Jan 31 01:27:40 np0005603623 kernel: Linux version 5.14.0-665.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026
Jan 31 01:27:40 np0005603623 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 31 01:27:40 np0005603623 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 31 01:27:40 np0005603623 kernel: BIOS-provided physical RAM map:
Jan 31 01:27:40 np0005603623 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 31 01:27:40 np0005603623 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 31 01:27:40 np0005603623 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 31 01:27:40 np0005603623 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 31 01:27:40 np0005603623 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 31 01:27:40 np0005603623 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 31 01:27:40 np0005603623 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 31 01:27:40 np0005603623 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 31 01:27:40 np0005603623 kernel: NX (Execute Disable) protection: active
Jan 31 01:27:40 np0005603623 kernel: APIC: Static calls initialized
Jan 31 01:27:40 np0005603623 kernel: SMBIOS 2.8 present.
Jan 31 01:27:40 np0005603623 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 31 01:27:40 np0005603623 kernel: Hypervisor detected: KVM
Jan 31 01:27:40 np0005603623 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 31 01:27:40 np0005603623 kernel: kvm-clock: using sched offset of 10648355110 cycles
Jan 31 01:27:40 np0005603623 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 31 01:27:40 np0005603623 kernel: tsc: Detected 2800.000 MHz processor
Jan 31 01:27:40 np0005603623 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 31 01:27:40 np0005603623 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 31 01:27:40 np0005603623 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 31 01:27:40 np0005603623 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 31 01:27:40 np0005603623 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 31 01:27:40 np0005603623 kernel: Using GB pages for direct mapping
Jan 31 01:27:40 np0005603623 kernel: RAMDISK: [mem 0x2d410000-0x329fffff]
Jan 31 01:27:40 np0005603623 kernel: ACPI: Early table checksum verification disabled
Jan 31 01:27:40 np0005603623 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 31 01:27:40 np0005603623 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 31 01:27:40 np0005603623 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 31 01:27:40 np0005603623 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 31 01:27:40 np0005603623 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 31 01:27:40 np0005603623 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 31 01:27:40 np0005603623 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 31 01:27:40 np0005603623 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 31 01:27:40 np0005603623 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 31 01:27:40 np0005603623 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 31 01:27:40 np0005603623 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 31 01:27:40 np0005603623 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 31 01:27:40 np0005603623 kernel: No NUMA configuration found
Jan 31 01:27:40 np0005603623 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 31 01:27:40 np0005603623 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Jan 31 01:27:40 np0005603623 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 31 01:27:40 np0005603623 kernel: Zone ranges:
Jan 31 01:27:40 np0005603623 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 31 01:27:40 np0005603623 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 31 01:27:40 np0005603623 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 31 01:27:40 np0005603623 kernel:  Device   empty
Jan 31 01:27:40 np0005603623 kernel: Movable zone start for each node
Jan 31 01:27:40 np0005603623 kernel: Early memory node ranges
Jan 31 01:27:40 np0005603623 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 31 01:27:40 np0005603623 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 31 01:27:40 np0005603623 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 31 01:27:40 np0005603623 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 31 01:27:40 np0005603623 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 31 01:27:40 np0005603623 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 31 01:27:40 np0005603623 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 31 01:27:40 np0005603623 kernel: ACPI: PM-Timer IO Port: 0x608
Jan 31 01:27:40 np0005603623 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 31 01:27:40 np0005603623 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 31 01:27:40 np0005603623 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 31 01:27:40 np0005603623 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 31 01:27:40 np0005603623 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 31 01:27:40 np0005603623 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 31 01:27:40 np0005603623 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 31 01:27:40 np0005603623 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 31 01:27:40 np0005603623 kernel: TSC deadline timer available
Jan 31 01:27:40 np0005603623 kernel: CPU topo: Max. logical packages:   8
Jan 31 01:27:40 np0005603623 kernel: CPU topo: Max. logical dies:       8
Jan 31 01:27:40 np0005603623 kernel: CPU topo: Max. dies per package:   1
Jan 31 01:27:40 np0005603623 kernel: CPU topo: Max. threads per core:   1
Jan 31 01:27:40 np0005603623 kernel: CPU topo: Num. cores per package:     1
Jan 31 01:27:40 np0005603623 kernel: CPU topo: Num. threads per package:   1
Jan 31 01:27:40 np0005603623 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 31 01:27:40 np0005603623 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 31 01:27:40 np0005603623 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 31 01:27:40 np0005603623 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 31 01:27:40 np0005603623 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 31 01:27:40 np0005603623 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 31 01:27:40 np0005603623 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 31 01:27:40 np0005603623 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 31 01:27:40 np0005603623 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 31 01:27:40 np0005603623 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 31 01:27:40 np0005603623 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 31 01:27:40 np0005603623 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 31 01:27:40 np0005603623 kernel: Booting paravirtualized kernel on KVM
Jan 31 01:27:40 np0005603623 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 31 01:27:40 np0005603623 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 31 01:27:40 np0005603623 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 31 01:27:40 np0005603623 kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 31 01:27:40 np0005603623 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 31 01:27:40 np0005603623 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64", will be passed to user space.
Jan 31 01:27:40 np0005603623 kernel: random: crng init done
Jan 31 01:27:40 np0005603623 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 31 01:27:40 np0005603623 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 31 01:27:40 np0005603623 kernel: Fallback order for Node 0: 0 
Jan 31 01:27:40 np0005603623 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 31 01:27:40 np0005603623 kernel: Policy zone: Normal
Jan 31 01:27:40 np0005603623 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 31 01:27:40 np0005603623 kernel: software IO TLB: area num 8.
Jan 31 01:27:40 np0005603623 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 31 01:27:40 np0005603623 kernel: ftrace: allocating 49438 entries in 194 pages
Jan 31 01:27:40 np0005603623 kernel: ftrace: allocated 194 pages with 3 groups
Jan 31 01:27:40 np0005603623 kernel: Dynamic Preempt: voluntary
Jan 31 01:27:40 np0005603623 kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 31 01:27:40 np0005603623 kernel: rcu: #011RCU event tracing is enabled.
Jan 31 01:27:40 np0005603623 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 31 01:27:40 np0005603623 kernel: #011Trampoline variant of Tasks RCU enabled.
Jan 31 01:27:40 np0005603623 kernel: #011Rude variant of Tasks RCU enabled.
Jan 31 01:27:40 np0005603623 kernel: #011Tracing variant of Tasks RCU enabled.
Jan 31 01:27:40 np0005603623 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 31 01:27:40 np0005603623 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 31 01:27:40 np0005603623 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 31 01:27:40 np0005603623 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 31 01:27:40 np0005603623 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 31 01:27:40 np0005603623 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 31 01:27:40 np0005603623 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 31 01:27:40 np0005603623 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 31 01:27:40 np0005603623 kernel: Console: colour VGA+ 80x25
Jan 31 01:27:40 np0005603623 kernel: printk: console [ttyS0] enabled
Jan 31 01:27:40 np0005603623 kernel: ACPI: Core revision 20230331
Jan 31 01:27:40 np0005603623 kernel: APIC: Switch to symmetric I/O mode setup
Jan 31 01:27:40 np0005603623 kernel: x2apic enabled
Jan 31 01:27:40 np0005603623 kernel: APIC: Switched APIC routing to: physical x2apic
Jan 31 01:27:40 np0005603623 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 31 01:27:40 np0005603623 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Jan 31 01:27:40 np0005603623 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 31 01:27:40 np0005603623 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 31 01:27:40 np0005603623 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 31 01:27:40 np0005603623 kernel: mitigations: Enabled attack vectors: user_kernel, user_user, guest_host, guest_guest, SMT mitigations: auto
Jan 31 01:27:40 np0005603623 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 31 01:27:40 np0005603623 kernel: Spectre V2 : Mitigation: Retpolines
Jan 31 01:27:40 np0005603623 kernel: RETBleed: Mitigation: untrained return thunk
Jan 31 01:27:40 np0005603623 kernel: Speculative Return Stack Overflow: Mitigation: SMT disabled
Jan 31 01:27:40 np0005603623 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 31 01:27:40 np0005603623 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 31 01:27:40 np0005603623 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 31 01:27:40 np0005603623 kernel: active return thunk: retbleed_return_thunk
Jan 31 01:27:40 np0005603623 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 31 01:27:40 np0005603623 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 31 01:27:40 np0005603623 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 31 01:27:40 np0005603623 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 31 01:27:40 np0005603623 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 31 01:27:40 np0005603623 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 31 01:27:40 np0005603623 kernel: Freeing SMP alternatives memory: 40K
Jan 31 01:27:40 np0005603623 kernel: pid_max: default: 32768 minimum: 301
Jan 31 01:27:40 np0005603623 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 31 01:27:40 np0005603623 kernel: landlock: Up and running.
Jan 31 01:27:40 np0005603623 kernel: Yama: becoming mindful.
Jan 31 01:27:40 np0005603623 kernel: SELinux:  Initializing.
Jan 31 01:27:40 np0005603623 kernel: LSM support for eBPF active
Jan 31 01:27:40 np0005603623 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 31 01:27:40 np0005603623 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 31 01:27:40 np0005603623 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 31 01:27:40 np0005603623 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 31 01:27:40 np0005603623 kernel: ... version:                0
Jan 31 01:27:40 np0005603623 kernel: ... bit width:              48
Jan 31 01:27:40 np0005603623 kernel: ... generic registers:      6
Jan 31 01:27:40 np0005603623 kernel: ... value mask:             0000ffffffffffff
Jan 31 01:27:40 np0005603623 kernel: ... max period:             00007fffffffffff
Jan 31 01:27:40 np0005603623 kernel: ... fixed-purpose events:   0
Jan 31 01:27:40 np0005603623 kernel: ... event mask:             000000000000003f
Jan 31 01:27:40 np0005603623 kernel: signal: max sigframe size: 1776
Jan 31 01:27:40 np0005603623 kernel: rcu: Hierarchical SRCU implementation.
Jan 31 01:27:40 np0005603623 kernel: rcu: #011Max phase no-delay instances is 400.
Jan 31 01:27:40 np0005603623 kernel: smp: Bringing up secondary CPUs ...
Jan 31 01:27:40 np0005603623 kernel: smpboot: x86: Booting SMP configuration:
Jan 31 01:27:40 np0005603623 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 31 01:27:40 np0005603623 kernel: smp: Brought up 1 node, 8 CPUs
Jan 31 01:27:40 np0005603623 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Jan 31 01:27:40 np0005603623 kernel: node 0 deferred pages initialised in 26ms
Jan 31 01:27:40 np0005603623 kernel: Memory: 7763748K/8388068K available (16384K kernel code, 5801K rwdata, 13928K rodata, 4196K init, 7192K bss, 618400K reserved, 0K cma-reserved)
Jan 31 01:27:40 np0005603623 kernel: devtmpfs: initialized
Jan 31 01:27:40 np0005603623 kernel: x86/mm: Memory block size: 128MB
Jan 31 01:27:40 np0005603623 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 31 01:27:40 np0005603623 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 31 01:27:40 np0005603623 kernel: pinctrl core: initialized pinctrl subsystem
Jan 31 01:27:40 np0005603623 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 31 01:27:40 np0005603623 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 31 01:27:40 np0005603623 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 31 01:27:40 np0005603623 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 31 01:27:40 np0005603623 kernel: audit: initializing netlink subsys (disabled)
Jan 31 01:27:40 np0005603623 kernel: audit: type=2000 audit(1769840860.216:1): state=initialized audit_enabled=0 res=1
Jan 31 01:27:40 np0005603623 kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 31 01:27:40 np0005603623 kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 31 01:27:40 np0005603623 kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 31 01:27:40 np0005603623 kernel: cpuidle: using governor menu
Jan 31 01:27:40 np0005603623 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 31 01:27:40 np0005603623 kernel: PCI: Using configuration type 1 for base access
Jan 31 01:27:40 np0005603623 kernel: PCI: Using configuration type 1 for extended access
Jan 31 01:27:40 np0005603623 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 31 01:27:40 np0005603623 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 31 01:27:40 np0005603623 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 31 01:27:40 np0005603623 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 31 01:27:40 np0005603623 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 31 01:27:40 np0005603623 kernel: Demotion targets for Node 0: null
Jan 31 01:27:40 np0005603623 kernel: cryptd: max_cpu_qlen set to 1000
Jan 31 01:27:40 np0005603623 kernel: ACPI: Added _OSI(Module Device)
Jan 31 01:27:40 np0005603623 kernel: ACPI: Added _OSI(Processor Device)
Jan 31 01:27:40 np0005603623 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 31 01:27:40 np0005603623 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 31 01:27:40 np0005603623 kernel: ACPI: Interpreter enabled
Jan 31 01:27:40 np0005603623 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 31 01:27:40 np0005603623 kernel: ACPI: Using IOAPIC for interrupt routing
Jan 31 01:27:40 np0005603623 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 31 01:27:40 np0005603623 kernel: PCI: Using E820 reservations for host bridge windows
Jan 31 01:27:40 np0005603623 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 31 01:27:40 np0005603623 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 31 01:27:40 np0005603623 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [3] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [4] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [5] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [6] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [7] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [8] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [9] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [10] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [11] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [12] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [13] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [14] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [15] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [16] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [17] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [18] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [19] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [20] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [21] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [22] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [23] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [24] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [25] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [26] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [27] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [28] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [29] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [30] registered
Jan 31 01:27:40 np0005603623 kernel: acpiphp: Slot [31] registered
Jan 31 01:27:40 np0005603623 kernel: PCI host bridge to bus 0000:00
Jan 31 01:27:40 np0005603623 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 31 01:27:40 np0005603623 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 31 01:27:40 np0005603623 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 31 01:27:40 np0005603623 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 31 01:27:40 np0005603623 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 31 01:27:40 np0005603623 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 31 01:27:40 np0005603623 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 31 01:27:40 np0005603623 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 31 01:27:40 np0005603623 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 31 01:27:40 np0005603623 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 31 01:27:40 np0005603623 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 31 01:27:40 np0005603623 kernel: iommu: Default domain type: Translated
Jan 31 01:27:40 np0005603623 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 31 01:27:40 np0005603623 kernel: SCSI subsystem initialized
Jan 31 01:27:40 np0005603623 kernel: ACPI: bus type USB registered
Jan 31 01:27:40 np0005603623 kernel: usbcore: registered new interface driver usbfs
Jan 31 01:27:40 np0005603623 kernel: usbcore: registered new interface driver hub
Jan 31 01:27:40 np0005603623 kernel: usbcore: registered new device driver usb
Jan 31 01:27:40 np0005603623 kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 31 01:27:40 np0005603623 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 31 01:27:40 np0005603623 kernel: PTP clock support registered
Jan 31 01:27:40 np0005603623 kernel: EDAC MC: Ver: 3.0.0
Jan 31 01:27:40 np0005603623 kernel: NetLabel: Initializing
Jan 31 01:27:40 np0005603623 kernel: NetLabel:  domain hash size = 128
Jan 31 01:27:40 np0005603623 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 31 01:27:40 np0005603623 kernel: NetLabel:  unlabeled traffic allowed by default
Jan 31 01:27:40 np0005603623 kernel: PCI: Using ACPI for IRQ routing
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 31 01:27:40 np0005603623 kernel: vgaarb: loaded
Jan 31 01:27:40 np0005603623 kernel: clocksource: Switched to clocksource kvm-clock
Jan 31 01:27:40 np0005603623 kernel: VFS: Disk quotas dquot_6.6.0
Jan 31 01:27:40 np0005603623 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 31 01:27:40 np0005603623 kernel: pnp: PnP ACPI init
Jan 31 01:27:40 np0005603623 kernel: pnp: PnP ACPI: found 5 devices
Jan 31 01:27:40 np0005603623 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 31 01:27:40 np0005603623 kernel: NET: Registered PF_INET protocol family
Jan 31 01:27:40 np0005603623 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 31 01:27:40 np0005603623 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 31 01:27:40 np0005603623 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 31 01:27:40 np0005603623 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 31 01:27:40 np0005603623 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 31 01:27:40 np0005603623 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 31 01:27:40 np0005603623 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 31 01:27:40 np0005603623 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 31 01:27:40 np0005603623 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 31 01:27:40 np0005603623 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 31 01:27:40 np0005603623 kernel: NET: Registered PF_XDP protocol family
Jan 31 01:27:40 np0005603623 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 31 01:27:40 np0005603623 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 31 01:27:40 np0005603623 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 31 01:27:40 np0005603623 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 31 01:27:40 np0005603623 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 31 01:27:40 np0005603623 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 31 01:27:40 np0005603623 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 27449 usecs
Jan 31 01:27:40 np0005603623 kernel: PCI: CLS 0 bytes, default 64
Jan 31 01:27:40 np0005603623 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 31 01:27:40 np0005603623 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 31 01:27:40 np0005603623 kernel: ACPI: bus type thunderbolt registered
Jan 31 01:27:40 np0005603623 kernel: Trying to unpack rootfs image as initramfs...
Jan 31 01:27:40 np0005603623 kernel: Initialise system trusted keyrings
Jan 31 01:27:40 np0005603623 kernel: Key type blacklist registered
Jan 31 01:27:40 np0005603623 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 31 01:27:40 np0005603623 kernel: zbud: loaded
Jan 31 01:27:40 np0005603623 kernel: integrity: Platform Keyring initialized
Jan 31 01:27:40 np0005603623 kernel: integrity: Machine keyring initialized
Jan 31 01:27:40 np0005603623 kernel: Freeing initrd memory: 88000K
Jan 31 01:27:40 np0005603623 kernel: NET: Registered PF_ALG protocol family
Jan 31 01:27:40 np0005603623 kernel: xor: automatically using best checksumming function   avx       
Jan 31 01:27:40 np0005603623 kernel: Key type asymmetric registered
Jan 31 01:27:40 np0005603623 kernel: Asymmetric key parser 'x509' registered
Jan 31 01:27:40 np0005603623 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 31 01:27:40 np0005603623 kernel: io scheduler mq-deadline registered
Jan 31 01:27:40 np0005603623 kernel: io scheduler kyber registered
Jan 31 01:27:40 np0005603623 kernel: io scheduler bfq registered
Jan 31 01:27:40 np0005603623 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 31 01:27:40 np0005603623 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 31 01:27:40 np0005603623 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 31 01:27:40 np0005603623 kernel: ACPI: button: Power Button [PWRF]
Jan 31 01:27:40 np0005603623 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 31 01:27:40 np0005603623 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 31 01:27:40 np0005603623 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 31 01:27:40 np0005603623 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 31 01:27:40 np0005603623 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 31 01:27:40 np0005603623 kernel: Non-volatile memory driver v1.3
Jan 31 01:27:40 np0005603623 kernel: rdac: device handler registered
Jan 31 01:27:40 np0005603623 kernel: hp_sw: device handler registered
Jan 31 01:27:40 np0005603623 kernel: emc: device handler registered
Jan 31 01:27:40 np0005603623 kernel: alua: device handler registered
Jan 31 01:27:40 np0005603623 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 31 01:27:40 np0005603623 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 31 01:27:40 np0005603623 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 31 01:27:40 np0005603623 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 31 01:27:40 np0005603623 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 31 01:27:40 np0005603623 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 31 01:27:40 np0005603623 kernel: usb usb1: Product: UHCI Host Controller
Jan 31 01:27:40 np0005603623 kernel: usb usb1: Manufacturer: Linux 5.14.0-665.el9.x86_64 uhci_hcd
Jan 31 01:27:40 np0005603623 kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 31 01:27:40 np0005603623 kernel: hub 1-0:1.0: USB hub found
Jan 31 01:27:40 np0005603623 kernel: hub 1-0:1.0: 2 ports detected
Jan 31 01:27:40 np0005603623 kernel: usbcore: registered new interface driver usbserial_generic
Jan 31 01:27:40 np0005603623 kernel: usbserial: USB Serial support registered for generic
Jan 31 01:27:40 np0005603623 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 31 01:27:40 np0005603623 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 31 01:27:40 np0005603623 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 31 01:27:40 np0005603623 kernel: mousedev: PS/2 mouse device common for all mice
Jan 31 01:27:40 np0005603623 kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 31 01:27:40 np0005603623 kernel: rtc_cmos 00:04: registered as rtc0
Jan 31 01:27:40 np0005603623 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 31 01:27:40 np0005603623 kernel: rtc_cmos 00:04: setting system clock to 2026-01-31T06:27:40 UTC (1769840860)
Jan 31 01:27:40 np0005603623 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 31 01:27:40 np0005603623 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 31 01:27:40 np0005603623 kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 31 01:27:40 np0005603623 kernel: usbcore: registered new interface driver usbhid
Jan 31 01:27:40 np0005603623 kernel: usbhid: USB HID core driver
Jan 31 01:27:40 np0005603623 kernel: drop_monitor: Initializing network drop monitor service
Jan 31 01:27:40 np0005603623 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 31 01:27:40 np0005603623 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 31 01:27:40 np0005603623 kernel: Initializing XFRM netlink socket
Jan 31 01:27:40 np0005603623 kernel: NET: Registered PF_INET6 protocol family
Jan 31 01:27:40 np0005603623 kernel: Segment Routing with IPv6
Jan 31 01:27:40 np0005603623 kernel: NET: Registered PF_PACKET protocol family
Jan 31 01:27:40 np0005603623 kernel: mpls_gso: MPLS GSO support
Jan 31 01:27:40 np0005603623 kernel: IPI shorthand broadcast: enabled
Jan 31 01:27:40 np0005603623 kernel: AVX2 version of gcm_enc/dec engaged.
Jan 31 01:27:40 np0005603623 kernel: AES CTR mode by8 optimization enabled
Jan 31 01:27:40 np0005603623 kernel: sched_clock: Marking stable (1022002994, 151015296)->(1288426810, -115408520)
Jan 31 01:27:40 np0005603623 kernel: registered taskstats version 1
Jan 31 01:27:40 np0005603623 kernel: Loading compiled-in X.509 certificates
Jan 31 01:27:40 np0005603623 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8d408fd8f954b245ea1a4231fd25ac56c328a9b5'
Jan 31 01:27:40 np0005603623 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 31 01:27:40 np0005603623 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 31 01:27:40 np0005603623 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 31 01:27:40 np0005603623 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 31 01:27:40 np0005603623 kernel: Demotion targets for Node 0: null
Jan 31 01:27:40 np0005603623 kernel: page_owner is disabled
Jan 31 01:27:40 np0005603623 kernel: Key type .fscrypt registered
Jan 31 01:27:40 np0005603623 kernel: Key type fscrypt-provisioning registered
Jan 31 01:27:40 np0005603623 kernel: Key type big_key registered
Jan 31 01:27:40 np0005603623 kernel: Key type encrypted registered
Jan 31 01:27:40 np0005603623 kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 31 01:27:40 np0005603623 kernel: Loading compiled-in module X.509 certificates
Jan 31 01:27:40 np0005603623 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8d408fd8f954b245ea1a4231fd25ac56c328a9b5'
Jan 31 01:27:40 np0005603623 kernel: ima: Allocated hash algorithm: sha256
Jan 31 01:27:40 np0005603623 kernel: ima: No architecture policies found
Jan 31 01:27:40 np0005603623 kernel: evm: Initialising EVM extended attributes:
Jan 31 01:27:40 np0005603623 kernel: evm: security.selinux
Jan 31 01:27:40 np0005603623 kernel: evm: security.SMACK64 (disabled)
Jan 31 01:27:40 np0005603623 kernel: evm: security.SMACK64EXEC (disabled)
Jan 31 01:27:40 np0005603623 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 31 01:27:40 np0005603623 kernel: evm: security.SMACK64MMAP (disabled)
Jan 31 01:27:40 np0005603623 kernel: evm: security.apparmor (disabled)
Jan 31 01:27:40 np0005603623 kernel: evm: security.ima
Jan 31 01:27:40 np0005603623 kernel: evm: security.capability
Jan 31 01:27:40 np0005603623 kernel: evm: HMAC attrs: 0x1
Jan 31 01:27:40 np0005603623 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 31 01:27:40 np0005603623 kernel: Running certificate verification RSA selftest
Jan 31 01:27:40 np0005603623 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 31 01:27:40 np0005603623 kernel: Running certificate verification ECDSA selftest
Jan 31 01:27:40 np0005603623 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 31 01:27:40 np0005603623 kernel: clk: Disabling unused clocks
Jan 31 01:27:40 np0005603623 kernel: Freeing unused decrypted memory: 2028K
Jan 31 01:27:40 np0005603623 kernel: Freeing unused kernel image (initmem) memory: 4196K
Jan 31 01:27:40 np0005603623 kernel: Write protecting the kernel read-only data: 30720k
Jan 31 01:27:40 np0005603623 kernel: Freeing unused kernel image (rodata/data gap) memory: 408K
Jan 31 01:27:40 np0005603623 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 31 01:27:40 np0005603623 kernel: Run /init as init process
Jan 31 01:27:40 np0005603623 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 31 01:27:40 np0005603623 systemd: Detected virtualization kvm.
Jan 31 01:27:40 np0005603623 systemd: Detected architecture x86-64.
Jan 31 01:27:40 np0005603623 systemd: Running in initrd.
Jan 31 01:27:40 np0005603623 systemd: No hostname configured, using default hostname.
Jan 31 01:27:40 np0005603623 systemd: Hostname set to <localhost>.
Jan 31 01:27:40 np0005603623 systemd: Initializing machine ID from VM UUID.
Jan 31 01:27:40 np0005603623 systemd: Queued start job for default target Initrd Default Target.
Jan 31 01:27:40 np0005603623 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 31 01:27:40 np0005603623 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 31 01:27:40 np0005603623 kernel: usb 1-1: Product: QEMU USB Tablet
Jan 31 01:27:40 np0005603623 kernel: usb 1-1: Manufacturer: QEMU
Jan 31 01:27:40 np0005603623 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 31 01:27:40 np0005603623 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 31 01:27:40 np0005603623 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 31 01:27:40 np0005603623 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 31 01:27:40 np0005603623 systemd: Reached target Local Encrypted Volumes.
Jan 31 01:27:40 np0005603623 systemd: Reached target Initrd /usr File System.
Jan 31 01:27:40 np0005603623 systemd: Reached target Local File Systems.
Jan 31 01:27:40 np0005603623 systemd: Reached target Path Units.
Jan 31 01:27:40 np0005603623 systemd: Reached target Slice Units.
Jan 31 01:27:40 np0005603623 systemd: Reached target Swaps.
Jan 31 01:27:40 np0005603623 systemd: Reached target Timer Units.
Jan 31 01:27:40 np0005603623 systemd: Listening on D-Bus System Message Bus Socket.
Jan 31 01:27:40 np0005603623 systemd: Listening on Journal Socket (/dev/log).
Jan 31 01:27:40 np0005603623 systemd: Listening on Journal Socket.
Jan 31 01:27:40 np0005603623 systemd: Listening on udev Control Socket.
Jan 31 01:27:40 np0005603623 systemd: Listening on udev Kernel Socket.
Jan 31 01:27:40 np0005603623 systemd: Reached target Socket Units.
Jan 31 01:27:40 np0005603623 systemd: Starting Create List of Static Device Nodes...
Jan 31 01:27:40 np0005603623 systemd: Starting Journal Service...
Jan 31 01:27:40 np0005603623 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 31 01:27:40 np0005603623 systemd: Starting Apply Kernel Variables...
Jan 31 01:27:40 np0005603623 systemd: Starting Create System Users...
Jan 31 01:27:40 np0005603623 systemd: Starting Setup Virtual Console...
Jan 31 01:27:40 np0005603623 systemd: Finished Create List of Static Device Nodes.
Jan 31 01:27:40 np0005603623 systemd: Finished Apply Kernel Variables.
Jan 31 01:27:40 np0005603623 systemd: Finished Create System Users.
Jan 31 01:27:40 np0005603623 systemd-journald[310]: Journal started
Jan 31 01:27:40 np0005603623 systemd-journald[310]: Runtime Journal (/run/log/journal/4e15465d7c0349259fc3ba6a686b7adc) is 8.0M, max 153.6M, 145.6M free.
Jan 31 01:27:40 np0005603623 systemd-sysusers[315]: Creating group 'users' with GID 100.
Jan 31 01:27:40 np0005603623 systemd-sysusers[315]: Creating group 'dbus' with GID 81.
Jan 31 01:27:40 np0005603623 systemd-sysusers[315]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 31 01:27:41 np0005603623 systemd: Started Journal Service.
Jan 31 01:27:41 np0005603623 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 31 01:27:41 np0005603623 systemd[1]: Starting Create Volatile Files and Directories...
Jan 31 01:27:41 np0005603623 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 31 01:27:41 np0005603623 systemd[1]: Finished Create Volatile Files and Directories.
Jan 31 01:27:41 np0005603623 systemd[1]: Finished Setup Virtual Console.
Jan 31 01:27:41 np0005603623 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 31 01:27:41 np0005603623 systemd[1]: Starting dracut cmdline hook...
Jan 31 01:27:41 np0005603623 dracut-cmdline[328]: dracut-9 dracut-057-102.git20250818.el9
Jan 31 01:27:41 np0005603623 dracut-cmdline[328]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 31 01:27:41 np0005603623 systemd[1]: Finished dracut cmdline hook.
Jan 31 01:27:41 np0005603623 systemd[1]: Starting dracut pre-udev hook...
Jan 31 01:27:41 np0005603623 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 31 01:27:41 np0005603623 kernel: device-mapper: uevent: version 1.0.3
Jan 31 01:27:41 np0005603623 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 31 01:27:41 np0005603623 kernel: RPC: Registered named UNIX socket transport module.
Jan 31 01:27:41 np0005603623 kernel: RPC: Registered udp transport module.
Jan 31 01:27:41 np0005603623 kernel: RPC: Registered tcp transport module.
Jan 31 01:27:41 np0005603623 kernel: RPC: Registered tcp-with-tls transport module.
Jan 31 01:27:41 np0005603623 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 31 01:27:41 np0005603623 rpc.statd[446]: Version 2.5.4 starting
Jan 31 01:27:41 np0005603623 rpc.statd[446]: Initializing NSM state
Jan 31 01:27:41 np0005603623 rpc.idmapd[451]: Setting log level to 0
Jan 31 01:27:41 np0005603623 systemd[1]: Finished dracut pre-udev hook.
Jan 31 01:27:41 np0005603623 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 31 01:27:41 np0005603623 systemd-udevd[464]: Using default interface naming scheme 'rhel-9.0'.
Jan 31 01:27:41 np0005603623 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 31 01:27:41 np0005603623 systemd[1]: Starting dracut pre-trigger hook...
Jan 31 01:27:41 np0005603623 systemd[1]: Finished dracut pre-trigger hook.
Jan 31 01:27:41 np0005603623 systemd[1]: Starting Coldplug All udev Devices...
Jan 31 01:27:41 np0005603623 systemd[1]: Created slice Slice /system/modprobe.
Jan 31 01:27:41 np0005603623 systemd[1]: Starting Load Kernel Module configfs...
Jan 31 01:27:41 np0005603623 systemd[1]: Finished Coldplug All udev Devices.
Jan 31 01:27:41 np0005603623 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 31 01:27:41 np0005603623 systemd[1]: Finished Load Kernel Module configfs.
Jan 31 01:27:41 np0005603623 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 31 01:27:41 np0005603623 systemd[1]: Reached target Network.
Jan 31 01:27:41 np0005603623 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 31 01:27:41 np0005603623 systemd[1]: Starting dracut initqueue hook...
Jan 31 01:27:41 np0005603623 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 31 01:27:41 np0005603623 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 31 01:27:41 np0005603623 kernel: vda: vda1
Jan 31 01:27:41 np0005603623 kernel: scsi host0: ata_piix
Jan 31 01:27:41 np0005603623 kernel: scsi host1: ata_piix
Jan 31 01:27:41 np0005603623 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 31 01:27:41 np0005603623 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 31 01:27:41 np0005603623 systemd-udevd[494]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:27:41 np0005603623 systemd[1]: Found device /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8.
Jan 31 01:27:41 np0005603623 systemd[1]: Reached target Initrd Root Device.
Jan 31 01:27:41 np0005603623 kernel: ata1: found unknown device (class 0)
Jan 31 01:27:41 np0005603623 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 31 01:27:41 np0005603623 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 31 01:27:41 np0005603623 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 31 01:27:41 np0005603623 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 31 01:27:41 np0005603623 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 31 01:27:41 np0005603623 systemd[1]: Mounting Kernel Configuration File System...
Jan 31 01:27:41 np0005603623 systemd[1]: Mounted Kernel Configuration File System.
Jan 31 01:27:41 np0005603623 systemd[1]: Reached target System Initialization.
Jan 31 01:27:41 np0005603623 systemd[1]: Reached target Basic System.
Jan 31 01:27:42 np0005603623 systemd[1]: Finished dracut initqueue hook.
Jan 31 01:27:42 np0005603623 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 31 01:27:42 np0005603623 systemd[1]: Reached target Remote Encrypted Volumes.
Jan 31 01:27:42 np0005603623 systemd[1]: Reached target Remote File Systems.
Jan 31 01:27:42 np0005603623 systemd[1]: Starting dracut pre-mount hook...
Jan 31 01:27:42 np0005603623 systemd[1]: Finished dracut pre-mount hook.
Jan 31 01:27:42 np0005603623 systemd[1]: Starting File System Check on /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8...
Jan 31 01:27:42 np0005603623 systemd-fsck[559]: /usr/sbin/fsck.xfs: XFS file system.
Jan 31 01:27:42 np0005603623 systemd[1]: Finished File System Check on /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8.
Jan 31 01:27:42 np0005603623 systemd[1]: Mounting /sysroot...
Jan 31 01:27:42 np0005603623 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 31 01:27:42 np0005603623 kernel: XFS (vda1): Mounting V5 Filesystem 822f14ea-6e7e-41df-b0d8-fbe282d9ded8
Jan 31 01:27:42 np0005603623 kernel: XFS (vda1): Ending clean mount
Jan 31 01:27:42 np0005603623 systemd[1]: Mounted /sysroot.
Jan 31 01:27:42 np0005603623 systemd[1]: Reached target Initrd Root File System.
Jan 31 01:27:42 np0005603623 systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 31 01:27:42 np0005603623 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 31 01:27:42 np0005603623 systemd[1]: Reached target Initrd File Systems.
Jan 31 01:27:42 np0005603623 systemd[1]: Reached target Initrd Default Target.
Jan 31 01:27:42 np0005603623 systemd[1]: Starting dracut mount hook...
Jan 31 01:27:42 np0005603623 systemd[1]: Finished dracut mount hook.
Jan 31 01:27:42 np0005603623 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 31 01:27:42 np0005603623 rpc.idmapd[451]: exiting on signal 15
Jan 31 01:27:42 np0005603623 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 31 01:27:42 np0005603623 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped target Network.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped target Timer Units.
Jan 31 01:27:42 np0005603623 systemd[1]: dbus.socket: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 31 01:27:42 np0005603623 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped target Initrd Default Target.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped target Basic System.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped target Initrd Root Device.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped target Initrd /usr File System.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped target Path Units.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped target Remote File Systems.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped target Slice Units.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped target Socket Units.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped target System Initialization.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped target Local File Systems.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped target Swaps.
Jan 31 01:27:42 np0005603623 systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped dracut mount hook.
Jan 31 01:27:42 np0005603623 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped dracut pre-mount hook.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped target Local Encrypted Volumes.
Jan 31 01:27:42 np0005603623 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 31 01:27:42 np0005603623 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped dracut initqueue hook.
Jan 31 01:27:42 np0005603623 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped Apply Kernel Variables.
Jan 31 01:27:42 np0005603623 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped Create Volatile Files and Directories.
Jan 31 01:27:42 np0005603623 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped Coldplug All udev Devices.
Jan 31 01:27:42 np0005603623 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped dracut pre-trigger hook.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 31 01:27:42 np0005603623 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped Setup Virtual Console.
Jan 31 01:27:42 np0005603623 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 31 01:27:42 np0005603623 systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 31 01:27:42 np0005603623 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Closed udev Control Socket.
Jan 31 01:27:42 np0005603623 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Closed udev Kernel Socket.
Jan 31 01:27:42 np0005603623 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped dracut pre-udev hook.
Jan 31 01:27:42 np0005603623 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped dracut cmdline hook.
Jan 31 01:27:42 np0005603623 systemd[1]: Starting Cleanup udev Database...
Jan 31 01:27:42 np0005603623 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 31 01:27:42 np0005603623 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped Create List of Static Device Nodes.
Jan 31 01:27:42 np0005603623 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Stopped Create System Users.
Jan 31 01:27:42 np0005603623 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 31 01:27:42 np0005603623 systemd[1]: Finished Cleanup udev Database.
Jan 31 01:27:42 np0005603623 systemd[1]: Reached target Switch Root.
Jan 31 01:27:42 np0005603623 systemd[1]: Starting Switch Root...
Jan 31 01:27:42 np0005603623 systemd[1]: Switching root.
Jan 31 01:27:42 np0005603623 systemd-journald[310]: Journal stopped
Jan 31 01:27:43 np0005603623 systemd-journald: Received SIGTERM from PID 1 (systemd).
Jan 31 01:27:43 np0005603623 kernel: audit: type=1404 audit(1769840863.182:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 31 01:27:43 np0005603623 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 01:27:43 np0005603623 kernel: SELinux:  policy capability open_perms=1
Jan 31 01:27:43 np0005603623 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 01:27:43 np0005603623 kernel: SELinux:  policy capability always_check_network=0
Jan 31 01:27:43 np0005603623 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 01:27:43 np0005603623 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 01:27:43 np0005603623 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 01:27:43 np0005603623 kernel: audit: type=1403 audit(1769840863.290:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 31 01:27:43 np0005603623 systemd: Successfully loaded SELinux policy in 118.429ms.
Jan 31 01:27:43 np0005603623 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 31.311ms.
Jan 31 01:27:43 np0005603623 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 31 01:27:43 np0005603623 systemd: Detected virtualization kvm.
Jan 31 01:27:43 np0005603623 systemd: Detected architecture x86-64.
Jan 31 01:27:43 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:27:43 np0005603623 systemd: initrd-switch-root.service: Deactivated successfully.
Jan 31 01:27:43 np0005603623 systemd: Stopped Switch Root.
Jan 31 01:27:43 np0005603623 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 31 01:27:43 np0005603623 systemd: Created slice Slice /system/getty.
Jan 31 01:27:43 np0005603623 systemd: Created slice Slice /system/serial-getty.
Jan 31 01:27:43 np0005603623 systemd: Created slice Slice /system/sshd-keygen.
Jan 31 01:27:43 np0005603623 systemd: Created slice User and Session Slice.
Jan 31 01:27:43 np0005603623 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 31 01:27:43 np0005603623 systemd: Started Forward Password Requests to Wall Directory Watch.
Jan 31 01:27:43 np0005603623 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 31 01:27:43 np0005603623 systemd: Reached target Local Encrypted Volumes.
Jan 31 01:27:43 np0005603623 systemd: Stopped target Switch Root.
Jan 31 01:27:43 np0005603623 systemd: Stopped target Initrd File Systems.
Jan 31 01:27:43 np0005603623 systemd: Stopped target Initrd Root File System.
Jan 31 01:27:43 np0005603623 systemd: Reached target Local Integrity Protected Volumes.
Jan 31 01:27:43 np0005603623 systemd: Reached target Path Units.
Jan 31 01:27:43 np0005603623 systemd: Reached target rpc_pipefs.target.
Jan 31 01:27:43 np0005603623 systemd: Reached target Slice Units.
Jan 31 01:27:43 np0005603623 systemd: Reached target Swaps.
Jan 31 01:27:43 np0005603623 systemd: Reached target Local Verity Protected Volumes.
Jan 31 01:27:43 np0005603623 systemd: Listening on RPCbind Server Activation Socket.
Jan 31 01:27:43 np0005603623 systemd: Reached target RPC Port Mapper.
Jan 31 01:27:43 np0005603623 systemd: Listening on Process Core Dump Socket.
Jan 31 01:27:43 np0005603623 systemd: Listening on initctl Compatibility Named Pipe.
Jan 31 01:27:43 np0005603623 systemd: Listening on udev Control Socket.
Jan 31 01:27:43 np0005603623 systemd: Listening on udev Kernel Socket.
Jan 31 01:27:43 np0005603623 systemd: Mounting Huge Pages File System...
Jan 31 01:27:43 np0005603623 systemd: Mounting POSIX Message Queue File System...
Jan 31 01:27:43 np0005603623 systemd: Mounting Kernel Debug File System...
Jan 31 01:27:43 np0005603623 systemd: Mounting Kernel Trace File System...
Jan 31 01:27:43 np0005603623 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 31 01:27:43 np0005603623 systemd: Starting Create List of Static Device Nodes...
Jan 31 01:27:43 np0005603623 systemd: Starting Load Kernel Module configfs...
Jan 31 01:27:43 np0005603623 systemd: Starting Load Kernel Module drm...
Jan 31 01:27:43 np0005603623 systemd: Starting Load Kernel Module efi_pstore...
Jan 31 01:27:43 np0005603623 systemd: Starting Load Kernel Module fuse...
Jan 31 01:27:43 np0005603623 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 31 01:27:43 np0005603623 systemd: systemd-fsck-root.service: Deactivated successfully.
Jan 31 01:27:43 np0005603623 systemd: Stopped File System Check on Root Device.
Jan 31 01:27:43 np0005603623 systemd: Stopped Journal Service.
Jan 31 01:27:43 np0005603623 systemd: Starting Journal Service...
Jan 31 01:27:43 np0005603623 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 31 01:27:43 np0005603623 systemd: Starting Generate network units from Kernel command line...
Jan 31 01:27:43 np0005603623 kernel: fuse: init (API version 7.37)
Jan 31 01:27:43 np0005603623 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 31 01:27:43 np0005603623 systemd: Starting Remount Root and Kernel File Systems...
Jan 31 01:27:43 np0005603623 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 31 01:27:43 np0005603623 systemd: Starting Apply Kernel Variables...
Jan 31 01:27:43 np0005603623 systemd: Starting Coldplug All udev Devices...
Jan 31 01:27:43 np0005603623 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 31 01:27:43 np0005603623 systemd: Mounted Huge Pages File System.
Jan 31 01:27:43 np0005603623 systemd: Mounted POSIX Message Queue File System.
Jan 31 01:27:43 np0005603623 systemd: Mounted Kernel Debug File System.
Jan 31 01:27:43 np0005603623 systemd: Mounted Kernel Trace File System.
Jan 31 01:27:43 np0005603623 systemd: Finished Create List of Static Device Nodes.
Jan 31 01:27:43 np0005603623 systemd: modprobe@configfs.service: Deactivated successfully.
Jan 31 01:27:43 np0005603623 systemd: Finished Load Kernel Module configfs.
Jan 31 01:27:43 np0005603623 systemd: modprobe@efi_pstore.service: Deactivated successfully.
Jan 31 01:27:43 np0005603623 kernel: ACPI: bus type drm_connector registered
Jan 31 01:27:43 np0005603623 systemd: Finished Load Kernel Module efi_pstore.
Jan 31 01:27:43 np0005603623 systemd-journald[680]: Journal started
Jan 31 01:27:43 np0005603623 systemd-journald[680]: Runtime Journal (/run/log/journal/bf0bc0bb03de29b24cba1cc9599cf5d0) is 8.0M, max 153.6M, 145.6M free.
Jan 31 01:27:43 np0005603623 systemd[1]: Queued start job for default target Multi-User System.
Jan 31 01:27:43 np0005603623 systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 31 01:27:43 np0005603623 systemd: Started Journal Service.
Jan 31 01:27:43 np0005603623 systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 31 01:27:43 np0005603623 systemd[1]: Finished Load Kernel Module drm.
Jan 31 01:27:43 np0005603623 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 31 01:27:43 np0005603623 systemd[1]: Finished Load Kernel Module fuse.
Jan 31 01:27:43 np0005603623 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 31 01:27:43 np0005603623 systemd[1]: Finished Generate network units from Kernel command line.
Jan 31 01:27:43 np0005603623 systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 31 01:27:43 np0005603623 systemd[1]: Finished Apply Kernel Variables.
Jan 31 01:27:43 np0005603623 systemd[1]: Mounting FUSE Control File System...
Jan 31 01:27:43 np0005603623 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 31 01:27:43 np0005603623 systemd[1]: Starting Rebuild Hardware Database...
Jan 31 01:27:43 np0005603623 systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 31 01:27:43 np0005603623 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 31 01:27:43 np0005603623 systemd[1]: Starting Load/Save OS Random Seed...
Jan 31 01:27:43 np0005603623 systemd[1]: Starting Create System Users...
Jan 31 01:27:43 np0005603623 systemd[1]: Mounted FUSE Control File System.
Jan 31 01:27:43 np0005603623 systemd-journald[680]: Runtime Journal (/run/log/journal/bf0bc0bb03de29b24cba1cc9599cf5d0) is 8.0M, max 153.6M, 145.6M free.
Jan 31 01:27:43 np0005603623 systemd-journald[680]: Received client request to flush runtime journal.
Jan 31 01:27:43 np0005603623 systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 31 01:27:43 np0005603623 systemd[1]: Finished Coldplug All udev Devices.
Jan 31 01:27:43 np0005603623 systemd[1]: Finished Load/Save OS Random Seed.
Jan 31 01:27:43 np0005603623 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 31 01:27:44 np0005603623 systemd[1]: Finished Create System Users.
Jan 31 01:27:44 np0005603623 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 31 01:27:44 np0005603623 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 31 01:27:44 np0005603623 systemd[1]: Reached target Preparation for Local File Systems.
Jan 31 01:27:44 np0005603623 systemd[1]: Reached target Local File Systems.
Jan 31 01:27:44 np0005603623 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 31 01:27:44 np0005603623 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 31 01:27:44 np0005603623 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 31 01:27:44 np0005603623 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 31 01:27:44 np0005603623 systemd[1]: Starting Automatic Boot Loader Update...
Jan 31 01:27:44 np0005603623 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 31 01:27:44 np0005603623 systemd[1]: Starting Create Volatile Files and Directories...
Jan 31 01:27:44 np0005603623 bootctl[697]: Couldn't find EFI system partition, skipping.
Jan 31 01:27:44 np0005603623 systemd[1]: Finished Automatic Boot Loader Update.
Jan 31 01:27:44 np0005603623 systemd[1]: Finished Create Volatile Files and Directories.
Jan 31 01:27:44 np0005603623 systemd[1]: Starting Security Auditing Service...
Jan 31 01:27:44 np0005603623 systemd[1]: Starting RPC Bind...
Jan 31 01:27:44 np0005603623 systemd[1]: Starting Rebuild Journal Catalog...
Jan 31 01:27:44 np0005603623 auditd[703]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 31 01:27:44 np0005603623 auditd[703]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 31 01:27:44 np0005603623 systemd[1]: Finished Rebuild Journal Catalog.
Jan 31 01:27:44 np0005603623 systemd[1]: Started RPC Bind.
Jan 31 01:27:44 np0005603623 augenrules[708]: /sbin/augenrules: No change
Jan 31 01:27:44 np0005603623 augenrules[723]: No rules
Jan 31 01:27:44 np0005603623 augenrules[723]: enabled 1
Jan 31 01:27:44 np0005603623 augenrules[723]: failure 1
Jan 31 01:27:44 np0005603623 augenrules[723]: pid 703
Jan 31 01:27:44 np0005603623 augenrules[723]: rate_limit 0
Jan 31 01:27:44 np0005603623 augenrules[723]: backlog_limit 8192
Jan 31 01:27:44 np0005603623 augenrules[723]: lost 0
Jan 31 01:27:44 np0005603623 augenrules[723]: backlog 4
Jan 31 01:27:44 np0005603623 augenrules[723]: backlog_wait_time 60000
Jan 31 01:27:44 np0005603623 augenrules[723]: backlog_wait_time_actual 0
Jan 31 01:27:44 np0005603623 augenrules[723]: enabled 1
Jan 31 01:27:44 np0005603623 augenrules[723]: failure 1
Jan 31 01:27:44 np0005603623 augenrules[723]: pid 703
Jan 31 01:27:44 np0005603623 augenrules[723]: rate_limit 0
Jan 31 01:27:44 np0005603623 augenrules[723]: backlog_limit 8192
Jan 31 01:27:44 np0005603623 augenrules[723]: lost 0
Jan 31 01:27:44 np0005603623 augenrules[723]: backlog 3
Jan 31 01:27:44 np0005603623 augenrules[723]: backlog_wait_time 60000
Jan 31 01:27:44 np0005603623 augenrules[723]: backlog_wait_time_actual 0
Jan 31 01:27:44 np0005603623 augenrules[723]: enabled 1
Jan 31 01:27:44 np0005603623 augenrules[723]: failure 1
Jan 31 01:27:44 np0005603623 augenrules[723]: pid 703
Jan 31 01:27:44 np0005603623 augenrules[723]: rate_limit 0
Jan 31 01:27:44 np0005603623 augenrules[723]: backlog_limit 8192
Jan 31 01:27:44 np0005603623 augenrules[723]: lost 0
Jan 31 01:27:44 np0005603623 augenrules[723]: backlog 0
Jan 31 01:27:44 np0005603623 augenrules[723]: backlog_wait_time 60000
Jan 31 01:27:44 np0005603623 augenrules[723]: backlog_wait_time_actual 0
Jan 31 01:27:44 np0005603623 systemd[1]: Started Security Auditing Service.
Jan 31 01:27:44 np0005603623 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 31 01:27:44 np0005603623 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 31 01:27:44 np0005603623 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 31 01:27:44 np0005603623 systemd[1]: Finished Rebuild Hardware Database.
Jan 31 01:27:44 np0005603623 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 31 01:27:44 np0005603623 systemd[1]: Starting Update is Completed...
Jan 31 01:27:44 np0005603623 systemd-udevd[731]: Using default interface naming scheme 'rhel-9.0'.
Jan 31 01:27:44 np0005603623 systemd[1]: Finished Update is Completed.
Jan 31 01:27:45 np0005603623 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 31 01:27:45 np0005603623 systemd[1]: Reached target System Initialization.
Jan 31 01:27:45 np0005603623 systemd[1]: Started dnf makecache --timer.
Jan 31 01:27:45 np0005603623 systemd[1]: Started Daily rotation of log files.
Jan 31 01:27:45 np0005603623 systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 31 01:27:45 np0005603623 systemd[1]: Reached target Timer Units.
Jan 31 01:27:45 np0005603623 systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 31 01:27:45 np0005603623 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 31 01:27:45 np0005603623 systemd[1]: Reached target Socket Units.
Jan 31 01:27:45 np0005603623 systemd[1]: Starting D-Bus System Message Bus...
Jan 31 01:27:45 np0005603623 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 31 01:27:45 np0005603623 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 31 01:27:45 np0005603623 systemd[1]: Starting Load Kernel Module configfs...
Jan 31 01:27:45 np0005603623 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 31 01:27:45 np0005603623 systemd[1]: Finished Load Kernel Module configfs.
Jan 31 01:27:45 np0005603623 systemd-udevd[759]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:27:45 np0005603623 systemd[1]: Started D-Bus System Message Bus.
Jan 31 01:27:45 np0005603623 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 31 01:27:45 np0005603623 systemd[1]: Reached target Basic System.
Jan 31 01:27:45 np0005603623 dbus-broker-lau[769]: Ready
Jan 31 01:27:45 np0005603623 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 31 01:27:45 np0005603623 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 31 01:27:45 np0005603623 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 31 01:27:45 np0005603623 systemd[1]: Starting NTP client/server...
Jan 31 01:27:45 np0005603623 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 31 01:27:45 np0005603623 systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 31 01:27:45 np0005603623 systemd[1]: Starting IPv4 firewall with iptables...
Jan 31 01:27:45 np0005603623 systemd[1]: Started irqbalance daemon.
Jan 31 01:27:45 np0005603623 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 31 01:27:45 np0005603623 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 31 01:27:45 np0005603623 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 31 01:27:45 np0005603623 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 31 01:27:45 np0005603623 systemd[1]: Reached target sshd-keygen.target.
Jan 31 01:27:45 np0005603623 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 31 01:27:45 np0005603623 systemd[1]: Reached target User and Group Name Lookups.
Jan 31 01:27:45 np0005603623 systemd[1]: Starting User Login Management...
Jan 31 01:27:45 np0005603623 systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 31 01:27:45 np0005603623 chronyd[800]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 31 01:27:45 np0005603623 chronyd[800]: Loaded 0 symmetric keys
Jan 31 01:27:45 np0005603623 chronyd[800]: Using right/UTC timezone to obtain leap second data
Jan 31 01:27:45 np0005603623 chronyd[800]: Loaded seccomp filter (level 2)
Jan 31 01:27:45 np0005603623 systemd[1]: Started NTP client/server.
Jan 31 01:27:45 np0005603623 systemd-logind[795]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 31 01:27:45 np0005603623 systemd-logind[795]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 31 01:27:45 np0005603623 systemd-logind[795]: New seat seat0.
Jan 31 01:27:45 np0005603623 systemd[1]: Started User Login Management.
Jan 31 01:27:45 np0005603623 kernel: kvm_amd: TSC scaling supported
Jan 31 01:27:45 np0005603623 kernel: kvm_amd: Nested Virtualization enabled
Jan 31 01:27:45 np0005603623 kernel: kvm_amd: Nested Paging enabled
Jan 31 01:27:45 np0005603623 kernel: kvm_amd: LBR virtualization supported
Jan 31 01:27:45 np0005603623 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 31 01:27:45 np0005603623 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 31 01:27:45 np0005603623 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 31 01:27:45 np0005603623 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 31 01:27:45 np0005603623 kernel: Console: switching to colour dummy device 80x25
Jan 31 01:27:45 np0005603623 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 31 01:27:45 np0005603623 kernel: [drm] features: -context_init
Jan 31 01:27:45 np0005603623 kernel: [drm] number of scanouts: 1
Jan 31 01:27:45 np0005603623 kernel: [drm] number of cap sets: 0
Jan 31 01:27:45 np0005603623 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 31 01:27:45 np0005603623 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 31 01:27:45 np0005603623 kernel: Console: switching to colour frame buffer device 128x48
Jan 31 01:27:45 np0005603623 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 31 01:27:45 np0005603623 iptables.init[788]: iptables: Applying firewall rules: [  OK  ]
Jan 31 01:27:45 np0005603623 systemd[1]: Finished IPv4 firewall with iptables.
Jan 31 01:27:46 np0005603623 cloud-init[840]: Cloud-init v. 24.4-8.el9 running 'init-local' at Sat, 31 Jan 2026 06:27:46 +0000. Up 6.76 seconds.
Jan 31 01:27:46 np0005603623 systemd[1]: run-cloud\x2dinit-tmp-tmpmv32r774.mount: Deactivated successfully.
Jan 31 01:27:46 np0005603623 systemd[1]: Starting Hostname Service...
Jan 31 01:27:46 np0005603623 systemd[1]: Started Hostname Service.
Jan 31 01:27:46 np0005603623 systemd-hostnamed[854]: Hostname set to <np0005603623.novalocal> (static)
Jan 31 01:27:46 np0005603623 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 31 01:27:46 np0005603623 systemd[1]: Reached target Preparation for Network.
Jan 31 01:27:46 np0005603623 systemd[1]: Starting Network Manager...
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9236] NetworkManager (version 1.54.3-2.el9) is starting... (boot:0253bde7-2e17-45fb-a5ca-f6e9e69beb56)
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9240] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9413] manager[0x558a81fe9000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9470] hostname: hostname: using hostnamed
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9471] hostname: static hostname changed from (none) to "np0005603623.novalocal"
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9473] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9554] manager[0x558a81fe9000]: rfkill: Wi-Fi hardware radio set enabled
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9555] manager[0x558a81fe9000]: rfkill: WWAN hardware radio set enabled
Jan 31 01:27:46 np0005603623 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9714] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9716] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9717] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9718] manager: Networking is enabled by state file
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9720] settings: Loaded settings plugin: keyfile (internal)
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9780] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9802] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9815] dhcp: init: Using DHCP client 'internal'
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9824] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9834] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9845] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9855] device (lo): Activation: starting connection 'lo' (b03d3090-8131-4757-b469-75b0afcb0833)
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9861] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9865] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9898] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9901] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9905] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9908] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 31 01:27:46 np0005603623 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9911] device (eth0): carrier: link connected
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9915] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9920] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9926] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9929] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9931] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9932] manager: NetworkManager state is now CONNECTING
Jan 31 01:27:46 np0005603623 systemd[1]: Started Network Manager.
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9934] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9940] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 01:27:46 np0005603623 NetworkManager[858]: <info>  [1769840866.9942] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 31 01:27:46 np0005603623 systemd[1]: Reached target Network.
Jan 31 01:27:46 np0005603623 systemd[1]: Starting Network Manager Wait Online...
Jan 31 01:27:46 np0005603623 systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 31 01:27:47 np0005603623 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 01:27:47 np0005603623 NetworkManager[858]: <info>  [1769840867.0525] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 31 01:27:47 np0005603623 NetworkManager[858]: <info>  [1769840867.0549] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 31 01:27:47 np0005603623 NetworkManager[858]: <info>  [1769840867.0563] device (lo): Activation: successful, device activated.
Jan 31 01:27:47 np0005603623 systemd[1]: Started GSSAPI Proxy Daemon.
Jan 31 01:27:47 np0005603623 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 31 01:27:47 np0005603623 systemd[1]: Reached target NFS client services.
Jan 31 01:27:47 np0005603623 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 31 01:27:47 np0005603623 systemd[1]: Reached target Remote File Systems.
Jan 31 01:27:47 np0005603623 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 31 01:27:48 np0005603623 NetworkManager[858]: <info>  [1769840868.4385] dhcp4 (eth0): state changed new lease, address=38.102.83.110
Jan 31 01:27:48 np0005603623 NetworkManager[858]: <info>  [1769840868.4397] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 31 01:27:48 np0005603623 NetworkManager[858]: <info>  [1769840868.4419] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 01:27:48 np0005603623 NetworkManager[858]: <info>  [1769840868.4441] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 01:27:48 np0005603623 NetworkManager[858]: <info>  [1769840868.4442] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 01:27:48 np0005603623 NetworkManager[858]: <info>  [1769840868.4447] manager: NetworkManager state is now CONNECTED_SITE
Jan 31 01:27:48 np0005603623 NetworkManager[858]: <info>  [1769840868.4452] device (eth0): Activation: successful, device activated.
Jan 31 01:27:48 np0005603623 NetworkManager[858]: <info>  [1769840868.4458] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 31 01:27:48 np0005603623 NetworkManager[858]: <info>  [1769840868.4463] manager: startup complete
Jan 31 01:27:48 np0005603623 systemd[1]: Finished Network Manager Wait Online.
Jan 31 01:27:48 np0005603623 systemd[1]: Starting Cloud-init: Network Stage...
Jan 31 01:27:48 np0005603623 cloud-init[920]: Cloud-init v. 24.4-8.el9 running 'init' at Sat, 31 Jan 2026 06:27:48 +0000. Up 9.23 seconds.
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: |  eth0  | True |        38.102.83.110         | 255.255.255.0 | global | fa:16:3e:aa:a0:22 |
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: |  eth0  | True | fe80::f816:3eff:feaa:a022/64 |       .       |  link  | fa:16:3e:aa:a0:22 |
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Jan 31 01:27:48 np0005603623 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 31 01:27:53 np0005603623 chronyd[800]: Selected source 206.108.0.133 (2.centos.pool.ntp.org)
Jan 31 01:27:53 np0005603623 chronyd[800]: System clock TAI offset set to 37 seconds
Jan 31 01:27:55 np0005603623 irqbalance[789]: Cannot change IRQ 25 affinity: Operation not permitted
Jan 31 01:27:55 np0005603623 irqbalance[789]: IRQ 25 affinity is now unmanaged
Jan 31 01:27:55 np0005603623 irqbalance[789]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 31 01:27:55 np0005603623 irqbalance[789]: IRQ 31 affinity is now unmanaged
Jan 31 01:27:55 np0005603623 irqbalance[789]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 31 01:27:55 np0005603623 irqbalance[789]: IRQ 28 affinity is now unmanaged
Jan 31 01:27:55 np0005603623 irqbalance[789]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 31 01:27:55 np0005603623 irqbalance[789]: IRQ 32 affinity is now unmanaged
Jan 31 01:27:55 np0005603623 irqbalance[789]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 31 01:27:55 np0005603623 irqbalance[789]: IRQ 30 affinity is now unmanaged
Jan 31 01:27:55 np0005603623 irqbalance[789]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 31 01:27:55 np0005603623 irqbalance[789]: IRQ 29 affinity is now unmanaged
Jan 31 01:27:58 np0005603623 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 01:28:04 np0005603623 cloud-init[920]: Generating public/private rsa key pair.
Jan 31 01:28:04 np0005603623 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 31 01:28:04 np0005603623 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 31 01:28:04 np0005603623 cloud-init[920]: The key fingerprint is:
Jan 31 01:28:04 np0005603623 cloud-init[920]: SHA256:q95z2u9YfpjoA9a/vS5J3bdPy+pm7yH7FzDBXWgLoLk root@np0005603623.novalocal
Jan 31 01:28:04 np0005603623 cloud-init[920]: The key's randomart image is:
Jan 31 01:28:04 np0005603623 cloud-init[920]: +---[RSA 3072]----+
Jan 31 01:28:04 np0005603623 cloud-init[920]: |          ... ..o|
Jan 31 01:28:04 np0005603623 cloud-init[920]: |         o  .oo. |
Jan 31 01:28:04 np0005603623 cloud-init[920]: |        o    o.. |
Jan 31 01:28:04 np0005603623 cloud-init[920]: |         .   o.  |
Jan 31 01:28:04 np0005603623 cloud-init[920]: |        E.   .o. |
Jan 31 01:28:04 np0005603623 cloud-init[920]: |        o.. . ..o|
Jan 31 01:28:04 np0005603623 cloud-init[920]: |       ... +.= .=|
Jan 31 01:28:04 np0005603623 cloud-init[920]: |       o..++*+=o+|
Jan 31 01:28:04 np0005603623 cloud-init[920]: |     .o o=++OOOB+|
Jan 31 01:28:04 np0005603623 cloud-init[920]: +----[SHA256]-----+
Jan 31 01:28:04 np0005603623 cloud-init[920]: Generating public/private ecdsa key pair.
Jan 31 01:28:04 np0005603623 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 31 01:28:04 np0005603623 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 31 01:28:04 np0005603623 cloud-init[920]: The key fingerprint is:
Jan 31 01:28:04 np0005603623 cloud-init[920]: SHA256:WRzie3NfKBd7HwFyNFCXT9pNXLU0iOsXQUzki5Peu3w root@np0005603623.novalocal
Jan 31 01:28:04 np0005603623 cloud-init[920]: The key's randomart image is:
Jan 31 01:28:04 np0005603623 cloud-init[920]: +---[ECDSA 256]---+
Jan 31 01:28:04 np0005603623 cloud-init[920]: |        . .oXX.=*|
Jan 31 01:28:04 np0005603623 cloud-init[920]: |       . o o++=.*|
Jan 31 01:28:04 np0005603623 cloud-init[920]: |        . o ..oBo|
Jan 31 01:28:04 np0005603623 cloud-init[920]: |         + .o.o+=|
Jan 31 01:28:04 np0005603623 cloud-init[920]: |        S ++o.=.o|
Jan 31 01:28:04 np0005603623 cloud-init[920]: |         ..+o= oo|
Jan 31 01:28:04 np0005603623 cloud-init[920]: |           .... .|
Jan 31 01:28:04 np0005603623 cloud-init[920]: |            . .E |
Jan 31 01:28:04 np0005603623 cloud-init[920]: |             +o  |
Jan 31 01:28:04 np0005603623 cloud-init[920]: +----[SHA256]-----+
Jan 31 01:28:04 np0005603623 cloud-init[920]: Generating public/private ed25519 key pair.
Jan 31 01:28:04 np0005603623 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 31 01:28:04 np0005603623 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 31 01:28:04 np0005603623 cloud-init[920]: The key fingerprint is:
Jan 31 01:28:04 np0005603623 cloud-init[920]: SHA256:lVD/CJG46tfptya0X0PinbXOrbFLDl5gg5EOhd8XHdY root@np0005603623.novalocal
Jan 31 01:28:04 np0005603623 cloud-init[920]: The key's randomart image is:
Jan 31 01:28:04 np0005603623 cloud-init[920]: +--[ED25519 256]--+
Jan 31 01:28:04 np0005603623 cloud-init[920]: |        .++.   +o|
Jan 31 01:28:04 np0005603623 cloud-init[920]: |        oo.=  o E|
Jan 31 01:28:04 np0005603623 cloud-init[920]: |        .o*..  . |
Jan 31 01:28:04 np0005603623 cloud-init[920]: |        .+.+.o.  |
Jan 31 01:28:04 np0005603623 cloud-init[920]: |       .S o *.o .|
Jan 31 01:28:04 np0005603623 cloud-init[920]: |      .   .o * o.|
Jan 31 01:28:04 np0005603623 cloud-init[920]: |     .   o oo O. |
Jan 31 01:28:04 np0005603623 cloud-init[920]: |      . . =.o*o+.|
Jan 31 01:28:04 np0005603623 cloud-init[920]: |       . ..=+.==.|
Jan 31 01:28:04 np0005603623 cloud-init[920]: +----[SHA256]-----+
Jan 31 01:28:04 np0005603623 systemd[1]: Finished Cloud-init: Network Stage.
Jan 31 01:28:04 np0005603623 systemd[1]: Reached target Cloud-config availability.
Jan 31 01:28:04 np0005603623 systemd[1]: Reached target Network is Online.
Jan 31 01:28:04 np0005603623 systemd[1]: Starting Cloud-init: Config Stage...
Jan 31 01:28:04 np0005603623 systemd[1]: Starting Crash recovery kernel arming...
Jan 31 01:28:04 np0005603623 systemd[1]: Starting Notify NFS peers of a restart...
Jan 31 01:28:04 np0005603623 systemd[1]: Starting System Logging Service...
Jan 31 01:28:04 np0005603623 sm-notify[1005]: Version 2.5.4 starting
Jan 31 01:28:04 np0005603623 systemd[1]: Starting OpenSSH server daemon...
Jan 31 01:28:04 np0005603623 systemd[1]: Starting Permit User Sessions...
Jan 31 01:28:04 np0005603623 systemd[1]: Started Notify NFS peers of a restart.
Jan 31 01:28:04 np0005603623 systemd[1]: Started OpenSSH server daemon.
Jan 31 01:28:04 np0005603623 systemd[1]: Finished Permit User Sessions.
Jan 31 01:28:04 np0005603623 systemd[1]: Started Command Scheduler.
Jan 31 01:28:04 np0005603623 systemd[1]: Started Getty on tty1.
Jan 31 01:28:04 np0005603623 systemd[1]: Started Serial Getty on ttyS0.
Jan 31 01:28:04 np0005603623 systemd[1]: Reached target Login Prompts.
Jan 31 01:28:04 np0005603623 rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Jan 31 01:28:04 np0005603623 rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 31 01:28:04 np0005603623 systemd[1]: Started System Logging Service.
Jan 31 01:28:04 np0005603623 systemd[1]: Reached target Multi-User System.
Jan 31 01:28:04 np0005603623 systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 31 01:28:04 np0005603623 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 31 01:28:04 np0005603623 systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 31 01:28:04 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 01:28:04 np0005603623 kdumpctl[1021]: kdump: No kdump initial ramdisk found.
Jan 31 01:28:04 np0005603623 kdumpctl[1021]: kdump: Rebuilding /boot/initramfs-5.14.0-665.el9.x86_64kdump.img
Jan 31 01:28:04 np0005603623 cloud-init[1156]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Sat, 31 Jan 2026 06:28:04 +0000. Up 25.04 seconds.
Jan 31 01:28:04 np0005603623 systemd[1]: Finished Cloud-init: Config Stage.
Jan 31 01:28:04 np0005603623 systemd[1]: Starting Cloud-init: Final Stage...
Jan 31 01:28:04 np0005603623 dracut[1284]: dracut-057-102.git20250818.el9
Jan 31 01:28:04 np0005603623 dracut[1286]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-665.el9.x86_64kdump.img 5.14.0-665.el9.x86_64
Jan 31 01:28:04 np0005603623 cloud-init[1330]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Sat, 31 Jan 2026 06:28:04 +0000. Up 25.43 seconds.
Jan 31 01:28:05 np0005603623 cloud-init[1356]: #############################################################
Jan 31 01:28:05 np0005603623 cloud-init[1357]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 31 01:28:05 np0005603623 cloud-init[1359]: 256 SHA256:WRzie3NfKBd7HwFyNFCXT9pNXLU0iOsXQUzki5Peu3w root@np0005603623.novalocal (ECDSA)
Jan 31 01:28:05 np0005603623 cloud-init[1361]: 256 SHA256:lVD/CJG46tfptya0X0PinbXOrbFLDl5gg5EOhd8XHdY root@np0005603623.novalocal (ED25519)
Jan 31 01:28:05 np0005603623 cloud-init[1363]: 3072 SHA256:q95z2u9YfpjoA9a/vS5J3bdPy+pm7yH7FzDBXWgLoLk root@np0005603623.novalocal (RSA)
Jan 31 01:28:05 np0005603623 cloud-init[1364]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 31 01:28:05 np0005603623 cloud-init[1365]: #############################################################
Jan 31 01:28:05 np0005603623 cloud-init[1330]: Cloud-init v. 24.4-8.el9 finished at Sat, 31 Jan 2026 06:28:05 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 25.60 seconds
Jan 31 01:28:05 np0005603623 systemd[1]: Finished Cloud-init: Final Stage.
Jan 31 01:28:05 np0005603623 systemd[1]: Reached target Cloud-init target.
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: memstrack is not available
Jan 31 01:28:05 np0005603623 dracut[1286]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 31 01:28:05 np0005603623 dracut[1286]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 31 01:28:06 np0005603623 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 31 01:28:06 np0005603623 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 31 01:28:06 np0005603623 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 31 01:28:06 np0005603623 dracut[1286]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 31 01:28:06 np0005603623 dracut[1286]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 31 01:28:06 np0005603623 dracut[1286]: memstrack is not available
Jan 31 01:28:06 np0005603623 dracut[1286]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 31 01:28:06 np0005603623 dracut[1286]: *** Including module: systemd ***
Jan 31 01:28:06 np0005603623 dracut[1286]: *** Including module: fips ***
Jan 31 01:28:06 np0005603623 dracut[1286]: *** Including module: systemd-initrd ***
Jan 31 01:28:06 np0005603623 dracut[1286]: *** Including module: i18n ***
Jan 31 01:28:06 np0005603623 dracut[1286]: *** Including module: drm ***
Jan 31 01:28:06 np0005603623 dracut[1286]: *** Including module: prefixdevname ***
Jan 31 01:28:06 np0005603623 dracut[1286]: *** Including module: kernel-modules ***
Jan 31 01:28:07 np0005603623 kernel: block vda: the capability attribute has been deprecated.
Jan 31 01:28:07 np0005603623 dracut[1286]: *** Including module: kernel-modules-extra ***
Jan 31 01:28:07 np0005603623 dracut[1286]: *** Including module: qemu ***
Jan 31 01:28:07 np0005603623 dracut[1286]: *** Including module: fstab-sys ***
Jan 31 01:28:07 np0005603623 dracut[1286]: *** Including module: rootfs-block ***
Jan 31 01:28:07 np0005603623 dracut[1286]: *** Including module: terminfo ***
Jan 31 01:28:07 np0005603623 dracut[1286]: *** Including module: udev-rules ***
Jan 31 01:28:07 np0005603623 dracut[1286]: Skipping udev rule: 91-permissions.rules
Jan 31 01:28:07 np0005603623 dracut[1286]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 31 01:28:07 np0005603623 dracut[1286]: *** Including module: virtiofs ***
Jan 31 01:28:07 np0005603623 dracut[1286]: *** Including module: dracut-systemd ***
Jan 31 01:28:07 np0005603623 dracut[1286]: *** Including module: usrmount ***
Jan 31 01:28:07 np0005603623 dracut[1286]: *** Including module: base ***
Jan 31 01:28:08 np0005603623 dracut[1286]: *** Including module: fs-lib ***
Jan 31 01:28:08 np0005603623 dracut[1286]: *** Including module: kdumpbase ***
Jan 31 01:28:08 np0005603623 dracut[1286]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 31 01:28:08 np0005603623 dracut[1286]:  microcode_ctl module: mangling fw_dir
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: configuration "intel" is ignored
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 31 01:28:08 np0005603623 dracut[1286]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 31 01:28:08 np0005603623 dracut[1286]: *** Including module: openssl ***
Jan 31 01:28:08 np0005603623 dracut[1286]: *** Including module: shutdown ***
Jan 31 01:28:08 np0005603623 dracut[1286]: *** Including module: squash ***
Jan 31 01:28:08 np0005603623 dracut[1286]: *** Including modules done ***
Jan 31 01:28:08 np0005603623 dracut[1286]: *** Installing kernel module dependencies ***
Jan 31 01:28:09 np0005603623 dracut[1286]: *** Installing kernel module dependencies done ***
Jan 31 01:28:09 np0005603623 dracut[1286]: *** Resolving executable dependencies ***
Jan 31 01:28:10 np0005603623 dracut[1286]: *** Resolving executable dependencies done ***
Jan 31 01:28:11 np0005603623 dracut[1286]: *** Generating early-microcode cpio image ***
Jan 31 01:28:11 np0005603623 dracut[1286]: *** Store current command line parameters ***
Jan 31 01:28:11 np0005603623 dracut[1286]: Stored kernel commandline:
Jan 31 01:28:11 np0005603623 dracut[1286]: No dracut internal kernel commandline stored in the initramfs
Jan 31 01:28:11 np0005603623 dracut[1286]: *** Install squash loader ***
Jan 31 01:28:11 np0005603623 dracut[1286]: *** Squashing the files inside the initramfs ***
Jan 31 01:28:12 np0005603623 dracut[1286]: *** Squashing the files inside the initramfs done ***
Jan 31 01:28:12 np0005603623 dracut[1286]: *** Creating image file '/boot/initramfs-5.14.0-665.el9.x86_64kdump.img' ***
Jan 31 01:28:12 np0005603623 dracut[1286]: *** Hardlinking files ***
Jan 31 01:28:13 np0005603623 dracut[1286]: *** Hardlinking files done ***
Jan 31 01:28:13 np0005603623 dracut[1286]: *** Creating initramfs image file '/boot/initramfs-5.14.0-665.el9.x86_64kdump.img' done ***
Jan 31 01:28:13 np0005603623 kdumpctl[1021]: kdump: kexec: loaded kdump kernel
Jan 31 01:28:13 np0005603623 kdumpctl[1021]: kdump: Starting kdump: [OK]
Jan 31 01:28:13 np0005603623 systemd[1]: Finished Crash recovery kernel arming.
Jan 31 01:28:13 np0005603623 systemd[1]: Startup finished in 1.293s (kernel) + 2.356s (initrd) + 30.757s (userspace) = 34.407s.
Jan 31 01:28:16 np0005603623 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 31 01:29:00 np0005603623 chronyd[800]: Selected source 149.56.19.163 (2.centos.pool.ntp.org)
Jan 31 01:31:01 np0005603623 systemd-logind[795]: New session 1 of user zuul.
Jan 31 01:31:01 np0005603623 systemd[1]: Created slice User Slice of UID 1000.
Jan 31 01:31:01 np0005603623 systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 31 01:31:01 np0005603623 systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 31 01:31:01 np0005603623 systemd[1]: Starting User Manager for UID 1000...
Jan 31 01:31:01 np0005603623 systemd[4312]: Queued start job for default target Main User Target.
Jan 31 01:31:01 np0005603623 systemd[4312]: Created slice User Application Slice.
Jan 31 01:31:01 np0005603623 systemd[4312]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 31 01:31:01 np0005603623 systemd[4312]: Started Daily Cleanup of User's Temporary Directories.
Jan 31 01:31:01 np0005603623 systemd[4312]: Reached target Paths.
Jan 31 01:31:01 np0005603623 systemd[4312]: Reached target Timers.
Jan 31 01:31:01 np0005603623 systemd[4312]: Starting D-Bus User Message Bus Socket...
Jan 31 01:31:01 np0005603623 systemd[4312]: Starting Create User's Volatile Files and Directories...
Jan 31 01:31:01 np0005603623 systemd[4312]: Finished Create User's Volatile Files and Directories.
Jan 31 01:31:01 np0005603623 systemd[4312]: Listening on D-Bus User Message Bus Socket.
Jan 31 01:31:01 np0005603623 systemd[4312]: Reached target Sockets.
Jan 31 01:31:01 np0005603623 systemd[4312]: Reached target Basic System.
Jan 31 01:31:01 np0005603623 systemd[4312]: Reached target Main User Target.
Jan 31 01:31:01 np0005603623 systemd[4312]: Startup finished in 123ms.
Jan 31 01:31:01 np0005603623 systemd[1]: Started User Manager for UID 1000.
Jan 31 01:31:01 np0005603623 systemd[1]: Started Session 1 of User zuul.
Jan 31 01:31:01 np0005603623 python3[4394]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:31:36 np0005603623 python3[4422]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:31:42 np0005603623 python3[4480]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:31:43 np0005603623 python3[4520]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 31 01:31:45 np0005603623 python3[4546]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0NROJyvixYj14yxc9a1mzd1FlH8bHxigBuCuSXZp+XBwK5CWQYNe1kWs8LnwK1EvgGycvb2uWsCqXynoIDepSR4X45xPMVj2xEV2M0gYJN2FioWZRuHFYKZQNIY+ZFpOMgic6vKkz3uR6hw5OogchCCdEPofRUiDvA6imrPii/QP8S3YnwQCYwkeq72uqj4sslD467c/NglKPLZEKdfcnC4ZLM8nrcRiZwRfWls2oF0OWdbFwIn6RiwJGvZAk12ezTFzNyNkHfkadH1PD5F7tLZVrxU1P73llDzfyU8ppwjlIEtvATWFb1y5VF8VkOvjaen+/DMoFYiLvR6MUyI4JAZ7JmXxmvhLxQHPwYFTbzdjdZRYeQvPWAwtH9LWW2cdBkvLA/vGY+PSqXhb3aAM/O6R0lcyTmGVNEMRpwYZYmdoB8Cr9m2jOxZ7Ffwbs94foCUrIVlc3dkcMCTaUTrXBAqnbUteQ/Ctgp2pFJOSEse2AQi52Xm8A87QOl1wYmN8= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:31:46 np0005603623 python3[4570]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:46 np0005603623 python3[4669]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:31:47 np0005603623 python3[4740]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769841106.5330038-254-109285715928241/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=24510bce553d47bd8880c3ff7a9c0ec0_id_rsa follow=False checksum=3ef597b6e54d9d641aaa8554b0f170ef780d386a backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:47 np0005603623 python3[4863]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:31:48 np0005603623 python3[4934]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769841107.5319216-308-71068351280852/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=24510bce553d47bd8880c3ff7a9c0ec0_id_rsa.pub follow=False checksum=9e6accf0af0859cd95b66274f88d88182d33bf59 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:49 np0005603623 python3[4982]: ansible-ping Invoked with data=pong
Jan 31 01:31:50 np0005603623 python3[5006]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:31:52 np0005603623 python3[5064]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 31 01:31:53 np0005603623 python3[5096]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:53 np0005603623 python3[5120]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:53 np0005603623 python3[5144]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:54 np0005603623 python3[5168]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:54 np0005603623 python3[5192]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:54 np0005603623 python3[5216]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:56 np0005603623 python3[5242]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:57 np0005603623 python3[5320]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:31:58 np0005603623 python3[5393]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769841117.1374288-34-255637883802381/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:59 np0005603623 python3[5441]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:31:59 np0005603623 python3[5465]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:31:59 np0005603623 python3[5489]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:31:59 np0005603623 python3[5513]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:00 np0005603623 python3[5537]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:00 np0005603623 python3[5561]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:00 np0005603623 python3[5585]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:01 np0005603623 python3[5609]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:01 np0005603623 python3[5633]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:01 np0005603623 python3[5657]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:01 np0005603623 python3[5681]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:02 np0005603623 python3[5705]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:02 np0005603623 python3[5729]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:02 np0005603623 python3[5753]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:03 np0005603623 python3[5777]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:03 np0005603623 python3[5801]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:03 np0005603623 python3[5825]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:03 np0005603623 python3[5849]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:04 np0005603623 python3[5873]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:04 np0005603623 python3[5897]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:04 np0005603623 python3[5921]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:05 np0005603623 python3[5945]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:05 np0005603623 python3[5969]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:05 np0005603623 python3[5993]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:06 np0005603623 python3[6017]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:06 np0005603623 python3[6041]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:09 np0005603623 python3[6067]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 31 01:32:09 np0005603623 systemd[1]: Starting Time & Date Service...
Jan 31 01:32:09 np0005603623 systemd[1]: Started Time & Date Service.
Jan 31 01:32:09 np0005603623 systemd-timedated[6069]: Changed time zone to 'UTC' (UTC).
Jan 31 01:32:10 np0005603623 python3[6098]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:32:10 np0005603623 python3[6174]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:32:11 np0005603623 python3[6245]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769841130.6899168-255-198103277561233/source _original_basename=tmprlx3m9zi follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:32:11 np0005603623 python3[6345]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:32:11 np0005603623 python3[6416]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769841131.4959953-305-12361971088235/source _original_basename=tmp_loyxpkk follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:32:12 np0005603623 python3[6518]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:32:13 np0005603623 python3[6591]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769841132.619544-384-25844121642166/source _original_basename=tmp1yhhx5wv follow=False checksum=54ceff67f46a00e80734f8bde7b737fc4d565204 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:32:13 np0005603623 python3[6639]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:32:14 np0005603623 python3[6665]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:32:14 np0005603623 python3[6745]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:32:15 np0005603623 python3[6818]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769841134.6798885-454-238250124163578/source _original_basename=tmpug1uy7fk follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:32:15 np0005603623 python3[6869]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-b97d-7d19-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:32:16 np0005603623 python3[6897]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-b97d-7d19-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 31 01:32:18 np0005603623 python3[6925]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:32:39 np0005603623 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 31 01:32:52 np0005603623 python3[6953]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:33:09 np0005603623 systemd[4312]: Starting Mark boot as successful...
Jan 31 01:33:10 np0005603623 systemd[4312]: Finished Mark boot as successful.
Jan 31 01:33:52 np0005603623 systemd-logind[795]: Session 1 logged out. Waiting for processes to exit.
Jan 31 01:34:25 np0005603623 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 31 01:34:25 np0005603623 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 31 01:34:25 np0005603623 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 31 01:34:25 np0005603623 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 31 01:34:25 np0005603623 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 31 01:34:25 np0005603623 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 31 01:34:25 np0005603623 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 31 01:34:25 np0005603623 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 31 01:34:25 np0005603623 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 31 01:34:25 np0005603623 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 31 01:34:25 np0005603623 NetworkManager[858]: <info>  [1769841265.4245] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 31 01:34:25 np0005603623 systemd-udevd[6956]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:34:25 np0005603623 NetworkManager[858]: <info>  [1769841265.4437] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 01:34:25 np0005603623 NetworkManager[858]: <info>  [1769841265.4468] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 31 01:34:25 np0005603623 NetworkManager[858]: <info>  [1769841265.4472] device (eth1): carrier: link connected
Jan 31 01:34:25 np0005603623 NetworkManager[858]: <info>  [1769841265.4474] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 31 01:34:25 np0005603623 NetworkManager[858]: <info>  [1769841265.4480] policy: auto-activating connection 'Wired connection 1' (723a2653-d37e-330e-8a33-ebd92ee3bb06)
Jan 31 01:34:25 np0005603623 NetworkManager[858]: <info>  [1769841265.4484] device (eth1): Activation: starting connection 'Wired connection 1' (723a2653-d37e-330e-8a33-ebd92ee3bb06)
Jan 31 01:34:25 np0005603623 NetworkManager[858]: <info>  [1769841265.4485] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 01:34:25 np0005603623 NetworkManager[858]: <info>  [1769841265.4488] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 01:34:25 np0005603623 NetworkManager[858]: <info>  [1769841265.4493] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 01:34:25 np0005603623 NetworkManager[858]: <info>  [1769841265.4497] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 31 01:34:26 np0005603623 systemd-logind[795]: New session 3 of user zuul.
Jan 31 01:34:26 np0005603623 systemd[1]: Started Session 3 of User zuul.
Jan 31 01:34:26 np0005603623 python3[6986]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-922d-b924-0000000001f6-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:34:36 np0005603623 python3[7066]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:34:36 np0005603623 python3[7139]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769841275.9689302-206-143786327065383/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=568671ca46ed465cac0fa303924e63ccf4daf2e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:34:36 np0005603623 python3[7189]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:34:37 np0005603623 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 31 01:34:37 np0005603623 systemd[1]: Stopped Network Manager Wait Online.
Jan 31 01:34:37 np0005603623 systemd[1]: Stopping Network Manager Wait Online...
Jan 31 01:34:37 np0005603623 systemd[1]: Stopping Network Manager...
Jan 31 01:34:37 np0005603623 NetworkManager[858]: <info>  [1769841277.0261] caught SIGTERM, shutting down normally.
Jan 31 01:34:37 np0005603623 NetworkManager[858]: <info>  [1769841277.0272] dhcp4 (eth0): canceled DHCP transaction
Jan 31 01:34:37 np0005603623 NetworkManager[858]: <info>  [1769841277.0273] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 31 01:34:37 np0005603623 NetworkManager[858]: <info>  [1769841277.0273] dhcp4 (eth0): state changed no lease
Jan 31 01:34:37 np0005603623 NetworkManager[858]: <info>  [1769841277.0276] manager: NetworkManager state is now CONNECTING
Jan 31 01:34:37 np0005603623 NetworkManager[858]: <info>  [1769841277.0374] dhcp4 (eth1): canceled DHCP transaction
Jan 31 01:34:37 np0005603623 NetworkManager[858]: <info>  [1769841277.0375] dhcp4 (eth1): state changed no lease
Jan 31 01:34:37 np0005603623 NetworkManager[858]: <info>  [1769841277.0419] exiting (success)
Jan 31 01:34:37 np0005603623 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 01:34:37 np0005603623 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 01:34:37 np0005603623 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 31 01:34:37 np0005603623 systemd[1]: Stopped Network Manager.
Jan 31 01:34:37 np0005603623 systemd[1]: NetworkManager.service: Consumed 3.273s CPU time, 10.2M memory peak.
Jan 31 01:34:37 np0005603623 systemd[1]: Starting Network Manager...
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.0935] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:0253bde7-2e17-45fb-a5ca-f6e9e69beb56)
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.0938] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.0989] manager[0x555e46b39000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 31 01:34:37 np0005603623 systemd[1]: Starting Hostname Service...
Jan 31 01:34:37 np0005603623 systemd[1]: Started Hostname Service.
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1592] hostname: hostname: using hostnamed
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1592] hostname: static hostname changed from (none) to "np0005603623.novalocal"
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1599] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1604] manager[0x555e46b39000]: rfkill: Wi-Fi hardware radio set enabled
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1604] manager[0x555e46b39000]: rfkill: WWAN hardware radio set enabled
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1630] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1631] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1631] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1632] manager: Networking is enabled by state file
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1634] settings: Loaded settings plugin: keyfile (internal)
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1640] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1660] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1668] dhcp: init: Using DHCP client 'internal'
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1670] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1674] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1678] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1683] device (lo): Activation: starting connection 'lo' (b03d3090-8131-4757-b469-75b0afcb0833)
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1688] device (eth0): carrier: link connected
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1690] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1693] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1694] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1697] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1702] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1705] device (eth1): carrier: link connected
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1708] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1712] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (723a2653-d37e-330e-8a33-ebd92ee3bb06) (indicated)
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1712] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1715] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1719] device (eth1): Activation: starting connection 'Wired connection 1' (723a2653-d37e-330e-8a33-ebd92ee3bb06)
Jan 31 01:34:37 np0005603623 systemd[1]: Started Network Manager.
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1724] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1739] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1741] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1742] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1745] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1746] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1748] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1749] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1751] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1755] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1757] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1763] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1764] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1790] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1795] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1799] device (lo): Activation: successful, device activated.
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1809] dhcp4 (eth0): state changed new lease, address=38.102.83.110
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.1815] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 31 01:34:37 np0005603623 systemd[1]: Starting Network Manager Wait Online...
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.2195] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.2210] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.2212] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.2216] manager: NetworkManager state is now CONNECTED_SITE
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.2219] device (eth0): Activation: successful, device activated.
Jan 31 01:34:37 np0005603623 NetworkManager[7199]: <info>  [1769841277.2226] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 31 01:34:37 np0005603623 python3[7274]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-922d-b924-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:34:47 np0005603623 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 01:35:07 np0005603623 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.5125] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 31 01:35:22 np0005603623 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 01:35:22 np0005603623 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.5373] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.5376] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.5381] device (eth1): Activation: successful, device activated.
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.5388] manager: startup complete
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.5390] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <warn>  [1769841322.5397] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.5406] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 31 01:35:22 np0005603623 systemd[1]: Finished Network Manager Wait Online.
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.5515] dhcp4 (eth1): canceled DHCP transaction
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.5516] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.5516] dhcp4 (eth1): state changed no lease
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.5531] policy: auto-activating connection 'ci-private-network' (30f6d43e-2343-50fd-9644-39fd990a5838)
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.5536] device (eth1): Activation: starting connection 'ci-private-network' (30f6d43e-2343-50fd-9644-39fd990a5838)
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.5537] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.5539] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.5545] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.5552] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.6214] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.6217] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 01:35:22 np0005603623 NetworkManager[7199]: <info>  [1769841322.6223] device (eth1): Activation: successful, device activated.
Jan 31 01:35:32 np0005603623 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 01:35:37 np0005603623 systemd[1]: session-3.scope: Deactivated successfully.
Jan 31 01:35:37 np0005603623 systemd[1]: session-3.scope: Consumed 1.341s CPU time.
Jan 31 01:35:37 np0005603623 systemd-logind[795]: Session 3 logged out. Waiting for processes to exit.
Jan 31 01:35:37 np0005603623 systemd-logind[795]: Removed session 3.
Jan 31 01:35:57 np0005603623 systemd-logind[795]: New session 4 of user zuul.
Jan 31 01:35:57 np0005603623 systemd[1]: Started Session 4 of User zuul.
Jan 31 01:35:57 np0005603623 python3[7387]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:35:58 np0005603623 python3[7460]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769841357.4699285-373-10035422320134/source _original_basename=tmplht2wh38 follow=False checksum=be2f7c16edb43e88d00ebc0882f9b60a044e6a9d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:36:00 np0005603623 systemd[1]: session-4.scope: Deactivated successfully.
Jan 31 01:36:00 np0005603623 systemd-logind[795]: Session 4 logged out. Waiting for processes to exit.
Jan 31 01:36:00 np0005603623 systemd-logind[795]: Removed session 4.
Jan 31 01:36:09 np0005603623 systemd[4312]: Created slice User Background Tasks Slice.
Jan 31 01:36:09 np0005603623 systemd[4312]: Starting Cleanup of User's Temporary Files and Directories...
Jan 31 01:36:09 np0005603623 systemd[4312]: Finished Cleanup of User's Temporary Files and Directories.
Jan 31 01:43:09 np0005603623 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 31 01:43:09 np0005603623 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 31 01:43:09 np0005603623 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 31 01:43:09 np0005603623 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 31 01:46:42 np0005603623 systemd-logind[795]: New session 5 of user zuul.
Jan 31 01:46:42 np0005603623 systemd[1]: Started Session 5 of User zuul.
Jan 31 01:46:42 np0005603623 python3[7520]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-bf7f-c771-000000000cb6-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:46:43 np0005603623 python3[7548]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:46:43 np0005603623 python3[7574]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:46:43 np0005603623 python3[7601]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:46:44 np0005603623 python3[7627]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:46:44 np0005603623 python3[7653]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:46:45 np0005603623 python3[7731]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:46:45 np0005603623 python3[7804]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769842004.9637814-378-20782885163639/source _original_basename=tmpaxzbimis follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:46:46 np0005603623 python3[7854]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 01:46:46 np0005603623 systemd[1]: Reloading.
Jan 31 01:46:46 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:46:48 np0005603623 python3[7910]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 31 01:46:48 np0005603623 python3[7936]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:46:49 np0005603623 python3[7964]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:46:49 np0005603623 python3[7992]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:46:49 np0005603623 python3[8020]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:46:50 np0005603623 python3[8047]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-bf7f-c771-000000000cbd-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:46:50 np0005603623 python3[8077]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 31 01:46:53 np0005603623 systemd-logind[795]: Session 5 logged out. Waiting for processes to exit.
Jan 31 01:46:53 np0005603623 systemd[1]: session-5.scope: Deactivated successfully.
Jan 31 01:46:53 np0005603623 systemd[1]: session-5.scope: Consumed 3.990s CPU time.
Jan 31 01:46:53 np0005603623 systemd-logind[795]: Removed session 5.
Jan 31 01:46:55 np0005603623 irqbalance[789]: Cannot change IRQ 27 affinity: Operation not permitted
Jan 31 01:46:55 np0005603623 irqbalance[789]: IRQ 27 affinity is now unmanaged
Jan 31 01:46:55 np0005603623 systemd-logind[795]: New session 6 of user zuul.
Jan 31 01:46:55 np0005603623 systemd[1]: Started Session 6 of User zuul.
Jan 31 01:46:56 np0005603623 python3[8111]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 31 01:47:05 np0005603623 setsebool[8154]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 31 01:47:05 np0005603623 setsebool[8154]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 31 01:47:20 np0005603623 kernel: SELinux:  Converting 385 SID table entries...
Jan 31 01:47:20 np0005603623 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 01:47:20 np0005603623 kernel: SELinux:  policy capability open_perms=1
Jan 31 01:47:20 np0005603623 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 01:47:20 np0005603623 kernel: SELinux:  policy capability always_check_network=0
Jan 31 01:47:20 np0005603623 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 01:47:20 np0005603623 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 01:47:20 np0005603623 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 01:47:34 np0005603623 kernel: SELinux:  Converting 388 SID table entries...
Jan 31 01:47:34 np0005603623 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 01:47:34 np0005603623 kernel: SELinux:  policy capability open_perms=1
Jan 31 01:47:34 np0005603623 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 01:47:34 np0005603623 kernel: SELinux:  policy capability always_check_network=0
Jan 31 01:47:34 np0005603623 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 01:47:34 np0005603623 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 01:47:34 np0005603623 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 01:47:54 np0005603623 dbus-broker-launch[782]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 31 01:47:54 np0005603623 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 01:47:54 np0005603623 systemd[1]: Starting man-db-cache-update.service...
Jan 31 01:47:54 np0005603623 systemd[1]: Reloading.
Jan 31 01:47:54 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:47:55 np0005603623 systemd[1]: Starting dnf makecache...
Jan 31 01:47:55 np0005603623 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 01:47:55 np0005603623 dnf[8993]: Failed determining last makecache time.
Jan 31 01:47:55 np0005603623 dnf[8993]: CentOS Stream 9 - BaseOS                         38 kB/s | 6.1 kB     00:00
Jan 31 01:47:56 np0005603623 dnf[8993]: CentOS Stream 9 - AppStream                      60 kB/s | 6.5 kB     00:00
Jan 31 01:47:56 np0005603623 dnf[8993]: CentOS Stream 9 - CRB                            52 kB/s | 6.0 kB     00:00
Jan 31 01:47:56 np0005603623 dnf[8993]: CentOS Stream 9 - Extras packages                67 kB/s | 7.3 kB     00:00
Jan 31 01:47:56 np0005603623 dnf[8993]: Metadata cache created.
Jan 31 01:47:56 np0005603623 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 31 01:47:56 np0005603623 systemd[1]: Finished dnf makecache.
Jan 31 01:48:06 np0005603623 python3[15390]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-c34d-28f4-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:48:07 np0005603623 kernel: evm: overlay not supported
Jan 31 01:48:07 np0005603623 systemd[4312]: Starting D-Bus User Message Bus...
Jan 31 01:48:07 np0005603623 dbus-broker-launch[15903]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 31 01:48:07 np0005603623 dbus-broker-launch[15903]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 31 01:48:07 np0005603623 systemd[4312]: Started D-Bus User Message Bus.
Jan 31 01:48:07 np0005603623 dbus-broker-lau[15903]: Ready
Jan 31 01:48:07 np0005603623 systemd[4312]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 31 01:48:07 np0005603623 systemd[4312]: Created slice Slice /user.
Jan 31 01:48:07 np0005603623 systemd[4312]: podman-15827.scope: unit configures an IP firewall, but not running as root.
Jan 31 01:48:07 np0005603623 systemd[4312]: (This warning is only shown for the first unit using IP firewalling.)
Jan 31 01:48:07 np0005603623 systemd[4312]: Started podman-15827.scope.
Jan 31 01:48:07 np0005603623 systemd[4312]: Started podman-pause-590181f5.scope.
Jan 31 01:48:08 np0005603623 python3[16250]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.2:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.2:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:48:08 np0005603623 python3[16250]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 31 01:48:08 np0005603623 systemd[1]: session-6.scope: Deactivated successfully.
Jan 31 01:48:08 np0005603623 systemd[1]: session-6.scope: Consumed 53.790s CPU time.
Jan 31 01:48:08 np0005603623 systemd-logind[795]: Session 6 logged out. Waiting for processes to exit.
Jan 31 01:48:08 np0005603623 systemd-logind[795]: Removed session 6.
Jan 31 01:48:33 np0005603623 systemd-logind[795]: New session 7 of user zuul.
Jan 31 01:48:33 np0005603623 systemd[1]: Started Session 7 of User zuul.
Jan 31 01:48:33 np0005603623 python3[26558]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAYzMDYbycOT72ga9wDhD1NtUc7onT0cFXCjwfAnzaB2tvlINsgaQbDQ5ZwqYE9Er0Wi02qKQ4UqK2RbEye6MZA= zuul@np0005603620.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:48:34 np0005603623 python3[26938]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAYzMDYbycOT72ga9wDhD1NtUc7onT0cFXCjwfAnzaB2tvlINsgaQbDQ5ZwqYE9Er0Wi02qKQ4UqK2RbEye6MZA= zuul@np0005603620.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:48:35 np0005603623 python3[27347]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005603623.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 31 01:48:35 np0005603623 python3[27674]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAYzMDYbycOT72ga9wDhD1NtUc7onT0cFXCjwfAnzaB2tvlINsgaQbDQ5ZwqYE9Er0Wi02qKQ4UqK2RbEye6MZA= zuul@np0005603620.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:48:36 np0005603623 python3[27945]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:48:36 np0005603623 python3[28195]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769842116.1128201-170-96188127030303/source _original_basename=tmp1gunbo45 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:48:37 np0005603623 python3[28562]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Jan 31 01:48:37 np0005603623 systemd[1]: Starting Hostname Service...
Jan 31 01:48:37 np0005603623 systemd[1]: Started Hostname Service.
Jan 31 01:48:37 np0005603623 systemd-hostnamed[28676]: Changed pretty hostname to 'compute-2'
Jan 31 01:48:37 np0005603623 systemd-hostnamed[28676]: Hostname set to <compute-2> (static)
Jan 31 01:48:37 np0005603623 NetworkManager[7199]: <info>  [1769842117.8278] hostname: static hostname changed from "np0005603623.novalocal" to "compute-2"
Jan 31 01:48:37 np0005603623 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 01:48:37 np0005603623 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 01:48:38 np0005603623 systemd[1]: session-7.scope: Deactivated successfully.
Jan 31 01:48:38 np0005603623 systemd[1]: session-7.scope: Consumed 2.179s CPU time.
Jan 31 01:48:38 np0005603623 systemd-logind[795]: Session 7 logged out. Waiting for processes to exit.
Jan 31 01:48:38 np0005603623 systemd-logind[795]: Removed session 7.
Jan 31 01:48:40 np0005603623 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 01:48:40 np0005603623 systemd[1]: Finished man-db-cache-update.service.
Jan 31 01:48:40 np0005603623 systemd[1]: man-db-cache-update.service: Consumed 52.157s CPU time.
Jan 31 01:48:40 np0005603623 systemd[1]: run-rb6dec8909e2b4a96a26e3eb24ab9413a.service: Deactivated successfully.
Jan 31 01:48:47 np0005603623 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 01:49:07 np0005603623 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 31 01:53:20 np0005603623 systemd-logind[795]: New session 8 of user zuul.
Jan 31 01:53:20 np0005603623 systemd[1]: Started Session 8 of User zuul.
Jan 31 01:53:21 np0005603623 python3[30087]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:53:22 np0005603623 python3[30203]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:53:23 np0005603623 python3[30276]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769842402.5899653-34076-255588692931202/source mode=0755 _original_basename=delorean.repo follow=False checksum=cc4ab4695da8ec58c451521a3dd2f41014af145d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:53:23 np0005603623 python3[30302]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:53:23 np0005603623 python3[30375]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769842402.5899653-34076-255588692931202/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:53:24 np0005603623 python3[30401]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:53:24 np0005603623 python3[30474]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769842402.5899653-34076-255588692931202/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:53:24 np0005603623 python3[30500]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:53:25 np0005603623 python3[30573]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769842402.5899653-34076-255588692931202/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:53:25 np0005603623 python3[30599]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:53:25 np0005603623 python3[30672]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769842402.5899653-34076-255588692931202/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:53:25 np0005603623 python3[30698]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:53:26 np0005603623 python3[30771]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769842402.5899653-34076-255588692931202/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:53:26 np0005603623 python3[30797]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:53:26 np0005603623 python3[30870]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769842402.5899653-34076-255588692931202/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=362a603578148d54e8cd25942b88d7f471cc677a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:53:38 np0005603623 python3[30918]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:58:37 np0005603623 systemd[1]: session-8.scope: Deactivated successfully.
Jan 31 01:58:37 np0005603623 systemd[1]: session-8.scope: Consumed 4.894s CPU time.
Jan 31 01:58:37 np0005603623 systemd-logind[795]: Session 8 logged out. Waiting for processes to exit.
Jan 31 01:58:37 np0005603623 systemd-logind[795]: Removed session 8.
Jan 31 02:10:44 np0005603623 systemd-logind[795]: New session 9 of user zuul.
Jan 31 02:10:44 np0005603623 systemd[1]: Started Session 9 of User zuul.
Jan 31 02:10:45 np0005603623 python3.9[31096]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:10:46 np0005603623 python3.9[31277]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:10:55 np0005603623 systemd[1]: session-9.scope: Deactivated successfully.
Jan 31 02:10:55 np0005603623 systemd[1]: session-9.scope: Consumed 8.597s CPU time.
Jan 31 02:10:55 np0005603623 systemd-logind[795]: Session 9 logged out. Waiting for processes to exit.
Jan 31 02:10:55 np0005603623 systemd-logind[795]: Removed session 9.
Jan 31 02:11:15 np0005603623 systemd-logind[795]: New session 10 of user zuul.
Jan 31 02:11:15 np0005603623 systemd[1]: Started Session 10 of User zuul.
Jan 31 02:11:16 np0005603623 python3.9[31490]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 31 02:11:17 np0005603623 python3.9[31664]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:11:18 np0005603623 python3.9[31816]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:11:19 np0005603623 python3.9[31969]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:11:21 np0005603623 python3.9[32121]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:11:21 np0005603623 python3.9[32273]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:11:22 np0005603623 python3.9[32396]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843481.3096619-179-65002520114121/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:11:23 np0005603623 python3.9[32548]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:11:23 np0005603623 python3.9[32704]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:11:24 np0005603623 python3.9[32856]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:11:25 np0005603623 python3.9[33006]: ansible-ansible.builtin.service_facts Invoked
Jan 31 02:11:29 np0005603623 python3.9[33259]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:11:30 np0005603623 python3.9[33409]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:11:31 np0005603623 python3.9[33563]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:11:32 np0005603623 python3.9[33721]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:11:33 np0005603623 python3.9[33805]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:12:19 np0005603623 systemd[1]: Reloading.
Jan 31 02:12:19 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:12:19 np0005603623 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 31 02:12:19 np0005603623 systemd[1]: Reloading.
Jan 31 02:12:19 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:12:20 np0005603623 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 31 02:12:20 np0005603623 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 31 02:12:20 np0005603623 systemd[1]: Reloading.
Jan 31 02:12:20 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:12:20 np0005603623 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 31 02:12:20 np0005603623 dbus-broker-launch[769]: Noticed file-system modification, trigger reload.
Jan 31 02:12:20 np0005603623 dbus-broker-launch[769]: Noticed file-system modification, trigger reload.
Jan 31 02:12:20 np0005603623 dbus-broker-launch[769]: Noticed file-system modification, trigger reload.
Jan 31 02:13:21 np0005603623 kernel: SELinux:  Converting 2727 SID table entries...
Jan 31 02:13:21 np0005603623 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 02:13:21 np0005603623 kernel: SELinux:  policy capability open_perms=1
Jan 31 02:13:21 np0005603623 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 02:13:21 np0005603623 kernel: SELinux:  policy capability always_check_network=0
Jan 31 02:13:21 np0005603623 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 02:13:21 np0005603623 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 02:13:21 np0005603623 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 02:13:21 np0005603623 dbus-broker-launch[782]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 31 02:13:21 np0005603623 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 02:13:21 np0005603623 systemd[1]: Starting man-db-cache-update.service...
Jan 31 02:13:21 np0005603623 systemd[1]: Reloading.
Jan 31 02:13:21 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:13:21 np0005603623 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 02:13:22 np0005603623 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 02:13:22 np0005603623 systemd[1]: Finished man-db-cache-update.service.
Jan 31 02:13:22 np0005603623 systemd[1]: run-rf4315b0e44824b6f83627de4d9b48c88.service: Deactivated successfully.
Jan 31 02:13:42 np0005603623 python3.9[35311]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:13:45 np0005603623 python3.9[35592]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 31 02:13:46 np0005603623 python3.9[35744]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 31 02:13:49 np0005603623 python3.9[35897]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:13:50 np0005603623 python3.9[36049]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 31 02:13:52 np0005603623 python3.9[36201]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:13:58 np0005603623 python3.9[36353]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:13:59 np0005603623 python3.9[36476]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843632.315424-669-61040167792608/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=95f204ee8062e227608bf68163d0c9f95531c74c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:14:00 np0005603623 python3.9[36628]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:14:01 np0005603623 python3.9[36780]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:14:02 np0005603623 python3.9[36933]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:14:03 np0005603623 python3.9[37085]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 31 02:14:03 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 02:14:04 np0005603623 python3.9[37239]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 31 02:14:06 np0005603623 python3.9[37397]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 31 02:14:06 np0005603623 python3.9[37557]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 31 02:14:07 np0005603623 python3.9[37710]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 31 02:14:08 np0005603623 python3.9[37868]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 31 02:14:09 np0005603623 python3.9[38020]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:14:13 np0005603623 python3.9[38173]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:14:14 np0005603623 python3.9[38325]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:14:14 np0005603623 python3.9[38448]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769843653.6630588-1026-250553284707415/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:14:15 np0005603623 python3.9[38600]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:14:15 np0005603623 systemd[1]: Starting Load Kernel Modules...
Jan 31 02:14:15 np0005603623 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 31 02:14:15 np0005603623 kernel: Bridge firewalling registered
Jan 31 02:14:15 np0005603623 systemd-modules-load[38604]: Inserted module 'br_netfilter'
Jan 31 02:14:15 np0005603623 systemd[1]: Finished Load Kernel Modules.
Jan 31 02:14:16 np0005603623 python3.9[38759]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:14:17 np0005603623 python3.9[38882]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769843656.1116285-1095-168170022345203/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:14:18 np0005603623 python3.9[39034]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:14:22 np0005603623 dbus-broker-launch[769]: Noticed file-system modification, trigger reload.
Jan 31 02:14:22 np0005603623 dbus-broker-launch[769]: Noticed file-system modification, trigger reload.
Jan 31 02:14:23 np0005603623 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 02:14:23 np0005603623 systemd[1]: Starting man-db-cache-update.service...
Jan 31 02:14:23 np0005603623 systemd[1]: Reloading.
Jan 31 02:14:23 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:14:23 np0005603623 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 02:14:25 np0005603623 python3.9[41004]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:14:26 np0005603623 python3.9[42456]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 31 02:14:27 np0005603623 python3.9[43085]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:14:27 np0005603623 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 02:14:27 np0005603623 systemd[1]: Finished man-db-cache-update.service.
Jan 31 02:14:27 np0005603623 systemd[1]: man-db-cache-update.service: Consumed 3.487s CPU time.
Jan 31 02:14:27 np0005603623 systemd[1]: run-r4881d884b61c4a1599dd191004df7e22.service: Deactivated successfully.
Jan 31 02:14:27 np0005603623 python3.9[43238]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:14:28 np0005603623 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 31 02:14:28 np0005603623 systemd[1]: Starting Authorization Manager...
Jan 31 02:14:28 np0005603623 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 31 02:14:28 np0005603623 polkitd[43456]: Started polkitd version 0.117
Jan 31 02:14:28 np0005603623 systemd[1]: Started Authorization Manager.
Jan 31 02:14:29 np0005603623 python3.9[43626]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:14:29 np0005603623 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 31 02:14:29 np0005603623 systemd[1]: tuned.service: Deactivated successfully.
Jan 31 02:14:29 np0005603623 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 31 02:14:29 np0005603623 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 31 02:14:29 np0005603623 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 31 02:14:30 np0005603623 python3.9[43788]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 31 02:14:34 np0005603623 python3.9[43940]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:14:34 np0005603623 systemd[1]: Reloading.
Jan 31 02:14:34 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:14:35 np0005603623 python3.9[44128]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:14:35 np0005603623 systemd[1]: Reloading.
Jan 31 02:14:35 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:14:35 np0005603623 irqbalance[789]: Cannot change IRQ 26 affinity: Operation not permitted
Jan 31 02:14:35 np0005603623 irqbalance[789]: IRQ 26 affinity is now unmanaged
Jan 31 02:14:36 np0005603623 python3.9[44317]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:14:37 np0005603623 python3.9[44470]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:14:37 np0005603623 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 31 02:14:37 np0005603623 python3.9[44623]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:14:39 np0005603623 python3.9[44785]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:14:40 np0005603623 python3.9[44938]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:14:40 np0005603623 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 31 02:14:40 np0005603623 systemd[1]: Stopped Apply Kernel Variables.
Jan 31 02:14:40 np0005603623 systemd[1]: Stopping Apply Kernel Variables...
Jan 31 02:14:40 np0005603623 systemd[1]: Starting Apply Kernel Variables...
Jan 31 02:14:40 np0005603623 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 31 02:14:40 np0005603623 systemd[1]: Finished Apply Kernel Variables.
Jan 31 02:14:41 np0005603623 systemd[1]: session-10.scope: Deactivated successfully.
Jan 31 02:14:41 np0005603623 systemd[1]: session-10.scope: Consumed 2min 7.473s CPU time.
Jan 31 02:14:41 np0005603623 systemd-logind[795]: Session 10 logged out. Waiting for processes to exit.
Jan 31 02:14:41 np0005603623 systemd-logind[795]: Removed session 10.
Jan 31 02:14:46 np0005603623 systemd-logind[795]: New session 11 of user zuul.
Jan 31 02:14:46 np0005603623 systemd[1]: Started Session 11 of User zuul.
Jan 31 02:14:47 np0005603623 python3.9[45121]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:14:49 np0005603623 python3.9[45277]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 31 02:14:49 np0005603623 python3.9[45430]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 31 02:14:50 np0005603623 python3.9[45588]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 31 02:14:51 np0005603623 python3.9[45748]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:14:52 np0005603623 python3.9[45832]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 02:14:56 np0005603623 python3.9[45996]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:15:10 np0005603623 kernel: SELinux:  Converting 2739 SID table entries...
Jan 31 02:15:10 np0005603623 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 02:15:10 np0005603623 kernel: SELinux:  policy capability open_perms=1
Jan 31 02:15:10 np0005603623 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 02:15:10 np0005603623 kernel: SELinux:  policy capability always_check_network=0
Jan 31 02:15:10 np0005603623 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 02:15:10 np0005603623 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 02:15:10 np0005603623 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 02:15:11 np0005603623 dbus-broker-launch[782]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 31 02:15:11 np0005603623 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 31 02:15:12 np0005603623 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 02:15:12 np0005603623 systemd[1]: Starting man-db-cache-update.service...
Jan 31 02:15:12 np0005603623 systemd[1]: Reloading.
Jan 31 02:15:12 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:15:12 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:15:12 np0005603623 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 02:15:13 np0005603623 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 02:15:13 np0005603623 systemd[1]: Finished man-db-cache-update.service.
Jan 31 02:15:13 np0005603623 systemd[1]: run-r5f44b6f82f4140759bde1600e2a0e694.service: Deactivated successfully.
Jan 31 02:15:15 np0005603623 python3.9[47094]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 02:15:15 np0005603623 systemd[1]: Reloading.
Jan 31 02:15:15 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:15:15 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:15:15 np0005603623 systemd[1]: Starting Open vSwitch Database Unit...
Jan 31 02:15:15 np0005603623 chown[47136]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 31 02:15:15 np0005603623 ovs-ctl[47141]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 31 02:15:15 np0005603623 ovs-ctl[47141]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 31 02:15:15 np0005603623 ovs-ctl[47141]: Starting ovsdb-server [  OK  ]
Jan 31 02:15:15 np0005603623 ovs-vsctl[47190]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 31 02:15:16 np0005603623 ovs-vsctl[47208]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"7ec8bf38-9571-4400-a85c-6bd5ac54bdf3\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 31 02:15:16 np0005603623 ovs-ctl[47141]: Configuring Open vSwitch system IDs [  OK  ]
Jan 31 02:15:16 np0005603623 ovs-ctl[47141]: Enabling remote OVSDB managers [  OK  ]
Jan 31 02:15:16 np0005603623 systemd[1]: Started Open vSwitch Database Unit.
Jan 31 02:15:16 np0005603623 ovs-vsctl[47216]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Jan 31 02:15:16 np0005603623 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 31 02:15:16 np0005603623 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 31 02:15:16 np0005603623 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 31 02:15:16 np0005603623 kernel: openvswitch: Open vSwitch switching datapath
Jan 31 02:15:16 np0005603623 ovs-ctl[47261]: Inserting openvswitch module [  OK  ]
Jan 31 02:15:16 np0005603623 ovs-ctl[47230]: Starting ovs-vswitchd [  OK  ]
Jan 31 02:15:16 np0005603623 ovs-vsctl[47278]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Jan 31 02:15:16 np0005603623 ovs-ctl[47230]: Enabling remote OVSDB managers [  OK  ]
Jan 31 02:15:16 np0005603623 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 31 02:15:16 np0005603623 systemd[1]: Starting Open vSwitch...
Jan 31 02:15:16 np0005603623 systemd[1]: Finished Open vSwitch.
Jan 31 02:15:17 np0005603623 python3.9[47430]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:15:18 np0005603623 python3.9[47582]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 31 02:15:20 np0005603623 kernel: SELinux:  Converting 2753 SID table entries...
Jan 31 02:15:20 np0005603623 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 02:15:20 np0005603623 kernel: SELinux:  policy capability open_perms=1
Jan 31 02:15:20 np0005603623 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 02:15:20 np0005603623 kernel: SELinux:  policy capability always_check_network=0
Jan 31 02:15:20 np0005603623 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 02:15:20 np0005603623 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 02:15:20 np0005603623 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 02:15:22 np0005603623 python3.9[47738]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:15:23 np0005603623 dbus-broker-launch[782]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 31 02:15:23 np0005603623 python3.9[47896]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:15:25 np0005603623 python3.9[48049]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:15:27 np0005603623 python3.9[48336]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 31 02:15:28 np0005603623 python3.9[48486]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:15:29 np0005603623 python3.9[48640]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:15:31 np0005603623 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 02:15:31 np0005603623 systemd[1]: Starting man-db-cache-update.service...
Jan 31 02:15:31 np0005603623 systemd[1]: Reloading.
Jan 31 02:15:31 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:15:31 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:15:31 np0005603623 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 02:15:31 np0005603623 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 02:15:31 np0005603623 systemd[1]: Finished man-db-cache-update.service.
Jan 31 02:15:31 np0005603623 systemd[1]: run-r09e1feaa5c604c60857f02358a864a30.service: Deactivated successfully.
Jan 31 02:15:32 np0005603623 python3.9[48957]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:15:33 np0005603623 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 31 02:15:33 np0005603623 systemd[1]: Stopped Network Manager Wait Online.
Jan 31 02:15:33 np0005603623 systemd[1]: Stopping Network Manager Wait Online...
Jan 31 02:15:33 np0005603623 systemd[1]: Stopping Network Manager...
Jan 31 02:15:33 np0005603623 NetworkManager[7199]: <info>  [1769843733.0284] caught SIGTERM, shutting down normally.
Jan 31 02:15:33 np0005603623 NetworkManager[7199]: <info>  [1769843733.0300] dhcp4 (eth0): canceled DHCP transaction
Jan 31 02:15:33 np0005603623 NetworkManager[7199]: <info>  [1769843733.0301] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 31 02:15:33 np0005603623 NetworkManager[7199]: <info>  [1769843733.0301] dhcp4 (eth0): state changed no lease
Jan 31 02:15:33 np0005603623 NetworkManager[7199]: <info>  [1769843733.0303] manager: NetworkManager state is now CONNECTED_SITE
Jan 31 02:15:33 np0005603623 NetworkManager[7199]: <info>  [1769843733.0362] exiting (success)
Jan 31 02:15:33 np0005603623 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 02:15:33 np0005603623 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 02:15:33 np0005603623 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 31 02:15:33 np0005603623 systemd[1]: Stopped Network Manager.
Jan 31 02:15:33 np0005603623 systemd[1]: NetworkManager.service: Consumed 24.341s CPU time, 4.1M memory peak, read 0B from disk, written 28.0K to disk.
Jan 31 02:15:33 np0005603623 systemd[1]: Starting Network Manager...
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.0989] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:0253bde7-2e17-45fb-a5ca-f6e9e69beb56)
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.0992] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1045] manager[0x55aa46e4e000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 31 02:15:33 np0005603623 systemd[1]: Starting Hostname Service...
Jan 31 02:15:33 np0005603623 systemd[1]: Started Hostname Service.
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1754] hostname: hostname: using hostnamed
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1754] hostname: static hostname changed from (none) to "compute-2"
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1759] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1763] manager[0x55aa46e4e000]: rfkill: Wi-Fi hardware radio set enabled
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1763] manager[0x55aa46e4e000]: rfkill: WWAN hardware radio set enabled
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1780] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1787] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1788] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1788] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1789] manager: Networking is enabled by state file
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1791] settings: Loaded settings plugin: keyfile (internal)
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1793] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1814] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1822] dhcp: init: Using DHCP client 'internal'
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1825] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1829] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1832] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1839] device (lo): Activation: starting connection 'lo' (b03d3090-8131-4757-b469-75b0afcb0833)
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1845] device (eth0): carrier: link connected
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1849] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1853] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1853] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1858] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1863] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1867] device (eth1): carrier: link connected
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1870] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1873] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (30f6d43e-2343-50fd-9644-39fd990a5838) (indicated)
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1873] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1878] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1883] device (eth1): Activation: starting connection 'ci-private-network' (30f6d43e-2343-50fd-9644-39fd990a5838)
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1889] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 31 02:15:33 np0005603623 systemd[1]: Started Network Manager.
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1897] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1899] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1900] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1902] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1905] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1907] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1910] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1913] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1919] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1921] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1942] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1957] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1967] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1968] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1974] device (lo): Activation: successful, device activated.
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1982] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1986] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1991] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.1993] device (eth1): Activation: successful, device activated.
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.2015] dhcp4 (eth0): state changed new lease, address=38.102.83.110
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.2020] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 31 02:15:33 np0005603623 systemd[1]: Starting Network Manager Wait Online...
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.2097] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.2127] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.2129] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.2133] manager: NetworkManager state is now CONNECTED_SITE
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.2137] device (eth0): Activation: successful, device activated.
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.2143] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 31 02:15:33 np0005603623 NetworkManager[48970]: <info>  [1769843733.2146] manager: startup complete
Jan 31 02:15:33 np0005603623 systemd[1]: Finished Network Manager Wait Online.
Jan 31 02:15:34 np0005603623 python3.9[49184]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:15:38 np0005603623 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 02:15:38 np0005603623 systemd[1]: Starting man-db-cache-update.service...
Jan 31 02:15:38 np0005603623 systemd[1]: Reloading.
Jan 31 02:15:38 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:15:38 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:15:38 np0005603623 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 02:15:40 np0005603623 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 02:15:40 np0005603623 systemd[1]: Finished man-db-cache-update.service.
Jan 31 02:15:40 np0005603623 systemd[1]: run-rb993eec2fc0542e6bade64d0748c13d2.service: Deactivated successfully.
Jan 31 02:15:43 np0005603623 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 02:15:43 np0005603623 python3.9[49647]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:15:44 np0005603623 python3.9[49799]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:15:45 np0005603623 python3.9[49953]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:15:46 np0005603623 python3.9[50105]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:15:46 np0005603623 python3.9[50257]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:15:47 np0005603623 python3.9[50409]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:15:48 np0005603623 python3.9[50561]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:15:49 np0005603623 python3.9[50684]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843747.8138337-649-186208396811808/.source _original_basename=.7w_sx2pk follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:15:50 np0005603623 python3.9[50836]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:15:51 np0005603623 python3.9[50988]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 31 02:15:52 np0005603623 python3.9[51140]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:15:55 np0005603623 python3.9[51567]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 31 02:15:56 np0005603623 ansible-async_wrapper.py[51742]: Invoked with j418371351315 300 /home/zuul/.ansible/tmp/ansible-tmp-1769843755.7663178-848-5186999890789/AnsiballZ_edpm_os_net_config.py _
Jan 31 02:15:56 np0005603623 ansible-async_wrapper.py[51745]: Starting module and watcher
Jan 31 02:15:56 np0005603623 ansible-async_wrapper.py[51745]: Start watching 51746 (300)
Jan 31 02:15:56 np0005603623 ansible-async_wrapper.py[51746]: Start module (51746)
Jan 31 02:15:56 np0005603623 ansible-async_wrapper.py[51742]: Return async_wrapper task started.
Jan 31 02:15:57 np0005603623 python3.9[51747]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 31 02:15:57 np0005603623 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 31 02:15:57 np0005603623 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 31 02:15:57 np0005603623 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 31 02:15:57 np0005603623 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 31 02:15:57 np0005603623 kernel: cfg80211: failed to load regulatory.db
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0052] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0074] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0623] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0625] audit: op="connection-add" uuid="fbaab2bb-9b55-4a41-9cc5-75f09ae47096" name="br-ex-br" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0639] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0640] audit: op="connection-add" uuid="f9f9d49c-4b85-4f4d-a2e0-e3c76cc002fe" name="br-ex-port" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0651] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0652] audit: op="connection-add" uuid="3637e29a-fbb1-4db9-94d4-1260b8b1f32f" name="eth1-port" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0666] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0668] audit: op="connection-add" uuid="26eaa47f-f9d0-4cd6-b46d-4ecc2b9a237c" name="vlan20-port" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0678] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0680] audit: op="connection-add" uuid="a10e8045-ee36-4f22-86c0-b4709996f851" name="vlan21-port" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0690] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0691] audit: op="connection-add" uuid="2663056e-56ef-42cf-8a42-acd3f2d2e56c" name="vlan22-port" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0701] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0702] audit: op="connection-add" uuid="46b3bd80-583a-4c66-b999-eca9f3748e02" name="vlan23-port" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0720] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.timestamp,connection.autoconnect-priority,802-3-ethernet.mtu,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0735] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.0736] audit: op="connection-add" uuid="4617eeea-97a1-460f-bf69-75c923d328be" name="br-ex-if" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1686] audit: op="connection-update" uuid="30f6d43e-2343-50fd-9644-39fd990a5838" name="ci-private-network" args="ovs-external-ids.data,ipv4.addresses,ipv4.dns,ipv4.routes,ipv4.method,ipv4.routing-rules,ipv4.never-default,connection.timestamp,connection.controller,connection.port-type,connection.master,connection.slave-type,ipv6.addresses,ipv6.dns,ipv6.routes,ipv6.method,ipv6.addr-gen-mode,ipv6.routing-rules,ovs-interface.type" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1708] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1710] audit: op="connection-add" uuid="78e869a4-e709-4b25-9b52-118f3670545c" name="vlan20-if" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1726] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1730] audit: op="connection-add" uuid="c40bcde9-8e2f-4ecd-b876-b5597c006f4c" name="vlan21-if" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1742] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1744] audit: op="connection-add" uuid="47794b39-1f32-4aa1-80b9-492f255fce7f" name="vlan22-if" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1757] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1758] audit: op="connection-add" uuid="e493e73d-39f0-4a9c-84b7-20a89e217950" name="vlan23-if" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1773] audit: op="connection-delete" uuid="723a2653-d37e-330e-8a33-ebd92ee3bb06" name="Wired connection 1" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1783] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <warn>  [1769843759.1787] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1792] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1795] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (fbaab2bb-9b55-4a41-9cc5-75f09ae47096)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1796] audit: op="connection-activate" uuid="fbaab2bb-9b55-4a41-9cc5-75f09ae47096" name="br-ex-br" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1798] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <warn>  [1769843759.1799] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1803] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1807] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (f9f9d49c-4b85-4f4d-a2e0-e3c76cc002fe)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1809] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <warn>  [1769843759.1810] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1814] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1817] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (3637e29a-fbb1-4db9-94d4-1260b8b1f32f)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1818] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <warn>  [1769843759.1819] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1823] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1827] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (26eaa47f-f9d0-4cd6-b46d-4ecc2b9a237c)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1829] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <warn>  [1769843759.1830] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1834] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1836] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (a10e8045-ee36-4f22-86c0-b4709996f851)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1838] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <warn>  [1769843759.1839] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1842] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1846] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (2663056e-56ef-42cf-8a42-acd3f2d2e56c)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1847] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <warn>  [1769843759.1848] device (vlan23)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1852] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1855] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (46b3bd80-583a-4c66-b999-eca9f3748e02)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1856] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1857] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1859] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1864] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <warn>  [1769843759.1865] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1867] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1870] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (4617eeea-97a1-460f-bf69-75c923d328be)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1871] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1873] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1875] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1876] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1877] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1885] device (eth1): disconnecting for new activation request.
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1886] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1899] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1902] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1903] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1907] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <warn>  [1769843759.1908] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1911] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1916] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (78e869a4-e709-4b25-9b52-118f3670545c)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1918] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1921] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1924] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1926] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1929] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <warn>  [1769843759.1930] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1933] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1937] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (c40bcde9-8e2f-4ecd-b876-b5597c006f4c)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1938] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1941] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1943] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1944] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1947] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <warn>  [1769843759.1948] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1951] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1955] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (47794b39-1f32-4aa1-80b9-492f255fce7f)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1956] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1958] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1960] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1961] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1964] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <warn>  [1769843759.1965] device (vlan23)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1968] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1972] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (e493e73d-39f0-4a9c-84b7-20a89e217950)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1973] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1976] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1978] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1980] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1982] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.1997] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,802-3-ethernet.mtu,ipv6.method,ipv6.addr-gen-mode" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2000] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2003] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2005] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2011] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2015] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2037] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 kernel: ovs-system: entered promiscuous mode
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2042] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2043] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2048] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2051] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 systemd-udevd[51753]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:15:59 np0005603623 kernel: Timeout policy base is empty
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2053] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2056] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2060] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2063] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2065] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2067] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2070] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2073] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2075] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2076] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2080] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2083] dhcp4 (eth0): canceled DHCP transaction
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2083] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2083] dhcp4 (eth0): state changed no lease
Jan 31 02:15:59 np0005603623 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2093] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2106] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2110] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51748 uid=0 result="fail" reason="Device is not activated"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2115] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 31 02:15:59 np0005603623 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 02:15:59 np0005603623 kernel: br-ex: entered promiscuous mode
Jan 31 02:15:59 np0005603623 kernel: vlan20: entered promiscuous mode
Jan 31 02:15:59 np0005603623 systemd-udevd[51752]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2403] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2416] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 31 02:15:59 np0005603623 kernel: vlan21: entered promiscuous mode
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2435] device (eth1): disconnecting for new activation request.
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2436] audit: op="connection-activate" uuid="30f6d43e-2343-50fd-9644-39fd990a5838" name="ci-private-network" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2436] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 31 02:15:59 np0005603623 systemd-udevd[51845]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:15:59 np0005603623 kernel: vlan22: entered promiscuous mode
Jan 31 02:15:59 np0005603623 kernel: vlan23: entered promiscuous mode
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2548] device (eth1): Activation: starting connection 'ci-private-network' (30f6d43e-2343-50fd-9644-39fd990a5838)
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2553] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2555] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2556] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2557] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2559] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2561] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2562] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2567] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2586] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2589] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2593] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2599] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2603] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2609] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2615] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2618] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2622] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2627] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2631] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2636] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2640] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2644] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2649] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2652] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2656] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2665] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2675] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51748 uid=0 result="success"
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2678] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2684] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2695] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2697] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2703] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2732] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2759] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2768] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2778] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2801] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2807] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2818] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2821] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2823] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2830] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2836] device (eth1): Activation: successful, device activated.
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2841] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2847] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2853] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2858] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2866] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2869] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2871] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2879] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2886] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2892] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2898] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2904] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.2909] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603623 NetworkManager[48970]: <info>  [1769843759.7678] dhcp4 (eth0): state changed new lease, address=38.102.83.110
Jan 31 02:16:00 np0005603623 NetworkManager[48970]: <info>  [1769843760.4651] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51748 uid=0 result="success"
Jan 31 02:16:00 np0005603623 NetworkManager[48970]: <info>  [1769843760.6504] checkpoint[0x55aa46e24950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 31 02:16:00 np0005603623 NetworkManager[48970]: <info>  [1769843760.6507] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51748 uid=0 result="success"
Jan 31 02:16:01 np0005603623 NetworkManager[48970]: <info>  [1769843761.0197] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51748 uid=0 result="success"
Jan 31 02:16:01 np0005603623 NetworkManager[48970]: <info>  [1769843761.0209] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51748 uid=0 result="success"
Jan 31 02:16:01 np0005603623 python3.9[52108]: ansible-ansible.legacy.async_status Invoked with jid=j418371351315.51742 mode=status _async_dir=/root/.ansible_async
Jan 31 02:16:01 np0005603623 NetworkManager[48970]: <info>  [1769843761.2747] audit: op="networking-control" arg="global-dns-configuration" pid=51748 uid=0 result="success"
Jan 31 02:16:01 np0005603623 NetworkManager[48970]: <info>  [1769843761.2828] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 31 02:16:01 np0005603623 NetworkManager[48970]: <info>  [1769843761.2884] audit: op="networking-control" arg="global-dns-configuration" pid=51748 uid=0 result="success"
Jan 31 02:16:01 np0005603623 NetworkManager[48970]: <info>  [1769843761.2908] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51748 uid=0 result="success"
Jan 31 02:16:01 np0005603623 NetworkManager[48970]: <info>  [1769843761.4112] checkpoint[0x55aa46e24a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 31 02:16:01 np0005603623 NetworkManager[48970]: <info>  [1769843761.4118] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51748 uid=0 result="success"
Jan 31 02:16:01 np0005603623 ansible-async_wrapper.py[51746]: Module complete (51746)
Jan 31 02:16:01 np0005603623 ansible-async_wrapper.py[51745]: Done in kid B.
Jan 31 02:16:03 np0005603623 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 31 02:16:04 np0005603623 python3.9[52215]: ansible-ansible.legacy.async_status Invoked with jid=j418371351315.51742 mode=status _async_dir=/root/.ansible_async
Jan 31 02:16:05 np0005603623 python3.9[52314]: ansible-ansible.legacy.async_status Invoked with jid=j418371351315.51742 mode=cleanup _async_dir=/root/.ansible_async
Jan 31 02:16:06 np0005603623 python3.9[52466]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:16:07 np0005603623 python3.9[52589]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843766.3263419-929-261121564765734/.source.returncode _original_basename=.6drtcn8s follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:16:10 np0005603623 python3.9[52742]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:16:11 np0005603623 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 02:16:11 np0005603623 python3.9[52866]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843770.2630792-976-207233387199512/.source.cfg _original_basename=.euiqgg2v follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:16:12 np0005603623 python3.9[53019]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:16:12 np0005603623 systemd[1]: Reloading Network Manager...
Jan 31 02:16:12 np0005603623 NetworkManager[48970]: <info>  [1769843772.2463] audit: op="reload" arg="0" pid=53023 uid=0 result="success"
Jan 31 02:16:12 np0005603623 NetworkManager[48970]: <info>  [1769843772.2471] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 31 02:16:12 np0005603623 systemd[1]: Reloaded Network Manager.
Jan 31 02:16:12 np0005603623 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 02:16:12 np0005603623 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 02:16:12 np0005603623 systemd[1]: session-11.scope: Deactivated successfully.
Jan 31 02:16:12 np0005603623 systemd[1]: session-11.scope: Consumed 50.344s CPU time.
Jan 31 02:16:12 np0005603623 systemd-logind[795]: Session 11 logged out. Waiting for processes to exit.
Jan 31 02:16:12 np0005603623 systemd-logind[795]: Removed session 11.
Jan 31 02:16:18 np0005603623 systemd-logind[795]: New session 12 of user zuul.
Jan 31 02:16:18 np0005603623 systemd[1]: Started Session 12 of User zuul.
Jan 31 02:16:19 np0005603623 python3.9[53211]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:16:20 np0005603623 python3.9[53365]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:16:21 np0005603623 python3.9[53559]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:16:21 np0005603623 systemd[1]: session-12.scope: Deactivated successfully.
Jan 31 02:16:21 np0005603623 systemd[1]: session-12.scope: Consumed 2.060s CPU time.
Jan 31 02:16:21 np0005603623 systemd-logind[795]: Session 12 logged out. Waiting for processes to exit.
Jan 31 02:16:21 np0005603623 systemd-logind[795]: Removed session 12.
Jan 31 02:16:22 np0005603623 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 02:16:28 np0005603623 systemd-logind[795]: New session 13 of user zuul.
Jan 31 02:16:28 np0005603623 systemd[1]: Started Session 13 of User zuul.
Jan 31 02:16:29 np0005603623 python3.9[53741]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:16:30 np0005603623 python3.9[53896]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:16:31 np0005603623 python3.9[54052]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:16:32 np0005603623 python3.9[54136]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:16:34 np0005603623 python3.9[54289]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:16:36 np0005603623 python3.9[54484]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:16:36 np0005603623 python3.9[54636]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:16:36 np0005603623 systemd[1]: var-lib-containers-storage-overlay-compat4081514600-merged.mount: Deactivated successfully.
Jan 31 02:16:36 np0005603623 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck317533262-merged.mount: Deactivated successfully.
Jan 31 02:16:36 np0005603623 podman[54637]: 2026-01-31 07:16:36.895268256 +0000 UTC m=+0.070498506 system refresh
Jan 31 02:16:37 np0005603623 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:16:37 np0005603623 python3.9[54799]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:16:38 np0005603623 python3.9[54922]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843797.417451-198-213594481294298/.source.json follow=False _original_basename=podman_network_config.j2 checksum=ff59a7d165ef46f625875ad6b95d570bbeac8c51 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:16:39 np0005603623 python3.9[55074]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:16:39 np0005603623 python3.9[55197]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769843798.957415-245-90798743449609/.source.conf follow=False _original_basename=registries.conf.j2 checksum=51dca2f6e7d675b0597f23a4e044edd3f4faff03 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:16:40 np0005603623 python3.9[55349]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:16:41 np0005603623 python3.9[55501]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:16:42 np0005603623 python3.9[55653]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:16:42 np0005603623 python3.9[55805]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:16:43 np0005603623 python3.9[55957]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:16:46 np0005603623 python3.9[56110]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:16:47 np0005603623 python3.9[56264]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:16:48 np0005603623 python3.9[56416]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:16:48 np0005603623 python3.9[56568]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:16:49 np0005603623 python3.9[56721]: ansible-service_facts Invoked
Jan 31 02:16:49 np0005603623 network[56738]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 02:16:49 np0005603623 network[56739]: 'network-scripts' will be removed from distribution in near future.
Jan 31 02:16:49 np0005603623 network[56740]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 02:16:58 np0005603623 python3.9[57192]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:17:02 np0005603623 python3.9[57345]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 31 02:17:03 np0005603623 python3.9[57497]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:04 np0005603623 python3.9[57622]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843823.3119464-677-44512540955297/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:05 np0005603623 python3.9[57776]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:05 np0005603623 python3.9[57901]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843824.6377358-721-281356059736729/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:07 np0005603623 python3.9[58055]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:08 np0005603623 python3.9[58209]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:17:09 np0005603623 python3.9[58293]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:17:12 np0005603623 python3.9[58447]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:17:13 np0005603623 python3.9[58531]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:17:13 np0005603623 chronyd[800]: chronyd exiting
Jan 31 02:17:13 np0005603623 systemd[1]: Stopping NTP client/server...
Jan 31 02:17:13 np0005603623 systemd[1]: chronyd.service: Deactivated successfully.
Jan 31 02:17:13 np0005603623 systemd[1]: Stopped NTP client/server.
Jan 31 02:17:13 np0005603623 systemd[1]: Starting NTP client/server...
Jan 31 02:17:13 np0005603623 chronyd[58540]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 31 02:17:13 np0005603623 chronyd[58540]: Frequency -32.013 +/- 0.041 ppm read from /var/lib/chrony/drift
Jan 31 02:17:13 np0005603623 chronyd[58540]: Loaded seccomp filter (level 2)
Jan 31 02:17:13 np0005603623 systemd[1]: Started NTP client/server.
Jan 31 02:17:13 np0005603623 systemd-logind[795]: Session 13 logged out. Waiting for processes to exit.
Jan 31 02:17:13 np0005603623 systemd[1]: session-13.scope: Deactivated successfully.
Jan 31 02:17:13 np0005603623 systemd[1]: session-13.scope: Consumed 23.078s CPU time.
Jan 31 02:17:13 np0005603623 systemd-logind[795]: Removed session 13.
Jan 31 02:17:19 np0005603623 systemd-logind[795]: New session 14 of user zuul.
Jan 31 02:17:19 np0005603623 systemd[1]: Started Session 14 of User zuul.
Jan 31 02:17:20 np0005603623 python3.9[58721]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:21 np0005603623 python3.9[58873]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:21 np0005603623 python3.9[58996]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843840.464678-65-11424365390470/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:22 np0005603623 systemd[1]: session-14.scope: Deactivated successfully.
Jan 31 02:17:22 np0005603623 systemd[1]: session-14.scope: Consumed 1.396s CPU time.
Jan 31 02:17:22 np0005603623 systemd-logind[795]: Session 14 logged out. Waiting for processes to exit.
Jan 31 02:17:22 np0005603623 systemd-logind[795]: Removed session 14.
Jan 31 02:17:27 np0005603623 systemd-logind[795]: New session 15 of user zuul.
Jan 31 02:17:27 np0005603623 systemd[1]: Started Session 15 of User zuul.
Jan 31 02:17:28 np0005603623 python3.9[59174]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:17:29 np0005603623 python3.9[59330]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:30 np0005603623 python3.9[59505]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:30 np0005603623 python3.9[59628]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769843849.651202-85-209926877541146/.source.json _original_basename=.0oa9ql53 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:31 np0005603623 python3.9[59780]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:32 np0005603623 python3.9[59903]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843851.4122732-154-245368368119407/.source _original_basename=.jluats32 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:33 np0005603623 python3.9[60055]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:17:33 np0005603623 python3.9[60207]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:34 np0005603623 python3.9[60330]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769843853.316281-226-34029375450045/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:17:34 np0005603623 python3.9[60482]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:35 np0005603623 python3.9[60605]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769843854.4981892-226-277702461002173/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:17:36 np0005603623 python3.9[60757]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:37 np0005603623 python3.9[60909]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:37 np0005603623 python3.9[61032]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843856.549967-337-90837056794288/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:38 np0005603623 python3.9[61184]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:38 np0005603623 python3.9[61307]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843857.7733219-383-180405538384754/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:40 np0005603623 python3.9[61459]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:17:40 np0005603623 systemd[1]: Reloading.
Jan 31 02:17:40 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:17:40 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:17:40 np0005603623 systemd[1]: Reloading.
Jan 31 02:17:40 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:17:40 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:17:40 np0005603623 systemd[1]: Starting EDPM Container Shutdown...
Jan 31 02:17:40 np0005603623 systemd[1]: Finished EDPM Container Shutdown.
Jan 31 02:17:41 np0005603623 python3.9[61685]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:41 np0005603623 python3.9[61808]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843860.8460627-452-49708541452691/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:42 np0005603623 python3.9[61960]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:42 np0005603623 python3.9[62083]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843861.9781094-496-41655640564997/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:43 np0005603623 python3.9[62235]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:17:43 np0005603623 systemd[1]: Reloading.
Jan 31 02:17:43 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:17:43 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:17:44 np0005603623 systemd[1]: Reloading.
Jan 31 02:17:44 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:17:44 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:17:44 np0005603623 systemd[1]: Starting Create netns directory...
Jan 31 02:17:44 np0005603623 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 31 02:17:44 np0005603623 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 31 02:17:44 np0005603623 systemd[1]: Finished Create netns directory.
Jan 31 02:17:45 np0005603623 python3.9[62463]: ansible-ansible.builtin.service_facts Invoked
Jan 31 02:17:45 np0005603623 network[62480]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 02:17:45 np0005603623 network[62481]: 'network-scripts' will be removed from distribution in near future.
Jan 31 02:17:45 np0005603623 network[62482]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 02:17:49 np0005603623 python3.9[62744]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:17:49 np0005603623 systemd[1]: Reloading.
Jan 31 02:17:49 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:17:49 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:17:49 np0005603623 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 31 02:17:49 np0005603623 iptables.init[62783]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 31 02:17:49 np0005603623 iptables.init[62783]: iptables: Flushing firewall rules: [  OK  ]
Jan 31 02:17:49 np0005603623 systemd[1]: iptables.service: Deactivated successfully.
Jan 31 02:17:49 np0005603623 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 31 02:17:50 np0005603623 python3.9[62979]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:17:51 np0005603623 python3.9[63133]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:17:51 np0005603623 systemd[1]: Reloading.
Jan 31 02:17:51 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:17:52 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:17:52 np0005603623 systemd[1]: Starting Netfilter Tables...
Jan 31 02:17:52 np0005603623 systemd[1]: Finished Netfilter Tables.
Jan 31 02:17:53 np0005603623 python3.9[63325]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:17:54 np0005603623 python3.9[63478]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:55 np0005603623 python3.9[63603]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843873.9291763-704-128134978158195/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:55 np0005603623 python3.9[63756]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:17:55 np0005603623 systemd[1]: Reloading OpenSSH server daemon...
Jan 31 02:17:55 np0005603623 systemd[1]: Reloaded OpenSSH server daemon.
Jan 31 02:17:56 np0005603623 python3.9[63912]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:57 np0005603623 python3.9[64064]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:57 np0005603623 python3.9[64187]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843876.785339-797-191844072664871/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:58 np0005603623 python3.9[64339]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 31 02:17:58 np0005603623 systemd[1]: Starting Time & Date Service...
Jan 31 02:17:58 np0005603623 systemd[1]: Started Time & Date Service.
Jan 31 02:18:00 np0005603623 python3.9[64495]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:00 np0005603623 python3.9[64647]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:18:01 np0005603623 python3.9[64770]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843880.2727466-902-182557252528081/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:02 np0005603623 python3.9[64922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:18:02 np0005603623 python3.9[65045]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843881.9664748-947-47658931962606/.source.yaml _original_basename=.85kqqav9 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:03 np0005603623 python3.9[65197]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:18:04 np0005603623 python3.9[65320]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843883.065021-991-43673278176539/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:04 np0005603623 python3.9[65472]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:18:05 np0005603623 python3.9[65625]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:18:06 np0005603623 python3[65778]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 31 02:18:07 np0005603623 python3.9[65930]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:18:07 np0005603623 python3.9[66053]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843886.362378-1109-77661885762497/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:08 np0005603623 python3.9[66205]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:18:09 np0005603623 python3.9[66328]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843888.0798516-1154-198849896177706/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:10 np0005603623 python3.9[66480]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:18:10 np0005603623 python3.9[66603]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843889.6775079-1199-20247535897383/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:11 np0005603623 python3.9[66755]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:18:11 np0005603623 python3.9[66878]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843890.8789659-1244-136455263452638/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:13 np0005603623 python3.9[67030]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:18:13 np0005603623 python3.9[67153]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843892.1610205-1290-215415143267725/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:14 np0005603623 python3.9[67305]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:15 np0005603623 python3.9[67457]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:18:15 np0005603623 python3.9[67616]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:16 np0005603623 python3.9[67769]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:17 np0005603623 python3.9[67921]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:18 np0005603623 python3.9[68073]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 31 02:18:18 np0005603623 python3.9[68226]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 31 02:18:19 np0005603623 systemd[1]: session-15.scope: Deactivated successfully.
Jan 31 02:18:19 np0005603623 systemd[1]: session-15.scope: Consumed 31.126s CPU time.
Jan 31 02:18:19 np0005603623 systemd-logind[795]: Session 15 logged out. Waiting for processes to exit.
Jan 31 02:18:19 np0005603623 systemd-logind[795]: Removed session 15.
Jan 31 02:18:24 np0005603623 systemd-logind[795]: New session 16 of user zuul.
Jan 31 02:18:24 np0005603623 systemd[1]: Started Session 16 of User zuul.
Jan 31 02:18:25 np0005603623 python3.9[68407]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 31 02:18:26 np0005603623 python3.9[68559]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:18:27 np0005603623 python3.9[68711]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:18:28 np0005603623 python3.9[68863]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCSEo2WrFN8DnR2/d+p3YtsWos96nHz1MZInXN3md5cJXE0icMDwEWJuGIDUd5e0SA6Q7i33i/WIEmt/wGMoNhoTI+f3plB2NyAn5vyVQGTZv7m+tOLQI3/k50Kxnpu0c5gO509yln6RcLe4MutF0imS/fINCM+Nznh7oKbn6hELTDlxDz0JH8dNsZGmtVmgnhwIrglpxAg/WpeOWkCmuuXmysx1JcAhIK5016MzaM9cOtHAGzj5s0GE7nQoH4yG0Ak3zMU/DPKr91Xq/m9PCnGKautoHmHgrEG6u+1WubtakbBxlfmroKbvrIFL6KKQzY0SiTrBsH3nZRaFGCqE0ZEyHvJz8AO3quWg2oaXRJWN98f7k3l5dtVJIuwyJxVnv6fUGuLbGxOp4T6UDPqC7b2Eg17EtpUjy77F/+8yrX6NH+hXwcWBwHelRCDSiceGQTm1uexb8Xo1R1Wt9h24H2yRKPFrqzf1R9J2vipDouDo7RLefAiCXEJDdlewdKUM5c=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILwGOCpzCDE8uIHb4RBldbKfEvxhUdsBT4K7sPU4vZLU#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLS8teLqq0Lmt8g22OKhtEhLCXd5cBLM6W2oDJcWxQl8DloBMMFjgDlHt0rzjMKEL0SpxkPbH7sPV1zbWKKJI9M=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAgshePGnD7oc3Zg8kfD9lUGSfPfE1OzPUGBHE12jLoyHnXwKTxYFYSMTWRcYgdFu4HaP0ShO1gEQF+1nDXxrozH/m2qxK/YPC5cVYCPvscwRdlyUNPOV0rpiruVZptTQ1iibsmRwMbxliXD2t13CtsrNjy9iuLgtvvnkfUh0wZKcZ8Jglg6E4vRTBPgXo3fJCfPF9Iz7GE50DpWAU8OnoLNlOf54/tcd8CyOrmLF9RwHTgNtN9FXscdQ3/A8avCF0WPWNUmfLFc20yOtfrq/xxjJMLn4KOZu1D1yjK5BSJu2pv/j0NPrTFKgPKYWjiXPdttcyubkXNZP96jkK9dgTgsEGRKuM83QpDIu7823wv4/GtEi+IsJeyqCN+3VAJo9hDB9eES8qlX4jAg6Kxen1oNkL9M+tz7N0BSdnxbS3skWEw6MsHlsBLOw7KMYe8gq8JoqHLBKBFQZZbjwaK5kNTeu6l5zAYERpt8uAEZkplq2vV5+4EOh7RPncmKuH0Xs=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF//s6MNfOt3MK/jBcrJ5VkyeSY5eg1jUHN32BLTGZtT#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEtIHGkRVmmqcsRoXLuIEWyuaX3BoKld3DircbfvRpdFLzOwbxRaZ6uUN5f7sBun3oAcQLdnixnG3R/YK8L7HpM=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCXc4C5rCDfOfKEuMVHI9SatZ+NRO9lp335K0yZ19CDCOGSUNO2lblpRlgxO3tw3S+UGGiC/7/HHeZBA2Zd+SUVMb7ytbl5c3+XuZIIQF6DyIIDSELf0FoE0NhuSjKFilPsxyxxGYgH+gVaTZkuGhDoljaywQBSPGZdDwejVKWPVuui5xe0X4T0WVfT5avLSpIL3WjJ9hmzEaR0dUqrbKvPUAXJPDqQOZbQZbpXDIi48NPUDFwByej1xHWHRQaPJ/M6AsyrZKP/hiF2xt0mCIk1FANldusq4OUs9r/0KTVrPRCpSrsSimKBtEMJVdxqxAasE7H07sSdwFcWNC21LtsH8+/LM0oofIZ3D0Lom0NoLaC+Ocy2vqbIhOPYJ6c7Q8J/p4NFiA/lD+bgyjOOnm3Ls4VaaHXUyknu259henkVzJ+iZuRNY8ki345nrzPLoLYyxVwRkSuONyYlRp36jjp0QIL9kXLFlJ2OTHvb9FUhlG7RnxzPeHZhsihSHJv1rgU=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAT2MDVMbPz3xtbIO31qZj2gzOQiz4a8pTNWAmd0+CUW#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFU8ym/rLGJxMpEsk09j3JHOh1hW4Vrm23tIOjn4/YJIrK1UFRFiQLDm+yZuj1NhWfbg71SK8ZuZ2miEJ20BHno=#012 create=True mode=0644 path=/tmp/ansible.0z0q99z9 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:28 np0005603623 python3.9[69015]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.0z0q99z9' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:18:29 np0005603623 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 31 02:18:29 np0005603623 python3.9[69172]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.0z0q99z9 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:30 np0005603623 systemd[1]: session-16.scope: Deactivated successfully.
Jan 31 02:18:30 np0005603623 systemd[1]: session-16.scope: Consumed 2.700s CPU time.
Jan 31 02:18:30 np0005603623 systemd-logind[795]: Session 16 logged out. Waiting for processes to exit.
Jan 31 02:18:30 np0005603623 systemd-logind[795]: Removed session 16.
Jan 31 02:18:35 np0005603623 systemd-logind[795]: New session 17 of user zuul.
Jan 31 02:18:35 np0005603623 systemd[1]: Started Session 17 of User zuul.
Jan 31 02:18:35 np0005603623 python3.9[69350]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:18:37 np0005603623 python3.9[69506]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 31 02:18:38 np0005603623 python3.9[69660]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:18:38 np0005603623 python3.9[69813]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:18:39 np0005603623 python3.9[69966]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:18:40 np0005603623 python3.9[70120]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:18:40 np0005603623 python3.9[70275]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:41 np0005603623 systemd[1]: session-17.scope: Deactivated successfully.
Jan 31 02:18:41 np0005603623 systemd[1]: session-17.scope: Consumed 3.642s CPU time.
Jan 31 02:18:41 np0005603623 systemd-logind[795]: Session 17 logged out. Waiting for processes to exit.
Jan 31 02:18:41 np0005603623 systemd-logind[795]: Removed session 17.
Jan 31 02:18:47 np0005603623 systemd-logind[795]: New session 18 of user zuul.
Jan 31 02:18:47 np0005603623 systemd[1]: Started Session 18 of User zuul.
Jan 31 02:18:48 np0005603623 python3.9[70453]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:18:49 np0005603623 python3.9[70609]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:18:50 np0005603623 python3.9[70693]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 02:18:52 np0005603623 python3.9[70844]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:18:53 np0005603623 python3.9[70995]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 31 02:18:54 np0005603623 python3.9[71145]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:18:54 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 02:18:55 np0005603623 python3.9[71296]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:18:55 np0005603623 systemd[1]: session-18.scope: Deactivated successfully.
Jan 31 02:18:55 np0005603623 systemd[1]: session-18.scope: Consumed 5.455s CPU time.
Jan 31 02:18:55 np0005603623 systemd-logind[795]: Session 18 logged out. Waiting for processes to exit.
Jan 31 02:18:55 np0005603623 systemd-logind[795]: Removed session 18.
Jan 31 02:19:03 np0005603623 systemd-logind[795]: New session 19 of user zuul.
Jan 31 02:19:03 np0005603623 systemd[1]: Started Session 19 of User zuul.
Jan 31 02:19:09 np0005603623 python3[72062]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:19:11 np0005603623 python3[72157]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 31 02:19:12 np0005603623 python3[72184]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 31 02:19:13 np0005603623 python3[72210]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:19:13 np0005603623 kernel: loop: module loaded
Jan 31 02:19:13 np0005603623 kernel: loop3: detected capacity change from 0 to 14680064
Jan 31 02:19:13 np0005603623 python3[72245]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:19:13 np0005603623 lvm[72248]: PV /dev/loop3 not used.
Jan 31 02:19:13 np0005603623 lvm[72257]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 31 02:19:13 np0005603623 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Jan 31 02:19:13 np0005603623 lvm[72259]:  1 logical volume(s) in volume group "ceph_vg0" now active
Jan 31 02:19:13 np0005603623 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Jan 31 02:19:14 np0005603623 python3[72337]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 02:19:14 np0005603623 python3[72410]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769843953.8670616-37125-72354050078607/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:19:15 np0005603623 python3[72460]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:19:15 np0005603623 systemd[1]: Reloading.
Jan 31 02:19:15 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:19:15 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:19:15 np0005603623 systemd[1]: Starting Ceph OSD losetup...
Jan 31 02:19:15 np0005603623 bash[72501]: /dev/loop3: [64513]:4194935 (/var/lib/ceph-osd-0.img)
Jan 31 02:19:15 np0005603623 systemd[1]: Finished Ceph OSD losetup.
Jan 31 02:19:15 np0005603623 lvm[72502]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 31 02:19:15 np0005603623 lvm[72502]: VG ceph_vg0 finished
Jan 31 02:19:17 np0005603623 python3[72526]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:19:22 np0005603623 chronyd[58540]: Selected source 23.133.168.245 (pool.ntp.org)
Jan 31 02:21:24 np0005603623 systemd[1]: Created slice User Slice of UID 42477.
Jan 31 02:21:24 np0005603623 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 31 02:21:24 np0005603623 systemd-logind[795]: New session 20 of user ceph-admin.
Jan 31 02:21:24 np0005603623 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 31 02:21:24 np0005603623 systemd[1]: Starting User Manager for UID 42477...
Jan 31 02:21:24 np0005603623 systemd[72574]: Queued start job for default target Main User Target.
Jan 31 02:21:24 np0005603623 systemd[72574]: Created slice User Application Slice.
Jan 31 02:21:24 np0005603623 systemd[72574]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 31 02:21:24 np0005603623 systemd[72574]: Started Daily Cleanup of User's Temporary Directories.
Jan 31 02:21:24 np0005603623 systemd[72574]: Reached target Paths.
Jan 31 02:21:24 np0005603623 systemd[72574]: Reached target Timers.
Jan 31 02:21:24 np0005603623 systemd[72574]: Starting D-Bus User Message Bus Socket...
Jan 31 02:21:24 np0005603623 systemd[72574]: Starting Create User's Volatile Files and Directories...
Jan 31 02:21:24 np0005603623 systemd-logind[795]: New session 22 of user ceph-admin.
Jan 31 02:21:24 np0005603623 systemd[72574]: Listening on D-Bus User Message Bus Socket.
Jan 31 02:21:24 np0005603623 systemd[72574]: Finished Create User's Volatile Files and Directories.
Jan 31 02:21:24 np0005603623 systemd[72574]: Reached target Sockets.
Jan 31 02:21:24 np0005603623 systemd[72574]: Reached target Basic System.
Jan 31 02:21:24 np0005603623 systemd[72574]: Reached target Main User Target.
Jan 31 02:21:24 np0005603623 systemd[72574]: Startup finished in 127ms.
Jan 31 02:21:24 np0005603623 systemd[1]: Started User Manager for UID 42477.
Jan 31 02:21:24 np0005603623 systemd[1]: Started Session 20 of User ceph-admin.
Jan 31 02:21:24 np0005603623 systemd[1]: Started Session 22 of User ceph-admin.
Jan 31 02:21:25 np0005603623 systemd-logind[795]: New session 23 of user ceph-admin.
Jan 31 02:21:25 np0005603623 systemd[1]: Started Session 23 of User ceph-admin.
Jan 31 02:21:25 np0005603623 systemd-logind[795]: New session 24 of user ceph-admin.
Jan 31 02:21:25 np0005603623 systemd[1]: Started Session 24 of User ceph-admin.
Jan 31 02:21:25 np0005603623 systemd-logind[795]: New session 25 of user ceph-admin.
Jan 31 02:21:25 np0005603623 systemd[1]: Started Session 25 of User ceph-admin.
Jan 31 02:21:26 np0005603623 systemd-logind[795]: New session 26 of user ceph-admin.
Jan 31 02:21:26 np0005603623 systemd[1]: Started Session 26 of User ceph-admin.
Jan 31 02:21:26 np0005603623 systemd-logind[795]: New session 27 of user ceph-admin.
Jan 31 02:21:26 np0005603623 systemd[1]: Started Session 27 of User ceph-admin.
Jan 31 02:21:26 np0005603623 systemd-logind[795]: New session 28 of user ceph-admin.
Jan 31 02:21:26 np0005603623 systemd[1]: Started Session 28 of User ceph-admin.
Jan 31 02:21:27 np0005603623 systemd-logind[795]: New session 29 of user ceph-admin.
Jan 31 02:21:27 np0005603623 systemd[1]: Started Session 29 of User ceph-admin.
Jan 31 02:21:27 np0005603623 systemd-logind[795]: New session 30 of user ceph-admin.
Jan 31 02:21:27 np0005603623 systemd[1]: Started Session 30 of User ceph-admin.
Jan 31 02:21:28 np0005603623 systemd-logind[795]: New session 31 of user ceph-admin.
Jan 31 02:21:28 np0005603623 systemd[1]: Started Session 31 of User ceph-admin.
Jan 31 02:21:28 np0005603623 systemd-logind[795]: New session 32 of user ceph-admin.
Jan 31 02:21:28 np0005603623 systemd[1]: Started Session 32 of User ceph-admin.
Jan 31 02:21:28 np0005603623 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:22:03 np0005603623 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:22:03 np0005603623 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:22:03 np0005603623 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73598 (sysctl)
Jan 31 02:22:04 np0005603623 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:22:04 np0005603623 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 31 02:22:04 np0005603623 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 31 02:22:04 np0005603623 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:22:05 np0005603623 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:22:07 np0005603623 systemd[1]: var-lib-containers-storage-overlay-compat3431657400-lower\x2dmapped.mount: Deactivated successfully.
Jan 31 02:22:18 np0005603623 podman[73876]: 2026-01-31 07:22:18.836617205 +0000 UTC m=+13.708071370 container create 631d019eb8cb0d78be42173a8071794100a05bc9c20edd73325148248dfe9f06 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_chebyshev, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:22:18 np0005603623 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 31 02:22:18 np0005603623 systemd[1]: Started libpod-conmon-631d019eb8cb0d78be42173a8071794100a05bc9c20edd73325148248dfe9f06.scope.
Jan 31 02:22:18 np0005603623 podman[73876]: 2026-01-31 07:22:18.817764473 +0000 UTC m=+13.689218688 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:18 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:22:18 np0005603623 podman[73876]: 2026-01-31 07:22:18.923616059 +0000 UTC m=+13.795070234 container init 631d019eb8cb0d78be42173a8071794100a05bc9c20edd73325148248dfe9f06 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_chebyshev, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:22:18 np0005603623 podman[73876]: 2026-01-31 07:22:18.929022564 +0000 UTC m=+13.800476729 container start 631d019eb8cb0d78be42173a8071794100a05bc9c20edd73325148248dfe9f06 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_chebyshev, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Jan 31 02:22:18 np0005603623 podman[73876]: 2026-01-31 07:22:18.933130582 +0000 UTC m=+13.804584767 container attach 631d019eb8cb0d78be42173a8071794100a05bc9c20edd73325148248dfe9f06 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_chebyshev, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 31 02:22:18 np0005603623 nifty_chebyshev[73943]: 167 167
Jan 31 02:22:18 np0005603623 systemd[1]: libpod-631d019eb8cb0d78be42173a8071794100a05bc9c20edd73325148248dfe9f06.scope: Deactivated successfully.
Jan 31 02:22:18 np0005603623 podman[73876]: 2026-01-31 07:22:18.936667074 +0000 UTC m=+13.808121269 container died 631d019eb8cb0d78be42173a8071794100a05bc9c20edd73325148248dfe9f06 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_chebyshev, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True)
Jan 31 02:22:18 np0005603623 systemd[1]: var-lib-containers-storage-overlay-fe47d5ed228dc5e852cf8b3b4bffe94307e90aa89a1bc5ca80e6826e6c5356c1-merged.mount: Deactivated successfully.
Jan 31 02:22:18 np0005603623 podman[73876]: 2026-01-31 07:22:18.973767312 +0000 UTC m=+13.845221487 container remove 631d019eb8cb0d78be42173a8071794100a05bc9c20edd73325148248dfe9f06 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_chebyshev, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:22:18 np0005603623 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:22:19 np0005603623 systemd[1]: libpod-conmon-631d019eb8cb0d78be42173a8071794100a05bc9c20edd73325148248dfe9f06.scope: Deactivated successfully.
Jan 31 02:22:19 np0005603623 podman[73966]: 2026-01-31 07:22:19.090765578 +0000 UTC m=+0.037376777 container create 25c2aa4292a49a8501e7c5f648cabd3c0017e4805d4112a74a32427e8bbbeb0f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_bose, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Jan 31 02:22:19 np0005603623 systemd[1]: Started libpod-conmon-25c2aa4292a49a8501e7c5f648cabd3c0017e4805d4112a74a32427e8bbbeb0f.scope.
Jan 31 02:22:19 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:22:19 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc0e220cb92b36c592fb7e6af88751c39bc5479bdd93c47f4e3670b0fda4549b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:19 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc0e220cb92b36c592fb7e6af88751c39bc5479bdd93c47f4e3670b0fda4549b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:19 np0005603623 podman[73966]: 2026-01-31 07:22:19.159126306 +0000 UTC m=+0.105737575 container init 25c2aa4292a49a8501e7c5f648cabd3c0017e4805d4112a74a32427e8bbbeb0f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_bose, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507)
Jan 31 02:22:19 np0005603623 podman[73966]: 2026-01-31 07:22:19.163594614 +0000 UTC m=+0.110205813 container start 25c2aa4292a49a8501e7c5f648cabd3c0017e4805d4112a74a32427e8bbbeb0f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_bose, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:22:19 np0005603623 podman[73966]: 2026-01-31 07:22:19.072424131 +0000 UTC m=+0.019035350 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:19 np0005603623 podman[73966]: 2026-01-31 07:22:19.168567197 +0000 UTC m=+0.115178416 container attach 25c2aa4292a49a8501e7c5f648cabd3c0017e4805d4112a74a32427e8bbbeb0f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_bose, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]: [
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:    {
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:        "available": false,
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:        "ceph_device": false,
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:        "lsm_data": {},
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:        "lvs": [],
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:        "path": "/dev/sr0",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:        "rejected_reasons": [
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "Insufficient space (<5GB)",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "Has a FileSystem"
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:        ],
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:        "sys_api": {
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "actuators": null,
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "device_nodes": "sr0",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "devname": "sr0",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "human_readable_size": "482.00 KB",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "id_bus": "ata",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "model": "QEMU DVD-ROM",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "nr_requests": "2",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "parent": "/dev/sr0",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "partitions": {},
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "path": "/dev/sr0",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "removable": "1",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "rev": "2.5+",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "ro": "0",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "rotational": "1",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "sas_address": "",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "sas_device_handle": "",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "scheduler_mode": "mq-deadline",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "sectors": 0,
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "sectorsize": "2048",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "size": 493568.0,
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "support_discard": "2048",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "type": "disk",
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:            "vendor": "QEMU"
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:        }
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]:    }
Jan 31 02:22:20 np0005603623 dreamy_bose[73982]: ]
Jan 31 02:22:20 np0005603623 systemd[1]: libpod-25c2aa4292a49a8501e7c5f648cabd3c0017e4805d4112a74a32427e8bbbeb0f.scope: Deactivated successfully.
Jan 31 02:22:20 np0005603623 podman[73966]: 2026-01-31 07:22:20.082335083 +0000 UTC m=+1.028946292 container died 25c2aa4292a49a8501e7c5f648cabd3c0017e4805d4112a74a32427e8bbbeb0f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_bose, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Jan 31 02:22:20 np0005603623 systemd[1]: var-lib-containers-storage-overlay-dc0e220cb92b36c592fb7e6af88751c39bc5479bdd93c47f4e3670b0fda4549b-merged.mount: Deactivated successfully.
Jan 31 02:22:20 np0005603623 podman[73966]: 2026-01-31 07:22:20.122968162 +0000 UTC m=+1.069579361 container remove 25c2aa4292a49a8501e7c5f648cabd3c0017e4805d4112a74a32427e8bbbeb0f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_bose, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 02:22:20 np0005603623 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:22:20 np0005603623 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:22:20 np0005603623 systemd[1]: libpod-conmon-25c2aa4292a49a8501e7c5f648cabd3c0017e4805d4112a74a32427e8bbbeb0f.scope: Deactivated successfully.
Jan 31 02:22:23 np0005603623 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:22:23 np0005603623 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:22:23 np0005603623 podman[76689]: 2026-01-31 07:22:23.787217659 +0000 UTC m=+0.034110272 container create a3daaea1c4114df0d7bd3780126ce4339d12b705fc6e2f8ba9fe17209bfab1ea (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_lalande, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 02:22:23 np0005603623 systemd[1]: Started libpod-conmon-a3daaea1c4114df0d7bd3780126ce4339d12b705fc6e2f8ba9fe17209bfab1ea.scope.
Jan 31 02:22:23 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:22:23 np0005603623 podman[76689]: 2026-01-31 07:22:23.836673272 +0000 UTC m=+0.083565905 container init a3daaea1c4114df0d7bd3780126ce4339d12b705fc6e2f8ba9fe17209bfab1ea (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_lalande, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:22:23 np0005603623 podman[76689]: 2026-01-31 07:22:23.841490821 +0000 UTC m=+0.088383434 container start a3daaea1c4114df0d7bd3780126ce4339d12b705fc6e2f8ba9fe17209bfab1ea (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_lalande, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_REF=reef)
Jan 31 02:22:23 np0005603623 podman[76689]: 2026-01-31 07:22:23.844297552 +0000 UTC m=+0.091190165 container attach a3daaea1c4114df0d7bd3780126ce4339d12b705fc6e2f8ba9fe17209bfab1ea (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_lalande, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 02:22:23 np0005603623 quizzical_lalande[76705]: 167 167
Jan 31 02:22:23 np0005603623 systemd[1]: libpod-a3daaea1c4114df0d7bd3780126ce4339d12b705fc6e2f8ba9fe17209bfab1ea.scope: Deactivated successfully.
Jan 31 02:22:23 np0005603623 conmon[76705]: conmon a3daaea1c4114df0d7bd <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a3daaea1c4114df0d7bd3780126ce4339d12b705fc6e2f8ba9fe17209bfab1ea.scope/container/memory.events
Jan 31 02:22:23 np0005603623 podman[76689]: 2026-01-31 07:22:23.845441735 +0000 UTC m=+0.092334368 container died a3daaea1c4114df0d7bd3780126ce4339d12b705fc6e2f8ba9fe17209bfab1ea (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_lalande, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Jan 31 02:22:23 np0005603623 podman[76689]: 2026-01-31 07:22:23.770530868 +0000 UTC m=+0.017423501 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:23 np0005603623 podman[76689]: 2026-01-31 07:22:23.873809821 +0000 UTC m=+0.120702434 container remove a3daaea1c4114df0d7bd3780126ce4339d12b705fc6e2f8ba9fe17209bfab1ea (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_lalande, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Jan 31 02:22:23 np0005603623 systemd[1]: libpod-conmon-a3daaea1c4114df0d7bd3780126ce4339d12b705fc6e2f8ba9fe17209bfab1ea.scope: Deactivated successfully.
Jan 31 02:22:23 np0005603623 podman[76725]: 2026-01-31 07:22:23.921237156 +0000 UTC m=+0.029650625 container create 3bf11e368cbe989298d82e95f38562fcbf87df4a4cbbe6dc6d7b570427d8afff (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_montalcini, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 02:22:23 np0005603623 systemd[1]: Started libpod-conmon-3bf11e368cbe989298d82e95f38562fcbf87df4a4cbbe6dc6d7b570427d8afff.scope.
Jan 31 02:22:23 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:22:23 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d3a87af80a6dfec8efb9d15f0b7b308c57b5debc481fdcff4cc387988bb6382/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:23 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d3a87af80a6dfec8efb9d15f0b7b308c57b5debc481fdcff4cc387988bb6382/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:23 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d3a87af80a6dfec8efb9d15f0b7b308c57b5debc481fdcff4cc387988bb6382/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:23 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d3a87af80a6dfec8efb9d15f0b7b308c57b5debc481fdcff4cc387988bb6382/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:23 np0005603623 podman[76725]: 2026-01-31 07:22:23.986693379 +0000 UTC m=+0.095106868 container init 3bf11e368cbe989298d82e95f38562fcbf87df4a4cbbe6dc6d7b570427d8afff (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_montalcini, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True)
Jan 31 02:22:23 np0005603623 podman[76725]: 2026-01-31 07:22:23.991248011 +0000 UTC m=+0.099661500 container start 3bf11e368cbe989298d82e95f38562fcbf87df4a4cbbe6dc6d7b570427d8afff (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_montalcini, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Jan 31 02:22:23 np0005603623 podman[76725]: 2026-01-31 07:22:23.995901804 +0000 UTC m=+0.104315303 container attach 3bf11e368cbe989298d82e95f38562fcbf87df4a4cbbe6dc6d7b570427d8afff (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_montalcini, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Jan 31 02:22:24 np0005603623 podman[76725]: 2026-01-31 07:22:23.907219713 +0000 UTC m=+0.015633182 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:24 np0005603623 systemd[1]: libpod-3bf11e368cbe989298d82e95f38562fcbf87df4a4cbbe6dc6d7b570427d8afff.scope: Deactivated successfully.
Jan 31 02:22:24 np0005603623 podman[76725]: 2026-01-31 07:22:24.0912922 +0000 UTC m=+0.199705669 container died 3bf11e368cbe989298d82e95f38562fcbf87df4a4cbbe6dc6d7b570427d8afff (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_montalcini, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:22:24 np0005603623 podman[76725]: 2026-01-31 07:22:24.129824848 +0000 UTC m=+0.238238317 container remove 3bf11e368cbe989298d82e95f38562fcbf87df4a4cbbe6dc6d7b570427d8afff (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_montalcini, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 31 02:22:24 np0005603623 systemd[1]: libpod-conmon-3bf11e368cbe989298d82e95f38562fcbf87df4a4cbbe6dc6d7b570427d8afff.scope: Deactivated successfully.
Jan 31 02:22:24 np0005603623 systemd[1]: Reloading.
Jan 31 02:22:24 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:22:24 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:22:24 np0005603623 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:22:24 np0005603623 systemd[1]: Reloading.
Jan 31 02:22:24 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:22:24 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:22:24 np0005603623 systemd[1]: Reached target All Ceph clusters and services.
Jan 31 02:22:24 np0005603623 systemd[1]: Reloading.
Jan 31 02:22:24 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:22:24 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:22:24 np0005603623 systemd[1]: Reached target Ceph cluster 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2.
Jan 31 02:22:24 np0005603623 systemd[1]: Reloading.
Jan 31 02:22:24 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:22:24 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:22:25 np0005603623 systemd[1]: Reloading.
Jan 31 02:22:25 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:22:25 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:22:25 np0005603623 systemd[1]: Created slice Slice /system/ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2.
Jan 31 02:22:25 np0005603623 systemd[1]: Reached target System Time Set.
Jan 31 02:22:25 np0005603623 systemd[1]: Reached target System Time Synchronized.
Jan 31 02:22:25 np0005603623 systemd[1]: Starting Ceph mon.compute-2 for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2...
Jan 31 02:22:25 np0005603623 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:22:25 np0005603623 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:22:25 np0005603623 podman[77017]: 2026-01-31 07:22:25.396416027 +0000 UTC m=+0.033781623 container create 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Jan 31 02:22:25 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33c273861bede722cd571d5f6210fec9b2cfc091123833ced0075bb6f90f5eb0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:25 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33c273861bede722cd571d5f6210fec9b2cfc091123833ced0075bb6f90f5eb0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:25 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33c273861bede722cd571d5f6210fec9b2cfc091123833ced0075bb6f90f5eb0/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:25 np0005603623 podman[77017]: 2026-01-31 07:22:25.457712191 +0000 UTC m=+0.095077807 container init 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Jan 31 02:22:25 np0005603623 podman[77017]: 2026-01-31 07:22:25.461578502 +0000 UTC m=+0.098944098 container start 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:22:25 np0005603623 bash[77017]: 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35
Jan 31 02:22:25 np0005603623 podman[77017]: 2026-01-31 07:22:25.380106827 +0000 UTC m=+0.017472443 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:25 np0005603623 systemd[1]: Started Ceph mon.compute-2 for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2.
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: set uid:gid to 167:167 (ceph:ceph)
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mon, pid 2
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: pidfile_write: ignore empty --pid-file
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: load: jerasure load: lrc 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: RocksDB version: 7.9.2
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Git sha 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: DB SUMMARY
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: DB Session ID:  GEDCETBIRNCTO5ATJU9U
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: CURRENT file:  CURRENT
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: IDENTITY file:  IDENTITY
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-2/store.db dir, Total Num: 0, files: 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-2/store.db: 000004.log size: 511 ; 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                         Options.error_if_exists: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                       Options.create_if_missing: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                         Options.paranoid_checks: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                                     Options.env: 0x557fc5385c40
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                                      Options.fs: PosixFileSystem
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                                Options.info_log: 0x557fc5f22fc0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                Options.max_file_opening_threads: 16
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                              Options.statistics: (nil)
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                               Options.use_fsync: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                       Options.max_log_file_size: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                         Options.allow_fallocate: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                        Options.use_direct_reads: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:          Options.create_missing_column_families: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                              Options.db_log_dir: 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                                 Options.wal_dir: 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                   Options.advise_random_on_open: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                    Options.write_buffer_manager: 0x557fc5f32b40
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                            Options.rate_limiter: (nil)
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                  Options.unordered_write: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                               Options.row_cache: None
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                              Options.wal_filter: None
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.allow_ingest_behind: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.two_write_queues: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.manual_wal_flush: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.wal_compression: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.atomic_flush: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                 Options.log_readahead_size: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.allow_data_in_errors: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.db_host_id: __hostname__
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.max_background_jobs: 2
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.max_background_compactions: -1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.max_subcompactions: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.max_total_wal_size: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                          Options.max_open_files: -1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                          Options.bytes_per_sync: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:       Options.compaction_readahead_size: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                  Options.max_background_flushes: -1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Compression algorithms supported:
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: #011kZSTD supported: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: #011kXpressCompression supported: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: #011kBZip2Compression supported: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: #011kLZ4Compression supported: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: #011kZlibCompression supported: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: #011kSnappyCompression supported: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:           Options.merge_operator: 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557fc5f22c00)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557fc5f1b1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:        Options.write_buffer_size: 33554432
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:  Options.max_write_buffer_number: 2
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:          Options.compression: NoCompression
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: bff4eee8-7924-415a-b54a-245461f42486
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844145499316, "job": 1, "event": "recovery_started", "wal_files": [4]}
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844145501134, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844145501208, "job": 1, "event": "recovery_finished"}
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x557fc5f44e00
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: DB pointer 0x557fc5fce000
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2 does not exist in monmap, will attempt to join an existing cluster
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557fc5f1b1f0#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: using public_addr v2:192.168.122.102:0/0 -> [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: starting mon.compute-2 rank -1 at public addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] at bind addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-2 fsid 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(???) e0 preinit fsid 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).mds e1 new map
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).mds e1 print_map#012e1#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: -1#012 #012No filesystems configured
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e9 e9: 2 total, 1 up, 2 in
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e10 e10: 2 total, 1 up, 2 in
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e13 crush map has features 3314933000852226048, adjusting msgr requires
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e13 crush map has features 288514051259236352, adjusting msgr requires
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e13 crush map has features 288514051259236352, adjusting msgr requires
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).osd e13 crush map has features 288514051259236352, adjusting msgr requires
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Added host compute-0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Deploying cephadm binary to compute-1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Added host compute-1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Deploying cephadm binary to compute-2
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Added host compute-2
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Saving service mon spec with placement compute-0;compute-1;compute-2
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Saving service mgr spec with placement compute-0;compute-1;compute-2
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Marking host: compute-0 for OSDSpec preview refresh.
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Marking host: compute-1 for OSDSpec preview refresh.
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Saving service osd.default_drive_group spec with placement compute-0;compute-1;compute-2
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Updating compute-1:/etc/ceph/ceph.conf
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Updating compute-1:/var/lib/ceph/2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/config/ceph.conf
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Updating compute-1:/var/lib/ceph/2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/config/ceph.client.admin.keyring
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Failed to apply mon spec MONSpec.from_json(yaml.safe_load('''service_type: mon#012service_name: mon#012placement:#012  hosts:#012  - compute-0#012  - compute-1#012  - compute-2#012''')): Cannot place <MONSpec for service_name=mon> on compute-2: Unknown hosts
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Failed to apply mgr spec ServiceSpec.from_json(yaml.safe_load('''service_type: mgr#012service_name: mgr#012placement:#012  hosts:#012  - compute-0#012  - compute-1#012  - compute-2#012''')): Cannot place <ServiceSpec for service_name=mgr> on compute-2: Unknown hosts
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Deploying daemon crash.compute-1 on compute-1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Health check failed: Failed to apply 2 service(s): mon,mgr (CEPHADM_APPLY_SPEC_FAIL)
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/2942024132' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "69ce1ba1-37ea-44ee-8e02-ae107b60d956"}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/2942024132' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "69ce1ba1-37ea-44ee-8e02-ae107b60d956"}]': finished
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.101:0/3660948089' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "c7b96aaa-43a0-4c7e-ac49-508c01d627b5"}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.101:0/3660948089' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "c7b96aaa-43a0-4c7e-ac49-508c01d627b5"}]': finished
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Health check cleared: TOO_FEW_OSDS (was: OSD count 0 < osd_pool_default_size 1)
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Deploying daemon osd.1 on compute-1
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Deploying daemon osd.0 on compute-0
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='osd.0 [v2:192.168.122.100:6802/1347694087,v1:192.168.122.100:6803/1347694087]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='osd.0 [v2:192.168.122.100:6802/1347694087,v1:192.168.122.100:6803/1347694087]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='osd.0 [v2:192.168.122.100:6802/1347694087,v1:192.168.122.100:6803/1347694087]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0068, "args": ["host=compute-0", "root=default"]}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='osd.1 [v2:192.168.122.101:6800/785741871,v1:192.168.122.101:6801/785741871]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='osd.0 [v2:192.168.122.100:6802/1347694087,v1:192.168.122.100:6803/1347694087]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0068, "args": ["host=compute-0", "root=default"]}]': finished
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='osd.1 [v2:192.168.122.101:6800/785741871,v1:192.168.122.101:6801/785741871]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='osd.1 [v2:192.168.122.101:6800/785741871,v1:192.168.122.101:6801/785741871]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0068, "args": ["host=compute-1", "root=default"]}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='osd.1 [v2:192.168.122.101:6800/785741871,v1:192.168.122.101:6801/785741871]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0068, "args": ["host=compute-1", "root=default"]}]': finished
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: OSD bench result of 8596.349487 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Adjusting osd_memory_target on compute-1 to  5247M
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: osd.0 [v2:192.168.122.100:6802/1347694087,v1:192.168.122.100:6803/1347694087] boot
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Adjusting osd_memory_target on compute-0 to 127.9M
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Unable to set osd_memory_target on compute-0 to 134197657: error parsing value: Value '134197657' is below minimum 939524096
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: OSD bench result of 6163.600447 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: osd.1 [v2:192.168.122.101:6800/785741871,v1:192.168.122.101:6801/785741871] boot
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Updating compute-2:/etc/ceph/ceph.conf
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Updating compute-2:/var/lib/ceph/2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/config/ceph.conf
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Updating compute-2:/var/lib/ceph/2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/config/ceph.client.admin.keyring
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Deploying daemon mon.compute-2 on compute-2
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: Cluster is now healthy
Jan 31 02:22:25 np0005603623 ceph-mon[77037]: mon.compute-2@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Jan 31 02:22:27 np0005603623 ceph-mon[77037]: mon.compute-2@-1(probing) e1  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 31 02:22:27 np0005603623 ceph-mon[77037]: mon.compute-2@-1(probing) e1  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 31 02:22:27 np0005603623 ceph-mon[77037]: mon.compute-2@-1(probing) e2  my rank is now 1 (was -1)
Jan 31 02:22:27 np0005603623 ceph-mon[77037]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 31 02:22:27 np0005603623 ceph-mon[77037]: paxos.1).electionLogic(0) init, first boot, initializing epoch at 1 
Jan 31 02:22:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 31 02:22:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 31 02:22:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 31 02:22:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Jan 31 02:22:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Jan 31 02:22:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 31 02:22:30 np0005603623 ceph-mon[77037]: mgrc update_daemon_metadata mon.compute-2 metadata {addrs=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,ceph_version_when_created=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,created_at=2026-01-31T07:22:24.025618Z,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-2,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026,kernel_version=5.14.0-665.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864300,os=Linux}
Jan 31 02:22:30 np0005603623 ceph-mon[77037]: Deploying daemon mon.compute-1 on compute-1
Jan 31 02:22:30 np0005603623 ceph-mon[77037]: mon.compute-0 calling monitor election
Jan 31 02:22:30 np0005603623 ceph-mon[77037]: mon.compute-2 calling monitor election
Jan 31 02:22:30 np0005603623 ceph-mon[77037]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Jan 31 02:22:30 np0005603623 ceph-mon[77037]: overall HEALTH_OK
Jan 31 02:22:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.cdjvtw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 31 02:22:31 np0005603623 podman[77217]: 2026-01-31 07:22:31.171486967 +0000 UTC m=+0.041496605 container create 9514ef7a4c4fe853fff574762f7fe90026e8265385efa1c7c2a1214e2545d96d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_mccarthy, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:22:31 np0005603623 systemd[1]: Started libpod-conmon-9514ef7a4c4fe853fff574762f7fe90026e8265385efa1c7c2a1214e2545d96d.scope.
Jan 31 02:22:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 31 02:22:31 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:22:31 np0005603623 ceph-mon[77037]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 31 02:22:31 np0005603623 ceph-mon[77037]: paxos.1).electionLogic(10) init, last seen epoch 10
Jan 31 02:22:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 31 02:22:31 np0005603623 podman[77217]: 2026-01-31 07:22:31.244347234 +0000 UTC m=+0.114356882 container init 9514ef7a4c4fe853fff574762f7fe90026e8265385efa1c7c2a1214e2545d96d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_mccarthy, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef)
Jan 31 02:22:31 np0005603623 podman[77217]: 2026-01-31 07:22:31.152678146 +0000 UTC m=+0.022687814 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:31 np0005603623 podman[77217]: 2026-01-31 07:22:31.253515638 +0000 UTC m=+0.123525286 container start 9514ef7a4c4fe853fff574762f7fe90026e8265385efa1c7c2a1214e2545d96d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_mccarthy, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:22:31 np0005603623 podman[77217]: 2026-01-31 07:22:31.256799781 +0000 UTC m=+0.126809449 container attach 9514ef7a4c4fe853fff574762f7fe90026e8265385efa1c7c2a1214e2545d96d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_mccarthy, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:22:31 np0005603623 dreamy_mccarthy[77234]: 167 167
Jan 31 02:22:31 np0005603623 systemd[1]: libpod-9514ef7a4c4fe853fff574762f7fe90026e8265385efa1c7c2a1214e2545d96d.scope: Deactivated successfully.
Jan 31 02:22:31 np0005603623 podman[77217]: 2026-01-31 07:22:31.260275732 +0000 UTC m=+0.130285390 container died 9514ef7a4c4fe853fff574762f7fe90026e8265385efa1c7c2a1214e2545d96d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_mccarthy, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:22:31 np0005603623 systemd[1]: var-lib-containers-storage-overlay-b80cc4bc55f6e958bf06ce36beefdf877c244c80af738267805ef4fea533e20c-merged.mount: Deactivated successfully.
Jan 31 02:22:31 np0005603623 podman[77217]: 2026-01-31 07:22:31.294290871 +0000 UTC m=+0.164300519 container remove 9514ef7a4c4fe853fff574762f7fe90026e8265385efa1c7c2a1214e2545d96d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_mccarthy, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Jan 31 02:22:31 np0005603623 systemd[1]: libpod-conmon-9514ef7a4c4fe853fff574762f7fe90026e8265385efa1c7c2a1214e2545d96d.scope: Deactivated successfully.
Jan 31 02:22:31 np0005603623 systemd[1]: Reloading.
Jan 31 02:22:31 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:22:31 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:22:31 np0005603623 systemd[1]: Reloading.
Jan 31 02:22:31 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:22:31 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:22:31 np0005603623 systemd[1]: Starting Ceph mgr.compute-2.cdjvtw for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2...
Jan 31 02:22:31 np0005603623 podman[77372]: 2026-01-31 07:22:31.936303146 +0000 UTC m=+0.047157559 container create b585f94a3e6bd3e7cba6966f28f858424113275073d51def70eb2b9f5ee89147 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:22:31 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47b0d4177b4aeaa5e4769712ed74c9081f5497493df11150e7533f19c41d4438/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:31 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47b0d4177b4aeaa5e4769712ed74c9081f5497493df11150e7533f19c41d4438/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:31 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47b0d4177b4aeaa5e4769712ed74c9081f5497493df11150e7533f19c41d4438/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:31 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47b0d4177b4aeaa5e4769712ed74c9081f5497493df11150e7533f19c41d4438/merged/var/lib/ceph/mgr/ceph-compute-2.cdjvtw supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:31 np0005603623 podman[77372]: 2026-01-31 07:22:31.98232643 +0000 UTC m=+0.093180863 container init b585f94a3e6bd3e7cba6966f28f858424113275073d51def70eb2b9f5ee89147 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Jan 31 02:22:31 np0005603623 podman[77372]: 2026-01-31 07:22:31.986250123 +0000 UTC m=+0.097104536 container start b585f94a3e6bd3e7cba6966f28f858424113275073d51def70eb2b9f5ee89147 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Jan 31 02:22:31 np0005603623 bash[77372]: b585f94a3e6bd3e7cba6966f28f858424113275073d51def70eb2b9f5ee89147
Jan 31 02:22:31 np0005603623 podman[77372]: 2026-01-31 07:22:31.918579846 +0000 UTC m=+0.029434279 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:31 np0005603623 systemd[1]: Started Ceph mgr.compute-2.cdjvtw for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2.
Jan 31 02:22:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 31 02:22:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 31 02:22:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 31 02:22:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 31 02:22:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 31 02:22:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 31 02:22:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 31 02:22:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 31 02:22:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 31 02:22:36 np0005603623 ceph-mon[77037]: mon.compute-0 calling monitor election
Jan 31 02:22:36 np0005603623 ceph-mon[77037]: mon.compute-2 calling monitor election
Jan 31 02:22:36 np0005603623 ceph-mon[77037]: mon.compute-1 calling monitor election
Jan 31 02:22:36 np0005603623 ceph-mon[77037]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 31 02:22:36 np0005603623 ceph-mon[77037]: overall HEALTH_OK
Jan 31 02:22:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Jan 31 02:22:36 np0005603623 ceph-mgr[77391]: set uid:gid to 167:167 (ceph:ceph)
Jan 31 02:22:36 np0005603623 ceph-mgr[77391]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mgr, pid 2
Jan 31 02:22:36 np0005603623 ceph-mgr[77391]: pidfile_write: ignore empty --pid-file
Jan 31 02:22:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Jan 31 02:22:36 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'alerts'
Jan 31 02:22:36 np0005603623 ceph-mgr[77391]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 31 02:22:36 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'balancer'
Jan 31 02:22:36 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:36.931+0000 7f20c3e90140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 31 02:22:37 np0005603623 ceph-mgr[77391]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 31 02:22:37 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'cephadm'
Jan 31 02:22:37 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:37.200+0000 7f20c3e90140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 31 02:22:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.gxjgok", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 31 02:22:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.gxjgok", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 31 02:22:37 np0005603623 ceph-mon[77037]: Deploying daemon mgr.compute-1.gxjgok on compute-1
Jan 31 02:22:38 np0005603623 podman[77557]: 2026-01-31 07:22:38.255516074 +0000 UTC m=+0.034495500 container create 857c49731850bab714eb5b061a5f660d20f3cc0fdae18a1cac5f6621e5451e34 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_galois, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:22:38 np0005603623 systemd[1]: Started libpod-conmon-857c49731850bab714eb5b061a5f660d20f3cc0fdae18a1cac5f6621e5451e34.scope.
Jan 31 02:22:38 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:22:38 np0005603623 podman[77557]: 2026-01-31 07:22:38.310037731 +0000 UTC m=+0.089017177 container init 857c49731850bab714eb5b061a5f660d20f3cc0fdae18a1cac5f6621e5451e34 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_galois, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:22:38 np0005603623 podman[77557]: 2026-01-31 07:22:38.315154513 +0000 UTC m=+0.094133939 container start 857c49731850bab714eb5b061a5f660d20f3cc0fdae18a1cac5f6621e5451e34 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_galois, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS)
Jan 31 02:22:38 np0005603623 podman[77557]: 2026-01-31 07:22:38.317829567 +0000 UTC m=+0.096809013 container attach 857c49731850bab714eb5b061a5f660d20f3cc0fdae18a1cac5f6621e5451e34 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_galois, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507)
Jan 31 02:22:38 np0005603623 interesting_galois[77573]: 167 167
Jan 31 02:22:38 np0005603623 systemd[1]: libpod-857c49731850bab714eb5b061a5f660d20f3cc0fdae18a1cac5f6621e5451e34.scope: Deactivated successfully.
Jan 31 02:22:38 np0005603623 podman[77557]: 2026-01-31 07:22:38.320524112 +0000 UTC m=+0.099503538 container died 857c49731850bab714eb5b061a5f660d20f3cc0fdae18a1cac5f6621e5451e34 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_galois, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True)
Jan 31 02:22:38 np0005603623 podman[77557]: 2026-01-31 07:22:38.239520409 +0000 UTC m=+0.018499845 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:38 np0005603623 systemd[1]: var-lib-containers-storage-overlay-8de365458873cc8b04d200d2aca08e0a759dcdf4562958aebbf2975592137f2e-merged.mount: Deactivated successfully.
Jan 31 02:22:38 np0005603623 podman[77557]: 2026-01-31 07:22:38.347126673 +0000 UTC m=+0.126106099 container remove 857c49731850bab714eb5b061a5f660d20f3cc0fdae18a1cac5f6621e5451e34 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_galois, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:22:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 31 02:22:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Jan 31 02:22:38 np0005603623 systemd[1]: libpod-conmon-857c49731850bab714eb5b061a5f660d20f3cc0fdae18a1cac5f6621e5451e34.scope: Deactivated successfully.
Jan 31 02:22:38 np0005603623 systemd[1]: Reloading.
Jan 31 02:22:38 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:22:38 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:22:38 np0005603623 systemd[1]: Reloading.
Jan 31 02:22:38 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:22:38 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:22:38 np0005603623 systemd[1]: Starting Ceph crash.compute-2 for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2...
Jan 31 02:22:39 np0005603623 podman[77725]: 2026-01-31 07:22:39.058382715 +0000 UTC m=+0.035745306 container create 263f01735433990d383450c452d6d04582be80ec1a23a77b785899375e1baf60 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Jan 31 02:22:39 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ded8f9711ce411d9ff8aecbe09726d3b6d2d1eeae36d6d98c877068f5ba5f90/merged/etc/ceph/ceph.client.crash.compute-2.keyring supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:39 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ded8f9711ce411d9ff8aecbe09726d3b6d2d1eeae36d6d98c877068f5ba5f90/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:39 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ded8f9711ce411d9ff8aecbe09726d3b6d2d1eeae36d6d98c877068f5ba5f90/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:39 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ded8f9711ce411d9ff8aecbe09726d3b6d2d1eeae36d6d98c877068f5ba5f90/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:39 np0005603623 podman[77725]: 2026-01-31 07:22:39.132554368 +0000 UTC m=+0.109917009 container init 263f01735433990d383450c452d6d04582be80ec1a23a77b785899375e1baf60 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Jan 31 02:22:39 np0005603623 podman[77725]: 2026-01-31 07:22:39.138407971 +0000 UTC m=+0.115770602 container start 263f01735433990d383450c452d6d04582be80ec1a23a77b785899375e1baf60 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 02:22:39 np0005603623 podman[77725]: 2026-01-31 07:22:39.041845705 +0000 UTC m=+0.019208316 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:39 np0005603623 bash[77725]: 263f01735433990d383450c452d6d04582be80ec1a23a77b785899375e1baf60
Jan 31 02:22:39 np0005603623 systemd[1]: Started Ceph crash.compute-2 for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2.
Jan 31 02:22:39 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'crash'
Jan 31 02:22:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: INFO:ceph-crash:pinging cluster to exercise our key
Jan 31 02:22:39 np0005603623 ceph-mon[77037]: Deploying daemon crash.compute-2 on compute-2
Jan 31 02:22:39 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/1597819399' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 31 02:22:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:22:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:22:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e14 e14: 2 total, 2 up, 2 in
Jan 31 02:22:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:39.505+0000 7f20c3e90140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 31 02:22:39 np0005603623 ceph-mgr[77391]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 31 02:22:39 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'dashboard'
Jan 31 02:22:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: 2026-01-31T07:22:39.510+0000 7fe4a8809640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 31 02:22:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: 2026-01-31T07:22:39.510+0000 7fe4a8809640 -1 AuthRegistry(0x7fe4a00675b0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 31 02:22:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: 2026-01-31T07:22:39.511+0000 7fe4a8809640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 31 02:22:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: 2026-01-31T07:22:39.511+0000 7fe4a8809640 -1 AuthRegistry(0x7fe4a8808000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 31 02:22:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: 2026-01-31T07:22:39.513+0000 7fe4a657e640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 31 02:22:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: 2026-01-31T07:22:39.514+0000 7fe4a5d7d640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 31 02:22:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: 2026-01-31T07:22:39.514+0000 7fe4a6d7f640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 31 02:22:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: 2026-01-31T07:22:39.514+0000 7fe4a8809640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Jan 31 02:22:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: [errno 13] RADOS permission denied (error connecting to the cluster)
Jan 31 02:22:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Jan 31 02:22:39 np0005603623 podman[77897]: 2026-01-31 07:22:39.709502205 +0000 UTC m=+0.038433880 container create a091d0ce74e1f7d7360c17b8779617d2da0fd0be1233790cc26d49133402fe67 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_mayer, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True)
Jan 31 02:22:39 np0005603623 systemd[1]: Started libpod-conmon-a091d0ce74e1f7d7360c17b8779617d2da0fd0be1233790cc26d49133402fe67.scope.
Jan 31 02:22:39 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:22:39 np0005603623 podman[77897]: 2026-01-31 07:22:39.784530362 +0000 UTC m=+0.113462037 container init a091d0ce74e1f7d7360c17b8779617d2da0fd0be1233790cc26d49133402fe67 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_mayer, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:22:39 np0005603623 podman[77897]: 2026-01-31 07:22:39.79091647 +0000 UTC m=+0.119848145 container start a091d0ce74e1f7d7360c17b8779617d2da0fd0be1233790cc26d49133402fe67 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_mayer, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:22:39 np0005603623 podman[77897]: 2026-01-31 07:22:39.693406358 +0000 UTC m=+0.022338053 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:39 np0005603623 podman[77897]: 2026-01-31 07:22:39.79450937 +0000 UTC m=+0.123441055 container attach a091d0ce74e1f7d7360c17b8779617d2da0fd0be1233790cc26d49133402fe67 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_mayer, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Jan 31 02:22:39 np0005603623 affectionate_mayer[77913]: 167 167
Jan 31 02:22:39 np0005603623 systemd[1]: libpod-a091d0ce74e1f7d7360c17b8779617d2da0fd0be1233790cc26d49133402fe67.scope: Deactivated successfully.
Jan 31 02:22:39 np0005603623 podman[77897]: 2026-01-31 07:22:39.795868317 +0000 UTC m=+0.124799992 container died a091d0ce74e1f7d7360c17b8779617d2da0fd0be1233790cc26d49133402fe67 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_mayer, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Jan 31 02:22:39 np0005603623 systemd[1]: var-lib-containers-storage-overlay-b22732e6fa285cc754e0e7101a2cd399eabafdeffc3dd0408a29540956b7d6c8-merged.mount: Deactivated successfully.
Jan 31 02:22:39 np0005603623 podman[77897]: 2026-01-31 07:22:39.822273882 +0000 UTC m=+0.151205557 container remove a091d0ce74e1f7d7360c17b8779617d2da0fd0be1233790cc26d49133402fe67 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_mayer, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 02:22:39 np0005603623 systemd[1]: libpod-conmon-a091d0ce74e1f7d7360c17b8779617d2da0fd0be1233790cc26d49133402fe67.scope: Deactivated successfully.
Jan 31 02:22:39 np0005603623 podman[77936]: 2026-01-31 07:22:39.952648348 +0000 UTC m=+0.038992956 container create 8270f016a28f46dddde83c72fe0ba7afb474cace32d7f5b7b7c46ac989cbd40c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_goodall, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507)
Jan 31 02:22:39 np0005603623 systemd[1]: Started libpod-conmon-8270f016a28f46dddde83c72fe0ba7afb474cace32d7f5b7b7c46ac989cbd40c.scope.
Jan 31 02:22:40 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:22:40 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8817074c3b81fa8ff9f7e0fa1aa4ba6ee980c7a187999c42151d58f682b7b5eb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:40 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8817074c3b81fa8ff9f7e0fa1aa4ba6ee980c7a187999c42151d58f682b7b5eb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:40 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8817074c3b81fa8ff9f7e0fa1aa4ba6ee980c7a187999c42151d58f682b7b5eb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:40 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8817074c3b81fa8ff9f7e0fa1aa4ba6ee980c7a187999c42151d58f682b7b5eb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:40 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8817074c3b81fa8ff9f7e0fa1aa4ba6ee980c7a187999c42151d58f682b7b5eb/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:40 np0005603623 podman[77936]: 2026-01-31 07:22:39.935425549 +0000 UTC m=+0.021770177 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:40 np0005603623 podman[77936]: 2026-01-31 07:22:40.056955619 +0000 UTC m=+0.143300227 container init 8270f016a28f46dddde83c72fe0ba7afb474cace32d7f5b7b7c46ac989cbd40c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_goodall, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 02:22:40 np0005603623 podman[77936]: 2026-01-31 07:22:40.061587698 +0000 UTC m=+0.147932306 container start 8270f016a28f46dddde83c72fe0ba7afb474cace32d7f5b7b7c46ac989cbd40c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_goodall, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Jan 31 02:22:40 np0005603623 podman[77936]: 2026-01-31 07:22:40.072236394 +0000 UTC m=+0.158581052 container attach 8270f016a28f46dddde83c72fe0ba7afb474cace32d7f5b7b7c46ac989cbd40c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_goodall, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Jan 31 02:22:40 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/1597819399' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 31 02:22:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e15 e15: 2 total, 2 up, 2 in
Jan 31 02:22:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e15 _set_new_cache_sizes cache_size:1019938018 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:22:40 np0005603623 elastic_goodall[77953]: --> passed data devices: 0 physical, 1 LVM
Jan 31 02:22:40 np0005603623 elastic_goodall[77953]: --> relative data size: 1.0
Jan 31 02:22:40 np0005603623 elastic_goodall[77953]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 31 02:22:40 np0005603623 elastic_goodall[77953]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new d561c1d2-064b-46a8-af35-64503a234a3c
Jan 31 02:22:40 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'devicehealth'
Jan 31 02:22:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd new", "uuid": "d561c1d2-064b-46a8-af35-64503a234a3c"} v 0) v1
Jan 31 02:22:41 np0005603623 ceph-mon[77037]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/935814870' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "d561c1d2-064b-46a8-af35-64503a234a3c"}]: dispatch
Jan 31 02:22:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e16 e16: 3 total, 2 up, 3 in
Jan 31 02:22:41 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:41.247+0000 7f20c3e90140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 31 02:22:41 np0005603623 ceph-mgr[77391]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 31 02:22:41 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'diskprediction_local'
Jan 31 02:22:41 np0005603623 elastic_goodall[77953]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 31 02:22:41 np0005603623 lvm[78001]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 31 02:22:41 np0005603623 lvm[78001]: VG ceph_vg0 finished
Jan 31 02:22:41 np0005603623 elastic_goodall[77953]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Jan 31 02:22:41 np0005603623 elastic_goodall[77953]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Jan 31 02:22:41 np0005603623 elastic_goodall[77953]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 31 02:22:41 np0005603623 elastic_goodall[77953]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 31 02:22:41 np0005603623 elastic_goodall[77953]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Jan 31 02:22:41 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/3898076589' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 31 02:22:41 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.102:0/935814870' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "d561c1d2-064b-46a8-af35-64503a234a3c"}]: dispatch
Jan 31 02:22:41 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "d561c1d2-064b-46a8-af35-64503a234a3c"}]: dispatch
Jan 31 02:22:41 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/3898076589' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 31 02:22:41 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "d561c1d2-064b-46a8-af35-64503a234a3c"}]': finished
Jan 31 02:22:41 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon getmap"} v 0) v1
Jan 31 02:22:41 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2052894036' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Jan 31 02:22:41 np0005603623 elastic_goodall[77953]: stderr: got monmap epoch 3
Jan 31 02:22:41 np0005603623 elastic_goodall[77953]: --> Creating keyring file for osd.2
Jan 31 02:22:41 np0005603623 elastic_goodall[77953]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Jan 31 02:22:41 np0005603623 elastic_goodall[77953]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Jan 31 02:22:41 np0005603623 elastic_goodall[77953]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid d561c1d2-064b-46a8-af35-64503a234a3c --setuser ceph --setgroup ceph
Jan 31 02:22:41 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 31 02:22:41 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 31 02:22:41 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]:  from numpy import show_config as show_numpy_config
Jan 31 02:22:41 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:41.817+0000 7f20c3e90140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 31 02:22:41 np0005603623 ceph-mgr[77391]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 31 02:22:41 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'influx'
Jan 31 02:22:42 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:42.075+0000 7f20c3e90140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 31 02:22:42 np0005603623 ceph-mgr[77391]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 31 02:22:42 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'insights'
Jan 31 02:22:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e17 e17: 3 total, 2 up, 3 in
Jan 31 02:22:42 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'iostat'
Jan 31 02:22:42 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/3326848097' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 31 02:22:42 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/3326848097' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 31 02:22:42 np0005603623 ceph-mon[77037]: Health check failed: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 31 02:22:42 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:42.543+0000 7f20c3e90140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 31 02:22:42 np0005603623 ceph-mgr[77391]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 31 02:22:42 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'k8sevents'
Jan 31 02:22:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e18 e18: 3 total, 2 up, 3 in
Jan 31 02:22:44 np0005603623 elastic_goodall[77953]: stderr: 2026-01-31T07:22:41.829+0000 7f3a66073740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 31 02:22:44 np0005603623 elastic_goodall[77953]: stderr: 2026-01-31T07:22:41.829+0000 7f3a66073740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 31 02:22:44 np0005603623 elastic_goodall[77953]: stderr: 2026-01-31T07:22:41.829+0000 7f3a66073740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 31 02:22:44 np0005603623 elastic_goodall[77953]: stderr: 2026-01-31T07:22:41.829+0000 7f3a66073740 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Jan 31 02:22:44 np0005603623 elastic_goodall[77953]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Jan 31 02:22:44 np0005603623 elastic_goodall[77953]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 31 02:22:44 np0005603623 elastic_goodall[77953]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Jan 31 02:22:44 np0005603623 elastic_goodall[77953]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 31 02:22:44 np0005603623 elastic_goodall[77953]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Jan 31 02:22:44 np0005603623 elastic_goodall[77953]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 31 02:22:44 np0005603623 elastic_goodall[77953]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 31 02:22:44 np0005603623 elastic_goodall[77953]: --> ceph-volume lvm activate successful for osd ID: 2
Jan 31 02:22:44 np0005603623 elastic_goodall[77953]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Jan 31 02:22:44 np0005603623 systemd[1]: libpod-8270f016a28f46dddde83c72fe0ba7afb474cace32d7f5b7b7c46ac989cbd40c.scope: Deactivated successfully.
Jan 31 02:22:44 np0005603623 systemd[1]: libpod-8270f016a28f46dddde83c72fe0ba7afb474cace32d7f5b7b7c46ac989cbd40c.scope: Consumed 2.053s CPU time.
Jan 31 02:22:44 np0005603623 podman[78921]: 2026-01-31 07:22:44.168563548 +0000 UTC m=+0.019984427 container died 8270f016a28f46dddde83c72fe0ba7afb474cace32d7f5b7b7c46ac989cbd40c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_goodall, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:22:44 np0005603623 systemd[1]: var-lib-containers-storage-overlay-8817074c3b81fa8ff9f7e0fa1aa4ba6ee980c7a187999c42151d58f682b7b5eb-merged.mount: Deactivated successfully.
Jan 31 02:22:44 np0005603623 podman[78921]: 2026-01-31 07:22:44.206841424 +0000 UTC m=+0.058262303 container remove 8270f016a28f46dddde83c72fe0ba7afb474cace32d7f5b7b7c46ac989cbd40c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_goodall, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:22:44 np0005603623 systemd[1]: libpod-conmon-8270f016a28f46dddde83c72fe0ba7afb474cace32d7f5b7b7c46ac989cbd40c.scope: Deactivated successfully.
Jan 31 02:22:44 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/3012310796' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 31 02:22:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e19 e19: 3 total, 2 up, 3 in
Jan 31 02:22:44 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'localpool'
Jan 31 02:22:44 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'mds_autoscaler'
Jan 31 02:22:44 np0005603623 podman[79075]: 2026-01-31 07:22:44.763763614 +0000 UTC m=+0.036369693 container create 4a814a910e3d15852d32ea06dcb9054a9b1750018cfe43ee56a7353f9b91d06f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_cartwright, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:22:44 np0005603623 systemd[1]: Started libpod-conmon-4a814a910e3d15852d32ea06dcb9054a9b1750018cfe43ee56a7353f9b91d06f.scope.
Jan 31 02:22:44 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:22:44 np0005603623 podman[79075]: 2026-01-31 07:22:44.839754547 +0000 UTC m=+0.112360656 container init 4a814a910e3d15852d32ea06dcb9054a9b1750018cfe43ee56a7353f9b91d06f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_cartwright, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default)
Jan 31 02:22:44 np0005603623 podman[79075]: 2026-01-31 07:22:44.748514029 +0000 UTC m=+0.021120128 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:44 np0005603623 podman[79075]: 2026-01-31 07:22:44.848393077 +0000 UTC m=+0.120999156 container start 4a814a910e3d15852d32ea06dcb9054a9b1750018cfe43ee56a7353f9b91d06f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_cartwright, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Jan 31 02:22:44 np0005603623 podman[79075]: 2026-01-31 07:22:44.851641757 +0000 UTC m=+0.124247856 container attach 4a814a910e3d15852d32ea06dcb9054a9b1750018cfe43ee56a7353f9b91d06f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_cartwright, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Jan 31 02:22:44 np0005603623 sad_cartwright[79092]: 167 167
Jan 31 02:22:44 np0005603623 systemd[1]: libpod-4a814a910e3d15852d32ea06dcb9054a9b1750018cfe43ee56a7353f9b91d06f.scope: Deactivated successfully.
Jan 31 02:22:44 np0005603623 conmon[79092]: conmon 4a814a910e3d15852d32 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4a814a910e3d15852d32ea06dcb9054a9b1750018cfe43ee56a7353f9b91d06f.scope/container/memory.events
Jan 31 02:22:44 np0005603623 podman[79075]: 2026-01-31 07:22:44.854586469 +0000 UTC m=+0.127192558 container died 4a814a910e3d15852d32ea06dcb9054a9b1750018cfe43ee56a7353f9b91d06f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_cartwright, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Jan 31 02:22:44 np0005603623 systemd[1]: var-lib-containers-storage-overlay-d0c78fb574ba6472ee7e2790366db5b7a12e1cf94ac5736e130a24862670f52f-merged.mount: Deactivated successfully.
Jan 31 02:22:44 np0005603623 podman[79075]: 2026-01-31 07:22:44.88841096 +0000 UTC m=+0.161017039 container remove 4a814a910e3d15852d32ea06dcb9054a9b1750018cfe43ee56a7353f9b91d06f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_cartwright, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:22:44 np0005603623 systemd[1]: libpod-conmon-4a814a910e3d15852d32ea06dcb9054a9b1750018cfe43ee56a7353f9b91d06f.scope: Deactivated successfully.
Jan 31 02:22:45 np0005603623 podman[79116]: 2026-01-31 07:22:45.001369562 +0000 UTC m=+0.037948087 container create a1ef6fc198e11921afa86dbdfd19ccce4d16976a834dffa4aae181d688d13720 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_villani, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True)
Jan 31 02:22:45 np0005603623 systemd[1]: Started libpod-conmon-a1ef6fc198e11921afa86dbdfd19ccce4d16976a834dffa4aae181d688d13720.scope.
Jan 31 02:22:45 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:22:45 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f8a16eb099951ae56927d91150b0e4ab2b431fb0b56bb7368f055288057cff03/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:45 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f8a16eb099951ae56927d91150b0e4ab2b431fb0b56bb7368f055288057cff03/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:45 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f8a16eb099951ae56927d91150b0e4ab2b431fb0b56bb7368f055288057cff03/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:45 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f8a16eb099951ae56927d91150b0e4ab2b431fb0b56bb7368f055288057cff03/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:45 np0005603623 podman[79116]: 2026-01-31 07:22:44.98221337 +0000 UTC m=+0.018791905 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:45 np0005603623 podman[79116]: 2026-01-31 07:22:45.082518529 +0000 UTC m=+0.119097074 container init a1ef6fc198e11921afa86dbdfd19ccce4d16976a834dffa4aae181d688d13720 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_villani, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:22:45 np0005603623 podman[79116]: 2026-01-31 07:22:45.087548499 +0000 UTC m=+0.124127024 container start a1ef6fc198e11921afa86dbdfd19ccce4d16976a834dffa4aae181d688d13720 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_villani, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 31 02:22:45 np0005603623 podman[79116]: 2026-01-31 07:22:45.091583112 +0000 UTC m=+0.128161667 container attach a1ef6fc198e11921afa86dbdfd19ccce4d16976a834dffa4aae181d688d13720 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_villani, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:22:45 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'mirroring'
Jan 31 02:22:45 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/3012310796' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 31 02:22:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:45 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/1249872425' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 31 02:22:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e20 e20: 3 total, 2 up, 3 in
Jan 31 02:22:45 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'nfs'
Jan 31 02:22:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e20 _set_new_cache_sizes cache_size:1020053332 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:22:45 np0005603623 gracious_villani[79133]: {
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:    "2": [
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:        {
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:            "devices": [
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:                "/dev/loop3"
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:            ],
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:            "lv_name": "ceph_lv0",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:            "lv_size": "7511998464",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=OZ6nUb-N1Zr-SPEz-Ai0O-q02q-jYcM-epsNuh,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=2f5ab832-5f2e-5a84-bd93-cf8bab960ee2,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=d561c1d2-064b-46a8-af35-64503a234a3c,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:            "lv_uuid": "OZ6nUb-N1Zr-SPEz-Ai0O-q02q-jYcM-epsNuh",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:            "name": "ceph_lv0",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:            "path": "/dev/ceph_vg0/ceph_lv0",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:            "tags": {
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:                "ceph.block_uuid": "OZ6nUb-N1Zr-SPEz-Ai0O-q02q-jYcM-epsNuh",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:                "ceph.cephx_lockbox_secret": "",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:                "ceph.cluster_fsid": "2f5ab832-5f2e-5a84-bd93-cf8bab960ee2",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:                "ceph.cluster_name": "ceph",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:                "ceph.crush_device_class": "",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:                "ceph.encrypted": "0",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:                "ceph.osd_fsid": "d561c1d2-064b-46a8-af35-64503a234a3c",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:                "ceph.osd_id": "2",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:                "ceph.osdspec_affinity": "default_drive_group",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:                "ceph.type": "block",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:                "ceph.vdo": "0"
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:            },
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:            "type": "block",
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:            "vg_name": "ceph_vg0"
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:        }
Jan 31 02:22:45 np0005603623 gracious_villani[79133]:    ]
Jan 31 02:22:45 np0005603623 gracious_villani[79133]: }
Jan 31 02:22:45 np0005603623 systemd[1]: libpod-a1ef6fc198e11921afa86dbdfd19ccce4d16976a834dffa4aae181d688d13720.scope: Deactivated successfully.
Jan 31 02:22:45 np0005603623 podman[79142]: 2026-01-31 07:22:45.882731956 +0000 UTC m=+0.028815733 container died a1ef6fc198e11921afa86dbdfd19ccce4d16976a834dffa4aae181d688d13720 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_villani, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Jan 31 02:22:45 np0005603623 systemd[1]: var-lib-containers-storage-overlay-f8a16eb099951ae56927d91150b0e4ab2b431fb0b56bb7368f055288057cff03-merged.mount: Deactivated successfully.
Jan 31 02:22:45 np0005603623 podman[79142]: 2026-01-31 07:22:45.936668497 +0000 UTC m=+0.082752224 container remove a1ef6fc198e11921afa86dbdfd19ccce4d16976a834dffa4aae181d688d13720 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_villani, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 31 02:22:45 np0005603623 systemd[1]: libpod-conmon-a1ef6fc198e11921afa86dbdfd19ccce4d16976a834dffa4aae181d688d13720.scope: Deactivated successfully.
Jan 31 02:22:46 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:46.180+0000 7f20c3e90140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 31 02:22:46 np0005603623 ceph-mgr[77391]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 31 02:22:46 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'orchestrator'
Jan 31 02:22:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e21 e21: 3 total, 2 up, 3 in
Jan 31 02:22:46 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/1249872425' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 31 02:22:46 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Jan 31 02:22:46 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/839230673' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 31 02:22:46 np0005603623 podman[79300]: 2026-01-31 07:22:46.52610096 +0000 UTC m=+0.029699266 container create 1777d2fc27d75ac5e4d075ef1a2001a41cfae14300c4cbd5d7757d9aa082b848 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_euler, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:22:46 np0005603623 systemd[1]: Started libpod-conmon-1777d2fc27d75ac5e4d075ef1a2001a41cfae14300c4cbd5d7757d9aa082b848.scope.
Jan 31 02:22:46 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:22:46 np0005603623 podman[79300]: 2026-01-31 07:22:46.590744178 +0000 UTC m=+0.094342574 container init 1777d2fc27d75ac5e4d075ef1a2001a41cfae14300c4cbd5d7757d9aa082b848 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_euler, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 02:22:46 np0005603623 podman[79300]: 2026-01-31 07:22:46.595749488 +0000 UTC m=+0.099347794 container start 1777d2fc27d75ac5e4d075ef1a2001a41cfae14300c4cbd5d7757d9aa082b848 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_euler, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Jan 31 02:22:46 np0005603623 condescending_euler[79316]: 167 167
Jan 31 02:22:46 np0005603623 systemd[1]: libpod-1777d2fc27d75ac5e4d075ef1a2001a41cfae14300c4cbd5d7757d9aa082b848.scope: Deactivated successfully.
Jan 31 02:22:46 np0005603623 podman[79300]: 2026-01-31 07:22:46.604664746 +0000 UTC m=+0.108263062 container attach 1777d2fc27d75ac5e4d075ef1a2001a41cfae14300c4cbd5d7757d9aa082b848 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_euler, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:22:46 np0005603623 podman[79300]: 2026-01-31 07:22:46.605225161 +0000 UTC m=+0.108823477 container died 1777d2fc27d75ac5e4d075ef1a2001a41cfae14300c4cbd5d7757d9aa082b848 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_euler, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Jan 31 02:22:46 np0005603623 podman[79300]: 2026-01-31 07:22:46.512926474 +0000 UTC m=+0.016524800 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:46 np0005603623 systemd[1]: var-lib-containers-storage-overlay-3416c64fd617a172a6fb70fd9cb71ff6bfbf6e305dc35a6329822d44f775c334-merged.mount: Deactivated successfully.
Jan 31 02:22:46 np0005603623 podman[79300]: 2026-01-31 07:22:46.63896159 +0000 UTC m=+0.142559906 container remove 1777d2fc27d75ac5e4d075ef1a2001a41cfae14300c4cbd5d7757d9aa082b848 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_euler, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:22:46 np0005603623 systemd[1]: libpod-conmon-1777d2fc27d75ac5e4d075ef1a2001a41cfae14300c4cbd5d7757d9aa082b848.scope: Deactivated successfully.
Jan 31 02:22:46 np0005603623 podman[79348]: 2026-01-31 07:22:46.846441871 +0000 UTC m=+0.049485078 container create b62d1fd29ad77350434d3c7a98ebeb54f75b66b82f873ec2b57f4d4a7342d704 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate-test, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:22:46 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:46.865+0000 7f20c3e90140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 31 02:22:46 np0005603623 ceph-mgr[77391]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 31 02:22:46 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'osd_perf_query'
Jan 31 02:22:46 np0005603623 systemd[1]: Started libpod-conmon-b62d1fd29ad77350434d3c7a98ebeb54f75b66b82f873ec2b57f4d4a7342d704.scope.
Jan 31 02:22:46 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:22:46 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcc22b5e2f3433b9f97d440e624399a35bda2c933d8a3ba03589f007563a3cb1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:46 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcc22b5e2f3433b9f97d440e624399a35bda2c933d8a3ba03589f007563a3cb1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:46 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcc22b5e2f3433b9f97d440e624399a35bda2c933d8a3ba03589f007563a3cb1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:46 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcc22b5e2f3433b9f97d440e624399a35bda2c933d8a3ba03589f007563a3cb1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:46 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcc22b5e2f3433b9f97d440e624399a35bda2c933d8a3ba03589f007563a3cb1/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:46 np0005603623 podman[79348]: 2026-01-31 07:22:46.819784509 +0000 UTC m=+0.022827766 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:46 np0005603623 podman[79348]: 2026-01-31 07:22:46.962286142 +0000 UTC m=+0.165329389 container init b62d1fd29ad77350434d3c7a98ebeb54f75b66b82f873ec2b57f4d4a7342d704 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate-test, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default)
Jan 31 02:22:46 np0005603623 podman[79348]: 2026-01-31 07:22:46.968399873 +0000 UTC m=+0.171443080 container start b62d1fd29ad77350434d3c7a98ebeb54f75b66b82f873ec2b57f4d4a7342d704 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:22:46 np0005603623 podman[79348]: 2026-01-31 07:22:46.980844549 +0000 UTC m=+0.183887816 container attach b62d1fd29ad77350434d3c7a98ebeb54f75b66b82f873ec2b57f4d4a7342d704 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate-test, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 02:22:47 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:47.142+0000 7f20c3e90140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 31 02:22:47 np0005603623 ceph-mgr[77391]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 31 02:22:47 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'osd_support'
Jan 31 02:22:47 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:47.378+0000 7f20c3e90140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 31 02:22:47 np0005603623 ceph-mgr[77391]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 31 02:22:47 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'pg_autoscaler'
Jan 31 02:22:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e22 e22: 3 total, 2 up, 3 in
Jan 31 02:22:47 np0005603623 ceph-mon[77037]: Deploying daemon osd.2 on compute-2
Jan 31 02:22:47 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/839230673' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 31 02:22:47 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate-test[79365]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Jan 31 02:22:47 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate-test[79365]:                            [--no-systemd] [--no-tmpfs]
Jan 31 02:22:47 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate-test[79365]: ceph-volume activate: error: unrecognized arguments: --bad-option
Jan 31 02:22:47 np0005603623 systemd[1]: libpod-b62d1fd29ad77350434d3c7a98ebeb54f75b66b82f873ec2b57f4d4a7342d704.scope: Deactivated successfully.
Jan 31 02:22:47 np0005603623 podman[79348]: 2026-01-31 07:22:47.621430796 +0000 UTC m=+0.824474003 container died b62d1fd29ad77350434d3c7a98ebeb54f75b66b82f873ec2b57f4d4a7342d704 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate-test, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:22:47 np0005603623 systemd[1]: var-lib-containers-storage-overlay-fcc22b5e2f3433b9f97d440e624399a35bda2c933d8a3ba03589f007563a3cb1-merged.mount: Deactivated successfully.
Jan 31 02:22:47 np0005603623 podman[79348]: 2026-01-31 07:22:47.66259144 +0000 UTC m=+0.865634648 container remove b62d1fd29ad77350434d3c7a98ebeb54f75b66b82f873ec2b57f4d4a7342d704 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate-test, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Jan 31 02:22:47 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:47.667+0000 7f20c3e90140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 31 02:22:47 np0005603623 ceph-mgr[77391]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 31 02:22:47 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'progress'
Jan 31 02:22:47 np0005603623 systemd[1]: libpod-conmon-b62d1fd29ad77350434d3c7a98ebeb54f75b66b82f873ec2b57f4d4a7342d704.scope: Deactivated successfully.
Jan 31 02:22:47 np0005603623 systemd[1]: Reloading.
Jan 31 02:22:47 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:22:47 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:22:47 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:47.915+0000 7f20c3e90140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 31 02:22:47 np0005603623 ceph-mgr[77391]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 31 02:22:47 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'prometheus'
Jan 31 02:22:48 np0005603623 systemd[1]: Reloading.
Jan 31 02:22:48 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:22:48 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:22:48 np0005603623 systemd[1]: Starting Ceph osd.2 for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2...
Jan 31 02:22:48 np0005603623 podman[79526]: 2026-01-31 07:22:48.486081875 +0000 UTC m=+0.040000654 container create 02434747f8151746615c076e0b4c05c863d0b04d790da027f799caf184fa0808 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:22:48 np0005603623 ceph-mon[77037]: Health check update: 6 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 31 02:22:48 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/1320339162' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Jan 31 02:22:48 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:22:48 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5019ccfc7cba4cbff3195545de576f94076419fd661f25a7860a76d1ae47dc84/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:48 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5019ccfc7cba4cbff3195545de576f94076419fd661f25a7860a76d1ae47dc84/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:48 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5019ccfc7cba4cbff3195545de576f94076419fd661f25a7860a76d1ae47dc84/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:48 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5019ccfc7cba4cbff3195545de576f94076419fd661f25a7860a76d1ae47dc84/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:48 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5019ccfc7cba4cbff3195545de576f94076419fd661f25a7860a76d1ae47dc84/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:48 np0005603623 podman[79526]: 2026-01-31 07:22:48.46754802 +0000 UTC m=+0.021466819 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:48 np0005603623 podman[79526]: 2026-01-31 07:22:48.569087323 +0000 UTC m=+0.123006102 container init 02434747f8151746615c076e0b4c05c863d0b04d790da027f799caf184fa0808 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Jan 31 02:22:48 np0005603623 podman[79526]: 2026-01-31 07:22:48.574281849 +0000 UTC m=+0.128200648 container start 02434747f8151746615c076e0b4c05c863d0b04d790da027f799caf184fa0808 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 02:22:48 np0005603623 podman[79526]: 2026-01-31 07:22:48.57864965 +0000 UTC m=+0.132568459 container attach 02434747f8151746615c076e0b4c05c863d0b04d790da027f799caf184fa0808 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True)
Jan 31 02:22:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e23 e23: 3 total, 2 up, 3 in
Jan 31 02:22:48 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:48.939+0000 7f20c3e90140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 31 02:22:48 np0005603623 ceph-mgr[77391]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 31 02:22:48 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'rbd_support'
Jan 31 02:22:49 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:49.242+0000 7f20c3e90140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 31 02:22:49 np0005603623 ceph-mgr[77391]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 31 02:22:49 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'restful'
Jan 31 02:22:49 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate[79542]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 31 02:22:49 np0005603623 bash[79526]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 31 02:22:49 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate[79542]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Jan 31 02:22:49 np0005603623 bash[79526]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Jan 31 02:22:49 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate[79542]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Jan 31 02:22:49 np0005603623 bash[79526]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Jan 31 02:22:49 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate[79542]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 31 02:22:49 np0005603623 bash[79526]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 31 02:22:49 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate[79542]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 31 02:22:49 np0005603623 bash[79526]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 31 02:22:49 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate[79542]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 31 02:22:49 np0005603623 bash[79526]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 31 02:22:49 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate[79542]: --> ceph-volume raw activate successful for osd ID: 2
Jan 31 02:22:49 np0005603623 bash[79526]: --> ceph-volume raw activate successful for osd ID: 2
Jan 31 02:22:49 np0005603623 systemd[1]: libpod-02434747f8151746615c076e0b4c05c863d0b04d790da027f799caf184fa0808.scope: Deactivated successfully.
Jan 31 02:22:49 np0005603623 podman[79526]: 2026-01-31 07:22:49.400118868 +0000 UTC m=+0.954037667 container died 02434747f8151746615c076e0b4c05c863d0b04d790da027f799caf184fa0808 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:22:49 np0005603623 systemd[1]: var-lib-containers-storage-overlay-5019ccfc7cba4cbff3195545de576f94076419fd661f25a7860a76d1ae47dc84-merged.mount: Deactivated successfully.
Jan 31 02:22:49 np0005603623 podman[79526]: 2026-01-31 07:22:49.440175432 +0000 UTC m=+0.994094221 container remove 02434747f8151746615c076e0b4c05c863d0b04d790da027f799caf184fa0808 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2-activate, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Jan 31 02:22:49 np0005603623 podman[79713]: 2026-01-31 07:22:49.591919873 +0000 UTC m=+0.034184592 container create a994b8021cb25b07227e64a3d84eae16f1cb45dc13d62a6996d41ff793e2b6c4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507)
Jan 31 02:22:49 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c2d12a47be2dc87686d7edcb3c267edbf2af28ec90a3d08790adb37330a67e1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:49 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c2d12a47be2dc87686d7edcb3c267edbf2af28ec90a3d08790adb37330a67e1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:49 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c2d12a47be2dc87686d7edcb3c267edbf2af28ec90a3d08790adb37330a67e1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:49 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c2d12a47be2dc87686d7edcb3c267edbf2af28ec90a3d08790adb37330a67e1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:49 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c2d12a47be2dc87686d7edcb3c267edbf2af28ec90a3d08790adb37330a67e1/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:49 np0005603623 podman[79713]: 2026-01-31 07:22:49.649093473 +0000 UTC m=+0.091358272 container init a994b8021cb25b07227e64a3d84eae16f1cb45dc13d62a6996d41ff793e2b6c4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Jan 31 02:22:49 np0005603623 podman[79713]: 2026-01-31 07:22:49.652648991 +0000 UTC m=+0.094913740 container start a994b8021cb25b07227e64a3d84eae16f1cb45dc13d62a6996d41ff793e2b6c4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Jan 31 02:22:49 np0005603623 bash[79713]: a994b8021cb25b07227e64a3d84eae16f1cb45dc13d62a6996d41ff793e2b6c4
Jan 31 02:22:49 np0005603623 podman[79713]: 2026-01-31 07:22:49.576179885 +0000 UTC m=+0.018444644 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:49 np0005603623 systemd[1]: Started Ceph osd.2 for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2.
Jan 31 02:22:49 np0005603623 ceph-osd[79732]: set uid:gid to 167:167 (ceph:ceph)
Jan 31 02:22:49 np0005603623 ceph-osd[79732]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-osd, pid 2
Jan 31 02:22:49 np0005603623 ceph-osd[79732]: pidfile_write: ignore empty --pid-file
Jan 31 02:22:49 np0005603623 ceph-osd[79732]: bdev(0x55e60fcb5800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 31 02:22:49 np0005603623 ceph-osd[79732]: bdev(0x55e60fcb5800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 31 02:22:49 np0005603623 ceph-osd[79732]: bdev(0x55e60fcb5800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 31 02:22:49 np0005603623 ceph-osd[79732]: bdev(0x55e60fcb5800 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 31 02:22:49 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 31 02:22:49 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 31 02:22:49 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 31 02:22:49 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0400 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 31 02:22:49 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0400 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 31 02:22:49 np0005603623 ceph-osd[79732]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Jan 31 02:22:49 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0400 /var/lib/ceph/osd/ceph-2/block) close
Jan 31 02:22:49 np0005603623 ceph-osd[79732]: bdev(0x55e60fcb5800 /var/lib/ceph/osd/ceph-2/block) close
Jan 31 02:22:49 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'rgw'
Jan 31 02:22:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e24 e24: 3 total, 2 up, 3 in
Jan 31 02:22:50 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/1320339162' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: load: jerasure load: lrc 
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0000 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0000 /var/lib/ceph/osd/ceph-2/block) close
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0000 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0000 /var/lib/ceph/osd/ceph-2/block) close
Jan 31 02:22:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e24 _set_new_cache_sizes cache_size:1020054715 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:22:50 np0005603623 ceph-mgr[77391]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 31 02:22:50 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'rook'
Jan 31 02:22:50 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:50.633+0000 7f20c3e90140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 31 02:22:50 np0005603623 podman[79897]: 2026-01-31 07:22:50.660095893 +0000 UTC m=+0.035275233 container create 96629874dbd821d82a741153cb5dce9c8bb5f1fbbcabe42c28bf2d69410edc8d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_curran, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Jan 31 02:22:50 np0005603623 systemd[1]: Started libpod-conmon-96629874dbd821d82a741153cb5dce9c8bb5f1fbbcabe42c28bf2d69410edc8d.scope.
Jan 31 02:22:50 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:22:50 np0005603623 podman[79897]: 2026-01-31 07:22:50.735067088 +0000 UTC m=+0.110246438 container init 96629874dbd821d82a741153cb5dce9c8bb5f1fbbcabe42c28bf2d69410edc8d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_curran, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:22:50 np0005603623 podman[79897]: 2026-01-31 07:22:50.643710467 +0000 UTC m=+0.018889817 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:50 np0005603623 podman[79897]: 2026-01-31 07:22:50.741090726 +0000 UTC m=+0.116270056 container start 96629874dbd821d82a741153cb5dce9c8bb5f1fbbcabe42c28bf2d69410edc8d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_curran, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Jan 31 02:22:50 np0005603623 podman[79897]: 2026-01-31 07:22:50.7441165 +0000 UTC m=+0.119295860 container attach 96629874dbd821d82a741153cb5dce9c8bb5f1fbbcabe42c28bf2d69410edc8d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_curran, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507)
Jan 31 02:22:50 np0005603623 pedantic_curran[79914]: 167 167
Jan 31 02:22:50 np0005603623 systemd[1]: libpod-96629874dbd821d82a741153cb5dce9c8bb5f1fbbcabe42c28bf2d69410edc8d.scope: Deactivated successfully.
Jan 31 02:22:50 np0005603623 podman[79897]: 2026-01-31 07:22:50.745957251 +0000 UTC m=+0.121136581 container died 96629874dbd821d82a741153cb5dce9c8bb5f1fbbcabe42c28bf2d69410edc8d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_curran, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True)
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610ac0000 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610cc8400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610cc8400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610cc8400 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610cc8400 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bluefs mount
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bluefs mount shared_bdev_used = 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: RocksDB version: 7.9.2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Git sha 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: DB SUMMARY
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: DB Session ID:  AXTAAW96B2SYT6S2KP0E
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: CURRENT file:  CURRENT
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: IDENTITY file:  IDENTITY
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                         Options.error_if_exists: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                       Options.create_if_missing: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                         Options.paranoid_checks: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                                     Options.env: 0x55e610b4fdc0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                                Options.info_log: 0x55e60fd32ba0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_file_opening_threads: 16
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                              Options.statistics: (nil)
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                               Options.use_fsync: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                       Options.max_log_file_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                         Options.allow_fallocate: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.use_direct_reads: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.create_missing_column_families: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                              Options.db_log_dir: 
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                                 Options.wal_dir: db.wal
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.advise_random_on_open: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.write_buffer_manager: 0x55e610c42460
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                            Options.rate_limiter: (nil)
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.unordered_write: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                               Options.row_cache: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                              Options.wal_filter: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.allow_ingest_behind: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.two_write_queues: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.manual_wal_flush: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.wal_compression: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.atomic_flush: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                 Options.log_readahead_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.allow_data_in_errors: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.db_host_id: __hostname__
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.max_background_jobs: 4
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.max_background_compactions: -1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.max_subcompactions: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.max_open_files: -1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.bytes_per_sync: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.max_background_flushes: -1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Compression algorithms supported:
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: #011kZSTD supported: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: #011kXpressCompression supported: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: #011kBZip2Compression supported: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: #011kLZ4Compression supported: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: #011kZlibCompression supported: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: #011kSnappyCompression supported: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd32600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd28dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd32600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd28dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd32600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd28dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd32600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd28dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:50 np0005603623 systemd[1]: var-lib-containers-storage-overlay-c0d37f8b52a36cd2b2d7bba5cc929fb02d2857db447737974d4124ffb0e7a497-merged.mount: Deactivated successfully.
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd32600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd28dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd32600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd28dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd32600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd28dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd325c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd28430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd325c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd28430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd325c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd28430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:50 np0005603623 podman[79897]: 2026-01-31 07:22:50.774565517 +0000 UTC m=+0.149744847 container remove 96629874dbd821d82a741153cb5dce9c8bb5f1fbbcabe42c28bf2d69410edc8d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_curran, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 1128e447-2aad-4aee-9439-bbc728e05b8f
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844170782385, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844170782763, "job": 1, "event": "recovery_finished"}
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: freelist init
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: freelist _read_cfg
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bluefs umount
Jan 31 02:22:50 np0005603623 ceph-osd[79732]: bdev(0x55e610cc8400 /var/lib/ceph/osd/ceph-2/block) close
Jan 31 02:22:50 np0005603623 systemd[1]: libpod-conmon-96629874dbd821d82a741153cb5dce9c8bb5f1fbbcabe42c28bf2d69410edc8d.scope: Deactivated successfully.
Jan 31 02:22:50 np0005603623 podman[80131]: 2026-01-31 07:22:50.893070432 +0000 UTC m=+0.038078259 container create 12cb47cacd9e1dbfa02d48d1652ae5cce810164b13904acd85c5c7bae18315d5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=charming_fermat, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 02:22:50 np0005603623 systemd[1]: Started libpod-conmon-12cb47cacd9e1dbfa02d48d1652ae5cce810164b13904acd85c5c7bae18315d5.scope.
Jan 31 02:22:50 np0005603623 podman[80131]: 2026-01-31 07:22:50.872441929 +0000 UTC m=+0.017449756 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:50 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:22:50 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e848f514a55f65f6c591617249da422727d1d0f3f2568fef26b4a5f0ab682680/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:50 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e848f514a55f65f6c591617249da422727d1d0f3f2568fef26b4a5f0ab682680/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:50 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e848f514a55f65f6c591617249da422727d1d0f3f2568fef26b4a5f0ab682680/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:50 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e848f514a55f65f6c591617249da422727d1d0f3f2568fef26b4a5f0ab682680/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:50 np0005603623 podman[80131]: 2026-01-31 07:22:50.995161443 +0000 UTC m=+0.140169260 container init 12cb47cacd9e1dbfa02d48d1652ae5cce810164b13904acd85c5c7bae18315d5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=charming_fermat, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:22:51 np0005603623 podman[80131]: 2026-01-31 07:22:50.999898844 +0000 UTC m=+0.144906671 container start 12cb47cacd9e1dbfa02d48d1652ae5cce810164b13904acd85c5c7bae18315d5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=charming_fermat, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3)
Jan 31 02:22:51 np0005603623 podman[80131]: 2026-01-31 07:22:51.003445263 +0000 UTC m=+0.148453080 container attach 12cb47cacd9e1dbfa02d48d1652ae5cce810164b13904acd85c5c7bae18315d5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=charming_fermat, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: bdev(0x55e610cc8400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: bdev(0x55e610cc8400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: bdev(0x55e610cc8400 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: bdev(0x55e610cc8400 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: bluefs mount
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: bluefs mount shared_bdev_used = 4718592
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: RocksDB version: 7.9.2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Git sha 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: DB SUMMARY
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: DB Session ID:  AXTAAW96B2SYT6S2KP0F
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: CURRENT file:  CURRENT
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: IDENTITY file:  IDENTITY
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                         Options.error_if_exists: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                       Options.create_if_missing: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                         Options.paranoid_checks: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                                     Options.env: 0x55e60fe80000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                                Options.info_log: 0x55e60fd0f580
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_file_opening_threads: 16
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                              Options.statistics: (nil)
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                               Options.use_fsync: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                       Options.max_log_file_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                         Options.allow_fallocate: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.use_direct_reads: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.create_missing_column_families: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                              Options.db_log_dir: 
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                                 Options.wal_dir: db.wal
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.advise_random_on_open: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.write_buffer_manager: 0x55e610c42960
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                            Options.rate_limiter: (nil)
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.unordered_write: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                               Options.row_cache: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                              Options.wal_filter: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.allow_ingest_behind: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.two_write_queues: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.manual_wal_flush: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.wal_compression: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.atomic_flush: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                 Options.log_readahead_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.allow_data_in_errors: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.db_host_id: __hostname__
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.max_background_jobs: 4
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.max_background_compactions: -1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.max_subcompactions: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.max_open_files: -1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.bytes_per_sync: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.max_background_flushes: -1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Compression algorithms supported:
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: #011kZSTD supported: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: #011kXpressCompression supported: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: #011kBZip2Compression supported: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: #011kLZ4Compression supported: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: #011kZlibCompression supported: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: #011kSnappyCompression supported: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd33220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd28f30#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd33220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd28f30#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd33220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd28f30#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd33220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd28f30#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd33220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd28f30#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd33220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd28f30#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd33220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd28f30#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd33100)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd29350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd33100)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd29350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e60fd33100)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e60fd29350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 1128e447-2aad-4aee-9439-bbc728e05b8f
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844171073746, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844171078939, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844171, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1128e447-2aad-4aee-9439-bbc728e05b8f", "db_session_id": "AXTAAW96B2SYT6S2KP0F", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844171082210, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844171, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1128e447-2aad-4aee-9439-bbc728e05b8f", "db_session_id": "AXTAAW96B2SYT6S2KP0F", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844171085557, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844171, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1128e447-2aad-4aee-9439-bbc728e05b8f", "db_session_id": "AXTAAW96B2SYT6S2KP0F", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844171087541, "job": 1, "event": "recovery_finished"}
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Jan 31 02:22:51 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/218093747' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Jan 31 02:22:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:51 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/218093747' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Jan 31 02:22:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:51 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/1045044254' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55e60fdf8700
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: DB pointer 0x55e610c2ba00
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e60fd28f30#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e60fd28f30#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e60fd28f30#2 capacity: 460.80 MB usag
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/hello/cls_hello.cc:316: loading cls_hello
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: _get_class not permitted to load lua
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: _get_class not permitted to load sdk
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: _get_class not permitted to load test_remote_reads
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: osd.2 0 load_pgs
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: osd.2 0 load_pgs opened 0 pgs
Jan 31 02:22:51 np0005603623 ceph-osd[79732]: osd.2 0 log_to_monitors true
Jan 31 02:22:51 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2[79728]: 2026-01-31T07:22:51.121+0000 7f72e4de6740 -1 osd.2 0 log_to_monitors true
Jan 31 02:22:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0) v1
Jan 31 02:22:51 np0005603623 ceph-mon[77037]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/1205784752,v1:192.168.122.102:6801/1205784752]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 31 02:22:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e25 e25: 3 total, 2 up, 3 in
Jan 31 02:22:51 np0005603623 charming_fermat[80149]: {
Jan 31 02:22:51 np0005603623 charming_fermat[80149]:    "d561c1d2-064b-46a8-af35-64503a234a3c": {
Jan 31 02:22:51 np0005603623 charming_fermat[80149]:        "ceph_fsid": "2f5ab832-5f2e-5a84-bd93-cf8bab960ee2",
Jan 31 02:22:51 np0005603623 charming_fermat[80149]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Jan 31 02:22:51 np0005603623 charming_fermat[80149]:        "osd_id": 2,
Jan 31 02:22:51 np0005603623 charming_fermat[80149]:        "osd_uuid": "d561c1d2-064b-46a8-af35-64503a234a3c",
Jan 31 02:22:51 np0005603623 charming_fermat[80149]:        "type": "bluestore"
Jan 31 02:22:51 np0005603623 charming_fermat[80149]:    }
Jan 31 02:22:51 np0005603623 charming_fermat[80149]: }
Jan 31 02:22:51 np0005603623 systemd[1]: libpod-12cb47cacd9e1dbfa02d48d1652ae5cce810164b13904acd85c5c7bae18315d5.scope: Deactivated successfully.
Jan 31 02:22:51 np0005603623 podman[80131]: 2026-01-31 07:22:51.793727905 +0000 UTC m=+0.938735722 container died 12cb47cacd9e1dbfa02d48d1652ae5cce810164b13904acd85c5c7bae18315d5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=charming_fermat, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:22:52 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Jan 31 02:22:52 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Jan 31 02:22:52 np0005603623 ceph-mon[77037]: from='osd.2 [v2:192.168.122.102:6800/1205784752,v1:192.168.122.102:6801/1205784752]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 31 02:22:52 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/1045044254' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Jan 31 02:22:52 np0005603623 ceph-mon[77037]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 31 02:22:52 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/1692381800' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Jan 31 02:22:52 np0005603623 systemd[1]: var-lib-containers-storage-overlay-e848f514a55f65f6c591617249da422727d1d0f3f2568fef26b4a5f0ab682680-merged.mount: Deactivated successfully.
Jan 31 02:22:52 np0005603623 podman[80131]: 2026-01-31 07:22:52.244895463 +0000 UTC m=+1.389903270 container remove 12cb47cacd9e1dbfa02d48d1652ae5cce810164b13904acd85c5c7bae18315d5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=charming_fermat, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Jan 31 02:22:52 np0005603623 systemd[1]: libpod-conmon-12cb47cacd9e1dbfa02d48d1652ae5cce810164b13904acd85c5c7bae18315d5.scope: Deactivated successfully.
Jan 31 02:22:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e26 e26: 3 total, 2 up, 3 in
Jan 31 02:22:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]} v 0) v1
Jan 31 02:22:52 np0005603623 ceph-mon[77037]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/1205784752,v1:192.168.122.102:6801/1205784752]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 31 02:22:52 np0005603623 ceph-mgr[77391]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 31 02:22:52 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'selftest'
Jan 31 02:22:52 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:52.773+0000 7f20c3e90140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 31 02:22:53 np0005603623 ceph-mgr[77391]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 31 02:22:53 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:53.075+0000 7f20c3e90140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 31 02:22:53 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'snap_schedule'
Jan 31 02:22:53 np0005603623 ceph-mon[77037]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Jan 31 02:22:53 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/1692381800' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Jan 31 02:22:53 np0005603623 ceph-mon[77037]: from='osd.2 [v2:192.168.122.102:6800/1205784752,v1:192.168.122.102:6801/1205784752]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 31 02:22:53 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:53 np0005603623 ceph-mon[77037]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 31 02:22:53 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:53 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/3921316751' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Jan 31 02:22:53 np0005603623 ceph-mgr[77391]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 31 02:22:53 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'stats'
Jan 31 02:22:53 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:53.348+0000 7f20c3e90140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 31 02:22:53 np0005603623 podman[80618]: 2026-01-31 07:22:53.351595875 +0000 UTC m=+0.073712752 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Jan 31 02:22:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e27 e27: 3 total, 2 up, 3 in
Jan 31 02:22:53 np0005603623 ceph-osd[79732]: osd.2 0 done with init, starting boot process
Jan 31 02:22:53 np0005603623 ceph-osd[79732]: osd.2 0 start_boot
Jan 31 02:22:53 np0005603623 ceph-osd[79732]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Jan 31 02:22:53 np0005603623 ceph-osd[79732]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Jan 31 02:22:53 np0005603623 ceph-osd[79732]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Jan 31 02:22:53 np0005603623 ceph-osd[79732]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Jan 31 02:22:53 np0005603623 ceph-osd[79732]: osd.2 0  bench count 12288000 bsize 4 KiB
Jan 31 02:22:53 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'status'
Jan 31 02:22:53 np0005603623 podman[80638]: 2026-01-31 07:22:53.687704933 +0000 UTC m=+0.048940522 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:22:53 np0005603623 podman[80618]: 2026-01-31 07:22:53.696513028 +0000 UTC m=+0.418629935 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:22:53 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:53.866+0000 7f20c3e90140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 31 02:22:53 np0005603623 ceph-mgr[77391]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 31 02:22:53 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'telegraf'
Jan 31 02:22:54 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:54.126+0000 7f20c3e90140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 31 02:22:54 np0005603623 ceph-mgr[77391]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 31 02:22:54 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'telemetry'
Jan 31 02:22:54 np0005603623 ceph-mon[77037]: Health check update: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 31 02:22:54 np0005603623 ceph-mon[77037]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]': finished
Jan 31 02:22:54 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/3921316751' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Jan 31 02:22:54 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:54 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:54 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:54 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:54 np0005603623 podman[80842]: 2026-01-31 07:22:54.577613745 +0000 UTC m=+0.057918192 container create b30c1c813294fb9159654f85d2da48478071188a1d014a0038e18f64f95a1b06 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_williams, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2)
Jan 31 02:22:54 np0005603623 systemd[1]: Started libpod-conmon-b30c1c813294fb9159654f85d2da48478071188a1d014a0038e18f64f95a1b06.scope.
Jan 31 02:22:54 np0005603623 podman[80842]: 2026-01-31 07:22:54.539087733 +0000 UTC m=+0.019392190 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:54 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:22:54 np0005603623 podman[80842]: 2026-01-31 07:22:54.663790491 +0000 UTC m=+0.144094938 container init b30c1c813294fb9159654f85d2da48478071188a1d014a0038e18f64f95a1b06 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_williams, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Jan 31 02:22:54 np0005603623 podman[80842]: 2026-01-31 07:22:54.671168007 +0000 UTC m=+0.151472434 container start b30c1c813294fb9159654f85d2da48478071188a1d014a0038e18f64f95a1b06 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_williams, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS)
Jan 31 02:22:54 np0005603623 hopeful_williams[80858]: 167 167
Jan 31 02:22:54 np0005603623 systemd[1]: libpod-b30c1c813294fb9159654f85d2da48478071188a1d014a0038e18f64f95a1b06.scope: Deactivated successfully.
Jan 31 02:22:54 np0005603623 podman[80842]: 2026-01-31 07:22:54.690082933 +0000 UTC m=+0.170387380 container attach b30c1c813294fb9159654f85d2da48478071188a1d014a0038e18f64f95a1b06 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_williams, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 02:22:54 np0005603623 podman[80842]: 2026-01-31 07:22:54.690976058 +0000 UTC m=+0.171280485 container died b30c1c813294fb9159654f85d2da48478071188a1d014a0038e18f64f95a1b06 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_williams, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 02:22:54 np0005603623 systemd[1]: var-lib-containers-storage-overlay-b385af16a23cfb7ef7892c9bc7cb9431960aaa8e1447922fa9d7a69aef4b9b08-merged.mount: Deactivated successfully.
Jan 31 02:22:54 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:54.751+0000 7f20c3e90140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 31 02:22:54 np0005603623 ceph-mgr[77391]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 31 02:22:54 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'test_orchestrator'
Jan 31 02:22:54 np0005603623 podman[80842]: 2026-01-31 07:22:54.790820265 +0000 UTC m=+0.271124702 container remove b30c1c813294fb9159654f85d2da48478071188a1d014a0038e18f64f95a1b06 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_williams, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Jan 31 02:22:54 np0005603623 systemd[1]: libpod-conmon-b30c1c813294fb9159654f85d2da48478071188a1d014a0038e18f64f95a1b06.scope: Deactivated successfully.
Jan 31 02:22:54 np0005603623 podman[80883]: 2026-01-31 07:22:54.947005199 +0000 UTC m=+0.039560222 container create a362ea4db2996aa048d730b48acbda66baa651ee9bb1a67a04c6a7ea4522fdd0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_dijkstra, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:22:55 np0005603623 systemd[1]: Started libpod-conmon-a362ea4db2996aa048d730b48acbda66baa651ee9bb1a67a04c6a7ea4522fdd0.scope.
Jan 31 02:22:55 np0005603623 podman[80883]: 2026-01-31 07:22:54.931110067 +0000 UTC m=+0.023665120 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:55 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:22:55 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdd6dc4265d1cf3e549a8e303356123b23290cca7fc3dd94d687b3760e7dc531/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:55 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdd6dc4265d1cf3e549a8e303356123b23290cca7fc3dd94d687b3760e7dc531/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:55 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdd6dc4265d1cf3e549a8e303356123b23290cca7fc3dd94d687b3760e7dc531/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:55 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdd6dc4265d1cf3e549a8e303356123b23290cca7fc3dd94d687b3760e7dc531/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:55 np0005603623 podman[80883]: 2026-01-31 07:22:55.065504935 +0000 UTC m=+0.158059988 container init a362ea4db2996aa048d730b48acbda66baa651ee9bb1a67a04c6a7ea4522fdd0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_dijkstra, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:22:55 np0005603623 podman[80883]: 2026-01-31 07:22:55.073788706 +0000 UTC m=+0.166343779 container start a362ea4db2996aa048d730b48acbda66baa651ee9bb1a67a04c6a7ea4522fdd0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_dijkstra, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Jan 31 02:22:55 np0005603623 podman[80883]: 2026-01-31 07:22:55.095700015 +0000 UTC m=+0.188255068 container attach a362ea4db2996aa048d730b48acbda66baa651ee9bb1a67a04c6a7ea4522fdd0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_dijkstra, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Jan 31 02:22:55 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:55.411+0000 7f20c3e90140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 31 02:22:55 np0005603623 ceph-mgr[77391]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 31 02:22:55 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'volumes'
Jan 31 02:22:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e28 e28: 3 total, 2 up, 3 in
Jan 31 02:22:55 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/4251382841' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Jan 31 02:22:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e28 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]: [
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:    {
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:        "available": false,
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:        "ceph_device": false,
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:        "lsm_data": {},
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:        "lvs": [],
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:        "path": "/dev/sr0",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:        "rejected_reasons": [
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "Has a FileSystem",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "Insufficient space (<5GB)"
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:        ],
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:        "sys_api": {
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "actuators": null,
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "device_nodes": "sr0",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "devname": "sr0",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "human_readable_size": "482.00 KB",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "id_bus": "ata",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "model": "QEMU DVD-ROM",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "nr_requests": "2",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "parent": "/dev/sr0",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "partitions": {},
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "path": "/dev/sr0",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "removable": "1",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "rev": "2.5+",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "ro": "0",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "rotational": "1",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "sas_address": "",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "sas_device_handle": "",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "scheduler_mode": "mq-deadline",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "sectors": 0,
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "sectorsize": "2048",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "size": 493568.0,
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "support_discard": "2048",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "type": "disk",
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:            "vendor": "QEMU"
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:        }
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]:    }
Jan 31 02:22:56 np0005603623 nifty_dijkstra[80899]: ]
Jan 31 02:22:56 np0005603623 systemd[1]: libpod-a362ea4db2996aa048d730b48acbda66baa651ee9bb1a67a04c6a7ea4522fdd0.scope: Deactivated successfully.
Jan 31 02:22:56 np0005603623 podman[80883]: 2026-01-31 07:22:56.084043964 +0000 UTC m=+1.176598997 container died a362ea4db2996aa048d730b48acbda66baa651ee9bb1a67a04c6a7ea4522fdd0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_dijkstra, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Jan 31 02:22:56 np0005603623 systemd[1]: var-lib-containers-storage-overlay-bdd6dc4265d1cf3e549a8e303356123b23290cca7fc3dd94d687b3760e7dc531-merged.mount: Deactivated successfully.
Jan 31 02:22:56 np0005603623 ceph-osd[79732]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 32.199 iops: 8242.819 elapsed_sec: 0.364
Jan 31 02:22:56 np0005603623 ceph-osd[79732]: log_channel(cluster) log [WRN] : OSD bench result of 8242.819324 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 31 02:22:56 np0005603623 ceph-osd[79732]: osd.2 0 waiting for initial osdmap
Jan 31 02:22:56 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2[79728]: 2026-01-31T07:22:56.124+0000 7f72e0d66640 -1 osd.2 0 waiting for initial osdmap
Jan 31 02:22:56 np0005603623 podman[80883]: 2026-01-31 07:22:56.133168531 +0000 UTC m=+1.225723564 container remove a362ea4db2996aa048d730b48acbda66baa651ee9bb1a67a04c6a7ea4522fdd0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_dijkstra, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Jan 31 02:22:56 np0005603623 ceph-osd[79732]: osd.2 28 crush map has features 288514051259236352, adjusting msgr requires for clients
Jan 31 02:22:56 np0005603623 ceph-osd[79732]: osd.2 28 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Jan 31 02:22:56 np0005603623 ceph-osd[79732]: osd.2 28 crush map has features 3314933000852226048, adjusting msgr requires for osds
Jan 31 02:22:56 np0005603623 ceph-osd[79732]: osd.2 28 check_osdmap_features require_osd_release unknown -> reef
Jan 31 02:22:56 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:56.137+0000 7f20c3e90140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 31 02:22:56 np0005603623 systemd[1]: libpod-conmon-a362ea4db2996aa048d730b48acbda66baa651ee9bb1a67a04c6a7ea4522fdd0.scope: Deactivated successfully.
Jan 31 02:22:56 np0005603623 ceph-mgr[77391]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 31 02:22:56 np0005603623 ceph-mgr[77391]: mgr[py] Loading python module 'zabbix'
Jan 31 02:22:56 np0005603623 ceph-osd[79732]: osd.2 28 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 31 02:22:56 np0005603623 ceph-osd[79732]: osd.2 28 set_numa_affinity not setting numa affinity
Jan 31 02:22:56 np0005603623 ceph-osd[79732]: osd.2 28 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Jan 31 02:22:56 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-2[79728]: 2026-01-31T07:22:56.147+0000 7f72dc38e640 -1 osd.2 28 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 31 02:22:56 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-2-cdjvtw[77387]: 2026-01-31T07:22:56.392+0000 7f20c3e90140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 31 02:22:56 np0005603623 ceph-mgr[77391]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 31 02:22:56 np0005603623 ceph-mgr[77391]: ms_deliver_dispatch: unhandled message 0x5645f6eab080 mon_map magic: 0 v1 from mon.0 v2:192.168.122.100:3300/0
Jan 31 02:22:56 np0005603623 ceph-mgr[77391]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3835187053
Jan 31 02:22:56 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/4251382841' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Jan 31 02:22:56 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:56 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:56 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Jan 31 02:22:56 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:22:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e29 e29: 3 total, 3 up, 3 in
Jan 31 02:22:56 np0005603623 ceph-osd[79732]: osd.2 29 state: booting -> active
Jan 31 02:22:57 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 29 pg[5.0( empty local-lis/les=0/0 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=29) [2] r=0 lpr=29 pi=[19,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:22:57 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 29 pg[3.0( empty local-lis/les=0/0 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [2] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:22:57 np0005603623 ceph-mon[77037]: OSD bench result of 8242.819324 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 31 02:22:57 np0005603623 ceph-mon[77037]: Adjusting osd_memory_target on compute-2 to 127.9M
Jan 31 02:22:57 np0005603623 ceph-mon[77037]: Unable to set osd_memory_target on compute-2 to 134203392: error parsing value: Value '134203392' is below minimum 939524096
Jan 31 02:22:57 np0005603623 ceph-mon[77037]: Updating compute-0:/etc/ceph/ceph.conf
Jan 31 02:22:57 np0005603623 ceph-mon[77037]: Updating compute-1:/etc/ceph/ceph.conf
Jan 31 02:22:57 np0005603623 ceph-mon[77037]: Updating compute-2:/etc/ceph/ceph.conf
Jan 31 02:22:57 np0005603623 ceph-mon[77037]: osd.2 [v2:192.168.122.102:6800/1205784752,v1:192.168.122.102:6801/1205784752] boot
Jan 31 02:22:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e30 e30: 3 total, 3 up, 3 in
Jan 31 02:22:57 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 30 pg[5.0( empty local-lis/les=29/30 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=29) [2] r=0 lpr=29 pi=[19,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:22:57 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 30 pg[3.0( empty local-lis/les=29/30 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [2] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:22:58 np0005603623 ceph-mon[77037]: Updating compute-0:/var/lib/ceph/2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/config/ceph.conf
Jan 31 02:22:58 np0005603623 ceph-mon[77037]: Updating compute-2:/var/lib/ceph/2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/config/ceph.conf
Jan 31 02:22:58 np0005603623 ceph-mon[77037]: Updating compute-1:/var/lib/ceph/2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/config/ceph.conf
Jan 31 02:22:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:22:58 np0005603623 ceph-mon[77037]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 31 02:22:58 np0005603623 ceph-mon[77037]: Cluster is now healthy
Jan 31 02:22:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config assimilate-conf"} v 0) v1
Jan 31 02:22:58 np0005603623 ceph-mon[77037]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3074196547' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Jan 31 02:22:59 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/3074196547' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Jan 31 02:22:59 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Jan 31 02:22:59 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Jan 31 02:23:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e30 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:01 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/3569125940' entity='client.admin' 
Jan 31 02:23:02 np0005603623 ceph-mon[77037]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 31 02:23:02 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:02 np0005603623 ceph-mon[77037]: Saving service ingress.rgw.default spec with placement count:2
Jan 31 02:23:02 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).mds e2 new map
Jan 31 02:23:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).mds e2 print_map#012e2#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-31T07:23:03.855545+0000#012modified#0112026-01-31T07:23:03.855587+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 
Jan 31 02:23:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e31 e31: 3 total, 3 up, 3 in
Jan 31 02:23:04 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:04 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:04 np0005603623 ceph-mon[77037]: Reconfiguring mon.compute-0 (monmap changed)...
Jan 31 02:23:04 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 31 02:23:04 np0005603623 ceph-mon[77037]: Reconfiguring daemon mon.compute-0 on compute-0
Jan 31 02:23:04 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Jan 31 02:23:04 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Jan 31 02:23:04 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Jan 31 02:23:04 np0005603623 ceph-mon[77037]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Jan 31 02:23:04 np0005603623 ceph-mon[77037]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Jan 31 02:23:04 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Jan 31 02:23:04 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:04 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:04 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:04 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.ddmhwk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 31 02:23:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:05 np0005603623 ceph-mon[77037]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 31 02:23:05 np0005603623 ceph-mon[77037]: Reconfiguring mgr.compute-0.ddmhwk (monmap changed)...
Jan 31 02:23:05 np0005603623 ceph-mon[77037]: Reconfiguring daemon mgr.compute-0.ddmhwk on compute-0
Jan 31 02:23:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 31 02:23:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Jan 31 02:23:06 np0005603623 ceph-mon[77037]: Reconfiguring crash.compute-0 (monmap changed)...
Jan 31 02:23:06 np0005603623 ceph-mon[77037]: Reconfiguring daemon crash.compute-0 on compute-0
Jan 31 02:23:06 np0005603623 ceph-mon[77037]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 31 02:23:06 np0005603623 ceph-mon[77037]: Reconfiguring osd.0 (monmap changed)...
Jan 31 02:23:06 np0005603623 ceph-mon[77037]: Reconfiguring daemon osd.0 on compute-0
Jan 31 02:23:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 31 02:23:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Jan 31 02:23:07 np0005603623 ceph-mon[77037]: Reconfiguring crash.compute-1 (monmap changed)...
Jan 31 02:23:07 np0005603623 ceph-mon[77037]: Reconfiguring daemon crash.compute-1 on compute-1
Jan 31 02:23:07 np0005603623 ceph-mon[77037]: Reconfiguring osd.1 (monmap changed)...
Jan 31 02:23:07 np0005603623 ceph-mon[77037]: Reconfiguring daemon osd.1 on compute-1
Jan 31 02:23:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 31 02:23:08 np0005603623 podman[82883]: 2026-01-31 07:23:08.282569141 +0000 UTC m=+0.030872650 container create 9e88b5c0e904683af22efd46066b4245f3d3b56807a478c117f6239933b99d5f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_kepler, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Jan 31 02:23:08 np0005603623 systemd[1]: Started libpod-conmon-9e88b5c0e904683af22efd46066b4245f3d3b56807a478c117f6239933b99d5f.scope.
Jan 31 02:23:08 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:23:08 np0005603623 podman[82883]: 2026-01-31 07:23:08.332486769 +0000 UTC m=+0.080790308 container init 9e88b5c0e904683af22efd46066b4245f3d3b56807a478c117f6239933b99d5f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_kepler, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:23:08 np0005603623 podman[82883]: 2026-01-31 07:23:08.339761911 +0000 UTC m=+0.088065420 container start 9e88b5c0e904683af22efd46066b4245f3d3b56807a478c117f6239933b99d5f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_kepler, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2)
Jan 31 02:23:08 np0005603623 podman[82883]: 2026-01-31 07:23:08.343533196 +0000 UTC m=+0.091836725 container attach 9e88b5c0e904683af22efd46066b4245f3d3b56807a478c117f6239933b99d5f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_kepler, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:23:08 np0005603623 unruffled_kepler[82899]: 167 167
Jan 31 02:23:08 np0005603623 systemd[1]: libpod-9e88b5c0e904683af22efd46066b4245f3d3b56807a478c117f6239933b99d5f.scope: Deactivated successfully.
Jan 31 02:23:08 np0005603623 podman[82883]: 2026-01-31 07:23:08.344266477 +0000 UTC m=+0.092569986 container died 9e88b5c0e904683af22efd46066b4245f3d3b56807a478c117f6239933b99d5f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_kepler, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Jan 31 02:23:08 np0005603623 systemd[1]: var-lib-containers-storage-overlay-9e210891458c07f6ad4bcedf08dd194b4ab01f5014c4b4e28c5c9e4d47ecefab-merged.mount: Deactivated successfully.
Jan 31 02:23:08 np0005603623 podman[82883]: 2026-01-31 07:23:08.269180879 +0000 UTC m=+0.017484418 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:23:08 np0005603623 podman[82883]: 2026-01-31 07:23:08.372312317 +0000 UTC m=+0.120615816 container remove 9e88b5c0e904683af22efd46066b4245f3d3b56807a478c117f6239933b99d5f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_kepler, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:23:08 np0005603623 systemd[1]: libpod-conmon-9e88b5c0e904683af22efd46066b4245f3d3b56807a478c117f6239933b99d5f.scope: Deactivated successfully.
Jan 31 02:23:08 np0005603623 ceph-mon[77037]: Reconfiguring mon.compute-1 (monmap changed)...
Jan 31 02:23:08 np0005603623 ceph-mon[77037]: Reconfiguring daemon mon.compute-1 on compute-1
Jan 31 02:23:08 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/720950555' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Jan 31 02:23:08 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/720950555' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Jan 31 02:23:08 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:08 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:08 np0005603623 ceph-mon[77037]: Reconfiguring mon.compute-2 (monmap changed)...
Jan 31 02:23:08 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 31 02:23:08 np0005603623 ceph-mon[77037]: Reconfiguring daemon mon.compute-2 on compute-2
Jan 31 02:23:08 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:08 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:08 np0005603623 ceph-mon[77037]: Reconfiguring mgr.compute-2.cdjvtw (monmap changed)...
Jan 31 02:23:08 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.cdjvtw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 31 02:23:08 np0005603623 ceph-mon[77037]: Reconfiguring daemon mgr.compute-2.cdjvtw on compute-2
Jan 31 02:23:08 np0005603623 podman[83035]: 2026-01-31 07:23:08.804472566 +0000 UTC m=+0.039796547 container create ea633eb13c4c39e7fb7f61b8c1acdc78f31e193fbe296fbce9ff4d93f1c553d4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_hoover, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:23:08 np0005603623 systemd[1]: Started libpod-conmon-ea633eb13c4c39e7fb7f61b8c1acdc78f31e193fbe296fbce9ff4d93f1c553d4.scope.
Jan 31 02:23:08 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:23:08 np0005603623 podman[83035]: 2026-01-31 07:23:08.860997709 +0000 UTC m=+0.096321720 container init ea633eb13c4c39e7fb7f61b8c1acdc78f31e193fbe296fbce9ff4d93f1c553d4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_hoover, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:23:08 np0005603623 podman[83035]: 2026-01-31 07:23:08.867662614 +0000 UTC m=+0.102986595 container start ea633eb13c4c39e7fb7f61b8c1acdc78f31e193fbe296fbce9ff4d93f1c553d4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_hoover, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Jan 31 02:23:08 np0005603623 distracted_hoover[83051]: 167 167
Jan 31 02:23:08 np0005603623 systemd[1]: libpod-ea633eb13c4c39e7fb7f61b8c1acdc78f31e193fbe296fbce9ff4d93f1c553d4.scope: Deactivated successfully.
Jan 31 02:23:08 np0005603623 podman[83035]: 2026-01-31 07:23:08.870486483 +0000 UTC m=+0.105810484 container attach ea633eb13c4c39e7fb7f61b8c1acdc78f31e193fbe296fbce9ff4d93f1c553d4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_hoover, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 02:23:08 np0005603623 podman[83035]: 2026-01-31 07:23:08.87148876 +0000 UTC m=+0.106812771 container died ea633eb13c4c39e7fb7f61b8c1acdc78f31e193fbe296fbce9ff4d93f1c553d4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_hoover, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Jan 31 02:23:08 np0005603623 podman[83035]: 2026-01-31 07:23:08.78660983 +0000 UTC m=+0.021933841 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:23:08 np0005603623 systemd[1]: var-lib-containers-storage-overlay-04061e78965c55607b0df911103c712f84a7112e904b50abe095fcd5d53f98a7-merged.mount: Deactivated successfully.
Jan 31 02:23:08 np0005603623 podman[83035]: 2026-01-31 07:23:08.904039906 +0000 UTC m=+0.139363887 container remove ea633eb13c4c39e7fb7f61b8c1acdc78f31e193fbe296fbce9ff4d93f1c553d4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_hoover, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:23:08 np0005603623 systemd[1]: libpod-conmon-ea633eb13c4c39e7fb7f61b8c1acdc78f31e193fbe296fbce9ff4d93f1c553d4.scope: Deactivated successfully.
Jan 31 02:23:09 np0005603623 podman[83242]: 2026-01-31 07:23:09.491542307 +0000 UTC m=+0.045218309 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 02:23:09 np0005603623 podman[83242]: 2026-01-31 07:23:09.582862206 +0000 UTC m=+0.136538208 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:23:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:23:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:23:13 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/992138592' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Jan 31 02:23:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.aejomu", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 31 02:23:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.aejomu", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 31 02:23:16 np0005603623 podman[83601]: 2026-01-31 07:23:16.568481865 +0000 UTC m=+0.033249076 container create ff5dee15a43070d8413ba9b054b5968e124985ebaea3ddad8e727fc1b7c15c69 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_bardeen, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507)
Jan 31 02:23:16 np0005603623 systemd[1]: Started libpod-conmon-ff5dee15a43070d8413ba9b054b5968e124985ebaea3ddad8e727fc1b7c15c69.scope.
Jan 31 02:23:16 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:23:16 np0005603623 podman[83601]: 2026-01-31 07:23:16.64559114 +0000 UTC m=+0.110358391 container init ff5dee15a43070d8413ba9b054b5968e124985ebaea3ddad8e727fc1b7c15c69 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_bardeen, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Jan 31 02:23:16 np0005603623 podman[83601]: 2026-01-31 07:23:16.555164975 +0000 UTC m=+0.019932206 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:23:16 np0005603623 podman[83601]: 2026-01-31 07:23:16.653910742 +0000 UTC m=+0.118677963 container start ff5dee15a43070d8413ba9b054b5968e124985ebaea3ddad8e727fc1b7c15c69 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_bardeen, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True)
Jan 31 02:23:16 np0005603623 podman[83601]: 2026-01-31 07:23:16.656860243 +0000 UTC m=+0.121627464 container attach ff5dee15a43070d8413ba9b054b5968e124985ebaea3ddad8e727fc1b7c15c69 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_bardeen, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef)
Jan 31 02:23:16 np0005603623 magical_bardeen[83617]: 167 167
Jan 31 02:23:16 np0005603623 systemd[1]: libpod-ff5dee15a43070d8413ba9b054b5968e124985ebaea3ddad8e727fc1b7c15c69.scope: Deactivated successfully.
Jan 31 02:23:16 np0005603623 conmon[83617]: conmon ff5dee15a43070d8413b <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ff5dee15a43070d8413ba9b054b5968e124985ebaea3ddad8e727fc1b7c15c69.scope/container/memory.events
Jan 31 02:23:16 np0005603623 podman[83601]: 2026-01-31 07:23:16.657990654 +0000 UTC m=+0.122757875 container died ff5dee15a43070d8413ba9b054b5968e124985ebaea3ddad8e727fc1b7c15c69 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_bardeen, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Jan 31 02:23:16 np0005603623 systemd[1]: var-lib-containers-storage-overlay-081f74feb793a042357f2288c7303c14cf6c17a9c9803cfb54d2791727ea614a-merged.mount: Deactivated successfully.
Jan 31 02:23:16 np0005603623 podman[83601]: 2026-01-31 07:23:16.693526853 +0000 UTC m=+0.158294094 container remove ff5dee15a43070d8413ba9b054b5968e124985ebaea3ddad8e727fc1b7c15c69 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_bardeen, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:23:16 np0005603623 systemd[1]: libpod-conmon-ff5dee15a43070d8413ba9b054b5968e124985ebaea3ddad8e727fc1b7c15c69.scope: Deactivated successfully.
Jan 31 02:23:16 np0005603623 systemd[1]: Reloading.
Jan 31 02:23:16 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:23:16 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:23:17 np0005603623 systemd[1]: Reloading.
Jan 31 02:23:17 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:23:17 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:23:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:17 np0005603623 ceph-mon[77037]: Deploying daemon rgw.rgw.compute-2.aejomu on compute-2
Jan 31 02:23:17 np0005603623 systemd[1]: Starting Ceph rgw.rgw.compute-2.aejomu for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2...
Jan 31 02:23:17 np0005603623 podman[83762]: 2026-01-31 07:23:17.393087701 +0000 UTC m=+0.034267954 container create 5b79fb4fe2cbacd662a4389e9317752c0b9a7c92e7ce99005f23ab691258c59b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-rgw-rgw-compute-2-aejomu, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Jan 31 02:23:17 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0b8b1a7660954220fb953fee9f4b02ecf5bf85be94b6d57c911222cb7debd09/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:23:17 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0b8b1a7660954220fb953fee9f4b02ecf5bf85be94b6d57c911222cb7debd09/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:23:17 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0b8b1a7660954220fb953fee9f4b02ecf5bf85be94b6d57c911222cb7debd09/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:23:17 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0b8b1a7660954220fb953fee9f4b02ecf5bf85be94b6d57c911222cb7debd09/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-2.aejomu supports timestamps until 2038 (0x7fffffff)
Jan 31 02:23:17 np0005603623 podman[83762]: 2026-01-31 07:23:17.452185434 +0000 UTC m=+0.093365697 container init 5b79fb4fe2cbacd662a4389e9317752c0b9a7c92e7ce99005f23ab691258c59b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-rgw-rgw-compute-2-aejomu, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:23:17 np0005603623 podman[83762]: 2026-01-31 07:23:17.455596079 +0000 UTC m=+0.096776332 container start 5b79fb4fe2cbacd662a4389e9317752c0b9a7c92e7ce99005f23ab691258c59b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-rgw-rgw-compute-2-aejomu, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Jan 31 02:23:17 np0005603623 bash[83762]: 5b79fb4fe2cbacd662a4389e9317752c0b9a7c92e7ce99005f23ab691258c59b
Jan 31 02:23:17 np0005603623 podman[83762]: 2026-01-31 07:23:17.378835754 +0000 UTC m=+0.020016037 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:23:17 np0005603623 systemd[1]: Started Ceph rgw.rgw.compute-2.aejomu for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2.
Jan 31 02:23:17 np0005603623 radosgw[83781]: deferred set uid:gid to 167:167 (ceph:ceph)
Jan 31 02:23:17 np0005603623 radosgw[83781]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process radosgw, pid 2
Jan 31 02:23:17 np0005603623 radosgw[83781]: framework: beast
Jan 31 02:23:17 np0005603623 radosgw[83781]: framework conf key: endpoint, val: 192.168.122.102:8082
Jan 31 02:23:17 np0005603623 radosgw[83781]: init_numa not setting numa affinity
Jan 31 02:23:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.bjsbdg", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 31 02:23:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.bjsbdg", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 31 02:23:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:18 np0005603623 ceph-mon[77037]: Deploying daemon rgw.rgw.compute-1.bjsbdg on compute-1
Jan 31 02:23:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e32 e32: 3 total, 3 up, 3 in
Jan 31 02:23:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"} v 0) v1
Jan 31 02:23:18 np0005603623 ceph-mon[77037]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/665557881' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 31 02:23:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e33 e33: 3 total, 3 up, 3 in
Jan 31 02:23:19 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.102:0/665557881' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 31 02:23:19 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 31 02:23:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.pnpmok", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 31 02:23:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.pnpmok", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 31 02:23:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e34 e34: 3 total, 3 up, 3 in
Jan 31 02:23:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0) v1
Jan 31 02:23:20 np0005603623 ceph-mon[77037]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/665557881' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 31 02:23:20 np0005603623 ceph-mon[77037]: Deploying daemon rgw.rgw.compute-0.pnpmok on compute-0
Jan 31 02:23:20 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Jan 31 02:23:20 np0005603623 ceph-mon[77037]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 31 02:23:20 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.101:0/4082344861' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 31 02:23:20 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 31 02:23:21 np0005603623 podman[83981]: 2026-01-31 07:23:21.212138472 +0000 UTC m=+0.034785658 container create 868da4cf55cefa9b06dde1f1079f81b33777ed9e9405f7872cad72b0392d75ee (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_liskov, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 31 02:23:21 np0005603623 systemd[1]: Started libpod-conmon-868da4cf55cefa9b06dde1f1079f81b33777ed9e9405f7872cad72b0392d75ee.scope.
Jan 31 02:23:21 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:23:21 np0005603623 podman[83981]: 2026-01-31 07:23:21.291215322 +0000 UTC m=+0.113862528 container init 868da4cf55cefa9b06dde1f1079f81b33777ed9e9405f7872cad72b0392d75ee (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_liskov, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3)
Jan 31 02:23:21 np0005603623 podman[83981]: 2026-01-31 07:23:21.199089319 +0000 UTC m=+0.021736525 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:23:21 np0005603623 podman[83981]: 2026-01-31 07:23:21.296070906 +0000 UTC m=+0.118718092 container start 868da4cf55cefa9b06dde1f1079f81b33777ed9e9405f7872cad72b0392d75ee (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_liskov, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef)
Jan 31 02:23:21 np0005603623 podman[83981]: 2026-01-31 07:23:21.298848544 +0000 UTC m=+0.121495730 container attach 868da4cf55cefa9b06dde1f1079f81b33777ed9e9405f7872cad72b0392d75ee (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_liskov, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:23:21 np0005603623 exciting_liskov[83997]: 167 167
Jan 31 02:23:21 np0005603623 systemd[1]: libpod-868da4cf55cefa9b06dde1f1079f81b33777ed9e9405f7872cad72b0392d75ee.scope: Deactivated successfully.
Jan 31 02:23:21 np0005603623 conmon[83997]: conmon 868da4cf55cefa9b06dd <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-868da4cf55cefa9b06dde1f1079f81b33777ed9e9405f7872cad72b0392d75ee.scope/container/memory.events
Jan 31 02:23:21 np0005603623 podman[83981]: 2026-01-31 07:23:21.300978524 +0000 UTC m=+0.123625710 container died 868da4cf55cefa9b06dde1f1079f81b33777ed9e9405f7872cad72b0392d75ee (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_liskov, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:23:21 np0005603623 systemd[1]: var-lib-containers-storage-overlay-45f1433f978b1e869f3f87a3e45dcb671ac3da84dcc155829f400b3765a774c2-merged.mount: Deactivated successfully.
Jan 31 02:23:21 np0005603623 podman[83981]: 2026-01-31 07:23:21.343072614 +0000 UTC m=+0.165719800 container remove 868da4cf55cefa9b06dde1f1079f81b33777ed9e9405f7872cad72b0392d75ee (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_liskov, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Jan 31 02:23:21 np0005603623 systemd[1]: libpod-conmon-868da4cf55cefa9b06dde1f1079f81b33777ed9e9405f7872cad72b0392d75ee.scope: Deactivated successfully.
Jan 31 02:23:21 np0005603623 systemd[1]: Reloading.
Jan 31 02:23:21 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:23:21 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:23:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e35 e35: 3 total, 3 up, 3 in
Jan 31 02:23:21 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.102:0/665557881' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 31 02:23:21 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 31 02:23:21 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:21 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:21 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:21 np0005603623 ceph-mon[77037]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 31 02:23:21 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:21 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:21 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.asgtzy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 31 02:23:21 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.asgtzy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 31 02:23:21 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:21 np0005603623 systemd[1]: Reloading.
Jan 31 02:23:21 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:23:21 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:23:21 np0005603623 systemd[1]: Starting Ceph mds.cephfs.compute-2.asgtzy for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2...
Jan 31 02:23:22 np0005603623 podman[84142]: 2026-01-31 07:23:22.023221662 +0000 UTC m=+0.019249846 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:23:22 np0005603623 podman[84142]: 2026-01-31 07:23:22.122701128 +0000 UTC m=+0.118729292 container create c2b553744a5ced3412d816354c479efed9cac2c42e8d5b5f2d2c241cbcc1ca44 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mds-cephfs-compute-2-asgtzy, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:23:22 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/242b7f25c55552b7b4f464cb18d2464d8fbe0b888612441846579aa7c8b60183/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:23:22 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/242b7f25c55552b7b4f464cb18d2464d8fbe0b888612441846579aa7c8b60183/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:23:22 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/242b7f25c55552b7b4f464cb18d2464d8fbe0b888612441846579aa7c8b60183/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:23:22 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/242b7f25c55552b7b4f464cb18d2464d8fbe0b888612441846579aa7c8b60183/merged/var/lib/ceph/mds/ceph-cephfs.compute-2.asgtzy supports timestamps until 2038 (0x7fffffff)
Jan 31 02:23:22 np0005603623 podman[84142]: 2026-01-31 07:23:22.257858468 +0000 UTC m=+0.253886642 container init c2b553744a5ced3412d816354c479efed9cac2c42e8d5b5f2d2c241cbcc1ca44 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mds-cephfs-compute-2-asgtzy, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 02:23:22 np0005603623 podman[84142]: 2026-01-31 07:23:22.26226186 +0000 UTC m=+0.258290024 container start c2b553744a5ced3412d816354c479efed9cac2c42e8d5b5f2d2c241cbcc1ca44 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mds-cephfs-compute-2-asgtzy, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True)
Jan 31 02:23:22 np0005603623 bash[84142]: c2b553744a5ced3412d816354c479efed9cac2c42e8d5b5f2d2c241cbcc1ca44
Jan 31 02:23:22 np0005603623 systemd[1]: Started Ceph mds.cephfs.compute-2.asgtzy for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2.
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: set uid:gid to 167:167 (ceph:ceph)
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mds, pid 2
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: main not setting numa affinity
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: pidfile_write: ignore empty --pid-file
Jan 31 02:23:22 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mds-cephfs-compute-2-asgtzy[84157]: starting mds.cephfs.compute-2.asgtzy at 
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy Updating MDS map to version 2 from mon.1
Jan 31 02:23:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).mds e3 new map
Jan 31 02:23:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).mds e3 print_map#012e3#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-31T07:23:03.855545+0000#012modified#0112026-01-31T07:23:03.855587+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.asgtzy{-1:24157} state up:standby seq 1 addr [v2:192.168.122.102:6804/2751451154,v1:192.168.122.102:6805/2751451154] compat {c=[1],r=[1],i=[7ff]}]
Jan 31 02:23:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e36 e36: 3 total, 3 up, 3 in
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy Updating MDS map to version 3 from mon.1
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy Monitors have assigned me to become a standby.
Jan 31 02:23:22 np0005603623 ceph-mon[77037]: Deploying daemon mds.cephfs.compute-2.asgtzy on compute-2
Jan 31 02:23:22 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.rgw.rgw.compute-1.bjsbdg' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 31 02:23:22 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 31 02:23:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0) v1
Jan 31 02:23:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/665557881' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 31 02:23:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).mds e4 new map
Jan 31 02:23:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).mds e4 print_map#012e4#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-31T07:23:03.855545+0000#012modified#0112026-01-31T07:23:22.914433+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.asgtzy{0:24157} state up:creating seq 1 addr [v2:192.168.122.102:6804/2751451154,v1:192.168.122.102:6805/2751451154] compat {c=[1],r=[1],i=[7ff]}]#012 #012 
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy Updating MDS map to version 4 from mon.1
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.0.4 handle_mds_map i am now mds.0.4
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.0.cache creating system inode with ino:0x1
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.0.cache creating system inode with ino:0x100
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.0.cache creating system inode with ino:0x600
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.0.cache creating system inode with ino:0x601
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.0.cache creating system inode with ino:0x602
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.0.cache creating system inode with ino:0x603
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.0.cache creating system inode with ino:0x604
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.0.cache creating system inode with ino:0x605
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.0.cache creating system inode with ino:0x606
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.0.cache creating system inode with ino:0x607
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.0.cache creating system inode with ino:0x608
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.0.cache creating system inode with ino:0x609
Jan 31 02:23:22 np0005603623 ceph-mds[84161]: mds.0.4 creating_done
Jan 31 02:23:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0) v1
Jan 31 02:23:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3587791562' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Jan 31 02:23:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e37 e37: 3 total, 3 up, 3 in
Jan 31 02:23:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.jroeqh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 31 02:23:23 np0005603623 ceph-mon[77037]: daemon mds.cephfs.compute-2.asgtzy assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Jan 31 02:23:23 np0005603623 ceph-mon[77037]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Jan 31 02:23:23 np0005603623 ceph-mon[77037]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Jan 31 02:23:23 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/3859572337' entity='client.rgw.rgw.compute-0.pnpmok' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 31 02:23:23 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.101:0/4082344861' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 31 02:23:23 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.102:0/665557881' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 31 02:23:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.jroeqh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 31 02:23:23 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 31 02:23:23 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 31 02:23:23 np0005603623 ceph-mon[77037]: Deploying daemon mds.cephfs.compute-0.jroeqh on compute-0
Jan 31 02:23:23 np0005603623 ceph-mon[77037]: daemon mds.cephfs.compute-2.asgtzy is now active in filesystem cephfs as rank 0
Jan 31 02:23:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).mds e5 new map
Jan 31 02:23:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).mds e5 print_map#012e5#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-31T07:23:03.855545+0000#012modified#0112026-01-31T07:23:23.967653+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.asgtzy{0:24157} state up:active seq 2 addr [v2:192.168.122.102:6804/2751451154,v1:192.168.122.102:6805/2751451154] compat {c=[1],r=[1],i=[7ff]}]#012 #012 
Jan 31 02:23:23 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy Updating MDS map to version 5 from mon.1
Jan 31 02:23:23 np0005603623 ceph-mds[84161]: mds.0.4 handle_mds_map i am now mds.0.4
Jan 31 02:23:23 np0005603623 ceph-mds[84161]: mds.0.4 handle_mds_map state change up:creating --> up:active
Jan 31 02:23:23 np0005603623 ceph-mds[84161]: mds.0.4 recovery_done -- successful recovery!
Jan 31 02:23:23 np0005603623 ceph-mds[84161]: mds.0.4 active_start
Jan 31 02:23:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e38 e38: 3 total, 3 up, 3 in
Jan 31 02:23:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0) v1
Jan 31 02:23:24 np0005603623 ceph-mon[77037]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1907859104' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 31 02:23:24 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/3859572337' entity='client.rgw.rgw.compute-0.pnpmok' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 31 02:23:24 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 31 02:23:24 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.rgw.rgw.compute-1.bjsbdg' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 31 02:23:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.bkrghs", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 31 02:23:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.bkrghs", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 31 02:23:24 np0005603623 ceph-mon[77037]: Deploying daemon mds.cephfs.compute-1.bkrghs on compute-1
Jan 31 02:23:24 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/487760849' entity='client.rgw.rgw.compute-0.pnpmok' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 31 02:23:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).mds e6 new map
Jan 31 02:23:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).mds e6 print_map#012e6#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-31T07:23:03.855545+0000#012modified#0112026-01-31T07:23:23.967653+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.asgtzy{0:24157} state up:active seq 2 addr [v2:192.168.122.102:6804/2751451154,v1:192.168.122.102:6805/2751451154] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jroeqh{-1:14409} state up:standby seq 1 addr [v2:192.168.122.100:6806/105956008,v1:192.168.122.100:6807/105956008] compat {c=[1],r=[1],i=[7ff]}]
Jan 31 02:23:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).mds e7 new map
Jan 31 02:23:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).mds e7 print_map#012e7#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-31T07:23:03.855545+0000#012modified#0112026-01-31T07:23:23.967653+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.asgtzy{0:24157} state up:active seq 2 addr [v2:192.168.122.102:6804/2751451154,v1:192.168.122.102:6805/2751451154] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jroeqh{-1:14409} state up:standby seq 1 addr [v2:192.168.122.100:6806/105956008,v1:192.168.122.100:6807/105956008] compat {c=[1],r=[1],i=[7ff]}]
Jan 31 02:23:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e39 e39: 3 total, 3 up, 3 in
Jan 31 02:23:26 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.102:0/1907859104' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 31 02:23:26 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.101:0/1698301580' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 31 02:23:26 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 31 02:23:26 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 31 02:23:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0) v1
Jan 31 02:23:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1907859104' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 31 02:23:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e40 e40: 3 total, 3 up, 3 in
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/487760849' entity='client.rgw.rgw.compute-0.pnpmok' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.rgw.rgw.compute-1.bjsbdg' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/487760849' entity='client.rgw.rgw.compute-0.pnpmok' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.102:0/1907859104' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.101:0/1698301580' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: Deploying daemon haproxy.rgw.default.compute-0.evwczw on compute-0
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: from='client.? 192.168.122.100:0/487760849' entity='client.rgw.rgw.compute-0.pnpmok' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: from='client.? ' entity='client.rgw.rgw.compute-1.bjsbdg' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).mds e8 new map
Jan 31 02:23:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).mds e8 print_map#012e8#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-31T07:23:03.855545+0000#012modified#0112026-01-31T07:23:27.037579+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.asgtzy{0:24157} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/2751451154,v1:192.168.122.102:6805/2751451154] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jroeqh{-1:14409} state up:standby seq 1 addr [v2:192.168.122.100:6806/105956008,v1:192.168.122.100:6807/105956008] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.bkrghs{-1:24146} state up:standby seq 1 addr [v2:192.168.122.101:6804/4027255140,v1:192.168.122.101:6805/4027255140] compat {c=[1],r=[1],i=[7ff]}]
Jan 31 02:23:27 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy Updating MDS map to version 8 from mon.1
Jan 31 02:23:27 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-rgw-rgw-compute-2-aejomu[83777]: 2026-01-31T07:23:27.227+0000 7f2dbcbb8940 -1 LDAP not started since no server URIs were provided in the configuration.
Jan 31 02:23:27 np0005603623 radosgw[83781]: LDAP not started since no server URIs were provided in the configuration.
Jan 31 02:23:27 np0005603623 radosgw[83781]: framework: beast
Jan 31 02:23:27 np0005603623 radosgw[83781]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Jan 31 02:23:27 np0005603623 radosgw[83781]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Jan 31 02:23:27 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 31 02:23:27 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 31 02:23:27 np0005603623 radosgw[83781]: starting handler: beast
Jan 31 02:23:27 np0005603623 radosgw[83781]: set uid:gid to 167:167 (ceph:ceph)
Jan 31 02:23:27 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 31 02:23:27 np0005603623 radosgw[83781]: mgrc service_daemon_register rgw.24172 metadata {arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.102:8082,frontend_type#0=beast,hostname=compute-2,id=rgw.compute-2.aejomu,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026,kernel_version=5.14.0-665.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864300,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=4a080c62-ffea-408b-958a-f1cf7a54b487,zone_name=default,zonegroup_id=4b3ae999-cc0f-4a4e-9689-957f89598a27,zonegroup_name=default}
Jan 31 02:23:27 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Jan 31 02:23:27 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 31 02:23:27 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 31 02:23:27 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Jan 31 02:23:27 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Jan 31 02:23:29 np0005603623 ceph-mon[77037]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 31 02:23:29 np0005603623 ceph-mon[77037]: Cluster is now healthy
Jan 31 02:23:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).mds e9 new map
Jan 31 02:23:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).mds e9 print_map#012e9#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-31T07:23:03.855545+0000#012modified#0112026-01-31T07:23:27.037579+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.asgtzy{0:24157} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/2751451154,v1:192.168.122.102:6805/2751451154] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jroeqh{-1:14409} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/105956008,v1:192.168.122.100:6807/105956008] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.bkrghs{-1:24146} state up:standby seq 1 addr [v2:192.168.122.101:6804/4027255140,v1:192.168.122.101:6805/4027255140] compat {c=[1],r=[1],i=[7ff]}]
Jan 31 02:23:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:31 np0005603623 ceph-mon[77037]: Deploying daemon haproxy.rgw.default.compute-2.yyrexo on compute-2
Jan 31 02:23:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).mds e10 new map
Jan 31 02:23:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).mds e10 print_map#012e10#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-31T07:23:03.855545+0000#012modified#0112026-01-31T07:23:27.037579+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.asgtzy{0:24157} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/2751451154,v1:192.168.122.102:6805/2751451154] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jroeqh{-1:14409} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/105956008,v1:192.168.122.100:6807/105956008] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.bkrghs{-1:24146} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/4027255140,v1:192.168.122.101:6805/4027255140] compat {c=[1],r=[1],i=[7ff]}]
Jan 31 02:23:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 31 02:23:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:31.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 31 02:23:32 np0005603623 podman[84884]: 2026-01-31 07:23:32.445021712 +0000 UTC m=+1.963019210 container create ed8c7378d49bf1011feb131b218085f49892a4f6847cdd41500a39466c85d883 (image=quay.io/ceph/haproxy:2.3, name=goofy_darwin)
Jan 31 02:23:32 np0005603623 systemd[72574]: Starting Mark boot as successful...
Jan 31 02:23:32 np0005603623 systemd[1]: Started libpod-conmon-ed8c7378d49bf1011feb131b218085f49892a4f6847cdd41500a39466c85d883.scope.
Jan 31 02:23:32 np0005603623 systemd[72574]: Finished Mark boot as successful.
Jan 31 02:23:32 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:23:32 np0005603623 podman[84884]: 2026-01-31 07:23:32.499164988 +0000 UTC m=+2.017162486 container init ed8c7378d49bf1011feb131b218085f49892a4f6847cdd41500a39466c85d883 (image=quay.io/ceph/haproxy:2.3, name=goofy_darwin)
Jan 31 02:23:32 np0005603623 podman[84884]: 2026-01-31 07:23:32.507830669 +0000 UTC m=+2.025828177 container start ed8c7378d49bf1011feb131b218085f49892a4f6847cdd41500a39466c85d883 (image=quay.io/ceph/haproxy:2.3, name=goofy_darwin)
Jan 31 02:23:32 np0005603623 podman[84884]: 2026-01-31 07:23:32.511539522 +0000 UTC m=+2.029537030 container attach ed8c7378d49bf1011feb131b218085f49892a4f6847cdd41500a39466c85d883 (image=quay.io/ceph/haproxy:2.3, name=goofy_darwin)
Jan 31 02:23:32 np0005603623 goofy_darwin[85001]: 0 0
Jan 31 02:23:32 np0005603623 systemd[1]: libpod-ed8c7378d49bf1011feb131b218085f49892a4f6847cdd41500a39466c85d883.scope: Deactivated successfully.
Jan 31 02:23:32 np0005603623 podman[84884]: 2026-01-31 07:23:32.512140449 +0000 UTC m=+2.030137957 container died ed8c7378d49bf1011feb131b218085f49892a4f6847cdd41500a39466c85d883 (image=quay.io/ceph/haproxy:2.3, name=goofy_darwin)
Jan 31 02:23:32 np0005603623 podman[84884]: 2026-01-31 07:23:32.430985712 +0000 UTC m=+1.948983230 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 31 02:23:32 np0005603623 systemd[1]: var-lib-containers-storage-overlay-d6d60f8adf2fd9460132d0e7fd0ee226eea80f1a73d6c8d860038c825ae8c1c5-merged.mount: Deactivated successfully.
Jan 31 02:23:32 np0005603623 podman[84884]: 2026-01-31 07:23:32.544802987 +0000 UTC m=+2.062800485 container remove ed8c7378d49bf1011feb131b218085f49892a4f6847cdd41500a39466c85d883 (image=quay.io/ceph/haproxy:2.3, name=goofy_darwin)
Jan 31 02:23:32 np0005603623 systemd[1]: libpod-conmon-ed8c7378d49bf1011feb131b218085f49892a4f6847cdd41500a39466c85d883.scope: Deactivated successfully.
Jan 31 02:23:32 np0005603623 systemd[1]: Reloading.
Jan 31 02:23:32 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:23:32 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:23:32 np0005603623 systemd[1]: Reloading.
Jan 31 02:23:32 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:23:32 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:23:33 np0005603623 systemd[1]: Starting Ceph haproxy.rgw.default.compute-2.yyrexo for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2...
Jan 31 02:23:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:33.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:33 np0005603623 podman[85147]: 2026-01-31 07:23:33.257664135 +0000 UTC m=+0.043171612 container create dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:23:33 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc899a870e91d7342096164774479d62302d6a0c348daf6c0a27aa0aeb639555/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Jan 31 02:23:33 np0005603623 podman[85147]: 2026-01-31 07:23:33.317797547 +0000 UTC m=+0.103305124 container init dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:23:33 np0005603623 podman[85147]: 2026-01-31 07:23:33.321672805 +0000 UTC m=+0.107180312 container start dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:23:33 np0005603623 bash[85147]: dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720
Jan 31 02:23:33 np0005603623 podman[85147]: 2026-01-31 07:23:33.241720431 +0000 UTC m=+0.027227918 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 31 02:23:33 np0005603623 systemd[1]: Started Ceph haproxy.rgw.default.compute-2.yyrexo for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2.
Jan 31 02:23:33 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo[85162]: [NOTICE] 030/072333 (2) : New worker #1 (4) forked
Jan 31 02:23:34 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:34 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:34 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:34 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:34 np0005603623 ceph-mon[77037]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 31 02:23:34 np0005603623 ceph-mon[77037]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 31 02:23:34 np0005603623 ceph-mon[77037]: Deploying daemon keepalived.rgw.default.compute-0.wujrgc on compute-0
Jan 31 02:23:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:34.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:35.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:36.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:37.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Jan 31 02:23:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:38.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e41 e41: 3 total, 3 up, 3 in
Jan 31 02:23:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:39.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:39 np0005603623 ceph-mon[77037]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 31 02:23:39 np0005603623 ceph-mon[77037]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 31 02:23:39 np0005603623 ceph-mon[77037]: Deploying daemon keepalived.rgw.default.compute-2.voilty on compute-2
Jan 31 02:23:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Jan 31 02:23:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Jan 31 02:23:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e42 e42: 3 total, 3 up, 3 in
Jan 31 02:23:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Jan 31 02:23:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 31 02:23:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Jan 31 02:23:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:40.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:41 np0005603623 podman[85319]: 2026-01-31 07:23:41.119017399 +0000 UTC m=+2.477306884 container create bd8d68d5d07d357c58ecb59c20a76e6b41f91f6b094c9e5161c90a1ac1b7f0a7 (image=quay.io/ceph/keepalived:2.2.4, name=funny_jackson, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, distribution-scope=public, name=keepalived, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=keepalived-container, release=1793, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 31 02:23:41 np0005603623 systemd[1]: Started libpod-conmon-bd8d68d5d07d357c58ecb59c20a76e6b41f91f6b094c9e5161c90a1ac1b7f0a7.scope.
Jan 31 02:23:41 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:23:41 np0005603623 podman[85319]: 2026-01-31 07:23:41.104709811 +0000 UTC m=+2.462999316 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 31 02:23:41 np0005603623 podman[85319]: 2026-01-31 07:23:41.175786708 +0000 UTC m=+2.534076213 container init bd8d68d5d07d357c58ecb59c20a76e6b41f91f6b094c9e5161c90a1ac1b7f0a7 (image=quay.io/ceph/keepalived:2.2.4, name=funny_jackson, distribution-scope=public, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, architecture=x86_64, build-date=2023-02-22T09:23:20, name=keepalived, vcs-type=git, version=2.2.4)
Jan 31 02:23:41 np0005603623 podman[85319]: 2026-01-31 07:23:41.181226629 +0000 UTC m=+2.539516114 container start bd8d68d5d07d357c58ecb59c20a76e6b41f91f6b094c9e5161c90a1ac1b7f0a7 (image=quay.io/ceph/keepalived:2.2.4, name=funny_jackson, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, io.openshift.expose-services=, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, distribution-scope=public, architecture=x86_64, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20)
Jan 31 02:23:41 np0005603623 podman[85319]: 2026-01-31 07:23:41.184257623 +0000 UTC m=+2.542547118 container attach bd8d68d5d07d357c58ecb59c20a76e6b41f91f6b094c9e5161c90a1ac1b7f0a7 (image=quay.io/ceph/keepalived:2.2.4, name=funny_jackson, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, architecture=x86_64, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, build-date=2023-02-22T09:23:20)
Jan 31 02:23:41 np0005603623 funny_jackson[85416]: 0 0
Jan 31 02:23:41 np0005603623 systemd[1]: libpod-bd8d68d5d07d357c58ecb59c20a76e6b41f91f6b094c9e5161c90a1ac1b7f0a7.scope: Deactivated successfully.
Jan 31 02:23:41 np0005603623 conmon[85416]: conmon bd8d68d5d07d357c58ec <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-bd8d68d5d07d357c58ecb59c20a76e6b41f91f6b094c9e5161c90a1ac1b7f0a7.scope/container/memory.events
Jan 31 02:23:41 np0005603623 podman[85319]: 2026-01-31 07:23:41.186422054 +0000 UTC m=+2.544711539 container died bd8d68d5d07d357c58ecb59c20a76e6b41f91f6b094c9e5161c90a1ac1b7f0a7 (image=quay.io/ceph/keepalived:2.2.4, name=funny_jackson, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, description=keepalived for Ceph, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, architecture=x86_64, name=keepalived, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container)
Jan 31 02:23:41 np0005603623 systemd[1]: var-lib-containers-storage-overlay-3bafd3333c93ce5b26c3044fc0344f31f5f83ffc38ec17902206233d472c059a-merged.mount: Deactivated successfully.
Jan 31 02:23:41 np0005603623 podman[85319]: 2026-01-31 07:23:41.219077952 +0000 UTC m=+2.577367447 container remove bd8d68d5d07d357c58ecb59c20a76e6b41f91f6b094c9e5161c90a1ac1b7f0a7 (image=quay.io/ceph/keepalived:2.2.4, name=funny_jackson, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, io.openshift.expose-services=, release=1793, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, vcs-type=git, vendor=Red Hat, Inc., description=keepalived for Ceph, io.buildah.version=1.28.2, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64)
Jan 31 02:23:41 np0005603623 systemd[1]: libpod-conmon-bd8d68d5d07d357c58ecb59c20a76e6b41f91f6b094c9e5161c90a1ac1b7f0a7.scope: Deactivated successfully.
Jan 31 02:23:41 np0005603623 systemd[1]: Reloading.
Jan 31 02:23:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:41.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e43 e43: 3 total, 3 up, 3 in
Jan 31 02:23:41 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:23:41 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:23:41 np0005603623 systemd[1]: Reloading.
Jan 31 02:23:41 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:23:41 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:23:41 np0005603623 systemd[1]: Starting Ceph keepalived.rgw.default.compute-2.voilty for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2...
Jan 31 02:23:41 np0005603623 podman[85560]: 2026-01-31 07:23:41.895544367 +0000 UTC m=+0.034616013 container create 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, io.buildah.version=1.28.2, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, version=2.2.4, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 31 02:23:41 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2e675c29102a6c21fb26d78380e50bd6baa4cc05130b80825e4a9fae2686aed/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:23:41 np0005603623 podman[85560]: 2026-01-31 07:23:41.942329559 +0000 UTC m=+0.081401205 container init 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, version=2.2.4, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, release=1793)
Jan 31 02:23:41 np0005603623 podman[85560]: 2026-01-31 07:23:41.946616728 +0000 UTC m=+0.085688354 container start 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, version=2.2.4, io.openshift.tags=Ceph keepalived, name=keepalived, distribution-scope=public, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=)
Jan 31 02:23:41 np0005603623 bash[85560]: 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466
Jan 31 02:23:41 np0005603623 podman[85560]: 2026-01-31 07:23:41.878522403 +0000 UTC m=+0.017594099 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 31 02:23:41 np0005603623 systemd[1]: Started Ceph keepalived.rgw.default.compute-2.voilty for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2.
Jan 31 02:23:41 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty[85575]: Sat Jan 31 07:23:41 2026: Starting Keepalived v2.2.4 (08/21,2021)
Jan 31 02:23:41 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty[85575]: Sat Jan 31 07:23:41 2026: Running on Linux 5.14.0-665.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026 (built for Linux 5.14.0)
Jan 31 02:23:41 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty[85575]: Sat Jan 31 07:23:41 2026: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Jan 31 02:23:41 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty[85575]: Sat Jan 31 07:23:41 2026: Configuration file /etc/keepalived/keepalived.conf
Jan 31 02:23:41 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty[85575]: Sat Jan 31 07:23:41 2026: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Jan 31 02:23:41 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty[85575]: Sat Jan 31 07:23:41 2026: Starting VRRP child process, pid=4
Jan 31 02:23:41 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty[85575]: Sat Jan 31 07:23:41 2026: Startup complete
Jan 31 02:23:41 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty[85575]: Sat Jan 31 07:23:41 2026: (VI_0) Entering BACKUP STATE (init)
Jan 31 02:23:41 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty[85575]: Sat Jan 31 07:23:41 2026: VRRP_Script(check_backend) succeeded
Jan 31 02:23:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e44 e44: 3 total, 3 up, 3 in
Jan 31 02:23:42 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 44 pg[3.0( empty local-lis/les=29/30 n=0 ec=16/16 lis/c=29/29 les/c/f=30/30/0 sis=44 pruub=11.252832413s) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active pruub 62.446853638s@ mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [2], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:42 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 44 pg[3.0( empty local-lis/les=29/30 n=0 ec=16/16 lis/c=29/29 les/c/f=30/30/0 sis=44 pruub=11.252832413s) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown pruub 62.446853638s@ mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Jan 31 02:23:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Jan 31 02:23:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:23:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:42.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:23:43 np0005603623 podman[85855]: 2026-01-31 07:23:43.029110992 +0000 UTC m=+0.113964496 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:23:43 np0005603623 podman[85876]: 2026-01-31 07:23:43.171639682 +0000 UTC m=+0.051896639 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 02:23:43 np0005603623 podman[85855]: 2026-01-31 07:23:43.176660989 +0000 UTC m=+0.261514493 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:23:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:43.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e45 e45: 3 total, 3 up, 3 in
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.1e( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.1f( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.1d( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.1c( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.b( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.a( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.9( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.8( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.7( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.5( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.4( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.3( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.2( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.1( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.6( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.c( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.e( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.f( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.10( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.11( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.12( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.13( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.d( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.15( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.16( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.17( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.18( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.19( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.1b( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.1a( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.14( empty local-lis/les=29/30 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.1e( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.1d( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.1f( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.b( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.1c( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.a( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.9( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.8( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.5( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.7( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.3( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.2( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.1( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.6( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.0( empty local-lis/les=44/45 n=0 ec=16/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.e( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.10( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.f( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.11( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.12( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.4( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.13( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.15( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.16( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.17( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.18( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.d( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.1b( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.1a( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.14( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.19( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 45 pg[3.c( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=29/29 les/c/f=30/30/0 sis=44) [2] r=0 lpr=44 pi=[29,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Jan 31 02:23:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Jan 31 02:23:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 31 02:23:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]: dispatch
Jan 31 02:23:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]': finished
Jan 31 02:23:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Jan 31 02:23:43 np0005603623 podman[86006]: 2026-01-31 07:23:43.621685123 +0000 UTC m=+0.042912323 container exec dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:23:43 np0005603623 podman[86006]: 2026-01-31 07:23:43.630754123 +0000 UTC m=+0.051981293 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:23:43 np0005603623 podman[86073]: 2026-01-31 07:23:43.768864718 +0000 UTC m=+0.039094166 container exec 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, version=2.2.4, name=keepalived, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, com.redhat.component=keepalived-container, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, release=1793, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Jan 31 02:23:43 np0005603623 podman[86073]: 2026-01-31 07:23:43.77978091 +0000 UTC m=+0.050010358 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, io.openshift.tags=Ceph keepalived, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, architecture=x86_64, io.buildah.version=1.28.2, release=1793, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph)
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.1 scrub starts
Jan 31 02:23:43 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.1 scrub ok
Jan 31 02:23:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e46 e46: 3 total, 3 up, 3 in
Jan 31 02:23:44 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 46 pg[5.0( empty local-lis/les=29/30 n=0 ec=19/19 lis/c=29/29 les/c/f=30/30/0 sis=46 pruub=9.104440689s) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active pruub 62.443946838s@ mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [2], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:44 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 46 pg[5.0( empty local-lis/les=29/30 n=0 ec=19/19 lis/c=29/29 les/c/f=30/30/0 sis=46 pruub=9.104440689s) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown pruub 62.443946838s@ mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Jan 31 02:23:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:23:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:23:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:23:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:44.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:23:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:45.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e47 e47: 3 total, 3 up, 3 in
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.1c( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.1e( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.1d( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.1f( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.11( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.10( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.13( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.12( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.15( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.14( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.17( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.16( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.9( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.8( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.a( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.b( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.6( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.7( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.4( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.5( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.2( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.3( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.1( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.e( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.f( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.c( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.d( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.1a( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.1b( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.18( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.19( empty local-lis/les=29/30 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.1c( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.11( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.1e( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.10( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.1f( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.13( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.12( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.15( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.16( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.17( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.14( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.9( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.8( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.0( empty local-lis/les=46/47 n=0 ec=19/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.a( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.1d( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.6( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.b( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.7( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.4( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.5( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.e( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.2( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.1( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.3( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.f( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.d( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.1a( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.c( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.18( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.19( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 47 pg[5.1b( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=29/29 les/c/f=30/30/0 sis=46) [2] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Jan 31 02:23:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Jan 31 02:23:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Jan 31 02:23:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Jan 31 02:23:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Jan 31 02:23:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Jan 31 02:23:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Jan 31 02:23:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:45 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty[85575]: Sat Jan 31 07:23:45 2026: (VI_0) Entering MASTER STATE
Jan 31 02:23:45 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty[85575]: Sat Jan 31 07:23:45 2026: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Jan 31 02:23:45 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty[85575]: Sat Jan 31 07:23:45 2026: (VI_0) Entering BACKUP STATE
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.2 scrub starts
Jan 31 02:23:45 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.2 scrub ok
Jan 31 02:23:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e48 e48: 3 total, 3 up, 3 in
Jan 31 02:23:46 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:46 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Jan 31 02:23:46 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Jan 31 02:23:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:46.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:23:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:47.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:23:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e49 e49: 3 total, 3 up, 3 in
Jan 31 02:23:47 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:47 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:47 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Jan 31 02:23:47 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Jan 31 02:23:47 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Jan 31 02:23:47 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 31 02:23:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e50 e50: 3 total, 3 up, 3 in
Jan 31 02:23:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:23:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:48.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:23:48 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Jan 31 02:23:48 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.3 scrub starts
Jan 31 02:23:48 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.3 scrub ok
Jan 31 02:23:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:49.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e51 e51: 3 total, 3 up, 3 in
Jan 31 02:23:50 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:50 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:50 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:50.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:51 np0005603623 podman[86335]: 2026-01-31 07:23:51.249983063 +0000 UTC m=+0.048460117 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef)
Jan 31 02:23:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:51.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:51 np0005603623 podman[86335]: 2026-01-31 07:23:51.33109518 +0000 UTC m=+0.129572224 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Jan 31 02:23:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e52 e52: 3 total, 3 up, 3 in
Jan 31 02:23:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Jan 31 02:23:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 31 02:23:51 np0005603623 podman[86487]: 2026-01-31 07:23:51.83306989 +0000 UTC m=+0.056246574 container exec dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:23:51 np0005603623 podman[86487]: 2026-01-31 07:23:51.840802346 +0000 UTC m=+0.063978990 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:23:51 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.4 scrub starts
Jan 31 02:23:51 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.4 scrub ok
Jan 31 02:23:52 np0005603623 podman[86552]: 2026-01-31 07:23:52.002967408 +0000 UTC m=+0.043042827 container exec 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, architecture=x86_64, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, com.redhat.component=keepalived-container, name=keepalived, release=1793)
Jan 31 02:23:52 np0005603623 podman[86552]: 2026-01-31 07:23:52.015849414 +0000 UTC m=+0.055924783 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, name=keepalived, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public)
Jan 31 02:23:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:23:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:23:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:52.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:53.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:54.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:54 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.5 scrub starts
Jan 31 02:23:54 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.5 scrub ok
Jan 31 02:23:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:55.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e52 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e53 e53: 3 total, 3 up, 3 in
Jan 31 02:23:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 31 02:23:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 31 02:23:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[11.16( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[4.19( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[8.15( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[8.16( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[6.5( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=53) [2] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[4.8( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[4.1f( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[4.1d( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[8.11( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[4.1c( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[11.13( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[8.b( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[8.a( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.1f( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.725094795s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.297737122s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.1d( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.730384827s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.303016663s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[4.6( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.1f( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.725027084s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.297737122s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.1d( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.730298042s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.303016663s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[11.8( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.1c( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.724432945s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.297637939s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.1c( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.724387169s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.297637939s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[8.9( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.19( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.633590698s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.207054138s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.1e( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.724140167s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.297615051s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.1e( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.724118233s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.297615051s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.19( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.633560181s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.207054138s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.18( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.633350372s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.206954956s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.17( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.633215904s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.206871033s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.18( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.633279800s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.206954956s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.17( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.633191109s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.206871033s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.10( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.723636627s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.297622681s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.16( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.632872581s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.206863403s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.10( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.723608017s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.297622681s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.16( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.632831573s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.206863403s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.13( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.632676125s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.206756592s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.14( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.632934570s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.207046509s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.13( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.632653236s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.206756592s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.14( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.632905960s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.207046509s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.15( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.723862648s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.298027039s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.15( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.723834991s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.298027039s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.11( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.723382950s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.297592163s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.11( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.723352432s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.297592163s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.12( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.632398605s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.206726074s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.12( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.632375717s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.206726074s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.14( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.728336334s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.302719116s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.14( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.728312492s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.302719116s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.17( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.728261948s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.302711487s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.16( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.728091240s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.302551270s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.17( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.728237152s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.302711487s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.16( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.728071213s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.302551270s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.10( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.632196426s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.206695557s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.f( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.632121086s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.206634521s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.f( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.632099152s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.206634521s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.10( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.632157326s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.206695557s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.9( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.728123665s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.302749634s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.9( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.728100777s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.302749634s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.d( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.632295609s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.206970215s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.d( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.632276535s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.206970215s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.a( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.728261948s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.303001404s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.c( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.632315636s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.207061768s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.a( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.728237152s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.303001404s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.c( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.632292747s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.207061768s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[6.7( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=53) [2] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.6( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727812767s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.303062439s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.6( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.631331444s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.206588745s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.6( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727784157s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.303062439s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.6( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.631308556s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.206588745s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.1( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.631231308s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.206527710s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.7( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727716446s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.303062439s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.1( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.631190300s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.206527710s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.7( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727680206s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.303062439s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.2( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.631068230s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.206520081s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.3( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.631025314s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.206535339s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.2( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.631012917s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.206520081s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.5( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727667809s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.303215027s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.3( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.630991936s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.206535339s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.4( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.631160736s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.206733704s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.5( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727640152s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.303215027s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.4( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.631128311s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.206733704s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.2( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727541924s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.303215027s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.2( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727511406s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.303215027s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.5( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.630744934s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.206481934s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.5( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.630706787s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.206481934s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.7( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.630680084s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.206497192s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.1( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727356911s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.303222656s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.7( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.630647659s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.206497192s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.3( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727498055s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.303276062s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.1( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727325439s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.303222656s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.3( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727327347s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.303276062s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.a( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.630369186s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.206336975s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.f( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727274895s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.303291321s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.c( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727353096s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.303382874s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.b( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.630258560s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.206306458s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.f( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727241516s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.303291321s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.c( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727320671s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.303382874s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.b( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.630229950s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.206306458s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.a( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.630341530s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.206336975s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.1b( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727362633s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.303573608s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.1c( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.630099297s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.206329346s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.1b( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727342606s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.303573608s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.1c( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.630069733s) [1] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.206329346s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.1e( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.622651100s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.198936462s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.1f( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.622664452s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active pruub 76.198982239s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.1e( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.622617722s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.198936462s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[3.1f( empty local-lis/les=44/45 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53 pruub=11.622633934s) [0] r=-1 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.198982239s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.19( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727068901s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.303436279s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.18( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727059364s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active pruub 78.303436279s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.18( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727036476s) [1] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.303436279s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[5.19( empty local-lis/les=46/47 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=13.727032661s) [0] r=-1 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.303436279s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[4.3( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[6.1( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=53) [2] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[11.a( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[6.3( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=53) [2] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[8.f( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[4.1( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[8.3( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[6.d( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=53) [2] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[11.e( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[8.d( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[4.2( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[8.c( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[6.b( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=53) [2] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[11.3( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[6.f( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=53) [2] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[4.9( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[8.6( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[6.9( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=53) [2] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[8.5( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[8.1c( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[8.2( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[4.14( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[8.1f( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[4.15( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[11.17( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[11.19( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[2.1d( empty local-lis/les=0/0 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[2.1c( empty local-lis/les=0/0 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[2.1b( empty local-lis/les=0/0 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[7.1f( empty local-lis/les=0/0 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53) [2] r=0 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[10.12( empty local-lis/les=0/0 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[10.1( empty local-lis/les=0/0 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[10.f( empty local-lis/les=0/0 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[2.5( empty local-lis/les=0/0 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[7.5( empty local-lis/les=0/0 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53) [2] r=0 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[2.a( empty local-lis/les=0/0 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[2.b( empty local-lis/les=0/0 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[2.c( empty local-lis/les=0/0 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[10.4( empty local-lis/les=0/0 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[2.d( empty local-lis/les=0/0 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[10.3( empty local-lis/les=0/0 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[7.a( empty local-lis/les=0/0 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53) [2] r=0 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[2.10( empty local-lis/les=0/0 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[7.14( empty local-lis/les=0/0 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53) [2] r=0 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[2.12( empty local-lis/les=0/0 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[7.16( empty local-lis/les=0/0 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53) [2] r=0 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[2.13( empty local-lis/les=0/0 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[2.f( empty local-lis/les=0/0 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[7.11( empty local-lis/les=0/0 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53) [2] r=0 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[2.15( empty local-lis/les=0/0 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[10.1e( empty local-lis/les=0/0 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[10.10( empty local-lis/les=0/0 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[2.18( empty local-lis/les=0/0 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[7.1d( empty local-lis/les=0/0 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53) [2] r=0 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 53 pg[10.11( empty local-lis/les=0/0 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Jan 31 02:23:55 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Jan 31 02:23:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e54 e54: 3 total, 3 up, 3 in
Jan 31 02:23:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:23:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:56.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[11.17( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[11.16( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[4.19( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[2.1c( empty local-lis/les=53/54 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[8.16( v 33'8 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[2.1d( empty local-lis/les=53/54 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[11.3( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[2.a( empty local-lis/les=53/54 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[8.15( v 33'8 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[2.b( empty local-lis/les=53/54 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[10.3( v 52'51 lc 37'43 (0'0,52'51] local-lis/les=53/54 n=1 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=52'51 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[8.2( v 33'8 (0'0,33'8] local-lis/les=53/54 n=1 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[10.1( v 37'48 (0'0,37'48] local-lis/les=53/54 n=1 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[8.3( v 33'8 (0'0,33'8] local-lis/les=53/54 n=1 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[6.d( v 40'39 lc 36'13 (0'0,40'39] local-lis/les=53/54 n=1 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=53) [2] r=0 lpr=53 pi=[46,53)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[4.2( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[8.c( v 33'8 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[4.3( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[6.1( v 40'39 (0'0,40'39] local-lis/les=53/54 n=2 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=53) [2] r=0 lpr=53 pi=[46,53)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[6.f( v 40'39 lc 36'1 (0'0,40'39] local-lis/les=53/54 n=1 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=53) [2] r=0 lpr=53 pi=[46,53)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[2.5( empty local-lis/les=53/54 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[8.f( v 33'8 lc 0'0 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=33'8 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[8.9( v 33'8 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[11.a( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[8.a( v 33'8 lc 0'0 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=33'8 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[4.6( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[8.b( v 33'8 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[6.7( v 40'39 lc 36'21 (0'0,40'39] local-lis/les=53/54 n=1 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=53) [2] r=0 lpr=53 pi=[46,53)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[6.5( v 40'39 lc 36'11 (0'0,40'39] local-lis/les=53/54 n=2 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=53) [2] r=0 lpr=53 pi=[46,53)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[11.8( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[7.5( empty local-lis/les=53/54 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53) [2] r=0 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[10.f( v 37'48 (0'0,37'48] local-lis/les=53/54 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[11.e( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[4.1( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[6.3( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=53/54 n=2 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=53) [2] r=0 lpr=53 pi=[46,53)/1 crt=40'39 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[6.9( v 40'39 (0'0,40'39] local-lis/les=53/54 n=1 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=53) [2] r=0 lpr=53 pi=[46,53)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[2.d( empty local-lis/les=53/54 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[2.f( empty local-lis/les=53/54 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[8.6( v 33'8 (0'0,33'8] local-lis/les=53/54 n=1 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[10.4( v 37'48 (0'0,37'48] local-lis/les=53/54 n=1 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[4.9( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[6.b( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=53/54 n=1 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=53) [2] r=0 lpr=53 pi=[46,53)/1 crt=40'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[8.5( v 33'8 (0'0,33'8] local-lis/les=53/54 n=1 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[4.1f( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[7.a( empty local-lis/les=53/54 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53) [2] r=0 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[7.14( empty local-lis/les=53/54 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53) [2] r=0 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[2.c( empty local-lis/les=53/54 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[8.d( v 33'8 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[11.19( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[4.8( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[4.15( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[7.16( empty local-lis/les=53/54 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53) [2] r=0 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[2.13( empty local-lis/les=53/54 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[2.12( empty local-lis/les=53/54 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[2.15( empty local-lis/les=53/54 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[8.1f( v 33'8 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[7.11( empty local-lis/les=53/54 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53) [2] r=0 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[4.14( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[7.1d( empty local-lis/les=53/54 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53) [2] r=0 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[8.1c( v 33'8 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[10.1e( v 37'48 (0'0,37'48] local-lis/les=53/54 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[10.11( v 37'48 (0'0,37'48] local-lis/les=53/54 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[2.18( empty local-lis/les=53/54 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[2.1b( empty local-lis/les=53/54 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[11.13( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[4.1c( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[4.1d( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [2] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[7.1f( empty local-lis/les=53/54 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53) [2] r=0 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[10.12( v 37'48 (0'0,37'48] local-lis/les=53/54 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[10.10( v 37'48 (0'0,37'48] local-lis/les=53/54 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53) [2] r=0 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[8.11( v 33'8 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [2] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 54 pg[2.10( empty local-lis/les=53/54 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53) [2] r=0 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 31 02:23:56 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 31 02:23:56 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 31 02:23:56 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]': finished
Jan 31 02:23:56 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 31 02:23:56 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Jan 31 02:23:56 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 31 02:23:56 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 31 02:23:56 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 31 02:23:56 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 31 02:23:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:57.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e55 e55: 3 total, 3 up, 3 in
Jan 31 02:23:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 31 02:23:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 31 02:23:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:58.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e56 e56: 3 total, 3 up, 3 in
Jan 31 02:23:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]': finished
Jan 31 02:23:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Jan 31 02:23:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:23:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:59.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:59 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Jan 31 02:23:59 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.9 scrub ok
Jan 31 02:24:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e56 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:00.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:00 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.e scrub starts
Jan 31 02:24:00 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.e scrub ok
Jan 31 02:24:00 np0005603623 ceph-mon[77037]: Health check failed: Degraded data redundancy: 10/215 objects degraded (4.651%), 7 pgs degraded (PG_DEGRADED)
Jan 31 02:24:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:01.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:01 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:24:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:02.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:03.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:03 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Jan 31 02:24:03 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Jan 31 02:24:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:04.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:05.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e56 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:05 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Jan 31 02:24:05 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Jan 31 02:24:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 31 02:24:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 31 02:24:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e57 e57: 3 total, 3 up, 3 in
Jan 31 02:24:06 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 57 pg[6.f( v 40'39 (0'0,40'39] local-lis/les=53/54 n=1 ec=46/20 lis/c=53/53 les/c/f=54/54/0 sis=57 pruub=14.361312866s) [1] r=-1 lpr=57 pi=[53,57)/1 crt=40'39 mlcod 40'39 active pruub 89.585006714s@ mbc={255={}}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:06 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 57 pg[6.f( v 40'39 (0'0,40'39] local-lis/les=53/54 n=1 ec=46/20 lis/c=53/53 les/c/f=54/54/0 sis=57 pruub=14.360935211s) [1] r=-1 lpr=57 pi=[53,57)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 89.585006714s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:06 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 57 pg[6.3( v 40'39 (0'0,40'39] local-lis/les=53/54 n=2 ec=46/20 lis/c=53/53 les/c/f=54/55/0 sis=57 pruub=14.361552238s) [1] r=-1 lpr=57 pi=[53,57)/1 crt=40'39 mlcod 40'39 active pruub 89.585670471s@ mbc={255={}}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:06 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 57 pg[6.3( v 40'39 (0'0,40'39] local-lis/les=53/54 n=2 ec=46/20 lis/c=53/53 les/c/f=54/55/0 sis=57 pruub=14.361511230s) [1] r=-1 lpr=57 pi=[53,57)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 89.585670471s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:06 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 57 pg[6.b( v 40'39 (0'0,40'39] local-lis/les=53/54 n=1 ec=46/20 lis/c=53/53 les/c/f=54/55/0 sis=57 pruub=14.361299515s) [1] r=-1 lpr=57 pi=[53,57)/1 crt=40'39 mlcod 40'39 active pruub 89.585723877s@ mbc={255={}}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:06 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 57 pg[6.b( v 40'39 (0'0,40'39] local-lis/les=53/54 n=1 ec=46/20 lis/c=53/53 les/c/f=54/55/0 sis=57 pruub=14.361256599s) [1] r=-1 lpr=57 pi=[53,57)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 89.585723877s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:06 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 57 pg[6.7( v 40'39 (0'0,40'39] local-lis/les=53/54 n=1 ec=46/20 lis/c=53/53 les/c/f=54/54/0 sis=57 pruub=14.360944748s) [1] r=-1 lpr=57 pi=[53,57)/1 crt=40'39 mlcod 40'39 active pruub 89.585433960s@ mbc={255={}}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:06 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 57 pg[6.7( v 40'39 (0'0,40'39] local-lis/les=53/54 n=1 ec=46/20 lis/c=53/53 les/c/f=54/54/0 sis=57 pruub=14.360899925s) [1] r=-1 lpr=57 pi=[53,57)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 89.585433960s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:06 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 57 pg[9.13( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:06 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 57 pg[9.17( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:06 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 57 pg[9.b( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:06 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 57 pg[9.f( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:06 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 57 pg[9.7( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:06 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 57 pg[9.3( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:06 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 57 pg[9.1f( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:06 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 57 pg[9.1b( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:06.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:06 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Jan 31 02:24:06 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Jan 31 02:24:07 np0005603623 ceph-mon[77037]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 1/215 objects degraded (0.465%), 1 pg degraded)
Jan 31 02:24:07 np0005603623 ceph-mon[77037]: Cluster is now healthy
Jan 31 02:24:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]': finished
Jan 31 02:24:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Jan 31 02:24:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:07.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e58 e58: 3 total, 3 up, 3 in
Jan 31 02:24:07 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 58 pg[9.13( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=58) [2]/[0] r=-1 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:07 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 58 pg[9.13( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=58) [2]/[0] r=-1 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:07 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 58 pg[9.b( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=58) [2]/[0] r=-1 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:07 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 58 pg[9.b( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=58) [2]/[0] r=-1 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:07 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 58 pg[9.3( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=58) [2]/[0] r=-1 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:07 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 58 pg[9.f( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=58) [2]/[0] r=-1 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:07 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 58 pg[9.17( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=58) [2]/[0] r=-1 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:07 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 58 pg[9.3( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=58) [2]/[0] r=-1 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:07 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 58 pg[9.17( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=58) [2]/[0] r=-1 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:07 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 58 pg[9.f( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=58) [2]/[0] r=-1 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:07 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 58 pg[9.7( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=58) [2]/[0] r=-1 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:07 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 58 pg[9.7( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=58) [2]/[0] r=-1 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:07 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 58 pg[9.1b( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=58) [2]/[0] r=-1 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:07 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 58 pg[9.1b( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=58) [2]/[0] r=-1 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:07 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 58 pg[9.1f( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=58) [2]/[0] r=-1 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:07 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 58 pg[9.1f( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=58) [2]/[0] r=-1 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e59 e59: 3 total, 3 up, 3 in
Jan 31 02:24:08 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 31 02:24:08 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 31 02:24:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e60 e60: 3 total, 3 up, 3 in
Jan 31 02:24:08 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 60 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=60) [2] r=0 lpr=60 pi=[49,60)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:08 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 60 pg[9.1b( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=60) [2] r=0 lpr=60 pi=[49,60)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:08 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 60 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=60) [2] r=0 lpr=60 pi=[49,60)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:08 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 60 pg[9.1b( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=60) [2] r=0 lpr=60 pi=[49,60)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:08 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 60 pg[9.f( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=60) [2] r=0 lpr=60 pi=[49,60)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:08 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 60 pg[9.f( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=60) [2] r=0 lpr=60 pi=[49,60)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:08 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 60 pg[9.17( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=60) [2] r=0 lpr=60 pi=[49,60)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:08 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 60 pg[9.3( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=60) [2] r=0 lpr=60 pi=[49,60)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:08 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 60 pg[9.17( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=60) [2] r=0 lpr=60 pi=[49,60)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:08 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 60 pg[9.3( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=60) [2] r=0 lpr=60 pi=[49,60)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:24:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:08.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:24:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:24:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:09.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:24:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]': finished
Jan 31 02:24:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Jan 31 02:24:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 31 02:24:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 31 02:24:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e61 e61: 3 total, 3 up, 3 in
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[9.5( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=61) [2] r=0 lpr=61 pi=[49,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[9.13( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=61) [2] r=0 lpr=61 pi=[49,61)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[9.b( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=61) [2] r=0 lpr=61 pi=[49,61)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[9.7( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=61) [2] r=0 lpr=61 pi=[49,61)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[9.13( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=61) [2] r=0 lpr=61 pi=[49,61)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[9.7( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=61) [2] r=0 lpr=61 pi=[49,61)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[9.b( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=61) [2] r=0 lpr=61 pi=[49,61)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[9.d( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=61) [2] r=0 lpr=61 pi=[49,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[9.1d( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=61) [2] r=0 lpr=61 pi=[49,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[9.15( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=61) [2] r=0 lpr=61 pi=[49,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[6.d( v 40'39 (0'0,40'39] local-lis/les=53/54 n=1 ec=46/20 lis/c=53/53 les/c/f=54/54/0 sis=61 pruub=10.974006653s) [1] r=-1 lpr=61 pi=[53,61)/1 crt=40'39 mlcod 40'39 active pruub 89.584938049s@ mbc={255={}}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[6.d( v 40'39 (0'0,40'39] local-lis/les=53/54 n=1 ec=46/20 lis/c=53/53 les/c/f=54/54/0 sis=61 pruub=10.973967552s) [1] r=-1 lpr=61 pi=[53,61)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 89.584938049s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[6.5( v 40'39 (0'0,40'39] local-lis/les=53/54 n=2 ec=46/20 lis/c=53/53 les/c/f=54/54/0 sis=61 pruub=10.974349022s) [1] r=-1 lpr=61 pi=[53,61)/1 crt=40'39 mlcod 40'39 active pruub 89.585510254s@ mbc={255={}}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[6.5( v 40'39 (0'0,40'39] local-lis/les=53/54 n=2 ec=46/20 lis/c=53/53 les/c/f=54/54/0 sis=61 pruub=10.974259377s) [1] r=-1 lpr=61 pi=[53,61)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 89.585510254s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=60/61 n=5 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=60) [2] r=0 lpr=60 pi=[49,60)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[9.3( v 40'1015 (0'0,40'1015] local-lis/les=60/61 n=6 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=60) [2] r=0 lpr=60 pi=[49,60)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[9.f( v 40'1015 (0'0,40'1015] local-lis/les=60/61 n=6 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=60) [2] r=0 lpr=60 pi=[49,60)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[9.1b( v 40'1015 (0'0,40'1015] local-lis/les=60/61 n=5 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=60) [2] r=0 lpr=60 pi=[49,60)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:09 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 61 pg[9.17( v 40'1015 (0'0,40'1015] local-lis/les=60/61 n=5 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=60) [2] r=0 lpr=60 pi=[49,60)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e61 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:24:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:10.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:24:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]': finished
Jan 31 02:24:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Jan 31 02:24:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e62 e62: 3 total, 3 up, 3 in
Jan 31 02:24:10 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 62 pg[9.5( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[49,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:10 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 62 pg[9.15( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[49,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:10 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 62 pg[9.15( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[49,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:10 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 62 pg[9.5( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[49,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:10 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 62 pg[9.1d( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[49,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:10 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 62 pg[9.1d( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[49,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:10 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 62 pg[9.d( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[49,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:10 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 62 pg[9.d( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[49,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:10 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 62 pg[9.13( v 40'1015 (0'0,40'1015] local-lis/les=61/62 n=5 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=61) [2] r=0 lpr=61 pi=[49,61)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:10 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 62 pg[9.7( v 40'1015 (0'0,40'1015] local-lis/les=61/62 n=6 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=61) [2] r=0 lpr=61 pi=[49,61)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:10 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 62 pg[9.b( v 40'1015 (0'0,40'1015] local-lis/les=61/62 n=6 ec=49/34 lis/c=58/49 les/c/f=59/50/0 sis=61) [2] r=0 lpr=61 pi=[49,61)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:11.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 31 02:24:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 31 02:24:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e63 e63: 3 total, 3 up, 3 in
Jan 31 02:24:12 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Jan 31 02:24:12 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Jan 31 02:24:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:12.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:12 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]': finished
Jan 31 02:24:12 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Jan 31 02:24:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e64 e64: 3 total, 3 up, 3 in
Jan 31 02:24:12 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 64 pg[9.15( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=62/49 les/c/f=63/50/0 sis=64) [2] r=0 lpr=64 pi=[49,64)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:12 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 64 pg[9.15( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=62/49 les/c/f=63/50/0 sis=64) [2] r=0 lpr=64 pi=[49,64)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:12 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 64 pg[9.d( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=62/49 les/c/f=63/50/0 sis=64) [2] r=0 lpr=64 pi=[49,64)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:12 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 64 pg[9.d( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=62/49 les/c/f=63/50/0 sis=64) [2] r=0 lpr=64 pi=[49,64)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:12 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 64 pg[9.5( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=62/49 les/c/f=63/50/0 sis=64) [2] r=0 lpr=64 pi=[49,64)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:12 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 64 pg[9.5( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=62/49 les/c/f=63/50/0 sis=64) [2] r=0 lpr=64 pi=[49,64)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:12 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 64 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=62/49 les/c/f=63/50/0 sis=64) [2] r=0 lpr=64 pi=[49,64)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:12 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 64 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=62/49 les/c/f=63/50/0 sis=64) [2] r=0 lpr=64 pi=[49,64)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:13.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:13 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Jan 31 02:24:13 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Jan 31 02:24:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e65 e65: 3 total, 3 up, 3 in
Jan 31 02:24:13 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 65 pg[9.5( v 40'1015 (0'0,40'1015] local-lis/les=64/65 n=6 ec=49/34 lis/c=62/49 les/c/f=63/50/0 sis=64) [2] r=0 lpr=64 pi=[49,64)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:13 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 65 pg[9.15( v 40'1015 (0'0,40'1015] local-lis/les=64/65 n=5 ec=49/34 lis/c=62/49 les/c/f=63/50/0 sis=64) [2] r=0 lpr=64 pi=[49,64)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:13 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 65 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=64/65 n=5 ec=49/34 lis/c=62/49 les/c/f=63/50/0 sis=64) [2] r=0 lpr=64 pi=[49,64)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:13 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 65 pg[9.d( v 40'1015 (0'0,40'1015] local-lis/les=64/65 n=6 ec=49/34 lis/c=62/49 les/c/f=63/50/0 sis=64) [2] r=0 lpr=64 pi=[49,64)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:24:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:14.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:24:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e66 e66: 3 total, 3 up, 3 in
Jan 31 02:24:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:15.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e66 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e67 e67: 3 total, 3 up, 3 in
Jan 31 02:24:16 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 5.4 deep-scrub starts
Jan 31 02:24:16 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 5.4 deep-scrub ok
Jan 31 02:24:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:24:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:16.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:24:16 np0005603623 systemd[1]: session-19.scope: Deactivated successfully.
Jan 31 02:24:16 np0005603623 systemd[1]: session-19.scope: Consumed 7.095s CPU time.
Jan 31 02:24:16 np0005603623 systemd-logind[795]: Session 19 logged out. Waiting for processes to exit.
Jan 31 02:24:16 np0005603623 systemd-logind[795]: Removed session 19.
Jan 31 02:24:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:17.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:18.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:19.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e68 e68: 3 total, 3 up, 3 in
Jan 31 02:24:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 31 02:24:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 31 02:24:19 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Jan 31 02:24:19 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Jan 31 02:24:20 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]': finished
Jan 31 02:24:20 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Jan 31 02:24:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e68 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:20 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 5.b scrub starts
Jan 31 02:24:20 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 5.b scrub ok
Jan 31 02:24:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:20.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:24:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:21.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:24:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e69 e69: 3 total, 3 up, 3 in
Jan 31 02:24:21 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 31 02:24:21 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 31 02:24:22 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 69 pg[9.8( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=69) [2] r=0 lpr=69 pi=[49,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:22 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 69 pg[9.18( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=69) [2] r=0 lpr=69 pi=[49,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]': finished
Jan 31 02:24:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Jan 31 02:24:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:22.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e70 e70: 3 total, 3 up, 3 in
Jan 31 02:24:22 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 70 pg[9.8( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[49,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:22 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 70 pg[9.8( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[49,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:22 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 70 pg[9.18( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[49,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:22 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 70 pg[9.18( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[49,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:23.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e71 e71: 3 total, 3 up, 3 in
Jan 31 02:24:23 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 71 pg[9.9( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=71) [2] r=0 lpr=71 pi=[49,71)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:23 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 71 pg[6.9( v 40'39 (0'0,40'39] local-lis/les=53/54 n=1 ec=46/20 lis/c=53/53 les/c/f=54/54/0 sis=71 pruub=12.961437225s) [0] r=-1 lpr=71 pi=[53,71)/1 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 105.585945129s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:23 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 71 pg[6.9( v 40'39 (0'0,40'39] local-lis/les=53/54 n=1 ec=46/20 lis/c=53/53 les/c/f=54/54/0 sis=71 pruub=12.961262703s) [0] r=-1 lpr=71 pi=[53,71)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 105.585945129s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:23 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 71 pg[9.19( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=71) [2] r=0 lpr=71 pi=[49,71)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 31 02:24:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 31 02:24:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:24.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e72 e72: 3 total, 3 up, 3 in
Jan 31 02:24:24 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 72 pg[9.9( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=72) [2]/[0] r=-1 lpr=72 pi=[49,72)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:24 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 72 pg[9.9( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=72) [2]/[0] r=-1 lpr=72 pi=[49,72)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:24 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 72 pg[9.8( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=70/49 les/c/f=71/50/0 sis=72) [2] r=0 lpr=72 pi=[49,72)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:24 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 72 pg[9.8( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=70/49 les/c/f=71/50/0 sis=72) [2] r=0 lpr=72 pi=[49,72)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:24 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 72 pg[9.18( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=70/49 les/c/f=71/50/0 sis=72) [2] r=0 lpr=72 pi=[49,72)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:24 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 72 pg[9.18( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=70/49 les/c/f=71/50/0 sis=72) [2] r=0 lpr=72 pi=[49,72)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:24 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 72 pg[9.19( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=72) [2]/[0] r=-1 lpr=72 pi=[49,72)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:24 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 72 pg[9.19( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=72) [2]/[0] r=-1 lpr=72 pi=[49,72)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]': finished
Jan 31 02:24:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Jan 31 02:24:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:24:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:25.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:24:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e72 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e73 e73: 3 total, 3 up, 3 in
Jan 31 02:24:25 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 73 pg[9.8( v 40'1015 (0'0,40'1015] local-lis/les=72/73 n=6 ec=49/34 lis/c=70/49 les/c/f=71/50/0 sis=72) [2] r=0 lpr=72 pi=[49,72)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:25 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 73 pg[9.18( v 40'1015 (0'0,40'1015] local-lis/les=72/73 n=5 ec=49/34 lis/c=70/49 les/c/f=71/50/0 sis=72) [2] r=0 lpr=72 pi=[49,72)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 31 02:24:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 31 02:24:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]': finished
Jan 31 02:24:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Jan 31 02:24:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:26.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e74 e74: 3 total, 3 up, 3 in
Jan 31 02:24:26 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 74 pg[9.9( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=72/49 les/c/f=73/50/0 sis=74) [2] r=0 lpr=74 pi=[49,74)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:26 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 74 pg[9.9( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=72/49 les/c/f=73/50/0 sis=74) [2] r=0 lpr=74 pi=[49,74)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:26 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 74 pg[9.19( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=72/49 les/c/f=73/50/0 sis=74) [2] r=0 lpr=74 pi=[49,74)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:26 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 74 pg[9.19( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=72/49 les/c/f=73/50/0 sis=74) [2] r=0 lpr=74 pi=[49,74)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:27.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e75 e75: 3 total, 3 up, 3 in
Jan 31 02:24:27 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 75 pg[9.19( v 40'1015 (0'0,40'1015] local-lis/les=74/75 n=5 ec=49/34 lis/c=72/49 les/c/f=73/50/0 sis=74) [2] r=0 lpr=74 pi=[49,74)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:27 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 75 pg[9.9( v 40'1015 (0'0,40'1015] local-lis/les=74/75 n=6 ec=49/34 lis/c=72/49 les/c/f=73/50/0 sis=74) [2] r=0 lpr=74 pi=[49,74)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e76 e76: 3 total, 3 up, 3 in
Jan 31 02:24:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:24:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:28.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:24:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:29.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e77 e77: 3 total, 3 up, 3 in
Jan 31 02:24:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e77 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:30 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 5.d scrub starts
Jan 31 02:24:30 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 5.d scrub ok
Jan 31 02:24:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:30.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:31.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:32.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:24:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:33.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:24:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:34.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:34 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 5.e scrub starts
Jan 31 02:24:34 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 5.e scrub ok
Jan 31 02:24:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:24:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:35.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:24:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e77 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e78 e78: 3 total, 3 up, 3 in
Jan 31 02:24:35 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 31 02:24:35 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 31 02:24:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:36.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 31 02:24:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 31 02:24:36 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Jan 31 02:24:36 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 5.12 scrub ok
Jan 31 02:24:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e79 e79: 3 total, 3 up, 3 in
Jan 31 02:24:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:37.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e80 e80: 3 total, 3 up, 3 in
Jan 31 02:24:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 31 02:24:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 31 02:24:38 np0005603623 systemd-logind[795]: New session 33 of user zuul.
Jan 31 02:24:38 np0005603623 systemd[1]: Started Session 33 of User zuul.
Jan 31 02:24:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:38.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 31 02:24:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 31 02:24:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:39.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:39 np0005603623 python3.9[86914]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:24:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e81 e81: 3 total, 3 up, 3 in
Jan 31 02:24:39 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 81 pg[9.d( v 40'1015 (0'0,40'1015] local-lis/les=64/65 n=6 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=81 pruub=14.027600288s) [1] r=-1 lpr=81 pi=[64,81)/1 crt=40'1015 mlcod 0'0 active pruub 122.742355347s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:39 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 81 pg[9.d( v 40'1015 (0'0,40'1015] local-lis/les=64/65 n=6 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=81 pruub=14.027521133s) [1] r=-1 lpr=81 pi=[64,81)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 122.742355347s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:39 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 81 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=64/65 n=5 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=81 pruub=14.027166367s) [1] r=-1 lpr=81 pi=[64,81)/1 crt=40'1015 mlcod 0'0 active pruub 122.742355347s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:39 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 81 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=64/65 n=5 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=81 pruub=14.027139664s) [1] r=-1 lpr=81 pi=[64,81)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 122.742355347s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 31 02:24:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 31 02:24:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:40.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:40 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Jan 31 02:24:40 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Jan 31 02:24:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 31 02:24:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 31 02:24:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e82 e82: 3 total, 3 up, 3 in
Jan 31 02:24:40 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 82 pg[9.d( v 40'1015 (0'0,40'1015] local-lis/les=64/65 n=6 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=82) [1]/[2] r=0 lpr=82 pi=[64,82)/1 crt=40'1015 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:40 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 82 pg[9.d( v 40'1015 (0'0,40'1015] local-lis/les=64/65 n=6 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=82) [1]/[2] r=0 lpr=82 pi=[64,82)/1 crt=40'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:40 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 82 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=64/65 n=5 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=82) [1]/[2] r=0 lpr=82 pi=[64,82)/1 crt=40'1015 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:40 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 82 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=64/65 n=5 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=82) [1]/[2] r=0 lpr=82 pi=[64,82)/1 crt=40'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:41 np0005603623 python3.9[87129]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:24:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:41.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:41 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 31 02:24:41 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 31 02:24:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e83 e83: 3 total, 3 up, 3 in
Jan 31 02:24:41 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 83 pg[9.d( v 40'1015 (0'0,40'1015] local-lis/les=82/83 n=6 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=82) [1]/[2] async=[1] r=0 lpr=82 pi=[64,82)/1 crt=40'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:41 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 83 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=82/83 n=5 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=82) [1]/[2] async=[1] r=0 lpr=82 pi=[64,82)/1 crt=40'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:24:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:42.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:24:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]': finished
Jan 31 02:24:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Jan 31 02:24:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e84 e84: 3 total, 3 up, 3 in
Jan 31 02:24:42 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 84 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=82/83 n=5 ec=49/34 lis/c=82/64 les/c/f=83/65/0 sis=84 pruub=14.981756210s) [1] async=[1] r=-1 lpr=84 pi=[64,84)/1 crt=40'1015 mlcod 40'1015 active pruub 126.800354004s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:42 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 84 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=82/83 n=5 ec=49/34 lis/c=82/64 les/c/f=83/65/0 sis=84 pruub=14.981669426s) [1] r=-1 lpr=84 pi=[64,84)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 126.800354004s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:42 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 84 pg[9.d( v 40'1015 (0'0,40'1015] local-lis/les=82/83 n=6 ec=49/34 lis/c=82/64 les/c/f=83/65/0 sis=84 pruub=14.981833458s) [1] async=[1] r=-1 lpr=84 pi=[64,84)/1 crt=40'1015 mlcod 40'1015 active pruub 126.800331116s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:42 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 84 pg[9.d( v 40'1015 (0'0,40'1015] local-lis/les=82/83 n=6 ec=49/34 lis/c=82/64 les/c/f=83/65/0 sis=84 pruub=14.981268883s) [1] r=-1 lpr=84 pi=[64,84)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 126.800331116s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:43.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 31 02:24:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 31 02:24:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e85 e85: 3 total, 3 up, 3 in
Jan 31 02:24:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 85 pg[9.f( v 40'1015 (0'0,40'1015] local-lis/les=60/61 n=6 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=85 pruub=13.762257576s) [1] r=-1 lpr=85 pi=[60,85)/1 crt=40'1015 mlcod 0'0 active pruub 126.616928101s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 85 pg[9.f( v 40'1015 (0'0,40'1015] local-lis/les=60/61 n=6 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=85 pruub=13.762109756s) [1] r=-1 lpr=85 pi=[60,85)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 126.616928101s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 85 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=60/61 n=5 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=85 pruub=13.758196831s) [1] r=-1 lpr=85 pi=[60,85)/1 crt=40'1015 mlcod 0'0 active pruub 126.613174438s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:43 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 85 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=60/61 n=5 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=85 pruub=13.758168221s) [1] r=-1 lpr=85 pi=[60,85)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 126.613174438s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:44.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:44 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Jan 31 02:24:44 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Jan 31 02:24:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]': finished
Jan 31 02:24:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Jan 31 02:24:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e86 e86: 3 total, 3 up, 3 in
Jan 31 02:24:44 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 86 pg[9.f( v 40'1015 (0'0,40'1015] local-lis/les=60/61 n=6 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=86) [1]/[2] r=0 lpr=86 pi=[60,86)/1 crt=40'1015 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:44 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 86 pg[9.f( v 40'1015 (0'0,40'1015] local-lis/les=60/61 n=6 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=86) [1]/[2] r=0 lpr=86 pi=[60,86)/1 crt=40'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:44 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 86 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=60/61 n=5 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=86) [1]/[2] r=0 lpr=86 pi=[60,86)/1 crt=40'1015 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:44 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 86 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=60/61 n=5 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=86) [1]/[2] r=0 lpr=86 pi=[60,86)/1 crt=40'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:45.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e87 e87: 3 total, 3 up, 3 in
Jan 31 02:24:46 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 87 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=86/87 n=5 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=86) [1]/[2] async=[1] r=0 lpr=86 pi=[60,86)/1 crt=40'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:46 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 87 pg[9.f( v 40'1015 (0'0,40'1015] local-lis/les=86/87 n=6 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=86) [1]/[2] async=[1] r=0 lpr=86 pi=[60,86)/1 crt=40'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:46.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e88 e88: 3 total, 3 up, 3 in
Jan 31 02:24:47 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 88 pg[9.f( v 40'1015 (0'0,40'1015] local-lis/les=86/87 n=6 ec=49/34 lis/c=86/60 les/c/f=87/61/0 sis=88 pruub=14.986036301s) [1] async=[1] r=-1 lpr=88 pi=[60,88)/1 crt=40'1015 mlcod 40'1015 active pruub 130.919738770s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:47 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 88 pg[9.f( v 40'1015 (0'0,40'1015] local-lis/les=86/87 n=6 ec=49/34 lis/c=86/60 les/c/f=87/61/0 sis=88 pruub=14.985909462s) [1] r=-1 lpr=88 pi=[60,88)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 130.919738770s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:47 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 88 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=86/87 n=5 ec=49/34 lis/c=86/60 les/c/f=87/61/0 sis=88 pruub=14.983409882s) [1] async=[1] r=-1 lpr=88 pi=[60,88)/1 crt=40'1015 mlcod 40'1015 active pruub 130.917572021s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:47 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 88 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=86/87 n=5 ec=49/34 lis/c=86/60 les/c/f=87/61/0 sis=88 pruub=14.983329773s) [1] r=-1 lpr=88 pi=[60,88)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 130.917572021s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:47.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e89 e89: 3 total, 3 up, 3 in
Jan 31 02:24:48 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 11.16 scrub starts
Jan 31 02:24:48 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 11.16 scrub ok
Jan 31 02:24:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:24:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:48.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:24:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:49.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:49 np0005603623 systemd[1]: session-33.scope: Deactivated successfully.
Jan 31 02:24:49 np0005603623 systemd[1]: session-33.scope: Consumed 7.378s CPU time.
Jan 31 02:24:49 np0005603623 systemd-logind[795]: Session 33 logged out. Waiting for processes to exit.
Jan 31 02:24:49 np0005603623 systemd-logind[795]: Removed session 33.
Jan 31 02:24:49 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.15 scrub starts
Jan 31 02:24:49 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.15 scrub ok
Jan 31 02:24:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e90 e90: 3 total, 3 up, 3 in
Jan 31 02:24:50 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 31 02:24:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:50.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e91 e91: 3 total, 3 up, 3 in
Jan 31 02:24:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 31 02:24:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:51.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e92 e92: 3 total, 3 up, 3 in
Jan 31 02:24:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Jan 31 02:24:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Jan 31 02:24:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:24:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:52.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:24:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e93 e93: 3 total, 3 up, 3 in
Jan 31 02:24:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:53.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e94 e94: 3 total, 3 up, 3 in
Jan 31 02:24:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:24:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:54.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:24:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e95 e95: 3 total, 3 up, 3 in
Jan 31 02:24:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:55.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e96 e96: 3 total, 3 up, 3 in
Jan 31 02:24:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:56.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:57.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:58 np0005603623 podman[87415]: 2026-01-31 07:24:58.587271694 +0000 UTC m=+0.051372467 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:24:58 np0005603623 podman[87415]: 2026-01-31 07:24:58.67997966 +0000 UTC m=+0.144080473 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:24:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:58.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:59 np0005603623 podman[87571]: 2026-01-31 07:24:59.249492292 +0000 UTC m=+0.067561952 container exec dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:24:59 np0005603623 podman[87571]: 2026-01-31 07:24:59.260757441 +0000 UTC m=+0.078827081 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:24:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:24:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:59.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:59 np0005603623 podman[87635]: 2026-01-31 07:24:59.452518125 +0000 UTC m=+0.042330114 container exec 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, io.openshift.tags=Ceph keepalived, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, io.buildah.version=1.28.2, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, description=keepalived for Ceph)
Jan 31 02:24:59 np0005603623 podman[87635]: 2026-01-31 07:24:59.488885047 +0000 UTC m=+0.078697036 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, release=1793, architecture=x86_64, vendor=Red Hat, Inc.)
Jan 31 02:24:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:24:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:24:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Jan 31 02:24:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:24:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:24:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:24:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:24:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e97 e97: 3 total, 3 up, 3 in
Jan 31 02:25:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:00.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:00 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.16 deep-scrub starts
Jan 31 02:25:00 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.16 deep-scrub ok
Jan 31 02:25:01 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Jan 31 02:25:01 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:25:01 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:25:01 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:25:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e98 e98: 3 total, 3 up, 3 in
Jan 31 02:25:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:01.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:01 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 4.8 scrub starts
Jan 31 02:25:01 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 4.8 scrub ok
Jan 31 02:25:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e99 e99: 3 total, 3 up, 3 in
Jan 31 02:25:02 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Jan 31 02:25:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:25:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:02.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:25:03 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Jan 31 02:25:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e100 e100: 3 total, 3 up, 3 in
Jan 31 02:25:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:03.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:03 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 4.1f scrub starts
Jan 31 02:25:03 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 4.1f scrub ok
Jan 31 02:25:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e101 e101: 3 total, 3 up, 3 in
Jan 31 02:25:04 np0005603623 systemd-logind[795]: New session 34 of user zuul.
Jan 31 02:25:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:04.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:04 np0005603623 systemd[1]: Started Session 34 of User zuul.
Jan 31 02:25:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:05.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:05 np0005603623 python3.9[88005]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 31 02:25:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:06 np0005603623 python3.9[88180]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:25:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:25:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:06.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:25:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:25:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:07.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:25:07 np0005603623 python3.9[88386]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:25:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:25:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:25:07 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Jan 31 02:25:07 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 4.1d scrub ok
Jan 31 02:25:08 np0005603623 python3.9[88540]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:25:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:08.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:09.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:09 np0005603623 python3.9[88694]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:25:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e102 e102: 3 total, 3 up, 3 in
Jan 31 02:25:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Jan 31 02:25:10 np0005603623 python3.9[88847]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:25:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:10.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Jan 31 02:25:11 np0005603623 python3.9[88997]: ansible-ansible.builtin.service_facts Invoked
Jan 31 02:25:11 np0005603623 network[89014]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 02:25:11 np0005603623 network[89015]: 'network-scripts' will be removed from distribution in near future.
Jan 31 02:25:11 np0005603623 network[89016]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 02:25:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:11.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e103 e103: 3 total, 3 up, 3 in
Jan 31 02:25:11 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 103 pg[9.15( v 40'1015 (0'0,40'1015] local-lis/les=64/65 n=5 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=103 pruub=14.036934853s) [1] r=-1 lpr=103 pi=[64,103)/1 crt=40'1015 mlcod 0'0 active pruub 154.743301392s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:11 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 103 pg[9.15( v 40'1015 (0'0,40'1015] local-lis/les=64/65 n=5 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=103 pruub=14.036848068s) [1] r=-1 lpr=103 pi=[64,103)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 154.743301392s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Jan 31 02:25:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:12.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e104 e104: 3 total, 3 up, 3 in
Jan 31 02:25:12 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 104 pg[9.15( v 40'1015 (0'0,40'1015] local-lis/les=64/65 n=5 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=104) [1]/[2] r=0 lpr=104 pi=[64,104)/1 crt=40'1015 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:12 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 104 pg[9.15( v 40'1015 (0'0,40'1015] local-lis/les=64/65 n=5 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=104) [1]/[2] r=0 lpr=104 pi=[64,104)/1 crt=40'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 31 02:25:12 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Jan 31 02:25:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:13.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:13 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Jan 31 02:25:13 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Jan 31 02:25:13 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Jan 31 02:25:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e105 e105: 3 total, 3 up, 3 in
Jan 31 02:25:14 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 105 pg[9.16( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=66/66 les/c/f=67/67/0 sis=105) [2] r=0 lpr=105 pi=[66,105)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:25:14 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 105 pg[9.15( v 40'1015 (0'0,40'1015] local-lis/les=104/105 n=5 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=104) [1]/[2] async=[1] r=0 lpr=104 pi=[64,104)/1 crt=40'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:25:14 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.11 scrub starts
Jan 31 02:25:14 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.11 scrub ok
Jan 31 02:25:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:25:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:14.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:25:14 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Jan 31 02:25:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e106 e106: 3 total, 3 up, 3 in
Jan 31 02:25:14 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 106 pg[9.15( v 40'1015 (0'0,40'1015] local-lis/les=104/105 n=5 ec=49/34 lis/c=104/64 les/c/f=105/65/0 sis=106 pruub=15.377679825s) [1] async=[1] r=-1 lpr=106 pi=[64,106)/1 crt=40'1015 mlcod 40'1015 active pruub 159.159271240s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:14 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 106 pg[9.16( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=66/66 les/c/f=67/67/0 sis=106) [2]/[1] r=-1 lpr=106 pi=[66,106)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:14 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 106 pg[9.15( v 40'1015 (0'0,40'1015] local-lis/les=104/105 n=5 ec=49/34 lis/c=104/64 les/c/f=105/65/0 sis=106 pruub=15.377606392s) [1] r=-1 lpr=106 pi=[64,106)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 159.159271240s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:14 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 106 pg[9.16( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=66/66 les/c/f=67/67/0 sis=106) [2]/[1] r=-1 lpr=106 pi=[66,106)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:15 np0005603623 python3.9[89278]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:25:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:25:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:15.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:25:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:15 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 11.13 scrub starts
Jan 31 02:25:15 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 11.13 scrub ok
Jan 31 02:25:15 np0005603623 python3.9[89428]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:25:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e107 e107: 3 total, 3 up, 3 in
Jan 31 02:25:15 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Jan 31 02:25:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:16.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Jan 31 02:25:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e108 e108: 3 total, 3 up, 3 in
Jan 31 02:25:16 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 108 pg[9.16( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=106/66 les/c/f=107/67/0 sis=108) [2] r=0 lpr=108 pi=[66,108)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:16 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 108 pg[9.16( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=106/66 les/c/f=107/67/0 sis=108) [2] r=0 lpr=108 pi=[66,108)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:25:17 np0005603623 python3.9[89583]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:25:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:17.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:17 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.b scrub starts
Jan 31 02:25:17 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.b scrub ok
Jan 31 02:25:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e109 e109: 3 total, 3 up, 3 in
Jan 31 02:25:17 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 109 pg[9.16( v 40'1015 (0'0,40'1015] local-lis/les=108/109 n=5 ec=49/34 lis/c=106/66 les/c/f=107/67/0 sis=108) [2] r=0 lpr=108 pi=[66,108)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:25:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Jan 31 02:25:18 np0005603623 python3.9[89741]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:25:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:18.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Jan 31 02:25:19 np0005603623 python3.9[89826]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:25:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:19.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:19 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 4.6 scrub starts
Jan 31 02:25:19 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 4.6 scrub ok
Jan 31 02:25:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:20.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:21.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:25:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:22.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:25:22 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 11.8 scrub starts
Jan 31 02:25:22 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 11.8 scrub ok
Jan 31 02:25:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:23.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e110 e110: 3 total, 3 up, 3 in
Jan 31 02:25:24 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 110 pg[9.19( v 40'1015 (0'0,40'1015] local-lis/les=74/75 n=5 ec=49/34 lis/c=74/74 les/c/f=75/75/0 sis=110 pruub=15.708768845s) [0] r=-1 lpr=110 pi=[74,110)/1 crt=40'1015 mlcod 0'0 active pruub 168.666458130s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:24 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 110 pg[9.19( v 40'1015 (0'0,40'1015] local-lis/les=74/75 n=5 ec=49/34 lis/c=74/74 les/c/f=75/75/0 sis=110 pruub=15.708709717s) [0] r=-1 lpr=110 pi=[74,110)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 168.666458130s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Jan 31 02:25:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:25:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:24.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:25:24 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.9 scrub starts
Jan 31 02:25:24 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.9 scrub ok
Jan 31 02:25:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Jan 31 02:25:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e111 e111: 3 total, 3 up, 3 in
Jan 31 02:25:25 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 111 pg[9.19( v 40'1015 (0'0,40'1015] local-lis/les=74/75 n=5 ec=49/34 lis/c=74/74 les/c/f=75/75/0 sis=111) [0]/[2] r=0 lpr=111 pi=[74,111)/1 crt=40'1015 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:25 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 111 pg[9.19( v 40'1015 (0'0,40'1015] local-lis/les=74/75 n=5 ec=49/34 lis/c=74/74 les/c/f=75/75/0 sis=111) [0]/[2] r=0 lpr=111 pi=[74,111)/1 crt=40'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 31 02:25:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:25.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:25 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.a scrub starts
Jan 31 02:25:25 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.a scrub ok
Jan 31 02:25:26 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 31 02:25:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e112 e112: 3 total, 3 up, 3 in
Jan 31 02:25:26 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 112 pg[9.19( v 40'1015 (0'0,40'1015] local-lis/les=111/112 n=5 ec=49/34 lis/c=74/74 les/c/f=75/75/0 sis=111) [0]/[2] async=[0] r=0 lpr=111 pi=[74,111)/1 crt=40'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:25:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:26.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:26 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 4.19 scrub starts
Jan 31 02:25:26 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 4.19 scrub ok
Jan 31 02:25:27 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Jan 31 02:25:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e113 e113: 3 total, 3 up, 3 in
Jan 31 02:25:27 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 113 pg[9.19( v 40'1015 (0'0,40'1015] local-lis/les=111/112 n=5 ec=49/34 lis/c=111/74 les/c/f=112/75/0 sis=113 pruub=14.987223625s) [0] async=[0] r=-1 lpr=113 pi=[74,113)/1 crt=40'1015 mlcod 40'1015 active pruub 171.035125732s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:27 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 113 pg[9.19( v 40'1015 (0'0,40'1015] local-lis/les=111/112 n=5 ec=49/34 lis/c=111/74 les/c/f=112/75/0 sis=113 pruub=14.987106323s) [0] r=-1 lpr=113 pi=[74,113)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 171.035125732s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:27.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e114 e114: 3 total, 3 up, 3 in
Jan 31 02:25:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e115 e115: 3 total, 3 up, 3 in
Jan 31 02:25:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:28.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:29 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Jan 31 02:25:29 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Jan 31 02:25:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:29.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e116 e116: 3 total, 3 up, 3 in
Jan 31 02:25:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:30.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:31 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.f scrub starts
Jan 31 02:25:31 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.f scrub ok
Jan 31 02:25:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:31.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:32.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:33 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.3 scrub starts
Jan 31 02:25:33 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.3 scrub ok
Jan 31 02:25:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:33.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e117 e117: 3 total, 3 up, 3 in
Jan 31 02:25:33 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 117 pg[9.1b( v 40'1015 (0'0,40'1015] local-lis/les=60/61 n=5 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=117 pruub=11.843094826s) [0] r=-1 lpr=117 pi=[60,117)/1 crt=40'1015 mlcod 0'0 active pruub 174.618148804s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:33 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 117 pg[9.1b( v 40'1015 (0'0,40'1015] local-lis/les=60/61 n=5 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=117 pruub=11.843037605s) [0] r=-1 lpr=117 pi=[60,117)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 174.618148804s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:33 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Jan 31 02:25:34 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 11.a deep-scrub starts
Jan 31 02:25:34 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 11.a deep-scrub ok
Jan 31 02:25:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:25:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:34.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:25:34 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Jan 31 02:25:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e118 e118: 3 total, 3 up, 3 in
Jan 31 02:25:34 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 118 pg[9.1b( v 40'1015 (0'0,40'1015] local-lis/les=60/61 n=5 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=118) [0]/[2] r=0 lpr=118 pi=[60,118)/1 crt=40'1015 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:34 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 118 pg[9.1b( v 40'1015 (0'0,40'1015] local-lis/les=60/61 n=5 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=118) [0]/[2] r=0 lpr=118 pi=[60,118)/1 crt=40'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 31 02:25:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:35.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e118 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e119 e119: 3 total, 3 up, 3 in
Jan 31 02:25:35 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 119 pg[9.1b( v 40'1015 (0'0,40'1015] local-lis/les=118/119 n=5 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=118) [0]/[2] async=[0] r=0 lpr=118 pi=[60,118)/1 crt=40'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:25:35 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Jan 31 02:25:36 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 11.e scrub starts
Jan 31 02:25:36 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 11.e scrub ok
Jan 31 02:25:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:36.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Jan 31 02:25:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e120 e120: 3 total, 3 up, 3 in
Jan 31 02:25:36 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Jan 31 02:25:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:25:36.973869) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:25:36 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Jan 31 02:25:36 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844336973954, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7154, "num_deletes": 256, "total_data_size": 13224662, "memory_usage": 13405616, "flush_reason": "Manual Compaction"}
Jan 31 02:25:36 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Jan 31 02:25:36 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 120 pg[9.1b( v 40'1015 (0'0,40'1015] local-lis/les=118/119 n=5 ec=49/34 lis/c=118/60 les/c/f=119/61/0 sis=120 pruub=14.963314056s) [0] async=[0] r=-1 lpr=120 pi=[60,120)/1 crt=40'1015 mlcod 40'1015 active pruub 180.830688477s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:36 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 120 pg[9.1b( v 40'1015 (0'0,40'1015] local-lis/les=118/119 n=5 ec=49/34 lis/c=118/60 les/c/f=119/61/0 sis=120 pruub=14.963251114s) [0] r=-1 lpr=120 pi=[60,120)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 180.830688477s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844337025077, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 7776453, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 249, "largest_seqno": 7159, "table_properties": {"data_size": 7749089, "index_size": 17905, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8389, "raw_key_size": 78301, "raw_average_key_size": 23, "raw_value_size": 7684003, "raw_average_value_size": 2296, "num_data_blocks": 793, "num_entries": 3346, "num_filter_entries": 3346, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 1769844145, "file_creation_time": 1769844336, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 51271 microseconds, and 20696 cpu microseconds.
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:25:37.025145) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 7776453 bytes OK
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:25:37.025166) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:25:37.026837) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:25:37.026857) EVENT_LOG_v1 {"time_micros": 1769844337026851, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:25:37.026875) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 13187576, prev total WAL file size 13187617, number of live WAL files 2.
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:25:37.028829) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(7594KB) 8(1648B)]
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844337028955, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 7778101, "oldest_snapshot_seqno": -1}
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3093 keys, 7772672 bytes, temperature: kUnknown
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844337075303, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 7772672, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7745999, "index_size": 17859, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7749, "raw_key_size": 74085, "raw_average_key_size": 23, "raw_value_size": 7684061, "raw_average_value_size": 2484, "num_data_blocks": 793, "num_entries": 3093, "num_filter_entries": 3093, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769844337, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:25:37.075583) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 7772672 bytes
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:25:37.077592) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 167.6 rd, 167.5 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(7.4, 0.0 +0.0 blob) out(7.4 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3351, records dropped: 258 output_compression: NoCompression
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:25:37.077623) EVENT_LOG_v1 {"time_micros": 1769844337077609, "job": 4, "event": "compaction_finished", "compaction_time_micros": 46416, "compaction_time_cpu_micros": 20556, "output_level": 6, "num_output_files": 1, "total_output_size": 7772672, "num_input_records": 3351, "num_output_records": 3093, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844337078665, "job": 4, "event": "table_file_deletion", "file_number": 14}
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844337078711, "job": 4, "event": "table_file_deletion", "file_number": 8}
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:25:37.028691) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:25:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:37.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Jan 31 02:25:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e121 e121: 3 total, 3 up, 3 in
Jan 31 02:25:38 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 121 pg[9.1d( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=84/84 les/c/f=85/85/0 sis=121) [2] r=0 lpr=121 pi=[84,121)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:25:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e122 e122: 3 total, 3 up, 3 in
Jan 31 02:25:38 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 122 pg[9.1d( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=84/84 les/c/f=85/85/0 sis=122) [2]/[1] r=-1 lpr=122 pi=[84,122)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:38 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 122 pg[9.1d( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=84/84 les/c/f=85/85/0 sis=122) [2]/[1] r=-1 lpr=122 pi=[84,122)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:25:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:38.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:25:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Jan 31 02:25:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:25:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:39.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:25:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e123 e123: 3 total, 3 up, 3 in
Jan 31 02:25:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e124 e124: 3 total, 3 up, 3 in
Jan 31 02:25:40 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 124 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=122/84 les/c/f=123/85/0 sis=124) [2] r=0 lpr=124 pi=[84,124)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:40 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 124 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=122/84 les/c/f=123/85/0 sis=124) [2] r=0 lpr=124 pi=[84,124)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:25:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:40.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:41.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e125 e125: 3 total, 3 up, 3 in
Jan 31 02:25:41 np0005603623 ceph-osd[79732]: osd.2 pg_epoch: 125 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=124/125 n=5 ec=49/34 lis/c=122/84 les/c/f=123/85/0 sis=124) [2] r=0 lpr=124 pi=[84,124)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:25:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:25:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:42.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:25:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:25:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:43.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:25:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Jan 31 02:25:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e126 e126: 3 total, 3 up, 3 in
Jan 31 02:25:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:25:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:44.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:25:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e127 e127: 3 total, 3 up, 3 in
Jan 31 02:25:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Jan 31 02:25:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:25:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:45.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:25:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:46 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 31 02:25:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e128 e128: 3 total, 3 up, 3 in
Jan 31 02:25:46 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.c scrub starts
Jan 31 02:25:46 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.c scrub ok
Jan 31 02:25:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:46.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:47 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 31 02:25:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e129 e129: 3 total, 3 up, 3 in
Jan 31 02:25:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:47.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e130 e130: 3 total, 3 up, 3 in
Jan 31 02:25:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e131 e131: 3 total, 3 up, 3 in
Jan 31 02:25:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:25:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:48.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:25:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 31 02:25:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:49.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 31 02:25:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 e132: 3 total, 3 up, 3 in
Jan 31 02:25:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:25:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:50.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:51.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:52.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:53 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 11.3 scrub starts
Jan 31 02:25:53 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 11.3 scrub ok
Jan 31 02:25:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:25:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:53.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:25:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:54.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:25:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:55.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:25:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:25:56 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.6 scrub starts
Jan 31 02:25:56 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.6 scrub ok
Jan 31 02:25:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 31 02:25:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:56.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 31 02:25:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:57.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:58 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Jan 31 02:25:58 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Jan 31 02:25:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:58.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:25:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:59.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:59 np0005603623 python3.9[90243]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:26:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:00.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:01.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:01 np0005603623 python3.9[90531]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 31 02:26:02 np0005603623 python3.9[90684]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 31 02:26:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:02.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:03 np0005603623 python3.9[90836]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:26:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:03.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:04 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.d deep-scrub starts
Jan 31 02:26:04 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.d deep-scrub ok
Jan 31 02:26:04 np0005603623 python3.9[91039]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 31 02:26:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:26:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:04.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:26:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:26:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:05.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:26:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:06 np0005603623 python3.9[91191]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:26:06 np0005603623 python3.9[91344]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:26:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:06.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:07 np0005603623 python3.9[91422]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:26:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:26:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:07.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:26:07 np0005603623 podman[91621]: 2026-01-31 07:26:07.765034123 +0000 UTC m=+0.049500404 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Jan 31 02:26:07 np0005603623 podman[91621]: 2026-01-31 07:26:07.890600585 +0000 UTC m=+0.175066856 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:26:08 np0005603623 podman[91882]: 2026-01-31 07:26:08.398265693 +0000 UTC m=+0.047553617 container exec dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:26:08 np0005603623 podman[91926]: 2026-01-31 07:26:08.462810841 +0000 UTC m=+0.053845053 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:26:08 np0005603623 podman[91882]: 2026-01-31 07:26:08.470311623 +0000 UTC m=+0.119599527 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:26:08 np0005603623 python3.9[91917]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:26:08 np0005603623 podman[91973]: 2026-01-31 07:26:08.661515855 +0000 UTC m=+0.055282295 container exec 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1793, vcs-type=git, architecture=x86_64, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, name=keepalived)
Jan 31 02:26:08 np0005603623 podman[91973]: 2026-01-31 07:26:08.67387562 +0000 UTC m=+0.067642010 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, name=keepalived, architecture=x86_64, version=2.2.4, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9)
Jan 31 02:26:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:08.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:08 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:26:08 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:26:08 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:26:08 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:26:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:09.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:09 np0005603623 python3.9[92287]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 31 02:26:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:26:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:26:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:26:10 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.1c scrub starts
Jan 31 02:26:10 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.1c scrub ok
Jan 31 02:26:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:10 np0005603623 python3.9[92441]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 31 02:26:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:26:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:10.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:26:11 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Jan 31 02:26:11 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Jan 31 02:26:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:11.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:11 np0005603623 python3.9[92594]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 31 02:26:12 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.5 scrub starts
Jan 31 02:26:12 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.5 scrub ok
Jan 31 02:26:12 np0005603623 python3.9[92747]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 31 02:26:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:12.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:13 np0005603623 python3.9[92899]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:26:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:13.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:26:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:14.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:26:15 np0005603623 python3.9[93053]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:26:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:15.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:16 np0005603623 python3.9[93255]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:26:16 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.2 scrub starts
Jan 31 02:26:16 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.2 scrub ok
Jan 31 02:26:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:26:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:26:16 np0005603623 python3.9[93334]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:26:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:16.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:17 np0005603623 python3.9[93486]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:26:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:17.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:17 np0005603623 python3.9[93564]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:26:18 np0005603623 python3.9[93717]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:26:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:18.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:19 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.1f scrub starts
Jan 31 02:26:19 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 8.1f scrub ok
Jan 31 02:26:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:26:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:19.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:26:20 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Jan 31 02:26:20 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Jan 31 02:26:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:20.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:20 np0005603623 python3.9[93869]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:26:21 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 11.17 scrub starts
Jan 31 02:26:21 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 11.17 scrub ok
Jan 31 02:26:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:26:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:21.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:26:21 np0005603623 python3.9[94021]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 31 02:26:22 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 11.19 scrub starts
Jan 31 02:26:22 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 11.19 scrub ok
Jan 31 02:26:22 np0005603623 python3.9[94172]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:26:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:26:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:22.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:26:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:26:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:23.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:26:23 np0005603623 python3.9[94374]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:26:23 np0005603623 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 31 02:26:24 np0005603623 systemd[1]: tuned.service: Deactivated successfully.
Jan 31 02:26:24 np0005603623 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 31 02:26:24 np0005603623 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 31 02:26:24 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Jan 31 02:26:24 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Jan 31 02:26:24 np0005603623 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 31 02:26:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:24.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:24 np0005603623 python3.9[94537]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 31 02:26:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:25.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:26:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:26.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:26:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:27.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:28 np0005603623 python3.9[94691]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:26:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:28.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:29 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.1b scrub starts
Jan 31 02:26:29 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.1b scrub ok
Jan 31 02:26:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:29.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:29 np0005603623 python3.9[94845]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:26:30 np0005603623 systemd[1]: session-34.scope: Deactivated successfully.
Jan 31 02:26:30 np0005603623 systemd[1]: session-34.scope: Consumed 58.630s CPU time.
Jan 31 02:26:30 np0005603623 systemd-logind[795]: Session 34 logged out. Waiting for processes to exit.
Jan 31 02:26:30 np0005603623 systemd-logind[795]: Removed session 34.
Jan 31 02:26:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:26:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:30.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:26:31 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.1d scrub starts
Jan 31 02:26:31 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.1d scrub ok
Jan 31 02:26:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:31.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:32.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:33 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 10.12 scrub starts
Jan 31 02:26:33 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 10.12 scrub ok
Jan 31 02:26:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:33.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:34 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 10.f scrub starts
Jan 31 02:26:34 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 10.f scrub ok
Jan 31 02:26:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:34.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:35 np0005603623 systemd-logind[795]: New session 35 of user zuul.
Jan 31 02:26:35 np0005603623 systemd[1]: Started Session 35 of User zuul.
Jan 31 02:26:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:35.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:36 np0005603623 python3.9[95028]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:26:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:26:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:36.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:26:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:37.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:37 np0005603623 python3.9[95185]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 31 02:26:38 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 10.1 scrub starts
Jan 31 02:26:38 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 10.1 scrub ok
Jan 31 02:26:38 np0005603623 python3.9[95339]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:26:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:38.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:39 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Jan 31 02:26:39 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Jan 31 02:26:39 np0005603623 python3.9[95423]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 02:26:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:39.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:40.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:41.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:41 np0005603623 python3.9[95577]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:26:42 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.5 scrub starts
Jan 31 02:26:42 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.5 scrub ok
Jan 31 02:26:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:26:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:42.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:26:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:43.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:44 np0005603623 python3.9[95781]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 02:26:44 np0005603623 python3.9[95935]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:26:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:44.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:44 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Jan 31 02:26:45 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Jan 31 02:26:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:45.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:45 np0005603623 python3.9[96087]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 31 02:26:45 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.a scrub starts
Jan 31 02:26:45 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.a scrub ok
Jan 31 02:26:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:26:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:46.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:26:46 np0005603623 python3.9[96238]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:26:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:47.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:47 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.b scrub starts
Jan 31 02:26:47 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.b scrub ok
Jan 31 02:26:47 np0005603623 python3.9[96396]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:26:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:26:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:48.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:26:48 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.c scrub starts
Jan 31 02:26:48 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.c scrub ok
Jan 31 02:26:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:49.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:50 np0005603623 python3.9[96550]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:26:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:50.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:51.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:51 np0005603623 python3.9[96838]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 31 02:26:52 np0005603623 python3.9[96989]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:26:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:52.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:53 np0005603623 python3.9[97143]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:26:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:53.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:26:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:54.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:26:55 np0005603623 python3.9[97297]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:26:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:55.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:56.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:57 np0005603623 python3.9[97451]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:26:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:57.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:57 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 10.4 scrub starts
Jan 31 02:26:57 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 10.4 scrub ok
Jan 31 02:26:58 np0005603623 python3.9[97606]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Jan 31 02:26:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:58.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:59 np0005603623 systemd[1]: session-35.scope: Deactivated successfully.
Jan 31 02:26:59 np0005603623 systemd[1]: session-35.scope: Consumed 15.961s CPU time.
Jan 31 02:26:59 np0005603623 systemd-logind[795]: Session 35 logged out. Waiting for processes to exit.
Jan 31 02:26:59 np0005603623 systemd-logind[795]: Removed session 35.
Jan 31 02:26:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:26:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:59.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:00.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:01 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.d deep-scrub starts
Jan 31 02:27:01 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.d deep-scrub ok
Jan 31 02:27:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:01.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:02 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 10.3 scrub starts
Jan 31 02:27:02 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 10.3 scrub ok
Jan 31 02:27:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:02.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:03.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:04 np0005603623 systemd-logind[795]: New session 36 of user zuul.
Jan 31 02:27:04 np0005603623 systemd[1]: Started Session 36 of User zuul.
Jan 31 02:27:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:04.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:05 np0005603623 python3.9[97837]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:27:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:05.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:06 np0005603623 python3.9[97991]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:27:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:06.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:06 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 7.a scrub starts
Jan 31 02:27:07 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 7.a scrub ok
Jan 31 02:27:07 np0005603623 python3.9[98185]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:27:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:07.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:08 np0005603623 systemd[1]: session-36.scope: Deactivated successfully.
Jan 31 02:27:08 np0005603623 systemd[1]: session-36.scope: Consumed 1.950s CPU time.
Jan 31 02:27:08 np0005603623 systemd-logind[795]: Session 36 logged out. Waiting for processes to exit.
Jan 31 02:27:08 np0005603623 systemd-logind[795]: Removed session 36.
Jan 31 02:27:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:08.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:09.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:09 np0005603623 systemd[72574]: Created slice User Background Tasks Slice.
Jan 31 02:27:09 np0005603623 systemd[72574]: Starting Cleanup of User's Temporary Files and Directories...
Jan 31 02:27:09 np0005603623 systemd[72574]: Finished Cleanup of User's Temporary Files and Directories.
Jan 31 02:27:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:10.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:11.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:12.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:12 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.12 scrub starts
Jan 31 02:27:12 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.12 scrub ok
Jan 31 02:27:13 np0005603623 systemd-logind[795]: New session 37 of user zuul.
Jan 31 02:27:13 np0005603623 systemd[1]: Started Session 37 of User zuul.
Jan 31 02:27:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:13.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:14 np0005603623 python3.9[98369]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:27:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:14.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:15 np0005603623 python3.9[98523]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:27:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:15.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:16 np0005603623 python3.9[98793]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:27:16 np0005603623 podman[98854]: 2026-01-31 07:27:16.319782809 +0000 UTC m=+0.186344002 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:27:16 np0005603623 podman[98854]: 2026-01-31 07:27:16.443824315 +0000 UTC m=+0.310385458 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:27:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:16.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:16 np0005603623 python3.9[98970]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:27:17 np0005603623 podman[99095]: 2026-01-31 07:27:17.342359101 +0000 UTC m=+0.152799658 container exec dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:27:17 np0005603623 podman[99116]: 2026-01-31 07:27:17.412596805 +0000 UTC m=+0.053846116 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:27:17 np0005603623 podman[99095]: 2026-01-31 07:27:17.439709573 +0000 UTC m=+0.250150120 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:27:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:27:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:17.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:27:17 np0005603623 podman[99161]: 2026-01-31 07:27:17.876256335 +0000 UTC m=+0.207568993 container exec 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, vcs-type=git, release=1793, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public)
Jan 31 02:27:17 np0005603623 podman[99161]: 2026-01-31 07:27:17.893912883 +0000 UTC m=+0.225225621 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, com.redhat.component=keepalived-container, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=Ceph keepalived, name=keepalived, version=2.2.4, architecture=x86_64, description=keepalived for Ceph)
Jan 31 02:27:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:27:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:27:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:27:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:27:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:27:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:27:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:18.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:27:18 np0005603623 python3.9[99479]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:27:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:27:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:19.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:27:19 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 7.16 scrub starts
Jan 31 02:27:19 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 7.16 scrub ok
Jan 31 02:27:20 np0005603623 python3.9[99674]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:27:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:20 np0005603623 python3.9[99827]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:27:20 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Jan 31 02:27:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:20.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:20 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Jan 31 02:27:21 np0005603623 python3.9[99993]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:27:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:21.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:21 np0005603623 python3.9[100071]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:27:22 np0005603623 python3.9[100224]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:27:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:22.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:23 np0005603623 python3.9[100302]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:27:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:27:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:23.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:27:23 np0005603623 python3.9[100504]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:27:24 np0005603623 python3.9[100657]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:27:24 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.f scrub starts
Jan 31 02:27:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:24.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:24 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.f scrub ok
Jan 31 02:27:25 np0005603623 python3.9[100809]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:27:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:25.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:25 np0005603623 python3.9[100961]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:27:26 np0005603623 python3.9[101164]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:27:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:26.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:26 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:27:26 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:27:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:27.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:27 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 7.11 scrub starts
Jan 31 02:27:27 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 7.11 scrub ok
Jan 31 02:27:28 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.10 scrub starts
Jan 31 02:27:28 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.10 scrub ok
Jan 31 02:27:28 np0005603623 python3.9[101318]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:27:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:28.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:29 np0005603623 python3.9[101472]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:27:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:27:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:29.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:27:30 np0005603623 python3.9[101625]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:27:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:30.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:31 np0005603623 python3.9[101777]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:27:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:27:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:31.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:27:31 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 7.14 deep-scrub starts
Jan 31 02:27:31 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 7.14 deep-scrub ok
Jan 31 02:27:31 np0005603623 python3.9[101930]: ansible-service_facts Invoked
Jan 31 02:27:32 np0005603623 network[101947]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 02:27:32 np0005603623 network[101948]: 'network-scripts' will be removed from distribution in near future.
Jan 31 02:27:32 np0005603623 network[101950]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 02:27:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:32.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:27:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:33.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:27:33 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Jan 31 02:27:33 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Jan 31 02:27:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:34.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:27:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:35.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:27:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:36.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:37 np0005603623 python3.9[102404]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:27:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:27:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:37.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:27:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:38.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:39.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:39 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 10.1e scrub starts
Jan 31 02:27:39 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 10.1e scrub ok
Jan 31 02:27:40 np0005603623 python3.9[102558]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 31 02:27:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:27:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:40.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:27:41 np0005603623 python3.9[102711]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:27:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:41.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:41 np0005603623 python3.9[102789]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:27:42 np0005603623 python3.9[102942]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:27:42 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Jan 31 02:27:42 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Jan 31 02:27:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:42.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:43 np0005603623 python3.9[103020]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:27:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:43.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:44 np0005603623 python3.9[103223]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:27:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:44.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:27:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:45.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:27:46 np0005603623 python3.9[103376]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:27:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:27:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:46.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:27:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:47.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:47 np0005603623 python3.9[103460]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:27:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:27:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:48.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:27:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:49.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:49 np0005603623 systemd-logind[795]: Session 37 logged out. Waiting for processes to exit.
Jan 31 02:27:49 np0005603623 systemd[1]: session-37.scope: Deactivated successfully.
Jan 31 02:27:49 np0005603623 systemd[1]: session-37.scope: Consumed 20.572s CPU time.
Jan 31 02:27:49 np0005603623 systemd-logind[795]: Removed session 37.
Jan 31 02:27:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:27:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:50.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:27:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.014000448s ======
Jan 31 02:27:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:51.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.014000448s
Jan 31 02:27:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:52.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:53.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:54.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:55 np0005603623 systemd-logind[795]: New session 38 of user zuul.
Jan 31 02:27:55 np0005603623 systemd[1]: Started Session 38 of User zuul.
Jan 31 02:27:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:55.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:56 np0005603623 python3.9[103647]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:27:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:56.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:57 np0005603623 python3.9[103799]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:27:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:27:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:57.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:27:57 np0005603623 python3.9[103877]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:27:58 np0005603623 systemd[1]: session-38.scope: Deactivated successfully.
Jan 31 02:27:58 np0005603623 systemd[1]: session-38.scope: Consumed 1.188s CPU time.
Jan 31 02:27:58 np0005603623 systemd-logind[795]: Session 38 logged out. Waiting for processes to exit.
Jan 31 02:27:58 np0005603623 systemd-logind[795]: Removed session 38.
Jan 31 02:27:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:58.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:27:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:27:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:59.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:28:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:00.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:01.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:02 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Jan 31 02:28:02 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Jan 31 02:28:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:28:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:02.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:28:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:03.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:03 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 10.10 deep-scrub starts
Jan 31 02:28:03 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 10.10 deep-scrub ok
Jan 31 02:28:04 np0005603623 systemd-logind[795]: New session 39 of user zuul.
Jan 31 02:28:04 np0005603623 systemd[1]: Started Session 39 of User zuul.
Jan 31 02:28:04 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 7.1d scrub starts
Jan 31 02:28:04 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 7.1d scrub ok
Jan 31 02:28:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:04.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:05.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:05 np0005603623 python3.9[104112]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:28:06 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.17 scrub starts
Jan 31 02:28:06 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.17 scrub ok
Jan 31 02:28:06 np0005603623 python3.9[104269]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:06.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:07 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.3 scrub starts
Jan 31 02:28:07 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.3 scrub ok
Jan 31 02:28:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:07.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:07 np0005603623 python3.9[104444]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:08 np0005603623 python3.9[104523]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.qbhwlqx7 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:08.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:09 np0005603623 python3.9[104675]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:28:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:09.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:28:09 np0005603623 python3.9[104753]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.r35khzhy recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:10 np0005603623 python3.9[104906]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:28:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:10.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:11 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.13 scrub starts
Jan 31 02:28:11 np0005603623 python3.9[105058]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:11 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.13 scrub ok
Jan 31 02:28:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:11.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:11 np0005603623 python3.9[105136]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:28:12 np0005603623 python3.9[105289]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:12 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.b deep-scrub starts
Jan 31 02:28:12 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.b deep-scrub ok
Jan 31 02:28:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:28:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:12.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:28:12 np0005603623 python3.9[105367]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:28:13 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.7 scrub starts
Jan 31 02:28:13 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.7 scrub ok
Jan 31 02:28:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:28:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:13.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:28:13 np0005603623 python3.9[105519]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:14 np0005603623 python3.9[105672]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:14 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.5 deep-scrub starts
Jan 31 02:28:14 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.5 deep-scrub ok
Jan 31 02:28:14 np0005603623 python3.9[105750]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:28:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:14.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:28:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:15 np0005603623 python3.9[105902]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:15.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:16 np0005603623 python3.9[105980]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:16.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:17 np0005603623 python3.9[106133]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:28:17 np0005603623 systemd[1]: Reloading.
Jan 31 02:28:17 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:28:17 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:28:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:17.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:18 np0005603623 python3.9[106323]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:18 np0005603623 python3.9[106401]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:28:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:18.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:28:19 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.18 scrub starts
Jan 31 02:28:19 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.18 scrub ok
Jan 31 02:28:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:19.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:19 np0005603623 python3.9[106553]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:20 np0005603623 python3.9[106632]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:20 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.8 scrub starts
Jan 31 02:28:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:20 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.8 scrub ok
Jan 31 02:28:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:20.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:21 np0005603623 python3.9[106784]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:28:21 np0005603623 systemd[1]: Reloading.
Jan 31 02:28:21 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:28:21 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:28:21 np0005603623 systemd[1]: Starting Create netns directory...
Jan 31 02:28:21 np0005603623 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 31 02:28:21 np0005603623 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 31 02:28:21 np0005603623 systemd[1]: Finished Create netns directory.
Jan 31 02:28:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:21.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:22 np0005603623 python3.9[106977]: ansible-ansible.builtin.service_facts Invoked
Jan 31 02:28:22 np0005603623 network[106994]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 02:28:22 np0005603623 network[106995]: 'network-scripts' will be removed from distribution in near future.
Jan 31 02:28:22 np0005603623 network[106996]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 02:28:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:28:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:22.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:28:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:23.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:28:24.649589) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844504649800, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 2650, "num_deletes": 251, "total_data_size": 5086331, "memory_usage": 5159328, "flush_reason": "Manual Compaction"}
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844504681775, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 3315327, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7165, "largest_seqno": 9809, "table_properties": {"data_size": 3305440, "index_size": 5677, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3141, "raw_key_size": 27086, "raw_average_key_size": 21, "raw_value_size": 3282604, "raw_average_value_size": 2628, "num_data_blocks": 252, "num_entries": 1249, "num_filter_entries": 1249, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844336, "oldest_key_time": 1769844336, "file_creation_time": 1769844504, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 32422 microseconds, and 8801 cpu microseconds.
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:28:24.682014) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 3315327 bytes OK
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:28:24.682055) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:28:24.684952) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:28:24.684991) EVENT_LOG_v1 {"time_micros": 1769844504684979, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:28:24.685053) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 5074010, prev total WAL file size 5074010, number of live WAL files 2.
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:28:24.686365) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(3237KB)], [15(7590KB)]
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844504686487, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 11087999, "oldest_snapshot_seqno": -1}
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3821 keys, 9565337 bytes, temperature: kUnknown
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844504791101, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 9565337, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9534145, "index_size": 20522, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9605, "raw_key_size": 91995, "raw_average_key_size": 24, "raw_value_size": 9459644, "raw_average_value_size": 2475, "num_data_blocks": 896, "num_entries": 3821, "num_filter_entries": 3821, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769844504, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:28:24.791405) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 9565337 bytes
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:28:24.798361) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 105.9 rd, 91.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.2, 7.4 +0.0 blob) out(9.1 +0.0 blob), read-write-amplify(6.2) write-amplify(2.9) OK, records in: 4342, records dropped: 521 output_compression: NoCompression
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:28:24.798397) EVENT_LOG_v1 {"time_micros": 1769844504798381, "job": 6, "event": "compaction_finished", "compaction_time_micros": 104721, "compaction_time_cpu_micros": 17970, "output_level": 6, "num_output_files": 1, "total_output_size": 9565337, "num_input_records": 4342, "num_output_records": 3821, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844504799141, "job": 6, "event": "table_file_deletion", "file_number": 17}
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844504800411, "job": 6, "event": "table_file_deletion", "file_number": 15}
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:28:24.686214) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:28:24.800520) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:28:24.800525) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:28:24.800528) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:28:24.800531) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:28:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:28:24.800534) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:28:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:24.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:25.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:26 np0005603623 python3.9[107309]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:26 np0005603623 python3.9[107459]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:26 np0005603623 podman[107609]: 2026-01-31 07:28:26.907335026 +0000 UTC m=+0.072568757 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:28:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:26.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:27 np0005603623 podman[107609]: 2026-01-31 07:28:27.001474393 +0000 UTC m=+0.166708094 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Jan 31 02:28:27 np0005603623 python3.9[107803]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:27 np0005603623 podman[107867]: 2026-01-31 07:28:27.554600355 +0000 UTC m=+0.057661530 container exec dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:28:27 np0005603623 podman[107867]: 2026-01-31 07:28:27.601572921 +0000 UTC m=+0.104634076 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:28:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:28:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:27.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:28:27 np0005603623 podman[107979]: 2026-01-31 07:28:27.766652422 +0000 UTC m=+0.041450369 container exec 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, io.buildah.version=1.28.2, release=1793, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, description=keepalived for Ceph)
Jan 31 02:28:27 np0005603623 podman[107979]: 2026-01-31 07:28:27.775699723 +0000 UTC m=+0.050497650 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, vcs-type=git, com.redhat.component=keepalived-container, name=keepalived, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, version=2.2.4, io.openshift.tags=Ceph keepalived, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9)
Jan 31 02:28:28 np0005603623 python3.9[108192]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:28 np0005603623 python3.9[108328]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:28 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:28:28 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:28:28 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:28:28 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:28:28 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:28:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:28.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:29 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.9 scrub starts
Jan 31 02:28:29 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.9 scrub ok
Jan 31 02:28:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:29.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:29 np0005603623 python3.9[108480]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 31 02:28:29 np0005603623 systemd[1]: Starting Time & Date Service...
Jan 31 02:28:29 np0005603623 systemd[1]: Started Time & Date Service.
Jan 31 02:28:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:30 np0005603623 python3.9[108637]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:30.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:31 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.16 scrub starts
Jan 31 02:28:31 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.16 scrub ok
Jan 31 02:28:31 np0005603623 python3.9[108789]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:31.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:31 np0005603623 python3.9[108867]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:32 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Jan 31 02:28:32 np0005603623 ceph-osd[79732]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Jan 31 02:28:32 np0005603623 python3.9[109020]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:28:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:32.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:28:33 np0005603623 python3.9[109098]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.blohhjuj recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:28:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:33.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:28:33 np0005603623 python3.9[109250]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:34 np0005603623 python3.9[109329]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:35.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:35 np0005603623 python3.9[109481]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:28:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:35.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:28:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:28:36 np0005603623 python3[109685]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 31 02:28:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:37.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:37 np0005603623 python3.9[109837]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:37 np0005603623 python3.9[109915]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:37.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:38 np0005603623 python3.9[110068]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:39.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:39 np0005603623 python3.9[110193]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844517.8755467-902-257539494521364/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:39.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:40 np0005603623 python3.9[110346]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:40 np0005603623 python3.9[110425]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:28:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:41.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:28:41 np0005603623 python3.9[110577]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:41.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:42 np0005603623 python3.9[110655]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:42 np0005603623 python3.9[110808]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:43.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:43 np0005603623 python3.9[110886]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:43.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:44 np0005603623 python3.9[111038]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:28:45 np0005603623 python3.9[111244]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:45.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:28:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:45.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:28:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:45 np0005603623 python3.9[111396]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:46 np0005603623 python3.9[111549]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:47.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:47 np0005603623 python3.9[111701]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 31 02:28:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:28:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:47.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:28:48 np0005603623 python3.9[111853]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 31 02:28:48 np0005603623 systemd[1]: session-39.scope: Deactivated successfully.
Jan 31 02:28:48 np0005603623 systemd[1]: session-39.scope: Consumed 24.672s CPU time.
Jan 31 02:28:48 np0005603623 systemd-logind[795]: Session 39 logged out. Waiting for processes to exit.
Jan 31 02:28:48 np0005603623 systemd-logind[795]: Removed session 39.
Jan 31 02:28:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:49.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:49.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:28:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:51.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:28:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:51.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:53.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:53.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:54 np0005603623 systemd-logind[795]: New session 40 of user zuul.
Jan 31 02:28:54 np0005603623 systemd[1]: Started Session 40 of User zuul.
Jan 31 02:28:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:55.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:55 np0005603623 python3.9[112037]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 31 02:28:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:55.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:56 np0005603623 python3.9[112189]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:28:56 np0005603623 python3.9[112344]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Jan 31 02:28:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:57.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:57 np0005603623 python3.9[112496]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.6dawx2gh follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:57.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:58 np0005603623 python3.9[112622]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.6dawx2gh mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769844537.2083688-110-18886889164849/.source.6dawx2gh _original_basename=.lz6lye9h follow=False checksum=894df0945bb562bf664b2d53d15fbd1da03ff944 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:28:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:59.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:28:59 np0005603623 python3.9[112774]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:28:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:28:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:28:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:59.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:28:59 np0005603623 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 31 02:29:00 np0005603623 python3.9[112930]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCXc4C5rCDfOfKEuMVHI9SatZ+NRO9lp335K0yZ19CDCOGSUNO2lblpRlgxO3tw3S+UGGiC/7/HHeZBA2Zd+SUVMb7ytbl5c3+XuZIIQF6DyIIDSELf0FoE0NhuSjKFilPsxyxxGYgH+gVaTZkuGhDoljaywQBSPGZdDwejVKWPVuui5xe0X4T0WVfT5avLSpIL3WjJ9hmzEaR0dUqrbKvPUAXJPDqQOZbQZbpXDIi48NPUDFwByej1xHWHRQaPJ/M6AsyrZKP/hiF2xt0mCIk1FANldusq4OUs9r/0KTVrPRCpSrsSimKBtEMJVdxqxAasE7H07sSdwFcWNC21LtsH8+/LM0oofIZ3D0Lom0NoLaC+Ocy2vqbIhOPYJ6c7Q8J/p4NFiA/lD+bgyjOOnm3Ls4VaaHXUyknu259henkVzJ+iZuRNY8ki345nrzPLoLYyxVwRkSuONyYlRp36jjp0QIL9kXLFlJ2OTHvb9FUhlG7RnxzPeHZhsihSHJv1rgU=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAT2MDVMbPz3xtbIO31qZj2gzOQiz4a8pTNWAmd0+CUW#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFU8ym/rLGJxMpEsk09j3JHOh1hW4Vrm23tIOjn4/YJIrK1UFRFiQLDm+yZuj1NhWfbg71SK8ZuZ2miEJ20BHno=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAgshePGnD7oc3Zg8kfD9lUGSfPfE1OzPUGBHE12jLoyHnXwKTxYFYSMTWRcYgdFu4HaP0ShO1gEQF+1nDXxrozH/m2qxK/YPC5cVYCPvscwRdlyUNPOV0rpiruVZptTQ1iibsmRwMbxliXD2t13CtsrNjy9iuLgtvvnkfUh0wZKcZ8Jglg6E4vRTBPgXo3fJCfPF9Iz7GE50DpWAU8OnoLNlOf54/tcd8CyOrmLF9RwHTgNtN9FXscdQ3/A8avCF0WPWNUmfLFc20yOtfrq/xxjJMLn4KOZu1D1yjK5BSJu2pv/j0NPrTFKgPKYWjiXPdttcyubkXNZP96jkK9dgTgsEGRKuM83QpDIu7823wv4/GtEi+IsJeyqCN+3VAJo9hDB9eES8qlX4jAg6Kxen1oNkL9M+tz7N0BSdnxbS3skWEw6MsHlsBLOw7KMYe8gq8JoqHLBKBFQZZbjwaK5kNTeu6l5zAYERpt8uAEZkplq2vV5+4EOh7RPncmKuH0Xs=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF//s6MNfOt3MK/jBcrJ5VkyeSY5eg1jUHN32BLTGZtT#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEtIHGkRVmmqcsRoXLuIEWyuaX3BoKld3DircbfvRpdFLzOwbxRaZ6uUN5f7sBun3oAcQLdnixnG3R/YK8L7HpM=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCSEo2WrFN8DnR2/d+p3YtsWos96nHz1MZInXN3md5cJXE0icMDwEWJuGIDUd5e0SA6Q7i33i/WIEmt/wGMoNhoTI+f3plB2NyAn5vyVQGTZv7m+tOLQI3/k50Kxnpu0c5gO509yln6RcLe4MutF0imS/fINCM+Nznh7oKbn6hELTDlxDz0JH8dNsZGmtVmgnhwIrglpxAg/WpeOWkCmuuXmysx1JcAhIK5016MzaM9cOtHAGzj5s0GE7nQoH4yG0Ak3zMU/DPKr91Xq/m9PCnGKautoHmHgrEG6u+1WubtakbBxlfmroKbvrIFL6KKQzY0SiTrBsH3nZRaFGCqE0ZEyHvJz8AO3quWg2oaXRJWN98f7k3l5dtVJIuwyJxVnv6fUGuLbGxOp4T6UDPqC7b2Eg17EtpUjy77F/+8yrX6NH+hXwcWBwHelRCDSiceGQTm1uexb8Xo1R1Wt9h24H2yRKPFrqzf1R9J2vipDouDo7RLefAiCXEJDdlewdKUM5c=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILwGOCpzCDE8uIHb4RBldbKfEvxhUdsBT4K7sPU4vZLU#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLS8teLqq0Lmt8g22OKhtEhLCXd5cBLM6W2oDJcWxQl8DloBMMFjgDlHt0rzjMKEL0SpxkPbH7sPV1zbWKKJI9M=#012 create=True mode=0644 path=/tmp/ansible.6dawx2gh state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:01.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:01 np0005603623 python3.9[113082]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.6dawx2gh' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:29:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:01.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:02 np0005603623 python3.9[113237]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.6dawx2gh state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:02 np0005603623 systemd[1]: session-40.scope: Deactivated successfully.
Jan 31 02:29:02 np0005603623 systemd[1]: session-40.scope: Consumed 4.346s CPU time.
Jan 31 02:29:02 np0005603623 systemd-logind[795]: Session 40 logged out. Waiting for processes to exit.
Jan 31 02:29:02 np0005603623 systemd-logind[795]: Removed session 40.
Jan 31 02:29:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:03.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:03.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:05.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:05.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:07.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:07.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:09.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:09.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:09 np0005603623 systemd-logind[795]: New session 41 of user zuul.
Jan 31 02:29:09 np0005603623 systemd[1]: Started Session 41 of User zuul.
Jan 31 02:29:10 np0005603623 python3.9[113470]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:29:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:11.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:11.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:12 np0005603623 python3.9[113626]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 31 02:29:12 np0005603623 python3.9[113781]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:29:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:13.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:13 np0005603623 python3.9[113934]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:29:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:13.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:14 np0005603623 python3.9[114088]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:29:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:15.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:15 np0005603623 python3.9[114240]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:15 np0005603623 systemd[1]: session-41.scope: Deactivated successfully.
Jan 31 02:29:15 np0005603623 systemd[1]: session-41.scope: Consumed 3.119s CPU time.
Jan 31 02:29:15 np0005603623 systemd-logind[795]: Session 41 logged out. Waiting for processes to exit.
Jan 31 02:29:15 np0005603623 systemd-logind[795]: Removed session 41.
Jan 31 02:29:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:15.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:17.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:17.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:19.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:19.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:21.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:21 np0005603623 systemd-logind[795]: New session 42 of user zuul.
Jan 31 02:29:21 np0005603623 systemd[1]: Started Session 42 of User zuul.
Jan 31 02:29:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:21.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:22 np0005603623 python3.9[114421]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:29:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:29:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:23.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:29:23 np0005603623 python3.9[114578]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:29:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:23.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:24 np0005603623 python3.9[114662]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 02:29:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:25.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:25.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:26 np0005603623 python3.9[114865]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:29:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:27.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:27.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:27 np0005603623 python3.9[115016]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 31 02:29:28 np0005603623 python3.9[115167]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:29:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:29.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:29 np0005603623 python3.9[115317]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:29:29 np0005603623 systemd[1]: session-42.scope: Deactivated successfully.
Jan 31 02:29:29 np0005603623 systemd[1]: session-42.scope: Consumed 5.588s CPU time.
Jan 31 02:29:29 np0005603623 systemd-logind[795]: Session 42 logged out. Waiting for processes to exit.
Jan 31 02:29:29 np0005603623 systemd-logind[795]: Removed session 42.
Jan 31 02:29:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:29.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:31.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:31.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:33.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:33.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:29:34.242279) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844574242414, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 883, "num_deletes": 250, "total_data_size": 1859892, "memory_usage": 1880024, "flush_reason": "Manual Compaction"}
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844574252185, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 785535, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9814, "largest_seqno": 10692, "table_properties": {"data_size": 782043, "index_size": 1272, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8741, "raw_average_key_size": 19, "raw_value_size": 774753, "raw_average_value_size": 1768, "num_data_blocks": 56, "num_entries": 438, "num_filter_entries": 438, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844505, "oldest_key_time": 1769844505, "file_creation_time": 1769844574, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 9982 microseconds, and 4209 cpu microseconds.
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:29:34.252256) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 785535 bytes OK
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:29:34.252279) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:29:34.255651) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:29:34.255753) EVENT_LOG_v1 {"time_micros": 1769844574255672, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:29:34.255777) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 1855405, prev total WAL file size 1855405, number of live WAL files 2.
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:29:34.256643) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323531' seq:0, type:0; will stop at (end)
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(767KB)], [18(9341KB)]
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844574256701, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 10350872, "oldest_snapshot_seqno": -1}
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 3773 keys, 7750800 bytes, temperature: kUnknown
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844574349626, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 7750800, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7722767, "index_size": 17491, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9477, "raw_key_size": 91467, "raw_average_key_size": 24, "raw_value_size": 7651812, "raw_average_value_size": 2028, "num_data_blocks": 764, "num_entries": 3773, "num_filter_entries": 3773, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769844574, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:29:34.349837) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 7750800 bytes
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:29:34.351095) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 111.3 rd, 83.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 9.1 +0.0 blob) out(7.4 +0.0 blob), read-write-amplify(23.0) write-amplify(9.9) OK, records in: 4259, records dropped: 486 output_compression: NoCompression
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:29:34.351145) EVENT_LOG_v1 {"time_micros": 1769844574351106, "job": 8, "event": "compaction_finished", "compaction_time_micros": 92981, "compaction_time_cpu_micros": 29756, "output_level": 6, "num_output_files": 1, "total_output_size": 7750800, "num_input_records": 4259, "num_output_records": 3773, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844574351315, "job": 8, "event": "table_file_deletion", "file_number": 20}
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844574352186, "job": 8, "event": "table_file_deletion", "file_number": 18}
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:29:34.256489) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:29:34.352395) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:29:34.352406) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:29:34.352411) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:29:34.352415) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:29:34.352422) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:29:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:35.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:35 np0005603623 systemd-logind[795]: New session 43 of user zuul.
Jan 31 02:29:35 np0005603623 systemd[1]: Started Session 43 of User zuul.
Jan 31 02:29:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:35.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:36 np0005603623 python3.9[115571]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:29:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:29:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:29:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:29:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:37.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:37.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:38 np0005603623 python3.9[115786]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:38 np0005603623 python3.9[115939]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:39.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:39 np0005603623 python3.9[116091]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:39.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:39 np0005603623 python3.9[116214]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844578.8926558-161-243788528752430/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=77c7bfdb54766e5798b52884d79620cacdfdd15e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:40 np0005603623 python3.9[116367]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:40 np0005603623 python3.9[116490]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844580.053268-161-154452396415191/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=a66cd34ae464c50bbe4c963e6eef9b60dc2a1e49 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:41.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:41 np0005603623 python3.9[116642]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:41.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:41 np0005603623 python3.9[116765]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844581.0139828-161-164652306701673/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=d4ed21120866ed41bebc0a6ec40cdc520792cb54 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:42 np0005603623 python3.9[116918]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:29:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:29:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:43.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:43 np0005603623 python3.9[117120]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:43.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:43 np0005603623 python3.9[117272]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:44 np0005603623 python3.9[117396]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844583.383958-332-189947389894123/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=81add942d3c85c11cdc326c88b4f4e07048619a0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:44 np0005603623 python3.9[117598]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:45.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:45 np0005603623 python3.9[117721]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844584.4278102-332-242413750662030/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=649eeea41a1e15889a1c750fd61fb88aa589bc91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:45.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:45 np0005603623 python3.9[117873]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:46 np0005603623 python3.9[117997]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844585.6172419-332-3223137072058/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=bf4ee34f56f6208d06fee554d6bb7d111cb96ec2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:47 np0005603623 python3.9[118149]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:47.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:47 np0005603623 python3.9[118301]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:47.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:48 np0005603623 python3.9[118454]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:48 np0005603623 python3.9[118577]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844587.8575623-513-11329066501254/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=c72014b7baba327e3f29bfc9d952bf308baf7e5e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:49.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:49 np0005603623 python3.9[118729]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:49 np0005603623 python3.9[118852]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844588.8952653-513-139119819755942/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=649eeea41a1e15889a1c750fd61fb88aa589bc91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:49.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:50 np0005603623 python3.9[119005]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:50 np0005603623 python3.9[119128]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844589.9231143-513-75966541142288/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=1ba628adef70a03c14745a3d104bc8915666a89a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:51.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:51.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:52 np0005603623 python3.9[119280]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:52 np0005603623 python3.9[119433]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:53 np0005603623 python3.9[119556]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844592.2069588-714-136802028853360/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=95f204ee8062e227608bf68163d0c9f95531c74c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:53.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:53 np0005603623 python3.9[119708]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:53.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:54 np0005603623 python3.9[119861]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:54 np0005603623 python3.9[119984]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844593.9068432-784-149509130308949/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=95f204ee8062e227608bf68163d0c9f95531c74c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:55.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:55 np0005603623 python3.9[120136]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:55.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:56 np0005603623 python3.9[120288]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:56 np0005603623 python3.9[120412]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844595.6827917-856-95184624450625/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=95f204ee8062e227608bf68163d0c9f95531c74c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:57.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:57 np0005603623 python3.9[120564]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:57.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:57 np0005603623 python3.9[120716]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:58 np0005603623 python3.9[120840]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844597.3889773-928-189677343382680/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=95f204ee8062e227608bf68163d0c9f95531c74c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:59 np0005603623 python3.9[120992]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:29:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:59.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:29:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:29:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:59.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:00 np0005603623 ceph-mon[77037]: overall HEALTH_OK
Jan 31 02:30:00 np0005603623 python3.9[121146]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:01.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:01 np0005603623 python3.9[121269]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844599.2379024-1000-181889146240326/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=95f204ee8062e227608bf68163d0c9f95531c74c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:01.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:02 np0005603623 python3.9[121421]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:30:02 np0005603623 python3.9[121574]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:03 np0005603623 python3.9[121697]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844602.191054-1103-131610426066606/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=95f204ee8062e227608bf68163d0c9f95531c74c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:03.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:03 np0005603623 systemd[1]: session-43.scope: Deactivated successfully.
Jan 31 02:30:03 np0005603623 systemd[1]: session-43.scope: Consumed 19.317s CPU time.
Jan 31 02:30:03 np0005603623 systemd-logind[795]: Session 43 logged out. Waiting for processes to exit.
Jan 31 02:30:03 np0005603623 systemd-logind[795]: Removed session 43.
Jan 31 02:30:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:30:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:03.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:30:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:30:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:05.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:30:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:05.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:30:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:07.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:30:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:07.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:30:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:09.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:30:09 np0005603623 systemd-logind[795]: New session 44 of user zuul.
Jan 31 02:30:09 np0005603623 systemd[1]: Started Session 44 of User zuul.
Jan 31 02:30:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:09.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:09 np0005603623 python3.9[121930]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:10 np0005603623 python3.9[122083]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:11.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:11 np0005603623 python3.9[122206]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769844610.3180583-64-87442546860206/.source.conf _original_basename=ceph.conf follow=False checksum=23cbd0a652332596774a4195d9b5b25af094d504 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:30:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:11.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:30:11 np0005603623 python3.9[122358]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:12 np0005603623 python3.9[122482]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769844611.5770373-64-54001827481070/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=35152db97829fbbc30ac5e5c6e1f42921e77a1a7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:12 np0005603623 systemd[1]: session-44.scope: Deactivated successfully.
Jan 31 02:30:12 np0005603623 systemd[1]: session-44.scope: Consumed 2.111s CPU time.
Jan 31 02:30:12 np0005603623 systemd-logind[795]: Session 44 logged out. Waiting for processes to exit.
Jan 31 02:30:12 np0005603623 systemd-logind[795]: Removed session 44.
Jan 31 02:30:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:13.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:13.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:30:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:15.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:30:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:30:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:15.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:30:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:17.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:30:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:17.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:30:18 np0005603623 systemd-logind[795]: New session 45 of user zuul.
Jan 31 02:30:18 np0005603623 systemd[1]: Started Session 45 of User zuul.
Jan 31 02:30:18 np0005603623 python3.9[122663]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:30:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:19.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:30:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:19.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:30:20 np0005603623 python3.9[122819]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:30:20 np0005603623 python3.9[122972]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:30:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:21.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:21 np0005603623 python3.9[123122]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:30:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:21.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:22 np0005603623 python3.9[123274]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 31 02:30:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:30:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:23.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:30:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:23.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:23 np0005603623 dbus-broker-launch[782]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 31 02:30:24 np0005603623 python3.9[123431]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:30:25 np0005603623 python3.9[123566]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:30:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:25.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:25.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:27.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:27 np0005603623 python3.9[123720]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 02:30:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:27.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:28 np0005603623 python3[123875]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 31 02:30:28 np0005603623 python3.9[124028]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:29.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:29 np0005603623 python3.9[124180]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:30:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:29.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:30:30 np0005603623 python3.9[124258]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:30 np0005603623 python3.9[124411]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:31 np0005603623 python3.9[124489]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.j1ucjmn9 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:31.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:31.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:31 np0005603623 python3.9[124641]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:32 np0005603623 python3.9[124720]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:32 np0005603623 python3.9[124872]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:30:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:33.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:33 np0005603623 python3[125025]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 31 02:30:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:33.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:34 np0005603623 python3.9[125178]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:35 np0005603623 python3.9[125303]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844633.9846587-434-265015642228667/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:30:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:35.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:30:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:35.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:35 np0005603623 python3.9[125455]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:36 np0005603623 python3.9[125581]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844635.423919-478-14382153029705/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:30:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:37.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:30:37 np0005603623 python3.9[125733]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:37 np0005603623 python3.9[125858]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844636.7353315-524-193932247231452/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:37.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:38 np0005603623 python3.9[126011]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:39 np0005603623 python3.9[126136]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844637.9891894-568-273935811331939/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:39.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:39 np0005603623 python3.9[126288]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:30:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:39.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:30:40 np0005603623 python3.9[126413]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844639.1984243-614-255860065274937/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:40 np0005603623 python3.9[126566]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:41.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:41 np0005603623 python3.9[126718]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:30:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:41.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:42 np0005603623 python3.9[126874]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:43 np0005603623 python3.9[127123]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:30:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:43.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:43 np0005603623 python3.9[127310]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:30:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:43.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:30:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:30:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:30:44 np0005603623 python3.9[127465]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:30:45 np0005603623 python3.9[127620]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:45.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:45.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:46 np0005603623 python3.9[127820]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:30:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:47.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:47 np0005603623 python3.9[127974]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:9e:41:65:cf" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:30:47 np0005603623 ovs-vsctl[127975]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:9e:41:65:cf external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 31 02:30:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:30:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:47.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:30:48 np0005603623 python3.9[128127]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:30:49 np0005603623 python3.9[128314]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:30:49 np0005603623 ovs-vsctl[128334]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 31 02:30:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:49.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:49 np0005603623 python3.9[128484]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:30:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:30:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:30:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:49.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:50 np0005603623 python3.9[128639]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:30:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:51 np0005603623 python3.9[128791]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:30:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:51.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:30:51 np0005603623 python3.9[128869]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:30:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:30:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:51.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:30:52 np0005603623 python3.9[129021]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:52 np0005603623 python3.9[129100]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:30:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:30:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:53.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:30:53 np0005603623 python3.9[129252]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:30:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:53.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:30:53 np0005603623 python3.9[129404]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:54 np0005603623 python3.9[129483]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:54 np0005603623 python3.9[129635]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:30:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:55.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:30:55 np0005603623 python3.9[129713]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:55.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:56 np0005603623 python3.9[129865]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:30:56 np0005603623 systemd[1]: Reloading.
Jan 31 02:30:56 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:30:56 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:30:57 np0005603623 python3.9[130055]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:30:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:57.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:30:57 np0005603623 python3.9[130133]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:57.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:58 np0005603623 python3.9[130285]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:58 np0005603623 python3.9[130364]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:59.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:59 np0005603623 python3.9[130516]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:30:59 np0005603623 systemd[1]: Reloading.
Jan 31 02:30:59 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:30:59 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:30:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:30:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:59.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:59 np0005603623 systemd[1]: Starting Create netns directory...
Jan 31 02:30:59 np0005603623 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 31 02:30:59 np0005603623 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 31 02:30:59 np0005603623 systemd[1]: Finished Create netns directory.
Jan 31 02:31:00 np0005603623 python3.9[130711]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:01.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:01 np0005603623 python3.9[130863]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:31:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:01.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:31:02 np0005603623 python3.9[130986]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769844661.2383826-1366-254244001417156/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:03.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:03 np0005603623 python3.9[131139]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:31:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:03.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:03 np0005603623 python3.9[131291]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:04 np0005603623 python3.9[131444]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:05 np0005603623 python3.9[131567]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769844664.1579647-1465-124359470145741/.source.json _original_basename=.3mvp518b follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:31:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:05.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:05 np0005603623 python3.9[131767]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:31:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:31:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:05.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:31:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:31:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:07.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:31:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:07.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:07 np0005603623 python3.9[132191]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 31 02:31:08 np0005603623 python3.9[132344]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 31 02:31:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:31:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:09.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:31:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:09.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:10 np0005603623 python3[132496]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 31 02:31:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:11.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:11.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:31:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:13.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:31:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:13.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:14 np0005603623 podman[132509]: 2026-01-31 07:31:14.314305295 +0000 UTC m=+4.252490203 image pull 9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 31 02:31:14 np0005603623 podman[132631]: 2026-01-31 07:31:14.41494172 +0000 UTC m=+0.045392288 container create cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 31 02:31:14 np0005603623 podman[132631]: 2026-01-31 07:31:14.388019253 +0000 UTC m=+0.018469851 image pull 9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 31 02:31:14 np0005603623 python3[132496]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 31 02:31:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:31:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:15.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:31:15 np0005603623 python3.9[132821]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:31:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:15.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:15 np0005603623 python3.9[132975]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:31:16 np0005603623 python3.9[133052]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:31:17 np0005603623 python3.9[133203]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769844676.4586985-1699-226428686815834/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:31:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:31:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:17.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:31:17 np0005603623 python3.9[133279]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 02:31:17 np0005603623 systemd[1]: Reloading.
Jan 31 02:31:17 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:31:17 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:31:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:17.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:18 np0005603623 python3.9[133391]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:31:18 np0005603623 systemd[1]: Reloading.
Jan 31 02:31:18 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:31:18 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:31:18 np0005603623 systemd[1]: Starting ovn_controller container...
Jan 31 02:31:18 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:31:18 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/343859af97c76bc0c6b84a6d2d802949882f000816824e65692704a09fd263d5/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 31 02:31:18 np0005603623 systemd[1]: Started /usr/bin/podman healthcheck run cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6.
Jan 31 02:31:18 np0005603623 podman[133433]: 2026-01-31 07:31:18.821739285 +0000 UTC m=+0.131447955 container init cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:31:18 np0005603623 ovn_controller[133449]: + sudo -E kolla_set_configs
Jan 31 02:31:18 np0005603623 podman[133433]: 2026-01-31 07:31:18.85794716 +0000 UTC m=+0.167655810 container start cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller)
Jan 31 02:31:18 np0005603623 edpm-start-podman-container[133433]: ovn_controller
Jan 31 02:31:18 np0005603623 systemd[1]: Created slice User Slice of UID 0.
Jan 31 02:31:18 np0005603623 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 31 02:31:18 np0005603623 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 31 02:31:18 np0005603623 systemd[1]: Starting User Manager for UID 0...
Jan 31 02:31:18 np0005603623 edpm-start-podman-container[133432]: Creating additional drop-in dependency for "ovn_controller" (cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6)
Jan 31 02:31:18 np0005603623 podman[133456]: 2026-01-31 07:31:18.9290788 +0000 UTC m=+0.063963994 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 31 02:31:18 np0005603623 systemd[1]: cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6-67954f7cccfb29b4.service: Main process exited, code=exited, status=1/FAILURE
Jan 31 02:31:18 np0005603623 systemd[1]: cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6-67954f7cccfb29b4.service: Failed with result 'exit-code'.
Jan 31 02:31:18 np0005603623 systemd[1]: Reloading.
Jan 31 02:31:19 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:31:19 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:31:19 np0005603623 systemd[133481]: Queued start job for default target Main User Target.
Jan 31 02:31:19 np0005603623 systemd[133481]: Created slice User Application Slice.
Jan 31 02:31:19 np0005603623 systemd[133481]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 31 02:31:19 np0005603623 systemd[133481]: Started Daily Cleanup of User's Temporary Directories.
Jan 31 02:31:19 np0005603623 systemd[133481]: Reached target Paths.
Jan 31 02:31:19 np0005603623 systemd[133481]: Reached target Timers.
Jan 31 02:31:19 np0005603623 systemd[133481]: Starting D-Bus User Message Bus Socket...
Jan 31 02:31:19 np0005603623 systemd[133481]: Starting Create User's Volatile Files and Directories...
Jan 31 02:31:19 np0005603623 systemd[133481]: Listening on D-Bus User Message Bus Socket.
Jan 31 02:31:19 np0005603623 systemd[133481]: Reached target Sockets.
Jan 31 02:31:19 np0005603623 systemd[133481]: Finished Create User's Volatile Files and Directories.
Jan 31 02:31:19 np0005603623 systemd[133481]: Reached target Basic System.
Jan 31 02:31:19 np0005603623 systemd[133481]: Reached target Main User Target.
Jan 31 02:31:19 np0005603623 systemd[133481]: Startup finished in 131ms.
Jan 31 02:31:19 np0005603623 systemd[1]: Started User Manager for UID 0.
Jan 31 02:31:19 np0005603623 systemd[1]: Started ovn_controller container.
Jan 31 02:31:19 np0005603623 systemd[1]: Started Session c1 of User root.
Jan 31 02:31:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:19.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: INFO:__main__:Validating config file
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: INFO:__main__:Writing out command to execute
Jan 31 02:31:19 np0005603623 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: ++ cat /run_command
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: + ARGS=
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: + sudo kolla_copy_cacerts
Jan 31 02:31:19 np0005603623 systemd[1]: Started Session c2 of User root.
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: + [[ ! -n '' ]]
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: + . kolla_extend_start
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: + umask 0022
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 31 02:31:19 np0005603623 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 31 02:31:19 np0005603623 NetworkManager[48970]: <info>  [1769844679.3012] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Jan 31 02:31:19 np0005603623 NetworkManager[48970]: <info>  [1769844679.3019] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:31:19 np0005603623 NetworkManager[48970]: <warn>  [1769844679.3022] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 02:31:19 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 02:31:19 np0005603623 NetworkManager[48970]: <info>  [1769844679.3028] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 31 02:31:19 np0005603623 NetworkManager[48970]: <info>  [1769844679.3033] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Jan 31 02:31:19 np0005603623 NetworkManager[48970]: <info>  [1769844679.3036] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 31 02:31:19 np0005603623 kernel: br-int: entered promiscuous mode
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00017|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00018|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00019|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00020|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00021|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00022|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00023|main|INFO|OVS feature set changed, force recompute.
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 31 02:31:19 np0005603623 systemd-udevd[133586]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 31 02:31:19 np0005603623 NetworkManager[48970]: <info>  [1769844679.3487] manager: (ovn-71aaf7-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 31 02:31:19 np0005603623 NetworkManager[48970]: <info>  [1769844679.3500] manager: (ovn-59a8b9-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 31 02:31:19 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:19Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 31 02:31:19 np0005603623 kernel: genev_sys_6081: entered promiscuous mode
Jan 31 02:31:19 np0005603623 NetworkManager[48970]: <info>  [1769844679.3653] device (genev_sys_6081): carrier: link connected
Jan 31 02:31:19 np0005603623 NetworkManager[48970]: <info>  [1769844679.3661] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/21)
Jan 31 02:31:19 np0005603623 NetworkManager[48970]: <info>  [1769844679.7412] manager: (ovn-bd097f-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Jan 31 02:31:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:19.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:20 np0005603623 python3.9[133717]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 31 02:31:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:31:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:21.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:31:21 np0005603623 python3.9[133869]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:21.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:21 np0005603623 python3.9[133992]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769844681.0321305-1834-158729280545114/.source.yaml _original_basename=.6zxl4d3o follow=False checksum=869a4744df33825307102f8d7b13c7e3fcbb8f59 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:31:22 np0005603623 python3.9[134145]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:31:22 np0005603623 ovs-vsctl[134146]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 31 02:31:23 np0005603623 python3.9[134298]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:31:23 np0005603623 ovs-vsctl[134300]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 31 02:31:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:31:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:23.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:31:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:23.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:24 np0005603623 python3.9[134453]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:31:24 np0005603623 ovs-vsctl[134455]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 31 02:31:24 np0005603623 systemd-logind[795]: Session 45 logged out. Waiting for processes to exit.
Jan 31 02:31:24 np0005603623 systemd[1]: session-45.scope: Deactivated successfully.
Jan 31 02:31:24 np0005603623 systemd[1]: session-45.scope: Consumed 49.413s CPU time.
Jan 31 02:31:24 np0005603623 systemd-logind[795]: Removed session 45.
Jan 31 02:31:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:25.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:25.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:31:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:27.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:31:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:27.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:31:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:29.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:31:29 np0005603623 systemd[1]: Stopping User Manager for UID 0...
Jan 31 02:31:29 np0005603623 systemd[133481]: Activating special unit Exit the Session...
Jan 31 02:31:29 np0005603623 systemd[133481]: Stopped target Main User Target.
Jan 31 02:31:29 np0005603623 systemd[133481]: Stopped target Basic System.
Jan 31 02:31:29 np0005603623 systemd[133481]: Stopped target Paths.
Jan 31 02:31:29 np0005603623 systemd[133481]: Stopped target Sockets.
Jan 31 02:31:29 np0005603623 systemd[133481]: Stopped target Timers.
Jan 31 02:31:29 np0005603623 systemd[133481]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 31 02:31:29 np0005603623 systemd[133481]: Closed D-Bus User Message Bus Socket.
Jan 31 02:31:29 np0005603623 systemd[133481]: Stopped Create User's Volatile Files and Directories.
Jan 31 02:31:29 np0005603623 systemd[133481]: Removed slice User Application Slice.
Jan 31 02:31:29 np0005603623 systemd[133481]: Reached target Shutdown.
Jan 31 02:31:29 np0005603623 systemd[133481]: Finished Exit the Session.
Jan 31 02:31:29 np0005603623 systemd[133481]: Reached target Exit the Session.
Jan 31 02:31:29 np0005603623 systemd[1]: user@0.service: Deactivated successfully.
Jan 31 02:31:29 np0005603623 systemd[1]: Stopped User Manager for UID 0.
Jan 31 02:31:29 np0005603623 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 31 02:31:29 np0005603623 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 31 02:31:29 np0005603623 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 31 02:31:29 np0005603623 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 31 02:31:29 np0005603623 systemd[1]: Removed slice User Slice of UID 0.
Jan 31 02:31:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:29.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:30 np0005603623 systemd-logind[795]: New session 47 of user zuul.
Jan 31 02:31:30 np0005603623 systemd[1]: Started Session 47 of User zuul.
Jan 31 02:31:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:31.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:31 np0005603623 python3.9[134688]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:31:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:31.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:32 np0005603623 python3.9[134845]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:32 np0005603623 python3.9[134998]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:33.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:33 np0005603623 python3.9[135150]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:31:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:33.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:31:33 np0005603623 python3.9[135302]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:34 np0005603623 python3.9[135455]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:35.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:35 np0005603623 python3.9[135605]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:31:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:35.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:36 np0005603623 python3.9[135762]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 31 02:31:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:37.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:37.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:38 np0005603623 python3.9[135914]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:38 np0005603623 python3.9[136036]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769844697.5512323-219-161806953371/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:39.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:39 np0005603623 python3.9[136186]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:31:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:39.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:31:40 np0005603623 python3.9[136307]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769844699.2749176-264-252153820940058/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:41 np0005603623 python3.9[136460]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:31:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:41.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:31:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:41.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:31:42 np0005603623 python3.9[136544]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:31:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:43.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:43.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:45 np0005603623 python3.9[136701]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 02:31:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:31:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:45.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:31:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:45.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:45 np0005603623 python3.9[136904]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:46 np0005603623 python3.9[137026]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769844705.5808399-375-132988035438016/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:46 np0005603623 python3.9[137176]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:47.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:47 np0005603623 python3.9[137297]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769844706.5717607-375-10434607060284/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:47.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:48 np0005603623 python3.9[137448]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:49 np0005603623 python3.9[137569]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769844708.144328-508-4149558485578/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:49 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:49Z|00025|memory|INFO|16128 kB peak resident set size after 29.8 seconds
Jan 31 02:31:49 np0005603623 ovn_controller[133449]: 2026-01-31T07:31:49Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Jan 31 02:31:49 np0005603623 podman[137594]: 2026-01-31 07:31:49.07342284 +0000 UTC m=+0.077271897 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 31 02:31:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:49.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:49 np0005603623 python3.9[137859]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:49.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:49 np0005603623 python3.9[137994]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769844709.1394103-508-196767086379776/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:50 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:31:50 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:31:50 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:31:50 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:31:50 np0005603623 python3.9[138145]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:31:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:31:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:51.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:31:51 np0005603623 python3.9[138299]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:31:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:31:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:31:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:51.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:52 np0005603623 python3.9[138451]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:52 np0005603623 python3.9[138530]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:53 np0005603623 python3.9[138682]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:53.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:53 np0005603623 python3.9[138760]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:53.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:55.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:55 np0005603623 python3.9[138913]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:31:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:31:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:55.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:31:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:56 np0005603623 python3.9[139065]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:56 np0005603623 python3.9[139144]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:31:57 np0005603623 python3.9[139296]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:57.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:57 np0005603623 python3.9[139424]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:31:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:57.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:31:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:31:58 np0005603623 python3.9[139577]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:31:58 np0005603623 systemd[1]: Reloading.
Jan 31 02:31:58 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:31:58 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:31:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:59.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:59 np0005603623 python3.9[139766]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:31:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:31:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:59.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:32:00 np0005603623 python3.9[139844]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:32:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:01.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:32:01 np0005603623 python3.9[139997]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:32:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:01.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:02 np0005603623 python3.9[140075]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:02 np0005603623 python3.9[140228]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:32:02 np0005603623 systemd[1]: Reloading.
Jan 31 02:32:03 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:32:03 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:32:03 np0005603623 systemd[1]: Starting Create netns directory...
Jan 31 02:32:03 np0005603623 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 31 02:32:03 np0005603623 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 31 02:32:03 np0005603623 systemd[1]: Finished Create netns directory.
Jan 31 02:32:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:32:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:03.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:32:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:03.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:04 np0005603623 python3.9[140421]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:32:04 np0005603623 python3.9[140574]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:32:05 np0005603623 python3.9[140697]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769844724.3879254-960-178734110299166/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:32:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:05.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:32:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:05.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:32:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:06 np0005603623 python3.9[140900]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:06 np0005603623 python3.9[141052]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:32:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:07.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:07 np0005603623 python3.9[141204]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:32:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:32:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:07.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:32:08 np0005603623 python3.9[141327]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769844727.1167672-1059-167537665719309/.source.json _original_basename=.z96ng6xo follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:08 np0005603623 python3.9[141478]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:09.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:09.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:32:11.156342) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844731156503, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1702, "num_deletes": 251, "total_data_size": 4140911, "memory_usage": 4187520, "flush_reason": "Manual Compaction"}
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844731203911, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 2702218, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10697, "largest_seqno": 12394, "table_properties": {"data_size": 2695154, "index_size": 4135, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 14061, "raw_average_key_size": 19, "raw_value_size": 2681102, "raw_average_value_size": 3698, "num_data_blocks": 187, "num_entries": 725, "num_filter_entries": 725, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844574, "oldest_key_time": 1769844574, "file_creation_time": 1769844731, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 47730 microseconds, and 10352 cpu microseconds.
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:32:11.204069) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 2702218 bytes OK
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:32:11.204125) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:32:11.210979) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:32:11.211001) EVENT_LOG_v1 {"time_micros": 1769844731210994, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:32:11.211022) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 4133234, prev total WAL file size 4133234, number of live WAL files 2.
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:32:11.212177) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(2638KB)], [21(7569KB)]
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844731212260, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 10453018, "oldest_snapshot_seqno": -1}
Jan 31 02:32:11 np0005603623 python3.9[141902]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 3981 keys, 8239046 bytes, temperature: kUnknown
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844731288646, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 8239046, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8209899, "index_size": 18093, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9989, "raw_key_size": 96460, "raw_average_key_size": 24, "raw_value_size": 8135487, "raw_average_value_size": 2043, "num_data_blocks": 782, "num_entries": 3981, "num_filter_entries": 3981, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769844731, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:32:11.288956) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 8239046 bytes
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:32:11.290608) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 136.7 rd, 107.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 7.4 +0.0 blob) out(7.9 +0.0 blob), read-write-amplify(6.9) write-amplify(3.0) OK, records in: 4498, records dropped: 517 output_compression: NoCompression
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:32:11.290630) EVENT_LOG_v1 {"time_micros": 1769844731290620, "job": 10, "event": "compaction_finished", "compaction_time_micros": 76486, "compaction_time_cpu_micros": 23449, "output_level": 6, "num_output_files": 1, "total_output_size": 8239046, "num_input_records": 4498, "num_output_records": 3981, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844731291005, "job": 10, "event": "table_file_deletion", "file_number": 23}
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844731291688, "job": 10, "event": "table_file_deletion", "file_number": 21}
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:32:11.212037) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:32:11.291743) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:32:11.291749) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:32:11.291751) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:32:11.291753) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:32:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:32:11.291755) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:32:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:32:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:11.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:32:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:32:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:11.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:32:12 np0005603623 python3.9[142055]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 31 02:32:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:13.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:13 np0005603623 python3[142207]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 31 02:32:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:13.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:15.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:32:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:15.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:32:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:32:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:17.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:32:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:17.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:19.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:32:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:19.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:32:20 np0005603623 podman[142311]: 2026-01-31 07:32:20.229168808 +0000 UTC m=+0.324984161 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 02:32:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:21 np0005603623 podman[142220]: 2026-01-31 07:32:21.332240564 +0000 UTC m=+7.898927933 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:32:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:32:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:21.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:32:21 np0005603623 podman[142383]: 2026-01-31 07:32:21.428281748 +0000 UTC m=+0.028386891 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:32:21 np0005603623 podman[142383]: 2026-01-31 07:32:21.576630804 +0000 UTC m=+0.176735947 container create 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 02:32:21 np0005603623 python3[142207]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:32:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:21.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:23.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:32:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:23.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:32:24 np0005603623 python3.9[142575]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:32:25 np0005603623 python3.9[142729]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:25.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 02:32:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2210 writes, 12K keys, 2210 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.04 MB/s#012Cumulative WAL: 2210 writes, 2210 syncs, 1.00 writes per sync, written: 0.02 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2210 writes, 12K keys, 2210 commit groups, 1.0 writes per commit group, ingest: 23.45 MB, 0.04 MB/s#012Interval WAL: 2210 writes, 2210 syncs, 1.00 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     97.1      0.14              0.04         5    0.029       0      0       0.0       0.0#012  L6      1/0    7.86 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.3    118.0     99.1      0.32              0.09         4    0.080     16K   1782       0.0       0.0#012 Sum      1/0    7.86 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.3     81.6     98.5      0.46              0.14         9    0.052     16K   1782       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.3     81.9     98.9      0.46              0.14         8    0.058     16K   1782       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0    118.0     99.1      0.32              0.09         4    0.080     16K   1782       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     98.3      0.14              0.04         4    0.035       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.014, interval 0.014#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.04 GB write, 0.08 MB/s write, 0.04 GB read, 0.06 MB/s read, 0.5 seconds#012Interval compaction: 0.04 GB write, 0.08 MB/s write, 0.04 GB read, 0.06 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557fc5f1b1f0#2 capacity: 304.00 MB usage: 1.27 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 4.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(61,1.11 MB,0.364524%) FilterBlock(9,53.86 KB,0.0173017%) IndexBlock(9,116.48 KB,0.0374192%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 02:32:25 np0005603623 python3.9[142855]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:32:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:25.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:26 np0005603623 python3.9[143007]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769844745.7470906-1293-193295644492477/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:26 np0005603623 python3.9[143083]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 02:32:26 np0005603623 systemd[1]: Reloading.
Jan 31 02:32:26 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:32:26 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:32:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:27.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:27 np0005603623 python3.9[143195]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:32:27 np0005603623 systemd[1]: Reloading.
Jan 31 02:32:27 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:32:27 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:32:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:27.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:28 np0005603623 systemd[1]: Starting ovn_metadata_agent container...
Jan 31 02:32:28 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:32:28 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81ad508df03a0c7a5502069b594c0a2334647d83825f6b6874fd5a69a227d9fc/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 31 02:32:28 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81ad508df03a0c7a5502069b594c0a2334647d83825f6b6874fd5a69a227d9fc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:32:28 np0005603623 systemd[1]: Started /usr/bin/podman healthcheck run 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e.
Jan 31 02:32:28 np0005603623 podman[143236]: 2026-01-31 07:32:28.275973631 +0000 UTC m=+0.218183119 container init 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: + sudo -E kolla_set_configs
Jan 31 02:32:28 np0005603623 podman[143236]: 2026-01-31 07:32:28.301639396 +0000 UTC m=+0.243848904 container start 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:32:28 np0005603623 edpm-start-podman-container[143236]: ovn_metadata_agent
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: INFO:__main__:Validating config file
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: INFO:__main__:Copying service configuration files
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: INFO:__main__:Writing out command to execute
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: ++ cat /run_command
Jan 31 02:32:28 np0005603623 podman[143260]: 2026-01-31 07:32:28.368097332 +0000 UTC m=+0.059945363 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: + CMD=neutron-ovn-metadata-agent
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: + ARGS=
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: + sudo kolla_copy_cacerts
Jan 31 02:32:28 np0005603623 edpm-start-podman-container[143235]: Creating additional drop-in dependency for "ovn_metadata_agent" (1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e)
Jan 31 02:32:28 np0005603623 systemd[1]: Reloading.
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: + [[ ! -n '' ]]
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: + . kolla_extend_start
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: Running command: 'neutron-ovn-metadata-agent'
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: + umask 0022
Jan 31 02:32:28 np0005603623 ovn_metadata_agent[143253]: + exec neutron-ovn-metadata-agent
Jan 31 02:32:28 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:32:28 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:32:28 np0005603623 systemd[1]: Started ovn_metadata_agent container.
Jan 31 02:32:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:32:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:29.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:32:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:29.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.029 143258 INFO neutron.common.config [-] Logging enabled!#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.029 143258 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev44#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.029 143258 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.030 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.030 143258 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.030 143258 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.030 143258 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.030 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.030 143258 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.030 143258 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.031 143258 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.031 143258 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.031 143258 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.031 143258 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.031 143258 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.031 143258 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.031 143258 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.031 143258 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.031 143258 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.031 143258 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.032 143258 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.032 143258 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.032 143258 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.032 143258 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.032 143258 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.032 143258 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.032 143258 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.032 143258 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.032 143258 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.032 143258 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.033 143258 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.033 143258 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.033 143258 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.033 143258 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.033 143258 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.033 143258 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.033 143258 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.033 143258 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.033 143258 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.034 143258 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.034 143258 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.034 143258 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.034 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.034 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.034 143258 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.034 143258 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.034 143258 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.034 143258 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.034 143258 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.034 143258 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.035 143258 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.035 143258 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.035 143258 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.035 143258 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.035 143258 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.035 143258 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.035 143258 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.035 143258 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.035 143258 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.035 143258 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.036 143258 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.036 143258 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.036 143258 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.036 143258 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.036 143258 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.036 143258 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.036 143258 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.036 143258 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.036 143258 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.037 143258 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.037 143258 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.037 143258 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.037 143258 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.037 143258 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.037 143258 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.037 143258 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.037 143258 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.037 143258 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.037 143258 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.038 143258 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.038 143258 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.038 143258 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.038 143258 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.038 143258 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.038 143258 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.038 143258 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.038 143258 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.038 143258 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.038 143258 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.039 143258 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.039 143258 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.039 143258 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.039 143258 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.039 143258 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.039 143258 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.039 143258 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.039 143258 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.039 143258 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.039 143258 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.039 143258 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.040 143258 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.040 143258 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.040 143258 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.040 143258 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.040 143258 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.040 143258 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.040 143258 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.040 143258 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.040 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.040 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.041 143258 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.041 143258 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.041 143258 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.041 143258 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.041 143258 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.041 143258 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.041 143258 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.041 143258 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.041 143258 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.042 143258 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.042 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.042 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.042 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.042 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.042 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.042 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.042 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.042 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.043 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.043 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.043 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.043 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.043 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.043 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.043 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.043 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.044 143258 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.044 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.044 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.044 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.044 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.044 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.044 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.044 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.044 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.045 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.045 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.045 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.045 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.045 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.045 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.045 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.045 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.045 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.045 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.046 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.046 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.046 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.046 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.046 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.046 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.046 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.046 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.046 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.047 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.047 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.047 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.047 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.047 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.047 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.047 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.047 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.047 143258 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.048 143258 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.048 143258 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.048 143258 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.048 143258 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.048 143258 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.048 143258 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.048 143258 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.048 143258 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.048 143258 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.048 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.049 143258 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.049 143258 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.049 143258 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.049 143258 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.049 143258 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.049 143258 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.049 143258 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.049 143258 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.049 143258 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.049 143258 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.050 143258 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.050 143258 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.050 143258 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.050 143258 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.050 143258 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.050 143258 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.050 143258 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.050 143258 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.050 143258 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.051 143258 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.051 143258 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.051 143258 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.051 143258 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.051 143258 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.051 143258 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.051 143258 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.051 143258 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.051 143258 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.051 143258 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.052 143258 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.052 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.052 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.052 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.052 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.052 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.052 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.053 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.053 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.053 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.053 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.053 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.053 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.053 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.053 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.053 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.053 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.054 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.054 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.054 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.054 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.054 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.054 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.054 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.054 143258 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.054 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.054 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.055 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.055 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.055 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.055 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.055 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.055 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.055 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.055 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.056 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.056 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.056 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.056 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.056 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.056 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.056 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.057 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.057 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.057 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.057 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.057 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.057 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.057 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.058 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.058 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.058 143258 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.058 143258 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.058 143258 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.058 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.058 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.059 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.059 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.059 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.059 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.059 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.059 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.059 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.060 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.060 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.060 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.060 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.060 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.060 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.060 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.061 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.061 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.061 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.061 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.061 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.061 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.061 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.061 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.062 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.062 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.062 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.062 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.062 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.062 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.062 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.063 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.063 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.063 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.063 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.063 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.063 143258 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.063 143258 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.073 143258 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.073 143258 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.073 143258 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.074 143258 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.074 143258 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.088 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 7ec8bf38-9571-4400-a85c-6bd5ac54bdf3 (UUID: 7ec8bf38-9571-4400-a85c-6bd5ac54bdf3) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.113 143258 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.113 143258 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.113 143258 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.113 143258 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.116 143258 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.121 143258 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.126 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '7ec8bf38-9571-4400-a85c-6bd5ac54bdf3'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], external_ids={}, name=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, nb_cfg_timestamp=1769844687335, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.127 143258 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f2985ce7be0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.128 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.128 143258 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.128 143258 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.128 143258 INFO oslo_service.service [-] Starting 1 workers#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.131 143258 DEBUG oslo_service.service [-] Started child 143493 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.134 143258 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpeopn5vel/privsep.sock']#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.135 143493 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-243350'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.163 143493 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.164 143493 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.164 143493 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.168 143493 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.177 143493 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.184 143493 INFO eventlet.wsgi.server [-] (143493) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Jan 31 02:32:30 np0005603623 python3.9[143492]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 31 02:32:30 np0005603623 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.703 143258 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.704 143258 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpeopn5vel/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.598 143522 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.601 143522 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.604 143522 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.605 143522 INFO oslo.privsep.daemon [-] privsep daemon running as pid 143522#033[00m
Jan 31 02:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:30.706 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[7dd3af7b-c806-412f-8ab0-452c1e04093e]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:32:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.164 143522 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.164 143522 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.164 143522 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:32:31 np0005603623 python3.9[143654]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:32:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:31.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.665 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[b58f9be0-787e-4fcf-9072-88eb8bc75131]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.667 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, column=external_ids, values=({'neutron:ovn-metadata-id': 'e7d269c5-ff06-5b3d-87b9-18fad4de05bc'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.677 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.683 143258 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.683 143258 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.683 143258 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.683 143258 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.683 143258 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.683 143258 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.684 143258 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.684 143258 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.684 143258 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.684 143258 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.684 143258 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.684 143258 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.684 143258 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.684 143258 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.685 143258 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.685 143258 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.685 143258 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.685 143258 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.685 143258 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.685 143258 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.685 143258 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.685 143258 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.685 143258 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.686 143258 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.686 143258 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.686 143258 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.686 143258 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.686 143258 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.686 143258 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.686 143258 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.686 143258 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.687 143258 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.687 143258 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.687 143258 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.687 143258 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.687 143258 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.687 143258 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.687 143258 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.687 143258 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.687 143258 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.688 143258 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.688 143258 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.688 143258 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.688 143258 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.688 143258 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.688 143258 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.688 143258 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.688 143258 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.688 143258 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.689 143258 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.689 143258 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.689 143258 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.689 143258 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.689 143258 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.689 143258 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.689 143258 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.689 143258 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.689 143258 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.689 143258 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.689 143258 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.690 143258 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.690 143258 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.690 143258 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.690 143258 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.690 143258 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.690 143258 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.690 143258 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.690 143258 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.690 143258 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.691 143258 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.691 143258 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.691 143258 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.691 143258 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.691 143258 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.691 143258 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.691 143258 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.691 143258 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.692 143258 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.692 143258 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.692 143258 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.692 143258 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.692 143258 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.692 143258 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.692 143258 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.692 143258 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.693 143258 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.693 143258 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.693 143258 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.693 143258 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.693 143258 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.693 143258 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.693 143258 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.693 143258 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.694 143258 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.694 143258 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.694 143258 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.694 143258 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.694 143258 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.694 143258 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.694 143258 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.695 143258 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.695 143258 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.695 143258 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.695 143258 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.695 143258 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.695 143258 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.695 143258 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.695 143258 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.696 143258 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.696 143258 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.696 143258 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.696 143258 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.696 143258 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.696 143258 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.697 143258 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.697 143258 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.697 143258 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.697 143258 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.697 143258 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.697 143258 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.697 143258 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.697 143258 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.698 143258 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.698 143258 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.698 143258 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.698 143258 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.698 143258 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.698 143258 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.698 143258 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.698 143258 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.698 143258 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.699 143258 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.699 143258 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.699 143258 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.699 143258 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.699 143258 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.699 143258 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.699 143258 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.699 143258 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.699 143258 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.699 143258 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.700 143258 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.700 143258 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.700 143258 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.700 143258 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.700 143258 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.700 143258 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.700 143258 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.700 143258 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.700 143258 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.700 143258 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.701 143258 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.701 143258 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.701 143258 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.701 143258 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.701 143258 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.701 143258 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.701 143258 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.701 143258 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.701 143258 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.701 143258 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.701 143258 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.701 143258 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.702 143258 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.702 143258 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.702 143258 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.702 143258 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.702 143258 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.702 143258 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.702 143258 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.702 143258 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.702 143258 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.702 143258 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.703 143258 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.703 143258 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.703 143258 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.703 143258 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.703 143258 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.703 143258 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.703 143258 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.703 143258 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.703 143258 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.703 143258 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.704 143258 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.704 143258 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.704 143258 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.704 143258 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.704 143258 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.704 143258 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.704 143258 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.704 143258 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.704 143258 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.704 143258 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.705 143258 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.705 143258 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.705 143258 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.705 143258 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.705 143258 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.705 143258 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.705 143258 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.705 143258 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.705 143258 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.705 143258 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.706 143258 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.706 143258 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.706 143258 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.706 143258 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.706 143258 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.706 143258 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.706 143258 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.706 143258 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.706 143258 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.706 143258 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.707 143258 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.707 143258 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.707 143258 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.707 143258 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.707 143258 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.707 143258 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.707 143258 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.707 143258 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.707 143258 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.708 143258 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.708 143258 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.708 143258 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.708 143258 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.708 143258 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.708 143258 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.708 143258 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.708 143258 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.708 143258 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.708 143258 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.708 143258 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.709 143258 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.709 143258 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.709 143258 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.709 143258 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.709 143258 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.709 143258 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.709 143258 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.709 143258 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.709 143258 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.709 143258 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.710 143258 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.710 143258 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.710 143258 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.710 143258 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.710 143258 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.710 143258 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.710 143258 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.710 143258 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.710 143258 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.710 143258 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.711 143258 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.711 143258 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.711 143258 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.711 143258 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.711 143258 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.711 143258 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.711 143258 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.711 143258 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.711 143258 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.711 143258 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.712 143258 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.712 143258 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.712 143258 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.712 143258 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.712 143258 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.712 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.712 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.712 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.712 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.712 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.713 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.713 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.713 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.713 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.713 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.713 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.713 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.713 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.713 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.713 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.714 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.714 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.714 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.714 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.714 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.714 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.714 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.714 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.714 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.714 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.715 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.715 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.715 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.715 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.715 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.715 143258 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.715 143258 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.715 143258 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.715 143258 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.716 143258 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:32:31.716 143258 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 31 02:32:31 np0005603623 python3.9[143779]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769844750.9636707-1429-158556351885097/.source.yaml _original_basename=.9oggihh3 follow=False checksum=87ad539680adb8db4e1be011e7c446590196a675 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:31.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:32 np0005603623 systemd[1]: session-47.scope: Deactivated successfully.
Jan 31 02:32:32 np0005603623 systemd[1]: session-47.scope: Consumed 46.350s CPU time.
Jan 31 02:32:32 np0005603623 systemd-logind[795]: Session 47 logged out. Waiting for processes to exit.
Jan 31 02:32:32 np0005603623 systemd-logind[795]: Removed session 47.
Jan 31 02:32:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:33.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:33.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:35.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:35.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:37.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:37 np0005603623 systemd-logind[795]: New session 48 of user zuul.
Jan 31 02:32:37 np0005603623 systemd[1]: Started Session 48 of User zuul.
Jan 31 02:32:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:37.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:38 np0005603623 python3.9[143961]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:32:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:39.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 31 02:32:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:39.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:40 np0005603623 python3.9[144117]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:32:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:41 np0005603623 python3.9[144281]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 02:32:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:41.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:41 np0005603623 systemd[1]: Reloading.
Jan 31 02:32:41 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:32:41 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:32:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:32:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:41.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:32:42 np0005603623 python3.9[144467]: ansible-ansible.builtin.service_facts Invoked
Jan 31 02:32:42 np0005603623 network[144484]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 02:32:42 np0005603623 network[144485]: 'network-scripts' will be removed from distribution in near future.
Jan 31 02:32:42 np0005603623 network[144486]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 02:32:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:43.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:43.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:45.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:45.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:47 np0005603623 python3.9[144800]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:32:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:47.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:47 np0005603623 python3.9[144953]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:32:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:47.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:48 np0005603623 python3.9[145107]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:32:49 np0005603623 python3.9[145260]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:32:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:32:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:49.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:32:49 np0005603623 python3.9[145413]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:32:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:49.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:50 np0005603623 python3.9[145567]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:32:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:50 np0005603623 podman[145692]: 2026-01-31 07:32:50.991558766 +0000 UTC m=+0.073618092 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 02:32:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 02:32:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 5423 writes, 24K keys, 5423 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5423 writes, 789 syncs, 6.87 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5423 writes, 24K keys, 5423 commit groups, 1.0 writes per commit group, ingest: 18.70 MB, 0.03 MB/s#012Interval WAL: 5423 writes, 789 syncs, 6.87 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e60fd28f30#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e60fd28f30#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Jan 31 02:32:51 np0005603623 python3.9[145734]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:32:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:51.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:51.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:52 np0005603623 python3.9[145898]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:52 np0005603623 python3.9[146051]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:53 np0005603623 python3.9[146203]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:53.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:53.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:54 np0005603623 python3.9[146355]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:55 np0005603623 python3.9[146509]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:32:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:55.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:32:55 np0005603623 python3.9[146661]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:32:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:55.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:32:56 np0005603623 python3.9[146813]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:56 np0005603623 python3.9[146966]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:57 np0005603623 python3.9[147118]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:57.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:57 np0005603623 python3.9[147370]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:32:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:57.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:32:58 np0005603623 python3.9[147643]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:58 np0005603623 podman[147798]: 2026-01-31 07:32:58.817394137 +0000 UTC m=+0.066376084 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:32:58 np0005603623 python3.9[147837]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:32:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:32:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 02:32:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 02:32:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Jan 31 02:32:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 02:32:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:32:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:32:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:32:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:59.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:32:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:59.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:00 np0005603623 python3.9[147997]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:33:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:01.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:01 np0005603623 python3.9[148150]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:33:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:01.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:02 np0005603623 python3.9[148302]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:33:02 np0005603623 python3.9[148455]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 31 02:33:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:03.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:03 np0005603623 python3.9[148607]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 02:33:03 np0005603623 systemd[1]: Reloading.
Jan 31 02:33:03 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:33:03 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:33:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:33:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:03.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:33:04 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:33:04 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:33:04 np0005603623 python3.9[148845]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:33:05 np0005603623 python3.9[148998]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:33:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:33:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:05.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:33:05 np0005603623 python3.9[149199]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:33:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:33:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:06.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:33:06 np0005603623 python3.9[149355]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:33:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:33:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:07.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:33:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:08.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:08 np0005603623 python3.9[149508]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:33:08 np0005603623 python3.9[149662]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:33:09 np0005603623 python3.9[149815]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:33:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:09.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:10.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:10 np0005603623 python3.9[149969]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 31 02:33:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:11 np0005603623 python3.9[150122]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 31 02:33:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:11.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:12.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:12 np0005603623 python3.9[150281]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 31 02:33:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:13.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:13 np0005603623 python3.9[150441]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:33:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:14.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:14 np0005603623 python3.9[150526]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:33:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:15.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:16.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:17.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:33:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:18.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:33:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:19.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:33:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:20.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:33:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:21.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:22 np0005603623 podman[150552]: 2026-01-31 07:33:21.99977309 +0000 UTC m=+0.089223742 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:33:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:22.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:33:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:23.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:33:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:33:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:24.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:33:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:33:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:25.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:33:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:26.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:27 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 31 02:33:27 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 31 02:33:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:27.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:27 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 31 02:33:28 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Jan 31 02:33:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:28.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:28 np0005603623 podman[150793]: 2026-01-31 07:33:28.951093474 +0000 UTC m=+0.050008751 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 31 02:33:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:29.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:30.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:33:30.066 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:33:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:33:30.068 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:33:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:33:30.069 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:33:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:31.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:32.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:33.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:34.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:35.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:33:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:36.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:33:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:37.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:38.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:33:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:39.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:33:39 np0005603623 kernel: SELinux:  Converting 2780 SID table entries...
Jan 31 02:33:39 np0005603623 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 02:33:39 np0005603623 kernel: SELinux:  policy capability open_perms=1
Jan 31 02:33:39 np0005603623 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 02:33:39 np0005603623 kernel: SELinux:  policy capability always_check_network=0
Jan 31 02:33:39 np0005603623 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 02:33:39 np0005603623 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 02:33:39 np0005603623 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 02:33:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:40.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:41.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:42.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:43.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:44.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:45.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:45 np0005603623 dbus-broker-launch[782]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 31 02:33:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:46.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:47.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:33:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:48.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:33:48 np0005603623 kernel: SELinux:  Converting 2780 SID table entries...
Jan 31 02:33:48 np0005603623 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 02:33:48 np0005603623 kernel: SELinux:  policy capability open_perms=1
Jan 31 02:33:48 np0005603623 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 02:33:48 np0005603623 kernel: SELinux:  policy capability always_check_network=0
Jan 31 02:33:48 np0005603623 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 02:33:48 np0005603623 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 02:33:48 np0005603623 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 02:33:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:49.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:50.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:51.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:52.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:52 np0005603623 dbus-broker-launch[782]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 31 02:33:53 np0005603623 podman[150895]: 2026-01-31 07:33:53.048142638 +0000 UTC m=+0.130117608 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 31 02:33:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:33:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:53.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:33:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:54.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:33:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:55.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:33:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:56.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:57.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:58.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:33:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:59.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:59 np0005603623 podman[150924]: 2026-01-31 07:33:59.947058793 +0000 UTC m=+0.045175627 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent)
Jan 31 02:34:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:00.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:01.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:02.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:03.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:04.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:34:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:34:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:34:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:05.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:06.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:07.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:08.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:09.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:34:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:10.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:34:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:34:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:34:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:11.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:12.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:13.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:34:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:14.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:34:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:15.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:16.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:34:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:17.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:34:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:18.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:19.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:20.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:34:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:21.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:34:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:34:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:22.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:34:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:23.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:24 np0005603623 podman[168055]: 2026-01-31 07:34:24.067136545 +0000 UTC m=+0.125962388 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 02:34:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:24.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:34:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:25.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:34:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:26.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:27.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:34:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:28.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:34:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:29.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:34:30.067 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:34:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:34:30.068 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:34:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:34:30.068 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:34:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:30.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:30 np0005603623 kernel: SELinux:  Converting 2781 SID table entries...
Jan 31 02:34:30 np0005603623 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 02:34:30 np0005603623 kernel: SELinux:  policy capability open_perms=1
Jan 31 02:34:30 np0005603623 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 02:34:30 np0005603623 kernel: SELinux:  policy capability always_check_network=0
Jan 31 02:34:30 np0005603623 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 02:34:30 np0005603623 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 02:34:30 np0005603623 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 02:34:30 np0005603623 dbus-broker-launch[782]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 31 02:34:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:30 np0005603623 podman[168141]: 2026-01-31 07:34:30.992384958 +0000 UTC m=+0.076673437 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:34:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:31.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:31 np0005603623 dbus-broker-launch[769]: Noticed file-system modification, trigger reload.
Jan 31 02:34:31 np0005603623 dbus-broker-launch[769]: Noticed file-system modification, trigger reload.
Jan 31 02:34:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:32.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:33.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:34:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:34.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:34:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:35.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:36.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:37.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:38.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:39 np0005603623 systemd[1]: Stopping OpenSSH server daemon...
Jan 31 02:34:39 np0005603623 systemd[1]: sshd.service: Deactivated successfully.
Jan 31 02:34:39 np0005603623 systemd[1]: Stopped OpenSSH server daemon.
Jan 31 02:34:39 np0005603623 systemd[1]: sshd.service: Consumed 1.967s CPU time, read 32.0K from disk, written 0B to disk.
Jan 31 02:34:39 np0005603623 systemd[1]: Stopped target sshd-keygen.target.
Jan 31 02:34:39 np0005603623 systemd[1]: Stopping sshd-keygen.target...
Jan 31 02:34:39 np0005603623 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 31 02:34:39 np0005603623 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 31 02:34:39 np0005603623 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 31 02:34:39 np0005603623 systemd[1]: Reached target sshd-keygen.target.
Jan 31 02:34:39 np0005603623 systemd[1]: Starting OpenSSH server daemon...
Jan 31 02:34:39 np0005603623 systemd[1]: Started OpenSSH server daemon.
Jan 31 02:34:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:39.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:34:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:40.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:34:40 np0005603623 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 02:34:40 np0005603623 systemd[1]: Starting man-db-cache-update.service...
Jan 31 02:34:40 np0005603623 systemd[1]: Reloading.
Jan 31 02:34:40 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:34:40 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:34:40 np0005603623 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 02:34:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:41.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:42.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:43.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:44.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:45.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:45 np0005603623 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 02:34:45 np0005603623 systemd[1]: Finished man-db-cache-update.service.
Jan 31 02:34:45 np0005603623 systemd[1]: man-db-cache-update.service: Consumed 6.518s CPU time.
Jan 31 02:34:45 np0005603623 systemd[1]: run-r26ed237d39094f1fb27dc7d7e8d298da.service: Deactivated successfully.
Jan 31 02:34:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:46.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:47.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:48.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:34:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:49.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:34:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:34:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:50.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:34:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 31 02:34:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:51.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 31 02:34:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:52.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:53.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:34:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:54.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:34:55 np0005603623 podman[177750]: 2026-01-31 07:34:55.118378652 +0000 UTC m=+0.211655736 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 02:34:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:55.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:56.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:34:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:57.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:34:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:58.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:34:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:59.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:00.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:01 np0005603623 podman[177878]: 2026-01-31 07:35:01.215050755 +0000 UTC m=+0.085485414 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:35:01 np0005603623 python3.9[177916]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 02:35:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:35:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:01.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:35:01 np0005603623 systemd[1]: Reloading.
Jan 31 02:35:01 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:35:01 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:35:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:35:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:02.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:35:02 np0005603623 python3.9[178116]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 02:35:02 np0005603623 systemd[1]: Reloading.
Jan 31 02:35:02 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:35:02 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:35:03 np0005603623 python3.9[178306]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 02:35:03 np0005603623 systemd[1]: Reloading.
Jan 31 02:35:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:03.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:03 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:35:03 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:35:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:04.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:04 np0005603623 python3.9[178496]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 02:35:04 np0005603623 systemd[1]: Reloading.
Jan 31 02:35:04 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:35:04 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:35:05 np0005603623 python3.9[178687]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:05 np0005603623 systemd[1]: Reloading.
Jan 31 02:35:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:05.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:05 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:35:05 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:35:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:06.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:06 np0005603623 python3.9[178876]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:06 np0005603623 systemd[1]: Reloading.
Jan 31 02:35:06 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:35:06 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:35:07 np0005603623 python3.9[179118]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:07 np0005603623 systemd[1]: Reloading.
Jan 31 02:35:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:35:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:07.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:35:07 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:35:07 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:35:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:08.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:08 np0005603623 python3.9[179309]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:09 np0005603623 python3.9[179464]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:09 np0005603623 systemd[1]: Reloading.
Jan 31 02:35:09 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:35:09 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:35:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:09.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:10.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:10 np0005603623 python3.9[179760]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 02:35:10 np0005603623 systemd[1]: Reloading.
Jan 31 02:35:10 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:35:10 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:35:10 np0005603623 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 31 02:35:10 np0005603623 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 31 02:35:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:35:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:11.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:35:11 np0005603623 python3.9[179978]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:35:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:35:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:12.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:12 np0005603623 python3.9[180134]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:12 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:35:12 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:35:12 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:35:13 np0005603623 python3.9[180289]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:13.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:13 np0005603623 python3.9[180444]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:14.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:14 np0005603623 python3.9[180600]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:15 np0005603623 python3.9[180755]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:35:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:15.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:35:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:16 np0005603623 python3.9[180910]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:35:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:16.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:35:16 np0005603623 python3.9[181066]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:17 np0005603623 python3.9[181221]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:17.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:18 np0005603623 python3.9[181376]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:18.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:18 np0005603623 python3.9[181580]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:35:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:19.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:19 np0005603623 python3.9[181737]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:35:19.916794) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844919916891, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1953, "num_deletes": 251, "total_data_size": 4915505, "memory_usage": 4971984, "flush_reason": "Manual Compaction"}
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844919989512, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 3225713, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12399, "largest_seqno": 14347, "table_properties": {"data_size": 3217646, "index_size": 4946, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 15085, "raw_average_key_size": 18, "raw_value_size": 3201662, "raw_average_value_size": 3962, "num_data_blocks": 221, "num_entries": 808, "num_filter_entries": 808, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844732, "oldest_key_time": 1769844732, "file_creation_time": 1769844919, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 72808 microseconds, and 8494 cpu microseconds.
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:35:19.989598) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 3225713 bytes OK
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:35:19.989620) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:35:19.996128) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:35:19.996145) EVENT_LOG_v1 {"time_micros": 1769844919996140, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:35:19.996162) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 4906839, prev total WAL file size 4906839, number of live WAL files 2.
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:35:19.996930) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323531' seq:0, type:0; will stop at (end)
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(3150KB)], [24(8045KB)]
Jan 31 02:35:19 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844919996966, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 11464759, "oldest_snapshot_seqno": -1}
Jan 31 02:35:20 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4269 keys, 10960943 bytes, temperature: kUnknown
Jan 31 02:35:20 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844920115388, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 10960943, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10927370, "index_size": 21815, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10693, "raw_key_size": 103961, "raw_average_key_size": 24, "raw_value_size": 10845355, "raw_average_value_size": 2540, "num_data_blocks": 929, "num_entries": 4269, "num_filter_entries": 4269, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769844919, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:35:20 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:35:20 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:35:20.115690) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 10960943 bytes
Jan 31 02:35:20 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:35:20.136545) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 96.7 rd, 92.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 7.9 +0.0 blob) out(10.5 +0.0 blob), read-write-amplify(7.0) write-amplify(3.4) OK, records in: 4789, records dropped: 520 output_compression: NoCompression
Jan 31 02:35:20 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:35:20.136599) EVENT_LOG_v1 {"time_micros": 1769844920136574, "job": 12, "event": "compaction_finished", "compaction_time_micros": 118537, "compaction_time_cpu_micros": 17421, "output_level": 6, "num_output_files": 1, "total_output_size": 10960943, "num_input_records": 4789, "num_output_records": 4269, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:35:20 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:35:20 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844920137237, "job": 12, "event": "table_file_deletion", "file_number": 26}
Jan 31 02:35:20 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:35:20 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844920138663, "job": 12, "event": "table_file_deletion", "file_number": 24}
Jan 31 02:35:20 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:35:19.996871) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:35:20 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:35:20.138783) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:35:20 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:35:20.138788) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:35:20 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:35:20.138789) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:35:20 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:35:20.138791) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:35:20 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:35:20.138793) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:35:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:35:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:20.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:35:20 np0005603623 python3.9[181893]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:21 np0005603623 python3.9[182048]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:21.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:22.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:22 np0005603623 python3.9[182204]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:35:23 np0005603623 python3.9[182356]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:35:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:23.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:24 np0005603623 python3.9[182508]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:35:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:24.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:24 np0005603623 python3.9[182661]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:35:25 np0005603623 python3.9[182813]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:35:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:25.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:25 np0005603623 podman[182937]: 2026-01-31 07:35:25.787960616 +0000 UTC m=+0.098824857 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 02:35:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:25 np0005603623 python3.9[182982]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:35:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:26.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:26 np0005603623 python3.9[183142]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:35:27 np0005603623 python3.9[183344]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:27.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:35:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:28.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:35:28 np0005603623 python3.9[183470]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769844927.0482152-1649-47392428794459/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:28 np0005603623 python3.9[183622]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:29 np0005603623 python3.9[183747]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769844928.4029534-1649-167348218766330/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:29.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:29 np0005603623 python3.9[183899]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:35:30.068 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:35:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:35:30.068 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:35:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:35:30.068 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:35:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:35:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:30.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:35:30 np0005603623 python3.9[184025]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769844929.5110502-1649-147217531664473/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:30 np0005603623 python3.9[184177]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:31 np0005603623 python3.9[184302]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769844930.5349636-1649-181101723597966/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:31 np0005603623 podman[184303]: 2026-01-31 07:35:31.500495471 +0000 UTC m=+0.034829922 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 31 02:35:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:31.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:31 np0005603623 python3.9[184473]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:32.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:32 np0005603623 python3.9[184599]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769844931.5975685-1649-183096706457424/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:32 np0005603623 python3.9[184751]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:33 np0005603623 python3.9[184876]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769844932.600609-1649-153248586259671/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:35:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:33.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:35:33 np0005603623 python3.9[185028]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:34.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:34 np0005603623 python3.9[185152]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769844933.5983546-1649-12279854184225/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:34 np0005603623 python3.9[185304]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:35.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:35 np0005603623 python3.9[185429]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769844934.5781043-1649-271751099265987/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:36.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:37 np0005603623 python3.9[185582]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 31 02:35:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:37.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:37 np0005603623 python3.9[185735]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:38.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:38 np0005603623 python3.9[185888]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:38 np0005603623 python3.9[186040]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:39.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:39 np0005603623 python3.9[186192]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:40 np0005603623 python3.9[186344]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:35:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:40.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:35:40 np0005603623 python3.9[186497]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:41 np0005603623 python3.9[186649]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:41.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:41 np0005603623 python3.9[186801]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:35:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:42.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:35:42 np0005603623 python3.9[186954]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:42 np0005603623 python3.9[187106]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:43.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:43 np0005603623 python3.9[187258]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:44 np0005603623 python3.9[187410]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:44.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:44 np0005603623 python3.9[187563]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:45 np0005603623 python3.9[187715]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:45.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:46.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:46 np0005603623 python3.9[187868]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:46 np0005603623 python3.9[188039]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844945.887946-2311-80613227603262/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:47 np0005603623 python3.9[188193]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:47.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:47 np0005603623 python3.9[188316]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844946.9778366-2311-96463719579486/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:35:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:48.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:35:48 np0005603623 python3.9[188469]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:48 np0005603623 python3.9[188592]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844948.0616558-2311-158028503012180/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:49 np0005603623 python3.9[188744]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:49.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:49 np0005603623 python3.9[188867]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844949.096504-2311-202075208256394/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:35:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:50.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:35:50 np0005603623 python3.9[189020]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:50 np0005603623 python3.9[189143]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844950.0537295-2311-72449123215521/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:51 np0005603623 python3.9[189295]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:51.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:51 np0005603623 python3.9[189418]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844951.0521789-2311-156615807598169/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:52.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:52 np0005603623 python3.9[189571]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:53 np0005603623 python3.9[189694]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844952.0520415-2311-17325659274641/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:53.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:53 np0005603623 python3.9[189846]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:54 np0005603623 python3.9[189969]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844953.1508608-2311-105363620575385/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:54.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:54 np0005603623 python3.9[190122]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:55 np0005603623 python3.9[190245]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844954.2588048-2311-141123237178984/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:55.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:55 np0005603623 python3.9[190397]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:56 np0005603623 podman[190467]: 2026-01-31 07:35:56.046438423 +0000 UTC m=+0.135634632 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller)
Jan 31 02:35:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:56 np0005603623 python3.9[190546]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844955.279511-2311-103495649249557/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:56.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:56 np0005603623 python3.9[190699]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:57 np0005603623 python3.9[190822]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844956.3846316-2311-90590037855099/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:35:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:57.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:35:57 np0005603623 python3.9[190974]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:35:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:58.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:35:58 np0005603623 python3.9[191097]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844957.3781133-2311-177851389828942/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:58 np0005603623 python3.9[191250]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:59 np0005603623 python3.9[191373]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844958.4100976-2311-40586755567261/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:35:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:59.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:59 np0005603623 python3.9[191525]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:00.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:00 np0005603623 python3.9[191649]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844959.5532255-2311-132653966042669/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:01.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:01 np0005603623 podman[191674]: 2026-01-31 07:36:01.968087386 +0000 UTC m=+0.057997401 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 31 02:36:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:02.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:03 np0005603623 python3.9[191819]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:36:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:36:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:03.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:36:03 np0005603623 python3.9[191974]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 31 02:36:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:04.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:05.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:06.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:06 np0005603623 dbus-broker-launch[782]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 31 02:36:06 np0005603623 python3.9[192132]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:07 np0005603623 python3.9[192334]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:07.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:07 np0005603623 python3.9[192486]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:08.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:08 np0005603623 python3.9[192639]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:08 np0005603623 python3.9[192791]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:09.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:09 np0005603623 python3.9[192943]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:10.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:10 np0005603623 python3.9[193096]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:11 np0005603623 python3.9[193248]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:11.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:11 np0005603623 python3.9[193400]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:12.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:12 np0005603623 python3.9[193553]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:13 np0005603623 python3.9[193705]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:36:13 np0005603623 systemd[1]: Reloading.
Jan 31 02:36:13 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:36:13 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:36:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:13.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:13 np0005603623 systemd[1]: Starting libvirt logging daemon socket...
Jan 31 02:36:13 np0005603623 systemd[1]: Listening on libvirt logging daemon socket.
Jan 31 02:36:13 np0005603623 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 31 02:36:13 np0005603623 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 31 02:36:13 np0005603623 systemd[1]: Starting libvirt logging daemon...
Jan 31 02:36:13 np0005603623 systemd[1]: Started libvirt logging daemon.
Jan 31 02:36:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:14.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:14 np0005603623 python3.9[193899]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:36:14 np0005603623 systemd[1]: Reloading.
Jan 31 02:36:14 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:36:14 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:36:14 np0005603623 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 31 02:36:14 np0005603623 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 31 02:36:14 np0005603623 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 31 02:36:14 np0005603623 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 31 02:36:14 np0005603623 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 31 02:36:14 np0005603623 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 31 02:36:14 np0005603623 systemd[1]: Starting libvirt nodedev daemon...
Jan 31 02:36:14 np0005603623 systemd[1]: Started libvirt nodedev daemon.
Jan 31 02:36:15 np0005603623 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 31 02:36:15 np0005603623 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 31 02:36:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:15.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:15 np0005603623 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 31 02:36:15 np0005603623 python3.9[194116]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:36:15 np0005603623 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 31 02:36:15 np0005603623 systemd[1]: Reloading.
Jan 31 02:36:15 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:36:15 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:36:15 np0005603623 auditd[703]: Audit daemon rotating log files
Jan 31 02:36:15 np0005603623 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 31 02:36:15 np0005603623 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 31 02:36:15 np0005603623 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 31 02:36:15 np0005603623 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 31 02:36:16 np0005603623 systemd[1]: Starting libvirt proxy daemon...
Jan 31 02:36:16 np0005603623 systemd[1]: Started libvirt proxy daemon.
Jan 31 02:36:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:16.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:16 np0005603623 setroubleshoot[193962]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 56cd6035-daf0-4f26-a7e0-2b34ee113a6c
Jan 31 02:36:16 np0005603623 setroubleshoot[193962]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 31 02:36:16 np0005603623 setroubleshoot[193962]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 56cd6035-daf0-4f26-a7e0-2b34ee113a6c
Jan 31 02:36:16 np0005603623 setroubleshoot[193962]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 31 02:36:16 np0005603623 python3.9[194337]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:36:16 np0005603623 systemd[1]: Reloading.
Jan 31 02:36:16 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:36:16 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:36:17 np0005603623 systemd[1]: Listening on libvirt locking daemon socket.
Jan 31 02:36:17 np0005603623 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 31 02:36:17 np0005603623 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 31 02:36:17 np0005603623 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 31 02:36:17 np0005603623 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 31 02:36:17 np0005603623 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 31 02:36:17 np0005603623 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 31 02:36:17 np0005603623 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 31 02:36:17 np0005603623 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 31 02:36:17 np0005603623 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 31 02:36:17 np0005603623 systemd[1]: Starting libvirt QEMU daemon...
Jan 31 02:36:17 np0005603623 systemd[1]: Started libvirt QEMU daemon.
Jan 31 02:36:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:17.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:17 np0005603623 python3.9[194554]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:36:17 np0005603623 systemd[1]: Reloading.
Jan 31 02:36:17 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:36:17 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:36:18 np0005603623 systemd[1]: Starting libvirt secret daemon socket...
Jan 31 02:36:18 np0005603623 systemd[1]: Listening on libvirt secret daemon socket.
Jan 31 02:36:18 np0005603623 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 31 02:36:18 np0005603623 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 31 02:36:18 np0005603623 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 31 02:36:18 np0005603623 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 31 02:36:18 np0005603623 systemd[1]: Starting libvirt secret daemon...
Jan 31 02:36:18 np0005603623 systemd[1]: Started libvirt secret daemon.
Jan 31 02:36:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:18.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:19 np0005603623 python3.9[194867]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:19.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:19 np0005603623 python3.9[195050]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 31 02:36:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:20.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:20 np0005603623 python3.9[195203]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:36:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:21 np0005603623 python3.9[195357]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 31 02:36:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:21.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:36:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:36:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:36:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:36:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:36:22 np0005603623 python3.9[195507]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:22.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:22 np0005603623 python3.9[195629]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769844981.7093022-3386-216112346771140/.source.xml follow=False _original_basename=secret.xml.j2 checksum=450e5279e3f961806683176060af91f2a100b4e1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:23 np0005603623 python3.9[195781]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:36:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:23.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:24.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:24 np0005603623 python3.9[195944]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:25.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:26.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:26 np0005603623 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 31 02:36:26 np0005603623 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 31 02:36:26 np0005603623 podman[196380]: 2026-01-31 07:36:26.743407918 +0000 UTC m=+0.124363777 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 02:36:26 np0005603623 python3.9[196414]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:27 np0005603623 python3.9[196637]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:27.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:28 np0005603623 python3.9[196760]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769844987.142102-3551-107318114239063/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:28.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:29 np0005603623 python3.9[196913]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:29.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:29 np0005603623 python3.9[197065]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:36:30.068 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:36:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:36:30.069 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:36:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:36:30.069 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:36:30 np0005603623 python3.9[197143]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:30.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:30 np0005603623 python3.9[197296]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:31 np0005603623 python3.9[197374]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.iqpt0kry recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:31.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:31 np0005603623 python3.9[197576]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:31 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:36:31 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:36:32 np0005603623 podman[197626]: 2026-01-31 07:36:32.106320695 +0000 UTC m=+0.066569192 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Jan 31 02:36:32 np0005603623 python3.9[197673]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:32.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:32 np0005603623 python3.9[197826]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:36:33 np0005603623 python3[197979]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 31 02:36:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:33.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:34 np0005603623 python3.9[198132]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:34.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:34 np0005603623 python3.9[198210]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:35 np0005603623 python3.9[198362]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:35.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:35 np0005603623 python3.9[198487]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844994.8760428-3818-7624133860315/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:36.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:36 np0005603623 python3.9[198640]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:37 np0005603623 python3.9[198718]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:37.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:37 np0005603623 python3.9[198870]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:38 np0005603623 python3.9[198948]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:38.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:38 np0005603623 python3.9[199101]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:39 np0005603623 python3.9[199226]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844998.405621-3935-207125515122742/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:39.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:40 np0005603623 python3.9[199378]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:40.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:40 np0005603623 python3.9[199531]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:36:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:41.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:41 np0005603623 python3.9[199686]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:42.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:42 np0005603623 python3.9[199839]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:36:43 np0005603623 python3.9[199992]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:36:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:36:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:43.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:36:44 np0005603623 python3.9[200146]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:36:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:44.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:45 np0005603623 python3.9[200303]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:45.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:45 np0005603623 python3.9[200455]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:46 np0005603623 python3.9[200578]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769845005.2622635-4150-29587896640322/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:46.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:46 np0005603623 python3.9[200731]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:47 np0005603623 python3.9[200904]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769845006.4968665-4196-120033256972109/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:47.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:48 np0005603623 python3.9[201056]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:48.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:48 np0005603623 python3.9[201180]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769845007.6513474-4242-156546687218771/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:49 np0005603623 python3.9[201332]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:36:49 np0005603623 systemd[1]: Reloading.
Jan 31 02:36:49 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:36:49 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:36:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:49.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:49 np0005603623 systemd[1]: Reached target edpm_libvirt.target.
Jan 31 02:36:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:50.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:50 np0005603623 python3.9[201525]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 31 02:36:50 np0005603623 systemd[1]: Reloading.
Jan 31 02:36:50 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:36:50 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:36:50 np0005603623 systemd[1]: Reloading.
Jan 31 02:36:51 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:36:51 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:36:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:51.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:51 np0005603623 systemd-logind[795]: Session 48 logged out. Waiting for processes to exit.
Jan 31 02:36:51 np0005603623 systemd[1]: session-48.scope: Deactivated successfully.
Jan 31 02:36:51 np0005603623 systemd[1]: session-48.scope: Consumed 2min 50.268s CPU time.
Jan 31 02:36:51 np0005603623 systemd-logind[795]: Removed session 48.
Jan 31 02:36:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:52.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:36:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:53.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:36:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:54.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:55.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:36:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:56.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:36:56 np0005603623 podman[201625]: 2026-01-31 07:36:56.992413458 +0000 UTC m=+0.086801498 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127)
Jan 31 02:36:57 np0005603623 systemd-logind[795]: New session 49 of user zuul.
Jan 31 02:36:57 np0005603623 systemd[1]: Started Session 49 of User zuul.
Jan 31 02:36:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:57.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:58 np0005603623 python3.9[201807]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:36:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:36:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:58.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:36:59 np0005603623 python3.9[201962]: ansible-ansible.builtin.service_facts Invoked
Jan 31 02:36:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:36:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:59.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:59 np0005603623 network[201979]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 02:36:59 np0005603623 network[201980]: 'network-scripts' will be removed from distribution in near future.
Jan 31 02:36:59 np0005603623 network[201981]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 02:37:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:00.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:01.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:02.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:02 np0005603623 podman[202128]: 2026-01-31 07:37:02.976504447 +0000 UTC m=+0.068763893 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 31 02:37:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:03.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:04 np0005603623 python3.9[202275]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:37:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:04.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:05 np0005603623 python3.9[202360]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:37:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:37:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:05.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:37:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:06.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:07.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:37:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:08.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:37:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:09.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:37:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:10.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:37:11 np0005603623 python3.9[202566]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:37:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:37:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:11.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:37:12 np0005603623 python3.9[202718]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:37:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:37:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:12.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:37:12 np0005603623 python3.9[202872]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:37:13 np0005603623 python3.9[203024]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:37:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:13.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:14 np0005603623 python3.9[203178]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:37:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:14.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:14 np0005603623 python3.9[203301]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769845033.8357778-247-95757590070517/.source.iscsi _original_basename=.29zau3kh follow=False checksum=047494f7bb8b2e3d4102ed656af55448433e58e8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:37:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:15.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:37:15 np0005603623 python3.9[203453]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:16.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:16 np0005603623 python3.9[203606]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:37:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:17.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:37:17 np0005603623 python3.9[203758]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:37:17 np0005603623 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 31 02:37:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:18.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:18 np0005603623 python3.9[203915]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:37:18 np0005603623 systemd[1]: Reloading.
Jan 31 02:37:18 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:37:18 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:37:18 np0005603623 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 31 02:37:18 np0005603623 systemd[1]: Starting Open-iSCSI...
Jan 31 02:37:18 np0005603623 kernel: Loading iSCSI transport class v2.0-870.
Jan 31 02:37:18 np0005603623 systemd[1]: Started Open-iSCSI.
Jan 31 02:37:18 np0005603623 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 31 02:37:18 np0005603623 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 31 02:37:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:19.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:20 np0005603623 python3.9[204114]: ansible-ansible.builtin.service_facts Invoked
Jan 31 02:37:20 np0005603623 network[204132]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 02:37:20 np0005603623 network[204133]: 'network-scripts' will be removed from distribution in near future.
Jan 31 02:37:20 np0005603623 network[204134]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 02:37:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:37:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:20.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:37:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:37:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:21.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:37:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:22.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:37:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:23.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:37:24 np0005603623 python3.9[204407]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:37:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:24.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:25.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:26 np0005603623 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 02:37:26 np0005603623 systemd[1]: Starting man-db-cache-update.service...
Jan 31 02:37:26 np0005603623 systemd[1]: Reloading.
Jan 31 02:37:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:26 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:37:26 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:37:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:37:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:26.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:37:26 np0005603623 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 02:37:26 np0005603623 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 02:37:26 np0005603623 systemd[1]: Finished man-db-cache-update.service.
Jan 31 02:37:26 np0005603623 systemd[1]: run-rf7e4378eee524373824b1d9978d2f717.service: Deactivated successfully.
Jan 31 02:37:27 np0005603623 podman[204622]: 2026-01-31 07:37:27.329379469 +0000 UTC m=+0.115706011 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 02:37:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:27.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:27 np0005603623 python3.9[204801]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 31 02:37:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:28.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:28 np0005603623 python3.9[204954]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 31 02:37:28 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 02:37:29 np0005603623 python3.9[205111]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:37:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:37:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:29.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:37:29 np0005603623 python3.9[205234]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769845049.0730164-512-214477179503549/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:37:30.069 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:37:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:37:30.070 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:37:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:37:30.070 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:37:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:30.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:30 np0005603623 python3.9[205387]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:31.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:31 np0005603623 python3.9[205656]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:37:32 np0005603623 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 31 02:37:32 np0005603623 systemd[1]: Stopped Load Kernel Modules.
Jan 31 02:37:32 np0005603623 systemd[1]: Stopping Load Kernel Modules...
Jan 31 02:37:32 np0005603623 systemd[1]: Starting Load Kernel Modules...
Jan 31 02:37:32 np0005603623 systemd[1]: Finished Load Kernel Modules.
Jan 31 02:37:32 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:37:32 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:37:32 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:37:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:32.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:32 np0005603623 python3.9[205827]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:37:33 np0005603623 podman[205952]: 2026-01-31 07:37:33.534993822 +0000 UTC m=+0.077371943 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:37:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:37:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:33.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:37:33 np0005603623 python3.9[205997]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:37:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:34.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:34 np0005603623 python3.9[206153]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:37:35 np0005603623 python3.9[206276]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769845054.0039294-665-278028872744849/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:37:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:35.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:37:35 np0005603623 python3.9[206428]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:37:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:36 np0005603623 python3.9[206582]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:36.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:37.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:37:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:37:38 np0005603623 python3.9[206735]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:38.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:39 np0005603623 python3.9[206937]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:37:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:39.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:37:39 np0005603623 python3.9[207089]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:40.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:40 np0005603623 python3.9[207242]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:41 np0005603623 python3.9[207394]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:41.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:41 np0005603623 python3.9[207546]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:42.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:43 np0005603623 python3.9[207699]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:37:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:37:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:43.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:37:43 np0005603623 python3.9[207853]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:37:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:37:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:44.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:37:45 np0005603623 python3.9[208007]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:37:45 np0005603623 systemd[1]: Listening on multipathd control socket.
Jan 31 02:37:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:37:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:45.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:37:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:46.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:46 np0005603623 python3.9[208164]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:37:47 np0005603623 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 31 02:37:47 np0005603623 udevadm[208169]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 31 02:37:47 np0005603623 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 31 02:37:47 np0005603623 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 31 02:37:47 np0005603623 multipathd[208173]: --------start up--------
Jan 31 02:37:47 np0005603623 multipathd[208173]: read /etc/multipath.conf
Jan 31 02:37:47 np0005603623 multipathd[208173]: path checkers start up
Jan 31 02:37:47 np0005603623 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 31 02:37:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:47.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:48 np0005603623 python3.9[208383]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 31 02:37:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:48.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:49 np0005603623 python3.9[208535]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 31 02:37:49 np0005603623 kernel: Key type psk registered
Jan 31 02:37:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:49.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:50 np0005603623 python3.9[208698]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:37:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:50.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:50 np0005603623 python3.9[208822]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769845069.5278165-1055-54375078560022/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:51 np0005603623 python3.9[208974]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:51.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:52 np0005603623 python3.9[209126]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:37:52 np0005603623 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 31 02:37:52 np0005603623 systemd[1]: Stopped Load Kernel Modules.
Jan 31 02:37:52 np0005603623 systemd[1]: Stopping Load Kernel Modules...
Jan 31 02:37:52 np0005603623 systemd[1]: Starting Load Kernel Modules...
Jan 31 02:37:52 np0005603623 systemd[1]: Finished Load Kernel Modules.
Jan 31 02:37:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:52.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:53.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:37:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:54.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:37:54.849581) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845074849639, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 1855, "num_deletes": 501, "total_data_size": 4016985, "memory_usage": 4072664, "flush_reason": "Manual Compaction"}
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845074868442, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 1530405, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14352, "largest_seqno": 16202, "table_properties": {"data_size": 1524843, "index_size": 2317, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2181, "raw_key_size": 16629, "raw_average_key_size": 19, "raw_value_size": 1510835, "raw_average_value_size": 1744, "num_data_blocks": 107, "num_entries": 866, "num_filter_entries": 866, "num_deletions": 501, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844920, "oldest_key_time": 1769844920, "file_creation_time": 1769845074, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 18938 microseconds, and 3342 cpu microseconds.
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:37:54.868512) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 1530405 bytes OK
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:37:54.868534) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:37:54.872612) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:37:54.872633) EVENT_LOG_v1 {"time_micros": 1769845074872627, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:37:54.872651) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 4007711, prev total WAL file size 4007711, number of live WAL files 2.
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:37:54.873561) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323530' seq:72057594037927935, type:22 .. '6D67727374617400353031' seq:0, type:0; will stop at (end)
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(1494KB)], [27(10MB)]
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845074873603, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 12491348, "oldest_snapshot_seqno": -1}
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 4191 keys, 8056653 bytes, temperature: kUnknown
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845074962035, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 8056653, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8027258, "index_size": 17829, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10501, "raw_key_size": 103457, "raw_average_key_size": 24, "raw_value_size": 7950084, "raw_average_value_size": 1896, "num_data_blocks": 752, "num_entries": 4191, "num_filter_entries": 4191, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769845074, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:37:54.962777) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 8056653 bytes
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:37:54.979894) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 140.3 rd, 90.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 10.5 +0.0 blob) out(7.7 +0.0 blob), read-write-amplify(13.4) write-amplify(5.3) OK, records in: 5135, records dropped: 944 output_compression: NoCompression
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:37:54.979920) EVENT_LOG_v1 {"time_micros": 1769845074979909, "job": 14, "event": "compaction_finished", "compaction_time_micros": 89019, "compaction_time_cpu_micros": 17455, "output_level": 6, "num_output_files": 1, "total_output_size": 8056653, "num_input_records": 5135, "num_output_records": 4191, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845074980165, "job": 14, "event": "table_file_deletion", "file_number": 29}
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845074981036, "job": 14, "event": "table_file_deletion", "file_number": 27}
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:37:54.873495) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:37:54.981114) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:37:54.981119) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:37:54.981120) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:37:54.981122) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:37:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:37:54.981124) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:37:55 np0005603623 python3.9[209284]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:37:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:37:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:55.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:37:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:37:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:56.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:37:57 np0005603623 systemd[1]: Reloading.
Jan 31 02:37:57 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:37:57 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:37:57 np0005603623 systemd[1]: Reloading.
Jan 31 02:37:57 np0005603623 ceph-mgr[77391]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3835187053
Jan 31 02:37:57 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:37:57 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:37:57 np0005603623 systemd-logind[795]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 31 02:37:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:57.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:57 np0005603623 systemd-logind[795]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 31 02:37:57 np0005603623 podman[209363]: 2026-01-31 07:37:57.725102923 +0000 UTC m=+0.107993561 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller)
Jan 31 02:37:57 np0005603623 lvm[209426]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 31 02:37:57 np0005603623 lvm[209426]: VG ceph_vg0 finished
Jan 31 02:37:57 np0005603623 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 02:37:57 np0005603623 systemd[1]: Starting man-db-cache-update.service...
Jan 31 02:37:57 np0005603623 systemd[1]: Reloading.
Jan 31 02:37:58 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:37:58 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:37:58 np0005603623 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 02:37:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:37:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:58.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:37:59 np0005603623 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 02:37:59 np0005603623 systemd[1]: Finished man-db-cache-update.service.
Jan 31 02:37:59 np0005603623 systemd[1]: run-r07eb52c5462d42d480d536758802f1eb.service: Deactivated successfully.
Jan 31 02:37:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:37:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:37:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:59.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:38:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:00.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:00 np0005603623 python3.9[210783]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:38:00 np0005603623 systemd[1]: Stopping Open-iSCSI...
Jan 31 02:38:00 np0005603623 iscsid[203955]: iscsid shutting down.
Jan 31 02:38:00 np0005603623 systemd[1]: iscsid.service: Deactivated successfully.
Jan 31 02:38:00 np0005603623 systemd[1]: Stopped Open-iSCSI.
Jan 31 02:38:00 np0005603623 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 31 02:38:00 np0005603623 systemd[1]: Starting Open-iSCSI...
Jan 31 02:38:00 np0005603623 systemd[1]: Started Open-iSCSI.
Jan 31 02:38:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:01 np0005603623 python3.9[210940]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:38:01 np0005603623 multipathd[208173]: exit (signal)
Jan 31 02:38:01 np0005603623 multipathd[208173]: --------shut down-------
Jan 31 02:38:01 np0005603623 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 31 02:38:01 np0005603623 systemd[1]: multipathd.service: Deactivated successfully.
Jan 31 02:38:01 np0005603623 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 31 02:38:01 np0005603623 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 31 02:38:01 np0005603623 multipathd[210946]: --------start up--------
Jan 31 02:38:01 np0005603623 multipathd[210946]: read /etc/multipath.conf
Jan 31 02:38:01 np0005603623 multipathd[210946]: path checkers start up
Jan 31 02:38:01 np0005603623 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 31 02:38:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:01.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:02 np0005603623 python3.9[211103]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:38:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:38:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:02.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:38:03 np0005603623 python3.9[211260]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:03.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:03 np0005603623 podman[211285]: 2026-01-31 07:38:03.998170497 +0000 UTC m=+0.058761680 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 02:38:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:04.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:04 np0005603623 python3.9[211432]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 02:38:04 np0005603623 systemd[1]: Reloading.
Jan 31 02:38:04 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:38:04 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:38:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:38:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:05.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:38:05 np0005603623 python3.9[211617]: ansible-ansible.builtin.service_facts Invoked
Jan 31 02:38:05 np0005603623 network[211634]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 02:38:05 np0005603623 network[211635]: 'network-scripts' will be removed from distribution in near future.
Jan 31 02:38:05 np0005603623 network[211636]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 02:38:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:06.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:07.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:08.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:09.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:10 np0005603623 python3.9[211962]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:38:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:10.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:11 np0005603623 python3.9[212115]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:38:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:11.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:11 np0005603623 python3.9[212268]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:38:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:12.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:12 np0005603623 python3.9[212422]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:38:13 np0005603623 python3.9[212575]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:38:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:13.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:14 np0005603623 python3.9[212728]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:38:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:14.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:14 np0005603623 python3.9[212882]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:38:15 np0005603623 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 31 02:38:15 np0005603623 python3.9[213036]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:38:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:15.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:16 np0005603623 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 31 02:38:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:16.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:16 np0005603623 python3.9[213191]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:17 np0005603623 python3.9[213343]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:17.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:17 np0005603623 python3.9[213495]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:18.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:18 np0005603623 python3.9[213648]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:19 np0005603623 python3.9[213800]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:19 np0005603623 python3.9[213952]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:19.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:20 np0005603623 python3.9[214105]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:20.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:20 np0005603623 python3.9[214257]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:38:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:21.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:38:22 np0005603623 python3.9[214409]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:38:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:22.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:38:22 np0005603623 python3.9[214562]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:23 np0005603623 python3.9[214714]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:23.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:24 np0005603623 python3.9[214866]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:38:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:24.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:38:24 np0005603623 python3.9[215019]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:25 np0005603623 python3.9[215171]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:38:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:25.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:38:26 np0005603623 python3.9[215323]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:26 np0005603623 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 31 02:38:26 np0005603623 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 31 02:38:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:38:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:26.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:38:26 np0005603623 python3.9[215478]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:27.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:27 np0005603623 python3.9[215680]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:38:27 np0005603623 podman[215683]: 2026-01-31 07:38:27.934543764 +0000 UTC m=+0.084450233 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:38:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:28.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:28 np0005603623 python3.9[215859]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 31 02:38:29 np0005603623 python3.9[216011]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 02:38:29 np0005603623 systemd[1]: Reloading.
Jan 31 02:38:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:38:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:29.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:38:29 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:38:29 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:38:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:38:30.070 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:38:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:38:30.071 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:38:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:38:30.071 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:38:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:30.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:31 np0005603623 python3.9[216198]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:38:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:31 np0005603623 python3.9[216351]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:38:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.003000095s ======
Jan 31 02:38:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:31.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000095s
Jan 31 02:38:32 np0005603623 python3.9[216504]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:38:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:32.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:32 np0005603623 python3.9[216658]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:38:33 np0005603623 python3.9[216811]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:38:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:38:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:33.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:38:34 np0005603623 python3.9[216964]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:38:34 np0005603623 podman[216966]: 2026-01-31 07:38:34.133046714 +0000 UTC m=+0.049138529 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 02:38:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:34.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:34 np0005603623 python3.9[217140]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:38:35 np0005603623 python3.9[217293]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:38:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:35.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:36.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:37 np0005603623 python3.9[217447]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:37.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:37 np0005603623 python3.9[217599]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:38 np0005603623 python3.9[217752]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:38.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:38 np0005603623 podman[218001]: 2026-01-31 07:38:38.863114929 +0000 UTC m=+0.045147574 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:38:38 np0005603623 podman[218001]: 2026-01-31 07:38:38.946818628 +0000 UTC m=+0.128851273 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 02:38:39 np0005603623 python3.9[218115]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:39 np0005603623 podman[218311]: 2026-01-31 07:38:39.413604387 +0000 UTC m=+0.046315720 container exec dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:38:39 np0005603623 podman[218357]: 2026-01-31 07:38:39.47124115 +0000 UTC m=+0.046451904 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:38:39 np0005603623 podman[218311]: 2026-01-31 07:38:39.476735773 +0000 UTC m=+0.109447156 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:38:39 np0005603623 podman[218452]: 2026-01-31 07:38:39.646382432 +0000 UTC m=+0.049110318 container exec 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, vcs-type=git, architecture=x86_64, distribution-scope=public, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., release=1793, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 31 02:38:39 np0005603623 podman[218452]: 2026-01-31 07:38:39.65589803 +0000 UTC m=+0.058625886 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, name=keepalived, version=2.2.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, vcs-type=git, architecture=x86_64)
Jan 31 02:38:39 np0005603623 python3.9[218426]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:39.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:38:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:38:40 np0005603623 python3.9[218745]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:40.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:40 np0005603623 python3.9[218921]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:41 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:38:41 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:38:41 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:38:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:41 np0005603623 python3.9[219073]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:38:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:41.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:38:42 np0005603623 python3.9[219226]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:38:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:42.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:38:42 np0005603623 python3.9[219378]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:43.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:44.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:45.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:38:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:46.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:38:46 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:38:46 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:38:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:38:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:47.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:38:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:48.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:48 np0005603623 python3.9[219633]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 31 02:38:49 np0005603623 python3.9[219786]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 31 02:38:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:38:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:49.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:38:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:50.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:50 np0005603623 python3.9[219945]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 31 02:38:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:38:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:51.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:38:52 np0005603623 systemd-logind[795]: New session 50 of user zuul.
Jan 31 02:38:52 np0005603623 systemd[1]: Started Session 50 of User zuul.
Jan 31 02:38:52 np0005603623 systemd[1]: session-50.scope: Deactivated successfully.
Jan 31 02:38:52 np0005603623 systemd-logind[795]: Session 50 logged out. Waiting for processes to exit.
Jan 31 02:38:52 np0005603623 systemd-logind[795]: Removed session 50.
Jan 31 02:38:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:52.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:52 np0005603623 python3.9[220132]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:38:53 np0005603623 python3.9[220253]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769845132.4305966-2662-87212640288156/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:53.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:53 np0005603623 python3.9[220403]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:38:54 np0005603623 python3.9[220479]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:54.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:54 np0005603623 python3.9[220630]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:38:55 np0005603623 python3.9[220751]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769845134.3123417-2662-136890329658452/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:55.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:55 np0005603623 python3.9[220901]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:38:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:56 np0005603623 python3.9[221023]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769845135.3150823-2662-141959305732441/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:56.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:56 np0005603623 python3.9[221173]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:38:57 np0005603623 python3.9[221294]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769845136.4612412-2662-92721247401373/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:57.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:57 np0005603623 python3.9[221444]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:38:58 np0005603623 podman[221540]: 2026-01-31 07:38:58.198926215 +0000 UTC m=+0.074899707 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 02:38:58 np0005603623 python3.9[221577]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769845137.4933026-2662-3817262492364/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:58.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:59 np0005603623 python3.9[221744]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:38:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:59.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:00 np0005603623 python3.9[221896]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:39:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:00.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:00 np0005603623 python3.9[222049]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:39:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:39:01 np0005603623 python3.9[222201]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:39:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:01.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:02 np0005603623 python3.9[222324]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769845141.1161761-2983-116397663886101/.source _original_basename=.o2a0mfqb follow=False checksum=ad58de695b11fac3fa7345417f2c10dd5d6dc81a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 31 02:39:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:02.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:02 np0005603623 python3.9[222477]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:39:03 np0005603623 python3.9[222629]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:39:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:03.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:04 np0005603623 python3.9[222750]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769845143.1986468-3061-273782909374302/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:39:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:04.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:04 np0005603623 podman[222875]: 2026-01-31 07:39:04.774775886 +0000 UTC m=+0.042498022 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 02:39:04 np0005603623 python3.9[222914]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:39:05 np0005603623 python3.9[223041]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769845144.3256044-3106-37148545573497/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:39:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:39:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:05.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:39:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:39:06 np0005603623 python3.9[223194]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 31 02:39:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:06.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:07 np0005603623 python3.9[223346]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 31 02:39:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:39:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:07.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:39:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:08.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:08 np0005603623 python3[223549]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 31 02:39:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:09.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:39:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:10.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:39:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:39:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:11.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:39:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:12.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:39:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:13.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:14.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:15.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:39:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:16.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:17.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:18.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:19.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:20.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:21.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:39:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:22.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:39:23 np0005603623 ceph-mds[84161]: mds.beacon.cephfs.compute-2.asgtzy missed beacon ack from the monitors
Jan 31 02:39:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:23.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:24.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:25.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:26.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:27 np0005603623 ceph-mds[84161]: mds.beacon.cephfs.compute-2.asgtzy missed beacon ack from the monitors
Jan 31 02:39:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:27.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).paxos(paxos updating c 1005..1710) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 5.705521107s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Jan 31 02:39:28 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2[77033]: 2026-01-31T07:39:28.006+0000 7fb35ec97640 -1 mon.compute-2@1(peon).paxos(paxos updating c 1005..1710) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 5.705521107s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Jan 31 02:39:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:39:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:28.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:39:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:39:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:39:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:29.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:39:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:39:30.072 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:39:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:39:30.072 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:39:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:39:30.072 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:39:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:39:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:30.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:39:31 np0005603623 ceph-mds[84161]: mds.beacon.cephfs.compute-2.asgtzy missed beacon ack from the monitors
Jan 31 02:39:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:31.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:31 np0005603623 ceph-mon[77037]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 31 02:39:31 np0005603623 ceph-mon[77037]: paxos.1).electionLogic(15) init, last seen epoch 15, mid-election, bumping
Jan 31 02:39:32 np0005603623 ceph-mds[84161]: mds.beacon.cephfs.compute-2.asgtzy MDS connection to Monitors appears to be laggy; 17.3065s since last acked beacon
Jan 31 02:39:32 np0005603623 ceph-mds[84161]: mds.0.4 skipping upkeep work because connection to Monitors appears laggy
Jan 31 02:39:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:39:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:32.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:39:32 np0005603623 podman[223685]: 2026-01-31 07:39:32.832039326 +0000 UTC m=+3.833085581 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller)
Jan 31 02:39:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:39:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:33.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:39:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:34.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:34 np0005603623 podman[223715]: 2026-01-31 07:39:34.948892855 +0000 UTC m=+0.038463236 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:39:35 np0005603623 ceph-mds[84161]: mds.beacon.cephfs.compute-2.asgtzy missed beacon ack from the monitors
Jan 31 02:39:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:35.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:36.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 31 02:39:37 np0005603623 ceph-mds[84161]: mds.0.4 skipping upkeep work because connection to Monitors appears laggy
Jan 31 02:39:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:37.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:38.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:39 np0005603623 ceph-mds[84161]: mds.beacon.cephfs.compute-2.asgtzy missed beacon ack from the monitors
Jan 31 02:39:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:39:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:39.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:39:39 np0005603623 ceph-mgr[77391]: client.0 ms_handle_reset on v2:192.168.122.100:3300/0
Jan 31 02:39:39 np0005603623 ceph-mgr[77391]: client.0 ms_handle_reset on v2:192.168.122.100:3300/0
Jan 31 02:39:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:40.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 31 02:39:40 np0005603623 ceph-mgr[77391]: ms_deliver_dispatch: unhandled message 0x5645f6eaaf20 mon_map magic: 0 v1 from mon.1 v2:192.168.122.102:3300/0
Jan 31 02:39:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:41.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 31 02:39:41 np0005603623 ceph-mds[84161]: mds.beacon.cephfs.compute-2.asgtzy  MDS is no longer laggy
Jan 31 02:39:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:39:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:42.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:43 np0005603623 podman[223561]: 2026-01-31 07:39:43.39624268 +0000 UTC m=+34.720101827 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 31 02:39:43 np0005603623 ceph-mon[77037]: mon.compute-0 calling monitor election
Jan 31 02:39:43 np0005603623 ceph-mon[77037]: mon.compute-1 calling monitor election
Jan 31 02:39:43 np0005603623 ceph-mon[77037]: mon.compute-2 calling monitor election
Jan 31 02:39:43 np0005603623 ceph-mon[77037]: mon.compute-0 calling monitor election
Jan 31 02:39:43 np0005603623 ceph-mon[77037]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 31 02:39:43 np0005603623 ceph-mon[77037]: overall HEALTH_OK
Jan 31 02:39:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:39:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:39:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:39:43 np0005603623 podman[223764]: 2026-01-31 07:39:43.522661118 +0000 UTC m=+0.060329380 container create b8f9d0758750e96f7d898f257e3b805b90943c4ed2917d32885547f8acfd2015 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 02:39:43 np0005603623 podman[223764]: 2026-01-31 07:39:43.479569329 +0000 UTC m=+0.017237601 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 31 02:39:43 np0005603623 python3[223549]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 31 02:39:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:43.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:44 np0005603623 python3.9[223954]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:39:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:44.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:45 np0005603623 python3.9[224109]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 31 02:39:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:45.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:46 np0005603623 python3.9[224262]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 31 02:39:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:46.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:39:47 np0005603623 python3[224414]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 31 02:39:47 np0005603623 podman[224452]: 2026-01-31 07:39:47.792419563 +0000 UTC m=+0.051066140 container create 54b55af4484bb0c643e6bbf1561c1d87fdda2d75731c1a36a5acae575377dd3a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, org.label-schema.build-date=20260127, io.buildah.version=1.41.3)
Jan 31 02:39:47 np0005603623 podman[224452]: 2026-01-31 07:39:47.766964866 +0000 UTC m=+0.025611453 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 31 02:39:47 np0005603623 python3[224414]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 31 02:39:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:47.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:39:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:48.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:39:48 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:39:48 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:39:49 np0005603623 python3.9[224743]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:39:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:39:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:49.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:39:50 np0005603623 python3.9[224897]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:39:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:50.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:50 np0005603623 python3.9[225049]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769845190.2169812-3394-14747999821441/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:39:51 np0005603623 python3.9[225125]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 02:39:51 np0005603623 systemd[1]: Reloading.
Jan 31 02:39:51 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:39:51 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:39:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:51.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:39:52 np0005603623 python3.9[225236]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:39:52 np0005603623 systemd[1]: Reloading.
Jan 31 02:39:52 np0005603623 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:39:52 np0005603623 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:39:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:39:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:52.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:39:52 np0005603623 systemd[1]: Starting nova_compute container...
Jan 31 02:39:52 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:39:52 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17098bae5c5841c547f70966a7dbbedc08543324b5a857df8cbaf25e4f6795f5/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 31 02:39:52 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17098bae5c5841c547f70966a7dbbedc08543324b5a857df8cbaf25e4f6795f5/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 31 02:39:52 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17098bae5c5841c547f70966a7dbbedc08543324b5a857df8cbaf25e4f6795f5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 31 02:39:52 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17098bae5c5841c547f70966a7dbbedc08543324b5a857df8cbaf25e4f6795f5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 31 02:39:52 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17098bae5c5841c547f70966a7dbbedc08543324b5a857df8cbaf25e4f6795f5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 31 02:39:52 np0005603623 podman[225277]: 2026-01-31 07:39:52.795252548 +0000 UTC m=+0.096278925 container init 54b55af4484bb0c643e6bbf1561c1d87fdda2d75731c1a36a5acae575377dd3a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=nova_compute)
Jan 31 02:39:52 np0005603623 podman[225277]: 2026-01-31 07:39:52.799923185 +0000 UTC m=+0.100949542 container start 54b55af4484bb0c643e6bbf1561c1d87fdda2d75731c1a36a5acae575377dd3a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 02:39:52 np0005603623 podman[225277]: nova_compute
Jan 31 02:39:52 np0005603623 nova_compute[225292]: + sudo -E kolla_set_configs
Jan 31 02:39:52 np0005603623 systemd[1]: Started nova_compute container.
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Validating config file
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Copying service configuration files
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Deleting /etc/ceph
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Creating directory /etc/ceph
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Setting permission for /etc/ceph
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Writing out command to execute
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 31 02:39:52 np0005603623 nova_compute[225292]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 31 02:39:52 np0005603623 nova_compute[225292]: ++ cat /run_command
Jan 31 02:39:52 np0005603623 nova_compute[225292]: + CMD=nova-compute
Jan 31 02:39:52 np0005603623 nova_compute[225292]: + ARGS=
Jan 31 02:39:52 np0005603623 nova_compute[225292]: + sudo kolla_copy_cacerts
Jan 31 02:39:52 np0005603623 nova_compute[225292]: + [[ ! -n '' ]]
Jan 31 02:39:52 np0005603623 nova_compute[225292]: + . kolla_extend_start
Jan 31 02:39:52 np0005603623 nova_compute[225292]: + echo 'Running command: '\''nova-compute'\'''
Jan 31 02:39:52 np0005603623 nova_compute[225292]: Running command: 'nova-compute'
Jan 31 02:39:52 np0005603623 nova_compute[225292]: + umask 0022
Jan 31 02:39:52 np0005603623 nova_compute[225292]: + exec nova-compute
Jan 31 02:39:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:53.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:54 np0005603623 python3.9[225455]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:39:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:54.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:55 np0005603623 nova_compute[225292]: 2026-01-31 07:39:55.141 225296 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 31 02:39:55 np0005603623 nova_compute[225292]: 2026-01-31 07:39:55.142 225296 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 31 02:39:55 np0005603623 nova_compute[225292]: 2026-01-31 07:39:55.142 225296 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 31 02:39:55 np0005603623 nova_compute[225292]: 2026-01-31 07:39:55.142 225296 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 31 02:39:55 np0005603623 nova_compute[225292]: 2026-01-31 07:39:55.294 225296 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:39:55 np0005603623 nova_compute[225292]: 2026-01-31 07:39:55.319 225296 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:39:55 np0005603623 nova_compute[225292]: 2026-01-31 07:39:55.319 225296 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 31 02:39:55 np0005603623 python3.9[225608]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:39:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:55.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:56 np0005603623 nova_compute[225292]: 2026-01-31 07:39:56.555 225296 INFO nova.virt.driver [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 31 02:39:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:39:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:56.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:39:56 np0005603623 python3.9[225760]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:39:56 np0005603623 nova_compute[225292]: 2026-01-31 07:39:56.694 225296 INFO nova.compute.provider_config [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.034 225296 DEBUG oslo_concurrency.lockutils [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.035 225296 DEBUG oslo_concurrency.lockutils [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.035 225296 DEBUG oslo_concurrency.lockutils [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.035 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.035 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.036 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.036 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.036 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.036 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.036 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.036 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.037 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.037 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.037 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.037 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.037 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.037 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.037 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.038 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.038 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.038 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.038 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.038 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.038 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.038 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.039 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.039 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.039 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.039 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.039 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.039 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.040 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.040 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.040 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.040 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.040 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.040 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.041 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.041 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.041 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.041 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.041 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.041 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.041 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.042 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.042 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.042 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.042 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.042 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.042 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.043 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.043 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.043 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.043 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.043 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.043 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.044 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.044 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.044 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.044 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.044 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.044 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.045 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.045 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.045 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.045 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.045 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.046 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.046 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.046 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.046 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.046 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.046 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.047 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.047 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.047 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.047 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.047 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.047 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.047 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.048 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.048 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.048 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.048 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.048 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.048 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.048 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.048 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.049 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.049 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.049 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.049 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.049 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.049 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.049 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.050 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.050 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.050 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.050 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.050 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.050 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.050 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.051 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.051 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.051 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.051 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.051 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.051 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.051 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.051 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.052 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.052 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.052 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.052 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.052 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.052 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.052 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.053 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.053 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.053 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.053 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.053 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.053 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.053 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.054 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.054 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.054 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.054 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.054 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.054 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.054 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.054 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.055 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.055 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.055 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.055 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.055 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.055 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.056 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.056 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.056 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.056 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.056 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.056 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.056 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.057 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.057 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.057 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.057 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.057 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.057 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.057 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.058 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.058 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.058 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.058 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.058 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.058 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.058 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.059 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.059 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.059 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.059 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.059 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.059 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.059 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.059 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.060 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.060 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.060 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.060 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.060 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.060 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.060 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.061 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.061 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.061 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.061 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.061 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.061 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.061 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.062 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.062 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.062 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.062 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.062 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.062 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.062 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.063 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.063 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.063 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.063 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.063 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.063 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.063 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.064 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.064 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.064 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.064 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.064 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.064 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.064 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.065 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.065 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.065 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.065 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.065 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.065 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.065 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.066 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.066 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.066 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.066 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.066 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.066 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.066 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.067 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.067 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.067 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.067 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.067 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.067 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.067 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.068 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.068 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.068 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.068 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.068 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.069 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.069 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.069 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.069 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.069 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.069 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.070 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.070 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.070 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.070 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.070 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.071 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.071 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.071 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.071 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.071 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.072 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.072 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.072 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.072 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.072 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.072 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.072 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.072 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.073 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.073 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.073 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.073 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.073 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.074 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.074 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.074 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.074 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.074 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.074 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.074 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.075 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.075 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.075 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.075 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.075 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.075 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.075 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.076 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.076 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.076 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.076 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.076 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.077 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.077 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.077 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.077 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.077 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.077 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.078 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.078 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.078 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.078 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.078 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.078 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.078 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.078 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.079 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.079 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.079 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.079 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.079 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.079 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.079 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.080 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.080 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.080 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.080 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.080 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.080 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.080 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.081 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.081 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.081 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.081 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.081 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.081 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.082 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.082 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.082 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.082 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.082 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.082 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.082 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.083 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.083 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.083 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.083 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.083 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.083 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.083 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.084 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.084 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.084 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.084 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.084 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.084 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.084 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.085 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.085 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.085 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.085 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.085 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.085 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.085 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.086 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.086 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.086 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.086 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.086 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.086 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.086 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.087 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.087 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.087 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.087 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.087 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.087 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.088 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.088 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.088 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.088 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.088 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.088 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.088 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.089 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.089 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.089 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.089 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.089 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.090 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.090 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.090 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.090 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.090 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.090 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.090 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.091 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.091 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.091 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.091 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.091 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.091 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.091 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.091 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.092 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.092 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.092 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.092 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.092 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.092 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.092 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.093 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.093 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.093 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.093 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.093 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.093 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.094 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.094 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.094 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.094 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.094 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.094 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.094 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.095 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.095 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.095 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.095 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.095 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.095 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.095 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.096 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.096 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.096 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.096 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.096 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.096 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.097 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.097 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.097 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.097 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.097 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.097 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.098 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.098 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.098 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.098 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.098 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.098 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.099 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.099 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.099 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.099 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.099 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.099 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.099 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.100 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.100 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.100 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.100 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.100 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.100 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.101 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.101 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.101 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.101 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.101 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.102 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.102 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.102 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.102 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.102 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.102 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.102 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.103 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.103 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.103 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.103 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.103 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.103 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.104 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.104 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.104 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.104 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.104 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.104 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.104 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.105 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.105 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.105 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.105 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.105 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.105 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.105 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.106 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.106 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.106 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.106 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.106 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.106 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.106 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.107 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.107 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.107 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.107 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.107 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.107 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.107 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.108 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.108 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.108 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.108 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.108 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.108 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.108 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.108 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.109 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.109 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.109 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.109 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.109 225296 WARNING oslo_config.cfg [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 31 02:39:57 np0005603623 nova_compute[225292]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 31 02:39:57 np0005603623 nova_compute[225292]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 31 02:39:57 np0005603623 nova_compute[225292]: and ``live_migration_inbound_addr`` respectively.
Jan 31 02:39:57 np0005603623 nova_compute[225292]: ).  Its value may be silently ignored in the future.#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.109 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.110 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.110 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.110 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.110 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.110 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.110 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.111 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.111 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.111 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.111 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.111 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.111 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.111 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.112 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.112 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.112 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.112 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.112 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.rbd_secret_uuid        = 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.112 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.112 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.113 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.113 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.113 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.113 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.113 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.113 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.113 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.114 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.114 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.114 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.114 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.115 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.115 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.115 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.115 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.115 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.115 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.116 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.116 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.116 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.116 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.116 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.116 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.117 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.117 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.117 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.117 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.117 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.117 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.117 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.118 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.118 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.118 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.118 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.118 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.118 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.118 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.119 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.119 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.119 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.119 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.119 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.119 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.120 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.120 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.120 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.120 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.120 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.120 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.120 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.121 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.121 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.121 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.121 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.121 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.121 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.121 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.122 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.122 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.122 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.122 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.122 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.122 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.122 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.123 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.123 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.123 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.123 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.123 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.123 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.123 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.124 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.124 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.124 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.124 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.124 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.124 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.124 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.124 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.125 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.125 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.125 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.125 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.125 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.125 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.125 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.125 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.126 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.126 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.126 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.126 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.126 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.126 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.126 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.126 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.127 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.127 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.127 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.127 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.127 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.127 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.127 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.128 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.128 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.128 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.128 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.128 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.128 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.128 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.128 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.129 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.129 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.129 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.129 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.129 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.129 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.129 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.130 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.130 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.130 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.130 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.130 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.130 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.130 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.131 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.131 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.131 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.131 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.131 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.131 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.131 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.131 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.132 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.132 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.132 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.132 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.132 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.132 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.132 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.133 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.133 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.133 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.133 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.133 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.133 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.133 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.133 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.134 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.134 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.134 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.134 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.134 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.134 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.134 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.135 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.135 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.135 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.135 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.135 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.135 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.135 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.136 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.136 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.136 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.136 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.136 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.136 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.136 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.136 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.137 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.137 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.137 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.137 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.137 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.137 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.137 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.138 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.138 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.138 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.138 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.138 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.138 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.138 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.139 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.139 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.139 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.139 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.139 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.139 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.139 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.139 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.140 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.140 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.140 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.140 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.140 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.140 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.140 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.141 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.141 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.141 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.141 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.141 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.141 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.141 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.141 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.142 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.142 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.142 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.142 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.142 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.142 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.142 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.142 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.143 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.143 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.143 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.143 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.143 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.143 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.143 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.144 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.144 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.144 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.144 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.144 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.144 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.144 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.145 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.145 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.145 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.145 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.145 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.145 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.145 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.146 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.146 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.146 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.146 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.146 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.146 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.146 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.146 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.147 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.147 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.147 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.147 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.147 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.147 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.147 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.147 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.148 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.148 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.148 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.148 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.148 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.148 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.148 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.149 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.149 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.149 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.149 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.149 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.149 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.149 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.149 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.150 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.150 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.150 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.150 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.150 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.150 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.151 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.151 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.151 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.151 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.151 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.151 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.151 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.151 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.152 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.152 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.152 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.152 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.152 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.152 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.152 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.153 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.153 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.153 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.153 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.153 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.153 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.153 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.153 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.154 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.154 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.154 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.154 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.154 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.154 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.154 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.155 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.155 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.155 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.155 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.155 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.155 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.155 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.155 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.156 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.156 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.156 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.156 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.156 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.156 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.156 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.157 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.157 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.157 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.157 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.157 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.157 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.157 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.158 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.158 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.158 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.158 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.158 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.158 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.158 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.158 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.159 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.159 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.159 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.159 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.159 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.159 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.159 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.159 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.160 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.160 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.160 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.160 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.160 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.160 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.160 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.160 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.161 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.161 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.161 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.161 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.161 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.161 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.161 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.161 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.162 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.162 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.162 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.162 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.162 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.162 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.162 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.163 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.163 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.163 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.163 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.163 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.163 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.163 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.163 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.164 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.164 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.164 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.164 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.164 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.164 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.164 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.164 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.165 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.165 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.165 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.165 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.165 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.165 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.165 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.166 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.166 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.166 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.166 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.166 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.166 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.166 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.166 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.167 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.167 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.167 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.167 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.167 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.167 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.167 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.167 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.168 225296 DEBUG oslo_service.service [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.169 225296 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.230 225296 DEBUG nova.virt.libvirt.host [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.231 225296 DEBUG nova.virt.libvirt.host [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.231 225296 DEBUG nova.virt.libvirt.host [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.231 225296 DEBUG nova.virt.libvirt.host [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 31 02:39:57 np0005603623 systemd[1]: Starting libvirt QEMU daemon...
Jan 31 02:39:57 np0005603623 systemd[1]: Started libvirt QEMU daemon.
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.340 225296 DEBUG nova.virt.libvirt.host [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f58e2357af0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.343 225296 DEBUG nova.virt.libvirt.host [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f58e2357af0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.343 225296 INFO nova.virt.libvirt.driver [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 31 02:39:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.460 225296 WARNING nova.virt.libvirt.driver [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 31 02:39:57 np0005603623 nova_compute[225292]: 2026-01-31 07:39:57.460 225296 DEBUG nova.virt.libvirt.volume.mount [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 31 02:39:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:57.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:58 np0005603623 python3.9[225964]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 31 02:39:58 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 02:39:58 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.397 225296 INFO nova.virt.libvirt.host [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Libvirt host capabilities <capabilities>
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <host>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <uuid>4e15465d-7c03-4925-9fc3-ba6a686b7adc</uuid>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <cpu>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <arch>x86_64</arch>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model>EPYC-Rome-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <vendor>AMD</vendor>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <microcode version='16777317'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <signature family='23' model='49' stepping='0'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='x2apic'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='tsc-deadline'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='osxsave'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='hypervisor'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='tsc_adjust'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='spec-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='stibp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='arch-capabilities'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='ssbd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='cmp_legacy'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='topoext'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='virt-ssbd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='lbrv'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='tsc-scale'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='vmcb-clean'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='pause-filter'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='pfthreshold'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='svme-addr-chk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='rdctl-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='skip-l1dfl-vmentry'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='mds-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature name='pschange-mc-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <pages unit='KiB' size='4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <pages unit='KiB' size='2048'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <pages unit='KiB' size='1048576'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </cpu>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <power_management>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <suspend_mem/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </power_management>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <iommu support='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <migration_features>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <live/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <uri_transports>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <uri_transport>tcp</uri_transport>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <uri_transport>rdma</uri_transport>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </uri_transports>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </migration_features>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <topology>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <cells num='1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <cell id='0'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:          <memory unit='KiB'>7864300</memory>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:          <pages unit='KiB' size='4'>1966075</pages>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:          <pages unit='KiB' size='2048'>0</pages>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:          <distances>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:            <sibling id='0' value='10'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:          </distances>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:          <cpus num='8'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:          </cpus>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        </cell>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </cells>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </topology>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <cache>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </cache>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <secmodel>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model>selinux</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <doi>0</doi>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </secmodel>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <secmodel>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model>dac</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <doi>0</doi>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </secmodel>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </host>
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <guest>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <os_type>hvm</os_type>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <arch name='i686'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <wordsize>32</wordsize>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <domain type='qemu'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <domain type='kvm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </arch>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <features>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <pae/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <nonpae/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <acpi default='on' toggle='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <apic default='on' toggle='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <cpuselection/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <deviceboot/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <disksnapshot default='on' toggle='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <externalSnapshot/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </features>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </guest>
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <guest>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <os_type>hvm</os_type>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <arch name='x86_64'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <wordsize>64</wordsize>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <domain type='qemu'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <domain type='kvm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </arch>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <features>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <acpi default='on' toggle='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <apic default='on' toggle='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <cpuselection/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <deviceboot/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <disksnapshot default='on' toggle='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <externalSnapshot/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </features>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </guest>
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 
Jan 31 02:39:58 np0005603623 nova_compute[225292]: </capabilities>
Jan 31 02:39:58 np0005603623 nova_compute[225292]: #033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.405 225296 DEBUG nova.virt.libvirt.host [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.424 225296 DEBUG nova.virt.libvirt.host [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 31 02:39:58 np0005603623 nova_compute[225292]: <domainCapabilities>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <path>/usr/libexec/qemu-kvm</path>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <domain>kvm</domain>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <arch>i686</arch>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <vcpu max='240'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <iothreads supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <os supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <enum name='firmware'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <loader supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>rom</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pflash</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='readonly'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>yes</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>no</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='secure'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>no</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </loader>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </os>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <cpu>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <mode name='host-passthrough' supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='hostPassthroughMigratable'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>on</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>off</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </mode>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <mode name='maximum' supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='maximumMigratable'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>on</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>off</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </mode>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <mode name='host-model' supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <vendor>AMD</vendor>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='x2apic'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='tsc-deadline'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='hypervisor'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='tsc_adjust'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='spec-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='stibp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='ssbd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='cmp_legacy'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='overflow-recov'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='succor'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='amd-ssbd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='virt-ssbd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='lbrv'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='tsc-scale'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='vmcb-clean'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='flushbyasid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='pause-filter'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='pfthreshold'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='svme-addr-chk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='disable' name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </mode>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <mode name='custom' supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-noTSX'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='ClearwaterForest'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ddpd-u'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sha512'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sm3'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sm4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='ClearwaterForest-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ddpd-u'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sha512'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sm3'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sm4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cooperlake'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cooperlake-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cooperlake-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Denverton'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Denverton-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Denverton-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Denverton-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Dhyana-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Genoa'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Genoa-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Genoa-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Milan'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Milan-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Milan-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Milan-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Rome'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Rome-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Rome-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Rome-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Turin'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibpb-brtype'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbpb'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Turin-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibpb-brtype'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbpb'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-v5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='GraniteRapids'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='GraniteRapids-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='GraniteRapids-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-128'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-256'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-512'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='GraniteRapids-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-128'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-256'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-512'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-noTSX'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-noTSX'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v6'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v7'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='IvyBridge'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='IvyBridge-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='IvyBridge-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='IvyBridge-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='KnightsMill'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-4fmaps'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-4vnniw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512er'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512pf'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='KnightsMill-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-4fmaps'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-4vnniw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512er'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512pf'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Opteron_G4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Opteron_G4-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Opteron_G5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tbm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Opteron_G5-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tbm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SierraForest'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SierraForest-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SierraForest-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SierraForest-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='athlon'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='athlon-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='core2duo'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='core2duo-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='coreduo'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='coreduo-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='n270'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='n270-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='phenom'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='phenom-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </mode>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </cpu>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <memoryBacking supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <enum name='sourceType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>file</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>anonymous</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>memfd</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </memoryBacking>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <devices>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <disk supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='diskDevice'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>disk</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>cdrom</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>floppy</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>lun</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='bus'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>ide</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>fdc</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>scsi</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>usb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>sata</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio-transitional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio-non-transitional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </disk>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <graphics supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vnc</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>egl-headless</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>dbus</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </graphics>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <video supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='modelType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vga</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>cirrus</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>none</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>bochs</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>ramfb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </video>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <hostdev supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='mode'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>subsystem</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='startupPolicy'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>default</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>mandatory</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>requisite</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>optional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='subsysType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>usb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pci</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>scsi</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='capsType'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='pciBackend'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </hostdev>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <rng supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio-transitional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio-non-transitional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>random</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>egd</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>builtin</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </rng>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <filesystem supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='driverType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>path</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>handle</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtiofs</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </filesystem>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <tpm supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>tpm-tis</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>tpm-crb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>emulator</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>external</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendVersion'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>2.0</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </tpm>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <redirdev supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='bus'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>usb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </redirdev>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <channel supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pty</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>unix</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </channel>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <crypto supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>qemu</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>builtin</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </crypto>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <interface supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>default</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>passt</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </interface>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <panic supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>isa</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>hyperv</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </panic>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <console supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>null</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vc</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pty</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>dev</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>file</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pipe</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>stdio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>udp</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>tcp</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>unix</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>qemu-vdagent</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>dbus</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </console>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </devices>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <features>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <gic supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <vmcoreinfo supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <genid supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <backingStoreInput supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <backup supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <async-teardown supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <s390-pv supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <ps2 supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <tdx supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <sev supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <sgx supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <hyperv supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='features'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>relaxed</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vapic</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>spinlocks</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vpindex</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>runtime</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>synic</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>stimer</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>reset</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vendor_id</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>frequencies</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>reenlightenment</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>tlbflush</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>ipi</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>avic</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>emsr_bitmap</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>xmm_input</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <defaults>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <spinlocks>4095</spinlocks>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <stimer_direct>on</stimer_direct>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <tlbflush_direct>on</tlbflush_direct>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <tlbflush_extended>on</tlbflush_extended>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </defaults>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </hyperv>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <launchSecurity supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </features>
Jan 31 02:39:58 np0005603623 nova_compute[225292]: </domainCapabilities>
Jan 31 02:39:58 np0005603623 nova_compute[225292]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.430 225296 DEBUG nova.virt.libvirt.host [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 31 02:39:58 np0005603623 nova_compute[225292]: <domainCapabilities>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <path>/usr/libexec/qemu-kvm</path>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <domain>kvm</domain>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <arch>i686</arch>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <vcpu max='4096'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <iothreads supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <os supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <enum name='firmware'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <loader supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>rom</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pflash</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='readonly'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>yes</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>no</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='secure'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>no</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </loader>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </os>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <cpu>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <mode name='host-passthrough' supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='hostPassthroughMigratable'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>on</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>off</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </mode>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <mode name='maximum' supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='maximumMigratable'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>on</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>off</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </mode>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <mode name='host-model' supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <vendor>AMD</vendor>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='x2apic'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='tsc-deadline'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='hypervisor'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='tsc_adjust'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='spec-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='stibp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='ssbd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='cmp_legacy'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='overflow-recov'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='succor'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='amd-ssbd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='virt-ssbd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='lbrv'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='tsc-scale'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='vmcb-clean'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='flushbyasid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='pause-filter'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='pfthreshold'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='svme-addr-chk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='disable' name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </mode>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <mode name='custom' supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-noTSX'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='ClearwaterForest'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ddpd-u'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sha512'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sm3'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sm4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='ClearwaterForest-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ddpd-u'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sha512'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sm3'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sm4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cooperlake'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cooperlake-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cooperlake-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Denverton'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Denverton-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Denverton-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Denverton-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Dhyana-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Genoa'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Genoa-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Genoa-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Milan'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Milan-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Milan-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Milan-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Rome'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Rome-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Rome-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Rome-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Turin'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibpb-brtype'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbpb'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Turin-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibpb-brtype'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbpb'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-v5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='GraniteRapids'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='GraniteRapids-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='GraniteRapids-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-128'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-256'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-512'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='GraniteRapids-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-128'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-256'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-512'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-noTSX'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-noTSX'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v6'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v7'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='IvyBridge'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='IvyBridge-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='IvyBridge-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='IvyBridge-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='KnightsMill'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-4fmaps'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-4vnniw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512er'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512pf'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='KnightsMill-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-4fmaps'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-4vnniw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512er'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512pf'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Opteron_G4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Opteron_G4-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Opteron_G5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tbm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Opteron_G5-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tbm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SierraForest'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SierraForest-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SierraForest-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SierraForest-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='athlon'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='athlon-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='core2duo'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='core2duo-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='coreduo'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='coreduo-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='n270'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='n270-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='phenom'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='phenom-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </mode>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </cpu>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <memoryBacking supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <enum name='sourceType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>file</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>anonymous</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>memfd</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </memoryBacking>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <devices>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <disk supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='diskDevice'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>disk</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>cdrom</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>floppy</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>lun</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='bus'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>fdc</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>scsi</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>usb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>sata</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio-transitional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio-non-transitional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </disk>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <graphics supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vnc</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>egl-headless</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>dbus</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </graphics>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <video supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='modelType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vga</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>cirrus</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>none</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>bochs</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>ramfb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </video>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <hostdev supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='mode'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>subsystem</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='startupPolicy'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>default</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>mandatory</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>requisite</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>optional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='subsysType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>usb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pci</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>scsi</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='capsType'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='pciBackend'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </hostdev>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <rng supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio-transitional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio-non-transitional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>random</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>egd</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>builtin</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </rng>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <filesystem supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='driverType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>path</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>handle</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtiofs</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </filesystem>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <tpm supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>tpm-tis</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>tpm-crb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>emulator</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>external</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendVersion'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>2.0</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </tpm>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <redirdev supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='bus'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>usb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </redirdev>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <channel supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pty</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>unix</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </channel>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <crypto supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>qemu</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>builtin</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </crypto>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <interface supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>default</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>passt</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </interface>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <panic supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>isa</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>hyperv</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </panic>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <console supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>null</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vc</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pty</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>dev</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>file</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pipe</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>stdio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>udp</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>tcp</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>unix</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>qemu-vdagent</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>dbus</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </console>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </devices>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <features>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <gic supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <vmcoreinfo supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <genid supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <backingStoreInput supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <backup supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <async-teardown supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <s390-pv supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <ps2 supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <tdx supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <sev supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <sgx supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <hyperv supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='features'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>relaxed</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vapic</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>spinlocks</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vpindex</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>runtime</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>synic</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>stimer</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>reset</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vendor_id</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>frequencies</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>reenlightenment</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>tlbflush</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>ipi</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>avic</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>emsr_bitmap</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>xmm_input</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <defaults>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <spinlocks>4095</spinlocks>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <stimer_direct>on</stimer_direct>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <tlbflush_direct>on</tlbflush_direct>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <tlbflush_extended>on</tlbflush_extended>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </defaults>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </hyperv>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <launchSecurity supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </features>
Jan 31 02:39:58 np0005603623 nova_compute[225292]: </domainCapabilities>
Jan 31 02:39:58 np0005603623 nova_compute[225292]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.471 225296 DEBUG nova.virt.libvirt.host [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.475 225296 DEBUG nova.virt.libvirt.host [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 31 02:39:58 np0005603623 nova_compute[225292]: <domainCapabilities>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <path>/usr/libexec/qemu-kvm</path>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <domain>kvm</domain>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <arch>x86_64</arch>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <vcpu max='4096'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <iothreads supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <os supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <enum name='firmware'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>efi</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <loader supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>rom</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pflash</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='readonly'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>yes</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>no</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='secure'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>yes</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>no</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </loader>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </os>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <cpu>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <mode name='host-passthrough' supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='hostPassthroughMigratable'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>on</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>off</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </mode>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <mode name='maximum' supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='maximumMigratable'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>on</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>off</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </mode>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <mode name='host-model' supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <vendor>AMD</vendor>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='x2apic'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='tsc-deadline'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='hypervisor'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='tsc_adjust'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='spec-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='stibp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='ssbd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='cmp_legacy'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='overflow-recov'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='succor'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='amd-ssbd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='virt-ssbd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='lbrv'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='tsc-scale'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='vmcb-clean'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='flushbyasid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='pause-filter'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='pfthreshold'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='svme-addr-chk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='disable' name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </mode>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <mode name='custom' supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-noTSX'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='ClearwaterForest'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ddpd-u'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sha512'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sm3'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sm4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='ClearwaterForest-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ddpd-u'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sha512'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sm3'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sm4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cooperlake'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cooperlake-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cooperlake-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Denverton'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Denverton-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Denverton-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Denverton-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Dhyana-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Genoa'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Genoa-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Genoa-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Milan'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Milan-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Milan-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Milan-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Rome'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Rome-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Rome-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Rome-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Turin'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibpb-brtype'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbpb'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Turin-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibpb-brtype'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbpb'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-v5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='GraniteRapids'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='GraniteRapids-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:39:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:58.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='GraniteRapids-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-128'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-256'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-512'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='GraniteRapids-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-128'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-256'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-512'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-noTSX'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-noTSX'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v6'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v7'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='IvyBridge'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='IvyBridge-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='IvyBridge-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='IvyBridge-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='KnightsMill'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-4fmaps'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-4vnniw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512er'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512pf'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='KnightsMill-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-4fmaps'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-4vnniw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512er'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512pf'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Opteron_G4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Opteron_G4-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Opteron_G5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tbm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Opteron_G5-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tbm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SierraForest'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SierraForest-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SierraForest-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SierraForest-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='athlon'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='athlon-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='core2duo'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='core2duo-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='coreduo'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='coreduo-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='n270'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='n270-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='phenom'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='phenom-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </mode>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </cpu>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <memoryBacking supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <enum name='sourceType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>file</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>anonymous</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>memfd</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </memoryBacking>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <devices>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <disk supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='diskDevice'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>disk</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>cdrom</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>floppy</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>lun</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='bus'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>fdc</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>scsi</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>usb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>sata</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio-transitional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio-non-transitional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </disk>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <graphics supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vnc</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>egl-headless</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>dbus</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </graphics>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <video supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='modelType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vga</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>cirrus</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>none</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>bochs</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>ramfb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </video>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <hostdev supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='mode'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>subsystem</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='startupPolicy'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>default</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>mandatory</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>requisite</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>optional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='subsysType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>usb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pci</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>scsi</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='capsType'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='pciBackend'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </hostdev>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <rng supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio-transitional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio-non-transitional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>random</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>egd</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>builtin</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </rng>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <filesystem supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='driverType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>path</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>handle</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtiofs</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </filesystem>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <tpm supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>tpm-tis</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>tpm-crb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>emulator</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>external</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendVersion'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>2.0</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </tpm>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <redirdev supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='bus'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>usb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </redirdev>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <channel supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pty</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>unix</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </channel>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <crypto supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>qemu</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>builtin</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </crypto>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <interface supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>default</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>passt</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </interface>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <panic supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>isa</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>hyperv</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </panic>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <console supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>null</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vc</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pty</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>dev</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>file</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pipe</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>stdio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>udp</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>tcp</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>unix</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>qemu-vdagent</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>dbus</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </console>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </devices>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <features>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <gic supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <vmcoreinfo supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <genid supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <backingStoreInput supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <backup supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <async-teardown supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <s390-pv supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <ps2 supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <tdx supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <sev supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <sgx supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <hyperv supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='features'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>relaxed</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vapic</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>spinlocks</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vpindex</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>runtime</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>synic</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>stimer</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>reset</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vendor_id</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>frequencies</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>reenlightenment</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>tlbflush</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>ipi</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>avic</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>emsr_bitmap</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>xmm_input</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <defaults>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <spinlocks>4095</spinlocks>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <stimer_direct>on</stimer_direct>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <tlbflush_direct>on</tlbflush_direct>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <tlbflush_extended>on</tlbflush_extended>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </defaults>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </hyperv>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <launchSecurity supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </features>
Jan 31 02:39:58 np0005603623 nova_compute[225292]: </domainCapabilities>
Jan 31 02:39:58 np0005603623 nova_compute[225292]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.543 225296 DEBUG nova.virt.libvirt.host [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 31 02:39:58 np0005603623 nova_compute[225292]: <domainCapabilities>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <path>/usr/libexec/qemu-kvm</path>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <domain>kvm</domain>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <arch>x86_64</arch>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <vcpu max='240'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <iothreads supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <os supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <enum name='firmware'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <loader supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>rom</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pflash</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='readonly'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>yes</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>no</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='secure'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>no</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </loader>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </os>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <cpu>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <mode name='host-passthrough' supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='hostPassthroughMigratable'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>on</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>off</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </mode>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <mode name='maximum' supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='maximumMigratable'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>on</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>off</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </mode>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <mode name='host-model' supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <vendor>AMD</vendor>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='x2apic'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='tsc-deadline'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='hypervisor'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='tsc_adjust'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='spec-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='stibp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='ssbd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='cmp_legacy'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='overflow-recov'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='succor'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='amd-ssbd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='virt-ssbd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='lbrv'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='tsc-scale'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='vmcb-clean'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='flushbyasid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='pause-filter'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='pfthreshold'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='svme-addr-chk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <feature policy='disable' name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </mode>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <mode name='custom' supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-noTSX'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Broadwell-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cascadelake-Server-v5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='ClearwaterForest'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ddpd-u'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sha512'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sm3'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sm4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='ClearwaterForest-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ddpd-u'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sha512'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sm3'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sm4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cooperlake'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cooperlake-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Cooperlake-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Denverton'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Denverton-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Denverton-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Denverton-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Dhyana-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Genoa'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Genoa-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Genoa-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Milan'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Milan-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Milan-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Milan-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Rome'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Rome-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Rome-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Rome-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Turin'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibpb-brtype'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbpb'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-Turin-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibpb-brtype'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbpb'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='EPYC-v5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='GraniteRapids'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='GraniteRapids-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='GraniteRapids-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-128'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-256'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-512'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='GraniteRapids-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-128'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-256'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx10-512'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-noTSX'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Haswell-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-noTSX'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v6'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Icelake-Server-v7'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='IvyBridge'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='IvyBridge-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='IvyBridge-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='IvyBridge-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='KnightsMill'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-4fmaps'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-4vnniw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512er'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512pf'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='KnightsMill-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-4fmaps'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-4vnniw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512er'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512pf'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Opteron_G4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Opteron_G4-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Opteron_G5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tbm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Opteron_G5-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tbm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SapphireRapids-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SierraForest'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SierraForest-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SierraForest-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='SierraForest-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Client-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Skylake-Server-v5'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge-v2'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge-v3'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='Snowridge-v4'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='athlon'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='athlon-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='core2duo'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='core2duo-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='coreduo'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='coreduo-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='n270'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='n270-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='phenom'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <blockers model='phenom-v1'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </blockers>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </mode>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </cpu>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <memoryBacking supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <enum name='sourceType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>file</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>anonymous</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <value>memfd</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </memoryBacking>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <devices>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <disk supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='diskDevice'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>disk</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>cdrom</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>floppy</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>lun</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='bus'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>ide</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>fdc</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>scsi</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>usb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>sata</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio-transitional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio-non-transitional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </disk>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <graphics supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vnc</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>egl-headless</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>dbus</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </graphics>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <video supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='modelType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vga</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>cirrus</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>none</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>bochs</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>ramfb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </video>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <hostdev supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='mode'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>subsystem</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='startupPolicy'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>default</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>mandatory</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>requisite</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>optional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='subsysType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>usb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pci</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>scsi</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='capsType'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='pciBackend'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </hostdev>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <rng supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio-transitional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtio-non-transitional</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>random</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>egd</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>builtin</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </rng>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <filesystem supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='driverType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>path</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>handle</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>virtiofs</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </filesystem>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <tpm supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>tpm-tis</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>tpm-crb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>emulator</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>external</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendVersion'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>2.0</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </tpm>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <redirdev supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='bus'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>usb</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </redirdev>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <channel supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pty</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>unix</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </channel>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <crypto supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>qemu</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>builtin</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </crypto>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <interface supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='backendType'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>default</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>passt</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </interface>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <panic supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='model'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>isa</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>hyperv</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </panic>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <console supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='type'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>null</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vc</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pty</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>dev</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>file</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>pipe</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>stdio</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>udp</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>tcp</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>unix</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>qemu-vdagent</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>dbus</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </console>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </devices>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <features>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <gic supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <vmcoreinfo supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <genid supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <backingStoreInput supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <backup supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <async-teardown supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <s390-pv supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <ps2 supported='yes'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <tdx supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <sev supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <sgx supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <hyperv supported='yes'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <enum name='features'>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>relaxed</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vapic</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>spinlocks</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vpindex</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>runtime</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>synic</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>stimer</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>reset</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>vendor_id</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>frequencies</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>reenlightenment</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>tlbflush</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>ipi</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>avic</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>emsr_bitmap</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <value>xmm_input</value>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </enum>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      <defaults>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <spinlocks>4095</spinlocks>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <stimer_direct>on</stimer_direct>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <tlbflush_direct>on</tlbflush_direct>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <tlbflush_extended>on</tlbflush_extended>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:      </defaults>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    </hyperv>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:    <launchSecurity supported='no'/>
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  </features>
Jan 31 02:39:58 np0005603623 nova_compute[225292]: </domainCapabilities>
Jan 31 02:39:58 np0005603623 nova_compute[225292]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.610 225296 DEBUG nova.virt.libvirt.host [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.610 225296 INFO nova.virt.libvirt.host [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Secure Boot support detected#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.612 225296 INFO nova.virt.libvirt.driver [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.612 225296 INFO nova.virt.libvirt.driver [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.619 225296 DEBUG nova.virt.libvirt.driver [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] cpu compare xml: <cpu match="exact">
Jan 31 02:39:58 np0005603623 nova_compute[225292]:  <model>Nehalem</model>
Jan 31 02:39:58 np0005603623 nova_compute[225292]: </cpu>
Jan 31 02:39:58 np0005603623 nova_compute[225292]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.621 225296 DEBUG nova.virt.libvirt.driver [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.691 225296 INFO nova.virt.node [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Determined node identity 492dc482-9d1e-49ca-87f3-0104a8508b72 from /var/lib/nova/compute_id#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.754 225296 WARNING nova.compute.manager [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Compute nodes ['492dc482-9d1e-49ca-87f3-0104a8508b72'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.835 225296 INFO nova.compute.manager [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.907 225296 WARNING nova.compute.manager [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.907 225296 DEBUG oslo_concurrency.lockutils [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.907 225296 DEBUG oslo_concurrency.lockutils [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.908 225296 DEBUG oslo_concurrency.lockutils [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.908 225296 DEBUG nova.compute.resource_tracker [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:39:58 np0005603623 nova_compute[225292]: 2026-01-31 07:39:58.908 225296 DEBUG oslo_concurrency.processutils [None req-c38ec75c-95df-4a8b-8374-e8571b9ebd0a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:39:59 np0005603623 python3.9[226152]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:39:59 np0005603623 systemd[1]: Stopping nova_compute container...
Jan 31 02:39:59 np0005603623 nova_compute[225292]: 2026-01-31 07:39:59.417 225296 DEBUG oslo_concurrency.lockutils [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:39:59 np0005603623 nova_compute[225292]: 2026-01-31 07:39:59.419 225296 DEBUG oslo_concurrency.lockutils [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:39:59 np0005603623 nova_compute[225292]: 2026-01-31 07:39:59.419 225296 DEBUG oslo_concurrency.lockutils [None req-1ac22602-71fd-47b0-b4f6-6320df3c2f5d - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:39:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:39:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:59.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:59 np0005603623 virtqemud[225858]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 31 02:39:59 np0005603623 virtqemud[225858]: hostname: compute-2
Jan 31 02:39:59 np0005603623 virtqemud[225858]: End of file while reading data: Input/output error
Jan 31 02:39:59 np0005603623 systemd[1]: libpod-54b55af4484bb0c643e6bbf1561c1d87fdda2d75731c1a36a5acae575377dd3a.scope: Deactivated successfully.
Jan 31 02:39:59 np0005603623 systemd[1]: libpod-54b55af4484bb0c643e6bbf1561c1d87fdda2d75731c1a36a5acae575377dd3a.scope: Consumed 3.502s CPU time.
Jan 31 02:39:59 np0005603623 podman[226176]: 2026-01-31 07:39:59.846350474 +0000 UTC m=+0.458310475 container died 54b55af4484bb0c643e6bbf1561c1d87fdda2d75731c1a36a5acae575377dd3a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:39:59 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-54b55af4484bb0c643e6bbf1561c1d87fdda2d75731c1a36a5acae575377dd3a-userdata-shm.mount: Deactivated successfully.
Jan 31 02:39:59 np0005603623 systemd[1]: var-lib-containers-storage-overlay-17098bae5c5841c547f70966a7dbbedc08543324b5a857df8cbaf25e4f6795f5-merged.mount: Deactivated successfully.
Jan 31 02:39:59 np0005603623 podman[226176]: 2026-01-31 07:39:59.900481119 +0000 UTC m=+0.512441130 container cleanup 54b55af4484bb0c643e6bbf1561c1d87fdda2d75731c1a36a5acae575377dd3a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:39:59 np0005603623 podman[226176]: nova_compute
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:39:59.950356) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845199950437, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 1217, "num_deletes": 255, "total_data_size": 2710848, "memory_usage": 2757864, "flush_reason": "Manual Compaction"}
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Jan 31 02:39:59 np0005603623 podman[226206]: nova_compute
Jan 31 02:39:59 np0005603623 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 31 02:39:59 np0005603623 systemd[1]: Stopped nova_compute container.
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845199971138, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 1768003, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16207, "largest_seqno": 17419, "table_properties": {"data_size": 1762790, "index_size": 2673, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11139, "raw_average_key_size": 18, "raw_value_size": 1752061, "raw_average_value_size": 2984, "num_data_blocks": 121, "num_entries": 587, "num_filter_entries": 587, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769845075, "oldest_key_time": 1769845075, "file_creation_time": 1769845199, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 20820 microseconds, and 3263 cpu microseconds.
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:39:59.971179) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 1768003 bytes OK
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:39:59.971195) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:39:59.974610) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:39:59.974625) EVENT_LOG_v1 {"time_micros": 1769845199974620, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:39:59.974642) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 2705038, prev total WAL file size 2705038, number of live WAL files 2.
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:39:59.975118) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323532' seq:0, type:0; will stop at (end)
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(1726KB)], [30(7867KB)]
Jan 31 02:39:59 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845199975157, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 9824656, "oldest_snapshot_seqno": -1}
Jan 31 02:39:59 np0005603623 systemd[1]: Starting nova_compute container...
Jan 31 02:40:00 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4250 keys, 9463507 bytes, temperature: kUnknown
Jan 31 02:40:00 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845200075741, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 9463507, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9432585, "index_size": 19164, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10693, "raw_key_size": 105991, "raw_average_key_size": 24, "raw_value_size": 9353059, "raw_average_value_size": 2200, "num_data_blocks": 801, "num_entries": 4250, "num_filter_entries": 4250, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769845199, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:40:00 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:40:00 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:40:00.075986) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 9463507 bytes
Jan 31 02:40:00 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:40:00.077964) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 97.6 rd, 94.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 7.7 +0.0 blob) out(9.0 +0.0 blob), read-write-amplify(10.9) write-amplify(5.4) OK, records in: 4778, records dropped: 528 output_compression: NoCompression
Jan 31 02:40:00 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:40:00.077982) EVENT_LOG_v1 {"time_micros": 1769845200077972, "job": 16, "event": "compaction_finished", "compaction_time_micros": 100703, "compaction_time_cpu_micros": 15377, "output_level": 6, "num_output_files": 1, "total_output_size": 9463507, "num_input_records": 4778, "num_output_records": 4250, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:40:00 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:40:00 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845200078176, "job": 16, "event": "table_file_deletion", "file_number": 32}
Jan 31 02:40:00 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:40:00 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845200078715, "job": 16, "event": "table_file_deletion", "file_number": 30}
Jan 31 02:40:00 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:39:59.975066) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:40:00 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:40:00.078840) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:40:00 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:40:00.078850) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:40:00 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:40:00.078853) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:40:00 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:40:00.078857) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:40:00 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:40:00.078860) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:40:00 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:40:00 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17098bae5c5841c547f70966a7dbbedc08543324b5a857df8cbaf25e4f6795f5/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 31 02:40:00 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17098bae5c5841c547f70966a7dbbedc08543324b5a857df8cbaf25e4f6795f5/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 31 02:40:00 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17098bae5c5841c547f70966a7dbbedc08543324b5a857df8cbaf25e4f6795f5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 31 02:40:00 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17098bae5c5841c547f70966a7dbbedc08543324b5a857df8cbaf25e4f6795f5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 31 02:40:00 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17098bae5c5841c547f70966a7dbbedc08543324b5a857df8cbaf25e4f6795f5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 31 02:40:00 np0005603623 podman[226219]: 2026-01-31 07:40:00.143170971 +0000 UTC m=+0.158936100 container init 54b55af4484bb0c643e6bbf1561c1d87fdda2d75731c1a36a5acae575377dd3a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:40:00 np0005603623 podman[226219]: 2026-01-31 07:40:00.15463129 +0000 UTC m=+0.170396359 container start 54b55af4484bb0c643e6bbf1561c1d87fdda2d75731c1a36a5acae575377dd3a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3)
Jan 31 02:40:00 np0005603623 nova_compute[226235]: + sudo -E kolla_set_configs
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Validating config file
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Copying service configuration files
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Deleting /etc/ceph
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Creating directory /etc/ceph
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Setting permission for /etc/ceph
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Writing out command to execute
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 31 02:40:00 np0005603623 nova_compute[226235]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 31 02:40:00 np0005603623 podman[226219]: nova_compute
Jan 31 02:40:00 np0005603623 systemd[1]: Started nova_compute container.
Jan 31 02:40:00 np0005603623 nova_compute[226235]: ++ cat /run_command
Jan 31 02:40:00 np0005603623 nova_compute[226235]: + CMD=nova-compute
Jan 31 02:40:00 np0005603623 nova_compute[226235]: + ARGS=
Jan 31 02:40:00 np0005603623 nova_compute[226235]: + sudo kolla_copy_cacerts
Jan 31 02:40:00 np0005603623 nova_compute[226235]: + [[ ! -n '' ]]
Jan 31 02:40:00 np0005603623 nova_compute[226235]: + . kolla_extend_start
Jan 31 02:40:00 np0005603623 nova_compute[226235]: Running command: 'nova-compute'
Jan 31 02:40:00 np0005603623 nova_compute[226235]: + echo 'Running command: '\''nova-compute'\'''
Jan 31 02:40:00 np0005603623 nova_compute[226235]: + umask 0022
Jan 31 02:40:00 np0005603623 nova_compute[226235]: + exec nova-compute
Jan 31 02:40:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:40:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:40:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:40:00.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:40:01 np0005603623 ceph-mon[77037]: overall HEALTH_OK
Jan 31 02:40:01 np0005603623 python3.9[226399]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 31 02:40:01 np0005603623 systemd[1]: Started libpod-conmon-b8f9d0758750e96f7d898f257e3b805b90943c4ed2917d32885547f8acfd2015.scope.
Jan 31 02:40:01 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:40:01 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/549c1032c48ac67938323e13ddaf76828db1a810ca214000fb66b0803c8e48a7/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 31 02:40:01 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/549c1032c48ac67938323e13ddaf76828db1a810ca214000fb66b0803c8e48a7/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 31 02:40:01 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/549c1032c48ac67938323e13ddaf76828db1a810ca214000fb66b0803c8e48a7/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 31 02:40:01 np0005603623 podman[226424]: 2026-01-31 07:40:01.457905207 +0000 UTC m=+0.208818591 container init b8f9d0758750e96f7d898f257e3b805b90943c4ed2917d32885547f8acfd2015 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 02:40:01 np0005603623 podman[226424]: 2026-01-31 07:40:01.467749926 +0000 UTC m=+0.218663320 container start b8f9d0758750e96f7d898f257e3b805b90943c4ed2917d32885547f8acfd2015 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm)
Jan 31 02:40:01 np0005603623 nova_compute_init[226445]: INFO:nova_statedir:Applying nova statedir ownership
Jan 31 02:40:01 np0005603623 nova_compute_init[226445]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 31 02:40:01 np0005603623 nova_compute_init[226445]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 31 02:40:01 np0005603623 nova_compute_init[226445]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 31 02:40:01 np0005603623 nova_compute_init[226445]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 31 02:40:01 np0005603623 nova_compute_init[226445]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 31 02:40:01 np0005603623 nova_compute_init[226445]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 31 02:40:01 np0005603623 nova_compute_init[226445]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 31 02:40:01 np0005603623 nova_compute_init[226445]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 31 02:40:01 np0005603623 nova_compute_init[226445]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 31 02:40:01 np0005603623 nova_compute_init[226445]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 31 02:40:01 np0005603623 nova_compute_init[226445]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 31 02:40:01 np0005603623 nova_compute_init[226445]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 31 02:40:01 np0005603623 nova_compute_init[226445]: INFO:nova_statedir:Nova statedir ownership complete
Jan 31 02:40:01 np0005603623 systemd[1]: libpod-b8f9d0758750e96f7d898f257e3b805b90943c4ed2917d32885547f8acfd2015.scope: Deactivated successfully.
Jan 31 02:40:01 np0005603623 python3.9[226399]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 31 02:40:01 np0005603623 podman[226446]: 2026-01-31 07:40:01.571534795 +0000 UTC m=+0.040668634 container died b8f9d0758750e96f7d898f257e3b805b90943c4ed2917d32885547f8acfd2015 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:40:01 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b8f9d0758750e96f7d898f257e3b805b90943c4ed2917d32885547f8acfd2015-userdata-shm.mount: Deactivated successfully.
Jan 31 02:40:01 np0005603623 systemd[1]: var-lib-containers-storage-overlay-549c1032c48ac67938323e13ddaf76828db1a810ca214000fb66b0803c8e48a7-merged.mount: Deactivated successfully.
Jan 31 02:40:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:40:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:40:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:40:01.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:40:01 np0005603623 podman[226446]: 2026-01-31 07:40:01.85358482 +0000 UTC m=+0.322718589 container cleanup b8f9d0758750e96f7d898f257e3b805b90943c4ed2917d32885547f8acfd2015 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 02:40:01 np0005603623 systemd[1]: libpod-conmon-b8f9d0758750e96f7d898f257e3b805b90943c4ed2917d32885547f8acfd2015.scope: Deactivated successfully.
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.016 226239 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.016 226239 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.016 226239 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.016 226239 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.136 226239 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.145 226239 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.145 226239 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 31 02:40:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:40:02 np0005603623 systemd[1]: session-49.scope: Deactivated successfully.
Jan 31 02:40:02 np0005603623 systemd[1]: session-49.scope: Consumed 1min 42.442s CPU time.
Jan 31 02:40:02 np0005603623 systemd-logind[795]: Session 49 logged out. Waiting for processes to exit.
Jan 31 02:40:02 np0005603623 systemd-logind[795]: Removed session 49.
Jan 31 02:40:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:40:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:40:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:40:02.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.697 226239 INFO nova.virt.driver [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.818 226239 INFO nova.compute.provider_config [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.834 226239 DEBUG oslo_concurrency.lockutils [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.835 226239 DEBUG oslo_concurrency.lockutils [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.835 226239 DEBUG oslo_concurrency.lockutils [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.836 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.836 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.836 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.836 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.837 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.837 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.837 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.837 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.838 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.838 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.838 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.838 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.839 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.839 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.839 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.839 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.839 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.839 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.840 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.840 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.840 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.840 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.840 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.841 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.841 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.841 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.841 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.841 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.842 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.842 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.842 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.842 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.842 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.843 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.843 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.843 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.843 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.843 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.844 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.844 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.844 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.844 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.844 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.845 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.845 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.845 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.845 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.845 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.846 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.846 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.846 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.846 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.847 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.847 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.847 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.847 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.848 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.848 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.848 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.848 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.848 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.849 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.849 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.849 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.849 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.849 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.850 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.850 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.850 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.850 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.850 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.851 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.851 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.851 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.851 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.851 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.852 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.852 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.852 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.852 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.852 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.853 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.853 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.853 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.853 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.853 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.854 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.854 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.854 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.854 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.854 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.855 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.855 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.855 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.855 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.855 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.856 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.856 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.856 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.856 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.856 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.856 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.857 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.857 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.857 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.857 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.857 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.858 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.858 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.858 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.858 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.858 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.859 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.859 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.859 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.859 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.860 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.860 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.860 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.860 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.860 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.861 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.861 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.861 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.861 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.861 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.861 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.862 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.862 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.862 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.862 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.862 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.863 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.863 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.863 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.863 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.863 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.863 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.864 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.864 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.864 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.864 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.864 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.865 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.865 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.865 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.865 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.865 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.865 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.866 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.866 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.866 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.866 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.867 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.867 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.867 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.867 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.867 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.868 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.868 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.868 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.868 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.868 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.869 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.869 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.869 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.869 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.869 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.870 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.870 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.870 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.870 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.870 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.871 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.871 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.871 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.871 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.871 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.872 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.872 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.872 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.872 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.872 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.872 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.873 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.873 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.873 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.873 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.873 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.873 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.874 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.874 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.874 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.874 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.874 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.874 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.875 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.875 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.875 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.875 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.875 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.876 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.876 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.876 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.876 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.876 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.877 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.877 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.877 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.877 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.877 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.878 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.878 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.878 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.878 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.879 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.879 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.879 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.879 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.879 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.879 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.880 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.880 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.880 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.880 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.880 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.881 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.881 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.881 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.881 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.881 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.882 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.882 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.882 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.882 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.882 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.882 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.883 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.883 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.883 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.883 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.883 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.883 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.883 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.884 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.884 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.884 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.884 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.884 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.884 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.884 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.884 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.885 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.885 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.885 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.885 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.885 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.885 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.886 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.886 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.886 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.886 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.886 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.886 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.887 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.887 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.887 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.887 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.887 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.887 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.887 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.888 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.888 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.888 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.888 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.888 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.889 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.889 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.889 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.889 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.889 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.889 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.889 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.889 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.890 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.890 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.890 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.890 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.890 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.890 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.890 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.891 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.891 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.891 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.891 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.891 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.891 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.892 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.892 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.892 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.892 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.892 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.892 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.892 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.893 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.893 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.893 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.893 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.893 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.893 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.894 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.894 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.894 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.894 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.894 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.894 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.894 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.895 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.895 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.895 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.895 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.895 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.896 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.896 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.896 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.896 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.896 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.896 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.897 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.897 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.897 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.897 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.897 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.897 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.897 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.898 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.898 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.898 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.898 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.898 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.898 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.899 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.899 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.899 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.899 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.899 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.899 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.899 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.900 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.900 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.900 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.900 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.900 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.900 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.900 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.901 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.901 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.901 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.901 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.902 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.902 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.902 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.902 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.902 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.902 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.903 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.903 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.903 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.903 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.903 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.903 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.903 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.904 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.904 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.904 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.904 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.904 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.904 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.904 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.904 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.905 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.905 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.905 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.905 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.905 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.905 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.906 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.906 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.906 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.906 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.906 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.906 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.907 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.907 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.907 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.907 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.907 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.908 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.908 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.908 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.908 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.908 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.908 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.908 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.909 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.909 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.909 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.909 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.909 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.909 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.910 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.910 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.910 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.910 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.910 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.910 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.911 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.911 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.911 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.911 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.911 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.912 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.912 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.912 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.912 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.912 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.912 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.913 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.913 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.913 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.913 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.914 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.914 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.914 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.914 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.914 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.915 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.915 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.915 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.915 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.915 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.916 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.916 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.916 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.916 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.916 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.916 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.916 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.917 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.917 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.917 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.917 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.917 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.917 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.918 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.918 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.918 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.918 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.918 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.919 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.919 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.919 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.919 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.919 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.919 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.920 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.920 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.920 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.920 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.920 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.920 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.921 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.921 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.921 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.921 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.921 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.921 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.922 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.922 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.922 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.922 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.922 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.922 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.922 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.923 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.923 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.923 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.923 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.923 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.923 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.924 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.924 226239 WARNING oslo_config.cfg [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 31 02:40:02 np0005603623 nova_compute[226235]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 31 02:40:02 np0005603623 nova_compute[226235]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 31 02:40:02 np0005603623 nova_compute[226235]: and ``live_migration_inbound_addr`` respectively.
Jan 31 02:40:02 np0005603623 nova_compute[226235]: ).  Its value may be silently ignored in the future.#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.924 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.924 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.924 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.925 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.925 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.925 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.925 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.925 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.925 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.925 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.926 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.926 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.926 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.926 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.926 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.926 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.926 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.927 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.927 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.rbd_secret_uuid        = 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.927 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.927 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.927 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.927 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.927 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.928 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.928 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.928 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.928 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.928 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.928 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.929 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.929 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.929 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.929 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.929 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.929 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.929 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.930 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.930 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.930 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.930 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.930 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.930 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.930 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.931 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.931 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.931 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.931 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.931 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.931 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.931 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.932 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.932 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.932 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.932 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.932 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.932 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.932 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.933 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.933 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.933 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.933 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.933 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.933 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.933 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.934 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.934 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.934 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.934 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.934 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.934 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.934 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.935 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.935 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.935 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.935 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.935 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.935 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.936 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.936 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.936 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.936 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.936 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.936 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.936 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.937 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.937 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.937 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.937 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.937 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.937 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.937 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.937 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.938 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.938 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.938 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.938 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.938 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.938 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.939 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.939 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.939 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.939 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.939 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.939 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.939 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.940 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.940 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.940 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.940 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.940 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.940 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.940 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.941 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.941 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.941 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.941 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.941 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.941 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.941 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.942 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.942 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.942 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.942 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.942 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.942 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.942 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.943 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.943 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.943 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.943 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.943 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.943 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.943 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.944 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.944 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.944 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.944 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.944 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.944 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.945 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.945 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.945 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.945 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.945 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.945 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.945 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.945 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.946 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.946 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.946 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.946 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.946 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.946 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.947 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.947 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.947 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.947 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.947 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.947 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.948 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.948 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.948 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.948 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.948 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.948 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.948 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.948 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.949 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.949 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.949 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.949 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.949 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.949 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.949 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.950 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.950 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.950 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.950 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.950 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.950 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.951 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.951 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.951 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.951 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.951 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.951 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.951 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.952 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.952 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.952 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.952 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.952 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.952 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.953 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.953 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.953 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.953 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.954 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.954 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.954 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.954 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.954 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.955 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.955 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.955 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.955 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.955 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.956 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.956 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.956 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.956 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.956 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.957 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.957 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.957 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.957 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.957 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.958 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.958 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.958 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.958 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.958 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.958 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.959 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.959 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.959 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.959 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.959 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.959 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.959 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.960 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.960 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.960 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.960 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.960 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.960 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.960 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.961 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.961 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.961 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.961 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.961 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.961 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.961 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.962 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.962 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.962 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.962 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.962 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.962 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.963 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.963 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.963 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.963 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.963 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.963 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.963 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.964 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.964 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.964 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.964 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.964 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.964 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.964 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.965 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.965 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.965 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.965 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.965 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.965 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.965 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.966 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.966 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.966 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.966 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.966 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.966 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.966 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.966 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.967 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.967 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.967 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.967 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.967 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.967 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.967 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.968 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.968 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.968 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.968 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.968 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.968 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.968 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.969 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.969 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.969 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.969 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.969 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.969 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.969 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.970 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.970 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.970 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.970 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.970 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.970 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.970 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.971 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.971 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.971 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.971 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.971 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.971 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.971 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.972 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.972 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.972 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.972 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.972 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.972 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.972 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.973 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.973 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.973 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.973 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.973 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.973 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.973 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.973 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.974 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.974 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.974 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.974 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.974 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.974 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.974 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.975 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.975 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.975 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.975 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.975 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.975 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.975 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.976 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.976 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.976 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.976 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.976 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.976 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.976 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.976 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.977 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.977 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.977 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.977 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.977 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.977 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.977 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.977 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.978 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.978 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.978 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.978 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.978 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.978 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.978 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.979 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.979 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.979 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.979 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.979 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.979 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.979 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.979 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.980 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.980 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.980 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.980 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.980 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.980 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.980 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.981 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.981 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.981 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.981 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.981 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.981 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.982 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.982 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.982 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.982 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.982 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.982 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.983 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.983 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.983 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.983 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.983 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.983 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.984 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.984 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.984 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.984 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.984 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.984 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.984 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.985 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.985 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.985 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.985 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.985 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.985 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.985 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.986 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.986 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.986 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.986 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.986 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.986 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.986 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.987 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.987 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.987 226239 DEBUG oslo_service.service [None req-6831cb49-8892-4bdf-9b9f-10404cdeaec6 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 31 02:40:02 np0005603623 nova_compute[226235]: 2026-01-31 07:40:02.988 226239 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)#033[00m
Jan 31 02:40:03 np0005603623 podman[226513]: 2026-01-31 07:40:03.010700849 +0000 UTC m=+0.103703519 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 2026-01-31 07:40:03.017 226239 INFO nova.virt.node [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Determined node identity 492dc482-9d1e-49ca-87f3-0104a8508b72 from /var/lib/nova/compute_id#033[00m
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 2026-01-31 07:40:03.018 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 2026-01-31 07:40:03.019 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 2026-01-31 07:40:03.019 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 2026-01-31 07:40:03.019 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 2026-01-31 07:40:03.029 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fb6f68fde20> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 2026-01-31 07:40:03.031 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fb6f68fde20> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 2026-01-31 07:40:03.032 226239 INFO nova.virt.libvirt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 2026-01-31 07:40:03.037 226239 INFO nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Libvirt host capabilities <capabilities>
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <host>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <uuid>4e15465d-7c03-4925-9fc3-ba6a686b7adc</uuid>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <cpu>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <arch>x86_64</arch>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model>EPYC-Rome-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <vendor>AMD</vendor>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <microcode version='16777317'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <signature family='23' model='49' stepping='0'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='x2apic'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='tsc-deadline'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='osxsave'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='hypervisor'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='tsc_adjust'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='spec-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='stibp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='arch-capabilities'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='ssbd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='cmp_legacy'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='topoext'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='virt-ssbd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='lbrv'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='tsc-scale'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='vmcb-clean'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='pause-filter'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='pfthreshold'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='svme-addr-chk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='rdctl-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='skip-l1dfl-vmentry'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='mds-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature name='pschange-mc-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <pages unit='KiB' size='4'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <pages unit='KiB' size='2048'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <pages unit='KiB' size='1048576'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </cpu>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <power_management>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <suspend_mem/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </power_management>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <iommu support='no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <migration_features>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <live/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <uri_transports>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <uri_transport>tcp</uri_transport>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <uri_transport>rdma</uri_transport>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </uri_transports>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </migration_features>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <topology>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <cells num='1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <cell id='0'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:          <memory unit='KiB'>7864300</memory>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:          <pages unit='KiB' size='4'>1966075</pages>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:          <pages unit='KiB' size='2048'>0</pages>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:          <distances>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:            <sibling id='0' value='10'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:          </distances>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:          <cpus num='8'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:          </cpus>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        </cell>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </cells>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </topology>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <cache>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </cache>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <secmodel>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model>selinux</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <doi>0</doi>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </secmodel>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <secmodel>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model>dac</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <doi>0</doi>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </secmodel>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  </host>
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <guest>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <os_type>hvm</os_type>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <arch name='i686'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <wordsize>32</wordsize>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <domain type='qemu'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <domain type='kvm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </arch>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <features>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <pae/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <nonpae/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <acpi default='on' toggle='yes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <apic default='on' toggle='no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <cpuselection/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <deviceboot/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <disksnapshot default='on' toggle='no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <externalSnapshot/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </features>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  </guest>
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <guest>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <os_type>hvm</os_type>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <arch name='x86_64'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <wordsize>64</wordsize>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <domain type='qemu'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <domain type='kvm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </arch>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <features>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <acpi default='on' toggle='yes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <apic default='on' toggle='no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <cpuselection/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <deviceboot/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <disksnapshot default='on' toggle='no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <externalSnapshot/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </features>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  </guest>
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 
Jan 31 02:40:03 np0005603623 nova_compute[226235]: </capabilities>
Jan 31 02:40:03 np0005603623 nova_compute[226235]: #033[00m
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 2026-01-31 07:40:03.042 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 2026-01-31 07:40:03.045 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 31 02:40:03 np0005603623 nova_compute[226235]: <domainCapabilities>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <path>/usr/libexec/qemu-kvm</path>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <domain>kvm</domain>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <arch>i686</arch>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <vcpu max='240'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <iothreads supported='yes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <os supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <enum name='firmware'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <loader supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='type'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>rom</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>pflash</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='readonly'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>yes</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>no</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='secure'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>no</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </loader>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <cpu>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <mode name='host-passthrough' supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='hostPassthroughMigratable'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>on</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>off</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </mode>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <mode name='maximum' supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='maximumMigratable'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>on</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>off</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </mode>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <mode name='host-model' supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <vendor>AMD</vendor>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='x2apic'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='tsc-deadline'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='hypervisor'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='tsc_adjust'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='spec-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='stibp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='ssbd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='cmp_legacy'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='overflow-recov'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='succor'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='ibrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='amd-ssbd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='virt-ssbd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='lbrv'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='tsc-scale'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='vmcb-clean'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='flushbyasid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='pause-filter'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='pfthreshold'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='svme-addr-chk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='disable' name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </mode>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <mode name='custom' supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-noTSX'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server-v5'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='ClearwaterForest'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bhi-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ddpd-u'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sha512'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sm3'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sm4'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='ClearwaterForest-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bhi-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ddpd-u'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sha512'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sm3'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sm4'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cooperlake'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cooperlake-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cooperlake-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Denverton'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Denverton-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Denverton-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Denverton-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Dhyana-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Genoa'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Genoa-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Genoa-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='perfmon-v2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Milan'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Milan-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Milan-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Milan-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Rome'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Rome-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Rome-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Rome-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Turin'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibpb-brtype'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='perfmon-v2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='prefetchi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbpb'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Turin-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibpb-brtype'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='perfmon-v2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='prefetchi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbpb'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-v5'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='GraniteRapids'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='GraniteRapids-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='GraniteRapids-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx10'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx10-128'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx10-256'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx10-512'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='GraniteRapids-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx10'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx10-128'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx10-256'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx10-512'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Haswell'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Haswell-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Haswell-noTSX'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Haswell-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Haswell-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Haswell-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Haswell-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Icelake-Server'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Icelake-Server-noTSX'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Icelake-Server-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Icelake-Server-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Icelake-Server-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Icelake-Server-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Icelake-Server-v5'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Icelake-Server-v6'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Icelake-Server-v7'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='IvyBridge'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='IvyBridge-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='IvyBridge-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='IvyBridge-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='KnightsMill'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-4fmaps'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-4vnniw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512er'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512pf'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='KnightsMill-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-4fmaps'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-4vnniw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512er'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512pf'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Opteron_G4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fma4'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xop'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Opteron_G4-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fma4'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xop'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Opteron_G5'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fma4'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tbm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xop'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Opteron_G5-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fma4'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tbm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xop'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='SapphireRapids'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='SapphireRapids-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='SapphireRapids-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='SapphireRapids-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='SapphireRapids-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='SierraForest'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='SierraForest-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='SierraForest-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='SierraForest-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Client'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Client-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Client-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Client-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Client-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Client-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Server'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Server-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Server-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Server-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Server-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Server-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Server-v5'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Snowridge'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='core-capability'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='split-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Snowridge-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='core-capability'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='split-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Snowridge-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='core-capability'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='split-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Snowridge-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='core-capability'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='split-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Snowridge-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='athlon'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='3dnow'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='3dnowext'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='athlon-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='3dnow'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='3dnowext'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='core2duo'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='core2duo-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='coreduo'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='coreduo-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='n270'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='n270-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='phenom'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='3dnow'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='3dnowext'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='phenom-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='3dnow'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='3dnowext'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </mode>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <memoryBacking supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <enum name='sourceType'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <value>file</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <value>anonymous</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <value>memfd</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  </memoryBacking>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <disk supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='diskDevice'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>disk</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>cdrom</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>floppy</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>lun</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='bus'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>ide</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>fdc</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>scsi</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>virtio</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>usb</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>sata</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='model'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>virtio</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>virtio-transitional</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>virtio-non-transitional</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <graphics supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='type'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>vnc</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>egl-headless</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>dbus</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </graphics>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <video supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='modelType'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>vga</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>cirrus</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>virtio</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>none</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>bochs</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>ramfb</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <hostdev supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='mode'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>subsystem</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='startupPolicy'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>default</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>mandatory</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>requisite</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>optional</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='subsysType'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>usb</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>pci</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>scsi</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='capsType'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='pciBackend'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </hostdev>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <rng supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='model'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>virtio</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>virtio-transitional</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>virtio-non-transitional</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='backendModel'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>random</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>egd</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>builtin</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <filesystem supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='driverType'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>path</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>handle</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>virtiofs</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </filesystem>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <tpm supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='model'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>tpm-tis</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>tpm-crb</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='backendModel'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>emulator</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>external</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='backendVersion'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>2.0</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </tpm>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <redirdev supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='bus'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>usb</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </redirdev>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <channel supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='type'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>pty</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>unix</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </channel>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <crypto supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='model'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='type'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>qemu</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='backendModel'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>builtin</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </crypto>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <interface supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='backendType'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>default</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>passt</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </interface>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <panic supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='model'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>isa</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>hyperv</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </panic>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <console supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='type'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>null</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>vc</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>pty</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>dev</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>file</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>pipe</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>stdio</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>udp</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>tcp</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>unix</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>qemu-vdagent</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>dbus</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </console>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <gic supported='no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <vmcoreinfo supported='yes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <genid supported='yes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <backingStoreInput supported='yes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <backup supported='yes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <async-teardown supported='yes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <s390-pv supported='no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <ps2 supported='yes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <tdx supported='no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <sev supported='no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <sgx supported='no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <hyperv supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='features'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>relaxed</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>vapic</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>spinlocks</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>vpindex</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>runtime</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>synic</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>stimer</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>reset</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>vendor_id</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>frequencies</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>reenlightenment</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>tlbflush</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>ipi</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>avic</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>emsr_bitmap</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>xmm_input</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <defaults>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <spinlocks>4095</spinlocks>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <stimer_direct>on</stimer_direct>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <tlbflush_direct>on</tlbflush_direct>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <tlbflush_extended>on</tlbflush_extended>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </defaults>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </hyperv>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <launchSecurity supported='no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:40:03 np0005603623 nova_compute[226235]: </domainCapabilities>
Jan 31 02:40:03 np0005603623 nova_compute[226235]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 2026-01-31 07:40:03.052 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 31 02:40:03 np0005603623 nova_compute[226235]: <domainCapabilities>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <path>/usr/libexec/qemu-kvm</path>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <domain>kvm</domain>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <arch>i686</arch>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <vcpu max='4096'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <iothreads supported='yes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <os supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <enum name='firmware'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <loader supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='type'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>rom</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>pflash</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='readonly'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>yes</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>no</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='secure'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>no</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </loader>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <cpu>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <mode name='host-passthrough' supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='hostPassthroughMigratable'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>on</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>off</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </mode>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <mode name='maximum' supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='maximumMigratable'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>on</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>off</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </mode>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <mode name='host-model' supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <vendor>AMD</vendor>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='x2apic'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='tsc-deadline'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='hypervisor'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='tsc_adjust'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='spec-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='stibp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='ssbd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='cmp_legacy'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='overflow-recov'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='succor'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='ibrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='amd-ssbd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='virt-ssbd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='lbrv'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='tsc-scale'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='vmcb-clean'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='flushbyasid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='pause-filter'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='pfthreshold'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='svme-addr-chk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='disable' name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </mode>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <mode name='custom' supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-noTSX'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server-v5'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='ClearwaterForest'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bhi-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ddpd-u'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sha512'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sm3'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sm4'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='ClearwaterForest-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bhi-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ddpd-u'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sha512'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sm3'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sm4'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cooperlake'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cooperlake-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cooperlake-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Denverton'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Denverton-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Denverton-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Denverton-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Dhyana-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Genoa'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Genoa-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Genoa-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='perfmon-v2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Milan'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Milan-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Milan-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Milan-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Rome'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Rome-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Rome-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Rome-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Turin'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibpb-brtype'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='perfmon-v2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='prefetchi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbpb'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Turin-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibpb-brtype'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='perfmon-v2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='prefetchi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbpb'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-v5'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='GraniteRapids'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='GraniteRapids-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='GraniteRapids-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx10'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx10-128'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx10-256'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx10-512'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='GraniteRapids-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx10'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx10-128'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx10-256'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx10-512'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Haswell'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Haswell-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Haswell-noTSX'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Haswell-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Haswell-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Haswell-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Haswell-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Icelake-Server'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Icelake-Server-noTSX'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Icelake-Server-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Icelake-Server-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Icelake-Server-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Icelake-Server-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Icelake-Server-v5'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Icelake-Server-v6'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Icelake-Server-v7'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='IvyBridge'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='IvyBridge-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='IvyBridge-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='IvyBridge-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='KnightsMill'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-4fmaps'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-4vnniw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512er'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512pf'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='KnightsMill-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-4fmaps'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-4vnniw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512er'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512pf'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Opteron_G4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fma4'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xop'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Opteron_G4-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fma4'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xop'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Opteron_G5'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fma4'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tbm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xop'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Opteron_G5-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fma4'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tbm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xop'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='SapphireRapids'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='SapphireRapids-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='SapphireRapids-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='SapphireRapids-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='SapphireRapids-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='SierraForest'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='SierraForest-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='SierraForest-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='SierraForest-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Client'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Client-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Client-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Client-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Client-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Client-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Server'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Server-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Server-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Server-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Server-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Server-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Skylake-Server-v5'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Snowridge'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='core-capability'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='split-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Snowridge-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='core-capability'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='split-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Snowridge-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='core-capability'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='split-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Snowridge-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='core-capability'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='split-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Snowridge-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='athlon'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='3dnow'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='3dnowext'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='athlon-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='3dnow'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='3dnowext'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='core2duo'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='core2duo-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='coreduo'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='coreduo-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='n270'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='n270-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='phenom'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='3dnow'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='3dnowext'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='phenom-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='3dnow'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='3dnowext'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </mode>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <memoryBacking supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <enum name='sourceType'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <value>file</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <value>anonymous</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <value>memfd</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  </memoryBacking>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <disk supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='diskDevice'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>disk</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>cdrom</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>floppy</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>lun</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='bus'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>fdc</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>scsi</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>virtio</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>usb</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>sata</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='model'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>virtio</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>virtio-transitional</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>virtio-non-transitional</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <graphics supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='type'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>vnc</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>egl-headless</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>dbus</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </graphics>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <video supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='modelType'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>vga</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>cirrus</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>virtio</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>none</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>bochs</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>ramfb</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <hostdev supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='mode'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>subsystem</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='startupPolicy'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>default</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>mandatory</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>requisite</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>optional</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='subsysType'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>usb</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>pci</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>scsi</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='capsType'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='pciBackend'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </hostdev>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <rng supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='model'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>virtio</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>virtio-transitional</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>virtio-non-transitional</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='backendModel'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>random</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>egd</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>builtin</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <filesystem supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='driverType'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>path</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>handle</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>virtiofs</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </filesystem>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <tpm supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='model'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>tpm-tis</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>tpm-crb</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='backendModel'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>emulator</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>external</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='backendVersion'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>2.0</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </tpm>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <redirdev supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='bus'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>usb</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </redirdev>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <channel supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='type'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>pty</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>unix</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </channel>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <crypto supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='model'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='type'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>qemu</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='backendModel'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>builtin</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </crypto>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <interface supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='backendType'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>default</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>passt</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </interface>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <panic supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='model'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>isa</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>hyperv</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </panic>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <console supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='type'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>null</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>vc</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>pty</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>dev</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>file</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>pipe</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>stdio</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>udp</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>tcp</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>unix</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>qemu-vdagent</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>dbus</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </console>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <gic supported='no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <vmcoreinfo supported='yes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <genid supported='yes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <backingStoreInput supported='yes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <backup supported='yes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <async-teardown supported='yes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <s390-pv supported='no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <ps2 supported='yes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <tdx supported='no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <sev supported='no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <sgx supported='no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <hyperv supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='features'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>relaxed</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>vapic</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>spinlocks</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>vpindex</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>runtime</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>synic</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>stimer</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>reset</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>vendor_id</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>frequencies</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>reenlightenment</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>tlbflush</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>ipi</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>avic</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>emsr_bitmap</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>xmm_input</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <defaults>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <spinlocks>4095</spinlocks>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <stimer_direct>on</stimer_direct>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <tlbflush_direct>on</tlbflush_direct>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <tlbflush_extended>on</tlbflush_extended>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </defaults>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </hyperv>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <launchSecurity supported='no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:40:03 np0005603623 nova_compute[226235]: </domainCapabilities>
Jan 31 02:40:03 np0005603623 nova_compute[226235]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 2026-01-31 07:40:03.096 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 2026-01-31 07:40:03.098 226239 DEBUG nova.virt.libvirt.volume.mount [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 31 02:40:03 np0005603623 nova_compute[226235]: 2026-01-31 07:40:03.102 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 31 02:40:03 np0005603623 nova_compute[226235]: <domainCapabilities>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <path>/usr/libexec/qemu-kvm</path>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <domain>kvm</domain>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <arch>x86_64</arch>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <vcpu max='240'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <iothreads supported='yes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <os supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <enum name='firmware'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <loader supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='type'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>rom</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>pflash</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='readonly'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>yes</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>no</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='secure'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>no</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </loader>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:  <cpu>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <mode name='host-passthrough' supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='hostPassthroughMigratable'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>on</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>off</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </mode>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <mode name='maximum' supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <enum name='maximumMigratable'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>on</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <value>off</value>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </enum>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </mode>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <mode name='host-model' supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <vendor>AMD</vendor>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='x2apic'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='tsc-deadline'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='hypervisor'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='tsc_adjust'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='spec-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='stibp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='ssbd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='cmp_legacy'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='overflow-recov'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='succor'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='ibrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='amd-ssbd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='virt-ssbd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='lbrv'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='tsc-scale'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='vmcb-clean'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='flushbyasid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='pause-filter'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='pfthreshold'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='svme-addr-chk'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <feature policy='disable' name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    </mode>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:    <mode name='custom' supported='yes'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-noTSX'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Broadwell-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server-v4'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cascadelake-Server-v5'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='ClearwaterForest'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bhi-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ddpd-u'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sha512'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sm3'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sm4'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='ClearwaterForest-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bhi-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ddpd-u'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sha512'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sm3'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='sm4'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cooperlake'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cooperlake-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Cooperlake-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Denverton'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Denverton-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Denverton-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Denverton-v3'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='Dhyana-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Genoa'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Genoa-v1'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      </blockers>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:      <blockers model='EPYC-Genoa-v2'>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603623 nova_compute[226235]:        <feature name='avx512bw'/>
Jan 31 02:41:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:41:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:25.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:41:26 np0005603623 rsyslogd[1006]: imjournal: 4081 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 31 02:41:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:41:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:26.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:41:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:41:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:27.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:28.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:29.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:41:30.075 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:41:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:41:30.076 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:41:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:41:30.076 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:41:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:30.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:41:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:31.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:41:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:41:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:32.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:33.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:34.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:41:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:35.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:41:36 np0005603623 podman[227258]: 2026-01-31 07:41:36.023317334 +0000 UTC m=+0.117616451 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 02:41:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:36.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:41:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:37.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:38.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:38 np0005603623 podman[227286]: 2026-01-31 07:41:38.951834998 +0000 UTC m=+0.048258892 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127)
Jan 31 02:41:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:39.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:40.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:41.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:41:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:41:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:42.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:41:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:41:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:43.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:41:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:44.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:45.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:46.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:41:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:47.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:41:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:48.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:41:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:41:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:49.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:41:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:50.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:51.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:41:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:52.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:53.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:54.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:55.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:56.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:41:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:41:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:41:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:41:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:41:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:41:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:41:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:41:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:41:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:57.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:41:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:58.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:41:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:59.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:42:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:00.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:42:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:01.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:02.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:03 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:42:03 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:42:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:03.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:03 np0005603623 nova_compute[226235]: 2026-01-31 07:42:03.998 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.024 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.025 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.025 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.040 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.040 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.040 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.040 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.041 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.041 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.041 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.041 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.057 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.057 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.057 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.057 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.057 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:42:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:42:04 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3506620085' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.466 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.592 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.594 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5310MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.594 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.594 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.713 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.714 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:42:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:04.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:04 np0005603623 nova_compute[226235]: 2026-01-31 07:42:04.731 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:42:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:42:05 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3338843083' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:42:05 np0005603623 nova_compute[226235]: 2026-01-31 07:42:05.123 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.392s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:42:05 np0005603623 nova_compute[226235]: 2026-01-31 07:42:05.128 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:42:05 np0005603623 nova_compute[226235]: 2026-01-31 07:42:05.146 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:42:05 np0005603623 nova_compute[226235]: 2026-01-31 07:42:05.149 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:42:05 np0005603623 nova_compute[226235]: 2026-01-31 07:42:05.150 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.556s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:42:05 np0005603623 nova_compute[226235]: 2026-01-31 07:42:05.265 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:05 np0005603623 nova_compute[226235]: 2026-01-31 07:42:05.265 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:42:05.445 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:42:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:42:05.446 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:42:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:42:05.447 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:42:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:05.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:06.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:06 np0005603623 podman[227594]: 2026-01-31 07:42:06.984719445 +0000 UTC m=+0.079566030 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:42:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:42:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:07.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:42:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:42:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:08.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:42:09 np0005603623 podman[227646]: 2026-01-31 07:42:09.209143769 +0000 UTC m=+0.054151990 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 02:42:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:09.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:10.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:42:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:11.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:42:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:12.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:42:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:13.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:42:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:14.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:15.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:16.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:17.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:18.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:19.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:20.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:42:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:21.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:42:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:22.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:42:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:23.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:42:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:24.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 02:42:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 3559 writes, 18K keys, 3559 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.03 MB/s#012Cumulative WAL: 3559 writes, 3559 syncs, 1.00 writes per sync, written: 0.04 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1349 writes, 6492 keys, 1349 commit groups, 1.0 writes per commit group, ingest: 14.38 MB, 0.02 MB/s#012Interval WAL: 1349 writes, 1349 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     78.6      0.26              0.06         9    0.029       0      0       0.0       0.0#012  L6      1/0    7.41 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.2    111.6     93.3      0.71              0.16         8    0.089     35K   4284       0.0       0.0#012 Sum      1/0    7.41 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.2     81.7     89.4      0.97              0.22        17    0.057     35K   4284       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   6.3     81.9     81.0      0.51              0.08         8    0.063     19K   2502       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    111.6     93.3      0.71              0.16         8    0.089     35K   4284       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     79.2      0.26              0.06         8    0.032       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.020, interval 0.006#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.08 GB write, 0.07 MB/s write, 0.08 GB read, 0.07 MB/s read, 1.0 seconds#012Interval compaction: 0.04 GB write, 0.07 MB/s write, 0.04 GB read, 0.07 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557fc5f1b1f0#2 capacity: 304.00 MB usage: 4.63 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000123 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(252,4.32 MB,1.41985%) FilterBlock(17,106.98 KB,0.0343674%) IndexBlock(17,211.83 KB,0.0680472%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 02:42:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:42:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:25.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:42:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:26.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:27.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:42:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:28.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:42:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:29.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:42:30.077 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:42:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:42:30.078 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:42:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:42:30.078 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:42:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:30.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:31.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:42:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:32.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:42:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:42:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:33.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:42:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:34.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:36.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:42:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:36.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:42:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:37 np0005603623 podman[227756]: 2026-01-31 07:42:37.971086542 +0000 UTC m=+0.060508250 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 31 02:42:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:38.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:38.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 31 02:42:39 np0005603623 podman[227784]: 2026-01-31 07:42:39.942951948 +0000 UTC m=+0.043403751 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible)
Jan 31 02:42:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:42:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:40.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:42:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:40.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:42.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:42:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:42.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:42:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:44.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:44.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:46.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:46.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:48.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:48.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:50.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:50.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 02:42:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 5891 writes, 24K keys, 5891 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 5891 writes, 1018 syncs, 5.79 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 468 writes, 710 keys, 468 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s#012Interval WAL: 468 writes, 229 syncs, 2.04 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e60fd28f30#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e60fd28f30#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_
Jan 31 02:42:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:42:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:52.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:42:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:42:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:52.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:42:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:54.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:54.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:56.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:56.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:58.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:42:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:58.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:00.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:00.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:02.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:02 np0005603623 nova_compute[226235]: 2026-01-31 07:43:02.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:43:02 np0005603623 nova_compute[226235]: 2026-01-31 07:43:02.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:43:02 np0005603623 nova_compute[226235]: 2026-01-31 07:43:02.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:43:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:02 np0005603623 nova_compute[226235]: 2026-01-31 07:43:02.605 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:43:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:02.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:03 np0005603623 nova_compute[226235]: 2026-01-31 07:43:03.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:43:03 np0005603623 nova_compute[226235]: 2026-01-31 07:43:03.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:43:03 np0005603623 nova_compute[226235]: 2026-01-31 07:43:03.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:43:03 np0005603623 nova_compute[226235]: 2026-01-31 07:43:03.507 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:43:03 np0005603623 nova_compute[226235]: 2026-01-31 07:43:03.508 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:43:03 np0005603623 nova_compute[226235]: 2026-01-31 07:43:03.508 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:43:03 np0005603623 nova_compute[226235]: 2026-01-31 07:43:03.508 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:43:03 np0005603623 nova_compute[226235]: 2026-01-31 07:43:03.508 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:43:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:04.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:43:04 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1811060913' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:43:04 np0005603623 nova_compute[226235]: 2026-01-31 07:43:04.256 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.748s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:43:04 np0005603623 nova_compute[226235]: 2026-01-31 07:43:04.377 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:43:04 np0005603623 nova_compute[226235]: 2026-01-31 07:43:04.379 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5294MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:43:04 np0005603623 nova_compute[226235]: 2026-01-31 07:43:04.379 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:43:04 np0005603623 nova_compute[226235]: 2026-01-31 07:43:04.379 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:43:04 np0005603623 nova_compute[226235]: 2026-01-31 07:43:04.475 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:43:04 np0005603623 nova_compute[226235]: 2026-01-31 07:43:04.475 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:43:04 np0005603623 nova_compute[226235]: 2026-01-31 07:43:04.501 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:43:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:04.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:43:04 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3717774752' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:43:04 np0005603623 nova_compute[226235]: 2026-01-31 07:43:04.884 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.382s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:43:04 np0005603623 nova_compute[226235]: 2026-01-31 07:43:04.889 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:43:04 np0005603623 nova_compute[226235]: 2026-01-31 07:43:04.947 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:43:04 np0005603623 nova_compute[226235]: 2026-01-31 07:43:04.950 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:43:04 np0005603623 nova_compute[226235]: 2026-01-31 07:43:04.950 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.571s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:43:05 np0005603623 nova_compute[226235]: 2026-01-31 07:43:05.951 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:43:05 np0005603623 nova_compute[226235]: 2026-01-31 07:43:05.952 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:43:05 np0005603623 nova_compute[226235]: 2026-01-31 07:43:05.952 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:43:05 np0005603623 nova_compute[226235]: 2026-01-31 07:43:05.952 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:43:05 np0005603623 nova_compute[226235]: 2026-01-31 07:43:05.953 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:43:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:43:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:06.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:43:06 np0005603623 nova_compute[226235]: 2026-01-31 07:43:06.151 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:43:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:06.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:43:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 02:43:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 02:43:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:08.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:08.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:08 np0005603623 podman[228164]: 2026-01-31 07:43:08.975442634 +0000 UTC m=+0.075157192 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 02:43:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:43:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 02:43:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:43:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:43:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:43:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:10.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:10.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:10 np0005603623 podman[228241]: 2026-01-31 07:43:10.997184063 +0000 UTC m=+0.069587318 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 31 02:43:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:12.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:12.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:14.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:14.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:16.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:16.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:18.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:18.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:19 np0005603623 ceph-mds[84161]: mds.beacon.cephfs.compute-2.asgtzy missed beacon ack from the monitors
Jan 31 02:43:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 02:43:19 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3863359228' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 02:43:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 02:43:19 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3863359228' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 02:43:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:20.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:20 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:43:20 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:43:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:43:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:20.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:43:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:22.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:22 np0005603623 nova_compute[226235]: 2026-01-31 07:43:22.769 226239 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 0.83 sec#033[00m
Jan 31 02:43:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:22.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:24.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:24.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:26.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:26.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:28.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:28.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:30.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:43:30.077 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:43:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:43:30.078 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:43:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:43:30.078 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:43:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:30.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:32.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:43:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:32.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:43:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:34.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=404 latency=0.001000031s ======
Jan 31 02:43:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:34.137 +0000] "GET /healthcheck HTTP/1.1" 404 240 - "python-urllib3/1.26.5" - latency=0.001000031s
Jan 31 02:43:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:34.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:36.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:43:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:36.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:43:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:38.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:43:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:38.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:43:40 np0005603623 podman[228375]: 2026-01-31 07:43:40.026549546 +0000 UTC m=+0.119809867 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 02:43:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:40.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:40.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:41 np0005603623 podman[228402]: 2026-01-31 07:43:41.945363211 +0000 UTC m=+0.046210113 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Jan 31 02:43:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e133 e133: 3 total, 3 up, 3 in
Jan 31 02:43:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:42.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:42.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e134 e134: 3 total, 3 up, 3 in
Jan 31 02:43:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:44.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e135 e135: 3 total, 3 up, 3 in
Jan 31 02:43:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:44.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:46.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:43:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:46.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:43:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e136 e136: 3 total, 3 up, 3 in
Jan 31 02:43:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:48.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:43:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:48.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:43:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:50.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e137 e137: 3 total, 3 up, 3 in
Jan 31 02:43:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:50.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:43:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:52.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:43:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:43:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:52.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:43:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:54.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:54.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:56.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:43:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:56.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:43:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:58.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:43:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:58.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:00.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:00.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:02.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:02.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:03 np0005603623 nova_compute[226235]: 2026-01-31 07:44:03.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:04.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:04 np0005603623 nova_compute[226235]: 2026-01-31 07:44:04.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:04 np0005603623 nova_compute[226235]: 2026-01-31 07:44:04.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:44:04 np0005603623 nova_compute[226235]: 2026-01-31 07:44:04.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:44:04 np0005603623 nova_compute[226235]: 2026-01-31 07:44:04.169 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:44:04 np0005603623 nova_compute[226235]: 2026-01-31 07:44:04.169 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:04.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:05 np0005603623 nova_compute[226235]: 2026-01-31 07:44:05.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:05 np0005603623 nova_compute[226235]: 2026-01-31 07:44:05.201 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:05 np0005603623 nova_compute[226235]: 2026-01-31 07:44:05.201 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:05 np0005603623 nova_compute[226235]: 2026-01-31 07:44:05.201 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:44:05 np0005603623 nova_compute[226235]: 2026-01-31 07:44:05.201 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:05 np0005603623 nova_compute[226235]: 2026-01-31 07:44:05.248 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:05 np0005603623 nova_compute[226235]: 2026-01-31 07:44:05.248 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:05 np0005603623 nova_compute[226235]: 2026-01-31 07:44:05.249 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:05 np0005603623 nova_compute[226235]: 2026-01-31 07:44:05.249 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:44:05 np0005603623 nova_compute[226235]: 2026-01-31 07:44:05.249 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:44:05 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3053683518' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:44:05 np0005603623 nova_compute[226235]: 2026-01-31 07:44:05.641 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.392s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:05 np0005603623 nova_compute[226235]: 2026-01-31 07:44:05.762 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:44:05 np0005603623 nova_compute[226235]: 2026-01-31 07:44:05.763 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5329MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:44:05 np0005603623 nova_compute[226235]: 2026-01-31 07:44:05.763 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:05 np0005603623 nova_compute[226235]: 2026-01-31 07:44:05.764 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:05 np0005603623 nova_compute[226235]: 2026-01-31 07:44:05.866 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:44:05 np0005603623 nova_compute[226235]: 2026-01-31 07:44:05.867 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:44:06 np0005603623 nova_compute[226235]: 2026-01-31 07:44:06.085 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:06.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:44:06 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1012629918' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:44:06 np0005603623 nova_compute[226235]: 2026-01-31 07:44:06.504 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:06 np0005603623 nova_compute[226235]: 2026-01-31 07:44:06.508 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:44:06 np0005603623 nova_compute[226235]: 2026-01-31 07:44:06.534 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:44:06 np0005603623 nova_compute[226235]: 2026-01-31 07:44:06.536 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:44:06 np0005603623 nova_compute[226235]: 2026-01-31 07:44:06.536 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.772s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:06.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:07 np0005603623 nova_compute[226235]: 2026-01-31 07:44:07.489 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:07 np0005603623 nova_compute[226235]: 2026-01-31 07:44:07.490 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:07 np0005603623 nova_compute[226235]: 2026-01-31 07:44:07.490 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:08.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:08.734 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:44:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:08.736 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:44:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:08.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:44:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:10.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:44:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:44:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:10.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:44:10 np0005603623 podman[228580]: 2026-01-31 07:44:10.961313036 +0000 UTC m=+0.060630561 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 02:44:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:12.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:44:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:12.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:44:12 np0005603623 podman[228607]: 2026-01-31 07:44:12.945246941 +0000 UTC m=+0.039122764 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:44:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:14.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:14.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:44:15.104837) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845455104889, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 2409, "num_deletes": 251, "total_data_size": 6167333, "memory_usage": 6239192, "flush_reason": "Manual Compaction"}
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845455166399, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 4037929, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17850, "largest_seqno": 20254, "table_properties": {"data_size": 4028036, "index_size": 6323, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19870, "raw_average_key_size": 20, "raw_value_size": 4008304, "raw_average_value_size": 4094, "num_data_blocks": 281, "num_entries": 979, "num_filter_entries": 979, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769845218, "oldest_key_time": 1769845218, "file_creation_time": 1769845455, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 61641 microseconds, and 5861 cpu microseconds.
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:44:15.166484) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 4037929 bytes OK
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:44:15.166500) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:44:15.169710) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:44:15.169755) EVENT_LOG_v1 {"time_micros": 1769845455169745, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:44:15.169778) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 6156867, prev total WAL file size 6156867, number of live WAL files 2.
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:44:15.171038) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(3943KB)], [36(7586KB)]
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845455171066, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 11806970, "oldest_snapshot_seqno": -1}
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 4512 keys, 9755866 bytes, temperature: kUnknown
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845455296648, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 9755866, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9723134, "index_size": 20334, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11333, "raw_key_size": 112734, "raw_average_key_size": 24, "raw_value_size": 9638779, "raw_average_value_size": 2136, "num_data_blocks": 844, "num_entries": 4512, "num_filter_entries": 4512, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769845455, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:44:15.296866) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 9755866 bytes
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:44:15.300643) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 94.0 rd, 77.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 7.4 +0.0 blob) out(9.3 +0.0 blob), read-write-amplify(5.3) write-amplify(2.4) OK, records in: 5035, records dropped: 523 output_compression: NoCompression
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:44:15.300663) EVENT_LOG_v1 {"time_micros": 1769845455300653, "job": 20, "event": "compaction_finished", "compaction_time_micros": 125639, "compaction_time_cpu_micros": 18113, "output_level": 6, "num_output_files": 1, "total_output_size": 9755866, "num_input_records": 5035, "num_output_records": 4512, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845455301082, "job": 20, "event": "table_file_deletion", "file_number": 38}
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845455301859, "job": 20, "event": "table_file_deletion", "file_number": 36}
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:44:15.170973) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:44:15.301903) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:44:15.301907) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:44:15.301908) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:44:15.301910) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:44:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:44:15.301911) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:44:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:16.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:16.739 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:44:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:16.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:44:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:18.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:44:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e138 e138: 3 total, 3 up, 3 in
Jan 31 02:44:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:44:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:18.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:44:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e139 e139: 3 total, 3 up, 3 in
Jan 31 02:44:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:20.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:20.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:21 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:44:21 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:44:21 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:44:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:22.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:22.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:24.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:24.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:26.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:44:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:26.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:44:27 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:44:27 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:44:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:28.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:28.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:30 np0005603623 nova_compute[226235]: 2026-01-31 07:44:30.065 226239 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "0d6cd237-f032-42fe-a952-64ac0baa4a66" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:30 np0005603623 nova_compute[226235]: 2026-01-31 07:44:30.066 226239 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "0d6cd237-f032-42fe-a952-64ac0baa4a66" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:30.078 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:30.079 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:30.079 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:30 np0005603623 nova_compute[226235]: 2026-01-31 07:44:30.089 226239 DEBUG nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:44:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:44:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:30.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:44:30 np0005603623 nova_compute[226235]: 2026-01-31 07:44:30.216 226239 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:30 np0005603623 nova_compute[226235]: 2026-01-31 07:44:30.216 226239 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e140 e140: 3 total, 3 up, 3 in
Jan 31 02:44:30 np0005603623 nova_compute[226235]: 2026-01-31 07:44:30.222 226239 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:44:30 np0005603623 nova_compute[226235]: 2026-01-31 07:44:30.222 226239 INFO nova.compute.claims [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:44:30 np0005603623 nova_compute[226235]: 2026-01-31 07:44:30.355 226239 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:44:30 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1386690314' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:44:30 np0005603623 nova_compute[226235]: 2026-01-31 07:44:30.739 226239 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.384s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:30 np0005603623 nova_compute[226235]: 2026-01-31 07:44:30.744 226239 DEBUG nova.compute.provider_tree [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:44:30 np0005603623 nova_compute[226235]: 2026-01-31 07:44:30.764 226239 DEBUG nova.scheduler.client.report [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:44:30 np0005603623 nova_compute[226235]: 2026-01-31 07:44:30.831 226239 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:30 np0005603623 nova_compute[226235]: 2026-01-31 07:44:30.832 226239 DEBUG nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:44:30 np0005603623 nova_compute[226235]: 2026-01-31 07:44:30.902 226239 DEBUG nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:44:30 np0005603623 nova_compute[226235]: 2026-01-31 07:44:30.903 226239 DEBUG nova.network.neutron [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:44:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:30.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:30 np0005603623 nova_compute[226235]: 2026-01-31 07:44:30.932 226239 INFO nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:44:30 np0005603623 nova_compute[226235]: 2026-01-31 07:44:30.974 226239 DEBUG nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:44:31 np0005603623 nova_compute[226235]: 2026-01-31 07:44:31.102 226239 DEBUG nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:44:31 np0005603623 nova_compute[226235]: 2026-01-31 07:44:31.104 226239 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:44:31 np0005603623 nova_compute[226235]: 2026-01-31 07:44:31.104 226239 INFO nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Creating image(s)#033[00m
Jan 31 02:44:31 np0005603623 nova_compute[226235]: 2026-01-31 07:44:31.126 226239 DEBUG nova.storage.rbd_utils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image 0d6cd237-f032-42fe-a952-64ac0baa4a66_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:31 np0005603623 nova_compute[226235]: 2026-01-31 07:44:31.149 226239 DEBUG nova.storage.rbd_utils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image 0d6cd237-f032-42fe-a952-64ac0baa4a66_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:31 np0005603623 nova_compute[226235]: 2026-01-31 07:44:31.170 226239 DEBUG nova.storage.rbd_utils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image 0d6cd237-f032-42fe-a952-64ac0baa4a66_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:31 np0005603623 nova_compute[226235]: 2026-01-31 07:44:31.172 226239 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:31 np0005603623 nova_compute[226235]: 2026-01-31 07:44:31.174 226239 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:31 np0005603623 nova_compute[226235]: 2026-01-31 07:44:31.673 226239 DEBUG nova.network.neutron [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Automatically allocating a network for project dc2f6584d8b64364b13683f53c58617f. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460#033[00m
Jan 31 02:44:31 np0005603623 nova_compute[226235]: 2026-01-31 07:44:31.864 226239 DEBUG nova.virt.libvirt.imagebackend [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Image locations are: [{'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 31 02:44:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:44:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:32.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:44:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:32.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:34.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:34 np0005603623 nova_compute[226235]: 2026-01-31 07:44:34.622 226239 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:34 np0005603623 nova_compute[226235]: 2026-01-31 07:44:34.687 226239 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6.part --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:34 np0005603623 nova_compute[226235]: 2026-01-31 07:44:34.688 226239 DEBUG nova.virt.images [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 31 02:44:34 np0005603623 nova_compute[226235]: 2026-01-31 07:44:34.691 226239 DEBUG nova.privsep.utils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 31 02:44:34 np0005603623 nova_compute[226235]: 2026-01-31 07:44:34.692 226239 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6.part /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:34.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:35 np0005603623 nova_compute[226235]: 2026-01-31 07:44:35.142 226239 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6.part /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6.converted" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:35 np0005603623 nova_compute[226235]: 2026-01-31 07:44:35.145 226239 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:35 np0005603623 nova_compute[226235]: 2026-01-31 07:44:35.218 226239 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6.converted --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:35 np0005603623 nova_compute[226235]: 2026-01-31 07:44:35.219 226239 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 4.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:35 np0005603623 nova_compute[226235]: 2026-01-31 07:44:35.237 226239 DEBUG nova.storage.rbd_utils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image 0d6cd237-f032-42fe-a952-64ac0baa4a66_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:35 np0005603623 nova_compute[226235]: 2026-01-31 07:44:35.240 226239 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 0d6cd237-f032-42fe-a952-64ac0baa4a66_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:35 np0005603623 nova_compute[226235]: 2026-01-31 07:44:35.573 226239 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 0d6cd237-f032-42fe-a952-64ac0baa4a66_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.333s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:35 np0005603623 nova_compute[226235]: 2026-01-31 07:44:35.641 226239 DEBUG nova.storage.rbd_utils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] resizing rbd image 0d6cd237-f032-42fe-a952-64ac0baa4a66_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:44:35 np0005603623 nova_compute[226235]: 2026-01-31 07:44:35.768 226239 DEBUG nova.objects.instance [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lazy-loading 'migration_context' on Instance uuid 0d6cd237-f032-42fe-a952-64ac0baa4a66 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:44:35 np0005603623 nova_compute[226235]: 2026-01-31 07:44:35.788 226239 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:44:35 np0005603623 nova_compute[226235]: 2026-01-31 07:44:35.788 226239 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Ensure instance console log exists: /var/lib/nova/instances/0d6cd237-f032-42fe-a952-64ac0baa4a66/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:44:35 np0005603623 nova_compute[226235]: 2026-01-31 07:44:35.789 226239 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:35 np0005603623 nova_compute[226235]: 2026-01-31 07:44:35.789 226239 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:35 np0005603623 nova_compute[226235]: 2026-01-31 07:44:35.790 226239 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:44:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:36.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:44:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:36.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:37 np0005603623 nova_compute[226235]: 2026-01-31 07:44:37.400 226239 DEBUG oslo_concurrency.lockutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:37 np0005603623 nova_compute[226235]: 2026-01-31 07:44:37.401 226239 DEBUG oslo_concurrency.lockutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:37 np0005603623 nova_compute[226235]: 2026-01-31 07:44:37.419 226239 DEBUG nova.compute.manager [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:44:37 np0005603623 nova_compute[226235]: 2026-01-31 07:44:37.490 226239 DEBUG oslo_concurrency.lockutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:37 np0005603623 nova_compute[226235]: 2026-01-31 07:44:37.490 226239 DEBUG oslo_concurrency.lockutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:37 np0005603623 nova_compute[226235]: 2026-01-31 07:44:37.497 226239 DEBUG nova.virt.hardware [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:44:37 np0005603623 nova_compute[226235]: 2026-01-31 07:44:37.497 226239 INFO nova.compute.claims [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:44:37 np0005603623 nova_compute[226235]: 2026-01-31 07:44:37.634 226239 DEBUG oslo_concurrency.processutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:44:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2491975839' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.023 226239 DEBUG oslo_concurrency.processutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.389s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.027 226239 DEBUG nova.compute.provider_tree [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.066 226239 ERROR nova.scheduler.client.report [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [req-96579092-4e73-4795-9c08-69458a740c80] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 492dc482-9d1e-49ca-87f3-0104a8508b72.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-96579092-4e73-4795-9c08-69458a740c80"}]}#033[00m
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.087 226239 DEBUG nova.scheduler.client.report [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.107 226239 DEBUG nova.scheduler.client.report [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.107 226239 DEBUG nova.compute.provider_tree [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.130 226239 DEBUG nova.scheduler.client.report [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 02:44:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:38.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.150 226239 DEBUG nova.scheduler.client.report [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.216 226239 DEBUG oslo_concurrency.processutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:44:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3768550617' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.618 226239 DEBUG oslo_concurrency.processutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.402s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.623 226239 DEBUG nova.compute.provider_tree [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.746 226239 DEBUG nova.scheduler.client.report [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Updated inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.747 226239 DEBUG nova.compute.provider_tree [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Updating resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.747 226239 DEBUG nova.compute.provider_tree [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.794 226239 DEBUG oslo_concurrency.lockutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.304s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.795 226239 DEBUG nova.compute.manager [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.895 226239 DEBUG nova.compute.manager [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.896 226239 DEBUG nova.network.neutron [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:44:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:38.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.930 226239 INFO nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:44:38 np0005603623 nova_compute[226235]: 2026-01-31 07:44:38.952 226239 DEBUG nova.compute.manager [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.054 226239 DEBUG nova.compute.manager [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.055 226239 DEBUG nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.055 226239 INFO nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Creating image(s)#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.080 226239 DEBUG nova.storage.rbd_utils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] rbd image a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.103 226239 DEBUG nova.storage.rbd_utils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] rbd image a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.128 226239 DEBUG nova.storage.rbd_utils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] rbd image a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.131 226239 DEBUG oslo_concurrency.processutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.170 226239 DEBUG oslo_concurrency.processutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.171 226239 DEBUG oslo_concurrency.lockutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.172 226239 DEBUG oslo_concurrency.lockutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.172 226239 DEBUG oslo_concurrency.lockutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.195 226239 DEBUG nova.storage.rbd_utils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] rbd image a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.198 226239 DEBUG oslo_concurrency.processutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:39 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.484 226239 DEBUG oslo_concurrency.processutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.286s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.546 226239 DEBUG nova.storage.rbd_utils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] resizing rbd image a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.657 226239 WARNING oslo_policy.policy [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.657 226239 WARNING oslo_policy.policy [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.659 226239 DEBUG nova.policy [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cddf948da8f14dc998b0b2434d23e7fc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6cefbc155c2449f4a8fe9fd88475b366', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.856 226239 DEBUG nova.objects.instance [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lazy-loading 'migration_context' on Instance uuid a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.875 226239 DEBUG nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.875 226239 DEBUG nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Ensure instance console log exists: /var/lib/nova/instances/a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.875 226239 DEBUG oslo_concurrency.lockutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.876 226239 DEBUG oslo_concurrency.lockutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:39 np0005603623 nova_compute[226235]: 2026-01-31 07:44:39.876 226239 DEBUG oslo_concurrency.lockutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:40.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:40.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:41 np0005603623 nova_compute[226235]: 2026-01-31 07:44:41.866 226239 DEBUG nova.network.neutron [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Successfully created port: 57df6b87-e6f0-4399-b69f-ee772bb1f551 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 02:44:41 np0005603623 podman[229285]: 2026-01-31 07:44:41.96124727 +0000 UTC m=+0.059866748 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:44:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:42.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:42.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:43 np0005603623 podman[229313]: 2026-01-31 07:44:43.946241738 +0000 UTC m=+0.042859910 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 02:44:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:44.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:44 np0005603623 nova_compute[226235]: 2026-01-31 07:44:44.762 226239 DEBUG nova.network.neutron [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Successfully updated port: 57df6b87-e6f0-4399-b69f-ee772bb1f551 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:44:44 np0005603623 nova_compute[226235]: 2026-01-31 07:44:44.800 226239 DEBUG oslo_concurrency.lockutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "refresh_cache-a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:44:44 np0005603623 nova_compute[226235]: 2026-01-31 07:44:44.800 226239 DEBUG oslo_concurrency.lockutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquired lock "refresh_cache-a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:44:44 np0005603623 nova_compute[226235]: 2026-01-31 07:44:44.801 226239 DEBUG nova.network.neutron [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:44:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e141 e141: 3 total, 3 up, 3 in
Jan 31 02:44:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:44:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:44.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:44:45 np0005603623 nova_compute[226235]: 2026-01-31 07:44:45.178 226239 DEBUG nova.network.neutron [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:44:45 np0005603623 nova_compute[226235]: 2026-01-31 07:44:45.437 226239 DEBUG nova.compute.manager [req-671c7fd5-c3ff-4465-9b69-46931bfb6a7a req-320643bb-2e26-44f6-9bff-cbaaec0ec6bc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Received event network-changed-57df6b87-e6f0-4399-b69f-ee772bb1f551 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:44:45 np0005603623 nova_compute[226235]: 2026-01-31 07:44:45.438 226239 DEBUG nova.compute.manager [req-671c7fd5-c3ff-4465-9b69-46931bfb6a7a req-320643bb-2e26-44f6-9bff-cbaaec0ec6bc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Refreshing instance network info cache due to event network-changed-57df6b87-e6f0-4399-b69f-ee772bb1f551. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:44:45 np0005603623 nova_compute[226235]: 2026-01-31 07:44:45.438 226239 DEBUG oslo_concurrency.lockutils [req-671c7fd5-c3ff-4465-9b69-46931bfb6a7a req-320643bb-2e26-44f6-9bff-cbaaec0ec6bc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:44:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e142 e142: 3 total, 3 up, 3 in
Jan 31 02:44:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:46.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.631 226239 DEBUG nova.network.neutron [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Updating instance_info_cache with network_info: [{"id": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "address": "fa:16:3e:8d:01:44", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57df6b87-e6", "ovs_interfaceid": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.669 226239 DEBUG oslo_concurrency.lockutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Releasing lock "refresh_cache-a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.670 226239 DEBUG nova.compute.manager [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Instance network_info: |[{"id": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "address": "fa:16:3e:8d:01:44", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57df6b87-e6", "ovs_interfaceid": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.670 226239 DEBUG oslo_concurrency.lockutils [req-671c7fd5-c3ff-4465-9b69-46931bfb6a7a req-320643bb-2e26-44f6-9bff-cbaaec0ec6bc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.670 226239 DEBUG nova.network.neutron [req-671c7fd5-c3ff-4465-9b69-46931bfb6a7a req-320643bb-2e26-44f6-9bff-cbaaec0ec6bc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Refreshing network info cache for port 57df6b87-e6f0-4399-b69f-ee772bb1f551 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.673 226239 DEBUG nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Start _get_guest_xml network_info=[{"id": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "address": "fa:16:3e:8d:01:44", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57df6b87-e6", "ovs_interfaceid": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.679 226239 WARNING nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.684 226239 DEBUG nova.virt.libvirt.host [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.684 226239 DEBUG nova.virt.libvirt.host [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.689 226239 DEBUG nova.virt.libvirt.host [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.690 226239 DEBUG nova.virt.libvirt.host [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.691 226239 DEBUG nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.692 226239 DEBUG nova.virt.hardware [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:44:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1811245453',id=5,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-1496925635',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.692 226239 DEBUG nova.virt.hardware [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.692 226239 DEBUG nova.virt.hardware [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.693 226239 DEBUG nova.virt.hardware [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.693 226239 DEBUG nova.virt.hardware [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.693 226239 DEBUG nova.virt.hardware [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.693 226239 DEBUG nova.virt.hardware [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.694 226239 DEBUG nova.virt.hardware [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.694 226239 DEBUG nova.virt.hardware [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.694 226239 DEBUG nova.virt.hardware [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.695 226239 DEBUG nova.virt.hardware [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.700 226239 DEBUG nova.privsep.utils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 31 02:44:46 np0005603623 nova_compute[226235]: 2026-01-31 07:44:46.701 226239 DEBUG oslo_concurrency.processutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:46.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:44:47 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1766568978' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.155 226239 DEBUG oslo_concurrency.processutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.179 226239 DEBUG nova.storage.rbd_utils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] rbd image a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.182 226239 DEBUG oslo_concurrency.processutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:44:47 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/587394933' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.611 226239 DEBUG oslo_concurrency.processutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.612 226239 DEBUG nova.virt.libvirt.vif [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:44:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-1597704400',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-1597704400',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-1597704400',id=5,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ89RiM/elI2ZqD0wPCKPg5OJ135CdqHdbE4RgDrlMkDtFUdRT49WLIZycY+7iaVAOnhx3ISEENdI3O2SmWMNgi6poXdDkUtOq9xREV6KVqBpvm5DBEGb+Du/A2bOrXUMA==',key_name='tempest-keypair-889434625',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6cefbc155c2449f4a8fe9fd88475b366',ramdisk_id='',reservation_id='r-f46r9yis',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1634965997',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1634965997-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:44:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cddf948da8f14dc998b0b2434d23e7fc',uuid=a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "address": "fa:16:3e:8d:01:44", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57df6b87-e6", "ovs_interfaceid": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.612 226239 DEBUG nova.network.os_vif_util [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Converting VIF {"id": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "address": "fa:16:3e:8d:01:44", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57df6b87-e6", "ovs_interfaceid": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.613 226239 DEBUG nova.network.os_vif_util [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8d:01:44,bridge_name='br-int',has_traffic_filtering=True,id=57df6b87-e6f0-4399-b69f-ee772bb1f551,network=Network(dad80b1a-444d-4c9c-97e5-f4375a4ed8d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57df6b87-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.615 226239 DEBUG nova.objects.instance [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lazy-loading 'pci_devices' on Instance uuid a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.653 226239 DEBUG nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:44:47 np0005603623 nova_compute[226235]:  <uuid>a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd</uuid>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:  <name>instance-00000005</name>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServersWithSpecificFlavorTestJSON-server-1597704400</nova:name>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:44:46</nova:creationTime>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <nova:flavor name="tempest-flavor_with_ephemeral_0-1496925635">
Jan 31 02:44:47 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:        <nova:user uuid="cddf948da8f14dc998b0b2434d23e7fc">tempest-ServersWithSpecificFlavorTestJSON-1634965997-project-member</nova:user>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:        <nova:project uuid="6cefbc155c2449f4a8fe9fd88475b366">tempest-ServersWithSpecificFlavorTestJSON-1634965997</nova:project>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:        <nova:port uuid="57df6b87-e6f0-4399-b69f-ee772bb1f551">
Jan 31 02:44:47 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <entry name="serial">a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd</entry>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <entry name="uuid">a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd</entry>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd_disk">
Jan 31 02:44:47 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:44:47 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd_disk.config">
Jan 31 02:44:47 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:44:47 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:8d:01:44"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <target dev="tap57df6b87-e6"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    </interface>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd/console.log" append="off"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:44:47 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:44:47 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:44:47 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:44:47 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.655 226239 DEBUG nova.compute.manager [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Preparing to wait for external event network-vif-plugged-57df6b87-e6f0-4399-b69f-ee772bb1f551 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.655 226239 DEBUG oslo_concurrency.lockutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.655 226239 DEBUG oslo_concurrency.lockutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.655 226239 DEBUG oslo_concurrency.lockutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.656 226239 DEBUG nova.virt.libvirt.vif [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:44:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-1597704400',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-1597704400',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-1597704400',id=5,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ89RiM/elI2ZqD0wPCKPg5OJ135CdqHdbE4RgDrlMkDtFUdRT49WLIZycY+7iaVAOnhx3ISEENdI3O2SmWMNgi6poXdDkUtOq9xREV6KVqBpvm5DBEGb+Du/A2bOrXUMA==',key_name='tempest-keypair-889434625',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6cefbc155c2449f4a8fe9fd88475b366',ramdisk_id='',reservation_id='r-f46r9yis',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1634965997',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1634965997-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:44:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cddf948da8f14dc998b0b2434d23e7fc',uuid=a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "address": "fa:16:3e:8d:01:44", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57df6b87-e6", "ovs_interfaceid": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.656 226239 DEBUG nova.network.os_vif_util [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Converting VIF {"id": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "address": "fa:16:3e:8d:01:44", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57df6b87-e6", "ovs_interfaceid": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.657 226239 DEBUG nova.network.os_vif_util [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8d:01:44,bridge_name='br-int',has_traffic_filtering=True,id=57df6b87-e6f0-4399-b69f-ee772bb1f551,network=Network(dad80b1a-444d-4c9c-97e5-f4375a4ed8d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57df6b87-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.657 226239 DEBUG os_vif [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8d:01:44,bridge_name='br-int',has_traffic_filtering=True,id=57df6b87-e6f0-4399-b69f-ee772bb1f551,network=Network(dad80b1a-444d-4c9c-97e5-f4375a4ed8d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57df6b87-e6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.694 226239 DEBUG ovsdbapp.backend.ovs_idl [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.694 226239 DEBUG ovsdbapp.backend.ovs_idl [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.695 226239 DEBUG ovsdbapp.backend.ovs_idl [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.695 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.696 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.696 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.697 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.698 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.701 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.712 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.712 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.712 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:44:47 np0005603623 nova_compute[226235]: 2026-01-31 07:44:47.713 226239 INFO oslo.privsep.daemon [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpih0_nhxd/privsep.sock']#033[00m
Jan 31 02:44:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:44:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:48.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.349 226239 INFO oslo.privsep.daemon [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.240 229403 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.243 229403 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.245 229403 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.245 229403 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229403#033[00m
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.662 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.662 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57df6b87-e6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.663 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap57df6b87-e6, col_values=(('external_ids', {'iface-id': '57df6b87-e6f0-4399-b69f-ee772bb1f551', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8d:01:44', 'vm-uuid': 'a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.704 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:48 np0005603623 NetworkManager[48970]: <info>  [1769845488.7048] manager: (tap57df6b87-e6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.707 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.710 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.711 226239 INFO os_vif [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8d:01:44,bridge_name='br-int',has_traffic_filtering=True,id=57df6b87-e6f0-4399-b69f-ee772bb1f551,network=Network(dad80b1a-444d-4c9c-97e5-f4375a4ed8d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57df6b87-e6')#033[00m
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.765 226239 DEBUG nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.765 226239 DEBUG nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.765 226239 DEBUG nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] No VIF found with MAC fa:16:3e:8d:01:44, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.765 226239 INFO nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Using config drive#033[00m
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.783 226239 DEBUG nova.storage.rbd_utils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] rbd image a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.823 226239 DEBUG nova.network.neutron [req-671c7fd5-c3ff-4465-9b69-46931bfb6a7a req-320643bb-2e26-44f6-9bff-cbaaec0ec6bc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Updated VIF entry in instance network info cache for port 57df6b87-e6f0-4399-b69f-ee772bb1f551. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.824 226239 DEBUG nova.network.neutron [req-671c7fd5-c3ff-4465-9b69-46931bfb6a7a req-320643bb-2e26-44f6-9bff-cbaaec0ec6bc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Updating instance_info_cache with network_info: [{"id": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "address": "fa:16:3e:8d:01:44", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57df6b87-e6", "ovs_interfaceid": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:44:48 np0005603623 nova_compute[226235]: 2026-01-31 07:44:48.839 226239 DEBUG oslo_concurrency.lockutils [req-671c7fd5-c3ff-4465-9b69-46931bfb6a7a req-320643bb-2e26-44f6-9bff-cbaaec0ec6bc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:44:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:48.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:50.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:50 np0005603623 nova_compute[226235]: 2026-01-31 07:44:50.167 226239 INFO nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Creating config drive at /var/lib/nova/instances/a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd/disk.config#033[00m
Jan 31 02:44:50 np0005603623 nova_compute[226235]: 2026-01-31 07:44:50.171 226239 DEBUG oslo_concurrency.processutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpanfdlmgq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:50 np0005603623 nova_compute[226235]: 2026-01-31 07:44:50.291 226239 DEBUG oslo_concurrency.processutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpanfdlmgq" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:50 np0005603623 nova_compute[226235]: 2026-01-31 07:44:50.313 226239 DEBUG nova.storage.rbd_utils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] rbd image a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:50 np0005603623 nova_compute[226235]: 2026-01-31 07:44:50.316 226239 DEBUG oslo_concurrency.processutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd/disk.config a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:50 np0005603623 nova_compute[226235]: 2026-01-31 07:44:50.429 226239 DEBUG oslo_concurrency.processutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd/disk.config a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.113s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:50 np0005603623 nova_compute[226235]: 2026-01-31 07:44:50.430 226239 INFO nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Deleting local config drive /var/lib/nova/instances/a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd/disk.config because it was imported into RBD.#033[00m
Jan 31 02:44:50 np0005603623 systemd[1]: Starting libvirt secret daemon...
Jan 31 02:44:50 np0005603623 systemd[1]: Started libvirt secret daemon.
Jan 31 02:44:50 np0005603623 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 31 02:44:50 np0005603623 kernel: tap57df6b87-e6: entered promiscuous mode
Jan 31 02:44:50 np0005603623 NetworkManager[48970]: <info>  [1769845490.5472] manager: (tap57df6b87-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Jan 31 02:44:50 np0005603623 ovn_controller[133449]: 2026-01-31T07:44:50Z|00027|binding|INFO|Claiming lport 57df6b87-e6f0-4399-b69f-ee772bb1f551 for this chassis.
Jan 31 02:44:50 np0005603623 ovn_controller[133449]: 2026-01-31T07:44:50Z|00028|binding|INFO|57df6b87-e6f0-4399-b69f-ee772bb1f551: Claiming fa:16:3e:8d:01:44 10.100.0.14
Jan 31 02:44:50 np0005603623 nova_compute[226235]: 2026-01-31 07:44:50.550 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:50 np0005603623 nova_compute[226235]: 2026-01-31 07:44:50.552 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:50.562 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8d:01:44 10.100.0.14'], port_security=['fa:16:3e:8d:01:44 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cefbc155c2449f4a8fe9fd88475b366', 'neutron:revision_number': '2', 'neutron:security_group_ids': '587c819d-03ba-401e-b9cf-985ddd1bff18', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9b241bb-46b8-4aac-aa2a-6c4356b4b2d7, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=57df6b87-e6f0-4399-b69f-ee772bb1f551) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:44:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:50.564 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 57df6b87-e6f0-4399-b69f-ee772bb1f551 in datapath dad80b1a-444d-4c9c-97e5-f4375a4ed8d6 bound to our chassis#033[00m
Jan 31 02:44:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:50.568 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dad80b1a-444d-4c9c-97e5-f4375a4ed8d6#033[00m
Jan 31 02:44:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:50.570 143258 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpycfy35zi/privsep.sock']#033[00m
Jan 31 02:44:50 np0005603623 systemd-udevd[229554]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:44:50 np0005603623 nova_compute[226235]: 2026-01-31 07:44:50.594 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:50 np0005603623 systemd-machined[194379]: New machine qemu-1-instance-00000005.
Jan 31 02:44:50 np0005603623 NetworkManager[48970]: <info>  [1769845490.5994] device (tap57df6b87-e6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:44:50 np0005603623 ovn_controller[133449]: 2026-01-31T07:44:50Z|00029|binding|INFO|Setting lport 57df6b87-e6f0-4399-b69f-ee772bb1f551 ovn-installed in OVS
Jan 31 02:44:50 np0005603623 ovn_controller[133449]: 2026-01-31T07:44:50Z|00030|binding|INFO|Setting lport 57df6b87-e6f0-4399-b69f-ee772bb1f551 up in Southbound
Jan 31 02:44:50 np0005603623 NetworkManager[48970]: <info>  [1769845490.6000] device (tap57df6b87-e6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:44:50 np0005603623 nova_compute[226235]: 2026-01-31 07:44:50.601 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:50 np0005603623 systemd[1]: Started Virtual Machine qemu-1-instance-00000005.
Jan 31 02:44:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:50.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:51 np0005603623 nova_compute[226235]: 2026-01-31 07:44:51.269 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845491.268397, a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:44:51 np0005603623 nova_compute[226235]: 2026-01-31 07:44:51.269 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] VM Started (Lifecycle Event)#033[00m
Jan 31 02:44:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:51.282 143258 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 31 02:44:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:51.282 143258 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpycfy35zi/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 31 02:44:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:51.164 229607 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 31 02:44:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:51.168 229607 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 31 02:44:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:51.170 229607 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Jan 31 02:44:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:51.170 229607 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229607#033[00m
Jan 31 02:44:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:51.284 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c167b498-bf92-4c99-88e8-947a509a4498]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:44:51 np0005603623 nova_compute[226235]: 2026-01-31 07:44:51.291 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:44:51 np0005603623 nova_compute[226235]: 2026-01-31 07:44:51.294 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845491.2717044, a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:44:51 np0005603623 nova_compute[226235]: 2026-01-31 07:44:51.294 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:44:51 np0005603623 nova_compute[226235]: 2026-01-31 07:44:51.316 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:44:51 np0005603623 nova_compute[226235]: 2026-01-31 07:44:51.318 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:44:51 np0005603623 nova_compute[226235]: 2026-01-31 07:44:51.357 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:44:51 np0005603623 nova_compute[226235]: 2026-01-31 07:44:51.577 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:51.770 229607 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:51.770 229607 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:51.771 229607 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:52.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:52.343 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3af00a62-a603-4a52-a79e-83ca0ea2624c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:44:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:52.344 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdad80b1a-41 in ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:44:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:52.346 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdad80b1a-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:44:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:52.346 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b0ded711-6a52-4acd-b78e-6574d769a662]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:44:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:52.349 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[def8e8c4-f999-4a1c-b161-c7f81dee94b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:44:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:52.365 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[b833dac8-5ede-4ff5-b8b8-7a38df27c9d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:44:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:52.433 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4793e0b9-138c-4db9-86d8-4b1bc88745a5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:44:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:52.435 143258 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpzjd33e1b/privsep.sock']#033[00m
Jan 31 02:44:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:52.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:53.013 143258 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 31 02:44:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:53.014 143258 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpzjd33e1b/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 31 02:44:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:52.903 229627 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 31 02:44:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:52.906 229627 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 31 02:44:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:52.908 229627 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 31 02:44:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:52.908 229627 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229627#033[00m
Jan 31 02:44:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:53.016 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[0c0c199e-8aa6-4ad9-85d0-0b4dfd510440]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:44:53 np0005603623 nova_compute[226235]: 2026-01-31 07:44:53.260 226239 DEBUG nova.network.neutron [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Automatically allocated network: {'id': '992dcec1-3019-47a1-a14c-defd99a80f3d', 'name': 'auto_allocated_network', 'tenant_id': 'dc2f6584d8b64364b13683f53c58617f', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['84a7580e-2eb6-465f-aef5-8fe041bdab03', 'c97780ea-8676-4982-b5b9-dfb934b09fd9'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2026-01-31T07:44:32Z', 'updated_at': '2026-01-31T07:44:43Z', 'revision_number': 4, 'project_id': 'dc2f6584d8b64364b13683f53c58617f'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478#033[00m
Jan 31 02:44:53 np0005603623 nova_compute[226235]: 2026-01-31 07:44:53.262 226239 DEBUG nova.policy [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0eb58e8663574849b17616075ce5c43e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dc2f6584d8b64364b13683f53c58617f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:44:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:53.511 229627 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:53.511 229627 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:53.511 229627 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:53 np0005603623 nova_compute[226235]: 2026-01-31 07:44:53.705 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.056 226239 DEBUG nova.compute.manager [req-042176a9-3792-4fd9-9e70-2e69c5d46c04 req-eb1de8a5-e2cd-4395-9118-edce54362d6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Received event network-vif-plugged-57df6b87-e6f0-4399-b69f-ee772bb1f551 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.057 226239 DEBUG oslo_concurrency.lockutils [req-042176a9-3792-4fd9-9e70-2e69c5d46c04 req-eb1de8a5-e2cd-4395-9118-edce54362d6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.057 226239 DEBUG oslo_concurrency.lockutils [req-042176a9-3792-4fd9-9e70-2e69c5d46c04 req-eb1de8a5-e2cd-4395-9118-edce54362d6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.058 226239 DEBUG oslo_concurrency.lockutils [req-042176a9-3792-4fd9-9e70-2e69c5d46c04 req-eb1de8a5-e2cd-4395-9118-edce54362d6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.058 226239 DEBUG nova.compute.manager [req-042176a9-3792-4fd9-9e70-2e69c5d46c04 req-eb1de8a5-e2cd-4395-9118-edce54362d6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Processing event network-vif-plugged-57df6b87-e6f0-4399-b69f-ee772bb1f551 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.059 226239 DEBUG nova.compute.manager [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.062 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845494.0617979, a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.063 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.065 226239 DEBUG nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.068 226239 INFO nova.virt.libvirt.driver [-] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Instance spawned successfully.#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.068 226239 DEBUG nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.089 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.094 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:54.094 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[39bbf445-ef9c-4b8b-88cb-a063548182b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.098 226239 DEBUG nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.098 226239 DEBUG nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.099 226239 DEBUG nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.099 226239 DEBUG nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.100 226239 DEBUG nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.100 226239 DEBUG nova.virt.libvirt.driver [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:54.112 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6d0a1bb6-ed15-44a6-9d64-8632f439db17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:44:54 np0005603623 NetworkManager[48970]: <info>  [1769845494.1134] manager: (tapdad80b1a-40): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Jan 31 02:44:54 np0005603623 systemd-udevd[229640]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.137 226239 DEBUG nova.network.neutron [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Successfully created port: e842d32e-dd00-462d-97bb-d28ef68b5985 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:54.138 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[b868bf11-1322-410e-865b-03d9f42d3bb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:54.141 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[92efc72d-df56-4b27-9301-8873745b6848]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:44:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:54.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:54 np0005603623 NetworkManager[48970]: <info>  [1769845494.1602] device (tapdad80b1a-40): carrier: link connected
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:54.164 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f9f960-c49d-4948-8b0c-59a195b68aab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.170 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:54.178 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0e7c82c8-475f-4680-9c3d-efc7931c0534]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdad80b1a-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:d8:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463458, 'reachable_time': 33636, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229658, 'error': None, 'target': 'ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:54.192 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ac4eb7d1-4a9f-4471-b237-ea45cc727374]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:d870'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463458, 'tstamp': 463458}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229659, 'error': None, 'target': 'ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:54.212 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e9a41eea-e1c2-41e8-a0de-992c514633d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdad80b1a-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:d8:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463458, 'reachable_time': 33636, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229661, 'error': None, 'target': 'ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.227 226239 INFO nova.compute.manager [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Took 15.17 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.229 226239 DEBUG nova.compute.manager [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:54.237 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[252fa6cd-3dd0-4894-b428-6d8da47dae94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:54.279 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2f9075b4-afd5-485a-971b-136fa44c654e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:54.280 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdad80b1a-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:54.281 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:54.281 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdad80b1a-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:44:54 np0005603623 NetworkManager[48970]: <info>  [1769845494.2835] manager: (tapdad80b1a-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.283 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:54 np0005603623 kernel: tapdad80b1a-40: entered promiscuous mode
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:54.287 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdad80b1a-40, col_values=(('external_ids', {'iface-id': 'ff1de611-ad43-4ac1-9afc-1469794e5ef0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.289 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:54 np0005603623 ovn_controller[133449]: 2026-01-31T07:44:54Z|00031|binding|INFO|Releasing lport ff1de611-ad43-4ac1-9afc-1469794e5ef0 from this chassis (sb_readonly=0)
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:54.291 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dad80b1a-444d-4c9c-97e5-f4375a4ed8d6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dad80b1a-444d-4c9c-97e5-f4375a4ed8d6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:54.292 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[02e6ce4b-b648-42c3-a5d9-dd31be8a35dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.293 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:54.294 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/dad80b1a-444d-4c9c-97e5-f4375a4ed8d6.pid.haproxy
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID dad80b1a-444d-4c9c-97e5-f4375a4ed8d6
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:44:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:44:54.294 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6', 'env', 'PROCESS_TAG=haproxy-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dad80b1a-444d-4c9c-97e5-f4375a4ed8d6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.319 226239 INFO nova.compute.manager [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Took 16.85 seconds to build instance.#033[00m
Jan 31 02:44:54 np0005603623 nova_compute[226235]: 2026-01-31 07:44:54.340 226239 DEBUG oslo_concurrency.lockutils [None req-1d7c528a-364f-499c-b59e-c17c24ab528a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.939s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:54 np0005603623 podman[229694]: 2026-01-31 07:44:54.663174502 +0000 UTC m=+0.022875830 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:44:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:54.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:54 np0005603623 podman[229694]: 2026-01-31 07:44:54.994307272 +0000 UTC m=+0.354008570 container create eafeebdd18ff2514ec08a5b77d7ac8451e3ce8333abb7c3653649a6eaa2bd9c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:44:55 np0005603623 systemd[1]: Started libpod-conmon-eafeebdd18ff2514ec08a5b77d7ac8451e3ce8333abb7c3653649a6eaa2bd9c3.scope.
Jan 31 02:44:55 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:44:55 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b78861133363e35fb11e82b7192b3e8d0044c59f3f4e3bb3f04647d193b37740/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:44:55 np0005603623 nova_compute[226235]: 2026-01-31 07:44:55.182 226239 DEBUG nova.network.neutron [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Successfully updated port: e842d32e-dd00-462d-97bb-d28ef68b5985 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:44:55 np0005603623 nova_compute[226235]: 2026-01-31 07:44:55.201 226239 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "refresh_cache-0d6cd237-f032-42fe-a952-64ac0baa4a66" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:44:55 np0005603623 nova_compute[226235]: 2026-01-31 07:44:55.202 226239 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquired lock "refresh_cache-0d6cd237-f032-42fe-a952-64ac0baa4a66" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:44:55 np0005603623 nova_compute[226235]: 2026-01-31 07:44:55.202 226239 DEBUG nova.network.neutron [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:44:55 np0005603623 podman[229694]: 2026-01-31 07:44:55.312710826 +0000 UTC m=+0.672412164 container init eafeebdd18ff2514ec08a5b77d7ac8451e3ce8333abb7c3653649a6eaa2bd9c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Jan 31 02:44:55 np0005603623 podman[229694]: 2026-01-31 07:44:55.318357021 +0000 UTC m=+0.678058319 container start eafeebdd18ff2514ec08a5b77d7ac8451e3ce8333abb7c3653649a6eaa2bd9c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 31 02:44:55 np0005603623 neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6[229709]: [NOTICE]   (229713) : New worker (229715) forked
Jan 31 02:44:55 np0005603623 neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6[229709]: [NOTICE]   (229713) : Loading success.
Jan 31 02:44:55 np0005603623 nova_compute[226235]: 2026-01-31 07:44:55.432 226239 DEBUG nova.network.neutron [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:44:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 e143: 3 total, 3 up, 3 in
Jan 31 02:44:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:44:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:56.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:44:56 np0005603623 nova_compute[226235]: 2026-01-31 07:44:56.163 226239 DEBUG nova.compute.manager [req-41d02c86-bb1b-4697-b4ff-8cf8ee99a92f req-6bc15433-442e-475c-8d03-f29cc020acf6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Received event network-vif-plugged-57df6b87-e6f0-4399-b69f-ee772bb1f551 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:44:56 np0005603623 nova_compute[226235]: 2026-01-31 07:44:56.164 226239 DEBUG oslo_concurrency.lockutils [req-41d02c86-bb1b-4697-b4ff-8cf8ee99a92f req-6bc15433-442e-475c-8d03-f29cc020acf6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:56 np0005603623 nova_compute[226235]: 2026-01-31 07:44:56.164 226239 DEBUG oslo_concurrency.lockutils [req-41d02c86-bb1b-4697-b4ff-8cf8ee99a92f req-6bc15433-442e-475c-8d03-f29cc020acf6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:56 np0005603623 nova_compute[226235]: 2026-01-31 07:44:56.164 226239 DEBUG oslo_concurrency.lockutils [req-41d02c86-bb1b-4697-b4ff-8cf8ee99a92f req-6bc15433-442e-475c-8d03-f29cc020acf6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:56 np0005603623 nova_compute[226235]: 2026-01-31 07:44:56.164 226239 DEBUG nova.compute.manager [req-41d02c86-bb1b-4697-b4ff-8cf8ee99a92f req-6bc15433-442e-475c-8d03-f29cc020acf6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] No waiting events found dispatching network-vif-plugged-57df6b87-e6f0-4399-b69f-ee772bb1f551 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:44:56 np0005603623 nova_compute[226235]: 2026-01-31 07:44:56.165 226239 WARNING nova.compute.manager [req-41d02c86-bb1b-4697-b4ff-8cf8ee99a92f req-6bc15433-442e-475c-8d03-f29cc020acf6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Received unexpected event network-vif-plugged-57df6b87-e6f0-4399-b69f-ee772bb1f551 for instance with vm_state active and task_state None.#033[00m
Jan 31 02:44:56 np0005603623 nova_compute[226235]: 2026-01-31 07:44:56.228 226239 DEBUG nova.compute.manager [req-9a680257-9b8d-4359-915f-01c0f38ee2a5 req-e7dbfee5-1095-4926-bd43-4298e4183fdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Received event network-changed-e842d32e-dd00-462d-97bb-d28ef68b5985 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:44:56 np0005603623 nova_compute[226235]: 2026-01-31 07:44:56.228 226239 DEBUG nova.compute.manager [req-9a680257-9b8d-4359-915f-01c0f38ee2a5 req-e7dbfee5-1095-4926-bd43-4298e4183fdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Refreshing instance network info cache due to event network-changed-e842d32e-dd00-462d-97bb-d28ef68b5985. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:44:56 np0005603623 nova_compute[226235]: 2026-01-31 07:44:56.228 226239 DEBUG oslo_concurrency.lockutils [req-9a680257-9b8d-4359-915f-01c0f38ee2a5 req-e7dbfee5-1095-4926-bd43-4298e4183fdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-0d6cd237-f032-42fe-a952-64ac0baa4a66" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:44:56 np0005603623 nova_compute[226235]: 2026-01-31 07:44:56.576 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:44:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:56.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:44:57 np0005603623 nova_compute[226235]: 2026-01-31 07:44:57.378 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:57 np0005603623 NetworkManager[48970]: <info>  [1769845497.3812] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/27)
Jan 31 02:44:57 np0005603623 NetworkManager[48970]: <info>  [1769845497.3824] device (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:44:57 np0005603623 NetworkManager[48970]: <warn>  [1769845497.3826] device (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 02:44:57 np0005603623 NetworkManager[48970]: <info>  [1769845497.3832] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/28)
Jan 31 02:44:57 np0005603623 NetworkManager[48970]: <info>  [1769845497.3836] device (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:44:57 np0005603623 NetworkManager[48970]: <warn>  [1769845497.3836] device (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 02:44:57 np0005603623 NetworkManager[48970]: <info>  [1769845497.3842] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Jan 31 02:44:57 np0005603623 NetworkManager[48970]: <info>  [1769845497.3847] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 31 02:44:57 np0005603623 NetworkManager[48970]: <info>  [1769845497.3851] device (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 31 02:44:57 np0005603623 NetworkManager[48970]: <info>  [1769845497.3853] device (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 31 02:44:57 np0005603623 nova_compute[226235]: 2026-01-31 07:44:57.458 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:57 np0005603623 ovn_controller[133449]: 2026-01-31T07:44:57Z|00032|binding|INFO|Releasing lport ff1de611-ad43-4ac1-9afc-1469794e5ef0 from this chassis (sb_readonly=0)
Jan 31 02:44:57 np0005603623 nova_compute[226235]: 2026-01-31 07:44:57.486 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:44:58 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/524416334' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:44:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:58.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.246 226239 DEBUG nova.network.neutron [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Updating instance_info_cache with network_info: [{"id": "e842d32e-dd00-462d-97bb-d28ef68b5985", "address": "fa:16:3e:ce:67:d9", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::16f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape842d32e-dd", "ovs_interfaceid": "e842d32e-dd00-462d-97bb-d28ef68b5985", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.265 226239 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Releasing lock "refresh_cache-0d6cd237-f032-42fe-a952-64ac0baa4a66" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.266 226239 DEBUG nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Instance network_info: |[{"id": "e842d32e-dd00-462d-97bb-d28ef68b5985", "address": "fa:16:3e:ce:67:d9", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::16f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape842d32e-dd", "ovs_interfaceid": "e842d32e-dd00-462d-97bb-d28ef68b5985", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.266 226239 DEBUG oslo_concurrency.lockutils [req-9a680257-9b8d-4359-915f-01c0f38ee2a5 req-e7dbfee5-1095-4926-bd43-4298e4183fdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-0d6cd237-f032-42fe-a952-64ac0baa4a66" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.266 226239 DEBUG nova.network.neutron [req-9a680257-9b8d-4359-915f-01c0f38ee2a5 req-e7dbfee5-1095-4926-bd43-4298e4183fdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Refreshing network info cache for port e842d32e-dd00-462d-97bb-d28ef68b5985 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.269 226239 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Start _get_guest_xml network_info=[{"id": "e842d32e-dd00-462d-97bb-d28ef68b5985", "address": "fa:16:3e:ce:67:d9", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::16f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape842d32e-dd", "ovs_interfaceid": "e842d32e-dd00-462d-97bb-d28ef68b5985", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.273 226239 WARNING nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.278 226239 DEBUG nova.virt.libvirt.host [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.279 226239 DEBUG nova.virt.libvirt.host [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.284 226239 DEBUG nova.virt.libvirt.host [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.285 226239 DEBUG nova.virt.libvirt.host [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.286 226239 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.286 226239 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.287 226239 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.287 226239 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.287 226239 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.288 226239 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.288 226239 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.288 226239 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.288 226239 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.289 226239 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.289 226239 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.289 226239 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.292 226239 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.540 226239 DEBUG nova.compute.manager [req-b8369897-4a9d-4c0c-9801-f20eb2850fed req-641f58af-4bfa-477f-997c-ee0b1c53f752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Received event network-changed-57df6b87-e6f0-4399-b69f-ee772bb1f551 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.541 226239 DEBUG nova.compute.manager [req-b8369897-4a9d-4c0c-9801-f20eb2850fed req-641f58af-4bfa-477f-997c-ee0b1c53f752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Refreshing instance network info cache due to event network-changed-57df6b87-e6f0-4399-b69f-ee772bb1f551. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.541 226239 DEBUG oslo_concurrency.lockutils [req-b8369897-4a9d-4c0c-9801-f20eb2850fed req-641f58af-4bfa-477f-997c-ee0b1c53f752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.541 226239 DEBUG oslo_concurrency.lockutils [req-b8369897-4a9d-4c0c-9801-f20eb2850fed req-641f58af-4bfa-477f-997c-ee0b1c53f752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.541 226239 DEBUG nova.network.neutron [req-b8369897-4a9d-4c0c-9801-f20eb2850fed req-641f58af-4bfa-477f-997c-ee0b1c53f752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Refreshing network info cache for port 57df6b87-e6f0-4399-b69f-ee772bb1f551 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:44:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:44:58 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1312831767' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:44:58 np0005603623 nova_compute[226235]: 2026-01-31 07:44:58.708 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:44:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:44:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:58.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.031 226239 DEBUG nova.network.neutron [req-9a680257-9b8d-4359-915f-01c0f38ee2a5 req-e7dbfee5-1095-4926-bd43-4298e4183fdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Updated VIF entry in instance network info cache for port e842d32e-dd00-462d-97bb-d28ef68b5985. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.031 226239 DEBUG nova.network.neutron [req-9a680257-9b8d-4359-915f-01c0f38ee2a5 req-e7dbfee5-1095-4926-bd43-4298e4183fdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Updating instance_info_cache with network_info: [{"id": "e842d32e-dd00-462d-97bb-d28ef68b5985", "address": "fa:16:3e:ce:67:d9", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::16f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape842d32e-dd", "ovs_interfaceid": "e842d32e-dd00-462d-97bb-d28ef68b5985", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.043 226239 DEBUG oslo_concurrency.lockutils [req-9a680257-9b8d-4359-915f-01c0f38ee2a5 req-e7dbfee5-1095-4926-bd43-4298e4183fdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-0d6cd237-f032-42fe-a952-64ac0baa4a66" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.092 226239 DEBUG nova.network.neutron [req-b8369897-4a9d-4c0c-9801-f20eb2850fed req-641f58af-4bfa-477f-997c-ee0b1c53f752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Updated VIF entry in instance network info cache for port 57df6b87-e6f0-4399-b69f-ee772bb1f551. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.093 226239 DEBUG nova.network.neutron [req-b8369897-4a9d-4c0c-9801-f20eb2850fed req-641f58af-4bfa-477f-997c-ee0b1c53f752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Updating instance_info_cache with network_info: [{"id": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "address": "fa:16:3e:8d:01:44", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57df6b87-e6", "ovs_interfaceid": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.118 226239 DEBUG oslo_concurrency.lockutils [req-b8369897-4a9d-4c0c-9801-f20eb2850fed req-641f58af-4bfa-477f-997c-ee0b1c53f752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:45:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:45:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:00.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.456 226239 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.164s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.492 226239 DEBUG nova.storage.rbd_utils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image 0d6cd237-f032-42fe-a952-64ac0baa4a66_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.496 226239 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:45:00 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1961844200' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.922 226239 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.924 226239 DEBUG nova.virt.libvirt.vif [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:44:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1079069239-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1079069239-2',id=3,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dc2f6584d8b64364b13683f53c58617f',ramdisk_id='',reservation_id='r-jd4rofhy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-2135409609',owner_user_name='tempest-AutoAllocateNetworkTest-2135409609-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:44:31Z,user_data=None,user_id='0eb58e8663574849b17616075ce5c43e',uuid=0d6cd237-f032-42fe-a952-64ac0baa4a66,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e842d32e-dd00-462d-97bb-d28ef68b5985", "address": "fa:16:3e:ce:67:d9", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::16f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape842d32e-dd", "ovs_interfaceid": "e842d32e-dd00-462d-97bb-d28ef68b5985", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.924 226239 DEBUG nova.network.os_vif_util [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Converting VIF {"id": "e842d32e-dd00-462d-97bb-d28ef68b5985", "address": "fa:16:3e:ce:67:d9", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::16f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape842d32e-dd", "ovs_interfaceid": "e842d32e-dd00-462d-97bb-d28ef68b5985", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.925 226239 DEBUG nova.network.os_vif_util [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:67:d9,bridge_name='br-int',has_traffic_filtering=True,id=e842d32e-dd00-462d-97bb-d28ef68b5985,network=Network(992dcec1-3019-47a1-a14c-defd99a80f3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape842d32e-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.926 226239 DEBUG nova.objects.instance [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lazy-loading 'pci_devices' on Instance uuid 0d6cd237-f032-42fe-a952-64ac0baa4a66 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.939 226239 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:45:00 np0005603623 nova_compute[226235]:  <uuid>0d6cd237-f032-42fe-a952-64ac0baa4a66</uuid>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:  <name>instance-00000003</name>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <nova:name>tempest-tempest.common.compute-instance-1079069239-2</nova:name>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:44:58</nova:creationTime>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:45:00 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:        <nova:user uuid="0eb58e8663574849b17616075ce5c43e">tempest-AutoAllocateNetworkTest-2135409609-project-member</nova:user>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:        <nova:project uuid="dc2f6584d8b64364b13683f53c58617f">tempest-AutoAllocateNetworkTest-2135409609</nova:project>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:        <nova:port uuid="e842d32e-dd00-462d-97bb-d28ef68b5985">
Jan 31 02:45:00 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.1.0.86" ipVersion="4"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="fdfe:381f:8400::16f" ipVersion="6"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <entry name="serial">0d6cd237-f032-42fe-a952-64ac0baa4a66</entry>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <entry name="uuid">0d6cd237-f032-42fe-a952-64ac0baa4a66</entry>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/0d6cd237-f032-42fe-a952-64ac0baa4a66_disk">
Jan 31 02:45:00 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:45:00 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/0d6cd237-f032-42fe-a952-64ac0baa4a66_disk.config">
Jan 31 02:45:00 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:45:00 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:ce:67:d9"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <target dev="tape842d32e-dd"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    </interface>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/0d6cd237-f032-42fe-a952-64ac0baa4a66/console.log" append="off"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:45:00 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:45:00 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:45:00 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:45:00 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.940 226239 DEBUG nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Preparing to wait for external event network-vif-plugged-e842d32e-dd00-462d-97bb-d28ef68b5985 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.940 226239 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "0d6cd237-f032-42fe-a952-64ac0baa4a66-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.941 226239 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "0d6cd237-f032-42fe-a952-64ac0baa4a66-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.941 226239 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "0d6cd237-f032-42fe-a952-64ac0baa4a66-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.941 226239 DEBUG nova.virt.libvirt.vif [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:44:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1079069239-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1079069239-2',id=3,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dc2f6584d8b64364b13683f53c58617f',ramdisk_id='',reservation_id='r-jd4rofhy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-2135409609',owner_user_name='tempest-AutoAllocateNetworkTest-2135409609-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:44:31Z,user_data=None,user_id='0eb58e8663574849b17616075ce5c43e',uuid=0d6cd237-f032-42fe-a952-64ac0baa4a66,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e842d32e-dd00-462d-97bb-d28ef68b5985", "address": "fa:16:3e:ce:67:d9", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::16f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape842d32e-dd", "ovs_interfaceid": "e842d32e-dd00-462d-97bb-d28ef68b5985", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.942 226239 DEBUG nova.network.os_vif_util [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Converting VIF {"id": "e842d32e-dd00-462d-97bb-d28ef68b5985", "address": "fa:16:3e:ce:67:d9", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::16f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape842d32e-dd", "ovs_interfaceid": "e842d32e-dd00-462d-97bb-d28ef68b5985", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.943 226239 DEBUG nova.network.os_vif_util [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:67:d9,bridge_name='br-int',has_traffic_filtering=True,id=e842d32e-dd00-462d-97bb-d28ef68b5985,network=Network(992dcec1-3019-47a1-a14c-defd99a80f3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape842d32e-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.943 226239 DEBUG os_vif [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:67:d9,bridge_name='br-int',has_traffic_filtering=True,id=e842d32e-dd00-462d-97bb-d28ef68b5985,network=Network(992dcec1-3019-47a1-a14c-defd99a80f3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape842d32e-dd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.944 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.944 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.945 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.948 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.948 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape842d32e-dd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.949 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape842d32e-dd, col_values=(('external_ids', {'iface-id': 'e842d32e-dd00-462d-97bb-d28ef68b5985', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ce:67:d9', 'vm-uuid': '0d6cd237-f032-42fe-a952-64ac0baa4a66'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.951 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:00 np0005603623 NetworkManager[48970]: <info>  [1769845500.9519] manager: (tape842d32e-dd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.953 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:45:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:00.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.958 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:00 np0005603623 nova_compute[226235]: 2026-01-31 07:45:00.958 226239 INFO os_vif [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:67:d9,bridge_name='br-int',has_traffic_filtering=True,id=e842d32e-dd00-462d-97bb-d28ef68b5985,network=Network(992dcec1-3019-47a1-a14c-defd99a80f3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape842d32e-dd')#033[00m
Jan 31 02:45:01 np0005603623 nova_compute[226235]: 2026-01-31 07:45:01.067 226239 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:45:01 np0005603623 nova_compute[226235]: 2026-01-31 07:45:01.068 226239 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:45:01 np0005603623 nova_compute[226235]: 2026-01-31 07:45:01.068 226239 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] No VIF found with MAC fa:16:3e:ce:67:d9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:45:01 np0005603623 nova_compute[226235]: 2026-01-31 07:45:01.068 226239 INFO nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Using config drive#033[00m
Jan 31 02:45:01 np0005603623 nova_compute[226235]: 2026-01-31 07:45:01.093 226239 DEBUG nova.storage.rbd_utils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image 0d6cd237-f032-42fe-a952-64ac0baa4a66_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:01 np0005603623 nova_compute[226235]: 2026-01-31 07:45:01.486 226239 INFO nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Creating config drive at /var/lib/nova/instances/0d6cd237-f032-42fe-a952-64ac0baa4a66/disk.config#033[00m
Jan 31 02:45:01 np0005603623 nova_compute[226235]: 2026-01-31 07:45:01.493 226239 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0d6cd237-f032-42fe-a952-64ac0baa4a66/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjemli7pw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:01 np0005603623 nova_compute[226235]: 2026-01-31 07:45:01.579 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:01 np0005603623 nova_compute[226235]: 2026-01-31 07:45:01.617 226239 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0d6cd237-f032-42fe-a952-64ac0baa4a66/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjemli7pw" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:01 np0005603623 nova_compute[226235]: 2026-01-31 07:45:01.687 226239 DEBUG nova.storage.rbd_utils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image 0d6cd237-f032-42fe-a952-64ac0baa4a66_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:01 np0005603623 nova_compute[226235]: 2026-01-31 07:45:01.691 226239 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0d6cd237-f032-42fe-a952-64ac0baa4a66/disk.config 0d6cd237-f032-42fe-a952-64ac0baa4a66_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.002 226239 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0d6cd237-f032-42fe-a952-64ac0baa4a66/disk.config 0d6cd237-f032-42fe-a952-64ac0baa4a66_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.311s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.003 226239 INFO nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Deleting local config drive /var/lib/nova/instances/0d6cd237-f032-42fe-a952-64ac0baa4a66/disk.config because it was imported into RBD.#033[00m
Jan 31 02:45:02 np0005603623 kernel: tape842d32e-dd: entered promiscuous mode
Jan 31 02:45:02 np0005603623 NetworkManager[48970]: <info>  [1769845502.0449] manager: (tape842d32e-dd): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Jan 31 02:45:02 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:02Z|00033|binding|INFO|Claiming lport e842d32e-dd00-462d-97bb-d28ef68b5985 for this chassis.
Jan 31 02:45:02 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:02Z|00034|binding|INFO|e842d32e-dd00-462d-97bb-d28ef68b5985: Claiming fa:16:3e:ce:67:d9 10.1.0.86 fdfe:381f:8400::16f
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.047 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.054 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:02 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:02Z|00035|binding|INFO|Setting lport e842d32e-dd00-462d-97bb-d28ef68b5985 ovn-installed in OVS
Jan 31 02:45:02 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:02Z|00036|binding|INFO|Setting lport e842d32e-dd00-462d-97bb-d28ef68b5985 up in Southbound
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.056 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:67:d9 10.1.0.86 fdfe:381f:8400::16f'], port_security=['fa:16:3e:ce:67:d9 10.1.0.86 fdfe:381f:8400::16f'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.86/26 fdfe:381f:8400::16f/64', 'neutron:device_id': '0d6cd237-f032-42fe-a952-64ac0baa4a66', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-992dcec1-3019-47a1-a14c-defd99a80f3d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dc2f6584d8b64364b13683f53c58617f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f48a740a-df16-488d-83ce-01edcece1d5f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d5eabbe-dd4d-4e48-a2ac-b48c29338142, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=e842d32e-dd00-462d-97bb-d28ef68b5985) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.058 143258 INFO neutron.agent.ovn.metadata.agent [-] Port e842d32e-dd00-462d-97bb-d28ef68b5985 in datapath 992dcec1-3019-47a1-a14c-defd99a80f3d bound to our chassis#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.060 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 992dcec1-3019-47a1-a14c-defd99a80f3d#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.061 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.066 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.073 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d4e4806a-06eb-4662-abd4-1eba019b9e6a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.074 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap992dcec1-31 in ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.076 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap992dcec1-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.076 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b7625f81-6b81-4224-aedb-1c4f5e7c843f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.077 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8fd74cd2-b46d-4a3e-97e7-0e04678c7100]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:02 np0005603623 systemd-machined[194379]: New machine qemu-2-instance-00000003.
Jan 31 02:45:02 np0005603623 systemd[1]: Started Virtual Machine qemu-2-instance-00000003.
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.107 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[7afabe9e-fc8c-4ad5-ba00-5402dc66dab1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:02 np0005603623 systemd-udevd[229869]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:45:02 np0005603623 NetworkManager[48970]: <info>  [1769845502.1233] device (tape842d32e-dd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:45:02 np0005603623 NetworkManager[48970]: <info>  [1769845502.1244] device (tape842d32e-dd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.127 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6446f96d-5cbc-4fb4-a379-39a89c0c118f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.153 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[a0113236-2895-449a-a1b8-66080c3043a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:02.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:02 np0005603623 NetworkManager[48970]: <info>  [1769845502.1613] manager: (tap992dcec1-30): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.160 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a16d18a3-55a1-41be-b9ca-c067b34a82e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:02 np0005603623 systemd-udevd[229872]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.184 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[d39899ed-7909-4e1c-b8dd-4220443a74a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.186 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[4e46443b-84f9-416b-bb24-780cc59b349c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.192 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.193 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.193 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 02:45:02 np0005603623 NetworkManager[48970]: <info>  [1769845502.2071] device (tap992dcec1-30): carrier: link connected
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.214 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[9088c023-b4b3-4710-914f-6a5459b4882a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.229 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.231 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c3c560be-1002-41e5-9292-fb5969718e01]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap992dcec1-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:60:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464263, 'reachable_time': 16291, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229900, 'error': None, 'target': 'ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.250 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0a48addb-e7d7-4332-9e9a-b4d52f369f34]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe24:6054'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 464263, 'tstamp': 464263}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229901, 'error': None, 'target': 'ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.268 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ced74b97-7d35-48ad-a36c-7663b13e5212]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap992dcec1-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:60:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464263, 'reachable_time': 16291, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229902, 'error': None, 'target': 'ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.299 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e22b5899-2e8e-4cfe-be8f-b06118559125]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.371 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8eb8d294-e31b-4797-8902-4b518e7d10fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.373 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap992dcec1-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.373 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.373 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap992dcec1-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.375 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:02 np0005603623 NetworkManager[48970]: <info>  [1769845502.3759] manager: (tap992dcec1-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Jan 31 02:45:02 np0005603623 kernel: tap992dcec1-30: entered promiscuous mode
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.379 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap992dcec1-30, col_values=(('external_ids', {'iface-id': '1443ed6f-926c-4e3e-8e41-280e0ddc0f64'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:02 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:02Z|00037|binding|INFO|Releasing lport 1443ed6f-926c-4e3e-8e41-280e0ddc0f64 from this chassis (sb_readonly=0)
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.381 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.385 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.386 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/992dcec1-3019-47a1-a14c-defd99a80f3d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/992dcec1-3019-47a1-a14c-defd99a80f3d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.387 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[62dd43bc-e366-4483-8371-109d2fe39c8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.388 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-992dcec1-3019-47a1-a14c-defd99a80f3d
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/992dcec1-3019-47a1-a14c-defd99a80f3d.pid.haproxy
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 992dcec1-3019-47a1-a14c-defd99a80f3d
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:45:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:02.388 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d', 'env', 'PROCESS_TAG=haproxy-992dcec1-3019-47a1-a14c-defd99a80f3d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/992dcec1-3019-47a1-a14c-defd99a80f3d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.418 226239 DEBUG nova.compute.manager [req-37154419-3d9b-4362-94ce-3578d242c6af req-c49cd8e8-c330-46e7-9243-d6c82373aeb6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Received event network-vif-plugged-e842d32e-dd00-462d-97bb-d28ef68b5985 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.419 226239 DEBUG oslo_concurrency.lockutils [req-37154419-3d9b-4362-94ce-3578d242c6af req-c49cd8e8-c330-46e7-9243-d6c82373aeb6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0d6cd237-f032-42fe-a952-64ac0baa4a66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.419 226239 DEBUG oslo_concurrency.lockutils [req-37154419-3d9b-4362-94ce-3578d242c6af req-c49cd8e8-c330-46e7-9243-d6c82373aeb6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0d6cd237-f032-42fe-a952-64ac0baa4a66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.420 226239 DEBUG oslo_concurrency.lockutils [req-37154419-3d9b-4362-94ce-3578d242c6af req-c49cd8e8-c330-46e7-9243-d6c82373aeb6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0d6cd237-f032-42fe-a952-64ac0baa4a66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.420 226239 DEBUG nova.compute.manager [req-37154419-3d9b-4362-94ce-3578d242c6af req-c49cd8e8-c330-46e7-9243-d6c82373aeb6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Processing event network-vif-plugged-e842d32e-dd00-462d-97bb-d28ef68b5985 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.730 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845502.7302864, 0d6cd237-f032-42fe-a952-64ac0baa4a66 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.732 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] VM Started (Lifecycle Event)#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.734 226239 DEBUG nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.737 226239 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.741 226239 INFO nova.virt.libvirt.driver [-] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Instance spawned successfully.#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.741 226239 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.762 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.765 226239 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.766 226239 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.766 226239 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.767 226239 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.767 226239 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.767 226239 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.771 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.812 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.812 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845502.7315645, 0d6cd237-f032-42fe-a952-64ac0baa4a66 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.812 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.841 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.846 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845502.7375956, 0d6cd237-f032-42fe-a952-64ac0baa4a66 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.846 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.860 226239 INFO nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Took 31.76 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.861 226239 DEBUG nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.863 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.871 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:45:02 np0005603623 podman[229973]: 2026-01-31 07:45:02.775905227 +0000 UTC m=+0.046982408 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.900 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.931 226239 INFO nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Took 32.77 seconds to build instance.#033[00m
Jan 31 02:45:02 np0005603623 nova_compute[226235]: 2026-01-31 07:45:02.955 226239 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "0d6cd237-f032-42fe-a952-64ac0baa4a66" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 32.889s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:02.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:03 np0005603623 podman[229973]: 2026-01-31 07:45:03.560812859 +0000 UTC m=+0.831890020 container create dec45b2b75ccd815a2db75749cbd2bf050f511aeab3708e0b61cdf1c33b3e72a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 02:45:03 np0005603623 systemd[1]: Started libpod-conmon-dec45b2b75ccd815a2db75749cbd2bf050f511aeab3708e0b61cdf1c33b3e72a.scope.
Jan 31 02:45:03 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:45:03 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c321601a9073ddb2d929a531fb85e2ef9579495004cf0125454a867409997ee/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:45:03 np0005603623 podman[229973]: 2026-01-31 07:45:03.752330257 +0000 UTC m=+1.023407438 container init dec45b2b75ccd815a2db75749cbd2bf050f511aeab3708e0b61cdf1c33b3e72a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:45:03 np0005603623 podman[229973]: 2026-01-31 07:45:03.757110745 +0000 UTC m=+1.028187896 container start dec45b2b75ccd815a2db75749cbd2bf050f511aeab3708e0b61cdf1c33b3e72a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:45:03 np0005603623 neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d[229989]: [NOTICE]   (229993) : New worker (229995) forked
Jan 31 02:45:03 np0005603623 neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d[229989]: [NOTICE]   (229993) : Loading success.
Jan 31 02:45:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:45:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:04.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:45:04 np0005603623 nova_compute[226235]: 2026-01-31 07:45:04.252 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:04 np0005603623 nova_compute[226235]: 2026-01-31 07:45:04.518 226239 DEBUG nova.compute.manager [req-04d90286-ca30-438a-a9a3-e6c74929f0bf req-16eb9b2a-d937-44d9-86ae-182f45266b10 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Received event network-vif-plugged-e842d32e-dd00-462d-97bb-d28ef68b5985 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:45:04 np0005603623 nova_compute[226235]: 2026-01-31 07:45:04.519 226239 DEBUG oslo_concurrency.lockutils [req-04d90286-ca30-438a-a9a3-e6c74929f0bf req-16eb9b2a-d937-44d9-86ae-182f45266b10 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0d6cd237-f032-42fe-a952-64ac0baa4a66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:04 np0005603623 nova_compute[226235]: 2026-01-31 07:45:04.519 226239 DEBUG oslo_concurrency.lockutils [req-04d90286-ca30-438a-a9a3-e6c74929f0bf req-16eb9b2a-d937-44d9-86ae-182f45266b10 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0d6cd237-f032-42fe-a952-64ac0baa4a66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:04 np0005603623 nova_compute[226235]: 2026-01-31 07:45:04.519 226239 DEBUG oslo_concurrency.lockutils [req-04d90286-ca30-438a-a9a3-e6c74929f0bf req-16eb9b2a-d937-44d9-86ae-182f45266b10 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0d6cd237-f032-42fe-a952-64ac0baa4a66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:04 np0005603623 nova_compute[226235]: 2026-01-31 07:45:04.519 226239 DEBUG nova.compute.manager [req-04d90286-ca30-438a-a9a3-e6c74929f0bf req-16eb9b2a-d937-44d9-86ae-182f45266b10 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] No waiting events found dispatching network-vif-plugged-e842d32e-dd00-462d-97bb-d28ef68b5985 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:45:04 np0005603623 nova_compute[226235]: 2026-01-31 07:45:04.519 226239 WARNING nova.compute.manager [req-04d90286-ca30-438a-a9a3-e6c74929f0bf req-16eb9b2a-d937-44d9-86ae-182f45266b10 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Received unexpected event network-vif-plugged-e842d32e-dd00-462d-97bb-d28ef68b5985 for instance with vm_state active and task_state None.#033[00m
Jan 31 02:45:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:04.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:05 np0005603623 nova_compute[226235]: 2026-01-31 07:45:05.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:05 np0005603623 nova_compute[226235]: 2026-01-31 07:45:05.952 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:06 np0005603623 nova_compute[226235]: 2026-01-31 07:45:06.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:06 np0005603623 nova_compute[226235]: 2026-01-31 07:45:06.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:06 np0005603623 nova_compute[226235]: 2026-01-31 07:45:06.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:45:06 np0005603623 nova_compute[226235]: 2026-01-31 07:45:06.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:45:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:06.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:06 np0005603623 nova_compute[226235]: 2026-01-31 07:45:06.582 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:06.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:07 np0005603623 nova_compute[226235]: 2026-01-31 07:45:07.194 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-0d6cd237-f032-42fe-a952-64ac0baa4a66" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:45:07 np0005603623 nova_compute[226235]: 2026-01-31 07:45:07.195 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-0d6cd237-f032-42fe-a952-64ac0baa4a66" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:45:07 np0005603623 nova_compute[226235]: 2026-01-31 07:45:07.195 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 02:45:07 np0005603623 nova_compute[226235]: 2026-01-31 07:45:07.195 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0d6cd237-f032-42fe-a952-64ac0baa4a66 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:45:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:08.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:45:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:08.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:45:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:45:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:10.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:45:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:10.227 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:45:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:10.229 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:45:10 np0005603623 nova_compute[226235]: 2026-01-31 07:45:10.233 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:10 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:10Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8d:01:44 10.100.0.14
Jan 31 02:45:10 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:10Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8d:01:44 10.100.0.14
Jan 31 02:45:10 np0005603623 nova_compute[226235]: 2026-01-31 07:45:10.381 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Updating instance_info_cache with network_info: [{"id": "e842d32e-dd00-462d-97bb-d28ef68b5985", "address": "fa:16:3e:ce:67:d9", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::16f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape842d32e-dd", "ovs_interfaceid": "e842d32e-dd00-462d-97bb-d28ef68b5985", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:45:10 np0005603623 nova_compute[226235]: 2026-01-31 07:45:10.404 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-0d6cd237-f032-42fe-a952-64ac0baa4a66" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:45:10 np0005603623 nova_compute[226235]: 2026-01-31 07:45:10.405 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 02:45:10 np0005603623 nova_compute[226235]: 2026-01-31 07:45:10.406 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:10 np0005603623 nova_compute[226235]: 2026-01-31 07:45:10.406 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:10 np0005603623 nova_compute[226235]: 2026-01-31 07:45:10.406 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:10 np0005603623 nova_compute[226235]: 2026-01-31 07:45:10.407 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:10 np0005603623 nova_compute[226235]: 2026-01-31 07:45:10.407 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:45:10 np0005603623 nova_compute[226235]: 2026-01-31 07:45:10.408 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:10 np0005603623 nova_compute[226235]: 2026-01-31 07:45:10.425 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:10 np0005603623 nova_compute[226235]: 2026-01-31 07:45:10.426 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:10 np0005603623 nova_compute[226235]: 2026-01-31 07:45:10.426 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:10 np0005603623 nova_compute[226235]: 2026-01-31 07:45:10.427 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:45:10 np0005603623 nova_compute[226235]: 2026-01-31 07:45:10.427 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:10 np0005603623 nova_compute[226235]: 2026-01-31 07:45:10.953 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:45:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:10.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:45:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:45:10 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/231468512' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:45:10 np0005603623 nova_compute[226235]: 2026-01-31 07:45:10.990 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.563s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:11 np0005603623 nova_compute[226235]: 2026-01-31 07:45:11.071 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000005 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:45:11 np0005603623 nova_compute[226235]: 2026-01-31 07:45:11.071 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000005 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:45:11 np0005603623 nova_compute[226235]: 2026-01-31 07:45:11.075 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:45:11 np0005603623 nova_compute[226235]: 2026-01-31 07:45:11.075 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:45:11 np0005603623 nova_compute[226235]: 2026-01-31 07:45:11.258 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:45:11 np0005603623 nova_compute[226235]: 2026-01-31 07:45:11.261 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4630MB free_disk=20.778430938720703GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:45:11 np0005603623 nova_compute[226235]: 2026-01-31 07:45:11.261 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:11 np0005603623 nova_compute[226235]: 2026-01-31 07:45:11.261 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:11 np0005603623 nova_compute[226235]: 2026-01-31 07:45:11.346 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 0d6cd237-f032-42fe-a952-64ac0baa4a66 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:45:11 np0005603623 nova_compute[226235]: 2026-01-31 07:45:11.346 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:45:11 np0005603623 nova_compute[226235]: 2026-01-31 07:45:11.347 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:45:11 np0005603623 nova_compute[226235]: 2026-01-31 07:45:11.347 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:45:11 np0005603623 nova_compute[226235]: 2026-01-31 07:45:11.561 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:11 np0005603623 nova_compute[226235]: 2026-01-31 07:45:11.585 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:45:11 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1255912892' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:45:11 np0005603623 nova_compute[226235]: 2026-01-31 07:45:11.952 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.392s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:11 np0005603623 nova_compute[226235]: 2026-01-31 07:45:11.959 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:45:11 np0005603623 nova_compute[226235]: 2026-01-31 07:45:11.990 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:45:12 np0005603623 nova_compute[226235]: 2026-01-31 07:45:12.071 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:45:12 np0005603623 nova_compute[226235]: 2026-01-31 07:45:12.071 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.810s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:12.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:12.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:13 np0005603623 podman[230106]: 2026-01-31 07:45:13.025311272 +0000 UTC m=+0.115628477 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:45:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:14.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 02:45:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2506885120' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 02:45:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 02:45:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2506885120' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 02:45:14 np0005603623 podman[230133]: 2026-01-31 07:45:14.944263122 +0000 UTC m=+0.039739692 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 31 02:45:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:14.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:15.231 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:15 np0005603623 nova_compute[226235]: 2026-01-31 07:45:15.956 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:45:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:16.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:45:16 np0005603623 nova_compute[226235]: 2026-01-31 07:45:16.586 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:16.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:17 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:17Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ce:67:d9 10.1.0.86
Jan 31 02:45:17 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:17Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ce:67:d9 10.1.0.86
Jan 31 02:45:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:45:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:18.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:45:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:18.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:20.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:20 np0005603623 nova_compute[226235]: 2026-01-31 07:45:20.958 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:20.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:21 np0005603623 nova_compute[226235]: 2026-01-31 07:45:21.588 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:45:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:22.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:45:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:22.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:23 np0005603623 nova_compute[226235]: 2026-01-31 07:45:23.933 226239 DEBUG oslo_concurrency.lockutils [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "0d6cd237-f032-42fe-a952-64ac0baa4a66" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:23 np0005603623 nova_compute[226235]: 2026-01-31 07:45:23.934 226239 DEBUG oslo_concurrency.lockutils [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "0d6cd237-f032-42fe-a952-64ac0baa4a66" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:23 np0005603623 nova_compute[226235]: 2026-01-31 07:45:23.934 226239 DEBUG oslo_concurrency.lockutils [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "0d6cd237-f032-42fe-a952-64ac0baa4a66-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:23 np0005603623 nova_compute[226235]: 2026-01-31 07:45:23.934 226239 DEBUG oslo_concurrency.lockutils [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "0d6cd237-f032-42fe-a952-64ac0baa4a66-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:23 np0005603623 nova_compute[226235]: 2026-01-31 07:45:23.934 226239 DEBUG oslo_concurrency.lockutils [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "0d6cd237-f032-42fe-a952-64ac0baa4a66-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:23 np0005603623 nova_compute[226235]: 2026-01-31 07:45:23.936 226239 INFO nova.compute.manager [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Terminating instance#033[00m
Jan 31 02:45:23 np0005603623 nova_compute[226235]: 2026-01-31 07:45:23.938 226239 DEBUG nova.compute.manager [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:45:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:45:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:24.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:45:24 np0005603623 kernel: tape842d32e-dd (unregistering): left promiscuous mode
Jan 31 02:45:24 np0005603623 NetworkManager[48970]: <info>  [1769845524.2595] device (tape842d32e-dd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.265 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:24 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:24Z|00038|binding|INFO|Releasing lport e842d32e-dd00-462d-97bb-d28ef68b5985 from this chassis (sb_readonly=0)
Jan 31 02:45:24 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:24Z|00039|binding|INFO|Setting lport e842d32e-dd00-462d-97bb-d28ef68b5985 down in Southbound
Jan 31 02:45:24 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:24Z|00040|binding|INFO|Removing iface tape842d32e-dd ovn-installed in OVS
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.272 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.275 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:24 np0005603623 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000003.scope: Deactivated successfully.
Jan 31 02:45:24 np0005603623 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000003.scope: Consumed 13.508s CPU time.
Jan 31 02:45:24 np0005603623 systemd-machined[194379]: Machine qemu-2-instance-00000003 terminated.
Jan 31 02:45:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:24.346 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:67:d9 10.1.0.86 fdfe:381f:8400::16f'], port_security=['fa:16:3e:ce:67:d9 10.1.0.86 fdfe:381f:8400::16f'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.86/26 fdfe:381f:8400::16f/64', 'neutron:device_id': '0d6cd237-f032-42fe-a952-64ac0baa4a66', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-992dcec1-3019-47a1-a14c-defd99a80f3d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dc2f6584d8b64364b13683f53c58617f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f48a740a-df16-488d-83ce-01edcece1d5f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d5eabbe-dd4d-4e48-a2ac-b48c29338142, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=e842d32e-dd00-462d-97bb-d28ef68b5985) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:45:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:24.347 143258 INFO neutron.agent.ovn.metadata.agent [-] Port e842d32e-dd00-462d-97bb-d28ef68b5985 in datapath 992dcec1-3019-47a1-a14c-defd99a80f3d unbound from our chassis#033[00m
Jan 31 02:45:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:24.349 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 992dcec1-3019-47a1-a14c-defd99a80f3d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:45:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:24.350 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0594246a-e2aa-4dfa-8861-d1c8482f9159]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:24.351 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d namespace which is not needed anymore#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.359 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.363 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.373 226239 INFO nova.virt.libvirt.driver [-] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Instance destroyed successfully.#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.373 226239 DEBUG nova.objects.instance [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lazy-loading 'resources' on Instance uuid 0d6cd237-f032-42fe-a952-64ac0baa4a66 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.424 226239 DEBUG nova.virt.libvirt.vif [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:44:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1079069239-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1079069239-2',id=3,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2026-01-31T07:45:02Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dc2f6584d8b64364b13683f53c58617f',ramdisk_id='',reservation_id='r-jd4rofhy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-2135409609',owner_user_name='tempest-AutoAllocateNetworkTest-2135409609-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:45:02Z,user_data=None,user_id='0eb58e8663574849b17616075ce5c43e',uuid=0d6cd237-f032-42fe-a952-64ac0baa4a66,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e842d32e-dd00-462d-97bb-d28ef68b5985", "address": "fa:16:3e:ce:67:d9", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::16f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape842d32e-dd", "ovs_interfaceid": "e842d32e-dd00-462d-97bb-d28ef68b5985", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.424 226239 DEBUG nova.network.os_vif_util [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Converting VIF {"id": "e842d32e-dd00-462d-97bb-d28ef68b5985", "address": "fa:16:3e:ce:67:d9", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::16f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape842d32e-dd", "ovs_interfaceid": "e842d32e-dd00-462d-97bb-d28ef68b5985", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.425 226239 DEBUG nova.network.os_vif_util [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ce:67:d9,bridge_name='br-int',has_traffic_filtering=True,id=e842d32e-dd00-462d-97bb-d28ef68b5985,network=Network(992dcec1-3019-47a1-a14c-defd99a80f3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape842d32e-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.425 226239 DEBUG os_vif [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:67:d9,bridge_name='br-int',has_traffic_filtering=True,id=e842d32e-dd00-462d-97bb-d28ef68b5985,network=Network(992dcec1-3019-47a1-a14c-defd99a80f3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape842d32e-dd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.427 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.427 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape842d32e-dd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.428 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.431 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.433 226239 INFO os_vif [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:67:d9,bridge_name='br-int',has_traffic_filtering=True,id=e842d32e-dd00-462d-97bb-d28ef68b5985,network=Network(992dcec1-3019-47a1-a14c-defd99a80f3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape842d32e-dd')#033[00m
Jan 31 02:45:24 np0005603623 neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d[229989]: [NOTICE]   (229993) : haproxy version is 2.8.14-c23fe91
Jan 31 02:45:24 np0005603623 neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d[229989]: [NOTICE]   (229993) : path to executable is /usr/sbin/haproxy
Jan 31 02:45:24 np0005603623 neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d[229989]: [WARNING]  (229993) : Exiting Master process...
Jan 31 02:45:24 np0005603623 neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d[229989]: [WARNING]  (229993) : Exiting Master process...
Jan 31 02:45:24 np0005603623 neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d[229989]: [ALERT]    (229993) : Current worker (229995) exited with code 143 (Terminated)
Jan 31 02:45:24 np0005603623 neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d[229989]: [WARNING]  (229993) : All workers exited. Exiting... (0)
Jan 31 02:45:24 np0005603623 systemd[1]: libpod-dec45b2b75ccd815a2db75749cbd2bf050f511aeab3708e0b61cdf1c33b3e72a.scope: Deactivated successfully.
Jan 31 02:45:24 np0005603623 podman[230200]: 2026-01-31 07:45:24.568996213 +0000 UTC m=+0.113970190 container died dec45b2b75ccd815a2db75749cbd2bf050f511aeab3708e0b61cdf1c33b3e72a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.654 226239 DEBUG nova.compute.manager [req-2cd3a761-997f-405f-a704-948b29a79f00 req-17b3ab28-ba63-428b-9ce8-296393bd8bdc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Received event network-vif-unplugged-e842d32e-dd00-462d-97bb-d28ef68b5985 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.654 226239 DEBUG oslo_concurrency.lockutils [req-2cd3a761-997f-405f-a704-948b29a79f00 req-17b3ab28-ba63-428b-9ce8-296393bd8bdc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0d6cd237-f032-42fe-a952-64ac0baa4a66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.654 226239 DEBUG oslo_concurrency.lockutils [req-2cd3a761-997f-405f-a704-948b29a79f00 req-17b3ab28-ba63-428b-9ce8-296393bd8bdc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0d6cd237-f032-42fe-a952-64ac0baa4a66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.655 226239 DEBUG oslo_concurrency.lockutils [req-2cd3a761-997f-405f-a704-948b29a79f00 req-17b3ab28-ba63-428b-9ce8-296393bd8bdc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0d6cd237-f032-42fe-a952-64ac0baa4a66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.655 226239 DEBUG nova.compute.manager [req-2cd3a761-997f-405f-a704-948b29a79f00 req-17b3ab28-ba63-428b-9ce8-296393bd8bdc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] No waiting events found dispatching network-vif-unplugged-e842d32e-dd00-462d-97bb-d28ef68b5985 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:45:24 np0005603623 nova_compute[226235]: 2026-01-31 07:45:24.655 226239 DEBUG nova.compute.manager [req-2cd3a761-997f-405f-a704-948b29a79f00 req-17b3ab28-ba63-428b-9ce8-296393bd8bdc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Received event network-vif-unplugged-e842d32e-dd00-462d-97bb-d28ef68b5985 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:45:24 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dec45b2b75ccd815a2db75749cbd2bf050f511aeab3708e0b61cdf1c33b3e72a-userdata-shm.mount: Deactivated successfully.
Jan 31 02:45:24 np0005603623 systemd[1]: var-lib-containers-storage-overlay-0c321601a9073ddb2d929a531fb85e2ef9579495004cf0125454a867409997ee-merged.mount: Deactivated successfully.
Jan 31 02:45:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:24 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 02:45:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:24.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:25 np0005603623 podman[230200]: 2026-01-31 07:45:25.211799586 +0000 UTC m=+0.756773523 container cleanup dec45b2b75ccd815a2db75749cbd2bf050f511aeab3708e0b61cdf1c33b3e72a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 02:45:25 np0005603623 systemd[1]: libpod-conmon-dec45b2b75ccd815a2db75749cbd2bf050f511aeab3708e0b61cdf1c33b3e72a.scope: Deactivated successfully.
Jan 31 02:45:25 np0005603623 podman[230244]: 2026-01-31 07:45:25.514041077 +0000 UTC m=+0.286702046 container remove dec45b2b75ccd815a2db75749cbd2bf050f511aeab3708e0b61cdf1c33b3e72a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:45:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:25.520 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4ad81120-3bb3-4021-a34d-241ae5a525bb]: (4, ('Sat Jan 31 07:45:24 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d (dec45b2b75ccd815a2db75749cbd2bf050f511aeab3708e0b61cdf1c33b3e72a)\ndec45b2b75ccd815a2db75749cbd2bf050f511aeab3708e0b61cdf1c33b3e72a\nSat Jan 31 07:45:25 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d (dec45b2b75ccd815a2db75749cbd2bf050f511aeab3708e0b61cdf1c33b3e72a)\ndec45b2b75ccd815a2db75749cbd2bf050f511aeab3708e0b61cdf1c33b3e72a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:25.522 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9e161d31-5cef-4649-ad1b-1785aca20296]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:25.523 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap992dcec1-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:25 np0005603623 nova_compute[226235]: 2026-01-31 07:45:25.524 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:25 np0005603623 kernel: tap992dcec1-30: left promiscuous mode
Jan 31 02:45:25 np0005603623 nova_compute[226235]: 2026-01-31 07:45:25.532 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:25.535 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9d680b6d-477c-4795-8cc5-b8c08af9cebf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:25.551 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7b326fef-3296-43ed-a51e-252a2b128262]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:25.552 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ef62892e-5d6b-4502-993a-815d67424dee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:25.567 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9089aa24-bdf9-46d3-8d95-6217538cd8a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464257, 'reachable_time': 44480, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230258, 'error': None, 'target': 'ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:25 np0005603623 systemd[1]: run-netns-ovnmeta\x2d992dcec1\x2d3019\x2d47a1\x2da14c\x2ddefd99a80f3d.mount: Deactivated successfully.
Jan 31 02:45:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:25.575 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:45:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:25.576 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[d062f36e-f5fd-41bf-88f7-08ac919446c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:26.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:26 np0005603623 nova_compute[226235]: 2026-01-31 07:45:26.592 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:26 np0005603623 nova_compute[226235]: 2026-01-31 07:45:26.745 226239 DEBUG nova.compute.manager [req-771c1271-3829-4f2e-b1da-a14d6f36787c req-47032e8e-ac8a-4ed6-b512-2dce30ddefc2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Received event network-vif-plugged-e842d32e-dd00-462d-97bb-d28ef68b5985 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:45:26 np0005603623 nova_compute[226235]: 2026-01-31 07:45:26.746 226239 DEBUG oslo_concurrency.lockutils [req-771c1271-3829-4f2e-b1da-a14d6f36787c req-47032e8e-ac8a-4ed6-b512-2dce30ddefc2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0d6cd237-f032-42fe-a952-64ac0baa4a66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:26 np0005603623 nova_compute[226235]: 2026-01-31 07:45:26.746 226239 DEBUG oslo_concurrency.lockutils [req-771c1271-3829-4f2e-b1da-a14d6f36787c req-47032e8e-ac8a-4ed6-b512-2dce30ddefc2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0d6cd237-f032-42fe-a952-64ac0baa4a66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:26 np0005603623 nova_compute[226235]: 2026-01-31 07:45:26.747 226239 DEBUG oslo_concurrency.lockutils [req-771c1271-3829-4f2e-b1da-a14d6f36787c req-47032e8e-ac8a-4ed6-b512-2dce30ddefc2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0d6cd237-f032-42fe-a952-64ac0baa4a66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:26 np0005603623 nova_compute[226235]: 2026-01-31 07:45:26.747 226239 DEBUG nova.compute.manager [req-771c1271-3829-4f2e-b1da-a14d6f36787c req-47032e8e-ac8a-4ed6-b512-2dce30ddefc2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] No waiting events found dispatching network-vif-plugged-e842d32e-dd00-462d-97bb-d28ef68b5985 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:45:26 np0005603623 nova_compute[226235]: 2026-01-31 07:45:26.747 226239 WARNING nova.compute.manager [req-771c1271-3829-4f2e-b1da-a14d6f36787c req-47032e8e-ac8a-4ed6-b512-2dce30ddefc2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Received unexpected event network-vif-plugged-e842d32e-dd00-462d-97bb-d28ef68b5985 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 02:45:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:26.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:27 np0005603623 nova_compute[226235]: 2026-01-31 07:45:27.299 226239 INFO nova.virt.libvirt.driver [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Deleting instance files /var/lib/nova/instances/0d6cd237-f032-42fe-a952-64ac0baa4a66_del#033[00m
Jan 31 02:45:27 np0005603623 nova_compute[226235]: 2026-01-31 07:45:27.300 226239 INFO nova.virt.libvirt.driver [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Deletion of /var/lib/nova/instances/0d6cd237-f032-42fe-a952-64ac0baa4a66_del complete#033[00m
Jan 31 02:45:27 np0005603623 nova_compute[226235]: 2026-01-31 07:45:27.347 226239 DEBUG nova.virt.libvirt.host [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Jan 31 02:45:27 np0005603623 nova_compute[226235]: 2026-01-31 07:45:27.348 226239 INFO nova.virt.libvirt.host [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] UEFI support detected#033[00m
Jan 31 02:45:27 np0005603623 nova_compute[226235]: 2026-01-31 07:45:27.350 226239 INFO nova.compute.manager [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Took 3.41 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:45:27 np0005603623 nova_compute[226235]: 2026-01-31 07:45:27.350 226239 DEBUG oslo.service.loopingcall [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:45:27 np0005603623 nova_compute[226235]: 2026-01-31 07:45:27.350 226239 DEBUG nova.compute.manager [-] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:45:27 np0005603623 nova_compute[226235]: 2026-01-31 07:45:27.350 226239 DEBUG nova.network.neutron [-] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:45:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:27 np0005603623 nova_compute[226235]: 2026-01-31 07:45:27.976 226239 DEBUG nova.network.neutron [-] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:45:27 np0005603623 nova_compute[226235]: 2026-01-31 07:45:27.993 226239 INFO nova.compute.manager [-] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Took 0.64 seconds to deallocate network for instance.#033[00m
Jan 31 02:45:28 np0005603623 nova_compute[226235]: 2026-01-31 07:45:28.155 226239 DEBUG oslo_concurrency.lockutils [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:28 np0005603623 nova_compute[226235]: 2026-01-31 07:45:28.156 226239 DEBUG oslo_concurrency.lockutils [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:28.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:28 np0005603623 nova_compute[226235]: 2026-01-31 07:45:28.223 226239 DEBUG oslo_concurrency.processutils [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:28 np0005603623 nova_compute[226235]: 2026-01-31 07:45:28.881 226239 DEBUG nova.compute.manager [req-95dc0295-2601-4885-8e2e-449e91b8f203 req-385285c0-efa5-481e-b0dc-070cc4e59055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Received event network-vif-deleted-e842d32e-dd00-462d-97bb-d28ef68b5985 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:45:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:28.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:45:29 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2555061592' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:45:29 np0005603623 nova_compute[226235]: 2026-01-31 07:45:29.393 226239 DEBUG oslo_concurrency.processutils [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.171s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:29 np0005603623 nova_compute[226235]: 2026-01-31 07:45:29.398 226239 DEBUG nova.compute.provider_tree [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:45:29 np0005603623 nova_compute[226235]: 2026-01-31 07:45:29.417 226239 DEBUG nova.scheduler.client.report [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:45:29 np0005603623 nova_compute[226235]: 2026-01-31 07:45:29.430 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:29 np0005603623 nova_compute[226235]: 2026-01-31 07:45:29.448 226239 DEBUG oslo_concurrency.lockutils [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:29 np0005603623 nova_compute[226235]: 2026-01-31 07:45:29.480 226239 INFO nova.scheduler.client.report [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Deleted allocations for instance 0d6cd237-f032-42fe-a952-64ac0baa4a66#033[00m
Jan 31 02:45:29 np0005603623 nova_compute[226235]: 2026-01-31 07:45:29.576 226239 DEBUG oslo_concurrency.lockutils [None req-7a54833b-c8be-42fd-b955-970f47c5424a 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "0d6cd237-f032-42fe-a952-64ac0baa4a66" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:30.080 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:30.080 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:30.081 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:30.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:30 np0005603623 nova_compute[226235]: 2026-01-31 07:45:30.776 226239 DEBUG oslo_concurrency.lockutils [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:30 np0005603623 nova_compute[226235]: 2026-01-31 07:45:30.776 226239 DEBUG oslo_concurrency.lockutils [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:30 np0005603623 nova_compute[226235]: 2026-01-31 07:45:30.777 226239 DEBUG oslo_concurrency.lockutils [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:30 np0005603623 nova_compute[226235]: 2026-01-31 07:45:30.777 226239 DEBUG oslo_concurrency.lockutils [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:30 np0005603623 nova_compute[226235]: 2026-01-31 07:45:30.777 226239 DEBUG oslo_concurrency.lockutils [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:30 np0005603623 nova_compute[226235]: 2026-01-31 07:45:30.779 226239 INFO nova.compute.manager [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Terminating instance#033[00m
Jan 31 02:45:30 np0005603623 nova_compute[226235]: 2026-01-31 07:45:30.781 226239 DEBUG nova.compute.manager [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:45:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:30.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:31 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:45:31 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:45:31 np0005603623 nova_compute[226235]: 2026-01-31 07:45:31.594 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:32.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:32 np0005603623 kernel: tap57df6b87-e6 (unregistering): left promiscuous mode
Jan 31 02:45:32 np0005603623 NetworkManager[48970]: <info>  [1769845532.6200] device (tap57df6b87-e6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:45:32 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:32Z|00041|binding|INFO|Releasing lport 57df6b87-e6f0-4399-b69f-ee772bb1f551 from this chassis (sb_readonly=0)
Jan 31 02:45:32 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:32Z|00042|binding|INFO|Setting lport 57df6b87-e6f0-4399-b69f-ee772bb1f551 down in Southbound
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.665 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:32 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:32Z|00043|binding|INFO|Removing iface tap57df6b87-e6 ovn-installed in OVS
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.669 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:32.674 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8d:01:44 10.100.0.14'], port_security=['fa:16:3e:8d:01:44 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cefbc155c2449f4a8fe9fd88475b366', 'neutron:revision_number': '4', 'neutron:security_group_ids': '587c819d-03ba-401e-b9cf-985ddd1bff18', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.184'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9b241bb-46b8-4aac-aa2a-6c4356b4b2d7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=57df6b87-e6f0-4399-b69f-ee772bb1f551) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:45:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:32.676 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 57df6b87-e6f0-4399-b69f-ee772bb1f551 in datapath dad80b1a-444d-4c9c-97e5-f4375a4ed8d6 unbound from our chassis#033[00m
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.677 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:32.680 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dad80b1a-444d-4c9c-97e5-f4375a4ed8d6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:45:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:32.681 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[29c35bee-3fb4-41c9-a89a-aa1875581fa4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:32.681 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6 namespace which is not needed anymore#033[00m
Jan 31 02:45:32 np0005603623 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000005.scope: Deactivated successfully.
Jan 31 02:45:32 np0005603623 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000005.scope: Consumed 14.281s CPU time.
Jan 31 02:45:32 np0005603623 systemd-machined[194379]: Machine qemu-1-instance-00000005 terminated.
Jan 31 02:45:32 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:45:32 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:45:32 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.817 226239 INFO nova.virt.libvirt.driver [-] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Instance destroyed successfully.#033[00m
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.818 226239 DEBUG nova.objects.instance [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lazy-loading 'resources' on Instance uuid a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.832 226239 DEBUG nova.virt.libvirt.vif [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:44:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-1597704400',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-1597704400',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-1597704400',id=5,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ89RiM/elI2ZqD0wPCKPg5OJ135CdqHdbE4RgDrlMkDtFUdRT49WLIZycY+7iaVAOnhx3ISEENdI3O2SmWMNgi6poXdDkUtOq9xREV6KVqBpvm5DBEGb+Du/A2bOrXUMA==',key_name='tempest-keypair-889434625',keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:44:54Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6cefbc155c2449f4a8fe9fd88475b366',ramdisk_id='',reservation_id='r-f46r9yis',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1634965997',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1634965997-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:44:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cddf948da8f14dc998b0b2434d23e7fc',uuid=a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "address": "fa:16:3e:8d:01:44", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57df6b87-e6", "ovs_interfaceid": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.833 226239 DEBUG nova.network.os_vif_util [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Converting VIF {"id": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "address": "fa:16:3e:8d:01:44", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57df6b87-e6", "ovs_interfaceid": "57df6b87-e6f0-4399-b69f-ee772bb1f551", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.834 226239 DEBUG nova.network.os_vif_util [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8d:01:44,bridge_name='br-int',has_traffic_filtering=True,id=57df6b87-e6f0-4399-b69f-ee772bb1f551,network=Network(dad80b1a-444d-4c9c-97e5-f4375a4ed8d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57df6b87-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.835 226239 DEBUG os_vif [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8d:01:44,bridge_name='br-int',has_traffic_filtering=True,id=57df6b87-e6f0-4399-b69f-ee772bb1f551,network=Network(dad80b1a-444d-4c9c-97e5-f4375a4ed8d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57df6b87-e6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.838 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.839 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57df6b87-e6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.841 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.843 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.846 226239 INFO os_vif [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8d:01:44,bridge_name='br-int',has_traffic_filtering=True,id=57df6b87-e6f0-4399-b69f-ee772bb1f551,network=Network(dad80b1a-444d-4c9c-97e5-f4375a4ed8d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57df6b87-e6')#033[00m
Jan 31 02:45:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.967 226239 DEBUG nova.compute.manager [req-a7493e11-93d4-4606-805a-2e65bd5c4760 req-7de39d35-a3ea-4d3e-9220-d7769ad0cd39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Received event network-vif-unplugged-57df6b87-e6f0-4399-b69f-ee772bb1f551 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.968 226239 DEBUG oslo_concurrency.lockutils [req-a7493e11-93d4-4606-805a-2e65bd5c4760 req-7de39d35-a3ea-4d3e-9220-d7769ad0cd39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.969 226239 DEBUG oslo_concurrency.lockutils [req-a7493e11-93d4-4606-805a-2e65bd5c4760 req-7de39d35-a3ea-4d3e-9220-d7769ad0cd39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.970 226239 DEBUG oslo_concurrency.lockutils [req-a7493e11-93d4-4606-805a-2e65bd5c4760 req-7de39d35-a3ea-4d3e-9220-d7769ad0cd39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.970 226239 DEBUG nova.compute.manager [req-a7493e11-93d4-4606-805a-2e65bd5c4760 req-7de39d35-a3ea-4d3e-9220-d7769ad0cd39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] No waiting events found dispatching network-vif-unplugged-57df6b87-e6f0-4399-b69f-ee772bb1f551 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:45:32 np0005603623 nova_compute[226235]: 2026-01-31 07:45:32.971 226239 DEBUG nova.compute.manager [req-a7493e11-93d4-4606-805a-2e65bd5c4760 req-7de39d35-a3ea-4d3e-9220-d7769ad0cd39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Received event network-vif-unplugged-57df6b87-e6f0-4399-b69f-ee772bb1f551 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:45:32 np0005603623 neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6[229709]: [NOTICE]   (229713) : haproxy version is 2.8.14-c23fe91
Jan 31 02:45:32 np0005603623 neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6[229709]: [NOTICE]   (229713) : path to executable is /usr/sbin/haproxy
Jan 31 02:45:32 np0005603623 neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6[229709]: [WARNING]  (229713) : Exiting Master process...
Jan 31 02:45:32 np0005603623 neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6[229709]: [ALERT]    (229713) : Current worker (229715) exited with code 143 (Terminated)
Jan 31 02:45:32 np0005603623 neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6[229709]: [WARNING]  (229713) : All workers exited. Exiting... (0)
Jan 31 02:45:32 np0005603623 systemd[1]: libpod-eafeebdd18ff2514ec08a5b77d7ac8451e3ce8333abb7c3653649a6eaa2bd9c3.scope: Deactivated successfully.
Jan 31 02:45:32 np0005603623 podman[230493]: 2026-01-31 07:45:32.996331111 +0000 UTC m=+0.230101353 container died eafeebdd18ff2514ec08a5b77d7ac8451e3ce8333abb7c3653649a6eaa2bd9c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 02:45:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:33.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:33 np0005603623 nova_compute[226235]: 2026-01-31 07:45:33.104 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "2607420d-fc87-4068-9a05-edeafb58d216" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:33 np0005603623 nova_compute[226235]: 2026-01-31 07:45:33.105 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "2607420d-fc87-4068-9a05-edeafb58d216" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:33 np0005603623 nova_compute[226235]: 2026-01-31 07:45:33.121 226239 DEBUG nova.compute.manager [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:45:33 np0005603623 nova_compute[226235]: 2026-01-31 07:45:33.184 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:33 np0005603623 nova_compute[226235]: 2026-01-31 07:45:33.185 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:33 np0005603623 nova_compute[226235]: 2026-01-31 07:45:33.191 226239 DEBUG nova.virt.hardware [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:45:33 np0005603623 nova_compute[226235]: 2026-01-31 07:45:33.191 226239 INFO nova.compute.claims [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:45:33 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eafeebdd18ff2514ec08a5b77d7ac8451e3ce8333abb7c3653649a6eaa2bd9c3-userdata-shm.mount: Deactivated successfully.
Jan 31 02:45:33 np0005603623 systemd[1]: var-lib-containers-storage-overlay-b78861133363e35fb11e82b7192b3e8d0044c59f3f4e3bb3f04647d193b37740-merged.mount: Deactivated successfully.
Jan 31 02:45:33 np0005603623 nova_compute[226235]: 2026-01-31 07:45:33.327 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:33 np0005603623 podman[230493]: 2026-01-31 07:45:33.492753104 +0000 UTC m=+0.726523386 container cleanup eafeebdd18ff2514ec08a5b77d7ac8451e3ce8333abb7c3653649a6eaa2bd9c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 02:45:33 np0005603623 systemd[1]: libpod-conmon-eafeebdd18ff2514ec08a5b77d7ac8451e3ce8333abb7c3653649a6eaa2bd9c3.scope: Deactivated successfully.
Jan 31 02:45:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:45:33 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3770056147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:45:33 np0005603623 nova_compute[226235]: 2026-01-31 07:45:33.780 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:33 np0005603623 nova_compute[226235]: 2026-01-31 07:45:33.787 226239 DEBUG nova.compute.provider_tree [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:45:33 np0005603623 nova_compute[226235]: 2026-01-31 07:45:33.811 226239 DEBUG nova.scheduler.client.report [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:45:33 np0005603623 nova_compute[226235]: 2026-01-31 07:45:33.841 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:33 np0005603623 nova_compute[226235]: 2026-01-31 07:45:33.843 226239 DEBUG nova.compute.manager [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:45:33 np0005603623 nova_compute[226235]: 2026-01-31 07:45:33.899 226239 DEBUG nova.compute.manager [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:45:33 np0005603623 nova_compute[226235]: 2026-01-31 07:45:33.900 226239 DEBUG nova.network.neutron [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:45:33 np0005603623 nova_compute[226235]: 2026-01-31 07:45:33.919 226239 INFO nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:45:33 np0005603623 nova_compute[226235]: 2026-01-31 07:45:33.954 226239 DEBUG nova.compute.manager [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.050 226239 DEBUG nova.compute.manager [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.052 226239 DEBUG nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.053 226239 INFO nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Creating image(s)#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.093 226239 DEBUG nova.storage.rbd_utils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] rbd image 2607420d-fc87-4068-9a05-edeafb58d216_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.136 226239 DEBUG nova.storage.rbd_utils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] rbd image 2607420d-fc87-4068-9a05-edeafb58d216_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:34.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:34 np0005603623 podman[230568]: 2026-01-31 07:45:34.304324723 +0000 UTC m=+0.790357381 container remove eafeebdd18ff2514ec08a5b77d7ac8451e3ce8333abb7c3653649a6eaa2bd9c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 02:45:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:34.308 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[efdddbf2-6311-4c92-8dc7-3f2d3caa969e]: (4, ('Sat Jan 31 07:45:32 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6 (eafeebdd18ff2514ec08a5b77d7ac8451e3ce8333abb7c3653649a6eaa2bd9c3)\neafeebdd18ff2514ec08a5b77d7ac8451e3ce8333abb7c3653649a6eaa2bd9c3\nSat Jan 31 07:45:33 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6 (eafeebdd18ff2514ec08a5b77d7ac8451e3ce8333abb7c3653649a6eaa2bd9c3)\neafeebdd18ff2514ec08a5b77d7ac8451e3ce8333abb7c3653649a6eaa2bd9c3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:34.311 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7364c062-4e47-4eb3-b119-c2a0614e20e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:34.313 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdad80b1a-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:34 np0005603623 kernel: tapdad80b1a-40: left promiscuous mode
Jan 31 02:45:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:34.325 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7de19b07-64a0-4a28-b270-d33804694ef5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:34.341 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[699257d1-92a0-46e6-b6a1-75303994ed9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:34.342 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[69d8bfc1-26e1-4bd1-b476-d7410f2be61d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:34.359 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9b17d510-8eb8-4f7e-8d67-e3672f443cae]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463451, 'reachable_time': 38665, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230637, 'error': None, 'target': 'ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:34.362 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:45:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:34.362 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[f72cd615-db14-4b0d-9c4b-1ab2ec397ebc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:34 np0005603623 systemd[1]: run-netns-ovnmeta\x2ddad80b1a\x2d444d\x2d4c9c\x2d97e5\x2df4375a4ed8d6.mount: Deactivated successfully.
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.550 226239 DEBUG nova.storage.rbd_utils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] rbd image 2607420d-fc87-4068-9a05-edeafb58d216_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.554 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.572 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.578 226239 DEBUG nova.policy [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cddf948da8f14dc998b0b2434d23e7fc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6cefbc155c2449f4a8fe9fd88475b366', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.637 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.638 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.639 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.640 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.863 226239 DEBUG nova.storage.rbd_utils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] rbd image 2607420d-fc87-4068-9a05-edeafb58d216_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.868 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 2607420d-fc87-4068-9a05-edeafb58d216_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.977 226239 DEBUG nova.compute.manager [req-0452c681-3d68-41d4-9e02-28456757725e req-038eac39-946b-45ec-861e-1a2db4505c09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Received event network-vif-plugged-57df6b87-e6f0-4399-b69f-ee772bb1f551 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.977 226239 DEBUG oslo_concurrency.lockutils [req-0452c681-3d68-41d4-9e02-28456757725e req-038eac39-946b-45ec-861e-1a2db4505c09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.978 226239 DEBUG oslo_concurrency.lockutils [req-0452c681-3d68-41d4-9e02-28456757725e req-038eac39-946b-45ec-861e-1a2db4505c09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.978 226239 DEBUG oslo_concurrency.lockutils [req-0452c681-3d68-41d4-9e02-28456757725e req-038eac39-946b-45ec-861e-1a2db4505c09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.979 226239 DEBUG nova.compute.manager [req-0452c681-3d68-41d4-9e02-28456757725e req-038eac39-946b-45ec-861e-1a2db4505c09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] No waiting events found dispatching network-vif-plugged-57df6b87-e6f0-4399-b69f-ee772bb1f551 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:45:34 np0005603623 nova_compute[226235]: 2026-01-31 07:45:34.979 226239 WARNING nova.compute.manager [req-0452c681-3d68-41d4-9e02-28456757725e req-038eac39-946b-45ec-861e-1a2db4505c09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Received unexpected event network-vif-plugged-57df6b87-e6f0-4399-b69f-ee772bb1f551 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 02:45:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:35.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:36.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:36 np0005603623 nova_compute[226235]: 2026-01-31 07:45:36.598 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:36 np0005603623 nova_compute[226235]: 2026-01-31 07:45:36.997 226239 DEBUG nova.network.neutron [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Successfully created port: 29abd150-9de3-49f3-80a2-85432282ba29 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 02:45:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:45:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:37.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:45:37 np0005603623 nova_compute[226235]: 2026-01-31 07:45:37.841 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:38 np0005603623 nova_compute[226235]: 2026-01-31 07:45:38.146 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 2607420d-fc87-4068-9a05-edeafb58d216_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.278s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:45:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:38.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:45:38 np0005603623 nova_compute[226235]: 2026-01-31 07:45:38.232 226239 DEBUG nova.storage.rbd_utils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] resizing rbd image 2607420d-fc87-4068-9a05-edeafb58d216_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:45:38 np0005603623 nova_compute[226235]: 2026-01-31 07:45:38.738 226239 DEBUG nova.network.neutron [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Successfully updated port: 29abd150-9de3-49f3-80a2-85432282ba29 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:45:38 np0005603623 nova_compute[226235]: 2026-01-31 07:45:38.740 226239 DEBUG nova.compute.manager [req-6e1f05ab-29da-4b31-8127-76485ba6c252 req-2f0764ae-0d72-4a41-a53d-3fc9ce56b876 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Received event network-changed-29abd150-9de3-49f3-80a2-85432282ba29 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:45:38 np0005603623 nova_compute[226235]: 2026-01-31 07:45:38.740 226239 DEBUG nova.compute.manager [req-6e1f05ab-29da-4b31-8127-76485ba6c252 req-2f0764ae-0d72-4a41-a53d-3fc9ce56b876 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Refreshing instance network info cache due to event network-changed-29abd150-9de3-49f3-80a2-85432282ba29. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:45:38 np0005603623 nova_compute[226235]: 2026-01-31 07:45:38.740 226239 DEBUG oslo_concurrency.lockutils [req-6e1f05ab-29da-4b31-8127-76485ba6c252 req-2f0764ae-0d72-4a41-a53d-3fc9ce56b876 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-2607420d-fc87-4068-9a05-edeafb58d216" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:45:38 np0005603623 nova_compute[226235]: 2026-01-31 07:45:38.740 226239 DEBUG oslo_concurrency.lockutils [req-6e1f05ab-29da-4b31-8127-76485ba6c252 req-2f0764ae-0d72-4a41-a53d-3fc9ce56b876 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-2607420d-fc87-4068-9a05-edeafb58d216" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:45:38 np0005603623 nova_compute[226235]: 2026-01-31 07:45:38.740 226239 DEBUG nova.network.neutron [req-6e1f05ab-29da-4b31-8127-76485ba6c252 req-2f0764ae-0d72-4a41-a53d-3fc9ce56b876 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Refreshing network info cache for port 29abd150-9de3-49f3-80a2-85432282ba29 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:45:38 np0005603623 nova_compute[226235]: 2026-01-31 07:45:38.770 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "refresh_cache-2607420d-fc87-4068-9a05-edeafb58d216" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:45:38 np0005603623 nova_compute[226235]: 2026-01-31 07:45:38.948 226239 DEBUG nova.network.neutron [req-6e1f05ab-29da-4b31-8127-76485ba6c252 req-2f0764ae-0d72-4a41-a53d-3fc9ce56b876 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:45:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:39.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:39 np0005603623 nova_compute[226235]: 2026-01-31 07:45:39.187 226239 DEBUG nova.network.neutron [req-6e1f05ab-29da-4b31-8127-76485ba6c252 req-2f0764ae-0d72-4a41-a53d-3fc9ce56b876 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:45:39 np0005603623 nova_compute[226235]: 2026-01-31 07:45:39.204 226239 DEBUG oslo_concurrency.lockutils [req-6e1f05ab-29da-4b31-8127-76485ba6c252 req-2f0764ae-0d72-4a41-a53d-3fc9ce56b876 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-2607420d-fc87-4068-9a05-edeafb58d216" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:45:39 np0005603623 nova_compute[226235]: 2026-01-31 07:45:39.205 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquired lock "refresh_cache-2607420d-fc87-4068-9a05-edeafb58d216" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:45:39 np0005603623 nova_compute[226235]: 2026-01-31 07:45:39.206 226239 DEBUG nova.network.neutron [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:45:39 np0005603623 nova_compute[226235]: 2026-01-31 07:45:39.338 226239 DEBUG nova.network.neutron [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:45:39 np0005603623 nova_compute[226235]: 2026-01-31 07:45:39.370 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845524.3693917, 0d6cd237-f032-42fe-a952-64ac0baa4a66 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:45:39 np0005603623 nova_compute[226235]: 2026-01-31 07:45:39.371 226239 INFO nova.compute.manager [-] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:45:39 np0005603623 nova_compute[226235]: 2026-01-31 07:45:39.392 226239 DEBUG nova.compute.manager [None req-d3c25678-8a84-4d56-aa2d-a5c80d8db3ab - - - - - -] [instance: 0d6cd237-f032-42fe-a952-64ac0baa4a66] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:40.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:40 np0005603623 nova_compute[226235]: 2026-01-31 07:45:40.731 226239 DEBUG nova.objects.instance [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lazy-loading 'migration_context' on Instance uuid 2607420d-fc87-4068-9a05-edeafb58d216 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:45:40 np0005603623 nova_compute[226235]: 2026-01-31 07:45:40.773 226239 DEBUG nova.storage.rbd_utils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] rbd image 2607420d-fc87-4068-9a05-edeafb58d216_disk.eph0 does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:40 np0005603623 nova_compute[226235]: 2026-01-31 07:45:40.799 226239 DEBUG nova.storage.rbd_utils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] rbd image 2607420d-fc87-4068-9a05-edeafb58d216_disk.eph0 does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:40 np0005603623 nova_compute[226235]: 2026-01-31 07:45:40.803 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "ephemeral_1_0706d66" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:40 np0005603623 nova_compute[226235]: 2026-01-31 07:45:40.804 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "ephemeral_1_0706d66" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:40 np0005603623 nova_compute[226235]: 2026-01-31 07:45:40.805 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/_base/ephemeral_1_0706d66 1G execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:40 np0005603623 nova_compute[226235]: 2026-01-31 07:45:40.823 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/_base/ephemeral_1_0706d66 1G" returned: 0 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:40 np0005603623 nova_compute[226235]: 2026-01-31 07:45:40.824 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): mkfs -t vfat -n ephemeral0 /var/lib/nova/instances/_base/ephemeral_1_0706d66 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:40 np0005603623 nova_compute[226235]: 2026-01-31 07:45:40.926 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "mkfs -t vfat -n ephemeral0 /var/lib/nova/instances/_base/ephemeral_1_0706d66" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:40 np0005603623 nova_compute[226235]: 2026-01-31 07:45:40.928 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "ephemeral_1_0706d66" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:40 np0005603623 nova_compute[226235]: 2026-01-31 07:45:40.964 226239 DEBUG nova.storage.rbd_utils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] rbd image 2607420d-fc87-4068-9a05-edeafb58d216_disk.eph0 does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:40 np0005603623 nova_compute[226235]: 2026-01-31 07:45:40.968 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/ephemeral_1_0706d66 2607420d-fc87-4068-9a05-edeafb58d216_disk.eph0 --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:41.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:41 np0005603623 nova_compute[226235]: 2026-01-31 07:45:41.427 226239 DEBUG nova.network.neutron [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Updating instance_info_cache with network_info: [{"id": "29abd150-9de3-49f3-80a2-85432282ba29", "address": "fa:16:3e:3f:a2:d5", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29abd150-9d", "ovs_interfaceid": "29abd150-9de3-49f3-80a2-85432282ba29", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:45:41 np0005603623 nova_compute[226235]: 2026-01-31 07:45:41.563 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Releasing lock "refresh_cache-2607420d-fc87-4068-9a05-edeafb58d216" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:45:41 np0005603623 nova_compute[226235]: 2026-01-31 07:45:41.563 226239 DEBUG nova.compute.manager [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Instance network_info: |[{"id": "29abd150-9de3-49f3-80a2-85432282ba29", "address": "fa:16:3e:3f:a2:d5", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29abd150-9d", "ovs_interfaceid": "29abd150-9de3-49f3-80a2-85432282ba29", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:45:41 np0005603623 nova_compute[226235]: 2026-01-31 07:45:41.598 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:42.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.277 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/ephemeral_1_0706d66 2607420d-fc87-4068-9a05-edeafb58d216_disk.eph0 --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.309s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.387 226239 DEBUG nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.388 226239 DEBUG nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Ensure instance console log exists: /var/lib/nova/instances/2607420d-fc87-4068-9a05-edeafb58d216/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.389 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.389 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.390 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.393 226239 DEBUG nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Start _get_guest_xml network_info=[{"id": "29abd150-9de3-49f3-80a2-85432282ba29", "address": "fa:16:3e:3f:a2:d5", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29abd150-9d", "ovs_interfaceid": "29abd150-9de3-49f3-80a2-85432282ba29", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [{'guest_format': None, 'device_name': '/dev/vdb', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 1, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.396 226239 WARNING nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.415 226239 DEBUG nova.virt.libvirt.host [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.416 226239 DEBUG nova.virt.libvirt.host [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.423 226239 DEBUG nova.virt.libvirt.host [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.423 226239 DEBUG nova.virt.libvirt.host [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.424 226239 DEBUG nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.425 226239 DEBUG nova.virt.hardware [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:44:28Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={hw_rng:allowed='True'},flavorid='1888648442',id=4,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_1-218485872',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.425 226239 DEBUG nova.virt.hardware [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.425 226239 DEBUG nova.virt.hardware [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.425 226239 DEBUG nova.virt.hardware [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.426 226239 DEBUG nova.virt.hardware [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.426 226239 DEBUG nova.virt.hardware [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.426 226239 DEBUG nova.virt.hardware [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.426 226239 DEBUG nova.virt.hardware [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.427 226239 DEBUG nova.virt.hardware [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.427 226239 DEBUG nova.virt.hardware [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.427 226239 DEBUG nova.virt.hardware [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.445 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.843 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:45:42 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3812092801' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.911 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:42 np0005603623 nova_compute[226235]: 2026-01-31 07:45:42.912 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:45:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:43.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:45:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:45:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3933859774' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:45:43 np0005603623 nova_compute[226235]: 2026-01-31 07:45:43.334 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:43 np0005603623 nova_compute[226235]: 2026-01-31 07:45:43.364 226239 DEBUG nova.storage.rbd_utils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] rbd image 2607420d-fc87-4068-9a05-edeafb58d216_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:43 np0005603623 nova_compute[226235]: 2026-01-31 07:45:43.368 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:45:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2605566876' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:45:43 np0005603623 nova_compute[226235]: 2026-01-31 07:45:43.973 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.605s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:43 np0005603623 nova_compute[226235]: 2026-01-31 07:45:43.975 226239 DEBUG nova.virt.libvirt.vif [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:45:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-1471853240',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-1471853240',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(4),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-1471853240',id=9,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=4,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ89RiM/elI2ZqD0wPCKPg5OJ135CdqHdbE4RgDrlMkDtFUdRT49WLIZycY+7iaVAOnhx3ISEENdI3O2SmWMNgi6poXdDkUtOq9xREV6KVqBpvm5DBEGb+Du/A2bOrXUMA==',key_name='tempest-keypair-889434625',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6cefbc155c2449f4a8fe9fd88475b366',ramdisk_id='',reservation_id='r-pkqkht2r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1634965997',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1634965997-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:45:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cddf948da8f14dc998b0b2434d23e7fc',uuid=2607420d-fc87-4068-9a05-edeafb58d216,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "29abd150-9de3-49f3-80a2-85432282ba29", "address": "fa:16:3e:3f:a2:d5", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29abd150-9d", "ovs_interfaceid": "29abd150-9de3-49f3-80a2-85432282ba29", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:45:43 np0005603623 nova_compute[226235]: 2026-01-31 07:45:43.975 226239 DEBUG nova.network.os_vif_util [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Converting VIF {"id": "29abd150-9de3-49f3-80a2-85432282ba29", "address": "fa:16:3e:3f:a2:d5", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29abd150-9d", "ovs_interfaceid": "29abd150-9de3-49f3-80a2-85432282ba29", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:45:43 np0005603623 nova_compute[226235]: 2026-01-31 07:45:43.976 226239 DEBUG nova.network.os_vif_util [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:a2:d5,bridge_name='br-int',has_traffic_filtering=True,id=29abd150-9de3-49f3-80a2-85432282ba29,network=Network(dad80b1a-444d-4c9c-97e5-f4375a4ed8d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29abd150-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:45:43 np0005603623 nova_compute[226235]: 2026-01-31 07:45:43.977 226239 DEBUG nova.objects.instance [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2607420d-fc87-4068-9a05-edeafb58d216 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:45:43 np0005603623 nova_compute[226235]: 2026-01-31 07:45:43.994 226239 DEBUG nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:  <uuid>2607420d-fc87-4068-9a05-edeafb58d216</uuid>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:  <name>instance-00000009</name>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServersWithSpecificFlavorTestJSON-server-1471853240</nova:name>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:45:42</nova:creationTime>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <nova:flavor name="tempest-flavor_with_ephemeral_1-218485872">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <nova:ephemeral>1</nova:ephemeral>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <nova:user uuid="cddf948da8f14dc998b0b2434d23e7fc">tempest-ServersWithSpecificFlavorTestJSON-1634965997-project-member</nova:user>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <nova:project uuid="6cefbc155c2449f4a8fe9fd88475b366">tempest-ServersWithSpecificFlavorTestJSON-1634965997</nova:project>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <nova:port uuid="29abd150-9de3-49f3-80a2-85432282ba29">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <entry name="serial">2607420d-fc87-4068-9a05-edeafb58d216</entry>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <entry name="uuid">2607420d-fc87-4068-9a05-edeafb58d216</entry>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/2607420d-fc87-4068-9a05-edeafb58d216_disk">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/2607420d-fc87-4068-9a05-edeafb58d216_disk.eph0">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <target dev="vdb" bus="virtio"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/2607420d-fc87-4068-9a05-edeafb58d216_disk.config">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:3f:a2:d5"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <target dev="tap29abd150-9d"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    </interface>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/2607420d-fc87-4068-9a05-edeafb58d216/console.log" append="off"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:45:43 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:45:43 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:45:43 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:45:44 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:43.995 226239 DEBUG nova.compute.manager [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Preparing to wait for external event network-vif-plugged-29abd150-9de3-49f3-80a2-85432282ba29 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:43.996 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "2607420d-fc87-4068-9a05-edeafb58d216-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:43.996 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "2607420d-fc87-4068-9a05-edeafb58d216-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:43.996 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "2607420d-fc87-4068-9a05-edeafb58d216-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:43.997 226239 DEBUG nova.virt.libvirt.vif [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:45:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-1471853240',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-1471853240',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(4),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-1471853240',id=9,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=4,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ89RiM/elI2ZqD0wPCKPg5OJ135CdqHdbE4RgDrlMkDtFUdRT49WLIZycY+7iaVAOnhx3ISEENdI3O2SmWMNgi6poXdDkUtOq9xREV6KVqBpvm5DBEGb+Du/A2bOrXUMA==',key_name='tempest-keypair-889434625',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6cefbc155c2449f4a8fe9fd88475b366',ramdisk_id='',reservation_id='r-pkqkht2r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1634965997',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1634965997-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:45:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cddf948da8f14dc998b0b2434d23e7fc',uuid=2607420d-fc87-4068-9a05-edeafb58d216,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "29abd150-9de3-49f3-80a2-85432282ba29", "address": "fa:16:3e:3f:a2:d5", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29abd150-9d", "ovs_interfaceid": "29abd150-9de3-49f3-80a2-85432282ba29", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:43.997 226239 DEBUG nova.network.os_vif_util [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Converting VIF {"id": "29abd150-9de3-49f3-80a2-85432282ba29", "address": "fa:16:3e:3f:a2:d5", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29abd150-9d", "ovs_interfaceid": "29abd150-9de3-49f3-80a2-85432282ba29", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:43.998 226239 DEBUG nova.network.os_vif_util [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:a2:d5,bridge_name='br-int',has_traffic_filtering=True,id=29abd150-9de3-49f3-80a2-85432282ba29,network=Network(dad80b1a-444d-4c9c-97e5-f4375a4ed8d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29abd150-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:43.999 226239 DEBUG os_vif [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:a2:d5,bridge_name='br-int',has_traffic_filtering=True,id=29abd150-9de3-49f3-80a2-85432282ba29,network=Network(dad80b1a-444d-4c9c-97e5-f4375a4ed8d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29abd150-9d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:43.999 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.000 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.000 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.003 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.003 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap29abd150-9d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.003 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap29abd150-9d, col_values=(('external_ids', {'iface-id': '29abd150-9de3-49f3-80a2-85432282ba29', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3f:a2:d5', 'vm-uuid': '2607420d-fc87-4068-9a05-edeafb58d216'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.005 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:44 np0005603623 NetworkManager[48970]: <info>  [1769845544.0061] manager: (tap29abd150-9d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.007 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.010 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.011 226239 INFO os_vif [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:a2:d5,bridge_name='br-int',has_traffic_filtering=True,id=29abd150-9de3-49f3-80a2-85432282ba29,network=Network(dad80b1a-444d-4c9c-97e5-f4375a4ed8d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29abd150-9d')#033[00m
Jan 31 02:45:44 np0005603623 podman[230973]: 2026-01-31 07:45:44.016602612 +0000 UTC m=+0.103494749 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.068 226239 DEBUG nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.068 226239 DEBUG nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.068 226239 DEBUG nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.069 226239 DEBUG nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] No VIF found with MAC fa:16:3e:3f:a2:d5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.069 226239 INFO nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Using config drive#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.097 226239 DEBUG nova.storage.rbd_utils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] rbd image 2607420d-fc87-4068-9a05-edeafb58d216_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:44.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.369 226239 INFO nova.virt.libvirt.driver [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Deleting instance files /var/lib/nova/instances/a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd_del#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.370 226239 INFO nova.virt.libvirt.driver [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Deletion of /var/lib/nova/instances/a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd_del complete#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.451 226239 INFO nova.compute.manager [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Took 13.67 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.452 226239 DEBUG oslo.service.loopingcall [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.452 226239 DEBUG nova.compute.manager [-] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:45:44 np0005603623 nova_compute[226235]: 2026-01-31 07:45:44.453 226239 DEBUG nova.network.neutron [-] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:45:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:45.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:45 np0005603623 nova_compute[226235]: 2026-01-31 07:45:45.225 226239 INFO nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Creating config drive at /var/lib/nova/instances/2607420d-fc87-4068-9a05-edeafb58d216/disk.config#033[00m
Jan 31 02:45:45 np0005603623 nova_compute[226235]: 2026-01-31 07:45:45.231 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2607420d-fc87-4068-9a05-edeafb58d216/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6jizq7vt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:45 np0005603623 nova_compute[226235]: 2026-01-31 07:45:45.347 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2607420d-fc87-4068-9a05-edeafb58d216/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6jizq7vt" returned: 0 in 0.116s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:45 np0005603623 nova_compute[226235]: 2026-01-31 07:45:45.382 226239 DEBUG nova.storage.rbd_utils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] rbd image 2607420d-fc87-4068-9a05-edeafb58d216_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:45 np0005603623 nova_compute[226235]: 2026-01-31 07:45:45.386 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2607420d-fc87-4068-9a05-edeafb58d216/disk.config 2607420d-fc87-4068-9a05-edeafb58d216_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:45 np0005603623 podman[231062]: 2026-01-31 07:45:45.95016372 +0000 UTC m=+0.044329418 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent)
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.036 226239 DEBUG nova.network.neutron [-] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.065 226239 INFO nova.compute.manager [-] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Took 1.61 seconds to deallocate network for instance.#033[00m
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.142 226239 DEBUG oslo_concurrency.lockutils [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.142 226239 DEBUG oslo_concurrency.lockutils [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:45:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:46.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.239 226239 DEBUG oslo_concurrency.processutils [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.275 226239 DEBUG nova.compute.manager [req-5f1163d4-2bde-421a-b917-63b4755a8225 req-a83815ea-5df4-4c65-a20f-7aca028cb32c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Received event network-vif-deleted-57df6b87-e6f0-4399-b69f-ee772bb1f551 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.527 226239 DEBUG oslo_concurrency.processutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2607420d-fc87-4068-9a05-edeafb58d216/disk.config 2607420d-fc87-4068-9a05-edeafb58d216_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.527 226239 INFO nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Deleting local config drive /var/lib/nova/instances/2607420d-fc87-4068-9a05-edeafb58d216/disk.config because it was imported into RBD.#033[00m
Jan 31 02:45:46 np0005603623 NetworkManager[48970]: <info>  [1769845546.5606] manager: (tap29abd150-9d): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Jan 31 02:45:46 np0005603623 kernel: tap29abd150-9d: entered promiscuous mode
Jan 31 02:45:46 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:46Z|00044|binding|INFO|Claiming lport 29abd150-9de3-49f3-80a2-85432282ba29 for this chassis.
Jan 31 02:45:46 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:46Z|00045|binding|INFO|29abd150-9de3-49f3-80a2-85432282ba29: Claiming fa:16:3e:3f:a2:d5 10.100.0.11
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.567 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:46 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:46Z|00046|binding|INFO|Setting lport 29abd150-9de3-49f3-80a2-85432282ba29 ovn-installed in OVS
Jan 31 02:45:46 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:46Z|00047|binding|INFO|Setting lport 29abd150-9de3-49f3-80a2-85432282ba29 up in Southbound
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.571 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:a2:d5 10.100.0.11'], port_security=['fa:16:3e:3f:a2:d5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '2607420d-fc87-4068-9a05-edeafb58d216', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cefbc155c2449f4a8fe9fd88475b366', 'neutron:revision_number': '2', 'neutron:security_group_ids': '587c819d-03ba-401e-b9cf-985ddd1bff18', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9b241bb-46b8-4aac-aa2a-6c4356b4b2d7, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=29abd150-9de3-49f3-80a2-85432282ba29) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.572 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 29abd150-9de3-49f3-80a2-85432282ba29 in datapath dad80b1a-444d-4c9c-97e5-f4375a4ed8d6 bound to our chassis#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.573 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dad80b1a-444d-4c9c-97e5-f4375a4ed8d6#033[00m
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.574 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.581 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7a857748-74d5-4ea0-b29f-27597d817e44]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.582 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdad80b1a-41 in ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:45:46 np0005603623 systemd-machined[194379]: New machine qemu-3-instance-00000009.
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.584 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdad80b1a-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.584 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dca52b33-e257-4aff-a1be-98a724e8f064]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.585 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8af07650-c05e-46f9-97e1-6a8277d00e8b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.595 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[5bbb4191-d5fd-4de8-9100-20d84c44e763]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:46 np0005603623 systemd[1]: Started Virtual Machine qemu-3-instance-00000009.
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.599 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:46 np0005603623 systemd-udevd[231122]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.617 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc0a306-2f68-4019-9608-eac95d8fec2f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:46 np0005603623 NetworkManager[48970]: <info>  [1769845546.6276] device (tap29abd150-9d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:45:46 np0005603623 NetworkManager[48970]: <info>  [1769845546.6280] device (tap29abd150-9d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.637 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[365d3b5c-7860-4abe-874e-1c68448da87e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.641 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[02badd86-7dda-484a-a433-24c5c475c722]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:46 np0005603623 systemd-udevd[231127]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:45:46 np0005603623 NetworkManager[48970]: <info>  [1769845546.6421] manager: (tapdad80b1a-40): new Veth device (/org/freedesktop/NetworkManager/Devices/37)
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.660 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ed8cf02d-ce3a-4b91-b823-f8b5c35e9af5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.663 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[7e0c231e-cfd6-4399-b732-5cdb53482f5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:46 np0005603623 NetworkManager[48970]: <info>  [1769845546.6785] device (tapdad80b1a-40): carrier: link connected
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.683 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[f2ebcb7a-b4fb-4ade-b46d-c759a3c1535d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.696 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[22af8399-8a8d-4d24-9027-fdac00cd07e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdad80b1a-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:d8:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468710, 'reachable_time': 44435, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231148, 'error': None, 'target': 'ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:45:46 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/879757586' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.707 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c91e2937-f808-4636-aefe-faba17d0b554]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:d870'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 468710, 'tstamp': 468710}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231150, 'error': None, 'target': 'ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.719 226239 DEBUG oslo_concurrency.processutils [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.719 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6c411b10-dbe7-434a-9558-403a3b61bb9f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdad80b1a-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:d8:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468710, 'reachable_time': 44435, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231152, 'error': None, 'target': 'ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.723 226239 DEBUG nova.compute.provider_tree [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.740 226239 DEBUG nova.scheduler.client.report [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.742 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[db2a3a1a-9000-47ed-8a65-058e87c1579f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.758 226239 DEBUG oslo_concurrency.lockutils [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.782 226239 INFO nova.scheduler.client.report [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Deleted allocations for instance a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.782 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a3744af4-1ab1-40fe-be4e-d06a545955de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.783 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdad80b1a-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.783 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.784 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdad80b1a-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.785 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:46 np0005603623 kernel: tapdad80b1a-40: entered promiscuous mode
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.787 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:46 np0005603623 NetworkManager[48970]: <info>  [1769845546.7872] manager: (tapdad80b1a-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.792 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdad80b1a-40, col_values=(('external_ids', {'iface-id': 'ff1de611-ad43-4ac1-9afc-1469794e5ef0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.793 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:46 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:46Z|00048|binding|INFO|Releasing lport ff1de611-ad43-4ac1-9afc-1469794e5ef0 from this chassis (sb_readonly=0)
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.794 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dad80b1a-444d-4c9c-97e5-f4375a4ed8d6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dad80b1a-444d-4c9c-97e5-f4375a4ed8d6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.795 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7d6552a3-c579-4a85-bd89-63bbd0386854]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.795 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/dad80b1a-444d-4c9c-97e5-f4375a4ed8d6.pid.haproxy
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID dad80b1a-444d-4c9c-97e5-f4375a4ed8d6
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:45:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:45:46.796 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6', 'env', 'PROCESS_TAG=haproxy-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dad80b1a-444d-4c9c-97e5-f4375a4ed8d6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.801 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:46 np0005603623 nova_compute[226235]: 2026-01-31 07:45:46.948 226239 DEBUG oslo_concurrency.lockutils [None req-dac71bce-665c-4781-afea-f393ae7ee54a cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 16.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:45:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:47.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:45:47 np0005603623 podman[231253]: 2026-01-31 07:45:47.093651978 +0000 UTC m=+0.030811684 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:45:47 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:45:47 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:45:47 np0005603623 podman[231253]: 2026-01-31 07:45:47.258524644 +0000 UTC m=+0.195684260 container create 72b2e98adfb1912bdeb7a0e3b5049c887e8ed25a93962e89cf13f903e8d4229f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 31 02:45:47 np0005603623 systemd[1]: Started libpod-conmon-72b2e98adfb1912bdeb7a0e3b5049c887e8ed25a93962e89cf13f903e8d4229f.scope.
Jan 31 02:45:47 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:45:47 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2919815f844d7739f4bec13767861c949e7e136919287d6c424fe44a637189f9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:45:47 np0005603623 podman[231253]: 2026-01-31 07:45:47.507440573 +0000 UTC m=+0.444600249 container init 72b2e98adfb1912bdeb7a0e3b5049c887e8ed25a93962e89cf13f903e8d4229f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 31 02:45:47 np0005603623 podman[231253]: 2026-01-31 07:45:47.513352294 +0000 UTC m=+0.450511910 container start 72b2e98adfb1912bdeb7a0e3b5049c887e8ed25a93962e89cf13f903e8d4229f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 02:45:47 np0005603623 neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6[231307]: [NOTICE]   (231315) : New worker (231317) forked
Jan 31 02:45:47 np0005603623 neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6[231307]: [NOTICE]   (231315) : Loading success.
Jan 31 02:45:47 np0005603623 nova_compute[226235]: 2026-01-31 07:45:47.578 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845547.577374, 2607420d-fc87-4068-9a05-edeafb58d216 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:45:47 np0005603623 nova_compute[226235]: 2026-01-31 07:45:47.578 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] VM Started (Lifecycle Event)#033[00m
Jan 31 02:45:47 np0005603623 nova_compute[226235]: 2026-01-31 07:45:47.671 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:47 np0005603623 nova_compute[226235]: 2026-01-31 07:45:47.675 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845547.5785406, 2607420d-fc87-4068-9a05-edeafb58d216 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:45:47 np0005603623 nova_compute[226235]: 2026-01-31 07:45:47.675 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:45:47 np0005603623 nova_compute[226235]: 2026-01-31 07:45:47.816 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845532.8158767, a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:45:47 np0005603623 nova_compute[226235]: 2026-01-31 07:45:47.817 226239 INFO nova.compute.manager [-] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:45:47 np0005603623 nova_compute[226235]: 2026-01-31 07:45:47.840 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:47 np0005603623 nova_compute[226235]: 2026-01-31 07:45:47.845 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:45:47 np0005603623 nova_compute[226235]: 2026-01-31 07:45:47.859 226239 DEBUG nova.compute.manager [None req-cdd80043-1ec4-4afc-8b99-79e3a4b3a3ef - - - - - -] [instance: a61ef754-a5b0-4364-8ef1-a3bd7d51a2bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:47 np0005603623 nova_compute[226235]: 2026-01-31 07:45:47.945 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:45:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:48.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:48 np0005603623 nova_compute[226235]: 2026-01-31 07:45:48.865 226239 DEBUG nova.compute.manager [req-c685966e-5121-47c0-9e2b-0fba943984d1 req-1e90bb5f-34f9-4146-88d5-41260a1a63a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Received event network-vif-plugged-29abd150-9de3-49f3-80a2-85432282ba29 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:45:48 np0005603623 nova_compute[226235]: 2026-01-31 07:45:48.866 226239 DEBUG oslo_concurrency.lockutils [req-c685966e-5121-47c0-9e2b-0fba943984d1 req-1e90bb5f-34f9-4146-88d5-41260a1a63a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "2607420d-fc87-4068-9a05-edeafb58d216-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:48 np0005603623 nova_compute[226235]: 2026-01-31 07:45:48.866 226239 DEBUG oslo_concurrency.lockutils [req-c685966e-5121-47c0-9e2b-0fba943984d1 req-1e90bb5f-34f9-4146-88d5-41260a1a63a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2607420d-fc87-4068-9a05-edeafb58d216-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:48 np0005603623 nova_compute[226235]: 2026-01-31 07:45:48.867 226239 DEBUG oslo_concurrency.lockutils [req-c685966e-5121-47c0-9e2b-0fba943984d1 req-1e90bb5f-34f9-4146-88d5-41260a1a63a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2607420d-fc87-4068-9a05-edeafb58d216-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:48 np0005603623 nova_compute[226235]: 2026-01-31 07:45:48.867 226239 DEBUG nova.compute.manager [req-c685966e-5121-47c0-9e2b-0fba943984d1 req-1e90bb5f-34f9-4146-88d5-41260a1a63a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Processing event network-vif-plugged-29abd150-9de3-49f3-80a2-85432282ba29 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:45:48 np0005603623 nova_compute[226235]: 2026-01-31 07:45:48.867 226239 DEBUG nova.compute.manager [req-c685966e-5121-47c0-9e2b-0fba943984d1 req-1e90bb5f-34f9-4146-88d5-41260a1a63a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Received event network-vif-plugged-29abd150-9de3-49f3-80a2-85432282ba29 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:45:48 np0005603623 nova_compute[226235]: 2026-01-31 07:45:48.867 226239 DEBUG oslo_concurrency.lockutils [req-c685966e-5121-47c0-9e2b-0fba943984d1 req-1e90bb5f-34f9-4146-88d5-41260a1a63a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "2607420d-fc87-4068-9a05-edeafb58d216-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:48 np0005603623 nova_compute[226235]: 2026-01-31 07:45:48.867 226239 DEBUG oslo_concurrency.lockutils [req-c685966e-5121-47c0-9e2b-0fba943984d1 req-1e90bb5f-34f9-4146-88d5-41260a1a63a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2607420d-fc87-4068-9a05-edeafb58d216-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:48 np0005603623 nova_compute[226235]: 2026-01-31 07:45:48.868 226239 DEBUG oslo_concurrency.lockutils [req-c685966e-5121-47c0-9e2b-0fba943984d1 req-1e90bb5f-34f9-4146-88d5-41260a1a63a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2607420d-fc87-4068-9a05-edeafb58d216-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:48 np0005603623 nova_compute[226235]: 2026-01-31 07:45:48.868 226239 DEBUG nova.compute.manager [req-c685966e-5121-47c0-9e2b-0fba943984d1 req-1e90bb5f-34f9-4146-88d5-41260a1a63a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] No waiting events found dispatching network-vif-plugged-29abd150-9de3-49f3-80a2-85432282ba29 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:45:48 np0005603623 nova_compute[226235]: 2026-01-31 07:45:48.868 226239 WARNING nova.compute.manager [req-c685966e-5121-47c0-9e2b-0fba943984d1 req-1e90bb5f-34f9-4146-88d5-41260a1a63a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Received unexpected event network-vif-plugged-29abd150-9de3-49f3-80a2-85432282ba29 for instance with vm_state building and task_state spawning.#033[00m
Jan 31 02:45:48 np0005603623 nova_compute[226235]: 2026-01-31 07:45:48.869 226239 DEBUG nova.compute.manager [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:45:48 np0005603623 nova_compute[226235]: 2026-01-31 07:45:48.873 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845548.8726852, 2607420d-fc87-4068-9a05-edeafb58d216 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:45:48 np0005603623 nova_compute[226235]: 2026-01-31 07:45:48.873 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:45:48 np0005603623 nova_compute[226235]: 2026-01-31 07:45:48.875 226239 DEBUG nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:45:48 np0005603623 nova_compute[226235]: 2026-01-31 07:45:48.880 226239 INFO nova.virt.libvirt.driver [-] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Instance spawned successfully.#033[00m
Jan 31 02:45:48 np0005603623 nova_compute[226235]: 2026-01-31 07:45:48.880 226239 DEBUG nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:45:49 np0005603623 nova_compute[226235]: 2026-01-31 07:45:49.003 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:49 np0005603623 nova_compute[226235]: 2026-01-31 07:45:49.006 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:49 np0005603623 nova_compute[226235]: 2026-01-31 07:45:49.008 226239 DEBUG nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:49 np0005603623 nova_compute[226235]: 2026-01-31 07:45:49.009 226239 DEBUG nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:49 np0005603623 nova_compute[226235]: 2026-01-31 07:45:49.009 226239 DEBUG nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:49 np0005603623 nova_compute[226235]: 2026-01-31 07:45:49.010 226239 DEBUG nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:49 np0005603623 nova_compute[226235]: 2026-01-31 07:45:49.010 226239 DEBUG nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:49 np0005603623 nova_compute[226235]: 2026-01-31 07:45:49.010 226239 DEBUG nova.virt.libvirt.driver [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:49 np0005603623 nova_compute[226235]: 2026-01-31 07:45:49.013 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:45:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:49.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:49 np0005603623 nova_compute[226235]: 2026-01-31 07:45:49.141 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:45:49 np0005603623 nova_compute[226235]: 2026-01-31 07:45:49.262 226239 INFO nova.compute.manager [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Took 15.21 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:45:49 np0005603623 nova_compute[226235]: 2026-01-31 07:45:49.262 226239 DEBUG nova.compute.manager [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:49 np0005603623 nova_compute[226235]: 2026-01-31 07:45:49.336 226239 INFO nova.compute.manager [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Took 16.18 seconds to build instance.#033[00m
Jan 31 02:45:49 np0005603623 nova_compute[226235]: 2026-01-31 07:45:49.353 226239 DEBUG oslo_concurrency.lockutils [None req-94ad4302-cd0a-468c-82a6-e6fccf5f097e cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "2607420d-fc87-4068-9a05-edeafb58d216" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.248s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:45:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:50.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:45:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:51.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:51 np0005603623 nova_compute[226235]: 2026-01-31 07:45:51.602 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:45:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:52.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:45:52 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:52Z|00049|binding|INFO|Releasing lport ff1de611-ad43-4ac1-9afc-1469794e5ef0 from this chassis (sb_readonly=0)
Jan 31 02:45:52 np0005603623 nova_compute[226235]: 2026-01-31 07:45:52.857 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:52 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:52Z|00050|binding|INFO|Releasing lport ff1de611-ad43-4ac1-9afc-1469794e5ef0 from this chassis (sb_readonly=0)
Jan 31 02:45:52 np0005603623 nova_compute[226235]: 2026-01-31 07:45:52.974 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:53.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:53 np0005603623 nova_compute[226235]: 2026-01-31 07:45:53.313 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:53 np0005603623 NetworkManager[48970]: <info>  [1769845553.3145] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Jan 31 02:45:53 np0005603623 NetworkManager[48970]: <info>  [1769845553.3164] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 31 02:45:53 np0005603623 nova_compute[226235]: 2026-01-31 07:45:53.381 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:53 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:53Z|00051|binding|INFO|Releasing lport ff1de611-ad43-4ac1-9afc-1469794e5ef0 from this chassis (sb_readonly=0)
Jan 31 02:45:53 np0005603623 nova_compute[226235]: 2026-01-31 07:45:53.396 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:53 np0005603623 nova_compute[226235]: 2026-01-31 07:45:53.873 226239 DEBUG nova.compute.manager [req-1c00dc2a-a5b8-4d4f-8e1d-f6abaaac6591 req-846a5c60-e8b1-4541-b478-c8b06d92049f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Received event network-changed-29abd150-9de3-49f3-80a2-85432282ba29 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:45:53 np0005603623 nova_compute[226235]: 2026-01-31 07:45:53.874 226239 DEBUG nova.compute.manager [req-1c00dc2a-a5b8-4d4f-8e1d-f6abaaac6591 req-846a5c60-e8b1-4541-b478-c8b06d92049f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Refreshing instance network info cache due to event network-changed-29abd150-9de3-49f3-80a2-85432282ba29. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:45:53 np0005603623 nova_compute[226235]: 2026-01-31 07:45:53.875 226239 DEBUG oslo_concurrency.lockutils [req-1c00dc2a-a5b8-4d4f-8e1d-f6abaaac6591 req-846a5c60-e8b1-4541-b478-c8b06d92049f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-2607420d-fc87-4068-9a05-edeafb58d216" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:45:53 np0005603623 nova_compute[226235]: 2026-01-31 07:45:53.875 226239 DEBUG oslo_concurrency.lockutils [req-1c00dc2a-a5b8-4d4f-8e1d-f6abaaac6591 req-846a5c60-e8b1-4541-b478-c8b06d92049f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-2607420d-fc87-4068-9a05-edeafb58d216" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:45:53 np0005603623 nova_compute[226235]: 2026-01-31 07:45:53.876 226239 DEBUG nova.network.neutron [req-1c00dc2a-a5b8-4d4f-8e1d-f6abaaac6591 req-846a5c60-e8b1-4541-b478-c8b06d92049f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Refreshing network info cache for port 29abd150-9de3-49f3-80a2-85432282ba29 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:45:54 np0005603623 nova_compute[226235]: 2026-01-31 07:45:54.008 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:45:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:54.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:45:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:55.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:55 np0005603623 nova_compute[226235]: 2026-01-31 07:45:55.950 226239 DEBUG nova.network.neutron [req-1c00dc2a-a5b8-4d4f-8e1d-f6abaaac6591 req-846a5c60-e8b1-4541-b478-c8b06d92049f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Updated VIF entry in instance network info cache for port 29abd150-9de3-49f3-80a2-85432282ba29. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:45:55 np0005603623 nova_compute[226235]: 2026-01-31 07:45:55.951 226239 DEBUG nova.network.neutron [req-1c00dc2a-a5b8-4d4f-8e1d-f6abaaac6591 req-846a5c60-e8b1-4541-b478-c8b06d92049f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Updating instance_info_cache with network_info: [{"id": "29abd150-9de3-49f3-80a2-85432282ba29", "address": "fa:16:3e:3f:a2:d5", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29abd150-9d", "ovs_interfaceid": "29abd150-9de3-49f3-80a2-85432282ba29", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:45:56 np0005603623 nova_compute[226235]: 2026-01-31 07:45:56.286 226239 DEBUG oslo_concurrency.lockutils [req-1c00dc2a-a5b8-4d4f-8e1d-f6abaaac6591 req-846a5c60-e8b1-4541-b478-c8b06d92049f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-2607420d-fc87-4068-9a05-edeafb58d216" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:45:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:56.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:56 np0005603623 ovn_controller[133449]: 2026-01-31T07:45:56Z|00052|binding|INFO|Releasing lport ff1de611-ad43-4ac1-9afc-1469794e5ef0 from this chassis (sb_readonly=0)
Jan 31 02:45:56 np0005603623 nova_compute[226235]: 2026-01-31 07:45:56.463 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:56 np0005603623 nova_compute[226235]: 2026-01-31 07:45:56.669 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:57.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:58.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:59 np0005603623 nova_compute[226235]: 2026-01-31 07:45:59.011 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:45:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:59.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:00.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:46:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:01.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:46:01 np0005603623 ovn_controller[133449]: 2026-01-31T07:46:01Z|00053|binding|INFO|Releasing lport ff1de611-ad43-4ac1-9afc-1469794e5ef0 from this chassis (sb_readonly=0)
Jan 31 02:46:01 np0005603623 nova_compute[226235]: 2026-01-31 07:46:01.231 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:01 np0005603623 nova_compute[226235]: 2026-01-31 07:46:01.670 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:02.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:46:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:03.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:46:04 np0005603623 nova_compute[226235]: 2026-01-31 07:46:04.014 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:04 np0005603623 ovn_controller[133449]: 2026-01-31T07:46:04Z|00054|binding|INFO|Releasing lport ff1de611-ad43-4ac1-9afc-1469794e5ef0 from this chassis (sb_readonly=0)
Jan 31 02:46:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:04.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:04 np0005603623 nova_compute[226235]: 2026-01-31 07:46:04.332 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:05.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:05 np0005603623 ovn_controller[133449]: 2026-01-31T07:46:05Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3f:a2:d5 10.100.0.11
Jan 31 02:46:05 np0005603623 ovn_controller[133449]: 2026-01-31T07:46:05Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3f:a2:d5 10.100.0.11
Jan 31 02:46:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:46:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:06.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:46:06 np0005603623 nova_compute[226235]: 2026-01-31 07:46:06.673 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:06 np0005603623 nova_compute[226235]: 2026-01-31 07:46:06.819 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:06 np0005603623 nova_compute[226235]: 2026-01-31 07:46:06.820 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:06 np0005603623 nova_compute[226235]: 2026-01-31 07:46:06.820 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:07.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:07 np0005603623 nova_compute[226235]: 2026-01-31 07:46:07.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:07 np0005603623 nova_compute[226235]: 2026-01-31 07:46:07.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:46:07 np0005603623 nova_compute[226235]: 2026-01-31 07:46:07.195 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:46:07 np0005603623 nova_compute[226235]: 2026-01-31 07:46:07.195 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:07 np0005603623 nova_compute[226235]: 2026-01-31 07:46:07.196 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:07 np0005603623 nova_compute[226235]: 2026-01-31 07:46:07.196 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:07 np0005603623 nova_compute[226235]: 2026-01-31 07:46:07.196 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:46:07 np0005603623 nova_compute[226235]: 2026-01-31 07:46:07.196 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:07 np0005603623 nova_compute[226235]: 2026-01-31 07:46:07.222 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:07 np0005603623 nova_compute[226235]: 2026-01-31 07:46:07.223 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:07 np0005603623 nova_compute[226235]: 2026-01-31 07:46:07.223 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:07 np0005603623 nova_compute[226235]: 2026-01-31 07:46:07.223 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:46:07 np0005603623 nova_compute[226235]: 2026-01-31 07:46:07.224 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:46:07 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3904472329' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:46:07 np0005603623 nova_compute[226235]: 2026-01-31 07:46:07.826 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.602s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:07 np0005603623 nova_compute[226235]: 2026-01-31 07:46:07.900 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:46:07 np0005603623 nova_compute[226235]: 2026-01-31 07:46:07.900 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:46:07 np0005603623 nova_compute[226235]: 2026-01-31 07:46:07.901 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:46:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:08 np0005603623 nova_compute[226235]: 2026-01-31 07:46:08.026 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:46:08 np0005603623 nova_compute[226235]: 2026-01-31 07:46:08.027 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4779MB free_disk=20.943607330322266GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:46:08 np0005603623 nova_compute[226235]: 2026-01-31 07:46:08.027 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:08 np0005603623 nova_compute[226235]: 2026-01-31 07:46:08.028 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:08 np0005603623 nova_compute[226235]: 2026-01-31 07:46:08.105 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 2607420d-fc87-4068-9a05-edeafb58d216 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:46:08 np0005603623 nova_compute[226235]: 2026-01-31 07:46:08.105 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:46:08 np0005603623 nova_compute[226235]: 2026-01-31 07:46:08.105 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:46:08 np0005603623 nova_compute[226235]: 2026-01-31 07:46:08.136 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:08.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:46:08 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/657313587' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:46:08 np0005603623 nova_compute[226235]: 2026-01-31 07:46:08.559 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:08 np0005603623 nova_compute[226235]: 2026-01-31 07:46:08.564 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:46:08 np0005603623 nova_compute[226235]: 2026-01-31 07:46:08.586 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:46:08 np0005603623 nova_compute[226235]: 2026-01-31 07:46:08.605 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:46:08 np0005603623 nova_compute[226235]: 2026-01-31 07:46:08.605 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.578s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:09 np0005603623 nova_compute[226235]: 2026-01-31 07:46:09.016 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:09.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:46:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:10.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:46:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:10.556 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:46:10 np0005603623 nova_compute[226235]: 2026-01-31 07:46:10.557 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:10.558 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:46:10 np0005603623 nova_compute[226235]: 2026-01-31 07:46:10.563 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:10 np0005603623 nova_compute[226235]: 2026-01-31 07:46:10.592 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:11.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:11 np0005603623 nova_compute[226235]: 2026-01-31 07:46:11.675 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:46:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:12.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:46:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:13.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:14 np0005603623 nova_compute[226235]: 2026-01-31 07:46:14.019 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:14.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:14.560 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:46:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:15.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:15 np0005603623 podman[231487]: 2026-01-31 07:46:15.087166967 +0000 UTC m=+0.159593626 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:46:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:16.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:16 np0005603623 nova_compute[226235]: 2026-01-31 07:46:16.677 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:16 np0005603623 podman[231514]: 2026-01-31 07:46:16.991596694 +0000 UTC m=+0.083011422 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 31 02:46:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:17.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:18.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:19 np0005603623 nova_compute[226235]: 2026-01-31 07:46:19.022 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:19.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:20.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:21.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:21 np0005603623 nova_compute[226235]: 2026-01-31 07:46:21.679 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:46:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:22.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:46:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:23.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:24 np0005603623 nova_compute[226235]: 2026-01-31 07:46:24.025 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:24.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:46:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:25.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:46:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:26.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:26 np0005603623 nova_compute[226235]: 2026-01-31 07:46:26.452 226239 DEBUG oslo_concurrency.lockutils [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "2607420d-fc87-4068-9a05-edeafb58d216" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:26 np0005603623 nova_compute[226235]: 2026-01-31 07:46:26.453 226239 DEBUG oslo_concurrency.lockutils [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "2607420d-fc87-4068-9a05-edeafb58d216" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:26 np0005603623 nova_compute[226235]: 2026-01-31 07:46:26.453 226239 DEBUG oslo_concurrency.lockutils [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "2607420d-fc87-4068-9a05-edeafb58d216-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:26 np0005603623 nova_compute[226235]: 2026-01-31 07:46:26.453 226239 DEBUG oslo_concurrency.lockutils [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "2607420d-fc87-4068-9a05-edeafb58d216-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:26 np0005603623 nova_compute[226235]: 2026-01-31 07:46:26.453 226239 DEBUG oslo_concurrency.lockutils [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "2607420d-fc87-4068-9a05-edeafb58d216-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:26 np0005603623 nova_compute[226235]: 2026-01-31 07:46:26.455 226239 INFO nova.compute.manager [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Terminating instance#033[00m
Jan 31 02:46:26 np0005603623 nova_compute[226235]: 2026-01-31 07:46:26.457 226239 DEBUG nova.compute.manager [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:46:26 np0005603623 nova_compute[226235]: 2026-01-31 07:46:26.680 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:46:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:27.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:46:27 np0005603623 kernel: tap29abd150-9d (unregistering): left promiscuous mode
Jan 31 02:46:27 np0005603623 NetworkManager[48970]: <info>  [1769845587.1574] device (tap29abd150-9d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:46:27 np0005603623 nova_compute[226235]: 2026-01-31 07:46:27.157 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:27 np0005603623 ovn_controller[133449]: 2026-01-31T07:46:27Z|00055|binding|INFO|Releasing lport 29abd150-9de3-49f3-80a2-85432282ba29 from this chassis (sb_readonly=0)
Jan 31 02:46:27 np0005603623 nova_compute[226235]: 2026-01-31 07:46:27.165 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:27 np0005603623 ovn_controller[133449]: 2026-01-31T07:46:27Z|00056|binding|INFO|Setting lport 29abd150-9de3-49f3-80a2-85432282ba29 down in Southbound
Jan 31 02:46:27 np0005603623 ovn_controller[133449]: 2026-01-31T07:46:27Z|00057|binding|INFO|Removing iface tap29abd150-9d ovn-installed in OVS
Jan 31 02:46:27 np0005603623 nova_compute[226235]: 2026-01-31 07:46:27.170 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:27 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:27.177 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:a2:d5 10.100.0.11'], port_security=['fa:16:3e:3f:a2:d5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '2607420d-fc87-4068-9a05-edeafb58d216', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cefbc155c2449f4a8fe9fd88475b366', 'neutron:revision_number': '4', 'neutron:security_group_ids': '587c819d-03ba-401e-b9cf-985ddd1bff18', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.184'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9b241bb-46b8-4aac-aa2a-6c4356b4b2d7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=29abd150-9de3-49f3-80a2-85432282ba29) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:46:27 np0005603623 nova_compute[226235]: 2026-01-31 07:46:27.179 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:27 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:27.179 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 29abd150-9de3-49f3-80a2-85432282ba29 in datapath dad80b1a-444d-4c9c-97e5-f4375a4ed8d6 unbound from our chassis#033[00m
Jan 31 02:46:27 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:27.182 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dad80b1a-444d-4c9c-97e5-f4375a4ed8d6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:46:27 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:27.184 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9a347849-17e4-4610-8d7f-81b161cef60f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:46:27 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:27.185 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6 namespace which is not needed anymore#033[00m
Jan 31 02:46:27 np0005603623 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000009.scope: Deactivated successfully.
Jan 31 02:46:27 np0005603623 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000009.scope: Consumed 14.118s CPU time.
Jan 31 02:46:27 np0005603623 systemd-machined[194379]: Machine qemu-3-instance-00000009 terminated.
Jan 31 02:46:27 np0005603623 nova_compute[226235]: 2026-01-31 07:46:27.288 226239 INFO nova.virt.libvirt.driver [-] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Instance destroyed successfully.#033[00m
Jan 31 02:46:27 np0005603623 nova_compute[226235]: 2026-01-31 07:46:27.289 226239 DEBUG nova.objects.instance [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lazy-loading 'resources' on Instance uuid 2607420d-fc87-4068-9a05-edeafb58d216 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:46:27 np0005603623 nova_compute[226235]: 2026-01-31 07:46:27.302 226239 DEBUG nova.virt.libvirt.vif [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:45:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-1471853240',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-1471853240',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(4),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-1471853240',id=9,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=4,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ89RiM/elI2ZqD0wPCKPg5OJ135CdqHdbE4RgDrlMkDtFUdRT49WLIZycY+7iaVAOnhx3ISEENdI3O2SmWMNgi6poXdDkUtOq9xREV6KVqBpvm5DBEGb+Du/A2bOrXUMA==',key_name='tempest-keypair-889434625',keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:45:49Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6cefbc155c2449f4a8fe9fd88475b366',ramdisk_id='',reservation_id='r-pkqkht2r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1634965997',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1634965997-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:45:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cddf948da8f14dc998b0b2434d23e7fc',uuid=2607420d-fc87-4068-9a05-edeafb58d216,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "29abd150-9de3-49f3-80a2-85432282ba29", "address": "fa:16:3e:3f:a2:d5", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29abd150-9d", "ovs_interfaceid": "29abd150-9de3-49f3-80a2-85432282ba29", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:46:27 np0005603623 nova_compute[226235]: 2026-01-31 07:46:27.302 226239 DEBUG nova.network.os_vif_util [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Converting VIF {"id": "29abd150-9de3-49f3-80a2-85432282ba29", "address": "fa:16:3e:3f:a2:d5", "network": {"id": "dad80b1a-444d-4c9c-97e5-f4375a4ed8d6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1863685988-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cefbc155c2449f4a8fe9fd88475b366", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29abd150-9d", "ovs_interfaceid": "29abd150-9de3-49f3-80a2-85432282ba29", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:46:27 np0005603623 nova_compute[226235]: 2026-01-31 07:46:27.303 226239 DEBUG nova.network.os_vif_util [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3f:a2:d5,bridge_name='br-int',has_traffic_filtering=True,id=29abd150-9de3-49f3-80a2-85432282ba29,network=Network(dad80b1a-444d-4c9c-97e5-f4375a4ed8d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29abd150-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:46:27 np0005603623 nova_compute[226235]: 2026-01-31 07:46:27.303 226239 DEBUG os_vif [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:a2:d5,bridge_name='br-int',has_traffic_filtering=True,id=29abd150-9de3-49f3-80a2-85432282ba29,network=Network(dad80b1a-444d-4c9c-97e5-f4375a4ed8d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29abd150-9d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:46:27 np0005603623 nova_compute[226235]: 2026-01-31 07:46:27.304 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:27 np0005603623 nova_compute[226235]: 2026-01-31 07:46:27.305 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29abd150-9d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:46:27 np0005603623 nova_compute[226235]: 2026-01-31 07:46:27.306 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:27 np0005603623 nova_compute[226235]: 2026-01-31 07:46:27.307 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:46:27 np0005603623 nova_compute[226235]: 2026-01-31 07:46:27.308 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:27 np0005603623 nova_compute[226235]: 2026-01-31 07:46:27.310 226239 INFO os_vif [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:a2:d5,bridge_name='br-int',has_traffic_filtering=True,id=29abd150-9de3-49f3-80a2-85432282ba29,network=Network(dad80b1a-444d-4c9c-97e5-f4375a4ed8d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29abd150-9d')#033[00m
Jan 31 02:46:27 np0005603623 neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6[231307]: [NOTICE]   (231315) : haproxy version is 2.8.14-c23fe91
Jan 31 02:46:27 np0005603623 neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6[231307]: [NOTICE]   (231315) : path to executable is /usr/sbin/haproxy
Jan 31 02:46:27 np0005603623 neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6[231307]: [WARNING]  (231315) : Exiting Master process...
Jan 31 02:46:27 np0005603623 neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6[231307]: [WARNING]  (231315) : Exiting Master process...
Jan 31 02:46:27 np0005603623 neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6[231307]: [ALERT]    (231315) : Current worker (231317) exited with code 143 (Terminated)
Jan 31 02:46:27 np0005603623 neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6[231307]: [WARNING]  (231315) : All workers exited. Exiting... (0)
Jan 31 02:46:27 np0005603623 systemd[1]: libpod-72b2e98adfb1912bdeb7a0e3b5049c887e8ed25a93962e89cf13f903e8d4229f.scope: Deactivated successfully.
Jan 31 02:46:27 np0005603623 podman[231565]: 2026-01-31 07:46:27.50889905 +0000 UTC m=+0.250853089 container died 72b2e98adfb1912bdeb7a0e3b5049c887e8ed25a93962e89cf13f903e8d4229f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 02:46:28 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-72b2e98adfb1912bdeb7a0e3b5049c887e8ed25a93962e89cf13f903e8d4229f-userdata-shm.mount: Deactivated successfully.
Jan 31 02:46:28 np0005603623 systemd[1]: var-lib-containers-storage-overlay-2919815f844d7739f4bec13767861c949e7e136919287d6c424fe44a637189f9-merged.mount: Deactivated successfully.
Jan 31 02:46:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:28 np0005603623 podman[231565]: 2026-01-31 07:46:28.272951276 +0000 UTC m=+1.014905305 container cleanup 72b2e98adfb1912bdeb7a0e3b5049c887e8ed25a93962e89cf13f903e8d4229f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:46:28 np0005603623 systemd[1]: libpod-conmon-72b2e98adfb1912bdeb7a0e3b5049c887e8ed25a93962e89cf13f903e8d4229f.scope: Deactivated successfully.
Jan 31 02:46:28 np0005603623 nova_compute[226235]: 2026-01-31 07:46:28.298 226239 DEBUG nova.compute.manager [req-3c028e4f-7ac8-48d0-98da-fdd180a7d9c4 req-27d23535-2044-4d96-8a06-197b6d58d27d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Received event network-vif-unplugged-29abd150-9de3-49f3-80a2-85432282ba29 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:46:28 np0005603623 nova_compute[226235]: 2026-01-31 07:46:28.299 226239 DEBUG oslo_concurrency.lockutils [req-3c028e4f-7ac8-48d0-98da-fdd180a7d9c4 req-27d23535-2044-4d96-8a06-197b6d58d27d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "2607420d-fc87-4068-9a05-edeafb58d216-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:28 np0005603623 nova_compute[226235]: 2026-01-31 07:46:28.300 226239 DEBUG oslo_concurrency.lockutils [req-3c028e4f-7ac8-48d0-98da-fdd180a7d9c4 req-27d23535-2044-4d96-8a06-197b6d58d27d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2607420d-fc87-4068-9a05-edeafb58d216-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:28 np0005603623 nova_compute[226235]: 2026-01-31 07:46:28.300 226239 DEBUG oslo_concurrency.lockutils [req-3c028e4f-7ac8-48d0-98da-fdd180a7d9c4 req-27d23535-2044-4d96-8a06-197b6d58d27d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2607420d-fc87-4068-9a05-edeafb58d216-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:28 np0005603623 nova_compute[226235]: 2026-01-31 07:46:28.301 226239 DEBUG nova.compute.manager [req-3c028e4f-7ac8-48d0-98da-fdd180a7d9c4 req-27d23535-2044-4d96-8a06-197b6d58d27d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] No waiting events found dispatching network-vif-unplugged-29abd150-9de3-49f3-80a2-85432282ba29 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:46:28 np0005603623 nova_compute[226235]: 2026-01-31 07:46:28.301 226239 DEBUG nova.compute.manager [req-3c028e4f-7ac8-48d0-98da-fdd180a7d9c4 req-27d23535-2044-4d96-8a06-197b6d58d27d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Received event network-vif-unplugged-29abd150-9de3-49f3-80a2-85432282ba29 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:46:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:28.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:28 np0005603623 podman[231627]: 2026-01-31 07:46:28.785977507 +0000 UTC m=+0.494473375 container remove 72b2e98adfb1912bdeb7a0e3b5049c887e8ed25a93962e89cf13f903e8d4229f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:46:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:28.790 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2514c4b7-f332-472b-bfb6-36f70ecc0805]: (4, ('Sat Jan 31 07:46:27 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6 (72b2e98adfb1912bdeb7a0e3b5049c887e8ed25a93962e89cf13f903e8d4229f)\n72b2e98adfb1912bdeb7a0e3b5049c887e8ed25a93962e89cf13f903e8d4229f\nSat Jan 31 07:46:28 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6 (72b2e98adfb1912bdeb7a0e3b5049c887e8ed25a93962e89cf13f903e8d4229f)\n72b2e98adfb1912bdeb7a0e3b5049c887e8ed25a93962e89cf13f903e8d4229f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:46:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:28.792 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[61b26a95-0b9d-4f10-8892-9f92af5a4550]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:46:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:28.794 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdad80b1a-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:46:28 np0005603623 nova_compute[226235]: 2026-01-31 07:46:28.797 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:28 np0005603623 kernel: tapdad80b1a-40: left promiscuous mode
Jan 31 02:46:28 np0005603623 nova_compute[226235]: 2026-01-31 07:46:28.807 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:28.812 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6cd379c2-1c76-4a0b-911c-abd35becd61a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:46:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:28.838 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ff849dae-4d61-4439-aba5-38f0eaa8a817]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:46:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:28.840 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[19a2f630-646a-4c0b-b839-fce38b323f92]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:46:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:28.856 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3772d338-24fc-448d-a167-8911a6f5a590]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468706, 'reachable_time': 27409, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231643, 'error': None, 'target': 'ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:46:28 np0005603623 systemd[1]: run-netns-ovnmeta\x2ddad80b1a\x2d444d\x2d4c9c\x2d97e5\x2df4375a4ed8d6.mount: Deactivated successfully.
Jan 31 02:46:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:28.860 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dad80b1a-444d-4c9c-97e5-f4375a4ed8d6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:46:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:28.860 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[c6491aef-8d7e-4558-a048-f352cde14347]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:46:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:29.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:30.081 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:30.082 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:30.082 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:46:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:30.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:46:30 np0005603623 nova_compute[226235]: 2026-01-31 07:46:30.386 226239 DEBUG nova.compute.manager [req-2fbd933e-46c0-4094-b41f-002738bdcd9a req-7e75b827-828f-4f3b-805f-bec3714458d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Received event network-vif-plugged-29abd150-9de3-49f3-80a2-85432282ba29 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:46:30 np0005603623 nova_compute[226235]: 2026-01-31 07:46:30.386 226239 DEBUG oslo_concurrency.lockutils [req-2fbd933e-46c0-4094-b41f-002738bdcd9a req-7e75b827-828f-4f3b-805f-bec3714458d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "2607420d-fc87-4068-9a05-edeafb58d216-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:30 np0005603623 nova_compute[226235]: 2026-01-31 07:46:30.386 226239 DEBUG oslo_concurrency.lockutils [req-2fbd933e-46c0-4094-b41f-002738bdcd9a req-7e75b827-828f-4f3b-805f-bec3714458d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2607420d-fc87-4068-9a05-edeafb58d216-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:30 np0005603623 nova_compute[226235]: 2026-01-31 07:46:30.387 226239 DEBUG oslo_concurrency.lockutils [req-2fbd933e-46c0-4094-b41f-002738bdcd9a req-7e75b827-828f-4f3b-805f-bec3714458d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2607420d-fc87-4068-9a05-edeafb58d216-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:30 np0005603623 nova_compute[226235]: 2026-01-31 07:46:30.387 226239 DEBUG nova.compute.manager [req-2fbd933e-46c0-4094-b41f-002738bdcd9a req-7e75b827-828f-4f3b-805f-bec3714458d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] No waiting events found dispatching network-vif-plugged-29abd150-9de3-49f3-80a2-85432282ba29 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:46:30 np0005603623 nova_compute[226235]: 2026-01-31 07:46:30.387 226239 WARNING nova.compute.manager [req-2fbd933e-46c0-4094-b41f-002738bdcd9a req-7e75b827-828f-4f3b-805f-bec3714458d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Received unexpected event network-vif-plugged-29abd150-9de3-49f3-80a2-85432282ba29 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 02:46:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:31.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:31 np0005603623 nova_compute[226235]: 2026-01-31 07:46:31.682 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:32 np0005603623 nova_compute[226235]: 2026-01-31 07:46:32.307 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:32.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:33.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:34.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:35.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:35 np0005603623 ovn_controller[133449]: 2026-01-31T07:46:35Z|00058|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 02:46:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:36.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:36 np0005603623 nova_compute[226235]: 2026-01-31 07:46:36.685 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:46:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:37.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:46:37 np0005603623 nova_compute[226235]: 2026-01-31 07:46:37.353 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:38.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:39.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:40.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:41.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:41 np0005603623 nova_compute[226235]: 2026-01-31 07:46:41.686 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:42 np0005603623 nova_compute[226235]: 2026-01-31 07:46:42.288 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845587.2862453, 2607420d-fc87-4068-9a05-edeafb58d216 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:46:42 np0005603623 nova_compute[226235]: 2026-01-31 07:46:42.288 226239 INFO nova.compute.manager [-] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:46:42 np0005603623 nova_compute[226235]: 2026-01-31 07:46:42.319 226239 DEBUG nova.compute.manager [None req-40eb7e75-c878-45fb-b83f-e8114cee0d4f - - - - - -] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:46:42 np0005603623 nova_compute[226235]: 2026-01-31 07:46:42.323 226239 DEBUG nova.compute.manager [None req-40eb7e75-c878-45fb-b83f-e8114cee0d4f - - - - - -] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: deleting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:46:42 np0005603623 nova_compute[226235]: 2026-01-31 07:46:42.351 226239 INFO nova.compute.manager [None req-40eb7e75-c878-45fb-b83f-e8114cee0d4f - - - - - -] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Jan 31 02:46:42 np0005603623 nova_compute[226235]: 2026-01-31 07:46:42.354 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:46:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:42.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:46:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:46:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:43.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:46:43 np0005603623 nova_compute[226235]: 2026-01-31 07:46:43.348 226239 INFO nova.virt.libvirt.driver [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Deleting instance files /var/lib/nova/instances/2607420d-fc87-4068-9a05-edeafb58d216_del#033[00m
Jan 31 02:46:43 np0005603623 nova_compute[226235]: 2026-01-31 07:46:43.349 226239 INFO nova.virt.libvirt.driver [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Deletion of /var/lib/nova/instances/2607420d-fc87-4068-9a05-edeafb58d216_del complete#033[00m
Jan 31 02:46:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:43 np0005603623 nova_compute[226235]: 2026-01-31 07:46:43.410 226239 INFO nova.compute.manager [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Took 16.95 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:46:43 np0005603623 nova_compute[226235]: 2026-01-31 07:46:43.410 226239 DEBUG oslo.service.loopingcall [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:46:43 np0005603623 nova_compute[226235]: 2026-01-31 07:46:43.411 226239 DEBUG nova.compute.manager [-] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:46:43 np0005603623 nova_compute[226235]: 2026-01-31 07:46:43.411 226239 DEBUG nova.network.neutron [-] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:46:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:44.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:46:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:45.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:46:45 np0005603623 nova_compute[226235]: 2026-01-31 07:46:45.335 226239 DEBUG nova.compute.manager [req-f67f81a3-dcc2-4a2d-8f09-a79cec2cbd2d req-b6928977-6bd4-48eb-bb19-0754aa757dce fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Received event network-vif-deleted-29abd150-9de3-49f3-80a2-85432282ba29 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:46:45 np0005603623 nova_compute[226235]: 2026-01-31 07:46:45.336 226239 INFO nova.compute.manager [req-f67f81a3-dcc2-4a2d-8f09-a79cec2cbd2d req-b6928977-6bd4-48eb-bb19-0754aa757dce fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Neutron deleted interface 29abd150-9de3-49f3-80a2-85432282ba29; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 02:46:45 np0005603623 nova_compute[226235]: 2026-01-31 07:46:45.336 226239 DEBUG nova.network.neutron [req-f67f81a3-dcc2-4a2d-8f09-a79cec2cbd2d req-b6928977-6bd4-48eb-bb19-0754aa757dce fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:46:45 np0005603623 nova_compute[226235]: 2026-01-31 07:46:45.340 226239 DEBUG nova.network.neutron [-] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:46:45 np0005603623 nova_compute[226235]: 2026-01-31 07:46:45.751 226239 INFO nova.compute.manager [-] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Took 2.34 seconds to deallocate network for instance.#033[00m
Jan 31 02:46:45 np0005603623 nova_compute[226235]: 2026-01-31 07:46:45.761 226239 DEBUG nova.compute.manager [req-f67f81a3-dcc2-4a2d-8f09-a79cec2cbd2d req-b6928977-6bd4-48eb-bb19-0754aa757dce fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2607420d-fc87-4068-9a05-edeafb58d216] Detach interface failed, port_id=29abd150-9de3-49f3-80a2-85432282ba29, reason: Instance 2607420d-fc87-4068-9a05-edeafb58d216 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 02:46:45 np0005603623 nova_compute[226235]: 2026-01-31 07:46:45.879 226239 DEBUG oslo_concurrency.lockutils [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:45 np0005603623 nova_compute[226235]: 2026-01-31 07:46:45.879 226239 DEBUG oslo_concurrency.lockutils [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:45 np0005603623 nova_compute[226235]: 2026-01-31 07:46:45.943 226239 DEBUG oslo_concurrency.processutils [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:46 np0005603623 podman[231703]: 2026-01-31 07:46:46.005698091 +0000 UTC m=+0.101424635 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 02:46:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:46:46 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3495718245' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:46:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:46.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:46 np0005603623 nova_compute[226235]: 2026-01-31 07:46:46.375 226239 DEBUG oslo_concurrency.processutils [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:46 np0005603623 nova_compute[226235]: 2026-01-31 07:46:46.380 226239 DEBUG nova.compute.provider_tree [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:46:46 np0005603623 nova_compute[226235]: 2026-01-31 07:46:46.399 226239 DEBUG nova.scheduler.client.report [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:46:46 np0005603623 nova_compute[226235]: 2026-01-31 07:46:46.434 226239 DEBUG oslo_concurrency.lockutils [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.555s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:46 np0005603623 nova_compute[226235]: 2026-01-31 07:46:46.466 226239 INFO nova.scheduler.client.report [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Deleted allocations for instance 2607420d-fc87-4068-9a05-edeafb58d216#033[00m
Jan 31 02:46:46 np0005603623 nova_compute[226235]: 2026-01-31 07:46:46.527 226239 DEBUG oslo_concurrency.lockutils [None req-97685685-c8d3-456b-848f-8ffecb7d134b cddf948da8f14dc998b0b2434d23e7fc 6cefbc155c2449f4a8fe9fd88475b366 - - default default] Lock "2607420d-fc87-4068-9a05-edeafb58d216" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 20.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:46 np0005603623 nova_compute[226235]: 2026-01-31 07:46:46.689 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:47.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:47 np0005603623 podman[231851]: 2026-01-31 07:46:47.11621772 +0000 UTC m=+0.050223658 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 31 02:46:47 np0005603623 nova_compute[226235]: 2026-01-31 07:46:47.356 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:48.236 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:46:48 np0005603623 nova_compute[226235]: 2026-01-31 07:46:48.236 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:48.238 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:46:48 np0005603623 nova_compute[226235]: 2026-01-31 07:46:48.352 226239 DEBUG oslo_concurrency.lockutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Acquiring lock "d3f77a29-d3a7-444c-9528-ab679b9b946c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:48 np0005603623 nova_compute[226235]: 2026-01-31 07:46:48.353 226239 DEBUG oslo_concurrency.lockutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "d3f77a29-d3a7-444c-9528-ab679b9b946c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:48.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:48 np0005603623 nova_compute[226235]: 2026-01-31 07:46:48.381 226239 DEBUG nova.compute.manager [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:46:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:48 np0005603623 nova_compute[226235]: 2026-01-31 07:46:48.473 226239 DEBUG oslo_concurrency.lockutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:48 np0005603623 nova_compute[226235]: 2026-01-31 07:46:48.474 226239 DEBUG oslo_concurrency.lockutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:48 np0005603623 nova_compute[226235]: 2026-01-31 07:46:48.483 226239 DEBUG nova.virt.hardware [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:46:48 np0005603623 nova_compute[226235]: 2026-01-31 07:46:48.483 226239 INFO nova.compute.claims [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:46:48 np0005603623 nova_compute[226235]: 2026-01-31 07:46:48.628 226239 DEBUG oslo_concurrency.processutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:46:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2538250321' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.054 226239 DEBUG oslo_concurrency.processutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.058 226239 DEBUG nova.compute.provider_tree [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.078 226239 DEBUG nova.scheduler.client.report [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.100 226239 DEBUG oslo_concurrency.lockutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.626s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.101 226239 DEBUG nova.compute.manager [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:46:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:49.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.157 226239 DEBUG nova.compute.manager [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.182 226239 INFO nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.203 226239 DEBUG nova.compute.manager [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.325 226239 DEBUG oslo_concurrency.lockutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Acquiring lock "b99980e3-e183-4ce8-b3d6-606bdf46f451" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.326 226239 DEBUG oslo_concurrency.lockutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Lock "b99980e3-e183-4ce8-b3d6-606bdf46f451" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.407 226239 DEBUG nova.compute.manager [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.417 226239 DEBUG nova.compute.manager [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.419 226239 DEBUG nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.419 226239 INFO nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Creating image(s)#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.464 226239 DEBUG nova.storage.rbd_utils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.509 226239 DEBUG nova.storage.rbd_utils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.536 226239 DEBUG nova.storage.rbd_utils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.539 226239 DEBUG oslo_concurrency.processutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.577 226239 DEBUG oslo_concurrency.lockutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.577 226239 DEBUG oslo_concurrency.lockutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.584 226239 DEBUG nova.virt.hardware [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.585 226239 INFO nova.compute.claims [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.617 226239 DEBUG oslo_concurrency.processutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.617 226239 DEBUG oslo_concurrency.lockutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.618 226239 DEBUG oslo_concurrency.lockutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.618 226239 DEBUG oslo_concurrency.lockutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.648 226239 DEBUG nova.storage.rbd_utils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.652 226239 DEBUG oslo_concurrency.processutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 d3f77a29-d3a7-444c-9528-ab679b9b946c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:49 np0005603623 nova_compute[226235]: 2026-01-31 07:46:49.789 226239 DEBUG oslo_concurrency.processutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:46:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:46:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:46:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:46:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:46:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:46:50 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2732021004' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.253 226239 DEBUG oslo_concurrency.processutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.258 226239 DEBUG nova.compute.provider_tree [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.271 226239 DEBUG nova.scheduler.client.report [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.293 226239 DEBUG oslo_concurrency.lockutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.716s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.294 226239 DEBUG nova.compute.manager [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.347 226239 DEBUG nova.compute.manager [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.347 226239 DEBUG nova.network.neutron [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.368 226239 INFO nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:46:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:50.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.391 226239 DEBUG nova.compute.manager [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.495 226239 DEBUG nova.compute.manager [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.496 226239 DEBUG nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.496 226239 INFO nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Creating image(s)#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.525 226239 DEBUG nova.storage.rbd_utils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] rbd image b99980e3-e183-4ce8-b3d6-606bdf46f451_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.557 226239 DEBUG nova.storage.rbd_utils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] rbd image b99980e3-e183-4ce8-b3d6-606bdf46f451_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.585 226239 DEBUG nova.storage.rbd_utils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] rbd image b99980e3-e183-4ce8-b3d6-606bdf46f451_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.588 226239 DEBUG oslo_concurrency.processutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.631 226239 DEBUG oslo_concurrency.processutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.631 226239 DEBUG oslo_concurrency.lockutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.632 226239 DEBUG oslo_concurrency.lockutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.632 226239 DEBUG oslo_concurrency.lockutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.654 226239 DEBUG nova.storage.rbd_utils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] rbd image b99980e3-e183-4ce8-b3d6-606bdf46f451_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.657 226239 DEBUG oslo_concurrency.processutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 b99980e3-e183-4ce8-b3d6-606bdf46f451_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.884 226239 DEBUG nova.network.neutron [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 31 02:46:50 np0005603623 nova_compute[226235]: 2026-01-31 07:46:50.884 226239 DEBUG nova.compute.manager [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:46:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:46:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:51.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:46:51 np0005603623 nova_compute[226235]: 2026-01-31 07:46:51.690 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:52 np0005603623 nova_compute[226235]: 2026-01-31 07:46:52.359 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:52.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:46:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:53.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:46:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:54 np0005603623 nova_compute[226235]: 2026-01-31 07:46:54.355 226239 DEBUG oslo_concurrency.processutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 d3f77a29-d3a7-444c-9528-ab679b9b946c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 4.703s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:54.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:54 np0005603623 nova_compute[226235]: 2026-01-31 07:46:54.429 226239 DEBUG nova.storage.rbd_utils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] resizing rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:46:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:55.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:55 np0005603623 nova_compute[226235]: 2026-01-31 07:46:55.152 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:55 np0005603623 nova_compute[226235]: 2026-01-31 07:46:55.193 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:46:55.240 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:46:55 np0005603623 nova_compute[226235]: 2026-01-31 07:46:55.743 226239 DEBUG oslo_concurrency.processutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 b99980e3-e183-4ce8-b3d6-606bdf46f451_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 5.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:55 np0005603623 nova_compute[226235]: 2026-01-31 07:46:55.890 226239 DEBUG nova.storage.rbd_utils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] resizing rbd image b99980e3-e183-4ce8-b3d6-606bdf46f451_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:46:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:56.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:56 np0005603623 nova_compute[226235]: 2026-01-31 07:46:56.722 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:56 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Jan 31 02:46:56 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:46:56.799872) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:46:56 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Jan 31 02:46:56 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845616800022, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 1977, "num_deletes": 250, "total_data_size": 4544641, "memory_usage": 4588040, "flush_reason": "Manual Compaction"}
Jan 31 02:46:56 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Jan 31 02:46:56 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845616911780, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 1817482, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20259, "largest_seqno": 22231, "table_properties": {"data_size": 1811257, "index_size": 3172, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 16364, "raw_average_key_size": 20, "raw_value_size": 1797426, "raw_average_value_size": 2289, "num_data_blocks": 140, "num_entries": 785, "num_filter_entries": 785, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769845456, "oldest_key_time": 1769845456, "file_creation_time": 1769845616, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:46:56 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 111931 microseconds, and 4568 cpu microseconds.
Jan 31 02:46:56 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:46:56.911828) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 1817482 bytes OK
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:46:56.911849) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:46:57.114631) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:46:57.114673) EVENT_LOG_v1 {"time_micros": 1769845617114663, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:46:57.114698) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 4535748, prev total WAL file size 4552977, number of live WAL files 2.
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:46:57.115709) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353030' seq:72057594037927935, type:22 .. '6D67727374617400373531' seq:0, type:0; will stop at (end)
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(1774KB)], [39(9527KB)]
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845617115789, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 11573348, "oldest_snapshot_seqno": -1}
Jan 31 02:46:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:46:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:57.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:46:57 np0005603623 nova_compute[226235]: 2026-01-31 07:46:57.360 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 4852 keys, 8804292 bytes, temperature: kUnknown
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845617481314, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 8804292, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8771626, "index_size": 19436, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12165, "raw_key_size": 120360, "raw_average_key_size": 24, "raw_value_size": 8683685, "raw_average_value_size": 1789, "num_data_blocks": 804, "num_entries": 4852, "num_filter_entries": 4852, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769845617, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:46:57.481906) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 8804292 bytes
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:46:57.755036) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 31.6 rd, 24.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 9.3 +0.0 blob) out(8.4 +0.0 blob), read-write-amplify(11.2) write-amplify(4.8) OK, records in: 5297, records dropped: 445 output_compression: NoCompression
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:46:57.755088) EVENT_LOG_v1 {"time_micros": 1769845617755067, "job": 22, "event": "compaction_finished", "compaction_time_micros": 365890, "compaction_time_cpu_micros": 30621, "output_level": 6, "num_output_files": 1, "total_output_size": 8804292, "num_input_records": 5297, "num_output_records": 4852, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845617755682, "job": 22, "event": "table_file_deletion", "file_number": 41}
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845617757568, "job": 22, "event": "table_file_deletion", "file_number": 39}
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:46:57.115589) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:46:57.757658) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:46:57.757667) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:46:57.757672) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:46:57.757676) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:46:57 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:46:57.757681) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:46:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:46:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:58.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:46:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:46:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:59.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:00.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.874 226239 DEBUG nova.objects.instance [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lazy-loading 'migration_context' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.878 226239 DEBUG nova.objects.instance [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Lazy-loading 'migration_context' on Instance uuid b99980e3-e183-4ce8-b3d6-606bdf46f451 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.890 226239 DEBUG nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.891 226239 DEBUG nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Ensure instance console log exists: /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.891 226239 DEBUG oslo_concurrency.lockutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.892 226239 DEBUG oslo_concurrency.lockutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.892 226239 DEBUG oslo_concurrency.lockutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.894 226239 DEBUG nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.896 226239 DEBUG nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.896 226239 DEBUG nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Ensure instance console log exists: /var/lib/nova/instances/b99980e3-e183-4ce8-b3d6-606bdf46f451/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.896 226239 DEBUG oslo_concurrency.lockutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.897 226239 DEBUG oslo_concurrency.lockutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.897 226239 DEBUG oslo_concurrency.lockutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.898 226239 DEBUG nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.903 226239 WARNING nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.905 226239 WARNING nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.912 226239 DEBUG nova.virt.libvirt.host [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.913 226239 DEBUG nova.virt.libvirt.host [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.913 226239 DEBUG nova.virt.libvirt.host [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.914 226239 DEBUG nova.virt.libvirt.host [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.917 226239 DEBUG nova.virt.libvirt.host [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.917 226239 DEBUG nova.virt.libvirt.host [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.919 226239 DEBUG nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.919 226239 DEBUG nova.virt.hardware [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.920 226239 DEBUG nova.virt.hardware [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.920 226239 DEBUG nova.virt.hardware [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.920 226239 DEBUG nova.virt.hardware [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.920 226239 DEBUG nova.virt.hardware [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.921 226239 DEBUG nova.virt.hardware [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.921 226239 DEBUG nova.virt.hardware [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.921 226239 DEBUG nova.virt.hardware [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.921 226239 DEBUG nova.virt.hardware [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.922 226239 DEBUG nova.virt.hardware [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.922 226239 DEBUG nova.virt.hardware [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.925 226239 DEBUG oslo_concurrency.processutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.938 226239 DEBUG nova.virt.libvirt.host [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.939 226239 DEBUG nova.virt.libvirt.host [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.940 226239 DEBUG nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.940 226239 DEBUG nova.virt.hardware [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.941 226239 DEBUG nova.virt.hardware [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.941 226239 DEBUG nova.virt.hardware [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.941 226239 DEBUG nova.virt.hardware [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.942 226239 DEBUG nova.virt.hardware [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.942 226239 DEBUG nova.virt.hardware [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.942 226239 DEBUG nova.virt.hardware [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.943 226239 DEBUG nova.virt.hardware [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.943 226239 DEBUG nova.virt.hardware [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.943 226239 DEBUG nova.virt.hardware [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.944 226239 DEBUG nova.virt.hardware [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:47:00 np0005603623 nova_compute[226235]: 2026-01-31 07:47:00.947 226239 DEBUG oslo_concurrency.processutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:47:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:01.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:47:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:47:01 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2011156060' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:47:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:47:01 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4220924356' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:47:01 np0005603623 nova_compute[226235]: 2026-01-31 07:47:01.713 226239 DEBUG oslo_concurrency.processutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.788s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:01 np0005603623 nova_compute[226235]: 2026-01-31 07:47:01.738 226239 DEBUG nova.storage.rbd_utils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:47:01 np0005603623 nova_compute[226235]: 2026-01-31 07:47:01.741 226239 DEBUG oslo_concurrency.processutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:01 np0005603623 nova_compute[226235]: 2026-01-31 07:47:01.753 226239 DEBUG oslo_concurrency.processutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.805s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:01 np0005603623 nova_compute[226235]: 2026-01-31 07:47:01.753 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:01 np0005603623 nova_compute[226235]: 2026-01-31 07:47:01.781 226239 DEBUG nova.storage.rbd_utils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] rbd image b99980e3-e183-4ce8-b3d6-606bdf46f451_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:47:01 np0005603623 nova_compute[226235]: 2026-01-31 07:47:01.786 226239 DEBUG oslo_concurrency.processutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:47:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:02.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:47:02 np0005603623 nova_compute[226235]: 2026-01-31 07:47:02.386 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:47:02 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1601280976' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:47:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:47:02 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/567245419' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:47:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:47:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:03.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.177 226239 DEBUG oslo_concurrency.processutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.178 226239 DEBUG nova.objects.instance [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lazy-loading 'pci_devices' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.179 226239 DEBUG oslo_concurrency.processutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.394s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.181 226239 DEBUG nova.objects.instance [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Lazy-loading 'pci_devices' on Instance uuid b99980e3-e183-4ce8-b3d6-606bdf46f451 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.211 226239 DEBUG nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <uuid>b99980e3-e183-4ce8-b3d6-606bdf46f451</uuid>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <name>instance-0000000d</name>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <nova:name>tempest-DeleteServersAdminTestJSON-server-566424991</nova:name>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:47:00</nova:creationTime>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <nova:user uuid="af3dc76e8c644930baa58a646b54b535">tempest-DeleteServersAdminTestJSON-1768399059-project-member</nova:user>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <nova:project uuid="f5246170c80a4be2beb961122f12fcaf">tempest-DeleteServersAdminTestJSON-1768399059</nova:project>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <nova:ports/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <entry name="serial">b99980e3-e183-4ce8-b3d6-606bdf46f451</entry>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <entry name="uuid">b99980e3-e183-4ce8-b3d6-606bdf46f451</entry>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/b99980e3-e183-4ce8-b3d6-606bdf46f451_disk">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/b99980e3-e183-4ce8-b3d6-606bdf46f451_disk.config">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/b99980e3-e183-4ce8-b3d6-606bdf46f451/console.log" append="off"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:47:03 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:47:03 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.214 226239 DEBUG nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <uuid>d3f77a29-d3a7-444c-9528-ab679b9b946c</uuid>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <name>instance-0000000c</name>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServersAdmin275Test-server-1493730364</nova:name>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:47:00</nova:creationTime>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <nova:user uuid="4e887d8783db44ff93a55e1ea75aa78e">tempest-ServersAdmin275Test-200317158-project-member</nova:user>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <nova:project uuid="2ca2f28405884a6ea92bcde9c8f91ff9">tempest-ServersAdmin275Test-200317158</nova:project>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <nova:ports/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <entry name="serial">d3f77a29-d3a7-444c-9528-ab679b9b946c</entry>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <entry name="uuid">d3f77a29-d3a7-444c-9528-ab679b9b946c</entry>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/d3f77a29-d3a7-444c-9528-ab679b9b946c_disk">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/d3f77a29-d3a7-444c-9528-ab679b9b946c_disk.config">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/console.log" append="off"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:47:03 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:47:03 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:47:03 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:47:03 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.408 226239 DEBUG nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.408 226239 DEBUG nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.409 226239 INFO nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Using config drive#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.440 226239 DEBUG nova.storage.rbd_utils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] rbd image b99980e3-e183-4ce8-b3d6-606bdf46f451_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.451 226239 DEBUG nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.451 226239 DEBUG nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.452 226239 INFO nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Using config drive#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.476 226239 DEBUG nova.storage.rbd_utils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:47:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.719 226239 INFO nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Creating config drive at /var/lib/nova/instances/b99980e3-e183-4ce8-b3d6-606bdf46f451/disk.config#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.723 226239 DEBUG oslo_concurrency.processutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b99980e3-e183-4ce8-b3d6-606bdf46f451/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp70hmnp43 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.750 226239 INFO nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Creating config drive at /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/disk.config#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.753 226239 DEBUG oslo_concurrency.processutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpirsfvg0_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.837 226239 DEBUG oslo_concurrency.processutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b99980e3-e183-4ce8-b3d6-606bdf46f451/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp70hmnp43" returned: 0 in 0.114s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.890 226239 DEBUG nova.storage.rbd_utils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] rbd image b99980e3-e183-4ce8-b3d6-606bdf46f451_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.894 226239 DEBUG oslo_concurrency.processutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b99980e3-e183-4ce8-b3d6-606bdf46f451/disk.config b99980e3-e183-4ce8-b3d6-606bdf46f451_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.912 226239 DEBUG oslo_concurrency.processutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpirsfvg0_" returned: 0 in 0.159s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.947 226239 DEBUG nova.storage.rbd_utils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:47:03 np0005603623 nova_compute[226235]: 2026-01-31 07:47:03.951 226239 DEBUG oslo_concurrency.processutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/disk.config d3f77a29-d3a7-444c-9528-ab679b9b946c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:04.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:05.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:05 np0005603623 nova_compute[226235]: 2026-01-31 07:47:05.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:47:05 np0005603623 nova_compute[226235]: 2026-01-31 07:47:05.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:47:06 np0005603623 nova_compute[226235]: 2026-01-31 07:47:06.151 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:47:06 np0005603623 nova_compute[226235]: 2026-01-31 07:47:06.182 226239 DEBUG oslo_concurrency.processutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b99980e3-e183-4ce8-b3d6-606bdf46f451/disk.config b99980e3-e183-4ce8-b3d6-606bdf46f451_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.288s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:06 np0005603623 nova_compute[226235]: 2026-01-31 07:47:06.183 226239 INFO nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Deleting local config drive /var/lib/nova/instances/b99980e3-e183-4ce8-b3d6-606bdf46f451/disk.config because it was imported into RBD.#033[00m
Jan 31 02:47:06 np0005603623 nova_compute[226235]: 2026-01-31 07:47:06.187 226239 DEBUG oslo_concurrency.processutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/disk.config d3f77a29-d3a7-444c-9528-ab679b9b946c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.236s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:06 np0005603623 nova_compute[226235]: 2026-01-31 07:47:06.188 226239 INFO nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Deleting local config drive /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/disk.config because it was imported into RBD.#033[00m
Jan 31 02:47:06 np0005603623 systemd-machined[194379]: New machine qemu-4-instance-0000000d.
Jan 31 02:47:06 np0005603623 systemd[1]: Started Virtual Machine qemu-4-instance-0000000d.
Jan 31 02:47:06 np0005603623 systemd-machined[194379]: New machine qemu-5-instance-0000000c.
Jan 31 02:47:06 np0005603623 systemd[1]: Started Virtual Machine qemu-5-instance-0000000c.
Jan 31 02:47:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:06.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:06 np0005603623 nova_compute[226235]: 2026-01-31 07:47:06.724 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.061 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845627.0608804, b99980e3-e183-4ce8-b3d6-606bdf46f451 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.062 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.064 226239 DEBUG nova.compute.manager [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.065 226239 DEBUG nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.067 226239 INFO nova.virt.libvirt.driver [-] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Instance spawned successfully.#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.068 226239 DEBUG nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.091 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.094 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.103 226239 DEBUG nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.104 226239 DEBUG nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.104 226239 DEBUG nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.105 226239 DEBUG nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.105 226239 DEBUG nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.106 226239 DEBUG nova.virt.libvirt.driver [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.133 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.133 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845627.0619497, b99980e3-e183-4ce8-b3d6-606bdf46f451 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:47:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.134 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] VM Started (Lifecycle Event)#033[00m
Jan 31 02:47:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:07.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.162 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.164 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.172 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.172 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.173 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.173 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.173 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.188 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.189 226239 INFO nova.compute.manager [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Took 16.69 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.190 226239 DEBUG nova.compute.manager [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.245 226239 INFO nova.compute.manager [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Took 17.69 seconds to build instance.#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.267 226239 DEBUG oslo_concurrency.lockutils [None req-af323e55-69f4-4ee8-ad29-72b408a62f69 af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Lock "b99980e3-e183-4ce8-b3d6-606bdf46f451" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.941s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.388 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.632 226239 DEBUG nova.compute.manager [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.633 226239 DEBUG nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.638 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845627.6376595, d3f77a29-d3a7-444c-9528-ab679b9b946c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.638 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.643 226239 INFO nova.virt.libvirt.driver [-] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance spawned successfully.#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.644 226239 DEBUG nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.658 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.664 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.667 226239 DEBUG nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.667 226239 DEBUG nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.667 226239 DEBUG nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.668 226239 DEBUG nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.668 226239 DEBUG nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.669 226239 DEBUG nova.virt.libvirt.driver [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.687 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.687 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845627.6378653, d3f77a29-d3a7-444c-9528-ab679b9b946c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.687 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] VM Started (Lifecycle Event)#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.706 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.709 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.734 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.745 226239 INFO nova.compute.manager [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Took 18.33 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.745 226239 DEBUG nova.compute.manager [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.796 226239 INFO nova.compute.manager [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Took 19.36 seconds to build instance.#033[00m
Jan 31 02:47:07 np0005603623 nova_compute[226235]: 2026-01-31 07:47:07.839 226239 DEBUG oslo_concurrency.lockutils [None req-01a8a307-f767-486c-abd6-1e397a076137 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "d3f77a29-d3a7-444c-9528-ab679b9b946c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.487s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:47:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 31 02:47:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:08.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 31 02:47:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:47:09 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/675341361' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.087 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.913s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:09.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.170 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000000c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:47:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.192 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000000c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.196 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000000d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.197 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000000d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.284 226239 DEBUG oslo_concurrency.lockutils [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Acquiring lock "b99980e3-e183-4ce8-b3d6-606bdf46f451" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.284 226239 DEBUG oslo_concurrency.lockutils [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Lock "b99980e3-e183-4ce8-b3d6-606bdf46f451" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.284 226239 DEBUG oslo_concurrency.lockutils [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Acquiring lock "b99980e3-e183-4ce8-b3d6-606bdf46f451-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.285 226239 DEBUG oslo_concurrency.lockutils [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Lock "b99980e3-e183-4ce8-b3d6-606bdf46f451-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.285 226239 DEBUG oslo_concurrency.lockutils [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Lock "b99980e3-e183-4ce8-b3d6-606bdf46f451-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.286 226239 INFO nova.compute.manager [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Terminating instance#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.287 226239 DEBUG oslo_concurrency.lockutils [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Acquiring lock "refresh_cache-b99980e3-e183-4ce8-b3d6-606bdf46f451" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.287 226239 DEBUG oslo_concurrency.lockutils [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Acquired lock "refresh_cache-b99980e3-e183-4ce8-b3d6-606bdf46f451" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.288 226239 DEBUG nova.network.neutron [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.345 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.346 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4758MB free_disk=20.926021575927734GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.346 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.346 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.448 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance d3f77a29-d3a7-444c-9528-ab679b9b946c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.448 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance b99980e3-e183-4ce8-b3d6-606bdf46f451 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.448 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.449 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.563 226239 DEBUG nova.network.neutron [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.682 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.926 226239 DEBUG nova.network.neutron [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.945 226239 DEBUG oslo_concurrency.lockutils [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Releasing lock "refresh_cache-b99980e3-e183-4ce8-b3d6-606bdf46f451" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:47:09 np0005603623 nova_compute[226235]: 2026-01-31 07:47:09.946 226239 DEBUG nova.compute.manager [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:47:10 np0005603623 nova_compute[226235]: 2026-01-31 07:47:10.006 226239 INFO nova.compute.manager [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Rebuilding instance#033[00m
Jan 31 02:47:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:47:10 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2083110011' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:47:10 np0005603623 nova_compute[226235]: 2026-01-31 07:47:10.337 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.655s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:10 np0005603623 nova_compute[226235]: 2026-01-31 07:47:10.343 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:47:10 np0005603623 nova_compute[226235]: 2026-01-31 07:47:10.353 226239 DEBUG nova.objects.instance [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:47:10 np0005603623 nova_compute[226235]: 2026-01-31 07:47:10.356 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:47:10 np0005603623 nova_compute[226235]: 2026-01-31 07:47:10.368 226239 DEBUG nova.compute.manager [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:47:10 np0005603623 nova_compute[226235]: 2026-01-31 07:47:10.381 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:47:10 np0005603623 nova_compute[226235]: 2026-01-31 07:47:10.382 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.035s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:47:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:10.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:10 np0005603623 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Jan 31 02:47:10 np0005603623 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000d.scope: Consumed 3.498s CPU time.
Jan 31 02:47:10 np0005603623 systemd-machined[194379]: Machine qemu-4-instance-0000000d terminated.
Jan 31 02:47:10 np0005603623 nova_compute[226235]: 2026-01-31 07:47:10.423 226239 DEBUG nova.objects.instance [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lazy-loading 'pci_requests' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:47:10 np0005603623 nova_compute[226235]: 2026-01-31 07:47:10.437 226239 DEBUG nova.objects.instance [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lazy-loading 'pci_devices' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:47:10 np0005603623 nova_compute[226235]: 2026-01-31 07:47:10.453 226239 DEBUG nova.objects.instance [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lazy-loading 'resources' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:47:10 np0005603623 nova_compute[226235]: 2026-01-31 07:47:10.469 226239 DEBUG nova.objects.instance [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lazy-loading 'migration_context' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:47:10 np0005603623 nova_compute[226235]: 2026-01-31 07:47:10.480 226239 DEBUG nova.objects.instance [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 02:47:10 np0005603623 nova_compute[226235]: 2026-01-31 07:47:10.483 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 02:47:10 np0005603623 nova_compute[226235]: 2026-01-31 07:47:10.560 226239 INFO nova.virt.libvirt.driver [-] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Instance destroyed successfully.#033[00m
Jan 31 02:47:10 np0005603623 nova_compute[226235]: 2026-01-31 07:47:10.560 226239 DEBUG nova.objects.instance [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Lazy-loading 'resources' on Instance uuid b99980e3-e183-4ce8-b3d6-606bdf46f451 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:47:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:11.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:11 np0005603623 nova_compute[226235]: 2026-01-31 07:47:11.727 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:12 np0005603623 nova_compute[226235]: 2026-01-31 07:47:12.382 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:47:12 np0005603623 nova_compute[226235]: 2026-01-31 07:47:12.383 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:47:12 np0005603623 nova_compute[226235]: 2026-01-31 07:47:12.383 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:47:12 np0005603623 nova_compute[226235]: 2026-01-31 07:47:12.391 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:12.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:12 np0005603623 nova_compute[226235]: 2026-01-31 07:47:12.410 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 31 02:47:12 np0005603623 nova_compute[226235]: 2026-01-31 07:47:12.411 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-d3f77a29-d3a7-444c-9528-ab679b9b946c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:47:12 np0005603623 nova_compute[226235]: 2026-01-31 07:47:12.411 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-d3f77a29-d3a7-444c-9528-ab679b9b946c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:47:12 np0005603623 nova_compute[226235]: 2026-01-31 07:47:12.412 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 02:47:12 np0005603623 nova_compute[226235]: 2026-01-31 07:47:12.412 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:47:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:13.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:13 np0005603623 nova_compute[226235]: 2026-01-31 07:47:13.223 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:47:13 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:47:13 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:47:14 np0005603623 nova_compute[226235]: 2026-01-31 07:47:14.185 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:47:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:14 np0005603623 nova_compute[226235]: 2026-01-31 07:47:14.201 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-d3f77a29-d3a7-444c-9528-ab679b9b946c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:47:14 np0005603623 nova_compute[226235]: 2026-01-31 07:47:14.202 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 02:47:14 np0005603623 nova_compute[226235]: 2026-01-31 07:47:14.202 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:47:14 np0005603623 nova_compute[226235]: 2026-01-31 07:47:14.202 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:47:14 np0005603623 nova_compute[226235]: 2026-01-31 07:47:14.203 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:47:14 np0005603623 nova_compute[226235]: 2026-01-31 07:47:14.203 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:47:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 02:47:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1637183502' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 02:47:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 02:47:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1637183502' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 02:47:14 np0005603623 nova_compute[226235]: 2026-01-31 07:47:14.333 226239 INFO nova.virt.libvirt.driver [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Deleting instance files /var/lib/nova/instances/b99980e3-e183-4ce8-b3d6-606bdf46f451_del#033[00m
Jan 31 02:47:14 np0005603623 nova_compute[226235]: 2026-01-31 07:47:14.334 226239 INFO nova.virt.libvirt.driver [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Deletion of /var/lib/nova/instances/b99980e3-e183-4ce8-b3d6-606bdf46f451_del complete#033[00m
Jan 31 02:47:14 np0005603623 nova_compute[226235]: 2026-01-31 07:47:14.382 226239 INFO nova.compute.manager [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Took 4.44 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:47:14 np0005603623 nova_compute[226235]: 2026-01-31 07:47:14.383 226239 DEBUG oslo.service.loopingcall [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:47:14 np0005603623 nova_compute[226235]: 2026-01-31 07:47:14.383 226239 DEBUG nova.compute.manager [-] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:47:14 np0005603623 nova_compute[226235]: 2026-01-31 07:47:14.383 226239 DEBUG nova.network.neutron [-] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:47:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:14.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:15.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:15 np0005603623 nova_compute[226235]: 2026-01-31 07:47:15.215 226239 DEBUG nova.network.neutron [-] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:47:15 np0005603623 nova_compute[226235]: 2026-01-31 07:47:15.236 226239 DEBUG nova.network.neutron [-] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:47:15 np0005603623 nova_compute[226235]: 2026-01-31 07:47:15.254 226239 INFO nova.compute.manager [-] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Took 0.87 seconds to deallocate network for instance.#033[00m
Jan 31 02:47:15 np0005603623 nova_compute[226235]: 2026-01-31 07:47:15.321 226239 DEBUG oslo_concurrency.lockutils [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:47:15 np0005603623 nova_compute[226235]: 2026-01-31 07:47:15.322 226239 DEBUG oslo_concurrency.lockutils [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:47:15 np0005603623 nova_compute[226235]: 2026-01-31 07:47:15.374 226239 DEBUG oslo_concurrency.processutils [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:47:16 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/657776033' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:47:16 np0005603623 nova_compute[226235]: 2026-01-31 07:47:16.249 226239 DEBUG oslo_concurrency.processutils [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.876s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:16 np0005603623 nova_compute[226235]: 2026-01-31 07:47:16.255 226239 DEBUG nova.compute.provider_tree [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:47:16 np0005603623 nova_compute[226235]: 2026-01-31 07:47:16.273 226239 DEBUG nova.scheduler.client.report [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:47:16 np0005603623 nova_compute[226235]: 2026-01-31 07:47:16.296 226239 DEBUG oslo_concurrency.lockutils [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.974s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:47:16 np0005603623 nova_compute[226235]: 2026-01-31 07:47:16.343 226239 INFO nova.scheduler.client.report [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Deleted allocations for instance b99980e3-e183-4ce8-b3d6-606bdf46f451#033[00m
Jan 31 02:47:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:16.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:16 np0005603623 nova_compute[226235]: 2026-01-31 07:47:16.406 226239 DEBUG oslo_concurrency.lockutils [None req-172bce12-b340-436f-917a-c475320b3e0a af3dc76e8c644930baa58a646b54b535 f5246170c80a4be2beb961122f12fcaf - - default default] Lock "b99980e3-e183-4ce8-b3d6-606bdf46f451" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:47:16 np0005603623 nova_compute[226235]: 2026-01-31 07:47:16.728 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:17 np0005603623 podman[232887]: 2026-01-31 07:47:17.007300948 +0000 UTC m=+0.097303889 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:47:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:17.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:17 np0005603623 nova_compute[226235]: 2026-01-31 07:47:17.394 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:17 np0005603623 podman[232914]: 2026-01-31 07:47:17.954194369 +0000 UTC m=+0.043708739 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 02:47:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:18.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:19.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:47:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:20.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:47:20 np0005603623 nova_compute[226235]: 2026-01-31 07:47:20.525 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 02:47:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:21.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:21.717668) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845641717806, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 518, "num_deletes": 256, "total_data_size": 714181, "memory_usage": 725720, "flush_reason": "Manual Compaction"}
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Jan 31 02:47:21 np0005603623 nova_compute[226235]: 2026-01-31 07:47:21.729 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845641787324, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 471342, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22237, "largest_seqno": 22749, "table_properties": {"data_size": 468543, "index_size": 771, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 6417, "raw_average_key_size": 17, "raw_value_size": 462919, "raw_average_value_size": 1289, "num_data_blocks": 34, "num_entries": 359, "num_filter_entries": 359, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769845616, "oldest_key_time": 1769845616, "file_creation_time": 1769845641, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 69711 microseconds, and 2511 cpu microseconds.
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:21.787390) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 471342 bytes OK
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:21.787417) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:21.817220) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:21.817250) EVENT_LOG_v1 {"time_micros": 1769845641817241, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:21.817277) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 711079, prev total WAL file size 727474, number of live WAL files 2.
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:21.817903) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323531' seq:72057594037927935, type:22 .. '6C6F676D00353033' seq:0, type:0; will stop at (end)
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(460KB)], [42(8597KB)]
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845641817970, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 9275634, "oldest_snapshot_seqno": -1}
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 4688 keys, 9133494 bytes, temperature: kUnknown
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845641997660, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 9133494, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9101068, "index_size": 19580, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11781, "raw_key_size": 118185, "raw_average_key_size": 25, "raw_value_size": 9015205, "raw_average_value_size": 1923, "num_data_blocks": 805, "num_entries": 4688, "num_filter_entries": 4688, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769845641, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:47:21 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:47:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:21.998015) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 9133494 bytes
Jan 31 02:47:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:22.019660) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 51.6 rd, 50.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 8.4 +0.0 blob) out(8.7 +0.0 blob), read-write-amplify(39.1) write-amplify(19.4) OK, records in: 5211, records dropped: 523 output_compression: NoCompression
Jan 31 02:47:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:22.019717) EVENT_LOG_v1 {"time_micros": 1769845642019693, "job": 24, "event": "compaction_finished", "compaction_time_micros": 179791, "compaction_time_cpu_micros": 27562, "output_level": 6, "num_output_files": 1, "total_output_size": 9133494, "num_input_records": 5211, "num_output_records": 4688, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:47:22 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:47:22 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845642019991, "job": 24, "event": "table_file_deletion", "file_number": 44}
Jan 31 02:47:22 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:47:22 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845642021894, "job": 24, "event": "table_file_deletion", "file_number": 42}
Jan 31 02:47:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:21.817765) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:22.021957) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:22.021964) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:22.021967) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:22.021970) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:22.021974) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:22 np0005603623 nova_compute[226235]: 2026-01-31 07:47:22.396 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:47:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:22.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:47:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:47:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:23.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:47:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:24.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:25.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:25 np0005603623 nova_compute[226235]: 2026-01-31 07:47:25.558 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845630.5577402, b99980e3-e183-4ce8-b3d6-606bdf46f451 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:47:25 np0005603623 nova_compute[226235]: 2026-01-31 07:47:25.559 226239 INFO nova.compute.manager [-] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:47:25 np0005603623 nova_compute[226235]: 2026-01-31 07:47:25.579 226239 DEBUG nova.compute.manager [None req-767c2042-14ac-43a1-80e2-1f9a5906c18c - - - - - -] [instance: b99980e3-e183-4ce8-b3d6-606bdf46f451] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:47:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:47:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:26.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:47:26 np0005603623 nova_compute[226235]: 2026-01-31 07:47:26.731 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:47:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:27.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:47:27 np0005603623 nova_compute[226235]: 2026-01-31 07:47:27.397 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:28.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:29.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:29 np0005603623 nova_compute[226235]: 2026-01-31 07:47:29.193 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:47:29.194 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:47:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:47:29.196 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:47:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:47:30.082 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:47:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:47:30.083 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:47:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:47:30.083 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:47:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:30.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:31.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:31 np0005603623 ovn_controller[133449]: 2026-01-31T07:47:31Z|00059|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 02:47:31 np0005603623 nova_compute[226235]: 2026-01-31 07:47:31.573 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 02:47:31 np0005603623 nova_compute[226235]: 2026-01-31 07:47:31.734 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:32 np0005603623 nova_compute[226235]: 2026-01-31 07:47:32.400 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:32.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:33.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:47:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:34.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:47:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:35.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:36 np0005603623 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 31 02:47:36 np0005603623 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000c.scope: Consumed 13.439s CPU time.
Jan 31 02:47:36 np0005603623 systemd-machined[194379]: Machine qemu-5-instance-0000000c terminated.
Jan 31 02:47:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e144 e144: 3 total, 3 up, 3 in
Jan 31 02:47:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:36.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:36 np0005603623 nova_compute[226235]: 2026-01-31 07:47:36.594 226239 INFO nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance shutdown successfully after 26 seconds.#033[00m
Jan 31 02:47:36 np0005603623 nova_compute[226235]: 2026-01-31 07:47:36.600 226239 INFO nova.virt.libvirt.driver [-] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance destroyed successfully.#033[00m
Jan 31 02:47:36 np0005603623 nova_compute[226235]: 2026-01-31 07:47:36.604 226239 INFO nova.virt.libvirt.driver [-] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance destroyed successfully.#033[00m
Jan 31 02:47:36 np0005603623 nova_compute[226235]: 2026-01-31 07:47:36.736 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:37.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:37 np0005603623 nova_compute[226235]: 2026-01-31 07:47:37.402 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:47:37 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2352693422' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:47:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:38.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:47:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:39.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:47:39 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:47:39.199 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:47:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:40.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:41.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:41 np0005603623 nova_compute[226235]: 2026-01-31 07:47:41.738 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:42 np0005603623 nova_compute[226235]: 2026-01-31 07:47:42.403 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:42.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:42.778289) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845662778385, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 448, "num_deletes": 251, "total_data_size": 592241, "memory_usage": 602088, "flush_reason": "Manual Compaction"}
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845662867744, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 390742, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22754, "largest_seqno": 23197, "table_properties": {"data_size": 388238, "index_size": 604, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6114, "raw_average_key_size": 18, "raw_value_size": 383229, "raw_average_value_size": 1179, "num_data_blocks": 27, "num_entries": 325, "num_filter_entries": 325, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769845641, "oldest_key_time": 1769845641, "file_creation_time": 1769845662, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 89521 microseconds, and 2414 cpu microseconds.
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:42.867788) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 390742 bytes OK
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:42.867834) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:42.988662) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:42.988711) EVENT_LOG_v1 {"time_micros": 1769845662988700, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:42.988737) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 589466, prev total WAL file size 589466, number of live WAL files 2.
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:42.989320) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(381KB)], [45(8919KB)]
Jan 31 02:47:42 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845662989385, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 9524236, "oldest_snapshot_seqno": -1}
Jan 31 02:47:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:43.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:43 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 4499 keys, 7495842 bytes, temperature: kUnknown
Jan 31 02:47:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845663294446, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 7495842, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7466058, "index_size": 17450, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11269, "raw_key_size": 114937, "raw_average_key_size": 25, "raw_value_size": 7384804, "raw_average_value_size": 1641, "num_data_blocks": 707, "num_entries": 4499, "num_filter_entries": 4499, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769845662, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:47:43 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:47:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:43.294717) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 7495842 bytes
Jan 31 02:47:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:43.660821) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 31.2 rd, 24.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 8.7 +0.0 blob) out(7.1 +0.0 blob), read-write-amplify(43.6) write-amplify(19.2) OK, records in: 5013, records dropped: 514 output_compression: NoCompression
Jan 31 02:47:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:43.660861) EVENT_LOG_v1 {"time_micros": 1769845663660845, "job": 26, "event": "compaction_finished", "compaction_time_micros": 305166, "compaction_time_cpu_micros": 14376, "output_level": 6, "num_output_files": 1, "total_output_size": 7495842, "num_input_records": 5013, "num_output_records": 4499, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:47:43 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:47:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845663661094, "job": 26, "event": "table_file_deletion", "file_number": 47}
Jan 31 02:47:43 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:47:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845663661930, "job": 26, "event": "table_file_deletion", "file_number": 45}
Jan 31 02:47:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:42.989244) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:43.662006) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:43.662012) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:43.662014) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:43.662016) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:47:43.662018) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:44.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:45.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:46.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:46 np0005603623 nova_compute[226235]: 2026-01-31 07:47:46.739 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:47.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:47 np0005603623 nova_compute[226235]: 2026-01-31 07:47:47.406 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:48 np0005603623 podman[233020]: 2026-01-31 07:47:48.020325847 +0000 UTC m=+0.103419652 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 02:47:48 np0005603623 podman[233046]: 2026-01-31 07:47:48.053194699 +0000 UTC m=+0.037962033 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 02:47:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:47:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:48.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:47:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:49.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:47:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:50.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:47:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:51.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:51 np0005603623 nova_compute[226235]: 2026-01-31 07:47:51.396 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845656.3947017, d3f77a29-d3a7-444c-9528-ab679b9b946c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:47:51 np0005603623 nova_compute[226235]: 2026-01-31 07:47:51.397 226239 INFO nova.compute.manager [-] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:47:51 np0005603623 nova_compute[226235]: 2026-01-31 07:47:51.445 226239 DEBUG nova.compute.manager [None req-fec58593-7d20-449d-b1dd-5327d45b0fc6 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:47:51 np0005603623 nova_compute[226235]: 2026-01-31 07:47:51.447 226239 DEBUG nova.compute.manager [None req-fec58593-7d20-449d-b1dd-5327d45b0fc6 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: rebuilding, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:47:51 np0005603623 nova_compute[226235]: 2026-01-31 07:47:51.474 226239 INFO nova.compute.manager [None req-fec58593-7d20-449d-b1dd-5327d45b0fc6 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] During sync_power_state the instance has a pending task (rebuilding). Skip.#033[00m
Jan 31 02:47:51 np0005603623 nova_compute[226235]: 2026-01-31 07:47:51.741 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:52 np0005603623 nova_compute[226235]: 2026-01-31 07:47:52.005 226239 INFO nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Deleting instance files /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c_del#033[00m
Jan 31 02:47:52 np0005603623 nova_compute[226235]: 2026-01-31 07:47:52.006 226239 INFO nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Deletion of /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c_del complete#033[00m
Jan 31 02:47:52 np0005603623 nova_compute[226235]: 2026-01-31 07:47:52.191 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:47:52 np0005603623 nova_compute[226235]: 2026-01-31 07:47:52.192 226239 INFO nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Creating image(s)#033[00m
Jan 31 02:47:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:52.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:52 np0005603623 nova_compute[226235]: 2026-01-31 07:47:52.688 226239 DEBUG nova.storage.rbd_utils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:47:52 np0005603623 nova_compute[226235]: 2026-01-31 07:47:52.725 226239 DEBUG nova.storage.rbd_utils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:47:52 np0005603623 nova_compute[226235]: 2026-01-31 07:47:52.761 226239 DEBUG nova.storage.rbd_utils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:47:52 np0005603623 nova_compute[226235]: 2026-01-31 07:47:52.765 226239 DEBUG oslo_concurrency.lockutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Acquiring lock "365f9823d2619ef09948bdeed685488da63755b5" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:47:52 np0005603623 nova_compute[226235]: 2026-01-31 07:47:52.766 226239 DEBUG oslo_concurrency.lockutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "365f9823d2619ef09948bdeed685488da63755b5" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:47:52 np0005603623 nova_compute[226235]: 2026-01-31 07:47:52.769 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:53.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:53 np0005603623 nova_compute[226235]: 2026-01-31 07:47:53.395 226239 DEBUG nova.virt.libvirt.imagebackend [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Image locations are: [{'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/0864ca59-9877-4e6d-adfc-f0a3204ed8f8/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/0864ca59-9877-4e6d-adfc-f0a3204ed8f8/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 31 02:47:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:54.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e145 e145: 3 total, 3 up, 3 in
Jan 31 02:47:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:55.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:56.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:56 np0005603623 nova_compute[226235]: 2026-01-31 07:47:56.862 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:47:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:57.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:47:57 np0005603623 nova_compute[226235]: 2026-01-31 07:47:57.417 226239 DEBUG oslo_concurrency.processutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:57 np0005603623 nova_compute[226235]: 2026-01-31 07:47:57.488 226239 DEBUG oslo_concurrency.processutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5.part --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:57 np0005603623 nova_compute[226235]: 2026-01-31 07:47:57.490 226239 DEBUG nova.virt.images [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] 0864ca59-9877-4e6d-adfc-f0a3204ed8f8 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 31 02:47:57 np0005603623 nova_compute[226235]: 2026-01-31 07:47:57.675 226239 DEBUG nova.privsep.utils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 31 02:47:57 np0005603623 nova_compute[226235]: 2026-01-31 07:47:57.676 226239 DEBUG oslo_concurrency.processutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5.part /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:57 np0005603623 nova_compute[226235]: 2026-01-31 07:47:57.772 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:58.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:47:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:59.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:00.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:01.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:01 np0005603623 nova_compute[226235]: 2026-01-31 07:48:01.743 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:02.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:02 np0005603623 nova_compute[226235]: 2026-01-31 07:48:02.776 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:03 np0005603623 ceph-mds[84161]: mds.beacon.cephfs.compute-2.asgtzy missed beacon ack from the monitors
Jan 31 02:48:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:03.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:03 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency_fn slow operation observed for _txc_committed_kv, latency = 5.069866180s, txc = 0x55e61162f500
Jan 31 02:48:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:04.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:05.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:06.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:06 np0005603623 nova_compute[226235]: 2026-01-31 07:48:06.744 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:07 np0005603623 nova_compute[226235]: 2026-01-31 07:48:07.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:07 np0005603623 nova_compute[226235]: 2026-01-31 07:48:07.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:48:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:07.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:48:07 np0005603623 nova_compute[226235]: 2026-01-31 07:48:07.778 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:08 np0005603623 nova_compute[226235]: 2026-01-31 07:48:08.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:08.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:09 np0005603623 nova_compute[226235]: 2026-01-31 07:48:09.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:09 np0005603623 nova_compute[226235]: 2026-01-31 07:48:09.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:09 np0005603623 nova_compute[226235]: 2026-01-31 07:48:09.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:09 np0005603623 nova_compute[226235]: 2026-01-31 07:48:09.195 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:09 np0005603623 nova_compute[226235]: 2026-01-31 07:48:09.195 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:09 np0005603623 nova_compute[226235]: 2026-01-31 07:48:09.196 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:09 np0005603623 nova_compute[226235]: 2026-01-31 07:48:09.196 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:48:09 np0005603623 nova_compute[226235]: 2026-01-31 07:48:09.197 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:09.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:09 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency_fn slow operation observed for _txc_committed_kv, latency = 5.310748577s, txc = 0x55e6117af200
Jan 31 02:48:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:48:09 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1720258811' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:48:09 np0005603623 nova_compute[226235]: 2026-01-31 07:48:09.694 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:09 np0005603623 nova_compute[226235]: 2026-01-31 07:48:09.842 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:48:09 np0005603623 nova_compute[226235]: 2026-01-31 07:48:09.844 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4927MB free_disk=20.942729949951172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:48:09 np0005603623 nova_compute[226235]: 2026-01-31 07:48:09.844 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:09 np0005603623 nova_compute[226235]: 2026-01-31 07:48:09.844 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:09 np0005603623 nova_compute[226235]: 2026-01-31 07:48:09.908 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance d3f77a29-d3a7-444c-9528-ab679b9b946c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:48:09 np0005603623 nova_compute[226235]: 2026-01-31 07:48:09.909 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:48:09 np0005603623 nova_compute[226235]: 2026-01-31 07:48:09.909 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:48:09 np0005603623 nova_compute[226235]: 2026-01-31 07:48:09.940 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:48:10 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/739609239' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:48:10 np0005603623 nova_compute[226235]: 2026-01-31 07:48:10.351 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.411s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:10 np0005603623 nova_compute[226235]: 2026-01-31 07:48:10.361 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:48:10 np0005603623 nova_compute[226235]: 2026-01-31 07:48:10.386 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:48:10 np0005603623 nova_compute[226235]: 2026-01-31 07:48:10.412 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:48:10 np0005603623 nova_compute[226235]: 2026-01-31 07:48:10.412 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.568s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:10.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:48:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:11.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:48:11 np0005603623 nova_compute[226235]: 2026-01-31 07:48:11.408 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:11 np0005603623 nova_compute[226235]: 2026-01-31 07:48:11.429 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:11 np0005603623 nova_compute[226235]: 2026-01-31 07:48:11.429 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:48:11 np0005603623 nova_compute[226235]: 2026-01-31 07:48:11.430 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:48:11 np0005603623 nova_compute[226235]: 2026-01-31 07:48:11.451 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-d3f77a29-d3a7-444c-9528-ab679b9b946c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:48:11 np0005603623 nova_compute[226235]: 2026-01-31 07:48:11.451 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-d3f77a29-d3a7-444c-9528-ab679b9b946c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:48:11 np0005603623 nova_compute[226235]: 2026-01-31 07:48:11.452 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 02:48:11 np0005603623 nova_compute[226235]: 2026-01-31 07:48:11.452 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:48:11 np0005603623 nova_compute[226235]: 2026-01-31 07:48:11.680 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:48:11 np0005603623 nova_compute[226235]: 2026-01-31 07:48:11.745 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:11 np0005603623 nova_compute[226235]: 2026-01-31 07:48:11.943 226239 DEBUG oslo_concurrency.processutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5.part /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5.converted" returned: 0 in 14.266s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:11 np0005603623 nova_compute[226235]: 2026-01-31 07:48:11.946 226239 DEBUG oslo_concurrency.processutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:11 np0005603623 nova_compute[226235]: 2026-01-31 07:48:11.960 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:48:11 np0005603623 nova_compute[226235]: 2026-01-31 07:48:11.979 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-d3f77a29-d3a7-444c-9528-ab679b9b946c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:48:11 np0005603623 nova_compute[226235]: 2026-01-31 07:48:11.980 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 02:48:11 np0005603623 nova_compute[226235]: 2026-01-31 07:48:11.980 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:11 np0005603623 nova_compute[226235]: 2026-01-31 07:48:11.981 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:11 np0005603623 nova_compute[226235]: 2026-01-31 07:48:11.981 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:48:12 np0005603623 nova_compute[226235]: 2026-01-31 07:48:12.011 226239 DEBUG oslo_concurrency.processutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5.converted --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:12 np0005603623 nova_compute[226235]: 2026-01-31 07:48:12.012 226239 DEBUG oslo_concurrency.lockutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "365f9823d2619ef09948bdeed685488da63755b5" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 19.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:48:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2303136280' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:48:12 np0005603623 nova_compute[226235]: 2026-01-31 07:48:12.115 226239 DEBUG nova.storage.rbd_utils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:48:12 np0005603623 nova_compute[226235]: 2026-01-31 07:48:12.119 226239 DEBUG oslo_concurrency.processutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 d3f77a29-d3a7-444c-9528-ab679b9b946c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:48:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:12.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:48:12 np0005603623 nova_compute[226235]: 2026-01-31 07:48:12.780 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:48:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:13.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:48:14 np0005603623 nova_compute[226235]: 2026-01-31 07:48:14.329 226239 DEBUG oslo_concurrency.processutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 d3f77a29-d3a7-444c-9528-ab679b9b946c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.210s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:14 np0005603623 nova_compute[226235]: 2026-01-31 07:48:14.441 226239 DEBUG nova.storage.rbd_utils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] resizing rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:48:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:14.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:14 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:48:14 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:48:14 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:48:14 np0005603623 nova_compute[226235]: 2026-01-31 07:48:14.994 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:48:14 np0005603623 nova_compute[226235]: 2026-01-31 07:48:14.995 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Ensure instance console log exists: /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:48:14 np0005603623 nova_compute[226235]: 2026-01-31 07:48:14.996 226239 DEBUG oslo_concurrency.lockutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:14 np0005603623 nova_compute[226235]: 2026-01-31 07:48:14.996 226239 DEBUG oslo_concurrency.lockutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:14 np0005603623 nova_compute[226235]: 2026-01-31 07:48:14.997 226239 DEBUG oslo_concurrency.lockutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:14 np0005603623 nova_compute[226235]: 2026-01-31 07:48:14.998 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:45Z,direct_url=<?>,disk_format='qcow2',id=0864ca59-9877-4e6d-adfc-f0a3204ed8f8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:47Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.002 226239 WARNING nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.008 226239 DEBUG nova.virt.libvirt.host [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.009 226239 DEBUG nova.virt.libvirt.host [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.012 226239 DEBUG nova.virt.libvirt.host [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.013 226239 DEBUG nova.virt.libvirt.host [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.015 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.015 226239 DEBUG nova.virt.hardware [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:45Z,direct_url=<?>,disk_format='qcow2',id=0864ca59-9877-4e6d-adfc-f0a3204ed8f8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:47Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.016 226239 DEBUG nova.virt.hardware [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.016 226239 DEBUG nova.virt.hardware [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.016 226239 DEBUG nova.virt.hardware [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.017 226239 DEBUG nova.virt.hardware [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.017 226239 DEBUG nova.virt.hardware [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.017 226239 DEBUG nova.virt.hardware [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.017 226239 DEBUG nova.virt.hardware [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.018 226239 DEBUG nova.virt.hardware [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.018 226239 DEBUG nova.virt.hardware [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.018 226239 DEBUG nova.virt.hardware [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.018 226239 DEBUG nova.objects.instance [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lazy-loading 'vcpu_model' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.042 226239 DEBUG oslo_concurrency.processutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:15.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:48:15 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3662791080' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.484 226239 DEBUG oslo_concurrency.processutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.506 226239 DEBUG nova.storage.rbd_utils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:48:15 np0005603623 nova_compute[226235]: 2026-01-31 07:48:15.511 226239 DEBUG oslo_concurrency.processutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:48:15 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2493221916' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:48:16 np0005603623 nova_compute[226235]: 2026-01-31 07:48:16.120 226239 DEBUG oslo_concurrency.processutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.610s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:16 np0005603623 nova_compute[226235]: 2026-01-31 07:48:16.124 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:48:16 np0005603623 nova_compute[226235]:  <uuid>d3f77a29-d3a7-444c-9528-ab679b9b946c</uuid>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:  <name>instance-0000000c</name>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServersAdmin275Test-server-1493730364</nova:name>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:48:15</nova:creationTime>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:48:16 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:        <nova:user uuid="4e887d8783db44ff93a55e1ea75aa78e">tempest-ServersAdmin275Test-200317158-project-member</nova:user>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:        <nova:project uuid="2ca2f28405884a6ea92bcde9c8f91ff9">tempest-ServersAdmin275Test-200317158</nova:project>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="0864ca59-9877-4e6d-adfc-f0a3204ed8f8"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <nova:ports/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <entry name="serial">d3f77a29-d3a7-444c-9528-ab679b9b946c</entry>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <entry name="uuid">d3f77a29-d3a7-444c-9528-ab679b9b946c</entry>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/d3f77a29-d3a7-444c-9528-ab679b9b946c_disk">
Jan 31 02:48:16 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:48:16 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/d3f77a29-d3a7-444c-9528-ab679b9b946c_disk.config">
Jan 31 02:48:16 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:48:16 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/console.log" append="off"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:48:16 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:48:16 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:48:16 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:48:16 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:48:16 np0005603623 nova_compute[226235]: 2026-01-31 07:48:16.283 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:48:16 np0005603623 nova_compute[226235]: 2026-01-31 07:48:16.284 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:48:16 np0005603623 nova_compute[226235]: 2026-01-31 07:48:16.285 226239 INFO nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Using config drive#033[00m
Jan 31 02:48:16 np0005603623 nova_compute[226235]: 2026-01-31 07:48:16.350 226239 DEBUG nova.storage.rbd_utils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:48:16 np0005603623 nova_compute[226235]: 2026-01-31 07:48:16.373 226239 DEBUG nova.objects.instance [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lazy-loading 'ec2_ids' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:48:16 np0005603623 nova_compute[226235]: 2026-01-31 07:48:16.443 226239 DEBUG nova.objects.instance [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lazy-loading 'keypairs' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:48:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:16.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:16 np0005603623 nova_compute[226235]: 2026-01-31 07:48:16.621 226239 INFO nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Creating config drive at /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/disk.config#033[00m
Jan 31 02:48:16 np0005603623 nova_compute[226235]: 2026-01-31 07:48:16.624 226239 DEBUG oslo_concurrency.processutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp7qkuemmi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:16 np0005603623 nova_compute[226235]: 2026-01-31 07:48:16.747 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:16 np0005603623 nova_compute[226235]: 2026-01-31 07:48:16.752 226239 DEBUG oslo_concurrency.processutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp7qkuemmi" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:16 np0005603623 nova_compute[226235]: 2026-01-31 07:48:16.781 226239 DEBUG nova.storage.rbd_utils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:48:16 np0005603623 nova_compute[226235]: 2026-01-31 07:48:16.785 226239 DEBUG oslo_concurrency.processutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/disk.config d3f77a29-d3a7-444c-9528-ab679b9b946c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e146 e146: 3 total, 3 up, 3 in
Jan 31 02:48:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:48:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:17.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:48:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:17.576 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:48:17 np0005603623 nova_compute[226235]: 2026-01-31 07:48:17.576 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:17.577 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:48:17 np0005603623 nova_compute[226235]: 2026-01-31 07:48:17.783 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:18 np0005603623 nova_compute[226235]: 2026-01-31 07:48:18.258 226239 DEBUG oslo_concurrency.processutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/disk.config d3f77a29-d3a7-444c-9528-ab679b9b946c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:18 np0005603623 nova_compute[226235]: 2026-01-31 07:48:18.259 226239 INFO nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Deleting local config drive /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/disk.config because it was imported into RBD.#033[00m
Jan 31 02:48:18 np0005603623 systemd-machined[194379]: New machine qemu-6-instance-0000000c.
Jan 31 02:48:18 np0005603623 systemd[1]: Started Virtual Machine qemu-6-instance-0000000c.
Jan 31 02:48:18 np0005603623 podman[233664]: 2026-01-31 07:48:18.387287836 +0000 UTC m=+0.054272007 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 02:48:18 np0005603623 podman[233665]: 2026-01-31 07:48:18.41827208 +0000 UTC m=+0.086826090 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 31 02:48:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:18.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:18 np0005603623 nova_compute[226235]: 2026-01-31 07:48:18.911 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845698.910986, d3f77a29-d3a7-444c-9528-ab679b9b946c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:48:18 np0005603623 nova_compute[226235]: 2026-01-31 07:48:18.912 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:48:18 np0005603623 nova_compute[226235]: 2026-01-31 07:48:18.915 226239 DEBUG nova.compute.manager [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:48:18 np0005603623 nova_compute[226235]: 2026-01-31 07:48:18.915 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:48:18 np0005603623 nova_compute[226235]: 2026-01-31 07:48:18.918 226239 INFO nova.virt.libvirt.driver [-] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance spawned successfully.#033[00m
Jan 31 02:48:18 np0005603623 nova_compute[226235]: 2026-01-31 07:48:18.919 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:48:18 np0005603623 nova_compute[226235]: 2026-01-31 07:48:18.952 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:48:18 np0005603623 nova_compute[226235]: 2026-01-31 07:48:18.952 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:48:18 np0005603623 nova_compute[226235]: 2026-01-31 07:48:18.953 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:48:18 np0005603623 nova_compute[226235]: 2026-01-31 07:48:18.953 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:48:18 np0005603623 nova_compute[226235]: 2026-01-31 07:48:18.954 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:48:18 np0005603623 nova_compute[226235]: 2026-01-31 07:48:18.954 226239 DEBUG nova.virt.libvirt.driver [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:48:18 np0005603623 nova_compute[226235]: 2026-01-31 07:48:18.959 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:48:18 np0005603623 nova_compute[226235]: 2026-01-31 07:48:18.964 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:48:19 np0005603623 nova_compute[226235]: 2026-01-31 07:48:18.999 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 31 02:48:19 np0005603623 nova_compute[226235]: 2026-01-31 07:48:19.000 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845698.9111164, d3f77a29-d3a7-444c-9528-ab679b9b946c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:48:19 np0005603623 nova_compute[226235]: 2026-01-31 07:48:19.000 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] VM Started (Lifecycle Event)#033[00m
Jan 31 02:48:19 np0005603623 nova_compute[226235]: 2026-01-31 07:48:19.040 226239 DEBUG nova.compute.manager [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:48:19 np0005603623 nova_compute[226235]: 2026-01-31 07:48:19.068 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:48:19 np0005603623 nova_compute[226235]: 2026-01-31 07:48:19.072 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:48:19 np0005603623 nova_compute[226235]: 2026-01-31 07:48:19.104 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 31 02:48:19 np0005603623 nova_compute[226235]: 2026-01-31 07:48:19.129 226239 DEBUG oslo_concurrency.lockutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:19 np0005603623 nova_compute[226235]: 2026-01-31 07:48:19.130 226239 DEBUG oslo_concurrency.lockutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:19 np0005603623 nova_compute[226235]: 2026-01-31 07:48:19.130 226239 DEBUG nova.objects.instance [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 02:48:19 np0005603623 nova_compute[226235]: 2026-01-31 07:48:19.180 226239 DEBUG oslo_concurrency.lockutils [None req-853982a4-9bfd-4487-8d4e-7160fac98f33 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.050s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:19.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:20.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:21.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:21 np0005603623 nova_compute[226235]: 2026-01-31 07:48:21.530 226239 INFO nova.compute.manager [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Rebuilding instance#033[00m
Jan 31 02:48:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:21.579 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:21 np0005603623 nova_compute[226235]: 2026-01-31 07:48:21.749 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:21 np0005603623 nova_compute[226235]: 2026-01-31 07:48:21.913 226239 DEBUG nova.objects.instance [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Lazy-loading 'trusted_certs' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:48:21 np0005603623 nova_compute[226235]: 2026-01-31 07:48:21.948 226239 DEBUG nova.compute.manager [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:48:22 np0005603623 nova_compute[226235]: 2026-01-31 07:48:22.037 226239 DEBUG nova.objects.instance [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Lazy-loading 'pci_requests' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:48:22 np0005603623 nova_compute[226235]: 2026-01-31 07:48:22.061 226239 DEBUG nova.objects.instance [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Lazy-loading 'pci_devices' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:48:22 np0005603623 nova_compute[226235]: 2026-01-31 07:48:22.077 226239 DEBUG nova.objects.instance [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Lazy-loading 'resources' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:48:22 np0005603623 nova_compute[226235]: 2026-01-31 07:48:22.094 226239 DEBUG nova.objects.instance [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Lazy-loading 'migration_context' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:48:22 np0005603623 nova_compute[226235]: 2026-01-31 07:48:22.115 226239 DEBUG nova.objects.instance [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 02:48:22 np0005603623 nova_compute[226235]: 2026-01-31 07:48:22.117 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 02:48:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:22.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:22 np0005603623 nova_compute[226235]: 2026-01-31 07:48:22.784 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:23.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:24.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:25.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:26.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:26 np0005603623 nova_compute[226235]: 2026-01-31 07:48:26.751 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:26 np0005603623 nova_compute[226235]: 2026-01-31 07:48:26.786 226239 DEBUG oslo_concurrency.lockutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquiring lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:26 np0005603623 nova_compute[226235]: 2026-01-31 07:48:26.787 226239 DEBUG oslo_concurrency.lockutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:26 np0005603623 nova_compute[226235]: 2026-01-31 07:48:26.802 226239 DEBUG nova.compute.manager [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:48:26 np0005603623 nova_compute[226235]: 2026-01-31 07:48:26.863 226239 DEBUG oslo_concurrency.lockutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:26 np0005603623 nova_compute[226235]: 2026-01-31 07:48:26.863 226239 DEBUG oslo_concurrency.lockutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:26 np0005603623 nova_compute[226235]: 2026-01-31 07:48:26.870 226239 DEBUG nova.virt.hardware [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:48:26 np0005603623 nova_compute[226235]: 2026-01-31 07:48:26.870 226239 INFO nova.compute.claims [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:48:27 np0005603623 nova_compute[226235]: 2026-01-31 07:48:27.030 226239 DEBUG oslo_concurrency.processutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:27.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:27 np0005603623 nova_compute[226235]: 2026-01-31 07:48:27.787 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:48:27 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1457567476' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:48:27 np0005603623 nova_compute[226235]: 2026-01-31 07:48:27.890 226239 DEBUG oslo_concurrency.processutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.859s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:27 np0005603623 nova_compute[226235]: 2026-01-31 07:48:27.898 226239 DEBUG nova.compute.provider_tree [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:48:27 np0005603623 nova_compute[226235]: 2026-01-31 07:48:27.984 226239 DEBUG nova.scheduler.client.report [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.019 226239 DEBUG oslo_concurrency.lockutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.020 226239 DEBUG nova.compute.manager [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.088 226239 DEBUG nova.compute.manager [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.088 226239 DEBUG nova.network.neutron [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.109 226239 INFO nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.136 226239 DEBUG nova.compute.manager [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.218 226239 DEBUG nova.compute.manager [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.219 226239 DEBUG nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.219 226239 INFO nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Creating image(s)#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.374 226239 DEBUG nova.storage.rbd_utils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] rbd image 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.399 226239 DEBUG nova.storage.rbd_utils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] rbd image 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.427 226239 DEBUG nova.storage.rbd_utils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] rbd image 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.431 226239 DEBUG oslo_concurrency.processutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:48:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:28.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.478 226239 DEBUG oslo_concurrency.processutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.478 226239 DEBUG oslo_concurrency.lockutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.479 226239 DEBUG oslo_concurrency.lockutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.479 226239 DEBUG oslo_concurrency.lockutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.503 226239 DEBUG nova.storage.rbd_utils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] rbd image 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.506 226239 DEBUG oslo_concurrency.processutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:28 np0005603623 nova_compute[226235]: 2026-01-31 07:48:28.759 226239 DEBUG nova.policy [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '37ed25cc14814a29867ac308b3cce8cf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e66a774f63ae4139a4e75c7973fbe077', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:48:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:29.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:48:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:48:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:30.084 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:30.084 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:30.084 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:48:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:30.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:48:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:31.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:31 np0005603623 nova_compute[226235]: 2026-01-31 07:48:31.503 226239 DEBUG oslo_concurrency.processutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.997s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:31 np0005603623 nova_compute[226235]: 2026-01-31 07:48:31.573 226239 DEBUG nova.storage.rbd_utils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] resizing rbd image 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:48:31 np0005603623 nova_compute[226235]: 2026-01-31 07:48:31.857 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:32 np0005603623 nova_compute[226235]: 2026-01-31 07:48:32.174 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 02:48:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:32.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:32 np0005603623 nova_compute[226235]: 2026-01-31 07:48:32.790 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:33 np0005603623 nova_compute[226235]: 2026-01-31 07:48:33.102 226239 DEBUG nova.objects.instance [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lazy-loading 'migration_context' on Instance uuid 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:48:33 np0005603623 nova_compute[226235]: 2026-01-31 07:48:33.117 226239 DEBUG nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:48:33 np0005603623 nova_compute[226235]: 2026-01-31 07:48:33.118 226239 DEBUG nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Ensure instance console log exists: /var/lib/nova/instances/5e4f7ec6-bb38-4a62-88f4-5e5b869452f0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:48:33 np0005603623 nova_compute[226235]: 2026-01-31 07:48:33.119 226239 DEBUG oslo_concurrency.lockutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:33 np0005603623 nova_compute[226235]: 2026-01-31 07:48:33.119 226239 DEBUG oslo_concurrency.lockutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:33 np0005603623 nova_compute[226235]: 2026-01-31 07:48:33.119 226239 DEBUG oslo_concurrency.lockutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:33.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:33 np0005603623 nova_compute[226235]: 2026-01-31 07:48:33.380 226239 DEBUG nova.network.neutron [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Successfully updated port: 06448a4a-1828-42e2-810c-09e0ca21c35f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:48:33 np0005603623 nova_compute[226235]: 2026-01-31 07:48:33.403 226239 DEBUG oslo_concurrency.lockutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquiring lock "refresh_cache-5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:48:33 np0005603623 nova_compute[226235]: 2026-01-31 07:48:33.403 226239 DEBUG oslo_concurrency.lockutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquired lock "refresh_cache-5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:48:33 np0005603623 nova_compute[226235]: 2026-01-31 07:48:33.403 226239 DEBUG nova.network.neutron [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:48:33 np0005603623 nova_compute[226235]: 2026-01-31 07:48:33.618 226239 DEBUG nova.network.neutron [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:48:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:34.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:34 np0005603623 nova_compute[226235]: 2026-01-31 07:48:34.651 226239 DEBUG nova.compute.manager [req-60f542a6-7e83-4a3b-bec9-6227bd2e58bb req-3f5ffd85-40ea-4124-9fcf-2546b614cb41 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received event network-changed-06448a4a-1828-42e2-810c-09e0ca21c35f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:48:34 np0005603623 nova_compute[226235]: 2026-01-31 07:48:34.652 226239 DEBUG nova.compute.manager [req-60f542a6-7e83-4a3b-bec9-6227bd2e58bb req-3f5ffd85-40ea-4124-9fcf-2546b614cb41 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Refreshing instance network info cache due to event network-changed-06448a4a-1828-42e2-810c-09e0ca21c35f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:48:34 np0005603623 nova_compute[226235]: 2026-01-31 07:48:34.652 226239 DEBUG oslo_concurrency.lockutils [req-60f542a6-7e83-4a3b-bec9-6227bd2e58bb req-3f5ffd85-40ea-4124-9fcf-2546b614cb41 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:48:35 np0005603623 ovn_controller[133449]: 2026-01-31T07:48:35Z|00060|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 31 02:48:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:35.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.598 226239 DEBUG nova.network.neutron [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Updating instance_info_cache with network_info: [{"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.629 226239 DEBUG oslo_concurrency.lockutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Releasing lock "refresh_cache-5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.630 226239 DEBUG nova.compute.manager [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Instance network_info: |[{"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.631 226239 DEBUG oslo_concurrency.lockutils [req-60f542a6-7e83-4a3b-bec9-6227bd2e58bb req-3f5ffd85-40ea-4124-9fcf-2546b614cb41 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.632 226239 DEBUG nova.network.neutron [req-60f542a6-7e83-4a3b-bec9-6227bd2e58bb req-3f5ffd85-40ea-4124-9fcf-2546b614cb41 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Refreshing network info cache for port 06448a4a-1828-42e2-810c-09e0ca21c35f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.637 226239 DEBUG nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Start _get_guest_xml network_info=[{"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.644 226239 WARNING nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.651 226239 DEBUG nova.virt.libvirt.host [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.652 226239 DEBUG nova.virt.libvirt.host [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.660 226239 DEBUG nova.virt.libvirt.host [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.661 226239 DEBUG nova.virt.libvirt.host [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.662 226239 DEBUG nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.662 226239 DEBUG nova.virt.hardware [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.663 226239 DEBUG nova.virt.hardware [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.663 226239 DEBUG nova.virt.hardware [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.663 226239 DEBUG nova.virt.hardware [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.664 226239 DEBUG nova.virt.hardware [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.664 226239 DEBUG nova.virt.hardware [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.664 226239 DEBUG nova.virt.hardware [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.664 226239 DEBUG nova.virt.hardware [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.664 226239 DEBUG nova.virt.hardware [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.665 226239 DEBUG nova.virt.hardware [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.665 226239 DEBUG nova.virt.hardware [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:48:35 np0005603623 nova_compute[226235]: 2026-01-31 07:48:35.668 226239 DEBUG oslo_concurrency.processutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:48:36 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1295809723' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:48:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:36.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:36 np0005603623 nova_compute[226235]: 2026-01-31 07:48:36.735 226239 DEBUG oslo_concurrency.processutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:36 np0005603623 nova_compute[226235]: 2026-01-31 07:48:36.761 226239 DEBUG nova.storage.rbd_utils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] rbd image 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:48:36 np0005603623 nova_compute[226235]: 2026-01-31 07:48:36.764 226239 DEBUG oslo_concurrency.processutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:36 np0005603623 nova_compute[226235]: 2026-01-31 07:48:36.782 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:48:37 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/399283427' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.165 226239 DEBUG oslo_concurrency.processutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.401s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.168 226239 DEBUG nova.virt.libvirt.vif [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:48:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1707336579',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1707336579',id=16,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e66a774f63ae4139a4e75c7973fbe077',ramdisk_id='',reservation_id='r-q29px744',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-2072827810',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-2072827810-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:48:28Z,user_data=None,user_id='37ed25cc14814a29867ac308b3cce8cf',uuid=5e4f7ec6-bb38-4a62-88f4-5e5b869452f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.168 226239 DEBUG nova.network.os_vif_util [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Converting VIF {"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.169 226239 DEBUG nova.network.os_vif_util [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:f3:77,bridge_name='br-int',has_traffic_filtering=True,id=06448a4a-1828-42e2-810c-09e0ca21c35f,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06448a4a-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.171 226239 DEBUG nova.objects.instance [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.203 226239 DEBUG nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:48:37 np0005603623 nova_compute[226235]:  <uuid>5e4f7ec6-bb38-4a62-88f4-5e5b869452f0</uuid>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:  <name>instance-00000010</name>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-1707336579</nova:name>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:48:35</nova:creationTime>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:48:37 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:        <nova:user uuid="37ed25cc14814a29867ac308b3cce8cf">tempest-LiveAutoBlockMigrationV225Test-2072827810-project-member</nova:user>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:        <nova:project uuid="e66a774f63ae4139a4e75c7973fbe077">tempest-LiveAutoBlockMigrationV225Test-2072827810</nova:project>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:        <nova:port uuid="06448a4a-1828-42e2-810c-09e0ca21c35f">
Jan 31 02:48:37 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <entry name="serial">5e4f7ec6-bb38-4a62-88f4-5e5b869452f0</entry>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <entry name="uuid">5e4f7ec6-bb38-4a62-88f4-5e5b869452f0</entry>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_disk">
Jan 31 02:48:37 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:48:37 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_disk.config">
Jan 31 02:48:37 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:48:37 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:da:f3:77"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <target dev="tap06448a4a-18"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    </interface>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/5e4f7ec6-bb38-4a62-88f4-5e5b869452f0/console.log" append="off"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:48:37 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:48:37 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:48:37 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:48:37 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.204 226239 DEBUG nova.compute.manager [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Preparing to wait for external event network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.204 226239 DEBUG oslo_concurrency.lockutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquiring lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.205 226239 DEBUG oslo_concurrency.lockutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.205 226239 DEBUG oslo_concurrency.lockutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.206 226239 DEBUG nova.virt.libvirt.vif [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:48:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1707336579',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1707336579',id=16,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e66a774f63ae4139a4e75c7973fbe077',ramdisk_id='',reservation_id='r-q29px744',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-2072827810',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-2072827810-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:48:28Z,user_data=None,user_id='37ed25cc14814a29867ac308b3cce8cf',uuid=5e4f7ec6-bb38-4a62-88f4-5e5b869452f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.206 226239 DEBUG nova.network.os_vif_util [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Converting VIF {"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.206 226239 DEBUG nova.network.os_vif_util [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:f3:77,bridge_name='br-int',has_traffic_filtering=True,id=06448a4a-1828-42e2-810c-09e0ca21c35f,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06448a4a-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.207 226239 DEBUG os_vif [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:f3:77,bridge_name='br-int',has_traffic_filtering=True,id=06448a4a-1828-42e2-810c-09e0ca21c35f,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06448a4a-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.207 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.208 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.208 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.212 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.212 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06448a4a-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.213 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap06448a4a-18, col_values=(('external_ids', {'iface-id': '06448a4a-1828-42e2-810c-09e0ca21c35f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:da:f3:77', 'vm-uuid': '5e4f7ec6-bb38-4a62-88f4-5e5b869452f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.214 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:37 np0005603623 NetworkManager[48970]: <info>  [1769845717.2160] manager: (tap06448a4a-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.220 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.221 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.222 226239 INFO os_vif [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:f3:77,bridge_name='br-int',has_traffic_filtering=True,id=06448a4a-1828-42e2-810c-09e0ca21c35f,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06448a4a-18')#033[00m
Jan 31 02:48:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:37.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.515 226239 DEBUG nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.516 226239 DEBUG nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.516 226239 DEBUG nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] No VIF found with MAC fa:16:3e:da:f3:77, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.517 226239 INFO nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Using config drive#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.551 226239 DEBUG nova.storage.rbd_utils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] rbd image 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.620 226239 DEBUG nova.network.neutron [req-60f542a6-7e83-4a3b-bec9-6227bd2e58bb req-3f5ffd85-40ea-4124-9fcf-2546b614cb41 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Updated VIF entry in instance network info cache for port 06448a4a-1828-42e2-810c-09e0ca21c35f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.621 226239 DEBUG nova.network.neutron [req-60f542a6-7e83-4a3b-bec9-6227bd2e58bb req-3f5ffd85-40ea-4124-9fcf-2546b614cb41 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Updating instance_info_cache with network_info: [{"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:48:37 np0005603623 nova_compute[226235]: 2026-01-31 07:48:37.818 226239 DEBUG oslo_concurrency.lockutils [req-60f542a6-7e83-4a3b-bec9-6227bd2e58bb req-3f5ffd85-40ea-4124-9fcf-2546b614cb41 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:48:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:38.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:38 np0005603623 nova_compute[226235]: 2026-01-31 07:48:38.591 226239 INFO nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Creating config drive at /var/lib/nova/instances/5e4f7ec6-bb38-4a62-88f4-5e5b869452f0/disk.config#033[00m
Jan 31 02:48:38 np0005603623 nova_compute[226235]: 2026-01-31 07:48:38.595 226239 DEBUG oslo_concurrency.processutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5e4f7ec6-bb38-4a62-88f4-5e5b869452f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpwvqii26_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:38 np0005603623 nova_compute[226235]: 2026-01-31 07:48:38.714 226239 DEBUG oslo_concurrency.processutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5e4f7ec6-bb38-4a62-88f4-5e5b869452f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpwvqii26_" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:38 np0005603623 nova_compute[226235]: 2026-01-31 07:48:38.980 226239 DEBUG nova.storage.rbd_utils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] rbd image 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:48:38 np0005603623 nova_compute[226235]: 2026-01-31 07:48:38.986 226239 DEBUG oslo_concurrency.processutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5e4f7ec6-bb38-4a62-88f4-5e5b869452f0/disk.config 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:48:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:39.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:48:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:40.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:48:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:41.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:48:41 np0005603623 nova_compute[226235]: 2026-01-31 07:48:41.756 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:42 np0005603623 nova_compute[226235]: 2026-01-31 07:48:42.215 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:42.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:43 np0005603623 nova_compute[226235]: 2026-01-31 07:48:43.224 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 02:48:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:43.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:48:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:44.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:48:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:45.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:48:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:46.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:48:46 np0005603623 nova_compute[226235]: 2026-01-31 07:48:46.502 226239 DEBUG oslo_concurrency.processutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5e4f7ec6-bb38-4a62-88f4-5e5b869452f0/disk.config 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 7.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:46 np0005603623 nova_compute[226235]: 2026-01-31 07:48:46.502 226239 INFO nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Deleting local config drive /var/lib/nova/instances/5e4f7ec6-bb38-4a62-88f4-5e5b869452f0/disk.config because it was imported into RBD.#033[00m
Jan 31 02:48:46 np0005603623 kernel: tap06448a4a-18: entered promiscuous mode
Jan 31 02:48:46 np0005603623 NetworkManager[48970]: <info>  [1769845726.5488] manager: (tap06448a4a-18): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Jan 31 02:48:46 np0005603623 ovn_controller[133449]: 2026-01-31T07:48:46Z|00061|binding|INFO|Claiming lport 06448a4a-1828-42e2-810c-09e0ca21c35f for this chassis.
Jan 31 02:48:46 np0005603623 ovn_controller[133449]: 2026-01-31T07:48:46Z|00062|binding|INFO|06448a4a-1828-42e2-810c-09e0ca21c35f: Claiming fa:16:3e:da:f3:77 10.100.0.14
Jan 31 02:48:46 np0005603623 ovn_controller[133449]: 2026-01-31T07:48:46Z|00063|binding|INFO|Claiming lport 3162329c-f03f-465e-9f99-799a29d883a0 for this chassis.
Jan 31 02:48:46 np0005603623 ovn_controller[133449]: 2026-01-31T07:48:46Z|00064|binding|INFO|3162329c-f03f-465e-9f99-799a29d883a0: Claiming fa:16:3e:a4:c3:df 19.80.0.151
Jan 31 02:48:46 np0005603623 nova_compute[226235]: 2026-01-31 07:48:46.549 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:46 np0005603623 nova_compute[226235]: 2026-01-31 07:48:46.556 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.571 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:f3:77 10.100.0.14'], port_security=['fa:16:3e:da:f3:77 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-116143868', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5e4f7ec6-bb38-4a62-88f4-5e5b869452f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-116143868', 'neutron:project_id': 'e66a774f63ae4139a4e75c7973fbe077', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a4a96739-bb2f-4e95-bbe5-76a81d2aa557', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=227ca833-938d-48d2-86c8-5d09dd658c40, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=06448a4a-1828-42e2-810c-09e0ca21c35f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:48:46 np0005603623 systemd-udevd[234198]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.573 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:c3:df 19.80.0.151'], port_security=['fa:16:3e:a4:c3:df 19.80.0.151'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['06448a4a-1828-42e2-810c-09e0ca21c35f'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1877730947', 'neutron:cidrs': '19.80.0.151/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-057fed11-d4e4-4c56-8ba3-81a6235ed1bf', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1877730947', 'neutron:project_id': 'e66a774f63ae4139a4e75c7973fbe077', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a4a96739-bb2f-4e95-bbe5-76a81d2aa557', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=0d46eecb-5024-425b-affd-165dd8eae0e4, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3162329c-f03f-465e-9f99-799a29d883a0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.574 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 06448a4a-1828-42e2-810c-09e0ca21c35f in datapath 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d bound to our chassis#033[00m
Jan 31 02:48:46 np0005603623 systemd-machined[194379]: New machine qemu-7-instance-00000010.
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.577 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d#033[00m
Jan 31 02:48:46 np0005603623 NetworkManager[48970]: <info>  [1769845726.5833] device (tap06448a4a-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:48:46 np0005603623 NetworkManager[48970]: <info>  [1769845726.5837] device (tap06448a4a-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.587 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[17628263-bf8d-48ec-8ed0-d78b0f90d92e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.588 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap60bb4bea-d1 in ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.590 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap60bb4bea-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.590 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[95f93435-c1ef-40d3-9799-dada39539851]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.591 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f4d1be81-a525-4243-a68c-0b60f21eadf5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.597 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[f9cbb589-5464-4df8-a1c4-d165b4785e96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:46 np0005603623 systemd[1]: Started Virtual Machine qemu-7-instance-00000010.
Jan 31 02:48:46 np0005603623 ovn_controller[133449]: 2026-01-31T07:48:46Z|00065|binding|INFO|Setting lport 06448a4a-1828-42e2-810c-09e0ca21c35f ovn-installed in OVS
Jan 31 02:48:46 np0005603623 ovn_controller[133449]: 2026-01-31T07:48:46Z|00066|binding|INFO|Setting lport 06448a4a-1828-42e2-810c-09e0ca21c35f up in Southbound
Jan 31 02:48:46 np0005603623 ovn_controller[133449]: 2026-01-31T07:48:46Z|00067|binding|INFO|Setting lport 3162329c-f03f-465e-9f99-799a29d883a0 up in Southbound
Jan 31 02:48:46 np0005603623 nova_compute[226235]: 2026-01-31 07:48:46.609 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.610 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[93fc8775-26b4-4f70-ab99-61f036b94c91]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.634 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[f92c58fe-4745-42bd-8952-e46597cfde36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.641 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[78698e81-fe72-4a88-8869-b2c2f8302a89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:46 np0005603623 NetworkManager[48970]: <info>  [1769845726.6419] manager: (tap60bb4bea-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.666 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[e4c4b78c-2cd2-48b0-beac-6fc17a491711]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.670 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[3dec1991-47f1-4023-af33-0283ec88ac09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:46 np0005603623 NetworkManager[48970]: <info>  [1769845726.6867] device (tap60bb4bea-d0): carrier: link connected
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.689 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[cdba8cec-82a0-400e-811f-10176c3e95b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.704 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a6b3575d-eb9b-47ba-a0ef-e4ed64087278]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap60bb4bea-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:b1:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486711, 'reachable_time': 26417, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234232, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.716 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c3ca2192-10b4-433a-98be-90e6dae4f0b3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4e:b1c0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486711, 'tstamp': 486711}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234233, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.729 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6eecc60d-58e5-4647-ad43-6ad442fbe19e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap60bb4bea-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:b1:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486711, 'reachable_time': 26417, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234234, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.755 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d9bf4bdf-9743-458a-bb95-5296b5cbe950]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:46 np0005603623 nova_compute[226235]: 2026-01-31 07:48:46.756 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.792 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8b42371e-2ba7-45e1-ba5e-a815104ffc36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.793 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60bb4bea-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.793 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.794 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60bb4bea-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:46 np0005603623 nova_compute[226235]: 2026-01-31 07:48:46.795 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:46 np0005603623 NetworkManager[48970]: <info>  [1769845726.7962] manager: (tap60bb4bea-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Jan 31 02:48:46 np0005603623 kernel: tap60bb4bea-d0: entered promiscuous mode
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.798 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap60bb4bea-d0, col_values=(('external_ids', {'iface-id': 'eefb3f31-55e8-4b1d-a07a-d5c925fc9fd8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:46 np0005603623 nova_compute[226235]: 2026-01-31 07:48:46.799 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:46 np0005603623 ovn_controller[133449]: 2026-01-31T07:48:46Z|00068|binding|INFO|Releasing lport eefb3f31-55e8-4b1d-a07a-d5c925fc9fd8 from this chassis (sb_readonly=0)
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.801 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/60bb4bea-d9f0-41fc-9c0f-6fcd644c255d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/60bb4bea-d9f0-41fc-9c0f-6fcd644c255d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.802 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[78b35e5f-3293-4321-9443-7801a008d35e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.802 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/60bb4bea-d9f0-41fc-9c0f-6fcd644c255d.pid.haproxy
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:48:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:46.803 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'env', 'PROCESS_TAG=haproxy-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/60bb4bea-d9f0-41fc-9c0f-6fcd644c255d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:48:46 np0005603623 nova_compute[226235]: 2026-01-31 07:48:46.804 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:47 np0005603623 podman[234266]: 2026-01-31 07:48:47.118895174 +0000 UTC m=+0.027337669 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:48:47 np0005603623 nova_compute[226235]: 2026-01-31 07:48:47.398 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:48:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:47.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:48:48 np0005603623 podman[234266]: 2026-01-31 07:48:48.4417787 +0000 UTC m=+1.350221175 container create 2b425384097e75f92748816fb53b40eac5cf896eed346e9cd848392f717d45c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 31 02:48:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:48.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:48 np0005603623 systemd[1]: Started libpod-conmon-2b425384097e75f92748816fb53b40eac5cf896eed346e9cd848392f717d45c9.scope.
Jan 31 02:48:48 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:48:48 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f67975f633fd0d016f345aebdb3ca181c0e818fd3645476c012efab03a72c3ef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.759 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845728.7585552, 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.759 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] VM Started (Lifecycle Event)#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.776 226239 DEBUG nova.compute.manager [req-ea65a495-24f2-45ba-ab83-f9f86b7a85b0 req-1b73306c-d20c-4549-8507-05fd57488a5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received event network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.776 226239 DEBUG oslo_concurrency.lockutils [req-ea65a495-24f2-45ba-ab83-f9f86b7a85b0 req-1b73306c-d20c-4549-8507-05fd57488a5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.777 226239 DEBUG oslo_concurrency.lockutils [req-ea65a495-24f2-45ba-ab83-f9f86b7a85b0 req-1b73306c-d20c-4549-8507-05fd57488a5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.777 226239 DEBUG oslo_concurrency.lockutils [req-ea65a495-24f2-45ba-ab83-f9f86b7a85b0 req-1b73306c-d20c-4549-8507-05fd57488a5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.777 226239 DEBUG nova.compute.manager [req-ea65a495-24f2-45ba-ab83-f9f86b7a85b0 req-1b73306c-d20c-4549-8507-05fd57488a5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Processing event network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.778 226239 DEBUG nova.compute.manager [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.782 226239 DEBUG nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.786 226239 INFO nova.virt.libvirt.driver [-] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Instance spawned successfully.#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.786 226239 DEBUG nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.813 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.818 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.822 226239 DEBUG nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.822 226239 DEBUG nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.823 226239 DEBUG nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.823 226239 DEBUG nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.824 226239 DEBUG nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.824 226239 DEBUG nova.virt.libvirt.driver [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.855 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.855 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845728.759365, 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.856 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.893 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.896 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845728.7821512, 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.896 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.906 226239 INFO nova.compute.manager [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Took 20.69 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.907 226239 DEBUG nova.compute.manager [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.925 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.929 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.959 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:48:48 np0005603623 nova_compute[226235]: 2026-01-31 07:48:48.980 226239 INFO nova.compute.manager [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Took 22.14 seconds to build instance.#033[00m
Jan 31 02:48:49 np0005603623 nova_compute[226235]: 2026-01-31 07:48:49.002 226239 DEBUG oslo_concurrency.lockutils [None req-49e6ab27-ac76-4cad-aeb4-7c149dbcf1a6 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:49 np0005603623 podman[234266]: 2026-01-31 07:48:49.045376169 +0000 UTC m=+1.953818664 container init 2b425384097e75f92748816fb53b40eac5cf896eed346e9cd848392f717d45c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:48:49 np0005603623 podman[234266]: 2026-01-31 07:48:49.052596015 +0000 UTC m=+1.961038530 container start 2b425384097e75f92748816fb53b40eac5cf896eed346e9cd848392f717d45c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Jan 31 02:48:49 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[234351]: [NOTICE]   (234372) : New worker (234374) forked
Jan 31 02:48:49 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[234351]: [NOTICE]   (234372) : Loading success.
Jan 31 02:48:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:49.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:49 np0005603623 podman[234317]: 2026-01-31 07:48:49.637717444 +0000 UTC m=+1.157424205 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Jan 31 02:48:49 np0005603623 podman[234316]: 2026-01-31 07:48:49.638996674 +0000 UTC m=+1.162928509 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.670 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 3162329c-f03f-465e-9f99-799a29d883a0 in datapath 057fed11-d4e4-4c56-8ba3-81a6235ed1bf unbound from our chassis#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.672 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 057fed11-d4e4-4c56-8ba3-81a6235ed1bf#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.680 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[50f1501e-9132-443f-8952-801414ef6c46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.681 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap057fed11-d1 in ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.685 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap057fed11-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.685 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[71a8337a-5d65-496f-b880-a68fc9246531]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.686 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[42eba736-c934-447a-8667-27986a42349d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.697 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[fcf56ff3-2a95-4ded-a634-1391892f0c96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.709 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[332ecfe3-ad70-4fc1-ad98-cdf69dce9d25]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.739 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[647c04a7-ee94-4a94-b059-2ab551b5f911]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:49 np0005603623 NetworkManager[48970]: <info>  [1769845729.7468] manager: (tap057fed11-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.746 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7bbf7739-4c05-4063-a8dd-751ba7900ea3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.779 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[49cf8991-0c81-496f-aadd-072af42c7833]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.784 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[220e0413-4ce4-41a4-8bba-3a1ed9935f6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:49 np0005603623 systemd-udevd[234392]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:48:49 np0005603623 NetworkManager[48970]: <info>  [1769845729.8099] device (tap057fed11-d0): carrier: link connected
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.812 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[329036ce-7429-4af6-a0c2-190eae0ecc94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.830 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1311b74b-3496-45b8-8eb3-4a0062b7cd01]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap057fed11-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:6a:78'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487023, 'reachable_time': 29854, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234411, 'error': None, 'target': 'ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.847 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9f9c0c0f-9021-479a-8e59-8622d970d8b9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe04:6a78'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487023, 'tstamp': 487023}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234412, 'error': None, 'target': 'ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.864 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[983e24cb-cd2a-428a-936c-0a3f06fdec6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap057fed11-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:6a:78'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487023, 'reachable_time': 29854, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234413, 'error': None, 'target': 'ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.892 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2be13515-2990-49d5-839a-e362c382149e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.953 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb47c6c-d224-49f4-84df-88a4a4e018cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.955 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap057fed11-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.955 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.956 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap057fed11-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:49 np0005603623 nova_compute[226235]: 2026-01-31 07:48:49.958 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:49 np0005603623 NetworkManager[48970]: <info>  [1769845729.9587] manager: (tap057fed11-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 31 02:48:49 np0005603623 kernel: tap057fed11-d0: entered promiscuous mode
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.962 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap057fed11-d0, col_values=(('external_ids', {'iface-id': '1e50dcb0-cab9-4443-a53d-151822a6eb9a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:49 np0005603623 ovn_controller[133449]: 2026-01-31T07:48:49Z|00069|binding|INFO|Releasing lport 1e50dcb0-cab9-4443-a53d-151822a6eb9a from this chassis (sb_readonly=0)
Jan 31 02:48:49 np0005603623 nova_compute[226235]: 2026-01-31 07:48:49.963 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.963 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/057fed11-d4e4-4c56-8ba3-81a6235ed1bf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/057fed11-d4e4-4c56-8ba3-81a6235ed1bf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:48:49 np0005603623 nova_compute[226235]: 2026-01-31 07:48:49.967 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.967 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9f774b26-5a8b-4340-8459-791e7fd8cbcb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.967 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-057fed11-d4e4-4c56-8ba3-81a6235ed1bf
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/057fed11-d4e4-4c56-8ba3-81a6235ed1bf.pid.haproxy
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 057fed11-d4e4-4c56-8ba3-81a6235ed1bf
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:48:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:48:49.968 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf', 'env', 'PROCESS_TAG=haproxy-057fed11-d4e4-4c56-8ba3-81a6235ed1bf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/057fed11-d4e4-4c56-8ba3-81a6235ed1bf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:48:50 np0005603623 podman[234447]: 2026-01-31 07:48:50.322178375 +0000 UTC m=+0.039362808 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:48:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:50.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:50 np0005603623 nova_compute[226235]: 2026-01-31 07:48:50.866 226239 DEBUG nova.compute.manager [req-1bd38402-0a52-4e92-8a90-a0825dafcf3d req-2cdfd6cb-f221-4e80-997c-bb5bf11a134a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received event network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:48:50 np0005603623 nova_compute[226235]: 2026-01-31 07:48:50.867 226239 DEBUG oslo_concurrency.lockutils [req-1bd38402-0a52-4e92-8a90-a0825dafcf3d req-2cdfd6cb-f221-4e80-997c-bb5bf11a134a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:50 np0005603623 nova_compute[226235]: 2026-01-31 07:48:50.868 226239 DEBUG oslo_concurrency.lockutils [req-1bd38402-0a52-4e92-8a90-a0825dafcf3d req-2cdfd6cb-f221-4e80-997c-bb5bf11a134a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:50 np0005603623 nova_compute[226235]: 2026-01-31 07:48:50.868 226239 DEBUG oslo_concurrency.lockutils [req-1bd38402-0a52-4e92-8a90-a0825dafcf3d req-2cdfd6cb-f221-4e80-997c-bb5bf11a134a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:50 np0005603623 nova_compute[226235]: 2026-01-31 07:48:50.869 226239 DEBUG nova.compute.manager [req-1bd38402-0a52-4e92-8a90-a0825dafcf3d req-2cdfd6cb-f221-4e80-997c-bb5bf11a134a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] No waiting events found dispatching network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:48:50 np0005603623 nova_compute[226235]: 2026-01-31 07:48:50.869 226239 WARNING nova.compute.manager [req-1bd38402-0a52-4e92-8a90-a0825dafcf3d req-2cdfd6cb-f221-4e80-997c-bb5bf11a134a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received unexpected event network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f for instance with vm_state active and task_state None.#033[00m
Jan 31 02:48:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:51.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:51 np0005603623 nova_compute[226235]: 2026-01-31 07:48:51.420 226239 INFO nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance shutdown successfully after 29 seconds.#033[00m
Jan 31 02:48:51 np0005603623 podman[234447]: 2026-01-31 07:48:51.43412341 +0000 UTC m=+1.151307823 container create 9d4b29f1b419887b96c39d82bb84ade214a8dd3a86495ef1ef7f8d356b69eab8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Jan 31 02:48:51 np0005603623 nova_compute[226235]: 2026-01-31 07:48:51.759 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:51 np0005603623 systemd[1]: Started libpod-conmon-9d4b29f1b419887b96c39d82bb84ade214a8dd3a86495ef1ef7f8d356b69eab8.scope.
Jan 31 02:48:51 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:48:51 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c36673826f20559cb3653fe08e3ce55f88bf346a0f0cb281dad09c112aef597/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:48:52 np0005603623 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 31 02:48:52 np0005603623 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000c.scope: Consumed 12.932s CPU time.
Jan 31 02:48:52 np0005603623 systemd-machined[194379]: Machine qemu-6-instance-0000000c terminated.
Jan 31 02:48:52 np0005603623 nova_compute[226235]: 2026-01-31 07:48:52.239 226239 INFO nova.virt.libvirt.driver [-] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance destroyed successfully.#033[00m
Jan 31 02:48:52 np0005603623 nova_compute[226235]: 2026-01-31 07:48:52.244 226239 INFO nova.virt.libvirt.driver [-] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance destroyed successfully.#033[00m
Jan 31 02:48:52 np0005603623 nova_compute[226235]: 2026-01-31 07:48:52.401 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:52 np0005603623 podman[234447]: 2026-01-31 07:48:52.460198726 +0000 UTC m=+2.177383239 container init 9d4b29f1b419887b96c39d82bb84ade214a8dd3a86495ef1ef7f8d356b69eab8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 02:48:52 np0005603623 podman[234447]: 2026-01-31 07:48:52.46890512 +0000 UTC m=+2.186089573 container start 9d4b29f1b419887b96c39d82bb84ade214a8dd3a86495ef1ef7f8d356b69eab8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:48:52 np0005603623 neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf[234512]: [NOTICE]   (234534) : New worker (234536) forked
Jan 31 02:48:52 np0005603623 neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf[234512]: [NOTICE]   (234534) : Loading success.
Jan 31 02:48:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:52.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:52 np0005603623 nova_compute[226235]: 2026-01-31 07:48:52.857 226239 DEBUG nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Check if temp file /var/lib/nova/instances/tmphvxots3o exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Jan 31 02:48:52 np0005603623 nova_compute[226235]: 2026-01-31 07:48:52.857 226239 DEBUG nova.compute.manager [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphvxots3o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='5e4f7ec6-bb38-4a62-88f4-5e5b869452f0',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Jan 31 02:48:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:53.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:53 np0005603623 nova_compute[226235]: 2026-01-31 07:48:53.911 226239 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:48:53 np0005603623 nova_compute[226235]: 2026-01-31 07:48:53.912 226239 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:48:53 np0005603623 nova_compute[226235]: 2026-01-31 07:48:53.927 226239 INFO nova.compute.rpcapi [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Jan 31 02:48:53 np0005603623 nova_compute[226235]: 2026-01-31 07:48:53.928 226239 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:48:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:48:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:54.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:48:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:48:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:55.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:48:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:56.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:56 np0005603623 nova_compute[226235]: 2026-01-31 07:48:56.760 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:57 np0005603623 nova_compute[226235]: 2026-01-31 07:48:57.246 226239 DEBUG nova.compute.manager [req-cfa8f04b-4426-4cd4-973c-f47d5e899b13 req-a39c70c3-2780-42f9-85d0-a78f299a70b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received event network-vif-unplugged-06448a4a-1828-42e2-810c-09e0ca21c35f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:48:57 np0005603623 nova_compute[226235]: 2026-01-31 07:48:57.247 226239 DEBUG oslo_concurrency.lockutils [req-cfa8f04b-4426-4cd4-973c-f47d5e899b13 req-a39c70c3-2780-42f9-85d0-a78f299a70b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:57 np0005603623 nova_compute[226235]: 2026-01-31 07:48:57.247 226239 DEBUG oslo_concurrency.lockutils [req-cfa8f04b-4426-4cd4-973c-f47d5e899b13 req-a39c70c3-2780-42f9-85d0-a78f299a70b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:57 np0005603623 nova_compute[226235]: 2026-01-31 07:48:57.247 226239 DEBUG oslo_concurrency.lockutils [req-cfa8f04b-4426-4cd4-973c-f47d5e899b13 req-a39c70c3-2780-42f9-85d0-a78f299a70b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:57 np0005603623 nova_compute[226235]: 2026-01-31 07:48:57.248 226239 DEBUG nova.compute.manager [req-cfa8f04b-4426-4cd4-973c-f47d5e899b13 req-a39c70c3-2780-42f9-85d0-a78f299a70b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] No waiting events found dispatching network-vif-unplugged-06448a4a-1828-42e2-810c-09e0ca21c35f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:48:57 np0005603623 nova_compute[226235]: 2026-01-31 07:48:57.248 226239 DEBUG nova.compute.manager [req-cfa8f04b-4426-4cd4-973c-f47d5e899b13 req-a39c70c3-2780-42f9-85d0-a78f299a70b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received event network-vif-unplugged-06448a4a-1828-42e2-810c-09e0ca21c35f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:48:57 np0005603623 nova_compute[226235]: 2026-01-31 07:48:57.403 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:57.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:58.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:58 np0005603623 nova_compute[226235]: 2026-01-31 07:48:58.588 226239 INFO nova.compute.manager [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Took 4.68 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.#033[00m
Jan 31 02:48:58 np0005603623 nova_compute[226235]: 2026-01-31 07:48:58.590 226239 DEBUG nova.compute.manager [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:48:58 np0005603623 nova_compute[226235]: 2026-01-31 07:48:58.772 226239 DEBUG nova.compute.manager [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphvxots3o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='5e4f7ec6-bb38-4a62-88f4-5e5b869452f0',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(13553142-40d2-4bbf-8fa5-7f14b3beb57a),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Jan 31 02:48:58 np0005603623 nova_compute[226235]: 2026-01-31 07:48:58.776 226239 DEBUG nova.objects.instance [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lazy-loading 'migration_context' on Instance uuid 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:48:58 np0005603623 nova_compute[226235]: 2026-01-31 07:48:58.778 226239 DEBUG nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Jan 31 02:48:58 np0005603623 nova_compute[226235]: 2026-01-31 07:48:58.779 226239 DEBUG nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Jan 31 02:48:58 np0005603623 nova_compute[226235]: 2026-01-31 07:48:58.779 226239 DEBUG nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Jan 31 02:48:58 np0005603623 nova_compute[226235]: 2026-01-31 07:48:58.820 226239 DEBUG nova.virt.libvirt.vif [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:48:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1707336579',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1707336579',id=16,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:48:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e66a774f63ae4139a4e75c7973fbe077',ramdisk_id='',reservation_id='r-q29px744',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-2072827810',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-2072827810-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:48:48Z,user_data=None,user_id='37ed25cc14814a29867ac308b3cce8cf',uuid=5e4f7ec6-bb38-4a62-88f4-5e5b869452f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:48:58 np0005603623 nova_compute[226235]: 2026-01-31 07:48:58.820 226239 DEBUG nova.network.os_vif_util [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Converting VIF {"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:48:58 np0005603623 nova_compute[226235]: 2026-01-31 07:48:58.821 226239 DEBUG nova.network.os_vif_util [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:f3:77,bridge_name='br-int',has_traffic_filtering=True,id=06448a4a-1828-42e2-810c-09e0ca21c35f,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06448a4a-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:48:58 np0005603623 nova_compute[226235]: 2026-01-31 07:48:58.822 226239 DEBUG nova.virt.libvirt.migration [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Updating guest XML with vif config: <interface type="ethernet">
Jan 31 02:48:58 np0005603623 nova_compute[226235]:  <mac address="fa:16:3e:da:f3:77"/>
Jan 31 02:48:58 np0005603623 nova_compute[226235]:  <model type="virtio"/>
Jan 31 02:48:58 np0005603623 nova_compute[226235]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:48:58 np0005603623 nova_compute[226235]:  <mtu size="1442"/>
Jan 31 02:48:58 np0005603623 nova_compute[226235]:  <target dev="tap06448a4a-18"/>
Jan 31 02:48:58 np0005603623 nova_compute[226235]: </interface>
Jan 31 02:48:58 np0005603623 nova_compute[226235]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Jan 31 02:48:58 np0005603623 nova_compute[226235]: 2026-01-31 07:48:58.823 226239 DEBUG nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Jan 31 02:48:59 np0005603623 nova_compute[226235]: 2026-01-31 07:48:59.282 226239 DEBUG nova.virt.libvirt.migration [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:48:59 np0005603623 nova_compute[226235]: 2026-01-31 07:48:59.284 226239 INFO nova.virt.libvirt.migration [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Jan 31 02:48:59 np0005603623 nova_compute[226235]: 2026-01-31 07:48:59.353 226239 DEBUG nova.compute.manager [req-6cd83d8f-5d24-4169-ba1d-db8956d658b5 req-d1d67dcb-fba3-45fe-868f-179415594362 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received event network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:48:59 np0005603623 nova_compute[226235]: 2026-01-31 07:48:59.354 226239 DEBUG oslo_concurrency.lockutils [req-6cd83d8f-5d24-4169-ba1d-db8956d658b5 req-d1d67dcb-fba3-45fe-868f-179415594362 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:59 np0005603623 nova_compute[226235]: 2026-01-31 07:48:59.354 226239 DEBUG oslo_concurrency.lockutils [req-6cd83d8f-5d24-4169-ba1d-db8956d658b5 req-d1d67dcb-fba3-45fe-868f-179415594362 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:59 np0005603623 nova_compute[226235]: 2026-01-31 07:48:59.355 226239 DEBUG oslo_concurrency.lockutils [req-6cd83d8f-5d24-4169-ba1d-db8956d658b5 req-d1d67dcb-fba3-45fe-868f-179415594362 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:59 np0005603623 nova_compute[226235]: 2026-01-31 07:48:59.355 226239 DEBUG nova.compute.manager [req-6cd83d8f-5d24-4169-ba1d-db8956d658b5 req-d1d67dcb-fba3-45fe-868f-179415594362 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] No waiting events found dispatching network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:48:59 np0005603623 nova_compute[226235]: 2026-01-31 07:48:59.355 226239 WARNING nova.compute.manager [req-6cd83d8f-5d24-4169-ba1d-db8956d658b5 req-d1d67dcb-fba3-45fe-868f-179415594362 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received unexpected event network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:48:59 np0005603623 nova_compute[226235]: 2026-01-31 07:48:59.355 226239 DEBUG nova.compute.manager [req-6cd83d8f-5d24-4169-ba1d-db8956d658b5 req-d1d67dcb-fba3-45fe-868f-179415594362 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received event network-changed-06448a4a-1828-42e2-810c-09e0ca21c35f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:48:59 np0005603623 nova_compute[226235]: 2026-01-31 07:48:59.356 226239 DEBUG nova.compute.manager [req-6cd83d8f-5d24-4169-ba1d-db8956d658b5 req-d1d67dcb-fba3-45fe-868f-179415594362 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Refreshing instance network info cache due to event network-changed-06448a4a-1828-42e2-810c-09e0ca21c35f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:48:59 np0005603623 nova_compute[226235]: 2026-01-31 07:48:59.356 226239 DEBUG oslo_concurrency.lockutils [req-6cd83d8f-5d24-4169-ba1d-db8956d658b5 req-d1d67dcb-fba3-45fe-868f-179415594362 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:48:59 np0005603623 nova_compute[226235]: 2026-01-31 07:48:59.356 226239 DEBUG oslo_concurrency.lockutils [req-6cd83d8f-5d24-4169-ba1d-db8956d658b5 req-d1d67dcb-fba3-45fe-868f-179415594362 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:48:59 np0005603623 nova_compute[226235]: 2026-01-31 07:48:59.356 226239 DEBUG nova.network.neutron [req-6cd83d8f-5d24-4169-ba1d-db8956d658b5 req-d1d67dcb-fba3-45fe-868f-179415594362 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Refreshing network info cache for port 06448a4a-1828-42e2-810c-09e0ca21c35f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:48:59 np0005603623 nova_compute[226235]: 2026-01-31 07:48:59.361 226239 INFO nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Jan 31 02:48:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:48:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:59.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:59 np0005603623 nova_compute[226235]: 2026-01-31 07:48:59.863 226239 DEBUG nova.virt.libvirt.migration [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:48:59 np0005603623 nova_compute[226235]: 2026-01-31 07:48:59.864 226239 DEBUG nova.virt.libvirt.migration [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:48:59 np0005603623 nova_compute[226235]: 2026-01-31 07:48:59.953 226239 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Creating tmpfile /var/lib/nova/instances/tmp8ndf3l1m to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 31 02:49:00 np0005603623 nova_compute[226235]: 2026-01-31 07:49:00.104 226239 DEBUG nova.compute.manager [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8ndf3l1m',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 31 02:49:00 np0005603623 nova_compute[226235]: 2026-01-31 07:49:00.367 226239 DEBUG nova.virt.libvirt.migration [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:49:00 np0005603623 nova_compute[226235]: 2026-01-31 07:49:00.368 226239 DEBUG nova.virt.libvirt.migration [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:49:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:00.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:00 np0005603623 nova_compute[226235]: 2026-01-31 07:49:00.529 226239 DEBUG nova.network.neutron [req-6cd83d8f-5d24-4169-ba1d-db8956d658b5 req-d1d67dcb-fba3-45fe-868f-179415594362 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Updated VIF entry in instance network info cache for port 06448a4a-1828-42e2-810c-09e0ca21c35f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:49:00 np0005603623 nova_compute[226235]: 2026-01-31 07:49:00.530 226239 DEBUG nova.network.neutron [req-6cd83d8f-5d24-4169-ba1d-db8956d658b5 req-d1d67dcb-fba3-45fe-868f-179415594362 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Updating instance_info_cache with network_info: [{"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:49:00 np0005603623 nova_compute[226235]: 2026-01-31 07:49:00.564 226239 DEBUG oslo_concurrency.lockutils [req-6cd83d8f-5d24-4169-ba1d-db8956d658b5 req-d1d67dcb-fba3-45fe-868f-179415594362 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:49:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e147 e147: 3 total, 3 up, 3 in
Jan 31 02:49:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:00 np0005603623 nova_compute[226235]: 2026-01-31 07:49:00.870 226239 DEBUG nova.virt.libvirt.migration [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Current 50 elapsed 2 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:49:00 np0005603623 nova_compute[226235]: 2026-01-31 07:49:00.871 226239 DEBUG nova.virt.libvirt.migration [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:49:01 np0005603623 nova_compute[226235]: 2026-01-31 07:49:01.068 226239 DEBUG nova.compute.manager [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8ndf3l1m',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 31 02:49:01 np0005603623 nova_compute[226235]: 2026-01-31 07:49:01.132 226239 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "refresh_cache-14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:49:01 np0005603623 nova_compute[226235]: 2026-01-31 07:49:01.132 226239 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquired lock "refresh_cache-14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:49:01 np0005603623 nova_compute[226235]: 2026-01-31 07:49:01.133 226239 DEBUG nova.network.neutron [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:49:01 np0005603623 nova_compute[226235]: 2026-01-31 07:49:01.374 226239 DEBUG nova.virt.libvirt.migration [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Current 50 elapsed 2 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:49:01 np0005603623 nova_compute[226235]: 2026-01-31 07:49:01.374 226239 DEBUG nova.virt.libvirt.migration [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:49:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:01.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:01 np0005603623 nova_compute[226235]: 2026-01-31 07:49:01.763 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.130 226239 DEBUG nova.virt.libvirt.migration [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Current 50 elapsed 3 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.131 226239 DEBUG nova.virt.libvirt.migration [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.170 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845742.1700902, 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.170 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.308 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.313 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.337 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.405 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:49:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:02.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.628 226239 DEBUG nova.network.neutron [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Updating instance_info_cache with network_info: [{"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.647 226239 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Releasing lock "refresh_cache-14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.651 226239 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8ndf3l1m',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.652 226239 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Creating instance directory: /var/lib/nova/instances/14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.652 226239 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Ensure instance console log exists: /var/lib/nova/instances/14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.653 226239 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.655 226239 DEBUG nova.virt.libvirt.vif [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:48:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1007393486',display_name='tempest-LiveMigrationTest-server-1007393486',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1007393486',id=17,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:48:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cbdbb7a4b22a49b68feb3e028bb62fbb',ramdisk_id='',reservation_id='r-egj05et1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-126681982',owner_user_name='tempest-LiveMigrationTest-126681982-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:48:55Z,user_data=None,user_id='795c7f392cbc45f0885f081449883d42',uuid=14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.655 226239 DEBUG nova.network.os_vif_util [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Converting VIF {"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.656 226239 DEBUG nova.network.os_vif_util [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:f2:07,bridge_name='br-int',has_traffic_filtering=True,id=3aff2339-ccc0-4845-8728-4ede26d0c11a,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3aff2339-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.657 226239 DEBUG os_vif [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:f2:07,bridge_name='br-int',has_traffic_filtering=True,id=3aff2339-ccc0-4845-8728-4ede26d0c11a,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3aff2339-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.658 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.659 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.660 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.663 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.664 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3aff2339-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.665 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3aff2339-cc, col_values=(('external_ids', {'iface-id': '3aff2339-ccc0-4845-8728-4ede26d0c11a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e0:f2:07', 'vm-uuid': '14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.666 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:02 np0005603623 NetworkManager[48970]: <info>  [1769845742.6683] manager: (tap3aff2339-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.669 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.672 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.674 226239 INFO os_vif [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:f2:07,bridge_name='br-int',has_traffic_filtering=True,id=3aff2339-ccc0-4845-8728-4ede26d0c11a,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3aff2339-cc')#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.675 226239 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 31 02:49:02 np0005603623 nova_compute[226235]: 2026-01-31 07:49:02.675 226239 DEBUG nova.compute.manager [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8ndf3l1m',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 31 02:49:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:03.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:03 np0005603623 kernel: tap06448a4a-18 (unregistering): left promiscuous mode
Jan 31 02:49:03 np0005603623 NetworkManager[48970]: <info>  [1769845743.7627] device (tap06448a4a-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:49:03 np0005603623 nova_compute[226235]: 2026-01-31 07:49:03.762 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:03 np0005603623 nova_compute[226235]: 2026-01-31 07:49:03.768 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:03 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:03Z|00070|binding|INFO|Releasing lport 06448a4a-1828-42e2-810c-09e0ca21c35f from this chassis (sb_readonly=0)
Jan 31 02:49:03 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:03Z|00071|binding|INFO|Setting lport 06448a4a-1828-42e2-810c-09e0ca21c35f down in Southbound
Jan 31 02:49:03 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:03Z|00072|binding|INFO|Releasing lport 3162329c-f03f-465e-9f99-799a29d883a0 from this chassis (sb_readonly=0)
Jan 31 02:49:03 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:03Z|00073|binding|INFO|Setting lport 3162329c-f03f-465e-9f99-799a29d883a0 down in Southbound
Jan 31 02:49:03 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:03Z|00074|binding|INFO|Removing iface tap06448a4a-18 ovn-installed in OVS
Jan 31 02:49:03 np0005603623 nova_compute[226235]: 2026-01-31 07:49:03.771 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:03 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:03Z|00075|binding|INFO|Releasing lport eefb3f31-55e8-4b1d-a07a-d5c925fc9fd8 from this chassis (sb_readonly=0)
Jan 31 02:49:03 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:03Z|00076|binding|INFO|Releasing lport 1e50dcb0-cab9-4443-a53d-151822a6eb9a from this chassis (sb_readonly=0)
Jan 31 02:49:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:03.775 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:f3:77 10.100.0.14'], port_security=['fa:16:3e:da:f3:77 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'bd097fed-e54b-4ed7-90f0-078b39b8b13a'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-116143868', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5e4f7ec6-bb38-4a62-88f4-5e5b869452f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-116143868', 'neutron:project_id': 'e66a774f63ae4139a4e75c7973fbe077', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a4a96739-bb2f-4e95-bbe5-76a81d2aa557', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=227ca833-938d-48d2-86c8-5d09dd658c40, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=06448a4a-1828-42e2-810c-09e0ca21c35f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:49:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:03.778 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:c3:df 19.80.0.151'], port_security=['fa:16:3e:a4:c3:df 19.80.0.151'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['06448a4a-1828-42e2-810c-09e0ca21c35f'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1877730947', 'neutron:cidrs': '19.80.0.151/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-057fed11-d4e4-4c56-8ba3-81a6235ed1bf', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1877730947', 'neutron:project_id': 'e66a774f63ae4139a4e75c7973fbe077', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'a4a96739-bb2f-4e95-bbe5-76a81d2aa557', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=0d46eecb-5024-425b-affd-165dd8eae0e4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3162329c-f03f-465e-9f99-799a29d883a0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:49:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:03.780 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 06448a4a-1828-42e2-810c-09e0ca21c35f in datapath 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d unbound from our chassis#033[00m
Jan 31 02:49:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:03.781 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:49:03 np0005603623 nova_compute[226235]: 2026-01-31 07:49:03.781 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:03.782 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[755128f2-30f7-4aaf-822c-004525fe3ec6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:03.783 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d namespace which is not needed anymore#033[00m
Jan 31 02:49:03 np0005603623 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000010.scope: Deactivated successfully.
Jan 31 02:49:03 np0005603623 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000010.scope: Consumed 12.399s CPU time.
Jan 31 02:49:03 np0005603623 nova_compute[226235]: 2026-01-31 07:49:03.804 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:03 np0005603623 systemd-machined[194379]: Machine qemu-7-instance-00000010 terminated.
Jan 31 02:49:03 np0005603623 virtqemud[225858]: Unable to get XATTR trusted.libvirt.security.ref_selinux on vms/5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_disk: No such file or directory
Jan 31 02:49:03 np0005603623 virtqemud[225858]: Unable to get XATTR trusted.libvirt.security.ref_dac on vms/5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_disk: No such file or directory
Jan 31 02:49:03 np0005603623 nova_compute[226235]: 2026-01-31 07:49:03.899 226239 DEBUG nova.virt.libvirt.guest [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Jan 31 02:49:03 np0005603623 nova_compute[226235]: 2026-01-31 07:49:03.900 226239 INFO nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Migration operation has completed#033[00m
Jan 31 02:49:03 np0005603623 nova_compute[226235]: 2026-01-31 07:49:03.900 226239 INFO nova.compute.manager [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] _post_live_migration() is started..#033[00m
Jan 31 02:49:03 np0005603623 nova_compute[226235]: 2026-01-31 07:49:03.904 226239 DEBUG nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Jan 31 02:49:03 np0005603623 nova_compute[226235]: 2026-01-31 07:49:03.904 226239 DEBUG nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Jan 31 02:49:03 np0005603623 nova_compute[226235]: 2026-01-31 07:49:03.904 226239 DEBUG nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Jan 31 02:49:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:04.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:04.592 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:49:04 np0005603623 nova_compute[226235]: 2026-01-31 07:49:04.593 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:04 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[234351]: [NOTICE]   (234372) : haproxy version is 2.8.14-c23fe91
Jan 31 02:49:04 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[234351]: [NOTICE]   (234372) : path to executable is /usr/sbin/haproxy
Jan 31 02:49:04 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[234351]: [WARNING]  (234372) : Exiting Master process...
Jan 31 02:49:04 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[234351]: [ALERT]    (234372) : Current worker (234374) exited with code 143 (Terminated)
Jan 31 02:49:04 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[234351]: [WARNING]  (234372) : All workers exited. Exiting... (0)
Jan 31 02:49:04 np0005603623 systemd[1]: libpod-2b425384097e75f92748816fb53b40eac5cf896eed346e9cd848392f717d45c9.scope: Deactivated successfully.
Jan 31 02:49:04 np0005603623 podman[234587]: 2026-01-31 07:49:04.785482272 +0000 UTC m=+0.919966183 container died 2b425384097e75f92748816fb53b40eac5cf896eed346e9cd848392f717d45c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.284 226239 DEBUG nova.network.neutron [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Port 3aff2339-ccc0-4845-8728-4ede26d0c11a updated with migration profile {'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.285 226239 DEBUG nova.compute.manager [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8ndf3l1m',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 31 02:49:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:49:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:05.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.620 226239 DEBUG nova.network.neutron [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Activated binding for port 06448a4a-1828-42e2-810c-09e0ca21c35f and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.621 226239 DEBUG nova.compute.manager [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.623 226239 DEBUG nova.virt.libvirt.vif [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:48:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1707336579',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1707336579',id=16,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:48:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e66a774f63ae4139a4e75c7973fbe077',ramdisk_id='',reservation_id='r-q29px744',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-2072827810',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-2072827810-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:48:52Z,user_data=None,user_id='37ed25cc14814a29867ac308b3cce8cf',uuid=5e4f7ec6-bb38-4a62-88f4-5e5b869452f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.623 226239 DEBUG nova.network.os_vif_util [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Converting VIF {"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.625 226239 DEBUG nova.network.os_vif_util [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:f3:77,bridge_name='br-int',has_traffic_filtering=True,id=06448a4a-1828-42e2-810c-09e0ca21c35f,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06448a4a-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.625 226239 DEBUG os_vif [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:f3:77,bridge_name='br-int',has_traffic_filtering=True,id=06448a4a-1828-42e2-810c-09e0ca21c35f,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06448a4a-18') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.629 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.629 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06448a4a-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.632 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.636 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.638 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.641 226239 INFO os_vif [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:f3:77,bridge_name='br-int',has_traffic_filtering=True,id=06448a4a-1828-42e2-810c-09e0ca21c35f,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06448a4a-18')#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.642 226239 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.642 226239 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.643 226239 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.643 226239 DEBUG nova.compute.manager [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.644 226239 INFO nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Deleting instance files /var/lib/nova/instances/5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_del#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.644 226239 INFO nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Deletion of /var/lib/nova/instances/5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_del complete#033[00m
Jan 31 02:49:05 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2b425384097e75f92748816fb53b40eac5cf896eed346e9cd848392f717d45c9-userdata-shm.mount: Deactivated successfully.
Jan 31 02:49:05 np0005603623 systemd[1]: var-lib-containers-storage-overlay-f67975f633fd0d016f345aebdb3ca181c0e818fd3645476c012efab03a72c3ef-merged.mount: Deactivated successfully.
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.826 226239 DEBUG nova.compute.manager [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received event network-vif-unplugged-06448a4a-1828-42e2-810c-09e0ca21c35f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.827 226239 DEBUG oslo_concurrency.lockutils [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.827 226239 DEBUG oslo_concurrency.lockutils [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.827 226239 DEBUG oslo_concurrency.lockutils [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.828 226239 DEBUG nova.compute.manager [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] No waiting events found dispatching network-vif-unplugged-06448a4a-1828-42e2-810c-09e0ca21c35f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.828 226239 DEBUG nova.compute.manager [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received event network-vif-unplugged-06448a4a-1828-42e2-810c-09e0ca21c35f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.829 226239 DEBUG nova.compute.manager [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received event network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.829 226239 DEBUG oslo_concurrency.lockutils [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.830 226239 DEBUG oslo_concurrency.lockutils [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.830 226239 DEBUG oslo_concurrency.lockutils [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.831 226239 DEBUG nova.compute.manager [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] No waiting events found dispatching network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.831 226239 WARNING nova.compute.manager [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received unexpected event network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.832 226239 DEBUG nova.compute.manager [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received event network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.832 226239 DEBUG oslo_concurrency.lockutils [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.833 226239 DEBUG oslo_concurrency.lockutils [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.833 226239 DEBUG oslo_concurrency.lockutils [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.833 226239 DEBUG nova.compute.manager [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] No waiting events found dispatching network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:49:05 np0005603623 nova_compute[226235]: 2026-01-31 07:49:05.834 226239 WARNING nova.compute.manager [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received unexpected event network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:49:05 np0005603623 systemd[1]: Starting libvirt proxy daemon...
Jan 31 02:49:05 np0005603623 systemd[1]: Started libvirt proxy daemon.
Jan 31 02:49:06 np0005603623 kernel: tap3aff2339-cc: entered promiscuous mode
Jan 31 02:49:06 np0005603623 systemd-udevd[234566]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:49:06 np0005603623 NetworkManager[48970]: <info>  [1769845746.0479] manager: (tap3aff2339-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Jan 31 02:49:06 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:06Z|00077|binding|INFO|Claiming lport 3aff2339-ccc0-4845-8728-4ede26d0c11a for this additional chassis.
Jan 31 02:49:06 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:06Z|00078|binding|INFO|3aff2339-ccc0-4845-8728-4ede26d0c11a: Claiming fa:16:3e:e0:f2:07 10.100.0.11
Jan 31 02:49:06 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:06Z|00079|binding|INFO|Claiming lport fc5261b7-0e3f-49d1-8fbf-8dcf40626991 for this additional chassis.
Jan 31 02:49:06 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:06Z|00080|binding|INFO|fc5261b7-0e3f-49d1-8fbf-8dcf40626991: Claiming fa:16:3e:0b:cb:fc 19.80.0.218
Jan 31 02:49:06 np0005603623 nova_compute[226235]: 2026-01-31 07:49:06.048 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:06 np0005603623 NetworkManager[48970]: <info>  [1769845746.0589] device (tap3aff2339-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:49:06 np0005603623 NetworkManager[48970]: <info>  [1769845746.0596] device (tap3aff2339-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:49:06 np0005603623 systemd-machined[194379]: New machine qemu-8-instance-00000011.
Jan 31 02:49:06 np0005603623 nova_compute[226235]: 2026-01-31 07:49:06.089 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:06 np0005603623 systemd[1]: Started Virtual Machine qemu-8-instance-00000011.
Jan 31 02:49:06 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:06Z|00081|binding|INFO|Setting lport 3aff2339-ccc0-4845-8728-4ede26d0c11a ovn-installed in OVS
Jan 31 02:49:06 np0005603623 nova_compute[226235]: 2026-01-31 07:49:06.098 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:06 np0005603623 podman[234587]: 2026-01-31 07:49:06.239353643 +0000 UTC m=+2.373837564 container cleanup 2b425384097e75f92748816fb53b40eac5cf896eed346e9cd848392f717d45c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:49:06 np0005603623 systemd[1]: libpod-conmon-2b425384097e75f92748816fb53b40eac5cf896eed346e9cd848392f717d45c9.scope: Deactivated successfully.
Jan 31 02:49:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:06.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:06 np0005603623 nova_compute[226235]: 2026-01-31 07:49:06.778 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:07 np0005603623 nova_compute[226235]: 2026-01-31 07:49:07.238 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845732.2375066, d3f77a29-d3a7-444c-9528-ab679b9b946c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:49:07 np0005603623 nova_compute[226235]: 2026-01-31 07:49:07.239 226239 INFO nova.compute.manager [-] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:49:07 np0005603623 nova_compute[226235]: 2026-01-31 07:49:07.264 226239 DEBUG nova.compute.manager [None req-c6f74ff2-154f-474b-9a2e-688fee7d2299 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:49:07 np0005603623 nova_compute[226235]: 2026-01-31 07:49:07.271 226239 DEBUG nova.compute.manager [None req-c6f74ff2-154f-474b-9a2e-688fee7d2299 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: rebuilding, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:49:07 np0005603623 nova_compute[226235]: 2026-01-31 07:49:07.297 226239 INFO nova.compute.manager [None req-c6f74ff2-154f-474b-9a2e-688fee7d2299 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] During sync_power_state the instance has a pending task (rebuilding). Skip.#033[00m
Jan 31 02:49:07 np0005603623 nova_compute[226235]: 2026-01-31 07:49:07.424 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845747.4242473, 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:49:07 np0005603623 nova_compute[226235]: 2026-01-31 07:49:07.425 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] VM Started (Lifecycle Event)#033[00m
Jan 31 02:49:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:49:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:07.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:49:07 np0005603623 nova_compute[226235]: 2026-01-31 07:49:07.461 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:49:07 np0005603623 podman[234674]: 2026-01-31 07:49:07.821083472 +0000 UTC m=+1.559894963 container remove 2b425384097e75f92748816fb53b40eac5cf896eed346e9cd848392f717d45c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 31 02:49:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:07.827 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[de3d404d-8e9f-4d19-a04d-41e246eee661]: (4, ('Sat Jan 31 07:49:03 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d (2b425384097e75f92748816fb53b40eac5cf896eed346e9cd848392f717d45c9)\n2b425384097e75f92748816fb53b40eac5cf896eed346e9cd848392f717d45c9\nSat Jan 31 07:49:06 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d (2b425384097e75f92748816fb53b40eac5cf896eed346e9cd848392f717d45c9)\n2b425384097e75f92748816fb53b40eac5cf896eed346e9cd848392f717d45c9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:07.830 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9c7210a4-7e99-4036-9f56-27b73fd09731]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:07.832 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60bb4bea-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:07 np0005603623 nova_compute[226235]: 2026-01-31 07:49:07.873 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:07 np0005603623 kernel: tap60bb4bea-d0: left promiscuous mode
Jan 31 02:49:07 np0005603623 nova_compute[226235]: 2026-01-31 07:49:07.883 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:07.887 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[20de2a8a-300f-4f0b-a246-134d8ea07828]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:07.901 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1cb3fe82-81db-4dfc-8b24-6e24cdf049ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:07.903 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a893aca8-2340-43d0-96b3-bb8c3973dd6c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:07.915 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[061bb00a-e233-4678-84f1-8b318d2ffbe1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486706, 'reachable_time': 23864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234734, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:07.920 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:49:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:07.920 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[83aa3d25-eb07-4810-8806-eba80d3c6ba9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:07 np0005603623 systemd[1]: run-netns-ovnmeta\x2d60bb4bea\x2dd9f0\x2d41fc\x2d9c0f\x2d6fcd644c255d.mount: Deactivated successfully.
Jan 31 02:49:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:07.922 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 3162329c-f03f-465e-9f99-799a29d883a0 in datapath 057fed11-d4e4-4c56-8ba3-81a6235ed1bf unbound from our chassis#033[00m
Jan 31 02:49:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:07.923 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 057fed11-d4e4-4c56-8ba3-81a6235ed1bf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:49:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:07.924 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[59777930-6710-40b0-9a57-a73ab27075c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:07.924 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf namespace which is not needed anymore#033[00m
Jan 31 02:49:07 np0005603623 nova_compute[226235]: 2026-01-31 07:49:07.980 226239 DEBUG nova.compute.manager [req-7101e823-39e1-437e-bacd-4249b090ce85 req-f1bba3a4-93e6-41e9-89ec-014c916d58bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received event network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:49:07 np0005603623 nova_compute[226235]: 2026-01-31 07:49:07.980 226239 DEBUG oslo_concurrency.lockutils [req-7101e823-39e1-437e-bacd-4249b090ce85 req-f1bba3a4-93e6-41e9-89ec-014c916d58bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:07 np0005603623 nova_compute[226235]: 2026-01-31 07:49:07.981 226239 DEBUG oslo_concurrency.lockutils [req-7101e823-39e1-437e-bacd-4249b090ce85 req-f1bba3a4-93e6-41e9-89ec-014c916d58bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:07 np0005603623 nova_compute[226235]: 2026-01-31 07:49:07.981 226239 DEBUG oslo_concurrency.lockutils [req-7101e823-39e1-437e-bacd-4249b090ce85 req-f1bba3a4-93e6-41e9-89ec-014c916d58bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:07 np0005603623 nova_compute[226235]: 2026-01-31 07:49:07.981 226239 DEBUG nova.compute.manager [req-7101e823-39e1-437e-bacd-4249b090ce85 req-f1bba3a4-93e6-41e9-89ec-014c916d58bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] No waiting events found dispatching network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:49:07 np0005603623 nova_compute[226235]: 2026-01-31 07:49:07.981 226239 WARNING nova.compute.manager [req-7101e823-39e1-437e-bacd-4249b090ce85 req-f1bba3a4-93e6-41e9-89ec-014c916d58bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received unexpected event network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:49:08 np0005603623 nova_compute[226235]: 2026-01-31 07:49:08.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:49:08 np0005603623 neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf[234512]: [NOTICE]   (234534) : haproxy version is 2.8.14-c23fe91
Jan 31 02:49:08 np0005603623 neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf[234512]: [NOTICE]   (234534) : path to executable is /usr/sbin/haproxy
Jan 31 02:49:08 np0005603623 neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf[234512]: [WARNING]  (234534) : Exiting Master process...
Jan 31 02:49:08 np0005603623 neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf[234512]: [ALERT]    (234534) : Current worker (234536) exited with code 143 (Terminated)
Jan 31 02:49:08 np0005603623 neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf[234512]: [WARNING]  (234534) : All workers exited. Exiting... (0)
Jan 31 02:49:08 np0005603623 systemd[1]: libpod-9d4b29f1b419887b96c39d82bb84ade214a8dd3a86495ef1ef7f8d356b69eab8.scope: Deactivated successfully.
Jan 31 02:49:08 np0005603623 podman[234752]: 2026-01-31 07:49:08.431408233 +0000 UTC m=+0.436192399 container died 9d4b29f1b419887b96c39d82bb84ade214a8dd3a86495ef1ef7f8d356b69eab8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 02:49:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:08.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:09 np0005603623 nova_compute[226235]: 2026-01-31 07:49:09.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:49:09 np0005603623 nova_compute[226235]: 2026-01-31 07:49:09.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:49:09 np0005603623 nova_compute[226235]: 2026-01-31 07:49:09.156 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:49:09 np0005603623 nova_compute[226235]: 2026-01-31 07:49:09.190 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:09 np0005603623 nova_compute[226235]: 2026-01-31 07:49:09.191 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:09 np0005603623 nova_compute[226235]: 2026-01-31 07:49:09.191 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:09 np0005603623 nova_compute[226235]: 2026-01-31 07:49:09.192 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:49:09 np0005603623 nova_compute[226235]: 2026-01-31 07:49:09.193 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:09.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:09 np0005603623 nova_compute[226235]: 2026-01-31 07:49:09.449 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845749.4484363, 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:49:09 np0005603623 nova_compute[226235]: 2026-01-31 07:49:09.450 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:49:09 np0005603623 nova_compute[226235]: 2026-01-31 07:49:09.473 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:49:09 np0005603623 nova_compute[226235]: 2026-01-31 07:49:09.478 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:49:09 np0005603623 nova_compute[226235]: 2026-01-31 07:49:09.502 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 31 02:49:09 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d4b29f1b419887b96c39d82bb84ade214a8dd3a86495ef1ef7f8d356b69eab8-userdata-shm.mount: Deactivated successfully.
Jan 31 02:49:09 np0005603623 systemd[1]: var-lib-containers-storage-overlay-2c36673826f20559cb3653fe08e3ce55f88bf346a0f0cb281dad09c112aef597-merged.mount: Deactivated successfully.
Jan 31 02:49:09 np0005603623 podman[234752]: 2026-01-31 07:49:09.96435982 +0000 UTC m=+1.969144006 container cleanup 9d4b29f1b419887b96c39d82bb84ade214a8dd3a86495ef1ef7f8d356b69eab8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 02:49:09 np0005603623 systemd[1]: libpod-conmon-9d4b29f1b419887b96c39d82bb84ade214a8dd3a86495ef1ef7f8d356b69eab8.scope: Deactivated successfully.
Jan 31 02:49:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:49:09 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2725549927' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:49:09 np0005603623 nova_compute[226235]: 2026-01-31 07:49:09.994 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.801s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.087 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000011 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.087 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000011 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.090 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000000c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.090 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000000c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.223 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.225 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4678MB free_disk=20.84790802001953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.226 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.226 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.274 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Migration for instance 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.274 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Migration for instance 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.308 226239 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.309 226239 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.309 226239 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.311 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.341 226239 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.343 226239 INFO nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Updating resource usage from migration 9755ad61-6df1-4456-8860-6bd237ad62cb#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.343 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Starting to track incoming migration 9755ad61-6df1-4456-8860-6bd237ad62cb with flavor a01eb4f0-fd80-416b-a750-75de320394d8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.397 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance d3f77a29-d3a7-444c-9528-ab679b9b946c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.398 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Migration 13553142-40d2-4bbf-8fa5-7f14b3beb57a is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.436 226239 WARNING nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}.#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.437 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.437 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:49:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:10.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.520 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.635 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:10 np0005603623 podman[234806]: 2026-01-31 07:49:10.914749508 +0000 UTC m=+0.931225207 container remove 9d4b29f1b419887b96c39d82bb84ade214a8dd3a86495ef1ef7f8d356b69eab8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 02:49:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:10.919 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[339e84dd-914f-4a8c-bdbe-42c6e8860a01]: (4, ('Sat Jan 31 07:49:07 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf (9d4b29f1b419887b96c39d82bb84ade214a8dd3a86495ef1ef7f8d356b69eab8)\n9d4b29f1b419887b96c39d82bb84ade214a8dd3a86495ef1ef7f8d356b69eab8\nSat Jan 31 07:49:09 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf (9d4b29f1b419887b96c39d82bb84ade214a8dd3a86495ef1ef7f8d356b69eab8)\n9d4b29f1b419887b96c39d82bb84ade214a8dd3a86495ef1ef7f8d356b69eab8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:10.921 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f159ee89-2950-43c0-8202-eff7c915e2a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:10.922 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap057fed11-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.923 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:10 np0005603623 kernel: tap057fed11-d0: left promiscuous mode
Jan 31 02:49:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:10.929 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[aa08e56e-30f8-44bb-b8b5-9ab1bb609fc6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:10 np0005603623 nova_compute[226235]: 2026-01-31 07:49:10.931 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:10.942 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[71618012-1725-4eec-bfab-8cba7adb289e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:10.943 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a919783a-31f4-48e7-b05b-9095f7c30ff2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:10.956 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bbbe0a99-20bb-45a5-a0eb-1660f4bce069]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487016, 'reachable_time': 30302, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234835, 'error': None, 'target': 'ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:10.958 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:49:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:10.958 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[77b5b34a-439f-46b2-8b9d-01ea78066bed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:10.958 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:49:10 np0005603623 systemd[1]: run-netns-ovnmeta\x2d057fed11\x2dd4e4\x2d4c56\x2d8ba3\x2d81a6235ed1bf.mount: Deactivated successfully.
Jan 31 02:49:11 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Jan 31 02:49:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:11.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:49:11 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2904921565' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:49:11 np0005603623 nova_compute[226235]: 2026-01-31 07:49:11.529 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:11 np0005603623 nova_compute[226235]: 2026-01-31 07:49:11.535 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:49:11 np0005603623 nova_compute[226235]: 2026-01-31 07:49:11.549 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:49:11 np0005603623 nova_compute[226235]: 2026-01-31 07:49:11.571 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:49:11 np0005603623 nova_compute[226235]: 2026-01-31 07:49:11.571 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:11 np0005603623 nova_compute[226235]: 2026-01-31 07:49:11.572 226239 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 1.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:11 np0005603623 nova_compute[226235]: 2026-01-31 07:49:11.572 226239 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:11 np0005603623 nova_compute[226235]: 2026-01-31 07:49:11.572 226239 DEBUG nova.compute.resource_tracker [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:49:11 np0005603623 nova_compute[226235]: 2026-01-31 07:49:11.573 226239 DEBUG oslo_concurrency.processutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:11 np0005603623 nova_compute[226235]: 2026-01-31 07:49:11.781 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:11 np0005603623 nova_compute[226235]: 2026-01-31 07:49:11.838 226239 INFO nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Deleting instance files /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c_del#033[00m
Jan 31 02:49:11 np0005603623 nova_compute[226235]: 2026-01-31 07:49:11.839 226239 INFO nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Deletion of /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c_del complete#033[00m
Jan 31 02:49:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:49:11 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4162205607' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:49:11 np0005603623 nova_compute[226235]: 2026-01-31 07:49:11.992 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:49:11 np0005603623 nova_compute[226235]: 2026-01-31 07:49:11.993 226239 INFO nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Creating image(s)#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.027 226239 DEBUG nova.storage.rbd_utils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.062 226239 DEBUG nova.storage.rbd_utils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.092 226239 DEBUG nova.storage.rbd_utils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.095 226239 DEBUG oslo_concurrency.processutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.109 226239 DEBUG oslo_concurrency.processutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.536s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.144 226239 DEBUG oslo_concurrency.processutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.145 226239 DEBUG oslo_concurrency.lockutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.146 226239 DEBUG oslo_concurrency.lockutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.146 226239 DEBUG oslo_concurrency.lockutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.168 226239 DEBUG nova.storage.rbd_utils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.171 226239 DEBUG oslo_concurrency.processutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 d3f77a29-d3a7-444c-9528-ab679b9b946c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.203 226239 DEBUG nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] skipping disk for instance-00000011 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.203 226239 DEBUG nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] skipping disk for instance-00000011 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.313 226239 WARNING nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.314 226239 DEBUG nova.compute.resource_tracker [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4679MB free_disk=20.84123992919922GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.315 226239 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.315 226239 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.360 226239 DEBUG nova.compute.resource_tracker [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Migration for instance 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.361 226239 DEBUG nova.compute.resource_tracker [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Migration for instance 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.389 226239 DEBUG nova.compute.resource_tracker [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.411 226239 INFO nova.compute.resource_tracker [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Updating resource usage from migration 9755ad61-6df1-4456-8860-6bd237ad62cb#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.412 226239 DEBUG nova.compute.resource_tracker [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Starting to track incoming migration 9755ad61-6df1-4456-8860-6bd237ad62cb with flavor a01eb4f0-fd80-416b-a750-75de320394d8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.443 226239 DEBUG nova.compute.resource_tracker [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Instance d3f77a29-d3a7-444c-9528-ab679b9b946c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.444 226239 DEBUG nova.compute.resource_tracker [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Migration 13553142-40d2-4bbf-8fa5-7f14b3beb57a is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.469 226239 WARNING nova.compute.resource_tracker [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Instance 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}.#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.470 226239 DEBUG nova.compute.resource_tracker [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.470 226239 DEBUG nova.compute.resource_tracker [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:49:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:12.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.566 226239 DEBUG oslo_concurrency.processutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.586 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.588 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.588 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.589 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.612 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-d3f77a29-d3a7-444c-9528-ab679b9b946c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.612 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-d3f77a29-d3a7-444c-9528-ab679b9b946c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.613 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 02:49:12 np0005603623 nova_compute[226235]: 2026-01-31 07:49:12.614 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:49:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:49:13 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2396706813' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:49:13 np0005603623 nova_compute[226235]: 2026-01-31 07:49:13.112 226239 DEBUG oslo_concurrency.processutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.546s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:13 np0005603623 nova_compute[226235]: 2026-01-31 07:49:13.117 226239 DEBUG nova.compute.provider_tree [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:49:13 np0005603623 nova_compute[226235]: 2026-01-31 07:49:13.132 226239 DEBUG nova.scheduler.client.report [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:49:13 np0005603623 nova_compute[226235]: 2026-01-31 07:49:13.153 226239 DEBUG nova.compute.resource_tracker [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:49:13 np0005603623 nova_compute[226235]: 2026-01-31 07:49:13.154 226239 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.839s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:13 np0005603623 nova_compute[226235]: 2026-01-31 07:49:13.158 226239 INFO nova.compute.manager [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Migrating instance to compute-1.ctlplane.example.com finished successfully.#033[00m
Jan 31 02:49:13 np0005603623 nova_compute[226235]: 2026-01-31 07:49:13.255 226239 INFO nova.scheduler.client.report [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Deleted allocation for migration 13553142-40d2-4bbf-8fa5-7f14b3beb57a#033[00m
Jan 31 02:49:13 np0005603623 nova_compute[226235]: 2026-01-31 07:49:13.255 226239 DEBUG nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Jan 31 02:49:13 np0005603623 nova_compute[226235]: 2026-01-31 07:49:13.275 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:49:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:13.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:14 np0005603623 nova_compute[226235]: 2026-01-31 07:49:14.389 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:49:14 np0005603623 nova_compute[226235]: 2026-01-31 07:49:14.410 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-d3f77a29-d3a7-444c-9528-ab679b9b946c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:49:14 np0005603623 nova_compute[226235]: 2026-01-31 07:49:14.410 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 02:49:14 np0005603623 nova_compute[226235]: 2026-01-31 07:49:14.410 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:49:14 np0005603623 nova_compute[226235]: 2026-01-31 07:49:14.411 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:49:14 np0005603623 nova_compute[226235]: 2026-01-31 07:49:14.411 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:49:14 np0005603623 nova_compute[226235]: 2026-01-31 07:49:14.411 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:49:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:14.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:15.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:15 np0005603623 nova_compute[226235]: 2026-01-31 07:49:15.638 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:16 np0005603623 nova_compute[226235]: 2026-01-31 07:49:16.123 226239 DEBUG oslo_concurrency.processutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 d3f77a29-d3a7-444c-9528-ab679b9b946c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.952s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:16 np0005603623 nova_compute[226235]: 2026-01-31 07:49:16.208 226239 DEBUG nova.storage.rbd_utils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] resizing rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:49:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:16.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:16 np0005603623 nova_compute[226235]: 2026-01-31 07:49:16.781 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e148 e148: 3 total, 3 up, 3 in
Jan 31 02:49:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:49:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:17.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.557 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.557 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Ensure instance console log exists: /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.558 226239 DEBUG oslo_concurrency.lockutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.558 226239 DEBUG oslo_concurrency.lockutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.558 226239 DEBUG oslo_concurrency.lockutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.560 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.564 226239 WARNING nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.567 226239 DEBUG nova.virt.libvirt.host [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.568 226239 DEBUG nova.virt.libvirt.host [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.570 226239 DEBUG nova.virt.libvirt.host [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.571 226239 DEBUG nova.virt.libvirt.host [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.573 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.573 226239 DEBUG nova.virt.hardware [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.574 226239 DEBUG nova.virt.hardware [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.574 226239 DEBUG nova.virt.hardware [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.574 226239 DEBUG nova.virt.hardware [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.575 226239 DEBUG nova.virt.hardware [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.575 226239 DEBUG nova.virt.hardware [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.575 226239 DEBUG nova.virt.hardware [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.575 226239 DEBUG nova.virt.hardware [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.575 226239 DEBUG nova.virt.hardware [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.576 226239 DEBUG nova.virt.hardware [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.576 226239 DEBUG nova.virt.hardware [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.576 226239 DEBUG nova.objects.instance [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Lazy-loading 'vcpu_model' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:49:17 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:17Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e0:f2:07 10.100.0.11
Jan 31 02:49:17 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:17Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e0:f2:07 10.100.0.11
Jan 31 02:49:17 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:17Z|00082|binding|INFO|Claiming lport 3aff2339-ccc0-4845-8728-4ede26d0c11a for this chassis.
Jan 31 02:49:17 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:17Z|00083|binding|INFO|3aff2339-ccc0-4845-8728-4ede26d0c11a: Claiming fa:16:3e:e0:f2:07 10.100.0.11
Jan 31 02:49:17 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:17Z|00084|binding|INFO|Claiming lport fc5261b7-0e3f-49d1-8fbf-8dcf40626991 for this chassis.
Jan 31 02:49:17 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:17Z|00085|binding|INFO|fc5261b7-0e3f-49d1-8fbf-8dcf40626991: Claiming fa:16:3e:0b:cb:fc 19.80.0.218
Jan 31 02:49:17 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:17Z|00086|binding|INFO|Setting lport 3aff2339-ccc0-4845-8728-4ede26d0c11a up in Southbound
Jan 31 02:49:17 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:17Z|00087|binding|INFO|Setting lport fc5261b7-0e3f-49d1-8fbf-8dcf40626991 up in Southbound
Jan 31 02:49:17 np0005603623 nova_compute[226235]: 2026-01-31 07:49:17.868 226239 DEBUG oslo_concurrency.processutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:17.878 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:f2:07 10.100.0.11'], port_security=['fa:16:3e:e0:f2:07 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1970562059', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1970562059', 'neutron:project_id': 'cbdbb7a4b22a49b68feb3e028bb62fbb', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'a60a5d2f-886d-4841-8ef6-f9e7838468dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f860fcac-4f6a-4e88-8005-0fd323fc8053, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=3aff2339-ccc0-4845-8728-4ede26d0c11a) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:49:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:17.879 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:cb:fc 19.80.0.218'], port_security=['fa:16:3e:0b:cb:fc 19.80.0.218'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['3aff2339-ccc0-4845-8728-4ede26d0c11a'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-427751920', 'neutron:cidrs': '19.80.0.218/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c3613479-5299-41cd-b6dd-df1fae2ae862', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-427751920', 'neutron:project_id': 'cbdbb7a4b22a49b68feb3e028bb62fbb', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a60a5d2f-886d-4841-8ef6-f9e7838468dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=0ad390ce-c29b-4af4-b946-e8404e058f9b, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fc5261b7-0e3f-49d1-8fbf-8dcf40626991) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:49:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:17.880 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 3aff2339-ccc0-4845-8728-4ede26d0c11a in datapath 850ad6ca-6166-4382-94bb-4b7c10d9a136 bound to our chassis#033[00m
Jan 31 02:49:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:17.882 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 850ad6ca-6166-4382-94bb-4b7c10d9a136#033[00m
Jan 31 02:49:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:17.889 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[089f787b-612f-43a5-8037-f2e36edd9cc4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:17.889 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap850ad6ca-61 in ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:49:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:17.890 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap850ad6ca-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:49:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:17.891 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[340ed295-6451-4c6b-b7dc-0b23697c4e32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:17.891 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[69aef55b-0b5d-4240-9132-924a970c27f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:17.899 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[afedc64a-d48f-4553-a2eb-828a30fbb60b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:17.920 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[22a959d0-a514-4fc4-a99d-2863b1deafdb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:17.936 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[f400a81d-6c31-48cd-b9b4-592b3a59d788]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:17 np0005603623 NetworkManager[48970]: <info>  [1769845757.9430] manager: (tap850ad6ca-60): new Veth device (/org/freedesktop/NetworkManager/Devices/49)
Jan 31 02:49:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:17.942 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7db4aa31-131a-467a-acef-bb7c2af6e192]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:17.960 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:17 np0005603623 systemd-udevd[235122]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:49:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:17.970 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[007e6cdd-da48-4636-9b84-ff730ce895a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:17.973 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[0c88d77c-7fd6-42d8-9dbb-6cf5be220f63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:17 np0005603623 NetworkManager[48970]: <info>  [1769845757.9958] device (tap850ad6ca-60): carrier: link connected
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:17.999 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[25ef15cf-caed-4b66-91f5-4d542f7ccc75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:18.012 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[20a88897-4158-480e-8da7-55e70516b097]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap850ad6ca-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:99:6f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489842, 'reachable_time': 29262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235151, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:18.025 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[94016719-0a93-42aa-862e-8e1bdc072cd6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2d:996f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489842, 'tstamp': 489842}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235161, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:18.043 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[98df78a7-1525-4b8c-b5d3-d0ad4889c25c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap850ad6ca-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:99:6f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489842, 'reachable_time': 29262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235162, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:18.077 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3c29db9b-7239-404b-88ba-b202f99ecb1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:18.138 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[079bf7d3-41c4-4e73-b176-06bcd166b6e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:18.139 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap850ad6ca-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:18.139 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:18.140 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap850ad6ca-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:18 np0005603623 NetworkManager[48970]: <info>  [1769845758.1897] manager: (tap850ad6ca-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Jan 31 02:49:18 np0005603623 nova_compute[226235]: 2026-01-31 07:49:18.189 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:18 np0005603623 kernel: tap850ad6ca-60: entered promiscuous mode
Jan 31 02:49:18 np0005603623 nova_compute[226235]: 2026-01-31 07:49:18.191 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:18.192 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap850ad6ca-60, col_values=(('external_ids', {'iface-id': '61b6889f-b848-4873-9650-8b2715794d29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:18 np0005603623 nova_compute[226235]: 2026-01-31 07:49:18.193 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:18 np0005603623 nova_compute[226235]: 2026-01-31 07:49:18.195 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:18.195 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/850ad6ca-6166-4382-94bb-4b7c10d9a136.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/850ad6ca-6166-4382-94bb-4b7c10d9a136.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:49:18 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:18Z|00088|binding|INFO|Releasing lport 61b6889f-b848-4873-9650-8b2715794d29 from this chassis (sb_readonly=0)
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:18.196 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d2eced77-43c9-4491-8ea9-08ac5dbd02da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:18.197 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-850ad6ca-6166-4382-94bb-4b7c10d9a136
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/850ad6ca-6166-4382-94bb-4b7c10d9a136.pid.haproxy
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 850ad6ca-6166-4382-94bb-4b7c10d9a136
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:49:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:18.197 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'env', 'PROCESS_TAG=haproxy-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/850ad6ca-6166-4382-94bb-4b7c10d9a136.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:49:18 np0005603623 nova_compute[226235]: 2026-01-31 07:49:18.202 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:49:18 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/324832964' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:49:18 np0005603623 nova_compute[226235]: 2026-01-31 07:49:18.312 226239 DEBUG oslo_concurrency.processutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:18 np0005603623 nova_compute[226235]: 2026-01-31 07:49:18.342 226239 DEBUG nova.storage.rbd_utils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:49:18 np0005603623 nova_compute[226235]: 2026-01-31 07:49:18.347 226239 DEBUG oslo_concurrency.processutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:18.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:18 np0005603623 podman[235225]: 2026-01-31 07:49:18.477557804 +0000 UTC m=+0.023566192 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:49:18 np0005603623 nova_compute[226235]: 2026-01-31 07:49:18.755 226239 INFO nova.compute.manager [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Post operation of migration started#033[00m
Jan 31 02:49:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:49:18 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2104307244' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:49:18 np0005603623 nova_compute[226235]: 2026-01-31 07:49:18.899 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845743.8975198, 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:49:18 np0005603623 nova_compute[226235]: 2026-01-31 07:49:18.900 226239 INFO nova.compute.manager [-] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:49:18 np0005603623 nova_compute[226235]: 2026-01-31 07:49:18.903 226239 DEBUG oslo_concurrency.processutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.556s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:18 np0005603623 nova_compute[226235]: 2026-01-31 07:49:18.908 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:49:18 np0005603623 nova_compute[226235]:  <uuid>d3f77a29-d3a7-444c-9528-ab679b9b946c</uuid>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:  <name>instance-0000000c</name>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServersAdmin275Test-server-1493730364</nova:name>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:49:17</nova:creationTime>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:49:18 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:        <nova:user uuid="4e887d8783db44ff93a55e1ea75aa78e">tempest-ServersAdmin275Test-200317158-project-member</nova:user>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:        <nova:project uuid="2ca2f28405884a6ea92bcde9c8f91ff9">tempest-ServersAdmin275Test-200317158</nova:project>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <nova:ports/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <entry name="serial">d3f77a29-d3a7-444c-9528-ab679b9b946c</entry>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <entry name="uuid">d3f77a29-d3a7-444c-9528-ab679b9b946c</entry>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/d3f77a29-d3a7-444c-9528-ab679b9b946c_disk">
Jan 31 02:49:18 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:49:18 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/d3f77a29-d3a7-444c-9528-ab679b9b946c_disk.config">
Jan 31 02:49:18 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:49:18 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/console.log" append="off"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:49:18 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:49:18 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:49:18 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:49:18 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:49:18 np0005603623 nova_compute[226235]: 2026-01-31 07:49:18.932 226239 DEBUG nova.compute.manager [None req-d7e524bc-aa1e-43b1-b16b-b9522cfb9b9b - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:49:19 np0005603623 podman[235225]: 2026-01-31 07:49:19.164375289 +0000 UTC m=+0.710383617 container create 0eb83e95ed694d047c8d6923aa9afbab83098c697f5a78f27e007163f07b8620 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:49:19 np0005603623 nova_compute[226235]: 2026-01-31 07:49:19.318 226239 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "refresh_cache-14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:49:19 np0005603623 nova_compute[226235]: 2026-01-31 07:49:19.319 226239 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquired lock "refresh_cache-14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:49:19 np0005603623 nova_compute[226235]: 2026-01-31 07:49:19.319 226239 DEBUG nova.network.neutron [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:49:19 np0005603623 nova_compute[226235]: 2026-01-31 07:49:19.343 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:49:19 np0005603623 nova_compute[226235]: 2026-01-31 07:49:19.343 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:49:19 np0005603623 nova_compute[226235]: 2026-01-31 07:49:19.344 226239 INFO nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Using config drive#033[00m
Jan 31 02:49:19 np0005603623 nova_compute[226235]: 2026-01-31 07:49:19.371 226239 DEBUG nova.storage.rbd_utils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:49:19 np0005603623 nova_compute[226235]: 2026-01-31 07:49:19.395 226239 DEBUG nova.objects.instance [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Lazy-loading 'ec2_ids' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:49:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:49:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:19.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:49:19 np0005603623 nova_compute[226235]: 2026-01-31 07:49:19.451 226239 DEBUG nova.objects.instance [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Lazy-loading 'keypairs' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:49:19 np0005603623 systemd[1]: Started libpod-conmon-0eb83e95ed694d047c8d6923aa9afbab83098c697f5a78f27e007163f07b8620.scope.
Jan 31 02:49:19 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:49:19 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d23ee60fd1830d50ca50c85cdad2a63a22a2e4a796b6a00c5b251fa1fd800a52/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:49:20 np0005603623 podman[235225]: 2026-01-31 07:49:20.001401793 +0000 UTC m=+1.547410171 container init 0eb83e95ed694d047c8d6923aa9afbab83098c697f5a78f27e007163f07b8620 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 31 02:49:20 np0005603623 nova_compute[226235]: 2026-01-31 07:49:20.001 226239 INFO nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Creating config drive at /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/disk.config#033[00m
Jan 31 02:49:20 np0005603623 podman[235225]: 2026-01-31 07:49:20.009333472 +0000 UTC m=+1.555341830 container start 0eb83e95ed694d047c8d6923aa9afbab83098c697f5a78f27e007163f07b8620 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 31 02:49:20 np0005603623 nova_compute[226235]: 2026-01-31 07:49:20.008 226239 DEBUG oslo_concurrency.processutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpbme_jujg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:20 np0005603623 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[235269]: [NOTICE]   (235296) : New worker (235299) forked
Jan 31 02:49:20 np0005603623 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[235269]: [NOTICE]   (235296) : Loading success.
Jan 31 02:49:20 np0005603623 nova_compute[226235]: 2026-01-31 07:49:20.134 226239 DEBUG oslo_concurrency.processutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpbme_jujg" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.147 143258 INFO neutron.agent.ovn.metadata.agent [-] Port fc5261b7-0e3f-49d1-8fbf-8dcf40626991 in datapath c3613479-5299-41cd-b6dd-df1fae2ae862 unbound from our chassis#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.150 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c3613479-5299-41cd-b6dd-df1fae2ae862#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.161 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7c0033cc-7841-421d-975b-9dd2305380ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.162 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc3613479-51 in ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.164 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc3613479-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.164 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b60fd97d-b26c-4b83-92ce-76b91cd11693]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.166 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[edbd42e9-e4d2-4398-9015-14f391533ea6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.177 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[2ea43cd7-dfb6-4c30-899f-518d42c0c4b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.188 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[54472aac-bb29-4c9b-b33e-078a5e88ae56]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.211 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[41622a52-cf2b-4592-ae4a-3679fd2d082b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.218 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d9d445f8-5b35-4ef4-a3f8-061b1bc217f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:20 np0005603623 podman[235270]: 2026-01-31 07:49:20.217346539 +0000 UTC m=+0.484209928 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 02:49:20 np0005603623 NetworkManager[48970]: <info>  [1769845760.2187] manager: (tapc3613479-50): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Jan 31 02:49:20 np0005603623 podman[235272]: 2026-01-31 07:49:20.246078312 +0000 UTC m=+0.512800877 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.245 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[b8571007-a7e4-4e24-b7cb-a35028e276e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.250 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[52256f96-c51e-4754-a341-1ee12e40cacc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:20 np0005603623 NetworkManager[48970]: <info>  [1769845760.2690] device (tapc3613479-50): carrier: link connected
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.273 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[9fc26d25-e3a9-4399-bc28-d61d5cfa1b00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.288 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b74b7e90-63b5-4c70-9cdb-28cfa1ef7598]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc3613479-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e6:bf:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490069, 'reachable_time': 27420, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235351, 'error': None, 'target': 'ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.302 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e8ad2495-0d58-4eb6-b933-4cf4f872aab7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee6:bf67'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 490069, 'tstamp': 490069}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235352, 'error': None, 'target': 'ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:20 np0005603623 nova_compute[226235]: 2026-01-31 07:49:20.313 226239 DEBUG nova.storage.rbd_utils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] rbd image d3f77a29-d3a7-444c-9528-ab679b9b946c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:49:20 np0005603623 nova_compute[226235]: 2026-01-31 07:49:20.316 226239 DEBUG oslo_concurrency.processutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/disk.config d3f77a29-d3a7-444c-9528-ab679b9b946c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.317 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[14e78b25-5e7f-4a0f-bbc9-a75466619169]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc3613479-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e6:bf:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490069, 'reachable_time': 27420, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235362, 'error': None, 'target': 'ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.335 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a1e6cd1f-2f5b-4850-b4cc-30a5478b5fe9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.370 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fcec57c6-7271-45a2-9f92-0e37707b1750]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.372 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3613479-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.372 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.372 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3613479-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:20 np0005603623 NetworkManager[48970]: <info>  [1769845760.4299] manager: (tapc3613479-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Jan 31 02:49:20 np0005603623 kernel: tapc3613479-50: entered promiscuous mode
Jan 31 02:49:20 np0005603623 nova_compute[226235]: 2026-01-31 07:49:20.429 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.434 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc3613479-50, col_values=(('external_ids', {'iface-id': '9dcf2f9f-4a2b-44f0-988c-28c5222b394c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:20 np0005603623 nova_compute[226235]: 2026-01-31 07:49:20.435 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:20 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:20Z|00089|binding|INFO|Releasing lport 9dcf2f9f-4a2b-44f0-988c-28c5222b394c from this chassis (sb_readonly=0)
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.436 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c3613479-5299-41cd-b6dd-df1fae2ae862.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c3613479-5299-41cd-b6dd-df1fae2ae862.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.436 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dfee4940-84e2-48b7-8cf9-fa40acef03d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.437 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-c3613479-5299-41cd-b6dd-df1fae2ae862
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/c3613479-5299-41cd-b6dd-df1fae2ae862.pid.haproxy
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID c3613479-5299-41cd-b6dd-df1fae2ae862
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:49:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:20.438 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862', 'env', 'PROCESS_TAG=haproxy-c3613479-5299-41cd-b6dd-df1fae2ae862', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c3613479-5299-41cd-b6dd-df1fae2ae862.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:49:20 np0005603623 nova_compute[226235]: 2026-01-31 07:49:20.440 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:20.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:20 np0005603623 nova_compute[226235]: 2026-01-31 07:49:20.640 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:20 np0005603623 podman[235413]: 2026-01-31 07:49:20.708288708 +0000 UTC m=+0.016391546 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:49:21 np0005603623 podman[235413]: 2026-01-31 07:49:21.264724565 +0000 UTC m=+0.572827383 container create ec60d423e16d0005c884bdbc814527297d99910c6bd8a56563a93c3ee9d773c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:49:21 np0005603623 systemd[1]: Started libpod-conmon-ec60d423e16d0005c884bdbc814527297d99910c6bd8a56563a93c3ee9d773c5.scope.
Jan 31 02:49:21 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:49:21 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c79bc6a3627715558b5cdba1eaa316f64550838ab27096df54011ee4d24ae108/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:49:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:21.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:21 np0005603623 nova_compute[226235]: 2026-01-31 07:49:21.553 226239 DEBUG nova.network.neutron [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Updating instance_info_cache with network_info: [{"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:49:21 np0005603623 podman[235413]: 2026-01-31 07:49:21.572503028 +0000 UTC m=+0.880605866 container init ec60d423e16d0005c884bdbc814527297d99910c6bd8a56563a93c3ee9d773c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:49:21 np0005603623 podman[235413]: 2026-01-31 07:49:21.576346199 +0000 UTC m=+0.884449027 container start ec60d423e16d0005c884bdbc814527297d99910c6bd8a56563a93c3ee9d773c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Jan 31 02:49:21 np0005603623 neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862[235429]: [NOTICE]   (235433) : New worker (235435) forked
Jan 31 02:49:21 np0005603623 neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862[235429]: [NOTICE]   (235433) : Loading success.
Jan 31 02:49:21 np0005603623 nova_compute[226235]: 2026-01-31 07:49:21.670 226239 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Releasing lock "refresh_cache-14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:49:21 np0005603623 nova_compute[226235]: 2026-01-31 07:49:21.697 226239 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:21 np0005603623 nova_compute[226235]: 2026-01-31 07:49:21.697 226239 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:21 np0005603623 nova_compute[226235]: 2026-01-31 07:49:21.698 226239 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:21 np0005603623 nova_compute[226235]: 2026-01-31 07:49:21.702 226239 INFO nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 31 02:49:21 np0005603623 virtqemud[225858]: Domain id=8 name='instance-00000011' uuid=14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 is tainted: custom-monitor
Jan 31 02:49:21 np0005603623 nova_compute[226235]: 2026-01-31 07:49:21.783 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:49:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:22.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:49:22 np0005603623 nova_compute[226235]: 2026-01-31 07:49:22.710 226239 INFO nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 31 02:49:23 np0005603623 nova_compute[226235]: 2026-01-31 07:49:23.166 226239 DEBUG oslo_concurrency.processutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/disk.config d3f77a29-d3a7-444c-9528-ab679b9b946c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.850s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:23 np0005603623 nova_compute[226235]: 2026-01-31 07:49:23.167 226239 INFO nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Deleting local config drive /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c/disk.config because it was imported into RBD.#033[00m
Jan 31 02:49:23 np0005603623 systemd-machined[194379]: New machine qemu-9-instance-0000000c.
Jan 31 02:49:23 np0005603623 systemd[1]: Started Virtual Machine qemu-9-instance-0000000c.
Jan 31 02:49:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:23.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:23 np0005603623 nova_compute[226235]: 2026-01-31 07:49:23.714 226239 INFO nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 31 02:49:23 np0005603623 nova_compute[226235]: 2026-01-31 07:49:23.719 226239 DEBUG nova.compute.manager [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:49:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:24.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:24 np0005603623 nova_compute[226235]: 2026-01-31 07:49:24.679 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845764.6787572, d3f77a29-d3a7-444c-9528-ab679b9b946c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:49:24 np0005603623 nova_compute[226235]: 2026-01-31 07:49:24.680 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:49:24 np0005603623 nova_compute[226235]: 2026-01-31 07:49:24.682 226239 DEBUG nova.compute.manager [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:49:24 np0005603623 nova_compute[226235]: 2026-01-31 07:49:24.682 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:49:24 np0005603623 nova_compute[226235]: 2026-01-31 07:49:24.685 226239 INFO nova.virt.libvirt.driver [-] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance spawned successfully.#033[00m
Jan 31 02:49:24 np0005603623 nova_compute[226235]: 2026-01-31 07:49:24.685 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.133 226239 DEBUG nova.objects.instance [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.140 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.146 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.151 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.152 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.153 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.153 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.154 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.155 226239 DEBUG nova.virt.libvirt.driver [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.234 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.234 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845764.6802373, d3f77a29-d3a7-444c-9528-ab679b9b946c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.234 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] VM Started (Lifecycle Event)#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.318 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.322 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.355 226239 DEBUG nova.compute.manager [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.364 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 31 02:49:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:25.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.471 226239 DEBUG oslo_concurrency.lockutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.471 226239 DEBUG oslo_concurrency.lockutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.472 226239 DEBUG nova.objects.instance [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.641 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:25 np0005603623 nova_compute[226235]: 2026-01-31 07:49:25.693 226239 DEBUG oslo_concurrency.lockutils [None req-7dae6a0b-c375-4b0a-9b58-b8b489568685 cd588efdd47d4cefafe6efce9dd7d09e 2c52201ee1a6452388ca22ab06da1e56 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:26 np0005603623 nova_compute[226235]: 2026-01-31 07:49:26.526 226239 DEBUG oslo_concurrency.lockutils [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Acquiring lock "d3f77a29-d3a7-444c-9528-ab679b9b946c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:26 np0005603623 nova_compute[226235]: 2026-01-31 07:49:26.526 226239 DEBUG oslo_concurrency.lockutils [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "d3f77a29-d3a7-444c-9528-ab679b9b946c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:26 np0005603623 nova_compute[226235]: 2026-01-31 07:49:26.526 226239 DEBUG oslo_concurrency.lockutils [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Acquiring lock "d3f77a29-d3a7-444c-9528-ab679b9b946c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:26 np0005603623 nova_compute[226235]: 2026-01-31 07:49:26.527 226239 DEBUG oslo_concurrency.lockutils [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "d3f77a29-d3a7-444c-9528-ab679b9b946c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:26 np0005603623 nova_compute[226235]: 2026-01-31 07:49:26.527 226239 DEBUG oslo_concurrency.lockutils [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "d3f77a29-d3a7-444c-9528-ab679b9b946c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:26 np0005603623 nova_compute[226235]: 2026-01-31 07:49:26.528 226239 INFO nova.compute.manager [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Terminating instance#033[00m
Jan 31 02:49:26 np0005603623 nova_compute[226235]: 2026-01-31 07:49:26.528 226239 DEBUG oslo_concurrency.lockutils [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Acquiring lock "refresh_cache-d3f77a29-d3a7-444c-9528-ab679b9b946c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:49:26 np0005603623 nova_compute[226235]: 2026-01-31 07:49:26.529 226239 DEBUG oslo_concurrency.lockutils [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Acquired lock "refresh_cache-d3f77a29-d3a7-444c-9528-ab679b9b946c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:49:26 np0005603623 nova_compute[226235]: 2026-01-31 07:49:26.529 226239 DEBUG nova.network.neutron [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:49:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:26.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:26 np0005603623 nova_compute[226235]: 2026-01-31 07:49:26.784 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:26 np0005603623 nova_compute[226235]: 2026-01-31 07:49:26.857 226239 DEBUG nova.network.neutron [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:49:27 np0005603623 nova_compute[226235]: 2026-01-31 07:49:27.257 226239 DEBUG nova.network.neutron [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:49:27 np0005603623 nova_compute[226235]: 2026-01-31 07:49:27.320 226239 DEBUG oslo_concurrency.lockutils [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Releasing lock "refresh_cache-d3f77a29-d3a7-444c-9528-ab679b9b946c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:49:27 np0005603623 nova_compute[226235]: 2026-01-31 07:49:27.321 226239 DEBUG nova.compute.manager [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:49:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:27.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:27 np0005603623 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 31 02:49:27 np0005603623 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000c.scope: Consumed 4.109s CPU time.
Jan 31 02:49:27 np0005603623 systemd-machined[194379]: Machine qemu-9-instance-0000000c terminated.
Jan 31 02:49:27 np0005603623 nova_compute[226235]: 2026-01-31 07:49:27.738 226239 INFO nova.virt.libvirt.driver [-] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance destroyed successfully.#033[00m
Jan 31 02:49:27 np0005603623 nova_compute[226235]: 2026-01-31 07:49:27.739 226239 DEBUG nova.objects.instance [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lazy-loading 'resources' on Instance uuid d3f77a29-d3a7-444c-9528-ab679b9b946c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:49:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e149 e149: 3 total, 3 up, 3 in
Jan 31 02:49:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:49:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:28.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:49:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:49:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:29.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:49:29 np0005603623 podman[235702]: 2026-01-31 07:49:29.830538295 +0000 UTC m=+0.268785368 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Jan 31 02:49:30 np0005603623 podman[235702]: 2026-01-31 07:49:30.041811344 +0000 UTC m=+0.480058387 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:49:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:30.084 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:30.085 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:30.086 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:30.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:30 np0005603623 nova_compute[226235]: 2026-01-31 07:49:30.643 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:30 np0005603623 podman[235852]: 2026-01-31 07:49:30.648292795 +0000 UTC m=+0.074019258 container exec dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:49:30 np0005603623 podman[235873]: 2026-01-31 07:49:30.743703833 +0000 UTC m=+0.083433284 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:49:30 np0005603623 podman[235852]: 2026-01-31 07:49:30.749491675 +0000 UTC m=+0.175218128 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:49:30 np0005603623 podman[235915]: 2026-01-31 07:49:30.922295596 +0000 UTC m=+0.053233624 container exec 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, architecture=x86_64, name=keepalived, com.redhat.component=keepalived-container, version=2.2.4, vendor=Red Hat, Inc., description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, io.buildah.version=1.28.2)
Jan 31 02:49:30 np0005603623 podman[235915]: 2026-01-31 07:49:30.933522449 +0000 UTC m=+0.064460467 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, vcs-type=git, name=keepalived, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, io.openshift.expose-services=, version=2.2.4, release=1793, distribution-scope=public, build-date=2023-02-22T09:23:20)
Jan 31 02:49:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:49:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:31.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:49:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:31 np0005603623 nova_compute[226235]: 2026-01-31 07:49:31.786 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:32.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:32 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:49:32 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:49:32 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:49:32 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:49:32 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:49:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:49:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:33.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:49:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:34.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:35.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:35 np0005603623 nova_compute[226235]: 2026-01-31 07:49:35.645 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:35 np0005603623 nova_compute[226235]: 2026-01-31 07:49:35.843 226239 INFO nova.virt.libvirt.driver [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Deleting instance files /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c_del#033[00m
Jan 31 02:49:35 np0005603623 nova_compute[226235]: 2026-01-31 07:49:35.844 226239 INFO nova.virt.libvirt.driver [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Deletion of /var/lib/nova/instances/d3f77a29-d3a7-444c-9528-ab679b9b946c_del complete#033[00m
Jan 31 02:49:36 np0005603623 nova_compute[226235]: 2026-01-31 07:49:36.360 226239 INFO nova.compute.manager [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Took 9.04 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:49:36 np0005603623 nova_compute[226235]: 2026-01-31 07:49:36.360 226239 DEBUG oslo.service.loopingcall [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:49:36 np0005603623 nova_compute[226235]: 2026-01-31 07:49:36.360 226239 DEBUG nova.compute.manager [-] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:49:36 np0005603623 nova_compute[226235]: 2026-01-31 07:49:36.360 226239 DEBUG nova.network.neutron [-] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:49:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:36.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:36 np0005603623 nova_compute[226235]: 2026-01-31 07:49:36.788 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:36 np0005603623 nova_compute[226235]: 2026-01-31 07:49:36.800 226239 DEBUG nova.network.neutron [-] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:49:36 np0005603623 nova_compute[226235]: 2026-01-31 07:49:36.859 226239 DEBUG nova.network.neutron [-] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:49:36 np0005603623 nova_compute[226235]: 2026-01-31 07:49:36.894 226239 INFO nova.compute.manager [-] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Took 0.53 seconds to deallocate network for instance.#033[00m
Jan 31 02:49:36 np0005603623 nova_compute[226235]: 2026-01-31 07:49:36.955 226239 DEBUG oslo_concurrency.lockutils [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:36 np0005603623 nova_compute[226235]: 2026-01-31 07:49:36.955 226239 DEBUG oslo_concurrency.lockutils [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:37 np0005603623 nova_compute[226235]: 2026-01-31 07:49:37.035 226239 DEBUG oslo_concurrency.processutils [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:49:37 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2519221345' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:49:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:49:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:37.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:49:37 np0005603623 nova_compute[226235]: 2026-01-31 07:49:37.481 226239 DEBUG oslo_concurrency.processutils [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:37 np0005603623 nova_compute[226235]: 2026-01-31 07:49:37.486 226239 DEBUG nova.compute.provider_tree [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:49:37 np0005603623 nova_compute[226235]: 2026-01-31 07:49:37.510 226239 DEBUG nova.scheduler.client.report [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:49:37 np0005603623 nova_compute[226235]: 2026-01-31 07:49:37.540 226239 DEBUG oslo_concurrency.lockutils [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:37 np0005603623 nova_compute[226235]: 2026-01-31 07:49:37.579 226239 INFO nova.scheduler.client.report [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Deleted allocations for instance d3f77a29-d3a7-444c-9528-ab679b9b946c#033[00m
Jan 31 02:49:37 np0005603623 nova_compute[226235]: 2026-01-31 07:49:37.657 226239 DEBUG oslo_concurrency.lockutils [None req-bd99137b-8494-4f88-8922-3bc74159bb3d 4e887d8783db44ff93a55e1ea75aa78e 2ca2f28405884a6ea92bcde9c8f91ff9 - - default default] Lock "d3f77a29-d3a7-444c-9528-ab679b9b946c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 11.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:38.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:39.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:49:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:40.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:40 np0005603623 nova_compute[226235]: 2026-01-31 07:49:40.647 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:41 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:49:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:41.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:41 np0005603623 nova_compute[226235]: 2026-01-31 07:49:41.789 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:42.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:42 np0005603623 nova_compute[226235]: 2026-01-31 07:49:42.737 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845767.7355876, d3f77a29-d3a7-444c-9528-ab679b9b946c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:49:42 np0005603623 nova_compute[226235]: 2026-01-31 07:49:42.737 226239 INFO nova.compute.manager [-] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:49:42 np0005603623 nova_compute[226235]: 2026-01-31 07:49:42.814 226239 DEBUG nova.compute.manager [None req-815a90ba-ba07-4012-8f9f-d5650ea4a819 - - - - - -] [instance: d3f77a29-d3a7-444c-9528-ab679b9b946c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:49:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:43.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:44.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:45.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:45 np0005603623 nova_compute[226235]: 2026-01-31 07:49:45.648 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:46.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:46 np0005603623 nova_compute[226235]: 2026-01-31 07:49:46.793 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:49:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:47.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:49:48 np0005603623 nova_compute[226235]: 2026-01-31 07:49:48.283 226239 DEBUG nova.compute.manager [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 31 02:49:48 np0005603623 nova_compute[226235]: 2026-01-31 07:49:48.397 226239 DEBUG oslo_concurrency.lockutils [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:48 np0005603623 nova_compute[226235]: 2026-01-31 07:49:48.398 226239 DEBUG oslo_concurrency.lockutils [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:48 np0005603623 nova_compute[226235]: 2026-01-31 07:49:48.480 226239 DEBUG nova.objects.instance [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'pci_requests' on Instance uuid 620b3405-251d-4545-b523-faa35768224b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:49:48 np0005603623 nova_compute[226235]: 2026-01-31 07:49:48.532 226239 DEBUG nova.virt.hardware [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:49:48 np0005603623 nova_compute[226235]: 2026-01-31 07:49:48.533 226239 INFO nova.compute.claims [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:49:48 np0005603623 nova_compute[226235]: 2026-01-31 07:49:48.533 226239 DEBUG nova.objects.instance [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'resources' on Instance uuid 620b3405-251d-4545-b523-faa35768224b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:49:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:48.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:48 np0005603623 nova_compute[226235]: 2026-01-31 07:49:48.568 226239 DEBUG nova.objects.instance [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'pci_devices' on Instance uuid 620b3405-251d-4545-b523-faa35768224b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:49:48 np0005603623 nova_compute[226235]: 2026-01-31 07:49:48.622 226239 INFO nova.compute.resource_tracker [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Updating resource usage from migration e0f17e23-e3f6-4dbf-9bcf-e53ad1ea7163#033[00m
Jan 31 02:49:48 np0005603623 nova_compute[226235]: 2026-01-31 07:49:48.622 226239 DEBUG nova.compute.resource_tracker [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Starting to track incoming migration e0f17e23-e3f6-4dbf-9bcf-e53ad1ea7163 with flavor f75c4aee-d826-4343-a7e3-f06a4b21de52 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 31 02:49:48 np0005603623 nova_compute[226235]: 2026-01-31 07:49:48.676 226239 DEBUG nova.scheduler.client.report [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 02:49:48 np0005603623 nova_compute[226235]: 2026-01-31 07:49:48.701 226239 DEBUG nova.scheduler.client.report [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 02:49:48 np0005603623 nova_compute[226235]: 2026-01-31 07:49:48.702 226239 DEBUG nova.compute.provider_tree [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 02:49:48 np0005603623 nova_compute[226235]: 2026-01-31 07:49:48.728 226239 DEBUG nova.scheduler.client.report [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 02:49:48 np0005603623 nova_compute[226235]: 2026-01-31 07:49:48.761 226239 DEBUG nova.scheduler.client.report [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 02:49:48 np0005603623 nova_compute[226235]: 2026-01-31 07:49:48.907 226239 DEBUG oslo_concurrency.processutils [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:49 np0005603623 nova_compute[226235]: 2026-01-31 07:49:49.103 226239 DEBUG oslo_concurrency.lockutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:49 np0005603623 nova_compute[226235]: 2026-01-31 07:49:49.104 226239 DEBUG oslo_concurrency.lockutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:49 np0005603623 nova_compute[226235]: 2026-01-31 07:49:49.133 226239 DEBUG nova.compute.manager [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:49:49 np0005603623 nova_compute[226235]: 2026-01-31 07:49:49.209 226239 DEBUG oslo_concurrency.lockutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:49:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4026404739' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:49:49 np0005603623 nova_compute[226235]: 2026-01-31 07:49:49.392 226239 DEBUG oslo_concurrency.processutils [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:49 np0005603623 nova_compute[226235]: 2026-01-31 07:49:49.397 226239 DEBUG nova.compute.provider_tree [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:49:49 np0005603623 nova_compute[226235]: 2026-01-31 07:49:49.431 226239 DEBUG nova.scheduler.client.report [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:49:49 np0005603623 nova_compute[226235]: 2026-01-31 07:49:49.467 226239 DEBUG oslo_concurrency.lockutils [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 1.070s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:49 np0005603623 nova_compute[226235]: 2026-01-31 07:49:49.468 226239 INFO nova.compute.manager [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Migrating#033[00m
Jan 31 02:49:49 np0005603623 nova_compute[226235]: 2026-01-31 07:49:49.474 226239 DEBUG oslo_concurrency.lockutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:49.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:49 np0005603623 nova_compute[226235]: 2026-01-31 07:49:49.482 226239 DEBUG nova.virt.hardware [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:49:49 np0005603623 nova_compute[226235]: 2026-01-31 07:49:49.482 226239 INFO nova.compute.claims [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:49:49 np0005603623 nova_compute[226235]: 2026-01-31 07:49:49.526 226239 DEBUG oslo_concurrency.lockutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:49 np0005603623 nova_compute[226235]: 2026-01-31 07:49:49.527 226239 DEBUG oslo_concurrency.lockutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:49 np0005603623 nova_compute[226235]: 2026-01-31 07:49:49.560 226239 DEBUG nova.compute.manager [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:49:49 np0005603623 nova_compute[226235]: 2026-01-31 07:49:49.687 226239 DEBUG oslo_concurrency.lockutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:49 np0005603623 nova_compute[226235]: 2026-01-31 07:49:49.881 226239 DEBUG oslo_concurrency.processutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:49:50 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2876606483' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:49:50 np0005603623 nova_compute[226235]: 2026-01-31 07:49:50.303 226239 DEBUG oslo_concurrency.processutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:50 np0005603623 nova_compute[226235]: 2026-01-31 07:49:50.308 226239 DEBUG nova.compute.provider_tree [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:49:50 np0005603623 nova_compute[226235]: 2026-01-31 07:49:50.338 226239 DEBUG nova.scheduler.client.report [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:49:50 np0005603623 nova_compute[226235]: 2026-01-31 07:49:50.365 226239 DEBUG oslo_concurrency.lockutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.892s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:50 np0005603623 nova_compute[226235]: 2026-01-31 07:49:50.366 226239 DEBUG nova.compute.manager [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:49:50 np0005603623 nova_compute[226235]: 2026-01-31 07:49:50.369 226239 DEBUG oslo_concurrency.lockutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.682s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:50 np0005603623 nova_compute[226235]: 2026-01-31 07:49:50.374 226239 DEBUG nova.virt.hardware [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:49:50 np0005603623 nova_compute[226235]: 2026-01-31 07:49:50.374 226239 INFO nova.compute.claims [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:49:50 np0005603623 ovn_controller[133449]: 2026-01-31T07:49:50Z|00090|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 31 02:49:50 np0005603623 nova_compute[226235]: 2026-01-31 07:49:50.481 226239 DEBUG nova.compute.manager [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:49:50 np0005603623 nova_compute[226235]: 2026-01-31 07:49:50.482 226239 DEBUG nova.network.neutron [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:49:50 np0005603623 nova_compute[226235]: 2026-01-31 07:49:50.510 226239 INFO nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:49:50 np0005603623 nova_compute[226235]: 2026-01-31 07:49:50.546 226239 DEBUG nova.compute.manager [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:49:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:49:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:50.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:49:50 np0005603623 nova_compute[226235]: 2026-01-31 07:49:50.627 226239 INFO nova.virt.block_device [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Booting with volume 74f8a6d0-259e-466b-a484-4c7bffded2e1 at /dev/vda#033[00m
Jan 31 02:49:50 np0005603623 nova_compute[226235]: 2026-01-31 07:49:50.650 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:50 np0005603623 nova_compute[226235]: 2026-01-31 07:49:50.662 226239 DEBUG oslo_concurrency.processutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:50 np0005603623 nova_compute[226235]: 2026-01-31 07:49:50.804 226239 DEBUG nova.policy [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '795c7f392cbc45f0885f081449883d42', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbdbb7a4b22a49b68feb3e028bb62fbb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:49:50 np0005603623 podman[236277]: 2026-01-31 07:49:50.983383318 +0000 UTC m=+0.074980017 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 31 02:49:51 np0005603623 podman[236278]: 2026-01-31 07:49:50.999945538 +0000 UTC m=+0.091328910 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 31 02:49:51 np0005603623 nova_compute[226235]: 2026-01-31 07:49:51.022 226239 DEBUG os_brick.utils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 02:49:51 np0005603623 nova_compute[226235]: 2026-01-31 07:49:51.024 226239 INFO oslo.privsep.daemon [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmpjbg7gfrb/privsep.sock']#033[00m
Jan 31 02:49:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:49:51 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1878218249' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:49:51 np0005603623 nova_compute[226235]: 2026-01-31 07:49:51.091 226239 DEBUG oslo_concurrency.processutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:51 np0005603623 nova_compute[226235]: 2026-01-31 07:49:51.096 226239 DEBUG nova.compute.provider_tree [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:49:51 np0005603623 systemd[1]: Created slice User Slice of UID 42436.
Jan 31 02:49:51 np0005603623 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 31 02:49:51 np0005603623 systemd-logind[795]: New session 51 of user nova.
Jan 31 02:49:51 np0005603623 nova_compute[226235]: 2026-01-31 07:49:51.144 226239 DEBUG nova.scheduler.client.report [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:49:51 np0005603623 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 31 02:49:51 np0005603623 systemd[1]: Starting User Manager for UID 42436...
Jan 31 02:49:51 np0005603623 nova_compute[226235]: 2026-01-31 07:49:51.178 226239 DEBUG oslo_concurrency.lockutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.809s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:51 np0005603623 nova_compute[226235]: 2026-01-31 07:49:51.179 226239 DEBUG nova.compute.manager [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:49:51 np0005603623 systemd[236328]: Queued start job for default target Main User Target.
Jan 31 02:49:51 np0005603623 nova_compute[226235]: 2026-01-31 07:49:51.273 226239 DEBUG nova.compute.manager [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:49:51 np0005603623 nova_compute[226235]: 2026-01-31 07:49:51.274 226239 DEBUG nova.network.neutron [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:49:51 np0005603623 systemd[236328]: Created slice User Application Slice.
Jan 31 02:49:51 np0005603623 systemd[236328]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 31 02:49:51 np0005603623 systemd[236328]: Started Daily Cleanup of User's Temporary Directories.
Jan 31 02:49:51 np0005603623 systemd[236328]: Reached target Paths.
Jan 31 02:49:51 np0005603623 systemd[236328]: Reached target Timers.
Jan 31 02:49:51 np0005603623 systemd[236328]: Starting D-Bus User Message Bus Socket...
Jan 31 02:49:51 np0005603623 systemd[236328]: Starting Create User's Volatile Files and Directories...
Jan 31 02:49:51 np0005603623 systemd[236328]: Listening on D-Bus User Message Bus Socket.
Jan 31 02:49:51 np0005603623 systemd[236328]: Reached target Sockets.
Jan 31 02:49:51 np0005603623 nova_compute[226235]: 2026-01-31 07:49:51.301 226239 INFO nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:49:51 np0005603623 systemd[236328]: Finished Create User's Volatile Files and Directories.
Jan 31 02:49:51 np0005603623 systemd[236328]: Reached target Basic System.
Jan 31 02:49:51 np0005603623 systemd[236328]: Reached target Main User Target.
Jan 31 02:49:51 np0005603623 systemd[236328]: Startup finished in 142ms.
Jan 31 02:49:51 np0005603623 systemd[1]: Started User Manager for UID 42436.
Jan 31 02:49:51 np0005603623 systemd[1]: Started Session 51 of User nova.
Jan 31 02:49:51 np0005603623 nova_compute[226235]: 2026-01-31 07:49:51.335 226239 DEBUG nova.compute.manager [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:49:51 np0005603623 systemd[1]: session-51.scope: Deactivated successfully.
Jan 31 02:49:51 np0005603623 systemd-logind[795]: Session 51 logged out. Waiting for processes to exit.
Jan 31 02:49:51 np0005603623 systemd-logind[795]: Removed session 51.
Jan 31 02:49:51 np0005603623 nova_compute[226235]: 2026-01-31 07:49:51.435 226239 INFO nova.virt.block_device [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Booting with volume 317b1c6b-4f89-402c-94d1-f4852844f1e2 at /dev/vda#033[00m
Jan 31 02:49:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:49:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:51.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:49:51 np0005603623 systemd-logind[795]: New session 53 of user nova.
Jan 31 02:49:51 np0005603623 systemd[1]: Started Session 53 of User nova.
Jan 31 02:49:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:51 np0005603623 systemd[1]: session-53.scope: Deactivated successfully.
Jan 31 02:49:51 np0005603623 systemd-logind[795]: Session 53 logged out. Waiting for processes to exit.
Jan 31 02:49:51 np0005603623 systemd-logind[795]: Removed session 53.
Jan 31 02:49:51 np0005603623 nova_compute[226235]: 2026-01-31 07:49:51.795 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.136 226239 INFO oslo.privsep.daemon [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.033 236401 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.036 236401 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.039 236401 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.039 236401 INFO oslo.privsep.daemon [-] privsep daemon running as pid 236401#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.138 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[ad8b42a8-80eb-484b-8b34-96ca91bed13b]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.183 226239 DEBUG os_brick.utils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.235 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.236 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.260 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.261 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[3ce36b1a-f020-4ee1-b5d4-177ed8ecce16]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.261 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.261 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[0707bdd4-8ca7-4a63-ae9e-07c4c30f9f5d]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.262 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.262 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.268 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.268 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[e16ddfd0-b6bd-472b-9d37-8eb065ffd3c7]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.270 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.272 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.273 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[dc3b59e1-bd16-4df1-92b0-f1a17b8f6d1c]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.274 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.305 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.305 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.305 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[9708e8a5-d56a-4c41-8cc3-af4470d904ae]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.305 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[4b46857f-4ac6-4b93-8b79-1bdb44f11ecc]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.308 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[26f48d5b-745e-4cf8-9949-062e95fe295e]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.308 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[bb43cfcd-d567-4621-9a9b-96f6827981ec]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.309 226239 DEBUG oslo_concurrency.processutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.372 226239 DEBUG oslo_concurrency.processutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.388 226239 DEBUG oslo_concurrency.processutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CMD "nvme version" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.391 226239 DEBUG oslo_concurrency.processutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CMD "nvme version" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.391 226239 DEBUG os_brick.initiator.connectors.lightos [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.392 226239 DEBUG os_brick.initiator.connectors.lightos [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.392 226239 DEBUG os_brick.initiator.connectors.lightos [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.393 226239 DEBUG os_brick.utils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] <== get_connector_properties: return (208ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.393 226239 DEBUG nova.virt.block_device [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Updating existing volume attachment record: 4b9eb10a-fc66-42c4-9e17-3b8136950b54 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.398 226239 DEBUG os_brick.initiator.connectors.lightos [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.398 226239 DEBUG os_brick.initiator.connectors.lightos [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.398 226239 DEBUG os_brick.initiator.connectors.lightos [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.398 226239 DEBUG os_brick.utils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] <== get_connector_properties: return (1375ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.399 226239 DEBUG nova.virt.block_device [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updating existing volume attachment record: da773656-1631-4e7b-855d-ed146c908f6b _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 02:49:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:52.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.692 226239 DEBUG nova.policy [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '37ed25cc14814a29867ac308b3cce8cf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e66a774f63ae4139a4e75c7973fbe077', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:49:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:52.859 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:49:52 np0005603623 nova_compute[226235]: 2026-01-31 07:49:52.859 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:49:52.860 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:49:53 np0005603623 nova_compute[226235]: 2026-01-31 07:49:53.107 226239 DEBUG nova.network.neutron [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Successfully created port: 31ab3c80-791f-418d-a70b-fcb0d523a037 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 02:49:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:49:53 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2679277477' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:49:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:53.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:54.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:49:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:55.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:49:55 np0005603623 nova_compute[226235]: 2026-01-31 07:49:55.651 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:56.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:56 np0005603623 nova_compute[226235]: 2026-01-31 07:49:56.763 226239 DEBUG nova.compute.manager [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:49:56 np0005603623 nova_compute[226235]: 2026-01-31 07:49:56.764 226239 DEBUG nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:49:56 np0005603623 nova_compute[226235]: 2026-01-31 07:49:56.765 226239 INFO nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Creating image(s)#033[00m
Jan 31 02:49:56 np0005603623 nova_compute[226235]: 2026-01-31 07:49:56.765 226239 DEBUG nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 02:49:56 np0005603623 nova_compute[226235]: 2026-01-31 07:49:56.765 226239 DEBUG nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Ensure instance console log exists: /var/lib/nova/instances/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:49:56 np0005603623 nova_compute[226235]: 2026-01-31 07:49:56.765 226239 DEBUG oslo_concurrency.lockutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:56 np0005603623 nova_compute[226235]: 2026-01-31 07:49:56.766 226239 DEBUG oslo_concurrency.lockutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:56 np0005603623 nova_compute[226235]: 2026-01-31 07:49:56.766 226239 DEBUG oslo_concurrency.lockutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:56 np0005603623 nova_compute[226235]: 2026-01-31 07:49:56.796 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:56 np0005603623 nova_compute[226235]: 2026-01-31 07:49:56.811 226239 DEBUG nova.compute.manager [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:49:56 np0005603623 nova_compute[226235]: 2026-01-31 07:49:56.812 226239 DEBUG nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:49:56 np0005603623 nova_compute[226235]: 2026-01-31 07:49:56.812 226239 INFO nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Creating image(s)#033[00m
Jan 31 02:49:56 np0005603623 nova_compute[226235]: 2026-01-31 07:49:56.812 226239 DEBUG nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 02:49:56 np0005603623 nova_compute[226235]: 2026-01-31 07:49:56.812 226239 DEBUG nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Ensure instance console log exists: /var/lib/nova/instances/4b48cc05-9edd-4e4d-a58e-84564afb0612/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:49:56 np0005603623 nova_compute[226235]: 2026-01-31 07:49:56.813 226239 DEBUG oslo_concurrency.lockutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:56 np0005603623 nova_compute[226235]: 2026-01-31 07:49:56.813 226239 DEBUG oslo_concurrency.lockutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:56 np0005603623 nova_compute[226235]: 2026-01-31 07:49:56.813 226239 DEBUG oslo_concurrency.lockutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:57 np0005603623 nova_compute[226235]: 2026-01-31 07:49:57.125 226239 DEBUG nova.network.neutron [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Successfully created port: ab24842b-0045-41e6-b6dc-51b110b51829 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 02:49:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:57.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:57 np0005603623 nova_compute[226235]: 2026-01-31 07:49:57.809 226239 DEBUG nova.network.neutron [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Successfully updated port: 31ab3c80-791f-418d-a70b-fcb0d523a037 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:49:57 np0005603623 nova_compute[226235]: 2026-01-31 07:49:57.832 226239 DEBUG oslo_concurrency.lockutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquiring lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:49:57 np0005603623 nova_compute[226235]: 2026-01-31 07:49:57.833 226239 DEBUG oslo_concurrency.lockutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquired lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:49:57 np0005603623 nova_compute[226235]: 2026-01-31 07:49:57.833 226239 DEBUG nova.network.neutron [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:49:58 np0005603623 nova_compute[226235]: 2026-01-31 07:49:58.009 226239 DEBUG nova.compute.manager [req-f9fc88bf-3d86-4416-9490-b85bc7c489fd req-648357dc-e546-48bc-8cda-6b88e4b4896e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-changed-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:49:58 np0005603623 nova_compute[226235]: 2026-01-31 07:49:58.009 226239 DEBUG nova.compute.manager [req-f9fc88bf-3d86-4416-9490-b85bc7c489fd req-648357dc-e546-48bc-8cda-6b88e4b4896e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Refreshing instance network info cache due to event network-changed-31ab3c80-791f-418d-a70b-fcb0d523a037. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:49:58 np0005603623 nova_compute[226235]: 2026-01-31 07:49:58.009 226239 DEBUG oslo_concurrency.lockutils [req-f9fc88bf-3d86-4416-9490-b85bc7c489fd req-648357dc-e546-48bc-8cda-6b88e4b4896e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:49:58 np0005603623 nova_compute[226235]: 2026-01-31 07:49:58.186 226239 DEBUG nova.network.neutron [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:49:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:58.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:58 np0005603623 nova_compute[226235]: 2026-01-31 07:49:58.619 226239 DEBUG nova.network.neutron [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Successfully updated port: ab24842b-0045-41e6-b6dc-51b110b51829 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:49:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:49:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:49:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:59.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.731 226239 DEBUG oslo_concurrency.lockutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquiring lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.731 226239 DEBUG oslo_concurrency.lockutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquired lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.731 226239 DEBUG nova.network.neutron [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.874 226239 DEBUG nova.network.neutron [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updating instance_info_cache with network_info: [{"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.915 226239 DEBUG oslo_concurrency.lockutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Releasing lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.915 226239 DEBUG nova.compute.manager [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Instance network_info: |[{"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.916 226239 DEBUG oslo_concurrency.lockutils [req-f9fc88bf-3d86-4416-9490-b85bc7c489fd req-648357dc-e546-48bc-8cda-6b88e4b4896e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.916 226239 DEBUG nova.network.neutron [req-f9fc88bf-3d86-4416-9490-b85bc7c489fd req-648357dc-e546-48bc-8cda-6b88e4b4896e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Refreshing network info cache for port 31ab3c80-791f-418d-a70b-fcb0d523a037 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.920 226239 DEBUG nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Start _get_guest_xml network_info=[{"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'attachment_id': 'da773656-1631-4e7b-855d-ed146c908f6b', 'delete_on_termination': True, 'guest_format': None, 'device_type': 'disk', 'boot_index': 0, 'disk_bus': 'virtio', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-74f8a6d0-259e-466b-a484-4c7bffded2e1', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '74f8a6d0-259e-466b-a484-4c7bffded2e1', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '4b48cc05-9edd-4e4d-a58e-84564afb0612', 'attached_at': '', 'detached_at': '', 'volume_id': '74f8a6d0-259e-466b-a484-4c7bffded2e1', 'serial': '74f8a6d0-259e-466b-a484-4c7bffded2e1'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.925 226239 WARNING nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.935 226239 DEBUG nova.virt.libvirt.host [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.936 226239 DEBUG nova.virt.libvirt.host [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.940 226239 DEBUG nova.virt.libvirt.host [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.940 226239 DEBUG nova.virt.libvirt.host [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.943 226239 DEBUG nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.943 226239 DEBUG nova.virt.hardware [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.943 226239 DEBUG nova.virt.hardware [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.944 226239 DEBUG nova.virt.hardware [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.944 226239 DEBUG nova.virt.hardware [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.944 226239 DEBUG nova.virt.hardware [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.944 226239 DEBUG nova.virt.hardware [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.944 226239 DEBUG nova.virt.hardware [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.945 226239 DEBUG nova.virt.hardware [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.945 226239 DEBUG nova.virt.hardware [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.945 226239 DEBUG nova.virt.hardware [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.945 226239 DEBUG nova.virt.hardware [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.970 226239 DEBUG nova.storage.rbd_utils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] rbd image 4b48cc05-9edd-4e4d-a58e-84564afb0612_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:49:59 np0005603623 nova_compute[226235]: 2026-01-31 07:49:59.973 226239 DEBUG oslo_concurrency.processutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.212 226239 DEBUG nova.compute.manager [req-46b11eb1-863d-4c94-b2e6-f6e5521ece05 req-07e558b0-100a-458c-865d-228f38c99a31 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-changed-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.213 226239 DEBUG nova.compute.manager [req-46b11eb1-863d-4c94-b2e6-f6e5521ece05 req-07e558b0-100a-458c-865d-228f38c99a31 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Refreshing instance network info cache due to event network-changed-ab24842b-0045-41e6-b6dc-51b110b51829. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.213 226239 DEBUG oslo_concurrency.lockutils [req-46b11eb1-863d-4c94-b2e6-f6e5521ece05 req-07e558b0-100a-458c-865d-228f38c99a31 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.293 226239 DEBUG nova.network.neutron [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:50:00 np0005603623 ceph-mon[77037]: overall HEALTH_OK
Jan 31 02:50:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:50:00 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4207730133' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.505 226239 DEBUG oslo_concurrency.processutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.532s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.506 226239 DEBUG oslo_concurrency.lockutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquiring lock "cache_volume_driver" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.506 226239 DEBUG oslo_concurrency.lockutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "cache_volume_driver" acquired by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.507 226239 DEBUG oslo_concurrency.lockutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "cache_volume_driver" "released" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:00.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.653 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.839 226239 DEBUG nova.virt.libvirt.vif [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:49:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1009708622',display_name='tempest-LiveMigrationTest-server-1009708622',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1009708622',id=19,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbdbb7a4b22a49b68feb3e028bb62fbb',ramdisk_id='',reservation_id='r-878znybl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-126681982',owner_user_name='tempest-LiveMigrationTest-126681982-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:49:50Z,user_data=None,user_id='795c7f392cbc45f0885f081449883d42',uuid=4b48cc05-9edd-4e4d-a58e-84564afb0612,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.840 226239 DEBUG nova.network.os_vif_util [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Converting VIF {"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.841 226239 DEBUG nova.network.os_vif_util [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.842 226239 DEBUG nova.objects.instance [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lazy-loading 'pci_devices' on Instance uuid 4b48cc05-9edd-4e4d-a58e-84564afb0612 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.878 226239 DEBUG nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:50:00 np0005603623 nova_compute[226235]:  <uuid>4b48cc05-9edd-4e4d-a58e-84564afb0612</uuid>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:  <name>instance-00000013</name>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <nova:name>tempest-LiveMigrationTest-server-1009708622</nova:name>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:49:59</nova:creationTime>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:50:00 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:        <nova:user uuid="795c7f392cbc45f0885f081449883d42">tempest-LiveMigrationTest-126681982-project-member</nova:user>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:        <nova:project uuid="cbdbb7a4b22a49b68feb3e028bb62fbb">tempest-LiveMigrationTest-126681982</nova:project>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:        <nova:port uuid="31ab3c80-791f-418d-a70b-fcb0d523a037">
Jan 31 02:50:00 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <entry name="serial">4b48cc05-9edd-4e4d-a58e-84564afb0612</entry>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <entry name="uuid">4b48cc05-9edd-4e4d-a58e-84564afb0612</entry>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/4b48cc05-9edd-4e4d-a58e-84564afb0612_disk.config">
Jan 31 02:50:00 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:50:00 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="volumes/volume-74f8a6d0-259e-466b-a484-4c7bffded2e1">
Jan 31 02:50:00 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:50:00 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <serial>74f8a6d0-259e-466b-a484-4c7bffded2e1</serial>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:8b:11:0b"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <target dev="tap31ab3c80-79"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    </interface>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/4b48cc05-9edd-4e4d-a58e-84564afb0612/console.log" append="off"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:50:00 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:50:00 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:50:00 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:50:00 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.879 226239 DEBUG nova.compute.manager [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Preparing to wait for external event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.879 226239 DEBUG oslo_concurrency.lockutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.879 226239 DEBUG oslo_concurrency.lockutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.879 226239 DEBUG oslo_concurrency.lockutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.880 226239 DEBUG nova.virt.libvirt.vif [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:49:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1009708622',display_name='tempest-LiveMigrationTest-server-1009708622',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1009708622',id=19,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbdbb7a4b22a49b68feb3e028bb62fbb',ramdisk_id='',reservation_id='r-878znybl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-126681982',owner_user_name='tempest-LiveMigrationTest-126681982-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:49:50Z,user_data=None,user_id='795c7f392cbc45f0885f081449883d42',uuid=4b48cc05-9edd-4e4d-a58e-84564afb0612,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.880 226239 DEBUG nova.network.os_vif_util [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Converting VIF {"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.881 226239 DEBUG nova.network.os_vif_util [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.881 226239 DEBUG os_vif [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.882 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.882 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.882 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.884 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.885 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31ab3c80-79, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.885 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap31ab3c80-79, col_values=(('external_ids', {'iface-id': '31ab3c80-791f-418d-a70b-fcb0d523a037', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:11:0b', 'vm-uuid': '4b48cc05-9edd-4e4d-a58e-84564afb0612'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.886 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:00 np0005603623 NetworkManager[48970]: <info>  [1769845800.8871] manager: (tap31ab3c80-79): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.888 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.892 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:00 np0005603623 nova_compute[226235]: 2026-01-31 07:50:00.892 226239 INFO os_vif [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79')#033[00m
Jan 31 02:50:01 np0005603623 nova_compute[226235]: 2026-01-31 07:50:01.073 226239 DEBUG nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:50:01 np0005603623 nova_compute[226235]: 2026-01-31 07:50:01.073 226239 DEBUG nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:50:01 np0005603623 nova_compute[226235]: 2026-01-31 07:50:01.073 226239 DEBUG nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] No VIF found with MAC fa:16:3e:8b:11:0b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:50:01 np0005603623 nova_compute[226235]: 2026-01-31 07:50:01.074 226239 INFO nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Using config drive#033[00m
Jan 31 02:50:01 np0005603623 nova_compute[226235]: 2026-01-31 07:50:01.097 226239 DEBUG nova.storage.rbd_utils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] rbd image 4b48cc05-9edd-4e4d-a58e-84564afb0612_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:01.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:01 np0005603623 systemd[1]: Stopping User Manager for UID 42436...
Jan 31 02:50:01 np0005603623 systemd[236328]: Activating special unit Exit the Session...
Jan 31 02:50:01 np0005603623 systemd[236328]: Stopped target Main User Target.
Jan 31 02:50:01 np0005603623 systemd[236328]: Stopped target Basic System.
Jan 31 02:50:01 np0005603623 systemd[236328]: Stopped target Paths.
Jan 31 02:50:01 np0005603623 systemd[236328]: Stopped target Sockets.
Jan 31 02:50:01 np0005603623 systemd[236328]: Stopped target Timers.
Jan 31 02:50:01 np0005603623 systemd[236328]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 31 02:50:01 np0005603623 systemd[236328]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 31 02:50:01 np0005603623 systemd[236328]: Closed D-Bus User Message Bus Socket.
Jan 31 02:50:01 np0005603623 systemd[236328]: Stopped Create User's Volatile Files and Directories.
Jan 31 02:50:01 np0005603623 systemd[236328]: Removed slice User Application Slice.
Jan 31 02:50:01 np0005603623 systemd[236328]: Reached target Shutdown.
Jan 31 02:50:01 np0005603623 systemd[236328]: Finished Exit the Session.
Jan 31 02:50:01 np0005603623 systemd[236328]: Reached target Exit the Session.
Jan 31 02:50:01 np0005603623 systemd[1]: user@42436.service: Deactivated successfully.
Jan 31 02:50:01 np0005603623 systemd[1]: Stopped User Manager for UID 42436.
Jan 31 02:50:01 np0005603623 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 31 02:50:01 np0005603623 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 31 02:50:01 np0005603623 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 31 02:50:01 np0005603623 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 31 02:50:01 np0005603623 systemd[1]: Removed slice User Slice of UID 42436.
Jan 31 02:50:01 np0005603623 nova_compute[226235]: 2026-01-31 07:50:01.774 226239 INFO nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Creating config drive at /var/lib/nova/instances/4b48cc05-9edd-4e4d-a58e-84564afb0612/disk.config#033[00m
Jan 31 02:50:01 np0005603623 nova_compute[226235]: 2026-01-31 07:50:01.778 226239 DEBUG oslo_concurrency.processutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4b48cc05-9edd-4e4d-a58e-84564afb0612/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpq0d8hl1d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:01 np0005603623 nova_compute[226235]: 2026-01-31 07:50:01.798 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:01.862 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:01 np0005603623 nova_compute[226235]: 2026-01-31 07:50:01.896 226239 DEBUG oslo_concurrency.processutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4b48cc05-9edd-4e4d-a58e-84564afb0612/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpq0d8hl1d" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:01 np0005603623 nova_compute[226235]: 2026-01-31 07:50:01.926 226239 DEBUG nova.storage.rbd_utils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] rbd image 4b48cc05-9edd-4e4d-a58e-84564afb0612_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:01 np0005603623 nova_compute[226235]: 2026-01-31 07:50:01.931 226239 DEBUG oslo_concurrency.processutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4b48cc05-9edd-4e4d-a58e-84564afb0612/disk.config 4b48cc05-9edd-4e4d-a58e-84564afb0612_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:02.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.616 226239 DEBUG nova.network.neutron [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Updating instance_info_cache with network_info: [{"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.676 226239 DEBUG oslo_concurrency.lockutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Releasing lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.676 226239 DEBUG nova.compute.manager [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Instance network_info: |[{"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.677 226239 DEBUG oslo_concurrency.lockutils [req-46b11eb1-863d-4c94-b2e6-f6e5521ece05 req-07e558b0-100a-458c-865d-228f38c99a31 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.677 226239 DEBUG nova.network.neutron [req-46b11eb1-863d-4c94-b2e6-f6e5521ece05 req-07e558b0-100a-458c-865d-228f38c99a31 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Refreshing network info cache for port ab24842b-0045-41e6-b6dc-51b110b51829 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.680 226239 DEBUG nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Start _get_guest_xml network_info=[{"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'attachment_id': '4b9eb10a-fc66-42c4-9e17-3b8136950b54', 'delete_on_termination': True, 'guest_format': None, 'device_type': 'disk', 'boot_index': 0, 'disk_bus': 'virtio', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-317b1c6b-4f89-402c-94d1-f4852844f1e2', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '317b1c6b-4f89-402c-94d1-f4852844f1e2', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '79350fb7-3eed-4a3b-a7e9-f0ec90460ac3', 'attached_at': '', 'detached_at': '', 'volume_id': '317b1c6b-4f89-402c-94d1-f4852844f1e2', 'serial': '317b1c6b-4f89-402c-94d1-f4852844f1e2'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.683 226239 WARNING nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.687 226239 DEBUG nova.virt.libvirt.host [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.688 226239 DEBUG nova.virt.libvirt.host [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.696 226239 DEBUG nova.virt.libvirt.host [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.697 226239 DEBUG nova.virt.libvirt.host [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.699 226239 DEBUG nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.700 226239 DEBUG nova.virt.hardware [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.701 226239 DEBUG nova.virt.hardware [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.701 226239 DEBUG nova.virt.hardware [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.701 226239 DEBUG nova.virt.hardware [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.701 226239 DEBUG nova.virt.hardware [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.702 226239 DEBUG nova.virt.hardware [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.702 226239 DEBUG nova.virt.hardware [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.702 226239 DEBUG nova.virt.hardware [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.702 226239 DEBUG nova.virt.hardware [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.703 226239 DEBUG nova.virt.hardware [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.703 226239 DEBUG nova.virt.hardware [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.731 226239 DEBUG nova.storage.rbd_utils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] rbd image 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:02 np0005603623 nova_compute[226235]: 2026-01-31 07:50:02.735 226239 DEBUG oslo_concurrency.processutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:50:03 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2471101319' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.145 226239 DEBUG oslo_concurrency.processutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.182 226239 DEBUG nova.virt.libvirt.vif [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:49:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1973231276',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1973231276',id=20,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e66a774f63ae4139a4e75c7973fbe077',ramdisk_id='',reservation_id='r-kz6k0bwy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-2072827810',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-2072827810-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:49:51Z,user_data=None,user_id='37ed25cc14814a29867ac308b3cce8cf',uuid=79350fb7-3eed-4a3b-a7e9-f0ec90460ac3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.183 226239 DEBUG nova.network.os_vif_util [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Converting VIF {"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.184 226239 DEBUG nova.network.os_vif_util [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.184 226239 DEBUG nova.objects.instance [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lazy-loading 'pci_devices' on Instance uuid 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.222 226239 DEBUG nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:50:03 np0005603623 nova_compute[226235]:  <uuid>79350fb7-3eed-4a3b-a7e9-f0ec90460ac3</uuid>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:  <name>instance-00000014</name>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-1973231276</nova:name>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:50:02</nova:creationTime>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:50:03 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:        <nova:user uuid="37ed25cc14814a29867ac308b3cce8cf">tempest-LiveAutoBlockMigrationV225Test-2072827810-project-member</nova:user>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:        <nova:project uuid="e66a774f63ae4139a4e75c7973fbe077">tempest-LiveAutoBlockMigrationV225Test-2072827810</nova:project>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:        <nova:port uuid="ab24842b-0045-41e6-b6dc-51b110b51829">
Jan 31 02:50:03 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <entry name="serial">79350fb7-3eed-4a3b-a7e9-f0ec90460ac3</entry>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <entry name="uuid">79350fb7-3eed-4a3b-a7e9-f0ec90460ac3</entry>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3_disk.config">
Jan 31 02:50:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:50:03 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="volumes/volume-317b1c6b-4f89-402c-94d1-f4852844f1e2">
Jan 31 02:50:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:50:03 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <serial>317b1c6b-4f89-402c-94d1-f4852844f1e2</serial>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:b4:a1:9e"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <target dev="tapab24842b-00"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    </interface>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3/console.log" append="off"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:50:03 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:50:03 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:50:03 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:50:03 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.223 226239 DEBUG nova.compute.manager [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Preparing to wait for external event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.223 226239 DEBUG oslo_concurrency.lockutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.224 226239 DEBUG oslo_concurrency.lockutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.224 226239 DEBUG oslo_concurrency.lockutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.224 226239 DEBUG nova.virt.libvirt.vif [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:49:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1973231276',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1973231276',id=20,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e66a774f63ae4139a4e75c7973fbe077',ramdisk_id='',reservation_id='r-kz6k0bwy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-2072827810',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-2072827810-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:49:51Z,user_data=None,user_id='37ed25cc14814a29867ac308b3cce8cf',uuid=79350fb7-3eed-4a3b-a7e9-f0ec90460ac3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.225 226239 DEBUG nova.network.os_vif_util [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Converting VIF {"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.225 226239 DEBUG nova.network.os_vif_util [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.225 226239 DEBUG os_vif [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.226 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.226 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.226 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.254 226239 DEBUG nova.network.neutron [req-f9fc88bf-3d86-4416-9490-b85bc7c489fd req-648357dc-e546-48bc-8cda-6b88e4b4896e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updated VIF entry in instance network info cache for port 31ab3c80-791f-418d-a70b-fcb0d523a037. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.254 226239 DEBUG nova.network.neutron [req-f9fc88bf-3d86-4416-9490-b85bc7c489fd req-648357dc-e546-48bc-8cda-6b88e4b4896e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updating instance_info_cache with network_info: [{"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.259 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.259 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab24842b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.259 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapab24842b-00, col_values=(('external_ids', {'iface-id': 'ab24842b-0045-41e6-b6dc-51b110b51829', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b4:a1:9e', 'vm-uuid': '79350fb7-3eed-4a3b-a7e9-f0ec90460ac3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.261 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:03 np0005603623 NetworkManager[48970]: <info>  [1769845803.2620] manager: (tapab24842b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.264 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.266 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.267 226239 INFO os_vif [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00')#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.282 226239 DEBUG oslo_concurrency.lockutils [req-f9fc88bf-3d86-4416-9490-b85bc7c489fd req-648357dc-e546-48bc-8cda-6b88e4b4896e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.438 226239 DEBUG nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.438 226239 DEBUG nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.439 226239 DEBUG nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] No VIF found with MAC fa:16:3e:b4:a1:9e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.439 226239 INFO nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Using config drive#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.470 226239 DEBUG nova.storage.rbd_utils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] rbd image 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:50:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:03.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.945 226239 DEBUG oslo_concurrency.processutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4b48cc05-9edd-4e4d-a58e-84564afb0612/disk.config 4b48cc05-9edd-4e4d-a58e-84564afb0612_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.946 226239 INFO nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Deleting local config drive /var/lib/nova/instances/4b48cc05-9edd-4e4d-a58e-84564afb0612/disk.config because it was imported into RBD.#033[00m
Jan 31 02:50:03 np0005603623 NetworkManager[48970]: <info>  [1769845803.9757] manager: (tap31ab3c80-79): new Tun device (/org/freedesktop/NetworkManager/Devices/55)
Jan 31 02:50:03 np0005603623 kernel: tap31ab3c80-79: entered promiscuous mode
Jan 31 02:50:03 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:03Z|00091|binding|INFO|Claiming lport 31ab3c80-791f-418d-a70b-fcb0d523a037 for this chassis.
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.978 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:03 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:03Z|00092|binding|INFO|31ab3c80-791f-418d-a70b-fcb0d523a037: Claiming fa:16:3e:8b:11:0b 10.100.0.6
Jan 31 02:50:03 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:03Z|00093|binding|INFO|Setting lport 31ab3c80-791f-418d-a70b-fcb0d523a037 ovn-installed in OVS
Jan 31 02:50:03 np0005603623 nova_compute[226235]: 2026-01-31 07:50:03.989 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:03 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:03Z|00094|binding|INFO|Setting lport 31ab3c80-791f-418d-a70b-fcb0d523a037 up in Southbound
Jan 31 02:50:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:03.994 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:11:0b 10.100.0.6'], port_security=['fa:16:3e:8b:11:0b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4b48cc05-9edd-4e4d-a58e-84564afb0612', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbdbb7a4b22a49b68feb3e028bb62fbb', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a60a5d2f-886d-4841-8ef6-f9e7838468dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f860fcac-4f6a-4e88-8005-0fd323fc8053, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=31ab3c80-791f-418d-a70b-fcb0d523a037) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:50:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:03.996 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 31ab3c80-791f-418d-a70b-fcb0d523a037 in datapath 850ad6ca-6166-4382-94bb-4b7c10d9a136 bound to our chassis#033[00m
Jan 31 02:50:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:03.997 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 850ad6ca-6166-4382-94bb-4b7c10d9a136#033[00m
Jan 31 02:50:04 np0005603623 systemd-udevd[236601]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:50:04 np0005603623 systemd-machined[194379]: New machine qemu-10-instance-00000013.
Jan 31 02:50:04 np0005603623 NetworkManager[48970]: <info>  [1769845804.0133] device (tap31ab3c80-79): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:50:04 np0005603623 NetworkManager[48970]: <info>  [1769845804.0138] device (tap31ab3c80-79): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.016 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6ba693f4-d0ad-4ff8-8828-b4096706f076]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 systemd[1]: Started Virtual Machine qemu-10-instance-00000013.
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.037 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ef3764ec-2fc3-414f-a6a1-c7bcbfbabbfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.041 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[afc9dcb7-0f17-499e-93e7-070d1d586ce4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.058 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[d242c2f9-a555-4dac-9896-547453ba4a22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.070 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ac64745d-eb94-4698-8aff-9dac821bc900]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap850ad6ca-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:99:6f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 5, 'rx_bytes': 868, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 5, 'rx_bytes': 868, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489842, 'reachable_time': 29262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236615, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.079 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[41bb499b-8e9e-4de1-b5d2-1d4d5818a5b3]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap850ad6ca-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489853, 'tstamp': 489853}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236617, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap850ad6ca-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489856, 'tstamp': 489856}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236617, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.081 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap850ad6ca-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.082 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.085 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.086 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap850ad6ca-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.086 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.087 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap850ad6ca-60, col_values=(('external_ids', {'iface-id': '61b6889f-b848-4873-9650-8b2715794d29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.087 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.159 226239 INFO nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Creating config drive at /var/lib/nova/instances/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3/disk.config#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.163 226239 DEBUG oslo_concurrency.processutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpb2fzh28e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.286 226239 DEBUG oslo_concurrency.processutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpb2fzh28e" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.309 226239 DEBUG nova.storage.rbd_utils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] rbd image 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.313 226239 DEBUG oslo_concurrency.processutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3/disk.config 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.471 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845804.4713945, 4b48cc05-9edd-4e4d-a58e-84564afb0612 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.472 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] VM Started (Lifecycle Event)#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.509 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.512 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845804.4738894, 4b48cc05-9edd-4e4d-a58e-84564afb0612 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.513 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.536 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.538 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.575 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:50:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:50:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:04.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.660 226239 DEBUG nova.compute.manager [req-4827b4aa-5808-4af7-9b21-06183a248bc6 req-73073f51-5011-4039-84b7-5d6e4ae0eb04 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.660 226239 DEBUG oslo_concurrency.lockutils [req-4827b4aa-5808-4af7-9b21-06183a248bc6 req-73073f51-5011-4039-84b7-5d6e4ae0eb04 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.661 226239 DEBUG oslo_concurrency.lockutils [req-4827b4aa-5808-4af7-9b21-06183a248bc6 req-73073f51-5011-4039-84b7-5d6e4ae0eb04 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.661 226239 DEBUG oslo_concurrency.lockutils [req-4827b4aa-5808-4af7-9b21-06183a248bc6 req-73073f51-5011-4039-84b7-5d6e4ae0eb04 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.661 226239 DEBUG nova.compute.manager [req-4827b4aa-5808-4af7-9b21-06183a248bc6 req-73073f51-5011-4039-84b7-5d6e4ae0eb04 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Processing event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.662 226239 DEBUG nova.compute.manager [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.665 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845804.6651332, 4b48cc05-9edd-4e4d-a58e-84564afb0612 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.665 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.667 226239 DEBUG nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.671 226239 INFO nova.virt.libvirt.driver [-] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Instance spawned successfully.#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.671 226239 DEBUG nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.692 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.697 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.700 226239 DEBUG nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.700 226239 DEBUG nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.700 226239 DEBUG nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.701 226239 DEBUG nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.701 226239 DEBUG nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.702 226239 DEBUG nova.virt.libvirt.driver [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.710 226239 DEBUG oslo_concurrency.processutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3/disk.config 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.397s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.710 226239 INFO nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Deleting local config drive /var/lib/nova/instances/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3/disk.config because it was imported into RBD.#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.741 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:50:04 np0005603623 kernel: tapab24842b-00: entered promiscuous mode
Jan 31 02:50:04 np0005603623 NetworkManager[48970]: <info>  [1769845804.7628] manager: (tapab24842b-00): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Jan 31 02:50:04 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:04Z|00095|binding|INFO|Claiming lport ab24842b-0045-41e6-b6dc-51b110b51829 for this chassis.
Jan 31 02:50:04 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:04Z|00096|binding|INFO|ab24842b-0045-41e6-b6dc-51b110b51829: Claiming fa:16:3e:b4:a1:9e 10.100.0.9
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.772 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:04 np0005603623 NetworkManager[48970]: <info>  [1769845804.7780] device (tapab24842b-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:50:04 np0005603623 NetworkManager[48970]: <info>  [1769845804.7787] device (tapab24842b-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.782 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:a1:9e 10.100.0.9'], port_security=['fa:16:3e:b4:a1:9e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '79350fb7-3eed-4a3b-a7e9-f0ec90460ac3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e66a774f63ae4139a4e75c7973fbe077', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a4a96739-bb2f-4e95-bbe5-76a81d2aa557', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=227ca833-938d-48d2-86c8-5d09dd658c40, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=ab24842b-0045-41e6-b6dc-51b110b51829) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.783 143258 INFO neutron.agent.ovn.metadata.agent [-] Port ab24842b-0045-41e6-b6dc-51b110b51829 in datapath 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d bound to our chassis#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.786 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.791 226239 INFO nova.compute.manager [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Took 7.98 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.792 226239 DEBUG nova.compute.manager [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.796 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a4420fe2-355a-43f9-82d9-82d15fbe773f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.797 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap60bb4bea-d1 in ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.798 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap60bb4bea-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.798 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bae8008b-545a-4f17-a8c5-6d4a32d24a33]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.799 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3e7a232b-6632-4ea5-9831-41a0c9cac12d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 systemd-machined[194379]: New machine qemu-11-instance-00000014.
Jan 31 02:50:04 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:04Z|00097|binding|INFO|Setting lport ab24842b-0045-41e6-b6dc-51b110b51829 ovn-installed in OVS
Jan 31 02:50:04 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:04Z|00098|binding|INFO|Setting lport ab24842b-0045-41e6-b6dc-51b110b51829 up in Southbound
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.805 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:04 np0005603623 systemd[1]: Started Virtual Machine qemu-11-instance-00000014.
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.814 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[301cce41-ba5b-4867-bdce-0cce8e2b0c97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.826 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fcdaf9e4-fb3c-4107-aee8-a7755c54302b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.841 226239 DEBUG nova.network.neutron [req-46b11eb1-863d-4c94-b2e6-f6e5521ece05 req-07e558b0-100a-458c-865d-228f38c99a31 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Updated VIF entry in instance network info cache for port ab24842b-0045-41e6-b6dc-51b110b51829. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.842 226239 DEBUG nova.network.neutron [req-46b11eb1-863d-4c94-b2e6-f6e5521ece05 req-07e558b0-100a-458c-865d-228f38c99a31 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Updating instance_info_cache with network_info: [{"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.848 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[d41644d7-a646-4340-960f-6876f2aa1847]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 NetworkManager[48970]: <info>  [1769845804.8551] manager: (tap60bb4bea-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/57)
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.854 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3820a11d-18a1-4cbf-ab36-0c04b814fba7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.877 226239 DEBUG oslo_concurrency.lockutils [req-46b11eb1-863d-4c94-b2e6-f6e5521ece05 req-07e558b0-100a-458c-865d-228f38c99a31 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.890 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[b7ba5e16-0b66-4a41-bf40-a39581dabfdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.894 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[0f6fc090-8d9a-46e5-8796-0ed0ad412f69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 NetworkManager[48970]: <info>  [1769845804.9113] device (tap60bb4bea-d0): carrier: link connected
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.914 226239 INFO nova.compute.manager [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Took 15.72 seconds to build instance.#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.915 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[85dd59db-c2d4-40ba-a4cc-24e9b47d19eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.928 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[be16beeb-da16-4bac-99d7-3ac24b6e73e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap60bb4bea-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:b1:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494534, 'reachable_time': 38776, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236748, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 nova_compute[226235]: 2026-01-31 07:50:04.941 226239 DEBUG oslo_concurrency.lockutils [None req-bb01e047-93b6-4e83-8875-b017c8afb1d4 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.838s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.942 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0b00a1a8-6ada-47bf-a619-921d326f008e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4e:b1c0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 494534, 'tstamp': 494534}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236749, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.954 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fd122b6b-9549-4ab7-a137-3922a3eaa852]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap60bb4bea-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:b1:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494534, 'reachable_time': 38776, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236750, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:04.972 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4bf1a2bb-e5f3-4e28-8347-c9cf423f9bcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:05.016 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[44aa0ffd-6184-494d-aa52-fcaeed934f5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:05.017 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60bb4bea-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:05.018 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:05.018 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60bb4bea-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.020 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:05 np0005603623 NetworkManager[48970]: <info>  [1769845805.0207] manager: (tap60bb4bea-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Jan 31 02:50:05 np0005603623 kernel: tap60bb4bea-d0: entered promiscuous mode
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.023 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:05.024 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap60bb4bea-d0, col_values=(('external_ids', {'iface-id': 'eefb3f31-55e8-4b1d-a07a-d5c925fc9fd8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.025 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:05 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:05Z|00099|binding|INFO|Releasing lport eefb3f31-55e8-4b1d-a07a-d5c925fc9fd8 from this chassis (sb_readonly=0)
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.031 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:05.033 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/60bb4bea-d9f0-41fc-9c0f-6fcd644c255d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/60bb4bea-d9f0-41fc-9c0f-6fcd644c255d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:05.034 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6d9179cc-fb75-47fa-99c8-936c6e042450]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:05.034 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/60bb4bea-d9f0-41fc-9c0f-6fcd644c255d.pid.haproxy
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:50:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:05.035 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'env', 'PROCESS_TAG=haproxy-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/60bb4bea-d9f0-41fc-9c0f-6fcd644c255d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.181 226239 DEBUG nova.compute.manager [req-1d48391a-2a62-4dc8-a91f-443c2a66a126 req-a0f51ecc-4186-40df-83d6-77dcc850cae0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.182 226239 DEBUG oslo_concurrency.lockutils [req-1d48391a-2a62-4dc8-a91f-443c2a66a126 req-a0f51ecc-4186-40df-83d6-77dcc850cae0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.182 226239 DEBUG oslo_concurrency.lockutils [req-1d48391a-2a62-4dc8-a91f-443c2a66a126 req-a0f51ecc-4186-40df-83d6-77dcc850cae0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.182 226239 DEBUG oslo_concurrency.lockutils [req-1d48391a-2a62-4dc8-a91f-443c2a66a126 req-a0f51ecc-4186-40df-83d6-77dcc850cae0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.182 226239 DEBUG nova.compute.manager [req-1d48391a-2a62-4dc8-a91f-443c2a66a126 req-a0f51ecc-4186-40df-83d6-77dcc850cae0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Processing event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.426 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845805.4256415, 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.427 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] VM Started (Lifecycle Event)#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.429 226239 DEBUG nova.compute.manager [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.432 226239 DEBUG nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.438 226239 INFO nova.virt.libvirt.driver [-] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Instance spawned successfully.#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.438 226239 DEBUG nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.448 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.450 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.464 226239 DEBUG nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.464 226239 DEBUG nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.464 226239 DEBUG nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.465 226239 DEBUG nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.465 226239 DEBUG nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.465 226239 DEBUG nova.virt.libvirt.driver [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:05 np0005603623 podman[236818]: 2026-01-31 07:50:05.377838403 +0000 UTC m=+0.037404836 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:50:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:05.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.518 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.519 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845805.4269404, 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.519 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.551 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.554 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845805.4325018, 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.554 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.579 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.581 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.592 226239 INFO nova.compute.manager [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Took 8.83 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.592 226239 DEBUG nova.compute.manager [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.611 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:50:05 np0005603623 podman[236818]: 2026-01-31 07:50:05.617266098 +0000 UTC m=+0.276832501 container create 921afe33de4e7c6dfbb75e1fb3cba5bcf7bba71e0f9d2905265849ab668b8c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.659 226239 INFO nova.compute.manager [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Took 16.01 seconds to build instance.#033[00m
Jan 31 02:50:05 np0005603623 nova_compute[226235]: 2026-01-31 07:50:05.695 226239 DEBUG oslo_concurrency.lockutils [None req-4917cab5-a6c3-4afd-a4bf-776811deb4c7 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:05 np0005603623 systemd[1]: Started libpod-conmon-921afe33de4e7c6dfbb75e1fb3cba5bcf7bba71e0f9d2905265849ab668b8c1e.scope.
Jan 31 02:50:05 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:50:05 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9413fcc7e89318f964d7d9a3f94dd232d40b634dbdaf623d612786dc673b78fa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:50:05 np0005603623 podman[236818]: 2026-01-31 07:50:05.891064762 +0000 UTC m=+0.550631165 container init 921afe33de4e7c6dfbb75e1fb3cba5bcf7bba71e0f9d2905265849ab668b8c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 02:50:05 np0005603623 podman[236818]: 2026-01-31 07:50:05.895901784 +0000 UTC m=+0.555468187 container start 921afe33de4e7c6dfbb75e1fb3cba5bcf7bba71e0f9d2905265849ab668b8c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:50:05 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[236839]: [NOTICE]   (236843) : New worker (236845) forked
Jan 31 02:50:05 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[236839]: [NOTICE]   (236843) : Loading success.
Jan 31 02:50:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:06.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:06 np0005603623 nova_compute[226235]: 2026-01-31 07:50:06.788 226239 DEBUG nova.compute.manager [req-488eeb23-0940-4530-bf52-98ba980168c3 req-ba32e41c-d779-4088-bdd4-89cfed4f60a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:06 np0005603623 nova_compute[226235]: 2026-01-31 07:50:06.788 226239 DEBUG oslo_concurrency.lockutils [req-488eeb23-0940-4530-bf52-98ba980168c3 req-ba32e41c-d779-4088-bdd4-89cfed4f60a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:06 np0005603623 nova_compute[226235]: 2026-01-31 07:50:06.789 226239 DEBUG oslo_concurrency.lockutils [req-488eeb23-0940-4530-bf52-98ba980168c3 req-ba32e41c-d779-4088-bdd4-89cfed4f60a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:06 np0005603623 nova_compute[226235]: 2026-01-31 07:50:06.789 226239 DEBUG oslo_concurrency.lockutils [req-488eeb23-0940-4530-bf52-98ba980168c3 req-ba32e41c-d779-4088-bdd4-89cfed4f60a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:06 np0005603623 nova_compute[226235]: 2026-01-31 07:50:06.790 226239 DEBUG nova.compute.manager [req-488eeb23-0940-4530-bf52-98ba980168c3 req-ba32e41c-d779-4088-bdd4-89cfed4f60a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] No waiting events found dispatching network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:06 np0005603623 nova_compute[226235]: 2026-01-31 07:50:06.790 226239 WARNING nova.compute.manager [req-488eeb23-0940-4530-bf52-98ba980168c3 req-ba32e41c-d779-4088-bdd4-89cfed4f60a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received unexpected event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 for instance with vm_state active and task_state None.#033[00m
Jan 31 02:50:06 np0005603623 nova_compute[226235]: 2026-01-31 07:50:06.801 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:07.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:08 np0005603623 nova_compute[226235]: 2026-01-31 07:50:08.182 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:08 np0005603623 nova_compute[226235]: 2026-01-31 07:50:08.182 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 02:50:08 np0005603623 nova_compute[226235]: 2026-01-31 07:50:08.262 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:50:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:08.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:50:08 np0005603623 nova_compute[226235]: 2026-01-31 07:50:08.644 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 02:50:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:09.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:09 np0005603623 nova_compute[226235]: 2026-01-31 07:50:09.617 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:09 np0005603623 nova_compute[226235]: 2026-01-31 07:50:09.617 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:09 np0005603623 nova_compute[226235]: 2026-01-31 07:50:09.617 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:10.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:11.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:11 np0005603623 nova_compute[226235]: 2026-01-31 07:50:11.804 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:11 np0005603623 nova_compute[226235]: 2026-01-31 07:50:11.904 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:11 np0005603623 nova_compute[226235]: 2026-01-31 07:50:11.904 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:11 np0005603623 nova_compute[226235]: 2026-01-31 07:50:11.905 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:11 np0005603623 nova_compute[226235]: 2026-01-31 07:50:11.905 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:50:11 np0005603623 nova_compute[226235]: 2026-01-31 07:50:11.905 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:50:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4216839064' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:50:12 np0005603623 nova_compute[226235]: 2026-01-31 07:50:12.347 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:12.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:13 np0005603623 nova_compute[226235]: 2026-01-31 07:50:13.296 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:13 np0005603623 nova_compute[226235]: 2026-01-31 07:50:13.307 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000011 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:50:13 np0005603623 nova_compute[226235]: 2026-01-31 07:50:13.308 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000011 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:50:13 np0005603623 nova_compute[226235]: 2026-01-31 07:50:13.313 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000013 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:50:13 np0005603623 nova_compute[226235]: 2026-01-31 07:50:13.313 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000013 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:50:13 np0005603623 nova_compute[226235]: 2026-01-31 07:50:13.317 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000014 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:50:13 np0005603623 nova_compute[226235]: 2026-01-31 07:50:13.318 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000014 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:50:13 np0005603623 nova_compute[226235]: 2026-01-31 07:50:13.447 226239 DEBUG oslo_concurrency.lockutils [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "refresh_cache-620b3405-251d-4545-b523-faa35768224b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:50:13 np0005603623 nova_compute[226235]: 2026-01-31 07:50:13.448 226239 DEBUG oslo_concurrency.lockutils [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquired lock "refresh_cache-620b3405-251d-4545-b523-faa35768224b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:50:13 np0005603623 nova_compute[226235]: 2026-01-31 07:50:13.448 226239 DEBUG nova.network.neutron [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:50:13 np0005603623 nova_compute[226235]: 2026-01-31 07:50:13.482 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:50:13 np0005603623 nova_compute[226235]: 2026-01-31 07:50:13.483 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4372MB free_disk=20.76026153564453GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:50:13 np0005603623 nova_compute[226235]: 2026-01-31 07:50:13.483 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:13 np0005603623 nova_compute[226235]: 2026-01-31 07:50:13.483 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:13.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 02:50:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2276360538' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 02:50:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 02:50:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2276360538' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 02:50:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:14.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:15 np0005603623 nova_compute[226235]: 2026-01-31 07:50:15.201 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Applying migration context for instance 620b3405-251d-4545-b523-faa35768224b as it has an incoming, in-progress migration e0f17e23-e3f6-4dbf-9bcf-e53ad1ea7163. Migration status is post-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Jan 31 02:50:15 np0005603623 nova_compute[226235]: 2026-01-31 07:50:15.203 226239 INFO nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 620b3405-251d-4545-b523-faa35768224b] Updating resource usage from migration e0f17e23-e3f6-4dbf-9bcf-e53ad1ea7163#033[00m
Jan 31 02:50:15 np0005603623 nova_compute[226235]: 2026-01-31 07:50:15.245 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:50:15 np0005603623 nova_compute[226235]: 2026-01-31 07:50:15.245 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 620b3405-251d-4545-b523-faa35768224b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:50:15 np0005603623 nova_compute[226235]: 2026-01-31 07:50:15.245 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 4b48cc05-9edd-4e4d-a58e-84564afb0612 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:50:15 np0005603623 nova_compute[226235]: 2026-01-31 07:50:15.245 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:50:15 np0005603623 nova_compute[226235]: 2026-01-31 07:50:15.246 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:50:15 np0005603623 nova_compute[226235]: 2026-01-31 07:50:15.246 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:50:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:50:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:15.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:50:15 np0005603623 nova_compute[226235]: 2026-01-31 07:50:15.533 226239 DEBUG nova.network.neutron [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:50:15 np0005603623 nova_compute[226235]: 2026-01-31 07:50:15.560 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:15 np0005603623 nova_compute[226235]: 2026-01-31 07:50:15.860 226239 DEBUG nova.network.neutron [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:50:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:50:16 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3575360831' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:50:16 np0005603623 nova_compute[226235]: 2026-01-31 07:50:16.276 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.716s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:16 np0005603623 nova_compute[226235]: 2026-01-31 07:50:16.284 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:50:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:16.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:16 np0005603623 nova_compute[226235]: 2026-01-31 07:50:16.754 226239 DEBUG oslo_concurrency.lockutils [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Releasing lock "refresh_cache-620b3405-251d-4545-b523-faa35768224b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:50:16 np0005603623 nova_compute[226235]: 2026-01-31 07:50:16.758 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:50:16 np0005603623 nova_compute[226235]: 2026-01-31 07:50:16.773 226239 DEBUG nova.compute.manager [req-8ab29d7f-4241-4583-9e6b-35cf2df7b052 req-098b8eec-ec70-4ce6-9144-653fc9b0ac62 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:16 np0005603623 nova_compute[226235]: 2026-01-31 07:50:16.773 226239 DEBUG oslo_concurrency.lockutils [req-8ab29d7f-4241-4583-9e6b-35cf2df7b052 req-098b8eec-ec70-4ce6-9144-653fc9b0ac62 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:16 np0005603623 nova_compute[226235]: 2026-01-31 07:50:16.773 226239 DEBUG oslo_concurrency.lockutils [req-8ab29d7f-4241-4583-9e6b-35cf2df7b052 req-098b8eec-ec70-4ce6-9144-653fc9b0ac62 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:16 np0005603623 nova_compute[226235]: 2026-01-31 07:50:16.774 226239 DEBUG oslo_concurrency.lockutils [req-8ab29d7f-4241-4583-9e6b-35cf2df7b052 req-098b8eec-ec70-4ce6-9144-653fc9b0ac62 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:16 np0005603623 nova_compute[226235]: 2026-01-31 07:50:16.774 226239 DEBUG nova.compute.manager [req-8ab29d7f-4241-4583-9e6b-35cf2df7b052 req-098b8eec-ec70-4ce6-9144-653fc9b0ac62 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:16 np0005603623 nova_compute[226235]: 2026-01-31 07:50:16.774 226239 WARNING nova.compute.manager [req-8ab29d7f-4241-4583-9e6b-35cf2df7b052 req-098b8eec-ec70-4ce6-9144-653fc9b0ac62 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received unexpected event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with vm_state active and task_state None.#033[00m
Jan 31 02:50:16 np0005603623 nova_compute[226235]: 2026-01-31 07:50:16.806 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:17 np0005603623 nova_compute[226235]: 2026-01-31 07:50:17.027 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:50:17 np0005603623 nova_compute[226235]: 2026-01-31 07:50:17.028 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.545s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:17 np0005603623 nova_compute[226235]: 2026-01-31 07:50:17.029 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:17 np0005603623 nova_compute[226235]: 2026-01-31 07:50:17.321 226239 DEBUG nova.virt.libvirt.driver [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 31 02:50:17 np0005603623 nova_compute[226235]: 2026-01-31 07:50:17.323 226239 DEBUG nova.virt.libvirt.driver [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 31 02:50:17 np0005603623 nova_compute[226235]: 2026-01-31 07:50:17.324 226239 INFO nova.virt.libvirt.driver [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Creating image(s)#033[00m
Jan 31 02:50:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:17.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:17 np0005603623 nova_compute[226235]: 2026-01-31 07:50:17.583 226239 DEBUG nova.storage.rbd_utils [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] creating snapshot(nova-resize) on rbd image(620b3405-251d-4545-b523-faa35768224b_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 02:50:17 np0005603623 nova_compute[226235]: 2026-01-31 07:50:17.853 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:17 np0005603623 nova_compute[226235]: 2026-01-31 07:50:17.853 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:18 np0005603623 nova_compute[226235]: 2026-01-31 07:50:18.208 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:18 np0005603623 nova_compute[226235]: 2026-01-31 07:50:18.209 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:50:18 np0005603623 nova_compute[226235]: 2026-01-31 07:50:18.209 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:50:18 np0005603623 nova_compute[226235]: 2026-01-31 07:50:18.299 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:18.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:18 np0005603623 nova_compute[226235]: 2026-01-31 07:50:18.692 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-620b3405-251d-4545-b523-faa35768224b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:50:18 np0005603623 nova_compute[226235]: 2026-01-31 07:50:18.692 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-620b3405-251d-4545-b523-faa35768224b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:50:18 np0005603623 nova_compute[226235]: 2026-01-31 07:50:18.692 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 620b3405-251d-4545-b523-faa35768224b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 02:50:18 np0005603623 nova_compute[226235]: 2026-01-31 07:50:18.692 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 620b3405-251d-4545-b523-faa35768224b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:50:19 np0005603623 nova_compute[226235]: 2026-01-31 07:50:19.330 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 620b3405-251d-4545-b523-faa35768224b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:50:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:50:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:19.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:50:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e150 e150: 3 total, 3 up, 3 in
Jan 31 02:50:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:20.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:20.999 226239 DEBUG nova.objects.instance [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 620b3405-251d-4545-b523-faa35768224b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:50:21 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:21Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8b:11:0b 10.100.0.6
Jan 31 02:50:21 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:21Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8b:11:0b 10.100.0.6
Jan 31 02:50:21 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:21Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b4:a1:9e 10.100.0.9
Jan 31 02:50:21 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:21Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b4:a1:9e 10.100.0.9
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.353 226239 DEBUG nova.virt.libvirt.driver [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.354 226239 DEBUG nova.virt.libvirt.driver [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Ensure instance console log exists: /var/lib/nova/instances/620b3405-251d-4545-b523-faa35768224b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.354 226239 DEBUG oslo_concurrency.lockutils [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.355 226239 DEBUG oslo_concurrency.lockutils [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.355 226239 DEBUG oslo_concurrency.lockutils [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.357 226239 DEBUG nova.virt.libvirt.driver [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.361 226239 WARNING nova.virt.libvirt.driver [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.365 226239 DEBUG nova.virt.libvirt.host [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.366 226239 DEBUG nova.virt.libvirt.host [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.369 226239 DEBUG nova.virt.libvirt.host [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.370 226239 DEBUG nova.virt.libvirt.host [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.371 226239 DEBUG nova.virt.libvirt.driver [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.372 226239 DEBUG nova.virt.hardware [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='f75c4aee-d826-4343-a7e3-f06a4b21de52',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.372 226239 DEBUG nova.virt.hardware [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.372 226239 DEBUG nova.virt.hardware [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.373 226239 DEBUG nova.virt.hardware [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.373 226239 DEBUG nova.virt.hardware [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.373 226239 DEBUG nova.virt.hardware [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.373 226239 DEBUG nova.virt.hardware [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.373 226239 DEBUG nova.virt.hardware [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.374 226239 DEBUG nova.virt.hardware [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.374 226239 DEBUG nova.virt.hardware [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.374 226239 DEBUG nova.virt.hardware [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.374 226239 DEBUG nova.objects.instance [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 620b3405-251d-4545-b523-faa35768224b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.408 226239 DEBUG oslo_concurrency.processutils [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:21.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.538 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 620b3405-251d-4545-b523-faa35768224b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.556 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-620b3405-251d-4545-b523-faa35768224b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.556 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 620b3405-251d-4545-b523-faa35768224b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.556 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.556 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.557 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.557 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.557 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:50:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.808 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:50:21 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/688721480' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.849 226239 DEBUG oslo_concurrency.processutils [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:21 np0005603623 nova_compute[226235]: 2026-01-31 07:50:21.878 226239 DEBUG oslo_concurrency.processutils [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:21 np0005603623 podman[237072]: 2026-01-31 07:50:21.985646977 +0000 UTC m=+0.072898872 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller)
Jan 31 02:50:21 np0005603623 podman[237071]: 2026-01-31 07:50:21.995358383 +0000 UTC m=+0.082610888 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 31 02:50:22 np0005603623 nova_compute[226235]: 2026-01-31 07:50:22.260 226239 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Check if temp file /var/lib/nova/instances/tmpkeqo2xgm exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Jan 31 02:50:22 np0005603623 nova_compute[226235]: 2026-01-31 07:50:22.260 226239 DEBUG nova.compute.manager [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkeqo2xgm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4b48cc05-9edd-4e4d-a58e-84564afb0612',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Jan 31 02:50:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:50:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3422845580' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:50:22 np0005603623 nova_compute[226235]: 2026-01-31 07:50:22.357 226239 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Check if temp file /var/lib/nova/instances/tmpmr0jxxoo exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Jan 31 02:50:22 np0005603623 nova_compute[226235]: 2026-01-31 07:50:22.358 226239 DEBUG nova.compute.manager [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmr0jxxoo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='79350fb7-3eed-4a3b-a7e9-f0ec90460ac3',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Jan 31 02:50:22 np0005603623 nova_compute[226235]: 2026-01-31 07:50:22.422 226239 DEBUG oslo_concurrency.processutils [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.543s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:22 np0005603623 nova_compute[226235]: 2026-01-31 07:50:22.425 226239 DEBUG nova.virt.libvirt.driver [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:50:22 np0005603623 nova_compute[226235]:  <uuid>620b3405-251d-4545-b523-faa35768224b</uuid>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:  <name>instance-00000012</name>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:  <memory>196608</memory>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <nova:name>tempest-MigrationsAdminTest-server-376445714</nova:name>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:50:21</nova:creationTime>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.micro">
Jan 31 02:50:22 np0005603623 nova_compute[226235]:        <nova:memory>192</nova:memory>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:        <nova:user uuid="8a59efd78e244f44a1c70650f82a2c50">tempest-MigrationsAdminTest-1820348317-project-member</nova:user>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:        <nova:project uuid="1627a71b855b4032b51e234e44a9d570">tempest-MigrationsAdminTest-1820348317</nova:project>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <nova:ports/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <entry name="serial">620b3405-251d-4545-b523-faa35768224b</entry>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <entry name="uuid">620b3405-251d-4545-b523-faa35768224b</entry>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/620b3405-251d-4545-b523-faa35768224b_disk">
Jan 31 02:50:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:50:22 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/620b3405-251d-4545-b523-faa35768224b_disk.config">
Jan 31 02:50:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:50:22 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/620b3405-251d-4545-b523-faa35768224b/console.log" append="off"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:50:22 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:50:22 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:50:22 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:50:22 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:50:22 np0005603623 nova_compute[226235]: 2026-01-31 07:50:22.512 226239 DEBUG nova.virt.libvirt.driver [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:50:22 np0005603623 nova_compute[226235]: 2026-01-31 07:50:22.512 226239 DEBUG nova.virt.libvirt.driver [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:50:22 np0005603623 nova_compute[226235]: 2026-01-31 07:50:22.513 226239 INFO nova.virt.libvirt.driver [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Using config drive#033[00m
Jan 31 02:50:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:50:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:22.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:50:22 np0005603623 systemd-machined[194379]: New machine qemu-12-instance-00000012.
Jan 31 02:50:22 np0005603623 systemd[1]: Started Virtual Machine qemu-12-instance-00000012.
Jan 31 02:50:23 np0005603623 nova_compute[226235]: 2026-01-31 07:50:23.300 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:23.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:23 np0005603623 nova_compute[226235]: 2026-01-31 07:50:23.720 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845823.7202184, 620b3405-251d-4545-b523-faa35768224b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:23 np0005603623 nova_compute[226235]: 2026-01-31 07:50:23.721 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 620b3405-251d-4545-b523-faa35768224b] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:50:23 np0005603623 nova_compute[226235]: 2026-01-31 07:50:23.727 226239 DEBUG nova.compute.manager [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:50:23 np0005603623 nova_compute[226235]: 2026-01-31 07:50:23.731 226239 INFO nova.virt.libvirt.driver [-] [instance: 620b3405-251d-4545-b523-faa35768224b] Instance running successfully.#033[00m
Jan 31 02:50:23 np0005603623 virtqemud[225858]: argument unsupported: QEMU guest agent is not configured
Jan 31 02:50:23 np0005603623 nova_compute[226235]: 2026-01-31 07:50:23.734 226239 DEBUG nova.virt.libvirt.guest [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 31 02:50:23 np0005603623 nova_compute[226235]: 2026-01-31 07:50:23.734 226239 DEBUG nova.virt.libvirt.driver [None req-344d9d3c-aed6-43d8-84d1-0b3900219d62 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 31 02:50:23 np0005603623 nova_compute[226235]: 2026-01-31 07:50:23.770 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 620b3405-251d-4545-b523-faa35768224b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:23 np0005603623 nova_compute[226235]: 2026-01-31 07:50:23.785 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 620b3405-251d-4545-b523-faa35768224b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:50:23 np0005603623 nova_compute[226235]: 2026-01-31 07:50:23.848 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 620b3405-251d-4545-b523-faa35768224b] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 31 02:50:23 np0005603623 nova_compute[226235]: 2026-01-31 07:50:23.849 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845823.7228022, 620b3405-251d-4545-b523-faa35768224b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:23 np0005603623 nova_compute[226235]: 2026-01-31 07:50:23.849 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 620b3405-251d-4545-b523-faa35768224b] VM Started (Lifecycle Event)#033[00m
Jan 31 02:50:23 np0005603623 nova_compute[226235]: 2026-01-31 07:50:23.923 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 620b3405-251d-4545-b523-faa35768224b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:23 np0005603623 nova_compute[226235]: 2026-01-31 07:50:23.930 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 620b3405-251d-4545-b523-faa35768224b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:50:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:24.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:25 np0005603623 nova_compute[226235]: 2026-01-31 07:50:25.260 226239 DEBUG oslo_concurrency.lockutils [None req-347f6903-1a62-4c06-805a-62d11afd512d 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "refresh_cache-620b3405-251d-4545-b523-faa35768224b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:50:25 np0005603623 nova_compute[226235]: 2026-01-31 07:50:25.260 226239 DEBUG oslo_concurrency.lockutils [None req-347f6903-1a62-4c06-805a-62d11afd512d 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquired lock "refresh_cache-620b3405-251d-4545-b523-faa35768224b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:50:25 np0005603623 nova_compute[226235]: 2026-01-31 07:50:25.261 226239 DEBUG nova.network.neutron [None req-347f6903-1a62-4c06-805a-62d11afd512d 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:50:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:25.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:25 np0005603623 nova_compute[226235]: 2026-01-31 07:50:25.939 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:26.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:26 np0005603623 nova_compute[226235]: 2026-01-31 07:50:26.810 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:27.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.870 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Triggering sync for uuid 620b3405-251d-4545-b523-faa35768224b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.871 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Triggering sync for uuid 4b48cc05-9edd-4e4d-a58e-84564afb0612 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.871 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Triggering sync for uuid 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.871 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Triggering sync for uuid 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.871 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "620b3405-251d-4545-b523-faa35768224b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.872 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "620b3405-251d-4545-b523-faa35768224b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.872 226239 INFO nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 620b3405-251d-4545-b523-faa35768224b] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.872 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "620b3405-251d-4545-b523-faa35768224b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.872 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.873 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.873 226239 INFO nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.873 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.873 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.874 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.874 226239 INFO nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.874 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.874 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.875 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:27 np0005603623 nova_compute[226235]: 2026-01-31 07:50:27.914 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.040s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:28 np0005603623 nova_compute[226235]: 2026-01-31 07:50:28.302 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:28 np0005603623 nova_compute[226235]: 2026-01-31 07:50:28.537 226239 DEBUG nova.network.neutron [None req-347f6903-1a62-4c06-805a-62d11afd512d 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:50:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:28.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:29 np0005603623 nova_compute[226235]: 2026-01-31 07:50:29.341 226239 DEBUG nova.network.neutron [None req-347f6903-1a62-4c06-805a-62d11afd512d 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 620b3405-251d-4545-b523-faa35768224b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:50:29 np0005603623 nova_compute[226235]: 2026-01-31 07:50:29.384 226239 DEBUG oslo_concurrency.lockutils [None req-347f6903-1a62-4c06-805a-62d11afd512d 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Releasing lock "refresh_cache-620b3405-251d-4545-b523-faa35768224b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:50:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:50:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:29.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:50:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:30.085 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:30.086 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:30.087 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:30 np0005603623 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000012.scope: Deactivated successfully.
Jan 31 02:50:30 np0005603623 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000012.scope: Consumed 6.236s CPU time.
Jan 31 02:50:30 np0005603623 systemd-machined[194379]: Machine qemu-12-instance-00000012 terminated.
Jan 31 02:50:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:30.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:30 np0005603623 nova_compute[226235]: 2026-01-31 07:50:30.639 226239 INFO nova.virt.libvirt.driver [-] [instance: 620b3405-251d-4545-b523-faa35768224b] Instance destroyed successfully.#033[00m
Jan 31 02:50:30 np0005603623 nova_compute[226235]: 2026-01-31 07:50:30.640 226239 DEBUG nova.objects.instance [None req-347f6903-1a62-4c06-805a-62d11afd512d 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'resources' on Instance uuid 620b3405-251d-4545-b523-faa35768224b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:50:30 np0005603623 nova_compute[226235]: 2026-01-31 07:50:30.659 226239 DEBUG oslo_concurrency.lockutils [None req-347f6903-1a62-4c06-805a-62d11afd512d 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:30 np0005603623 nova_compute[226235]: 2026-01-31 07:50:30.660 226239 DEBUG oslo_concurrency.lockutils [None req-347f6903-1a62-4c06-805a-62d11afd512d 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:30 np0005603623 nova_compute[226235]: 2026-01-31 07:50:30.693 226239 DEBUG nova.objects.instance [None req-347f6903-1a62-4c06-805a-62d11afd512d 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'migration_context' on Instance uuid 620b3405-251d-4545-b523-faa35768224b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:50:30 np0005603623 nova_compute[226235]: 2026-01-31 07:50:30.873 226239 DEBUG oslo_concurrency.processutils [None req-347f6903-1a62-4c06-805a-62d11afd512d 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:50:31 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1781718677' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:50:31 np0005603623 nova_compute[226235]: 2026-01-31 07:50:31.283 226239 DEBUG oslo_concurrency.processutils [None req-347f6903-1a62-4c06-805a-62d11afd512d 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:31 np0005603623 nova_compute[226235]: 2026-01-31 07:50:31.289 226239 DEBUG nova.compute.provider_tree [None req-347f6903-1a62-4c06-805a-62d11afd512d 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:50:31 np0005603623 nova_compute[226235]: 2026-01-31 07:50:31.309 226239 DEBUG nova.scheduler.client.report [None req-347f6903-1a62-4c06-805a-62d11afd512d 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:50:31 np0005603623 nova_compute[226235]: 2026-01-31 07:50:31.390 226239 DEBUG oslo_concurrency.lockutils [None req-347f6903-1a62-4c06-805a-62d11afd512d 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:31.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:31 np0005603623 nova_compute[226235]: 2026-01-31 07:50:31.812 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:32 np0005603623 nova_compute[226235]: 2026-01-31 07:50:32.420 226239 DEBUG nova.compute.manager [req-de076765-37f3-4cd0-89c8-c76ba777de22 req-220602f0-a88b-4d64-a94e-a4653cad1e3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:32 np0005603623 nova_compute[226235]: 2026-01-31 07:50:32.421 226239 DEBUG oslo_concurrency.lockutils [req-de076765-37f3-4cd0-89c8-c76ba777de22 req-220602f0-a88b-4d64-a94e-a4653cad1e3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:32 np0005603623 nova_compute[226235]: 2026-01-31 07:50:32.421 226239 DEBUG oslo_concurrency.lockutils [req-de076765-37f3-4cd0-89c8-c76ba777de22 req-220602f0-a88b-4d64-a94e-a4653cad1e3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:32 np0005603623 nova_compute[226235]: 2026-01-31 07:50:32.421 226239 DEBUG oslo_concurrency.lockutils [req-de076765-37f3-4cd0-89c8-c76ba777de22 req-220602f0-a88b-4d64-a94e-a4653cad1e3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:32 np0005603623 nova_compute[226235]: 2026-01-31 07:50:32.422 226239 DEBUG nova.compute.manager [req-de076765-37f3-4cd0-89c8-c76ba777de22 req-220602f0-a88b-4d64-a94e-a4653cad1e3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:32 np0005603623 nova_compute[226235]: 2026-01-31 07:50:32.422 226239 DEBUG nova.compute.manager [req-de076765-37f3-4cd0-89c8-c76ba777de22 req-220602f0-a88b-4d64-a94e-a4653cad1e3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:50:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:32.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:33 np0005603623 nova_compute[226235]: 2026-01-31 07:50:33.305 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:33.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e151 e151: 3 total, 3 up, 3 in
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.398 226239 INFO nova.compute.manager [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Took 10.43 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.398 226239 DEBUG nova.compute.manager [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.453 226239 DEBUG nova.compute.manager [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmr0jxxoo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='79350fb7-3eed-4a3b-a7e9-f0ec90460ac3',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=Migration(0a31c4ff-6473-43f3-9758-d88dd85b8589),old_vol_attachment_ids={317b1c6b-4f89-402c-94d1-f4852844f1e2='4b9eb10a-fc66-42c4-9e17-3b8136950b54'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.457 226239 DEBUG nova.objects.instance [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lazy-loading 'migration_context' on Instance uuid 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.458 226239 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.460 226239 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.460 226239 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.493 226239 DEBUG nova.virt.libvirt.migration [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Find same serial number: pos=1, serial=317b1c6b-4f89-402c-94d1-f4852844f1e2 _update_volume_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:242#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.494 226239 DEBUG nova.virt.libvirt.vif [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:49:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1973231276',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1973231276',id=20,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:50:05Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e66a774f63ae4139a4e75c7973fbe077',ramdisk_id='',reservation_id='r-kz6k0bwy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-2072827810',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-2072827810-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:50:05Z,user_data=None,user_id='37ed25cc14814a29867ac308b3cce8cf',uuid=79350fb7-3eed-4a3b-a7e9-f0ec90460ac3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.494 226239 DEBUG nova.network.os_vif_util [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Converting VIF {"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.495 226239 DEBUG nova.network.os_vif_util [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.495 226239 DEBUG nova.virt.libvirt.migration [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Updating guest XML with vif config: <interface type="ethernet">
Jan 31 02:50:34 np0005603623 nova_compute[226235]:  <mac address="fa:16:3e:b4:a1:9e"/>
Jan 31 02:50:34 np0005603623 nova_compute[226235]:  <model type="virtio"/>
Jan 31 02:50:34 np0005603623 nova_compute[226235]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:50:34 np0005603623 nova_compute[226235]:  <mtu size="1442"/>
Jan 31 02:50:34 np0005603623 nova_compute[226235]:  <target dev="tapab24842b-00"/>
Jan 31 02:50:34 np0005603623 nova_compute[226235]: </interface>
Jan 31 02:50:34 np0005603623 nova_compute[226235]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.495 226239 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.584 226239 DEBUG nova.compute.manager [req-8bc6cfc5-abe3-40fc-948e-c6a878ec46c5 req-41d16a4c-bfd6-464c-bb0c-66ea6257822a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.585 226239 DEBUG oslo_concurrency.lockutils [req-8bc6cfc5-abe3-40fc-948e-c6a878ec46c5 req-41d16a4c-bfd6-464c-bb0c-66ea6257822a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.585 226239 DEBUG oslo_concurrency.lockutils [req-8bc6cfc5-abe3-40fc-948e-c6a878ec46c5 req-41d16a4c-bfd6-464c-bb0c-66ea6257822a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.585 226239 DEBUG oslo_concurrency.lockutils [req-8bc6cfc5-abe3-40fc-948e-c6a878ec46c5 req-41d16a4c-bfd6-464c-bb0c-66ea6257822a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.585 226239 DEBUG nova.compute.manager [req-8bc6cfc5-abe3-40fc-948e-c6a878ec46c5 req-41d16a4c-bfd6-464c-bb0c-66ea6257822a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.585 226239 WARNING nova.compute.manager [req-8bc6cfc5-abe3-40fc-948e-c6a878ec46c5 req-41d16a4c-bfd6-464c-bb0c-66ea6257822a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received unexpected event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.586 226239 DEBUG nova.compute.manager [req-8bc6cfc5-abe3-40fc-948e-c6a878ec46c5 req-41d16a4c-bfd6-464c-bb0c-66ea6257822a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-changed-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.586 226239 DEBUG nova.compute.manager [req-8bc6cfc5-abe3-40fc-948e-c6a878ec46c5 req-41d16a4c-bfd6-464c-bb0c-66ea6257822a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Refreshing instance network info cache due to event network-changed-ab24842b-0045-41e6-b6dc-51b110b51829. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.586 226239 DEBUG oslo_concurrency.lockutils [req-8bc6cfc5-abe3-40fc-948e-c6a878ec46c5 req-41d16a4c-bfd6-464c-bb0c-66ea6257822a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.586 226239 DEBUG oslo_concurrency.lockutils [req-8bc6cfc5-abe3-40fc-948e-c6a878ec46c5 req-41d16a4c-bfd6-464c-bb0c-66ea6257822a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.586 226239 DEBUG nova.network.neutron [req-8bc6cfc5-abe3-40fc-948e-c6a878ec46c5 req-41d16a4c-bfd6-464c-bb0c-66ea6257822a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Refreshing network info cache for port ab24842b-0045-41e6-b6dc-51b110b51829 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:50:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:34.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:34 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:34Z|00100|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.962 226239 DEBUG nova.virt.libvirt.migration [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:50:34 np0005603623 nova_compute[226235]: 2026-01-31 07:50:34.963 226239 INFO nova.virt.libvirt.migration [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Jan 31 02:50:35 np0005603623 nova_compute[226235]: 2026-01-31 07:50:35.091 226239 INFO nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Jan 31 02:50:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:50:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:35.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:50:35 np0005603623 nova_compute[226235]: 2026-01-31 07:50:35.599 226239 DEBUG nova.virt.libvirt.migration [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:50:35 np0005603623 nova_compute[226235]: 2026-01-31 07:50:35.599 226239 DEBUG nova.virt.libvirt.migration [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:50:35 np0005603623 nova_compute[226235]: 2026-01-31 07:50:35.964 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845835.9644723, 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:35 np0005603623 nova_compute[226235]: 2026-01-31 07:50:35.965 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.003 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.008 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.058 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 31 02:50:36 np0005603623 kernel: tapab24842b-00 (unregistering): left promiscuous mode
Jan 31 02:50:36 np0005603623 NetworkManager[48970]: <info>  [1769845836.1858] device (tapab24842b-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:50:36 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:36Z|00101|binding|INFO|Releasing lport ab24842b-0045-41e6-b6dc-51b110b51829 from this chassis (sb_readonly=0)
Jan 31 02:50:36 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:36Z|00102|binding|INFO|Setting lport ab24842b-0045-41e6-b6dc-51b110b51829 down in Southbound
Jan 31 02:50:36 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:36Z|00103|binding|INFO|Removing iface tapab24842b-00 ovn-installed in OVS
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.192 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.194 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.207 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:a1:9e 10.100.0.9'], port_security=['fa:16:3e:b4:a1:9e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'bd097fed-e54b-4ed7-90f0-078b39b8b13a'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '79350fb7-3eed-4a3b-a7e9-f0ec90460ac3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e66a774f63ae4139a4e75c7973fbe077', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a4a96739-bb2f-4e95-bbe5-76a81d2aa557', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=227ca833-938d-48d2-86c8-5d09dd658c40, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=ab24842b-0045-41e6-b6dc-51b110b51829) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.208 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.209 143258 INFO neutron.agent.ovn.metadata.agent [-] Port ab24842b-0045-41e6-b6dc-51b110b51829 in datapath 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d unbound from our chassis#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.211 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.212 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[24845277-cd14-45ce-a6b5-7aba8f14af96]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.213 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d namespace which is not needed anymore#033[00m
Jan 31 02:50:36 np0005603623 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000014.scope: Deactivated successfully.
Jan 31 02:50:36 np0005603623 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000014.scope: Consumed 13.647s CPU time.
Jan 31 02:50:36 np0005603623 systemd-machined[194379]: Machine qemu-11-instance-00000014 terminated.
Jan 31 02:50:36 np0005603623 virtqemud[225858]: Unable to get XATTR trusted.libvirt.security.ref_selinux on volumes/volume-317b1c6b-4f89-402c-94d1-f4852844f1e2: No such file or directory
Jan 31 02:50:36 np0005603623 virtqemud[225858]: Unable to get XATTR trusted.libvirt.security.ref_dac on volumes/volume-317b1c6b-4f89-402c-94d1-f4852844f1e2: No such file or directory
Jan 31 02:50:36 np0005603623 kernel: tapab24842b-00: entered promiscuous mode
Jan 31 02:50:36 np0005603623 NetworkManager[48970]: <info>  [1769845836.3165] manager: (tapab24842b-00): new Tun device (/org/freedesktop/NetworkManager/Devices/59)
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.316 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:36 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:36Z|00104|binding|INFO|Claiming lport ab24842b-0045-41e6-b6dc-51b110b51829 for this chassis.
Jan 31 02:50:36 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:36Z|00105|binding|INFO|ab24842b-0045-41e6-b6dc-51b110b51829: Claiming fa:16:3e:b4:a1:9e 10.100.0.9
Jan 31 02:50:36 np0005603623 kernel: tapab24842b-00 (unregistering): left promiscuous mode
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.325 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:a1:9e 10.100.0.9'], port_security=['fa:16:3e:b4:a1:9e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'bd097fed-e54b-4ed7-90f0-078b39b8b13a'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '79350fb7-3eed-4a3b-a7e9-f0ec90460ac3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e66a774f63ae4139a4e75c7973fbe077', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a4a96739-bb2f-4e95-bbe5-76a81d2aa557', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=227ca833-938d-48d2-86c8-5d09dd658c40, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=ab24842b-0045-41e6-b6dc-51b110b51829) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:50:36 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:36Z|00106|binding|INFO|Setting lport ab24842b-0045-41e6-b6dc-51b110b51829 ovn-installed in OVS
Jan 31 02:50:36 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:36Z|00107|binding|INFO|Setting lport ab24842b-0045-41e6-b6dc-51b110b51829 up in Southbound
Jan 31 02:50:36 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:36Z|00108|binding|INFO|Releasing lport ab24842b-0045-41e6-b6dc-51b110b51829 from this chassis (sb_readonly=1)
Jan 31 02:50:36 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:36Z|00109|if_status|INFO|Not setting lport ab24842b-0045-41e6-b6dc-51b110b51829 down as sb is readonly
Jan 31 02:50:36 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:36Z|00110|binding|INFO|Removing iface tapab24842b-00 ovn-installed in OVS
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.331 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:36 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:36Z|00111|binding|INFO|Releasing lport ab24842b-0045-41e6-b6dc-51b110b51829 from this chassis (sb_readonly=0)
Jan 31 02:50:36 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:36Z|00112|binding|INFO|Setting lport ab24842b-0045-41e6-b6dc-51b110b51829 down in Southbound
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.342 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.346 226239 DEBUG nova.virt.libvirt.guest [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.346 226239 INFO nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Migration operation has completed#033[00m
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.347 226239 INFO nova.compute.manager [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] _post_live_migration() is started..#033[00m
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.348 226239 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.348 226239 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.348 226239 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.350 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:a1:9e 10.100.0.9'], port_security=['fa:16:3e:b4:a1:9e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'bd097fed-e54b-4ed7-90f0-078b39b8b13a'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '79350fb7-3eed-4a3b-a7e9-f0ec90460ac3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e66a774f63ae4139a4e75c7973fbe077', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a4a96739-bb2f-4e95-bbe5-76a81d2aa557', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=227ca833-938d-48d2-86c8-5d09dd658c40, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=ab24842b-0045-41e6-b6dc-51b110b51829) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:50:36 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[236839]: [NOTICE]   (236843) : haproxy version is 2.8.14-c23fe91
Jan 31 02:50:36 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[236839]: [NOTICE]   (236843) : path to executable is /usr/sbin/haproxy
Jan 31 02:50:36 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[236839]: [WARNING]  (236843) : Exiting Master process...
Jan 31 02:50:36 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[236839]: [WARNING]  (236843) : Exiting Master process...
Jan 31 02:50:36 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[236839]: [ALERT]    (236843) : Current worker (236845) exited with code 143 (Terminated)
Jan 31 02:50:36 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[236839]: [WARNING]  (236843) : All workers exited. Exiting... (0)
Jan 31 02:50:36 np0005603623 systemd[1]: libpod-921afe33de4e7c6dfbb75e1fb3cba5bcf7bba71e0f9d2905265849ab668b8c1e.scope: Deactivated successfully.
Jan 31 02:50:36 np0005603623 podman[237323]: 2026-01-31 07:50:36.376993654 +0000 UTC m=+0.057787027 container died 921afe33de4e7c6dfbb75e1fb3cba5bcf7bba71e0f9d2905265849ab668b8c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Jan 31 02:50:36 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-921afe33de4e7c6dfbb75e1fb3cba5bcf7bba71e0f9d2905265849ab668b8c1e-userdata-shm.mount: Deactivated successfully.
Jan 31 02:50:36 np0005603623 systemd[1]: var-lib-containers-storage-overlay-9413fcc7e89318f964d7d9a3f94dd232d40b634dbdaf623d612786dc673b78fa-merged.mount: Deactivated successfully.
Jan 31 02:50:36 np0005603623 podman[237323]: 2026-01-31 07:50:36.433484699 +0000 UTC m=+0.114278072 container cleanup 921afe33de4e7c6dfbb75e1fb3cba5bcf7bba71e0f9d2905265849ab668b8c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:50:36 np0005603623 systemd[1]: libpod-conmon-921afe33de4e7c6dfbb75e1fb3cba5bcf7bba71e0f9d2905265849ab668b8c1e.scope: Deactivated successfully.
Jan 31 02:50:36 np0005603623 podman[237357]: 2026-01-31 07:50:36.511652096 +0000 UTC m=+0.059384347 container remove 921afe33de4e7c6dfbb75e1fb3cba5bcf7bba71e0f9d2905265849ab668b8c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.516 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e4165705-3d55-410a-b09b-8b1ab6446fc6]: (4, ('Sat Jan 31 07:50:36 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d (921afe33de4e7c6dfbb75e1fb3cba5bcf7bba71e0f9d2905265849ab668b8c1e)\n921afe33de4e7c6dfbb75e1fb3cba5bcf7bba71e0f9d2905265849ab668b8c1e\nSat Jan 31 07:50:36 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d (921afe33de4e7c6dfbb75e1fb3cba5bcf7bba71e0f9d2905265849ab668b8c1e)\n921afe33de4e7c6dfbb75e1fb3cba5bcf7bba71e0f9d2905265849ab668b8c1e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.517 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4be634f1-262b-42ba-a8d0-046ad6363491]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.519 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60bb4bea-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:36 np0005603623 kernel: tap60bb4bea-d0: left promiscuous mode
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.523 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.531 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.536 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2b6ebfb6-56e1-4fdb-a33b-81ec91766b60]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.558 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[31e5b2bd-9127-4354-a607-b2381c53ffd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.559 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[709f524a-6449-43ae-9188-efa1d6187930]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.572 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bafe0e4a-ec85-4c4e-be9c-039260701629]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494527, 'reachable_time': 31483, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237374, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:36 np0005603623 systemd[1]: run-netns-ovnmeta\x2d60bb4bea\x2dd9f0\x2d41fc\x2d9c0f\x2d6fcd644c255d.mount: Deactivated successfully.
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.578 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.578 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[f901a9ce-9c98-4b99-80ea-0a97ebd7d582]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.580 143258 INFO neutron.agent.ovn.metadata.agent [-] Port ab24842b-0045-41e6-b6dc-51b110b51829 in datapath 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d unbound from our chassis#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.581 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.582 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[42971630-8671-444d-84fd-6435d405b1a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.583 143258 INFO neutron.agent.ovn.metadata.agent [-] Port ab24842b-0045-41e6-b6dc-51b110b51829 in datapath 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d unbound from our chassis#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.585 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.585 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a88a4e10-2112-4f6a-9158-1b9928eacdf2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:50:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:36.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:50:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.813 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.814 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:50:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:36.816 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.873 226239 DEBUG nova.compute.manager [req-cd59516d-057a-4b07-bcd0-0a4183c77b31 req-a4ec2a03-8f7c-406f-b7f9-2c4a0a2c2446 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.873 226239 DEBUG oslo_concurrency.lockutils [req-cd59516d-057a-4b07-bcd0-0a4183c77b31 req-a4ec2a03-8f7c-406f-b7f9-2c4a0a2c2446 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.873 226239 DEBUG oslo_concurrency.lockutils [req-cd59516d-057a-4b07-bcd0-0a4183c77b31 req-a4ec2a03-8f7c-406f-b7f9-2c4a0a2c2446 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.874 226239 DEBUG oslo_concurrency.lockutils [req-cd59516d-057a-4b07-bcd0-0a4183c77b31 req-a4ec2a03-8f7c-406f-b7f9-2c4a0a2c2446 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.874 226239 DEBUG nova.compute.manager [req-cd59516d-057a-4b07-bcd0-0a4183c77b31 req-a4ec2a03-8f7c-406f-b7f9-2c4a0a2c2446 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:36 np0005603623 nova_compute[226235]: 2026-01-31 07:50:36.874 226239 DEBUG nova.compute.manager [req-cd59516d-057a-4b07-bcd0-0a4183c77b31 req-a4ec2a03-8f7c-406f-b7f9-2c4a0a2c2446 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:50:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:37.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:38 np0005603623 nova_compute[226235]: 2026-01-31 07:50:38.349 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:38.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:38 np0005603623 nova_compute[226235]: 2026-01-31 07:50:38.684 226239 DEBUG nova.network.neutron [req-8bc6cfc5-abe3-40fc-948e-c6a878ec46c5 req-41d16a4c-bfd6-464c-bb0c-66ea6257822a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Updated VIF entry in instance network info cache for port ab24842b-0045-41e6-b6dc-51b110b51829. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:50:38 np0005603623 nova_compute[226235]: 2026-01-31 07:50:38.685 226239 DEBUG nova.network.neutron [req-8bc6cfc5-abe3-40fc-948e-c6a878ec46c5 req-41d16a4c-bfd6-464c-bb0c-66ea6257822a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Updating instance_info_cache with network_info: [{"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:50:38 np0005603623 nova_compute[226235]: 2026-01-31 07:50:38.741 226239 DEBUG oslo_concurrency.lockutils [req-8bc6cfc5-abe3-40fc-948e-c6a878ec46c5 req-41d16a4c-bfd6-464c-bb0c-66ea6257822a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.205 226239 DEBUG nova.compute.manager [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.205 226239 DEBUG oslo_concurrency.lockutils [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.206 226239 DEBUG oslo_concurrency.lockutils [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.206 226239 DEBUG oslo_concurrency.lockutils [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.206 226239 DEBUG nova.compute.manager [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.206 226239 WARNING nova.compute.manager [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received unexpected event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.206 226239 DEBUG nova.compute.manager [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.206 226239 DEBUG oslo_concurrency.lockutils [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.206 226239 DEBUG oslo_concurrency.lockutils [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.207 226239 DEBUG oslo_concurrency.lockutils [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.207 226239 DEBUG nova.compute.manager [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.207 226239 WARNING nova.compute.manager [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received unexpected event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.207 226239 DEBUG nova.compute.manager [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.207 226239 DEBUG oslo_concurrency.lockutils [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.207 226239 DEBUG oslo_concurrency.lockutils [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.208 226239 DEBUG oslo_concurrency.lockutils [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.208 226239 DEBUG nova.compute.manager [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.208 226239 WARNING nova.compute.manager [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received unexpected event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.208 226239 DEBUG nova.compute.manager [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.208 226239 DEBUG oslo_concurrency.lockutils [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.208 226239 DEBUG oslo_concurrency.lockutils [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.209 226239 DEBUG oslo_concurrency.lockutils [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.209 226239 DEBUG nova.compute.manager [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.209 226239 DEBUG nova.compute.manager [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.209 226239 DEBUG nova.compute.manager [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.209 226239 DEBUG oslo_concurrency.lockutils [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.209 226239 DEBUG oslo_concurrency.lockutils [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.209 226239 DEBUG oslo_concurrency.lockutils [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.210 226239 DEBUG nova.compute.manager [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.210 226239 WARNING nova.compute.manager [req-4a952eb5-18ef-45b4-a4e7-a72f7ffabde8 req-b4ee0b8e-b53d-4d3d-b548-9595ee74cb82 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received unexpected event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.336 226239 DEBUG nova.compute.manager [req-c85f095b-8017-40ac-bf77-1b2d57baa354 req-a67577b7-e0c3-4a73-80bf-732febe995a4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.337 226239 DEBUG oslo_concurrency.lockutils [req-c85f095b-8017-40ac-bf77-1b2d57baa354 req-a67577b7-e0c3-4a73-80bf-732febe995a4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.337 226239 DEBUG oslo_concurrency.lockutils [req-c85f095b-8017-40ac-bf77-1b2d57baa354 req-a67577b7-e0c3-4a73-80bf-732febe995a4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.337 226239 DEBUG oslo_concurrency.lockutils [req-c85f095b-8017-40ac-bf77-1b2d57baa354 req-a67577b7-e0c3-4a73-80bf-732febe995a4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.338 226239 DEBUG nova.compute.manager [req-c85f095b-8017-40ac-bf77-1b2d57baa354 req-a67577b7-e0c3-4a73-80bf-732febe995a4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.338 226239 DEBUG nova.compute.manager [req-c85f095b-8017-40ac-bf77-1b2d57baa354 req-a67577b7-e0c3-4a73-80bf-732febe995a4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:50:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:50:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:39.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.661 226239 DEBUG nova.network.neutron [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Activated binding for port ab24842b-0045-41e6-b6dc-51b110b51829 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.661 226239 DEBUG nova.compute.manager [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.662 226239 DEBUG nova.virt.libvirt.vif [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:49:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1973231276',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1973231276',id=20,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:50:05Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e66a774f63ae4139a4e75c7973fbe077',ramdisk_id='',reservation_id='r-kz6k0bwy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-2072827810',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-2072827810-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:50:21Z,user_data=None,user_id='37ed25cc14814a29867ac308b3cce8cf',uuid=79350fb7-3eed-4a3b-a7e9-f0ec90460ac3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.663 226239 DEBUG nova.network.os_vif_util [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Converting VIF {"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.663 226239 DEBUG nova.network.os_vif_util [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.664 226239 DEBUG os_vif [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.665 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.666 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab24842b-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.667 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.668 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.670 226239 INFO os_vif [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00')#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.671 226239 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.671 226239 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.671 226239 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.671 226239 DEBUG nova.compute.manager [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.672 226239 INFO nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Deleting instance files /var/lib/nova/instances/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3_del#033[00m
Jan 31 02:50:39 np0005603623 nova_compute[226235]: 2026-01-31 07:50:39.672 226239 INFO nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Deletion of /var/lib/nova/instances/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3_del complete#033[00m
Jan 31 02:50:39 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:39.817 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:40.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:50:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:50:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:50:40 np0005603623 nova_compute[226235]: 2026-01-31 07:50:40.979 226239 DEBUG oslo_concurrency.lockutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "866b0b10-d2ae-4e08-9efa-36b9c9c9f50d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:40 np0005603623 nova_compute[226235]: 2026-01-31 07:50:40.981 226239 DEBUG oslo_concurrency.lockutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "866b0b10-d2ae-4e08-9efa-36b9c9c9f50d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.040 226239 DEBUG nova.compute.manager [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.167 226239 DEBUG oslo_concurrency.lockutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.168 226239 DEBUG oslo_concurrency.lockutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.176 226239 DEBUG nova.virt.hardware [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.176 226239 INFO nova.compute.claims [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.382 226239 DEBUG oslo_concurrency.processutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:41.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.679 226239 DEBUG nova.compute.manager [req-3b3885b7-6989-4d2c-9cc1-3a184f689561 req-2b785996-89f1-4a03-b25e-057873384a8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.679 226239 DEBUG oslo_concurrency.lockutils [req-3b3885b7-6989-4d2c-9cc1-3a184f689561 req-2b785996-89f1-4a03-b25e-057873384a8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.679 226239 DEBUG oslo_concurrency.lockutils [req-3b3885b7-6989-4d2c-9cc1-3a184f689561 req-2b785996-89f1-4a03-b25e-057873384a8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.680 226239 DEBUG oslo_concurrency.lockutils [req-3b3885b7-6989-4d2c-9cc1-3a184f689561 req-2b785996-89f1-4a03-b25e-057873384a8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.680 226239 DEBUG nova.compute.manager [req-3b3885b7-6989-4d2c-9cc1-3a184f689561 req-2b785996-89f1-4a03-b25e-057873384a8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.680 226239 WARNING nova.compute.manager [req-3b3885b7-6989-4d2c-9cc1-3a184f689561 req-2b785996-89f1-4a03-b25e-057873384a8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received unexpected event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.680 226239 DEBUG nova.compute.manager [req-3b3885b7-6989-4d2c-9cc1-3a184f689561 req-2b785996-89f1-4a03-b25e-057873384a8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.681 226239 DEBUG oslo_concurrency.lockutils [req-3b3885b7-6989-4d2c-9cc1-3a184f689561 req-2b785996-89f1-4a03-b25e-057873384a8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.681 226239 DEBUG oslo_concurrency.lockutils [req-3b3885b7-6989-4d2c-9cc1-3a184f689561 req-2b785996-89f1-4a03-b25e-057873384a8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.681 226239 DEBUG oslo_concurrency.lockutils [req-3b3885b7-6989-4d2c-9cc1-3a184f689561 req-2b785996-89f1-4a03-b25e-057873384a8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.681 226239 DEBUG nova.compute.manager [req-3b3885b7-6989-4d2c-9cc1-3a184f689561 req-2b785996-89f1-4a03-b25e-057873384a8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.682 226239 WARNING nova.compute.manager [req-3b3885b7-6989-4d2c-9cc1-3a184f689561 req-2b785996-89f1-4a03-b25e-057873384a8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received unexpected event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.682 226239 DEBUG nova.compute.manager [req-3b3885b7-6989-4d2c-9cc1-3a184f689561 req-2b785996-89f1-4a03-b25e-057873384a8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.682 226239 DEBUG oslo_concurrency.lockutils [req-3b3885b7-6989-4d2c-9cc1-3a184f689561 req-2b785996-89f1-4a03-b25e-057873384a8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.682 226239 DEBUG oslo_concurrency.lockutils [req-3b3885b7-6989-4d2c-9cc1-3a184f689561 req-2b785996-89f1-4a03-b25e-057873384a8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.683 226239 DEBUG oslo_concurrency.lockutils [req-3b3885b7-6989-4d2c-9cc1-3a184f689561 req-2b785996-89f1-4a03-b25e-057873384a8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.684 226239 DEBUG nova.compute.manager [req-3b3885b7-6989-4d2c-9cc1-3a184f689561 req-2b785996-89f1-4a03-b25e-057873384a8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.685 226239 WARNING nova.compute.manager [req-3b3885b7-6989-4d2c-9cc1-3a184f689561 req-2b785996-89f1-4a03-b25e-057873384a8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received unexpected event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:50:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.816 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:50:41 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1664782695' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.863 226239 DEBUG oslo_concurrency.processutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.869 226239 DEBUG nova.compute.provider_tree [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.918 226239 DEBUG nova.scheduler.client.report [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.958 226239 DEBUG oslo_concurrency.lockutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.791s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:41 np0005603623 nova_compute[226235]: 2026-01-31 07:50:41.959 226239 DEBUG nova.compute.manager [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.049 226239 DEBUG nova.compute.manager [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.049 226239 DEBUG nova.network.neutron [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.132 226239 INFO nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.160 226239 DEBUG nova.compute.manager [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.322 226239 DEBUG nova.compute.manager [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.324 226239 DEBUG nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.324 226239 INFO nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Creating image(s)#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.347 226239 DEBUG nova.storage.rbd_utils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] rbd image 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.376 226239 DEBUG nova.storage.rbd_utils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] rbd image 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.405 226239 DEBUG nova.storage.rbd_utils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] rbd image 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.410 226239 DEBUG oslo_concurrency.processutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.463 226239 DEBUG oslo_concurrency.processutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.464 226239 DEBUG oslo_concurrency.lockutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.465 226239 DEBUG oslo_concurrency.lockutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.465 226239 DEBUG oslo_concurrency.lockutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.489 226239 DEBUG nova.storage.rbd_utils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] rbd image 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.492 226239 DEBUG oslo_concurrency.processutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:42.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.737 226239 DEBUG nova.network.neutron [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.737 226239 DEBUG nova.compute.manager [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:50:42 np0005603623 nova_compute[226235]: 2026-01-31 07:50:42.998 226239 DEBUG oslo_concurrency.processutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.067 226239 DEBUG nova.storage.rbd_utils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] resizing rbd image 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:50:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e152 e152: 3 total, 3 up, 3 in
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.523 226239 DEBUG nova.objects.instance [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'migration_context' on Instance uuid 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.537 226239 DEBUG nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.537 226239 DEBUG nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Ensure instance console log exists: /var/lib/nova/instances/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.538 226239 DEBUG oslo_concurrency.lockutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.538 226239 DEBUG oslo_concurrency.lockutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.538 226239 DEBUG oslo_concurrency.lockutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.539 226239 DEBUG nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:50:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:50:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:43.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.543 226239 WARNING nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.549 226239 DEBUG nova.virt.libvirt.host [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.549 226239 DEBUG nova.virt.libvirt.host [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.553 226239 DEBUG nova.virt.libvirt.host [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.554 226239 DEBUG nova.virt.libvirt.host [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.555 226239 DEBUG nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.555 226239 DEBUG nova.virt.hardware [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.556 226239 DEBUG nova.virt.hardware [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.556 226239 DEBUG nova.virt.hardware [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.557 226239 DEBUG nova.virt.hardware [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.557 226239 DEBUG nova.virt.hardware [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.557 226239 DEBUG nova.virt.hardware [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.557 226239 DEBUG nova.virt.hardware [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.557 226239 DEBUG nova.virt.hardware [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.558 226239 DEBUG nova.virt.hardware [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.558 226239 DEBUG nova.virt.hardware [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.558 226239 DEBUG nova.virt.hardware [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:50:43 np0005603623 nova_compute[226235]: 2026-01-31 07:50:43.561 226239 DEBUG oslo_concurrency.processutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:50:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2201210873' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:50:44 np0005603623 nova_compute[226235]: 2026-01-31 07:50:44.147 226239 DEBUG oslo_concurrency.processutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.586s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:44 np0005603623 nova_compute[226235]: 2026-01-31 07:50:44.173 226239 DEBUG nova.storage.rbd_utils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] rbd image 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:44 np0005603623 nova_compute[226235]: 2026-01-31 07:50:44.178 226239 DEBUG oslo_concurrency.processutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:44.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:50:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2293269338' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:50:44 np0005603623 nova_compute[226235]: 2026-01-31 07:50:44.655 226239 DEBUG oslo_concurrency.processutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:44 np0005603623 nova_compute[226235]: 2026-01-31 07:50:44.657 226239 DEBUG nova.objects.instance [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'pci_devices' on Instance uuid 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:50:44 np0005603623 nova_compute[226235]: 2026-01-31 07:50:44.669 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:44 np0005603623 nova_compute[226235]: 2026-01-31 07:50:44.703 226239 DEBUG nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:50:44 np0005603623 nova_compute[226235]:  <uuid>866b0b10-d2ae-4e08-9efa-36b9c9c9f50d</uuid>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:  <name>instance-00000016</name>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <nova:name>tempest-MigrationsAdminTest-server-1993007316</nova:name>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:50:43</nova:creationTime>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:50:44 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:        <nova:user uuid="8a59efd78e244f44a1c70650f82a2c50">tempest-MigrationsAdminTest-1820348317-project-member</nova:user>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:        <nova:project uuid="1627a71b855b4032b51e234e44a9d570">tempest-MigrationsAdminTest-1820348317</nova:project>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <nova:ports/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <entry name="serial">866b0b10-d2ae-4e08-9efa-36b9c9c9f50d</entry>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <entry name="uuid">866b0b10-d2ae-4e08-9efa-36b9c9c9f50d</entry>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_disk">
Jan 31 02:50:44 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:50:44 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_disk.config">
Jan 31 02:50:44 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:50:44 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d/console.log" append="off"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:50:44 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:50:44 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:50:44 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:50:44 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:50:44 np0005603623 nova_compute[226235]: 2026-01-31 07:50:44.802 226239 DEBUG nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:50:44 np0005603623 nova_compute[226235]: 2026-01-31 07:50:44.802 226239 DEBUG nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:50:44 np0005603623 nova_compute[226235]: 2026-01-31 07:50:44.803 226239 INFO nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Using config drive#033[00m
Jan 31 02:50:44 np0005603623 nova_compute[226235]: 2026-01-31 07:50:44.832 226239 DEBUG nova.storage.rbd_utils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] rbd image 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:45.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:45 np0005603623 nova_compute[226235]: 2026-01-31 07:50:45.583 226239 INFO nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Creating config drive at /var/lib/nova/instances/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d/disk.config#033[00m
Jan 31 02:50:45 np0005603623 nova_compute[226235]: 2026-01-31 07:50:45.587 226239 DEBUG oslo_concurrency.processutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmphrss8iug execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:45 np0005603623 nova_compute[226235]: 2026-01-31 07:50:45.638 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845830.6370864, 620b3405-251d-4545-b523-faa35768224b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:45 np0005603623 nova_compute[226235]: 2026-01-31 07:50:45.639 226239 INFO nova.compute.manager [-] [instance: 620b3405-251d-4545-b523-faa35768224b] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:50:45 np0005603623 nova_compute[226235]: 2026-01-31 07:50:45.667 226239 DEBUG nova.compute.manager [None req-82dd0c77-5209-4309-8953-b20d574a38e8 - - - - - -] [instance: 620b3405-251d-4545-b523-faa35768224b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:45 np0005603623 nova_compute[226235]: 2026-01-31 07:50:45.713 226239 DEBUG oslo_concurrency.processutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmphrss8iug" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:45 np0005603623 nova_compute[226235]: 2026-01-31 07:50:45.741 226239 DEBUG nova.storage.rbd_utils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] rbd image 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:45 np0005603623 nova_compute[226235]: 2026-01-31 07:50:45.744 226239 DEBUG oslo_concurrency.processutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d/disk.config 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:46 np0005603623 nova_compute[226235]: 2026-01-31 07:50:46.190 226239 DEBUG oslo_concurrency.processutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d/disk.config 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:46 np0005603623 nova_compute[226235]: 2026-01-31 07:50:46.192 226239 INFO nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Deleting local config drive /var/lib/nova/instances/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d/disk.config because it was imported into RBD.#033[00m
Jan 31 02:50:46 np0005603623 systemd-machined[194379]: New machine qemu-13-instance-00000016.
Jan 31 02:50:46 np0005603623 systemd[1]: Started Virtual Machine qemu-13-instance-00000016.
Jan 31 02:50:46 np0005603623 nova_compute[226235]: 2026-01-31 07:50:46.616 226239 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:46 np0005603623 nova_compute[226235]: 2026-01-31 07:50:46.618 226239 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:46 np0005603623 nova_compute[226235]: 2026-01-31 07:50:46.618 226239 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:46.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:46 np0005603623 nova_compute[226235]: 2026-01-31 07:50:46.650 226239 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:46 np0005603623 nova_compute[226235]: 2026-01-31 07:50:46.651 226239 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:46 np0005603623 nova_compute[226235]: 2026-01-31 07:50:46.651 226239 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:46 np0005603623 nova_compute[226235]: 2026-01-31 07:50:46.651 226239 DEBUG nova.compute.resource_tracker [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:50:46 np0005603623 nova_compute[226235]: 2026-01-31 07:50:46.651 226239 DEBUG oslo_concurrency.processutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:46 np0005603623 nova_compute[226235]: 2026-01-31 07:50:46.817 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.015 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845847.0149124, 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.015 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.020 226239 DEBUG nova.compute.manager [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.020 226239 DEBUG nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.027 226239 INFO nova.virt.libvirt.driver [-] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Instance spawned successfully.#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.027 226239 DEBUG nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.056 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:50:47 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3634990103' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.074 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.079 226239 DEBUG nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.080 226239 DEBUG nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.080 226239 DEBUG nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.081 226239 DEBUG nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.081 226239 DEBUG nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.082 226239 DEBUG nova.virt.libvirt.driver [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.093 226239 DEBUG oslo_concurrency.processutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.112 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.113 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845847.0176785, 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.113 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] VM Started (Lifecycle Event)#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.163 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.167 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.193 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.203 226239 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] skipping disk for instance-00000011 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.203 226239 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] skipping disk for instance-00000011 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.206 226239 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] skipping disk for instance-00000013 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.206 226239 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] skipping disk for instance-00000013 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.209 226239 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.210 226239 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.214 226239 INFO nova.compute.manager [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Took 4.89 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.214 226239 DEBUG nova.compute.manager [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.270 226239 INFO nova.compute.manager [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Took 6.16 seconds to build instance.#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.292 226239 DEBUG oslo_concurrency.lockutils [None req-2919c65f-a1ec-419a-b19b-e7f454b8f873 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "866b0b10-d2ae-4e08-9efa-36b9c9c9f50d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.311s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.352 226239 WARNING nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.353 226239 DEBUG nova.compute.resource_tracker [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4249MB free_disk=20.743717193603516GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.354 226239 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.354 226239 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.437 226239 DEBUG nova.compute.resource_tracker [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Migration for instance 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.462 226239 DEBUG nova.compute.resource_tracker [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.462 226239 INFO nova.compute.resource_tracker [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updating resource usage from migration b50886cd-60b8-4c4f-bb91-0426bbbcc40b#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.491 226239 DEBUG nova.compute.resource_tracker [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Instance 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.491 226239 DEBUG nova.compute.resource_tracker [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Migration b50886cd-60b8-4c4f-bb91-0426bbbcc40b is active on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.491 226239 DEBUG nova.compute.resource_tracker [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Migration 0a31c4ff-6473-43f3-9758-d88dd85b8589 is active on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.491 226239 DEBUG nova.compute.resource_tracker [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Instance 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.492 226239 DEBUG nova.compute.resource_tracker [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.492 226239 DEBUG nova.compute.resource_tracker [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:50:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:47.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:47 np0005603623 nova_compute[226235]: 2026-01-31 07:50:47.607 226239 DEBUG oslo_concurrency.processutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:50:48 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3802651578' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:50:48 np0005603623 nova_compute[226235]: 2026-01-31 07:50:48.050 226239 DEBUG oslo_concurrency.processutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:48 np0005603623 nova_compute[226235]: 2026-01-31 07:50:48.055 226239 DEBUG nova.compute.provider_tree [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:50:48 np0005603623 nova_compute[226235]: 2026-01-31 07:50:48.094 226239 DEBUG nova.scheduler.client.report [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:50:48 np0005603623 nova_compute[226235]: 2026-01-31 07:50:48.129 226239 DEBUG nova.compute.resource_tracker [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:50:48 np0005603623 nova_compute[226235]: 2026-01-31 07:50:48.130 226239 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.775s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:48 np0005603623 nova_compute[226235]: 2026-01-31 07:50:48.136 226239 INFO nova.compute.manager [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Migrating instance to compute-1.ctlplane.example.com finished successfully.#033[00m
Jan 31 02:50:48 np0005603623 nova_compute[226235]: 2026-01-31 07:50:48.237 226239 INFO nova.scheduler.client.report [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Deleted allocation for migration 0a31c4ff-6473-43f3-9758-d88dd85b8589#033[00m
Jan 31 02:50:48 np0005603623 nova_compute[226235]: 2026-01-31 07:50:48.238 226239 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Jan 31 02:50:48 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:50:48 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:50:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:48.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:49 np0005603623 nova_compute[226235]: 2026-01-31 07:50:49.142 226239 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Creating tmpfile /var/lib/nova/instances/tmpwobuud1k to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 31 02:50:49 np0005603623 nova_compute[226235]: 2026-01-31 07:50:49.143 226239 DEBUG nova.compute.manager [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwobuud1k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 31 02:50:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:49.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:49 np0005603623 nova_compute[226235]: 2026-01-31 07:50:49.672 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:50 np0005603623 nova_compute[226235]: 2026-01-31 07:50:50.448 226239 DEBUG nova.compute.manager [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwobuud1k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='79350fb7-3eed-4a3b-a7e9-f0ec90460ac3',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 31 02:50:50 np0005603623 nova_compute[226235]: 2026-01-31 07:50:50.499 226239 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:50:50 np0005603623 nova_compute[226235]: 2026-01-31 07:50:50.499 226239 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquired lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:50:50 np0005603623 nova_compute[226235]: 2026-01-31 07:50:50.500 226239 DEBUG nova.network.neutron [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:50:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:50.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:51 np0005603623 nova_compute[226235]: 2026-01-31 07:50:51.253 226239 DEBUG oslo_concurrency.lockutils [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Acquiring lock "refresh_cache-866b0b10-d2ae-4e08-9efa-36b9c9c9f50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:50:51 np0005603623 nova_compute[226235]: 2026-01-31 07:50:51.254 226239 DEBUG oslo_concurrency.lockutils [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Acquired lock "refresh_cache-866b0b10-d2ae-4e08-9efa-36b9c9c9f50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:50:51 np0005603623 nova_compute[226235]: 2026-01-31 07:50:51.254 226239 DEBUG nova.network.neutron [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:50:51 np0005603623 nova_compute[226235]: 2026-01-31 07:50:51.346 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845836.3437157, 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:51 np0005603623 nova_compute[226235]: 2026-01-31 07:50:51.346 226239 INFO nova.compute.manager [-] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:50:51 np0005603623 nova_compute[226235]: 2026-01-31 07:50:51.368 226239 DEBUG nova.compute.manager [None req-85a978eb-a75f-4afe-be62-15863f9f39aa - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:51 np0005603623 nova_compute[226235]: 2026-01-31 07:50:51.520 226239 DEBUG nova.network.neutron [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:50:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:51.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:51 np0005603623 nova_compute[226235]: 2026-01-31 07:50:51.818 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:52 np0005603623 podman[237999]: 2026-01-31 07:50:52.237359245 +0000 UTC m=+0.053572895 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 31 02:50:52 np0005603623 podman[238000]: 2026-01-31 07:50:52.256491826 +0000 UTC m=+0.072460138 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 31 02:50:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:52.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:52 np0005603623 nova_compute[226235]: 2026-01-31 07:50:52.763 226239 DEBUG nova.network.neutron [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:50:52 np0005603623 nova_compute[226235]: 2026-01-31 07:50:52.793 226239 DEBUG oslo_concurrency.lockutils [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Releasing lock "refresh_cache-866b0b10-d2ae-4e08-9efa-36b9c9c9f50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:50:52 np0005603623 nova_compute[226235]: 2026-01-31 07:50:52.905 226239 DEBUG nova.compute.manager [req-b7531378-7873-4486-b8d0-344b2a0a716a req-60ffb7a9-cdf4-4c72-9d56-814ef84982a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-unplugged-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:52 np0005603623 nova_compute[226235]: 2026-01-31 07:50:52.905 226239 DEBUG oslo_concurrency.lockutils [req-b7531378-7873-4486-b8d0-344b2a0a716a req-60ffb7a9-cdf4-4c72-9d56-814ef84982a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:52 np0005603623 nova_compute[226235]: 2026-01-31 07:50:52.905 226239 DEBUG oslo_concurrency.lockutils [req-b7531378-7873-4486-b8d0-344b2a0a716a req-60ffb7a9-cdf4-4c72-9d56-814ef84982a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:52 np0005603623 nova_compute[226235]: 2026-01-31 07:50:52.906 226239 DEBUG oslo_concurrency.lockutils [req-b7531378-7873-4486-b8d0-344b2a0a716a req-60ffb7a9-cdf4-4c72-9d56-814ef84982a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:52 np0005603623 nova_compute[226235]: 2026-01-31 07:50:52.906 226239 DEBUG nova.compute.manager [req-b7531378-7873-4486-b8d0-344b2a0a716a req-60ffb7a9-cdf4-4c72-9d56-814ef84982a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] No waiting events found dispatching network-vif-unplugged-31ab3c80-791f-418d-a70b-fcb0d523a037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:52 np0005603623 nova_compute[226235]: 2026-01-31 07:50:52.907 226239 DEBUG nova.compute.manager [req-b7531378-7873-4486-b8d0-344b2a0a716a req-60ffb7a9-cdf4-4c72-9d56-814ef84982a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-unplugged-31ab3c80-791f-418d-a70b-fcb0d523a037 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:50:52 np0005603623 nova_compute[226235]: 2026-01-31 07:50:52.934 226239 DEBUG nova.virt.libvirt.driver [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 31 02:50:52 np0005603623 nova_compute[226235]: 2026-01-31 07:50:52.935 226239 DEBUG nova.virt.libvirt.volume.remotefs [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Creating file /var/lib/nova/instances/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d/dad70b7a75a14ed9ae9a3cae15c80c84.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 31 02:50:52 np0005603623 nova_compute[226235]: 2026-01-31 07:50:52.935 226239 DEBUG oslo_concurrency.processutils [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d/dad70b7a75a14ed9ae9a3cae15c80c84.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.316 226239 DEBUG oslo_concurrency.processutils [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d/dad70b7a75a14ed9ae9a3cae15c80c84.tmp" returned: 1 in 0.381s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.318 226239 DEBUG oslo_concurrency.processutils [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d/dad70b7a75a14ed9ae9a3cae15c80c84.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.318 226239 DEBUG nova.virt.libvirt.volume.remotefs [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Creating directory /var/lib/nova/instances/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.318 226239 DEBUG oslo_concurrency.processutils [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.497 226239 DEBUG oslo_concurrency.processutils [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d" returned: 0 in 0.179s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.503 226239 DEBUG nova.virt.libvirt.driver [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 02:50:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:53.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.593 226239 INFO nova.compute.manager [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Took 5.32 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.594 226239 DEBUG nova.compute.manager [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.626 226239 DEBUG nova.compute.manager [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkeqo2xgm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4b48cc05-9edd-4e4d-a58e-84564afb0612',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=Migration(b50886cd-60b8-4c4f-bb91-0426bbbcc40b),old_vol_attachment_ids={74f8a6d0-259e-466b-a484-4c7bffded2e1='da773656-1631-4e7b-855d-ed146c908f6b'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.629 226239 DEBUG nova.objects.instance [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lazy-loading 'migration_context' on Instance uuid 4b48cc05-9edd-4e4d-a58e-84564afb0612 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.631 226239 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.632 226239 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.632 226239 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.655 226239 DEBUG nova.virt.libvirt.migration [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Find same serial number: pos=1, serial=74f8a6d0-259e-466b-a484-4c7bffded2e1 _update_volume_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:242#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.656 226239 DEBUG nova.virt.libvirt.vif [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:49:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1009708622',display_name='tempest-LiveMigrationTest-server-1009708622',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1009708622',id=19,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:50:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cbdbb7a4b22a49b68feb3e028bb62fbb',ramdisk_id='',reservation_id='r-878znybl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveMigrationTest-126681982',owner_user_name='tempest-LiveMigrationTest-126681982-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:50:04Z,user_data=None,user_id='795c7f392cbc45f0885f081449883d42',uuid=4b48cc05-9edd-4e4d-a58e-84564afb0612,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.657 226239 DEBUG nova.network.os_vif_util [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Converting VIF {"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.658 226239 DEBUG nova.network.os_vif_util [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.658 226239 DEBUG nova.virt.libvirt.migration [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updating guest XML with vif config: <interface type="ethernet">
Jan 31 02:50:53 np0005603623 nova_compute[226235]:  <mac address="fa:16:3e:8b:11:0b"/>
Jan 31 02:50:53 np0005603623 nova_compute[226235]:  <model type="virtio"/>
Jan 31 02:50:53 np0005603623 nova_compute[226235]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:50:53 np0005603623 nova_compute[226235]:  <mtu size="1442"/>
Jan 31 02:50:53 np0005603623 nova_compute[226235]:  <target dev="tap31ab3c80-79"/>
Jan 31 02:50:53 np0005603623 nova_compute[226235]: </interface>
Jan 31 02:50:53 np0005603623 nova_compute[226235]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.659 226239 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.661 226239 DEBUG nova.network.neutron [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Updating instance_info_cache with network_info: [{"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.682 226239 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Releasing lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.684 226239 DEBUG os_brick.utils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.685 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.696 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.696 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[0c3edf8b-5996-4b28-8092-9a0fdf6265cf]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.698 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.703 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.005s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.703 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[22737302-eb05-4f89-aa75-633c498196cb]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.705 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.709 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.005s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.709 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[865edc37-b475-4821-82a4-12d089e47ee6]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.711 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[445ec321-bbd1-4d47-ad44-0fc70a93ad9b]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.712 226239 DEBUG oslo_concurrency.processutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.729 226239 DEBUG oslo_concurrency.processutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] CMD "nvme version" returned: 0 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.732 226239 DEBUG os_brick.initiator.connectors.lightos [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.733 226239 DEBUG os_brick.initiator.connectors.lightos [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.733 226239 DEBUG os_brick.initiator.connectors.lightos [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 02:50:53 np0005603623 nova_compute[226235]: 2026-01-31 07:50:53.733 226239 DEBUG os_brick.utils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] <== get_connector_properties: return (49ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.135 226239 DEBUG nova.virt.libvirt.migration [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.136 226239 INFO nova.virt.libvirt.migration [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.196 226239 INFO nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Jan 31 02:50:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:54.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.676 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.699 226239 DEBUG nova.virt.libvirt.migration [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.700 226239 DEBUG nova.virt.libvirt.migration [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.815 226239 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwobuud1k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='79350fb7-3eed-4a3b-a7e9-f0ec90460ac3',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={317b1c6b-4f89-402c-94d1-f4852844f1e2='c20de688-0876-4e49-80fa-40bec74574ff'},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.816 226239 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Creating instance directory: /var/lib/nova/instances/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.817 226239 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Ensure instance console log exists: /var/lib/nova/instances/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.817 226239 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Connecting volumes before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10901#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.820 226239 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.821 226239 DEBUG nova.virt.libvirt.vif [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T07:49:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1973231276',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1973231276',id=20,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:50:05Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e66a774f63ae4139a4e75c7973fbe077',ramdisk_id='',reservation_id='r-kz6k0bwy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-2072827810',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-2072827810-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:50:45Z,user_data=None,user_id='37ed25cc14814a29867ac308b3cce8cf',uuid=79350fb7-3eed-4a3b-a7e9-f0ec90460ac3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.821 226239 DEBUG nova.network.os_vif_util [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Converting VIF {"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.822 226239 DEBUG nova.network.os_vif_util [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.822 226239 DEBUG os_vif [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.823 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.824 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.824 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.827 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.827 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab24842b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.828 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapab24842b-00, col_values=(('external_ids', {'iface-id': 'ab24842b-0045-41e6-b6dc-51b110b51829', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b4:a1:9e', 'vm-uuid': '79350fb7-3eed-4a3b-a7e9-f0ec90460ac3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.829 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:54 np0005603623 NetworkManager[48970]: <info>  [1769845854.8301] manager: (tapab24842b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.832 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.835 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.836 226239 INFO os_vif [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00')#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.839 226239 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.839 226239 DEBUG nova.compute.manager [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwobuud1k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='79350fb7-3eed-4a3b-a7e9-f0ec90460ac3',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={317b1c6b-4f89-402c-94d1-f4852844f1e2='c20de688-0876-4e49-80fa-40bec74574ff'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.994 226239 DEBUG nova.compute.manager [req-e427aff1-a57f-4b51-ba12-4a60798b6dd8 req-fc9062ba-8c6c-432b-bc33-6ccd9748fe09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.995 226239 DEBUG oslo_concurrency.lockutils [req-e427aff1-a57f-4b51-ba12-4a60798b6dd8 req-fc9062ba-8c6c-432b-bc33-6ccd9748fe09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.995 226239 DEBUG oslo_concurrency.lockutils [req-e427aff1-a57f-4b51-ba12-4a60798b6dd8 req-fc9062ba-8c6c-432b-bc33-6ccd9748fe09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.996 226239 DEBUG oslo_concurrency.lockutils [req-e427aff1-a57f-4b51-ba12-4a60798b6dd8 req-fc9062ba-8c6c-432b-bc33-6ccd9748fe09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.996 226239 DEBUG nova.compute.manager [req-e427aff1-a57f-4b51-ba12-4a60798b6dd8 req-fc9062ba-8c6c-432b-bc33-6ccd9748fe09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] No waiting events found dispatching network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.996 226239 WARNING nova.compute.manager [req-e427aff1-a57f-4b51-ba12-4a60798b6dd8 req-fc9062ba-8c6c-432b-bc33-6ccd9748fe09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received unexpected event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.996 226239 DEBUG nova.compute.manager [req-e427aff1-a57f-4b51-ba12-4a60798b6dd8 req-fc9062ba-8c6c-432b-bc33-6ccd9748fe09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-changed-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.997 226239 DEBUG nova.compute.manager [req-e427aff1-a57f-4b51-ba12-4a60798b6dd8 req-fc9062ba-8c6c-432b-bc33-6ccd9748fe09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Refreshing instance network info cache due to event network-changed-31ab3c80-791f-418d-a70b-fcb0d523a037. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.997 226239 DEBUG oslo_concurrency.lockutils [req-e427aff1-a57f-4b51-ba12-4a60798b6dd8 req-fc9062ba-8c6c-432b-bc33-6ccd9748fe09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.997 226239 DEBUG oslo_concurrency.lockutils [req-e427aff1-a57f-4b51-ba12-4a60798b6dd8 req-fc9062ba-8c6c-432b-bc33-6ccd9748fe09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:50:54 np0005603623 nova_compute[226235]: 2026-01-31 07:50:54.998 226239 DEBUG nova.network.neutron [req-e427aff1-a57f-4b51-ba12-4a60798b6dd8 req-fc9062ba-8c6c-432b-bc33-6ccd9748fe09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Refreshing network info cache for port 31ab3c80-791f-418d-a70b-fcb0d523a037 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:50:55 np0005603623 nova_compute[226235]: 2026-01-31 07:50:55.203 226239 DEBUG nova.virt.libvirt.migration [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:50:55 np0005603623 nova_compute[226235]: 2026-01-31 07:50:55.205 226239 DEBUG nova.virt.libvirt.migration [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:50:55 np0005603623 nova_compute[226235]: 2026-01-31 07:50:55.439 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845855.439384, 4b48cc05-9edd-4e4d-a58e-84564afb0612 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:55 np0005603623 nova_compute[226235]: 2026-01-31 07:50:55.440 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:50:55 np0005603623 nova_compute[226235]: 2026-01-31 07:50:55.462 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:55 np0005603623 nova_compute[226235]: 2026-01-31 07:50:55.465 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:50:55 np0005603623 nova_compute[226235]: 2026-01-31 07:50:55.498 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 31 02:50:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:50:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:55.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:50:55 np0005603623 kernel: tap31ab3c80-79 (unregistering): left promiscuous mode
Jan 31 02:50:55 np0005603623 NetworkManager[48970]: <info>  [1769845855.8760] device (tap31ab3c80-79): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:50:55 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:55Z|00113|binding|INFO|Releasing lport 31ab3c80-791f-418d-a70b-fcb0d523a037 from this chassis (sb_readonly=0)
Jan 31 02:50:55 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:55Z|00114|binding|INFO|Setting lport 31ab3c80-791f-418d-a70b-fcb0d523a037 down in Southbound
Jan 31 02:50:55 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:55Z|00115|binding|INFO|Removing iface tap31ab3c80-79 ovn-installed in OVS
Jan 31 02:50:55 np0005603623 nova_compute[226235]: 2026-01-31 07:50:55.884 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:55 np0005603623 nova_compute[226235]: 2026-01-31 07:50:55.886 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:55 np0005603623 nova_compute[226235]: 2026-01-31 07:50:55.891 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:55.896 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:11:0b 10.100.0.6'], port_security=['fa:16:3e:8b:11:0b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'bd097fed-e54b-4ed7-90f0-078b39b8b13a'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4b48cc05-9edd-4e4d-a58e-84564afb0612', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbdbb7a4b22a49b68feb3e028bb62fbb', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a60a5d2f-886d-4841-8ef6-f9e7838468dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f860fcac-4f6a-4e88-8005-0fd323fc8053, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=31ab3c80-791f-418d-a70b-fcb0d523a037) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:50:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:55.897 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 31ab3c80-791f-418d-a70b-fcb0d523a037 in datapath 850ad6ca-6166-4382-94bb-4b7c10d9a136 unbound from our chassis#033[00m
Jan 31 02:50:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:55.899 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 850ad6ca-6166-4382-94bb-4b7c10d9a136#033[00m
Jan 31 02:50:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:55.919 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8bdb384d-65ba-4ee8-8831-53acfb730f23]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:55.942 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[6656894f-1129-4643-99fd-8bb9f6bb72ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:55.945 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[1e8c4167-add0-4d20-8aed-da0f5fc603f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:55 np0005603623 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000013.scope: Deactivated successfully.
Jan 31 02:50:55 np0005603623 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000013.scope: Consumed 14.705s CPU time.
Jan 31 02:50:55 np0005603623 systemd-machined[194379]: Machine qemu-10-instance-00000013 terminated.
Jan 31 02:50:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:55.966 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[f2915f25-481e-485f-9bb7-79aa6ba0cd1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:55.979 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e4833fff-7939-4a1d-9bd5-77e0ed15e13d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap850ad6ca-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:99:6f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 16, 'tx_packets': 7, 'rx_bytes': 952, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 16, 'tx_packets': 7, 'rx_bytes': 952, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489842, 'reachable_time': 29262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238093, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:55.991 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3621f47d-c219-48f8-abe1-8c1946902ab2]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap850ad6ca-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489853, 'tstamp': 489853}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238094, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap850ad6ca-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489856, 'tstamp': 489856}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238094, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:55.993 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap850ad6ca-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:55 np0005603623 nova_compute[226235]: 2026-01-31 07:50:55.994 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:56 np0005603623 nova_compute[226235]: 2026-01-31 07:50:56.000 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:56.001 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap850ad6ca-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:56.001 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:50:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:56.001 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap850ad6ca-60, col_values=(('external_ids', {'iface-id': '61b6889f-b848-4873-9650-8b2715794d29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:56 np0005603623 virtqemud[225858]: Unable to get XATTR trusted.libvirt.security.ref_selinux on volumes/volume-74f8a6d0-259e-466b-a484-4c7bffded2e1: No such file or directory
Jan 31 02:50:56 np0005603623 virtqemud[225858]: Unable to get XATTR trusted.libvirt.security.ref_dac on volumes/volume-74f8a6d0-259e-466b-a484-4c7bffded2e1: No such file or directory
Jan 31 02:50:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:50:56.002 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:50:56 np0005603623 NetworkManager[48970]: <info>  [1769845856.0106] manager: (tap31ab3c80-79): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Jan 31 02:50:56 np0005603623 systemd-udevd[238084]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:50:56 np0005603623 nova_compute[226235]: 2026-01-31 07:50:56.030 226239 DEBUG nova.virt.libvirt.guest [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Jan 31 02:50:56 np0005603623 nova_compute[226235]: 2026-01-31 07:50:56.030 226239 INFO nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Migration operation has completed#033[00m
Jan 31 02:50:56 np0005603623 nova_compute[226235]: 2026-01-31 07:50:56.030 226239 INFO nova.compute.manager [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] _post_live_migration() is started..#033[00m
Jan 31 02:50:56 np0005603623 nova_compute[226235]: 2026-01-31 07:50:56.031 226239 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Jan 31 02:50:56 np0005603623 nova_compute[226235]: 2026-01-31 07:50:56.032 226239 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Jan 31 02:50:56 np0005603623 nova_compute[226235]: 2026-01-31 07:50:56.034 226239 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Jan 31 02:50:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:56.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:56 np0005603623 nova_compute[226235]: 2026-01-31 07:50:56.774 226239 DEBUG nova.compute.manager [req-ee3633d8-9190-4f70-b553-9a4783d3bd81 req-80306c3a-ca45-4255-b93b-0ec5b429d5df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-unplugged-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:56 np0005603623 nova_compute[226235]: 2026-01-31 07:50:56.774 226239 DEBUG oslo_concurrency.lockutils [req-ee3633d8-9190-4f70-b553-9a4783d3bd81 req-80306c3a-ca45-4255-b93b-0ec5b429d5df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:56 np0005603623 nova_compute[226235]: 2026-01-31 07:50:56.774 226239 DEBUG oslo_concurrency.lockutils [req-ee3633d8-9190-4f70-b553-9a4783d3bd81 req-80306c3a-ca45-4255-b93b-0ec5b429d5df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:56 np0005603623 nova_compute[226235]: 2026-01-31 07:50:56.775 226239 DEBUG oslo_concurrency.lockutils [req-ee3633d8-9190-4f70-b553-9a4783d3bd81 req-80306c3a-ca45-4255-b93b-0ec5b429d5df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:56 np0005603623 nova_compute[226235]: 2026-01-31 07:50:56.775 226239 DEBUG nova.compute.manager [req-ee3633d8-9190-4f70-b553-9a4783d3bd81 req-80306c3a-ca45-4255-b93b-0ec5b429d5df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] No waiting events found dispatching network-vif-unplugged-31ab3c80-791f-418d-a70b-fcb0d523a037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:56 np0005603623 nova_compute[226235]: 2026-01-31 07:50:56.775 226239 DEBUG nova.compute.manager [req-ee3633d8-9190-4f70-b553-9a4783d3bd81 req-80306c3a-ca45-4255-b93b-0ec5b429d5df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-unplugged-31ab3c80-791f-418d-a70b-fcb0d523a037 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:50:56 np0005603623 nova_compute[226235]: 2026-01-31 07:50:56.819 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:57.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.020 226239 DEBUG nova.network.neutron [req-e427aff1-a57f-4b51-ba12-4a60798b6dd8 req-fc9062ba-8c6c-432b-bc33-6ccd9748fe09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updated VIF entry in instance network info cache for port 31ab3c80-791f-418d-a70b-fcb0d523a037. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.021 226239 DEBUG nova.network.neutron [req-e427aff1-a57f-4b51-ba12-4a60798b6dd8 req-fc9062ba-8c6c-432b-bc33-6ccd9748fe09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updating instance_info_cache with network_info: [{"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.131 226239 DEBUG oslo_concurrency.lockutils [req-e427aff1-a57f-4b51-ba12-4a60798b6dd8 req-fc9062ba-8c6c-432b-bc33-6ccd9748fe09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.327 226239 DEBUG nova.network.neutron [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Activated binding for port 31ab3c80-791f-418d-a70b-fcb0d523a037 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.327 226239 DEBUG nova.compute.manager [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.328 226239 DEBUG nova.virt.libvirt.vif [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:49:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1009708622',display_name='tempest-LiveMigrationTest-server-1009708622',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1009708622',id=19,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:50:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cbdbb7a4b22a49b68feb3e028bb62fbb',ramdisk_id='',reservation_id='r-878znybl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveMigrationTest-126681982',owner_user_name='tempest-LiveMigrationTest-126681982-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:50:19Z,user_data=None,user_id='795c7f392cbc45f0885f081449883d42',uuid=4b48cc05-9edd-4e4d-a58e-84564afb0612,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.328 226239 DEBUG nova.network.os_vif_util [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Converting VIF {"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.329 226239 DEBUG nova.network.os_vif_util [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.329 226239 DEBUG os_vif [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.330 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.331 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31ab3c80-79, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.332 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.334 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.336 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.338 226239 INFO os_vif [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79')#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.339 226239 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.339 226239 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.339 226239 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.339 226239 DEBUG nova.compute.manager [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.340 226239 INFO nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Deleting instance files /var/lib/nova/instances/4b48cc05-9edd-4e4d-a58e-84564afb0612_del#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.340 226239 INFO nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Deletion of /var/lib/nova/instances/4b48cc05-9edd-4e4d-a58e-84564afb0612_del complete#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.561 226239 DEBUG nova.network.neutron [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Port ab24842b-0045-41e6-b6dc-51b110b51829 updated with migration profile {'os_vif_delegation': True, 'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 31 02:50:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:50:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:58.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.774 226239 DEBUG nova.compute.manager [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwobuud1k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='79350fb7-3eed-4a3b-a7e9-f0ec90460ac3',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={317b1c6b-4f89-402c-94d1-f4852844f1e2='c20de688-0876-4e49-80fa-40bec74574ff'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.981 226239 DEBUG nova.compute.manager [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.981 226239 DEBUG oslo_concurrency.lockutils [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.982 226239 DEBUG oslo_concurrency.lockutils [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.982 226239 DEBUG oslo_concurrency.lockutils [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.982 226239 DEBUG nova.compute.manager [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] No waiting events found dispatching network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.982 226239 WARNING nova.compute.manager [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received unexpected event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.982 226239 DEBUG nova.compute.manager [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.983 226239 DEBUG oslo_concurrency.lockutils [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.983 226239 DEBUG oslo_concurrency.lockutils [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.983 226239 DEBUG oslo_concurrency.lockutils [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.983 226239 DEBUG nova.compute.manager [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] No waiting events found dispatching network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.983 226239 WARNING nova.compute.manager [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received unexpected event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.984 226239 DEBUG nova.compute.manager [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.984 226239 DEBUG oslo_concurrency.lockutils [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.984 226239 DEBUG oslo_concurrency.lockutils [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.984 226239 DEBUG oslo_concurrency.lockutils [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.984 226239 DEBUG nova.compute.manager [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] No waiting events found dispatching network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:58 np0005603623 nova_compute[226235]: 2026-01-31 07:50:58.985 226239 WARNING nova.compute.manager [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received unexpected event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:50:59 np0005603623 kernel: tapab24842b-00: entered promiscuous mode
Jan 31 02:50:59 np0005603623 NetworkManager[48970]: <info>  [1769845859.0421] manager: (tapab24842b-00): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Jan 31 02:50:59 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:59Z|00116|binding|INFO|Claiming lport ab24842b-0045-41e6-b6dc-51b110b51829 for this additional chassis.
Jan 31 02:50:59 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:59Z|00117|binding|INFO|ab24842b-0045-41e6-b6dc-51b110b51829: Claiming fa:16:3e:b4:a1:9e 10.100.0.9
Jan 31 02:50:59 np0005603623 nova_compute[226235]: 2026-01-31 07:50:59.042 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:59 np0005603623 ovn_controller[133449]: 2026-01-31T07:50:59Z|00118|binding|INFO|Setting lport ab24842b-0045-41e6-b6dc-51b110b51829 ovn-installed in OVS
Jan 31 02:50:59 np0005603623 nova_compute[226235]: 2026-01-31 07:50:59.050 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:59 np0005603623 nova_compute[226235]: 2026-01-31 07:50:59.054 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:59 np0005603623 systemd-udevd[238121]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:50:59 np0005603623 systemd-machined[194379]: New machine qemu-14-instance-00000014.
Jan 31 02:50:59 np0005603623 NetworkManager[48970]: <info>  [1769845859.0730] device (tapab24842b-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:50:59 np0005603623 NetworkManager[48970]: <info>  [1769845859.0735] device (tapab24842b-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:50:59 np0005603623 systemd[1]: Started Virtual Machine qemu-14-instance-00000014.
Jan 31 02:50:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:50:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:50:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:59.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:51:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:00.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:01 np0005603623 nova_compute[226235]: 2026-01-31 07:51:01.395 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845861.3949785, 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:51:01 np0005603623 nova_compute[226235]: 2026-01-31 07:51:01.395 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] VM Started (Lifecycle Event)#033[00m
Jan 31 02:51:01 np0005603623 nova_compute[226235]: 2026-01-31 07:51:01.428 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:51:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:01.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:01 np0005603623 nova_compute[226235]: 2026-01-31 07:51:01.822 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:51:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:02.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:51:02 np0005603623 nova_compute[226235]: 2026-01-31 07:51:02.718 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845862.7186763, 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:51:02 np0005603623 nova_compute[226235]: 2026-01-31 07:51:02.719 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:51:02 np0005603623 nova_compute[226235]: 2026-01-31 07:51:02.746 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:51:02 np0005603623 nova_compute[226235]: 2026-01-31 07:51:02.749 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:51:02 np0005603623 nova_compute[226235]: 2026-01-31 07:51:02.779 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 31 02:51:03 np0005603623 nova_compute[226235]: 2026-01-31 07:51:03.333 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:03 np0005603623 nova_compute[226235]: 2026-01-31 07:51:03.543 226239 DEBUG nova.virt.libvirt.driver [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 02:51:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:51:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:03.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:51:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:04.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:05 np0005603623 nova_compute[226235]: 2026-01-31 07:51:05.561 226239 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:05 np0005603623 nova_compute[226235]: 2026-01-31 07:51:05.562 226239 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:05 np0005603623 nova_compute[226235]: 2026-01-31 07:51:05.562 226239 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:05.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:05 np0005603623 nova_compute[226235]: 2026-01-31 07:51:05.588 226239 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:05 np0005603623 nova_compute[226235]: 2026-01-31 07:51:05.589 226239 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:05 np0005603623 nova_compute[226235]: 2026-01-31 07:51:05.589 226239 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:05 np0005603623 nova_compute[226235]: 2026-01-31 07:51:05.589 226239 DEBUG nova.compute.resource_tracker [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:51:05 np0005603623 nova_compute[226235]: 2026-01-31 07:51:05.590 226239 DEBUG oslo_concurrency.processutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:05 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:05Z|00119|binding|INFO|Claiming lport ab24842b-0045-41e6-b6dc-51b110b51829 for this chassis.
Jan 31 02:51:05 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:05Z|00120|binding|INFO|ab24842b-0045-41e6-b6dc-51b110b51829: Claiming fa:16:3e:b4:a1:9e 10.100.0.9
Jan 31 02:51:05 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:05Z|00121|binding|INFO|Setting lport ab24842b-0045-41e6-b6dc-51b110b51829 up in Southbound
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.699 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:a1:9e 10.100.0.9'], port_security=['fa:16:3e:b4:a1:9e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '79350fb7-3eed-4a3b-a7e9-f0ec90460ac3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e66a774f63ae4139a4e75c7973fbe077', 'neutron:revision_number': '23', 'neutron:security_group_ids': 'a4a96739-bb2f-4e95-bbe5-76a81d2aa557', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=227ca833-938d-48d2-86c8-5d09dd658c40, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=ab24842b-0045-41e6-b6dc-51b110b51829) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.701 143258 INFO neutron.agent.ovn.metadata.agent [-] Port ab24842b-0045-41e6-b6dc-51b110b51829 in datapath 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d bound to our chassis#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.704 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.714 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[569484fe-370f-47ee-bd84-bb5a09bdd63a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.715 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap60bb4bea-d1 in ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.716 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap60bb4bea-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.716 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b5e12ce3-5529-4e43-91ab-33aeee232ecf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.717 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e417239d-7609-4f58-8aa2-4ffaaec6f80c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.734 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[99688c84-18f8-4ccc-8962-eec1e5a81dfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.745 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4f032212-e5d6-405a-a838-e52a4010be06]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.763 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[516e14bb-c415-416d-9ca9-950e465eca92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.767 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9653a2f4-fc91-4b24-9999-c41ce14704a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:05 np0005603623 NetworkManager[48970]: <info>  [1769845865.7702] manager: (tap60bb4bea-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/63)
Jan 31 02:51:05 np0005603623 systemd-udevd[238202]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.789 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[e9269e10-4f11-4bd7-ba81-ba3bd2a6477e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.791 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[4edd204c-373e-4768-b029-b24534da7575]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:05 np0005603623 NetworkManager[48970]: <info>  [1769845865.8092] device (tap60bb4bea-d0): carrier: link connected
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.811 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[7b355004-f891-47c5-a7e1-109d598b7173]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.825 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[032eedf1-c8c7-4893-b8b1-371a4223e860]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap60bb4bea-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:b1:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500623, 'reachable_time': 32376, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238221, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:05 np0005603623 nova_compute[226235]: 2026-01-31 07:51:05.826 226239 INFO nova.compute.manager [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Post operation of migration started#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.836 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a4774de6-0283-468d-922f-10afa20f3f73]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4e:b1c0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 500623, 'tstamp': 500623}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238222, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.849 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac3e30e-3fdd-43da-a666-f7aa45e71544]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap60bb4bea-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:b1:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500623, 'reachable_time': 32376, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238223, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.868 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9e765ee5-f7a8-4136-8ded-9104a07bfb4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.906 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0bc4c91f-4cd7-4815-a047-f2921cc1c441]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.907 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60bb4bea-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.907 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.907 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60bb4bea-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:05 np0005603623 kernel: tap60bb4bea-d0: entered promiscuous mode
Jan 31 02:51:05 np0005603623 nova_compute[226235]: 2026-01-31 07:51:05.909 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:05 np0005603623 NetworkManager[48970]: <info>  [1769845865.9096] manager: (tap60bb4bea-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Jan 31 02:51:05 np0005603623 nova_compute[226235]: 2026-01-31 07:51:05.915 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.918 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap60bb4bea-d0, col_values=(('external_ids', {'iface-id': 'eefb3f31-55e8-4b1d-a07a-d5c925fc9fd8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:05 np0005603623 nova_compute[226235]: 2026-01-31 07:51:05.919 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:05 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:05Z|00122|binding|INFO|Releasing lport eefb3f31-55e8-4b1d-a07a-d5c925fc9fd8 from this chassis (sb_readonly=0)
Jan 31 02:51:05 np0005603623 nova_compute[226235]: 2026-01-31 07:51:05.920 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.921 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/60bb4bea-d9f0-41fc-9c0f-6fcd644c255d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/60bb4bea-d9f0-41fc-9c0f-6fcd644c255d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.921 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6252266c-917b-4516-ba4a-9147cd259481]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.922 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/60bb4bea-d9f0-41fc-9c0f-6fcd644c255d.pid.haproxy
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:51:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:05.923 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'env', 'PROCESS_TAG=haproxy-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/60bb4bea-d9f0-41fc-9c0f-6fcd644c255d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:51:05 np0005603623 nova_compute[226235]: 2026-01-31 07:51:05.926 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:51:06 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3747996773' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.029 226239 DEBUG oslo_concurrency.processutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.094 226239 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] skipping disk for instance-00000011 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.095 226239 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] skipping disk for instance-00000011 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.098 226239 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.098 226239 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.100 226239 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] skipping disk for instance-00000014 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.101 226239 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] skipping disk for instance-00000014 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.243 226239 WARNING nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.244 226239 DEBUG nova.compute.resource_tracker [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4112MB free_disk=20.71639633178711GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.245 226239 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.245 226239 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:06 np0005603623 podman[238260]: 2026-01-31 07:51:06.257001818 +0000 UTC m=+0.019690090 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.494 226239 DEBUG nova.compute.resource_tracker [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Migration for instance 4b48cc05-9edd-4e4d-a58e-84564afb0612 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.494 226239 DEBUG nova.compute.resource_tracker [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Migration for instance 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.526 226239 DEBUG nova.compute.resource_tracker [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.548 226239 INFO nova.compute.resource_tracker [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Updating resource usage from migration c91a31e3-ac93-4843-9897-f5679755f4a7#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.549 226239 DEBUG nova.compute.resource_tracker [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Starting to track incoming migration c91a31e3-ac93-4843-9897-f5679755f4a7 with flavor a01eb4f0-fd80-416b-a750-75de320394d8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.549 226239 INFO nova.compute.resource_tracker [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Updating resource usage from migration d5c54c4c-3045-4285-98a5-2a691842bc5d#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.587 226239 DEBUG nova.compute.resource_tracker [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Instance 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.587 226239 DEBUG nova.compute.resource_tracker [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Migration b50886cd-60b8-4c4f-bb91-0426bbbcc40b is active on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.614 226239 WARNING nova.compute.resource_tracker [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Instance 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.615 226239 DEBUG nova.compute.resource_tracker [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Migration d5c54c4c-3045-4285-98a5-2a691842bc5d is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.615 226239 DEBUG nova.compute.resource_tracker [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.616 226239 DEBUG nova.compute.resource_tracker [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:51:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:06.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.682 226239 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.682 226239 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquired lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.683 226239 DEBUG nova.network.neutron [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.738 226239 DEBUG oslo_concurrency.processutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:06 np0005603623 nova_compute[226235]: 2026-01-31 07:51:06.826 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:06 np0005603623 podman[238260]: 2026-01-31 07:51:06.865778251 +0000 UTC m=+0.628466503 container create 1641fefeaed6d322f8fd16878c554e27bc4ffb5e3174b63856c05f6007aa93c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 02:51:07 np0005603623 systemd[1]: Started libpod-conmon-1641fefeaed6d322f8fd16878c554e27bc4ffb5e3174b63856c05f6007aa93c3.scope.
Jan 31 02:51:07 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:51:07 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dec35def09d864a7d80bc9283c847af9a29af6cf6a93722f0adc75b25ac17bd3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:51:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:51:07 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4177671220' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:51:07 np0005603623 nova_compute[226235]: 2026-01-31 07:51:07.172 226239 DEBUG oslo_concurrency.processutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:07 np0005603623 nova_compute[226235]: 2026-01-31 07:51:07.176 226239 DEBUG nova.compute.provider_tree [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:51:07 np0005603623 nova_compute[226235]: 2026-01-31 07:51:07.194 226239 DEBUG nova.scheduler.client.report [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:51:07 np0005603623 podman[238260]: 2026-01-31 07:51:07.208285135 +0000 UTC m=+0.970973407 container init 1641fefeaed6d322f8fd16878c554e27bc4ffb5e3174b63856c05f6007aa93c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Jan 31 02:51:07 np0005603623 podman[238260]: 2026-01-31 07:51:07.212705843 +0000 UTC m=+0.975394095 container start 1641fefeaed6d322f8fd16878c554e27bc4ffb5e3174b63856c05f6007aa93c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 02:51:07 np0005603623 nova_compute[226235]: 2026-01-31 07:51:07.216 226239 DEBUG nova.compute.resource_tracker [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:51:07 np0005603623 nova_compute[226235]: 2026-01-31 07:51:07.217 226239 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.972s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:07 np0005603623 nova_compute[226235]: 2026-01-31 07:51:07.222 226239 INFO nova.compute.manager [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Migrating instance to compute-1.ctlplane.example.com finished successfully.#033[00m
Jan 31 02:51:07 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[238295]: [NOTICE]   (238301) : New worker (238303) forked
Jan 31 02:51:07 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[238295]: [NOTICE]   (238301) : Loading success.
Jan 31 02:51:07 np0005603623 nova_compute[226235]: 2026-01-31 07:51:07.299 226239 INFO nova.scheduler.client.report [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Deleted allocation for migration b50886cd-60b8-4c4f-bb91-0426bbbcc40b#033[00m
Jan 31 02:51:07 np0005603623 nova_compute[226235]: 2026-01-31 07:51:07.300 226239 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Jan 31 02:51:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:07.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:08 np0005603623 nova_compute[226235]: 2026-01-31 07:51:08.270 226239 DEBUG nova.network.neutron [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Updating instance_info_cache with network_info: [{"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:51:08 np0005603623 nova_compute[226235]: 2026-01-31 07:51:08.274 226239 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Creating tmpfile /var/lib/nova/instances/tmp5hfaw34r to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 31 02:51:08 np0005603623 nova_compute[226235]: 2026-01-31 07:51:08.275 226239 DEBUG nova.compute.manager [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5hfaw34r',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 31 02:51:08 np0005603623 nova_compute[226235]: 2026-01-31 07:51:08.299 226239 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Releasing lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:51:08 np0005603623 nova_compute[226235]: 2026-01-31 07:51:08.314 226239 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:08 np0005603623 nova_compute[226235]: 2026-01-31 07:51:08.314 226239 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:08 np0005603623 nova_compute[226235]: 2026-01-31 07:51:08.314 226239 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:08 np0005603623 nova_compute[226235]: 2026-01-31 07:51:08.318 226239 INFO nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 31 02:51:08 np0005603623 virtqemud[225858]: Domain id=14 name='instance-00000014' uuid=79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 is tainted: custom-monitor
Jan 31 02:51:08 np0005603623 nova_compute[226235]: 2026-01-31 07:51:08.335 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:08.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:09 np0005603623 nova_compute[226235]: 2026-01-31 07:51:09.323 226239 INFO nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 31 02:51:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:51:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:09.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:51:10 np0005603623 nova_compute[226235]: 2026-01-31 07:51:10.329 226239 INFO nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 31 02:51:10 np0005603623 nova_compute[226235]: 2026-01-31 07:51:10.333 226239 DEBUG nova.compute.manager [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:51:10 np0005603623 nova_compute[226235]: 2026-01-31 07:51:10.354 226239 DEBUG nova.objects.instance [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 02:51:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:10.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.033 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845856.0278718, 4b48cc05-9edd-4e4d-a58e-84564afb0612 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.033 226239 INFO nova.compute.manager [-] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.083 226239 DEBUG nova.compute.manager [None req-47b21a1d-d071-4037-9e86-80eddd6b817d - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.089 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.090 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.090 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.180 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.180 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.180 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.181 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.181 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:51:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:11.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:51:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:51:11 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2799178994' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.678 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.779 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000011 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.781 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000011 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.790 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.791 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.797 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000014 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.798 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000014 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.828 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.975 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.976 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4149MB free_disk=20.71560287475586GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.976 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:11 np0005603623 nova_compute[226235]: 2026-01-31 07:51:11.977 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:12 np0005603623 nova_compute[226235]: 2026-01-31 07:51:12.082 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Applying migration context for instance 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 as it has an incoming, in-progress migration c91a31e3-ac93-4843-9897-f5679755f4a7. Migration status is running _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Jan 31 02:51:12 np0005603623 nova_compute[226235]: 2026-01-31 07:51:12.082 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 02:51:12 np0005603623 nova_compute[226235]: 2026-01-31 07:51:12.083 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Migration for instance 4b48cc05-9edd-4e4d-a58e-84564afb0612 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 02:51:12 np0005603623 nova_compute[226235]: 2026-01-31 07:51:12.084 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 31 02:51:12 np0005603623 nova_compute[226235]: 2026-01-31 07:51:12.112 226239 INFO nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Updating resource usage from migration d5c54c4c-3045-4285-98a5-2a691842bc5d#033[00m
Jan 31 02:51:12 np0005603623 nova_compute[226235]: 2026-01-31 07:51:12.112 226239 INFO nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updating resource usage from migration aad546b8-9239-4ce8-aa2b-899d02b2684e#033[00m
Jan 31 02:51:12 np0005603623 nova_compute[226235]: 2026-01-31 07:51:12.112 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Starting to track incoming migration aad546b8-9239-4ce8-aa2b-899d02b2684e with flavor a01eb4f0-fd80-416b-a750-75de320394d8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 31 02:51:12 np0005603623 nova_compute[226235]: 2026-01-31 07:51:12.131 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:51:12 np0005603623 nova_compute[226235]: 2026-01-31 07:51:12.132 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:51:12 np0005603623 nova_compute[226235]: 2026-01-31 07:51:12.132 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Migration d5c54c4c-3045-4285-98a5-2a691842bc5d is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 02:51:12 np0005603623 nova_compute[226235]: 2026-01-31 07:51:12.159 226239 WARNING nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 4b48cc05-9edd-4e4d-a58e-84564afb0612 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Jan 31 02:51:12 np0005603623 nova_compute[226235]: 2026-01-31 07:51:12.160 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:51:12 np0005603623 nova_compute[226235]: 2026-01-31 07:51:12.160 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:51:12 np0005603623 nova_compute[226235]: 2026-01-31 07:51:12.253 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:51:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:12.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:51:13 np0005603623 nova_compute[226235]: 2026-01-31 07:51:13.208 226239 DEBUG nova.compute.manager [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5hfaw34r',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4b48cc05-9edd-4e4d-a58e-84564afb0612',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 31 02:51:13 np0005603623 nova_compute[226235]: 2026-01-31 07:51:13.238 226239 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:51:13 np0005603623 nova_compute[226235]: 2026-01-31 07:51:13.239 226239 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquired lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:51:13 np0005603623 nova_compute[226235]: 2026-01-31 07:51:13.239 226239 DEBUG nova.network.neutron [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:51:13 np0005603623 nova_compute[226235]: 2026-01-31 07:51:13.337 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:13 np0005603623 nova_compute[226235]: 2026-01-31 07:51:13.359 226239 DEBUG oslo_concurrency.lockutils [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:13 np0005603623 nova_compute[226235]: 2026-01-31 07:51:13.360 226239 DEBUG oslo_concurrency.lockutils [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:13 np0005603623 nova_compute[226235]: 2026-01-31 07:51:13.360 226239 DEBUG oslo_concurrency.lockutils [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:13 np0005603623 nova_compute[226235]: 2026-01-31 07:51:13.360 226239 DEBUG oslo_concurrency.lockutils [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:13 np0005603623 nova_compute[226235]: 2026-01-31 07:51:13.361 226239 DEBUG oslo_concurrency.lockutils [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:13 np0005603623 nova_compute[226235]: 2026-01-31 07:51:13.362 226239 INFO nova.compute.manager [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Terminating instance#033[00m
Jan 31 02:51:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:51:13 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/312245557' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:51:13 np0005603623 nova_compute[226235]: 2026-01-31 07:51:13.397 226239 DEBUG nova.compute.manager [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:51:13 np0005603623 nova_compute[226235]: 2026-01-31 07:51:13.413 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:13 np0005603623 nova_compute[226235]: 2026-01-31 07:51:13.418 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:51:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:13.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:13 np0005603623 nova_compute[226235]: 2026-01-31 07:51:13.629 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:51:13 np0005603623 nova_compute[226235]: 2026-01-31 07:51:13.661 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:51:13 np0005603623 nova_compute[226235]: 2026-01-31 07:51:13.662 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:14 np0005603623 kernel: tapab24842b-00 (unregistering): left promiscuous mode
Jan 31 02:51:14 np0005603623 NetworkManager[48970]: <info>  [1769845874.1735] device (tapab24842b-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:51:14 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:14Z|00123|binding|INFO|Releasing lport ab24842b-0045-41e6-b6dc-51b110b51829 from this chassis (sb_readonly=0)
Jan 31 02:51:14 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:14Z|00124|binding|INFO|Setting lport ab24842b-0045-41e6-b6dc-51b110b51829 down in Southbound
Jan 31 02:51:14 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:14Z|00125|binding|INFO|Removing iface tapab24842b-00 ovn-installed in OVS
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.184 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.187 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.192 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:14.193 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:a1:9e 10.100.0.9'], port_security=['fa:16:3e:b4:a1:9e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '79350fb7-3eed-4a3b-a7e9-f0ec90460ac3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e66a774f63ae4139a4e75c7973fbe077', 'neutron:revision_number': '25', 'neutron:security_group_ids': 'a4a96739-bb2f-4e95-bbe5-76a81d2aa557', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=227ca833-938d-48d2-86c8-5d09dd658c40, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=ab24842b-0045-41e6-b6dc-51b110b51829) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:51:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:14.195 143258 INFO neutron.agent.ovn.metadata.agent [-] Port ab24842b-0045-41e6-b6dc-51b110b51829 in datapath 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d unbound from our chassis#033[00m
Jan 31 02:51:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:14.197 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:51:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:14.198 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2caed477-d7fa-4d30-bd50-f75935e6f3f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:14.198 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d namespace which is not needed anymore#033[00m
Jan 31 02:51:14 np0005603623 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000014.scope: Deactivated successfully.
Jan 31 02:51:14 np0005603623 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000014.scope: Consumed 1.303s CPU time.
Jan 31 02:51:14 np0005603623 systemd-machined[194379]: Machine qemu-14-instance-00000014 terminated.
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.434 226239 INFO nova.virt.libvirt.driver [-] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Instance destroyed successfully.#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.435 226239 DEBUG nova.objects.instance [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lazy-loading 'resources' on Instance uuid 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.454 226239 DEBUG nova.virt.libvirt.vif [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T07:49:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1973231276',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1973231276',id=20,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:50:05Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e66a774f63ae4139a4e75c7973fbe077',ramdisk_id='',reservation_id='r-kz6k0bwy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-2072827810',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-2072827810-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:51:10Z,user_data=None,user_id='37ed25cc14814a29867ac308b3cce8cf',uuid=79350fb7-3eed-4a3b-a7e9-f0ec90460ac3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.455 226239 DEBUG nova.network.os_vif_util [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Converting VIF {"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.456 226239 DEBUG nova.network.os_vif_util [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.457 226239 DEBUG os_vif [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.460 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.460 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab24842b-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.483 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.486 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.489 226239 INFO os_vif [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00')#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.589 226239 DEBUG nova.virt.libvirt.driver [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 02:51:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:51:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:14.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.658 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.658 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.659 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.697 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Skipping network cache update for instance because it has been migrated to another host. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9902#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.731 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9907#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.781 226239 DEBUG nova.compute.manager [req-ce5f1230-9031-41b8-a44f-1bd1990dd397 req-5344e5cc-acac-4269-b13f-5b29c41bdd48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.782 226239 DEBUG oslo_concurrency.lockutils [req-ce5f1230-9031-41b8-a44f-1bd1990dd397 req-5344e5cc-acac-4269-b13f-5b29c41bdd48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.783 226239 DEBUG oslo_concurrency.lockutils [req-ce5f1230-9031-41b8-a44f-1bd1990dd397 req-5344e5cc-acac-4269-b13f-5b29c41bdd48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.783 226239 DEBUG oslo_concurrency.lockutils [req-ce5f1230-9031-41b8-a44f-1bd1990dd397 req-5344e5cc-acac-4269-b13f-5b29c41bdd48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.783 226239 DEBUG nova.compute.manager [req-ce5f1230-9031-41b8-a44f-1bd1990dd397 req-5344e5cc-acac-4269-b13f-5b29c41bdd48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.784 226239 DEBUG nova.compute.manager [req-ce5f1230-9031-41b8-a44f-1bd1990dd397 req-5344e5cc-acac-4269-b13f-5b29c41bdd48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:51:14 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[238295]: [NOTICE]   (238301) : haproxy version is 2.8.14-c23fe91
Jan 31 02:51:14 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[238295]: [NOTICE]   (238301) : path to executable is /usr/sbin/haproxy
Jan 31 02:51:14 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[238295]: [WARNING]  (238301) : Exiting Master process...
Jan 31 02:51:14 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[238295]: [ALERT]    (238301) : Current worker (238303) exited with code 143 (Terminated)
Jan 31 02:51:14 np0005603623 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[238295]: [WARNING]  (238301) : All workers exited. Exiting... (0)
Jan 31 02:51:14 np0005603623 systemd[1]: libpod-1641fefeaed6d322f8fd16878c554e27bc4ffb5e3174b63856c05f6007aa93c3.scope: Deactivated successfully.
Jan 31 02:51:14 np0005603623 podman[238433]: 2026-01-31 07:51:14.951690418 +0000 UTC m=+0.677437830 container died 1641fefeaed6d322f8fd16878c554e27bc4ffb5e3174b63856c05f6007aa93c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.965 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.966 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:51:14 np0005603623 nova_compute[226235]: 2026-01-31 07:51:14.966 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.405 226239 DEBUG nova.network.neutron [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updating instance_info_cache with network_info: [{"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.421 226239 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Releasing lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.423 226239 DEBUG os_brick.utils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.425 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.434 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.435 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[d39a834d-43e8-4800-9035-e22d7c561511]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.436 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.442 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.443 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[f1f747b4-6eff-44a8-9c2f-11ed16439f6f]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.446 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.452 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.452 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[0374a0c9-a2ca-45ba-8c4d-88f7fdb87306]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.454 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[58aff611-a1be-43aa-9f64-c27eef22f3b4]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.455 226239 DEBUG oslo_concurrency.processutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.482 226239 DEBUG oslo_concurrency.processutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] CMD "nvme version" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.485 226239 DEBUG os_brick.initiator.connectors.lightos [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.486 226239 DEBUG os_brick.initiator.connectors.lightos [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.486 226239 DEBUG os_brick.initiator.connectors.lightos [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 02:51:15 np0005603623 nova_compute[226235]: 2026-01-31 07:51:15.487 226239 DEBUG os_brick.utils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] <== get_connector_properties: return (62ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 02:51:15 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1641fefeaed6d322f8fd16878c554e27bc4ffb5e3174b63856c05f6007aa93c3-userdata-shm.mount: Deactivated successfully.
Jan 31 02:51:15 np0005603623 systemd[1]: var-lib-containers-storage-overlay-dec35def09d864a7d80bc9283c847af9a29af6cf6a93722f0adc75b25ac17bd3-merged.mount: Deactivated successfully.
Jan 31 02:51:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:15.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:15 np0005603623 podman[238433]: 2026-01-31 07:51:15.866861039 +0000 UTC m=+1.592608431 container cleanup 1641fefeaed6d322f8fd16878c554e27bc4ffb5e3174b63856c05f6007aa93c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 02:51:15 np0005603623 systemd[1]: libpod-conmon-1641fefeaed6d322f8fd16878c554e27bc4ffb5e3174b63856c05f6007aa93c3.scope: Deactivated successfully.
Jan 31 02:51:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:51:16 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3159847127' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:51:16 np0005603623 podman[238499]: 2026-01-31 07:51:16.567587431 +0000 UTC m=+0.684757561 container remove 1641fefeaed6d322f8fd16878c554e27bc4ffb5e3174b63856c05f6007aa93c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 02:51:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:16.572 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[659f753a-b593-4164-8c12-72cc9f95d558]: (4, ('Sat Jan 31 07:51:14 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d (1641fefeaed6d322f8fd16878c554e27bc4ffb5e3174b63856c05f6007aa93c3)\n1641fefeaed6d322f8fd16878c554e27bc4ffb5e3174b63856c05f6007aa93c3\nSat Jan 31 07:51:15 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d (1641fefeaed6d322f8fd16878c554e27bc4ffb5e3174b63856c05f6007aa93c3)\n1641fefeaed6d322f8fd16878c554e27bc4ffb5e3174b63856c05f6007aa93c3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:16.574 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[30241c9e-3e53-4659-934e-41172cb497f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:16.574 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60bb4bea-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.576 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:16 np0005603623 kernel: tap60bb4bea-d0: left promiscuous mode
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.587 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:16.589 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fcf0833d-098e-4bdb-94a9-8b18b811895a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:16.613 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b277db2e-fc1d-4b7b-89b8-cd87020391a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:16.614 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[eb2f2741-ba6e-455c-b196-c0426f937498]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.623 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Updating instance_info_cache with network_info: [{"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:51:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:16.631 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fc0f16a5-ffb4-48f3-b729-6c48a571a40a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500619, 'reachable_time': 23893, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238517, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:16.634 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:51:16 np0005603623 systemd[1]: run-netns-ovnmeta\x2d60bb4bea\x2dd9f0\x2d41fc\x2d9c0f\x2d6fcd644c255d.mount: Deactivated successfully.
Jan 31 02:51:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:16.635 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[1cd285f3-dd30-4d3e-8906-7182395bcd88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.651 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.651 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.652 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.652 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.653 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:51:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:16.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.795 226239 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5hfaw34r',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4b48cc05-9edd-4e4d-a58e-84564afb0612',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={74f8a6d0-259e-466b-a484-4c7bffded2e1='e6430b41-2bd9-4a2a-92ee-fef9061b8529'},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.796 226239 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Creating instance directory: /var/lib/nova/instances/4b48cc05-9edd-4e4d-a58e-84564afb0612 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.797 226239 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Ensure instance console log exists: /var/lib/nova/instances/4b48cc05-9edd-4e4d-a58e-84564afb0612/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:51:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.797 226239 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Connecting volumes before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10901#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.800 226239 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.802 226239 DEBUG nova.virt.libvirt.vif [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T07:49:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1009708622',display_name='tempest-LiveMigrationTest-server-1009708622',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1009708622',id=19,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:50:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cbdbb7a4b22a49b68feb3e028bb62fbb',ramdisk_id='',reservation_id='r-878znybl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveMigrationTest-126681982',owner_user_name='tempest-LiveMigrationTest-126681982-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:51:04Z,user_data=None,user_id='795c7f392cbc45f0885f081449883d42',uuid=4b48cc05-9edd-4e4d-a58e-84564afb0612,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.802 226239 DEBUG nova.network.os_vif_util [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Converting VIF {"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.803 226239 DEBUG nova.network.os_vif_util [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.804 226239 DEBUG os_vif [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.805 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.805 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.806 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.808 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.809 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31ab3c80-79, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.809 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap31ab3c80-79, col_values=(('external_ids', {'iface-id': '31ab3c80-791f-418d-a70b-fcb0d523a037', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:11:0b', 'vm-uuid': '4b48cc05-9edd-4e4d-a58e-84564afb0612'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.811 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:16 np0005603623 NetworkManager[48970]: <info>  [1769845876.8118] manager: (tap31ab3c80-79): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.813 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.815 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.816 226239 INFO os_vif [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79')#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.819 226239 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.819 226239 DEBUG nova.compute.manager [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5hfaw34r',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4b48cc05-9edd-4e4d-a58e-84564afb0612',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={74f8a6d0-259e-466b-a484-4c7bffded2e1='e6430b41-2bd9-4a2a-92ee-fef9061b8529'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.829 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.913 226239 DEBUG nova.compute.manager [req-33158281-be6a-4fbf-b783-1ff5f5e68422 req-5d8fab62-7c4d-44fe-8297-62b32884ea11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.914 226239 DEBUG oslo_concurrency.lockutils [req-33158281-be6a-4fbf-b783-1ff5f5e68422 req-5d8fab62-7c4d-44fe-8297-62b32884ea11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.914 226239 DEBUG oslo_concurrency.lockutils [req-33158281-be6a-4fbf-b783-1ff5f5e68422 req-5d8fab62-7c4d-44fe-8297-62b32884ea11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.915 226239 DEBUG oslo_concurrency.lockutils [req-33158281-be6a-4fbf-b783-1ff5f5e68422 req-5d8fab62-7c4d-44fe-8297-62b32884ea11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.915 226239 DEBUG nova.compute.manager [req-33158281-be6a-4fbf-b783-1ff5f5e68422 req-5d8fab62-7c4d-44fe-8297-62b32884ea11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:16 np0005603623 nova_compute[226235]: 2026-01-31 07:51:16.915 226239 WARNING nova.compute.manager [req-33158281-be6a-4fbf-b783-1ff5f5e68422 req-5d8fab62-7c4d-44fe-8297-62b32884ea11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received unexpected event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 02:51:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:17.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:17 np0005603623 nova_compute[226235]: 2026-01-31 07:51:17.870 226239 DEBUG nova.network.neutron [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Port 31ab3c80-791f-418d-a70b-fcb0d523a037 updated with migration profile {'os_vif_delegation': True, 'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 31 02:51:18 np0005603623 nova_compute[226235]: 2026-01-31 07:51:18.064 226239 DEBUG nova.compute.manager [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5hfaw34r',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4b48cc05-9edd-4e4d-a58e-84564afb0612',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={74f8a6d0-259e-466b-a484-4c7bffded2e1='e6430b41-2bd9-4a2a-92ee-fef9061b8529'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 31 02:51:18 np0005603623 kernel: tap31ab3c80-79: entered promiscuous mode
Jan 31 02:51:18 np0005603623 NetworkManager[48970]: <info>  [1769845878.2990] manager: (tap31ab3c80-79): new Tun device (/org/freedesktop/NetworkManager/Devices/66)
Jan 31 02:51:18 np0005603623 nova_compute[226235]: 2026-01-31 07:51:18.299 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:18 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:18Z|00126|binding|INFO|Claiming lport 31ab3c80-791f-418d-a70b-fcb0d523a037 for this additional chassis.
Jan 31 02:51:18 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:18Z|00127|binding|INFO|31ab3c80-791f-418d-a70b-fcb0d523a037: Claiming fa:16:3e:8b:11:0b 10.100.0.6
Jan 31 02:51:18 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:18Z|00128|binding|INFO|Setting lport 31ab3c80-791f-418d-a70b-fcb0d523a037 ovn-installed in OVS
Jan 31 02:51:18 np0005603623 nova_compute[226235]: 2026-01-31 07:51:18.312 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:18 np0005603623 nova_compute[226235]: 2026-01-31 07:51:18.313 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:18 np0005603623 systemd-udevd[238534]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:51:18 np0005603623 systemd-machined[194379]: New machine qemu-15-instance-00000013.
Jan 31 02:51:18 np0005603623 NetworkManager[48970]: <info>  [1769845878.3322] device (tap31ab3c80-79): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:51:18 np0005603623 systemd[1]: Started Virtual Machine qemu-15-instance-00000013.
Jan 31 02:51:18 np0005603623 NetworkManager[48970]: <info>  [1769845878.3328] device (tap31ab3c80-79): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:51:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:18.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:19.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:19 np0005603623 nova_compute[226235]: 2026-01-31 07:51:19.961 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845879.9615083, 4b48cc05-9edd-4e4d-a58e-84564afb0612 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:51:19 np0005603623 nova_compute[226235]: 2026-01-31 07:51:19.962 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] VM Started (Lifecycle Event)#033[00m
Jan 31 02:51:19 np0005603623 nova_compute[226235]: 2026-01-31 07:51:19.980 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:51:20 np0005603623 nova_compute[226235]: 2026-01-31 07:51:20.612 226239 INFO nova.virt.libvirt.driver [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Instance shutdown successfully after 27 seconds.#033[00m
Jan 31 02:51:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:51:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:20.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:51:20 np0005603623 nova_compute[226235]: 2026-01-31 07:51:20.995 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845880.9950914, 4b48cc05-9edd-4e4d-a58e-84564afb0612 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:51:20 np0005603623 nova_compute[226235]: 2026-01-31 07:51:20.996 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:51:21 np0005603623 nova_compute[226235]: 2026-01-31 07:51:21.037 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:51:21 np0005603623 nova_compute[226235]: 2026-01-31 07:51:21.040 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:51:21 np0005603623 nova_compute[226235]: 2026-01-31 07:51:21.066 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 31 02:51:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:21.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:21 np0005603623 nova_compute[226235]: 2026-01-31 07:51:21.811 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:21 np0005603623 nova_compute[226235]: 2026-01-31 07:51:21.831 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:22 np0005603623 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000016.scope: Deactivated successfully.
Jan 31 02:51:22 np0005603623 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000016.scope: Consumed 13.403s CPU time.
Jan 31 02:51:22 np0005603623 systemd-machined[194379]: Machine qemu-13-instance-00000016 terminated.
Jan 31 02:51:22 np0005603623 podman[238588]: 2026-01-31 07:51:22.514173214 +0000 UTC m=+0.049439955 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 02:51:22 np0005603623 podman[238589]: 2026-01-31 07:51:22.537077664 +0000 UTC m=+0.071158887 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Jan 31 02:51:22 np0005603623 nova_compute[226235]: 2026-01-31 07:51:22.625 226239 INFO nova.virt.libvirt.driver [-] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Instance destroyed successfully.#033[00m
Jan 31 02:51:22 np0005603623 nova_compute[226235]: 2026-01-31 07:51:22.628 226239 DEBUG nova.virt.libvirt.driver [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:22 np0005603623 nova_compute[226235]: 2026-01-31 07:51:22.628 226239 DEBUG nova.virt.libvirt.driver [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:51:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:22.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:51:22 np0005603623 nova_compute[226235]: 2026-01-31 07:51:22.722 226239 DEBUG oslo_concurrency.lockutils [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Acquiring lock "866b0b10-d2ae-4e08-9efa-36b9c9c9f50d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:22 np0005603623 nova_compute[226235]: 2026-01-31 07:51:22.722 226239 DEBUG oslo_concurrency.lockutils [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Lock "866b0b10-d2ae-4e08-9efa-36b9c9c9f50d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:22 np0005603623 nova_compute[226235]: 2026-01-31 07:51:22.722 226239 DEBUG oslo_concurrency.lockutils [None req-bcb652b8-8b76-44de-b8c7-72bcadae2a34 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Lock "866b0b10-d2ae-4e08-9efa-36b9c9c9f50d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:23 np0005603623 nova_compute[226235]: 2026-01-31 07:51:23.460 226239 INFO nova.virt.libvirt.driver [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Deleting instance files /var/lib/nova/instances/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3_del#033[00m
Jan 31 02:51:23 np0005603623 nova_compute[226235]: 2026-01-31 07:51:23.461 226239 INFO nova.virt.libvirt.driver [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Deletion of /var/lib/nova/instances/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3_del complete#033[00m
Jan 31 02:51:23 np0005603623 nova_compute[226235]: 2026-01-31 07:51:23.574 226239 INFO nova.compute.manager [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Took 10.18 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:51:23 np0005603623 nova_compute[226235]: 2026-01-31 07:51:23.574 226239 DEBUG oslo.service.loopingcall [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:51:23 np0005603623 nova_compute[226235]: 2026-01-31 07:51:23.575 226239 DEBUG nova.compute.manager [-] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:51:23 np0005603623 nova_compute[226235]: 2026-01-31 07:51:23.575 226239 DEBUG nova.network.neutron [-] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:51:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:51:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:23.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:51:23 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:23Z|00129|binding|INFO|Claiming lport 31ab3c80-791f-418d-a70b-fcb0d523a037 for this chassis.
Jan 31 02:51:23 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:23Z|00130|binding|INFO|31ab3c80-791f-418d-a70b-fcb0d523a037: Claiming fa:16:3e:8b:11:0b 10.100.0.6
Jan 31 02:51:23 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:23Z|00131|binding|INFO|Setting lport 31ab3c80-791f-418d-a70b-fcb0d523a037 up in Southbound
Jan 31 02:51:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:23.988 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:11:0b 10.100.0.6'], port_security=['fa:16:3e:8b:11:0b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4b48cc05-9edd-4e4d-a58e-84564afb0612', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbdbb7a4b22a49b68feb3e028bb62fbb', 'neutron:revision_number': '20', 'neutron:security_group_ids': 'a60a5d2f-886d-4841-8ef6-f9e7838468dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f860fcac-4f6a-4e88-8005-0fd323fc8053, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=31ab3c80-791f-418d-a70b-fcb0d523a037) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:51:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:23.989 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 31ab3c80-791f-418d-a70b-fcb0d523a037 in datapath 850ad6ca-6166-4382-94bb-4b7c10d9a136 bound to our chassis#033[00m
Jan 31 02:51:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:23.991 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 850ad6ca-6166-4382-94bb-4b7c10d9a136#033[00m
Jan 31 02:51:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:24.009 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5f4a92d2-05b6-4190-afd2-5aba639a2539]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:24.036 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[f55097b1-71f3-4f72-ab5b-d090d321d397]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:24.038 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[71585f56-1b63-4699-998a-74b501751ab5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:24.058 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[a9fd26af-69e2-4160-b5be-29c7efbf9424]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:24.072 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc71abc-0cf4-4739-93a2-3e415f90c1b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap850ad6ca-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:99:6f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 21, 'tx_packets': 9, 'rx_bytes': 1162, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 21, 'tx_packets': 9, 'rx_bytes': 1162, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489842, 'reachable_time': 29262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238636, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:24.085 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[80bb80f3-fd33-46ed-b926-68cb251c0640]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap850ad6ca-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489853, 'tstamp': 489853}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238637, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap850ad6ca-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489856, 'tstamp': 489856}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238637, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:24.087 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap850ad6ca-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:24 np0005603623 nova_compute[226235]: 2026-01-31 07:51:24.090 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:24.092 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap850ad6ca-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:24.092 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:51:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:24.093 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap850ad6ca-60, col_values=(('external_ids', {'iface-id': '61b6889f-b848-4873-9650-8b2715794d29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:24.093 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:51:24 np0005603623 nova_compute[226235]: 2026-01-31 07:51:24.146 226239 INFO nova.compute.manager [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Post operation of migration started#033[00m
Jan 31 02:51:24 np0005603623 nova_compute[226235]: 2026-01-31 07:51:24.215 226239 DEBUG nova.network.neutron [-] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:51:24 np0005603623 nova_compute[226235]: 2026-01-31 07:51:24.233 226239 INFO nova.compute.manager [-] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Took 0.66 seconds to deallocate network for instance.#033[00m
Jan 31 02:51:24 np0005603623 nova_compute[226235]: 2026-01-31 07:51:24.314 226239 DEBUG nova.compute.manager [req-ad274fe9-4a59-40f2-8513-a50972b9383e req-7c890c71-1d11-4f91-8f47-01ff0cb1b605 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-deleted-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:24 np0005603623 nova_compute[226235]: 2026-01-31 07:51:24.550 226239 INFO nova.compute.manager [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Took 0.32 seconds to detach 1 volumes for instance.#033[00m
Jan 31 02:51:24 np0005603623 nova_compute[226235]: 2026-01-31 07:51:24.553 226239 DEBUG nova.compute.manager [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Deleting volume: 317b1c6b-4f89-402c-94d1-f4852844f1e2 _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 31 02:51:24 np0005603623 nova_compute[226235]: 2026-01-31 07:51:24.590 226239 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:51:24 np0005603623 nova_compute[226235]: 2026-01-31 07:51:24.590 226239 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquired lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:51:24 np0005603623 nova_compute[226235]: 2026-01-31 07:51:24.591 226239 DEBUG nova.network.neutron [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:51:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:24.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:24 np0005603623 nova_compute[226235]: 2026-01-31 07:51:24.858 226239 DEBUG oslo_concurrency.lockutils [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:24 np0005603623 nova_compute[226235]: 2026-01-31 07:51:24.858 226239 DEBUG oslo_concurrency.lockutils [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:24 np0005603623 nova_compute[226235]: 2026-01-31 07:51:24.958 226239 DEBUG oslo_concurrency.processutils [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:51:25.517537) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845885517647, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2453, "num_deletes": 254, "total_data_size": 5898159, "memory_usage": 5959120, "flush_reason": "Manual Compaction"}
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3307692210' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:51:25 np0005603623 nova_compute[226235]: 2026-01-31 07:51:25.541 226239 DEBUG oslo_concurrency.processutils [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.583s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:25 np0005603623 nova_compute[226235]: 2026-01-31 07:51:25.547 226239 DEBUG nova.compute.provider_tree [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:51:25 np0005603623 nova_compute[226235]: 2026-01-31 07:51:25.563 226239 DEBUG nova.scheduler.client.report [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:51:25 np0005603623 nova_compute[226235]: 2026-01-31 07:51:25.585 226239 DEBUG oslo_concurrency.lockutils [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.727s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:25.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:25 np0005603623 nova_compute[226235]: 2026-01-31 07:51:25.620 226239 INFO nova.scheduler.client.report [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Deleted allocations for instance 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3#033[00m
Jan 31 02:51:25 np0005603623 nova_compute[226235]: 2026-01-31 07:51:25.637 226239 DEBUG nova.network.neutron [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updating instance_info_cache with network_info: [{"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:51:25 np0005603623 nova_compute[226235]: 2026-01-31 07:51:25.678 226239 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Releasing lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:51:25 np0005603623 nova_compute[226235]: 2026-01-31 07:51:25.698 226239 DEBUG oslo_concurrency.lockutils [None req-828b0638-00c3-4748-99b6-2114c3d0133d 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 12.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:25 np0005603623 nova_compute[226235]: 2026-01-31 07:51:25.700 226239 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:25 np0005603623 nova_compute[226235]: 2026-01-31 07:51:25.701 226239 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:25 np0005603623 nova_compute[226235]: 2026-01-31 07:51:25.701 226239 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:25 np0005603623 nova_compute[226235]: 2026-01-31 07:51:25.705 226239 INFO nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 31 02:51:25 np0005603623 virtqemud[225858]: Domain id=15 name='instance-00000013' uuid=4b48cc05-9edd-4e4d-a58e-84564afb0612 is tainted: custom-monitor
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845885760347, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 3868424, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23202, "largest_seqno": 25650, "table_properties": {"data_size": 3858382, "index_size": 6344, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 21257, "raw_average_key_size": 20, "raw_value_size": 3838263, "raw_average_value_size": 3744, "num_data_blocks": 278, "num_entries": 1025, "num_filter_entries": 1025, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769845664, "oldest_key_time": 1769845664, "file_creation_time": 1769845885, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 242861 microseconds, and 7119 cpu microseconds.
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:51:25.760408) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 3868424 bytes OK
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:51:25.760427) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:51:25.880074) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:51:25.880123) EVENT_LOG_v1 {"time_micros": 1769845885880112, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:51:25.880148) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 5887358, prev total WAL file size 5887358, number of live WAL files 2.
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:51:25.881756) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(3777KB)], [48(7320KB)]
Jan 31 02:51:25 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845885881791, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 11364266, "oldest_snapshot_seqno": -1}
Jan 31 02:51:26 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 4999 keys, 9281269 bytes, temperature: kUnknown
Jan 31 02:51:26 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845886224394, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 9281269, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9246781, "index_size": 20874, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12549, "raw_key_size": 126328, "raw_average_key_size": 25, "raw_value_size": 9155607, "raw_average_value_size": 1831, "num_data_blocks": 850, "num_entries": 4999, "num_filter_entries": 4999, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769845885, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:51:26 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:51:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:51:26.224684) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 9281269 bytes
Jan 31 02:51:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:51:26.293369) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 33.2 rd, 27.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 7.1 +0.0 blob) out(8.9 +0.0 blob), read-write-amplify(5.3) write-amplify(2.4) OK, records in: 5524, records dropped: 525 output_compression: NoCompression
Jan 31 02:51:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:51:26.293410) EVENT_LOG_v1 {"time_micros": 1769845886293395, "job": 28, "event": "compaction_finished", "compaction_time_micros": 342732, "compaction_time_cpu_micros": 16013, "output_level": 6, "num_output_files": 1, "total_output_size": 9281269, "num_input_records": 5524, "num_output_records": 4999, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:51:26 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:51:26 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845886293857, "job": 28, "event": "table_file_deletion", "file_number": 50}
Jan 31 02:51:26 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:51:26 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845886294592, "job": 28, "event": "table_file_deletion", "file_number": 48}
Jan 31 02:51:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:51:25.881659) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:51:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:51:26.294696) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:51:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:51:26.294702) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:51:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:51:26.294704) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:51:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:51:26.294706) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:51:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:51:26.294708) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:51:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:26.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:26 np0005603623 nova_compute[226235]: 2026-01-31 07:51:26.712 226239 INFO nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 31 02:51:26 np0005603623 nova_compute[226235]: 2026-01-31 07:51:26.813 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:26 np0005603623 nova_compute[226235]: 2026-01-31 07:51:26.834 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e153 e153: 3 total, 3 up, 3 in
Jan 31 02:51:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:27.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:27 np0005603623 nova_compute[226235]: 2026-01-31 07:51:27.718 226239 INFO nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 31 02:51:27 np0005603623 nova_compute[226235]: 2026-01-31 07:51:27.722 226239 DEBUG nova.compute.manager [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:51:27 np0005603623 nova_compute[226235]: 2026-01-31 07:51:27.741 226239 DEBUG nova.objects.instance [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 02:51:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:28.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:29 np0005603623 nova_compute[226235]: 2026-01-31 07:51:29.433 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845874.4321284, 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:51:29 np0005603623 nova_compute[226235]: 2026-01-31 07:51:29.433 226239 INFO nova.compute.manager [-] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:51:29 np0005603623 nova_compute[226235]: 2026-01-31 07:51:29.460 226239 DEBUG nova.compute.manager [None req-f00b568a-e8ba-4a52-b538-7317b3d2757b - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:51:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:51:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:29.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:51:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:30.086 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:30.087 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:30.087 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:30.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:30 np0005603623 nova_compute[226235]: 2026-01-31 07:51:30.919 226239 DEBUG oslo_concurrency.lockutils [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:30 np0005603623 nova_compute[226235]: 2026-01-31 07:51:30.919 226239 DEBUG oslo_concurrency.lockutils [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:30 np0005603623 nova_compute[226235]: 2026-01-31 07:51:30.920 226239 DEBUG oslo_concurrency.lockutils [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:30 np0005603623 nova_compute[226235]: 2026-01-31 07:51:30.920 226239 DEBUG oslo_concurrency.lockutils [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:30 np0005603623 nova_compute[226235]: 2026-01-31 07:51:30.920 226239 DEBUG oslo_concurrency.lockutils [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:30 np0005603623 nova_compute[226235]: 2026-01-31 07:51:30.922 226239 INFO nova.compute.manager [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Terminating instance#033[00m
Jan 31 02:51:30 np0005603623 nova_compute[226235]: 2026-01-31 07:51:30.924 226239 DEBUG nova.compute.manager [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:51:31 np0005603623 kernel: tap31ab3c80-79 (unregistering): left promiscuous mode
Jan 31 02:51:31 np0005603623 NetworkManager[48970]: <info>  [1769845891.4010] device (tap31ab3c80-79): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:51:31 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:31Z|00132|binding|INFO|Releasing lport 31ab3c80-791f-418d-a70b-fcb0d523a037 from this chassis (sb_readonly=0)
Jan 31 02:51:31 np0005603623 nova_compute[226235]: 2026-01-31 07:51:31.410 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:31 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:31Z|00133|binding|INFO|Setting lport 31ab3c80-791f-418d-a70b-fcb0d523a037 down in Southbound
Jan 31 02:51:31 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:31Z|00134|binding|INFO|Removing iface tap31ab3c80-79 ovn-installed in OVS
Jan 31 02:51:31 np0005603623 nova_compute[226235]: 2026-01-31 07:51:31.415 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:31 np0005603623 nova_compute[226235]: 2026-01-31 07:51:31.419 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:31.427 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:11:0b 10.100.0.6'], port_security=['fa:16:3e:8b:11:0b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4b48cc05-9edd-4e4d-a58e-84564afb0612', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbdbb7a4b22a49b68feb3e028bb62fbb', 'neutron:revision_number': '22', 'neutron:security_group_ids': 'a60a5d2f-886d-4841-8ef6-f9e7838468dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f860fcac-4f6a-4e88-8005-0fd323fc8053, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=31ab3c80-791f-418d-a70b-fcb0d523a037) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:51:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:31.429 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 31ab3c80-791f-418d-a70b-fcb0d523a037 in datapath 850ad6ca-6166-4382-94bb-4b7c10d9a136 unbound from our chassis#033[00m
Jan 31 02:51:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:31.431 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 850ad6ca-6166-4382-94bb-4b7c10d9a136#033[00m
Jan 31 02:51:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:31.442 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[357fedd9-fad3-4233-adee-b9c24bceb8d9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:31 np0005603623 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000013.scope: Deactivated successfully.
Jan 31 02:51:31 np0005603623 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000013.scope: Consumed 1.152s CPU time.
Jan 31 02:51:31 np0005603623 systemd-machined[194379]: Machine qemu-15-instance-00000013 terminated.
Jan 31 02:51:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:31.467 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ef0c32fd-dfbf-492c-9de1-923c31554175]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:31.469 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ff05a44d-1748-4fa8-bdfd-342c8a01601f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:31.486 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[c8e44eef-e960-460c-9003-091b34567b7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:31.497 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[348971be-23a4-4cd4-95f4-3012034cf0a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap850ad6ca-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:99:6f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 11, 'rx_bytes': 1792, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 11, 'rx_bytes': 1792, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489842, 'reachable_time': 42839, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238675, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:31.510 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d332dfa7-0174-40d9-bd67-4d6e09261f6c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap850ad6ca-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489853, 'tstamp': 489853}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238676, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap850ad6ca-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489856, 'tstamp': 489856}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238676, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:31.511 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap850ad6ca-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:31 np0005603623 nova_compute[226235]: 2026-01-31 07:51:31.513 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:31 np0005603623 nova_compute[226235]: 2026-01-31 07:51:31.516 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:31.516 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap850ad6ca-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:31.517 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:51:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:31.517 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap850ad6ca-60, col_values=(('external_ids', {'iface-id': '61b6889f-b848-4873-9650-8b2715794d29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:31.517 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:51:31 np0005603623 nova_compute[226235]: 2026-01-31 07:51:31.551 226239 INFO nova.virt.libvirt.driver [-] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Instance destroyed successfully.#033[00m
Jan 31 02:51:31 np0005603623 nova_compute[226235]: 2026-01-31 07:51:31.551 226239 DEBUG nova.objects.instance [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lazy-loading 'resources' on Instance uuid 4b48cc05-9edd-4e4d-a58e-84564afb0612 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:51:31 np0005603623 nova_compute[226235]: 2026-01-31 07:51:31.583 226239 DEBUG nova.virt.libvirt.vif [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T07:49:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1009708622',display_name='tempest-LiveMigrationTest-server-1009708622',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1009708622',id=19,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:50:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbdbb7a4b22a49b68feb3e028bb62fbb',ramdisk_id='',reservation_id='r-878znybl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveMigrationTest-126681982',owner_user_name='tempest-LiveMigrationTest-126681982-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:51:27Z,user_data=None,user_id='795c7f392cbc45f0885f081449883d42',uuid=4b48cc05-9edd-4e4d-a58e-84564afb0612,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:51:31 np0005603623 nova_compute[226235]: 2026-01-31 07:51:31.583 226239 DEBUG nova.network.os_vif_util [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Converting VIF {"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:51:31 np0005603623 nova_compute[226235]: 2026-01-31 07:51:31.584 226239 DEBUG nova.network.os_vif_util [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:51:31 np0005603623 nova_compute[226235]: 2026-01-31 07:51:31.584 226239 DEBUG os_vif [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:51:31 np0005603623 nova_compute[226235]: 2026-01-31 07:51:31.586 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:31 np0005603623 nova_compute[226235]: 2026-01-31 07:51:31.586 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31ab3c80-79, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:31 np0005603623 nova_compute[226235]: 2026-01-31 07:51:31.587 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:31 np0005603623 nova_compute[226235]: 2026-01-31 07:51:31.589 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:31 np0005603623 nova_compute[226235]: 2026-01-31 07:51:31.591 226239 INFO os_vif [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79')#033[00m
Jan 31 02:51:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:51:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:31.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:51:31 np0005603623 nova_compute[226235]: 2026-01-31 07:51:31.835 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:32.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:32 np0005603623 nova_compute[226235]: 2026-01-31 07:51:32.913 226239 DEBUG nova.compute.manager [req-550861c9-fff5-4ecf-84d4-c4c602d79634 req-45c2e65b-8c32-4f79-a69c-f28985bfe1ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-unplugged-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:32 np0005603623 nova_compute[226235]: 2026-01-31 07:51:32.914 226239 DEBUG oslo_concurrency.lockutils [req-550861c9-fff5-4ecf-84d4-c4c602d79634 req-45c2e65b-8c32-4f79-a69c-f28985bfe1ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:32 np0005603623 nova_compute[226235]: 2026-01-31 07:51:32.914 226239 DEBUG oslo_concurrency.lockutils [req-550861c9-fff5-4ecf-84d4-c4c602d79634 req-45c2e65b-8c32-4f79-a69c-f28985bfe1ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:32 np0005603623 nova_compute[226235]: 2026-01-31 07:51:32.914 226239 DEBUG oslo_concurrency.lockutils [req-550861c9-fff5-4ecf-84d4-c4c602d79634 req-45c2e65b-8c32-4f79-a69c-f28985bfe1ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:32 np0005603623 nova_compute[226235]: 2026-01-31 07:51:32.914 226239 DEBUG nova.compute.manager [req-550861c9-fff5-4ecf-84d4-c4c602d79634 req-45c2e65b-8c32-4f79-a69c-f28985bfe1ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] No waiting events found dispatching network-vif-unplugged-31ab3c80-791f-418d-a70b-fcb0d523a037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:32 np0005603623 nova_compute[226235]: 2026-01-31 07:51:32.915 226239 DEBUG nova.compute.manager [req-550861c9-fff5-4ecf-84d4-c4c602d79634 req-45c2e65b-8c32-4f79-a69c-f28985bfe1ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-unplugged-31ab3c80-791f-418d-a70b-fcb0d523a037 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:51:32 np0005603623 nova_compute[226235]: 2026-01-31 07:51:32.915 226239 DEBUG nova.compute.manager [req-550861c9-fff5-4ecf-84d4-c4c602d79634 req-45c2e65b-8c32-4f79-a69c-f28985bfe1ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:32 np0005603623 nova_compute[226235]: 2026-01-31 07:51:32.915 226239 DEBUG oslo_concurrency.lockutils [req-550861c9-fff5-4ecf-84d4-c4c602d79634 req-45c2e65b-8c32-4f79-a69c-f28985bfe1ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:32 np0005603623 nova_compute[226235]: 2026-01-31 07:51:32.916 226239 DEBUG oslo_concurrency.lockutils [req-550861c9-fff5-4ecf-84d4-c4c602d79634 req-45c2e65b-8c32-4f79-a69c-f28985bfe1ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:32 np0005603623 nova_compute[226235]: 2026-01-31 07:51:32.916 226239 DEBUG oslo_concurrency.lockutils [req-550861c9-fff5-4ecf-84d4-c4c602d79634 req-45c2e65b-8c32-4f79-a69c-f28985bfe1ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:32 np0005603623 nova_compute[226235]: 2026-01-31 07:51:32.916 226239 DEBUG nova.compute.manager [req-550861c9-fff5-4ecf-84d4-c4c602d79634 req-45c2e65b-8c32-4f79-a69c-f28985bfe1ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] No waiting events found dispatching network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:32 np0005603623 nova_compute[226235]: 2026-01-31 07:51:32.916 226239 WARNING nova.compute.manager [req-550861c9-fff5-4ecf-84d4-c4c602d79634 req-45c2e65b-8c32-4f79-a69c-f28985bfe1ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received unexpected event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 02:51:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:51:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:33.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:51:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:34.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:35.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:35 np0005603623 nova_compute[226235]: 2026-01-31 07:51:35.685 226239 INFO nova.compute.manager [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Swapping old allocation on dict_keys(['492dc482-9d1e-49ca-87f3-0104a8508b72']) held by migration d5c54c4c-3045-4285-98a5-2a691842bc5d for instance#033[00m
Jan 31 02:51:35 np0005603623 nova_compute[226235]: 2026-01-31 07:51:35.715 226239 DEBUG nova.scheduler.client.report [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Overwriting current allocation {'allocations': {'d7116329-87c2-469a-b33a-1e01daf74ceb': {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}, 'generation': 15}}, 'project_id': '1627a71b855b4032b51e234e44a9d570', 'user_id': '8a59efd78e244f44a1c70650f82a2c50', 'consumer_generation': 1} on consumer 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Jan 31 02:51:35 np0005603623 nova_compute[226235]: 2026-01-31 07:51:35.989 226239 DEBUG oslo_concurrency.lockutils [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "refresh_cache-866b0b10-d2ae-4e08-9efa-36b9c9c9f50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:51:35 np0005603623 nova_compute[226235]: 2026-01-31 07:51:35.989 226239 DEBUG oslo_concurrency.lockutils [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquired lock "refresh_cache-866b0b10-d2ae-4e08-9efa-36b9c9c9f50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:51:35 np0005603623 nova_compute[226235]: 2026-01-31 07:51:35.990 226239 DEBUG nova.network.neutron [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:51:36 np0005603623 nova_compute[226235]: 2026-01-31 07:51:36.153 226239 DEBUG nova.network.neutron [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:51:36 np0005603623 nova_compute[226235]: 2026-01-31 07:51:36.558 226239 INFO nova.virt.libvirt.driver [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Deleting instance files /var/lib/nova/instances/4b48cc05-9edd-4e4d-a58e-84564afb0612_del#033[00m
Jan 31 02:51:36 np0005603623 nova_compute[226235]: 2026-01-31 07:51:36.560 226239 INFO nova.virt.libvirt.driver [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Deletion of /var/lib/nova/instances/4b48cc05-9edd-4e4d-a58e-84564afb0612_del complete#033[00m
Jan 31 02:51:36 np0005603623 nova_compute[226235]: 2026-01-31 07:51:36.587 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:36 np0005603623 nova_compute[226235]: 2026-01-31 07:51:36.618 226239 INFO nova.compute.manager [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Took 5.69 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:51:36 np0005603623 nova_compute[226235]: 2026-01-31 07:51:36.619 226239 DEBUG oslo.service.loopingcall [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:51:36 np0005603623 nova_compute[226235]: 2026-01-31 07:51:36.619 226239 DEBUG nova.compute.manager [-] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:51:36 np0005603623 nova_compute[226235]: 2026-01-31 07:51:36.619 226239 DEBUG nova.network.neutron [-] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:51:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:51:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:36.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:51:36 np0005603623 nova_compute[226235]: 2026-01-31 07:51:36.720 226239 DEBUG nova.network.neutron [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:51:36 np0005603623 nova_compute[226235]: 2026-01-31 07:51:36.736 226239 DEBUG oslo_concurrency.lockutils [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Releasing lock "refresh_cache-866b0b10-d2ae-4e08-9efa-36b9c9c9f50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:51:36 np0005603623 nova_compute[226235]: 2026-01-31 07:51:36.737 226239 DEBUG nova.virt.libvirt.driver [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Jan 31 02:51:36 np0005603623 nova_compute[226235]: 2026-01-31 07:51:36.820 226239 DEBUG nova.storage.rbd_utils [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] rolling back rbd image(866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_disk) to snapshot(nova-resize) rollback_to_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:505#033[00m
Jan 31 02:51:36 np0005603623 nova_compute[226235]: 2026-01-31 07:51:36.836 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:37.173 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.173 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:37.174 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.251 226239 DEBUG nova.storage.rbd_utils [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] removing snapshot(nova-resize) on rbd image(866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.270 226239 DEBUG nova.network.neutron [-] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.297 226239 INFO nova.compute.manager [-] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Took 0.68 seconds to deallocate network for instance.#033[00m
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.304 226239 DEBUG nova.compute.manager [req-bed28e19-b173-44c2-a1b9-df353656abdf req-d11a029d-c493-4740-9697-868c8ba5b2e7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-deleted-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.305 226239 INFO nova.compute.manager [req-bed28e19-b173-44c2-a1b9-df353656abdf req-d11a029d-c493-4740-9697-868c8ba5b2e7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Neutron deleted interface 31ab3c80-791f-418d-a70b-fcb0d523a037; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.305 226239 DEBUG nova.network.neutron [req-bed28e19-b173-44c2-a1b9-df353656abdf req-d11a029d-c493-4740-9697-868c8ba5b2e7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.335 226239 DEBUG nova.compute.manager [req-bed28e19-b173-44c2-a1b9-df353656abdf req-d11a029d-c493-4740-9697-868c8ba5b2e7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Detach interface failed, port_id=31ab3c80-791f-418d-a70b-fcb0d523a037, reason: Instance 4b48cc05-9edd-4e4d-a58e-84564afb0612 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.570 226239 INFO nova.compute.manager [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Took 0.27 seconds to detach 1 volumes for instance.#033[00m
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.571 226239 DEBUG nova.compute.manager [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Deleting volume: 74f8a6d0-259e-466b-a484-4c7bffded2e1 _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 31 02:51:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:51:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:37.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.624 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845882.6240873, 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.625 226239 INFO nova.compute.manager [-] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:51:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.716 226239 DEBUG nova.compute.manager [None req-7c9cacb5-fb23-43ed-8803-1a60322609c4 - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.720 226239 DEBUG nova.compute.manager [None req-7c9cacb5-fb23-43ed-8803-1a60322609c4 - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.800 226239 INFO nova.compute.manager [None req-7c9cacb5-fb23-43ed-8803-1a60322609c4 - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.908 226239 DEBUG oslo_concurrency.lockutils [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.908 226239 DEBUG oslo_concurrency.lockutils [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.913 226239 DEBUG oslo_concurrency.lockutils [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:37 np0005603623 nova_compute[226235]: 2026-01-31 07:51:37.936 226239 INFO nova.scheduler.client.report [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Deleted allocations for instance 4b48cc05-9edd-4e4d-a58e-84564afb0612#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.012 226239 DEBUG oslo_concurrency.lockutils [None req-ea269a57-69f6-47a2-b23c-1e99a4f3816a 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e154 e154: 3 total, 3 up, 3 in
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.435 226239 DEBUG nova.virt.libvirt.driver [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.438 226239 WARNING nova.virt.libvirt.driver [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.442 226239 DEBUG nova.virt.libvirt.host [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.442 226239 DEBUG nova.virt.libvirt.host [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.445 226239 DEBUG nova.virt.libvirt.host [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.445 226239 DEBUG nova.virt.libvirt.host [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.446 226239 DEBUG nova.virt.libvirt.driver [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.446 226239 DEBUG nova.virt.hardware [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.447 226239 DEBUG nova.virt.hardware [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.447 226239 DEBUG nova.virt.hardware [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.447 226239 DEBUG nova.virt.hardware [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.447 226239 DEBUG nova.virt.hardware [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.447 226239 DEBUG nova.virt.hardware [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.448 226239 DEBUG nova.virt.hardware [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.448 226239 DEBUG nova.virt.hardware [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.448 226239 DEBUG nova.virt.hardware [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.448 226239 DEBUG nova.virt.hardware [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.448 226239 DEBUG nova.virt.hardware [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.449 226239 DEBUG nova.objects.instance [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.469 226239 DEBUG oslo_concurrency.processutils [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:51:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:38.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:51:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:51:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/785528574' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.881 226239 DEBUG oslo_concurrency.processutils [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:38 np0005603623 nova_compute[226235]: 2026-01-31 07:51:38.921 226239 DEBUG oslo_concurrency.processutils [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:39 np0005603623 nova_compute[226235]: 2026-01-31 07:51:39.059 226239 DEBUG oslo_concurrency.lockutils [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquiring lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:39 np0005603623 nova_compute[226235]: 2026-01-31 07:51:39.060 226239 DEBUG oslo_concurrency.lockutils [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:39 np0005603623 nova_compute[226235]: 2026-01-31 07:51:39.060 226239 DEBUG oslo_concurrency.lockutils [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquiring lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:39 np0005603623 nova_compute[226235]: 2026-01-31 07:51:39.061 226239 DEBUG oslo_concurrency.lockutils [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:39 np0005603623 nova_compute[226235]: 2026-01-31 07:51:39.061 226239 DEBUG oslo_concurrency.lockutils [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:39 np0005603623 nova_compute[226235]: 2026-01-31 07:51:39.063 226239 INFO nova.compute.manager [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Terminating instance#033[00m
Jan 31 02:51:39 np0005603623 nova_compute[226235]: 2026-01-31 07:51:39.064 226239 DEBUG nova.compute.manager [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:51:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:51:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3207663040' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:51:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:39.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:39 np0005603623 nova_compute[226235]: 2026-01-31 07:51:39.766 226239 DEBUG oslo_concurrency.processutils [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.845s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:39 np0005603623 nova_compute[226235]: 2026-01-31 07:51:39.770 226239 DEBUG nova.virt.libvirt.driver [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:51:39 np0005603623 nova_compute[226235]:  <uuid>866b0b10-d2ae-4e08-9efa-36b9c9c9f50d</uuid>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:  <name>instance-00000016</name>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <nova:name>tempest-MigrationsAdminTest-server-1993007316</nova:name>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:51:38</nova:creationTime>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:51:39 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:        <nova:user uuid="8a59efd78e244f44a1c70650f82a2c50">tempest-MigrationsAdminTest-1820348317-project-member</nova:user>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:        <nova:project uuid="1627a71b855b4032b51e234e44a9d570">tempest-MigrationsAdminTest-1820348317</nova:project>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <nova:ports/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <entry name="serial">866b0b10-d2ae-4e08-9efa-36b9c9c9f50d</entry>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <entry name="uuid">866b0b10-d2ae-4e08-9efa-36b9c9c9f50d</entry>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_disk">
Jan 31 02:51:39 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:51:39 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_disk.config">
Jan 31 02:51:39 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:51:39 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d/console.log" append="off"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <input type="keyboard" bus="usb"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:51:39 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:51:39 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:51:39 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:51:39 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:51:39 np0005603623 systemd-machined[194379]: New machine qemu-16-instance-00000016.
Jan 31 02:51:39 np0005603623 systemd[1]: Started Virtual Machine qemu-16-instance-00000016.
Jan 31 02:51:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:40.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.020 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845901.0200093, 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.021 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.023 226239 DEBUG nova.compute.manager [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.026 226239 INFO nova.virt.libvirt.driver [-] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Instance running successfully.#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.026 226239 DEBUG nova.virt.libvirt.driver [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.074 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.077 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.113 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.114 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769845901.0208838, 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.114 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] VM Started (Lifecycle Event)#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.139 226239 INFO nova.compute.manager [None req-7465cf5f-179b-4b90-958f-2202059c96ec 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Updating instance to original state: 'active'#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.143 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.149 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.179 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 31 02:51:41 np0005603623 kernel: tap3aff2339-cc (unregistering): left promiscuous mode
Jan 31 02:51:41 np0005603623 NetworkManager[48970]: <info>  [1769845901.4578] device (tap3aff2339-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:51:41 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:41Z|00135|binding|INFO|Releasing lport 3aff2339-ccc0-4845-8728-4ede26d0c11a from this chassis (sb_readonly=0)
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.463 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:41 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:41Z|00136|binding|INFO|Setting lport 3aff2339-ccc0-4845-8728-4ede26d0c11a down in Southbound
Jan 31 02:51:41 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:41Z|00137|binding|INFO|Releasing lport fc5261b7-0e3f-49d1-8fbf-8dcf40626991 from this chassis (sb_readonly=0)
Jan 31 02:51:41 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:41Z|00138|binding|INFO|Setting lport fc5261b7-0e3f-49d1-8fbf-8dcf40626991 down in Southbound
Jan 31 02:51:41 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:41Z|00139|binding|INFO|Removing iface tap3aff2339-cc ovn-installed in OVS
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.465 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:41.469 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:f2:07 10.100.0.11'], port_security=['fa:16:3e:e0:f2:07 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1970562059', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1970562059', 'neutron:project_id': 'cbdbb7a4b22a49b68feb3e028bb62fbb', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'a60a5d2f-886d-4841-8ef6-f9e7838468dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f860fcac-4f6a-4e88-8005-0fd323fc8053, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=3aff2339-ccc0-4845-8728-4ede26d0c11a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:51:41 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:41Z|00140|binding|INFO|Releasing lport 61b6889f-b848-4873-9650-8b2715794d29 from this chassis (sb_readonly=0)
Jan 31 02:51:41 np0005603623 ovn_controller[133449]: 2026-01-31T07:51:41Z|00141|binding|INFO|Releasing lport 9dcf2f9f-4a2b-44f0-988c-28c5222b394c from this chassis (sb_readonly=0)
Jan 31 02:51:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:41.471 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:cb:fc 19.80.0.218'], port_security=['fa:16:3e:0b:cb:fc 19.80.0.218'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['3aff2339-ccc0-4845-8728-4ede26d0c11a'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-427751920', 'neutron:cidrs': '19.80.0.218/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c3613479-5299-41cd-b6dd-df1fae2ae862', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-427751920', 'neutron:project_id': 'cbdbb7a4b22a49b68feb3e028bb62fbb', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'a60a5d2f-886d-4841-8ef6-f9e7838468dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=0ad390ce-c29b-4af4-b946-e8404e058f9b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fc5261b7-0e3f-49d1-8fbf-8dcf40626991) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:51:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:41.473 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 3aff2339-ccc0-4845-8728-4ede26d0c11a in datapath 850ad6ca-6166-4382-94bb-4b7c10d9a136 unbound from our chassis#033[00m
Jan 31 02:51:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:41.480 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 850ad6ca-6166-4382-94bb-4b7c10d9a136, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:51:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:41.481 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[38254f90-1bed-4ccd-b864-304688266d63]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:41.482 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136 namespace which is not needed anymore#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.531 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:41 np0005603623 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000011.scope: Deactivated successfully.
Jan 31 02:51:41 np0005603623 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000011.scope: Consumed 8.493s CPU time.
Jan 31 02:51:41 np0005603623 systemd-machined[194379]: Machine qemu-8-instance-00000011 terminated.
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.589 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:51:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:41.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.703 226239 INFO nova.virt.libvirt.driver [-] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Instance destroyed successfully.#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.704 226239 DEBUG nova.objects.instance [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lazy-loading 'resources' on Instance uuid 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.732 226239 DEBUG nova.virt.libvirt.vif [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T07:48:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1007393486',display_name='tempest-LiveMigrationTest-server-1007393486',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1007393486',id=17,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:48:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbdbb7a4b22a49b68feb3e028bb62fbb',ramdisk_id='',reservation_id='r-egj05et1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-126681982',owner_user_name='tempest-LiveMigrationTest-126681982-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:49:25Z,user_data=None,user_id='795c7f392cbc45f0885f081449883d42',uuid=14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.732 226239 DEBUG nova.network.os_vif_util [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Converting VIF {"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.733 226239 DEBUG nova.network.os_vif_util [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e0:f2:07,bridge_name='br-int',has_traffic_filtering=True,id=3aff2339-ccc0-4845-8728-4ede26d0c11a,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3aff2339-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.733 226239 DEBUG os_vif [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e0:f2:07,bridge_name='br-int',has_traffic_filtering=True,id=3aff2339-ccc0-4845-8728-4ede26d0c11a,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3aff2339-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.734 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.735 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3aff2339-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.736 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.738 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.738 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.740 226239 INFO os_vif [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e0:f2:07,bridge_name='br-int',has_traffic_filtering=True,id=3aff2339-ccc0-4845-8728-4ede26d0c11a,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3aff2339-cc')#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.838 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:41 np0005603623 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[235269]: [NOTICE]   (235296) : haproxy version is 2.8.14-c23fe91
Jan 31 02:51:41 np0005603623 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[235269]: [NOTICE]   (235296) : path to executable is /usr/sbin/haproxy
Jan 31 02:51:41 np0005603623 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[235269]: [WARNING]  (235296) : Exiting Master process...
Jan 31 02:51:41 np0005603623 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[235269]: [WARNING]  (235296) : Exiting Master process...
Jan 31 02:51:41 np0005603623 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[235269]: [ALERT]    (235296) : Current worker (235299) exited with code 143 (Terminated)
Jan 31 02:51:41 np0005603623 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[235269]: [WARNING]  (235296) : All workers exited. Exiting... (0)
Jan 31 02:51:41 np0005603623 systemd[1]: libpod-0eb83e95ed694d047c8d6923aa9afbab83098c697f5a78f27e007163f07b8620.scope: Deactivated successfully.
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.956 226239 DEBUG oslo_concurrency.lockutils [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "866b0b10-d2ae-4e08-9efa-36b9c9c9f50d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.957 226239 DEBUG oslo_concurrency.lockutils [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "866b0b10-d2ae-4e08-9efa-36b9c9c9f50d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.957 226239 DEBUG oslo_concurrency.lockutils [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "866b0b10-d2ae-4e08-9efa-36b9c9c9f50d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.957 226239 DEBUG oslo_concurrency.lockutils [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "866b0b10-d2ae-4e08-9efa-36b9c9c9f50d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.958 226239 DEBUG oslo_concurrency.lockutils [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "866b0b10-d2ae-4e08-9efa-36b9c9c9f50d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.960 226239 INFO nova.compute.manager [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Terminating instance#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.961 226239 DEBUG oslo_concurrency.lockutils [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "refresh_cache-866b0b10-d2ae-4e08-9efa-36b9c9c9f50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.961 226239 DEBUG oslo_concurrency.lockutils [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquired lock "refresh_cache-866b0b10-d2ae-4e08-9efa-36b9c9c9f50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:51:41 np0005603623 nova_compute[226235]: 2026-01-31 07:51:41.962 226239 DEBUG nova.network.neutron [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:51:41 np0005603623 podman[238960]: 2026-01-31 07:51:41.963800209 +0000 UTC m=+0.406619080 container died 0eb83e95ed694d047c8d6923aa9afbab83098c697f5a78f27e007163f07b8620 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:51:42 np0005603623 nova_compute[226235]: 2026-01-31 07:51:42.012 226239 DEBUG nova.compute.manager [req-e725df6b-adad-43ae-a559-55e70eadcff6 req-1d99a9d1-f6c2-4c70-919b-0d9d0e91ebe1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received event network-vif-unplugged-3aff2339-ccc0-4845-8728-4ede26d0c11a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:42 np0005603623 nova_compute[226235]: 2026-01-31 07:51:42.012 226239 DEBUG oslo_concurrency.lockutils [req-e725df6b-adad-43ae-a559-55e70eadcff6 req-1d99a9d1-f6c2-4c70-919b-0d9d0e91ebe1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:42 np0005603623 nova_compute[226235]: 2026-01-31 07:51:42.012 226239 DEBUG oslo_concurrency.lockutils [req-e725df6b-adad-43ae-a559-55e70eadcff6 req-1d99a9d1-f6c2-4c70-919b-0d9d0e91ebe1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:42 np0005603623 nova_compute[226235]: 2026-01-31 07:51:42.013 226239 DEBUG oslo_concurrency.lockutils [req-e725df6b-adad-43ae-a559-55e70eadcff6 req-1d99a9d1-f6c2-4c70-919b-0d9d0e91ebe1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:42 np0005603623 nova_compute[226235]: 2026-01-31 07:51:42.013 226239 DEBUG nova.compute.manager [req-e725df6b-adad-43ae-a559-55e70eadcff6 req-1d99a9d1-f6c2-4c70-919b-0d9d0e91ebe1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] No waiting events found dispatching network-vif-unplugged-3aff2339-ccc0-4845-8728-4ede26d0c11a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:42 np0005603623 nova_compute[226235]: 2026-01-31 07:51:42.013 226239 DEBUG nova.compute.manager [req-e725df6b-adad-43ae-a559-55e70eadcff6 req-1d99a9d1-f6c2-4c70-919b-0d9d0e91ebe1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received event network-vif-unplugged-3aff2339-ccc0-4845-8728-4ede26d0c11a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:51:42 np0005603623 nova_compute[226235]: 2026-01-31 07:51:42.163 226239 DEBUG nova.network.neutron [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:51:42 np0005603623 nova_compute[226235]: 2026-01-31 07:51:42.581 226239 DEBUG nova.network.neutron [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:51:42 np0005603623 nova_compute[226235]: 2026-01-31 07:51:42.596 226239 DEBUG oslo_concurrency.lockutils [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Releasing lock "refresh_cache-866b0b10-d2ae-4e08-9efa-36b9c9c9f50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:51:42 np0005603623 nova_compute[226235]: 2026-01-31 07:51:42.597 226239 DEBUG nova.compute.manager [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:51:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:42.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:43.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:43 np0005603623 systemd[1]: var-lib-containers-storage-overlay-d23ee60fd1830d50ca50c85cdad2a63a22a2e4a796b6a00c5b251fa1fd800a52-merged.mount: Deactivated successfully.
Jan 31 02:51:43 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0eb83e95ed694d047c8d6923aa9afbab83098c697f5a78f27e007163f07b8620-userdata-shm.mount: Deactivated successfully.
Jan 31 02:51:44 np0005603623 nova_compute[226235]: 2026-01-31 07:51:44.112 226239 DEBUG nova.compute.manager [req-cf564f84-d91a-4055-92c1-24abb90b3dbf req-4cdde6d6-68e3-4d4a-9ff6-1c4bb394c98a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received event network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:44 np0005603623 nova_compute[226235]: 2026-01-31 07:51:44.113 226239 DEBUG oslo_concurrency.lockutils [req-cf564f84-d91a-4055-92c1-24abb90b3dbf req-4cdde6d6-68e3-4d4a-9ff6-1c4bb394c98a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:44 np0005603623 nova_compute[226235]: 2026-01-31 07:51:44.113 226239 DEBUG oslo_concurrency.lockutils [req-cf564f84-d91a-4055-92c1-24abb90b3dbf req-4cdde6d6-68e3-4d4a-9ff6-1c4bb394c98a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:44 np0005603623 nova_compute[226235]: 2026-01-31 07:51:44.113 226239 DEBUG oslo_concurrency.lockutils [req-cf564f84-d91a-4055-92c1-24abb90b3dbf req-4cdde6d6-68e3-4d4a-9ff6-1c4bb394c98a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:44 np0005603623 nova_compute[226235]: 2026-01-31 07:51:44.114 226239 DEBUG nova.compute.manager [req-cf564f84-d91a-4055-92c1-24abb90b3dbf req-4cdde6d6-68e3-4d4a-9ff6-1c4bb394c98a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] No waiting events found dispatching network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:44 np0005603623 nova_compute[226235]: 2026-01-31 07:51:44.114 226239 WARNING nova.compute.manager [req-cf564f84-d91a-4055-92c1-24abb90b3dbf req-4cdde6d6-68e3-4d4a-9ff6-1c4bb394c98a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received unexpected event network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a for instance with vm_state active and task_state deleting.#033[00m
Jan 31 02:51:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:44.176 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:44 np0005603623 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000016.scope: Deactivated successfully.
Jan 31 02:51:44 np0005603623 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000016.scope: Consumed 2.342s CPU time.
Jan 31 02:51:44 np0005603623 systemd-machined[194379]: Machine qemu-16-instance-00000016 terminated.
Jan 31 02:51:44 np0005603623 nova_compute[226235]: 2026-01-31 07:51:44.617 226239 INFO nova.virt.libvirt.driver [-] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Instance destroyed successfully.#033[00m
Jan 31 02:51:44 np0005603623 nova_compute[226235]: 2026-01-31 07:51:44.617 226239 DEBUG nova.objects.instance [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'resources' on Instance uuid 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:51:44 np0005603623 podman[238960]: 2026-01-31 07:51:44.638838587 +0000 UTC m=+3.081657488 container cleanup 0eb83e95ed694d047c8d6923aa9afbab83098c697f5a78f27e007163f07b8620 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 02:51:44 np0005603623 systemd[1]: libpod-conmon-0eb83e95ed694d047c8d6923aa9afbab83098c697f5a78f27e007163f07b8620.scope: Deactivated successfully.
Jan 31 02:51:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:44.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:51:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:45.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:51:46 np0005603623 podman[239032]: 2026-01-31 07:51:46.394662972 +0000 UTC m=+1.738671577 container remove 0eb83e95ed694d047c8d6923aa9afbab83098c697f5a78f27e007163f07b8620 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:51:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:46.399 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f52a88f0-7ef4-48b8-a7d9-ce3aa7c9008e]: (4, ('Sat Jan 31 07:51:41 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136 (0eb83e95ed694d047c8d6923aa9afbab83098c697f5a78f27e007163f07b8620)\n0eb83e95ed694d047c8d6923aa9afbab83098c697f5a78f27e007163f07b8620\nSat Jan 31 07:51:44 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136 (0eb83e95ed694d047c8d6923aa9afbab83098c697f5a78f27e007163f07b8620)\n0eb83e95ed694d047c8d6923aa9afbab83098c697f5a78f27e007163f07b8620\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:46.401 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[557dcfae-6919-4c9c-b0b1-77c0566d806c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:46.402 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap850ad6ca-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:46 np0005603623 nova_compute[226235]: 2026-01-31 07:51:46.403 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:46 np0005603623 kernel: tap850ad6ca-60: left promiscuous mode
Jan 31 02:51:46 np0005603623 nova_compute[226235]: 2026-01-31 07:51:46.410 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:46 np0005603623 nova_compute[226235]: 2026-01-31 07:51:46.412 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:46.413 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3c22ce17-7682-40e7-84b3-afba8d417dce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:46.429 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2877f95e-821d-4e74-a2c9-ab8dfa138126]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:46.430 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[79c23c44-ebbf-4d6a-ac1e-785f2ae8ac27]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:46.441 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[436870eb-12e5-426d-8bbe-26a22fc43c1c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489836, 'reachable_time': 42941, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239061, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:46.443 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:51:46 np0005603623 systemd[1]: run-netns-ovnmeta\x2d850ad6ca\x2d6166\x2d4382\x2d94bb\x2d4b7c10d9a136.mount: Deactivated successfully.
Jan 31 02:51:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:46.443 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[848de0c4-08b8-4911-b4d2-b526646f46dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:46.444 143258 INFO neutron.agent.ovn.metadata.agent [-] Port fc5261b7-0e3f-49d1-8fbf-8dcf40626991 in datapath c3613479-5299-41cd-b6dd-df1fae2ae862 unbound from our chassis#033[00m
Jan 31 02:51:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:46.446 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c3613479-5299-41cd-b6dd-df1fae2ae862, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:51:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:46.447 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dffbbfd8-55c1-457b-be60-3bf615ca2f40]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:46.447 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862 namespace which is not needed anymore#033[00m
Jan 31 02:51:46 np0005603623 nova_compute[226235]: 2026-01-31 07:51:46.549 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845891.5487034, 4b48cc05-9edd-4e4d-a58e-84564afb0612 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:51:46 np0005603623 nova_compute[226235]: 2026-01-31 07:51:46.550 226239 INFO nova.compute.manager [-] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:51:46 np0005603623 nova_compute[226235]: 2026-01-31 07:51:46.597 226239 DEBUG nova.compute.manager [None req-4e6da4b8-9d1f-49e5-bdf4-0309e5eec3da - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:51:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:46.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:46 np0005603623 neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862[235429]: [NOTICE]   (235433) : haproxy version is 2.8.14-c23fe91
Jan 31 02:51:46 np0005603623 neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862[235429]: [NOTICE]   (235433) : path to executable is /usr/sbin/haproxy
Jan 31 02:51:46 np0005603623 neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862[235429]: [WARNING]  (235433) : Exiting Master process...
Jan 31 02:51:46 np0005603623 neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862[235429]: [WARNING]  (235433) : Exiting Master process...
Jan 31 02:51:46 np0005603623 neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862[235429]: [ALERT]    (235433) : Current worker (235435) exited with code 143 (Terminated)
Jan 31 02:51:46 np0005603623 neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862[235429]: [WARNING]  (235433) : All workers exited. Exiting... (0)
Jan 31 02:51:46 np0005603623 systemd[1]: libpod-ec60d423e16d0005c884bdbc814527297d99910c6bd8a56563a93c3ee9d773c5.scope: Deactivated successfully.
Jan 31 02:51:46 np0005603623 podman[239080]: 2026-01-31 07:51:46.710218869 +0000 UTC m=+0.176815123 container died ec60d423e16d0005c884bdbc814527297d99910c6bd8a56563a93c3ee9d773c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:51:46 np0005603623 nova_compute[226235]: 2026-01-31 07:51:46.737 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:46 np0005603623 nova_compute[226235]: 2026-01-31 07:51:46.840 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:47 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec60d423e16d0005c884bdbc814527297d99910c6bd8a56563a93c3ee9d773c5-userdata-shm.mount: Deactivated successfully.
Jan 31 02:51:47 np0005603623 systemd[1]: var-lib-containers-storage-overlay-c79bc6a3627715558b5cdba1eaa316f64550838ab27096df54011ee4d24ae108-merged.mount: Deactivated successfully.
Jan 31 02:51:47 np0005603623 podman[239080]: 2026-01-31 07:51:47.506655973 +0000 UTC m=+0.973252197 container cleanup ec60d423e16d0005c884bdbc814527297d99910c6bd8a56563a93c3ee9d773c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Jan 31 02:51:47 np0005603623 systemd[1]: libpod-conmon-ec60d423e16d0005c884bdbc814527297d99910c6bd8a56563a93c3ee9d773c5.scope: Deactivated successfully.
Jan 31 02:51:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:47.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:48 np0005603623 podman[239111]: 2026-01-31 07:51:48.654078827 +0000 UTC m=+1.133373783 container remove ec60d423e16d0005c884bdbc814527297d99910c6bd8a56563a93c3ee9d773c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 02:51:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:48.661 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c68e8120-1c6b-45b4-83de-455570916e9f]: (4, ('Sat Jan 31 07:51:46 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862 (ec60d423e16d0005c884bdbc814527297d99910c6bd8a56563a93c3ee9d773c5)\nec60d423e16d0005c884bdbc814527297d99910c6bd8a56563a93c3ee9d773c5\nSat Jan 31 07:51:47 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862 (ec60d423e16d0005c884bdbc814527297d99910c6bd8a56563a93c3ee9d773c5)\nec60d423e16d0005c884bdbc814527297d99910c6bd8a56563a93c3ee9d773c5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:48.664 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[af3dc1b2-6be8-42d0-bc0b-57e8cfb057b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:48.665 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3613479-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:48 np0005603623 nova_compute[226235]: 2026-01-31 07:51:48.667 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:48 np0005603623 kernel: tapc3613479-50: left promiscuous mode
Jan 31 02:51:48 np0005603623 nova_compute[226235]: 2026-01-31 07:51:48.675 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:48.678 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a35b8fc2-c01f-4a14-93a6-4ea30568a950]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:48.691 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4615f712-0872-476e-ba36-4517aa46f26b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:48.693 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1b7e15fe-eba7-4348-8f52-2add6ed4ccc9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:48.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:48.710 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ae68f064-cc44-4a55-ba57-d52d32a07f97]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490063, 'reachable_time': 19010, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239257, 'error': None, 'target': 'ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:48 np0005603623 systemd[1]: run-netns-ovnmeta\x2dc3613479\x2d5299\x2d41cd\x2db6dd\x2ddf1fae2ae862.mount: Deactivated successfully.
Jan 31 02:51:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:48.714 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:51:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:51:48.714 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[8cd3f123-625d-41ac-9b72-b284ceb09dc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e155 e155: 3 total, 3 up, 3 in
Jan 31 02:51:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:51:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:49.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:51:50 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:51:50 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:51:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:51:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:50.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:51:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:51.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:51 np0005603623 nova_compute[226235]: 2026-01-31 07:51:51.739 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:51 np0005603623 nova_compute[226235]: 2026-01-31 07:51:51.841 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:51:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:51:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:51:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:51:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:51:52 np0005603623 podman[239288]: 2026-01-31 07:51:52.61317982 +0000 UTC m=+0.049379444 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 02:51:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:52 np0005603623 podman[239289]: 2026-01-31 07:51:52.647168979 +0000 UTC m=+0.078674396 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 02:51:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:52.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:51:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:53.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:51:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:51:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:54.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:51:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:55.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:56 np0005603623 nova_compute[226235]: 2026-01-31 07:51:56.702 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845901.701905, 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:51:56 np0005603623 nova_compute[226235]: 2026-01-31 07:51:56.703 226239 INFO nova.compute.manager [-] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:51:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:56.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:56 np0005603623 nova_compute[226235]: 2026-01-31 07:51:56.740 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:56 np0005603623 nova_compute[226235]: 2026-01-31 07:51:56.844 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:57.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:51:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:58.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:51:58 np0005603623 nova_compute[226235]: 2026-01-31 07:51:58.990 226239 INFO nova.virt.libvirt.driver [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Deleting instance files /var/lib/nova/instances/14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_del#033[00m
Jan 31 02:51:58 np0005603623 nova_compute[226235]: 2026-01-31 07:51:58.991 226239 INFO nova.virt.libvirt.driver [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Deletion of /var/lib/nova/instances/14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_del complete#033[00m
Jan 31 02:51:59 np0005603623 nova_compute[226235]: 2026-01-31 07:51:59.615 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845904.613609, 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:51:59 np0005603623 nova_compute[226235]: 2026-01-31 07:51:59.615 226239 INFO nova.compute.manager [-] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:51:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:51:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:59.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:00.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:01 np0005603623 nova_compute[226235]: 2026-01-31 07:52:01.358 226239 DEBUG nova.compute.manager [None req-d9c29cd4-411a-418d-a81a-7e7d1a5472ea - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:52:01 np0005603623 nova_compute[226235]: 2026-01-31 07:52:01.362 226239 DEBUG nova.compute.manager [None req-d9c29cd4-411a-418d-a81a-7e7d1a5472ea - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: deleting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:52:01 np0005603623 nova_compute[226235]: 2026-01-31 07:52:01.377 226239 DEBUG nova.compute.manager [None req-a46602fe-218a-4bdf-ad35-614198e84daf - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:52:01 np0005603623 nova_compute[226235]: 2026-01-31 07:52:01.380 226239 DEBUG nova.compute.manager [None req-a46602fe-218a-4bdf-ad35-614198e84daf - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: deleting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:52:01 np0005603623 nova_compute[226235]: 2026-01-31 07:52:01.441 226239 INFO nova.compute.manager [None req-d9c29cd4-411a-418d-a81a-7e7d1a5472ea - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Jan 31 02:52:01 np0005603623 nova_compute[226235]: 2026-01-31 07:52:01.446 226239 INFO nova.compute.manager [None req-a46602fe-218a-4bdf-ad35-614198e84daf - - - - - -] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Jan 31 02:52:01 np0005603623 nova_compute[226235]: 2026-01-31 07:52:01.461 226239 INFO nova.compute.manager [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Took 22.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:52:01 np0005603623 nova_compute[226235]: 2026-01-31 07:52:01.462 226239 DEBUG oslo.service.loopingcall [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:52:01 np0005603623 nova_compute[226235]: 2026-01-31 07:52:01.462 226239 DEBUG nova.compute.manager [-] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:52:01 np0005603623 nova_compute[226235]: 2026-01-31 07:52:01.462 226239 DEBUG nova.network.neutron [-] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:52:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:52:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:01.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:52:01 np0005603623 nova_compute[226235]: 2026-01-31 07:52:01.742 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:01 np0005603623 nova_compute[226235]: 2026-01-31 07:52:01.845 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:02.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:52:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:03.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:52:04 np0005603623 nova_compute[226235]: 2026-01-31 07:52:04.711 226239 DEBUG nova.network.neutron [-] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:52:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:04.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:04 np0005603623 nova_compute[226235]: 2026-01-31 07:52:04.760 226239 INFO nova.compute.manager [-] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Took 3.30 seconds to deallocate network for instance.#033[00m
Jan 31 02:52:04 np0005603623 nova_compute[226235]: 2026-01-31 07:52:04.845 226239 DEBUG oslo_concurrency.lockutils [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:04 np0005603623 nova_compute[226235]: 2026-01-31 07:52:04.846 226239 DEBUG oslo_concurrency.lockutils [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:04 np0005603623 nova_compute[226235]: 2026-01-31 07:52:04.943 226239 DEBUG oslo_concurrency.processutils [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:52:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:52:05 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/469675131' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:52:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:52:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:05.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:52:05 np0005603623 nova_compute[226235]: 2026-01-31 07:52:05.651 226239 DEBUG oslo_concurrency.processutils [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.708s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:52:05 np0005603623 nova_compute[226235]: 2026-01-31 07:52:05.658 226239 DEBUG nova.compute.provider_tree [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:52:05 np0005603623 nova_compute[226235]: 2026-01-31 07:52:05.679 226239 DEBUG nova.scheduler.client.report [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:52:05 np0005603623 nova_compute[226235]: 2026-01-31 07:52:05.730 226239 DEBUG oslo_concurrency.lockutils [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.884s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:05 np0005603623 nova_compute[226235]: 2026-01-31 07:52:05.795 226239 INFO nova.scheduler.client.report [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Deleted allocations for instance 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7#033[00m
Jan 31 02:52:05 np0005603623 nova_compute[226235]: 2026-01-31 07:52:05.940 226239 DEBUG oslo_concurrency.lockutils [None req-9cc8e9b0-b645-4032-bed3-0a1c093c3fa1 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 26.880s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:06.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:06 np0005603623 nova_compute[226235]: 2026-01-31 07:52:06.785 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:06 np0005603623 nova_compute[226235]: 2026-01-31 07:52:06.846 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:07.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:08.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:52:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:09.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:52:10 np0005603623 nova_compute[226235]: 2026-01-31 07:52:10.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:10 np0005603623 nova_compute[226235]: 2026-01-31 07:52:10.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:52:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:10.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:52:11 np0005603623 ceph-mds[84161]: mds.beacon.cephfs.compute-2.asgtzy missed beacon ack from the monitors
Jan 31 02:52:11 np0005603623 nova_compute[226235]: 2026-01-31 07:52:11.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:11 np0005603623 nova_compute[226235]: 2026-01-31 07:52:11.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:11.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:11 np0005603623 nova_compute[226235]: 2026-01-31 07:52:11.788 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:11 np0005603623 nova_compute[226235]: 2026-01-31 07:52:11.848 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:12 np0005603623 nova_compute[226235]: 2026-01-31 07:52:12.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:12 np0005603623 nova_compute[226235]: 2026-01-31 07:52:12.222 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:12 np0005603623 nova_compute[226235]: 2026-01-31 07:52:12.222 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:12 np0005603623 nova_compute[226235]: 2026-01-31 07:52:12.222 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:12 np0005603623 nova_compute[226235]: 2026-01-31 07:52:12.223 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:52:12 np0005603623 nova_compute[226235]: 2026-01-31 07:52:12.223 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:52:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:52:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1953331099' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:52:12 np0005603623 nova_compute[226235]: 2026-01-31 07:52:12.654 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:52:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:12.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:12 np0005603623 nova_compute[226235]: 2026-01-31 07:52:12.824 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:52:12 np0005603623 nova_compute[226235]: 2026-01-31 07:52:12.824 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:52:12 np0005603623 nova_compute[226235]: 2026-01-31 07:52:12.957 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:52:12 np0005603623 nova_compute[226235]: 2026-01-31 07:52:12.959 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4775MB free_disk=20.830753326416016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:52:12 np0005603623 nova_compute[226235]: 2026-01-31 07:52:12.959 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:12 np0005603623 nova_compute[226235]: 2026-01-31 07:52:12.959 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:13 np0005603623 nova_compute[226235]: 2026-01-31 07:52:13.129 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:52:13 np0005603623 nova_compute[226235]: 2026-01-31 07:52:13.130 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:52:13 np0005603623 nova_compute[226235]: 2026-01-31 07:52:13.130 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:52:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:52:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:13.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:52:13 np0005603623 nova_compute[226235]: 2026-01-31 07:52:13.785 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:52:14 np0005603623 nova_compute[226235]: 2026-01-31 07:52:14.006 226239 INFO nova.virt.libvirt.driver [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Deleting instance files /var/lib/nova/instances/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_del#033[00m
Jan 31 02:52:14 np0005603623 nova_compute[226235]: 2026-01-31 07:52:14.007 226239 INFO nova.virt.libvirt.driver [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Deletion of /var/lib/nova/instances/866b0b10-d2ae-4e08-9efa-36b9c9c9f50d_del complete#033[00m
Jan 31 02:52:14 np0005603623 nova_compute[226235]: 2026-01-31 07:52:14.237 226239 INFO nova.compute.manager [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Took 31.64 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:52:14 np0005603623 nova_compute[226235]: 2026-01-31 07:52:14.237 226239 DEBUG oslo.service.loopingcall [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:52:14 np0005603623 nova_compute[226235]: 2026-01-31 07:52:14.237 226239 DEBUG nova.compute.manager [-] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:52:14 np0005603623 nova_compute[226235]: 2026-01-31 07:52:14.237 226239 DEBUG nova.network.neutron [-] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:52:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:52:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/711308837' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:52:14 np0005603623 nova_compute[226235]: 2026-01-31 07:52:14.395 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.609s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:52:14 np0005603623 nova_compute[226235]: 2026-01-31 07:52:14.399 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:52:14 np0005603623 nova_compute[226235]: 2026-01-31 07:52:14.443 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:52:14 np0005603623 nova_compute[226235]: 2026-01-31 07:52:14.550 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:52:14 np0005603623 nova_compute[226235]: 2026-01-31 07:52:14.550 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:14.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:15 np0005603623 nova_compute[226235]: 2026-01-31 07:52:15.029 226239 DEBUG nova.network.neutron [-] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:52:15 np0005603623 nova_compute[226235]: 2026-01-31 07:52:15.066 226239 DEBUG nova.network.neutron [-] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:52:15 np0005603623 nova_compute[226235]: 2026-01-31 07:52:15.127 226239 INFO nova.compute.manager [-] [instance: 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d] Took 0.89 seconds to deallocate network for instance.#033[00m
Jan 31 02:52:15 np0005603623 nova_compute[226235]: 2026-01-31 07:52:15.199 226239 DEBUG oslo_concurrency.lockutils [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:15 np0005603623 nova_compute[226235]: 2026-01-31 07:52:15.200 226239 DEBUG oslo_concurrency.lockutils [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:15 np0005603623 nova_compute[226235]: 2026-01-31 07:52:15.270 226239 DEBUG oslo_concurrency.processutils [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:52:15 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:52:15 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:52:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:15.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:52:15 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3515060684' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:52:15 np0005603623 nova_compute[226235]: 2026-01-31 07:52:15.850 226239 DEBUG oslo_concurrency.processutils [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.580s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:52:15 np0005603623 nova_compute[226235]: 2026-01-31 07:52:15.855 226239 DEBUG nova.compute.provider_tree [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:52:15 np0005603623 nova_compute[226235]: 2026-01-31 07:52:15.889 226239 DEBUG nova.scheduler.client.report [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:52:16 np0005603623 nova_compute[226235]: 2026-01-31 07:52:16.000 226239 DEBUG oslo_concurrency.lockutils [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.800s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:16 np0005603623 nova_compute[226235]: 2026-01-31 07:52:16.138 226239 INFO nova.scheduler.client.report [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Deleted allocations for instance 866b0b10-d2ae-4e08-9efa-36b9c9c9f50d#033[00m
Jan 31 02:52:16 np0005603623 nova_compute[226235]: 2026-01-31 07:52:16.329 226239 DEBUG oslo_concurrency.lockutils [None req-2d90a2b3-6416-42ab-8a6e-e272cbe5d05e 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "866b0b10-d2ae-4e08-9efa-36b9c9c9f50d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 34.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:16 np0005603623 nova_compute[226235]: 2026-01-31 07:52:16.547 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:16 np0005603623 nova_compute[226235]: 2026-01-31 07:52:16.548 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:16 np0005603623 nova_compute[226235]: 2026-01-31 07:52:16.591 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:16 np0005603623 nova_compute[226235]: 2026-01-31 07:52:16.592 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:52:16 np0005603623 nova_compute[226235]: 2026-01-31 07:52:16.592 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:52:16 np0005603623 nova_compute[226235]: 2026-01-31 07:52:16.617 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:52:16 np0005603623 nova_compute[226235]: 2026-01-31 07:52:16.618 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:16 np0005603623 nova_compute[226235]: 2026-01-31 07:52:16.618 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:16 np0005603623 nova_compute[226235]: 2026-01-31 07:52:16.618 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:52:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:16.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:16 np0005603623 nova_compute[226235]: 2026-01-31 07:52:16.790 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:16 np0005603623 nova_compute[226235]: 2026-01-31 07:52:16.850 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:17.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:18.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:19.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:20.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:52:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:21.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:52:21 np0005603623 nova_compute[226235]: 2026-01-31 07:52:21.793 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:21 np0005603623 nova_compute[226235]: 2026-01-31 07:52:21.852 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:22.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:22 np0005603623 podman[239566]: 2026-01-31 07:52:22.966413056 +0000 UTC m=+0.059087220 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:52:22 np0005603623 podman[239567]: 2026-01-31 07:52:22.981352606 +0000 UTC m=+0.074477755 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller)
Jan 31 02:52:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:52:23.529 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:52:23 np0005603623 nova_compute[226235]: 2026-01-31 07:52:23.530 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:52:23.530 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:52:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:23.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:24.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 02:52:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 5048 writes, 26K keys, 5048 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.03 MB/s#012Cumulative WAL: 5048 writes, 5048 syncs, 1.00 writes per sync, written: 0.05 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1489 writes, 7202 keys, 1489 commit groups, 1.0 writes per commit group, ingest: 15.41 MB, 0.03 MB/s#012Interval WAL: 1489 writes, 1489 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     36.5      0.84              0.08        14    0.060       0      0       0.0       0.0#012  L6      1/0    8.85 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.6     64.2     53.6      2.03              0.26        13    0.156     61K   6814       0.0       0.0#012 Sum      1/0    8.85 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.6     45.5     48.6      2.87              0.35        27    0.106     61K   6814       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.0       0.0   5.2     26.9     27.7      1.89              0.13        10    0.189     26K   2530       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0     64.2     53.6      2.03              0.26        13    0.156     61K   6814       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     36.6      0.83              0.08        13    0.064       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.0 total, 600.0 interval#012Flush(GB): cumulative 0.030, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.14 GB write, 0.08 MB/s write, 0.13 GB read, 0.07 MB/s read, 2.9 seconds#012Interval compaction: 0.05 GB write, 0.09 MB/s write, 0.05 GB read, 0.09 MB/s read, 1.9 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557fc5f1b1f0#2 capacity: 304.00 MB usage: 12.32 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000155 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(712,11.82 MB,3.88962%) FilterBlock(27,177.42 KB,0.0569946%) IndexBlock(27,334.36 KB,0.107409%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 02:52:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:52:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:25.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:52:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:26.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:26 np0005603623 nova_compute[226235]: 2026-01-31 07:52:26.795 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:26 np0005603623 nova_compute[226235]: 2026-01-31 07:52:26.854 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:27 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:52:27.533 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:52:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:27.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:28.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:29.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:52:30.086 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:52:30.087 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:52:30.087 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:30.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:31.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:31 np0005603623 nova_compute[226235]: 2026-01-31 07:52:31.797 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:31 np0005603623 nova_compute[226235]: 2026-01-31 07:52:31.855 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:32.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:33.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:34.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:35.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:36.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:36 np0005603623 nova_compute[226235]: 2026-01-31 07:52:36.799 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:36 np0005603623 nova_compute[226235]: 2026-01-31 07:52:36.858 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:52:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:37.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:52:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:38.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:39.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 31 02:52:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:40.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:41.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:41 np0005603623 nova_compute[226235]: 2026-01-31 07:52:41.800 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:41 np0005603623 nova_compute[226235]: 2026-01-31 07:52:41.859 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:42.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:52:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/609643398' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:52:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:52:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:43.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:52:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:52:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:44.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:52:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:45.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:46.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:46 np0005603623 nova_compute[226235]: 2026-01-31 07:52:46.803 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:46 np0005603623 nova_compute[226235]: 2026-01-31 07:52:46.861 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:47.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:48.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:49.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:50.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 02:52:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 12K writes, 57K keys, 12K commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.03 MB/s#012Cumulative WAL: 12K writes, 3307 syncs, 3.89 writes per sync, written: 0.05 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 6975 writes, 32K keys, 6975 commit groups, 1.0 writes per commit group, ingest: 33.45 MB, 0.06 MB/s#012Interval WAL: 6975 writes, 2289 syncs, 3.05 writes per sync, written: 0.03 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 02:52:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:51.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:51 np0005603623 nova_compute[226235]: 2026-01-31 07:52:51.805 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:51 np0005603623 nova_compute[226235]: 2026-01-31 07:52:51.864 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:52.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:52:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:53.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:52:54 np0005603623 podman[239674]: 2026-01-31 07:52:54.000285921 +0000 UTC m=+0.087308448 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 02:52:54 np0005603623 podman[239675]: 2026-01-31 07:52:54.00821266 +0000 UTC m=+0.096256089 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:52:54 np0005603623 ovn_controller[133449]: 2026-01-31T07:52:54Z|00142|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 02:52:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:54.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:55.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:56.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:56 np0005603623 nova_compute[226235]: 2026-01-31 07:52:56.807 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:56 np0005603623 nova_compute[226235]: 2026-01-31 07:52:56.866 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:57 np0005603623 ceph-mgr[77391]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3835187053
Jan 31 02:52:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:52:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:57.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:52:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:58.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:52:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:59.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:00.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:01.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:01 np0005603623 nova_compute[226235]: 2026-01-31 07:53:01.810 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:01 np0005603623 nova_compute[226235]: 2026-01-31 07:53:01.868 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:53:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:02.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:53:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:53:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:03.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:53:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:04.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:05.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:06.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:06 np0005603623 nova_compute[226235]: 2026-01-31 07:53:06.813 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:06 np0005603623 nova_compute[226235]: 2026-01-31 07:53:06.902 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:53:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:07.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:53:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:53:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:08.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:53:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:53:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:09.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:53:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e156 e156: 3 total, 3 up, 3 in
Jan 31 02:53:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:10.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:11 np0005603623 nova_compute[226235]: 2026-01-31 07:53:11.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:53:11 np0005603623 nova_compute[226235]: 2026-01-31 07:53:11.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:53:11 np0005603623 nova_compute[226235]: 2026-01-31 07:53:11.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:53:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:11.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:11 np0005603623 nova_compute[226235]: 2026-01-31 07:53:11.815 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:11 np0005603623 nova_compute[226235]: 2026-01-31 07:53:11.905 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:12.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:13 np0005603623 nova_compute[226235]: 2026-01-31 07:53:13.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:53:13 np0005603623 nova_compute[226235]: 2026-01-31 07:53:13.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:53:13 np0005603623 nova_compute[226235]: 2026-01-31 07:53:13.198 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:13 np0005603623 nova_compute[226235]: 2026-01-31 07:53:13.199 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:13 np0005603623 nova_compute[226235]: 2026-01-31 07:53:13.199 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:13 np0005603623 nova_compute[226235]: 2026-01-31 07:53:13.199 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:53:13 np0005603623 nova_compute[226235]: 2026-01-31 07:53:13.199 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:13.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:53:13 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/12496036' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:53:13 np0005603623 nova_compute[226235]: 2026-01-31 07:53:13.826 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.626s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:13 np0005603623 nova_compute[226235]: 2026-01-31 07:53:13.993 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:53:13 np0005603623 nova_compute[226235]: 2026-01-31 07:53:13.996 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4820MB free_disk=20.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:53:13 np0005603623 nova_compute[226235]: 2026-01-31 07:53:13.996 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:13 np0005603623 nova_compute[226235]: 2026-01-31 07:53:13.997 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:14 np0005603623 nova_compute[226235]: 2026-01-31 07:53:14.134 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:53:14 np0005603623 nova_compute[226235]: 2026-01-31 07:53:14.134 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:53:14 np0005603623 nova_compute[226235]: 2026-01-31 07:53:14.334 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 02:53:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3713112901' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 02:53:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 02:53:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3713112901' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 02:53:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:53:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1656863756' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:53:14 np0005603623 nova_compute[226235]: 2026-01-31 07:53:14.780 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:14 np0005603623 nova_compute[226235]: 2026-01-31 07:53:14.784 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:53:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:14.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:14 np0005603623 nova_compute[226235]: 2026-01-31 07:53:14.828 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:53:14 np0005603623 nova_compute[226235]: 2026-01-31 07:53:14.857 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:53:14 np0005603623 nova_compute[226235]: 2026-01-31 07:53:14.858 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.861s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:15.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:53:15.819 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:53:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:53:15.820 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:53:15 np0005603623 nova_compute[226235]: 2026-01-31 07:53:15.820 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:16.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:16 np0005603623 nova_compute[226235]: 2026-01-31 07:53:16.817 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:16 np0005603623 nova_compute[226235]: 2026-01-31 07:53:16.855 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:53:16 np0005603623 nova_compute[226235]: 2026-01-31 07:53:16.855 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:53:16 np0005603623 nova_compute[226235]: 2026-01-31 07:53:16.855 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:53:16 np0005603623 nova_compute[226235]: 2026-01-31 07:53:16.855 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:53:16 np0005603623 nova_compute[226235]: 2026-01-31 07:53:16.873 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:53:16 np0005603623 nova_compute[226235]: 2026-01-31 07:53:16.874 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:53:16 np0005603623 nova_compute[226235]: 2026-01-31 07:53:16.874 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:53:16 np0005603623 nova_compute[226235]: 2026-01-31 07:53:16.906 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:17 np0005603623 nova_compute[226235]: 2026-01-31 07:53:17.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:53:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:17.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:53:17.823 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:53:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:18.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 e157: 3 total, 3 up, 3 in
Jan 31 02:53:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:19.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:20.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:20 np0005603623 podman[240383]: 2026-01-31 07:53:20.709518875 +0000 UTC m=+0.019004179 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:53:21 np0005603623 podman[240383]: 2026-01-31 07:53:21.218714983 +0000 UTC m=+0.528200267 container create d42a9fa51ebb4f012fe262640b841af833c42ebd4072355a2724374e7c308533 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_einstein, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:53:21 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:21 np0005603623 systemd[1]: Started libpod-conmon-d42a9fa51ebb4f012fe262640b841af833c42ebd4072355a2724374e7c308533.scope.
Jan 31 02:53:21 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:53:21 np0005603623 podman[240383]: 2026-01-31 07:53:21.582697822 +0000 UTC m=+0.892183126 container init d42a9fa51ebb4f012fe262640b841af833c42ebd4072355a2724374e7c308533 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_einstein, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:53:21 np0005603623 podman[240383]: 2026-01-31 07:53:21.5895843 +0000 UTC m=+0.899069584 container start d42a9fa51ebb4f012fe262640b841af833c42ebd4072355a2724374e7c308533 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_einstein, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True)
Jan 31 02:53:21 np0005603623 suspicious_einstein[240399]: 167 167
Jan 31 02:53:21 np0005603623 systemd[1]: libpod-d42a9fa51ebb4f012fe262640b841af833c42ebd4072355a2724374e7c308533.scope: Deactivated successfully.
Jan 31 02:53:21 np0005603623 podman[240383]: 2026-01-31 07:53:21.687069476 +0000 UTC m=+0.996554850 container attach d42a9fa51ebb4f012fe262640b841af833c42ebd4072355a2724374e7c308533 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_einstein, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Jan 31 02:53:21 np0005603623 podman[240383]: 2026-01-31 07:53:21.688569804 +0000 UTC m=+0.998055148 container died d42a9fa51ebb4f012fe262640b841af833c42ebd4072355a2724374e7c308533 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_einstein, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Jan 31 02:53:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:53:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:21.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:53:21 np0005603623 nova_compute[226235]: 2026-01-31 07:53:21.820 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:21 np0005603623 nova_compute[226235]: 2026-01-31 07:53:21.908 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:22 np0005603623 systemd[1]: var-lib-containers-storage-overlay-c551e0d682a63ab56d25d7cde571cac1108d8e1ba5a95fbf1120b5fe4f05fe06-merged.mount: Deactivated successfully.
Jan 31 02:53:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:22.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:23 np0005603623 podman[240383]: 2026-01-31 07:53:23.094971785 +0000 UTC m=+2.404457109 container remove d42a9fa51ebb4f012fe262640b841af833c42ebd4072355a2724374e7c308533 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_einstein, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:53:23 np0005603623 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 31 02:53:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:23 np0005603623 systemd[1]: libpod-conmon-d42a9fa51ebb4f012fe262640b841af833c42ebd4072355a2724374e7c308533.scope: Deactivated successfully.
Jan 31 02:53:23 np0005603623 podman[240424]: 2026-01-31 07:53:23.226515633 +0000 UTC m=+0.019189955 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:53:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 02:53:23 np0005603623 podman[240424]: 2026-01-31 07:53:23.522901456 +0000 UTC m=+0.315575748 container create c6ec9f40df3eda3ab2e2becdd6c60f84d5d897560c7e64aa564e3143b5af36f5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=mystifying_kare, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:53:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:23.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:23 np0005603623 systemd[1]: Started libpod-conmon-c6ec9f40df3eda3ab2e2becdd6c60f84d5d897560c7e64aa564e3143b5af36f5.scope.
Jan 31 02:53:23 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:53:23 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7be407020e4d96340becb3c833864a4f8ddbf3d7f555010d8f7681adf9d4e798/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:53:23 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7be407020e4d96340becb3c833864a4f8ddbf3d7f555010d8f7681adf9d4e798/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:53:23 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7be407020e4d96340becb3c833864a4f8ddbf3d7f555010d8f7681adf9d4e798/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:53:23 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7be407020e4d96340becb3c833864a4f8ddbf3d7f555010d8f7681adf9d4e798/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:53:24 np0005603623 podman[240424]: 2026-01-31 07:53:24.091902925 +0000 UTC m=+0.884577247 container init c6ec9f40df3eda3ab2e2becdd6c60f84d5d897560c7e64aa564e3143b5af36f5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=mystifying_kare, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True)
Jan 31 02:53:24 np0005603623 podman[240424]: 2026-01-31 07:53:24.097402068 +0000 UTC m=+0.890076360 container start c6ec9f40df3eda3ab2e2becdd6c60f84d5d897560c7e64aa564e3143b5af36f5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=mystifying_kare, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Jan 31 02:53:24 np0005603623 podman[240424]: 2026-01-31 07:53:24.321508618 +0000 UTC m=+1.114182910 container attach c6ec9f40df3eda3ab2e2becdd6c60f84d5d897560c7e64aa564e3143b5af36f5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=mystifying_kare, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2)
Jan 31 02:53:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:24.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:24 np0005603623 podman[240452]: 2026-01-31 07:53:24.958063593 +0000 UTC m=+0.051505302 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible)
Jan 31 02:53:24 np0005603623 podman[240454]: 2026-01-31 07:53:24.984835015 +0000 UTC m=+0.077326773 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]: [
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:    {
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:        "available": false,
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:        "ceph_device": false,
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:        "lsm_data": {},
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:        "lvs": [],
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:        "path": "/dev/sr0",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:        "rejected_reasons": [
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "Has a FileSystem",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "Insufficient space (<5GB)"
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:        ],
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:        "sys_api": {
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "actuators": null,
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "device_nodes": "sr0",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "devname": "sr0",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "human_readable_size": "482.00 KB",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "id_bus": "ata",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "model": "QEMU DVD-ROM",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "nr_requests": "2",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "parent": "/dev/sr0",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "partitions": {},
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "path": "/dev/sr0",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "removable": "1",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "rev": "2.5+",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "ro": "0",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "rotational": "1",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "sas_address": "",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "sas_device_handle": "",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "scheduler_mode": "mq-deadline",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "sectors": 0,
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "sectorsize": "2048",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "size": 493568.0,
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "support_discard": "2048",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "type": "disk",
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:            "vendor": "QEMU"
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:        }
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]:    }
Jan 31 02:53:25 np0005603623 mystifying_kare[240441]: ]
Jan 31 02:53:25 np0005603623 systemd[1]: libpod-c6ec9f40df3eda3ab2e2becdd6c60f84d5d897560c7e64aa564e3143b5af36f5.scope: Deactivated successfully.
Jan 31 02:53:25 np0005603623 systemd[1]: libpod-c6ec9f40df3eda3ab2e2becdd6c60f84d5d897560c7e64aa564e3143b5af36f5.scope: Consumed 1.096s CPU time.
Jan 31 02:53:25 np0005603623 podman[240424]: 2026-01-31 07:53:25.248973134 +0000 UTC m=+2.041647426 container died c6ec9f40df3eda3ab2e2becdd6c60f84d5d897560c7e64aa564e3143b5af36f5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=mystifying_kare, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Jan 31 02:53:25 np0005603623 systemd[1]: var-lib-containers-storage-overlay-7be407020e4d96340becb3c833864a4f8ddbf3d7f555010d8f7681adf9d4e798-merged.mount: Deactivated successfully.
Jan 31 02:53:25 np0005603623 podman[240424]: 2026-01-31 07:53:25.601128092 +0000 UTC m=+2.393802384 container remove c6ec9f40df3eda3ab2e2becdd6c60f84d5d897560c7e64aa564e3143b5af36f5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=mystifying_kare, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:53:25 np0005603623 systemd[1]: libpod-conmon-c6ec9f40df3eda3ab2e2becdd6c60f84d5d897560c7e64aa564e3143b5af36f5.scope: Deactivated successfully.
Jan 31 02:53:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:25.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 02:53:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:26.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:26 np0005603623 nova_compute[226235]: 2026-01-31 07:53:26.824 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:26 np0005603623 nova_compute[226235]: 2026-01-31 07:53:26.911 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:26 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:26 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 02:53:26 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:53:26 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:26 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:53:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:27.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:28 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 31 02:53:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:28.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:28 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 31 02:53:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:53:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:29.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:53:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:53:30.087 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:53:30.087 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:53:30.087 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:30.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:31 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Jan 31 02:53:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 31 02:53:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:31.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 31 02:53:31 np0005603623 nova_compute[226235]: 2026-01-31 07:53:31.827 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:31 np0005603623 nova_compute[226235]: 2026-01-31 07:53:31.913 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:53:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:32.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:53:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:33.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:34 np0005603623 ovn_controller[133449]: 2026-01-31T07:53:34Z|00143|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 02:53:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:34.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:35 np0005603623 nova_compute[226235]: 2026-01-31 07:53:35.112 226239 DEBUG oslo_concurrency.lockutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Acquiring lock "e6feb62a-1d4c-435a-a6c7-054e8da05258" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:35 np0005603623 nova_compute[226235]: 2026-01-31 07:53:35.113 226239 DEBUG oslo_concurrency.lockutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Lock "e6feb62a-1d4c-435a-a6c7-054e8da05258" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:35 np0005603623 nova_compute[226235]: 2026-01-31 07:53:35.139 226239 DEBUG nova.compute.manager [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:53:35 np0005603623 nova_compute[226235]: 2026-01-31 07:53:35.293 226239 DEBUG oslo_concurrency.lockutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:35 np0005603623 nova_compute[226235]: 2026-01-31 07:53:35.294 226239 DEBUG oslo_concurrency.lockutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:35 np0005603623 nova_compute[226235]: 2026-01-31 07:53:35.302 226239 DEBUG nova.virt.hardware [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:53:35 np0005603623 nova_compute[226235]: 2026-01-31 07:53:35.303 226239 INFO nova.compute.claims [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:53:35 np0005603623 nova_compute[226235]: 2026-01-31 07:53:35.467 226239 DEBUG oslo_concurrency.processutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:35.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:53:35 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/610503336' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:53:35 np0005603623 nova_compute[226235]: 2026-01-31 07:53:35.894 226239 DEBUG oslo_concurrency.processutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:35 np0005603623 nova_compute[226235]: 2026-01-31 07:53:35.899 226239 DEBUG nova.compute.provider_tree [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:53:35 np0005603623 nova_compute[226235]: 2026-01-31 07:53:35.920 226239 DEBUG nova.scheduler.client.report [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:53:35 np0005603623 nova_compute[226235]: 2026-01-31 07:53:35.952 226239 DEBUG oslo_concurrency.lockutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.658s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:35 np0005603623 nova_compute[226235]: 2026-01-31 07:53:35.953 226239 DEBUG nova.compute.manager [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:53:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.031 226239 DEBUG nova.compute.manager [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.031 226239 DEBUG nova.network.neutron [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.061 226239 INFO nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.096 226239 DEBUG nova.compute.manager [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.228 226239 DEBUG nova.compute.manager [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.230 226239 DEBUG nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.230 226239 INFO nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Creating image(s)#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.253 226239 DEBUG nova.storage.rbd_utils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] rbd image e6feb62a-1d4c-435a-a6c7-054e8da05258_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.367 226239 DEBUG nova.storage.rbd_utils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] rbd image e6feb62a-1d4c-435a-a6c7-054e8da05258_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.395 226239 DEBUG nova.storage.rbd_utils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] rbd image e6feb62a-1d4c-435a-a6c7-054e8da05258_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.398 226239 DEBUG oslo_concurrency.processutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.448 226239 DEBUG oslo_concurrency.processutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.449 226239 DEBUG oslo_concurrency.lockutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.449 226239 DEBUG oslo_concurrency.lockutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.450 226239 DEBUG oslo_concurrency.lockutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.515 226239 DEBUG nova.storage.rbd_utils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] rbd image e6feb62a-1d4c-435a-a6c7-054e8da05258_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.521 226239 DEBUG oslo_concurrency.processutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 e6feb62a-1d4c-435a-a6c7-054e8da05258_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.647 226239 DEBUG nova.network.neutron [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.647 226239 DEBUG nova.compute.manager [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:53:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:36.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.829 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:36 np0005603623 nova_compute[226235]: 2026-01-31 07:53:36.914 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:37.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:38 np0005603623 nova_compute[226235]: 2026-01-31 07:53:38.264 226239 DEBUG oslo_concurrency.processutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 e6feb62a-1d4c-435a-a6c7-054e8da05258_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.744s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:38 np0005603623 nova_compute[226235]: 2026-01-31 07:53:38.375 226239 DEBUG nova.storage.rbd_utils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] resizing rbd image e6feb62a-1d4c-435a-a6c7-054e8da05258_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:53:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:38.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.093 226239 DEBUG nova.objects.instance [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Lazy-loading 'migration_context' on Instance uuid e6feb62a-1d4c-435a-a6c7-054e8da05258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.120 226239 DEBUG nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.120 226239 DEBUG nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Ensure instance console log exists: /var/lib/nova/instances/e6feb62a-1d4c-435a-a6c7-054e8da05258/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.121 226239 DEBUG oslo_concurrency.lockutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.122 226239 DEBUG oslo_concurrency.lockutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.122 226239 DEBUG oslo_concurrency.lockutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.123 226239 DEBUG nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.128 226239 WARNING nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.133 226239 DEBUG nova.virt.libvirt.host [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.134 226239 DEBUG nova.virt.libvirt.host [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.137 226239 DEBUG nova.virt.libvirt.host [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.138 226239 DEBUG nova.virt.libvirt.host [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.140 226239 DEBUG nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.140 226239 DEBUG nova.virt.hardware [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.140 226239 DEBUG nova.virt.hardware [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.141 226239 DEBUG nova.virt.hardware [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.141 226239 DEBUG nova.virt.hardware [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.141 226239 DEBUG nova.virt.hardware [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.141 226239 DEBUG nova.virt.hardware [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.141 226239 DEBUG nova.virt.hardware [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.141 226239 DEBUG nova.virt.hardware [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.142 226239 DEBUG nova.virt.hardware [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.142 226239 DEBUG nova.virt.hardware [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.142 226239 DEBUG nova.virt.hardware [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.145 226239 DEBUG oslo_concurrency.processutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:53:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1234293031' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:53:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:39.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.801 226239 DEBUG oslo_concurrency.processutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.655s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.834 226239 DEBUG nova.storage.rbd_utils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] rbd image e6feb62a-1d4c-435a-a6c7-054e8da05258_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:39 np0005603623 nova_compute[226235]: 2026-01-31 07:53:39.839 226239 DEBUG oslo_concurrency.processutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:53:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1572721269' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:53:40 np0005603623 nova_compute[226235]: 2026-01-31 07:53:40.510 226239 DEBUG oslo_concurrency.processutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.671s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:40 np0005603623 nova_compute[226235]: 2026-01-31 07:53:40.513 226239 DEBUG nova.objects.instance [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Lazy-loading 'pci_devices' on Instance uuid e6feb62a-1d4c-435a-a6c7-054e8da05258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:53:40 np0005603623 nova_compute[226235]: 2026-01-31 07:53:40.540 226239 DEBUG nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:53:40 np0005603623 nova_compute[226235]:  <uuid>e6feb62a-1d4c-435a-a6c7-054e8da05258</uuid>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:  <name>instance-00000019</name>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerDiagnosticsTest-server-295987811</nova:name>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:53:39</nova:creationTime>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:53:40 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:        <nova:user uuid="5baed7ec51044fc6859b3df3fe9c4bdd">tempest-ServerDiagnosticsTest-690697880-project-member</nova:user>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:        <nova:project uuid="9b5dbf38b98c4018b86f5dccf80a9f30">tempest-ServerDiagnosticsTest-690697880</nova:project>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <nova:ports/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <entry name="serial">e6feb62a-1d4c-435a-a6c7-054e8da05258</entry>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <entry name="uuid">e6feb62a-1d4c-435a-a6c7-054e8da05258</entry>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/e6feb62a-1d4c-435a-a6c7-054e8da05258_disk">
Jan 31 02:53:40 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:53:40 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/e6feb62a-1d4c-435a-a6c7-054e8da05258_disk.config">
Jan 31 02:53:40 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:53:40 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/e6feb62a-1d4c-435a-a6c7-054e8da05258/console.log" append="off"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:53:40 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:53:40 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:53:40 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:53:40 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:53:40 np0005603623 nova_compute[226235]: 2026-01-31 07:53:40.707 226239 DEBUG nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:53:40 np0005603623 nova_compute[226235]: 2026-01-31 07:53:40.708 226239 DEBUG nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:53:40 np0005603623 nova_compute[226235]: 2026-01-31 07:53:40.708 226239 INFO nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Using config drive#033[00m
Jan 31 02:53:40 np0005603623 nova_compute[226235]: 2026-01-31 07:53:40.736 226239 DEBUG nova.storage.rbd_utils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] rbd image e6feb62a-1d4c-435a-a6c7-054e8da05258_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:40.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:41 np0005603623 nova_compute[226235]: 2026-01-31 07:53:41.061 226239 INFO nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Creating config drive at /var/lib/nova/instances/e6feb62a-1d4c-435a-a6c7-054e8da05258/disk.config#033[00m
Jan 31 02:53:41 np0005603623 nova_compute[226235]: 2026-01-31 07:53:41.065 226239 DEBUG oslo_concurrency.processutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e6feb62a-1d4c-435a-a6c7-054e8da05258/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp_xtx81g_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:41 np0005603623 nova_compute[226235]: 2026-01-31 07:53:41.184 226239 DEBUG oslo_concurrency.processutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e6feb62a-1d4c-435a-a6c7-054e8da05258/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp_xtx81g_" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:41 np0005603623 nova_compute[226235]: 2026-01-31 07:53:41.214 226239 DEBUG nova.storage.rbd_utils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] rbd image e6feb62a-1d4c-435a-a6c7-054e8da05258_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:41 np0005603623 nova_compute[226235]: 2026-01-31 07:53:41.218 226239 DEBUG oslo_concurrency.processutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e6feb62a-1d4c-435a-a6c7-054e8da05258/disk.config e6feb62a-1d4c-435a-a6c7-054e8da05258_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:41 np0005603623 nova_compute[226235]: 2026-01-31 07:53:41.727 226239 DEBUG oslo_concurrency.processutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e6feb62a-1d4c-435a-a6c7-054e8da05258/disk.config e6feb62a-1d4c-435a-a6c7-054e8da05258_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:41 np0005603623 nova_compute[226235]: 2026-01-31 07:53:41.727 226239 INFO nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Deleting local config drive /var/lib/nova/instances/e6feb62a-1d4c-435a-a6c7-054e8da05258/disk.config because it was imported into RBD.#033[00m
Jan 31 02:53:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:41.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:41 np0005603623 systemd-machined[194379]: New machine qemu-17-instance-00000019.
Jan 31 02:53:41 np0005603623 systemd[1]: Started Virtual Machine qemu-17-instance-00000019.
Jan 31 02:53:41 np0005603623 nova_compute[226235]: 2026-01-31 07:53:41.832 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:41 np0005603623 nova_compute[226235]: 2026-01-31 07:53:41.917 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.518 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846022.5177774, e6feb62a-1d4c-435a-a6c7-054e8da05258 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.518 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.521 226239 DEBUG nova.compute.manager [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.522 226239 DEBUG nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.526 226239 INFO nova.virt.libvirt.driver [-] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Instance spawned successfully.#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.526 226239 DEBUG nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.552 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.566 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.571 226239 DEBUG nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.571 226239 DEBUG nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.572 226239 DEBUG nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.572 226239 DEBUG nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.572 226239 DEBUG nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.573 226239 DEBUG nova.virt.libvirt.driver [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.615 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.616 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846022.5207613, e6feb62a-1d4c-435a-a6c7-054e8da05258 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.616 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] VM Started (Lifecycle Event)#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.650 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.654 226239 INFO nova.compute.manager [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Took 6.42 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.654 226239 DEBUG nova.compute.manager [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.655 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.692 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.740 226239 INFO nova.compute.manager [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Took 7.49 seconds to build instance.#033[00m
Jan 31 02:53:42 np0005603623 nova_compute[226235]: 2026-01-31 07:53:42.777 226239 DEBUG oslo_concurrency.lockutils [None req-4902b5c1-ba9f-49b7-a16f-28d1457a297f 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Lock "e6feb62a-1d4c-435a-a6c7-054e8da05258" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:42.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:43.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:44.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:45 np0005603623 nova_compute[226235]: 2026-01-31 07:53:45.107 226239 DEBUG nova.compute.manager [None req-2d7109e6-e072-459b-b287-13bd664a75e2 31d932b5f41944cca2bc3a967b02a4ce 3a9450c40fc14e10acd79c5e479c398e - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:53:45 np0005603623 nova_compute[226235]: 2026-01-31 07:53:45.113 226239 INFO nova.compute.manager [None req-2d7109e6-e072-459b-b287-13bd664a75e2 31d932b5f41944cca2bc3a967b02a4ce 3a9450c40fc14e10acd79c5e479c398e - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Retrieving diagnostics#033[00m
Jan 31 02:53:45 np0005603623 nova_compute[226235]: 2026-01-31 07:53:45.520 226239 DEBUG oslo_concurrency.lockutils [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Acquiring lock "e6feb62a-1d4c-435a-a6c7-054e8da05258" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:45 np0005603623 nova_compute[226235]: 2026-01-31 07:53:45.520 226239 DEBUG oslo_concurrency.lockutils [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Lock "e6feb62a-1d4c-435a-a6c7-054e8da05258" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:45 np0005603623 nova_compute[226235]: 2026-01-31 07:53:45.521 226239 DEBUG oslo_concurrency.lockutils [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Acquiring lock "e6feb62a-1d4c-435a-a6c7-054e8da05258-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:45 np0005603623 nova_compute[226235]: 2026-01-31 07:53:45.521 226239 DEBUG oslo_concurrency.lockutils [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Lock "e6feb62a-1d4c-435a-a6c7-054e8da05258-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:45 np0005603623 nova_compute[226235]: 2026-01-31 07:53:45.521 226239 DEBUG oslo_concurrency.lockutils [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Lock "e6feb62a-1d4c-435a-a6c7-054e8da05258-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:45 np0005603623 nova_compute[226235]: 2026-01-31 07:53:45.522 226239 INFO nova.compute.manager [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Terminating instance#033[00m
Jan 31 02:53:45 np0005603623 nova_compute[226235]: 2026-01-31 07:53:45.523 226239 DEBUG oslo_concurrency.lockutils [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Acquiring lock "refresh_cache-e6feb62a-1d4c-435a-a6c7-054e8da05258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:53:45 np0005603623 nova_compute[226235]: 2026-01-31 07:53:45.523 226239 DEBUG oslo_concurrency.lockutils [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Acquired lock "refresh_cache-e6feb62a-1d4c-435a-a6c7-054e8da05258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:53:45 np0005603623 nova_compute[226235]: 2026-01-31 07:53:45.523 226239 DEBUG nova.network.neutron [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:53:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:53:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:45.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:53:45 np0005603623 nova_compute[226235]: 2026-01-31 07:53:45.795 226239 DEBUG nova.network.neutron [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:53:46 np0005603623 nova_compute[226235]: 2026-01-31 07:53:46.397 226239 DEBUG nova.network.neutron [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:53:46 np0005603623 nova_compute[226235]: 2026-01-31 07:53:46.425 226239 DEBUG oslo_concurrency.lockutils [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Releasing lock "refresh_cache-e6feb62a-1d4c-435a-a6c7-054e8da05258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:53:46 np0005603623 nova_compute[226235]: 2026-01-31 07:53:46.426 226239 DEBUG nova.compute.manager [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:53:46 np0005603623 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000019.scope: Deactivated successfully.
Jan 31 02:53:46 np0005603623 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000019.scope: Consumed 4.472s CPU time.
Jan 31 02:53:46 np0005603623 systemd-machined[194379]: Machine qemu-17-instance-00000019 terminated.
Jan 31 02:53:46 np0005603623 nova_compute[226235]: 2026-01-31 07:53:46.640 226239 INFO nova.virt.libvirt.driver [-] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Instance destroyed successfully.#033[00m
Jan 31 02:53:46 np0005603623 nova_compute[226235]: 2026-01-31 07:53:46.641 226239 DEBUG nova.objects.instance [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Lazy-loading 'resources' on Instance uuid e6feb62a-1d4c-435a-a6c7-054e8da05258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:53:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:46.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:46 np0005603623 nova_compute[226235]: 2026-01-31 07:53:46.834 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:46 np0005603623 nova_compute[226235]: 2026-01-31 07:53:46.919 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:53:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:47.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:53:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:48 np0005603623 nova_compute[226235]: 2026-01-31 07:53:48.372 226239 INFO nova.virt.libvirt.driver [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Deleting instance files /var/lib/nova/instances/e6feb62a-1d4c-435a-a6c7-054e8da05258_del#033[00m
Jan 31 02:53:48 np0005603623 nova_compute[226235]: 2026-01-31 07:53:48.373 226239 INFO nova.virt.libvirt.driver [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Deletion of /var/lib/nova/instances/e6feb62a-1d4c-435a-a6c7-054e8da05258_del complete#033[00m
Jan 31 02:53:48 np0005603623 nova_compute[226235]: 2026-01-31 07:53:48.432 226239 INFO nova.compute.manager [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Took 2.01 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:53:48 np0005603623 nova_compute[226235]: 2026-01-31 07:53:48.433 226239 DEBUG oslo.service.loopingcall [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:53:48 np0005603623 nova_compute[226235]: 2026-01-31 07:53:48.433 226239 DEBUG nova.compute.manager [-] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:53:48 np0005603623 nova_compute[226235]: 2026-01-31 07:53:48.433 226239 DEBUG nova.network.neutron [-] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:53:48 np0005603623 nova_compute[226235]: 2026-01-31 07:53:48.778 226239 DEBUG nova.network.neutron [-] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:53:48 np0005603623 nova_compute[226235]: 2026-01-31 07:53:48.812 226239 DEBUG nova.network.neutron [-] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:53:48 np0005603623 nova_compute[226235]: 2026-01-31 07:53:48.831 226239 INFO nova.compute.manager [-] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Took 0.40 seconds to deallocate network for instance.#033[00m
Jan 31 02:53:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:53:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:48.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:53:48 np0005603623 nova_compute[226235]: 2026-01-31 07:53:48.876 226239 DEBUG oslo_concurrency.lockutils [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:48 np0005603623 nova_compute[226235]: 2026-01-31 07:53:48.876 226239 DEBUG oslo_concurrency.lockutils [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:48 np0005603623 nova_compute[226235]: 2026-01-31 07:53:48.960 226239 DEBUG oslo_concurrency.processutils [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:53:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3539145662' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:53:49 np0005603623 nova_compute[226235]: 2026-01-31 07:53:49.469 226239 DEBUG oslo_concurrency.processutils [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:49 np0005603623 nova_compute[226235]: 2026-01-31 07:53:49.474 226239 DEBUG nova.compute.provider_tree [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:53:49 np0005603623 nova_compute[226235]: 2026-01-31 07:53:49.510 226239 DEBUG nova.scheduler.client.report [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:53:49 np0005603623 nova_compute[226235]: 2026-01-31 07:53:49.542 226239 DEBUG oslo_concurrency.lockutils [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.666s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:49 np0005603623 nova_compute[226235]: 2026-01-31 07:53:49.577 226239 INFO nova.scheduler.client.report [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Deleted allocations for instance e6feb62a-1d4c-435a-a6c7-054e8da05258#033[00m
Jan 31 02:53:49 np0005603623 nova_compute[226235]: 2026-01-31 07:53:49.676 226239 DEBUG oslo_concurrency.lockutils [None req-3dfbb5f4-5c98-4ac2-b436-c198673761d2 5baed7ec51044fc6859b3df3fe9c4bdd 9b5dbf38b98c4018b86f5dccf80a9f30 - - default default] Lock "e6feb62a-1d4c-435a-a6c7-054e8da05258" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:53:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:49.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:53:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:50.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:53:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:51.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:53:51 np0005603623 nova_compute[226235]: 2026-01-31 07:53:51.836 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:51 np0005603623 nova_compute[226235]: 2026-01-31 07:53:51.920 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:52.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:53 np0005603623 nova_compute[226235]: 2026-01-31 07:53:53.434 226239 DEBUG oslo_concurrency.lockutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Acquiring lock "33d8d2ee-7d2a-4973-a7da-7f86f14f5f87" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:53 np0005603623 nova_compute[226235]: 2026-01-31 07:53:53.434 226239 DEBUG oslo_concurrency.lockutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Lock "33d8d2ee-7d2a-4973-a7da-7f86f14f5f87" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:53 np0005603623 nova_compute[226235]: 2026-01-31 07:53:53.456 226239 DEBUG nova.compute.manager [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:53:53 np0005603623 nova_compute[226235]: 2026-01-31 07:53:53.580 226239 DEBUG oslo_concurrency.lockutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:53 np0005603623 nova_compute[226235]: 2026-01-31 07:53:53.580 226239 DEBUG oslo_concurrency.lockutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:53 np0005603623 nova_compute[226235]: 2026-01-31 07:53:53.590 226239 DEBUG nova.virt.hardware [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:53:53 np0005603623 nova_compute[226235]: 2026-01-31 07:53:53.591 226239 INFO nova.compute.claims [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:53:53 np0005603623 nova_compute[226235]: 2026-01-31 07:53:53.703 226239 DEBUG oslo_concurrency.processutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.003000095s ======
Jan 31 02:53:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:53.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000095s
Jan 31 02:53:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:53:54 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2335444646' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.184 226239 DEBUG oslo_concurrency.processutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.190 226239 DEBUG nova.compute.provider_tree [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.213 226239 DEBUG nova.scheduler.client.report [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.249 226239 DEBUG oslo_concurrency.lockutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.669s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.250 226239 DEBUG nova.compute.manager [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.349 226239 DEBUG nova.compute.manager [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.349 226239 DEBUG nova.network.neutron [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.425 226239 INFO nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.471 226239 DEBUG nova.compute.manager [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.629 226239 DEBUG nova.compute.manager [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.630 226239 DEBUG nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.631 226239 INFO nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Creating image(s)#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.661 226239 DEBUG nova.storage.rbd_utils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] rbd image 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.692 226239 DEBUG nova.storage.rbd_utils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] rbd image 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.723 226239 DEBUG nova.storage.rbd_utils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] rbd image 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.727 226239 DEBUG oslo_concurrency.processutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.778 226239 DEBUG oslo_concurrency.processutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.779 226239 DEBUG oslo_concurrency.lockutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.780 226239 DEBUG oslo_concurrency.lockutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.780 226239 DEBUG oslo_concurrency.lockutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.805 226239 DEBUG nova.storage.rbd_utils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] rbd image 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:54 np0005603623 nova_compute[226235]: 2026-01-31 07:53:54.810 226239 DEBUG oslo_concurrency.processutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:54.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:55 np0005603623 podman[242353]: 2026-01-31 07:53:55.239593771 +0000 UTC m=+0.061768454 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 31 02:53:55 np0005603623 nova_compute[226235]: 2026-01-31 07:53:55.265 226239 DEBUG nova.network.neutron [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 31 02:53:55 np0005603623 nova_compute[226235]: 2026-01-31 07:53:55.267 226239 DEBUG nova.compute.manager [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:53:55 np0005603623 podman[242354]: 2026-01-31 07:53:55.278342129 +0000 UTC m=+0.100016786 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 31 02:53:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:55.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.555 226239 DEBUG oslo_concurrency.processutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.745s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.633 226239 DEBUG nova.storage.rbd_utils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] resizing rbd image 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.798 226239 DEBUG nova.objects.instance [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Lazy-loading 'migration_context' on Instance uuid 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.836 226239 DEBUG nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.836 226239 DEBUG nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Ensure instance console log exists: /var/lib/nova/instances/33d8d2ee-7d2a-4973-a7da-7f86f14f5f87/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.837 226239 DEBUG oslo_concurrency.lockutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.837 226239 DEBUG oslo_concurrency.lockutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.837 226239 DEBUG oslo_concurrency.lockutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.839 226239 DEBUG nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.839 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:56.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.844 226239 WARNING nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.863 226239 DEBUG nova.virt.libvirt.host [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.864 226239 DEBUG nova.virt.libvirt.host [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.868 226239 DEBUG nova.virt.libvirt.host [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.868 226239 DEBUG nova.virt.libvirt.host [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.869 226239 DEBUG nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.870 226239 DEBUG nova.virt.hardware [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.870 226239 DEBUG nova.virt.hardware [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.870 226239 DEBUG nova.virt.hardware [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.871 226239 DEBUG nova.virt.hardware [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.871 226239 DEBUG nova.virt.hardware [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.871 226239 DEBUG nova.virt.hardware [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.871 226239 DEBUG nova.virt.hardware [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.872 226239 DEBUG nova.virt.hardware [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.872 226239 DEBUG nova.virt.hardware [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.872 226239 DEBUG nova.virt.hardware [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.872 226239 DEBUG nova.virt.hardware [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.875 226239 DEBUG oslo_concurrency.processutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:56 np0005603623 nova_compute[226235]: 2026-01-31 07:53:56.923 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:53:57 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2807047324' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:53:57 np0005603623 nova_compute[226235]: 2026-01-31 07:53:57.558 226239 DEBUG oslo_concurrency.processutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.683s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:57 np0005603623 nova_compute[226235]: 2026-01-31 07:53:57.582 226239 DEBUG nova.storage.rbd_utils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] rbd image 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:57 np0005603623 nova_compute[226235]: 2026-01-31 07:53:57.586 226239 DEBUG oslo_concurrency.processutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:53:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:57.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:53:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:53:58 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2237304547' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:53:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:58 np0005603623 nova_compute[226235]: 2026-01-31 07:53:58.133 226239 DEBUG oslo_concurrency.processutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.547s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:58 np0005603623 nova_compute[226235]: 2026-01-31 07:53:58.135 226239 DEBUG nova.objects.instance [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Lazy-loading 'pci_devices' on Instance uuid 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:53:58 np0005603623 nova_compute[226235]: 2026-01-31 07:53:58.161 226239 DEBUG nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:53:58 np0005603623 nova_compute[226235]:  <uuid>33d8d2ee-7d2a-4973-a7da-7f86f14f5f87</uuid>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:  <name>instance-0000001c</name>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServersAdminNegativeTestJSON-server-940384285</nova:name>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:53:56</nova:creationTime>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:53:58 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:        <nova:user uuid="0ee689c4c14744fb8b4e1d54f6831626">tempest-ServersAdminNegativeTestJSON-1352127112-project-member</nova:user>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:        <nova:project uuid="f63afec818164f31a848360151f96a68">tempest-ServersAdminNegativeTestJSON-1352127112</nova:project>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <nova:ports/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <entry name="serial">33d8d2ee-7d2a-4973-a7da-7f86f14f5f87</entry>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <entry name="uuid">33d8d2ee-7d2a-4973-a7da-7f86f14f5f87</entry>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/33d8d2ee-7d2a-4973-a7da-7f86f14f5f87_disk">
Jan 31 02:53:58 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:53:58 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/33d8d2ee-7d2a-4973-a7da-7f86f14f5f87_disk.config">
Jan 31 02:53:58 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:53:58 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/33d8d2ee-7d2a-4973-a7da-7f86f14f5f87/console.log" append="off"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:53:58 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:53:58 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:53:58 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:53:58 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:53:58 np0005603623 nova_compute[226235]: 2026-01-31 07:53:58.249 226239 DEBUG nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:53:58 np0005603623 nova_compute[226235]: 2026-01-31 07:53:58.250 226239 DEBUG nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:53:58 np0005603623 nova_compute[226235]: 2026-01-31 07:53:58.250 226239 INFO nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Using config drive#033[00m
Jan 31 02:53:58 np0005603623 nova_compute[226235]: 2026-01-31 07:53:58.275 226239 DEBUG nova.storage.rbd_utils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] rbd image 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:58 np0005603623 nova_compute[226235]: 2026-01-31 07:53:58.558 226239 INFO nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Creating config drive at /var/lib/nova/instances/33d8d2ee-7d2a-4973-a7da-7f86f14f5f87/disk.config#033[00m
Jan 31 02:53:58 np0005603623 nova_compute[226235]: 2026-01-31 07:53:58.562 226239 DEBUG oslo_concurrency.processutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/33d8d2ee-7d2a-4973-a7da-7f86f14f5f87/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpmrdb58cm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:58 np0005603623 nova_compute[226235]: 2026-01-31 07:53:58.680 226239 DEBUG oslo_concurrency.processutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/33d8d2ee-7d2a-4973-a7da-7f86f14f5f87/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpmrdb58cm" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:58 np0005603623 nova_compute[226235]: 2026-01-31 07:53:58.802 226239 DEBUG nova.storage.rbd_utils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] rbd image 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:58 np0005603623 nova_compute[226235]: 2026-01-31 07:53:58.806 226239 DEBUG oslo_concurrency.processutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/33d8d2ee-7d2a-4973-a7da-7f86f14f5f87/disk.config 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:58.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:53:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:59.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:00.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:01 np0005603623 nova_compute[226235]: 2026-01-31 07:54:01.640 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846026.6383953, e6feb62a-1d4c-435a-a6c7-054e8da05258 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:54:01 np0005603623 nova_compute[226235]: 2026-01-31 07:54:01.640 226239 INFO nova.compute.manager [-] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:54:01 np0005603623 nova_compute[226235]: 2026-01-31 07:54:01.662 226239 DEBUG nova.compute.manager [None req-f49bfdfb-6939-4aff-a5fa-e2406c8a5a21 - - - - - -] [instance: e6feb62a-1d4c-435a-a6c7-054e8da05258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:54:01 np0005603623 nova_compute[226235]: 2026-01-31 07:54:01.841 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:01 np0005603623 nova_compute[226235]: 2026-01-31 07:54:01.871 226239 DEBUG oslo_concurrency.processutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/33d8d2ee-7d2a-4973-a7da-7f86f14f5f87/disk.config 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:54:01 np0005603623 nova_compute[226235]: 2026-01-31 07:54:01.872 226239 INFO nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Deleting local config drive /var/lib/nova/instances/33d8d2ee-7d2a-4973-a7da-7f86f14f5f87/disk.config because it was imported into RBD.#033[00m
Jan 31 02:54:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:01.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:01 np0005603623 systemd-machined[194379]: New machine qemu-18-instance-0000001c.
Jan 31 02:54:01 np0005603623 nova_compute[226235]: 2026-01-31 07:54:01.924 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:01 np0005603623 systemd[1]: Started Virtual Machine qemu-18-instance-0000001c.
Jan 31 02:54:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:02.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:02.887 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:54:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:02.887 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:54:02 np0005603623 nova_compute[226235]: 2026-01-31 07:54:02.920 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.286 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846043.2858694, 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.286 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.288 226239 DEBUG nova.compute.manager [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.288 226239 DEBUG nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.291 226239 INFO nova.virt.libvirt.driver [-] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Instance spawned successfully.#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.292 226239 DEBUG nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.328 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.335 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.342 226239 DEBUG nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.343 226239 DEBUG nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.343 226239 DEBUG nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.344 226239 DEBUG nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.344 226239 DEBUG nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.345 226239 DEBUG nova.virt.libvirt.driver [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.355 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.356 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846043.2880266, 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.357 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] VM Started (Lifecycle Event)#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.382 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.387 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.412 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.419 226239 INFO nova.compute.manager [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Took 8.79 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.420 226239 DEBUG nova.compute.manager [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.487 226239 INFO nova.compute.manager [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Took 9.98 seconds to build instance.#033[00m
Jan 31 02:54:03 np0005603623 nova_compute[226235]: 2026-01-31 07:54:03.520 226239 DEBUG oslo_concurrency.lockutils [None req-4d4b02d9-0380-4fa9-8bc3-788f61e3aff2 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Lock "33d8d2ee-7d2a-4973-a7da-7f86f14f5f87" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:03.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:54:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:04.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:54:05 np0005603623 nova_compute[226235]: 2026-01-31 07:54:05.347 226239 DEBUG nova.objects.instance [None req-63a41afc-546c-40ff-8e1d-fe5145dafb0f 9565dfd51dc34930a3f05eaa4c60c7ee 9d8a708699004a25bfb9d13c5a8fe481 - - default default] Lazy-loading 'pci_devices' on Instance uuid 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:54:05 np0005603623 nova_compute[226235]: 2026-01-31 07:54:05.373 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846045.3722975, 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:54:05 np0005603623 nova_compute[226235]: 2026-01-31 07:54:05.374 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:54:05 np0005603623 nova_compute[226235]: 2026-01-31 07:54:05.399 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:54:05 np0005603623 nova_compute[226235]: 2026-01-31 07:54:05.404 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:54:05 np0005603623 nova_compute[226235]: 2026-01-31 07:54:05.437 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 31 02:54:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:05.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:06 np0005603623 nova_compute[226235]: 2026-01-31 07:54:06.843 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:06.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:06 np0005603623 nova_compute[226235]: 2026-01-31 07:54:06.926 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:07.890 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:54:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:54:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:07.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:54:08 np0005603623 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Jan 31 02:54:08 np0005603623 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001c.scope: Consumed 2.628s CPU time.
Jan 31 02:54:08 np0005603623 systemd-machined[194379]: Machine qemu-18-instance-0000001c terminated.
Jan 31 02:54:08 np0005603623 nova_compute[226235]: 2026-01-31 07:54:08.437 226239 DEBUG nova.compute.manager [None req-63a41afc-546c-40ff-8e1d-fe5145dafb0f 9565dfd51dc34930a3f05eaa4c60c7ee 9d8a708699004a25bfb9d13c5a8fe481 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:54:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:08.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:09 np0005603623 nova_compute[226235]: 2026-01-31 07:54:09.329 226239 DEBUG oslo_concurrency.lockutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Acquiring lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:09 np0005603623 nova_compute[226235]: 2026-01-31 07:54:09.329 226239 DEBUG oslo_concurrency.lockutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:09 np0005603623 nova_compute[226235]: 2026-01-31 07:54:09.350 226239 DEBUG nova.compute.manager [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:54:09 np0005603623 nova_compute[226235]: 2026-01-31 07:54:09.463 226239 DEBUG oslo_concurrency.lockutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:09 np0005603623 nova_compute[226235]: 2026-01-31 07:54:09.464 226239 DEBUG oslo_concurrency.lockutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:09 np0005603623 nova_compute[226235]: 2026-01-31 07:54:09.470 226239 DEBUG nova.virt.hardware [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:54:09 np0005603623 nova_compute[226235]: 2026-01-31 07:54:09.470 226239 INFO nova.compute.claims [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:54:09 np0005603623 nova_compute[226235]: 2026-01-31 07:54:09.650 226239 DEBUG oslo_concurrency.processutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:54:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:09.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:54:10 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3587040041' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.387 226239 DEBUG oslo_concurrency.processutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.737s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.393 226239 DEBUG nova.compute.provider_tree [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.437 226239 DEBUG nova.scheduler.client.report [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.460 226239 DEBUG oslo_concurrency.lockutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.996s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.461 226239 DEBUG nova.compute.manager [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.514 226239 DEBUG nova.compute.manager [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.514 226239 DEBUG nova.network.neutron [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.551 226239 INFO nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.574 226239 DEBUG nova.compute.manager [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.695 226239 DEBUG nova.compute.manager [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.696 226239 DEBUG nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.697 226239 INFO nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Creating image(s)#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.722 226239 DEBUG nova.storage.rbd_utils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] rbd image 19f9b64a-5ffd-4930-8b5d-427c62ff87a4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.752 226239 DEBUG nova.storage.rbd_utils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] rbd image 19f9b64a-5ffd-4930-8b5d-427c62ff87a4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.801 226239 DEBUG nova.storage.rbd_utils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] rbd image 19f9b64a-5ffd-4930-8b5d-427c62ff87a4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.805 226239 DEBUG oslo_concurrency.processutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.826 226239 DEBUG nova.policy [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0384d6fc8c0b4f66bf382009760ab9f5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8a2b8f01b0d74b8588a3f97400e9fbff', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:54:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:54:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:10.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.861 226239 DEBUG oslo_concurrency.processutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.862 226239 DEBUG oslo_concurrency.lockutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.862 226239 DEBUG oslo_concurrency.lockutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.862 226239 DEBUG oslo_concurrency.lockutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.887 226239 DEBUG nova.storage.rbd_utils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] rbd image 19f9b64a-5ffd-4930-8b5d-427c62ff87a4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:54:10 np0005603623 nova_compute[226235]: 2026-01-31 07:54:10.891 226239 DEBUG oslo_concurrency.processutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 19f9b64a-5ffd-4930-8b5d-427c62ff87a4_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:54:11 np0005603623 nova_compute[226235]: 2026-01-31 07:54:11.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:11 np0005603623 nova_compute[226235]: 2026-01-31 07:54:11.806 226239 DEBUG nova.network.neutron [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Successfully created port: 2c75734d-f6b4-43fd-8b27-35fc421580dd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 02:54:11 np0005603623 nova_compute[226235]: 2026-01-31 07:54:11.860 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:54:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:11.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:54:11 np0005603623 nova_compute[226235]: 2026-01-31 07:54:11.927 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:12 np0005603623 nova_compute[226235]: 2026-01-31 07:54:12.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:12 np0005603623 nova_compute[226235]: 2026-01-31 07:54:12.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:12.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.157 226239 DEBUG oslo_concurrency.lockutils [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Acquiring lock "33d8d2ee-7d2a-4973-a7da-7f86f14f5f87" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.158 226239 DEBUG oslo_concurrency.lockutils [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Lock "33d8d2ee-7d2a-4973-a7da-7f86f14f5f87" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.158 226239 DEBUG oslo_concurrency.lockutils [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Acquiring lock "33d8d2ee-7d2a-4973-a7da-7f86f14f5f87-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.158 226239 DEBUG oslo_concurrency.lockutils [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Lock "33d8d2ee-7d2a-4973-a7da-7f86f14f5f87-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.159 226239 DEBUG oslo_concurrency.lockutils [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Lock "33d8d2ee-7d2a-4973-a7da-7f86f14f5f87-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.160 226239 INFO nova.compute.manager [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Terminating instance#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.160 226239 DEBUG oslo_concurrency.lockutils [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Acquiring lock "refresh_cache-33d8d2ee-7d2a-4973-a7da-7f86f14f5f87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.161 226239 DEBUG oslo_concurrency.lockutils [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Acquired lock "refresh_cache-33d8d2ee-7d2a-4973-a7da-7f86f14f5f87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.161 226239 DEBUG nova.network.neutron [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.197 226239 DEBUG nova.network.neutron [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Successfully updated port: 2c75734d-f6b4-43fd-8b27-35fc421580dd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.215 226239 DEBUG oslo_concurrency.lockutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Acquiring lock "refresh_cache-19f9b64a-5ffd-4930-8b5d-427c62ff87a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.215 226239 DEBUG oslo_concurrency.lockutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Acquired lock "refresh_cache-19f9b64a-5ffd-4930-8b5d-427c62ff87a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.215 226239 DEBUG nova.network.neutron [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.329 226239 DEBUG nova.compute.manager [req-71dc7ea8-f1c0-4793-b130-48613a27bb50 req-0e70f81c-9179-42e0-9267-8bdbc9ae97ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Received event network-changed-2c75734d-f6b4-43fd-8b27-35fc421580dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.330 226239 DEBUG nova.compute.manager [req-71dc7ea8-f1c0-4793-b130-48613a27bb50 req-0e70f81c-9179-42e0-9267-8bdbc9ae97ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Refreshing instance network info cache due to event network-changed-2c75734d-f6b4-43fd-8b27-35fc421580dd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.330 226239 DEBUG oslo_concurrency.lockutils [req-71dc7ea8-f1c0-4793-b130-48613a27bb50 req-0e70f81c-9179-42e0-9267-8bdbc9ae97ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-19f9b64a-5ffd-4930-8b5d-427c62ff87a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.373 226239 DEBUG nova.network.neutron [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.467 226239 DEBUG nova.network.neutron [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.823 226239 DEBUG nova.network.neutron [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.853 226239 DEBUG oslo_concurrency.lockutils [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Releasing lock "refresh_cache-33d8d2ee-7d2a-4973-a7da-7f86f14f5f87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.854 226239 DEBUG nova.compute.manager [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.860 226239 INFO nova.virt.libvirt.driver [-] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Instance destroyed successfully.#033[00m
Jan 31 02:54:13 np0005603623 nova_compute[226235]: 2026-01-31 07:54:13.861 226239 DEBUG nova.objects.instance [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Lazy-loading 'resources' on Instance uuid 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:54:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:54:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:13.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:54:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:14 np0005603623 nova_compute[226235]: 2026-01-31 07:54:14.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:14 np0005603623 nova_compute[226235]: 2026-01-31 07:54:14.171 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:14 np0005603623 nova_compute[226235]: 2026-01-31 07:54:14.192 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:14 np0005603623 nova_compute[226235]: 2026-01-31 07:54:14.193 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:14 np0005603623 nova_compute[226235]: 2026-01-31 07:54:14.193 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:14 np0005603623 nova_compute[226235]: 2026-01-31 07:54:14.193 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:54:14 np0005603623 nova_compute[226235]: 2026-01-31 07:54:14.194 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:54:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 02:54:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/657114258' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 02:54:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 02:54:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/657114258' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 02:54:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:54:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2821086047' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:54:14 np0005603623 nova_compute[226235]: 2026-01-31 07:54:14.721 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:54:14 np0005603623 nova_compute[226235]: 2026-01-31 07:54:14.789 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000001c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:54:14 np0005603623 nova_compute[226235]: 2026-01-31 07:54:14.789 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000001c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:54:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:14.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:14 np0005603623 nova_compute[226235]: 2026-01-31 07:54:14.904 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:54:14 np0005603623 nova_compute[226235]: 2026-01-31 07:54:14.905 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4637MB free_disk=20.922611236572266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:54:14 np0005603623 nova_compute[226235]: 2026-01-31 07:54:14.906 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:14 np0005603623 nova_compute[226235]: 2026-01-31 07:54:14.906 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:14 np0005603623 nova_compute[226235]: 2026-01-31 07:54:14.991 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:54:14 np0005603623 nova_compute[226235]: 2026-01-31 07:54:14.991 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 19f9b64a-5ffd-4930-8b5d-427c62ff87a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:54:14 np0005603623 nova_compute[226235]: 2026-01-31 07:54:14.992 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:54:14 np0005603623 nova_compute[226235]: 2026-01-31 07:54:14.992 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:54:15 np0005603623 nova_compute[226235]: 2026-01-31 07:54:15.072 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:54:15 np0005603623 nova_compute[226235]: 2026-01-31 07:54:15.245 226239 DEBUG oslo_concurrency.processutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 19f9b64a-5ffd-4930-8b5d-427c62ff87a4_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 4.354s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:54:15 np0005603623 nova_compute[226235]: 2026-01-31 07:54:15.336 226239 DEBUG nova.storage.rbd_utils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] resizing rbd image 19f9b64a-5ffd-4930-8b5d-427c62ff87a4_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:54:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:54:15 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/296120198' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:54:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:54:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:15.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:54:15 np0005603623 nova_compute[226235]: 2026-01-31 07:54:15.984 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.912s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:54:15 np0005603623 nova_compute[226235]: 2026-01-31 07:54:15.992 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:54:16 np0005603623 nova_compute[226235]: 2026-01-31 07:54:16.011 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:54:16 np0005603623 nova_compute[226235]: 2026-01-31 07:54:16.049 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:54:16 np0005603623 nova_compute[226235]: 2026-01-31 07:54:16.050 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:16 np0005603623 nova_compute[226235]: 2026-01-31 07:54:16.291 226239 DEBUG nova.network.neutron [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Updating instance_info_cache with network_info: [{"id": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "address": "fa:16:3e:8f:0b:55", "network": {"id": "7aba0eda-87fd-423d-bbf9-46f52ee12dad", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-733650969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a2b8f01b0d74b8588a3f97400e9fbff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c75734d-f6", "ovs_interfaceid": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:54:16 np0005603623 nova_compute[226235]: 2026-01-31 07:54:16.314 226239 DEBUG oslo_concurrency.lockutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Releasing lock "refresh_cache-19f9b64a-5ffd-4930-8b5d-427c62ff87a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:54:16 np0005603623 nova_compute[226235]: 2026-01-31 07:54:16.315 226239 DEBUG nova.compute.manager [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Instance network_info: |[{"id": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "address": "fa:16:3e:8f:0b:55", "network": {"id": "7aba0eda-87fd-423d-bbf9-46f52ee12dad", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-733650969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a2b8f01b0d74b8588a3f97400e9fbff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c75734d-f6", "ovs_interfaceid": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:54:16 np0005603623 nova_compute[226235]: 2026-01-31 07:54:16.315 226239 DEBUG oslo_concurrency.lockutils [req-71dc7ea8-f1c0-4793-b130-48613a27bb50 req-0e70f81c-9179-42e0-9267-8bdbc9ae97ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-19f9b64a-5ffd-4930-8b5d-427c62ff87a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:54:16 np0005603623 nova_compute[226235]: 2026-01-31 07:54:16.316 226239 DEBUG nova.network.neutron [req-71dc7ea8-f1c0-4793-b130-48613a27bb50 req-0e70f81c-9179-42e0-9267-8bdbc9ae97ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Refreshing network info cache for port 2c75734d-f6b4-43fd-8b27-35fc421580dd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:54:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:16.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:16 np0005603623 nova_compute[226235]: 2026-01-31 07:54:16.862 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:16 np0005603623 nova_compute[226235]: 2026-01-31 07:54:16.930 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:17 np0005603623 nova_compute[226235]: 2026-01-31 07:54:17.033 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:17 np0005603623 nova_compute[226235]: 2026-01-31 07:54:17.033 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:17 np0005603623 nova_compute[226235]: 2026-01-31 07:54:17.033 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:54:17 np0005603623 nova_compute[226235]: 2026-01-31 07:54:17.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:17.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.156 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.176 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.177 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.177 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.512 226239 DEBUG nova.objects.instance [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lazy-loading 'migration_context' on Instance uuid 19f9b64a-5ffd-4930-8b5d-427c62ff87a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.533 226239 DEBUG nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.533 226239 DEBUG nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Ensure instance console log exists: /var/lib/nova/instances/19f9b64a-5ffd-4930-8b5d-427c62ff87a4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.534 226239 DEBUG oslo_concurrency.lockutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.534 226239 DEBUG oslo_concurrency.lockutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.534 226239 DEBUG oslo_concurrency.lockutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.537 226239 DEBUG nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Start _get_guest_xml network_info=[{"id": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "address": "fa:16:3e:8f:0b:55", "network": {"id": "7aba0eda-87fd-423d-bbf9-46f52ee12dad", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-733650969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a2b8f01b0d74b8588a3f97400e9fbff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c75734d-f6", "ovs_interfaceid": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.541 226239 WARNING nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.546 226239 DEBUG nova.virt.libvirt.host [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.546 226239 DEBUG nova.virt.libvirt.host [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.548 226239 DEBUG nova.virt.libvirt.host [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.549 226239 DEBUG nova.virt.libvirt.host [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.550 226239 DEBUG nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.551 226239 DEBUG nova.virt.hardware [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.551 226239 DEBUG nova.virt.hardware [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.551 226239 DEBUG nova.virt.hardware [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.552 226239 DEBUG nova.virt.hardware [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.552 226239 DEBUG nova.virt.hardware [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.552 226239 DEBUG nova.virt.hardware [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.553 226239 DEBUG nova.virt.hardware [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.553 226239 DEBUG nova.virt.hardware [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.553 226239 DEBUG nova.virt.hardware [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.554 226239 DEBUG nova.virt.hardware [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.554 226239 DEBUG nova.virt.hardware [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:54:18 np0005603623 nova_compute[226235]: 2026-01-31 07:54:18.557 226239 DEBUG oslo_concurrency.processutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:54:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:18.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:54:18 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3643346606' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:54:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:19 np0005603623 nova_compute[226235]: 2026-01-31 07:54:19.089 226239 DEBUG nova.network.neutron [req-71dc7ea8-f1c0-4793-b130-48613a27bb50 req-0e70f81c-9179-42e0-9267-8bdbc9ae97ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Updated VIF entry in instance network info cache for port 2c75734d-f6b4-43fd-8b27-35fc421580dd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:54:19 np0005603623 nova_compute[226235]: 2026-01-31 07:54:19.090 226239 DEBUG nova.network.neutron [req-71dc7ea8-f1c0-4793-b130-48613a27bb50 req-0e70f81c-9179-42e0-9267-8bdbc9ae97ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Updating instance_info_cache with network_info: [{"id": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "address": "fa:16:3e:8f:0b:55", "network": {"id": "7aba0eda-87fd-423d-bbf9-46f52ee12dad", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-733650969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a2b8f01b0d74b8588a3f97400e9fbff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c75734d-f6", "ovs_interfaceid": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:54:19 np0005603623 nova_compute[226235]: 2026-01-31 07:54:19.109 226239 DEBUG oslo_concurrency.lockutils [req-71dc7ea8-f1c0-4793-b130-48613a27bb50 req-0e70f81c-9179-42e0-9267-8bdbc9ae97ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-19f9b64a-5ffd-4930-8b5d-427c62ff87a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:54:19 np0005603623 nova_compute[226235]: 2026-01-31 07:54:19.612 226239 DEBUG oslo_concurrency.processutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:54:19 np0005603623 nova_compute[226235]: 2026-01-31 07:54:19.655 226239 DEBUG nova.storage.rbd_utils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] rbd image 19f9b64a-5ffd-4930-8b5d-427c62ff87a4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:54:19 np0005603623 nova_compute[226235]: 2026-01-31 07:54:19.662 226239 DEBUG oslo_concurrency.processutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:54:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:54:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:19.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:54:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:54:20 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2248364042' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.095 226239 DEBUG oslo_concurrency.processutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.098 226239 DEBUG nova.virt.libvirt.vif [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:54:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-1178765642',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-1178765642',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-117876564',id=29,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8a2b8f01b0d74b8588a3f97400e9fbff',ramdisk_id='',reservation_id='r-dngyx2m2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-407725813',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-407725813-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:54:10Z,user_data=None,user_id='0384d6fc8c0b4f66bf382009760ab9f5',uuid=19f9b64a-5ffd-4930-8b5d-427c62ff87a4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "address": "fa:16:3e:8f:0b:55", "network": {"id": "7aba0eda-87fd-423d-bbf9-46f52ee12dad", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-733650969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a2b8f01b0d74b8588a3f97400e9fbff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c75734d-f6", "ovs_interfaceid": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.099 226239 DEBUG nova.network.os_vif_util [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Converting VIF {"id": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "address": "fa:16:3e:8f:0b:55", "network": {"id": "7aba0eda-87fd-423d-bbf9-46f52ee12dad", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-733650969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a2b8f01b0d74b8588a3f97400e9fbff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c75734d-f6", "ovs_interfaceid": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.100 226239 DEBUG nova.network.os_vif_util [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:0b:55,bridge_name='br-int',has_traffic_filtering=True,id=2c75734d-f6b4-43fd-8b27-35fc421580dd,network=Network(7aba0eda-87fd-423d-bbf9-46f52ee12dad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c75734d-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.102 226239 DEBUG nova.objects.instance [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lazy-loading 'pci_devices' on Instance uuid 19f9b64a-5ffd-4930-8b5d-427c62ff87a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.128 226239 DEBUG nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:54:20 np0005603623 nova_compute[226235]:  <uuid>19f9b64a-5ffd-4930-8b5d-427c62ff87a4</uuid>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:  <name>instance-0000001d</name>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <nova:name>tempest-FloatingIPsAssociationNegativeTestJSON-server-1178765642</nova:name>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:54:18</nova:creationTime>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:54:20 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:        <nova:user uuid="0384d6fc8c0b4f66bf382009760ab9f5">tempest-FloatingIPsAssociationNegativeTestJSON-407725813-project-member</nova:user>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:        <nova:project uuid="8a2b8f01b0d74b8588a3f97400e9fbff">tempest-FloatingIPsAssociationNegativeTestJSON-407725813</nova:project>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:        <nova:port uuid="2c75734d-f6b4-43fd-8b27-35fc421580dd">
Jan 31 02:54:20 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <entry name="serial">19f9b64a-5ffd-4930-8b5d-427c62ff87a4</entry>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <entry name="uuid">19f9b64a-5ffd-4930-8b5d-427c62ff87a4</entry>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/19f9b64a-5ffd-4930-8b5d-427c62ff87a4_disk">
Jan 31 02:54:20 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:54:20 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/19f9b64a-5ffd-4930-8b5d-427c62ff87a4_disk.config">
Jan 31 02:54:20 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:54:20 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:8f:0b:55"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <target dev="tap2c75734d-f6"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    </interface>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/19f9b64a-5ffd-4930-8b5d-427c62ff87a4/console.log" append="off"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:54:20 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:54:20 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:54:20 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:54:20 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.130 226239 DEBUG nova.compute.manager [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Preparing to wait for external event network-vif-plugged-2c75734d-f6b4-43fd-8b27-35fc421580dd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.131 226239 DEBUG oslo_concurrency.lockutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Acquiring lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.131 226239 DEBUG oslo_concurrency.lockutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.131 226239 DEBUG oslo_concurrency.lockutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.132 226239 DEBUG nova.virt.libvirt.vif [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:54:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-1178765642',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-1178765642',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-117876564',id=29,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8a2b8f01b0d74b8588a3f97400e9fbff',ramdisk_id='',reservation_id='r-dngyx2m2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-407725813',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-407725813-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:54:10Z,user_data=None,user_id='0384d6fc8c0b4f66bf382009760ab9f5',uuid=19f9b64a-5ffd-4930-8b5d-427c62ff87a4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "address": "fa:16:3e:8f:0b:55", "network": {"id": "7aba0eda-87fd-423d-bbf9-46f52ee12dad", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-733650969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a2b8f01b0d74b8588a3f97400e9fbff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c75734d-f6", "ovs_interfaceid": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.133 226239 DEBUG nova.network.os_vif_util [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Converting VIF {"id": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "address": "fa:16:3e:8f:0b:55", "network": {"id": "7aba0eda-87fd-423d-bbf9-46f52ee12dad", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-733650969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a2b8f01b0d74b8588a3f97400e9fbff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c75734d-f6", "ovs_interfaceid": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.133 226239 DEBUG nova.network.os_vif_util [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:0b:55,bridge_name='br-int',has_traffic_filtering=True,id=2c75734d-f6b4-43fd-8b27-35fc421580dd,network=Network(7aba0eda-87fd-423d-bbf9-46f52ee12dad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c75734d-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.133 226239 DEBUG os_vif [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:0b:55,bridge_name='br-int',has_traffic_filtering=True,id=2c75734d-f6b4-43fd-8b27-35fc421580dd,network=Network(7aba0eda-87fd-423d-bbf9-46f52ee12dad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c75734d-f6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.134 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.135 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.135 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.139 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.139 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2c75734d-f6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.140 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2c75734d-f6, col_values=(('external_ids', {'iface-id': '2c75734d-f6b4-43fd-8b27-35fc421580dd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8f:0b:55', 'vm-uuid': '19f9b64a-5ffd-4930-8b5d-427c62ff87a4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.165 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:20 np0005603623 NetworkManager[48970]: <info>  [1769846060.1671] manager: (tap2c75734d-f6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.170 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.174 226239 INFO os_vif [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:0b:55,bridge_name='br-int',has_traffic_filtering=True,id=2c75734d-f6b4-43fd-8b27-35fc421580dd,network=Network(7aba0eda-87fd-423d-bbf9-46f52ee12dad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c75734d-f6')#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.430 226239 DEBUG nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.431 226239 DEBUG nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.431 226239 DEBUG nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] No VIF found with MAC fa:16:3e:8f:0b:55, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.432 226239 INFO nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Using config drive#033[00m
Jan 31 02:54:20 np0005603623 nova_compute[226235]: 2026-01-31 07:54:20.579 226239 DEBUG nova.storage.rbd_utils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] rbd image 19f9b64a-5ffd-4930-8b5d-427c62ff87a4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:54:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:54:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:20.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:54:21 np0005603623 nova_compute[226235]: 2026-01-31 07:54:21.154 226239 INFO nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Creating config drive at /var/lib/nova/instances/19f9b64a-5ffd-4930-8b5d-427c62ff87a4/disk.config#033[00m
Jan 31 02:54:21 np0005603623 nova_compute[226235]: 2026-01-31 07:54:21.160 226239 DEBUG oslo_concurrency.processutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/19f9b64a-5ffd-4930-8b5d-427c62ff87a4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpzq2kumi6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:54:21 np0005603623 nova_compute[226235]: 2026-01-31 07:54:21.280 226239 DEBUG oslo_concurrency.processutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/19f9b64a-5ffd-4930-8b5d-427c62ff87a4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpzq2kumi6" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:54:21 np0005603623 nova_compute[226235]: 2026-01-31 07:54:21.308 226239 DEBUG nova.storage.rbd_utils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] rbd image 19f9b64a-5ffd-4930-8b5d-427c62ff87a4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:54:21 np0005603623 nova_compute[226235]: 2026-01-31 07:54:21.311 226239 DEBUG oslo_concurrency.processutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/19f9b64a-5ffd-4930-8b5d-427c62ff87a4/disk.config 19f9b64a-5ffd-4930-8b5d-427c62ff87a4_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:54:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:21.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:21 np0005603623 nova_compute[226235]: 2026-01-31 07:54:21.932 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:54:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:22.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:54:23 np0005603623 nova_compute[226235]: 2026-01-31 07:54:23.440 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846048.4385698, 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:54:23 np0005603623 nova_compute[226235]: 2026-01-31 07:54:23.440 226239 INFO nova.compute.manager [-] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:54:23 np0005603623 nova_compute[226235]: 2026-01-31 07:54:23.465 226239 DEBUG nova.compute.manager [None req-3900ba27-9eb2-449b-9908-817894764bfa - - - - - -] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:54:23 np0005603623 nova_compute[226235]: 2026-01-31 07:54:23.468 226239 DEBUG nova.compute.manager [None req-3900ba27-9eb2-449b-9908-817894764bfa - - - - - -] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: suspended, current task_state: deleting, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:54:23 np0005603623 nova_compute[226235]: 2026-01-31 07:54:23.495 226239 INFO nova.compute.manager [None req-3900ba27-9eb2-449b-9908-817894764bfa - - - - - -] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Jan 31 02:54:23 np0005603623 nova_compute[226235]: 2026-01-31 07:54:23.589 226239 DEBUG oslo_concurrency.processutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/19f9b64a-5ffd-4930-8b5d-427c62ff87a4/disk.config 19f9b64a-5ffd-4930-8b5d-427c62ff87a4_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.278s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:54:23 np0005603623 nova_compute[226235]: 2026-01-31 07:54:23.589 226239 INFO nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Deleting local config drive /var/lib/nova/instances/19f9b64a-5ffd-4930-8b5d-427c62ff87a4/disk.config because it was imported into RBD.#033[00m
Jan 31 02:54:23 np0005603623 kernel: tap2c75734d-f6: entered promiscuous mode
Jan 31 02:54:23 np0005603623 ovn_controller[133449]: 2026-01-31T07:54:23Z|00144|binding|INFO|Claiming lport 2c75734d-f6b4-43fd-8b27-35fc421580dd for this chassis.
Jan 31 02:54:23 np0005603623 ovn_controller[133449]: 2026-01-31T07:54:23Z|00145|binding|INFO|2c75734d-f6b4-43fd-8b27-35fc421580dd: Claiming fa:16:3e:8f:0b:55 10.100.0.5
Jan 31 02:54:23 np0005603623 nova_compute[226235]: 2026-01-31 07:54:23.632 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:23 np0005603623 NetworkManager[48970]: <info>  [1769846063.6334] manager: (tap2c75734d-f6): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Jan 31 02:54:23 np0005603623 nova_compute[226235]: 2026-01-31 07:54:23.636 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:23 np0005603623 ovn_controller[133449]: 2026-01-31T07:54:23Z|00146|binding|INFO|Setting lport 2c75734d-f6b4-43fd-8b27-35fc421580dd ovn-installed in OVS
Jan 31 02:54:23 np0005603623 ovn_controller[133449]: 2026-01-31T07:54:23Z|00147|binding|INFO|Setting lport 2c75734d-f6b4-43fd-8b27-35fc421580dd up in Southbound
Jan 31 02:54:23 np0005603623 nova_compute[226235]: 2026-01-31 07:54:23.659 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:23 np0005603623 nova_compute[226235]: 2026-01-31 07:54:23.670 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.670 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:0b:55 10.100.0.5'], port_security=['fa:16:3e:8f:0b:55 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '19f9b64a-5ffd-4930-8b5d-427c62ff87a4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7aba0eda-87fd-423d-bbf9-46f52ee12dad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a2b8f01b0d74b8588a3f97400e9fbff', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8615b5bd-1e93-4e34-b4af-645e552086ea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2d7c665d-38bd-4d97-833a-10a3636faeeb, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=2c75734d-f6b4-43fd-8b27-35fc421580dd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.672 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 2c75734d-f6b4-43fd-8b27-35fc421580dd in datapath 7aba0eda-87fd-423d-bbf9-46f52ee12dad bound to our chassis#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.673 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7aba0eda-87fd-423d-bbf9-46f52ee12dad#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.684 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9cfeccf2-0c3d-4ef6-b6e5-99b771ca08dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.685 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7aba0eda-81 in ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.686 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7aba0eda-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.686 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bc34207d-768a-4d49-90ee-b5b6bbf84525]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.687 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[29778325-ec9c-43b2-b3b8-e3410a05cbe3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:54:23 np0005603623 systemd-machined[194379]: New machine qemu-19-instance-0000001d.
Jan 31 02:54:23 np0005603623 systemd-udevd[243134]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.698 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[d6a4ab32-6cef-4c7c-99e4-d50c2a1baaed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:54:23 np0005603623 systemd[1]: Started Virtual Machine qemu-19-instance-0000001d.
Jan 31 02:54:23 np0005603623 NetworkManager[48970]: <info>  [1769846063.7024] device (tap2c75734d-f6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:54:23 np0005603623 NetworkManager[48970]: <info>  [1769846063.7032] device (tap2c75734d-f6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.709 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[65c468af-3a3b-45d1-9694-3095b0dea5fc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.733 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ed0761a1-cdfd-42a9-ba54-438dfbb4b640]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:54:23 np0005603623 NetworkManager[48970]: <info>  [1769846063.7382] manager: (tap7aba0eda-80): new Veth device (/org/freedesktop/NetworkManager/Devices/69)
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.738 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3f87e4b0-2498-4ee9-825e-afea27fa0e20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.767 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ebd91dfd-3283-48f1-9c4b-b42cc2645cf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.770 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[c686b0e2-1919-4243-978e-2895e2d87a98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:54:23 np0005603623 NetworkManager[48970]: <info>  [1769846063.7819] device (tap7aba0eda-80): carrier: link connected
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.783 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ebc04966-af62-425b-8778-ca7804b1b6da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.794 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[214e82e4-cb84-4655-98b4-f4379edf3f0c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7aba0eda-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:e8:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520421, 'reachable_time': 38781, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243166, 'error': None, 'target': 'ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.804 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[37bbf9c5-87f0-4d14-94b5-3b0c2a58aa29]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:e8c2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 520421, 'tstamp': 520421}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243167, 'error': None, 'target': 'ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.818 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[78f33d35-db2f-4e00-a0d0-724baf0972ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7aba0eda-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:e8:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520421, 'reachable_time': 38781, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243168, 'error': None, 'target': 'ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.836 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bb9248f0-2d1b-4f00-b778-9b08ae8fb2b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.874 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f9279962-26da-4582-8c14-6a154554af18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.875 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7aba0eda-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.876 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.876 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7aba0eda-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:54:23 np0005603623 kernel: tap7aba0eda-80: entered promiscuous mode
Jan 31 02:54:23 np0005603623 NetworkManager[48970]: <info>  [1769846063.8787] manager: (tap7aba0eda-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Jan 31 02:54:23 np0005603623 nova_compute[226235]: 2026-01-31 07:54:23.877 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:23 np0005603623 nova_compute[226235]: 2026-01-31 07:54:23.879 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.883 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7aba0eda-80, col_values=(('external_ids', {'iface-id': '4ef54508-7037-41f0-9e61-2e89866320c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:54:23 np0005603623 ovn_controller[133449]: 2026-01-31T07:54:23Z|00148|binding|INFO|Releasing lport 4ef54508-7037-41f0-9e61-2e89866320c2 from this chassis (sb_readonly=0)
Jan 31 02:54:23 np0005603623 nova_compute[226235]: 2026-01-31 07:54:23.884 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:23 np0005603623 nova_compute[226235]: 2026-01-31 07:54:23.884 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.887 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7aba0eda-87fd-423d-bbf9-46f52ee12dad.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7aba0eda-87fd-423d-bbf9-46f52ee12dad.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.888 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6ed86b26-56ef-4bf9-aadb-466d15c354f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:54:23 np0005603623 nova_compute[226235]: 2026-01-31 07:54:23.889 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.889 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-7aba0eda-87fd-423d-bbf9-46f52ee12dad
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/7aba0eda-87fd-423d-bbf9-46f52ee12dad.pid.haproxy
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 7aba0eda-87fd-423d-bbf9-46f52ee12dad
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:54:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:23.890 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad', 'env', 'PROCESS_TAG=haproxy-7aba0eda-87fd-423d-bbf9-46f52ee12dad', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7aba0eda-87fd-423d-bbf9-46f52ee12dad.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:54:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:23.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.143 226239 INFO nova.virt.libvirt.driver [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Deleting instance files /var/lib/nova/instances/33d8d2ee-7d2a-4973-a7da-7f86f14f5f87_del#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.145 226239 INFO nova.virt.libvirt.driver [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Deletion of /var/lib/nova/instances/33d8d2ee-7d2a-4973-a7da-7f86f14f5f87_del complete#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.220 226239 INFO nova.compute.manager [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Took 10.37 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.221 226239 DEBUG oslo.service.loopingcall [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.222 226239 DEBUG nova.compute.manager [-] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.222 226239 DEBUG nova.network.neutron [-] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.240 226239 DEBUG nova.compute.manager [req-2d7f2ce0-ceed-4647-978e-a4dce1897c58 req-2298874b-84d6-4ba5-b268-f87356948d6f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Received event network-vif-plugged-2c75734d-f6b4-43fd-8b27-35fc421580dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.240 226239 DEBUG oslo_concurrency.lockutils [req-2d7f2ce0-ceed-4647-978e-a4dce1897c58 req-2298874b-84d6-4ba5-b268-f87356948d6f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.240 226239 DEBUG oslo_concurrency.lockutils [req-2d7f2ce0-ceed-4647-978e-a4dce1897c58 req-2298874b-84d6-4ba5-b268-f87356948d6f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.241 226239 DEBUG oslo_concurrency.lockutils [req-2d7f2ce0-ceed-4647-978e-a4dce1897c58 req-2298874b-84d6-4ba5-b268-f87356948d6f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.241 226239 DEBUG nova.compute.manager [req-2d7f2ce0-ceed-4647-978e-a4dce1897c58 req-2298874b-84d6-4ba5-b268-f87356948d6f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Processing event network-vif-plugged-2c75734d-f6b4-43fd-8b27-35fc421580dd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:54:24 np0005603623 podman[243218]: 2026-01-31 07:54:24.199229035 +0000 UTC m=+0.024727699 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.533 226239 DEBUG nova.network.neutron [-] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.548 226239 DEBUG nova.network.neutron [-] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.574 226239 INFO nova.compute.manager [-] [instance: 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87] Took 0.35 seconds to deallocate network for instance.#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.649 226239 DEBUG oslo_concurrency.lockutils [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.650 226239 DEBUG oslo_concurrency.lockutils [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.766 226239 DEBUG oslo_concurrency.processutils [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.781 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846064.781308, 19f9b64a-5ffd-4930-8b5d-427c62ff87a4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.782 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] VM Started (Lifecycle Event)#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.784 226239 DEBUG nova.compute.manager [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.788 226239 DEBUG nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.792 226239 INFO nova.virt.libvirt.driver [-] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Instance spawned successfully.#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.793 226239 DEBUG nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.812 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.817 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.822 226239 DEBUG nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.823 226239 DEBUG nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.823 226239 DEBUG nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.824 226239 DEBUG nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.824 226239 DEBUG nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.824 226239 DEBUG nova.virt.libvirt.driver [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.857 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.857 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846064.7814372, 19f9b64a-5ffd-4930-8b5d-427c62ff87a4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.858 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:54:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:54:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:24.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.885 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.888 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846064.7870967, 19f9b64a-5ffd-4930-8b5d-427c62ff87a4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.888 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.894 226239 INFO nova.compute.manager [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Took 14.20 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.894 226239 DEBUG nova.compute.manager [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.919 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.922 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.958 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:54:24 np0005603623 nova_compute[226235]: 2026-01-31 07:54:24.980 226239 INFO nova.compute.manager [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Took 15.57 seconds to build instance.#033[00m
Jan 31 02:54:25 np0005603623 nova_compute[226235]: 2026-01-31 07:54:25.003 226239 DEBUG oslo_concurrency.lockutils [None req-7df74bb7-2dd2-40fa-984c-16b4397789c2 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.674s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:54:25 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2389806235' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:54:25 np0005603623 nova_compute[226235]: 2026-01-31 07:54:25.164 226239 DEBUG oslo_concurrency.processutils [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.397s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:54:25 np0005603623 nova_compute[226235]: 2026-01-31 07:54:25.165 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:25 np0005603623 nova_compute[226235]: 2026-01-31 07:54:25.168 226239 DEBUG nova.compute.provider_tree [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:54:25 np0005603623 nova_compute[226235]: 2026-01-31 07:54:25.190 226239 DEBUG nova.scheduler.client.report [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:54:25 np0005603623 nova_compute[226235]: 2026-01-31 07:54:25.215 226239 DEBUG oslo_concurrency.lockutils [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.566s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:25 np0005603623 podman[243218]: 2026-01-31 07:54:25.222582996 +0000 UTC m=+1.048081620 container create 222ef25b03220b43e60b08e1443980d29c51d73217ce7d288206cea7b9bb9917 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 02:54:25 np0005603623 nova_compute[226235]: 2026-01-31 07:54:25.304 226239 INFO nova.scheduler.client.report [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Deleted allocations for instance 33d8d2ee-7d2a-4973-a7da-7f86f14f5f87#033[00m
Jan 31 02:54:25 np0005603623 nova_compute[226235]: 2026-01-31 07:54:25.398 226239 DEBUG oslo_concurrency.lockutils [None req-db449be9-71a8-4e68-9698-05fb6fbccc5f 0ee689c4c14744fb8b4e1d54f6831626 f63afec818164f31a848360151f96a68 - - default default] Lock "33d8d2ee-7d2a-4973-a7da-7f86f14f5f87" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 12.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:25 np0005603623 systemd[1]: Started libpod-conmon-222ef25b03220b43e60b08e1443980d29c51d73217ce7d288206cea7b9bb9917.scope.
Jan 31 02:54:25 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:54:25 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e18d919a43087b3b27bf09d3366756592cb166e285f8aed914992ea996aef85e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:54:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:25.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:26 np0005603623 podman[243218]: 2026-01-31 07:54:26.180701897 +0000 UTC m=+2.006200551 container init 222ef25b03220b43e60b08e1443980d29c51d73217ce7d288206cea7b9bb9917 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:54:26 np0005603623 podman[243218]: 2026-01-31 07:54:26.189287127 +0000 UTC m=+2.014785741 container start 222ef25b03220b43e60b08e1443980d29c51d73217ce7d288206cea7b9bb9917 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 02:54:26 np0005603623 neutron-haproxy-ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad[243281]: [NOTICE]   (243308) : New worker (243310) forked
Jan 31 02:54:26 np0005603623 neutron-haproxy-ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad[243281]: [NOTICE]   (243308) : Loading success.
Jan 31 02:54:26 np0005603623 nova_compute[226235]: 2026-01-31 07:54:26.412 226239 DEBUG nova.compute.manager [req-187d858b-390f-41e1-b9e7-26819d0ea721 req-c7b92ba1-15df-46ee-b828-c8ee5a21b0dc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Received event network-vif-plugged-2c75734d-f6b4-43fd-8b27-35fc421580dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:54:26 np0005603623 nova_compute[226235]: 2026-01-31 07:54:26.413 226239 DEBUG oslo_concurrency.lockutils [req-187d858b-390f-41e1-b9e7-26819d0ea721 req-c7b92ba1-15df-46ee-b828-c8ee5a21b0dc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:26 np0005603623 nova_compute[226235]: 2026-01-31 07:54:26.413 226239 DEBUG oslo_concurrency.lockutils [req-187d858b-390f-41e1-b9e7-26819d0ea721 req-c7b92ba1-15df-46ee-b828-c8ee5a21b0dc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:26 np0005603623 nova_compute[226235]: 2026-01-31 07:54:26.414 226239 DEBUG oslo_concurrency.lockutils [req-187d858b-390f-41e1-b9e7-26819d0ea721 req-c7b92ba1-15df-46ee-b828-c8ee5a21b0dc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:26 np0005603623 nova_compute[226235]: 2026-01-31 07:54:26.414 226239 DEBUG nova.compute.manager [req-187d858b-390f-41e1-b9e7-26819d0ea721 req-c7b92ba1-15df-46ee-b828-c8ee5a21b0dc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] No waiting events found dispatching network-vif-plugged-2c75734d-f6b4-43fd-8b27-35fc421580dd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:54:26 np0005603623 nova_compute[226235]: 2026-01-31 07:54:26.414 226239 WARNING nova.compute.manager [req-187d858b-390f-41e1-b9e7-26819d0ea721 req-c7b92ba1-15df-46ee-b828-c8ee5a21b0dc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Received unexpected event network-vif-plugged-2c75734d-f6b4-43fd-8b27-35fc421580dd for instance with vm_state active and task_state None.#033[00m
Jan 31 02:54:26 np0005603623 podman[243283]: 2026-01-31 07:54:26.728837619 +0000 UTC m=+1.082694649 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 02:54:26 np0005603623 podman[243284]: 2026-01-31 07:54:26.753220347 +0000 UTC m=+1.105513407 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 02:54:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:54:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:26.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:54:26 np0005603623 nova_compute[226235]: 2026-01-31 07:54:26.935 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:27.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:28.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:29.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:30.088 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:30.088 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:30.089 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:30 np0005603623 nova_compute[226235]: 2026-01-31 07:54:30.169 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 31 02:54:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:30.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 31 02:54:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:54:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:31.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:54:31 np0005603623 nova_compute[226235]: 2026-01-31 07:54:31.937 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:32.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:33.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:33 np0005603623 NetworkManager[48970]: <info>  [1769846073.9737] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Jan 31 02:54:33 np0005603623 NetworkManager[48970]: <info>  [1769846073.9746] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Jan 31 02:54:33 np0005603623 nova_compute[226235]: 2026-01-31 07:54:33.975 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:33 np0005603623 nova_compute[226235]: 2026-01-31 07:54:33.999 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:34 np0005603623 ovn_controller[133449]: 2026-01-31T07:54:34Z|00149|binding|INFO|Releasing lport 4ef54508-7037-41f0-9e61-2e89866320c2 from this chassis (sb_readonly=0)
Jan 31 02:54:34 np0005603623 nova_compute[226235]: 2026-01-31 07:54:34.013 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:34 np0005603623 nova_compute[226235]: 2026-01-31 07:54:34.458 226239 DEBUG nova.compute.manager [req-3b8e296e-04da-493e-9914-568f7111e34c req-93ee2732-86fe-41f2-bb34-1963181ebd4f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Received event network-changed-2c75734d-f6b4-43fd-8b27-35fc421580dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:54:34 np0005603623 nova_compute[226235]: 2026-01-31 07:54:34.458 226239 DEBUG nova.compute.manager [req-3b8e296e-04da-493e-9914-568f7111e34c req-93ee2732-86fe-41f2-bb34-1963181ebd4f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Refreshing instance network info cache due to event network-changed-2c75734d-f6b4-43fd-8b27-35fc421580dd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:54:34 np0005603623 nova_compute[226235]: 2026-01-31 07:54:34.459 226239 DEBUG oslo_concurrency.lockutils [req-3b8e296e-04da-493e-9914-568f7111e34c req-93ee2732-86fe-41f2-bb34-1963181ebd4f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-19f9b64a-5ffd-4930-8b5d-427c62ff87a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:54:34 np0005603623 nova_compute[226235]: 2026-01-31 07:54:34.459 226239 DEBUG oslo_concurrency.lockutils [req-3b8e296e-04da-493e-9914-568f7111e34c req-93ee2732-86fe-41f2-bb34-1963181ebd4f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-19f9b64a-5ffd-4930-8b5d-427c62ff87a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:54:34 np0005603623 nova_compute[226235]: 2026-01-31 07:54:34.459 226239 DEBUG nova.network.neutron [req-3b8e296e-04da-493e-9914-568f7111e34c req-93ee2732-86fe-41f2-bb34-1963181ebd4f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Refreshing network info cache for port 2c75734d-f6b4-43fd-8b27-35fc421580dd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:54:34 np0005603623 nova_compute[226235]: 2026-01-31 07:54:34.508 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:54:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:34.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:54:35 np0005603623 nova_compute[226235]: 2026-01-31 07:54:35.181 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:35.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:36 np0005603623 nova_compute[226235]: 2026-01-31 07:54:36.885 226239 DEBUG nova.network.neutron [req-3b8e296e-04da-493e-9914-568f7111e34c req-93ee2732-86fe-41f2-bb34-1963181ebd4f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Updated VIF entry in instance network info cache for port 2c75734d-f6b4-43fd-8b27-35fc421580dd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:54:36 np0005603623 nova_compute[226235]: 2026-01-31 07:54:36.886 226239 DEBUG nova.network.neutron [req-3b8e296e-04da-493e-9914-568f7111e34c req-93ee2732-86fe-41f2-bb34-1963181ebd4f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Updating instance_info_cache with network_info: [{"id": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "address": "fa:16:3e:8f:0b:55", "network": {"id": "7aba0eda-87fd-423d-bbf9-46f52ee12dad", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-733650969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a2b8f01b0d74b8588a3f97400e9fbff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c75734d-f6", "ovs_interfaceid": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:54:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:36.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:36 np0005603623 nova_compute[226235]: 2026-01-31 07:54:36.906 226239 DEBUG oslo_concurrency.lockutils [req-3b8e296e-04da-493e-9914-568f7111e34c req-93ee2732-86fe-41f2-bb34-1963181ebd4f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-19f9b64a-5ffd-4930-8b5d-427c62ff87a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:54:36 np0005603623 nova_compute[226235]: 2026-01-31 07:54:36.982 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:54:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:37.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:54:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:54:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:38.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:39.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:40 np0005603623 nova_compute[226235]: 2026-01-31 07:54:40.184 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:40 np0005603623 nova_compute[226235]: 2026-01-31 07:54:40.591 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:40.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:54:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:41.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:54:41 np0005603623 nova_compute[226235]: 2026-01-31 07:54:41.983 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:42.152 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:54:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:42.153 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:54:42 np0005603623 nova_compute[226235]: 2026-01-31 07:54:42.153 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:42.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:43.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:54:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:44.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:54:45 np0005603623 nova_compute[226235]: 2026-01-31 07:54:45.186 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:54:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:45.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:54:46.417245) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846086417310, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2259, "num_deletes": 256, "total_data_size": 5674236, "memory_usage": 5744864, "flush_reason": "Manual Compaction"}
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846086621049, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 3661247, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25655, "largest_seqno": 27909, "table_properties": {"data_size": 3651829, "index_size": 5912, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19635, "raw_average_key_size": 20, "raw_value_size": 3632825, "raw_average_value_size": 3741, "num_data_blocks": 261, "num_entries": 971, "num_filter_entries": 971, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769845886, "oldest_key_time": 1769845886, "file_creation_time": 1769846086, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 203856 microseconds, and 5896 cpu microseconds.
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:54:46 np0005603623 ovn_controller[133449]: 2026-01-31T07:54:46Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8f:0b:55 10.100.0.5
Jan 31 02:54:46 np0005603623 ovn_controller[133449]: 2026-01-31T07:54:46Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8f:0b:55 10.100.0.5
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:54:46.621096) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 3661247 bytes OK
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:54:46.621117) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:54:46.850512) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:54:46.850553) EVENT_LOG_v1 {"time_micros": 1769846086850544, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:54:46.850574) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 5664071, prev total WAL file size 5664071, number of live WAL files 2.
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:54:46.851498) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353032' seq:72057594037927935, type:22 .. '6C6F676D00373533' seq:0, type:0; will stop at (end)
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(3575KB)], [51(9063KB)]
Jan 31 02:54:46 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846086851584, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 12942516, "oldest_snapshot_seqno": -1}
Jan 31 02:54:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:46.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:46 np0005603623 nova_compute[226235]: 2026-01-31 07:54:46.985 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:47 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5437 keys, 12817826 bytes, temperature: kUnknown
Jan 31 02:54:47 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846087261069, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 12817826, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12776814, "index_size": 26310, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13637, "raw_key_size": 136910, "raw_average_key_size": 25, "raw_value_size": 12674565, "raw_average_value_size": 2331, "num_data_blocks": 1085, "num_entries": 5437, "num_filter_entries": 5437, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769846086, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:54:47 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:54:47.261276) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 12817826 bytes
Jan 31 02:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:54:47.329111) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 31.6 rd, 31.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 8.9 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(7.0) write-amplify(3.5) OK, records in: 5970, records dropped: 533 output_compression: NoCompression
Jan 31 02:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:54:47.329148) EVENT_LOG_v1 {"time_micros": 1769846087329133, "job": 30, "event": "compaction_finished", "compaction_time_micros": 409544, "compaction_time_cpu_micros": 20243, "output_level": 6, "num_output_files": 1, "total_output_size": 12817826, "num_input_records": 5970, "num_output_records": 5437, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:54:47 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:54:47 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846087329604, "job": 30, "event": "table_file_deletion", "file_number": 53}
Jan 31 02:54:47 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:54:47 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846087330498, "job": 30, "event": "table_file_deletion", "file_number": 51}
Jan 31 02:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:54:46.851336) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:54:47.330573) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:54:47.330579) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:54:47.330582) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:54:47.330584) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:54:47.330586) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:54:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:47.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:48.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:49 np0005603623 nova_compute[226235]: 2026-01-31 07:54:49.696 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:54:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:49.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:54:50 np0005603623 nova_compute[226235]: 2026-01-31 07:54:50.191 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:50.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:51.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:51 np0005603623 nova_compute[226235]: 2026-01-31 07:54:51.987 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:54:52.156 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:54:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:52.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:53.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:54 np0005603623 nova_compute[226235]: 2026-01-31 07:54:54.342 226239 DEBUG nova.compute.manager [req-268a291c-bfe8-4e6b-b463-978a57b20a75 req-da2b1b53-8cda-4d78-8b3e-ca7332f89f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Received event network-changed-2c75734d-f6b4-43fd-8b27-35fc421580dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:54:54 np0005603623 nova_compute[226235]: 2026-01-31 07:54:54.342 226239 DEBUG nova.compute.manager [req-268a291c-bfe8-4e6b-b463-978a57b20a75 req-da2b1b53-8cda-4d78-8b3e-ca7332f89f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Refreshing instance network info cache due to event network-changed-2c75734d-f6b4-43fd-8b27-35fc421580dd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:54:54 np0005603623 nova_compute[226235]: 2026-01-31 07:54:54.342 226239 DEBUG oslo_concurrency.lockutils [req-268a291c-bfe8-4e6b-b463-978a57b20a75 req-da2b1b53-8cda-4d78-8b3e-ca7332f89f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-19f9b64a-5ffd-4930-8b5d-427c62ff87a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:54:54 np0005603623 nova_compute[226235]: 2026-01-31 07:54:54.343 226239 DEBUG oslo_concurrency.lockutils [req-268a291c-bfe8-4e6b-b463-978a57b20a75 req-da2b1b53-8cda-4d78-8b3e-ca7332f89f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-19f9b64a-5ffd-4930-8b5d-427c62ff87a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:54:54 np0005603623 nova_compute[226235]: 2026-01-31 07:54:54.343 226239 DEBUG nova.network.neutron [req-268a291c-bfe8-4e6b-b463-978a57b20a75 req-da2b1b53-8cda-4d78-8b3e-ca7332f89f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Refreshing network info cache for port 2c75734d-f6b4-43fd-8b27-35fc421580dd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:54:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:54:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:54.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:54:55 np0005603623 nova_compute[226235]: 2026-01-31 07:54:55.194 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:54:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:54:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:55.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:56.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:56 np0005603623 podman[243641]: 2026-01-31 07:54:56.983235994 +0000 UTC m=+0.076653913 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 02:54:56 np0005603623 nova_compute[226235]: 2026-01-31 07:54:56.989 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:57 np0005603623 podman[243642]: 2026-01-31 07:54:57.005301037 +0000 UTC m=+0.098383186 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:54:57 np0005603623 nova_compute[226235]: 2026-01-31 07:54:57.482 226239 DEBUG nova.network.neutron [req-268a291c-bfe8-4e6b-b463-978a57b20a75 req-da2b1b53-8cda-4d78-8b3e-ca7332f89f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Updated VIF entry in instance network info cache for port 2c75734d-f6b4-43fd-8b27-35fc421580dd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:54:57 np0005603623 nova_compute[226235]: 2026-01-31 07:54:57.483 226239 DEBUG nova.network.neutron [req-268a291c-bfe8-4e6b-b463-978a57b20a75 req-da2b1b53-8cda-4d78-8b3e-ca7332f89f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Updating instance_info_cache with network_info: [{"id": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "address": "fa:16:3e:8f:0b:55", "network": {"id": "7aba0eda-87fd-423d-bbf9-46f52ee12dad", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-733650969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a2b8f01b0d74b8588a3f97400e9fbff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c75734d-f6", "ovs_interfaceid": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:54:57 np0005603623 nova_compute[226235]: 2026-01-31 07:54:57.573 226239 DEBUG oslo_concurrency.lockutils [req-268a291c-bfe8-4e6b-b463-978a57b20a75 req-da2b1b53-8cda-4d78-8b3e-ca7332f89f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-19f9b64a-5ffd-4930-8b5d-427c62ff87a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:54:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:57.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:58.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:54:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:59.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:00 np0005603623 nova_compute[226235]: 2026-01-31 07:55:00.198 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:00.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:00 np0005603623 nova_compute[226235]: 2026-01-31 07:55:00.981 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:01.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:01 np0005603623 nova_compute[226235]: 2026-01-31 07:55:01.992 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:02 np0005603623 nova_compute[226235]: 2026-01-31 07:55:02.542 226239 DEBUG oslo_concurrency.lockutils [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Acquiring lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:02 np0005603623 nova_compute[226235]: 2026-01-31 07:55:02.543 226239 DEBUG oslo_concurrency.lockutils [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:02 np0005603623 nova_compute[226235]: 2026-01-31 07:55:02.543 226239 DEBUG oslo_concurrency.lockutils [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Acquiring lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:02 np0005603623 nova_compute[226235]: 2026-01-31 07:55:02.543 226239 DEBUG oslo_concurrency.lockutils [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:02 np0005603623 nova_compute[226235]: 2026-01-31 07:55:02.543 226239 DEBUG oslo_concurrency.lockutils [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:02 np0005603623 nova_compute[226235]: 2026-01-31 07:55:02.544 226239 INFO nova.compute.manager [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Terminating instance#033[00m
Jan 31 02:55:02 np0005603623 nova_compute[226235]: 2026-01-31 07:55:02.545 226239 DEBUG nova.compute.manager [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:55:02 np0005603623 kernel: tap2c75734d-f6 (unregistering): left promiscuous mode
Jan 31 02:55:02 np0005603623 NetworkManager[48970]: <info>  [1769846102.8808] device (tap2c75734d-f6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:55:02 np0005603623 ovn_controller[133449]: 2026-01-31T07:55:02Z|00150|binding|INFO|Releasing lport 2c75734d-f6b4-43fd-8b27-35fc421580dd from this chassis (sb_readonly=0)
Jan 31 02:55:02 np0005603623 ovn_controller[133449]: 2026-01-31T07:55:02Z|00151|binding|INFO|Setting lport 2c75734d-f6b4-43fd-8b27-35fc421580dd down in Southbound
Jan 31 02:55:02 np0005603623 nova_compute[226235]: 2026-01-31 07:55:02.887 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:02 np0005603623 ovn_controller[133449]: 2026-01-31T07:55:02Z|00152|binding|INFO|Removing iface tap2c75734d-f6 ovn-installed in OVS
Jan 31 02:55:02 np0005603623 nova_compute[226235]: 2026-01-31 07:55:02.896 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:55:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:02.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:55:02 np0005603623 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Jan 31 02:55:02 np0005603623 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001d.scope: Consumed 12.876s CPU time.
Jan 31 02:55:02 np0005603623 systemd-machined[194379]: Machine qemu-19-instance-0000001d terminated.
Jan 31 02:55:02 np0005603623 nova_compute[226235]: 2026-01-31 07:55:02.983 226239 INFO nova.virt.libvirt.driver [-] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Instance destroyed successfully.#033[00m
Jan 31 02:55:02 np0005603623 nova_compute[226235]: 2026-01-31 07:55:02.983 226239 DEBUG nova.objects.instance [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lazy-loading 'resources' on Instance uuid 19f9b64a-5ffd-4930-8b5d-427c62ff87a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:55:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:03.002 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:0b:55 10.100.0.5'], port_security=['fa:16:3e:8f:0b:55 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '19f9b64a-5ffd-4930-8b5d-427c62ff87a4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7aba0eda-87fd-423d-bbf9-46f52ee12dad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a2b8f01b0d74b8588a3f97400e9fbff', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8615b5bd-1e93-4e34-b4af-645e552086ea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2d7c665d-38bd-4d97-833a-10a3636faeeb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=2c75734d-f6b4-43fd-8b27-35fc421580dd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:55:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:03.003 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 2c75734d-f6b4-43fd-8b27-35fc421580dd in datapath 7aba0eda-87fd-423d-bbf9-46f52ee12dad unbound from our chassis#033[00m
Jan 31 02:55:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:03.005 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7aba0eda-87fd-423d-bbf9-46f52ee12dad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:55:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:03.006 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d81c0a99-07cf-4d42-bbda-dc0c6111fe8f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:03.007 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad namespace which is not needed anymore#033[00m
Jan 31 02:55:03 np0005603623 nova_compute[226235]: 2026-01-31 07:55:03.015 226239 DEBUG nova.virt.libvirt.vif [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:54:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-1178765642',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-1178765642',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-117876564',id=29,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:54:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8a2b8f01b0d74b8588a3f97400e9fbff',ramdisk_id='',reservation_id='r-dngyx2m2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-407725813',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-407725813-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:54:24Z,user_data=None,user_id='0384d6fc8c0b4f66bf382009760ab9f5',uuid=19f9b64a-5ffd-4930-8b5d-427c62ff87a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "address": "fa:16:3e:8f:0b:55", "network": {"id": "7aba0eda-87fd-423d-bbf9-46f52ee12dad", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-733650969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a2b8f01b0d74b8588a3f97400e9fbff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c75734d-f6", "ovs_interfaceid": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:55:03 np0005603623 nova_compute[226235]: 2026-01-31 07:55:03.015 226239 DEBUG nova.network.os_vif_util [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Converting VIF {"id": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "address": "fa:16:3e:8f:0b:55", "network": {"id": "7aba0eda-87fd-423d-bbf9-46f52ee12dad", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-733650969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a2b8f01b0d74b8588a3f97400e9fbff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c75734d-f6", "ovs_interfaceid": "2c75734d-f6b4-43fd-8b27-35fc421580dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:55:03 np0005603623 nova_compute[226235]: 2026-01-31 07:55:03.016 226239 DEBUG nova.network.os_vif_util [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8f:0b:55,bridge_name='br-int',has_traffic_filtering=True,id=2c75734d-f6b4-43fd-8b27-35fc421580dd,network=Network(7aba0eda-87fd-423d-bbf9-46f52ee12dad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c75734d-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:55:03 np0005603623 nova_compute[226235]: 2026-01-31 07:55:03.017 226239 DEBUG os_vif [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8f:0b:55,bridge_name='br-int',has_traffic_filtering=True,id=2c75734d-f6b4-43fd-8b27-35fc421580dd,network=Network(7aba0eda-87fd-423d-bbf9-46f52ee12dad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c75734d-f6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:55:03 np0005603623 nova_compute[226235]: 2026-01-31 07:55:03.018 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:03 np0005603623 nova_compute[226235]: 2026-01-31 07:55:03.018 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c75734d-f6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:55:03 np0005603623 nova_compute[226235]: 2026-01-31 07:55:03.020 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:03 np0005603623 nova_compute[226235]: 2026-01-31 07:55:03.022 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:03 np0005603623 nova_compute[226235]: 2026-01-31 07:55:03.025 226239 INFO os_vif [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8f:0b:55,bridge_name='br-int',has_traffic_filtering=True,id=2c75734d-f6b4-43fd-8b27-35fc421580dd,network=Network(7aba0eda-87fd-423d-bbf9-46f52ee12dad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c75734d-f6')#033[00m
Jan 31 02:55:03 np0005603623 neutron-haproxy-ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad[243281]: [NOTICE]   (243308) : haproxy version is 2.8.14-c23fe91
Jan 31 02:55:03 np0005603623 neutron-haproxy-ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad[243281]: [NOTICE]   (243308) : path to executable is /usr/sbin/haproxy
Jan 31 02:55:03 np0005603623 neutron-haproxy-ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad[243281]: [WARNING]  (243308) : Exiting Master process...
Jan 31 02:55:03 np0005603623 neutron-haproxy-ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad[243281]: [ALERT]    (243308) : Current worker (243310) exited with code 143 (Terminated)
Jan 31 02:55:03 np0005603623 neutron-haproxy-ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad[243281]: [WARNING]  (243308) : All workers exited. Exiting... (0)
Jan 31 02:55:03 np0005603623 systemd[1]: libpod-222ef25b03220b43e60b08e1443980d29c51d73217ce7d288206cea7b9bb9917.scope: Deactivated successfully.
Jan 31 02:55:03 np0005603623 podman[243745]: 2026-01-31 07:55:03.125024498 +0000 UTC m=+0.045656967 container died 222ef25b03220b43e60b08e1443980d29c51d73217ce7d288206cea7b9bb9917 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 02:55:03 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-222ef25b03220b43e60b08e1443980d29c51d73217ce7d288206cea7b9bb9917-userdata-shm.mount: Deactivated successfully.
Jan 31 02:55:03 np0005603623 systemd[1]: var-lib-containers-storage-overlay-e18d919a43087b3b27bf09d3366756592cb166e285f8aed914992ea996aef85e-merged.mount: Deactivated successfully.
Jan 31 02:55:03 np0005603623 podman[243745]: 2026-01-31 07:55:03.172600105 +0000 UTC m=+0.093232564 container cleanup 222ef25b03220b43e60b08e1443980d29c51d73217ce7d288206cea7b9bb9917 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 02:55:03 np0005603623 systemd[1]: libpod-conmon-222ef25b03220b43e60b08e1443980d29c51d73217ce7d288206cea7b9bb9917.scope: Deactivated successfully.
Jan 31 02:55:03 np0005603623 podman[243776]: 2026-01-31 07:55:03.231955292 +0000 UTC m=+0.043072486 container remove 222ef25b03220b43e60b08e1443980d29c51d73217ce7d288206cea7b9bb9917 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:55:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:03.236 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[22b52809-da84-4bcc-a411-b90dcd8bcbac]: (4, ('Sat Jan 31 07:55:03 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad (222ef25b03220b43e60b08e1443980d29c51d73217ce7d288206cea7b9bb9917)\n222ef25b03220b43e60b08e1443980d29c51d73217ce7d288206cea7b9bb9917\nSat Jan 31 07:55:03 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad (222ef25b03220b43e60b08e1443980d29c51d73217ce7d288206cea7b9bb9917)\n222ef25b03220b43e60b08e1443980d29c51d73217ce7d288206cea7b9bb9917\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:03.238 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[98bfacbe-de5a-402d-be5e-ea529b725433]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:03.239 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7aba0eda-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:55:03 np0005603623 nova_compute[226235]: 2026-01-31 07:55:03.262 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:03 np0005603623 kernel: tap7aba0eda-80: left promiscuous mode
Jan 31 02:55:03 np0005603623 nova_compute[226235]: 2026-01-31 07:55:03.268 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:55:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:04.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:55:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:04.473 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[80e3aed4-9685-4dcb-af8c-554bfc8e563e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:04.483 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[29b3fa85-e6f9-430a-a34a-e049adb8c911]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:04.485 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ef4091f5-0d80-4176-a14b-2e1c2c8921df]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:04.496 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[13cab71b-c20d-4da6-8d13-16800e42f487]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520415, 'reachable_time': 30487, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243788, 'error': None, 'target': 'ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:04 np0005603623 systemd[1]: run-netns-ovnmeta\x2d7aba0eda\x2d87fd\x2d423d\x2dbbf9\x2d46f52ee12dad.mount: Deactivated successfully.
Jan 31 02:55:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:04.500 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7aba0eda-87fd-423d-bbf9-46f52ee12dad deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:55:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:04.500 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[7ee1c6d0-4a14-4077-a885-85a5e2c39088]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:04.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:06.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:06.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:06 np0005603623 nova_compute[226235]: 2026-01-31 07:55:06.993 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:07 np0005603623 nova_compute[226235]: 2026-01-31 07:55:07.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:07 np0005603623 nova_compute[226235]: 2026-01-31 07:55:07.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 02:55:07 np0005603623 nova_compute[226235]: 2026-01-31 07:55:07.592 226239 DEBUG nova.compute.manager [req-10ebd12c-8957-4953-9ca4-2b8957b21826 req-5e16ca5f-4202-4cac-aa34-e3072b0b5142 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Received event network-vif-unplugged-2c75734d-f6b4-43fd-8b27-35fc421580dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:55:07 np0005603623 nova_compute[226235]: 2026-01-31 07:55:07.593 226239 DEBUG oslo_concurrency.lockutils [req-10ebd12c-8957-4953-9ca4-2b8957b21826 req-5e16ca5f-4202-4cac-aa34-e3072b0b5142 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:07 np0005603623 nova_compute[226235]: 2026-01-31 07:55:07.593 226239 DEBUG oslo_concurrency.lockutils [req-10ebd12c-8957-4953-9ca4-2b8957b21826 req-5e16ca5f-4202-4cac-aa34-e3072b0b5142 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:07 np0005603623 nova_compute[226235]: 2026-01-31 07:55:07.593 226239 DEBUG oslo_concurrency.lockutils [req-10ebd12c-8957-4953-9ca4-2b8957b21826 req-5e16ca5f-4202-4cac-aa34-e3072b0b5142 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:07 np0005603623 nova_compute[226235]: 2026-01-31 07:55:07.593 226239 DEBUG nova.compute.manager [req-10ebd12c-8957-4953-9ca4-2b8957b21826 req-5e16ca5f-4202-4cac-aa34-e3072b0b5142 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] No waiting events found dispatching network-vif-unplugged-2c75734d-f6b4-43fd-8b27-35fc421580dd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:55:07 np0005603623 nova_compute[226235]: 2026-01-31 07:55:07.593 226239 DEBUG nova.compute.manager [req-10ebd12c-8957-4953-9ca4-2b8957b21826 req-5e16ca5f-4202-4cac-aa34-e3072b0b5142 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Received event network-vif-unplugged-2c75734d-f6b4-43fd-8b27-35fc421580dd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:55:08 np0005603623 nova_compute[226235]: 2026-01-31 07:55:08.021 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:08.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:55:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:08.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:55:09 np0005603623 nova_compute[226235]: 2026-01-31 07:55:09.437 226239 INFO nova.virt.libvirt.driver [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Deleting instance files /var/lib/nova/instances/19f9b64a-5ffd-4930-8b5d-427c62ff87a4_del#033[00m
Jan 31 02:55:09 np0005603623 nova_compute[226235]: 2026-01-31 07:55:09.438 226239 INFO nova.virt.libvirt.driver [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Deletion of /var/lib/nova/instances/19f9b64a-5ffd-4930-8b5d-427c62ff87a4_del complete#033[00m
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:55:09.539740) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846109539772, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 474, "num_deletes": 251, "total_data_size": 721120, "memory_usage": 731368, "flush_reason": "Manual Compaction"}
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846109543758, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 476388, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 27914, "largest_seqno": 28383, "table_properties": {"data_size": 473718, "index_size": 770, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6271, "raw_average_key_size": 18, "raw_value_size": 468442, "raw_average_value_size": 1415, "num_data_blocks": 34, "num_entries": 331, "num_filter_entries": 331, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769846087, "oldest_key_time": 1769846087, "file_creation_time": 1769846109, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 4053 microseconds, and 1449 cpu microseconds.
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:55:09.543790) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 476388 bytes OK
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:55:09.543812) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:55:09.548064) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:55:09.548096) EVENT_LOG_v1 {"time_micros": 1769846109548089, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:55:09.548114) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 718247, prev total WAL file size 718247, number of live WAL files 2.
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:55:09.548558) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(465KB)], [54(12MB)]
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846109548669, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 13294214, "oldest_snapshot_seqno": -1}
Jan 31 02:55:09 np0005603623 nova_compute[226235]: 2026-01-31 07:55:09.653 226239 INFO nova.compute.manager [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Took 7.11 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:55:09 np0005603623 nova_compute[226235]: 2026-01-31 07:55:09.654 226239 DEBUG oslo.service.loopingcall [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:55:09 np0005603623 nova_compute[226235]: 2026-01-31 07:55:09.654 226239 DEBUG nova.compute.manager [-] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:55:09 np0005603623 nova_compute[226235]: 2026-01-31 07:55:09.654 226239 DEBUG nova.network.neutron [-] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 5255 keys, 11325220 bytes, temperature: kUnknown
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846109726579, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 11325220, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11286701, "index_size": 24265, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13189, "raw_key_size": 133794, "raw_average_key_size": 25, "raw_value_size": 11188810, "raw_average_value_size": 2129, "num_data_blocks": 995, "num_entries": 5255, "num_filter_entries": 5255, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769846109, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:55:09.726805) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 11325220 bytes
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:55:09.727972) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 74.7 rd, 63.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 12.2 +0.0 blob) out(10.8 +0.0 blob), read-write-amplify(51.7) write-amplify(23.8) OK, records in: 5768, records dropped: 513 output_compression: NoCompression
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:55:09.727992) EVENT_LOG_v1 {"time_micros": 1769846109727983, "job": 32, "event": "compaction_finished", "compaction_time_micros": 177978, "compaction_time_cpu_micros": 19067, "output_level": 6, "num_output_files": 1, "total_output_size": 11325220, "num_input_records": 5768, "num_output_records": 5255, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846109728129, "job": 32, "event": "table_file_deletion", "file_number": 56}
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846109729016, "job": 32, "event": "table_file_deletion", "file_number": 54}
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:55:09.548488) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:55:09.729148) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:55:09.729153) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:55:09.729155) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:55:09.729157) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:55:09.729159) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:55:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:09 np0005603623 nova_compute[226235]: 2026-01-31 07:55:09.751 226239 DEBUG nova.compute.manager [req-dbedd85b-a961-407c-9cc1-343d1e1c1ccc req-03beef21-bcc1-4e78-a9e6-e6a9de43b894 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Received event network-vif-plugged-2c75734d-f6b4-43fd-8b27-35fc421580dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:55:09 np0005603623 nova_compute[226235]: 2026-01-31 07:55:09.752 226239 DEBUG oslo_concurrency.lockutils [req-dbedd85b-a961-407c-9cc1-343d1e1c1ccc req-03beef21-bcc1-4e78-a9e6-e6a9de43b894 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:09 np0005603623 nova_compute[226235]: 2026-01-31 07:55:09.752 226239 DEBUG oslo_concurrency.lockutils [req-dbedd85b-a961-407c-9cc1-343d1e1c1ccc req-03beef21-bcc1-4e78-a9e6-e6a9de43b894 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:09 np0005603623 nova_compute[226235]: 2026-01-31 07:55:09.753 226239 DEBUG oslo_concurrency.lockutils [req-dbedd85b-a961-407c-9cc1-343d1e1c1ccc req-03beef21-bcc1-4e78-a9e6-e6a9de43b894 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:09 np0005603623 nova_compute[226235]: 2026-01-31 07:55:09.753 226239 DEBUG nova.compute.manager [req-dbedd85b-a961-407c-9cc1-343d1e1c1ccc req-03beef21-bcc1-4e78-a9e6-e6a9de43b894 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] No waiting events found dispatching network-vif-plugged-2c75734d-f6b4-43fd-8b27-35fc421580dd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:55:09 np0005603623 nova_compute[226235]: 2026-01-31 07:55:09.753 226239 WARNING nova.compute.manager [req-dbedd85b-a961-407c-9cc1-343d1e1c1ccc req-03beef21-bcc1-4e78-a9e6-e6a9de43b894 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Received unexpected event network-vif-plugged-2c75734d-f6b4-43fd-8b27-35fc421580dd for instance with vm_state active and task_state deleting.#033[00m
Jan 31 02:55:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:10.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:55:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:10.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:55:11 np0005603623 nova_compute[226235]: 2026-01-31 07:55:11.544 226239 DEBUG nova.network.neutron [-] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:55:11 np0005603623 nova_compute[226235]: 2026-01-31 07:55:11.839 226239 INFO nova.compute.manager [-] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Took 2.18 seconds to deallocate network for instance.#033[00m
Jan 31 02:55:11 np0005603623 nova_compute[226235]: 2026-01-31 07:55:11.996 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:12 np0005603623 nova_compute[226235]: 2026-01-31 07:55:12.140 226239 DEBUG nova.compute.manager [req-4ff0764e-684a-4fa2-95f2-6536450725ba req-cc623c0d-03a6-44d1-be0a-d28546158038 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Received event network-vif-deleted-2c75734d-f6b4-43fd-8b27-35fc421580dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:55:12 np0005603623 nova_compute[226235]: 2026-01-31 07:55:12.282 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:12 np0005603623 nova_compute[226235]: 2026-01-31 07:55:12.386 226239 DEBUG oslo_concurrency.lockutils [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:12 np0005603623 nova_compute[226235]: 2026-01-31 07:55:12.387 226239 DEBUG oslo_concurrency.lockutils [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:12 np0005603623 nova_compute[226235]: 2026-01-31 07:55:12.421 226239 DEBUG nova.scheduler.client.report [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 02:55:12 np0005603623 nova_compute[226235]: 2026-01-31 07:55:12.476 226239 DEBUG nova.scheduler.client.report [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 02:55:12 np0005603623 nova_compute[226235]: 2026-01-31 07:55:12.477 226239 DEBUG nova.compute.provider_tree [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 02:55:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:12.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:12 np0005603623 nova_compute[226235]: 2026-01-31 07:55:12.514 226239 DEBUG nova.scheduler.client.report [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 02:55:12 np0005603623 nova_compute[226235]: 2026-01-31 07:55:12.572 226239 DEBUG nova.scheduler.client.report [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 02:55:12 np0005603623 nova_compute[226235]: 2026-01-31 07:55:12.611 226239 DEBUG oslo_concurrency.processutils [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:12.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:13 np0005603623 nova_compute[226235]: 2026-01-31 07:55:13.024 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:55:13 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1713937433' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:55:13 np0005603623 nova_compute[226235]: 2026-01-31 07:55:13.068 226239 DEBUG oslo_concurrency.processutils [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:13 np0005603623 nova_compute[226235]: 2026-01-31 07:55:13.073 226239 DEBUG nova.compute.provider_tree [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:55:13 np0005603623 nova_compute[226235]: 2026-01-31 07:55:13.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:14 np0005603623 nova_compute[226235]: 2026-01-31 07:55:14.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:14 np0005603623 nova_compute[226235]: 2026-01-31 07:55:14.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:14 np0005603623 nova_compute[226235]: 2026-01-31 07:55:14.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:55:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:14.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:55:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:55:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:14.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:55:15 np0005603623 nova_compute[226235]: 2026-01-31 07:55:15.614 226239 DEBUG nova.scheduler.client.report [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:55:15 np0005603623 nova_compute[226235]: 2026-01-31 07:55:15.862 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:16 np0005603623 nova_compute[226235]: 2026-01-31 07:55:16.057 226239 DEBUG oslo_concurrency.lockutils [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 3.670s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:16 np0005603623 nova_compute[226235]: 2026-01-31 07:55:16.060 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:16 np0005603623 nova_compute[226235]: 2026-01-31 07:55:16.060 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:16 np0005603623 nova_compute[226235]: 2026-01-31 07:55:16.060 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:55:16 np0005603623 nova_compute[226235]: 2026-01-31 07:55:16.060 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:16 np0005603623 nova_compute[226235]: 2026-01-31 07:55:16.316 226239 DEBUG oslo_concurrency.lockutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Acquiring lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:16 np0005603623 nova_compute[226235]: 2026-01-31 07:55:16.316 226239 DEBUG oslo_concurrency.lockutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:16 np0005603623 nova_compute[226235]: 2026-01-31 07:55:16.366 226239 INFO nova.scheduler.client.report [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Deleted allocations for instance 19f9b64a-5ffd-4930-8b5d-427c62ff87a4#033[00m
Jan 31 02:55:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:16.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:55:16 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/337920987' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:55:16 np0005603623 nova_compute[226235]: 2026-01-31 07:55:16.517 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:16 np0005603623 nova_compute[226235]: 2026-01-31 07:55:16.674 226239 DEBUG nova.compute.manager [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:55:16 np0005603623 nova_compute[226235]: 2026-01-31 07:55:16.717 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:55:16 np0005603623 nova_compute[226235]: 2026-01-31 07:55:16.718 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4737MB free_disk=20.911182403564453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:55:16 np0005603623 nova_compute[226235]: 2026-01-31 07:55:16.719 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:16 np0005603623 nova_compute[226235]: 2026-01-31 07:55:16.719 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:16.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:16 np0005603623 nova_compute[226235]: 2026-01-31 07:55:16.998 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:17 np0005603623 nova_compute[226235]: 2026-01-31 07:55:17.226 226239 DEBUG oslo_concurrency.lockutils [None req-029db93a-d9ac-4a3c-83a2-cde3853c713b 0384d6fc8c0b4f66bf382009760ab9f5 8a2b8f01b0d74b8588a3f97400e9fbff - - default default] Lock "19f9b64a-5ffd-4930-8b5d-427c62ff87a4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 14.684s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:17 np0005603623 nova_compute[226235]: 2026-01-31 07:55:17.368 226239 DEBUG oslo_concurrency.lockutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:17 np0005603623 nova_compute[226235]: 2026-01-31 07:55:17.454 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 0eb7d937-6381-4fca-88d8-57be8d3f0a29 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Jan 31 02:55:17 np0005603623 nova_compute[226235]: 2026-01-31 07:55:17.455 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:55:17 np0005603623 nova_compute[226235]: 2026-01-31 07:55:17.455 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:55:17 np0005603623 nova_compute[226235]: 2026-01-31 07:55:17.538 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:17 np0005603623 nova_compute[226235]: 2026-01-31 07:55:17.981 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846102.980758, 19f9b64a-5ffd-4930-8b5d-427c62ff87a4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:55:17 np0005603623 nova_compute[226235]: 2026-01-31 07:55:17.982 226239 INFO nova.compute.manager [-] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:55:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:55:17 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2092718706' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:55:18 np0005603623 nova_compute[226235]: 2026-01-31 07:55:18.003 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:18 np0005603623 nova_compute[226235]: 2026-01-31 07:55:18.008 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:55:18 np0005603623 nova_compute[226235]: 2026-01-31 07:55:18.027 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:18 np0005603623 nova_compute[226235]: 2026-01-31 07:55:18.247 226239 DEBUG nova.compute.manager [None req-5189171e-1e25-4ea6-9b3c-fbf6009b6792 - - - - - -] [instance: 19f9b64a-5ffd-4930-8b5d-427c62ff87a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:55:18 np0005603623 nova_compute[226235]: 2026-01-31 07:55:18.253 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:55:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:55:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:18.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:55:18 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Jan 31 02:55:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:55:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:18.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:55:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:55:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:20.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:55:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:55:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:20.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:55:21 np0005603623 nova_compute[226235]: 2026-01-31 07:55:21.841 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:55:21 np0005603623 nova_compute[226235]: 2026-01-31 07:55:21.841 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 5.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:21 np0005603623 nova_compute[226235]: 2026-01-31 07:55:21.841 226239 DEBUG oslo_concurrency.lockutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 4.473s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:21 np0005603623 nova_compute[226235]: 2026-01-31 07:55:21.842 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:21 np0005603623 nova_compute[226235]: 2026-01-31 07:55:21.843 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 02:55:21 np0005603623 nova_compute[226235]: 2026-01-31 07:55:21.852 226239 DEBUG nova.virt.hardware [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:55:21 np0005603623 nova_compute[226235]: 2026-01-31 07:55:21.852 226239 INFO nova.compute.claims [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:55:22 np0005603623 nova_compute[226235]: 2026-01-31 07:55:22.000 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:22.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:55:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:22.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:55:23 np0005603623 nova_compute[226235]: 2026-01-31 07:55:23.031 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:23 np0005603623 nova_compute[226235]: 2026-01-31 07:55:23.864 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 02:55:23 np0005603623 nova_compute[226235]: 2026-01-31 07:55:23.865 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:55:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:24.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:55:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:24.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:25 np0005603623 nova_compute[226235]: 2026-01-31 07:55:25.483 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:25 np0005603623 nova_compute[226235]: 2026-01-31 07:55:25.483 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:25 np0005603623 nova_compute[226235]: 2026-01-31 07:55:25.483 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:55:25 np0005603623 nova_compute[226235]: 2026-01-31 07:55:25.484 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:55:25 np0005603623 nova_compute[226235]: 2026-01-31 07:55:25.609 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 02:55:25 np0005603623 nova_compute[226235]: 2026-01-31 07:55:25.609 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:55:25 np0005603623 nova_compute[226235]: 2026-01-31 07:55:25.610 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:25 np0005603623 nova_compute[226235]: 2026-01-31 07:55:25.616 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:25 np0005603623 nova_compute[226235]: 2026-01-31 07:55:25.616 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:55:25 np0005603623 nova_compute[226235]: 2026-01-31 07:55:25.811 226239 DEBUG oslo_concurrency.processutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:55:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1934266441' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:55:26 np0005603623 nova_compute[226235]: 2026-01-31 07:55:26.272 226239 DEBUG oslo_concurrency.processutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:26 np0005603623 nova_compute[226235]: 2026-01-31 07:55:26.279 226239 DEBUG nova.compute.provider_tree [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:55:26 np0005603623 nova_compute[226235]: 2026-01-31 07:55:26.307 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "3399eab2-419d-4742-b204-ab806dcda151" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:26 np0005603623 nova_compute[226235]: 2026-01-31 07:55:26.307 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "3399eab2-419d-4742-b204-ab806dcda151" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:26 np0005603623 nova_compute[226235]: 2026-01-31 07:55:26.428 226239 DEBUG nova.scheduler.client.report [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:55:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:26.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:26 np0005603623 nova_compute[226235]: 2026-01-31 07:55:26.627 226239 DEBUG nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:55:26 np0005603623 nova_compute[226235]: 2026-01-31 07:55:26.891 226239 DEBUG oslo_concurrency.lockutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 5.050s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:26 np0005603623 nova_compute[226235]: 2026-01-31 07:55:26.892 226239 DEBUG nova.compute.manager [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:55:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:26.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:27 np0005603623 nova_compute[226235]: 2026-01-31 07:55:27.001 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:27 np0005603623 nova_compute[226235]: 2026-01-31 07:55:27.076 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:27 np0005603623 nova_compute[226235]: 2026-01-31 07:55:27.077 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:27 np0005603623 nova_compute[226235]: 2026-01-31 07:55:27.083 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:55:27 np0005603623 nova_compute[226235]: 2026-01-31 07:55:27.083 226239 INFO nova.compute.claims [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:55:27 np0005603623 nova_compute[226235]: 2026-01-31 07:55:27.487 226239 DEBUG nova.compute.manager [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:55:27 np0005603623 nova_compute[226235]: 2026-01-31 07:55:27.487 226239 DEBUG nova.network.neutron [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:55:27 np0005603623 nova_compute[226235]: 2026-01-31 07:55:27.611 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "bec297f5-8e63-412e-9cd3-8e859f89a123" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:27 np0005603623 nova_compute[226235]: 2026-01-31 07:55:27.611 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "bec297f5-8e63-412e-9cd3-8e859f89a123" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:27 np0005603623 nova_compute[226235]: 2026-01-31 07:55:27.676 226239 DEBUG nova.policy [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd6078cfaadaa45ae9256245554f784fe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd9f0c923b994b0295e72b111f661de1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:55:27 np0005603623 podman[243941]: 2026-01-31 07:55:27.94916554 +0000 UTC m=+0.047192787 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.028 226239 INFO nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:55:28 np0005603623 podman[243942]: 2026-01-31 07:55:28.031297343 +0000 UTC m=+0.129423843 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2)
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.032 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.061 226239 DEBUG nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.100 226239 DEBUG nova.compute.manager [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.172 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.218 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.350 226239 DEBUG nova.compute.manager [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.351 226239 DEBUG nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.352 226239 INFO nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Creating image(s)#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.388 226239 DEBUG nova.storage.rbd_utils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] rbd image 0eb7d937-6381-4fca-88d8-57be8d3f0a29_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.427 226239 DEBUG nova.storage.rbd_utils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] rbd image 0eb7d937-6381-4fca-88d8-57be8d3f0a29_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.459 226239 DEBUG nova.storage.rbd_utils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] rbd image 0eb7d937-6381-4fca-88d8-57be8d3f0a29_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.465 226239 DEBUG oslo_concurrency.processutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:28.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.527 226239 DEBUG oslo_concurrency.processutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.528 226239 DEBUG oslo_concurrency.lockutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.529 226239 DEBUG oslo_concurrency.lockutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.530 226239 DEBUG oslo_concurrency.lockutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.567 226239 DEBUG nova.storage.rbd_utils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] rbd image 0eb7d937-6381-4fca-88d8-57be8d3f0a29_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.572 226239 DEBUG oslo_concurrency.processutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 0eb7d937-6381-4fca-88d8-57be8d3f0a29_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:55:28 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1181961811' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.657 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.662 226239 DEBUG nova.compute.provider_tree [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.719 226239 DEBUG nova.scheduler.client.report [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.782 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.705s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.784 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.566s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.818 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.818 226239 INFO nova.compute.claims [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.826 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "c98f7f86-da83-4c5a-8456-904c2a427aba" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.827 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "c98f7f86-da83-4c5a-8456-904c2a427aba" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.882 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "c98f7f86-da83-4c5a-8456-904c2a427aba" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.883 226239 DEBUG nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:55:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:28.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:28 np0005603623 nova_compute[226235]: 2026-01-31 07:55:28.999 226239 DEBUG nova.network.neutron [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Successfully created port: 6e38f6ff-3729-4d12-9f54-6c01e6aae5aa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.021 226239 DEBUG nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.021 226239 DEBUG nova.network.neutron [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.068 226239 DEBUG oslo_concurrency.processutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 0eb7d937-6381-4fca-88d8-57be8d3f0a29_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.106 226239 INFO nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.169 226239 DEBUG nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.181 226239 DEBUG nova.storage.rbd_utils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] resizing rbd image 0eb7d937-6381-4fca-88d8-57be8d3f0a29_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.232 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.436 226239 DEBUG nova.network.neutron [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.437 226239 DEBUG nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.446 226239 DEBUG nova.objects.instance [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lazy-loading 'migration_context' on Instance uuid 0eb7d937-6381-4fca-88d8-57be8d3f0a29 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.558 226239 DEBUG nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.559 226239 DEBUG nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Ensure instance console log exists: /var/lib/nova/instances/0eb7d937-6381-4fca-88d8-57be8d3f0a29/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.559 226239 DEBUG oslo_concurrency.lockutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.559 226239 DEBUG oslo_concurrency.lockutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.559 226239 DEBUG oslo_concurrency.lockutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.667 226239 DEBUG nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.669 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.670 226239 INFO nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Creating image(s)#033[00m
Jan 31 02:55:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:55:29 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2089867896' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.703 226239 DEBUG nova.storage.rbd_utils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 3399eab2-419d-4742-b204-ab806dcda151_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.741 226239 DEBUG nova.storage.rbd_utils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 3399eab2-419d-4742-b204-ab806dcda151_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.780 226239 DEBUG nova.storage.rbd_utils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 3399eab2-419d-4742-b204-ab806dcda151_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.787 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.804 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.572s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.813 226239 DEBUG nova.compute.provider_tree [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.851 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.851 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.852 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.852 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.878 226239 DEBUG nova.storage.rbd_utils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 3399eab2-419d-4742-b204-ab806dcda151_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:29 np0005603623 nova_compute[226235]: 2026-01-31 07:55:29.884 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 3399eab2-419d-4742-b204-ab806dcda151_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:30.089 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:30.090 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:30.090 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:55:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:30.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:55:30 np0005603623 nova_compute[226235]: 2026-01-31 07:55:30.555 226239 DEBUG nova.scheduler.client.report [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:55:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:30.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:32 np0005603623 nova_compute[226235]: 2026-01-31 07:55:32.004 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:32 np0005603623 nova_compute[226235]: 2026-01-31 07:55:32.307 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 3.524s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:32 np0005603623 nova_compute[226235]: 2026-01-31 07:55:32.354 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "c98f7f86-da83-4c5a-8456-904c2a427aba" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:32 np0005603623 nova_compute[226235]: 2026-01-31 07:55:32.355 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "c98f7f86-da83-4c5a-8456-904c2a427aba" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:32 np0005603623 nova_compute[226235]: 2026-01-31 07:55:32.392 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "c98f7f86-da83-4c5a-8456-904c2a427aba" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.037s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:32 np0005603623 nova_compute[226235]: 2026-01-31 07:55:32.392 226239 DEBUG nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:55:32 np0005603623 nova_compute[226235]: 2026-01-31 07:55:32.477 226239 DEBUG nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:55:32 np0005603623 nova_compute[226235]: 2026-01-31 07:55:32.477 226239 DEBUG nova.network.neutron [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:55:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:32.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:32 np0005603623 nova_compute[226235]: 2026-01-31 07:55:32.548 226239 INFO nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:55:32 np0005603623 nova_compute[226235]: 2026-01-31 07:55:32.686 226239 DEBUG nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:55:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:32.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:33 np0005603623 nova_compute[226235]: 2026-01-31 07:55:33.034 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:33 np0005603623 nova_compute[226235]: 2026-01-31 07:55:33.169 226239 DEBUG nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:55:33 np0005603623 nova_compute[226235]: 2026-01-31 07:55:33.170 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:55:33 np0005603623 nova_compute[226235]: 2026-01-31 07:55:33.171 226239 INFO nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Creating image(s)#033[00m
Jan 31 02:55:33 np0005603623 nova_compute[226235]: 2026-01-31 07:55:33.194 226239 DEBUG nova.storage.rbd_utils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image bec297f5-8e63-412e-9cd3-8e859f89a123_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:33 np0005603623 nova_compute[226235]: 2026-01-31 07:55:33.218 226239 DEBUG nova.storage.rbd_utils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image bec297f5-8e63-412e-9cd3-8e859f89a123_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:33 np0005603623 nova_compute[226235]: 2026-01-31 07:55:33.248 226239 DEBUG nova.storage.rbd_utils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image bec297f5-8e63-412e-9cd3-8e859f89a123_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:33 np0005603623 nova_compute[226235]: 2026-01-31 07:55:33.252 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:33 np0005603623 nova_compute[226235]: 2026-01-31 07:55:33.303 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:33 np0005603623 nova_compute[226235]: 2026-01-31 07:55:33.304 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:33 np0005603623 nova_compute[226235]: 2026-01-31 07:55:33.305 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:33 np0005603623 nova_compute[226235]: 2026-01-31 07:55:33.305 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:33 np0005603623 nova_compute[226235]: 2026-01-31 07:55:33.335 226239 DEBUG nova.storage.rbd_utils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image bec297f5-8e63-412e-9cd3-8e859f89a123_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:33 np0005603623 nova_compute[226235]: 2026-01-31 07:55:33.339 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 bec297f5-8e63-412e-9cd3-8e859f89a123_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:33 np0005603623 nova_compute[226235]: 2026-01-31 07:55:33.669 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 3399eab2-419d-4742-b204-ab806dcda151_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.786s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:33 np0005603623 nova_compute[226235]: 2026-01-31 07:55:33.900 226239 DEBUG nova.storage.rbd_utils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] resizing rbd image 3399eab2-419d-4742-b204-ab806dcda151_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:55:34 np0005603623 nova_compute[226235]: 2026-01-31 07:55:34.410 226239 DEBUG nova.network.neutron [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 31 02:55:34 np0005603623 nova_compute[226235]: 2026-01-31 07:55:34.411 226239 DEBUG nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:55:34 np0005603623 nova_compute[226235]: 2026-01-31 07:55:34.412 226239 DEBUG nova.network.neutron [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Successfully updated port: 6e38f6ff-3729-4d12-9f54-6c01e6aae5aa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:55:34 np0005603623 nova_compute[226235]: 2026-01-31 07:55:34.414 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:34 np0005603623 nova_compute[226235]: 2026-01-31 07:55:34.417 226239 DEBUG nova.compute.manager [req-fcb2d90e-a6c0-467f-879e-0f703b8e851d req-a3f3c9cb-ef9d-44a2-b3a6-ea0efde494a2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Received event network-changed-6e38f6ff-3729-4d12-9f54-6c01e6aae5aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:55:34 np0005603623 nova_compute[226235]: 2026-01-31 07:55:34.418 226239 DEBUG nova.compute.manager [req-fcb2d90e-a6c0-467f-879e-0f703b8e851d req-a3f3c9cb-ef9d-44a2-b3a6-ea0efde494a2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Refreshing instance network info cache due to event network-changed-6e38f6ff-3729-4d12-9f54-6c01e6aae5aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:55:34 np0005603623 nova_compute[226235]: 2026-01-31 07:55:34.418 226239 DEBUG oslo_concurrency.lockutils [req-fcb2d90e-a6c0-467f-879e-0f703b8e851d req-a3f3c9cb-ef9d-44a2-b3a6-ea0efde494a2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-0eb7d937-6381-4fca-88d8-57be8d3f0a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:55:34 np0005603623 nova_compute[226235]: 2026-01-31 07:55:34.418 226239 DEBUG oslo_concurrency.lockutils [req-fcb2d90e-a6c0-467f-879e-0f703b8e851d req-a3f3c9cb-ef9d-44a2-b3a6-ea0efde494a2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-0eb7d937-6381-4fca-88d8-57be8d3f0a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:55:34 np0005603623 nova_compute[226235]: 2026-01-31 07:55:34.419 226239 DEBUG nova.network.neutron [req-fcb2d90e-a6c0-467f-879e-0f703b8e851d req-a3f3c9cb-ef9d-44a2-b3a6-ea0efde494a2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Refreshing network info cache for port 6e38f6ff-3729-4d12-9f54-6c01e6aae5aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:55:34 np0005603623 nova_compute[226235]: 2026-01-31 07:55:34.436 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:55:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:34.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:55:34 np0005603623 nova_compute[226235]: 2026-01-31 07:55:34.560 226239 DEBUG oslo_concurrency.lockutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Acquiring lock "refresh_cache-0eb7d937-6381-4fca-88d8-57be8d3f0a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:55:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:34.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:34 np0005603623 nova_compute[226235]: 2026-01-31 07:55:34.995 226239 DEBUG nova.network.neutron [req-fcb2d90e-a6c0-467f-879e-0f703b8e851d req-a3f3c9cb-ef9d-44a2-b3a6-ea0efde494a2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:55:36 np0005603623 nova_compute[226235]: 2026-01-31 07:55:36.075 226239 DEBUG nova.network.neutron [req-fcb2d90e-a6c0-467f-879e-0f703b8e851d req-a3f3c9cb-ef9d-44a2-b3a6-ea0efde494a2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:55:36 np0005603623 nova_compute[226235]: 2026-01-31 07:55:36.264 226239 DEBUG oslo_concurrency.lockutils [req-fcb2d90e-a6c0-467f-879e-0f703b8e851d req-a3f3c9cb-ef9d-44a2-b3a6-ea0efde494a2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-0eb7d937-6381-4fca-88d8-57be8d3f0a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:55:36 np0005603623 nova_compute[226235]: 2026-01-31 07:55:36.265 226239 DEBUG oslo_concurrency.lockutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Acquired lock "refresh_cache-0eb7d937-6381-4fca-88d8-57be8d3f0a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:55:36 np0005603623 nova_compute[226235]: 2026-01-31 07:55:36.265 226239 DEBUG nova.network.neutron [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:55:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:36.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:36 np0005603623 nova_compute[226235]: 2026-01-31 07:55:36.791 226239 DEBUG nova.objects.instance [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lazy-loading 'migration_context' on Instance uuid 3399eab2-419d-4742-b204-ab806dcda151 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:55:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:36.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.005 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.029 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.029 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Ensure instance console log exists: /var/lib/nova/instances/3399eab2-419d-4742-b204-ab806dcda151/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.030 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.030 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.030 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.032 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.035 226239 WARNING nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.041 226239 DEBUG nova.virt.libvirt.host [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.042 226239 DEBUG nova.virt.libvirt.host [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.046 226239 DEBUG nova.virt.libvirt.host [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.047 226239 DEBUG nova.virt.libvirt.host [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.048 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.048 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.049 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.049 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.049 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.049 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.049 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.050 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.050 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.050 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.050 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.050 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.053 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.475 226239 DEBUG nova.network.neutron [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:55:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:55:37 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/399831852' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.694 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 bec297f5-8e63-412e-9cd3-8e859f89a123_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 4.355s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:37 np0005603623 nova_compute[226235]: 2026-01-31 07:55:37.914 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.861s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:38 np0005603623 nova_compute[226235]: 2026-01-31 07:55:38.135 226239 DEBUG nova.storage.rbd_utils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 3399eab2-419d-4742-b204-ab806dcda151_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:38 np0005603623 nova_compute[226235]: 2026-01-31 07:55:38.139 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:38 np0005603623 nova_compute[226235]: 2026-01-31 07:55:38.158 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:38 np0005603623 nova_compute[226235]: 2026-01-31 07:55:38.201 226239 DEBUG nova.storage.rbd_utils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] resizing rbd image bec297f5-8e63-412e-9cd3-8e859f89a123_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:55:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:55:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:38.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:55:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:55:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/770437823' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:55:38 np0005603623 nova_compute[226235]: 2026-01-31 07:55:38.581 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:38 np0005603623 nova_compute[226235]: 2026-01-31 07:55:38.583 226239 DEBUG nova.objects.instance [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3399eab2-419d-4742-b204-ab806dcda151 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:55:38 np0005603623 nova_compute[226235]: 2026-01-31 07:55:38.743 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:55:38 np0005603623 nova_compute[226235]:  <uuid>3399eab2-419d-4742-b204-ab806dcda151</uuid>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:  <name>instance-00000023</name>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServersOnMultiNodesTest-server-1567887052-1</nova:name>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:55:37</nova:creationTime>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:55:38 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:        <nova:user uuid="d4307bc8a2224140b78ba248cecefe55">tempest-ServersOnMultiNodesTest-1827677275-project-member</nova:user>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:        <nova:project uuid="b6dca32431594e2682c5d2acb448bbf4">tempest-ServersOnMultiNodesTest-1827677275</nova:project>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <nova:ports/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <entry name="serial">3399eab2-419d-4742-b204-ab806dcda151</entry>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <entry name="uuid">3399eab2-419d-4742-b204-ab806dcda151</entry>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/3399eab2-419d-4742-b204-ab806dcda151_disk">
Jan 31 02:55:38 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:55:38 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/3399eab2-419d-4742-b204-ab806dcda151_disk.config">
Jan 31 02:55:38 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:55:38 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/3399eab2-419d-4742-b204-ab806dcda151/console.log" append="off"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:55:38 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:55:38 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:55:38 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:55:38 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:55:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:38.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:39 np0005603623 nova_compute[226235]: 2026-01-31 07:55:39.357 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:55:39 np0005603623 nova_compute[226235]: 2026-01-31 07:55:39.357 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:55:39 np0005603623 nova_compute[226235]: 2026-01-31 07:55:39.358 226239 INFO nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Using config drive#033[00m
Jan 31 02:55:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:39 np0005603623 nova_compute[226235]: 2026-01-31 07:55:39.819 226239 DEBUG nova.storage.rbd_utils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 3399eab2-419d-4742-b204-ab806dcda151_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:39 np0005603623 nova_compute[226235]: 2026-01-31 07:55:39.824 226239 DEBUG nova.network.neutron [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Updating instance_info_cache with network_info: [{"id": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "address": "fa:16:3e:fa:91:f1", "network": {"id": "80d90d51-335c-4f74-8a61-143d47d84f22", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-991561978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd9f0c923b994b0295e72b111f661de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e38f6ff-37", "ovs_interfaceid": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.021 226239 DEBUG oslo_concurrency.lockutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Releasing lock "refresh_cache-0eb7d937-6381-4fca-88d8-57be8d3f0a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.022 226239 DEBUG nova.compute.manager [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Instance network_info: |[{"id": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "address": "fa:16:3e:fa:91:f1", "network": {"id": "80d90d51-335c-4f74-8a61-143d47d84f22", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-991561978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd9f0c923b994b0295e72b111f661de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e38f6ff-37", "ovs_interfaceid": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.026 226239 DEBUG nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Start _get_guest_xml network_info=[{"id": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "address": "fa:16:3e:fa:91:f1", "network": {"id": "80d90d51-335c-4f74-8a61-143d47d84f22", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-991561978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd9f0c923b994b0295e72b111f661de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e38f6ff-37", "ovs_interfaceid": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.031 226239 WARNING nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.037 226239 DEBUG nova.virt.libvirt.host [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.038 226239 DEBUG nova.virt.libvirt.host [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.043 226239 DEBUG nova.virt.libvirt.host [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.044 226239 DEBUG nova.virt.libvirt.host [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.046 226239 DEBUG nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.046 226239 DEBUG nova.virt.hardware [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.047 226239 DEBUG nova.virt.hardware [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.047 226239 DEBUG nova.virt.hardware [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.047 226239 DEBUG nova.virt.hardware [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.048 226239 DEBUG nova.virt.hardware [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.048 226239 DEBUG nova.virt.hardware [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.048 226239 DEBUG nova.virt.hardware [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.049 226239 DEBUG nova.virt.hardware [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.049 226239 DEBUG nova.virt.hardware [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.049 226239 DEBUG nova.virt.hardware [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.049 226239 DEBUG nova.virt.hardware [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.052 226239 DEBUG oslo_concurrency.processutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.112 226239 INFO nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Creating config drive at /var/lib/nova/instances/3399eab2-419d-4742-b204-ab806dcda151/disk.config#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.116 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3399eab2-419d-4742-b204-ab806dcda151/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjzanrc3a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.236 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3399eab2-419d-4742-b204-ab806dcda151/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjzanrc3a" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.260 226239 DEBUG nova.storage.rbd_utils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 3399eab2-419d-4742-b204-ab806dcda151_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:40 np0005603623 nova_compute[226235]: 2026-01-31 07:55:40.263 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3399eab2-419d-4742-b204-ab806dcda151/disk.config 3399eab2-419d-4742-b204-ab806dcda151_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:40.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:55:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1447467291' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:55:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:40.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:41 np0005603623 nova_compute[226235]: 2026-01-31 07:55:41.897 226239 DEBUG oslo_concurrency.processutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.845s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:42 np0005603623 nova_compute[226235]: 2026-01-31 07:55:42.460 226239 DEBUG nova.storage.rbd_utils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] rbd image 0eb7d937-6381-4fca-88d8-57be8d3f0a29_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:42 np0005603623 nova_compute[226235]: 2026-01-31 07:55:42.464 226239 DEBUG oslo_concurrency.processutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:42 np0005603623 nova_compute[226235]: 2026-01-31 07:55:42.477 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:42.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:42 np0005603623 nova_compute[226235]: 2026-01-31 07:55:42.936 226239 DEBUG nova.objects.instance [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lazy-loading 'migration_context' on Instance uuid bec297f5-8e63-412e-9cd3-8e859f89a123 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:55:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:42.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.141 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.142 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Ensure instance console log exists: /var/lib/nova/instances/bec297f5-8e63-412e-9cd3-8e859f89a123/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.142 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.143 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.143 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.144 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.150 226239 WARNING nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.156 226239 DEBUG nova.virt.libvirt.host [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.158 226239 DEBUG nova.virt.libvirt.host [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.160 226239 DEBUG nova.virt.libvirt.host [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.162 226239 DEBUG nova.virt.libvirt.host [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.164 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.164 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.165 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.165 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.166 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.166 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.166 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.167 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.167 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.167 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.167 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.168 226239 DEBUG nova.virt.hardware [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.170 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.206 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:55:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4232692933' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.858 226239 DEBUG oslo_concurrency.processutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.395s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.861 226239 DEBUG nova.virt.libvirt.vif [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:55:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-UpdateMultiattachVolumeNegativeTest-server-356500433',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-updatemultiattachvolumenegativetest-server-356500433',id=34,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYVdi1LnaHZ5r6xMfeklqfzjDViAexljM9P3M0Fy5FZ3Xolf4vxCOKTYu0NFlJGf4EcZe3GteIpoGaJZuwWfVMuKuQVsr/qX8LdXn5NJVOqUqTS1m1sSlyZl2teCw6PaQ==',key_name='tempest-keypair-1101838222',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd9f0c923b994b0295e72b111f661de1',ramdisk_id='',reservation_id='r-3c4eit9j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-UpdateMultiattachVolumeNegativeTest-860437657',owner_user_name='tempest-UpdateMultiattachVolumeNegativeTest-860437657-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:55:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d6078cfaadaa45ae9256245554f784fe',uuid=0eb7d937-6381-4fca-88d8-57be8d3f0a29,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "address": "fa:16:3e:fa:91:f1", "network": {"id": "80d90d51-335c-4f74-8a61-143d47d84f22", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-991561978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd9f0c923b994b0295e72b111f661de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e38f6ff-37", "ovs_interfaceid": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.863 226239 DEBUG nova.network.os_vif_util [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Converting VIF {"id": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "address": "fa:16:3e:fa:91:f1", "network": {"id": "80d90d51-335c-4f74-8a61-143d47d84f22", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-991561978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd9f0c923b994b0295e72b111f661de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e38f6ff-37", "ovs_interfaceid": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.864 226239 DEBUG nova.network.os_vif_util [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:91:f1,bridge_name='br-int',has_traffic_filtering=True,id=6e38f6ff-3729-4d12-9f54-6c01e6aae5aa,network=Network(80d90d51-335c-4f74-8a61-143d47d84f22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e38f6ff-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:55:43 np0005603623 nova_compute[226235]: 2026-01-31 07:55:43.866 226239 DEBUG nova.objects.instance [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0eb7d937-6381-4fca-88d8-57be8d3f0a29 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:55:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:55:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1500264976' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.006 226239 DEBUG nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:55:44 np0005603623 nova_compute[226235]:  <uuid>0eb7d937-6381-4fca-88d8-57be8d3f0a29</uuid>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:  <name>instance-00000022</name>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <nova:name>tempest-UpdateMultiattachVolumeNegativeTest-server-356500433</nova:name>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:55:40</nova:creationTime>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:55:44 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:        <nova:user uuid="d6078cfaadaa45ae9256245554f784fe">tempest-UpdateMultiattachVolumeNegativeTest-860437657-project-member</nova:user>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:        <nova:project uuid="fd9f0c923b994b0295e72b111f661de1">tempest-UpdateMultiattachVolumeNegativeTest-860437657</nova:project>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:        <nova:port uuid="6e38f6ff-3729-4d12-9f54-6c01e6aae5aa">
Jan 31 02:55:44 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <entry name="serial">0eb7d937-6381-4fca-88d8-57be8d3f0a29</entry>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <entry name="uuid">0eb7d937-6381-4fca-88d8-57be8d3f0a29</entry>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/0eb7d937-6381-4fca-88d8-57be8d3f0a29_disk">
Jan 31 02:55:44 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:55:44 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/0eb7d937-6381-4fca-88d8-57be8d3f0a29_disk.config">
Jan 31 02:55:44 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:55:44 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:fa:91:f1"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <target dev="tap6e38f6ff-37"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    </interface>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/0eb7d937-6381-4fca-88d8-57be8d3f0a29/console.log" append="off"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:55:44 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:55:44 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:55:44 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:55:44 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.006 226239 DEBUG nova.compute.manager [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Preparing to wait for external event network-vif-plugged-6e38f6ff-3729-4d12-9f54-6c01e6aae5aa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.006 226239 DEBUG oslo_concurrency.lockutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Acquiring lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.007 226239 DEBUG oslo_concurrency.lockutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.007 226239 DEBUG oslo_concurrency.lockutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.007 226239 DEBUG nova.virt.libvirt.vif [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:55:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-UpdateMultiattachVolumeNegativeTest-server-356500433',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-updatemultiattachvolumenegativetest-server-356500433',id=34,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYVdi1LnaHZ5r6xMfeklqfzjDViAexljM9P3M0Fy5FZ3Xolf4vxCOKTYu0NFlJGf4EcZe3GteIpoGaJZuwWfVMuKuQVsr/qX8LdXn5NJVOqUqTS1m1sSlyZl2teCw6PaQ==',key_name='tempest-keypair-1101838222',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd9f0c923b994b0295e72b111f661de1',ramdisk_id='',reservation_id='r-3c4eit9j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-UpdateMultiattachVolumeNegativeTest-860437657',owner_user_name='tempest-UpdateMultiattachVolumeNegativeTest-860437657-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:55:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d6078cfaadaa45ae9256245554f784fe',uuid=0eb7d937-6381-4fca-88d8-57be8d3f0a29,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "address": "fa:16:3e:fa:91:f1", "network": {"id": "80d90d51-335c-4f74-8a61-143d47d84f22", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-991561978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd9f0c923b994b0295e72b111f661de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e38f6ff-37", "ovs_interfaceid": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.008 226239 DEBUG nova.network.os_vif_util [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Converting VIF {"id": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "address": "fa:16:3e:fa:91:f1", "network": {"id": "80d90d51-335c-4f74-8a61-143d47d84f22", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-991561978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd9f0c923b994b0295e72b111f661de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e38f6ff-37", "ovs_interfaceid": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.008 226239 DEBUG nova.network.os_vif_util [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:91:f1,bridge_name='br-int',has_traffic_filtering=True,id=6e38f6ff-3729-4d12-9f54-6c01e6aae5aa,network=Network(80d90d51-335c-4f74-8a61-143d47d84f22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e38f6ff-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.009 226239 DEBUG os_vif [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:91:f1,bridge_name='br-int',has_traffic_filtering=True,id=6e38f6ff-3729-4d12-9f54-6c01e6aae5aa,network=Network(80d90d51-335c-4f74-8a61-143d47d84f22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e38f6ff-37') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.010 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.010 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.011 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.013 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.013 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e38f6ff-37, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.014 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6e38f6ff-37, col_values=(('external_ids', {'iface-id': '6e38f6ff-3729-4d12-9f54-6c01e6aae5aa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fa:91:f1', 'vm-uuid': '0eb7d937-6381-4fca-88d8-57be8d3f0a29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.015 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:44 np0005603623 NetworkManager[48970]: <info>  [1769846144.0164] manager: (tap6e38f6ff-37): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.017 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.019 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.021 226239 INFO os_vif [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:91:f1,bridge_name='br-int',has_traffic_filtering=True,id=6e38f6ff-3729-4d12-9f54-6c01e6aae5aa,network=Network(80d90d51-335c-4f74-8a61-143d47d84f22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e38f6ff-37')#033[00m
Jan 31 02:55:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:55:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:44.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.654 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.689 226239 DEBUG nova.storage.rbd_utils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image bec297f5-8e63-412e-9cd3-8e859f89a123_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.693 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.890 226239 DEBUG nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.890 226239 DEBUG nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.891 226239 DEBUG nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] No VIF found with MAC fa:16:3e:fa:91:f1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.891 226239 INFO nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Using config drive#033[00m
Jan 31 02:55:44 np0005603623 nova_compute[226235]: 2026-01-31 07:55:44.916 226239 DEBUG nova.storage.rbd_utils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] rbd image 0eb7d937-6381-4fca-88d8-57be8d3f0a29_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:55:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:44.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:55:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:55:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/893141744' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:55:45 np0005603623 nova_compute[226235]: 2026-01-31 07:55:45.179 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:45 np0005603623 nova_compute[226235]: 2026-01-31 07:55:45.181 226239 DEBUG nova.objects.instance [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lazy-loading 'pci_devices' on Instance uuid bec297f5-8e63-412e-9cd3-8e859f89a123 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:55:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:46.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:46.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:47 np0005603623 nova_compute[226235]: 2026-01-31 07:55:47.019 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:47 np0005603623 nova_compute[226235]: 2026-01-31 07:55:47.095 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3399eab2-419d-4742-b204-ab806dcda151/disk.config 3399eab2-419d-4742-b204-ab806dcda151_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 6.832s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:47 np0005603623 nova_compute[226235]: 2026-01-31 07:55:47.096 226239 INFO nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Deleting local config drive /var/lib/nova/instances/3399eab2-419d-4742-b204-ab806dcda151/disk.config because it was imported into RBD.#033[00m
Jan 31 02:55:47 np0005603623 systemd-machined[194379]: New machine qemu-20-instance-00000023.
Jan 31 02:55:47 np0005603623 systemd[1]: Started Virtual Machine qemu-20-instance-00000023.
Jan 31 02:55:47 np0005603623 nova_compute[226235]: 2026-01-31 07:55:47.653 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846147.6534343, 3399eab2-419d-4742-b204-ab806dcda151 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:55:47 np0005603623 nova_compute[226235]: 2026-01-31 07:55:47.654 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 3399eab2-419d-4742-b204-ab806dcda151] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:55:47 np0005603623 nova_compute[226235]: 2026-01-31 07:55:47.656 226239 DEBUG nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:55:47 np0005603623 nova_compute[226235]: 2026-01-31 07:55:47.656 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:55:47 np0005603623 nova_compute[226235]: 2026-01-31 07:55:47.661 226239 INFO nova.virt.libvirt.driver [-] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Instance spawned successfully.#033[00m
Jan 31 02:55:47 np0005603623 nova_compute[226235]: 2026-01-31 07:55:47.662 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:55:48 np0005603623 nova_compute[226235]: 2026-01-31 07:55:48.216 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:55:48 np0005603623 nova_compute[226235]:  <uuid>bec297f5-8e63-412e-9cd3-8e859f89a123</uuid>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:  <name>instance-00000024</name>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServersOnMultiNodesTest-server-1567887052-2</nova:name>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:55:43</nova:creationTime>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:55:48 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:        <nova:user uuid="d4307bc8a2224140b78ba248cecefe55">tempest-ServersOnMultiNodesTest-1827677275-project-member</nova:user>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:        <nova:project uuid="b6dca32431594e2682c5d2acb448bbf4">tempest-ServersOnMultiNodesTest-1827677275</nova:project>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <nova:ports/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <entry name="serial">bec297f5-8e63-412e-9cd3-8e859f89a123</entry>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <entry name="uuid">bec297f5-8e63-412e-9cd3-8e859f89a123</entry>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/bec297f5-8e63-412e-9cd3-8e859f89a123_disk">
Jan 31 02:55:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:55:48 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/bec297f5-8e63-412e-9cd3-8e859f89a123_disk.config">
Jan 31 02:55:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:55:48 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/bec297f5-8e63-412e-9cd3-8e859f89a123/console.log" append="off"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:55:48 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:55:48 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:55:48 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:55:48 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:55:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:55:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:48.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:55:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:48.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:49 np0005603623 nova_compute[226235]: 2026-01-31 07:55:49.015 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e158 e158: 3 total, 3 up, 3 in
Jan 31 02:55:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:50 np0005603623 nova_compute[226235]: 2026-01-31 07:55:50.035 226239 INFO nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Creating config drive at /var/lib/nova/instances/0eb7d937-6381-4fca-88d8-57be8d3f0a29/disk.config#033[00m
Jan 31 02:55:50 np0005603623 nova_compute[226235]: 2026-01-31 07:55:50.041 226239 DEBUG oslo_concurrency.processutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0eb7d937-6381-4fca-88d8-57be8d3f0a29/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpo7o27eqn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:50 np0005603623 nova_compute[226235]: 2026-01-31 07:55:50.163 226239 DEBUG oslo_concurrency.processutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0eb7d937-6381-4fca-88d8-57be8d3f0a29/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpo7o27eqn" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:50 np0005603623 nova_compute[226235]: 2026-01-31 07:55:50.194 226239 DEBUG nova.storage.rbd_utils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] rbd image 0eb7d937-6381-4fca-88d8-57be8d3f0a29_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:50 np0005603623 nova_compute[226235]: 2026-01-31 07:55:50.199 226239 DEBUG oslo_concurrency.processutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0eb7d937-6381-4fca-88d8-57be8d3f0a29/disk.config 0eb7d937-6381-4fca-88d8-57be8d3f0a29_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:50.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:50 np0005603623 nova_compute[226235]: 2026-01-31 07:55:50.921 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:55:50 np0005603623 nova_compute[226235]: 2026-01-31 07:55:50.930 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:55:50 np0005603623 nova_compute[226235]: 2026-01-31 07:55:50.933 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:55:50 np0005603623 nova_compute[226235]: 2026-01-31 07:55:50.934 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:55:50 np0005603623 nova_compute[226235]: 2026-01-31 07:55:50.934 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:55:50 np0005603623 nova_compute[226235]: 2026-01-31 07:55:50.935 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:55:50 np0005603623 nova_compute[226235]: 2026-01-31 07:55:50.935 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:55:50 np0005603623 nova_compute[226235]: 2026-01-31 07:55:50.935 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:55:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:55:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:50.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:55:52 np0005603623 nova_compute[226235]: 2026-01-31 07:55:52.055 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:52.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:55:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:52.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:55:53 np0005603623 nova_compute[226235]: 2026-01-31 07:55:53.066 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 3399eab2-419d-4742-b204-ab806dcda151] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:55:53 np0005603623 nova_compute[226235]: 2026-01-31 07:55:53.067 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846147.6561213, 3399eab2-419d-4742-b204-ab806dcda151 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:55:53 np0005603623 nova_compute[226235]: 2026-01-31 07:55:53.067 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 3399eab2-419d-4742-b204-ab806dcda151] VM Started (Lifecycle Event)#033[00m
Jan 31 02:55:53 np0005603623 nova_compute[226235]: 2026-01-31 07:55:53.186 226239 DEBUG oslo_concurrency.processutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0eb7d937-6381-4fca-88d8-57be8d3f0a29/disk.config 0eb7d937-6381-4fca-88d8-57be8d3f0a29_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.988s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:53 np0005603623 nova_compute[226235]: 2026-01-31 07:55:53.187 226239 INFO nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Deleting local config drive /var/lib/nova/instances/0eb7d937-6381-4fca-88d8-57be8d3f0a29/disk.config because it was imported into RBD.#033[00m
Jan 31 02:55:53 np0005603623 kernel: tap6e38f6ff-37: entered promiscuous mode
Jan 31 02:55:53 np0005603623 NetworkManager[48970]: <info>  [1769846153.2435] manager: (tap6e38f6ff-37): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Jan 31 02:55:53 np0005603623 ovn_controller[133449]: 2026-01-31T07:55:53Z|00153|binding|INFO|Claiming lport 6e38f6ff-3729-4d12-9f54-6c01e6aae5aa for this chassis.
Jan 31 02:55:53 np0005603623 ovn_controller[133449]: 2026-01-31T07:55:53Z|00154|binding|INFO|6e38f6ff-3729-4d12-9f54-6c01e6aae5aa: Claiming fa:16:3e:fa:91:f1 10.100.0.13
Jan 31 02:55:53 np0005603623 nova_compute[226235]: 2026-01-31 07:55:53.271 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:53 np0005603623 systemd-udevd[244965]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:55:53 np0005603623 nova_compute[226235]: 2026-01-31 07:55:53.278 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:53 np0005603623 nova_compute[226235]: 2026-01-31 07:55:53.291 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:53 np0005603623 NetworkManager[48970]: <info>  [1769846153.2963] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Jan 31 02:55:53 np0005603623 NetworkManager[48970]: <info>  [1769846153.2984] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Jan 31 02:55:53 np0005603623 NetworkManager[48970]: <info>  [1769846153.2994] device (tap6e38f6ff-37): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:55:53 np0005603623 NetworkManager[48970]: <info>  [1769846153.3014] device (tap6e38f6ff-37): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:55:53 np0005603623 systemd-machined[194379]: New machine qemu-21-instance-00000022.
Jan 31 02:55:53 np0005603623 systemd[1]: Started Virtual Machine qemu-21-instance-00000022.
Jan 31 02:55:53 np0005603623 nova_compute[226235]: 2026-01-31 07:55:53.382 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:53 np0005603623 nova_compute[226235]: 2026-01-31 07:55:53.400 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:53 np0005603623 nova_compute[226235]: 2026-01-31 07:55:53.619 226239 INFO nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Took 23.95 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:55:53 np0005603623 nova_compute[226235]: 2026-01-31 07:55:53.620 226239 DEBUG nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:55:54 np0005603623 nova_compute[226235]: 2026-01-31 07:55:54.017 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.390 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:91:f1 10.100.0.13'], port_security=['fa:16:3e:fa:91:f1 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '0eb7d937-6381-4fca-88d8-57be8d3f0a29', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-80d90d51-335c-4f74-8a61-143d47d84f22', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd9f0c923b994b0295e72b111f661de1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '026daea3-1ff6-4616-9656-065604061a00', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd84a3ff-b232-4c39-928c-e2cb3c0840e0, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=6e38f6ff-3729-4d12-9f54-6c01e6aae5aa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.391 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 6e38f6ff-3729-4d12-9f54-6c01e6aae5aa in datapath 80d90d51-335c-4f74-8a61-143d47d84f22 bound to our chassis#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.393 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 80d90d51-335c-4f74-8a61-143d47d84f22#033[00m
Jan 31 02:55:54 np0005603623 ovn_controller[133449]: 2026-01-31T07:55:54Z|00155|binding|INFO|Setting lport 6e38f6ff-3729-4d12-9f54-6c01e6aae5aa ovn-installed in OVS
Jan 31 02:55:54 np0005603623 ovn_controller[133449]: 2026-01-31T07:55:54Z|00156|binding|INFO|Setting lport 6e38f6ff-3729-4d12-9f54-6c01e6aae5aa up in Southbound
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.402 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fcb73d96-5ed7-4b7f-9737-2759b64732b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.403 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap80d90d51-31 in ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.404 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap80d90d51-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.404 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7de0a5be-044a-4708-9578-a6757143f17a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.405 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[51f3278e-9eba-4ba8-917a-b2e7d965a3bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:54 np0005603623 nova_compute[226235]: 2026-01-31 07:55:54.415 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.415 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[d239774f-f6fc-431f-8eff-68c5be4acd9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.424 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ba677d37-e217-4b4a-a3cb-3fb156bf6726]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.449 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ce174dbe-b051-445a-9bca-2f01863f069a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:54 np0005603623 NetworkManager[48970]: <info>  [1769846154.4565] manager: (tap80d90d51-30): new Veth device (/org/freedesktop/NetworkManager/Devices/77)
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.455 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ce9be14d-6757-4865-bb9d-5a30016921ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:54 np0005603623 systemd-udevd[244970]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.488 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[d037927b-d0f3-4f08-88c3-8ff7cfb25e32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.492 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[dfbf0a90-6fd1-4d67-bfeb-34deb5ef8c09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:54 np0005603623 NetworkManager[48970]: <info>  [1769846154.5115] device (tap80d90d51-30): carrier: link connected
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.515 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[26dc01d3-ef3f-45b2-93d1-2d36926bcf21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.530 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[07d118cb-6f44-4278-936b-92871420f524]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap80d90d51-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:7c:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529494, 'reachable_time': 35118, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245047, 'error': None, 'target': 'ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:54.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.543 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c0c4ee7d-62b5-4a9a-ac70-ac09dcb3056d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef9:7c63'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529494, 'tstamp': 529494}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245048, 'error': None, 'target': 'ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.557 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[854545b0-c1c4-4185-b1b9-fca84d5b2b58]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap80d90d51-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:7c:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529494, 'reachable_time': 35118, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245049, 'error': None, 'target': 'ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.582 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[75da7431-5491-40e7-809c-057177963084]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.626 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d30c400d-3568-467a-8a0c-cf8942890762]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.628 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap80d90d51-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.628 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.629 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap80d90d51-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:55:54 np0005603623 NetworkManager[48970]: <info>  [1769846154.6312] manager: (tap80d90d51-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Jan 31 02:55:54 np0005603623 kernel: tap80d90d51-30: entered promiscuous mode
Jan 31 02:55:54 np0005603623 nova_compute[226235]: 2026-01-31 07:55:54.632 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.634 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap80d90d51-30, col_values=(('external_ids', {'iface-id': '8d9a016f-907b-4797-b88c-cdfc5c832335'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:55:54 np0005603623 ovn_controller[133449]: 2026-01-31T07:55:54Z|00157|binding|INFO|Releasing lport 8d9a016f-907b-4797-b88c-cdfc5c832335 from this chassis (sb_readonly=1)
Jan 31 02:55:54 np0005603623 nova_compute[226235]: 2026-01-31 07:55:54.635 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:54 np0005603623 nova_compute[226235]: 2026-01-31 07:55:54.642 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.643 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/80d90d51-335c-4f74-8a61-143d47d84f22.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/80d90d51-335c-4f74-8a61-143d47d84f22.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.644 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1419ace0-92ad-4592-a29d-af27745d8c9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.646 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-80d90d51-335c-4f74-8a61-143d47d84f22
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/80d90d51-335c-4f74-8a61-143d47d84f22.pid.haproxy
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 80d90d51-335c-4f74-8a61-143d47d84f22
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:55:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:55:54.646 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22', 'env', 'PROCESS_TAG=haproxy-80d90d51-335c-4f74-8a61-143d47d84f22', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/80d90d51-335c-4f74-8a61-143d47d84f22.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:55:54 np0005603623 nova_compute[226235]: 2026-01-31 07:55:54.677 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:55:54 np0005603623 nova_compute[226235]: 2026-01-31 07:55:54.677 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:55:54 np0005603623 nova_compute[226235]: 2026-01-31 07:55:54.678 226239 INFO nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Using config drive#033[00m
Jan 31 02:55:54 np0005603623 nova_compute[226235]: 2026-01-31 07:55:54.702 226239 DEBUG nova.storage.rbd_utils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image bec297f5-8e63-412e-9cd3-8e859f89a123_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:54.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:54 np0005603623 podman[245100]: 2026-01-31 07:55:54.974348511 +0000 UTC m=+0.048766824 container create 7b91c8756b00ddcf3f2ca80de63bdc29b89e24187b31e658f661511e667b4938 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Jan 31 02:55:55 np0005603623 systemd[1]: Started libpod-conmon-7b91c8756b00ddcf3f2ca80de63bdc29b89e24187b31e658f661511e667b4938.scope.
Jan 31 02:55:55 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:55:55 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3da8f8953511c3e0812eb3b8cccef1b2415a69cc7f989ccb91b4a19db974fc6b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:55:55 np0005603623 podman[245100]: 2026-01-31 07:55:54.948020314 +0000 UTC m=+0.022438657 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:55:55 np0005603623 podman[245100]: 2026-01-31 07:55:55.053167491 +0000 UTC m=+0.127585824 container init 7b91c8756b00ddcf3f2ca80de63bdc29b89e24187b31e658f661511e667b4938 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 31 02:55:55 np0005603623 podman[245100]: 2026-01-31 07:55:55.058389485 +0000 UTC m=+0.132807788 container start 7b91c8756b00ddcf3f2ca80de63bdc29b89e24187b31e658f661511e667b4938 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 31 02:55:55 np0005603623 neutron-haproxy-ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22[245116]: [NOTICE]   (245120) : New worker (245122) forked
Jan 31 02:55:55 np0005603623 neutron-haproxy-ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22[245116]: [NOTICE]   (245120) : Loading success.
Jan 31 02:55:55 np0005603623 nova_compute[226235]: 2026-01-31 07:55:55.723 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:55:55 np0005603623 nova_compute[226235]: 2026-01-31 07:55:55.730 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:55:56 np0005603623 nova_compute[226235]: 2026-01-31 07:55:56.445 226239 INFO nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Took 29.39 seconds to build instance.#033[00m
Jan 31 02:55:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:56.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:56 np0005603623 nova_compute[226235]: 2026-01-31 07:55:56.702 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846154.2099898, 0eb7d937-6381-4fca-88d8-57be8d3f0a29 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:55:56 np0005603623 nova_compute[226235]: 2026-01-31 07:55:56.703 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] VM Started (Lifecycle Event)#033[00m
Jan 31 02:55:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:56.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:57 np0005603623 nova_compute[226235]: 2026-01-31 07:55:57.057 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:58.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:58 np0005603623 nova_compute[226235]: 2026-01-31 07:55:58.541 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "3399eab2-419d-4742-b204-ab806dcda151" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 32.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:55:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:55:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:58.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:55:58 np0005603623 podman[245314]: 2026-01-31 07:55:58.99558246 +0000 UTC m=+0.086913995 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 31 02:55:58 np0005603623 podman[245313]: 2026-01-31 07:55:58.99558418 +0000 UTC m=+0.085581734 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:55:59 np0005603623 nova_compute[226235]: 2026-01-31 07:55:59.019 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:59 np0005603623 nova_compute[226235]: 2026-01-31 07:55:59.057 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:55:59 np0005603623 nova_compute[226235]: 2026-01-31 07:55:59.062 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846154.2100768, 0eb7d937-6381-4fca-88d8-57be8d3f0a29 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:55:59 np0005603623 nova_compute[226235]: 2026-01-31 07:55:59.062 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:55:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:55:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:55:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:55:59 np0005603623 nova_compute[226235]: 2026-01-31 07:55:59.403 226239 INFO nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Creating config drive at /var/lib/nova/instances/bec297f5-8e63-412e-9cd3-8e859f89a123/disk.config#033[00m
Jan 31 02:55:59 np0005603623 nova_compute[226235]: 2026-01-31 07:55:59.407 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bec297f5-8e63-412e-9cd3-8e859f89a123/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmphg3h31dg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:59 np0005603623 nova_compute[226235]: 2026-01-31 07:55:59.531 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bec297f5-8e63-412e-9cd3-8e859f89a123/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmphg3h31dg" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:59 np0005603623 nova_compute[226235]: 2026-01-31 07:55:59.564 226239 DEBUG nova.storage.rbd_utils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image bec297f5-8e63-412e-9cd3-8e859f89a123_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:59 np0005603623 nova_compute[226235]: 2026-01-31 07:55:59.568 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/bec297f5-8e63-412e-9cd3-8e859f89a123/disk.config bec297f5-8e63-412e-9cd3-8e859f89a123_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:59 np0005603623 nova_compute[226235]: 2026-01-31 07:55:59.857 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:55:59 np0005603623 nova_compute[226235]: 2026-01-31 07:55:59.862 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:55:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:00 np0005603623 nova_compute[226235]: 2026-01-31 07:56:00.011 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:56:00 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:56:00 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:56:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:00.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:00.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:01 np0005603623 nova_compute[226235]: 2026-01-31 07:56:01.344 226239 DEBUG oslo_concurrency.processutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/bec297f5-8e63-412e-9cd3-8e859f89a123/disk.config bec297f5-8e63-412e-9cd3-8e859f89a123_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.775s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:01 np0005603623 nova_compute[226235]: 2026-01-31 07:56:01.344 226239 INFO nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Deleting local config drive /var/lib/nova/instances/bec297f5-8e63-412e-9cd3-8e859f89a123/disk.config because it was imported into RBD.#033[00m
Jan 31 02:56:01 np0005603623 systemd-machined[194379]: New machine qemu-22-instance-00000024.
Jan 31 02:56:01 np0005603623 systemd[1]: Started Virtual Machine qemu-22-instance-00000024.
Jan 31 02:56:01 np0005603623 nova_compute[226235]: 2026-01-31 07:56:01.897 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846161.8973827, bec297f5-8e63-412e-9cd3-8e859f89a123 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:56:01 np0005603623 nova_compute[226235]: 2026-01-31 07:56:01.898 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:56:01 np0005603623 nova_compute[226235]: 2026-01-31 07:56:01.901 226239 DEBUG nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:56:01 np0005603623 nova_compute[226235]: 2026-01-31 07:56:01.901 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:56:01 np0005603623 nova_compute[226235]: 2026-01-31 07:56:01.904 226239 INFO nova.virt.libvirt.driver [-] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Instance spawned successfully.#033[00m
Jan 31 02:56:01 np0005603623 nova_compute[226235]: 2026-01-31 07:56:01.905 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.049 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.053 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.058 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.263 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.264 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.264 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.265 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.265 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.265 226239 DEBUG nova.virt.libvirt.driver [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:02.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.562 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.562 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846161.8983655, bec297f5-8e63-412e-9cd3-8e859f89a123 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.562 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] VM Started (Lifecycle Event)#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.626 226239 DEBUG nova.compute.manager [req-0ff18910-a740-46f4-ad78-ae447ca53a17 req-d2108186-cfaa-4e1a-ba17-2eda6f8b7f3a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Received event network-vif-plugged-6e38f6ff-3729-4d12-9f54-6c01e6aae5aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.627 226239 DEBUG oslo_concurrency.lockutils [req-0ff18910-a740-46f4-ad78-ae447ca53a17 req-d2108186-cfaa-4e1a-ba17-2eda6f8b7f3a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.627 226239 DEBUG oslo_concurrency.lockutils [req-0ff18910-a740-46f4-ad78-ae447ca53a17 req-d2108186-cfaa-4e1a-ba17-2eda6f8b7f3a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.627 226239 DEBUG oslo_concurrency.lockutils [req-0ff18910-a740-46f4-ad78-ae447ca53a17 req-d2108186-cfaa-4e1a-ba17-2eda6f8b7f3a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.628 226239 DEBUG nova.compute.manager [req-0ff18910-a740-46f4-ad78-ae447ca53a17 req-d2108186-cfaa-4e1a-ba17-2eda6f8b7f3a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Processing event network-vif-plugged-6e38f6ff-3729-4d12-9f54-6c01e6aae5aa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.629 226239 DEBUG nova.compute.manager [req-0ff18910-a740-46f4-ad78-ae447ca53a17 req-d2108186-cfaa-4e1a-ba17-2eda6f8b7f3a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Received event network-vif-plugged-6e38f6ff-3729-4d12-9f54-6c01e6aae5aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.630 226239 DEBUG oslo_concurrency.lockutils [req-0ff18910-a740-46f4-ad78-ae447ca53a17 req-d2108186-cfaa-4e1a-ba17-2eda6f8b7f3a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.630 226239 DEBUG oslo_concurrency.lockutils [req-0ff18910-a740-46f4-ad78-ae447ca53a17 req-d2108186-cfaa-4e1a-ba17-2eda6f8b7f3a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.631 226239 DEBUG oslo_concurrency.lockutils [req-0ff18910-a740-46f4-ad78-ae447ca53a17 req-d2108186-cfaa-4e1a-ba17-2eda6f8b7f3a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.631 226239 DEBUG nova.compute.manager [req-0ff18910-a740-46f4-ad78-ae447ca53a17 req-d2108186-cfaa-4e1a-ba17-2eda6f8b7f3a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] No waiting events found dispatching network-vif-plugged-6e38f6ff-3729-4d12-9f54-6c01e6aae5aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.631 226239 WARNING nova.compute.manager [req-0ff18910-a740-46f4-ad78-ae447ca53a17 req-d2108186-cfaa-4e1a-ba17-2eda6f8b7f3a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Received unexpected event network-vif-plugged-6e38f6ff-3729-4d12-9f54-6c01e6aae5aa for instance with vm_state building and task_state spawning.#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.633 226239 DEBUG nova.compute.manager [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Instance event wait completed in 8 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.638 226239 DEBUG nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.642 226239 INFO nova.virt.libvirt.driver [-] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Instance spawned successfully.#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.642 226239 DEBUG nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.687 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:56:02 np0005603623 nova_compute[226235]: 2026-01-31 07:56:02.691 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:56:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:02.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:03 np0005603623 nova_compute[226235]: 2026-01-31 07:56:03.610 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:56:03 np0005603623 nova_compute[226235]: 2026-01-31 07:56:03.612 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846162.6369243, 0eb7d937-6381-4fca-88d8-57be8d3f0a29 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:56:03 np0005603623 nova_compute[226235]: 2026-01-31 07:56:03.612 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:56:03 np0005603623 nova_compute[226235]: 2026-01-31 07:56:03.616 226239 DEBUG nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:03 np0005603623 nova_compute[226235]: 2026-01-31 07:56:03.616 226239 DEBUG nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:03 np0005603623 nova_compute[226235]: 2026-01-31 07:56:03.616 226239 DEBUG nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:03 np0005603623 nova_compute[226235]: 2026-01-31 07:56:03.617 226239 DEBUG nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:03 np0005603623 nova_compute[226235]: 2026-01-31 07:56:03.617 226239 DEBUG nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:03 np0005603623 nova_compute[226235]: 2026-01-31 07:56:03.617 226239 DEBUG nova.virt.libvirt.driver [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:03 np0005603623 nova_compute[226235]: 2026-01-31 07:56:03.639 226239 INFO nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Took 30.47 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:56:03 np0005603623 nova_compute[226235]: 2026-01-31 07:56:03.640 226239 DEBUG nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:56:03 np0005603623 nova_compute[226235]: 2026-01-31 07:56:03.830 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:56:03 np0005603623 nova_compute[226235]: 2026-01-31 07:56:03.832 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:56:03 np0005603623 nova_compute[226235]: 2026-01-31 07:56:03.984 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:56:04 np0005603623 nova_compute[226235]: 2026-01-31 07:56:04.011 226239 INFO nova.compute.manager [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Took 35.66 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:56:04 np0005603623 nova_compute[226235]: 2026-01-31 07:56:04.011 226239 DEBUG nova.compute.manager [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:56:04 np0005603623 nova_compute[226235]: 2026-01-31 07:56:04.020 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:04 np0005603623 nova_compute[226235]: 2026-01-31 07:56:04.046 226239 INFO nova.compute.manager [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Took 35.85 seconds to build instance.#033[00m
Jan 31 02:56:04 np0005603623 nova_compute[226235]: 2026-01-31 07:56:04.151 226239 DEBUG oslo_concurrency.lockutils [None req-da74f3c8-4fa8-467d-b52e-cfee18664c84 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "bec297f5-8e63-412e-9cd3-8e859f89a123" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 36.540s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:04 np0005603623 nova_compute[226235]: 2026-01-31 07:56:04.220 226239 INFO nova.compute.manager [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Took 46.98 seconds to build instance.#033[00m
Jan 31 02:56:04 np0005603623 nova_compute[226235]: 2026-01-31 07:56:04.322 226239 DEBUG oslo_concurrency.lockutils [None req-4dea3830-2c44-42b0-ad70-94a9ca7f3fa1 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 48.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:04.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:04.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:06.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:06.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:07 np0005603623 nova_compute[226235]: 2026-01-31 07:56:07.061 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:56:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:08.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:56:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:08.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:09 np0005603623 nova_compute[226235]: 2026-01-31 07:56:09.023 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:10.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:10.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:11.459 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:56:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:11.461 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:56:11 np0005603623 nova_compute[226235]: 2026-01-31 07:56:11.461 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:12 np0005603623 nova_compute[226235]: 2026-01-31 07:56:12.062 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:12 np0005603623 nova_compute[226235]: 2026-01-31 07:56:12.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:12 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:56:12 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:56:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:12.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:12.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:13.465 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:56:13 np0005603623 nova_compute[226235]: 2026-01-31 07:56:13.480 226239 DEBUG nova.compute.manager [req-a09bacb6-2bab-4137-a0d3-1df0771ecbd2 req-b5ae9e88-ec9e-4cfc-bd1a-c1b8fa75aef5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Received event network-changed-6e38f6ff-3729-4d12-9f54-6c01e6aae5aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:56:13 np0005603623 nova_compute[226235]: 2026-01-31 07:56:13.481 226239 DEBUG nova.compute.manager [req-a09bacb6-2bab-4137-a0d3-1df0771ecbd2 req-b5ae9e88-ec9e-4cfc-bd1a-c1b8fa75aef5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Refreshing instance network info cache due to event network-changed-6e38f6ff-3729-4d12-9f54-6c01e6aae5aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:56:13 np0005603623 nova_compute[226235]: 2026-01-31 07:56:13.482 226239 DEBUG oslo_concurrency.lockutils [req-a09bacb6-2bab-4137-a0d3-1df0771ecbd2 req-b5ae9e88-ec9e-4cfc-bd1a-c1b8fa75aef5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-0eb7d937-6381-4fca-88d8-57be8d3f0a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:56:13 np0005603623 nova_compute[226235]: 2026-01-31 07:56:13.482 226239 DEBUG oslo_concurrency.lockutils [req-a09bacb6-2bab-4137-a0d3-1df0771ecbd2 req-b5ae9e88-ec9e-4cfc-bd1a-c1b8fa75aef5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-0eb7d937-6381-4fca-88d8-57be8d3f0a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:56:13 np0005603623 nova_compute[226235]: 2026-01-31 07:56:13.483 226239 DEBUG nova.network.neutron [req-a09bacb6-2bab-4137-a0d3-1df0771ecbd2 req-b5ae9e88-ec9e-4cfc-bd1a-c1b8fa75aef5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Refreshing network info cache for port 6e38f6ff-3729-4d12-9f54-6c01e6aae5aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:56:14 np0005603623 nova_compute[226235]: 2026-01-31 07:56:14.027 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:14 np0005603623 nova_compute[226235]: 2026-01-31 07:56:14.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:14.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e159 e159: 3 total, 3 up, 3 in
Jan 31 02:56:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:14.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:15 np0005603623 nova_compute[226235]: 2026-01-31 07:56:15.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:15 np0005603623 nova_compute[226235]: 2026-01-31 07:56:15.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:15 np0005603623 nova_compute[226235]: 2026-01-31 07:56:15.307 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:15 np0005603623 nova_compute[226235]: 2026-01-31 07:56:15.308 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:15 np0005603623 nova_compute[226235]: 2026-01-31 07:56:15.308 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:15 np0005603623 nova_compute[226235]: 2026-01-31 07:56:15.308 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:56:15 np0005603623 nova_compute[226235]: 2026-01-31 07:56:15.308 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:56:15 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2354465643' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:56:15 np0005603623 nova_compute[226235]: 2026-01-31 07:56:15.730 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:16 np0005603623 nova_compute[226235]: 2026-01-31 07:56:16.103 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000023 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:56:16 np0005603623 nova_compute[226235]: 2026-01-31 07:56:16.104 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000023 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:56:16 np0005603623 nova_compute[226235]: 2026-01-31 07:56:16.107 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000022 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:56:16 np0005603623 nova_compute[226235]: 2026-01-31 07:56:16.107 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000022 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:56:16 np0005603623 nova_compute[226235]: 2026-01-31 07:56:16.110 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000024 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:56:16 np0005603623 nova_compute[226235]: 2026-01-31 07:56:16.110 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000024 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:56:16 np0005603623 nova_compute[226235]: 2026-01-31 07:56:16.265 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:56:16 np0005603623 nova_compute[226235]: 2026-01-31 07:56:16.266 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4154MB free_disk=20.714214324951172GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:56:16 np0005603623 nova_compute[226235]: 2026-01-31 07:56:16.266 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:16 np0005603623 nova_compute[226235]: 2026-01-31 07:56:16.267 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:16.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:16.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:17 np0005603623 nova_compute[226235]: 2026-01-31 07:56:17.011 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 0eb7d937-6381-4fca-88d8-57be8d3f0a29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:56:17 np0005603623 nova_compute[226235]: 2026-01-31 07:56:17.011 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 3399eab2-419d-4742-b204-ab806dcda151 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:56:17 np0005603623 nova_compute[226235]: 2026-01-31 07:56:17.012 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance bec297f5-8e63-412e-9cd3-8e859f89a123 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:56:17 np0005603623 nova_compute[226235]: 2026-01-31 07:56:17.012 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:56:17 np0005603623 nova_compute[226235]: 2026-01-31 07:56:17.012 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:56:17 np0005603623 nova_compute[226235]: 2026-01-31 07:56:17.065 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:17 np0005603623 nova_compute[226235]: 2026-01-31 07:56:17.102 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:56:17 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3874693495' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:56:17 np0005603623 nova_compute[226235]: 2026-01-31 07:56:17.518 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:17 np0005603623 nova_compute[226235]: 2026-01-31 07:56:17.523 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:56:17 np0005603623 nova_compute[226235]: 2026-01-31 07:56:17.595 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:56:17 np0005603623 nova_compute[226235]: 2026-01-31 07:56:17.832 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:56:17 np0005603623 nova_compute[226235]: 2026-01-31 07:56:17.833 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.566s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:18 np0005603623 nova_compute[226235]: 2026-01-31 07:56:18.113 226239 DEBUG nova.network.neutron [req-a09bacb6-2bab-4137-a0d3-1df0771ecbd2 req-b5ae9e88-ec9e-4cfc-bd1a-c1b8fa75aef5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Updated VIF entry in instance network info cache for port 6e38f6ff-3729-4d12-9f54-6c01e6aae5aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:56:18 np0005603623 nova_compute[226235]: 2026-01-31 07:56:18.114 226239 DEBUG nova.network.neutron [req-a09bacb6-2bab-4137-a0d3-1df0771ecbd2 req-b5ae9e88-ec9e-4cfc-bd1a-c1b8fa75aef5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Updating instance_info_cache with network_info: [{"id": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "address": "fa:16:3e:fa:91:f1", "network": {"id": "80d90d51-335c-4f74-8a61-143d47d84f22", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-991561978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd9f0c923b994b0295e72b111f661de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e38f6ff-37", "ovs_interfaceid": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:56:18 np0005603623 nova_compute[226235]: 2026-01-31 07:56:18.291 226239 DEBUG oslo_concurrency.lockutils [req-a09bacb6-2bab-4137-a0d3-1df0771ecbd2 req-b5ae9e88-ec9e-4cfc-bd1a-c1b8fa75aef5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-0eb7d937-6381-4fca-88d8-57be8d3f0a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:56:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:18.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:18 np0005603623 nova_compute[226235]: 2026-01-31 07:56:18.834 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:18 np0005603623 nova_compute[226235]: 2026-01-31 07:56:18.834 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:18 np0005603623 nova_compute[226235]: 2026-01-31 07:56:18.835 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:18 np0005603623 nova_compute[226235]: 2026-01-31 07:56:18.835 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:56:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:18.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:19 np0005603623 nova_compute[226235]: 2026-01-31 07:56:19.028 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:19 np0005603623 nova_compute[226235]: 2026-01-31 07:56:19.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:19 np0005603623 nova_compute[226235]: 2026-01-31 07:56:19.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:19 np0005603623 nova_compute[226235]: 2026-01-31 07:56:19.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:56:19 np0005603623 nova_compute[226235]: 2026-01-31 07:56:19.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:56:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:20 np0005603623 nova_compute[226235]: 2026-01-31 07:56:20.020 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-0eb7d937-6381-4fca-88d8-57be8d3f0a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:56:20 np0005603623 nova_compute[226235]: 2026-01-31 07:56:20.020 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-0eb7d937-6381-4fca-88d8-57be8d3f0a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:56:20 np0005603623 nova_compute[226235]: 2026-01-31 07:56:20.020 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 02:56:20 np0005603623 nova_compute[226235]: 2026-01-31 07:56:20.020 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0eb7d937-6381-4fca-88d8-57be8d3f0a29 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:56:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:20.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:20.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:21 np0005603623 nova_compute[226235]: 2026-01-31 07:56:21.337 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Updating instance_info_cache with network_info: [{"id": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "address": "fa:16:3e:fa:91:f1", "network": {"id": "80d90d51-335c-4f74-8a61-143d47d84f22", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-991561978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd9f0c923b994b0295e72b111f661de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e38f6ff-37", "ovs_interfaceid": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:56:21 np0005603623 nova_compute[226235]: 2026-01-31 07:56:21.356 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-0eb7d937-6381-4fca-88d8-57be8d3f0a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:56:21 np0005603623 nova_compute[226235]: 2026-01-31 07:56:21.357 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 02:56:21 np0005603623 nova_compute[226235]: 2026-01-31 07:56:21.357 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:22 np0005603623 nova_compute[226235]: 2026-01-31 07:56:22.068 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:22 np0005603623 ovn_controller[133449]: 2026-01-31T07:56:22Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fa:91:f1 10.100.0.13
Jan 31 02:56:22 np0005603623 ovn_controller[133449]: 2026-01-31T07:56:22Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fa:91:f1 10.100.0.13
Jan 31 02:56:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:22.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:23.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:24 np0005603623 nova_compute[226235]: 2026-01-31 07:56:24.040 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:24.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:25.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 e160: 3 total, 3 up, 3 in
Jan 31 02:56:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:26.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:56:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:27.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:56:27 np0005603623 nova_compute[226235]: 2026-01-31 07:56:27.068 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:28.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:29.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:29 np0005603623 nova_compute[226235]: 2026-01-31 07:56:29.042 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:29 np0005603623 podman[245616]: 2026-01-31 07:56:29.96753297 +0000 UTC m=+0.056337859 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:56:29 np0005603623 podman[245617]: 2026-01-31 07:56:29.986361071 +0000 UTC m=+0.075166520 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 02:56:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:30.090 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:30.091 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:30.091 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:56:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:30.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:56:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:31.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:32 np0005603623 nova_compute[226235]: 2026-01-31 07:56:32.071 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:32 np0005603623 nova_compute[226235]: 2026-01-31 07:56:32.474 226239 DEBUG oslo_concurrency.lockutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquiring lock "a6c36818-897d-4417-bad1-5d1546fa7497" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:32 np0005603623 nova_compute[226235]: 2026-01-31 07:56:32.474 226239 DEBUG oslo_concurrency.lockutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "a6c36818-897d-4417-bad1-5d1546fa7497" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:32 np0005603623 nova_compute[226235]: 2026-01-31 07:56:32.573 226239 DEBUG nova.compute.manager [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:56:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:32.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:33.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:33 np0005603623 nova_compute[226235]: 2026-01-31 07:56:33.063 226239 DEBUG oslo_concurrency.lockutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:33 np0005603623 nova_compute[226235]: 2026-01-31 07:56:33.064 226239 DEBUG oslo_concurrency.lockutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:33 np0005603623 nova_compute[226235]: 2026-01-31 07:56:33.070 226239 DEBUG nova.virt.hardware [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:56:33 np0005603623 nova_compute[226235]: 2026-01-31 07:56:33.071 226239 INFO nova.compute.claims [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:56:33 np0005603623 nova_compute[226235]: 2026-01-31 07:56:33.965 226239 DEBUG oslo_concurrency.processutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:34 np0005603623 nova_compute[226235]: 2026-01-31 07:56:34.103 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:34.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:35.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:56:35 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1128079976' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:56:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:35 np0005603623 nova_compute[226235]: 2026-01-31 07:56:35.242 226239 DEBUG oslo_concurrency.processutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.277s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:35 np0005603623 nova_compute[226235]: 2026-01-31 07:56:35.246 226239 DEBUG nova.compute.provider_tree [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:56:35 np0005603623 nova_compute[226235]: 2026-01-31 07:56:35.347 226239 DEBUG nova.scheduler.client.report [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:56:35 np0005603623 nova_compute[226235]: 2026-01-31 07:56:35.441 226239 DEBUG oslo_concurrency.lockutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.377s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:35 np0005603623 nova_compute[226235]: 2026-01-31 07:56:35.442 226239 DEBUG nova.compute.manager [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:56:35 np0005603623 nova_compute[226235]: 2026-01-31 07:56:35.643 226239 DEBUG nova.compute.manager [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:56:35 np0005603623 nova_compute[226235]: 2026-01-31 07:56:35.643 226239 DEBUG nova.network.neutron [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:56:35 np0005603623 nova_compute[226235]: 2026-01-31 07:56:35.720 226239 INFO nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:56:35 np0005603623 nova_compute[226235]: 2026-01-31 07:56:35.808 226239 DEBUG nova.compute.manager [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:56:36 np0005603623 nova_compute[226235]: 2026-01-31 07:56:35.999 226239 DEBUG nova.compute.manager [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:56:36 np0005603623 nova_compute[226235]: 2026-01-31 07:56:36.000 226239 DEBUG nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:56:36 np0005603623 nova_compute[226235]: 2026-01-31 07:56:36.001 226239 INFO nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Creating image(s)#033[00m
Jan 31 02:56:36 np0005603623 nova_compute[226235]: 2026-01-31 07:56:36.022 226239 DEBUG nova.storage.rbd_utils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] rbd image a6c36818-897d-4417-bad1-5d1546fa7497_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:56:36 np0005603623 nova_compute[226235]: 2026-01-31 07:56:36.049 226239 DEBUG nova.storage.rbd_utils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] rbd image a6c36818-897d-4417-bad1-5d1546fa7497_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:56:36 np0005603623 nova_compute[226235]: 2026-01-31 07:56:36.102 226239 DEBUG nova.storage.rbd_utils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] rbd image a6c36818-897d-4417-bad1-5d1546fa7497_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:56:36 np0005603623 nova_compute[226235]: 2026-01-31 07:56:36.106 226239 DEBUG oslo_concurrency.processutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:36 np0005603623 nova_compute[226235]: 2026-01-31 07:56:36.155 226239 DEBUG oslo_concurrency.processutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:36 np0005603623 nova_compute[226235]: 2026-01-31 07:56:36.156 226239 DEBUG oslo_concurrency.lockutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:36 np0005603623 nova_compute[226235]: 2026-01-31 07:56:36.157 226239 DEBUG oslo_concurrency.lockutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:36 np0005603623 nova_compute[226235]: 2026-01-31 07:56:36.157 226239 DEBUG oslo_concurrency.lockutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:36 np0005603623 nova_compute[226235]: 2026-01-31 07:56:36.257 226239 DEBUG nova.storage.rbd_utils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] rbd image a6c36818-897d-4417-bad1-5d1546fa7497_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:56:36 np0005603623 nova_compute[226235]: 2026-01-31 07:56:36.260 226239 DEBUG oslo_concurrency.processutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 a6c36818-897d-4417-bad1-5d1546fa7497_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:36 np0005603623 nova_compute[226235]: 2026-01-31 07:56:36.276 226239 DEBUG nova.policy [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93973daeb08c453e90372a79b54b9ede', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8033316fc42c4926bfd1f8a34b02fa97', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:56:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:36.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:37.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:37 np0005603623 nova_compute[226235]: 2026-01-31 07:56:37.072 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:37 np0005603623 nova_compute[226235]: 2026-01-31 07:56:37.506 226239 DEBUG oslo_concurrency.processutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 a6c36818-897d-4417-bad1-5d1546fa7497_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.246s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:37 np0005603623 nova_compute[226235]: 2026-01-31 07:56:37.600 226239 DEBUG nova.storage.rbd_utils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] resizing rbd image a6c36818-897d-4417-bad1-5d1546fa7497_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:56:37 np0005603623 nova_compute[226235]: 2026-01-31 07:56:37.761 226239 DEBUG nova.objects.instance [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lazy-loading 'migration_context' on Instance uuid a6c36818-897d-4417-bad1-5d1546fa7497 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:56:37 np0005603623 nova_compute[226235]: 2026-01-31 07:56:37.834 226239 DEBUG nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:56:37 np0005603623 nova_compute[226235]: 2026-01-31 07:56:37.835 226239 DEBUG nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Ensure instance console log exists: /var/lib/nova/instances/a6c36818-897d-4417-bad1-5d1546fa7497/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:56:37 np0005603623 nova_compute[226235]: 2026-01-31 07:56:37.835 226239 DEBUG oslo_concurrency.lockutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:37 np0005603623 nova_compute[226235]: 2026-01-31 07:56:37.835 226239 DEBUG oslo_concurrency.lockutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:37 np0005603623 nova_compute[226235]: 2026-01-31 07:56:37.836 226239 DEBUG oslo_concurrency.lockutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:37 np0005603623 nova_compute[226235]: 2026-01-31 07:56:37.951 226239 DEBUG nova.network.neutron [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Successfully created port: bb69cccb-97dc-4472-af6c-0ed0b0324779 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 02:56:38 np0005603623 nova_compute[226235]: 2026-01-31 07:56:38.306 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:38.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:39.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:39 np0005603623 nova_compute[226235]: 2026-01-31 07:56:39.149 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:39 np0005603623 nova_compute[226235]: 2026-01-31 07:56:39.967 226239 DEBUG nova.network.neutron [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Successfully updated port: bb69cccb-97dc-4472-af6c-0ed0b0324779 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:56:40 np0005603623 nova_compute[226235]: 2026-01-31 07:56:40.019 226239 DEBUG oslo_concurrency.lockutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquiring lock "refresh_cache-a6c36818-897d-4417-bad1-5d1546fa7497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:56:40 np0005603623 nova_compute[226235]: 2026-01-31 07:56:40.019 226239 DEBUG oslo_concurrency.lockutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquired lock "refresh_cache-a6c36818-897d-4417-bad1-5d1546fa7497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:56:40 np0005603623 nova_compute[226235]: 2026-01-31 07:56:40.019 226239 DEBUG nova.network.neutron [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:56:40 np0005603623 nova_compute[226235]: 2026-01-31 07:56:40.222 226239 DEBUG nova.compute.manager [req-3d33ae42-4830-4eea-8ba4-a18c4ccf3342 req-1a62adb6-35d5-463d-b57a-a76338eac6d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Received event network-changed-bb69cccb-97dc-4472-af6c-0ed0b0324779 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:56:40 np0005603623 nova_compute[226235]: 2026-01-31 07:56:40.222 226239 DEBUG nova.compute.manager [req-3d33ae42-4830-4eea-8ba4-a18c4ccf3342 req-1a62adb6-35d5-463d-b57a-a76338eac6d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Refreshing instance network info cache due to event network-changed-bb69cccb-97dc-4472-af6c-0ed0b0324779. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:56:40 np0005603623 nova_compute[226235]: 2026-01-31 07:56:40.222 226239 DEBUG oslo_concurrency.lockutils [req-3d33ae42-4830-4eea-8ba4-a18c4ccf3342 req-1a62adb6-35d5-463d-b57a-a76338eac6d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-a6c36818-897d-4417-bad1-5d1546fa7497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:56:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:40 np0005603623 nova_compute[226235]: 2026-01-31 07:56:40.365 226239 DEBUG nova.network.neutron [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:56:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:56:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:40.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:56:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:41.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:42 np0005603623 nova_compute[226235]: 2026-01-31 07:56:42.075 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:42.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:43.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.132 226239 DEBUG nova.network.neutron [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Updating instance_info_cache with network_info: [{"id": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "address": "fa:16:3e:f5:be:30", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb69cccb-97", "ovs_interfaceid": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.151 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:44.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.844 226239 DEBUG oslo_concurrency.lockutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Releasing lock "refresh_cache-a6c36818-897d-4417-bad1-5d1546fa7497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.845 226239 DEBUG nova.compute.manager [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Instance network_info: |[{"id": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "address": "fa:16:3e:f5:be:30", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb69cccb-97", "ovs_interfaceid": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.845 226239 DEBUG oslo_concurrency.lockutils [req-3d33ae42-4830-4eea-8ba4-a18c4ccf3342 req-1a62adb6-35d5-463d-b57a-a76338eac6d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-a6c36818-897d-4417-bad1-5d1546fa7497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.846 226239 DEBUG nova.network.neutron [req-3d33ae42-4830-4eea-8ba4-a18c4ccf3342 req-1a62adb6-35d5-463d-b57a-a76338eac6d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Refreshing network info cache for port bb69cccb-97dc-4472-af6c-0ed0b0324779 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.849 226239 DEBUG nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Start _get_guest_xml network_info=[{"id": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "address": "fa:16:3e:f5:be:30", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb69cccb-97", "ovs_interfaceid": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.857 226239 WARNING nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.867 226239 DEBUG nova.virt.libvirt.host [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.867 226239 DEBUG nova.virt.libvirt.host [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.870 226239 DEBUG nova.virt.libvirt.host [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.870 226239 DEBUG nova.virt.libvirt.host [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.871 226239 DEBUG nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.872 226239 DEBUG nova.virt.hardware [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.872 226239 DEBUG nova.virt.hardware [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.872 226239 DEBUG nova.virt.hardware [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.872 226239 DEBUG nova.virt.hardware [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.873 226239 DEBUG nova.virt.hardware [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.873 226239 DEBUG nova.virt.hardware [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.873 226239 DEBUG nova.virt.hardware [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.873 226239 DEBUG nova.virt.hardware [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.873 226239 DEBUG nova.virt.hardware [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.873 226239 DEBUG nova.virt.hardware [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.874 226239 DEBUG nova.virt.hardware [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:56:44 np0005603623 nova_compute[226235]: 2026-01-31 07:56:44.876 226239 DEBUG oslo_concurrency.processutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:45.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:56:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1665479017' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:56:45 np0005603623 nova_compute[226235]: 2026-01-31 07:56:45.280 226239 DEBUG oslo_concurrency.processutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:45 np0005603623 nova_compute[226235]: 2026-01-31 07:56:45.311 226239 DEBUG nova.storage.rbd_utils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] rbd image a6c36818-897d-4417-bad1-5d1546fa7497_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:56:45 np0005603623 nova_compute[226235]: 2026-01-31 07:56:45.315 226239 DEBUG oslo_concurrency.processutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:56:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/51207697' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:56:45 np0005603623 nova_compute[226235]: 2026-01-31 07:56:45.850 226239 DEBUG oslo_concurrency.lockutils [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "3399eab2-419d-4742-b204-ab806dcda151" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:45 np0005603623 nova_compute[226235]: 2026-01-31 07:56:45.851 226239 DEBUG oslo_concurrency.lockutils [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "3399eab2-419d-4742-b204-ab806dcda151" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:45 np0005603623 nova_compute[226235]: 2026-01-31 07:56:45.851 226239 DEBUG oslo_concurrency.lockutils [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "3399eab2-419d-4742-b204-ab806dcda151-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:45 np0005603623 nova_compute[226235]: 2026-01-31 07:56:45.851 226239 DEBUG oslo_concurrency.lockutils [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "3399eab2-419d-4742-b204-ab806dcda151-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:45 np0005603623 nova_compute[226235]: 2026-01-31 07:56:45.851 226239 DEBUG oslo_concurrency.lockutils [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "3399eab2-419d-4742-b204-ab806dcda151-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:45 np0005603623 nova_compute[226235]: 2026-01-31 07:56:45.853 226239 INFO nova.compute.manager [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Terminating instance#033[00m
Jan 31 02:56:45 np0005603623 nova_compute[226235]: 2026-01-31 07:56:45.853 226239 DEBUG oslo_concurrency.lockutils [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "refresh_cache-3399eab2-419d-4742-b204-ab806dcda151" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:56:45 np0005603623 nova_compute[226235]: 2026-01-31 07:56:45.853 226239 DEBUG oslo_concurrency.lockutils [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquired lock "refresh_cache-3399eab2-419d-4742-b204-ab806dcda151" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:56:45 np0005603623 nova_compute[226235]: 2026-01-31 07:56:45.854 226239 DEBUG nova.network.neutron [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:56:45 np0005603623 nova_compute[226235]: 2026-01-31 07:56:45.897 226239 DEBUG oslo_concurrency.processutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.583s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:45 np0005603623 nova_compute[226235]: 2026-01-31 07:56:45.899 226239 DEBUG nova.virt.libvirt.vif [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:56:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1883505973',display_name='tempest-ServersAdminTestJSON-server-1883505973',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1883505973',id=40,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8033316fc42c4926bfd1f8a34b02fa97',ramdisk_id='',reservation_id='r-u30dpo8c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-784933461',owner_user_name='tempest-ServersAdminTestJSON-784933461-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:56:35Z,user_data=None,user_id='93973daeb08c453e90372a79b54b9ede',uuid=a6c36818-897d-4417-bad1-5d1546fa7497,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "address": "fa:16:3e:f5:be:30", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb69cccb-97", "ovs_interfaceid": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:56:45 np0005603623 nova_compute[226235]: 2026-01-31 07:56:45.899 226239 DEBUG nova.network.os_vif_util [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Converting VIF {"id": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "address": "fa:16:3e:f5:be:30", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb69cccb-97", "ovs_interfaceid": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:56:45 np0005603623 nova_compute[226235]: 2026-01-31 07:56:45.900 226239 DEBUG nova.network.os_vif_util [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:be:30,bridge_name='br-int',has_traffic_filtering=True,id=bb69cccb-97dc-4472-af6c-0ed0b0324779,network=Network(c58eaedf-202a-428a-acfb-f0b1291517f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb69cccb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:56:45 np0005603623 nova_compute[226235]: 2026-01-31 07:56:45.901 226239 DEBUG nova.objects.instance [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lazy-loading 'pci_devices' on Instance uuid a6c36818-897d-4417-bad1-5d1546fa7497 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.104 226239 DEBUG nova.network.neutron [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:56:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:46.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.667 226239 DEBUG nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:56:46 np0005603623 nova_compute[226235]:  <uuid>a6c36818-897d-4417-bad1-5d1546fa7497</uuid>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:  <name>instance-00000028</name>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServersAdminTestJSON-server-1883505973</nova:name>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:56:44</nova:creationTime>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:56:46 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:        <nova:user uuid="93973daeb08c453e90372a79b54b9ede">tempest-ServersAdminTestJSON-784933461-project-member</nova:user>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:        <nova:project uuid="8033316fc42c4926bfd1f8a34b02fa97">tempest-ServersAdminTestJSON-784933461</nova:project>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:        <nova:port uuid="bb69cccb-97dc-4472-af6c-0ed0b0324779">
Jan 31 02:56:46 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <entry name="serial">a6c36818-897d-4417-bad1-5d1546fa7497</entry>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <entry name="uuid">a6c36818-897d-4417-bad1-5d1546fa7497</entry>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/a6c36818-897d-4417-bad1-5d1546fa7497_disk">
Jan 31 02:56:46 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:56:46 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/a6c36818-897d-4417-bad1-5d1546fa7497_disk.config">
Jan 31 02:56:46 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:56:46 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:f5:be:30"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <target dev="tapbb69cccb-97"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    </interface>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/a6c36818-897d-4417-bad1-5d1546fa7497/console.log" append="off"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:56:46 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:56:46 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:56:46 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:56:46 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.667 226239 DEBUG nova.compute.manager [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Preparing to wait for external event network-vif-plugged-bb69cccb-97dc-4472-af6c-0ed0b0324779 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.668 226239 DEBUG oslo_concurrency.lockutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquiring lock "a6c36818-897d-4417-bad1-5d1546fa7497-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.668 226239 DEBUG oslo_concurrency.lockutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "a6c36818-897d-4417-bad1-5d1546fa7497-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.668 226239 DEBUG oslo_concurrency.lockutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "a6c36818-897d-4417-bad1-5d1546fa7497-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.669 226239 DEBUG nova.virt.libvirt.vif [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:56:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1883505973',display_name='tempest-ServersAdminTestJSON-server-1883505973',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1883505973',id=40,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8033316fc42c4926bfd1f8a34b02fa97',ramdisk_id='',reservation_id='r-u30dpo8c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-784933461',owner_user_name='tempest-ServersAdminTestJSON-784933461-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:56:35Z,user_data=None,user_id='93973daeb08c453e90372a79b54b9ede',uuid=a6c36818-897d-4417-bad1-5d1546fa7497,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "address": "fa:16:3e:f5:be:30", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb69cccb-97", "ovs_interfaceid": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.669 226239 DEBUG nova.network.os_vif_util [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Converting VIF {"id": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "address": "fa:16:3e:f5:be:30", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb69cccb-97", "ovs_interfaceid": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.669 226239 DEBUG nova.network.os_vif_util [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:be:30,bridge_name='br-int',has_traffic_filtering=True,id=bb69cccb-97dc-4472-af6c-0ed0b0324779,network=Network(c58eaedf-202a-428a-acfb-f0b1291517f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb69cccb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.670 226239 DEBUG os_vif [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:be:30,bridge_name='br-int',has_traffic_filtering=True,id=bb69cccb-97dc-4472-af6c-0ed0b0324779,network=Network(c58eaedf-202a-428a-acfb-f0b1291517f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb69cccb-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.670 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.670 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.671 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.673 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.673 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbb69cccb-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.674 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbb69cccb-97, col_values=(('external_ids', {'iface-id': 'bb69cccb-97dc-4472-af6c-0ed0b0324779', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f5:be:30', 'vm-uuid': 'a6c36818-897d-4417-bad1-5d1546fa7497'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:56:46 np0005603623 NetworkManager[48970]: <info>  [1769846206.6757] manager: (tapbb69cccb-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.677 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.680 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.681 226239 INFO os_vif [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:be:30,bridge_name='br-int',has_traffic_filtering=True,id=bb69cccb-97dc-4472-af6c-0ed0b0324779,network=Network(c58eaedf-202a-428a-acfb-f0b1291517f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb69cccb-97')#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.751 226239 DEBUG nova.network.neutron [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.843 226239 DEBUG oslo_concurrency.lockutils [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "bec297f5-8e63-412e-9cd3-8e859f89a123" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.844 226239 DEBUG oslo_concurrency.lockutils [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "bec297f5-8e63-412e-9cd3-8e859f89a123" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.845 226239 DEBUG oslo_concurrency.lockutils [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "bec297f5-8e63-412e-9cd3-8e859f89a123-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.845 226239 DEBUG oslo_concurrency.lockutils [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "bec297f5-8e63-412e-9cd3-8e859f89a123-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.845 226239 DEBUG oslo_concurrency.lockutils [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "bec297f5-8e63-412e-9cd3-8e859f89a123-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.846 226239 INFO nova.compute.manager [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Terminating instance#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.847 226239 DEBUG oslo_concurrency.lockutils [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "refresh_cache-bec297f5-8e63-412e-9cd3-8e859f89a123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.847 226239 DEBUG oslo_concurrency.lockutils [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquired lock "refresh_cache-bec297f5-8e63-412e-9cd3-8e859f89a123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.848 226239 DEBUG nova.network.neutron [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.850 226239 DEBUG oslo_concurrency.lockutils [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Releasing lock "refresh_cache-3399eab2-419d-4742-b204-ab806dcda151" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:56:46 np0005603623 nova_compute[226235]: 2026-01-31 07:56:46.850 226239 DEBUG nova.compute.manager [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:56:47 np0005603623 nova_compute[226235]: 2026-01-31 07:56:47.003 226239 DEBUG nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:56:47 np0005603623 nova_compute[226235]: 2026-01-31 07:56:47.003 226239 DEBUG nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:56:47 np0005603623 nova_compute[226235]: 2026-01-31 07:56:47.004 226239 DEBUG nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] No VIF found with MAC fa:16:3e:f5:be:30, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:56:47 np0005603623 nova_compute[226235]: 2026-01-31 07:56:47.004 226239 INFO nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Using config drive#033[00m
Jan 31 02:56:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:47.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:47 np0005603623 nova_compute[226235]: 2026-01-31 07:56:47.086 226239 DEBUG nova.storage.rbd_utils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] rbd image a6c36818-897d-4417-bad1-5d1546fa7497_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:56:47 np0005603623 nova_compute[226235]: 2026-01-31 07:56:47.091 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:47 np0005603623 nova_compute[226235]: 2026-01-31 07:56:47.110 226239 DEBUG nova.network.neutron [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:56:47 np0005603623 nova_compute[226235]: 2026-01-31 07:56:47.156 226239 DEBUG nova.network.neutron [req-3d33ae42-4830-4eea-8ba4-a18c4ccf3342 req-1a62adb6-35d5-463d-b57a-a76338eac6d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Updated VIF entry in instance network info cache for port bb69cccb-97dc-4472-af6c-0ed0b0324779. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:56:47 np0005603623 nova_compute[226235]: 2026-01-31 07:56:47.157 226239 DEBUG nova.network.neutron [req-3d33ae42-4830-4eea-8ba4-a18c4ccf3342 req-1a62adb6-35d5-463d-b57a-a76338eac6d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Updating instance_info_cache with network_info: [{"id": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "address": "fa:16:3e:f5:be:30", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb69cccb-97", "ovs_interfaceid": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:56:47 np0005603623 nova_compute[226235]: 2026-01-31 07:56:47.223 226239 DEBUG oslo_concurrency.lockutils [req-3d33ae42-4830-4eea-8ba4-a18c4ccf3342 req-1a62adb6-35d5-463d-b57a-a76338eac6d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-a6c36818-897d-4417-bad1-5d1546fa7497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:56:47 np0005603623 nova_compute[226235]: 2026-01-31 07:56:47.340 226239 DEBUG nova.network.neutron [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:56:47 np0005603623 nova_compute[226235]: 2026-01-31 07:56:47.381 226239 DEBUG oslo_concurrency.lockutils [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Releasing lock "refresh_cache-bec297f5-8e63-412e-9cd3-8e859f89a123" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:56:47 np0005603623 nova_compute[226235]: 2026-01-31 07:56:47.382 226239 DEBUG nova.compute.manager [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:56:47 np0005603623 nova_compute[226235]: 2026-01-31 07:56:47.844 226239 INFO nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Creating config drive at /var/lib/nova/instances/a6c36818-897d-4417-bad1-5d1546fa7497/disk.config#033[00m
Jan 31 02:56:47 np0005603623 nova_compute[226235]: 2026-01-31 07:56:47.848 226239 DEBUG oslo_concurrency.processutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a6c36818-897d-4417-bad1-5d1546fa7497/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpqx6swx8q execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:47 np0005603623 nova_compute[226235]: 2026-01-31 07:56:47.968 226239 DEBUG oslo_concurrency.processutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a6c36818-897d-4417-bad1-5d1546fa7497/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpqx6swx8q" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:48 np0005603623 nova_compute[226235]: 2026-01-31 07:56:48.003 226239 DEBUG nova.storage.rbd_utils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] rbd image a6c36818-897d-4417-bad1-5d1546fa7497_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:56:48 np0005603623 nova_compute[226235]: 2026-01-31 07:56:48.007 226239 DEBUG oslo_concurrency.processutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a6c36818-897d-4417-bad1-5d1546fa7497/disk.config a6c36818-897d-4417-bad1-5d1546fa7497_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:48 np0005603623 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000023.scope: Deactivated successfully.
Jan 31 02:56:48 np0005603623 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000023.scope: Consumed 14.047s CPU time.
Jan 31 02:56:48 np0005603623 systemd-machined[194379]: Machine qemu-20-instance-00000023 terminated.
Jan 31 02:56:48 np0005603623 nova_compute[226235]: 2026-01-31 07:56:48.467 226239 INFO nova.virt.libvirt.driver [-] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Instance destroyed successfully.#033[00m
Jan 31 02:56:48 np0005603623 nova_compute[226235]: 2026-01-31 07:56:48.468 226239 DEBUG nova.objects.instance [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lazy-loading 'resources' on Instance uuid 3399eab2-419d-4742-b204-ab806dcda151 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:56:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:48.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:48 np0005603623 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000024.scope: Deactivated successfully.
Jan 31 02:56:48 np0005603623 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000024.scope: Consumed 13.172s CPU time.
Jan 31 02:56:48 np0005603623 systemd-machined[194379]: Machine qemu-22-instance-00000024 terminated.
Jan 31 02:56:48 np0005603623 nova_compute[226235]: 2026-01-31 07:56:48.884 226239 INFO nova.virt.libvirt.driver [-] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Instance destroyed successfully.#033[00m
Jan 31 02:56:48 np0005603623 nova_compute[226235]: 2026-01-31 07:56:48.885 226239 DEBUG nova.objects.instance [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lazy-loading 'resources' on Instance uuid bec297f5-8e63-412e-9cd3-8e859f89a123 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:56:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:56:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:49.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:56:49 np0005603623 nova_compute[226235]: 2026-01-31 07:56:49.336 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:49 np0005603623 nova_compute[226235]: 2026-01-31 07:56:49.776 226239 DEBUG oslo_concurrency.processutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a6c36818-897d-4417-bad1-5d1546fa7497/disk.config a6c36818-897d-4417-bad1-5d1546fa7497_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.769s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:49 np0005603623 nova_compute[226235]: 2026-01-31 07:56:49.777 226239 INFO nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Deleting local config drive /var/lib/nova/instances/a6c36818-897d-4417-bad1-5d1546fa7497/disk.config because it was imported into RBD.#033[00m
Jan 31 02:56:49 np0005603623 kernel: tapbb69cccb-97: entered promiscuous mode
Jan 31 02:56:49 np0005603623 systemd-udevd[246028]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:56:49 np0005603623 NetworkManager[48970]: <info>  [1769846209.8219] manager: (tapbb69cccb-97): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Jan 31 02:56:49 np0005603623 nova_compute[226235]: 2026-01-31 07:56:49.822 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:49 np0005603623 ovn_controller[133449]: 2026-01-31T07:56:49Z|00158|binding|INFO|Claiming lport bb69cccb-97dc-4472-af6c-0ed0b0324779 for this chassis.
Jan 31 02:56:49 np0005603623 ovn_controller[133449]: 2026-01-31T07:56:49Z|00159|binding|INFO|bb69cccb-97dc-4472-af6c-0ed0b0324779: Claiming fa:16:3e:f5:be:30 10.100.0.6
Jan 31 02:56:49 np0005603623 ovn_controller[133449]: 2026-01-31T07:56:49Z|00160|binding|INFO|Setting lport bb69cccb-97dc-4472-af6c-0ed0b0324779 ovn-installed in OVS
Jan 31 02:56:49 np0005603623 nova_compute[226235]: 2026-01-31 07:56:49.831 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:49 np0005603623 NetworkManager[48970]: <info>  [1769846209.8346] device (tapbb69cccb-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:56:49 np0005603623 NetworkManager[48970]: <info>  [1769846209.8363] device (tapbb69cccb-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:56:49 np0005603623 nova_compute[226235]: 2026-01-31 07:56:49.835 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:49 np0005603623 systemd-machined[194379]: New machine qemu-23-instance-00000028.
Jan 31 02:56:49 np0005603623 systemd[1]: Started Virtual Machine qemu-23-instance-00000028.
Jan 31 02:56:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:49.901 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:be:30 10.100.0.6'], port_security=['fa:16:3e:f5:be:30 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'a6c36818-897d-4417-bad1-5d1546fa7497', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c58eaedf-202a-428a-acfb-f0b1291517f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8033316fc42c4926bfd1f8a34b02fa97', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4b3d9baf-bd3e-457e-a5c2-9addbc71d588', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=189b55ef-8e14-4c6c-870a-5dba85715c4a, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=bb69cccb-97dc-4472-af6c-0ed0b0324779) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:56:49 np0005603623 ovn_controller[133449]: 2026-01-31T07:56:49Z|00161|binding|INFO|Setting lport bb69cccb-97dc-4472-af6c-0ed0b0324779 up in Southbound
Jan 31 02:56:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:49.902 143258 INFO neutron.agent.ovn.metadata.agent [-] Port bb69cccb-97dc-4472-af6c-0ed0b0324779 in datapath c58eaedf-202a-428a-acfb-f0b1291517f1 bound to our chassis#033[00m
Jan 31 02:56:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:49.905 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c58eaedf-202a-428a-acfb-f0b1291517f1#033[00m
Jan 31 02:56:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:49.916 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2df81722-9b0d-4dc1-a83b-239419eff32c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:49.917 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc58eaedf-21 in ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:56:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:49.919 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc58eaedf-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:56:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:49.919 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1dd6bb05-0af9-4b8f-9bdc-f7269c8d77ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:49.919 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c25c1aa5-6b04-4f7c-8154-f86d27e4115b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:49.930 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[5e993dea-dd50-44c2-ba92-0b18b94b7db0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:49.939 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ed1b92e9-e12f-42a9-b90f-6f1e186efe1b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:49.970 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[205569bf-3e1c-41f3-865f-9e5fc9281f45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:49 np0005603623 NetworkManager[48970]: <info>  [1769846209.9762] manager: (tapc58eaedf-20): new Veth device (/org/freedesktop/NetworkManager/Devices/81)
Jan 31 02:56:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:49.977 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[98323847-1e5b-41ab-a63a-e85edd420b81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:50.004 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[1947684e-28fc-4c0b-be45-fcf18e30fa5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:50.007 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[9406675e-5767-40a9-bc5a-2e84dbf825eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:50 np0005603623 NetworkManager[48970]: <info>  [1769846210.0247] device (tapc58eaedf-20): carrier: link connected
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:50.030 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[3e8c9395-be2e-441c-b148-201387b59d75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:50.044 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[10f72370-c8b3-4e45-acfd-798d0940357e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc58eaedf-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:41:11:bf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535045, 'reachable_time': 20177, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246118, 'error': None, 'target': 'ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:50.057 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[25d60cc0-12db-4077-a100-a97e5736fccf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe41:11bf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 535045, 'tstamp': 535045}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246119, 'error': None, 'target': 'ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:50.070 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3688c08e-f4d0-4c94-8fb9-bab7b45c69db]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc58eaedf-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:41:11:bf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535045, 'reachable_time': 20177, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246120, 'error': None, 'target': 'ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:50.091 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6e339de0-6697-4e02-9872-3e4a32be3abc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:50.131 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c8e047b0-b2b7-40e8-ace0-c2d8fcea88d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:50.133 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc58eaedf-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:50.133 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:50.134 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc58eaedf-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:56:50 np0005603623 nova_compute[226235]: 2026-01-31 07:56:50.135 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:50 np0005603623 kernel: tapc58eaedf-20: entered promiscuous mode
Jan 31 02:56:50 np0005603623 NetworkManager[48970]: <info>  [1769846210.1363] manager: (tapc58eaedf-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Jan 31 02:56:50 np0005603623 nova_compute[226235]: 2026-01-31 07:56:50.138 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:50.138 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc58eaedf-20, col_values=(('external_ids', {'iface-id': '8c531a0f-deeb-4de0-880b-b07ec1cf9103'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:56:50 np0005603623 nova_compute[226235]: 2026-01-31 07:56:50.139 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:50 np0005603623 nova_compute[226235]: 2026-01-31 07:56:50.141 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:50.141 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c58eaedf-202a-428a-acfb-f0b1291517f1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c58eaedf-202a-428a-acfb-f0b1291517f1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:50.142 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2cb7fb9c-1fd1-410b-a4c3-e3b0828ae0c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:50.143 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-c58eaedf-202a-428a-acfb-f0b1291517f1
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/c58eaedf-202a-428a-acfb-f0b1291517f1.pid.haproxy
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID c58eaedf-202a-428a-acfb-f0b1291517f1
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:56:50 np0005603623 ovn_controller[133449]: 2026-01-31T07:56:50Z|00162|binding|INFO|Releasing lport 8c531a0f-deeb-4de0-880b-b07ec1cf9103 from this chassis (sb_readonly=0)
Jan 31 02:56:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:56:50.143 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1', 'env', 'PROCESS_TAG=haproxy-c58eaedf-202a-428a-acfb-f0b1291517f1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c58eaedf-202a-428a-acfb-f0b1291517f1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:56:50 np0005603623 nova_compute[226235]: 2026-01-31 07:56:50.148 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:50 np0005603623 podman[246169]: 2026-01-31 07:56:50.46886745 +0000 UTC m=+0.023674925 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:56:50 np0005603623 nova_compute[226235]: 2026-01-31 07:56:50.597 226239 DEBUG oslo_concurrency.lockutils [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Acquiring lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:50 np0005603623 nova_compute[226235]: 2026-01-31 07:56:50.597 226239 DEBUG oslo_concurrency.lockutils [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:50.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:50 np0005603623 nova_compute[226235]: 2026-01-31 07:56:50.766 226239 DEBUG nova.objects.instance [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lazy-loading 'flavor' on Instance uuid 0eb7d937-6381-4fca-88d8-57be8d3f0a29 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:56:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:51.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:51 np0005603623 nova_compute[226235]: 2026-01-31 07:56:51.052 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846211.0517676, a6c36818-897d-4417-bad1-5d1546fa7497 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:56:51 np0005603623 nova_compute[226235]: 2026-01-31 07:56:51.052 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] VM Started (Lifecycle Event)#033[00m
Jan 31 02:56:51 np0005603623 nova_compute[226235]: 2026-01-31 07:56:51.261 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:56:51 np0005603623 nova_compute[226235]: 2026-01-31 07:56:51.265 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846211.051869, a6c36818-897d-4417-bad1-5d1546fa7497 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:56:51 np0005603623 nova_compute[226235]: 2026-01-31 07:56:51.265 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:56:51 np0005603623 podman[246169]: 2026-01-31 07:56:51.45869119 +0000 UTC m=+1.013498625 container create 0967c212d89d2325dfd01b6ebf08257a6f79fd0dcbc80249d4de065407ff2ca2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 02:56:51 np0005603623 nova_compute[226235]: 2026-01-31 07:56:51.579 226239 DEBUG oslo_concurrency.lockutils [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.982s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:51 np0005603623 nova_compute[226235]: 2026-01-31 07:56:51.675 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:51 np0005603623 systemd[1]: Started libpod-conmon-0967c212d89d2325dfd01b6ebf08257a6f79fd0dcbc80249d4de065407ff2ca2.scope.
Jan 31 02:56:51 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:56:51 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2fc5448f3f4255cec1f50b49977303692b5ef974976839ccde36793b77055730/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:56:51 np0005603623 nova_compute[226235]: 2026-01-31 07:56:51.853 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:56:51 np0005603623 nova_compute[226235]: 2026-01-31 07:56:51.878 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:56:51 np0005603623 podman[246169]: 2026-01-31 07:56:51.997352712 +0000 UTC m=+1.552160167 container init 0967c212d89d2325dfd01b6ebf08257a6f79fd0dcbc80249d4de065407ff2ca2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 02:56:52 np0005603623 podman[246169]: 2026-01-31 07:56:52.001414219 +0000 UTC m=+1.556221664 container start 0967c212d89d2325dfd01b6ebf08257a6f79fd0dcbc80249d4de065407ff2ca2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:56:52 np0005603623 neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1[246208]: [NOTICE]   (246212) : New worker (246214) forked
Jan 31 02:56:52 np0005603623 neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1[246208]: [NOTICE]   (246212) : Loading success.
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.079 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.152 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.495 226239 DEBUG oslo_concurrency.lockutils [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Acquiring lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.496 226239 DEBUG oslo_concurrency.lockutils [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.496 226239 INFO nova.compute.manager [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Attaching volume bd6a018f-eaef-43b9-b78d-313bb85c40fa to /dev/vdb#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.502 226239 DEBUG nova.compute.manager [req-6d83d3e6-56ba-49f6-af65-33bb0b27a941 req-ed6448c9-1ff3-40db-9e45-af9ffb0e4a86 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Received event network-vif-plugged-bb69cccb-97dc-4472-af6c-0ed0b0324779 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.502 226239 DEBUG oslo_concurrency.lockutils [req-6d83d3e6-56ba-49f6-af65-33bb0b27a941 req-ed6448c9-1ff3-40db-9e45-af9ffb0e4a86 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a6c36818-897d-4417-bad1-5d1546fa7497-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.502 226239 DEBUG oslo_concurrency.lockutils [req-6d83d3e6-56ba-49f6-af65-33bb0b27a941 req-ed6448c9-1ff3-40db-9e45-af9ffb0e4a86 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a6c36818-897d-4417-bad1-5d1546fa7497-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.503 226239 DEBUG oslo_concurrency.lockutils [req-6d83d3e6-56ba-49f6-af65-33bb0b27a941 req-ed6448c9-1ff3-40db-9e45-af9ffb0e4a86 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a6c36818-897d-4417-bad1-5d1546fa7497-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.503 226239 DEBUG nova.compute.manager [req-6d83d3e6-56ba-49f6-af65-33bb0b27a941 req-ed6448c9-1ff3-40db-9e45-af9ffb0e4a86 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Processing event network-vif-plugged-bb69cccb-97dc-4472-af6c-0ed0b0324779 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.504 226239 DEBUG nova.compute.manager [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.507 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846212.50765, a6c36818-897d-4417-bad1-5d1546fa7497 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.508 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.511 226239 DEBUG nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.517 226239 INFO nova.virt.libvirt.driver [-] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Instance spawned successfully.#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.518 226239 DEBUG nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:56:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:52.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.722 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.728 226239 DEBUG nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.729 226239 DEBUG nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.729 226239 DEBUG nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.729 226239 DEBUG nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.730 226239 DEBUG nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.730 226239 DEBUG nova.virt.libvirt.driver [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.734 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.832 226239 DEBUG os_brick.utils [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.837 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.849 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.849 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[6914c732-aa81-4d6a-bff8-46eee023a8b6]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.851 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.857 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.857 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[6c953a35-9c3e-47a2-a3f6-ba79142f6e03]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.859 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.863 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.005s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.863 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[50685e6a-1432-4cd4-b239-7881a75ae079]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.865 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[633fc73e-c7da-4523-a71e-9ef90cbce697]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.866 226239 DEBUG oslo_concurrency.processutils [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.883 226239 DEBUG oslo_concurrency.processutils [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] CMD "nvme version" returned: 0 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.885 226239 DEBUG os_brick.initiator.connectors.lightos [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.886 226239 DEBUG os_brick.initiator.connectors.lightos [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.886 226239 DEBUG os_brick.initiator.connectors.lightos [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.886 226239 DEBUG os_brick.utils [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] <== get_connector_properties: return (53ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 02:56:52 np0005603623 nova_compute[226235]: 2026-01-31 07:56:52.887 226239 DEBUG nova.virt.block_device [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Updating existing volume attachment record: 97209046-c543-451c-84de-69e19e0a38d6 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 02:56:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:53.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:53 np0005603623 nova_compute[226235]: 2026-01-31 07:56:53.146 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:56:53 np0005603623 nova_compute[226235]: 2026-01-31 07:56:53.797 226239 INFO nova.compute.manager [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Took 17.80 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:56:53 np0005603623 nova_compute[226235]: 2026-01-31 07:56:53.798 226239 DEBUG nova.compute.manager [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:56:54 np0005603623 nova_compute[226235]: 2026-01-31 07:56:54.350 226239 INFO nova.compute.manager [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Took 21.32 seconds to build instance.#033[00m
Jan 31 02:56:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:54.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:54 np0005603623 nova_compute[226235]: 2026-01-31 07:56:54.729 226239 DEBUG nova.compute.manager [req-61729654-1bb4-4b13-b17f-37e4d49857b0 req-8f71889d-8006-4174-9243-ab41ed3cdc95 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Received event network-vif-plugged-bb69cccb-97dc-4472-af6c-0ed0b0324779 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:56:54 np0005603623 nova_compute[226235]: 2026-01-31 07:56:54.730 226239 DEBUG oslo_concurrency.lockutils [req-61729654-1bb4-4b13-b17f-37e4d49857b0 req-8f71889d-8006-4174-9243-ab41ed3cdc95 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a6c36818-897d-4417-bad1-5d1546fa7497-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:54 np0005603623 nova_compute[226235]: 2026-01-31 07:56:54.731 226239 DEBUG oslo_concurrency.lockutils [req-61729654-1bb4-4b13-b17f-37e4d49857b0 req-8f71889d-8006-4174-9243-ab41ed3cdc95 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a6c36818-897d-4417-bad1-5d1546fa7497-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:54 np0005603623 nova_compute[226235]: 2026-01-31 07:56:54.731 226239 DEBUG oslo_concurrency.lockutils [req-61729654-1bb4-4b13-b17f-37e4d49857b0 req-8f71889d-8006-4174-9243-ab41ed3cdc95 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a6c36818-897d-4417-bad1-5d1546fa7497-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:54 np0005603623 nova_compute[226235]: 2026-01-31 07:56:54.731 226239 DEBUG nova.compute.manager [req-61729654-1bb4-4b13-b17f-37e4d49857b0 req-8f71889d-8006-4174-9243-ab41ed3cdc95 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] No waiting events found dispatching network-vif-plugged-bb69cccb-97dc-4472-af6c-0ed0b0324779 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:56:54 np0005603623 nova_compute[226235]: 2026-01-31 07:56:54.731 226239 WARNING nova.compute.manager [req-61729654-1bb4-4b13-b17f-37e4d49857b0 req-8f71889d-8006-4174-9243-ab41ed3cdc95 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Received unexpected event network-vif-plugged-bb69cccb-97dc-4472-af6c-0ed0b0324779 for instance with vm_state active and task_state None.#033[00m
Jan 31 02:56:55 np0005603623 nova_compute[226235]: 2026-01-31 07:56:54.999 226239 DEBUG oslo_concurrency.lockutils [None req-6d859b7d-0997-4b59-9d70-b34f23252ac7 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "a6c36818-897d-4417-bad1-5d1546fa7497" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.525s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:55.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:55 np0005603623 nova_compute[226235]: 2026-01-31 07:56:55.074 226239 DEBUG nova.objects.instance [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lazy-loading 'flavor' on Instance uuid 0eb7d937-6381-4fca-88d8-57be8d3f0a29 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:56:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:55 np0005603623 nova_compute[226235]: 2026-01-31 07:56:55.258 226239 DEBUG nova.virt.libvirt.driver [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Attempting to attach volume bd6a018f-eaef-43b9-b78d-313bb85c40fa with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 02:56:55 np0005603623 nova_compute[226235]: 2026-01-31 07:56:55.260 226239 DEBUG nova.virt.libvirt.guest [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 02:56:55 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 02:56:55 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-bd6a018f-eaef-43b9-b78d-313bb85c40fa">
Jan 31 02:56:55 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 02:56:55 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 02:56:55 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 02:56:55 np0005603623 nova_compute[226235]:  </source>
Jan 31 02:56:55 np0005603623 nova_compute[226235]:  <auth username="openstack">
Jan 31 02:56:55 np0005603623 nova_compute[226235]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:56:55 np0005603623 nova_compute[226235]:  </auth>
Jan 31 02:56:55 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 02:56:55 np0005603623 nova_compute[226235]:  <serial>bd6a018f-eaef-43b9-b78d-313bb85c40fa</serial>
Jan 31 02:56:55 np0005603623 nova_compute[226235]:  <shareable/>
Jan 31 02:56:55 np0005603623 nova_compute[226235]: </disk>
Jan 31 02:56:55 np0005603623 nova_compute[226235]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 02:56:55 np0005603623 nova_compute[226235]: 2026-01-31 07:56:55.687 226239 DEBUG oslo_concurrency.lockutils [None req-05712e6e-8ba5-4abe-afc6-42f7452eb974 f604ab8ce514415199c9c6743e4c5883 49463c3785ca451ea25e79210872c961 - - default default] Acquiring lock "refresh_cache-a6c36818-897d-4417-bad1-5d1546fa7497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:56:55 np0005603623 nova_compute[226235]: 2026-01-31 07:56:55.688 226239 DEBUG oslo_concurrency.lockutils [None req-05712e6e-8ba5-4abe-afc6-42f7452eb974 f604ab8ce514415199c9c6743e4c5883 49463c3785ca451ea25e79210872c961 - - default default] Acquired lock "refresh_cache-a6c36818-897d-4417-bad1-5d1546fa7497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:56:55 np0005603623 nova_compute[226235]: 2026-01-31 07:56:55.688 226239 DEBUG nova.network.neutron [None req-05712e6e-8ba5-4abe-afc6-42f7452eb974 f604ab8ce514415199c9c6743e4c5883 49463c3785ca451ea25e79210872c961 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:56:56 np0005603623 nova_compute[226235]: 2026-01-31 07:56:56.253 226239 DEBUG nova.virt.libvirt.driver [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:56:56 np0005603623 nova_compute[226235]: 2026-01-31 07:56:56.255 226239 DEBUG nova.virt.libvirt.driver [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:56:56 np0005603623 nova_compute[226235]: 2026-01-31 07:56:56.255 226239 DEBUG nova.virt.libvirt.driver [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:56:56 np0005603623 nova_compute[226235]: 2026-01-31 07:56:56.255 226239 DEBUG nova.virt.libvirt.driver [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] No VIF found with MAC fa:16:3e:fa:91:f1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:56:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:56.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:56 np0005603623 nova_compute[226235]: 2026-01-31 07:56:56.677 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:56:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:57.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:56:57 np0005603623 nova_compute[226235]: 2026-01-31 07:56:57.082 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:57 np0005603623 nova_compute[226235]: 2026-01-31 07:56:57.272 226239 DEBUG oslo_concurrency.lockutils [None req-439c9268-871e-42bc-bb74-506dd12224a5 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 4.776s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:57 np0005603623 nova_compute[226235]: 2026-01-31 07:56:57.376 226239 DEBUG nova.network.neutron [None req-05712e6e-8ba5-4abe-afc6-42f7452eb974 f604ab8ce514415199c9c6743e4c5883 49463c3785ca451ea25e79210872c961 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Updating instance_info_cache with network_info: [{"id": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "address": "fa:16:3e:f5:be:30", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb69cccb-97", "ovs_interfaceid": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:56:57 np0005603623 nova_compute[226235]: 2026-01-31 07:56:57.747 226239 DEBUG oslo_concurrency.lockutils [None req-05712e6e-8ba5-4abe-afc6-42f7452eb974 f604ab8ce514415199c9c6743e4c5883 49463c3785ca451ea25e79210872c961 - - default default] Releasing lock "refresh_cache-a6c36818-897d-4417-bad1-5d1546fa7497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:56:57 np0005603623 nova_compute[226235]: 2026-01-31 07:56:57.748 226239 DEBUG nova.compute.manager [None req-05712e6e-8ba5-4abe-afc6-42f7452eb974 f604ab8ce514415199c9c6743e4c5883 49463c3785ca451ea25e79210872c961 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144#033[00m
Jan 31 02:56:57 np0005603623 nova_compute[226235]: 2026-01-31 07:56:57.748 226239 DEBUG nova.compute.manager [None req-05712e6e-8ba5-4abe-afc6-42f7452eb974 f604ab8ce514415199c9c6743e4c5883 49463c3785ca451ea25e79210872c961 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] network_info to inject: |[{"id": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "address": "fa:16:3e:f5:be:30", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb69cccb-97", "ovs_interfaceid": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145#033[00m
Jan 31 02:56:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:56:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:58.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:56:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:56:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:59.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:00.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:00 np0005603623 podman[246307]: 2026-01-31 07:57:00.957570885 +0000 UTC m=+0.048265195 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:57:01 np0005603623 podman[246308]: 2026-01-31 07:57:01.007012357 +0000 UTC m=+0.096994975 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 02:57:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:01.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:01 np0005603623 nova_compute[226235]: 2026-01-31 07:57:01.680 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:02 np0005603623 nova_compute[226235]: 2026-01-31 07:57:02.083 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:02 np0005603623 nova_compute[226235]: 2026-01-31 07:57:02.583 226239 DEBUG oslo_concurrency.lockutils [None req-5eb49be6-0faf-469c-a51a-d51d03ff8d6e d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Acquiring lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:02 np0005603623 nova_compute[226235]: 2026-01-31 07:57:02.583 226239 DEBUG oslo_concurrency.lockutils [None req-5eb49be6-0faf-469c-a51a-d51d03ff8d6e d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:02.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:02 np0005603623 nova_compute[226235]: 2026-01-31 07:57:02.864 226239 INFO nova.virt.libvirt.driver [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Deleting instance files /var/lib/nova/instances/3399eab2-419d-4742-b204-ab806dcda151_del#033[00m
Jan 31 02:57:02 np0005603623 nova_compute[226235]: 2026-01-31 07:57:02.867 226239 INFO nova.virt.libvirt.driver [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Deletion of /var/lib/nova/instances/3399eab2-419d-4742-b204-ab806dcda151_del complete#033[00m
Jan 31 02:57:02 np0005603623 nova_compute[226235]: 2026-01-31 07:57:02.923 226239 INFO nova.compute.manager [None req-5eb49be6-0faf-469c-a51a-d51d03ff8d6e d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Detaching volume bd6a018f-eaef-43b9-b78d-313bb85c40fa#033[00m
Jan 31 02:57:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:03.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.102 226239 INFO nova.virt.block_device [None req-5eb49be6-0faf-469c-a51a-d51d03ff8d6e d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Attempting to driver detach volume bd6a018f-eaef-43b9-b78d-313bb85c40fa from mountpoint /dev/vdb#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.110 226239 DEBUG nova.virt.libvirt.driver [None req-5eb49be6-0faf-469c-a51a-d51d03ff8d6e d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Attempting to detach device vdb from instance 0eb7d937-6381-4fca-88d8-57be8d3f0a29 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.111 226239 DEBUG nova.virt.libvirt.guest [None req-5eb49be6-0faf-469c-a51a-d51d03ff8d6e d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 02:57:03 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 02:57:03 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-bd6a018f-eaef-43b9-b78d-313bb85c40fa">
Jan 31 02:57:03 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 02:57:03 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 02:57:03 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 02:57:03 np0005603623 nova_compute[226235]:  </source>
Jan 31 02:57:03 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 02:57:03 np0005603623 nova_compute[226235]:  <serial>bd6a018f-eaef-43b9-b78d-313bb85c40fa</serial>
Jan 31 02:57:03 np0005603623 nova_compute[226235]:  <shareable/>
Jan 31 02:57:03 np0005603623 nova_compute[226235]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 02:57:03 np0005603623 nova_compute[226235]: </disk>
Jan 31 02:57:03 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.245 226239 INFO nova.compute.manager [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Took 16.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.246 226239 DEBUG oslo.service.loopingcall [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.247 226239 DEBUG nova.compute.manager [-] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.247 226239 DEBUG nova.network.neutron [-] [instance: 3399eab2-419d-4742-b204-ab806dcda151] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.315 226239 INFO nova.virt.libvirt.driver [None req-5eb49be6-0faf-469c-a51a-d51d03ff8d6e d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Successfully detached device vdb from instance 0eb7d937-6381-4fca-88d8-57be8d3f0a29 from the persistent domain config.#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.316 226239 DEBUG nova.virt.libvirt.driver [None req-5eb49be6-0faf-469c-a51a-d51d03ff8d6e d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 0eb7d937-6381-4fca-88d8-57be8d3f0a29 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.317 226239 DEBUG nova.virt.libvirt.guest [None req-5eb49be6-0faf-469c-a51a-d51d03ff8d6e d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 02:57:03 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 02:57:03 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-bd6a018f-eaef-43b9-b78d-313bb85c40fa">
Jan 31 02:57:03 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 02:57:03 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 02:57:03 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 02:57:03 np0005603623 nova_compute[226235]:  </source>
Jan 31 02:57:03 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 02:57:03 np0005603623 nova_compute[226235]:  <serial>bd6a018f-eaef-43b9-b78d-313bb85c40fa</serial>
Jan 31 02:57:03 np0005603623 nova_compute[226235]:  <shareable/>
Jan 31 02:57:03 np0005603623 nova_compute[226235]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 02:57:03 np0005603623 nova_compute[226235]: </disk>
Jan 31 02:57:03 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.465 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846208.4642637, 3399eab2-419d-4742-b204-ab806dcda151 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.466 226239 INFO nova.compute.manager [-] [instance: 3399eab2-419d-4742-b204-ab806dcda151] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.492 226239 DEBUG nova.compute.manager [None req-c7fda391-1ea1-4a8e-b66a-dceb9a2d44ee - - - - - -] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.523 226239 DEBUG nova.virt.libvirt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Received event <DeviceRemovedEvent: 1769846223.5232887, 0eb7d937-6381-4fca-88d8-57be8d3f0a29 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.526 226239 DEBUG nova.virt.libvirt.driver [None req-5eb49be6-0faf-469c-a51a-d51d03ff8d6e d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 0eb7d937-6381-4fca-88d8-57be8d3f0a29 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.528 226239 INFO nova.virt.libvirt.driver [None req-5eb49be6-0faf-469c-a51a-d51d03ff8d6e d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Successfully detached device vdb from instance 0eb7d937-6381-4fca-88d8-57be8d3f0a29 from the live domain config.#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.882 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846208.7973325, bec297f5-8e63-412e-9cd3-8e859f89a123 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.883 226239 INFO nova.compute.manager [-] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.918 226239 DEBUG nova.compute.manager [None req-4c09bd14-3dc6-43de-ac28-fee5ac613310 - - - - - -] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.921 226239 DEBUG nova.compute.manager [None req-4c09bd14-3dc6-43de-ac28-fee5ac613310 - - - - - -] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: deleting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:57:03 np0005603623 nova_compute[226235]: 2026-01-31 07:57:03.947 226239 INFO nova.compute.manager [None req-4c09bd14-3dc6-43de-ac28-fee5ac613310 - - - - - -] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Jan 31 02:57:04 np0005603623 nova_compute[226235]: 2026-01-31 07:57:04.056 226239 DEBUG nova.network.neutron [-] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:57:04 np0005603623 nova_compute[226235]: 2026-01-31 07:57:04.069 226239 DEBUG nova.objects.instance [None req-5eb49be6-0faf-469c-a51a-d51d03ff8d6e d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lazy-loading 'flavor' on Instance uuid 0eb7d937-6381-4fca-88d8-57be8d3f0a29 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:57:04 np0005603623 nova_compute[226235]: 2026-01-31 07:57:04.206 226239 DEBUG nova.network.neutron [-] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:57:04 np0005603623 nova_compute[226235]: 2026-01-31 07:57:04.429 226239 INFO nova.compute.manager [-] [instance: 3399eab2-419d-4742-b204-ab806dcda151] Took 1.18 seconds to deallocate network for instance.#033[00m
Jan 31 02:57:04 np0005603623 nova_compute[226235]: 2026-01-31 07:57:04.499 226239 DEBUG oslo_concurrency.lockutils [None req-5eb49be6-0faf-469c-a51a-d51d03ff8d6e d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.916s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:04 np0005603623 nova_compute[226235]: 2026-01-31 07:57:04.622 226239 DEBUG oslo_concurrency.lockutils [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:04 np0005603623 nova_compute[226235]: 2026-01-31 07:57:04.623 226239 DEBUG oslo_concurrency.lockutils [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:04.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:04 np0005603623 nova_compute[226235]: 2026-01-31 07:57:04.716 226239 DEBUG oslo_concurrency.processutils [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:57:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:05.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:57:05 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3831908728' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:57:05 np0005603623 nova_compute[226235]: 2026-01-31 07:57:05.193 226239 DEBUG oslo_concurrency.processutils [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:57:05 np0005603623 nova_compute[226235]: 2026-01-31 07:57:05.199 226239 DEBUG nova.compute.provider_tree [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:57:05 np0005603623 nova_compute[226235]: 2026-01-31 07:57:05.231 226239 DEBUG nova.scheduler.client.report [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:57:05 np0005603623 nova_compute[226235]: 2026-01-31 07:57:05.294 226239 DEBUG oslo_concurrency.lockutils [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.671s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:05 np0005603623 nova_compute[226235]: 2026-01-31 07:57:05.382 226239 INFO nova.scheduler.client.report [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Deleted allocations for instance 3399eab2-419d-4742-b204-ab806dcda151#033[00m
Jan 31 02:57:05 np0005603623 nova_compute[226235]: 2026-01-31 07:57:05.474 226239 INFO nova.virt.libvirt.driver [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Deleting instance files /var/lib/nova/instances/bec297f5-8e63-412e-9cd3-8e859f89a123_del#033[00m
Jan 31 02:57:05 np0005603623 nova_compute[226235]: 2026-01-31 07:57:05.475 226239 INFO nova.virt.libvirt.driver [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Deletion of /var/lib/nova/instances/bec297f5-8e63-412e-9cd3-8e859f89a123_del complete#033[00m
Jan 31 02:57:05 np0005603623 nova_compute[226235]: 2026-01-31 07:57:05.559 226239 DEBUG oslo_concurrency.lockutils [None req-84932552-c8b2-4052-a951-17de6342fb63 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "3399eab2-419d-4742-b204-ab806dcda151" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 19.708s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:05 np0005603623 nova_compute[226235]: 2026-01-31 07:57:05.672 226239 INFO nova.compute.manager [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Took 18.29 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:57:05 np0005603623 nova_compute[226235]: 2026-01-31 07:57:05.673 226239 DEBUG oslo.service.loopingcall [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:57:05 np0005603623 nova_compute[226235]: 2026-01-31 07:57:05.673 226239 DEBUG nova.compute.manager [-] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:57:05 np0005603623 nova_compute[226235]: 2026-01-31 07:57:05.673 226239 DEBUG nova.network.neutron [-] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:57:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:06 np0005603623 nova_compute[226235]: 2026-01-31 07:57:06.062 226239 DEBUG nova.network.neutron [-] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:57:06 np0005603623 nova_compute[226235]: 2026-01-31 07:57:06.219 226239 DEBUG nova.network.neutron [-] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:57:06 np0005603623 nova_compute[226235]: 2026-01-31 07:57:06.273 226239 INFO nova.compute.manager [-] [instance: bec297f5-8e63-412e-9cd3-8e859f89a123] Took 0.60 seconds to deallocate network for instance.#033[00m
Jan 31 02:57:06 np0005603623 nova_compute[226235]: 2026-01-31 07:57:06.459 226239 DEBUG oslo_concurrency.lockutils [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:06 np0005603623 nova_compute[226235]: 2026-01-31 07:57:06.460 226239 DEBUG oslo_concurrency.lockutils [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:06 np0005603623 nova_compute[226235]: 2026-01-31 07:57:06.574 226239 DEBUG oslo_concurrency.processutils [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:57:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:57:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:06.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:57:06 np0005603623 nova_compute[226235]: 2026-01-31 07:57:06.683 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:57:07 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2709742983' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:57:07 np0005603623 nova_compute[226235]: 2026-01-31 07:57:07.019 226239 DEBUG oslo_concurrency.processutils [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:57:07 np0005603623 nova_compute[226235]: 2026-01-31 07:57:07.025 226239 DEBUG nova.compute.provider_tree [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:57:07 np0005603623 ovn_controller[133449]: 2026-01-31T07:57:07Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f5:be:30 10.100.0.6
Jan 31 02:57:07 np0005603623 ovn_controller[133449]: 2026-01-31T07:57:07Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f5:be:30 10.100.0.6
Jan 31 02:57:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:07.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:07 np0005603623 nova_compute[226235]: 2026-01-31 07:57:07.085 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:07 np0005603623 nova_compute[226235]: 2026-01-31 07:57:07.125 226239 DEBUG nova.scheduler.client.report [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:57:07 np0005603623 nova_compute[226235]: 2026-01-31 07:57:07.290 226239 DEBUG oslo_concurrency.lockutils [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.830s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:07 np0005603623 nova_compute[226235]: 2026-01-31 07:57:07.345 226239 INFO nova.scheduler.client.report [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Deleted allocations for instance bec297f5-8e63-412e-9cd3-8e859f89a123#033[00m
Jan 31 02:57:07 np0005603623 nova_compute[226235]: 2026-01-31 07:57:07.492 226239 DEBUG oslo_concurrency.lockutils [None req-20d9eb79-0b42-4cda-9975-50b79883a468 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "bec297f5-8e63-412e-9cd3-8e859f89a123" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 20.648s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:08.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:08 np0005603623 nova_compute[226235]: 2026-01-31 07:57:08.671 226239 DEBUG oslo_concurrency.lockutils [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Acquiring lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:08 np0005603623 nova_compute[226235]: 2026-01-31 07:57:08.671 226239 DEBUG oslo_concurrency.lockutils [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:08 np0005603623 nova_compute[226235]: 2026-01-31 07:57:08.671 226239 DEBUG oslo_concurrency.lockutils [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Acquiring lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:08 np0005603623 nova_compute[226235]: 2026-01-31 07:57:08.672 226239 DEBUG oslo_concurrency.lockutils [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:08 np0005603623 nova_compute[226235]: 2026-01-31 07:57:08.672 226239 DEBUG oslo_concurrency.lockutils [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:08 np0005603623 nova_compute[226235]: 2026-01-31 07:57:08.673 226239 INFO nova.compute.manager [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Terminating instance#033[00m
Jan 31 02:57:08 np0005603623 nova_compute[226235]: 2026-01-31 07:57:08.674 226239 DEBUG nova.compute.manager [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:57:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:09.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:09 np0005603623 kernel: tap6e38f6ff-37 (unregistering): left promiscuous mode
Jan 31 02:57:09 np0005603623 NetworkManager[48970]: <info>  [1769846229.1144] device (tap6e38f6ff-37): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:57:09 np0005603623 nova_compute[226235]: 2026-01-31 07:57:09.125 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:09 np0005603623 ovn_controller[133449]: 2026-01-31T07:57:09Z|00163|binding|INFO|Releasing lport 6e38f6ff-3729-4d12-9f54-6c01e6aae5aa from this chassis (sb_readonly=0)
Jan 31 02:57:09 np0005603623 ovn_controller[133449]: 2026-01-31T07:57:09Z|00164|binding|INFO|Setting lport 6e38f6ff-3729-4d12-9f54-6c01e6aae5aa down in Southbound
Jan 31 02:57:09 np0005603623 ovn_controller[133449]: 2026-01-31T07:57:09Z|00165|binding|INFO|Removing iface tap6e38f6ff-37 ovn-installed in OVS
Jan 31 02:57:09 np0005603623 nova_compute[226235]: 2026-01-31 07:57:09.128 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:09 np0005603623 nova_compute[226235]: 2026-01-31 07:57:09.134 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:09 np0005603623 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000022.scope: Deactivated successfully.
Jan 31 02:57:09 np0005603623 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000022.scope: Consumed 14.696s CPU time.
Jan 31 02:57:09 np0005603623 systemd-machined[194379]: Machine qemu-21-instance-00000022 terminated.
Jan 31 02:57:09 np0005603623 nova_compute[226235]: 2026-01-31 07:57:09.308 226239 INFO nova.virt.libvirt.driver [-] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Instance destroyed successfully.#033[00m
Jan 31 02:57:09 np0005603623 nova_compute[226235]: 2026-01-31 07:57:09.309 226239 DEBUG nova.objects.instance [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lazy-loading 'resources' on Instance uuid 0eb7d937-6381-4fca-88d8-57be8d3f0a29 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:57:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:09.437 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:57:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:09.438 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:57:09 np0005603623 nova_compute[226235]: 2026-01-31 07:57:09.438 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:09 np0005603623 nova_compute[226235]: 2026-01-31 07:57:09.552 226239 DEBUG nova.virt.libvirt.vif [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:55:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-UpdateMultiattachVolumeNegativeTest-server-356500433',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-updatemultiattachvolumenegativetest-server-356500433',id=34,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYVdi1LnaHZ5r6xMfeklqfzjDViAexljM9P3M0Fy5FZ3Xolf4vxCOKTYu0NFlJGf4EcZe3GteIpoGaJZuwWfVMuKuQVsr/qX8LdXn5NJVOqUqTS1m1sSlyZl2teCw6PaQ==',key_name='tempest-keypair-1101838222',keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:56:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd9f0c923b994b0295e72b111f661de1',ramdisk_id='',reservation_id='r-3c4eit9j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-UpdateMultiattachVolumeNegativeTest-860437657',owner_user_name='tempest-UpdateMultiattachVolumeNegativeTest-860437657-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:56:04Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d6078cfaadaa45ae9256245554f784fe',uuid=0eb7d937-6381-4fca-88d8-57be8d3f0a29,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "address": "fa:16:3e:fa:91:f1", "network": {"id": "80d90d51-335c-4f74-8a61-143d47d84f22", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-991561978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd9f0c923b994b0295e72b111f661de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e38f6ff-37", "ovs_interfaceid": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:57:09 np0005603623 nova_compute[226235]: 2026-01-31 07:57:09.553 226239 DEBUG nova.network.os_vif_util [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Converting VIF {"id": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "address": "fa:16:3e:fa:91:f1", "network": {"id": "80d90d51-335c-4f74-8a61-143d47d84f22", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-991561978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd9f0c923b994b0295e72b111f661de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e38f6ff-37", "ovs_interfaceid": "6e38f6ff-3729-4d12-9f54-6c01e6aae5aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:57:09 np0005603623 nova_compute[226235]: 2026-01-31 07:57:09.554 226239 DEBUG nova.network.os_vif_util [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fa:91:f1,bridge_name='br-int',has_traffic_filtering=True,id=6e38f6ff-3729-4d12-9f54-6c01e6aae5aa,network=Network(80d90d51-335c-4f74-8a61-143d47d84f22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e38f6ff-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:57:09 np0005603623 nova_compute[226235]: 2026-01-31 07:57:09.554 226239 DEBUG os_vif [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fa:91:f1,bridge_name='br-int',has_traffic_filtering=True,id=6e38f6ff-3729-4d12-9f54-6c01e6aae5aa,network=Network(80d90d51-335c-4f74-8a61-143d47d84f22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e38f6ff-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:57:09 np0005603623 nova_compute[226235]: 2026-01-31 07:57:09.556 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:09 np0005603623 nova_compute[226235]: 2026-01-31 07:57:09.556 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e38f6ff-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:57:09 np0005603623 nova_compute[226235]: 2026-01-31 07:57:09.557 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:09 np0005603623 nova_compute[226235]: 2026-01-31 07:57:09.559 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:09 np0005603623 nova_compute[226235]: 2026-01-31 07:57:09.563 226239 INFO os_vif [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fa:91:f1,bridge_name='br-int',has_traffic_filtering=True,id=6e38f6ff-3729-4d12-9f54-6c01e6aae5aa,network=Network(80d90d51-335c-4f74-8a61-143d47d84f22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e38f6ff-37')#033[00m
Jan 31 02:57:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:09.630 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:91:f1 10.100.0.13'], port_security=['fa:16:3e:fa:91:f1 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '0eb7d937-6381-4fca-88d8-57be8d3f0a29', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-80d90d51-335c-4f74-8a61-143d47d84f22', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd9f0c923b994b0295e72b111f661de1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '026daea3-1ff6-4616-9656-065604061a00', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.181'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd84a3ff-b232-4c39-928c-e2cb3c0840e0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=6e38f6ff-3729-4d12-9f54-6c01e6aae5aa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:57:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:09.631 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 6e38f6ff-3729-4d12-9f54-6c01e6aae5aa in datapath 80d90d51-335c-4f74-8a61-143d47d84f22 unbound from our chassis#033[00m
Jan 31 02:57:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:09.632 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 80d90d51-335c-4f74-8a61-143d47d84f22, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:57:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:09.634 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cbd6504d-1793-4c4c-af66-acf1d7ff3d9d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:09.634 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22 namespace which is not needed anymore#033[00m
Jan 31 02:57:09 np0005603623 neutron-haproxy-ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22[245116]: [NOTICE]   (245120) : haproxy version is 2.8.14-c23fe91
Jan 31 02:57:09 np0005603623 neutron-haproxy-ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22[245116]: [NOTICE]   (245120) : path to executable is /usr/sbin/haproxy
Jan 31 02:57:09 np0005603623 neutron-haproxy-ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22[245116]: [WARNING]  (245120) : Exiting Master process...
Jan 31 02:57:09 np0005603623 neutron-haproxy-ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22[245116]: [WARNING]  (245120) : Exiting Master process...
Jan 31 02:57:09 np0005603623 neutron-haproxy-ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22[245116]: [ALERT]    (245120) : Current worker (245122) exited with code 143 (Terminated)
Jan 31 02:57:09 np0005603623 neutron-haproxy-ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22[245116]: [WARNING]  (245120) : All workers exited. Exiting... (0)
Jan 31 02:57:09 np0005603623 systemd[1]: libpod-7b91c8756b00ddcf3f2ca80de63bdc29b89e24187b31e658f661511e667b4938.scope: Deactivated successfully.
Jan 31 02:57:09 np0005603623 podman[246454]: 2026-01-31 07:57:09.840013181 +0000 UTC m=+0.123186987 container died 7b91c8756b00ddcf3f2ca80de63bdc29b89e24187b31e658f661511e667b4938 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 02:57:09 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7b91c8756b00ddcf3f2ca80de63bdc29b89e24187b31e658f661511e667b4938-userdata-shm.mount: Deactivated successfully.
Jan 31 02:57:09 np0005603623 systemd[1]: var-lib-containers-storage-overlay-3da8f8953511c3e0812eb3b8cccef1b2415a69cc7f989ccb91b4a19db974fc6b-merged.mount: Deactivated successfully.
Jan 31 02:57:09 np0005603623 podman[246454]: 2026-01-31 07:57:09.962124433 +0000 UTC m=+0.245298239 container cleanup 7b91c8756b00ddcf3f2ca80de63bdc29b89e24187b31e658f661511e667b4938 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 02:57:09 np0005603623 systemd[1]: libpod-conmon-7b91c8756b00ddcf3f2ca80de63bdc29b89e24187b31e658f661511e667b4938.scope: Deactivated successfully.
Jan 31 02:57:10 np0005603623 podman[246483]: 2026-01-31 07:57:10.097667866 +0000 UTC m=+0.114638748 container remove 7b91c8756b00ddcf3f2ca80de63bdc29b89e24187b31e658f661511e667b4938 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 02:57:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:10.103 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[99da0dd9-51f2-4d4c-9b4b-72378302a11a]: (4, ('Sat Jan 31 07:57:09 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22 (7b91c8756b00ddcf3f2ca80de63bdc29b89e24187b31e658f661511e667b4938)\n7b91c8756b00ddcf3f2ca80de63bdc29b89e24187b31e658f661511e667b4938\nSat Jan 31 07:57:09 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22 (7b91c8756b00ddcf3f2ca80de63bdc29b89e24187b31e658f661511e667b4938)\n7b91c8756b00ddcf3f2ca80de63bdc29b89e24187b31e658f661511e667b4938\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:10.105 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2ea0688b-2c65-4043-85ad-988a3e033e56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:10.107 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap80d90d51-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:57:10 np0005603623 nova_compute[226235]: 2026-01-31 07:57:10.109 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:10 np0005603623 kernel: tap80d90d51-30: left promiscuous mode
Jan 31 02:57:10 np0005603623 nova_compute[226235]: 2026-01-31 07:57:10.111 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:10.116 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1bd4aa31-7949-4346-b07e-35ff69bcde97]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:10 np0005603623 nova_compute[226235]: 2026-01-31 07:57:10.117 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:10.135 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e2f99f34-04bb-4bab-bf54-306727a58a61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:10.136 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[019c8eba-4357-4115-b029-1ca14bea9a31]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:10.151 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[09ba2d25-38b6-47b1-9fd2-8a8b741548e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529487, 'reachable_time': 29047, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246498, 'error': None, 'target': 'ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:10 np0005603623 systemd[1]: run-netns-ovnmeta\x2d80d90d51\x2d335c\x2d4f74\x2d8a61\x2d143d47d84f22.mount: Deactivated successfully.
Jan 31 02:57:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:10.155 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-80d90d51-335c-4f74-8a61-143d47d84f22 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:57:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:10.155 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[6c23c35d-6edd-4956-b840-54c947634dfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:10.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:11.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:11 np0005603623 nova_compute[226235]: 2026-01-31 07:57:11.329 226239 INFO nova.virt.libvirt.driver [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Deleting instance files /var/lib/nova/instances/0eb7d937-6381-4fca-88d8-57be8d3f0a29_del#033[00m
Jan 31 02:57:11 np0005603623 nova_compute[226235]: 2026-01-31 07:57:11.330 226239 INFO nova.virt.libvirt.driver [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Deletion of /var/lib/nova/instances/0eb7d937-6381-4fca-88d8-57be8d3f0a29_del complete#033[00m
Jan 31 02:57:11 np0005603623 nova_compute[226235]: 2026-01-31 07:57:11.522 226239 INFO nova.compute.manager [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Took 2.85 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:57:11 np0005603623 nova_compute[226235]: 2026-01-31 07:57:11.522 226239 DEBUG oslo.service.loopingcall [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:57:11 np0005603623 nova_compute[226235]: 2026-01-31 07:57:11.523 226239 DEBUG nova.compute.manager [-] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:57:11 np0005603623 nova_compute[226235]: 2026-01-31 07:57:11.523 226239 DEBUG nova.network.neutron [-] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:57:12 np0005603623 nova_compute[226235]: 2026-01-31 07:57:12.088 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:57:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:12.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:57:13 np0005603623 nova_compute[226235]: 2026-01-31 07:57:13.006 226239 DEBUG nova.compute.manager [req-bcebf036-b2ad-4f7f-9628-035633dc69fe req-f0ad3d03-34aa-4489-890e-28b4df9705ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Received event network-vif-unplugged-6e38f6ff-3729-4d12-9f54-6c01e6aae5aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:57:13 np0005603623 nova_compute[226235]: 2026-01-31 07:57:13.007 226239 DEBUG oslo_concurrency.lockutils [req-bcebf036-b2ad-4f7f-9628-035633dc69fe req-f0ad3d03-34aa-4489-890e-28b4df9705ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:13 np0005603623 nova_compute[226235]: 2026-01-31 07:57:13.007 226239 DEBUG oslo_concurrency.lockutils [req-bcebf036-b2ad-4f7f-9628-035633dc69fe req-f0ad3d03-34aa-4489-890e-28b4df9705ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:13 np0005603623 nova_compute[226235]: 2026-01-31 07:57:13.007 226239 DEBUG oslo_concurrency.lockutils [req-bcebf036-b2ad-4f7f-9628-035633dc69fe req-f0ad3d03-34aa-4489-890e-28b4df9705ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:13 np0005603623 nova_compute[226235]: 2026-01-31 07:57:13.007 226239 DEBUG nova.compute.manager [req-bcebf036-b2ad-4f7f-9628-035633dc69fe req-f0ad3d03-34aa-4489-890e-28b4df9705ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] No waiting events found dispatching network-vif-unplugged-6e38f6ff-3729-4d12-9f54-6c01e6aae5aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:57:13 np0005603623 nova_compute[226235]: 2026-01-31 07:57:13.008 226239 DEBUG nova.compute.manager [req-bcebf036-b2ad-4f7f-9628-035633dc69fe req-f0ad3d03-34aa-4489-890e-28b4df9705ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Received event network-vif-unplugged-6e38f6ff-3729-4d12-9f54-6c01e6aae5aa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:57:13 np0005603623 nova_compute[226235]: 2026-01-31 07:57:13.008 226239 DEBUG nova.compute.manager [req-bcebf036-b2ad-4f7f-9628-035633dc69fe req-f0ad3d03-34aa-4489-890e-28b4df9705ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Received event network-vif-plugged-6e38f6ff-3729-4d12-9f54-6c01e6aae5aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:57:13 np0005603623 nova_compute[226235]: 2026-01-31 07:57:13.008 226239 DEBUG oslo_concurrency.lockutils [req-bcebf036-b2ad-4f7f-9628-035633dc69fe req-f0ad3d03-34aa-4489-890e-28b4df9705ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:13 np0005603623 nova_compute[226235]: 2026-01-31 07:57:13.008 226239 DEBUG oslo_concurrency.lockutils [req-bcebf036-b2ad-4f7f-9628-035633dc69fe req-f0ad3d03-34aa-4489-890e-28b4df9705ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:13 np0005603623 nova_compute[226235]: 2026-01-31 07:57:13.008 226239 DEBUG oslo_concurrency.lockutils [req-bcebf036-b2ad-4f7f-9628-035633dc69fe req-f0ad3d03-34aa-4489-890e-28b4df9705ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:13 np0005603623 nova_compute[226235]: 2026-01-31 07:57:13.009 226239 DEBUG nova.compute.manager [req-bcebf036-b2ad-4f7f-9628-035633dc69fe req-f0ad3d03-34aa-4489-890e-28b4df9705ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] No waiting events found dispatching network-vif-plugged-6e38f6ff-3729-4d12-9f54-6c01e6aae5aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:57:13 np0005603623 nova_compute[226235]: 2026-01-31 07:57:13.009 226239 WARNING nova.compute.manager [req-bcebf036-b2ad-4f7f-9628-035633dc69fe req-f0ad3d03-34aa-4489-890e-28b4df9705ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Received unexpected event network-vif-plugged-6e38f6ff-3729-4d12-9f54-6c01e6aae5aa for instance with vm_state active and task_state deleting.#033[00m
Jan 31 02:57:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:13.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:14 np0005603623 nova_compute[226235]: 2026-01-31 07:57:14.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:57:14 np0005603623 nova_compute[226235]: 2026-01-31 07:57:14.539 226239 DEBUG nova.compute.manager [req-79e842dd-e56d-4c07-ae4e-32a464920689 req-13d9d62a-a134-4336-a561-5ca733fa8645 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Received event network-vif-deleted-6e38f6ff-3729-4d12-9f54-6c01e6aae5aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:57:14 np0005603623 nova_compute[226235]: 2026-01-31 07:57:14.540 226239 INFO nova.compute.manager [req-79e842dd-e56d-4c07-ae4e-32a464920689 req-13d9d62a-a134-4336-a561-5ca733fa8645 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Neutron deleted interface 6e38f6ff-3729-4d12-9f54-6c01e6aae5aa; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 02:57:14 np0005603623 nova_compute[226235]: 2026-01-31 07:57:14.540 226239 DEBUG nova.network.neutron [req-79e842dd-e56d-4c07-ae4e-32a464920689 req-13d9d62a-a134-4336-a561-5ca733fa8645 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:57:14 np0005603623 nova_compute[226235]: 2026-01-31 07:57:14.559 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:14.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:14 np0005603623 nova_compute[226235]: 2026-01-31 07:57:14.676 226239 DEBUG nova.network.neutron [-] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:57:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:57:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:15.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:57:15 np0005603623 nova_compute[226235]: 2026-01-31 07:57:15.158 226239 DEBUG nova.compute.manager [req-79e842dd-e56d-4c07-ae4e-32a464920689 req-13d9d62a-a134-4336-a561-5ca733fa8645 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Detach interface failed, port_id=6e38f6ff-3729-4d12-9f54-6c01e6aae5aa, reason: Instance 0eb7d937-6381-4fca-88d8-57be8d3f0a29 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 02:57:15 np0005603623 nova_compute[226235]: 2026-01-31 07:57:15.184 226239 INFO nova.compute.manager [-] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Took 3.66 seconds to deallocate network for instance.#033[00m
Jan 31 02:57:15 np0005603623 nova_compute[226235]: 2026-01-31 07:57:15.377 226239 DEBUG oslo_concurrency.lockutils [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:15 np0005603623 nova_compute[226235]: 2026-01-31 07:57:15.377 226239 DEBUG oslo_concurrency.lockutils [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:15 np0005603623 nova_compute[226235]: 2026-01-31 07:57:15.633 226239 DEBUG oslo_concurrency.processutils [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:57:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:16 np0005603623 nova_compute[226235]: 2026-01-31 07:57:16.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:57:16 np0005603623 nova_compute[226235]: 2026-01-31 07:57:16.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:57:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:57:16 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/326972663' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:57:16 np0005603623 nova_compute[226235]: 2026-01-31 07:57:16.374 226239 DEBUG oslo_concurrency.processutils [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.741s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:57:16 np0005603623 nova_compute[226235]: 2026-01-31 07:57:16.383 226239 DEBUG nova.compute.provider_tree [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:57:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:16.440 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:57:16 np0005603623 nova_compute[226235]: 2026-01-31 07:57:16.505 226239 DEBUG nova.scheduler.client.report [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:57:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:16.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:16 np0005603623 nova_compute[226235]: 2026-01-31 07:57:16.889 226239 DEBUG oslo_concurrency.lockutils [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.511s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:17.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:17 np0005603623 nova_compute[226235]: 2026-01-31 07:57:17.087 226239 INFO nova.scheduler.client.report [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Deleted allocations for instance 0eb7d937-6381-4fca-88d8-57be8d3f0a29#033[00m
Jan 31 02:57:17 np0005603623 nova_compute[226235]: 2026-01-31 07:57:17.089 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:17 np0005603623 nova_compute[226235]: 2026-01-31 07:57:17.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:57:17 np0005603623 nova_compute[226235]: 2026-01-31 07:57:17.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:57:17 np0005603623 nova_compute[226235]: 2026-01-31 07:57:17.239 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:17 np0005603623 nova_compute[226235]: 2026-01-31 07:57:17.239 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:17 np0005603623 nova_compute[226235]: 2026-01-31 07:57:17.239 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:17 np0005603623 nova_compute[226235]: 2026-01-31 07:57:17.240 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:57:17 np0005603623 nova_compute[226235]: 2026-01-31 07:57:17.240 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:57:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:57:17 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1070020695' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:57:17 np0005603623 nova_compute[226235]: 2026-01-31 07:57:17.717 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:57:17 np0005603623 nova_compute[226235]: 2026-01-31 07:57:17.757 226239 DEBUG oslo_concurrency.lockutils [None req-b2b4f876-29e2-432f-93bf-ec66023574e2 d6078cfaadaa45ae9256245554f784fe fd9f0c923b994b0295e72b111f661de1 - - default default] Lock "0eb7d937-6381-4fca-88d8-57be8d3f0a29" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:57:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:57:17 np0005603623 nova_compute[226235]: 2026-01-31 07:57:17.914 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000028 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:57:17 np0005603623 nova_compute[226235]: 2026-01-31 07:57:17.915 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000028 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:57:18 np0005603623 nova_compute[226235]: 2026-01-31 07:57:18.063 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:57:18 np0005603623 nova_compute[226235]: 2026-01-31 07:57:18.064 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4507MB free_disk=20.756263732910156GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:57:18 np0005603623 nova_compute[226235]: 2026-01-31 07:57:18.064 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:18 np0005603623 nova_compute[226235]: 2026-01-31 07:57:18.065 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:18 np0005603623 nova_compute[226235]: 2026-01-31 07:57:18.321 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance a6c36818-897d-4417-bad1-5d1546fa7497 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:57:18 np0005603623 nova_compute[226235]: 2026-01-31 07:57:18.321 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:57:18 np0005603623 nova_compute[226235]: 2026-01-31 07:57:18.322 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:57:18 np0005603623 nova_compute[226235]: 2026-01-31 07:57:18.358 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:57:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:57:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:18.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:57:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:57:18 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3230320152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:57:18 np0005603623 nova_compute[226235]: 2026-01-31 07:57:18.792 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:57:18 np0005603623 nova_compute[226235]: 2026-01-31 07:57:18.797 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:57:18 np0005603623 nova_compute[226235]: 2026-01-31 07:57:18.995 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:57:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:19.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:57:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:57:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:57:19 np0005603623 nova_compute[226235]: 2026-01-31 07:57:19.338 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:57:19 np0005603623 nova_compute[226235]: 2026-01-31 07:57:19.338 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:19 np0005603623 nova_compute[226235]: 2026-01-31 07:57:19.560 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:20.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:21.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:21 np0005603623 nova_compute[226235]: 2026-01-31 07:57:21.336 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:57:21 np0005603623 nova_compute[226235]: 2026-01-31 07:57:21.336 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:57:21 np0005603623 nova_compute[226235]: 2026-01-31 07:57:21.337 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:57:21 np0005603623 nova_compute[226235]: 2026-01-31 07:57:21.538 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:57:21 np0005603623 nova_compute[226235]: 2026-01-31 07:57:21.539 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:57:21 np0005603623 nova_compute[226235]: 2026-01-31 07:57:21.539 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:57:21 np0005603623 nova_compute[226235]: 2026-01-31 07:57:21.539 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:57:22 np0005603623 nova_compute[226235]: 2026-01-31 07:57:22.092 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:57:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:22.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:57:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:23.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:24 np0005603623 nova_compute[226235]: 2026-01-31 07:57:24.306 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846229.304092, 0eb7d937-6381-4fca-88d8-57be8d3f0a29 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:57:24 np0005603623 nova_compute[226235]: 2026-01-31 07:57:24.307 226239 INFO nova.compute.manager [-] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:57:24 np0005603623 nova_compute[226235]: 2026-01-31 07:57:24.426 226239 DEBUG nova.compute.manager [None req-b41489c0-389d-4a3d-80cd-f48dde09ab40 - - - - - -] [instance: 0eb7d937-6381-4fca-88d8-57be8d3f0a29] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:57:24 np0005603623 nova_compute[226235]: 2026-01-31 07:57:24.562 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:24.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:25.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:26.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:27.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:27 np0005603623 nova_compute[226235]: 2026-01-31 07:57:27.094 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:57:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:28.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:57:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:57:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:29.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:57:29 np0005603623 nova_compute[226235]: 2026-01-31 07:57:29.565 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:30.092 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:30.092 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:57:30.093 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:30.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:57:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:31.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:57:31 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:57:31 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:57:31 np0005603623 podman[246808]: 2026-01-31 07:57:31.950949415 +0000 UTC m=+0.041252555 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127)
Jan 31 02:57:32 np0005603623 podman[246809]: 2026-01-31 07:57:32.002237175 +0000 UTC m=+0.093179686 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, container_name=ovn_controller)
Jan 31 02:57:32 np0005603623 nova_compute[226235]: 2026-01-31 07:57:32.095 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:32.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:57:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:33.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:57:34 np0005603623 nova_compute[226235]: 2026-01-31 07:57:34.611 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:34.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:35.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:36 np0005603623 ovn_controller[133449]: 2026-01-31T07:57:36Z|00166|binding|INFO|Releasing lport 8c531a0f-deeb-4de0-880b-b07ec1cf9103 from this chassis (sb_readonly=0)
Jan 31 02:57:36 np0005603623 nova_compute[226235]: 2026-01-31 07:57:36.245 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:36 np0005603623 ovn_controller[133449]: 2026-01-31T07:57:36Z|00167|binding|INFO|Releasing lport 8c531a0f-deeb-4de0-880b-b07ec1cf9103 from this chassis (sb_readonly=0)
Jan 31 02:57:36 np0005603623 nova_compute[226235]: 2026-01-31 07:57:36.320 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:57:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:36.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:57:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:37.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:37 np0005603623 nova_compute[226235]: 2026-01-31 07:57:37.097 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:57:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:38.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:57:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:57:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:39.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:57:39 np0005603623 nova_compute[226235]: 2026-01-31 07:57:39.612 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:40.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:41.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:42 np0005603623 nova_compute[226235]: 2026-01-31 07:57:42.098 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:57:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:42.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:57:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:43.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:44 np0005603623 nova_compute[226235]: 2026-01-31 07:57:44.614 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:44.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:45.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:46.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:47.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:47 np0005603623 nova_compute[226235]: 2026-01-31 07:57:47.099 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:48.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:49.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:49 np0005603623 nova_compute[226235]: 2026-01-31 07:57:49.615 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:50.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:57:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:51.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:57:52 np0005603623 nova_compute[226235]: 2026-01-31 07:57:52.100 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:52.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:57:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:53.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:57:54 np0005603623 nova_compute[226235]: 2026-01-31 07:57:54.617 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:57:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:54.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:57:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:57:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:55.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:57:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:57:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:56.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:57:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:57:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:57.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:57:57 np0005603623 nova_compute[226235]: 2026-01-31 07:57:57.102 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:58.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:57:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:59.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:59 np0005603623 nova_compute[226235]: 2026-01-31 07:57:59.619 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:00.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:01.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:02 np0005603623 nova_compute[226235]: 2026-01-31 07:58:02.104 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:02.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:02 np0005603623 podman[246972]: 2026-01-31 07:58:02.955585516 +0000 UTC m=+0.051359613 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127)
Jan 31 02:58:02 np0005603623 podman[246973]: 2026-01-31 07:58:02.972186276 +0000 UTC m=+0.067248421 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:58:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:58:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:03.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:58:04 np0005603623 nova_compute[226235]: 2026-01-31 07:58:04.621 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:58:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:04.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:58:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:05.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:06.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:06 np0005603623 nova_compute[226235]: 2026-01-31 07:58:06.802 226239 DEBUG oslo_concurrency.lockutils [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquiring lock "a6c36818-897d-4417-bad1-5d1546fa7497" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:06 np0005603623 nova_compute[226235]: 2026-01-31 07:58:06.802 226239 DEBUG oslo_concurrency.lockutils [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "a6c36818-897d-4417-bad1-5d1546fa7497" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:06 np0005603623 nova_compute[226235]: 2026-01-31 07:58:06.802 226239 DEBUG oslo_concurrency.lockutils [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquiring lock "a6c36818-897d-4417-bad1-5d1546fa7497-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:06 np0005603623 nova_compute[226235]: 2026-01-31 07:58:06.803 226239 DEBUG oslo_concurrency.lockutils [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "a6c36818-897d-4417-bad1-5d1546fa7497-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:06 np0005603623 nova_compute[226235]: 2026-01-31 07:58:06.803 226239 DEBUG oslo_concurrency.lockutils [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "a6c36818-897d-4417-bad1-5d1546fa7497-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:06 np0005603623 nova_compute[226235]: 2026-01-31 07:58:06.804 226239 INFO nova.compute.manager [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Terminating instance#033[00m
Jan 31 02:58:06 np0005603623 nova_compute[226235]: 2026-01-31 07:58:06.805 226239 DEBUG nova.compute.manager [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:58:06 np0005603623 kernel: tapbb69cccb-97 (unregistering): left promiscuous mode
Jan 31 02:58:06 np0005603623 NetworkManager[48970]: <info>  [1769846286.8725] device (tapbb69cccb-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:58:06 np0005603623 ovn_controller[133449]: 2026-01-31T07:58:06Z|00168|binding|INFO|Releasing lport bb69cccb-97dc-4472-af6c-0ed0b0324779 from this chassis (sb_readonly=0)
Jan 31 02:58:06 np0005603623 ovn_controller[133449]: 2026-01-31T07:58:06Z|00169|binding|INFO|Setting lport bb69cccb-97dc-4472-af6c-0ed0b0324779 down in Southbound
Jan 31 02:58:06 np0005603623 nova_compute[226235]: 2026-01-31 07:58:06.881 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:06 np0005603623 ovn_controller[133449]: 2026-01-31T07:58:06Z|00170|binding|INFO|Removing iface tapbb69cccb-97 ovn-installed in OVS
Jan 31 02:58:06 np0005603623 nova_compute[226235]: 2026-01-31 07:58:06.883 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:06 np0005603623 nova_compute[226235]: 2026-01-31 07:58:06.892 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:06 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:06.920 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:be:30 10.100.0.6'], port_security=['fa:16:3e:f5:be:30 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'a6c36818-897d-4417-bad1-5d1546fa7497', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c58eaedf-202a-428a-acfb-f0b1291517f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8033316fc42c4926bfd1f8a34b02fa97', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4b3d9baf-bd3e-457e-a5c2-9addbc71d588', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=189b55ef-8e14-4c6c-870a-5dba85715c4a, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=bb69cccb-97dc-4472-af6c-0ed0b0324779) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:58:06 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:06.921 143258 INFO neutron.agent.ovn.metadata.agent [-] Port bb69cccb-97dc-4472-af6c-0ed0b0324779 in datapath c58eaedf-202a-428a-acfb-f0b1291517f1 unbound from our chassis#033[00m
Jan 31 02:58:06 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:06.923 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c58eaedf-202a-428a-acfb-f0b1291517f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:58:06 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:06.924 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[328489bb-7371-4d8c-8088-50741c8905a4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:06 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:06.925 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1 namespace which is not needed anymore#033[00m
Jan 31 02:58:06 np0005603623 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000028.scope: Deactivated successfully.
Jan 31 02:58:06 np0005603623 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000028.scope: Consumed 15.333s CPU time.
Jan 31 02:58:06 np0005603623 systemd-machined[194379]: Machine qemu-23-instance-00000028 terminated.
Jan 31 02:58:07 np0005603623 neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1[246208]: [NOTICE]   (246212) : haproxy version is 2.8.14-c23fe91
Jan 31 02:58:07 np0005603623 neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1[246208]: [NOTICE]   (246212) : path to executable is /usr/sbin/haproxy
Jan 31 02:58:07 np0005603623 neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1[246208]: [WARNING]  (246212) : Exiting Master process...
Jan 31 02:58:07 np0005603623 neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1[246208]: [ALERT]    (246212) : Current worker (246214) exited with code 143 (Terminated)
Jan 31 02:58:07 np0005603623 neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1[246208]: [WARNING]  (246212) : All workers exited. Exiting... (0)
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.038 226239 INFO nova.virt.libvirt.driver [-] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Instance destroyed successfully.#033[00m
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.039 226239 DEBUG nova.objects.instance [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lazy-loading 'resources' on Instance uuid a6c36818-897d-4417-bad1-5d1546fa7497 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:58:07 np0005603623 systemd[1]: libpod-0967c212d89d2325dfd01b6ebf08257a6f79fd0dcbc80249d4de065407ff2ca2.scope: Deactivated successfully.
Jan 31 02:58:07 np0005603623 podman[247046]: 2026-01-31 07:58:07.048651023 +0000 UTC m=+0.055411671 container died 0967c212d89d2325dfd01b6ebf08257a6f79fd0dcbc80249d4de065407ff2ca2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 02:58:07 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0967c212d89d2325dfd01b6ebf08257a6f79fd0dcbc80249d4de065407ff2ca2-userdata-shm.mount: Deactivated successfully.
Jan 31 02:58:07 np0005603623 systemd[1]: var-lib-containers-storage-overlay-2fc5448f3f4255cec1f50b49977303692b5ef974976839ccde36793b77055730-merged.mount: Deactivated successfully.
Jan 31 02:58:07 np0005603623 podman[247046]: 2026-01-31 07:58:07.087660237 +0000 UTC m=+0.094420865 container cleanup 0967c212d89d2325dfd01b6ebf08257a6f79fd0dcbc80249d4de065407ff2ca2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 02:58:07 np0005603623 systemd[1]: libpod-conmon-0967c212d89d2325dfd01b6ebf08257a6f79fd0dcbc80249d4de065407ff2ca2.scope: Deactivated successfully.
Jan 31 02:58:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:07.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.106 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:07 np0005603623 podman[247085]: 2026-01-31 07:58:07.149640092 +0000 UTC m=+0.046524732 container remove 0967c212d89d2325dfd01b6ebf08257a6f79fd0dcbc80249d4de065407ff2ca2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 02:58:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:07.153 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dc7ef237-8878-42de-ac59-97216788f36a]: (4, ('Sat Jan 31 07:58:06 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1 (0967c212d89d2325dfd01b6ebf08257a6f79fd0dcbc80249d4de065407ff2ca2)\n0967c212d89d2325dfd01b6ebf08257a6f79fd0dcbc80249d4de065407ff2ca2\nSat Jan 31 07:58:07 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1 (0967c212d89d2325dfd01b6ebf08257a6f79fd0dcbc80249d4de065407ff2ca2)\n0967c212d89d2325dfd01b6ebf08257a6f79fd0dcbc80249d4de065407ff2ca2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:07.156 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[66e12606-4036-4d18-b827-2f1a33b93af4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:07.157 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc58eaedf-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.159 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:07 np0005603623 kernel: tapc58eaedf-20: left promiscuous mode
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.163 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.168 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:07.170 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0e0161c6-ab36-4d0d-9eb8-17b8359cc7f8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:07.189 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8445ee58-c10f-4965-a24f-da312879a326]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:07.190 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[207a0d5e-bd39-4b92-85ea-117e052bde30]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:07.201 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2c334f07-f756-439e-8017-9573b1795ee8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535039, 'reachable_time': 37841, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247104, 'error': None, 'target': 'ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:07 np0005603623 systemd[1]: run-netns-ovnmeta\x2dc58eaedf\x2d202a\x2d428a\x2dacfb\x2df0b1291517f1.mount: Deactivated successfully.
Jan 31 02:58:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:07.205 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:58:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:07.205 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[3a833170-22b7-4167-ae77-a14041ea0ac5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.214 226239 DEBUG nova.virt.libvirt.vif [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:56:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1883505973',display_name='tempest-ServersAdminTestJSON-server-1883505973',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1883505973',id=40,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:56:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8033316fc42c4926bfd1f8a34b02fa97',ramdisk_id='',reservation_id='r-u30dpo8c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-784933461',owner_user_name='tempest-ServersAdminTestJSON-784933461-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:56:54Z,user_data=None,user_id='93973daeb08c453e90372a79b54b9ede',uuid=a6c36818-897d-4417-bad1-5d1546fa7497,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "address": "fa:16:3e:f5:be:30", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb69cccb-97", "ovs_interfaceid": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.215 226239 DEBUG nova.network.os_vif_util [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Converting VIF {"id": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "address": "fa:16:3e:f5:be:30", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb69cccb-97", "ovs_interfaceid": "bb69cccb-97dc-4472-af6c-0ed0b0324779", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.215 226239 DEBUG nova.network.os_vif_util [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f5:be:30,bridge_name='br-int',has_traffic_filtering=True,id=bb69cccb-97dc-4472-af6c-0ed0b0324779,network=Network(c58eaedf-202a-428a-acfb-f0b1291517f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb69cccb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.216 226239 DEBUG os_vif [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f5:be:30,bridge_name='br-int',has_traffic_filtering=True,id=bb69cccb-97dc-4472-af6c-0ed0b0324779,network=Network(c58eaedf-202a-428a-acfb-f0b1291517f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb69cccb-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.217 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.217 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbb69cccb-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.220 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.222 226239 INFO os_vif [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f5:be:30,bridge_name='br-int',has_traffic_filtering=True,id=bb69cccb-97dc-4472-af6c-0ed0b0324779,network=Network(c58eaedf-202a-428a-acfb-f0b1291517f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb69cccb-97')#033[00m
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.740 226239 DEBUG nova.compute.manager [req-70368d18-dac2-479f-a79f-aac9e5201074 req-6c277302-65b6-4a92-89cd-9168811913d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Received event network-vif-unplugged-bb69cccb-97dc-4472-af6c-0ed0b0324779 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.741 226239 DEBUG oslo_concurrency.lockutils [req-70368d18-dac2-479f-a79f-aac9e5201074 req-6c277302-65b6-4a92-89cd-9168811913d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a6c36818-897d-4417-bad1-5d1546fa7497-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.741 226239 DEBUG oslo_concurrency.lockutils [req-70368d18-dac2-479f-a79f-aac9e5201074 req-6c277302-65b6-4a92-89cd-9168811913d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a6c36818-897d-4417-bad1-5d1546fa7497-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.742 226239 DEBUG oslo_concurrency.lockutils [req-70368d18-dac2-479f-a79f-aac9e5201074 req-6c277302-65b6-4a92-89cd-9168811913d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a6c36818-897d-4417-bad1-5d1546fa7497-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.742 226239 DEBUG nova.compute.manager [req-70368d18-dac2-479f-a79f-aac9e5201074 req-6c277302-65b6-4a92-89cd-9168811913d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] No waiting events found dispatching network-vif-unplugged-bb69cccb-97dc-4472-af6c-0ed0b0324779 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:58:07 np0005603623 nova_compute[226235]: 2026-01-31 07:58:07.742 226239 DEBUG nova.compute.manager [req-70368d18-dac2-479f-a79f-aac9e5201074 req-6c277302-65b6-4a92-89cd-9168811913d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Received event network-vif-unplugged-bb69cccb-97dc-4472-af6c-0ed0b0324779 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:58:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:58:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:08.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:58:08 np0005603623 nova_compute[226235]: 2026-01-31 07:58:08.939 226239 INFO nova.virt.libvirt.driver [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Deleting instance files /var/lib/nova/instances/a6c36818-897d-4417-bad1-5d1546fa7497_del#033[00m
Jan 31 02:58:08 np0005603623 nova_compute[226235]: 2026-01-31 07:58:08.940 226239 INFO nova.virt.libvirt.driver [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Deletion of /var/lib/nova/instances/a6c36818-897d-4417-bad1-5d1546fa7497_del complete#033[00m
Jan 31 02:58:09 np0005603623 nova_compute[226235]: 2026-01-31 07:58:09.024 226239 INFO nova.compute.manager [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Took 2.22 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:58:09 np0005603623 nova_compute[226235]: 2026-01-31 07:58:09.025 226239 DEBUG oslo.service.loopingcall [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:58:09 np0005603623 nova_compute[226235]: 2026-01-31 07:58:09.026 226239 DEBUG nova.compute.manager [-] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:58:09 np0005603623 nova_compute[226235]: 2026-01-31 07:58:09.027 226239 DEBUG nova.network.neutron [-] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:58:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:09.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:10 np0005603623 nova_compute[226235]: 2026-01-31 07:58:10.278 226239 DEBUG nova.compute.manager [req-2e1094f7-249f-4080-b787-9a30ca58da78 req-23e6c908-3ed2-4b44-9fde-b8412ff6131a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Received event network-vif-plugged-bb69cccb-97dc-4472-af6c-0ed0b0324779 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:58:10 np0005603623 nova_compute[226235]: 2026-01-31 07:58:10.278 226239 DEBUG oslo_concurrency.lockutils [req-2e1094f7-249f-4080-b787-9a30ca58da78 req-23e6c908-3ed2-4b44-9fde-b8412ff6131a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a6c36818-897d-4417-bad1-5d1546fa7497-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:10 np0005603623 nova_compute[226235]: 2026-01-31 07:58:10.279 226239 DEBUG oslo_concurrency.lockutils [req-2e1094f7-249f-4080-b787-9a30ca58da78 req-23e6c908-3ed2-4b44-9fde-b8412ff6131a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a6c36818-897d-4417-bad1-5d1546fa7497-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:10 np0005603623 nova_compute[226235]: 2026-01-31 07:58:10.279 226239 DEBUG oslo_concurrency.lockutils [req-2e1094f7-249f-4080-b787-9a30ca58da78 req-23e6c908-3ed2-4b44-9fde-b8412ff6131a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a6c36818-897d-4417-bad1-5d1546fa7497-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:10 np0005603623 nova_compute[226235]: 2026-01-31 07:58:10.279 226239 DEBUG nova.compute.manager [req-2e1094f7-249f-4080-b787-9a30ca58da78 req-23e6c908-3ed2-4b44-9fde-b8412ff6131a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] No waiting events found dispatching network-vif-plugged-bb69cccb-97dc-4472-af6c-0ed0b0324779 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:58:10 np0005603623 nova_compute[226235]: 2026-01-31 07:58:10.279 226239 WARNING nova.compute.manager [req-2e1094f7-249f-4080-b787-9a30ca58da78 req-23e6c908-3ed2-4b44-9fde-b8412ff6131a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Received unexpected event network-vif-plugged-bb69cccb-97dc-4472-af6c-0ed0b0324779 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 02:58:10 np0005603623 nova_compute[226235]: 2026-01-31 07:58:10.594 226239 DEBUG nova.network.neutron [-] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:58:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:10.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:10 np0005603623 nova_compute[226235]: 2026-01-31 07:58:10.831 226239 DEBUG nova.compute.manager [req-7d9c049b-b03c-4cd0-ba69-d2b042513e15 req-aa2a87ca-e2c0-4806-babc-37400bb183d0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Received event network-vif-deleted-bb69cccb-97dc-4472-af6c-0ed0b0324779 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:58:10 np0005603623 nova_compute[226235]: 2026-01-31 07:58:10.831 226239 INFO nova.compute.manager [req-7d9c049b-b03c-4cd0-ba69-d2b042513e15 req-aa2a87ca-e2c0-4806-babc-37400bb183d0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Neutron deleted interface bb69cccb-97dc-4472-af6c-0ed0b0324779; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 02:58:10 np0005603623 nova_compute[226235]: 2026-01-31 07:58:10.832 226239 DEBUG nova.network.neutron [req-7d9c049b-b03c-4cd0-ba69-d2b042513e15 req-aa2a87ca-e2c0-4806-babc-37400bb183d0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:58:10 np0005603623 nova_compute[226235]: 2026-01-31 07:58:10.965 226239 INFO nova.compute.manager [-] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Took 1.94 seconds to deallocate network for instance.#033[00m
Jan 31 02:58:10 np0005603623 nova_compute[226235]: 2026-01-31 07:58:10.971 226239 DEBUG nova.compute.manager [req-7d9c049b-b03c-4cd0-ba69-d2b042513e15 req-aa2a87ca-e2c0-4806-babc-37400bb183d0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Detach interface failed, port_id=bb69cccb-97dc-4472-af6c-0ed0b0324779, reason: Instance a6c36818-897d-4417-bad1-5d1546fa7497 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 02:58:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:11 np0005603623 nova_compute[226235]: 2026-01-31 07:58:11.095 226239 DEBUG oslo_concurrency.lockutils [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:11 np0005603623 nova_compute[226235]: 2026-01-31 07:58:11.096 226239 DEBUG oslo_concurrency.lockutils [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:11.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:11 np0005603623 nova_compute[226235]: 2026-01-31 07:58:11.181 226239 DEBUG oslo_concurrency.processutils [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:58:11 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3012667903' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:58:11 np0005603623 nova_compute[226235]: 2026-01-31 07:58:11.660 226239 DEBUG oslo_concurrency.processutils [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:11 np0005603623 nova_compute[226235]: 2026-01-31 07:58:11.665 226239 DEBUG nova.compute.provider_tree [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:58:11 np0005603623 nova_compute[226235]: 2026-01-31 07:58:11.750 226239 DEBUG nova.scheduler.client.report [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:58:11 np0005603623 nova_compute[226235]: 2026-01-31 07:58:11.871 226239 DEBUG oslo_concurrency.lockutils [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.776s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:11 np0005603623 nova_compute[226235]: 2026-01-31 07:58:11.909 226239 INFO nova.scheduler.client.report [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Deleted allocations for instance a6c36818-897d-4417-bad1-5d1546fa7497#033[00m
Jan 31 02:58:12 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #46. Immutable memtables: 3.
Jan 31 02:58:12 np0005603623 nova_compute[226235]: 2026-01-31 07:58:12.108 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:12 np0005603623 nova_compute[226235]: 2026-01-31 07:58:12.155 226239 DEBUG oslo_concurrency.lockutils [None req-922570ae-8083-46ac-8887-5f0b307ff9d5 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "a6c36818-897d-4417-bad1-5d1546fa7497" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.353s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:12 np0005603623 nova_compute[226235]: 2026-01-31 07:58:12.219 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:12 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:12.286 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:58:12 np0005603623 nova_compute[226235]: 2026-01-31 07:58:12.287 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:12 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:12.288 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:58:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:12.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:13.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:14 np0005603623 nova_compute[226235]: 2026-01-31 07:58:14.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 02:58:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/704945848' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 02:58:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 02:58:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/704945848' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 02:58:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:14.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:15.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:15 np0005603623 nova_compute[226235]: 2026-01-31 07:58:15.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:15.290 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:16.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:17 np0005603623 nova_compute[226235]: 2026-01-31 07:58:17.111 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:17.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:17 np0005603623 nova_compute[226235]: 2026-01-31 07:58:17.220 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:18 np0005603623 nova_compute[226235]: 2026-01-31 07:58:18.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:18 np0005603623 nova_compute[226235]: 2026-01-31 07:58:18.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:18 np0005603623 nova_compute[226235]: 2026-01-31 07:58:18.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:18 np0005603623 nova_compute[226235]: 2026-01-31 07:58:18.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:18 np0005603623 nova_compute[226235]: 2026-01-31 07:58:18.177 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:18 np0005603623 nova_compute[226235]: 2026-01-31 07:58:18.178 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:18 np0005603623 nova_compute[226235]: 2026-01-31 07:58:18.178 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:18 np0005603623 nova_compute[226235]: 2026-01-31 07:58:18.178 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:58:18 np0005603623 nova_compute[226235]: 2026-01-31 07:58:18.179 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:58:18 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3367759284' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:58:18 np0005603623 nova_compute[226235]: 2026-01-31 07:58:18.633 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:18.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:18 np0005603623 nova_compute[226235]: 2026-01-31 07:58:18.775 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:58:18 np0005603623 nova_compute[226235]: 2026-01-31 07:58:18.776 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4763MB free_disk=20.879932403564453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:58:18 np0005603623 nova_compute[226235]: 2026-01-31 07:58:18.777 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:18 np0005603623 nova_compute[226235]: 2026-01-31 07:58:18.777 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:18 np0005603623 nova_compute[226235]: 2026-01-31 07:58:18.870 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:58:18 np0005603623 nova_compute[226235]: 2026-01-31 07:58:18.871 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:58:18 np0005603623 nova_compute[226235]: 2026-01-31 07:58:18.889 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:58:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:19.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:58:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:58:19 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4273994335' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:58:19 np0005603623 nova_compute[226235]: 2026-01-31 07:58:19.314 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:19 np0005603623 nova_compute[226235]: 2026-01-31 07:58:19.320 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:58:19 np0005603623 nova_compute[226235]: 2026-01-31 07:58:19.337 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:58:19 np0005603623 nova_compute[226235]: 2026-01-31 07:58:19.361 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:58:19 np0005603623 nova_compute[226235]: 2026-01-31 07:58:19.361 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.584s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:20 np0005603623 nova_compute[226235]: 2026-01-31 07:58:20.357 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:20 np0005603623 nova_compute[226235]: 2026-01-31 07:58:20.358 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:20 np0005603623 nova_compute[226235]: 2026-01-31 07:58:20.358 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:20 np0005603623 nova_compute[226235]: 2026-01-31 07:58:20.358 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:58:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:58:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:20.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:58:20 np0005603623 nova_compute[226235]: 2026-01-31 07:58:20.832 226239 DEBUG oslo_concurrency.lockutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:20 np0005603623 nova_compute[226235]: 2026-01-31 07:58:20.832 226239 DEBUG oslo_concurrency.lockutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:20 np0005603623 nova_compute[226235]: 2026-01-31 07:58:20.864 226239 DEBUG nova.compute.manager [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:58:20 np0005603623 nova_compute[226235]: 2026-01-31 07:58:20.968 226239 DEBUG oslo_concurrency.lockutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:20 np0005603623 nova_compute[226235]: 2026-01-31 07:58:20.968 226239 DEBUG oslo_concurrency.lockutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:20 np0005603623 nova_compute[226235]: 2026-01-31 07:58:20.975 226239 DEBUG nova.virt.hardware [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:58:20 np0005603623 nova_compute[226235]: 2026-01-31 07:58:20.976 226239 INFO nova.compute.claims [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:58:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:58:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:21.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.188 226239 DEBUG oslo_concurrency.processutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.209 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.210 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:58:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:58:21 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4230973850' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.606 226239 DEBUG oslo_concurrency.processutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.611 226239 DEBUG nova.compute.provider_tree [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.630 226239 DEBUG nova.scheduler.client.report [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.657 226239 DEBUG oslo_concurrency.lockutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.689s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.658 226239 DEBUG nova.compute.manager [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.714 226239 DEBUG nova.compute.manager [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.714 226239 DEBUG nova.network.neutron [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.741 226239 INFO nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.763 226239 DEBUG nova.compute.manager [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.887 226239 DEBUG nova.compute.manager [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.888 226239 DEBUG nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.889 226239 INFO nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Creating image(s)#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.912 226239 DEBUG nova.storage.rbd_utils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] rbd image ba72fe35-90dd-4806-9195-8a8ff81ae9f0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.937 226239 DEBUG nova.storage.rbd_utils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] rbd image ba72fe35-90dd-4806-9195-8a8ff81ae9f0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.962 226239 DEBUG nova.storage.rbd_utils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] rbd image ba72fe35-90dd-4806-9195-8a8ff81ae9f0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:21 np0005603623 nova_compute[226235]: 2026-01-31 07:58:21.966 226239 DEBUG oslo_concurrency.processutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:22 np0005603623 nova_compute[226235]: 2026-01-31 07:58:22.012 226239 DEBUG oslo_concurrency.processutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:22 np0005603623 nova_compute[226235]: 2026-01-31 07:58:22.013 226239 DEBUG oslo_concurrency.lockutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:22 np0005603623 nova_compute[226235]: 2026-01-31 07:58:22.014 226239 DEBUG oslo_concurrency.lockutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:22 np0005603623 nova_compute[226235]: 2026-01-31 07:58:22.015 226239 DEBUG oslo_concurrency.lockutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:22 np0005603623 nova_compute[226235]: 2026-01-31 07:58:22.042 226239 DEBUG nova.storage.rbd_utils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] rbd image ba72fe35-90dd-4806-9195-8a8ff81ae9f0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:22 np0005603623 nova_compute[226235]: 2026-01-31 07:58:22.045 226239 DEBUG oslo_concurrency.processutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 ba72fe35-90dd-4806-9195-8a8ff81ae9f0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:22 np0005603623 nova_compute[226235]: 2026-01-31 07:58:22.060 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846287.0355747, a6c36818-897d-4417-bad1-5d1546fa7497 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:58:22 np0005603623 nova_compute[226235]: 2026-01-31 07:58:22.061 226239 INFO nova.compute.manager [-] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:58:22 np0005603623 nova_compute[226235]: 2026-01-31 07:58:22.094 226239 DEBUG nova.compute.manager [None req-d77a1f69-7358-4030-b25c-d8d0384fd69d - - - - - -] [instance: a6c36818-897d-4417-bad1-5d1546fa7497] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:58:22 np0005603623 nova_compute[226235]: 2026-01-31 07:58:22.113 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:22 np0005603623 nova_compute[226235]: 2026-01-31 07:58:22.175 226239 DEBUG nova.policy [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b034a039074641a7b7c872e8b715ca4c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '273fe485cc184dd8bf86440d8d1e05f3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:58:22 np0005603623 nova_compute[226235]: 2026-01-31 07:58:22.221 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:58:22 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 02:58:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:22.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:58:22 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 02:58:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:23.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:23 np0005603623 nova_compute[226235]: 2026-01-31 07:58:23.271 226239 DEBUG oslo_concurrency.processutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 ba72fe35-90dd-4806-9195-8a8ff81ae9f0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.226s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:23 np0005603623 nova_compute[226235]: 2026-01-31 07:58:23.349 226239 DEBUG nova.storage.rbd_utils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] resizing rbd image ba72fe35-90dd-4806-9195-8a8ff81ae9f0_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:58:23 np0005603623 nova_compute[226235]: 2026-01-31 07:58:23.507 226239 DEBUG nova.objects.instance [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lazy-loading 'migration_context' on Instance uuid ba72fe35-90dd-4806-9195-8a8ff81ae9f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:58:23 np0005603623 nova_compute[226235]: 2026-01-31 07:58:23.538 226239 DEBUG nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:58:23 np0005603623 nova_compute[226235]: 2026-01-31 07:58:23.539 226239 DEBUG nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Ensure instance console log exists: /var/lib/nova/instances/ba72fe35-90dd-4806-9195-8a8ff81ae9f0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:58:23 np0005603623 nova_compute[226235]: 2026-01-31 07:58:23.539 226239 DEBUG oslo_concurrency.lockutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:23 np0005603623 nova_compute[226235]: 2026-01-31 07:58:23.539 226239 DEBUG oslo_concurrency.lockutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:23 np0005603623 nova_compute[226235]: 2026-01-31 07:58:23.539 226239 DEBUG oslo_concurrency.lockutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:24 np0005603623 nova_compute[226235]: 2026-01-31 07:58:24.273 226239 DEBUG nova.network.neutron [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Successfully created port: cdac9e6d-0972-4cab-a922-cf4d3f769c6c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 02:58:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:24.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:25.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:25 np0005603623 nova_compute[226235]: 2026-01-31 07:58:25.716 226239 DEBUG nova.network.neutron [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Successfully updated port: cdac9e6d-0972-4cab-a922-cf4d3f769c6c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:58:25 np0005603623 nova_compute[226235]: 2026-01-31 07:58:25.750 226239 DEBUG oslo_concurrency.lockutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "refresh_cache-ba72fe35-90dd-4806-9195-8a8ff81ae9f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:58:25 np0005603623 nova_compute[226235]: 2026-01-31 07:58:25.750 226239 DEBUG oslo_concurrency.lockutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquired lock "refresh_cache-ba72fe35-90dd-4806-9195-8a8ff81ae9f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:58:25 np0005603623 nova_compute[226235]: 2026-01-31 07:58:25.750 226239 DEBUG nova.network.neutron [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:58:25 np0005603623 nova_compute[226235]: 2026-01-31 07:58:25.836 226239 DEBUG nova.compute.manager [req-6ef78bfb-aabd-4dc1-ae35-3573b617f80a req-351f9dff-0cb3-432f-aa51-f1a6afd4a0ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Received event network-changed-cdac9e6d-0972-4cab-a922-cf4d3f769c6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:58:25 np0005603623 nova_compute[226235]: 2026-01-31 07:58:25.837 226239 DEBUG nova.compute.manager [req-6ef78bfb-aabd-4dc1-ae35-3573b617f80a req-351f9dff-0cb3-432f-aa51-f1a6afd4a0ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Refreshing instance network info cache due to event network-changed-cdac9e6d-0972-4cab-a922-cf4d3f769c6c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:58:25 np0005603623 nova_compute[226235]: 2026-01-31 07:58:25.837 226239 DEBUG oslo_concurrency.lockutils [req-6ef78bfb-aabd-4dc1-ae35-3573b617f80a req-351f9dff-0cb3-432f-aa51-f1a6afd4a0ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-ba72fe35-90dd-4806-9195-8a8ff81ae9f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:58:26 np0005603623 nova_compute[226235]: 2026-01-31 07:58:26.008 226239 DEBUG nova.network.neutron [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:58:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:58:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:26.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.115 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:27.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.222 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.358 226239 DEBUG nova.network.neutron [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Updating instance_info_cache with network_info: [{"id": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "address": "fa:16:3e:97:e5:c9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdac9e6d-09", "ovs_interfaceid": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.381 226239 DEBUG oslo_concurrency.lockutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Releasing lock "refresh_cache-ba72fe35-90dd-4806-9195-8a8ff81ae9f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.382 226239 DEBUG nova.compute.manager [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Instance network_info: |[{"id": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "address": "fa:16:3e:97:e5:c9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdac9e6d-09", "ovs_interfaceid": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.382 226239 DEBUG oslo_concurrency.lockutils [req-6ef78bfb-aabd-4dc1-ae35-3573b617f80a req-351f9dff-0cb3-432f-aa51-f1a6afd4a0ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-ba72fe35-90dd-4806-9195-8a8ff81ae9f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.383 226239 DEBUG nova.network.neutron [req-6ef78bfb-aabd-4dc1-ae35-3573b617f80a req-351f9dff-0cb3-432f-aa51-f1a6afd4a0ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Refreshing network info cache for port cdac9e6d-0972-4cab-a922-cf4d3f769c6c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.386 226239 DEBUG nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Start _get_guest_xml network_info=[{"id": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "address": "fa:16:3e:97:e5:c9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdac9e6d-09", "ovs_interfaceid": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.391 226239 WARNING nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.396 226239 DEBUG nova.virt.libvirt.host [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.396 226239 DEBUG nova.virt.libvirt.host [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.402 226239 DEBUG nova.virt.libvirt.host [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.402 226239 DEBUG nova.virt.libvirt.host [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.403 226239 DEBUG nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.404 226239 DEBUG nova.virt.hardware [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.404 226239 DEBUG nova.virt.hardware [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.404 226239 DEBUG nova.virt.hardware [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.404 226239 DEBUG nova.virt.hardware [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.405 226239 DEBUG nova.virt.hardware [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.405 226239 DEBUG nova.virt.hardware [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.405 226239 DEBUG nova.virt.hardware [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.405 226239 DEBUG nova.virt.hardware [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.405 226239 DEBUG nova.virt.hardware [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.406 226239 DEBUG nova.virt.hardware [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.406 226239 DEBUG nova.virt.hardware [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.408 226239 DEBUG oslo_concurrency.processutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:58:27 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3458197171' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.867 226239 DEBUG oslo_concurrency.processutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.889 226239 DEBUG nova.storage.rbd_utils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] rbd image ba72fe35-90dd-4806-9195-8a8ff81ae9f0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:27 np0005603623 nova_compute[226235]: 2026-01-31 07:58:27.892 226239 DEBUG oslo_concurrency.processutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:58:28 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1879360094' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.323 226239 DEBUG oslo_concurrency.processutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.325 226239 DEBUG nova.virt.libvirt.vif [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:58:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-709128910',display_name='tempest-VolumesAdminNegativeTest-server-709128910',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-volumesadminnegativetest-server-709128910',id=42,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJi1q2vjU3oXwmY2ffjVGLPXOBEh8qUUJE5r4ONrjGH3XHm6dlUTSzVpJpxExvl6X7xbQyEMzqjUiwuQRK3OBD1eEX65+Q6/e+BZWXbmhA9FXHDtzew/pw1jU9p7vxh+uA==',key_name='tempest-keypair-941200395',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='273fe485cc184dd8bf86440d8d1e05f3',ramdisk_id='',reservation_id='r-oipxgz7v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-1346674431',owner_user_name='tempest-VolumesAdminNegativeTest-1346674431-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:58:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b034a039074641a7b7c872e8b715ca4c',uuid=ba72fe35-90dd-4806-9195-8a8ff81ae9f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "address": "fa:16:3e:97:e5:c9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdac9e6d-09", "ovs_interfaceid": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.325 226239 DEBUG nova.network.os_vif_util [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Converting VIF {"id": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "address": "fa:16:3e:97:e5:c9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdac9e6d-09", "ovs_interfaceid": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.326 226239 DEBUG nova.network.os_vif_util [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:e5:c9,bridge_name='br-int',has_traffic_filtering=True,id=cdac9e6d-0972-4cab-a922-cf4d3f769c6c,network=Network(81f9575e-f3c4-4956-9a5c-9cb1776ab08e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcdac9e6d-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.327 226239 DEBUG nova.objects.instance [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lazy-loading 'pci_devices' on Instance uuid ba72fe35-90dd-4806-9195-8a8ff81ae9f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.352 226239 DEBUG nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:58:28 np0005603623 nova_compute[226235]:  <uuid>ba72fe35-90dd-4806-9195-8a8ff81ae9f0</uuid>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:  <name>instance-0000002a</name>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <nova:name>tempest-VolumesAdminNegativeTest-server-709128910</nova:name>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:58:27</nova:creationTime>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:58:28 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:        <nova:user uuid="b034a039074641a7b7c872e8b715ca4c">tempest-VolumesAdminNegativeTest-1346674431-project-member</nova:user>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:        <nova:project uuid="273fe485cc184dd8bf86440d8d1e05f3">tempest-VolumesAdminNegativeTest-1346674431</nova:project>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:        <nova:port uuid="cdac9e6d-0972-4cab-a922-cf4d3f769c6c">
Jan 31 02:58:28 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <entry name="serial">ba72fe35-90dd-4806-9195-8a8ff81ae9f0</entry>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <entry name="uuid">ba72fe35-90dd-4806-9195-8a8ff81ae9f0</entry>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/ba72fe35-90dd-4806-9195-8a8ff81ae9f0_disk">
Jan 31 02:58:28 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:58:28 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/ba72fe35-90dd-4806-9195-8a8ff81ae9f0_disk.config">
Jan 31 02:58:28 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:58:28 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:97:e5:c9"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <target dev="tapcdac9e6d-09"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    </interface>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/ba72fe35-90dd-4806-9195-8a8ff81ae9f0/console.log" append="off"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:58:28 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:58:28 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:58:28 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:58:28 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.353 226239 DEBUG nova.compute.manager [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Preparing to wait for external event network-vif-plugged-cdac9e6d-0972-4cab-a922-cf4d3f769c6c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.353 226239 DEBUG oslo_concurrency.lockutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.353 226239 DEBUG oslo_concurrency.lockutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.354 226239 DEBUG oslo_concurrency.lockutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.354 226239 DEBUG nova.virt.libvirt.vif [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:58:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-709128910',display_name='tempest-VolumesAdminNegativeTest-server-709128910',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-volumesadminnegativetest-server-709128910',id=42,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJi1q2vjU3oXwmY2ffjVGLPXOBEh8qUUJE5r4ONrjGH3XHm6dlUTSzVpJpxExvl6X7xbQyEMzqjUiwuQRK3OBD1eEX65+Q6/e+BZWXbmhA9FXHDtzew/pw1jU9p7vxh+uA==',key_name='tempest-keypair-941200395',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='273fe485cc184dd8bf86440d8d1e05f3',ramdisk_id='',reservation_id='r-oipxgz7v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-1346674431',owner_user_name='tempest-VolumesAdminNegativeTest-1346674431-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:58:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b034a039074641a7b7c872e8b715ca4c',uuid=ba72fe35-90dd-4806-9195-8a8ff81ae9f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "address": "fa:16:3e:97:e5:c9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdac9e6d-09", "ovs_interfaceid": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.355 226239 DEBUG nova.network.os_vif_util [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Converting VIF {"id": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "address": "fa:16:3e:97:e5:c9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdac9e6d-09", "ovs_interfaceid": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.356 226239 DEBUG nova.network.os_vif_util [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:e5:c9,bridge_name='br-int',has_traffic_filtering=True,id=cdac9e6d-0972-4cab-a922-cf4d3f769c6c,network=Network(81f9575e-f3c4-4956-9a5c-9cb1776ab08e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcdac9e6d-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.357 226239 DEBUG os_vif [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:e5:c9,bridge_name='br-int',has_traffic_filtering=True,id=cdac9e6d-0972-4cab-a922-cf4d3f769c6c,network=Network(81f9575e-f3c4-4956-9a5c-9cb1776ab08e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcdac9e6d-09') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.357 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.358 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.358 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.362 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.363 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcdac9e6d-09, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.363 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcdac9e6d-09, col_values=(('external_ids', {'iface-id': 'cdac9e6d-0972-4cab-a922-cf4d3f769c6c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:97:e5:c9', 'vm-uuid': 'ba72fe35-90dd-4806-9195-8a8ff81ae9f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.365 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:28 np0005603623 NetworkManager[48970]: <info>  [1769846308.3657] manager: (tapcdac9e6d-09): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.367 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.369 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.370 226239 INFO os_vif [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:e5:c9,bridge_name='br-int',has_traffic_filtering=True,id=cdac9e6d-0972-4cab-a922-cf4d3f769c6c,network=Network(81f9575e-f3c4-4956-9a5c-9cb1776ab08e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcdac9e6d-09')#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.430 226239 DEBUG nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.430 226239 DEBUG nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.431 226239 DEBUG nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] No VIF found with MAC fa:16:3e:97:e5:c9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.431 226239 INFO nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Using config drive#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.453 226239 DEBUG nova.storage.rbd_utils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] rbd image ba72fe35-90dd-4806-9195-8a8ff81ae9f0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:28.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.794 226239 INFO nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Creating config drive at /var/lib/nova/instances/ba72fe35-90dd-4806-9195-8a8ff81ae9f0/disk.config#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.799 226239 DEBUG oslo_concurrency.processutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ba72fe35-90dd-4806-9195-8a8ff81ae9f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpdc5qh0gz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.845 226239 DEBUG nova.network.neutron [req-6ef78bfb-aabd-4dc1-ae35-3573b617f80a req-351f9dff-0cb3-432f-aa51-f1a6afd4a0ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Updated VIF entry in instance network info cache for port cdac9e6d-0972-4cab-a922-cf4d3f769c6c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.846 226239 DEBUG nova.network.neutron [req-6ef78bfb-aabd-4dc1-ae35-3573b617f80a req-351f9dff-0cb3-432f-aa51-f1a6afd4a0ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Updating instance_info_cache with network_info: [{"id": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "address": "fa:16:3e:97:e5:c9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdac9e6d-09", "ovs_interfaceid": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.881 226239 DEBUG oslo_concurrency.lockutils [req-6ef78bfb-aabd-4dc1-ae35-3573b617f80a req-351f9dff-0cb3-432f-aa51-f1a6afd4a0ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-ba72fe35-90dd-4806-9195-8a8ff81ae9f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.921 226239 DEBUG oslo_concurrency.processutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ba72fe35-90dd-4806-9195-8a8ff81ae9f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpdc5qh0gz" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.950 226239 DEBUG nova.storage.rbd_utils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] rbd image ba72fe35-90dd-4806-9195-8a8ff81ae9f0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:28 np0005603623 nova_compute[226235]: 2026-01-31 07:58:28.955 226239 DEBUG oslo_concurrency.processutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ba72fe35-90dd-4806-9195-8a8ff81ae9f0/disk.config ba72fe35-90dd-4806-9195-8a8ff81ae9f0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:29.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.137 226239 DEBUG oslo_concurrency.processutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ba72fe35-90dd-4806-9195-8a8ff81ae9f0/disk.config ba72fe35-90dd-4806-9195-8a8ff81ae9f0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.182s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.137 226239 INFO nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Deleting local config drive /var/lib/nova/instances/ba72fe35-90dd-4806-9195-8a8ff81ae9f0/disk.config because it was imported into RBD.#033[00m
Jan 31 02:58:29 np0005603623 kernel: tapcdac9e6d-09: entered promiscuous mode
Jan 31 02:58:29 np0005603623 NetworkManager[48970]: <info>  [1769846309.1801] manager: (tapcdac9e6d-09): new Tun device (/org/freedesktop/NetworkManager/Devices/84)
Jan 31 02:58:29 np0005603623 systemd-udevd[247572]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:58:29 np0005603623 ovn_controller[133449]: 2026-01-31T07:58:29Z|00171|binding|INFO|Claiming lport cdac9e6d-0972-4cab-a922-cf4d3f769c6c for this chassis.
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.213 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:29 np0005603623 ovn_controller[133449]: 2026-01-31T07:58:29Z|00172|binding|INFO|cdac9e6d-0972-4cab-a922-cf4d3f769c6c: Claiming fa:16:3e:97:e5:c9 10.100.0.4
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.218 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:29 np0005603623 NetworkManager[48970]: <info>  [1769846309.2236] device (tapcdac9e6d-09): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:58:29 np0005603623 NetworkManager[48970]: <info>  [1769846309.2245] device (tapcdac9e6d-09): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:58:29 np0005603623 systemd-machined[194379]: New machine qemu-24-instance-0000002a.
Jan 31 02:58:29 np0005603623 ovn_controller[133449]: 2026-01-31T07:58:29Z|00173|binding|INFO|Setting lport cdac9e6d-0972-4cab-a922-cf4d3f769c6c ovn-installed in OVS
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.242 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:29 np0005603623 ovn_controller[133449]: 2026-01-31T07:58:29Z|00174|binding|INFO|Setting lport cdac9e6d-0972-4cab-a922-cf4d3f769c6c up in Southbound
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.245 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:e5:c9 10.100.0.4'], port_security=['fa:16:3e:97:e5:c9 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'ba72fe35-90dd-4806-9195-8a8ff81ae9f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81f9575e-f3c4-4956-9a5c-9cb1776ab08e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '273fe485cc184dd8bf86440d8d1e05f3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8e774ed5-8b8d-44d7-ba02-981099294fdd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c7c7cc1-6459-40e3-9ba1-aa227a70d0ae, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=cdac9e6d-0972-4cab-a922-cf4d3f769c6c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.246 143258 INFO neutron.agent.ovn.metadata.agent [-] Port cdac9e6d-0972-4cab-a922-cf4d3f769c6c in datapath 81f9575e-f3c4-4956-9a5c-9cb1776ab08e bound to our chassis#033[00m
Jan 31 02:58:29 np0005603623 systemd[1]: Started Virtual Machine qemu-24-instance-0000002a.
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.248 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81f9575e-f3c4-4956-9a5c-9cb1776ab08e#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.255 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3e38a915-09e4-45eb-a7f1-0dd7c033062d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.255 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap81f9575e-f1 in ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.257 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap81f9575e-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.257 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a17a99a3-5094-4882-b041-0538bea16245]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.257 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b4a22ef3-4930-4699-8b9d-cbc3725e348c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.268 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[ec5c5f6b-e5be-4e02-8402-5c99a576303a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.280 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d8bb35ac-ad2c-46e6-a4bf-8c1755036417]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.302 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[9e6c80e3-2be3-487d-9241-ef289ec51945]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.306 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4114ff10-392b-4d78-88d6-cbf0cb6c5ee4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:29 np0005603623 NetworkManager[48970]: <info>  [1769846309.3083] manager: (tap81f9575e-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/85)
Jan 31 02:58:29 np0005603623 systemd-udevd[247576]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.334 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[d8a2c391-c421-41e9-a907-63af9d3b8770]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.337 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[6370378c-a99b-4cbc-898c-ce15eb515bad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:29 np0005603623 NetworkManager[48970]: <info>  [1769846309.3578] device (tap81f9575e-f0): carrier: link connected
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.362 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[c6cd25c8-9bc0-4bc4-9b8f-5bea953c0130]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.374 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7dce7013-3335-443b-8338-715b4e477c42]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81f9575e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4f:70:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 544978, 'reachable_time': 40615, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247608, 'error': None, 'target': 'ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.386 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f11a56b7-42b3-4786-9ee4-6c1d5fbf13fa]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4f:7017'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 544978, 'tstamp': 544978}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247609, 'error': None, 'target': 'ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.396 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[21f3774f-d29f-4580-8822-1ae804e4311f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81f9575e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4f:70:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 544978, 'reachable_time': 40615, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247610, 'error': None, 'target': 'ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.421 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[538cda77-b821-4b79-a0d1-71fa1ab2c85e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.458 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7d0acbcb-d220-4577-9772-2d3b362822fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.459 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81f9575e-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.460 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.460 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81f9575e-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.462 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:29 np0005603623 kernel: tap81f9575e-f0: entered promiscuous mode
Jan 31 02:58:29 np0005603623 NetworkManager[48970]: <info>  [1769846309.4643] manager: (tap81f9575e-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.466 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.467 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81f9575e-f0, col_values=(('external_ids', {'iface-id': 'fdf58350-21cf-49b6-9dfa-eef283a74bd5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.469 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:29 np0005603623 ovn_controller[133449]: 2026-01-31T07:58:29Z|00175|binding|INFO|Releasing lport fdf58350-21cf-49b6-9dfa-eef283a74bd5 from this chassis (sb_readonly=0)
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.470 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.471 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/81f9575e-f3c4-4956-9a5c-9cb1776ab08e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/81f9575e-f3c4-4956-9a5c-9cb1776ab08e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.471 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2644e051-aa24-4c37-831e-a463465ab5d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.472 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-81f9575e-f3c4-4956-9a5c-9cb1776ab08e
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/81f9575e-f3c4-4956-9a5c-9cb1776ab08e.pid.haproxy
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 81f9575e-f3c4-4956-9a5c-9cb1776ab08e
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:58:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:29.473 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e', 'env', 'PROCESS_TAG=haproxy-81f9575e-f3c4-4956-9a5c-9cb1776ab08e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/81f9575e-f3c4-4956-9a5c-9cb1776ab08e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.475 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.502 226239 DEBUG nova.compute.manager [req-428ddbf2-e3f4-4e9c-ad9c-2658a39665cf req-70bd37a6-145a-4618-b490-1c3b882cbada fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Received event network-vif-plugged-cdac9e6d-0972-4cab-a922-cf4d3f769c6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.503 226239 DEBUG oslo_concurrency.lockutils [req-428ddbf2-e3f4-4e9c-ad9c-2658a39665cf req-70bd37a6-145a-4618-b490-1c3b882cbada fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.503 226239 DEBUG oslo_concurrency.lockutils [req-428ddbf2-e3f4-4e9c-ad9c-2658a39665cf req-70bd37a6-145a-4618-b490-1c3b882cbada fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.503 226239 DEBUG oslo_concurrency.lockutils [req-428ddbf2-e3f4-4e9c-ad9c-2658a39665cf req-70bd37a6-145a-4618-b490-1c3b882cbada fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.504 226239 DEBUG nova.compute.manager [req-428ddbf2-e3f4-4e9c-ad9c-2658a39665cf req-70bd37a6-145a-4618-b490-1c3b882cbada fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Processing event network-vif-plugged-cdac9e6d-0972-4cab-a922-cf4d3f769c6c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.733 226239 DEBUG nova.compute.manager [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.734 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846309.733612, ba72fe35-90dd-4806-9195-8a8ff81ae9f0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.735 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] VM Started (Lifecycle Event)#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.737 226239 DEBUG nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.741 226239 INFO nova.virt.libvirt.driver [-] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Instance spawned successfully.#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.742 226239 DEBUG nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:58:29 np0005603623 podman[247683]: 2026-01-31 07:58:29.821355984 +0000 UTC m=+0.050844636 container create eb5cebfb7ad0341fafec9f4e01aaf9a8576bfb966cbffd3230bbfa2384497063 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.829 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.836 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.839 226239 DEBUG nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.839 226239 DEBUG nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.840 226239 DEBUG nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.840 226239 DEBUG nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.841 226239 DEBUG nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.841 226239 DEBUG nova.virt.libvirt.driver [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:58:29 np0005603623 systemd[1]: Started libpod-conmon-eb5cebfb7ad0341fafec9f4e01aaf9a8576bfb966cbffd3230bbfa2384497063.scope.
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.861 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.861 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846309.733725, ba72fe35-90dd-4806-9195-8a8ff81ae9f0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.861 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:58:29 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:58:29 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/758912f774509dd6a7150078e5836d6b7db4868057269d1d7ccf9bdec9b71693/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:58:29 np0005603623 podman[247683]: 2026-01-31 07:58:29.788748942 +0000 UTC m=+0.018237624 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.904 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.908 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846309.737008, ba72fe35-90dd-4806-9195-8a8ff81ae9f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.908 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.947 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.950 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:58:29 np0005603623 podman[247683]: 2026-01-31 07:58:29.960080498 +0000 UTC m=+0.189569170 container init eb5cebfb7ad0341fafec9f4e01aaf9a8576bfb966cbffd3230bbfa2384497063 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 02:58:29 np0005603623 podman[247683]: 2026-01-31 07:58:29.964448675 +0000 UTC m=+0.193937327 container start eb5cebfb7ad0341fafec9f4e01aaf9a8576bfb966cbffd3230bbfa2384497063 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 02:58:29 np0005603623 neutron-haproxy-ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e[247698]: [NOTICE]   (247702) : New worker (247704) forked
Jan 31 02:58:29 np0005603623 neutron-haproxy-ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e[247698]: [NOTICE]   (247702) : Loading success.
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.984 226239 INFO nova.compute.manager [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Took 8.10 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.985 226239 DEBUG nova.compute.manager [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:58:29 np0005603623 nova_compute[226235]: 2026-01-31 07:58:29.993 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:58:30 np0005603623 nova_compute[226235]: 2026-01-31 07:58:30.077 226239 INFO nova.compute.manager [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Took 9.14 seconds to build instance.#033[00m
Jan 31 02:58:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:30.092 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:30.093 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:58:30.093 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:30 np0005603623 nova_compute[226235]: 2026-01-31 07:58:30.130 226239 DEBUG oslo_concurrency.lockutils [None req-7742d1e4-24b6-4655-a3c6-b98a0fe7a5cf b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.298s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:30.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:31.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:31 np0005603623 nova_compute[226235]: 2026-01-31 07:58:31.627 226239 DEBUG nova.compute.manager [req-46734190-407c-424b-8a2b-f2c4b14e385f req-d65d43cc-face-4b1e-9c50-234274b6d752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Received event network-vif-plugged-cdac9e6d-0972-4cab-a922-cf4d3f769c6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:58:31 np0005603623 nova_compute[226235]: 2026-01-31 07:58:31.628 226239 DEBUG oslo_concurrency.lockutils [req-46734190-407c-424b-8a2b-f2c4b14e385f req-d65d43cc-face-4b1e-9c50-234274b6d752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:31 np0005603623 nova_compute[226235]: 2026-01-31 07:58:31.629 226239 DEBUG oslo_concurrency.lockutils [req-46734190-407c-424b-8a2b-f2c4b14e385f req-d65d43cc-face-4b1e-9c50-234274b6d752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:31 np0005603623 nova_compute[226235]: 2026-01-31 07:58:31.629 226239 DEBUG oslo_concurrency.lockutils [req-46734190-407c-424b-8a2b-f2c4b14e385f req-d65d43cc-face-4b1e-9c50-234274b6d752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:31 np0005603623 nova_compute[226235]: 2026-01-31 07:58:31.629 226239 DEBUG nova.compute.manager [req-46734190-407c-424b-8a2b-f2c4b14e385f req-d65d43cc-face-4b1e-9c50-234274b6d752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] No waiting events found dispatching network-vif-plugged-cdac9e6d-0972-4cab-a922-cf4d3f769c6c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:58:31 np0005603623 nova_compute[226235]: 2026-01-31 07:58:31.629 226239 WARNING nova.compute.manager [req-46734190-407c-424b-8a2b-f2c4b14e385f req-d65d43cc-face-4b1e-9c50-234274b6d752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Received unexpected event network-vif-plugged-cdac9e6d-0972-4cab-a922-cf4d3f769c6c for instance with vm_state active and task_state None.#033[00m
Jan 31 02:58:32 np0005603623 nova_compute[226235]: 2026-01-31 07:58:32.117 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:32 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:58:32 np0005603623 ovn_controller[133449]: 2026-01-31T07:58:32Z|00176|binding|INFO|Releasing lport fdf58350-21cf-49b6-9dfa-eef283a74bd5 from this chassis (sb_readonly=0)
Jan 31 02:58:32 np0005603623 nova_compute[226235]: 2026-01-31 07:58:32.401 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:32.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:32 np0005603623 nova_compute[226235]: 2026-01-31 07:58:32.940 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:32 np0005603623 NetworkManager[48970]: <info>  [1769846312.9419] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Jan 31 02:58:32 np0005603623 NetworkManager[48970]: <info>  [1769846312.9430] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Jan 31 02:58:33 np0005603623 nova_compute[226235]: 2026-01-31 07:58:33.012 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:33 np0005603623 ovn_controller[133449]: 2026-01-31T07:58:33Z|00177|binding|INFO|Releasing lport fdf58350-21cf-49b6-9dfa-eef283a74bd5 from this chassis (sb_readonly=0)
Jan 31 02:58:33 np0005603623 nova_compute[226235]: 2026-01-31 07:58:33.030 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:58:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:33.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:58:33 np0005603623 nova_compute[226235]: 2026-01-31 07:58:33.365 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:33 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:58:33 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:58:33 np0005603623 nova_compute[226235]: 2026-01-31 07:58:33.688 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:33 np0005603623 nova_compute[226235]: 2026-01-31 07:58:33.776 226239 DEBUG nova.compute.manager [req-5e4c8033-3ebb-454c-8be9-7129ec791982 req-dba1deb0-7580-4114-b486-e4132cc93c11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Received event network-changed-cdac9e6d-0972-4cab-a922-cf4d3f769c6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:58:33 np0005603623 nova_compute[226235]: 2026-01-31 07:58:33.776 226239 DEBUG nova.compute.manager [req-5e4c8033-3ebb-454c-8be9-7129ec791982 req-dba1deb0-7580-4114-b486-e4132cc93c11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Refreshing instance network info cache due to event network-changed-cdac9e6d-0972-4cab-a922-cf4d3f769c6c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:58:33 np0005603623 nova_compute[226235]: 2026-01-31 07:58:33.777 226239 DEBUG oslo_concurrency.lockutils [req-5e4c8033-3ebb-454c-8be9-7129ec791982 req-dba1deb0-7580-4114-b486-e4132cc93c11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-ba72fe35-90dd-4806-9195-8a8ff81ae9f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:58:33 np0005603623 nova_compute[226235]: 2026-01-31 07:58:33.777 226239 DEBUG oslo_concurrency.lockutils [req-5e4c8033-3ebb-454c-8be9-7129ec791982 req-dba1deb0-7580-4114-b486-e4132cc93c11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-ba72fe35-90dd-4806-9195-8a8ff81ae9f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:58:33 np0005603623 nova_compute[226235]: 2026-01-31 07:58:33.777 226239 DEBUG nova.network.neutron [req-5e4c8033-3ebb-454c-8be9-7129ec791982 req-dba1deb0-7580-4114-b486-e4132cc93c11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Refreshing network info cache for port cdac9e6d-0972-4cab-a922-cf4d3f769c6c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:58:33 np0005603623 podman[247845]: 2026-01-31 07:58:33.955421549 +0000 UTC m=+0.046211042 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 31 02:58:33 np0005603623 podman[247846]: 2026-01-31 07:58:33.983440198 +0000 UTC m=+0.074338065 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:58:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:34.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:35.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:35 np0005603623 nova_compute[226235]: 2026-01-31 07:58:35.796 226239 DEBUG nova.network.neutron [req-5e4c8033-3ebb-454c-8be9-7129ec791982 req-dba1deb0-7580-4114-b486-e4132cc93c11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Updated VIF entry in instance network info cache for port cdac9e6d-0972-4cab-a922-cf4d3f769c6c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:58:35 np0005603623 nova_compute[226235]: 2026-01-31 07:58:35.797 226239 DEBUG nova.network.neutron [req-5e4c8033-3ebb-454c-8be9-7129ec791982 req-dba1deb0-7580-4114-b486-e4132cc93c11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Updating instance_info_cache with network_info: [{"id": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "address": "fa:16:3e:97:e5:c9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdac9e6d-09", "ovs_interfaceid": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:58:35 np0005603623 nova_compute[226235]: 2026-01-31 07:58:35.888 226239 DEBUG oslo_concurrency.lockutils [req-5e4c8033-3ebb-454c-8be9-7129ec791982 req-dba1deb0-7580-4114-b486-e4132cc93c11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-ba72fe35-90dd-4806-9195-8a8ff81ae9f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:58:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:58:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:36.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:58:37 np0005603623 nova_compute[226235]: 2026-01-31 07:58:37.119 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.002000064s ======
Jan 31 02:58:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:37.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 31 02:58:38 np0005603623 nova_compute[226235]: 2026-01-31 07:58:38.370 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:38 np0005603623 ovn_controller[133449]: 2026-01-31T07:58:38Z|00178|binding|INFO|Releasing lport fdf58350-21cf-49b6-9dfa-eef283a74bd5 from this chassis (sb_readonly=0)
Jan 31 02:58:38 np0005603623 nova_compute[226235]: 2026-01-31 07:58:38.500 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:38.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:39.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:58:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:58:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:40.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:58:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:41.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:58:42 np0005603623 nova_compute[226235]: 2026-01-31 07:58:42.121 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:42.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:43.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:43 np0005603623 ovn_controller[133449]: 2026-01-31T07:58:43Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:97:e5:c9 10.100.0.4
Jan 31 02:58:43 np0005603623 ovn_controller[133449]: 2026-01-31T07:58:43Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:97:e5:c9 10.100.0.4
Jan 31 02:58:43 np0005603623 nova_compute[226235]: 2026-01-31 07:58:43.372 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:44.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:58:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:45.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:58:45.728851) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846325728948, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 2684, "num_deletes": 502, "total_data_size": 5806932, "memory_usage": 5886528, "flush_reason": "Manual Compaction"}
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846325746690, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 3656389, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28389, "largest_seqno": 31067, "table_properties": {"data_size": 3646130, "index_size": 5985, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3205, "raw_key_size": 25689, "raw_average_key_size": 20, "raw_value_size": 3623274, "raw_average_value_size": 2875, "num_data_blocks": 260, "num_entries": 1260, "num_filter_entries": 1260, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769846110, "oldest_key_time": 1769846110, "file_creation_time": 1769846325, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 17895 microseconds, and 7803 cpu microseconds.
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:58:45.746744) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 3656389 bytes OK
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:58:45.746763) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:58:45.748870) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:58:45.748895) EVENT_LOG_v1 {"time_micros": 1769846325748888, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:58:45.748913) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 5794218, prev total WAL file size 5794218, number of live WAL files 2.
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:58:45.749715) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(3570KB)], [57(10MB)]
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846325749773, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 14981609, "oldest_snapshot_seqno": -1}
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5500 keys, 9230157 bytes, temperature: kUnknown
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846325913369, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 9230157, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9193402, "index_size": 21925, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13765, "raw_key_size": 140350, "raw_average_key_size": 25, "raw_value_size": 9094587, "raw_average_value_size": 1653, "num_data_blocks": 886, "num_entries": 5500, "num_filter_entries": 5500, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769846325, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:58:45.913909) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 9230157 bytes
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:58:45.989866) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 91.5 rd, 56.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 10.8 +0.0 blob) out(8.8 +0.0 blob), read-write-amplify(6.6) write-amplify(2.5) OK, records in: 6515, records dropped: 1015 output_compression: NoCompression
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:58:45.989903) EVENT_LOG_v1 {"time_micros": 1769846325989888, "job": 34, "event": "compaction_finished", "compaction_time_micros": 163711, "compaction_time_cpu_micros": 19856, "output_level": 6, "num_output_files": 1, "total_output_size": 9230157, "num_input_records": 6515, "num_output_records": 5500, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846325991328, "job": 34, "event": "table_file_deletion", "file_number": 59}
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846325992649, "job": 34, "event": "table_file_deletion", "file_number": 57}
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:58:45.749611) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:58:45.992704) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:58:45.992710) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:58:45.992711) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:58:45.992713) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:58:45 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-07:58:45.992715) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:58:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:46.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:47 np0005603623 nova_compute[226235]: 2026-01-31 07:58:47.122 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:47.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:48 np0005603623 nova_compute[226235]: 2026-01-31 07:58:48.375 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:58:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:48.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:58:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:49.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:50.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:51.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:51 np0005603623 nova_compute[226235]: 2026-01-31 07:58:51.673 226239 DEBUG oslo_concurrency.lockutils [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:51 np0005603623 nova_compute[226235]: 2026-01-31 07:58:51.673 226239 DEBUG oslo_concurrency.lockutils [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:51 np0005603623 nova_compute[226235]: 2026-01-31 07:58:51.711 226239 DEBUG nova.objects.instance [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lazy-loading 'flavor' on Instance uuid ba72fe35-90dd-4806-9195-8a8ff81ae9f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:58:51 np0005603623 nova_compute[226235]: 2026-01-31 07:58:51.777 226239 DEBUG oslo_concurrency.lockutils [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.017 226239 DEBUG oslo_concurrency.lockutils [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.017 226239 DEBUG oslo_concurrency.lockutils [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.018 226239 INFO nova.compute.manager [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Attaching volume db1124e5-1a30-4118-a44b-369e05e566a9 to /dev/vdb#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.125 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.235 226239 DEBUG os_brick.utils [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.237 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.247 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.248 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[9e585486-570c-4bfc-8904-04ad8173ce4a]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.249 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.256 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.256 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[3b779089-145b-4375-a346-fb48ef783080]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.258 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.264 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.265 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[438ce162-8b17-4577-9b59-b4d73bfdeea6]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.266 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[c6a1d784-9652-4bb9-8370-90eeaf988152]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.267 226239 DEBUG oslo_concurrency.processutils [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.287 226239 DEBUG oslo_concurrency.processutils [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CMD "nvme version" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.289 226239 DEBUG os_brick.initiator.connectors.lightos [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.290 226239 DEBUG os_brick.initiator.connectors.lightos [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.290 226239 DEBUG os_brick.initiator.connectors.lightos [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.290 226239 DEBUG os_brick.utils [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] <== get_connector_properties: return (54ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 02:58:52 np0005603623 nova_compute[226235]: 2026-01-31 07:58:52.291 226239 DEBUG nova.virt.block_device [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Updating existing volume attachment record: e3151b1e-a19d-4cef-b944-30b818e6b6f5 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 02:58:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:52.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:53 np0005603623 nova_compute[226235]: 2026-01-31 07:58:53.133 226239 DEBUG nova.objects.instance [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lazy-loading 'flavor' on Instance uuid ba72fe35-90dd-4806-9195-8a8ff81ae9f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:58:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:58:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:53.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:58:53 np0005603623 nova_compute[226235]: 2026-01-31 07:58:53.171 226239 DEBUG nova.virt.libvirt.driver [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Attempting to attach volume db1124e5-1a30-4118-a44b-369e05e566a9 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 02:58:53 np0005603623 nova_compute[226235]: 2026-01-31 07:58:53.174 226239 DEBUG nova.virt.libvirt.guest [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 02:58:53 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 02:58:53 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-db1124e5-1a30-4118-a44b-369e05e566a9">
Jan 31 02:58:53 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 02:58:53 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 02:58:53 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 02:58:53 np0005603623 nova_compute[226235]:  </source>
Jan 31 02:58:53 np0005603623 nova_compute[226235]:  <auth username="openstack">
Jan 31 02:58:53 np0005603623 nova_compute[226235]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:58:53 np0005603623 nova_compute[226235]:  </auth>
Jan 31 02:58:53 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 02:58:53 np0005603623 nova_compute[226235]:  <serial>db1124e5-1a30-4118-a44b-369e05e566a9</serial>
Jan 31 02:58:53 np0005603623 nova_compute[226235]: </disk>
Jan 31 02:58:53 np0005603623 nova_compute[226235]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 02:58:53 np0005603623 nova_compute[226235]: 2026-01-31 07:58:53.308 226239 DEBUG nova.virt.libvirt.driver [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:58:53 np0005603623 nova_compute[226235]: 2026-01-31 07:58:53.309 226239 DEBUG nova.virt.libvirt.driver [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:58:53 np0005603623 nova_compute[226235]: 2026-01-31 07:58:53.309 226239 DEBUG nova.virt.libvirt.driver [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:58:53 np0005603623 nova_compute[226235]: 2026-01-31 07:58:53.309 226239 DEBUG nova.virt.libvirt.driver [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] No VIF found with MAC fa:16:3e:97:e5:c9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:58:53 np0005603623 nova_compute[226235]: 2026-01-31 07:58:53.380 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:53 np0005603623 nova_compute[226235]: 2026-01-31 07:58:53.625 226239 DEBUG oslo_concurrency.lockutils [None req-d11de506-af81-4e4e-8ed8-c65ec30ad426 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:58:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:54.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:58:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:55.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:55 np0005603623 nova_compute[226235]: 2026-01-31 07:58:55.616 226239 DEBUG oslo_concurrency.lockutils [None req-0d367ad7-8795-4d81-9414-3592683f6a94 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:55 np0005603623 nova_compute[226235]: 2026-01-31 07:58:55.617 226239 DEBUG oslo_concurrency.lockutils [None req-0d367ad7-8795-4d81-9414-3592683f6a94 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:55 np0005603623 nova_compute[226235]: 2026-01-31 07:58:55.636 226239 INFO nova.compute.manager [None req-0d367ad7-8795-4d81-9414-3592683f6a94 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Detaching volume db1124e5-1a30-4118-a44b-369e05e566a9#033[00m
Jan 31 02:58:55 np0005603623 nova_compute[226235]: 2026-01-31 07:58:55.776 226239 INFO nova.virt.block_device [None req-0d367ad7-8795-4d81-9414-3592683f6a94 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Attempting to driver detach volume db1124e5-1a30-4118-a44b-369e05e566a9 from mountpoint /dev/vdb#033[00m
Jan 31 02:58:55 np0005603623 nova_compute[226235]: 2026-01-31 07:58:55.784 226239 DEBUG nova.virt.libvirt.driver [None req-0d367ad7-8795-4d81-9414-3592683f6a94 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Attempting to detach device vdb from instance ba72fe35-90dd-4806-9195-8a8ff81ae9f0 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 02:58:55 np0005603623 nova_compute[226235]: 2026-01-31 07:58:55.784 226239 DEBUG nova.virt.libvirt.guest [None req-0d367ad7-8795-4d81-9414-3592683f6a94 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 02:58:55 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 02:58:55 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-db1124e5-1a30-4118-a44b-369e05e566a9">
Jan 31 02:58:55 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 02:58:55 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 02:58:55 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 02:58:55 np0005603623 nova_compute[226235]:  </source>
Jan 31 02:58:55 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 02:58:55 np0005603623 nova_compute[226235]:  <serial>db1124e5-1a30-4118-a44b-369e05e566a9</serial>
Jan 31 02:58:55 np0005603623 nova_compute[226235]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 02:58:55 np0005603623 nova_compute[226235]: </disk>
Jan 31 02:58:55 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 02:58:55 np0005603623 nova_compute[226235]: 2026-01-31 07:58:55.798 226239 INFO nova.virt.libvirt.driver [None req-0d367ad7-8795-4d81-9414-3592683f6a94 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Successfully detached device vdb from instance ba72fe35-90dd-4806-9195-8a8ff81ae9f0 from the persistent domain config.#033[00m
Jan 31 02:58:55 np0005603623 nova_compute[226235]: 2026-01-31 07:58:55.799 226239 DEBUG nova.virt.libvirt.driver [None req-0d367ad7-8795-4d81-9414-3592683f6a94 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance ba72fe35-90dd-4806-9195-8a8ff81ae9f0 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 02:58:55 np0005603623 nova_compute[226235]: 2026-01-31 07:58:55.799 226239 DEBUG nova.virt.libvirt.guest [None req-0d367ad7-8795-4d81-9414-3592683f6a94 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 02:58:55 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 02:58:55 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-db1124e5-1a30-4118-a44b-369e05e566a9">
Jan 31 02:58:55 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 02:58:55 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 02:58:55 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 02:58:55 np0005603623 nova_compute[226235]:  </source>
Jan 31 02:58:55 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 02:58:55 np0005603623 nova_compute[226235]:  <serial>db1124e5-1a30-4118-a44b-369e05e566a9</serial>
Jan 31 02:58:55 np0005603623 nova_compute[226235]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 02:58:55 np0005603623 nova_compute[226235]: </disk>
Jan 31 02:58:55 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 02:58:55 np0005603623 nova_compute[226235]: 2026-01-31 07:58:55.899 226239 DEBUG nova.virt.libvirt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Received event <DeviceRemovedEvent: 1769846335.8996453, ba72fe35-90dd-4806-9195-8a8ff81ae9f0 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 02:58:55 np0005603623 nova_compute[226235]: 2026-01-31 07:58:55.901 226239 DEBUG nova.virt.libvirt.driver [None req-0d367ad7-8795-4d81-9414-3592683f6a94 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance ba72fe35-90dd-4806-9195-8a8ff81ae9f0 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 02:58:55 np0005603623 nova_compute[226235]: 2026-01-31 07:58:55.902 226239 INFO nova.virt.libvirt.driver [None req-0d367ad7-8795-4d81-9414-3592683f6a94 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Successfully detached device vdb from instance ba72fe35-90dd-4806-9195-8a8ff81ae9f0 from the live domain config.#033[00m
Jan 31 02:58:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:56 np0005603623 nova_compute[226235]: 2026-01-31 07:58:56.108 226239 DEBUG nova.objects.instance [None req-0d367ad7-8795-4d81-9414-3592683f6a94 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lazy-loading 'flavor' on Instance uuid ba72fe35-90dd-4806-9195-8a8ff81ae9f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:58:56 np0005603623 nova_compute[226235]: 2026-01-31 07:58:56.158 226239 DEBUG oslo_concurrency.lockutils [None req-0d367ad7-8795-4d81-9414-3592683f6a94 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.541s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:56.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:57 np0005603623 nova_compute[226235]: 2026-01-31 07:58:57.126 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:57.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:58 np0005603623 nova_compute[226235]: 2026-01-31 07:58:58.383 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:58:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:58.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:58:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:58:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:59.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:00 np0005603623 nova_compute[226235]: 2026-01-31 07:59:00.712 226239 DEBUG oslo_concurrency.lockutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:00 np0005603623 nova_compute[226235]: 2026-01-31 07:59:00.713 226239 DEBUG oslo_concurrency.lockutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:59:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:00.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:59:00 np0005603623 nova_compute[226235]: 2026-01-31 07:59:00.785 226239 DEBUG nova.compute.manager [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:59:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:01 np0005603623 nova_compute[226235]: 2026-01-31 07:59:01.119 226239 DEBUG oslo_concurrency.lockutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:01 np0005603623 nova_compute[226235]: 2026-01-31 07:59:01.119 226239 DEBUG oslo_concurrency.lockutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:01 np0005603623 nova_compute[226235]: 2026-01-31 07:59:01.128 226239 DEBUG nova.virt.hardware [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:59:01 np0005603623 nova_compute[226235]: 2026-01-31 07:59:01.129 226239 INFO nova.compute.claims [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:59:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:01.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:01 np0005603623 nova_compute[226235]: 2026-01-31 07:59:01.345 226239 DEBUG oslo_concurrency.processutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:59:01 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1874427172' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:59:01 np0005603623 nova_compute[226235]: 2026-01-31 07:59:01.951 226239 DEBUG oslo_concurrency.processutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.606s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:01 np0005603623 nova_compute[226235]: 2026-01-31 07:59:01.963 226239 DEBUG nova.compute.provider_tree [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:59:01 np0005603623 nova_compute[226235]: 2026-01-31 07:59:01.985 226239 DEBUG nova.scheduler.client.report [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.047 226239 DEBUG oslo_concurrency.lockutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.928s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.049 226239 DEBUG nova.compute.manager [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.128 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.157 226239 DEBUG nova.compute.manager [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.159 226239 DEBUG nova.network.neutron [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.221 226239 INFO nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.283 226239 DEBUG nova.compute.manager [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.444 226239 DEBUG nova.compute.manager [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.445 226239 DEBUG nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.446 226239 INFO nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Creating image(s)#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.506 226239 DEBUG nova.storage.rbd_utils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] rbd image 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.541 226239 DEBUG nova.storage.rbd_utils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] rbd image 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.578 226239 DEBUG nova.storage.rbd_utils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] rbd image 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.584 226239 DEBUG oslo_concurrency.processutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.648 226239 DEBUG oslo_concurrency.processutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.649 226239 DEBUG oslo_concurrency.lockutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.650 226239 DEBUG oslo_concurrency.lockutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.650 226239 DEBUG oslo_concurrency.lockutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.684 226239 DEBUG nova.storage.rbd_utils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] rbd image 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:59:02 np0005603623 nova_compute[226235]: 2026-01-31 07:59:02.689 226239 DEBUG oslo_concurrency.processutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:59:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:02.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:59:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:03.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:03 np0005603623 nova_compute[226235]: 2026-01-31 07:59:03.196 226239 DEBUG nova.policy [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b034a039074641a7b7c872e8b715ca4c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '273fe485cc184dd8bf86440d8d1e05f3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:59:03 np0005603623 nova_compute[226235]: 2026-01-31 07:59:03.205 226239 DEBUG oslo_concurrency.processutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:03 np0005603623 nova_compute[226235]: 2026-01-31 07:59:03.279 226239 DEBUG nova.storage.rbd_utils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] resizing rbd image 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:59:03 np0005603623 nova_compute[226235]: 2026-01-31 07:59:03.385 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:03 np0005603623 nova_compute[226235]: 2026-01-31 07:59:03.547 226239 DEBUG nova.objects.instance [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lazy-loading 'migration_context' on Instance uuid 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:59:03 np0005603623 nova_compute[226235]: 2026-01-31 07:59:03.578 226239 DEBUG nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:59:03 np0005603623 nova_compute[226235]: 2026-01-31 07:59:03.579 226239 DEBUG nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Ensure instance console log exists: /var/lib/nova/instances/34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:59:03 np0005603623 nova_compute[226235]: 2026-01-31 07:59:03.579 226239 DEBUG oslo_concurrency.lockutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:03 np0005603623 nova_compute[226235]: 2026-01-31 07:59:03.579 226239 DEBUG oslo_concurrency.lockutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:03 np0005603623 nova_compute[226235]: 2026-01-31 07:59:03.580 226239 DEBUG oslo_concurrency.lockutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:04.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:04 np0005603623 podman[248275]: 2026-01-31 07:59:04.957145789 +0000 UTC m=+0.046890942 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 31 02:59:05 np0005603623 podman[248276]: 2026-01-31 07:59:05.001515651 +0000 UTC m=+0.091260254 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller)
Jan 31 02:59:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:05.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:06.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:07 np0005603623 nova_compute[226235]: 2026-01-31 07:59:07.131 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:59:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:07.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:59:07 np0005603623 nova_compute[226235]: 2026-01-31 07:59:07.201 226239 DEBUG nova.network.neutron [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Successfully created port: ab16b512-c763-4a6a-9d29-6911ff7bb9bc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 02:59:08 np0005603623 nova_compute[226235]: 2026-01-31 07:59:08.388 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:08.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:09.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e161 e161: 3 total, 3 up, 3 in
Jan 31 02:59:10 np0005603623 nova_compute[226235]: 2026-01-31 07:59:10.243 226239 DEBUG nova.network.neutron [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Successfully updated port: ab16b512-c763-4a6a-9d29-6911ff7bb9bc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:59:10 np0005603623 nova_compute[226235]: 2026-01-31 07:59:10.387 226239 DEBUG oslo_concurrency.lockutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "refresh_cache-34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:59:10 np0005603623 nova_compute[226235]: 2026-01-31 07:59:10.388 226239 DEBUG oslo_concurrency.lockutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquired lock "refresh_cache-34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:59:10 np0005603623 nova_compute[226235]: 2026-01-31 07:59:10.388 226239 DEBUG nova.network.neutron [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:59:10 np0005603623 nova_compute[226235]: 2026-01-31 07:59:10.593 226239 DEBUG nova.compute.manager [req-e35119a1-11f0-4f9e-8e00-7561b01742b1 req-f8eb6a31-d1bd-4893-803b-118c41623f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Received event network-changed-ab16b512-c763-4a6a-9d29-6911ff7bb9bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:10 np0005603623 nova_compute[226235]: 2026-01-31 07:59:10.593 226239 DEBUG nova.compute.manager [req-e35119a1-11f0-4f9e-8e00-7561b01742b1 req-f8eb6a31-d1bd-4893-803b-118c41623f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Refreshing instance network info cache due to event network-changed-ab16b512-c763-4a6a-9d29-6911ff7bb9bc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:59:10 np0005603623 nova_compute[226235]: 2026-01-31 07:59:10.593 226239 DEBUG oslo_concurrency.lockutils [req-e35119a1-11f0-4f9e-8e00-7561b01742b1 req-f8eb6a31-d1bd-4893-803b-118c41623f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:59:10 np0005603623 nova_compute[226235]: 2026-01-31 07:59:10.625 226239 DEBUG nova.network.neutron [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:59:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e162 e162: 3 total, 3 up, 3 in
Jan 31 02:59:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:10.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:11.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e163 e163: 3 total, 3 up, 3 in
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.132 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.250 226239 DEBUG nova.network.neutron [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Updating instance_info_cache with network_info: [{"id": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "address": "fa:16:3e:43:a3:d9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab16b512-c7", "ovs_interfaceid": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.399 226239 DEBUG oslo_concurrency.lockutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Releasing lock "refresh_cache-34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.399 226239 DEBUG nova.compute.manager [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Instance network_info: |[{"id": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "address": "fa:16:3e:43:a3:d9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab16b512-c7", "ovs_interfaceid": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.400 226239 DEBUG oslo_concurrency.lockutils [req-e35119a1-11f0-4f9e-8e00-7561b01742b1 req-f8eb6a31-d1bd-4893-803b-118c41623f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.400 226239 DEBUG nova.network.neutron [req-e35119a1-11f0-4f9e-8e00-7561b01742b1 req-f8eb6a31-d1bd-4893-803b-118c41623f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Refreshing network info cache for port ab16b512-c763-4a6a-9d29-6911ff7bb9bc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.403 226239 DEBUG nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Start _get_guest_xml network_info=[{"id": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "address": "fa:16:3e:43:a3:d9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab16b512-c7", "ovs_interfaceid": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.406 226239 WARNING nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.417 226239 DEBUG nova.virt.libvirt.host [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.417 226239 DEBUG nova.virt.libvirt.host [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.422 226239 DEBUG nova.virt.libvirt.host [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.423 226239 DEBUG nova.virt.libvirt.host [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.424 226239 DEBUG nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.424 226239 DEBUG nova.virt.hardware [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.424 226239 DEBUG nova.virt.hardware [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.425 226239 DEBUG nova.virt.hardware [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.425 226239 DEBUG nova.virt.hardware [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.425 226239 DEBUG nova.virt.hardware [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.425 226239 DEBUG nova.virt.hardware [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.425 226239 DEBUG nova.virt.hardware [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.426 226239 DEBUG nova.virt.hardware [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.426 226239 DEBUG nova.virt.hardware [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.426 226239 DEBUG nova.virt.hardware [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.426 226239 DEBUG nova.virt.hardware [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.429 226239 DEBUG oslo_concurrency.processutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:59:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:12.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:59:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:59:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1100007511' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.944 226239 DEBUG oslo_concurrency.processutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.515s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.976 226239 DEBUG nova.storage.rbd_utils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] rbd image 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:59:12 np0005603623 nova_compute[226235]: 2026-01-31 07:59:12.983 226239 DEBUG oslo_concurrency.processutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:59:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:13.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.338 226239 DEBUG oslo_concurrency.lockutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.338 226239 DEBUG oslo_concurrency.lockutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.392 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:59:13 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2544111389' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.495 226239 DEBUG oslo_concurrency.processutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.497 226239 DEBUG nova.virt.libvirt.vif [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:58:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1580535877',display_name='tempest-VolumesAdminNegativeTest-server-1580535877',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-volumesadminnegativetest-server-1580535877',id=47,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='273fe485cc184dd8bf86440d8d1e05f3',ramdisk_id='',reservation_id='r-bnd59map',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-1346674431',owner_user_name='tempest-VolumesAdminNegativeTest-1346674431-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:59:02Z,user_data=None,user_id='b034a039074641a7b7c872e8b715ca4c',uuid=34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "address": "fa:16:3e:43:a3:d9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab16b512-c7", "ovs_interfaceid": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.497 226239 DEBUG nova.network.os_vif_util [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Converting VIF {"id": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "address": "fa:16:3e:43:a3:d9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab16b512-c7", "ovs_interfaceid": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.498 226239 DEBUG nova.network.os_vif_util [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:a3:d9,bridge_name='br-int',has_traffic_filtering=True,id=ab16b512-c763-4a6a-9d29-6911ff7bb9bc,network=Network(81f9575e-f3c4-4956-9a5c-9cb1776ab08e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab16b512-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.499 226239 DEBUG nova.objects.instance [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.671 226239 DEBUG nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:59:13 np0005603623 nova_compute[226235]:  <uuid>34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a</uuid>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:  <name>instance-0000002f</name>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <nova:name>tempest-VolumesAdminNegativeTest-server-1580535877</nova:name>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:59:12</nova:creationTime>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:59:13 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:        <nova:user uuid="b034a039074641a7b7c872e8b715ca4c">tempest-VolumesAdminNegativeTest-1346674431-project-member</nova:user>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:        <nova:project uuid="273fe485cc184dd8bf86440d8d1e05f3">tempest-VolumesAdminNegativeTest-1346674431</nova:project>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:        <nova:port uuid="ab16b512-c763-4a6a-9d29-6911ff7bb9bc">
Jan 31 02:59:13 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <entry name="serial">34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a</entry>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <entry name="uuid">34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a</entry>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a_disk">
Jan 31 02:59:13 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:59:13 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a_disk.config">
Jan 31 02:59:13 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:59:13 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:43:a3:d9"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <target dev="tapab16b512-c7"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    </interface>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a/console.log" append="off"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:59:13 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:59:13 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:59:13 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:59:13 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.672 226239 DEBUG nova.compute.manager [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Preparing to wait for external event network-vif-plugged-ab16b512-c763-4a6a-9d29-6911ff7bb9bc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.672 226239 DEBUG oslo_concurrency.lockutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.672 226239 DEBUG oslo_concurrency.lockutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.673 226239 DEBUG oslo_concurrency.lockutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.674 226239 DEBUG nova.virt.libvirt.vif [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:58:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1580535877',display_name='tempest-VolumesAdminNegativeTest-server-1580535877',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-volumesadminnegativetest-server-1580535877',id=47,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='273fe485cc184dd8bf86440d8d1e05f3',ramdisk_id='',reservation_id='r-bnd59map',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-1346674431',owner_user_name='tempest-VolumesAdminNegativeTest-1346674431-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:59:02Z,user_data=None,user_id='b034a039074641a7b7c872e8b715ca4c',uuid=34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "address": "fa:16:3e:43:a3:d9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab16b512-c7", "ovs_interfaceid": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.674 226239 DEBUG nova.network.os_vif_util [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Converting VIF {"id": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "address": "fa:16:3e:43:a3:d9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab16b512-c7", "ovs_interfaceid": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.675 226239 DEBUG nova.network.os_vif_util [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:a3:d9,bridge_name='br-int',has_traffic_filtering=True,id=ab16b512-c763-4a6a-9d29-6911ff7bb9bc,network=Network(81f9575e-f3c4-4956-9a5c-9cb1776ab08e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab16b512-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.675 226239 DEBUG os_vif [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:a3:d9,bridge_name='br-int',has_traffic_filtering=True,id=ab16b512-c763-4a6a-9d29-6911ff7bb9bc,network=Network(81f9575e-f3c4-4956-9a5c-9cb1776ab08e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab16b512-c7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.676 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.676 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.677 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.681 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.681 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab16b512-c7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.682 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapab16b512-c7, col_values=(('external_ids', {'iface-id': 'ab16b512-c763-4a6a-9d29-6911ff7bb9bc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:43:a3:d9', 'vm-uuid': '34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.683 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:13 np0005603623 NetworkManager[48970]: <info>  [1769846353.6846] manager: (tapab16b512-c7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.685 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.689 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.691 226239 INFO os_vif [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:a3:d9,bridge_name='br-int',has_traffic_filtering=True,id=ab16b512-c763-4a6a-9d29-6911ff7bb9bc,network=Network(81f9575e-f3c4-4956-9a5c-9cb1776ab08e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab16b512-c7')#033[00m
Jan 31 02:59:13 np0005603623 nova_compute[226235]: 2026-01-31 07:59:13.930 226239 DEBUG nova.compute.manager [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:59:14 np0005603623 nova_compute[226235]: 2026-01-31 07:59:14.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:59:14 np0005603623 nova_compute[226235]: 2026-01-31 07:59:14.295 226239 DEBUG oslo_concurrency.lockutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:14 np0005603623 nova_compute[226235]: 2026-01-31 07:59:14.296 226239 DEBUG oslo_concurrency.lockutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:14 np0005603623 nova_compute[226235]: 2026-01-31 07:59:14.304 226239 DEBUG nova.virt.hardware [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:59:14 np0005603623 nova_compute[226235]: 2026-01-31 07:59:14.304 226239 INFO nova.compute.claims [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 02:59:14 np0005603623 nova_compute[226235]: 2026-01-31 07:59:14.312 226239 DEBUG nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:59:14 np0005603623 nova_compute[226235]: 2026-01-31 07:59:14.312 226239 DEBUG nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:59:14 np0005603623 nova_compute[226235]: 2026-01-31 07:59:14.312 226239 DEBUG nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] No VIF found with MAC fa:16:3e:43:a3:d9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:59:14 np0005603623 nova_compute[226235]: 2026-01-31 07:59:14.313 226239 INFO nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Using config drive#033[00m
Jan 31 02:59:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 02:59:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/734104115' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 02:59:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 02:59:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/734104115' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 02:59:14 np0005603623 nova_compute[226235]: 2026-01-31 07:59:14.336 226239 DEBUG nova.storage.rbd_utils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] rbd image 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:59:14 np0005603623 nova_compute[226235]: 2026-01-31 07:59:14.519 226239 DEBUG nova.network.neutron [req-e35119a1-11f0-4f9e-8e00-7561b01742b1 req-f8eb6a31-d1bd-4893-803b-118c41623f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Updated VIF entry in instance network info cache for port ab16b512-c763-4a6a-9d29-6911ff7bb9bc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:59:14 np0005603623 nova_compute[226235]: 2026-01-31 07:59:14.520 226239 DEBUG nova.network.neutron [req-e35119a1-11f0-4f9e-8e00-7561b01742b1 req-f8eb6a31-d1bd-4893-803b-118c41623f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Updating instance_info_cache with network_info: [{"id": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "address": "fa:16:3e:43:a3:d9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab16b512-c7", "ovs_interfaceid": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:59:14 np0005603623 nova_compute[226235]: 2026-01-31 07:59:14.609 226239 DEBUG oslo_concurrency.lockutils [req-e35119a1-11f0-4f9e-8e00-7561b01742b1 req-f8eb6a31-d1bd-4893-803b-118c41623f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:59:14 np0005603623 nova_compute[226235]: 2026-01-31 07:59:14.687 226239 DEBUG oslo_concurrency.processutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:14.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:59:15 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1332816857' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:59:15 np0005603623 nova_compute[226235]: 2026-01-31 07:59:15.106 226239 DEBUG oslo_concurrency.processutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:15 np0005603623 nova_compute[226235]: 2026-01-31 07:59:15.111 226239 DEBUG nova.compute.provider_tree [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:59:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:15.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:15 np0005603623 nova_compute[226235]: 2026-01-31 07:59:15.504 226239 DEBUG nova.scheduler.client.report [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:59:15 np0005603623 nova_compute[226235]: 2026-01-31 07:59:15.926 226239 DEBUG oslo_concurrency.lockutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.630s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:15 np0005603623 nova_compute[226235]: 2026-01-31 07:59:15.926 226239 DEBUG nova.compute.manager [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:59:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:16 np0005603623 nova_compute[226235]: 2026-01-31 07:59:16.260 226239 DEBUG nova.compute.manager [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:59:16 np0005603623 nova_compute[226235]: 2026-01-31 07:59:16.261 226239 DEBUG nova.network.neutron [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:59:16 np0005603623 nova_compute[226235]: 2026-01-31 07:59:16.413 226239 INFO nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:59:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e164 e164: 3 total, 3 up, 3 in
Jan 31 02:59:16 np0005603623 nova_compute[226235]: 2026-01-31 07:59:16.787 226239 DEBUG nova.compute.manager [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:59:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:16.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.054 226239 INFO nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Creating config drive at /var/lib/nova/instances/34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a/disk.config#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.058 226239 DEBUG oslo_concurrency.processutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpv1_p6mb7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.134 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.147 226239 DEBUG nova.policy [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0e402088c09448e1a6f0cd61b11e0816', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f31b0319126848a5b8fd9521dc509172', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:59:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:59:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:17.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.181 226239 DEBUG oslo_concurrency.processutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpv1_p6mb7" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.210 226239 DEBUG nova.storage.rbd_utils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] rbd image 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.214 226239 DEBUG oslo_concurrency.processutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a/disk.config 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.376 226239 DEBUG nova.compute.manager [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.378 226239 DEBUG nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.378 226239 INFO nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Creating image(s)#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.407 226239 DEBUG nova.storage.rbd_utils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] rbd image add62d62-47fb-454d-aec1-d5bb6be9f1e6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.440 226239 DEBUG nova.storage.rbd_utils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] rbd image add62d62-47fb-454d-aec1-d5bb6be9f1e6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.475 226239 DEBUG nova.storage.rbd_utils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] rbd image add62d62-47fb-454d-aec1-d5bb6be9f1e6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.481 226239 DEBUG oslo_concurrency.processutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.500 226239 DEBUG oslo_concurrency.processutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a/disk.config 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.286s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.501 226239 INFO nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Deleting local config drive /var/lib/nova/instances/34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a/disk.config because it was imported into RBD.#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.532 226239 DEBUG oslo_concurrency.processutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.534 226239 DEBUG oslo_concurrency.lockutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.535 226239 DEBUG oslo_concurrency.lockutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.536 226239 DEBUG oslo_concurrency.lockutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:17 np0005603623 kernel: tapab16b512-c7: entered promiscuous mode
Jan 31 02:59:17 np0005603623 NetworkManager[48970]: <info>  [1769846357.5418] manager: (tapab16b512-c7): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Jan 31 02:59:17 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:17Z|00179|binding|INFO|Claiming lport ab16b512-c763-4a6a-9d29-6911ff7bb9bc for this chassis.
Jan 31 02:59:17 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:17Z|00180|binding|INFO|ab16b512-c763-4a6a-9d29-6911ff7bb9bc: Claiming fa:16:3e:43:a3:d9 10.100.0.10
Jan 31 02:59:17 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:17Z|00181|binding|INFO|Setting lport ab16b512-c763-4a6a-9d29-6911ff7bb9bc ovn-installed in OVS
Jan 31 02:59:17 np0005603623 systemd-udevd[248604]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:59:17 np0005603623 systemd-machined[194379]: New machine qemu-25-instance-0000002f.
Jan 31 02:59:17 np0005603623 NetworkManager[48970]: <info>  [1769846357.5787] device (tapab16b512-c7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:59:17 np0005603623 NetworkManager[48970]: <info>  [1769846357.5794] device (tapab16b512-c7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:59:17 np0005603623 systemd[1]: Started Virtual Machine qemu-25-instance-0000002f.
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.582 226239 DEBUG nova.storage.rbd_utils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] rbd image add62d62-47fb-454d-aec1-d5bb6be9f1e6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.586 226239 DEBUG oslo_concurrency.processutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 add62d62-47fb-454d-aec1-d5bb6be9f1e6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.600 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:17 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:17Z|00182|binding|INFO|Setting lport ab16b512-c763-4a6a-9d29-6911ff7bb9bc up in Southbound
Jan 31 02:59:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:17.637 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:a3:d9 10.100.0.10'], port_security=['fa:16:3e:43:a3:d9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81f9575e-f3c4-4956-9a5c-9cb1776ab08e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '273fe485cc184dd8bf86440d8d1e05f3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3843f467-3db3-41b2-9b0b-057a8193c058', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c7c7cc1-6459-40e3-9ba1-aa227a70d0ae, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=ab16b512-c763-4a6a-9d29-6911ff7bb9bc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:59:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:17.638 143258 INFO neutron.agent.ovn.metadata.agent [-] Port ab16b512-c763-4a6a-9d29-6911ff7bb9bc in datapath 81f9575e-f3c4-4956-9a5c-9cb1776ab08e bound to our chassis#033[00m
Jan 31 02:59:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:17.640 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81f9575e-f3c4-4956-9a5c-9cb1776ab08e#033[00m
Jan 31 02:59:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:17.651 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[47b77073-0e26-4512-9ae4-d1d4843c7f85]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:17.672 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[5052f05a-b9c5-47c3-b8cd-f3d518415dbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:17.675 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[993ad6ea-6832-4ea2-92b8-41b5f2ce23ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:17.693 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[f3e899cf-8579-4a6f-9a8a-29e994d72603]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:17.706 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[61a2edf7-260d-4b26-bc57-ba3f353dffa3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81f9575e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4f:70:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 544978, 'reachable_time': 40615, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248639, 'error': None, 'target': 'ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:17.719 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e4d982-d339-4148-88fa-1d7c22a5eca7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap81f9575e-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 544986, 'tstamp': 544986}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248642, 'error': None, 'target': 'ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap81f9575e-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 544988, 'tstamp': 544988}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248642, 'error': None, 'target': 'ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:17.721 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81f9575e-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.722 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.723 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:17.725 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81f9575e-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:17.726 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:59:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:17.726 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81f9575e-f0, col_values=(('external_ids', {'iface-id': 'fdf58350-21cf-49b6-9dfa-eef283a74bd5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:17.726 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:59:17 np0005603623 nova_compute[226235]: 2026-01-31 07:59:17.938 226239 DEBUG oslo_concurrency.processutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 add62d62-47fb-454d-aec1-d5bb6be9f1e6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.353s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.012 226239 DEBUG nova.storage.rbd_utils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] resizing rbd image add62d62-47fb-454d-aec1-d5bb6be9f1e6_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.051 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846358.0228016, 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.052 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] VM Started (Lifecycle Event)#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.119 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.123 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846358.0229018, 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.124 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.204 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.206 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.213 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.214 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.214 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.214 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.214 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.267 226239 DEBUG nova.objects.instance [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lazy-loading 'migration_context' on Instance uuid add62d62-47fb-454d-aec1-d5bb6be9f1e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.283 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.333 226239 DEBUG nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.334 226239 DEBUG nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Ensure instance console log exists: /var/lib/nova/instances/add62d62-47fb-454d-aec1-d5bb6be9f1e6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.334 226239 DEBUG oslo_concurrency.lockutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.334 226239 DEBUG oslo_concurrency.lockutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.335 226239 DEBUG oslo_concurrency.lockutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:59:18 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3542624482' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.640 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.684 226239 DEBUG nova.network.neutron [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Successfully created port: 392355e0-133c-4f74-b0f0-dc74eb8c8416 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.686 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.689 226239 DEBUG nova.compute.manager [req-b9e0227e-bd6c-4041-bb90-056dd93779d9 req-9da13130-72af-4dfb-88ed-68a00b510831 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Received event network-vif-plugged-ab16b512-c763-4a6a-9d29-6911ff7bb9bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.690 226239 DEBUG oslo_concurrency.lockutils [req-b9e0227e-bd6c-4041-bb90-056dd93779d9 req-9da13130-72af-4dfb-88ed-68a00b510831 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.690 226239 DEBUG oslo_concurrency.lockutils [req-b9e0227e-bd6c-4041-bb90-056dd93779d9 req-9da13130-72af-4dfb-88ed-68a00b510831 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.690 226239 DEBUG oslo_concurrency.lockutils [req-b9e0227e-bd6c-4041-bb90-056dd93779d9 req-9da13130-72af-4dfb-88ed-68a00b510831 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.690 226239 DEBUG nova.compute.manager [req-b9e0227e-bd6c-4041-bb90-056dd93779d9 req-9da13130-72af-4dfb-88ed-68a00b510831 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Processing event network-vif-plugged-ab16b512-c763-4a6a-9d29-6911ff7bb9bc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.691 226239 DEBUG nova.compute.manager [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.694 226239 DEBUG nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.694 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846358.6941698, 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.695 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.698 226239 INFO nova.virt.libvirt.driver [-] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Instance spawned successfully.#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.699 226239 DEBUG nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.769 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.775 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.779 226239 DEBUG nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.780 226239 DEBUG nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.780 226239 DEBUG nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.780 226239 DEBUG nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.781 226239 DEBUG nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.781 226239 DEBUG nova.virt.libvirt.driver [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:59:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:18.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.889 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.893 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000002f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.893 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000002f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.897 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000002a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:59:18 np0005603623 nova_compute[226235]: 2026-01-31 07:59:18.898 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000002a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:59:19 np0005603623 nova_compute[226235]: 2026-01-31 07:59:19.037 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:59:19 np0005603623 nova_compute[226235]: 2026-01-31 07:59:19.038 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4484MB free_disk=20.773712158203125GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:59:19 np0005603623 nova_compute[226235]: 2026-01-31 07:59:19.038 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:19 np0005603623 nova_compute[226235]: 2026-01-31 07:59:19.038 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:19 np0005603623 nova_compute[226235]: 2026-01-31 07:59:19.058 226239 INFO nova.compute.manager [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Took 16.61 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:59:19 np0005603623 nova_compute[226235]: 2026-01-31 07:59:19.058 226239 DEBUG nova.compute.manager [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:59:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:59:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:19.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:59:19 np0005603623 nova_compute[226235]: 2026-01-31 07:59:19.427 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance ba72fe35-90dd-4806-9195-8a8ff81ae9f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:59:19 np0005603623 nova_compute[226235]: 2026-01-31 07:59:19.427 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:59:19 np0005603623 nova_compute[226235]: 2026-01-31 07:59:19.427 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance add62d62-47fb-454d-aec1-d5bb6be9f1e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:59:19 np0005603623 nova_compute[226235]: 2026-01-31 07:59:19.427 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:59:19 np0005603623 nova_compute[226235]: 2026-01-31 07:59:19.428 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:59:19 np0005603623 nova_compute[226235]: 2026-01-31 07:59:19.504 226239 INFO nova.compute.manager [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Took 18.44 seconds to build instance.#033[00m
Jan 31 02:59:19 np0005603623 nova_compute[226235]: 2026-01-31 07:59:19.515 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:59:20 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/178026402' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:59:20 np0005603623 nova_compute[226235]: 2026-01-31 07:59:20.039 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.525s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:20 np0005603623 nova_compute[226235]: 2026-01-31 07:59:20.045 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:59:20 np0005603623 nova_compute[226235]: 2026-01-31 07:59:20.175 226239 DEBUG oslo_concurrency.lockutils [None req-5afce8db-9043-4e80-b0e3-cfff5b84912b b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.462s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:20 np0005603623 nova_compute[226235]: 2026-01-31 07:59:20.231 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:59:20 np0005603623 nova_compute[226235]: 2026-01-31 07:59:20.334 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:59:20 np0005603623 nova_compute[226235]: 2026-01-31 07:59:20.335 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:20 np0005603623 nova_compute[226235]: 2026-01-31 07:59:20.472 226239 DEBUG nova.network.neutron [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Successfully updated port: 392355e0-133c-4f74-b0f0-dc74eb8c8416 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:59:20 np0005603623 nova_compute[226235]: 2026-01-31 07:59:20.544 226239 DEBUG oslo_concurrency.lockutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "refresh_cache-add62d62-47fb-454d-aec1-d5bb6be9f1e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:59:20 np0005603623 nova_compute[226235]: 2026-01-31 07:59:20.544 226239 DEBUG oslo_concurrency.lockutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquired lock "refresh_cache-add62d62-47fb-454d-aec1-d5bb6be9f1e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:59:20 np0005603623 nova_compute[226235]: 2026-01-31 07:59:20.544 226239 DEBUG nova.network.neutron [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:59:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:59:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:20.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:59:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.176 226239 DEBUG nova.network.neutron [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:59:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:59:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:21.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:59:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:21.320 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:59:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:21.321 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.322 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.335 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.336 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.336 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.337 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.425 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.516 226239 DEBUG nova.compute.manager [req-336d43aa-5849-4781-b646-c360fc398449 req-cbe4121f-216b-4e41-a470-51b6cdb235b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Received event network-vif-plugged-ab16b512-c763-4a6a-9d29-6911ff7bb9bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.517 226239 DEBUG oslo_concurrency.lockutils [req-336d43aa-5849-4781-b646-c360fc398449 req-cbe4121f-216b-4e41-a470-51b6cdb235b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.517 226239 DEBUG oslo_concurrency.lockutils [req-336d43aa-5849-4781-b646-c360fc398449 req-cbe4121f-216b-4e41-a470-51b6cdb235b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.517 226239 DEBUG oslo_concurrency.lockutils [req-336d43aa-5849-4781-b646-c360fc398449 req-cbe4121f-216b-4e41-a470-51b6cdb235b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.518 226239 DEBUG nova.compute.manager [req-336d43aa-5849-4781-b646-c360fc398449 req-cbe4121f-216b-4e41-a470-51b6cdb235b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] No waiting events found dispatching network-vif-plugged-ab16b512-c763-4a6a-9d29-6911ff7bb9bc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.518 226239 WARNING nova.compute.manager [req-336d43aa-5849-4781-b646-c360fc398449 req-cbe4121f-216b-4e41-a470-51b6cdb235b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Received unexpected event network-vif-plugged-ab16b512-c763-4a6a-9d29-6911ff7bb9bc for instance with vm_state active and task_state None.#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.518 226239 DEBUG nova.compute.manager [req-336d43aa-5849-4781-b646-c360fc398449 req-cbe4121f-216b-4e41-a470-51b6cdb235b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received event network-changed-392355e0-133c-4f74-b0f0-dc74eb8c8416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.518 226239 DEBUG nova.compute.manager [req-336d43aa-5849-4781-b646-c360fc398449 req-cbe4121f-216b-4e41-a470-51b6cdb235b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Refreshing instance network info cache due to event network-changed-392355e0-133c-4f74-b0f0-dc74eb8c8416. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.519 226239 DEBUG oslo_concurrency.lockutils [req-336d43aa-5849-4781-b646-c360fc398449 req-cbe4121f-216b-4e41-a470-51b6cdb235b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-add62d62-47fb-454d-aec1-d5bb6be9f1e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.628 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-ba72fe35-90dd-4806-9195-8a8ff81ae9f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.629 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-ba72fe35-90dd-4806-9195-8a8ff81ae9f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.630 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 02:59:21 np0005603623 nova_compute[226235]: 2026-01-31 07:59:21.630 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ba72fe35-90dd-4806-9195-8a8ff81ae9f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.135 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.472 226239 DEBUG nova.network.neutron [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Updating instance_info_cache with network_info: [{"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.703 226239 DEBUG oslo_concurrency.lockutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Releasing lock "refresh_cache-add62d62-47fb-454d-aec1-d5bb6be9f1e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.704 226239 DEBUG nova.compute.manager [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Instance network_info: |[{"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.704 226239 DEBUG oslo_concurrency.lockutils [req-336d43aa-5849-4781-b646-c360fc398449 req-cbe4121f-216b-4e41-a470-51b6cdb235b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-add62d62-47fb-454d-aec1-d5bb6be9f1e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.704 226239 DEBUG nova.network.neutron [req-336d43aa-5849-4781-b646-c360fc398449 req-cbe4121f-216b-4e41-a470-51b6cdb235b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Refreshing network info cache for port 392355e0-133c-4f74-b0f0-dc74eb8c8416 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.706 226239 DEBUG nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Start _get_guest_xml network_info=[{"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.712 226239 WARNING nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.715 226239 DEBUG nova.virt.libvirt.host [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.716 226239 DEBUG nova.virt.libvirt.host [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.718 226239 DEBUG nova.virt.libvirt.host [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.719 226239 DEBUG nova.virt.libvirt.host [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.720 226239 DEBUG nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.720 226239 DEBUG nova.virt.hardware [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.720 226239 DEBUG nova.virt.hardware [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.720 226239 DEBUG nova.virt.hardware [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.721 226239 DEBUG nova.virt.hardware [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.721 226239 DEBUG nova.virt.hardware [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.721 226239 DEBUG nova.virt.hardware [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.721 226239 DEBUG nova.virt.hardware [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.721 226239 DEBUG nova.virt.hardware [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.722 226239 DEBUG nova.virt.hardware [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.722 226239 DEBUG nova.virt.hardware [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.722 226239 DEBUG nova.virt.hardware [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:59:22 np0005603623 nova_compute[226235]: 2026-01-31 07:59:22.725 226239 DEBUG oslo_concurrency.processutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:22.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:59:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2255188733' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:59:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:59:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:23.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.187 226239 DEBUG oslo_concurrency.processutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.216 226239 DEBUG nova.storage.rbd_utils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] rbd image add62d62-47fb-454d-aec1-d5bb6be9f1e6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.221 226239 DEBUG oslo_concurrency.processutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.529 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Updating instance_info_cache with network_info: [{"id": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "address": "fa:16:3e:97:e5:c9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdac9e6d-09", "ovs_interfaceid": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.615 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-ba72fe35-90dd-4806-9195-8a8ff81ae9f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.616 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.617 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.617 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.618 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.618 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.620 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.621 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:59:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:59:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4264370898' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.687 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.689 226239 DEBUG oslo_concurrency.processutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.690 226239 DEBUG nova.virt.libvirt.vif [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:59:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1136604277',display_name='tempest-SecurityGroupsTestJSON-server-1136604277',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1136604277',id=48,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f31b0319126848a5b8fd9521dc509172',ramdisk_id='',reservation_id='r-n0w7f0zv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-648078268',owner_user_name='tempest-SecurityGroupsTestJSON-648078268-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:59:16Z,user_data=None,user_id='0e402088c09448e1a6f0cd61b11e0816',uuid=add62d62-47fb-454d-aec1-d5bb6be9f1e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.691 226239 DEBUG nova.network.os_vif_util [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Converting VIF {"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.692 226239 DEBUG nova.network.os_vif_util [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:a5:39,bridge_name='br-int',has_traffic_filtering=True,id=392355e0-133c-4f74-b0f0-dc74eb8c8416,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392355e0-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.693 226239 DEBUG nova.objects.instance [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lazy-loading 'pci_devices' on Instance uuid add62d62-47fb-454d-aec1-d5bb6be9f1e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.723 226239 DEBUG nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:59:23 np0005603623 nova_compute[226235]:  <uuid>add62d62-47fb-454d-aec1-d5bb6be9f1e6</uuid>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:  <name>instance-00000030</name>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <nova:name>tempest-SecurityGroupsTestJSON-server-1136604277</nova:name>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:59:22</nova:creationTime>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:59:23 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:        <nova:user uuid="0e402088c09448e1a6f0cd61b11e0816">tempest-SecurityGroupsTestJSON-648078268-project-member</nova:user>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:        <nova:project uuid="f31b0319126848a5b8fd9521dc509172">tempest-SecurityGroupsTestJSON-648078268</nova:project>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:        <nova:port uuid="392355e0-133c-4f74-b0f0-dc74eb8c8416">
Jan 31 02:59:23 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <entry name="serial">add62d62-47fb-454d-aec1-d5bb6be9f1e6</entry>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <entry name="uuid">add62d62-47fb-454d-aec1-d5bb6be9f1e6</entry>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/add62d62-47fb-454d-aec1-d5bb6be9f1e6_disk">
Jan 31 02:59:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:59:23 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/add62d62-47fb-454d-aec1-d5bb6be9f1e6_disk.config">
Jan 31 02:59:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:59:23 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:98:a5:39"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <target dev="tap392355e0-13"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    </interface>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/add62d62-47fb-454d-aec1-d5bb6be9f1e6/console.log" append="off"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:59:23 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:59:23 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:59:23 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:59:23 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.729 226239 DEBUG nova.compute.manager [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Preparing to wait for external event network-vif-plugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.730 226239 DEBUG oslo_concurrency.lockutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.730 226239 DEBUG oslo_concurrency.lockutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.730 226239 DEBUG oslo_concurrency.lockutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.731 226239 DEBUG nova.virt.libvirt.vif [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:59:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1136604277',display_name='tempest-SecurityGroupsTestJSON-server-1136604277',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1136604277',id=48,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f31b0319126848a5b8fd9521dc509172',ramdisk_id='',reservation_id='r-n0w7f0zv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-648078268',owner_user_name='tempest-SecurityGroupsTestJSON-648078268-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:59:16Z,user_data=None,user_id='0e402088c09448e1a6f0cd61b11e0816',uuid=add62d62-47fb-454d-aec1-d5bb6be9f1e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.732 226239 DEBUG nova.network.os_vif_util [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Converting VIF {"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.733 226239 DEBUG nova.network.os_vif_util [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:a5:39,bridge_name='br-int',has_traffic_filtering=True,id=392355e0-133c-4f74-b0f0-dc74eb8c8416,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392355e0-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.733 226239 DEBUG os_vif [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:a5:39,bridge_name='br-int',has_traffic_filtering=True,id=392355e0-133c-4f74-b0f0-dc74eb8c8416,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392355e0-13') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.734 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.735 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.735 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.740 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.741 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap392355e0-13, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.742 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap392355e0-13, col_values=(('external_ids', {'iface-id': '392355e0-133c-4f74-b0f0-dc74eb8c8416', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:98:a5:39', 'vm-uuid': 'add62d62-47fb-454d-aec1-d5bb6be9f1e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.743 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:23 np0005603623 NetworkManager[48970]: <info>  [1769846363.7445] manager: (tap392355e0-13): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.746 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.750 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:23 np0005603623 nova_compute[226235]: 2026-01-31 07:59:23.751 226239 INFO os_vif [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:a5:39,bridge_name='br-int',has_traffic_filtering=True,id=392355e0-133c-4f74-b0f0-dc74eb8c8416,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392355e0-13')#033[00m
Jan 31 02:59:24 np0005603623 nova_compute[226235]: 2026-01-31 07:59:24.165 226239 DEBUG nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:59:24 np0005603623 nova_compute[226235]: 2026-01-31 07:59:24.165 226239 DEBUG nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:59:24 np0005603623 nova_compute[226235]: 2026-01-31 07:59:24.166 226239 DEBUG nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] No VIF found with MAC fa:16:3e:98:a5:39, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:59:24 np0005603623 nova_compute[226235]: 2026-01-31 07:59:24.166 226239 INFO nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Using config drive#033[00m
Jan 31 02:59:24 np0005603623 nova_compute[226235]: 2026-01-31 07:59:24.193 226239 DEBUG nova.storage.rbd_utils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] rbd image add62d62-47fb-454d-aec1-d5bb6be9f1e6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:59:24 np0005603623 nova_compute[226235]: 2026-01-31 07:59:24.694 226239 DEBUG nova.network.neutron [req-336d43aa-5849-4781-b646-c360fc398449 req-cbe4121f-216b-4e41-a470-51b6cdb235b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Updated VIF entry in instance network info cache for port 392355e0-133c-4f74-b0f0-dc74eb8c8416. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:59:24 np0005603623 nova_compute[226235]: 2026-01-31 07:59:24.694 226239 DEBUG nova.network.neutron [req-336d43aa-5849-4781-b646-c360fc398449 req-cbe4121f-216b-4e41-a470-51b6cdb235b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Updating instance_info_cache with network_info: [{"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:59:24 np0005603623 nova_compute[226235]: 2026-01-31 07:59:24.752 226239 INFO nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Creating config drive at /var/lib/nova/instances/add62d62-47fb-454d-aec1-d5bb6be9f1e6/disk.config#033[00m
Jan 31 02:59:24 np0005603623 nova_compute[226235]: 2026-01-31 07:59:24.756 226239 DEBUG oslo_concurrency.processutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/add62d62-47fb-454d-aec1-d5bb6be9f1e6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpkrstv71d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:24.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:24 np0005603623 nova_compute[226235]: 2026-01-31 07:59:24.818 226239 DEBUG oslo_concurrency.lockutils [req-336d43aa-5849-4781-b646-c360fc398449 req-cbe4121f-216b-4e41-a470-51b6cdb235b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-add62d62-47fb-454d-aec1-d5bb6be9f1e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:59:24 np0005603623 nova_compute[226235]: 2026-01-31 07:59:24.880 226239 DEBUG oslo_concurrency.processutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/add62d62-47fb-454d-aec1-d5bb6be9f1e6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpkrstv71d" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:24 np0005603623 nova_compute[226235]: 2026-01-31 07:59:24.907 226239 DEBUG nova.storage.rbd_utils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] rbd image add62d62-47fb-454d-aec1-d5bb6be9f1e6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:59:24 np0005603623 nova_compute[226235]: 2026-01-31 07:59:24.911 226239 DEBUG oslo_concurrency.processutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/add62d62-47fb-454d-aec1-d5bb6be9f1e6/disk.config add62d62-47fb-454d-aec1-d5bb6be9f1e6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:25.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:25 np0005603623 nova_compute[226235]: 2026-01-31 07:59:25.222 226239 DEBUG oslo_concurrency.processutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/add62d62-47fb-454d-aec1-d5bb6be9f1e6/disk.config add62d62-47fb-454d-aec1-d5bb6be9f1e6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.311s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:25 np0005603623 nova_compute[226235]: 2026-01-31 07:59:25.223 226239 INFO nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Deleting local config drive /var/lib/nova/instances/add62d62-47fb-454d-aec1-d5bb6be9f1e6/disk.config because it was imported into RBD.#033[00m
Jan 31 02:59:25 np0005603623 kernel: tap392355e0-13: entered promiscuous mode
Jan 31 02:59:25 np0005603623 NetworkManager[48970]: <info>  [1769846365.2618] manager: (tap392355e0-13): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Jan 31 02:59:25 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:25Z|00183|binding|INFO|Claiming lport 392355e0-133c-4f74-b0f0-dc74eb8c8416 for this chassis.
Jan 31 02:59:25 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:25Z|00184|binding|INFO|392355e0-133c-4f74-b0f0-dc74eb8c8416: Claiming fa:16:3e:98:a5:39 10.100.0.7
Jan 31 02:59:25 np0005603623 nova_compute[226235]: 2026-01-31 07:59:25.262 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:25 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:25Z|00185|binding|INFO|Setting lport 392355e0-133c-4f74-b0f0-dc74eb8c8416 ovn-installed in OVS
Jan 31 02:59:25 np0005603623 nova_compute[226235]: 2026-01-31 07:59:25.269 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:25 np0005603623 nova_compute[226235]: 2026-01-31 07:59:25.272 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:25 np0005603623 systemd-udevd[248940]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:59:25 np0005603623 systemd-machined[194379]: New machine qemu-26-instance-00000030.
Jan 31 02:59:25 np0005603623 NetworkManager[48970]: <info>  [1769846365.3028] device (tap392355e0-13): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:59:25 np0005603623 NetworkManager[48970]: <info>  [1769846365.3039] device (tap392355e0-13): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:59:25 np0005603623 systemd[1]: Started Virtual Machine qemu-26-instance-00000030.
Jan 31 02:59:25 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:25Z|00186|binding|INFO|Setting lport 392355e0-133c-4f74-b0f0-dc74eb8c8416 up in Southbound
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.314 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:a5:39 10.100.0.7'], port_security=['fa:16:3e:98:a5:39 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'add62d62-47fb-454d-aec1-d5bb6be9f1e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f31b0319126848a5b8fd9521dc509172', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2814a167-00ed-4304-830b-a99e04552970', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29a935fc-1163-43c6-97c6-acf0f9c4194f, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=392355e0-133c-4f74-b0f0-dc74eb8c8416) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.315 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 392355e0-133c-4f74-b0f0-dc74eb8c8416 in datapath 92b7a3d2-99de-4036-b28b-98f77dab6a25 bound to our chassis#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.318 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 92b7a3d2-99de-4036-b28b-98f77dab6a25#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.329 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f3409356-2f63-433d-8c20-48f8e28afd6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.330 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap92b7a3d2-91 in ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.331 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap92b7a3d2-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.331 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[555185c5-09df-49ca-86bb-1385e26353ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.333 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0da79336-1a5e-4b35-ac37-ec7b309ca66c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.348 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[79fabed4-a725-41ee-acf5-5afa9748e535]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.370 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[425451d9-94ab-4cf8-aca3-49992f9ba61c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.393 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[074055df-4bec-45e0-96ec-e5380ddf25ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:25 np0005603623 NetworkManager[48970]: <info>  [1769846365.4019] manager: (tap92b7a3d2-90): new Veth device (/org/freedesktop/NetworkManager/Devices/93)
Jan 31 02:59:25 np0005603623 systemd-udevd[248943]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.402 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f46e057c-a1b4-48ec-959b-7ee29206ce5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.430 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[59bd4e23-2964-4bef-811e-9e7eedc7b323]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.434 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[5d9551c3-d604-4dfa-afb8-e28b58699f5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:25 np0005603623 NetworkManager[48970]: <info>  [1769846365.4552] device (tap92b7a3d2-90): carrier: link connected
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.459 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[85bc244b-4fe0-490a-ace8-0550cc5c1085]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.474 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[59db659c-f028-4bbc-b679-97d727fad67a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap92b7a3d2-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:d6:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550588, 'reachable_time': 22419, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248974, 'error': None, 'target': 'ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.488 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f92c2a43-36a5-47e3-adef-0c2b3f428515]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:d681'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 550588, 'tstamp': 550588}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248975, 'error': None, 'target': 'ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.501 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[39805df6-4e71-4b12-b841-fed567f5d0fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap92b7a3d2-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:d6:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550588, 'reachable_time': 22419, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248976, 'error': None, 'target': 'ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.523 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c7408b43-102b-41c2-84aa-02b1d3800974]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.564 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[74898df5-81e0-4b01-be21-709042abfb92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.566 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92b7a3d2-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.567 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.567 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap92b7a3d2-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:25 np0005603623 nova_compute[226235]: 2026-01-31 07:59:25.569 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:25 np0005603623 NetworkManager[48970]: <info>  [1769846365.5703] manager: (tap92b7a3d2-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Jan 31 02:59:25 np0005603623 kernel: tap92b7a3d2-90: entered promiscuous mode
Jan 31 02:59:25 np0005603623 nova_compute[226235]: 2026-01-31 07:59:25.572 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.574 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap92b7a3d2-90, col_values=(('external_ids', {'iface-id': 'b33af60b-01fb-4204-b1d7-f9b1d79e127d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:25 np0005603623 nova_compute[226235]: 2026-01-31 07:59:25.576 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:25 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:25Z|00187|binding|INFO|Releasing lport b33af60b-01fb-4204-b1d7-f9b1d79e127d from this chassis (sb_readonly=0)
Jan 31 02:59:25 np0005603623 nova_compute[226235]: 2026-01-31 07:59:25.577 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.580 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/92b7a3d2-99de-4036-b28b-98f77dab6a25.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/92b7a3d2-99de-4036-b28b-98f77dab6a25.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.581 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c310ed54-af45-49c1-a06e-0b8cf5597772]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.582 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-92b7a3d2-99de-4036-b28b-98f77dab6a25
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/92b7a3d2-99de-4036-b28b-98f77dab6a25.pid.haproxy
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 92b7a3d2-99de-4036-b28b-98f77dab6a25
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:59:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:25.583 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'env', 'PROCESS_TAG=haproxy-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/92b7a3d2-99de-4036-b28b-98f77dab6a25.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:59:25 np0005603623 nova_compute[226235]: 2026-01-31 07:59:25.584 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 02:59:25 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1576973755' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 02:59:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 02:59:25 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1576973755' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 02:59:25 np0005603623 nova_compute[226235]: 2026-01-31 07:59:25.786 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846365.7851832, add62d62-47fb-454d-aec1-d5bb6be9f1e6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:59:25 np0005603623 nova_compute[226235]: 2026-01-31 07:59:25.786 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] VM Started (Lifecycle Event)#033[00m
Jan 31 02:59:25 np0005603623 nova_compute[226235]: 2026-01-31 07:59:25.915 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:59:25 np0005603623 nova_compute[226235]: 2026-01-31 07:59:25.920 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846365.7854772, add62d62-47fb-454d-aec1-d5bb6be9f1e6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:59:25 np0005603623 nova_compute[226235]: 2026-01-31 07:59:25.920 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:59:25 np0005603623 podman[249051]: 2026-01-31 07:59:25.98066438 +0000 UTC m=+0.054583103 container create ca4706c1e1c260631197c39397f5704f6c7fcc31a2d70b009dd09365a9854679 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 02:59:26 np0005603623 systemd[1]: Started libpod-conmon-ca4706c1e1c260631197c39397f5704f6c7fcc31a2d70b009dd09365a9854679.scope.
Jan 31 02:59:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:26 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:59:26 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a121f07ac7d2976ecc560bea0ff697a838423cd4a9151b22763e8c4dece8f6c3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.040 226239 DEBUG nova.compute.manager [req-a4526f78-d539-445c-8962-ea94915f0dc7 req-0d8617b0-c2ca-4a40-9923-51166dd7bfab fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received event network-vif-plugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.041 226239 DEBUG oslo_concurrency.lockutils [req-a4526f78-d539-445c-8962-ea94915f0dc7 req-0d8617b0-c2ca-4a40-9923-51166dd7bfab fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.042 226239 DEBUG oslo_concurrency.lockutils [req-a4526f78-d539-445c-8962-ea94915f0dc7 req-0d8617b0-c2ca-4a40-9923-51166dd7bfab fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.042 226239 DEBUG oslo_concurrency.lockutils [req-a4526f78-d539-445c-8962-ea94915f0dc7 req-0d8617b0-c2ca-4a40-9923-51166dd7bfab fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.043 226239 DEBUG nova.compute.manager [req-a4526f78-d539-445c-8962-ea94915f0dc7 req-0d8617b0-c2ca-4a40-9923-51166dd7bfab fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Processing event network-vif-plugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:59:26 np0005603623 podman[249051]: 2026-01-31 07:59:25.947309784 +0000 UTC m=+0.021228537 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.044 226239 DEBUG nova.compute.manager [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.048 226239 DEBUG nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.051 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.053 226239 INFO nova.virt.libvirt.driver [-] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Instance spawned successfully.#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.054 226239 DEBUG nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.057 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846366.0485399, add62d62-47fb-454d-aec1-d5bb6be9f1e6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.057 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:59:26 np0005603623 podman[249051]: 2026-01-31 07:59:26.072809951 +0000 UTC m=+0.146728694 container init ca4706c1e1c260631197c39397f5704f6c7fcc31a2d70b009dd09365a9854679 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:59:26 np0005603623 podman[249051]: 2026-01-31 07:59:26.076915001 +0000 UTC m=+0.150833724 container start ca4706c1e1c260631197c39397f5704f6c7fcc31a2d70b009dd09365a9854679 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 02:59:26 np0005603623 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[249067]: [NOTICE]   (249071) : New worker (249073) forked
Jan 31 02:59:26 np0005603623 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[249067]: [NOTICE]   (249071) : Loading success.
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.172 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.177 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.180 226239 DEBUG nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.181 226239 DEBUG nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.181 226239 DEBUG nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.182 226239 DEBUG nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.182 226239 DEBUG nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.182 226239 DEBUG nova.virt.libvirt.driver [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.283 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:59:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:26.324 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.452 226239 INFO nova.compute.manager [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Took 9.08 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.453 226239 DEBUG nova.compute.manager [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.744 226239 DEBUG oslo_concurrency.lockutils [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.745 226239 DEBUG oslo_concurrency.lockutils [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.746 226239 DEBUG oslo_concurrency.lockutils [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.746 226239 DEBUG oslo_concurrency.lockutils [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.746 226239 DEBUG oslo_concurrency.lockutils [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.747 226239 INFO nova.compute.manager [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Terminating instance#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.748 226239 DEBUG nova.compute.manager [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.780 226239 INFO nova.compute.manager [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Took 12.51 seconds to build instance.#033[00m
Jan 31 02:59:26 np0005603623 kernel: tapab16b512-c7 (unregistering): left promiscuous mode
Jan 31 02:59:26 np0005603623 NetworkManager[48970]: <info>  [1769846366.7892] device (tapab16b512-c7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.793 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:26 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:26Z|00188|binding|INFO|Releasing lport ab16b512-c763-4a6a-9d29-6911ff7bb9bc from this chassis (sb_readonly=0)
Jan 31 02:59:26 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:26Z|00189|binding|INFO|Setting lport ab16b512-c763-4a6a-9d29-6911ff7bb9bc down in Southbound
Jan 31 02:59:26 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:26Z|00190|binding|INFO|Removing iface tapab16b512-c7 ovn-installed in OVS
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.801 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:26.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:26 np0005603623 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000002f.scope: Deactivated successfully.
Jan 31 02:59:26 np0005603623 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000002f.scope: Consumed 8.565s CPU time.
Jan 31 02:59:26 np0005603623 systemd-machined[194379]: Machine qemu-25-instance-0000002f terminated.
Jan 31 02:59:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:26.859 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:a3:d9 10.100.0.10'], port_security=['fa:16:3e:43:a3:d9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81f9575e-f3c4-4956-9a5c-9cb1776ab08e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '273fe485cc184dd8bf86440d8d1e05f3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3843f467-3db3-41b2-9b0b-057a8193c058', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c7c7cc1-6459-40e3-9ba1-aa227a70d0ae, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=ab16b512-c763-4a6a-9d29-6911ff7bb9bc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:59:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:26.860 143258 INFO neutron.agent.ovn.metadata.agent [-] Port ab16b512-c763-4a6a-9d29-6911ff7bb9bc in datapath 81f9575e-f3c4-4956-9a5c-9cb1776ab08e unbound from our chassis#033[00m
Jan 31 02:59:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:26.862 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81f9575e-f3c4-4956-9a5c-9cb1776ab08e#033[00m
Jan 31 02:59:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:26.875 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[08f51e14-254c-45b8-b68e-d44b9d9be7f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:26.897 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[a0ccbe74-982c-4e36-965f-59722ab0ab55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:26.900 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[00f24a19-4c85-4322-908d-cf93dbb003ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:26.924 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[6f749735-4916-4cb7-bbfa-7218a43d5e39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.936 226239 DEBUG oslo_concurrency.lockutils [None req-6303371d-48ad-4540-b970-a66750e08679 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.598s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:26.939 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a764f5ec-2420-4ba2-972a-1631659ec903]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81f9575e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4f:70:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 544978, 'reachable_time': 40615, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249092, 'error': None, 'target': 'ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:26.951 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b697f9ef-8a2e-4399-a3d7-65a0869bd717]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap81f9575e-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 544986, 'tstamp': 544986}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249093, 'error': None, 'target': 'ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap81f9575e-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 544988, 'tstamp': 544988}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249093, 'error': None, 'target': 'ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:26.954 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81f9575e-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.958 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.963 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:26.964 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81f9575e-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:26.965 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:59:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:26.965 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81f9575e-f0, col_values=(('external_ids', {'iface-id': 'fdf58350-21cf-49b6-9dfa-eef283a74bd5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:26.967 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.977 226239 INFO nova.virt.libvirt.driver [-] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Instance destroyed successfully.#033[00m
Jan 31 02:59:26 np0005603623 nova_compute[226235]: 2026-01-31 07:59:26.978 226239 DEBUG nova.objects.instance [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lazy-loading 'resources' on Instance uuid 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:59:27 np0005603623 nova_compute[226235]: 2026-01-31 07:59:27.051 226239 DEBUG nova.virt.libvirt.vif [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:58:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1580535877',display_name='tempest-VolumesAdminNegativeTest-server-1580535877',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-volumesadminnegativetest-server-1580535877',id=47,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:59:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='273fe485cc184dd8bf86440d8d1e05f3',ramdisk_id='',reservation_id='r-bnd59map',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-VolumesAdminNegativeTest-1346674431',owner_user_name='tempest-VolumesAdminNegativeTest-1346674431-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:59:19Z,user_data=None,user_id='b034a039074641a7b7c872e8b715ca4c',uuid=34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "address": "fa:16:3e:43:a3:d9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab16b512-c7", "ovs_interfaceid": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:59:27 np0005603623 nova_compute[226235]: 2026-01-31 07:59:27.052 226239 DEBUG nova.network.os_vif_util [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Converting VIF {"id": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "address": "fa:16:3e:43:a3:d9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab16b512-c7", "ovs_interfaceid": "ab16b512-c763-4a6a-9d29-6911ff7bb9bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:59:27 np0005603623 nova_compute[226235]: 2026-01-31 07:59:27.053 226239 DEBUG nova.network.os_vif_util [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:a3:d9,bridge_name='br-int',has_traffic_filtering=True,id=ab16b512-c763-4a6a-9d29-6911ff7bb9bc,network=Network(81f9575e-f3c4-4956-9a5c-9cb1776ab08e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab16b512-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:59:27 np0005603623 nova_compute[226235]: 2026-01-31 07:59:27.053 226239 DEBUG os_vif [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:a3:d9,bridge_name='br-int',has_traffic_filtering=True,id=ab16b512-c763-4a6a-9d29-6911ff7bb9bc,network=Network(81f9575e-f3c4-4956-9a5c-9cb1776ab08e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab16b512-c7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:59:27 np0005603623 nova_compute[226235]: 2026-01-31 07:59:27.055 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:27 np0005603623 nova_compute[226235]: 2026-01-31 07:59:27.055 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab16b512-c7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:27 np0005603623 nova_compute[226235]: 2026-01-31 07:59:27.057 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:27 np0005603623 nova_compute[226235]: 2026-01-31 07:59:27.059 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:59:27 np0005603623 nova_compute[226235]: 2026-01-31 07:59:27.062 226239 INFO os_vif [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:a3:d9,bridge_name='br-int',has_traffic_filtering=True,id=ab16b512-c763-4a6a-9d29-6911ff7bb9bc,network=Network(81f9575e-f3c4-4956-9a5c-9cb1776ab08e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab16b512-c7')#033[00m
Jan 31 02:59:27 np0005603623 nova_compute[226235]: 2026-01-31 07:59:27.137 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:27.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:27 np0005603623 nova_compute[226235]: 2026-01-31 07:59:27.645 226239 INFO nova.virt.libvirt.driver [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Deleting instance files /var/lib/nova/instances/34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a_del#033[00m
Jan 31 02:59:27 np0005603623 nova_compute[226235]: 2026-01-31 07:59:27.645 226239 INFO nova.virt.libvirt.driver [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Deletion of /var/lib/nova/instances/34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a_del complete#033[00m
Jan 31 02:59:27 np0005603623 nova_compute[226235]: 2026-01-31 07:59:27.851 226239 INFO nova.compute.manager [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Took 1.10 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:59:27 np0005603623 nova_compute[226235]: 2026-01-31 07:59:27.852 226239 DEBUG oslo.service.loopingcall [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:59:27 np0005603623 nova_compute[226235]: 2026-01-31 07:59:27.853 226239 DEBUG nova.compute.manager [-] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:59:27 np0005603623 nova_compute[226235]: 2026-01-31 07:59:27.853 226239 DEBUG nova.network.neutron [-] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:59:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:28.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:59:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:29.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:59:29 np0005603623 nova_compute[226235]: 2026-01-31 07:59:29.379 226239 DEBUG nova.compute.manager [req-7483f6bb-69f9-4758-8106-17ce25f64bfb req-305483f3-8f24-4004-a10f-8cc630a289e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received event network-vif-plugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:29 np0005603623 nova_compute[226235]: 2026-01-31 07:59:29.380 226239 DEBUG oslo_concurrency.lockutils [req-7483f6bb-69f9-4758-8106-17ce25f64bfb req-305483f3-8f24-4004-a10f-8cc630a289e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:29 np0005603623 nova_compute[226235]: 2026-01-31 07:59:29.380 226239 DEBUG oslo_concurrency.lockutils [req-7483f6bb-69f9-4758-8106-17ce25f64bfb req-305483f3-8f24-4004-a10f-8cc630a289e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:29 np0005603623 nova_compute[226235]: 2026-01-31 07:59:29.380 226239 DEBUG oslo_concurrency.lockutils [req-7483f6bb-69f9-4758-8106-17ce25f64bfb req-305483f3-8f24-4004-a10f-8cc630a289e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:29 np0005603623 nova_compute[226235]: 2026-01-31 07:59:29.380 226239 DEBUG nova.compute.manager [req-7483f6bb-69f9-4758-8106-17ce25f64bfb req-305483f3-8f24-4004-a10f-8cc630a289e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] No waiting events found dispatching network-vif-plugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:59:29 np0005603623 nova_compute[226235]: 2026-01-31 07:59:29.381 226239 WARNING nova.compute.manager [req-7483f6bb-69f9-4758-8106-17ce25f64bfb req-305483f3-8f24-4004-a10f-8cc630a289e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received unexpected event network-vif-plugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 for instance with vm_state active and task_state None.#033[00m
Jan 31 02:59:29 np0005603623 nova_compute[226235]: 2026-01-31 07:59:29.381 226239 DEBUG nova.compute.manager [req-7483f6bb-69f9-4758-8106-17ce25f64bfb req-305483f3-8f24-4004-a10f-8cc630a289e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Received event network-vif-unplugged-ab16b512-c763-4a6a-9d29-6911ff7bb9bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:29 np0005603623 nova_compute[226235]: 2026-01-31 07:59:29.381 226239 DEBUG oslo_concurrency.lockutils [req-7483f6bb-69f9-4758-8106-17ce25f64bfb req-305483f3-8f24-4004-a10f-8cc630a289e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:29 np0005603623 nova_compute[226235]: 2026-01-31 07:59:29.381 226239 DEBUG oslo_concurrency.lockutils [req-7483f6bb-69f9-4758-8106-17ce25f64bfb req-305483f3-8f24-4004-a10f-8cc630a289e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:29 np0005603623 nova_compute[226235]: 2026-01-31 07:59:29.381 226239 DEBUG oslo_concurrency.lockutils [req-7483f6bb-69f9-4758-8106-17ce25f64bfb req-305483f3-8f24-4004-a10f-8cc630a289e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:29 np0005603623 nova_compute[226235]: 2026-01-31 07:59:29.382 226239 DEBUG nova.compute.manager [req-7483f6bb-69f9-4758-8106-17ce25f64bfb req-305483f3-8f24-4004-a10f-8cc630a289e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] No waiting events found dispatching network-vif-unplugged-ab16b512-c763-4a6a-9d29-6911ff7bb9bc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:59:29 np0005603623 nova_compute[226235]: 2026-01-31 07:59:29.382 226239 DEBUG nova.compute.manager [req-7483f6bb-69f9-4758-8106-17ce25f64bfb req-305483f3-8f24-4004-a10f-8cc630a289e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Received event network-vif-unplugged-ab16b512-c763-4a6a-9d29-6911ff7bb9bc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:59:29 np0005603623 nova_compute[226235]: 2026-01-31 07:59:29.382 226239 DEBUG nova.compute.manager [req-7483f6bb-69f9-4758-8106-17ce25f64bfb req-305483f3-8f24-4004-a10f-8cc630a289e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Received event network-vif-plugged-ab16b512-c763-4a6a-9d29-6911ff7bb9bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:29 np0005603623 nova_compute[226235]: 2026-01-31 07:59:29.382 226239 DEBUG oslo_concurrency.lockutils [req-7483f6bb-69f9-4758-8106-17ce25f64bfb req-305483f3-8f24-4004-a10f-8cc630a289e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:29 np0005603623 nova_compute[226235]: 2026-01-31 07:59:29.382 226239 DEBUG oslo_concurrency.lockutils [req-7483f6bb-69f9-4758-8106-17ce25f64bfb req-305483f3-8f24-4004-a10f-8cc630a289e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:29 np0005603623 nova_compute[226235]: 2026-01-31 07:59:29.382 226239 DEBUG oslo_concurrency.lockutils [req-7483f6bb-69f9-4758-8106-17ce25f64bfb req-305483f3-8f24-4004-a10f-8cc630a289e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:29 np0005603623 nova_compute[226235]: 2026-01-31 07:59:29.383 226239 DEBUG nova.compute.manager [req-7483f6bb-69f9-4758-8106-17ce25f64bfb req-305483f3-8f24-4004-a10f-8cc630a289e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] No waiting events found dispatching network-vif-plugged-ab16b512-c763-4a6a-9d29-6911ff7bb9bc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:59:29 np0005603623 nova_compute[226235]: 2026-01-31 07:59:29.383 226239 WARNING nova.compute.manager [req-7483f6bb-69f9-4758-8106-17ce25f64bfb req-305483f3-8f24-4004-a10f-8cc630a289e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Received unexpected event network-vif-plugged-ab16b512-c763-4a6a-9d29-6911ff7bb9bc for instance with vm_state active and task_state deleting.#033[00m
Jan 31 02:59:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e165 e165: 3 total, 3 up, 3 in
Jan 31 02:59:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:30.094 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:30.094 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:30.095 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:59:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:30.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:59:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:31.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:31 np0005603623 nova_compute[226235]: 2026-01-31 07:59:31.852 226239 DEBUG nova.network.neutron [-] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:59:32 np0005603623 nova_compute[226235]: 2026-01-31 07:59:32.057 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:32 np0005603623 nova_compute[226235]: 2026-01-31 07:59:32.068 226239 DEBUG nova.compute.manager [req-f845961c-eed9-414c-8b72-3e2f6091595a req-1f6cebb6-f187-4c64-99a6-7b89918e1226 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Received event network-vif-deleted-ab16b512-c763-4a6a-9d29-6911ff7bb9bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:32 np0005603623 nova_compute[226235]: 2026-01-31 07:59:32.069 226239 INFO nova.compute.manager [req-f845961c-eed9-414c-8b72-3e2f6091595a req-1f6cebb6-f187-4c64-99a6-7b89918e1226 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Neutron deleted interface ab16b512-c763-4a6a-9d29-6911ff7bb9bc; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 02:59:32 np0005603623 nova_compute[226235]: 2026-01-31 07:59:32.069 226239 DEBUG nova.network.neutron [req-f845961c-eed9-414c-8b72-3e2f6091595a req-1f6cebb6-f187-4c64-99a6-7b89918e1226 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:59:32 np0005603623 nova_compute[226235]: 2026-01-31 07:59:32.139 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:32 np0005603623 nova_compute[226235]: 2026-01-31 07:59:32.824 226239 INFO nova.compute.manager [-] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Took 4.97 seconds to deallocate network for instance.#033[00m
Jan 31 02:59:32 np0005603623 nova_compute[226235]: 2026-01-31 07:59:32.832 226239 DEBUG nova.compute.manager [req-f845961c-eed9-414c-8b72-3e2f6091595a req-1f6cebb6-f187-4c64-99a6-7b89918e1226 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Detach interface failed, port_id=ab16b512-c763-4a6a-9d29-6911ff7bb9bc, reason: Instance 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 02:59:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:32.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:33 np0005603623 nova_compute[226235]: 2026-01-31 07:59:33.047 226239 DEBUG oslo_concurrency.lockutils [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:33 np0005603623 nova_compute[226235]: 2026-01-31 07:59:33.048 226239 DEBUG oslo_concurrency.lockutils [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:59:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:33.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:59:33 np0005603623 nova_compute[226235]: 2026-01-31 07:59:33.203 226239 DEBUG oslo_concurrency.processutils [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:59:33 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2857997029' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:59:33 np0005603623 nova_compute[226235]: 2026-01-31 07:59:33.628 226239 DEBUG oslo_concurrency.processutils [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:33 np0005603623 nova_compute[226235]: 2026-01-31 07:59:33.634 226239 DEBUG nova.compute.provider_tree [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:59:33 np0005603623 nova_compute[226235]: 2026-01-31 07:59:33.915 226239 DEBUG nova.scheduler.client.report [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:59:34 np0005603623 nova_compute[226235]: 2026-01-31 07:59:34.017 226239 DEBUG oslo_concurrency.lockutils [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.969s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:34 np0005603623 nova_compute[226235]: 2026-01-31 07:59:34.089 226239 INFO nova.scheduler.client.report [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Deleted allocations for instance 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a#033[00m
Jan 31 02:59:34 np0005603623 nova_compute[226235]: 2026-01-31 07:59:34.316 226239 DEBUG nova.compute.manager [req-b085b8f8-71c7-43e7-ac14-0b9e62c2a1fa req-5001cc71-6600-48f4-826e-dbedd59dc49e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received event network-changed-392355e0-133c-4f74-b0f0-dc74eb8c8416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:34 np0005603623 nova_compute[226235]: 2026-01-31 07:59:34.318 226239 DEBUG nova.compute.manager [req-b085b8f8-71c7-43e7-ac14-0b9e62c2a1fa req-5001cc71-6600-48f4-826e-dbedd59dc49e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Refreshing instance network info cache due to event network-changed-392355e0-133c-4f74-b0f0-dc74eb8c8416. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:59:34 np0005603623 nova_compute[226235]: 2026-01-31 07:59:34.319 226239 DEBUG oslo_concurrency.lockutils [req-b085b8f8-71c7-43e7-ac14-0b9e62c2a1fa req-5001cc71-6600-48f4-826e-dbedd59dc49e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-add62d62-47fb-454d-aec1-d5bb6be9f1e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:59:34 np0005603623 nova_compute[226235]: 2026-01-31 07:59:34.319 226239 DEBUG oslo_concurrency.lockutils [req-b085b8f8-71c7-43e7-ac14-0b9e62c2a1fa req-5001cc71-6600-48f4-826e-dbedd59dc49e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-add62d62-47fb-454d-aec1-d5bb6be9f1e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:59:34 np0005603623 nova_compute[226235]: 2026-01-31 07:59:34.319 226239 DEBUG nova.network.neutron [req-b085b8f8-71c7-43e7-ac14-0b9e62c2a1fa req-5001cc71-6600-48f4-826e-dbedd59dc49e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Refreshing network info cache for port 392355e0-133c-4f74-b0f0-dc74eb8c8416 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:59:34 np0005603623 nova_compute[226235]: 2026-01-31 07:59:34.448 226239 DEBUG oslo_concurrency.lockutils [None req-9d16b850-b615-416f-8b70-f4160c2c24f0 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.703s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:34.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:59:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:35.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:59:35 np0005603623 podman[249152]: 2026-01-31 07:59:35.987676042 +0000 UTC m=+0.074070016 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 02:59:35 np0005603623 podman[249151]: 2026-01-31 07:59:35.997301753 +0000 UTC m=+0.083695247 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Jan 31 02:59:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:36 np0005603623 nova_compute[226235]: 2026-01-31 07:59:36.282 226239 DEBUG oslo_concurrency.lockutils [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:36 np0005603623 nova_compute[226235]: 2026-01-31 07:59:36.284 226239 DEBUG oslo_concurrency.lockutils [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:36 np0005603623 nova_compute[226235]: 2026-01-31 07:59:36.284 226239 INFO nova.compute.manager [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Rebooting instance#033[00m
Jan 31 02:59:36 np0005603623 nova_compute[226235]: 2026-01-31 07:59:36.311 226239 DEBUG oslo_concurrency.lockutils [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "refresh_cache-add62d62-47fb-454d-aec1-d5bb6be9f1e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:59:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e166 e166: 3 total, 3 up, 3 in
Jan 31 02:59:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:59:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:36.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:59:37 np0005603623 nova_compute[226235]: 2026-01-31 07:59:37.058 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:37 np0005603623 nova_compute[226235]: 2026-01-31 07:59:37.140 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:37 np0005603623 nova_compute[226235]: 2026-01-31 07:59:37.167 226239 DEBUG nova.network.neutron [req-b085b8f8-71c7-43e7-ac14-0b9e62c2a1fa req-5001cc71-6600-48f4-826e-dbedd59dc49e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Updated VIF entry in instance network info cache for port 392355e0-133c-4f74-b0f0-dc74eb8c8416. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:59:37 np0005603623 nova_compute[226235]: 2026-01-31 07:59:37.167 226239 DEBUG nova.network.neutron [req-b085b8f8-71c7-43e7-ac14-0b9e62c2a1fa req-5001cc71-6600-48f4-826e-dbedd59dc49e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Updating instance_info_cache with network_info: [{"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:59:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:37.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:37 np0005603623 nova_compute[226235]: 2026-01-31 07:59:37.408 226239 DEBUG oslo_concurrency.lockutils [req-b085b8f8-71c7-43e7-ac14-0b9e62c2a1fa req-5001cc71-6600-48f4-826e-dbedd59dc49e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-add62d62-47fb-454d-aec1-d5bb6be9f1e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:59:37 np0005603623 nova_compute[226235]: 2026-01-31 07:59:37.409 226239 DEBUG oslo_concurrency.lockutils [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquired lock "refresh_cache-add62d62-47fb-454d-aec1-d5bb6be9f1e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:59:37 np0005603623 nova_compute[226235]: 2026-01-31 07:59:37.409 226239 DEBUG nova.network.neutron [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:59:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:38.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:39.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:39 np0005603623 nova_compute[226235]: 2026-01-31 07:59:39.387 226239 DEBUG nova.network.neutron [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Updating instance_info_cache with network_info: [{"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:59:39 np0005603623 nova_compute[226235]: 2026-01-31 07:59:39.474 226239 DEBUG oslo_concurrency.lockutils [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Releasing lock "refresh_cache-add62d62-47fb-454d-aec1-d5bb6be9f1e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:59:39 np0005603623 nova_compute[226235]: 2026-01-31 07:59:39.476 226239 DEBUG nova.compute.manager [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:59:39 np0005603623 kernel: tap392355e0-13 (unregistering): left promiscuous mode
Jan 31 02:59:39 np0005603623 NetworkManager[48970]: <info>  [1769846379.8910] device (tap392355e0-13): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:59:39 np0005603623 nova_compute[226235]: 2026-01-31 07:59:39.900 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:39 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:39Z|00191|binding|INFO|Releasing lport 392355e0-133c-4f74-b0f0-dc74eb8c8416 from this chassis (sb_readonly=0)
Jan 31 02:59:39 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:39Z|00192|binding|INFO|Setting lport 392355e0-133c-4f74-b0f0-dc74eb8c8416 down in Southbound
Jan 31 02:59:39 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:39Z|00193|binding|INFO|Removing iface tap392355e0-13 ovn-installed in OVS
Jan 31 02:59:39 np0005603623 nova_compute[226235]: 2026-01-31 07:59:39.901 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:39 np0005603623 nova_compute[226235]: 2026-01-31 07:59:39.922 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:39 np0005603623 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000030.scope: Deactivated successfully.
Jan 31 02:59:39 np0005603623 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000030.scope: Consumed 12.257s CPU time.
Jan 31 02:59:39 np0005603623 systemd-machined[194379]: Machine qemu-26-instance-00000030 terminated.
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.009 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:a5:39 10.100.0.7'], port_security=['fa:16:3e:98:a5:39 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'add62d62-47fb-454d-aec1-d5bb6be9f1e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f31b0319126848a5b8fd9521dc509172', 'neutron:revision_number': '5', 'neutron:security_group_ids': '2814a167-00ed-4304-830b-a99e04552970 e0bb74dc-5295-43f2-8b09-451ae80c7a58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29a935fc-1163-43c6-97c6-acf0f9c4194f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=392355e0-133c-4f74-b0f0-dc74eb8c8416) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.011 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 392355e0-133c-4f74-b0f0-dc74eb8c8416 in datapath 92b7a3d2-99de-4036-b28b-98f77dab6a25 unbound from our chassis#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.013 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 92b7a3d2-99de-4036-b28b-98f77dab6a25, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.014 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[182a6873-c3d8-4008-84a3-a89f97189db6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.014 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25 namespace which is not needed anymore#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.085 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.092 226239 INFO nova.virt.libvirt.driver [-] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Instance destroyed successfully.#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.093 226239 DEBUG nova.objects.instance [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lazy-loading 'resources' on Instance uuid add62d62-47fb-454d-aec1-d5bb6be9f1e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:59:40 np0005603623 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[249067]: [NOTICE]   (249071) : haproxy version is 2.8.14-c23fe91
Jan 31 02:59:40 np0005603623 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[249067]: [NOTICE]   (249071) : path to executable is /usr/sbin/haproxy
Jan 31 02:59:40 np0005603623 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[249067]: [WARNING]  (249071) : Exiting Master process...
Jan 31 02:59:40 np0005603623 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[249067]: [WARNING]  (249071) : Exiting Master process...
Jan 31 02:59:40 np0005603623 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[249067]: [ALERT]    (249071) : Current worker (249073) exited with code 143 (Terminated)
Jan 31 02:59:40 np0005603623 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[249067]: [WARNING]  (249071) : All workers exited. Exiting... (0)
Jan 31 02:59:40 np0005603623 systemd[1]: libpod-ca4706c1e1c260631197c39397f5704f6c7fcc31a2d70b009dd09365a9854679.scope: Deactivated successfully.
Jan 31 02:59:40 np0005603623 podman[249419]: 2026-01-31 07:59:40.168660957 +0000 UTC m=+0.043593449 container died ca4706c1e1c260631197c39397f5704f6c7fcc31a2d70b009dd09365a9854679 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.175 226239 DEBUG nova.virt.libvirt.vif [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:59:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1136604277',display_name='tempest-SecurityGroupsTestJSON-server-1136604277',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1136604277',id=48,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:59:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f31b0319126848a5b8fd9521dc509172',ramdisk_id='',reservation_id='r-n0w7f0zv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-648078268',owner_user_name='tempest-SecurityGroupsTestJSON-648078268-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:59:39Z,user_data=None,user_id='0e402088c09448e1a6f0cd61b11e0816',uuid=add62d62-47fb-454d-aec1-d5bb6be9f1e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.176 226239 DEBUG nova.network.os_vif_util [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Converting VIF {"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.177 226239 DEBUG nova.network.os_vif_util [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:98:a5:39,bridge_name='br-int',has_traffic_filtering=True,id=392355e0-133c-4f74-b0f0-dc74eb8c8416,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392355e0-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.177 226239 DEBUG os_vif [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:a5:39,bridge_name='br-int',has_traffic_filtering=True,id=392355e0-133c-4f74-b0f0-dc74eb8c8416,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392355e0-13') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.179 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.180 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap392355e0-13, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:40 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ca4706c1e1c260631197c39397f5704f6c7fcc31a2d70b009dd09365a9854679-userdata-shm.mount: Deactivated successfully.
Jan 31 02:59:40 np0005603623 systemd[1]: var-lib-containers-storage-overlay-a121f07ac7d2976ecc560bea0ff697a838423cd4a9151b22763e8c4dece8f6c3-merged.mount: Deactivated successfully.
Jan 31 02:59:40 np0005603623 podman[249419]: 2026-01-31 07:59:40.21238131 +0000 UTC m=+0.087313772 container cleanup ca4706c1e1c260631197c39397f5704f6c7fcc31a2d70b009dd09365a9854679 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:59:40 np0005603623 systemd[1]: libpod-conmon-ca4706c1e1c260631197c39397f5704f6c7fcc31a2d70b009dd09365a9854679.scope: Deactivated successfully.
Jan 31 02:59:40 np0005603623 podman[249463]: 2026-01-31 07:59:40.272834156 +0000 UTC m=+0.044007111 container remove ca4706c1e1c260631197c39397f5704f6c7fcc31a2d70b009dd09365a9854679 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.277 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dc371676-30c5-4f39-83d0-18c8f34b06db]: (4, ('Sat Jan 31 07:59:40 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25 (ca4706c1e1c260631197c39397f5704f6c7fcc31a2d70b009dd09365a9854679)\nca4706c1e1c260631197c39397f5704f6c7fcc31a2d70b009dd09365a9854679\nSat Jan 31 07:59:40 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25 (ca4706c1e1c260631197c39397f5704f6c7fcc31a2d70b009dd09365a9854679)\nca4706c1e1c260631197c39397f5704f6c7fcc31a2d70b009dd09365a9854679\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.279 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[16441ffe-a9bd-4d66-b05c-d6478c386f41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.279 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92b7a3d2-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:40 np0005603623 kernel: tap92b7a3d2-90: left promiscuous mode
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.292 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5d9d06ae-c376-4745-a613-59376a96865a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.308 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[717927fa-cf7c-4cfa-a49d-325f0f6b07c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.310 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0b75926a-e1f5-4637-a47c-9b653ba2c9ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.321 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e4f02c98-8f3c-431b-80c8-6e2804088169]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550581, 'reachable_time': 20311, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249500, 'error': None, 'target': 'ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:40 np0005603623 systemd[1]: run-netns-ovnmeta\x2d92b7a3d2\x2d99de\x2d4036\x2db28b\x2d98f77dab6a25.mount: Deactivated successfully.
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.326 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.326 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[6b20fc09-c3f5-4bfe-90ae-d56abcca8c91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.353 226239 DEBUG oslo_concurrency.lockutils [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.354 226239 DEBUG oslo_concurrency.lockutils [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.354 226239 DEBUG oslo_concurrency.lockutils [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.354 226239 DEBUG oslo_concurrency.lockutils [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.354 226239 DEBUG oslo_concurrency.lockutils [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:40 np0005603623 podman[249486]: 2026-01-31 07:59:40.354867281 +0000 UTC m=+0.061377097 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.355 226239 INFO nova.compute.manager [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Terminating instance#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.356 226239 DEBUG nova.compute.manager [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.357 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.360 226239 INFO os_vif [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:a5:39,bridge_name='br-int',has_traffic_filtering=True,id=392355e0-133c-4f74-b0f0-dc74eb8c8416,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392355e0-13')#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.367 226239 DEBUG nova.virt.libvirt.driver [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Start _get_guest_xml network_info=[{"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.372 226239 WARNING nova.virt.libvirt.driver [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.377 226239 DEBUG nova.virt.libvirt.host [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.378 226239 DEBUG nova.virt.libvirt.host [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.380 226239 DEBUG nova.virt.libvirt.host [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.381 226239 DEBUG nova.virt.libvirt.host [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.382 226239 DEBUG nova.virt.libvirt.driver [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.382 226239 DEBUG nova.virt.hardware [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.382 226239 DEBUG nova.virt.hardware [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.383 226239 DEBUG nova.virt.hardware [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.383 226239 DEBUG nova.virt.hardware [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.383 226239 DEBUG nova.virt.hardware [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.383 226239 DEBUG nova.virt.hardware [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.383 226239 DEBUG nova.virt.hardware [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.383 226239 DEBUG nova.virt.hardware [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.384 226239 DEBUG nova.virt.hardware [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.384 226239 DEBUG nova.virt.hardware [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.384 226239 DEBUG nova.virt.hardware [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.384 226239 DEBUG nova.objects.instance [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lazy-loading 'vcpu_model' on Instance uuid add62d62-47fb-454d-aec1-d5bb6be9f1e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:59:40 np0005603623 kernel: tapcdac9e6d-09 (unregistering): left promiscuous mode
Jan 31 02:59:40 np0005603623 NetworkManager[48970]: <info>  [1769846380.4123] device (tapcdac9e6d-09): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.419 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:40 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:40Z|00194|binding|INFO|Releasing lport cdac9e6d-0972-4cab-a922-cf4d3f769c6c from this chassis (sb_readonly=0)
Jan 31 02:59:40 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:40Z|00195|binding|INFO|Setting lport cdac9e6d-0972-4cab-a922-cf4d3f769c6c down in Southbound
Jan 31 02:59:40 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:40Z|00196|binding|INFO|Removing iface tapcdac9e6d-09 ovn-installed in OVS
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.421 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.438 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:40 np0005603623 podman[249486]: 2026-01-31 07:59:40.44887952 +0000 UTC m=+0.155389336 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Jan 31 02:59:40 np0005603623 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Jan 31 02:59:40 np0005603623 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000002a.scope: Consumed 15.723s CPU time.
Jan 31 02:59:40 np0005603623 systemd-machined[194379]: Machine qemu-24-instance-0000002a terminated.
Jan 31 02:59:40 np0005603623 NetworkManager[48970]: <info>  [1769846380.5762] manager: (tapcdac9e6d-09): new Tun device (/org/freedesktop/NetworkManager/Devices/95)
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.592 226239 INFO nova.virt.libvirt.driver [-] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Instance destroyed successfully.#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.592 226239 DEBUG nova.objects.instance [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lazy-loading 'resources' on Instance uuid ba72fe35-90dd-4806-9195-8a8ff81ae9f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.640 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:e5:c9 10.100.0.4'], port_security=['fa:16:3e:97:e5:c9 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'ba72fe35-90dd-4806-9195-8a8ff81ae9f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81f9575e-f3c4-4956-9a5c-9cb1776ab08e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '273fe485cc184dd8bf86440d8d1e05f3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8e774ed5-8b8d-44d7-ba02-981099294fdd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.211'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c7c7cc1-6459-40e3-9ba1-aa227a70d0ae, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=cdac9e6d-0972-4cab-a922-cf4d3f769c6c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.642 143258 INFO neutron.agent.ovn.metadata.agent [-] Port cdac9e6d-0972-4cab-a922-cf4d3f769c6c in datapath 81f9575e-f3c4-4956-9a5c-9cb1776ab08e unbound from our chassis#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.643 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 81f9575e-f3c4-4956-9a5c-9cb1776ab08e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.644 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[045d4809-87cb-4bab-97a4-aeccf44b6b98]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.645 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e namespace which is not needed anymore#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.717 226239 DEBUG oslo_concurrency.processutils [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.736 226239 DEBUG nova.virt.libvirt.vif [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:58:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-709128910',display_name='tempest-VolumesAdminNegativeTest-server-709128910',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-volumesadminnegativetest-server-709128910',id=42,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJi1q2vjU3oXwmY2ffjVGLPXOBEh8qUUJE5r4ONrjGH3XHm6dlUTSzVpJpxExvl6X7xbQyEMzqjUiwuQRK3OBD1eEX65+Q6/e+BZWXbmhA9FXHDtzew/pw1jU9p7vxh+uA==',key_name='tempest-keypair-941200395',keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:58:29Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='273fe485cc184dd8bf86440d8d1e05f3',ramdisk_id='',reservation_id='r-oipxgz7v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-VolumesAdminNegativeTest-1346674431',owner_user_name='tempest-VolumesAdminNegativeTest-1346674431-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:58:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b034a039074641a7b7c872e8b715ca4c',uuid=ba72fe35-90dd-4806-9195-8a8ff81ae9f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "address": "fa:16:3e:97:e5:c9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdac9e6d-09", "ovs_interfaceid": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.737 226239 DEBUG nova.network.os_vif_util [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Converting VIF {"id": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "address": "fa:16:3e:97:e5:c9", "network": {"id": "81f9575e-f3c4-4956-9a5c-9cb1776ab08e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1802464445-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "273fe485cc184dd8bf86440d8d1e05f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdac9e6d-09", "ovs_interfaceid": "cdac9e6d-0972-4cab-a922-cf4d3f769c6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.738 226239 DEBUG nova.network.os_vif_util [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:97:e5:c9,bridge_name='br-int',has_traffic_filtering=True,id=cdac9e6d-0972-4cab-a922-cf4d3f769c6c,network=Network(81f9575e-f3c4-4956-9a5c-9cb1776ab08e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcdac9e6d-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.738 226239 DEBUG os_vif [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:97:e5:c9,bridge_name='br-int',has_traffic_filtering=True,id=cdac9e6d-0972-4cab-a922-cf4d3f769c6c,network=Network(81f9575e-f3c4-4956-9a5c-9cb1776ab08e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcdac9e6d-09') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.743 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.744 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcdac9e6d-09, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.746 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.747 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.749 226239 INFO os_vif [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:97:e5:c9,bridge_name='br-int',has_traffic_filtering=True,id=cdac9e6d-0972-4cab-a922-cf4d3f769c6c,network=Network(81f9575e-f3c4-4956-9a5c-9cb1776ab08e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcdac9e6d-09')#033[00m
Jan 31 02:59:40 np0005603623 neutron-haproxy-ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e[247698]: [NOTICE]   (247702) : haproxy version is 2.8.14-c23fe91
Jan 31 02:59:40 np0005603623 neutron-haproxy-ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e[247698]: [NOTICE]   (247702) : path to executable is /usr/sbin/haproxy
Jan 31 02:59:40 np0005603623 neutron-haproxy-ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e[247698]: [WARNING]  (247702) : Exiting Master process...
Jan 31 02:59:40 np0005603623 neutron-haproxy-ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e[247698]: [WARNING]  (247702) : Exiting Master process...
Jan 31 02:59:40 np0005603623 neutron-haproxy-ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e[247698]: [ALERT]    (247702) : Current worker (247704) exited with code 143 (Terminated)
Jan 31 02:59:40 np0005603623 neutron-haproxy-ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e[247698]: [WARNING]  (247702) : All workers exited. Exiting... (0)
Jan 31 02:59:40 np0005603623 systemd[1]: libpod-eb5cebfb7ad0341fafec9f4e01aaf9a8576bfb966cbffd3230bbfa2384497063.scope: Deactivated successfully.
Jan 31 02:59:40 np0005603623 podman[249605]: 2026-01-31 07:59:40.763160523 +0000 UTC m=+0.049557936 container died eb5cebfb7ad0341fafec9f4e01aaf9a8576bfb966cbffd3230bbfa2384497063 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 31 02:59:40 np0005603623 systemd[1]: var-lib-containers-storage-overlay-758912f774509dd6a7150078e5836d6b7db4868057269d1d7ccf9bdec9b71693-merged.mount: Deactivated successfully.
Jan 31 02:59:40 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eb5cebfb7ad0341fafec9f4e01aaf9a8576bfb966cbffd3230bbfa2384497063-userdata-shm.mount: Deactivated successfully.
Jan 31 02:59:40 np0005603623 podman[249605]: 2026-01-31 07:59:40.806443181 +0000 UTC m=+0.092840574 container cleanup eb5cebfb7ad0341fafec9f4e01aaf9a8576bfb966cbffd3230bbfa2384497063 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Jan 31 02:59:40 np0005603623 systemd[1]: libpod-conmon-eb5cebfb7ad0341fafec9f4e01aaf9a8576bfb966cbffd3230bbfa2384497063.scope: Deactivated successfully.
Jan 31 02:59:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:59:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:40.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.860 226239 DEBUG nova.compute.manager [req-01042820-5915-4a25-a1a5-e9dc0ac7a441 req-b1bfd2e8-c289-4e0f-a2e5-2bf0d5e5ec9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received event network-vif-unplugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.861 226239 DEBUG oslo_concurrency.lockutils [req-01042820-5915-4a25-a1a5-e9dc0ac7a441 req-b1bfd2e8-c289-4e0f-a2e5-2bf0d5e5ec9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.861 226239 DEBUG oslo_concurrency.lockutils [req-01042820-5915-4a25-a1a5-e9dc0ac7a441 req-b1bfd2e8-c289-4e0f-a2e5-2bf0d5e5ec9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.862 226239 DEBUG oslo_concurrency.lockutils [req-01042820-5915-4a25-a1a5-e9dc0ac7a441 req-b1bfd2e8-c289-4e0f-a2e5-2bf0d5e5ec9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.862 226239 DEBUG nova.compute.manager [req-01042820-5915-4a25-a1a5-e9dc0ac7a441 req-b1bfd2e8-c289-4e0f-a2e5-2bf0d5e5ec9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] No waiting events found dispatching network-vif-unplugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.862 226239 WARNING nova.compute.manager [req-01042820-5915-4a25-a1a5-e9dc0ac7a441 req-b1bfd2e8-c289-4e0f-a2e5-2bf0d5e5ec9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received unexpected event network-vif-unplugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 for instance with vm_state active and task_state reboot_started_hard.#033[00m
Jan 31 02:59:40 np0005603623 podman[249668]: 2026-01-31 07:59:40.873903287 +0000 UTC m=+0.046900982 container remove eb5cebfb7ad0341fafec9f4e01aaf9a8576bfb966cbffd3230bbfa2384497063 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.878 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7790ec86-b832-4469-a05d-7cae49eaf270]: (4, ('Sat Jan 31 07:59:40 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e (eb5cebfb7ad0341fafec9f4e01aaf9a8576bfb966cbffd3230bbfa2384497063)\neb5cebfb7ad0341fafec9f4e01aaf9a8576bfb966cbffd3230bbfa2384497063\nSat Jan 31 07:59:40 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e (eb5cebfb7ad0341fafec9f4e01aaf9a8576bfb966cbffd3230bbfa2384497063)\neb5cebfb7ad0341fafec9f4e01aaf9a8576bfb966cbffd3230bbfa2384497063\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.879 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[42c2f9ce-dcfa-4f27-8d2d-1f44fa0274fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.880 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81f9575e-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.881 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:40 np0005603623 kernel: tap81f9575e-f0: left promiscuous mode
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.884 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.888 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a80822cc-2ecf-4a1a-a5d7-92e3c8f0d0ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:40 np0005603623 nova_compute[226235]: 2026-01-31 07:59:40.891 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.899 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[23c63120-880a-4bf3-ab9f-e6d31bed4408]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.900 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e7103f71-7baa-483d-a6dd-9507689c15f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.914 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fb7eeded-70d1-4d2c-a422-479fad99468d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 544972, 'reachable_time': 31963, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249718, 'error': None, 'target': 'ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.916 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-81f9575e-f3c4-4956-9a5c-9cb1776ab08e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:59:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:40.916 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[c94f600d-194d-44f0-883c-3958d79abe01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e166 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:41 np0005603623 podman[249748]: 2026-01-31 07:59:41.092700103 +0000 UTC m=+0.068119248 container exec dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:59:41 np0005603623 podman[249748]: 2026-01-31 07:59:41.105066521 +0000 UTC m=+0.080485646 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 02:59:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:59:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:41.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:59:41 np0005603623 systemd[1]: run-netns-ovnmeta\x2d81f9575e\x2df3c4\x2d4956\x2d9a5c\x2d9cb1776ab08e.mount: Deactivated successfully.
Jan 31 02:59:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:59:41 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3303983514' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.229 226239 DEBUG oslo_concurrency.processutils [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.274 226239 DEBUG oslo_concurrency.processutils [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:41 np0005603623 podman[249831]: 2026-01-31 07:59:41.320943496 +0000 UTC m=+0.059903522 container exec 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, io.buildah.version=1.28.2, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, release=1793, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, description=keepalived for Ceph, name=keepalived, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64)
Jan 31 02:59:41 np0005603623 podman[249853]: 2026-01-31 07:59:41.429790631 +0000 UTC m=+0.091583925 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., release=1793, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git)
Jan 31 02:59:41 np0005603623 podman[249831]: 2026-01-31 07:59:41.564824409 +0000 UTC m=+0.303784425 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, vcs-type=git, io.buildah.version=1.28.2, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, com.redhat.component=keepalived-container, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, description=keepalived for Ceph, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, io.openshift.tags=Ceph keepalived)
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.602 226239 INFO nova.virt.libvirt.driver [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Deleting instance files /var/lib/nova/instances/ba72fe35-90dd-4806-9195-8a8ff81ae9f0_del#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.604 226239 INFO nova.virt.libvirt.driver [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Deletion of /var/lib/nova/instances/ba72fe35-90dd-4806-9195-8a8ff81ae9f0_del complete#033[00m
Jan 31 02:59:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:59:41 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2804424046' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.771 226239 DEBUG oslo_concurrency.processutils [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.773 226239 DEBUG nova.virt.libvirt.vif [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:59:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1136604277',display_name='tempest-SecurityGroupsTestJSON-server-1136604277',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1136604277',id=48,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:59:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f31b0319126848a5b8fd9521dc509172',ramdisk_id='',reservation_id='r-n0w7f0zv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-648078268',owner_user_name='tempest-SecurityGroupsTestJSON-648078268-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:59:39Z,user_data=None,user_id='0e402088c09448e1a6f0cd61b11e0816',uuid=add62d62-47fb-454d-aec1-d5bb6be9f1e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.773 226239 DEBUG nova.network.os_vif_util [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Converting VIF {"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.774 226239 DEBUG nova.network.os_vif_util [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:98:a5:39,bridge_name='br-int',has_traffic_filtering=True,id=392355e0-133c-4f74-b0f0-dc74eb8c8416,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392355e0-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.775 226239 DEBUG nova.objects.instance [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lazy-loading 'pci_devices' on Instance uuid add62d62-47fb-454d-aec1-d5bb6be9f1e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.940 226239 DEBUG nova.compute.manager [req-e532f908-8030-452a-8a2c-9c4d931f8c50 req-3d031aa1-9312-48dd-a49b-06ae50cd2c71 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Received event network-vif-unplugged-cdac9e6d-0972-4cab-a922-cf4d3f769c6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.941 226239 DEBUG oslo_concurrency.lockutils [req-e532f908-8030-452a-8a2c-9c4d931f8c50 req-3d031aa1-9312-48dd-a49b-06ae50cd2c71 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.942 226239 DEBUG oslo_concurrency.lockutils [req-e532f908-8030-452a-8a2c-9c4d931f8c50 req-3d031aa1-9312-48dd-a49b-06ae50cd2c71 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.942 226239 DEBUG oslo_concurrency.lockutils [req-e532f908-8030-452a-8a2c-9c4d931f8c50 req-3d031aa1-9312-48dd-a49b-06ae50cd2c71 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.942 226239 DEBUG nova.compute.manager [req-e532f908-8030-452a-8a2c-9c4d931f8c50 req-3d031aa1-9312-48dd-a49b-06ae50cd2c71 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] No waiting events found dispatching network-vif-unplugged-cdac9e6d-0972-4cab-a922-cf4d3f769c6c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.942 226239 DEBUG nova.compute.manager [req-e532f908-8030-452a-8a2c-9c4d931f8c50 req-3d031aa1-9312-48dd-a49b-06ae50cd2c71 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Received event network-vif-unplugged-cdac9e6d-0972-4cab-a922-cf4d3f769c6c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.950 226239 DEBUG nova.virt.libvirt.driver [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:59:41 np0005603623 nova_compute[226235]:  <uuid>add62d62-47fb-454d-aec1-d5bb6be9f1e6</uuid>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:  <name>instance-00000030</name>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <nova:name>tempest-SecurityGroupsTestJSON-server-1136604277</nova:name>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 07:59:40</nova:creationTime>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 02:59:41 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:        <nova:user uuid="0e402088c09448e1a6f0cd61b11e0816">tempest-SecurityGroupsTestJSON-648078268-project-member</nova:user>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:        <nova:project uuid="f31b0319126848a5b8fd9521dc509172">tempest-SecurityGroupsTestJSON-648078268</nova:project>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:        <nova:port uuid="392355e0-133c-4f74-b0f0-dc74eb8c8416">
Jan 31 02:59:41 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <system>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <entry name="serial">add62d62-47fb-454d-aec1-d5bb6be9f1e6</entry>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <entry name="uuid">add62d62-47fb-454d-aec1-d5bb6be9f1e6</entry>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    </system>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:  <os>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:  </os>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:  <features>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:  </features>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:  </clock>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:  <devices>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/add62d62-47fb-454d-aec1-d5bb6be9f1e6_disk">
Jan 31 02:59:41 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:59:41 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/add62d62-47fb-454d-aec1-d5bb6be9f1e6_disk.config">
Jan 31 02:59:41 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      </source>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 02:59:41 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      </auth>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    </disk>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:98:a5:39"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <target dev="tap392355e0-13"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    </interface>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/add62d62-47fb-454d-aec1-d5bb6be9f1e6/console.log" append="off"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    </serial>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <video>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    </video>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <input type="keyboard" bus="usb"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    </rng>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 02:59:41 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 02:59:41 np0005603623 nova_compute[226235]:  </devices>
Jan 31 02:59:41 np0005603623 nova_compute[226235]: </domain>
Jan 31 02:59:41 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.952 226239 DEBUG nova.virt.libvirt.driver [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] skipping disk for instance-00000030 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.952 226239 DEBUG nova.virt.libvirt.driver [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] skipping disk for instance-00000030 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.953 226239 DEBUG nova.virt.libvirt.vif [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:59:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1136604277',display_name='tempest-SecurityGroupsTestJSON-server-1136604277',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1136604277',id=48,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:59:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='f31b0319126848a5b8fd9521dc509172',ramdisk_id='',reservation_id='r-n0w7f0zv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-648078268',owner_user_name='tempest-SecurityGroupsTestJSON-648078268-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:59:39Z,user_data=None,user_id='0e402088c09448e1a6f0cd61b11e0816',uuid=add62d62-47fb-454d-aec1-d5bb6be9f1e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.953 226239 DEBUG nova.network.os_vif_util [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Converting VIF {"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.954 226239 DEBUG nova.network.os_vif_util [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:98:a5:39,bridge_name='br-int',has_traffic_filtering=True,id=392355e0-133c-4f74-b0f0-dc74eb8c8416,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392355e0-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.954 226239 DEBUG os_vif [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:a5:39,bridge_name='br-int',has_traffic_filtering=True,id=392355e0-133c-4f74-b0f0-dc74eb8c8416,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392355e0-13') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.955 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.956 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.956 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.960 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.960 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap392355e0-13, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.961 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap392355e0-13, col_values=(('external_ids', {'iface-id': '392355e0-133c-4f74-b0f0-dc74eb8c8416', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:98:a5:39', 'vm-uuid': 'add62d62-47fb-454d-aec1-d5bb6be9f1e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.962 226239 INFO nova.compute.manager [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Took 1.61 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.962 226239 DEBUG oslo.service.loopingcall [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.963 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.964 226239 DEBUG nova.compute.manager [-] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.964 226239 DEBUG nova.network.neutron [-] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:59:41 np0005603623 NetworkManager[48970]: <info>  [1769846381.9642] manager: (tap392355e0-13): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.967 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.969 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.970 226239 INFO os_vif [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:a5:39,bridge_name='br-int',has_traffic_filtering=True,id=392355e0-133c-4f74-b0f0-dc74eb8c8416,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392355e0-13')#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.975 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846366.9748542, 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:59:41 np0005603623 nova_compute[226235]: 2026-01-31 07:59:41.977 226239 INFO nova.compute.manager [-] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:59:42 np0005603623 nova_compute[226235]: 2026-01-31 07:59:42.001 226239 DEBUG nova.compute.manager [None req-e9b4d3c2-c762-4384-b4f5-b7b5f0eefaa7 - - - - - -] [instance: 34a6aede-cc00-4cfe-a62b-9bb9d59e8e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:59:42 np0005603623 kernel: tap392355e0-13: entered promiscuous mode
Jan 31 02:59:42 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:42Z|00197|binding|INFO|Claiming lport 392355e0-133c-4f74-b0f0-dc74eb8c8416 for this chassis.
Jan 31 02:59:42 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:42Z|00198|binding|INFO|392355e0-133c-4f74-b0f0-dc74eb8c8416: Claiming fa:16:3e:98:a5:39 10.100.0.7
Jan 31 02:59:42 np0005603623 NetworkManager[48970]: <info>  [1769846382.1227] manager: (tap392355e0-13): new Tun device (/org/freedesktop/NetworkManager/Devices/97)
Jan 31 02:59:42 np0005603623 nova_compute[226235]: 2026-01-31 07:59:42.122 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:42 np0005603623 systemd-udevd[249349]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:59:42 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:42Z|00199|binding|INFO|Setting lport 392355e0-133c-4f74-b0f0-dc74eb8c8416 ovn-installed in OVS
Jan 31 02:59:42 np0005603623 nova_compute[226235]: 2026-01-31 07:59:42.131 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:42 np0005603623 nova_compute[226235]: 2026-01-31 07:59:42.133 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:42 np0005603623 NetworkManager[48970]: <info>  [1769846382.1385] device (tap392355e0-13): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:59:42 np0005603623 NetworkManager[48970]: <info>  [1769846382.1402] device (tap392355e0-13): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:59:42 np0005603623 nova_compute[226235]: 2026-01-31 07:59:42.143 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:42 np0005603623 systemd-machined[194379]: New machine qemu-27-instance-00000030.
Jan 31 02:59:42 np0005603623 systemd[1]: Started Virtual Machine qemu-27-instance-00000030.
Jan 31 02:59:42 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:42Z|00200|binding|INFO|Setting lport 392355e0-133c-4f74-b0f0-dc74eb8c8416 up in Southbound
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.164 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:a5:39 10.100.0.7'], port_security=['fa:16:3e:98:a5:39 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'add62d62-47fb-454d-aec1-d5bb6be9f1e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f31b0319126848a5b8fd9521dc509172', 'neutron:revision_number': '6', 'neutron:security_group_ids': '2814a167-00ed-4304-830b-a99e04552970 e0bb74dc-5295-43f2-8b09-451ae80c7a58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29a935fc-1163-43c6-97c6-acf0f9c4194f, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=392355e0-133c-4f74-b0f0-dc74eb8c8416) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.165 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 392355e0-133c-4f74-b0f0-dc74eb8c8416 in datapath 92b7a3d2-99de-4036-b28b-98f77dab6a25 bound to our chassis#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.167 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 92b7a3d2-99de-4036-b28b-98f77dab6a25#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.176 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[99d68b63-d682-4e8c-aeac-e606c5113d05]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.177 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap92b7a3d2-91 in ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.180 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap92b7a3d2-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.180 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[18c0bf61-91d4-4306-8a18-5d34e5362653]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.181 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d61a5c2e-60af-4698-8eb9-5318ce3e3e00]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.191 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[332292f4-8645-46ba-84f7-75992a3f5932]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.211 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e4d0b8e1-ea5c-4ee2-972f-d9164ffcf22e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.230 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[c7c58cbf-8a14-4f20-979b-948508168c70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.237 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a6900259-ea0c-40a9-9adc-ada440291cca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:42 np0005603623 NetworkManager[48970]: <info>  [1769846382.2390] manager: (tap92b7a3d2-90): new Veth device (/org/freedesktop/NetworkManager/Devices/98)
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.262 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[1b09216b-cca3-48bb-85fc-15207e81fd88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.265 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[c3ee3863-ebb0-4222-a5e2-55a8c06397db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:42 np0005603623 NetworkManager[48970]: <info>  [1769846382.2834] device (tap92b7a3d2-90): carrier: link connected
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.286 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[8b17ebd3-0670-4849-bdd5-d4e765b75695]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.300 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0b51f2a6-2906-4eaa-a9a7-30d4dfc8e12b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap92b7a3d2-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:d6:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552271, 'reachable_time': 33499, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250048, 'error': None, 'target': 'ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.315 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[411b3764-33d3-4ffe-a698-18ea091d6604]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:d681'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 552271, 'tstamp': 552271}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250049, 'error': None, 'target': 'ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.327 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[18fe23c1-9e28-4199-aab0-2fafbb2647b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap92b7a3d2-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:d6:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552271, 'reachable_time': 33499, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250050, 'error': None, 'target': 'ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.347 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0fc5c42f-24c1-4e79-b727-f7d98296efe6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.393 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fa6c0acc-b70f-4a80-b251-6f0a47fccbd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.395 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92b7a3d2-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.395 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.396 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap92b7a3d2-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:42 np0005603623 NetworkManager[48970]: <info>  [1769846382.3986] manager: (tap92b7a3d2-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Jan 31 02:59:42 np0005603623 kernel: tap92b7a3d2-90: entered promiscuous mode
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.400 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap92b7a3d2-90, col_values=(('external_ids', {'iface-id': 'b33af60b-01fb-4204-b1d7-f9b1d79e127d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:42 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:42Z|00201|binding|INFO|Releasing lport b33af60b-01fb-4204-b1d7-f9b1d79e127d from this chassis (sb_readonly=0)
Jan 31 02:59:42 np0005603623 nova_compute[226235]: 2026-01-31 07:59:42.408 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.410 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/92b7a3d2-99de-4036-b28b-98f77dab6a25.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/92b7a3d2-99de-4036-b28b-98f77dab6a25.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.412 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0c7bc114-2398-4a87-bd78-7c3093acc755]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.413 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-92b7a3d2-99de-4036-b28b-98f77dab6a25
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/92b7a3d2-99de-4036-b28b-98f77dab6a25.pid.haproxy
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 92b7a3d2-99de-4036-b28b-98f77dab6a25
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:59:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:42.414 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'env', 'PROCESS_TAG=haproxy-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/92b7a3d2-99de-4036-b28b-98f77dab6a25.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:59:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:59:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:59:42 np0005603623 podman[250097]: 2026-01-31 07:59:42.719129509 +0000 UTC m=+0.021196585 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:59:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:59:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:42.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:59:42 np0005603623 nova_compute[226235]: 2026-01-31 07:59:42.963 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Removed pending event for add62d62-47fb-454d-aec1-d5bb6be9f1e6 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 02:59:42 np0005603623 nova_compute[226235]: 2026-01-31 07:59:42.963 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846382.9626567, add62d62-47fb-454d-aec1-d5bb6be9f1e6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:59:42 np0005603623 nova_compute[226235]: 2026-01-31 07:59:42.964 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:59:42 np0005603623 nova_compute[226235]: 2026-01-31 07:59:42.966 226239 DEBUG nova.compute.manager [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:59:42 np0005603623 nova_compute[226235]: 2026-01-31 07:59:42.968 226239 INFO nova.virt.libvirt.driver [-] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Instance rebooted successfully.#033[00m
Jan 31 02:59:42 np0005603623 nova_compute[226235]: 2026-01-31 07:59:42.968 226239 DEBUG nova.compute.manager [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:59:43 np0005603623 podman[250097]: 2026-01-31 07:59:43.113394422 +0000 UTC m=+0.415461488 container create 9ce13ef9de7ee23e1a2d419a65c83e33087d32ea23234be5f6bcefcbf1477db0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 31 02:59:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:43.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:43 np0005603623 systemd[1]: Started libpod-conmon-9ce13ef9de7ee23e1a2d419a65c83e33087d32ea23234be5f6bcefcbf1477db0.scope.
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.316 226239 DEBUG nova.compute.manager [req-803c07ee-f582-47f1-8766-ae9f1fad09a6 req-5ec83375-e4d1-4c39-bf67-bb3f523cf0ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received event network-vif-plugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.316 226239 DEBUG oslo_concurrency.lockutils [req-803c07ee-f582-47f1-8766-ae9f1fad09a6 req-5ec83375-e4d1-4c39-bf67-bb3f523cf0ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.317 226239 DEBUG oslo_concurrency.lockutils [req-803c07ee-f582-47f1-8766-ae9f1fad09a6 req-5ec83375-e4d1-4c39-bf67-bb3f523cf0ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.317 226239 DEBUG oslo_concurrency.lockutils [req-803c07ee-f582-47f1-8766-ae9f1fad09a6 req-5ec83375-e4d1-4c39-bf67-bb3f523cf0ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.317 226239 DEBUG nova.compute.manager [req-803c07ee-f582-47f1-8766-ae9f1fad09a6 req-5ec83375-e4d1-4c39-bf67-bb3f523cf0ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] No waiting events found dispatching network-vif-plugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.317 226239 WARNING nova.compute.manager [req-803c07ee-f582-47f1-8766-ae9f1fad09a6 req-5ec83375-e4d1-4c39-bf67-bb3f523cf0ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received unexpected event network-vif-plugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 for instance with vm_state active and task_state reboot_started_hard.#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.318 226239 DEBUG nova.compute.manager [req-803c07ee-f582-47f1-8766-ae9f1fad09a6 req-5ec83375-e4d1-4c39-bf67-bb3f523cf0ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received event network-vif-plugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.318 226239 DEBUG oslo_concurrency.lockutils [req-803c07ee-f582-47f1-8766-ae9f1fad09a6 req-5ec83375-e4d1-4c39-bf67-bb3f523cf0ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.318 226239 DEBUG oslo_concurrency.lockutils [req-803c07ee-f582-47f1-8766-ae9f1fad09a6 req-5ec83375-e4d1-4c39-bf67-bb3f523cf0ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.318 226239 DEBUG oslo_concurrency.lockutils [req-803c07ee-f582-47f1-8766-ae9f1fad09a6 req-5ec83375-e4d1-4c39-bf67-bb3f523cf0ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.319 226239 DEBUG nova.compute.manager [req-803c07ee-f582-47f1-8766-ae9f1fad09a6 req-5ec83375-e4d1-4c39-bf67-bb3f523cf0ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] No waiting events found dispatching network-vif-plugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.319 226239 WARNING nova.compute.manager [req-803c07ee-f582-47f1-8766-ae9f1fad09a6 req-5ec83375-e4d1-4c39-bf67-bb3f523cf0ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received unexpected event network-vif-plugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 for instance with vm_state active and task_state reboot_started_hard.#033[00m
Jan 31 02:59:43 np0005603623 systemd[1]: Started libcrun container.
Jan 31 02:59:43 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c2c79f1a2af293e75ccf2b77428b6a4eb56e13e6d6232d63bc1964c08a338a9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.377 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.381 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.422 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.422 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846382.965624, add62d62-47fb-454d-aec1-d5bb6be9f1e6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.422 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] VM Started (Lifecycle Event)#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.431 226239 DEBUG nova.network.neutron [-] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:59:43 np0005603623 podman[250097]: 2026-01-31 07:59:43.440654581 +0000 UTC m=+0.742721667 container init 9ce13ef9de7ee23e1a2d419a65c83e33087d32ea23234be5f6bcefcbf1477db0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:59:43 np0005603623 podman[250097]: 2026-01-31 07:59:43.446324289 +0000 UTC m=+0.748391375 container start 9ce13ef9de7ee23e1a2d419a65c83e33087d32ea23234be5f6bcefcbf1477db0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.455 226239 DEBUG oslo_concurrency.lockutils [None req-6e7076b9-fe9b-417b-847f-6b54d3bcbcf3 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 7.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:43 np0005603623 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[250155]: [NOTICE]   (250159) : New worker (250161) forked
Jan 31 02:59:43 np0005603623 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[250155]: [NOTICE]   (250159) : Loading success.
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.488 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.491 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.561 226239 INFO nova.compute.manager [-] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Took 1.60 seconds to deallocate network for instance.#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.708 226239 DEBUG oslo_concurrency.lockutils [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.709 226239 DEBUG oslo_concurrency.lockutils [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:59:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:59:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:59:43 np0005603623 nova_compute[226235]: 2026-01-31 07:59:43.810 226239 DEBUG oslo_concurrency.processutils [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:44 np0005603623 nova_compute[226235]: 2026-01-31 07:59:44.108 226239 DEBUG nova.compute.manager [req-8285e1e1-8c43-4c76-9db0-44c6599c531e req-00883c8f-631d-44b5-b2b7-331e29917252 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Received event network-vif-plugged-cdac9e6d-0972-4cab-a922-cf4d3f769c6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:44 np0005603623 nova_compute[226235]: 2026-01-31 07:59:44.109 226239 DEBUG oslo_concurrency.lockutils [req-8285e1e1-8c43-4c76-9db0-44c6599c531e req-00883c8f-631d-44b5-b2b7-331e29917252 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:44 np0005603623 nova_compute[226235]: 2026-01-31 07:59:44.109 226239 DEBUG oslo_concurrency.lockutils [req-8285e1e1-8c43-4c76-9db0-44c6599c531e req-00883c8f-631d-44b5-b2b7-331e29917252 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:44 np0005603623 nova_compute[226235]: 2026-01-31 07:59:44.109 226239 DEBUG oslo_concurrency.lockutils [req-8285e1e1-8c43-4c76-9db0-44c6599c531e req-00883c8f-631d-44b5-b2b7-331e29917252 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:44 np0005603623 nova_compute[226235]: 2026-01-31 07:59:44.110 226239 DEBUG nova.compute.manager [req-8285e1e1-8c43-4c76-9db0-44c6599c531e req-00883c8f-631d-44b5-b2b7-331e29917252 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] No waiting events found dispatching network-vif-plugged-cdac9e6d-0972-4cab-a922-cf4d3f769c6c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:59:44 np0005603623 nova_compute[226235]: 2026-01-31 07:59:44.110 226239 WARNING nova.compute.manager [req-8285e1e1-8c43-4c76-9db0-44c6599c531e req-00883c8f-631d-44b5-b2b7-331e29917252 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Received unexpected event network-vif-plugged-cdac9e6d-0972-4cab-a922-cf4d3f769c6c for instance with vm_state deleted and task_state None.#033[00m
Jan 31 02:59:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:59:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3249708180' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:59:44 np0005603623 nova_compute[226235]: 2026-01-31 07:59:44.255 226239 DEBUG oslo_concurrency.processutils [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:44 np0005603623 nova_compute[226235]: 2026-01-31 07:59:44.262 226239 DEBUG nova.compute.provider_tree [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:59:44 np0005603623 nova_compute[226235]: 2026-01-31 07:59:44.349 226239 DEBUG nova.scheduler.client.report [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:59:44 np0005603623 nova_compute[226235]: 2026-01-31 07:59:44.424 226239 DEBUG oslo_concurrency.lockutils [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.716s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:44 np0005603623 nova_compute[226235]: 2026-01-31 07:59:44.514 226239 INFO nova.scheduler.client.report [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Deleted allocations for instance ba72fe35-90dd-4806-9195-8a8ff81ae9f0#033[00m
Jan 31 02:59:44 np0005603623 nova_compute[226235]: 2026-01-31 07:59:44.673 226239 DEBUG oslo_concurrency.lockutils [None req-9680fe5f-f9a0-440d-907d-16cad4c5ccc2 b034a039074641a7b7c872e8b715ca4c 273fe485cc184dd8bf86440d8d1e05f3 - - default default] Lock "ba72fe35-90dd-4806-9195-8a8ff81ae9f0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.319s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:44.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:44 np0005603623 nova_compute[226235]: 2026-01-31 07:59:44.931 226239 DEBUG nova.compute.manager [req-a17f56fd-b68e-4a81-b18d-04d3cc8c1415 req-fbaf0f6e-3e95-434c-b68d-1428d121d7e1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Received event network-vif-deleted-cdac9e6d-0972-4cab-a922-cf4d3f769c6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:45.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:45 np0005603623 nova_compute[226235]: 2026-01-31 07:59:45.435 226239 DEBUG nova.compute.manager [req-b975b27a-0add-48a6-9266-5467b042955b req-744ac706-8827-488a-8955-725621fde602 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received event network-vif-plugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:45 np0005603623 nova_compute[226235]: 2026-01-31 07:59:45.435 226239 DEBUG oslo_concurrency.lockutils [req-b975b27a-0add-48a6-9266-5467b042955b req-744ac706-8827-488a-8955-725621fde602 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:45 np0005603623 nova_compute[226235]: 2026-01-31 07:59:45.436 226239 DEBUG oslo_concurrency.lockutils [req-b975b27a-0add-48a6-9266-5467b042955b req-744ac706-8827-488a-8955-725621fde602 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:45 np0005603623 nova_compute[226235]: 2026-01-31 07:59:45.436 226239 DEBUG oslo_concurrency.lockutils [req-b975b27a-0add-48a6-9266-5467b042955b req-744ac706-8827-488a-8955-725621fde602 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:45 np0005603623 nova_compute[226235]: 2026-01-31 07:59:45.436 226239 DEBUG nova.compute.manager [req-b975b27a-0add-48a6-9266-5467b042955b req-744ac706-8827-488a-8955-725621fde602 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] No waiting events found dispatching network-vif-plugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:59:45 np0005603623 nova_compute[226235]: 2026-01-31 07:59:45.436 226239 WARNING nova.compute.manager [req-b975b27a-0add-48a6-9266-5467b042955b req-744ac706-8827-488a-8955-725621fde602 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received unexpected event network-vif-plugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 for instance with vm_state active and task_state None.#033[00m
Jan 31 02:59:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e166 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:46.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:46 np0005603623 nova_compute[226235]: 2026-01-31 07:59:46.963 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:47 np0005603623 nova_compute[226235]: 2026-01-31 07:59:47.144 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:47.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:48 np0005603623 nova_compute[226235]: 2026-01-31 07:59:48.501 226239 DEBUG nova.compute.manager [req-83e6490f-ba5b-41ee-988e-ec6a44e0a82b req-d7b159ed-b915-40d7-80ef-c2d23e5a2dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received event network-changed-392355e0-133c-4f74-b0f0-dc74eb8c8416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:48 np0005603623 nova_compute[226235]: 2026-01-31 07:59:48.502 226239 DEBUG nova.compute.manager [req-83e6490f-ba5b-41ee-988e-ec6a44e0a82b req-d7b159ed-b915-40d7-80ef-c2d23e5a2dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Refreshing instance network info cache due to event network-changed-392355e0-133c-4f74-b0f0-dc74eb8c8416. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:59:48 np0005603623 nova_compute[226235]: 2026-01-31 07:59:48.502 226239 DEBUG oslo_concurrency.lockutils [req-83e6490f-ba5b-41ee-988e-ec6a44e0a82b req-d7b159ed-b915-40d7-80ef-c2d23e5a2dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-add62d62-47fb-454d-aec1-d5bb6be9f1e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:59:48 np0005603623 nova_compute[226235]: 2026-01-31 07:59:48.502 226239 DEBUG oslo_concurrency.lockutils [req-83e6490f-ba5b-41ee-988e-ec6a44e0a82b req-d7b159ed-b915-40d7-80ef-c2d23e5a2dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-add62d62-47fb-454d-aec1-d5bb6be9f1e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:59:48 np0005603623 nova_compute[226235]: 2026-01-31 07:59:48.502 226239 DEBUG nova.network.neutron [req-83e6490f-ba5b-41ee-988e-ec6a44e0a82b req-d7b159ed-b915-40d7-80ef-c2d23e5a2dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Refreshing network info cache for port 392355e0-133c-4f74-b0f0-dc74eb8c8416 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:59:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:59:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:48.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:59:48 np0005603623 nova_compute[226235]: 2026-01-31 07:59:48.948 226239 DEBUG oslo_concurrency.lockutils [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:48 np0005603623 nova_compute[226235]: 2026-01-31 07:59:48.949 226239 DEBUG oslo_concurrency.lockutils [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:48 np0005603623 nova_compute[226235]: 2026-01-31 07:59:48.949 226239 DEBUG oslo_concurrency.lockutils [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:48 np0005603623 nova_compute[226235]: 2026-01-31 07:59:48.949 226239 DEBUG oslo_concurrency.lockutils [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:48 np0005603623 nova_compute[226235]: 2026-01-31 07:59:48.949 226239 DEBUG oslo_concurrency.lockutils [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:48 np0005603623 nova_compute[226235]: 2026-01-31 07:59:48.950 226239 INFO nova.compute.manager [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Terminating instance#033[00m
Jan 31 02:59:48 np0005603623 nova_compute[226235]: 2026-01-31 07:59:48.951 226239 DEBUG nova.compute.manager [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:59:48 np0005603623 kernel: tap392355e0-13 (unregistering): left promiscuous mode
Jan 31 02:59:48 np0005603623 NetworkManager[48970]: <info>  [1769846388.9953] device (tap392355e0-13): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.012 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:49 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:49Z|00202|binding|INFO|Releasing lport 392355e0-133c-4f74-b0f0-dc74eb8c8416 from this chassis (sb_readonly=0)
Jan 31 02:59:49 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:49Z|00203|binding|INFO|Setting lport 392355e0-133c-4f74-b0f0-dc74eb8c8416 down in Southbound
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.021 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:49 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:49Z|00204|binding|INFO|Removing iface tap392355e0-13 ovn-installed in OVS
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.023 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.026 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:49.043 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:a5:39 10.100.0.7'], port_security=['fa:16:3e:98:a5:39 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'add62d62-47fb-454d-aec1-d5bb6be9f1e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f31b0319126848a5b8fd9521dc509172', 'neutron:revision_number': '8', 'neutron:security_group_ids': '2814a167-00ed-4304-830b-a99e04552970 a0373e95-04ab-4354-87ae-68169beb9f7b e0bb74dc-5295-43f2-8b09-451ae80c7a58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29a935fc-1163-43c6-97c6-acf0f9c4194f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=392355e0-133c-4f74-b0f0-dc74eb8c8416) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:59:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:49.044 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 392355e0-133c-4f74-b0f0-dc74eb8c8416 in datapath 92b7a3d2-99de-4036-b28b-98f77dab6a25 unbound from our chassis#033[00m
Jan 31 02:59:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:49.045 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 92b7a3d2-99de-4036-b28b-98f77dab6a25, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:59:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:49.046 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[60f448d0-bf71-43dc-b29d-803c4e099023]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:49.047 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25 namespace which is not needed anymore#033[00m
Jan 31 02:59:49 np0005603623 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000030.scope: Deactivated successfully.
Jan 31 02:59:49 np0005603623 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000030.scope: Consumed 6.996s CPU time.
Jan 31 02:59:49 np0005603623 systemd-machined[194379]: Machine qemu-27-instance-00000030 terminated.
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.168 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.171 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.182 226239 INFO nova.virt.libvirt.driver [-] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Instance destroyed successfully.#033[00m
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.183 226239 DEBUG nova.objects.instance [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lazy-loading 'resources' on Instance uuid add62d62-47fb-454d-aec1-d5bb6be9f1e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:59:49 np0005603623 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[250155]: [NOTICE]   (250159) : haproxy version is 2.8.14-c23fe91
Jan 31 02:59:49 np0005603623 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[250155]: [NOTICE]   (250159) : path to executable is /usr/sbin/haproxy
Jan 31 02:59:49 np0005603623 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[250155]: [WARNING]  (250159) : Exiting Master process...
Jan 31 02:59:49 np0005603623 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[250155]: [WARNING]  (250159) : Exiting Master process...
Jan 31 02:59:49 np0005603623 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[250155]: [ALERT]    (250159) : Current worker (250161) exited with code 143 (Terminated)
Jan 31 02:59:49 np0005603623 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[250155]: [WARNING]  (250159) : All workers exited. Exiting... (0)
Jan 31 02:59:49 np0005603623 systemd[1]: libpod-9ce13ef9de7ee23e1a2d419a65c83e33087d32ea23234be5f6bcefcbf1477db0.scope: Deactivated successfully.
Jan 31 02:59:49 np0005603623 podman[250217]: 2026-01-31 07:59:49.197188015 +0000 UTC m=+0.084325356 container died 9ce13ef9de7ee23e1a2d419a65c83e33087d32ea23234be5f6bcefcbf1477db0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 02:59:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:49.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.218 226239 DEBUG nova.virt.libvirt.vif [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:59:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1136604277',display_name='tempest-SecurityGroupsTestJSON-server-1136604277',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1136604277',id=48,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:59:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f31b0319126848a5b8fd9521dc509172',ramdisk_id='',reservation_id='r-n0w7f0zv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-648078268',owner_user_name='tempest-SecurityGroupsTestJSON-648078268-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:59:43Z,user_data=None,user_id='0e402088c09448e1a6f0cd61b11e0816',uuid=add62d62-47fb-454d-aec1-d5bb6be9f1e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.219 226239 DEBUG nova.network.os_vif_util [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Converting VIF {"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.220 226239 DEBUG nova.network.os_vif_util [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:98:a5:39,bridge_name='br-int',has_traffic_filtering=True,id=392355e0-133c-4f74-b0f0-dc74eb8c8416,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392355e0-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.220 226239 DEBUG os_vif [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:a5:39,bridge_name='br-int',has_traffic_filtering=True,id=392355e0-133c-4f74-b0f0-dc74eb8c8416,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392355e0-13') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.222 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.222 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap392355e0-13, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.223 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.224 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.227 226239 INFO os_vif [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:a5:39,bridge_name='br-int',has_traffic_filtering=True,id=392355e0-133c-4f74-b0f0-dc74eb8c8416,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392355e0-13')#033[00m
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.336 226239 DEBUG nova.compute.manager [req-44bac796-118b-4938-b4a1-ecd4105d830d req-0c5d1145-a49e-45b5-9d20-cbe2149c57e9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received event network-vif-unplugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.337 226239 DEBUG oslo_concurrency.lockutils [req-44bac796-118b-4938-b4a1-ecd4105d830d req-0c5d1145-a49e-45b5-9d20-cbe2149c57e9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.337 226239 DEBUG oslo_concurrency.lockutils [req-44bac796-118b-4938-b4a1-ecd4105d830d req-0c5d1145-a49e-45b5-9d20-cbe2149c57e9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.338 226239 DEBUG oslo_concurrency.lockutils [req-44bac796-118b-4938-b4a1-ecd4105d830d req-0c5d1145-a49e-45b5-9d20-cbe2149c57e9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.338 226239 DEBUG nova.compute.manager [req-44bac796-118b-4938-b4a1-ecd4105d830d req-0c5d1145-a49e-45b5-9d20-cbe2149c57e9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] No waiting events found dispatching network-vif-unplugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.338 226239 DEBUG nova.compute.manager [req-44bac796-118b-4938-b4a1-ecd4105d830d req-0c5d1145-a49e-45b5-9d20-cbe2149c57e9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received event network-vif-unplugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:59:49 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:49Z|00205|binding|INFO|Releasing lport b33af60b-01fb-4204-b1d7-f9b1d79e127d from this chassis (sb_readonly=0)
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.658 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:49 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9ce13ef9de7ee23e1a2d419a65c83e33087d32ea23234be5f6bcefcbf1477db0-userdata-shm.mount: Deactivated successfully.
Jan 31 02:59:49 np0005603623 systemd[1]: var-lib-containers-storage-overlay-9c2c79f1a2af293e75ccf2b77428b6a4eb56e13e6d6232d63bc1964c08a338a9-merged.mount: Deactivated successfully.
Jan 31 02:59:49 np0005603623 ovn_controller[133449]: 2026-01-31T07:59:49Z|00206|binding|INFO|Releasing lport b33af60b-01fb-4204-b1d7-f9b1d79e127d from this chassis (sb_readonly=0)
Jan 31 02:59:49 np0005603623 nova_compute[226235]: 2026-01-31 07:59:49.722 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:49 np0005603623 podman[250217]: 2026-01-31 07:59:49.987730643 +0000 UTC m=+0.874867984 container cleanup 9ce13ef9de7ee23e1a2d419a65c83e33087d32ea23234be5f6bcefcbf1477db0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:59:49 np0005603623 systemd[1]: libpod-conmon-9ce13ef9de7ee23e1a2d419a65c83e33087d32ea23234be5f6bcefcbf1477db0.scope: Deactivated successfully.
Jan 31 02:59:50 np0005603623 nova_compute[226235]: 2026-01-31 07:59:50.153 226239 DEBUG nova.network.neutron [req-83e6490f-ba5b-41ee-988e-ec6a44e0a82b req-d7b159ed-b915-40d7-80ef-c2d23e5a2dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Updated VIF entry in instance network info cache for port 392355e0-133c-4f74-b0f0-dc74eb8c8416. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:59:50 np0005603623 nova_compute[226235]: 2026-01-31 07:59:50.153 226239 DEBUG nova.network.neutron [req-83e6490f-ba5b-41ee-988e-ec6a44e0a82b req-d7b159ed-b915-40d7-80ef-c2d23e5a2dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Updating instance_info_cache with network_info: [{"id": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "address": "fa:16:3e:98:a5:39", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392355e0-13", "ovs_interfaceid": "392355e0-133c-4f74-b0f0-dc74eb8c8416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:59:50 np0005603623 nova_compute[226235]: 2026-01-31 07:59:50.182 226239 DEBUG oslo_concurrency.lockutils [req-83e6490f-ba5b-41ee-988e-ec6a44e0a82b req-d7b159ed-b915-40d7-80ef-c2d23e5a2dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-add62d62-47fb-454d-aec1-d5bb6be9f1e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:59:50 np0005603623 podman[250278]: 2026-01-31 07:59:50.492245664 +0000 UTC m=+0.485881778 container remove 9ce13ef9de7ee23e1a2d419a65c83e33087d32ea23234be5f6bcefcbf1477db0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 02:59:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:50.496 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[81d51226-0a07-40e8-8eef-133c9d227357]: (4, ('Sat Jan 31 07:59:49 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25 (9ce13ef9de7ee23e1a2d419a65c83e33087d32ea23234be5f6bcefcbf1477db0)\n9ce13ef9de7ee23e1a2d419a65c83e33087d32ea23234be5f6bcefcbf1477db0\nSat Jan 31 07:59:49 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25 (9ce13ef9de7ee23e1a2d419a65c83e33087d32ea23234be5f6bcefcbf1477db0)\n9ce13ef9de7ee23e1a2d419a65c83e33087d32ea23234be5f6bcefcbf1477db0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:50.498 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[72e289d2-bf0b-4e06-b798-f6733b8cd88d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:50.499 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92b7a3d2-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:50 np0005603623 nova_compute[226235]: 2026-01-31 07:59:50.500 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:50 np0005603623 kernel: tap92b7a3d2-90: left promiscuous mode
Jan 31 02:59:50 np0005603623 nova_compute[226235]: 2026-01-31 07:59:50.506 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:50.508 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c06c3876-0de9-45c6-bf5e-6cb80fa8c1c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:50.526 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8f78ec57-a554-490b-ba31-acfa16a61719]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:50.527 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[22fa9ef0-b760-46b1-a51a-7620c16a7595]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:50.541 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f580cd70-25da-46f7-b9e8-ef56e2182c43]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552265, 'reachable_time': 25563, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250294, 'error': None, 'target': 'ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:50.543 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:59:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 07:59:50.544 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[72598599-f468-49ee-9616-99b245322aea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:50 np0005603623 systemd[1]: run-netns-ovnmeta\x2d92b7a3d2\x2d99de\x2d4036\x2db28b\x2d98f77dab6a25.mount: Deactivated successfully.
Jan 31 02:59:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:59:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:50.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:59:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e166 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:51.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:51 np0005603623 nova_compute[226235]: 2026-01-31 07:59:51.496 226239 DEBUG nova.compute.manager [req-e0c2b364-c0f1-4a89-ba65-907ed10ae3f8 req-49604f4a-1a03-4d76-9924-703fe50d2d74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received event network-vif-plugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:51 np0005603623 nova_compute[226235]: 2026-01-31 07:59:51.496 226239 DEBUG oslo_concurrency.lockutils [req-e0c2b364-c0f1-4a89-ba65-907ed10ae3f8 req-49604f4a-1a03-4d76-9924-703fe50d2d74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:51 np0005603623 nova_compute[226235]: 2026-01-31 07:59:51.497 226239 DEBUG oslo_concurrency.lockutils [req-e0c2b364-c0f1-4a89-ba65-907ed10ae3f8 req-49604f4a-1a03-4d76-9924-703fe50d2d74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:51 np0005603623 nova_compute[226235]: 2026-01-31 07:59:51.497 226239 DEBUG oslo_concurrency.lockutils [req-e0c2b364-c0f1-4a89-ba65-907ed10ae3f8 req-49604f4a-1a03-4d76-9924-703fe50d2d74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:51 np0005603623 nova_compute[226235]: 2026-01-31 07:59:51.497 226239 DEBUG nova.compute.manager [req-e0c2b364-c0f1-4a89-ba65-907ed10ae3f8 req-49604f4a-1a03-4d76-9924-703fe50d2d74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] No waiting events found dispatching network-vif-plugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:59:51 np0005603623 nova_compute[226235]: 2026-01-31 07:59:51.497 226239 WARNING nova.compute.manager [req-e0c2b364-c0f1-4a89-ba65-907ed10ae3f8 req-49604f4a-1a03-4d76-9924-703fe50d2d74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received unexpected event network-vif-plugged-392355e0-133c-4f74-b0f0-dc74eb8c8416 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 02:59:52 np0005603623 nova_compute[226235]: 2026-01-31 07:59:52.146 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:52.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:59:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:53.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:59:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e167 e167: 3 total, 3 up, 3 in
Jan 31 02:59:54 np0005603623 nova_compute[226235]: 2026-01-31 07:59:54.223 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:59:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:54.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:59:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:55.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:55 np0005603623 nova_compute[226235]: 2026-01-31 07:59:55.590 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846380.5884314, ba72fe35-90dd-4806-9195-8a8ff81ae9f0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:59:55 np0005603623 nova_compute[226235]: 2026-01-31 07:59:55.591 226239 INFO nova.compute.manager [-] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:59:55 np0005603623 nova_compute[226235]: 2026-01-31 07:59:55.658 226239 DEBUG nova.compute.manager [None req-6b240b61-b469-4f47-a4be-783cc9331149 - - - - - -] [instance: ba72fe35-90dd-4806-9195-8a8ff81ae9f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:59:55 np0005603623 nova_compute[226235]: 2026-01-31 07:59:55.877 226239 INFO nova.virt.libvirt.driver [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Deleting instance files /var/lib/nova/instances/add62d62-47fb-454d-aec1-d5bb6be9f1e6_del#033[00m
Jan 31 02:59:55 np0005603623 nova_compute[226235]: 2026-01-31 07:59:55.878 226239 INFO nova.virt.libvirt.driver [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Deletion of /var/lib/nova/instances/add62d62-47fb-454d-aec1-d5bb6be9f1e6_del complete#033[00m
Jan 31 02:59:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:56 np0005603623 nova_compute[226235]: 2026-01-31 07:59:56.085 226239 INFO nova.compute.manager [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Took 7.13 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:59:56 np0005603623 nova_compute[226235]: 2026-01-31 07:59:56.086 226239 DEBUG oslo.service.loopingcall [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:59:56 np0005603623 nova_compute[226235]: 2026-01-31 07:59:56.086 226239 DEBUG nova.compute.manager [-] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:59:56 np0005603623 nova_compute[226235]: 2026-01-31 07:59:56.086 226239 DEBUG nova.network.neutron [-] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:59:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:56.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:57 np0005603623 nova_compute[226235]: 2026-01-31 07:59:57.148 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:57 np0005603623 nova_compute[226235]: 2026-01-31 07:59:57.169 226239 DEBUG nova.network.neutron [-] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:59:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:57.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:57 np0005603623 nova_compute[226235]: 2026-01-31 07:59:57.240 226239 INFO nova.compute.manager [-] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Took 1.15 seconds to deallocate network for instance.#033[00m
Jan 31 02:59:57 np0005603623 nova_compute[226235]: 2026-01-31 07:59:57.354 226239 DEBUG nova.compute.manager [req-e0ffb47c-7592-47a8-8b5c-7e9f799606ef req-6b521e3d-5d33-4a05-9672-7f8d6516a7ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Received event network-vif-deleted-392355e0-133c-4f74-b0f0-dc74eb8c8416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:57 np0005603623 nova_compute[226235]: 2026-01-31 07:59:57.356 226239 DEBUG oslo_concurrency.lockutils [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:57 np0005603623 nova_compute[226235]: 2026-01-31 07:59:57.356 226239 DEBUG oslo_concurrency.lockutils [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:57 np0005603623 nova_compute[226235]: 2026-01-31 07:59:57.426 226239 DEBUG oslo_concurrency.processutils [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:59:57 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3617929998' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:59:57 np0005603623 nova_compute[226235]: 2026-01-31 07:59:57.986 226239 DEBUG oslo_concurrency.processutils [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.560s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:57 np0005603623 nova_compute[226235]: 2026-01-31 07:59:57.991 226239 DEBUG nova.compute.provider_tree [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:59:58 np0005603623 nova_compute[226235]: 2026-01-31 07:59:58.073 226239 DEBUG nova.scheduler.client.report [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:59:58 np0005603623 nova_compute[226235]: 2026-01-31 07:59:58.112 226239 DEBUG oslo_concurrency.lockutils [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.756s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:58 np0005603623 nova_compute[226235]: 2026-01-31 07:59:58.221 226239 INFO nova.scheduler.client.report [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Deleted allocations for instance add62d62-47fb-454d-aec1-d5bb6be9f1e6#033[00m
Jan 31 02:59:58 np0005603623 nova_compute[226235]: 2026-01-31 07:59:58.455 226239 DEBUG oslo_concurrency.lockutils [None req-0c54b1c7-78c0-45f3-85a2-356aded6fbb5 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "add62d62-47fb-454d-aec1-d5bb6be9f1e6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.506s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:58.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 02:59:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:59.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:59 np0005603623 nova_compute[226235]: 2026-01-31 07:59:59.224 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:00 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:00:00 np0005603623 ceph-mon[77037]: overall HEALTH_OK
Jan 31 03:00:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:00.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:00:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:01.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:00:01 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:00:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e168 e168: 3 total, 3 up, 3 in
Jan 31 03:00:02 np0005603623 nova_compute[226235]: 2026-01-31 08:00:02.150 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e169 e169: 3 total, 3 up, 3 in
Jan 31 03:00:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:00:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:02.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:00:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:03.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:04 np0005603623 nova_compute[226235]: 2026-01-31 08:00:04.181 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846389.1801372, add62d62-47fb-454d-aec1-d5bb6be9f1e6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:00:04 np0005603623 nova_compute[226235]: 2026-01-31 08:00:04.181 226239 INFO nova.compute.manager [-] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:00:04 np0005603623 nova_compute[226235]: 2026-01-31 08:00:04.225 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:04 np0005603623 nova_compute[226235]: 2026-01-31 08:00:04.348 226239 DEBUG nova.compute.manager [None req-002a72af-9e3d-434e-8d3a-ff4b1e7845ff - - - - - -] [instance: add62d62-47fb-454d-aec1-d5bb6be9f1e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:00:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:04.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:00:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:05.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:00:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e170 e170: 3 total, 3 up, 3 in
Jan 31 03:00:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:06.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:06 np0005603623 podman[250426]: 2026-01-31 08:00:06.96231439 +0000 UTC m=+0.046915162 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 03:00:06 np0005603623 podman[250427]: 2026-01-31 08:00:06.981194943 +0000 UTC m=+0.066091245 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 03:00:07 np0005603623 nova_compute[226235]: 2026-01-31 08:00:07.151 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:07.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:08.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e171 e171: 3 total, 3 up, 3 in
Jan 31 03:00:09 np0005603623 nova_compute[226235]: 2026-01-31 08:00:09.227 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:00:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:09.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:00:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:00:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:10.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:00:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e171 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:11.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e172 e172: 3 total, 3 up, 3 in
Jan 31 03:00:12 np0005603623 nova_compute[226235]: 2026-01-31 08:00:12.152 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:12.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:13.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:14 np0005603623 nova_compute[226235]: 2026-01-31 08:00:14.229 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:00:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3099610238' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:00:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:00:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3099610238' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:00:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:14.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:15 np0005603623 nova_compute[226235]: 2026-01-31 08:00:15.157 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:15.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e172 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:00:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:16.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:00:17 np0005603623 nova_compute[226235]: 2026-01-31 08:00:17.155 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e173 e173: 3 total, 3 up, 3 in
Jan 31 03:00:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:17.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:18.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:19 np0005603623 nova_compute[226235]: 2026-01-31 08:00:19.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:19 np0005603623 nova_compute[226235]: 2026-01-31 08:00:19.167 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:19 np0005603623 nova_compute[226235]: 2026-01-31 08:00:19.167 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:19 np0005603623 nova_compute[226235]: 2026-01-31 08:00:19.231 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:19.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:20 np0005603623 nova_compute[226235]: 2026-01-31 08:00:20.168 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:20 np0005603623 nova_compute[226235]: 2026-01-31 08:00:20.169 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:20 np0005603623 nova_compute[226235]: 2026-01-31 08:00:20.214 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:20 np0005603623 nova_compute[226235]: 2026-01-31 08:00:20.215 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:20 np0005603623 nova_compute[226235]: 2026-01-31 08:00:20.215 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:20 np0005603623 nova_compute[226235]: 2026-01-31 08:00:20.216 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:00:20 np0005603623 nova_compute[226235]: 2026-01-31 08:00:20.216 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:00:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:20.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:00:21 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/929286127' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:00:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:21.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:21 np0005603623 nova_compute[226235]: 2026-01-31 08:00:21.246 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:00:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:21 np0005603623 nova_compute[226235]: 2026-01-31 08:00:21.371 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:00:21 np0005603623 nova_compute[226235]: 2026-01-31 08:00:21.372 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4712MB free_disk=20.988269805908203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:00:21 np0005603623 nova_compute[226235]: 2026-01-31 08:00:21.373 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:21 np0005603623 nova_compute[226235]: 2026-01-31 08:00:21.374 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:21 np0005603623 nova_compute[226235]: 2026-01-31 08:00:21.804 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:00:21 np0005603623 nova_compute[226235]: 2026-01-31 08:00:21.805 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:00:21 np0005603623 nova_compute[226235]: 2026-01-31 08:00:21.994 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:00:22 np0005603623 nova_compute[226235]: 2026-01-31 08:00:22.012 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:00:22 np0005603623 nova_compute[226235]: 2026-01-31 08:00:22.013 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:00:22 np0005603623 nova_compute[226235]: 2026-01-31 08:00:22.033 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:00:22 np0005603623 nova_compute[226235]: 2026-01-31 08:00:22.063 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:00:22 np0005603623 nova_compute[226235]: 2026-01-31 08:00:22.095 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:00:22 np0005603623 nova_compute[226235]: 2026-01-31 08:00:22.157 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:00:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3081557015' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:00:22 np0005603623 nova_compute[226235]: 2026-01-31 08:00:22.862 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.768s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:00:22 np0005603623 nova_compute[226235]: 2026-01-31 08:00:22.866 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:00:22 np0005603623 nova_compute[226235]: 2026-01-31 08:00:22.881 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:00:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:00:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:22.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:00:22 np0005603623 nova_compute[226235]: 2026-01-31 08:00:22.922 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:00:22 np0005603623 nova_compute[226235]: 2026-01-31 08:00:22.922 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.549s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:22 np0005603623 nova_compute[226235]: 2026-01-31 08:00:22.923 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:22 np0005603623 nova_compute[226235]: 2026-01-31 08:00:22.923 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:00:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:23.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:23 np0005603623 nova_compute[226235]: 2026-01-31 08:00:23.924 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:23 np0005603623 nova_compute[226235]: 2026-01-31 08:00:23.925 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:23 np0005603623 nova_compute[226235]: 2026-01-31 08:00:23.925 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:00:24 np0005603623 nova_compute[226235]: 2026-01-31 08:00:24.123 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:00:24 np0005603623 nova_compute[226235]: 2026-01-31 08:00:24.124 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:24 np0005603623 nova_compute[226235]: 2026-01-31 08:00:24.125 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:24 np0005603623 nova_compute[226235]: 2026-01-31 08:00:24.125 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:24 np0005603623 nova_compute[226235]: 2026-01-31 08:00:24.126 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:00:24 np0005603623 nova_compute[226235]: 2026-01-31 08:00:24.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:24 np0005603623 nova_compute[226235]: 2026-01-31 08:00:24.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:00:24 np0005603623 nova_compute[226235]: 2026-01-31 08:00:24.232 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:24 np0005603623 nova_compute[226235]: 2026-01-31 08:00:24.260 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:00:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:24.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:25.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:00:26.033 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:00:26 np0005603623 nova_compute[226235]: 2026-01-31 08:00:26.034 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:00:26.035 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:00:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:00:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:26.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:00:27 np0005603623 nova_compute[226235]: 2026-01-31 08:00:27.158 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:27.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:00:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:28.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:00:29 np0005603623 nova_compute[226235]: 2026-01-31 08:00:29.236 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:29.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e174 e174: 3 total, 3 up, 3 in
Jan 31 03:00:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:00:30.038 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:00:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:00:30.094 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:00:30.095 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:00:30.095 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:00:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:30.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:00:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:31.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:32 np0005603623 nova_compute[226235]: 2026-01-31 08:00:32.159 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:00:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:32.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:00:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:33.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:34 np0005603623 nova_compute[226235]: 2026-01-31 08:00:34.239 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:34.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:35.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e175 e175: 3 total, 3 up, 3 in
Jan 31 03:00:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:36.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:37 np0005603623 nova_compute[226235]: 2026-01-31 08:00:37.160 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e176 e176: 3 total, 3 up, 3 in
Jan 31 03:00:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:00:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:37.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:00:37 np0005603623 podman[250602]: 2026-01-31 08:00:37.555307919 +0000 UTC m=+0.045311992 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible)
Jan 31 03:00:37 np0005603623 podman[250603]: 2026-01-31 08:00:37.581032666 +0000 UTC m=+0.066298649 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:00:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:38.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:39 np0005603623 nova_compute[226235]: 2026-01-31 08:00:39.243 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:39.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e177 e177: 3 total, 3 up, 3 in
Jan 31 03:00:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:40.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:41.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:42 np0005603623 nova_compute[226235]: 2026-01-31 08:00:42.162 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e178 e178: 3 total, 3 up, 3 in
Jan 31 03:00:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:42.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:43.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:44 np0005603623 nova_compute[226235]: 2026-01-31 08:00:44.246 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:44.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:45.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e178 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:00:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:46.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:00:47 np0005603623 nova_compute[226235]: 2026-01-31 08:00:47.045 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:47 np0005603623 nova_compute[226235]: 2026-01-31 08:00:47.164 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:00:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:47.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:00:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e179 e179: 3 total, 3 up, 3 in
Jan 31 03:00:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:48.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:49 np0005603623 nova_compute[226235]: 2026-01-31 08:00:49.249 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:49.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e180 e180: 3 total, 3 up, 3 in
Jan 31 03:00:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:00:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:50.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:00:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:51.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:52 np0005603623 nova_compute[226235]: 2026-01-31 08:00:52.204 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:52.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:53.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:53 np0005603623 ovn_controller[133449]: 2026-01-31T08:00:53Z|00207|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 03:00:54 np0005603623 nova_compute[226235]: 2026-01-31 08:00:54.253 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:54.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:55.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:00:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:56.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:00:57 np0005603623 nova_compute[226235]: 2026-01-31 08:00:57.206 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e181 e181: 3 total, 3 up, 3 in
Jan 31 03:00:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:57.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:58.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:59 np0005603623 nova_compute[226235]: 2026-01-31 08:00:59.256 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:00:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:59.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:00.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:01.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:02 np0005603623 nova_compute[226235]: 2026-01-31 08:01:02.208 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:03.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:03.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:03 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:01:03 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:01:03 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:01:04 np0005603623 nova_compute[226235]: 2026-01-31 08:01:04.260 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:05.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:01:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:05.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:01:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:07 np0005603623 nova_compute[226235]: 2026-01-31 08:01:07.209 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:01:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:07.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:01:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:07.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:07 np0005603623 podman[250877]: 2026-01-31 08:01:07.960118475 +0000 UTC m=+0.053927693 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:01:07 np0005603623 podman[250878]: 2026-01-31 08:01:07.980795052 +0000 UTC m=+0.073864746 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:01:09 np0005603623 nova_compute[226235]: 2026-01-31 08:01:09.262 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:01:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:09.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:01:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:01:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:09.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:01:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:11.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:11.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:12 np0005603623 nova_compute[226235]: 2026-01-31 08:01:12.210 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:13.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:01:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:13.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:01:14 np0005603623 nova_compute[226235]: 2026-01-31 08:01:14.265 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:15 np0005603623 nova_compute[226235]: 2026-01-31 08:01:15.252 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:01:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:15.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:15.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:01:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:17 np0005603623 nova_compute[226235]: 2026-01-31 08:01:17.212 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:17.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:01:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:17.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:01:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:01:19 np0005603623 nova_compute[226235]: 2026-01-31 08:01:19.269 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:19.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:19.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:20 np0005603623 nova_compute[226235]: 2026-01-31 08:01:20.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:01:20 np0005603623 nova_compute[226235]: 2026-01-31 08:01:20.156 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:01:20 np0005603623 nova_compute[226235]: 2026-01-31 08:01:20.218 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:01:20 np0005603623 nova_compute[226235]: 2026-01-31 08:01:20.219 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:01:20 np0005603623 nova_compute[226235]: 2026-01-31 08:01:20.219 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:01:20 np0005603623 nova_compute[226235]: 2026-01-31 08:01:20.219 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:01:20 np0005603623 nova_compute[226235]: 2026-01-31 08:01:20.219 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:01:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:01:20 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4071406169' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:01:20 np0005603623 nova_compute[226235]: 2026-01-31 08:01:20.672 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:01:20 np0005603623 nova_compute[226235]: 2026-01-31 08:01:20.791 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:01:20 np0005603623 nova_compute[226235]: 2026-01-31 08:01:20.792 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4731MB free_disk=20.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:01:20 np0005603623 nova_compute[226235]: 2026-01-31 08:01:20.792 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:01:20 np0005603623 nova_compute[226235]: 2026-01-31 08:01:20.793 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:01:20 np0005603623 nova_compute[226235]: 2026-01-31 08:01:20.874 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:01:20 np0005603623 nova_compute[226235]: 2026-01-31 08:01:20.875 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:01:20 np0005603623 nova_compute[226235]: 2026-01-31 08:01:20.895 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:01:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:01:21 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1230746788' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:01:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:21.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:21 np0005603623 nova_compute[226235]: 2026-01-31 08:01:21.307 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:01:21 np0005603623 nova_compute[226235]: 2026-01-31 08:01:21.311 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:01:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:21 np0005603623 nova_compute[226235]: 2026-01-31 08:01:21.347 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:01:21 np0005603623 nova_compute[226235]: 2026-01-31 08:01:21.349 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:01:21 np0005603623 nova_compute[226235]: 2026-01-31 08:01:21.349 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:01:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:21.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:22 np0005603623 nova_compute[226235]: 2026-01-31 08:01:22.216 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:22 np0005603623 nova_compute[226235]: 2026-01-31 08:01:22.348 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:01:22 np0005603623 nova_compute[226235]: 2026-01-31 08:01:22.348 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:01:22 np0005603623 nova_compute[226235]: 2026-01-31 08:01:22.348 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:01:22 np0005603623 nova_compute[226235]: 2026-01-31 08:01:22.349 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:01:22 np0005603623 nova_compute[226235]: 2026-01-31 08:01:22.367 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:01:22 np0005603623 nova_compute[226235]: 2026-01-31 08:01:22.368 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:01:22 np0005603623 nova_compute[226235]: 2026-01-31 08:01:22.368 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:01:22 np0005603623 nova_compute[226235]: 2026-01-31 08:01:22.368 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:01:22 np0005603623 nova_compute[226235]: 2026-01-31 08:01:22.369 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:01:23 np0005603623 nova_compute[226235]: 2026-01-31 08:01:23.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:01:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:23.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:23.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:24 np0005603623 nova_compute[226235]: 2026-01-31 08:01:24.273 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:25.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:01:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:25.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:01:26.148 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:01:26.149 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:01:26 np0005603623 nova_compute[226235]: 2026-01-31 08:01:26.200 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:27 np0005603623 nova_compute[226235]: 2026-01-31 08:01:27.217 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:27.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:27.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:01:28.152 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:01:29 np0005603623 nova_compute[226235]: 2026-01-31 08:01:29.277 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:29.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:29.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:01:30.095 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:01:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:01:30.095 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:01:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:01:30.095 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:01:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:01:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:31.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:01:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:01:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:31.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:01:32 np0005603623 nova_compute[226235]: 2026-01-31 08:01:32.218 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:33.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:33.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:34 np0005603623 nova_compute[226235]: 2026-01-31 08:01:34.282 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:01:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:35.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:01:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:35.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:37 np0005603623 nova_compute[226235]: 2026-01-31 08:01:37.220 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:01:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:37.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:01:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:01:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:37.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:01:38 np0005603623 podman[251127]: 2026-01-31 08:01:38.954342022 +0000 UTC m=+0.053408065 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent)
Jan 31 03:01:38 np0005603623 podman[251128]: 2026-01-31 08:01:38.977272071 +0000 UTC m=+0.074350132 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 31 03:01:39 np0005603623 nova_compute[226235]: 2026-01-31 08:01:39.284 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:01:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:39.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:01:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:39.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:41.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:01:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:41.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:01:42 np0005603623 nova_compute[226235]: 2026-01-31 08:01:42.222 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:01:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:43.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:01:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:43.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:44 np0005603623 nova_compute[226235]: 2026-01-31 08:01:44.288 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:45.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:45.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e182 e182: 3 total, 3 up, 3 in
Jan 31 03:01:47 np0005603623 nova_compute[226235]: 2026-01-31 08:01:47.224 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:47.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:01:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:47.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:01:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e183 e183: 3 total, 3 up, 3 in
Jan 31 03:01:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e184 e184: 3 total, 3 up, 3 in
Jan 31 03:01:49 np0005603623 nova_compute[226235]: 2026-01-31 08:01:49.291 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:49.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:49.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:51.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:51.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:52 np0005603623 nova_compute[226235]: 2026-01-31 08:01:52.226 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:01:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:53.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:01:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:01:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:53.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:01:54 np0005603623 nova_compute[226235]: 2026-01-31 08:01:54.294 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:55.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:55.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:56 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #47. Immutable memtables: 4.
Jan 31 03:01:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:57 np0005603623 nova_compute[226235]: 2026-01-31 08:01:57.226 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e185 e185: 3 total, 3 up, 3 in
Jan 31 03:01:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:57.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:01:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:57.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:01:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e186 e186: 3 total, 3 up, 3 in
Jan 31 03:01:59 np0005603623 nova_compute[226235]: 2026-01-31 08:01:59.297 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:01:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:59.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:01:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:01:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:59.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:01.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:02:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:01.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:02:02 np0005603623 nova_compute[226235]: 2026-01-31 08:02:02.229 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:02:02.519738) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846522519859, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2787, "num_deletes": 516, "total_data_size": 5778726, "memory_usage": 5848272, "flush_reason": "Manual Compaction"}
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846522560685, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 3794221, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31072, "largest_seqno": 33854, "table_properties": {"data_size": 3783169, "index_size": 6714, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3269, "raw_key_size": 26079, "raw_average_key_size": 20, "raw_value_size": 3758898, "raw_average_value_size": 2887, "num_data_blocks": 289, "num_entries": 1302, "num_filter_entries": 1302, "num_deletions": 516, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769846327, "oldest_key_time": 1769846327, "file_creation_time": 1769846522, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 40984 microseconds, and 5893 cpu microseconds.
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:02:02.560733) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 3794221 bytes OK
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:02:02.560754) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:02:02.563441) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:02:02.563472) EVENT_LOG_v1 {"time_micros": 1769846522563467, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:02:02.563490) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 5765493, prev total WAL file size 5765493, number of live WAL files 2.
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:02:02.564258) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(3705KB)], [60(9013KB)]
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846522564310, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 13024378, "oldest_snapshot_seqno": -1}
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 5753 keys, 10745464 bytes, temperature: kUnknown
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846522681664, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 10745464, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10704853, "index_size": 25141, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14405, "raw_key_size": 147896, "raw_average_key_size": 25, "raw_value_size": 10599427, "raw_average_value_size": 1842, "num_data_blocks": 1012, "num_entries": 5753, "num_filter_entries": 5753, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769846522, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:02:02.681887) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 10745464 bytes
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:02:02.683160) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 110.9 rd, 91.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 8.8 +0.0 blob) out(10.2 +0.0 blob), read-write-amplify(6.3) write-amplify(2.8) OK, records in: 6802, records dropped: 1049 output_compression: NoCompression
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:02:02.683178) EVENT_LOG_v1 {"time_micros": 1769846522683170, "job": 36, "event": "compaction_finished", "compaction_time_micros": 117421, "compaction_time_cpu_micros": 19703, "output_level": 6, "num_output_files": 1, "total_output_size": 10745464, "num_input_records": 6802, "num_output_records": 5753, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846522683662, "job": 36, "event": "table_file_deletion", "file_number": 62}
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846522684625, "job": 36, "event": "table_file_deletion", "file_number": 60}
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:02:02.564190) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:02:02.684770) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:02:02.684860) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:02:02.684863) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:02:02.684865) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:02:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:02:02.684867) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:02:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:03.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:03.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:04 np0005603623 nova_compute[226235]: 2026-01-31 08:02:04.300 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:05.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:02:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:05.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:02:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:07 np0005603623 nova_compute[226235]: 2026-01-31 08:02:07.230 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e187 e187: 3 total, 3 up, 3 in
Jan 31 03:02:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:07.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:02:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:07.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:02:09 np0005603623 nova_compute[226235]: 2026-01-31 08:02:09.303 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:09.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:09.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:09 np0005603623 podman[251235]: 2026-01-31 08:02:09.968174764 +0000 UTC m=+0.060585620 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 31 03:02:09 np0005603623 podman[251234]: 2026-01-31 08:02:09.980296605 +0000 UTC m=+0.074782566 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 31 03:02:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:11.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:11.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:11.674 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:02:11 np0005603623 nova_compute[226235]: 2026-01-31 08:02:11.675 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:11.675 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:02:12 np0005603623 nova_compute[226235]: 2026-01-31 08:02:12.232 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:13.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:02:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:13.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:02:14 np0005603623 nova_compute[226235]: 2026-01-31 08:02:14.306 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:02:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:15.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:02:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:02:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:15.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:02:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:16.678 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:02:17 np0005603623 nova_compute[226235]: 2026-01-31 08:02:17.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:17 np0005603623 nova_compute[226235]: 2026-01-31 08:02:17.267 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:17.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:17.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:02:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:02:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:02:19 np0005603623 nova_compute[226235]: 2026-01-31 08:02:19.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:19 np0005603623 nova_compute[226235]: 2026-01-31 08:02:19.310 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:19.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:02:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:19.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:02:20 np0005603623 nova_compute[226235]: 2026-01-31 08:02:20.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:20 np0005603623 nova_compute[226235]: 2026-01-31 08:02:20.205 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:20 np0005603623 nova_compute[226235]: 2026-01-31 08:02:20.205 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:20 np0005603623 nova_compute[226235]: 2026-01-31 08:02:20.206 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:20 np0005603623 nova_compute[226235]: 2026-01-31 08:02:20.206 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:02:20 np0005603623 nova_compute[226235]: 2026-01-31 08:02:20.206 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:02:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:02:20 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/479867212' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:02:20 np0005603623 nova_compute[226235]: 2026-01-31 08:02:20.631 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:02:20 np0005603623 nova_compute[226235]: 2026-01-31 08:02:20.753 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:02:20 np0005603623 nova_compute[226235]: 2026-01-31 08:02:20.755 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4736MB free_disk=20.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:02:20 np0005603623 nova_compute[226235]: 2026-01-31 08:02:20.755 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:20 np0005603623 nova_compute[226235]: 2026-01-31 08:02:20.755 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:20 np0005603623 nova_compute[226235]: 2026-01-31 08:02:20.857 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:02:20 np0005603623 nova_compute[226235]: 2026-01-31 08:02:20.857 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:02:20 np0005603623 nova_compute[226235]: 2026-01-31 08:02:20.878 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:02:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:02:21 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2913698432' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:02:21 np0005603623 nova_compute[226235]: 2026-01-31 08:02:21.310 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:02:21 np0005603623 nova_compute[226235]: 2026-01-31 08:02:21.314 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:02:21 np0005603623 nova_compute[226235]: 2026-01-31 08:02:21.342 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:02:21 np0005603623 nova_compute[226235]: 2026-01-31 08:02:21.344 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:02:21 np0005603623 nova_compute[226235]: 2026-01-31 08:02:21.344 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.589s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:21.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:21.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:22 np0005603623 nova_compute[226235]: 2026-01-31 08:02:22.270 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:22 np0005603623 nova_compute[226235]: 2026-01-31 08:02:22.344 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:22 np0005603623 nova_compute[226235]: 2026-01-31 08:02:22.345 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:22 np0005603623 nova_compute[226235]: 2026-01-31 08:02:22.345 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:02:23 np0005603623 nova_compute[226235]: 2026-01-31 08:02:23.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:23 np0005603623 nova_compute[226235]: 2026-01-31 08:02:23.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:02:23 np0005603623 nova_compute[226235]: 2026-01-31 08:02:23.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:02:23 np0005603623 nova_compute[226235]: 2026-01-31 08:02:23.187 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:02:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:23.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:02:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:23.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:02:24 np0005603623 nova_compute[226235]: 2026-01-31 08:02:24.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:24 np0005603623 nova_compute[226235]: 2026-01-31 08:02:24.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:24 np0005603623 nova_compute[226235]: 2026-01-31 08:02:24.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:24 np0005603623 nova_compute[226235]: 2026-01-31 08:02:24.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:24 np0005603623 nova_compute[226235]: 2026-01-31 08:02:24.313 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:02:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:25.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:02:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:02:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 6645 writes, 34K keys, 6645 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.03 MB/s#012Cumulative WAL: 6645 writes, 6645 syncs, 1.00 writes per sync, written: 0.07 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1597 writes, 7910 keys, 1597 commit groups, 1.0 writes per commit group, ingest: 16.21 MB, 0.03 MB/s#012Interval WAL: 1597 writes, 1597 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     37.7      1.10              0.10        18    0.061       0      0       0.0       0.0#012  L6      1/0   10.25 MB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   3.6     62.8     52.0      2.90              0.34        17    0.171     86K   9924       0.0       0.0#012 Sum      1/0   10.25 MB   0.0      0.2     0.0      0.1       0.2      0.1       0.0   4.6     45.5     48.1      4.00              0.45        35    0.114     86K   9924       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   4.8     45.6     46.8      1.14              0.10         8    0.142     25K   3110       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   0.0     62.8     52.0      2.90              0.34        17    0.171     86K   9924       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     37.8      1.10              0.10        17    0.065       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.0 total, 600.0 interval#012Flush(GB): cumulative 0.041, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.19 GB write, 0.08 MB/s write, 0.18 GB read, 0.08 MB/s read, 4.0 seconds#012Interval compaction: 0.05 GB write, 0.09 MB/s write, 0.05 GB read, 0.09 MB/s read, 1.1 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557fc5f1b1f0#2 capacity: 304.00 MB usage: 20.26 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.00018 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1154,19.57 MB,6.43839%) FilterBlock(35,248.55 KB,0.0798426%) IndexBlock(35,455.86 KB,0.146439%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 03:02:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:25.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:02:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:02:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e188 e188: 3 total, 3 up, 3 in
Jan 31 03:02:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e189 e189: 3 total, 3 up, 3 in
Jan 31 03:02:27 np0005603623 nova_compute[226235]: 2026-01-31 08:02:27.272 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:27.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:02:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:27.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:02:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e190 e190: 3 total, 3 up, 3 in
Jan 31 03:02:29 np0005603623 nova_compute[226235]: 2026-01-31 08:02:29.316 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:29.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:29.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:30.096 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:30.096 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:30.096 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:31.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:31.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:32 np0005603623 nova_compute[226235]: 2026-01-31 08:02:32.273 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:33.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:33.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:34 np0005603623 nova_compute[226235]: 2026-01-31 08:02:34.320 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:35.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:02:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:35.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:02:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e191 e191: 3 total, 3 up, 3 in
Jan 31 03:02:37 np0005603623 nova_compute[226235]: 2026-01-31 08:02:37.275 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e192 e192: 3 total, 3 up, 3 in
Jan 31 03:02:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:37.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:37.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:38 np0005603623 nova_compute[226235]: 2026-01-31 08:02:38.460 226239 DEBUG oslo_concurrency.lockutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Acquiring lock "4470f2e9-7e12-43b5-a9a5-f690f2509954" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:38 np0005603623 nova_compute[226235]: 2026-01-31 08:02:38.461 226239 DEBUG oslo_concurrency.lockutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lock "4470f2e9-7e12-43b5-a9a5-f690f2509954" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:38 np0005603623 nova_compute[226235]: 2026-01-31 08:02:38.609 226239 DEBUG nova.compute.manager [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:02:38 np0005603623 nova_compute[226235]: 2026-01-31 08:02:38.994 226239 DEBUG oslo_concurrency.lockutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:38 np0005603623 nova_compute[226235]: 2026-01-31 08:02:38.995 226239 DEBUG oslo_concurrency.lockutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:39 np0005603623 nova_compute[226235]: 2026-01-31 08:02:39.004 226239 DEBUG nova.virt.hardware [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:02:39 np0005603623 nova_compute[226235]: 2026-01-31 08:02:39.005 226239 INFO nova.compute.claims [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:02:39 np0005603623 nova_compute[226235]: 2026-01-31 08:02:39.323 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:39.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:39 np0005603623 nova_compute[226235]: 2026-01-31 08:02:39.532 226239 DEBUG oslo_concurrency.processutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:02:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:39.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 31 03:02:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:02:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1822226992' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:02:39 np0005603623 nova_compute[226235]: 2026-01-31 08:02:39.950 226239 DEBUG oslo_concurrency.processutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:02:39 np0005603623 nova_compute[226235]: 2026-01-31 08:02:39.955 226239 DEBUG nova.compute.provider_tree [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.115 226239 DEBUG nova.scheduler.client.report [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.221 226239 DEBUG oslo_concurrency.lockutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.222 226239 DEBUG nova.compute.manager [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.425 226239 DEBUG nova.compute.manager [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.426 226239 DEBUG nova.network.neutron [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.466 226239 INFO nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.492 226239 DEBUG nova.compute.manager [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.588 226239 DEBUG nova.compute.manager [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.590 226239 DEBUG nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.591 226239 INFO nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Creating image(s)#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.620 226239 DEBUG nova.storage.rbd_utils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] rbd image 4470f2e9-7e12-43b5-a9a5-f690f2509954_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.651 226239 DEBUG nova.storage.rbd_utils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] rbd image 4470f2e9-7e12-43b5-a9a5-f690f2509954_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.686 226239 DEBUG nova.storage.rbd_utils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] rbd image 4470f2e9-7e12-43b5-a9a5-f690f2509954_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.730 226239 DEBUG oslo_concurrency.processutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.787 226239 DEBUG oslo_concurrency.processutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.788 226239 DEBUG oslo_concurrency.lockutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.788 226239 DEBUG oslo_concurrency.lockutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.788 226239 DEBUG oslo_concurrency.lockutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.811 226239 DEBUG nova.storage.rbd_utils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] rbd image 4470f2e9-7e12-43b5-a9a5-f690f2509954_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.814 226239 DEBUG oslo_concurrency.processutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4470f2e9-7e12-43b5-a9a5-f690f2509954_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:02:40 np0005603623 nova_compute[226235]: 2026-01-31 08:02:40.830 226239 DEBUG nova.policy [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97778ff629964356819ef34be55ca5a6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '03f24e162c6d454aa9e31d60b478001d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:02:40 np0005603623 podman[251733]: 2026-01-31 08:02:40.949104395 +0000 UTC m=+0.045265911 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 03:02:40 np0005603623 podman[251734]: 2026-01-31 08:02:40.993176146 +0000 UTC m=+0.087670629 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:02:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:02:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:41.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:02:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e192 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:41 np0005603623 nova_compute[226235]: 2026-01-31 08:02:41.597 226239 DEBUG oslo_concurrency.processutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4470f2e9-7e12-43b5-a9a5-f690f2509954_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.783s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:02:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:02:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:41.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:02:41 np0005603623 nova_compute[226235]: 2026-01-31 08:02:41.685 226239 DEBUG nova.storage.rbd_utils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] resizing rbd image 4470f2e9-7e12-43b5-a9a5-f690f2509954_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:02:42 np0005603623 nova_compute[226235]: 2026-01-31 08:02:42.269 226239 DEBUG nova.network.neutron [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Successfully created port: 2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:02:42 np0005603623 nova_compute[226235]: 2026-01-31 08:02:42.277 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:42 np0005603623 nova_compute[226235]: 2026-01-31 08:02:42.977 226239 DEBUG nova.objects.instance [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lazy-loading 'migration_context' on Instance uuid 4470f2e9-7e12-43b5-a9a5-f690f2509954 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:02:43 np0005603623 nova_compute[226235]: 2026-01-31 08:02:43.010 226239 DEBUG nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:02:43 np0005603623 nova_compute[226235]: 2026-01-31 08:02:43.011 226239 DEBUG nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Ensure instance console log exists: /var/lib/nova/instances/4470f2e9-7e12-43b5-a9a5-f690f2509954/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:02:43 np0005603623 nova_compute[226235]: 2026-01-31 08:02:43.011 226239 DEBUG oslo_concurrency.lockutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:43 np0005603623 nova_compute[226235]: 2026-01-31 08:02:43.011 226239 DEBUG oslo_concurrency.lockutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:43 np0005603623 nova_compute[226235]: 2026-01-31 08:02:43.012 226239 DEBUG oslo_concurrency.lockutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:02:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:43.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:02:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:02:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:43.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:02:44 np0005603623 nova_compute[226235]: 2026-01-31 08:02:44.312 226239 DEBUG nova.network.neutron [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Successfully updated port: 2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:02:44 np0005603623 nova_compute[226235]: 2026-01-31 08:02:44.327 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:44 np0005603623 nova_compute[226235]: 2026-01-31 08:02:44.334 226239 DEBUG oslo_concurrency.lockutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Acquiring lock "refresh_cache-4470f2e9-7e12-43b5-a9a5-f690f2509954" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:02:44 np0005603623 nova_compute[226235]: 2026-01-31 08:02:44.334 226239 DEBUG oslo_concurrency.lockutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Acquired lock "refresh_cache-4470f2e9-7e12-43b5-a9a5-f690f2509954" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:02:44 np0005603623 nova_compute[226235]: 2026-01-31 08:02:44.335 226239 DEBUG nova.network.neutron [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:02:44 np0005603623 nova_compute[226235]: 2026-01-31 08:02:44.523 226239 DEBUG nova.compute.manager [req-9ebb2619-a90b-41a8-908e-90585b056968 req-ffc513eb-c428-4361-8dd1-1d6c29595214 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Received event network-changed-2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:02:44 np0005603623 nova_compute[226235]: 2026-01-31 08:02:44.524 226239 DEBUG nova.compute.manager [req-9ebb2619-a90b-41a8-908e-90585b056968 req-ffc513eb-c428-4361-8dd1-1d6c29595214 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Refreshing instance network info cache due to event network-changed-2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:02:44 np0005603623 nova_compute[226235]: 2026-01-31 08:02:44.524 226239 DEBUG oslo_concurrency.lockutils [req-9ebb2619-a90b-41a8-908e-90585b056968 req-ffc513eb-c428-4361-8dd1-1d6c29595214 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4470f2e9-7e12-43b5-a9a5-f690f2509954" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:02:44 np0005603623 nova_compute[226235]: 2026-01-31 08:02:44.686 226239 DEBUG nova.network.neutron [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:02:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:45.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:02:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:45.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:02:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e192 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.002 226239 DEBUG nova.network.neutron [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Updating instance_info_cache with network_info: [{"id": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "address": "fa:16:3e:7a:9d:eb", "network": {"id": "8f76a919-8420-45e2-a6e8-56654f40ec08", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-57295873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "03f24e162c6d454aa9e31d60b478001d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cb7cbe0-77", "ovs_interfaceid": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.131 226239 DEBUG oslo_concurrency.lockutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Releasing lock "refresh_cache-4470f2e9-7e12-43b5-a9a5-f690f2509954" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.131 226239 DEBUG nova.compute.manager [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Instance network_info: |[{"id": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "address": "fa:16:3e:7a:9d:eb", "network": {"id": "8f76a919-8420-45e2-a6e8-56654f40ec08", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-57295873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "03f24e162c6d454aa9e31d60b478001d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cb7cbe0-77", "ovs_interfaceid": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.132 226239 DEBUG oslo_concurrency.lockutils [req-9ebb2619-a90b-41a8-908e-90585b056968 req-ffc513eb-c428-4361-8dd1-1d6c29595214 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4470f2e9-7e12-43b5-a9a5-f690f2509954" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.132 226239 DEBUG nova.network.neutron [req-9ebb2619-a90b-41a8-908e-90585b056968 req-ffc513eb-c428-4361-8dd1-1d6c29595214 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Refreshing network info cache for port 2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.134 226239 DEBUG nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Start _get_guest_xml network_info=[{"id": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "address": "fa:16:3e:7a:9d:eb", "network": {"id": "8f76a919-8420-45e2-a6e8-56654f40ec08", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-57295873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "03f24e162c6d454aa9e31d60b478001d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cb7cbe0-77", "ovs_interfaceid": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.138 226239 WARNING nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.146 226239 DEBUG nova.virt.libvirt.host [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.147 226239 DEBUG nova.virt.libvirt.host [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.153 226239 DEBUG nova.virt.libvirt.host [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.153 226239 DEBUG nova.virt.libvirt.host [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.155 226239 DEBUG nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.155 226239 DEBUG nova.virt.hardware [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.155 226239 DEBUG nova.virt.hardware [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.155 226239 DEBUG nova.virt.hardware [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.156 226239 DEBUG nova.virt.hardware [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.156 226239 DEBUG nova.virt.hardware [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.156 226239 DEBUG nova.virt.hardware [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.156 226239 DEBUG nova.virt.hardware [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.157 226239 DEBUG nova.virt.hardware [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.157 226239 DEBUG nova.virt.hardware [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.157 226239 DEBUG nova.virt.hardware [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.157 226239 DEBUG nova.virt.hardware [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.160 226239 DEBUG oslo_concurrency.processutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.279 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e193 e193: 3 total, 3 up, 3 in
Jan 31 03:02:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:47.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:02:47 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2056823326' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.610 226239 DEBUG oslo_concurrency.processutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:02:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:47.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.634 226239 DEBUG nova.storage.rbd_utils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] rbd image 4470f2e9-7e12-43b5-a9a5-f690f2509954_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:02:47 np0005603623 nova_compute[226235]: 2026-01-31 08:02:47.637 226239 DEBUG oslo_concurrency.processutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:02:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:02:48 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/272352408' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.042 226239 DEBUG oslo_concurrency.processutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.405s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.044 226239 DEBUG nova.virt.libvirt.vif [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:02:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1556480914',display_name='tempest-ImagesOneServerTestJSON-server-1556480914',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1556480914',id=53,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='03f24e162c6d454aa9e31d60b478001d',ramdisk_id='',reservation_id='r-i9hgf1b5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-77816411',owner_user_name='tempest-ImagesOneServerTestJSON-77816411-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:02:40Z,user_data=None,user_id='97778ff629964356819ef34be55ca5a6',uuid=4470f2e9-7e12-43b5-a9a5-f690f2509954,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "address": "fa:16:3e:7a:9d:eb", "network": {"id": "8f76a919-8420-45e2-a6e8-56654f40ec08", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-57295873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "03f24e162c6d454aa9e31d60b478001d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cb7cbe0-77", "ovs_interfaceid": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.044 226239 DEBUG nova.network.os_vif_util [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Converting VIF {"id": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "address": "fa:16:3e:7a:9d:eb", "network": {"id": "8f76a919-8420-45e2-a6e8-56654f40ec08", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-57295873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "03f24e162c6d454aa9e31d60b478001d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cb7cbe0-77", "ovs_interfaceid": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.045 226239 DEBUG nova.network.os_vif_util [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:9d:eb,bridge_name='br-int',has_traffic_filtering=True,id=2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0,network=Network(8f76a919-8420-45e2-a6e8-56654f40ec08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2cb7cbe0-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.046 226239 DEBUG nova.objects.instance [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lazy-loading 'pci_devices' on Instance uuid 4470f2e9-7e12-43b5-a9a5-f690f2509954 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.072 226239 DEBUG nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:02:48 np0005603623 nova_compute[226235]:  <uuid>4470f2e9-7e12-43b5-a9a5-f690f2509954</uuid>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:  <name>instance-00000035</name>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <nova:name>tempest-ImagesOneServerTestJSON-server-1556480914</nova:name>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:02:47</nova:creationTime>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:02:48 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:        <nova:user uuid="97778ff629964356819ef34be55ca5a6">tempest-ImagesOneServerTestJSON-77816411-project-member</nova:user>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:        <nova:project uuid="03f24e162c6d454aa9e31d60b478001d">tempest-ImagesOneServerTestJSON-77816411</nova:project>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:        <nova:port uuid="2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0">
Jan 31 03:02:48 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <entry name="serial">4470f2e9-7e12-43b5-a9a5-f690f2509954</entry>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <entry name="uuid">4470f2e9-7e12-43b5-a9a5-f690f2509954</entry>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/4470f2e9-7e12-43b5-a9a5-f690f2509954_disk">
Jan 31 03:02:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:02:48 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/4470f2e9-7e12-43b5-a9a5-f690f2509954_disk.config">
Jan 31 03:02:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:02:48 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:7a:9d:eb"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <target dev="tap2cb7cbe0-77"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/4470f2e9-7e12-43b5-a9a5-f690f2509954/console.log" append="off"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:02:48 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:02:48 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:02:48 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:02:48 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.073 226239 DEBUG nova.compute.manager [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Preparing to wait for external event network-vif-plugged-2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.074 226239 DEBUG oslo_concurrency.lockutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Acquiring lock "4470f2e9-7e12-43b5-a9a5-f690f2509954-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.074 226239 DEBUG oslo_concurrency.lockutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lock "4470f2e9-7e12-43b5-a9a5-f690f2509954-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.074 226239 DEBUG oslo_concurrency.lockutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lock "4470f2e9-7e12-43b5-a9a5-f690f2509954-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.075 226239 DEBUG nova.virt.libvirt.vif [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:02:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1556480914',display_name='tempest-ImagesOneServerTestJSON-server-1556480914',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1556480914',id=53,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='03f24e162c6d454aa9e31d60b478001d',ramdisk_id='',reservation_id='r-i9hgf1b5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-77816411',owner_user_name='tempest-ImagesOneServerTestJSON-77816411-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:02:40Z,user_data=None,user_id='97778ff629964356819ef34be55ca5a6',uuid=4470f2e9-7e12-43b5-a9a5-f690f2509954,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "address": "fa:16:3e:7a:9d:eb", "network": {"id": "8f76a919-8420-45e2-a6e8-56654f40ec08", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-57295873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "03f24e162c6d454aa9e31d60b478001d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cb7cbe0-77", "ovs_interfaceid": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.075 226239 DEBUG nova.network.os_vif_util [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Converting VIF {"id": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "address": "fa:16:3e:7a:9d:eb", "network": {"id": "8f76a919-8420-45e2-a6e8-56654f40ec08", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-57295873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "03f24e162c6d454aa9e31d60b478001d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cb7cbe0-77", "ovs_interfaceid": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.076 226239 DEBUG nova.network.os_vif_util [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:9d:eb,bridge_name='br-int',has_traffic_filtering=True,id=2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0,network=Network(8f76a919-8420-45e2-a6e8-56654f40ec08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2cb7cbe0-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.076 226239 DEBUG os_vif [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:9d:eb,bridge_name='br-int',has_traffic_filtering=True,id=2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0,network=Network(8f76a919-8420-45e2-a6e8-56654f40ec08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2cb7cbe0-77') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.077 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.077 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.078 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.080 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.080 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2cb7cbe0-77, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.081 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2cb7cbe0-77, col_values=(('external_ids', {'iface-id': '2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7a:9d:eb', 'vm-uuid': '4470f2e9-7e12-43b5-a9a5-f690f2509954'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.082 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:48 np0005603623 NetworkManager[48970]: <info>  [1769846568.0839] manager: (tap2cb7cbe0-77): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.085 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.088 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.089 226239 INFO os_vif [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:9d:eb,bridge_name='br-int',has_traffic_filtering=True,id=2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0,network=Network(8f76a919-8420-45e2-a6e8-56654f40ec08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2cb7cbe0-77')#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.169 226239 DEBUG nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.170 226239 DEBUG nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.170 226239 DEBUG nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] No VIF found with MAC fa:16:3e:7a:9d:eb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.171 226239 INFO nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Using config drive#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.201 226239 DEBUG nova.storage.rbd_utils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] rbd image 4470f2e9-7e12-43b5-a9a5-f690f2509954_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.761 226239 INFO nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Creating config drive at /var/lib/nova/instances/4470f2e9-7e12-43b5-a9a5-f690f2509954/disk.config#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.765 226239 DEBUG oslo_concurrency.processutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4470f2e9-7e12-43b5-a9a5-f690f2509954/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpuldcwuju execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.885 226239 DEBUG oslo_concurrency.processutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4470f2e9-7e12-43b5-a9a5-f690f2509954/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpuldcwuju" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.913 226239 DEBUG nova.storage.rbd_utils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] rbd image 4470f2e9-7e12-43b5-a9a5-f690f2509954_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:02:48 np0005603623 nova_compute[226235]: 2026-01-31 08:02:48.917 226239 DEBUG oslo_concurrency.processutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4470f2e9-7e12-43b5-a9a5-f690f2509954/disk.config 4470f2e9-7e12-43b5-a9a5-f690f2509954_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:02:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:49.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:49.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:49 np0005603623 nova_compute[226235]: 2026-01-31 08:02:49.860 226239 DEBUG nova.network.neutron [req-9ebb2619-a90b-41a8-908e-90585b056968 req-ffc513eb-c428-4361-8dd1-1d6c29595214 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Updated VIF entry in instance network info cache for port 2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:02:49 np0005603623 nova_compute[226235]: 2026-01-31 08:02:49.860 226239 DEBUG nova.network.neutron [req-9ebb2619-a90b-41a8-908e-90585b056968 req-ffc513eb-c428-4361-8dd1-1d6c29595214 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Updating instance_info_cache with network_info: [{"id": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "address": "fa:16:3e:7a:9d:eb", "network": {"id": "8f76a919-8420-45e2-a6e8-56654f40ec08", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-57295873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "03f24e162c6d454aa9e31d60b478001d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cb7cbe0-77", "ovs_interfaceid": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:02:49 np0005603623 nova_compute[226235]: 2026-01-31 08:02:49.895 226239 DEBUG oslo_concurrency.lockutils [req-9ebb2619-a90b-41a8-908e-90585b056968 req-ffc513eb-c428-4361-8dd1-1d6c29595214 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4470f2e9-7e12-43b5-a9a5-f690f2509954" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:02:50 np0005603623 nova_compute[226235]: 2026-01-31 08:02:50.345 226239 DEBUG oslo_concurrency.processutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4470f2e9-7e12-43b5-a9a5-f690f2509954/disk.config 4470f2e9-7e12-43b5-a9a5-f690f2509954_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:02:50 np0005603623 nova_compute[226235]: 2026-01-31 08:02:50.345 226239 INFO nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Deleting local config drive /var/lib/nova/instances/4470f2e9-7e12-43b5-a9a5-f690f2509954/disk.config because it was imported into RBD.#033[00m
Jan 31 03:02:50 np0005603623 kernel: tap2cb7cbe0-77: entered promiscuous mode
Jan 31 03:02:50 np0005603623 NetworkManager[48970]: <info>  [1769846570.3834] manager: (tap2cb7cbe0-77): new Tun device (/org/freedesktop/NetworkManager/Devices/101)
Jan 31 03:02:50 np0005603623 nova_compute[226235]: 2026-01-31 08:02:50.383 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:02:50Z|00208|binding|INFO|Claiming lport 2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 for this chassis.
Jan 31 03:02:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:02:50Z|00209|binding|INFO|2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0: Claiming fa:16:3e:7a:9d:eb 10.100.0.6
Jan 31 03:02:50 np0005603623 nova_compute[226235]: 2026-01-31 08:02:50.386 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:50 np0005603623 nova_compute[226235]: 2026-01-31 08:02:50.388 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:50 np0005603623 nova_compute[226235]: 2026-01-31 08:02:50.391 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.401 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:9d:eb 10.100.0.6'], port_security=['fa:16:3e:7a:9d:eb 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4470f2e9-7e12-43b5-a9a5-f690f2509954', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f76a919-8420-45e2-a6e8-56654f40ec08', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '03f24e162c6d454aa9e31d60b478001d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9a69a75b-5ce1-4bb0-8555-2e322ef267ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=043eb9e9-d351-4602-ba73-ce358ef795c2, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.402 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 in datapath 8f76a919-8420-45e2-a6e8-56654f40ec08 bound to our chassis#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.403 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8f76a919-8420-45e2-a6e8-56654f40ec08#033[00m
Jan 31 03:02:50 np0005603623 systemd-machined[194379]: New machine qemu-28-instance-00000035.
Jan 31 03:02:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:02:50Z|00210|binding|INFO|Setting lport 2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 ovn-installed in OVS
Jan 31 03:02:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:02:50Z|00211|binding|INFO|Setting lport 2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 up in Southbound
Jan 31 03:02:50 np0005603623 nova_compute[226235]: 2026-01-31 08:02:50.413 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:50 np0005603623 systemd[1]: Started Virtual Machine qemu-28-instance-00000035.
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.412 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b2d2be86-d12c-4def-86cf-c62d0a074d16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.414 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8f76a919-81 in ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.415 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8f76a919-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.415 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dd71c9ed-1986-4f07-8d82-071515b3e197]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:50 np0005603623 systemd-udevd[251994]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.416 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0a0295c0-2914-4bda-ba24-d69a936fc9bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:50 np0005603623 NetworkManager[48970]: <info>  [1769846570.4276] device (tap2cb7cbe0-77): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.426 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[c17f8d65-6ca1-4cd9-a5df-0a24d36e61de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:50 np0005603623 NetworkManager[48970]: <info>  [1769846570.4286] device (tap2cb7cbe0-77): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.436 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[de40363d-8888-41c6-9112-2f954c2ad967]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.457 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[3418dcc5-23a0-487d-ac44-1263912e5461]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.462 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3491ac8a-8c01-47fc-ae63-e966b3bb00ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:50 np0005603623 NetworkManager[48970]: <info>  [1769846570.4634] manager: (tap8f76a919-80): new Veth device (/org/freedesktop/NetworkManager/Devices/102)
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.487 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9675f9-48a7-43a5-995e-b7cc03025430]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.489 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[be8668d7-6567-45db-bd87-b6edbd291e09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:50 np0005603623 NetworkManager[48970]: <info>  [1769846570.5052] device (tap8f76a919-80): carrier: link connected
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.509 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[26304c7c-63d2-4891-9f18-c46001a5b772]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.521 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[100a1dca-9c59-4cd9-9e15-a7603934540e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8f76a919-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:f7:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571093, 'reachable_time': 43892, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252026, 'error': None, 'target': 'ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.532 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[00725e9f-ed8a-4bc6-a0cc-434956c1fbcb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe89:f773'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 571093, 'tstamp': 571093}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252027, 'error': None, 'target': 'ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.544 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[550d8dc5-c1cc-41ac-9e0f-5b51d13bb4de]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8f76a919-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:f7:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571093, 'reachable_time': 43892, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 252028, 'error': None, 'target': 'ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.562 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6a4dcd30-5f05-409f-8ef4-129cf8d7d4c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.591 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4f2fffb2-3bed-4e75-928a-2c2cd27fd196]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.593 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f76a919-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.593 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.593 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f76a919-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:02:50 np0005603623 nova_compute[226235]: 2026-01-31 08:02:50.595 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:50 np0005603623 NetworkManager[48970]: <info>  [1769846570.5958] manager: (tap8f76a919-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Jan 31 03:02:50 np0005603623 kernel: tap8f76a919-80: entered promiscuous mode
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.599 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8f76a919-80, col_values=(('external_ids', {'iface-id': '80264831-feba-4158-a461-cbdcb057822c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:02:50 np0005603623 nova_compute[226235]: 2026-01-31 08:02:50.600 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:02:50Z|00212|binding|INFO|Releasing lport 80264831-feba-4158-a461-cbdcb057822c from this chassis (sb_readonly=0)
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.602 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8f76a919-8420-45e2-a6e8-56654f40ec08.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8f76a919-8420-45e2-a6e8-56654f40ec08.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.602 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b3a5369d-6bcf-42f3-bddb-66672f4e7c5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.603 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-8f76a919-8420-45e2-a6e8-56654f40ec08
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/8f76a919-8420-45e2-a6e8-56654f40ec08.pid.haproxy
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 8f76a919-8420-45e2-a6e8-56654f40ec08
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:02:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:02:50.604 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08', 'env', 'PROCESS_TAG=haproxy-8f76a919-8420-45e2-a6e8-56654f40ec08', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8f76a919-8420-45e2-a6e8-56654f40ec08.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:02:50 np0005603623 nova_compute[226235]: 2026-01-31 08:02:50.606 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:50 np0005603623 podman[252093]: 2026-01-31 08:02:50.885487147 +0000 UTC m=+0.022374662 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:02:51 np0005603623 nova_compute[226235]: 2026-01-31 08:02:51.026 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846571.0261152, 4470f2e9-7e12-43b5-a9a5-f690f2509954 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:02:51 np0005603623 nova_compute[226235]: 2026-01-31 08:02:51.026 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] VM Started (Lifecycle Event)#033[00m
Jan 31 03:02:51 np0005603623 nova_compute[226235]: 2026-01-31 08:02:51.072 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:02:51 np0005603623 nova_compute[226235]: 2026-01-31 08:02:51.075 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846571.0262144, 4470f2e9-7e12-43b5-a9a5-f690f2509954 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:02:51 np0005603623 nova_compute[226235]: 2026-01-31 08:02:51.076 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:02:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:02:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 22K writes, 100K keys, 22K commit groups, 1.0 writes per commit group, ingest: 0.10 GB, 0.04 MB/s#012Cumulative WAL: 22K writes, 6619 syncs, 3.41 writes per sync, written: 0.10 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 9723 writes, 42K keys, 9723 commit groups, 1.0 writes per commit group, ingest: 48.75 MB, 0.08 MB/s#012Interval WAL: 9723 writes, 3312 syncs, 2.94 writes per sync, written: 0.05 GB, 0.08 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 03:02:51 np0005603623 nova_compute[226235]: 2026-01-31 08:02:51.161 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:02:51 np0005603623 nova_compute[226235]: 2026-01-31 08:02:51.164 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:02:51 np0005603623 nova_compute[226235]: 2026-01-31 08:02:51.229 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:02:51 np0005603623 podman[252093]: 2026-01-31 08:02:51.277804229 +0000 UTC m=+0.414691724 container create f49795642e1b00d730f84f3a3126cc89f505d3eebbf48bd260a6e5ee34be5a05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 03:02:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:51.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:51 np0005603623 systemd[1]: Started libpod-conmon-f49795642e1b00d730f84f3a3126cc89f505d3eebbf48bd260a6e5ee34be5a05.scope.
Jan 31 03:02:51 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:02:51 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01ce683dee7e76791635b6183f2b0875cfb819d5ffdcdf197e35c54d73df2e84/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:02:51 np0005603623 podman[252093]: 2026-01-31 08:02:51.551062318 +0000 UTC m=+0.687949833 container init f49795642e1b00d730f84f3a3126cc89f505d3eebbf48bd260a6e5ee34be5a05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:02:51 np0005603623 podman[252093]: 2026-01-31 08:02:51.555748705 +0000 UTC m=+0.692636200 container start f49795642e1b00d730f84f3a3126cc89f505d3eebbf48bd260a6e5ee34be5a05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 31 03:02:51 np0005603623 neutron-haproxy-ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08[252114]: [NOTICE]   (252118) : New worker (252120) forked
Jan 31 03:02:51 np0005603623 neutron-haproxy-ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08[252114]: [NOTICE]   (252118) : Loading success.
Jan 31 03:02:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:51.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.281 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.292 226239 DEBUG nova.compute.manager [req-c66b9976-76f8-47f4-b1aa-03029e7a7504 req-eff6894b-2512-416b-b2fc-56c84a15b4c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Received event network-vif-plugged-2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.292 226239 DEBUG oslo_concurrency.lockutils [req-c66b9976-76f8-47f4-b1aa-03029e7a7504 req-eff6894b-2512-416b-b2fc-56c84a15b4c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4470f2e9-7e12-43b5-a9a5-f690f2509954-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.292 226239 DEBUG oslo_concurrency.lockutils [req-c66b9976-76f8-47f4-b1aa-03029e7a7504 req-eff6894b-2512-416b-b2fc-56c84a15b4c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4470f2e9-7e12-43b5-a9a5-f690f2509954-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.292 226239 DEBUG oslo_concurrency.lockutils [req-c66b9976-76f8-47f4-b1aa-03029e7a7504 req-eff6894b-2512-416b-b2fc-56c84a15b4c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4470f2e9-7e12-43b5-a9a5-f690f2509954-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.293 226239 DEBUG nova.compute.manager [req-c66b9976-76f8-47f4-b1aa-03029e7a7504 req-eff6894b-2512-416b-b2fc-56c84a15b4c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Processing event network-vif-plugged-2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.293 226239 DEBUG nova.compute.manager [req-c66b9976-76f8-47f4-b1aa-03029e7a7504 req-eff6894b-2512-416b-b2fc-56c84a15b4c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Received event network-vif-plugged-2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.293 226239 DEBUG oslo_concurrency.lockutils [req-c66b9976-76f8-47f4-b1aa-03029e7a7504 req-eff6894b-2512-416b-b2fc-56c84a15b4c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4470f2e9-7e12-43b5-a9a5-f690f2509954-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.293 226239 DEBUG oslo_concurrency.lockutils [req-c66b9976-76f8-47f4-b1aa-03029e7a7504 req-eff6894b-2512-416b-b2fc-56c84a15b4c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4470f2e9-7e12-43b5-a9a5-f690f2509954-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.293 226239 DEBUG oslo_concurrency.lockutils [req-c66b9976-76f8-47f4-b1aa-03029e7a7504 req-eff6894b-2512-416b-b2fc-56c84a15b4c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4470f2e9-7e12-43b5-a9a5-f690f2509954-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.294 226239 DEBUG nova.compute.manager [req-c66b9976-76f8-47f4-b1aa-03029e7a7504 req-eff6894b-2512-416b-b2fc-56c84a15b4c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] No waiting events found dispatching network-vif-plugged-2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.294 226239 WARNING nova.compute.manager [req-c66b9976-76f8-47f4-b1aa-03029e7a7504 req-eff6894b-2512-416b-b2fc-56c84a15b4c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Received unexpected event network-vif-plugged-2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.294 226239 DEBUG nova.compute.manager [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.298 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846572.2979872, 4470f2e9-7e12-43b5-a9a5-f690f2509954 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.298 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.300 226239 DEBUG nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.303 226239 INFO nova.virt.libvirt.driver [-] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Instance spawned successfully.#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.303 226239 DEBUG nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.341 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.343 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.349 226239 DEBUG nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.350 226239 DEBUG nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.350 226239 DEBUG nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.350 226239 DEBUG nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.351 226239 DEBUG nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.351 226239 DEBUG nova.virt.libvirt.driver [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.380 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.479 226239 INFO nova.compute.manager [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Took 11.89 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.479 226239 DEBUG nova.compute.manager [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.577 226239 INFO nova.compute.manager [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Took 13.63 seconds to build instance.#033[00m
Jan 31 03:02:52 np0005603623 nova_compute[226235]: 2026-01-31 08:02:52.627 226239 DEBUG oslo_concurrency.lockutils [None req-bdb0c09c-17b1-41a7-900d-6c16cdf89b52 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lock "4470f2e9-7e12-43b5-a9a5-f690f2509954" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:53 np0005603623 nova_compute[226235]: 2026-01-31 08:02:53.083 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:53.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:53.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:55.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:02:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:55.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:02:56 np0005603623 nova_compute[226235]: 2026-01-31 08:02:56.040 226239 DEBUG nova.compute.manager [None req-32d301d7-4f62-403e-8df5-aae32761c8b2 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:02:56 np0005603623 nova_compute[226235]: 2026-01-31 08:02:56.322 226239 INFO nova.compute.manager [None req-32d301d7-4f62-403e-8df5-aae32761c8b2 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] instance snapshotting#033[00m
Jan 31 03:02:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:57 np0005603623 nova_compute[226235]: 2026-01-31 08:02:57.050 226239 INFO nova.virt.libvirt.driver [None req-32d301d7-4f62-403e-8df5-aae32761c8b2 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Beginning live snapshot process#033[00m
Jan 31 03:02:57 np0005603623 nova_compute[226235]: 2026-01-31 08:02:57.275 226239 DEBUG nova.virt.libvirt.imagebackend [None req-32d301d7-4f62-403e-8df5-aae32761c8b2 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] No parent info for 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 31 03:02:57 np0005603623 nova_compute[226235]: 2026-01-31 08:02:57.282 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:57.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:02:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:57.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:02:57 np0005603623 nova_compute[226235]: 2026-01-31 08:02:57.671 226239 DEBUG nova.storage.rbd_utils [None req-32d301d7-4f62-403e-8df5-aae32761c8b2 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] creating snapshot(fcc08767ae79484f966b79543e70df51) on rbd image(4470f2e9-7e12-43b5-a9a5-f690f2509954_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:02:58 np0005603623 nova_compute[226235]: 2026-01-31 08:02:58.085 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e194 e194: 3 total, 3 up, 3 in
Jan 31 03:02:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:59.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:02:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:02:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:59.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:03:00 np0005603623 nova_compute[226235]: 2026-01-31 08:03:00.356 226239 DEBUG nova.storage.rbd_utils [None req-32d301d7-4f62-403e-8df5-aae32761c8b2 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] cloning vms/4470f2e9-7e12-43b5-a9a5-f690f2509954_disk@fcc08767ae79484f966b79543e70df51 to images/3676ca5a-2e55-4575-bc3f-672af49a103b clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:03:00 np0005603623 nova_compute[226235]: 2026-01-31 08:03:00.849 226239 DEBUG nova.storage.rbd_utils [None req-32d301d7-4f62-403e-8df5-aae32761c8b2 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] flattening images/3676ca5a-2e55-4575-bc3f-672af49a103b flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 31 03:03:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:01.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:03:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:01.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:03:02 np0005603623 nova_compute[226235]: 2026-01-31 08:03:02.314 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:03 np0005603623 nova_compute[226235]: 2026-01-31 08:03:03.087 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:03 np0005603623 nova_compute[226235]: 2026-01-31 08:03:03.247 226239 DEBUG nova.storage.rbd_utils [None req-32d301d7-4f62-403e-8df5-aae32761c8b2 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] removing snapshot(fcc08767ae79484f966b79543e70df51) on rbd image(4470f2e9-7e12-43b5-a9a5-f690f2509954_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:03:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:03.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:03.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e195 e195: 3 total, 3 up, 3 in
Jan 31 03:03:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:05.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:05.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e195 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e196 e196: 3 total, 3 up, 3 in
Jan 31 03:03:07 np0005603623 nova_compute[226235]: 2026-01-31 08:03:07.285 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:07.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:03:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:07.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:03:07 np0005603623 nova_compute[226235]: 2026-01-31 08:03:07.815 226239 DEBUG nova.storage.rbd_utils [None req-32d301d7-4f62-403e-8df5-aae32761c8b2 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] creating snapshot(snap) on rbd image(3676ca5a-2e55-4575-bc3f-672af49a103b) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:03:08 np0005603623 nova_compute[226235]: 2026-01-31 08:03:08.416 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:03:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:09.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:03:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:03:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:09.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:03:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e197 e197: 3 total, 3 up, 3 in
Jan 31 03:03:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:11.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e197 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:11.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e198 e198: 3 total, 3 up, 3 in
Jan 31 03:03:11 np0005603623 podman[252331]: 2026-01-31 08:03:11.981468127 +0000 UTC m=+0.053948032 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 03:03:12 np0005603623 podman[252332]: 2026-01-31 08:03:12.026194979 +0000 UTC m=+0.099033406 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:03:12 np0005603623 nova_compute[226235]: 2026-01-31 08:03:12.288 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:13.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:13 np0005603623 nova_compute[226235]: 2026-01-31 08:03:13.418 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:13.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e199 e199: 3 total, 3 up, 3 in
Jan 31 03:03:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:15.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:03:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:15.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:03:16 np0005603623 nova_compute[226235]: 2026-01-31 08:03:16.028 226239 INFO nova.virt.libvirt.driver [None req-32d301d7-4f62-403e-8df5-aae32761c8b2 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Snapshot image upload complete#033[00m
Jan 31 03:03:16 np0005603623 nova_compute[226235]: 2026-01-31 08:03:16.028 226239 INFO nova.compute.manager [None req-32d301d7-4f62-403e-8df5-aae32761c8b2 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Took 19.70 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 31 03:03:16 np0005603623 ovn_controller[133449]: 2026-01-31T08:03:16Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7a:9d:eb 10.100.0.6
Jan 31 03:03:16 np0005603623 ovn_controller[133449]: 2026-01-31T08:03:16Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7a:9d:eb 10.100.0.6
Jan 31 03:03:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:17 np0005603623 nova_compute[226235]: 2026-01-31 08:03:17.290 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:17.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e200 e200: 3 total, 3 up, 3 in
Jan 31 03:03:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:17.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:18 np0005603623 nova_compute[226235]: 2026-01-31 08:03:18.421 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e201 e201: 3 total, 3 up, 3 in
Jan 31 03:03:19 np0005603623 nova_compute[226235]: 2026-01-31 08:03:19.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:03:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:19.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:03:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:19.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:03:21 np0005603623 nova_compute[226235]: 2026-01-31 08:03:21.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:03:21 np0005603623 nova_compute[226235]: 2026-01-31 08:03:21.209 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:21 np0005603623 nova_compute[226235]: 2026-01-31 08:03:21.209 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:21 np0005603623 nova_compute[226235]: 2026-01-31 08:03:21.209 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:21 np0005603623 nova_compute[226235]: 2026-01-31 08:03:21.209 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:03:21 np0005603623 nova_compute[226235]: 2026-01-31 08:03:21.210 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:21 np0005603623 nova_compute[226235]: 2026-01-31 08:03:21.394 226239 DEBUG nova.compute.manager [None req-086af459-02f1-4c06-b6dd-a285c4518054 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:03:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:03:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:21.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:03:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e201 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:21 np0005603623 nova_compute[226235]: 2026-01-31 08:03:21.474 226239 INFO nova.compute.manager [None req-086af459-02f1-4c06-b6dd-a285c4518054 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] instance snapshotting#033[00m
Jan 31 03:03:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:03:21 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3634561380' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:03:21 np0005603623 nova_compute[226235]: 2026-01-31 08:03:21.641 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:03:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:21.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:03:21 np0005603623 nova_compute[226235]: 2026-01-31 08:03:21.741 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000035 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:03:21 np0005603623 nova_compute[226235]: 2026-01-31 08:03:21.741 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000035 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:03:21 np0005603623 nova_compute[226235]: 2026-01-31 08:03:21.873 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:03:21 np0005603623 nova_compute[226235]: 2026-01-31 08:03:21.875 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4516MB free_disk=20.87781524658203GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:03:21 np0005603623 nova_compute[226235]: 2026-01-31 08:03:21.875 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:21 np0005603623 nova_compute[226235]: 2026-01-31 08:03:21.876 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:21 np0005603623 nova_compute[226235]: 2026-01-31 08:03:21.993 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 4470f2e9-7e12-43b5-a9a5-f690f2509954 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:03:21 np0005603623 nova_compute[226235]: 2026-01-31 08:03:21.993 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:03:21 np0005603623 nova_compute[226235]: 2026-01-31 08:03:21.994 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:03:22 np0005603623 nova_compute[226235]: 2026-01-31 08:03:22.058 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:22 np0005603623 nova_compute[226235]: 2026-01-31 08:03:22.122 226239 INFO nova.virt.libvirt.driver [None req-086af459-02f1-4c06-b6dd-a285c4518054 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Beginning live snapshot process#033[00m
Jan 31 03:03:22 np0005603623 nova_compute[226235]: 2026-01-31 08:03:22.319 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:22 np0005603623 nova_compute[226235]: 2026-01-31 08:03:22.327 226239 DEBUG nova.virt.libvirt.imagebackend [None req-086af459-02f1-4c06-b6dd-a285c4518054 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] No parent info for 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 31 03:03:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:03:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/70999273' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:03:22 np0005603623 nova_compute[226235]: 2026-01-31 08:03:22.468 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:22 np0005603623 nova_compute[226235]: 2026-01-31 08:03:22.473 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:03:22 np0005603623 nova_compute[226235]: 2026-01-31 08:03:22.515 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:03:22 np0005603623 nova_compute[226235]: 2026-01-31 08:03:22.584 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:03:22 np0005603623 nova_compute[226235]: 2026-01-31 08:03:22.585 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.709s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e202 e202: 3 total, 3 up, 3 in
Jan 31 03:03:22 np0005603623 nova_compute[226235]: 2026-01-31 08:03:22.838 226239 DEBUG nova.storage.rbd_utils [None req-086af459-02f1-4c06-b6dd-a285c4518054 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] creating snapshot(b7dd0c278cce4fc49fd5898db54a5b83) on rbd image(4470f2e9-7e12-43b5-a9a5-f690f2509954_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:03:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:03:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:23.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:03:23 np0005603623 nova_compute[226235]: 2026-01-31 08:03:23.424 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:23 np0005603623 nova_compute[226235]: 2026-01-31 08:03:23.586 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:03:23 np0005603623 nova_compute[226235]: 2026-01-31 08:03:23.586 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:03:23 np0005603623 nova_compute[226235]: 2026-01-31 08:03:23.586 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:03:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:03:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:23.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:03:24 np0005603623 nova_compute[226235]: 2026-01-31 08:03:24.794 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-4470f2e9-7e12-43b5-a9a5-f690f2509954" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:03:24 np0005603623 nova_compute[226235]: 2026-01-31 08:03:24.795 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-4470f2e9-7e12-43b5-a9a5-f690f2509954" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:03:24 np0005603623 nova_compute[226235]: 2026-01-31 08:03:24.795 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:03:24 np0005603623 nova_compute[226235]: 2026-01-31 08:03:24.795 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4470f2e9-7e12-43b5-a9a5-f690f2509954 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:03:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e203 e203: 3 total, 3 up, 3 in
Jan 31 03:03:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:25.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:03:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:25.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:03:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e204 e204: 3 total, 3 up, 3 in
Jan 31 03:03:27 np0005603623 nova_compute[226235]: 2026-01-31 08:03:27.293 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:27.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:27 np0005603623 nova_compute[226235]: 2026-01-31 08:03:27.473 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Updating instance_info_cache with network_info: [{"id": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "address": "fa:16:3e:7a:9d:eb", "network": {"id": "8f76a919-8420-45e2-a6e8-56654f40ec08", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-57295873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "03f24e162c6d454aa9e31d60b478001d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cb7cbe0-77", "ovs_interfaceid": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:03:27 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:03:27 np0005603623 nova_compute[226235]: 2026-01-31 08:03:27.635 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-4470f2e9-7e12-43b5-a9a5-f690f2509954" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:03:27 np0005603623 nova_compute[226235]: 2026-01-31 08:03:27.635 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:03:27 np0005603623 nova_compute[226235]: 2026-01-31 08:03:27.636 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:03:27 np0005603623 nova_compute[226235]: 2026-01-31 08:03:27.636 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:03:27 np0005603623 nova_compute[226235]: 2026-01-31 08:03:27.636 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:03:27 np0005603623 nova_compute[226235]: 2026-01-31 08:03:27.636 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:03:27 np0005603623 nova_compute[226235]: 2026-01-31 08:03:27.637 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:03:27 np0005603623 nova_compute[226235]: 2026-01-31 08:03:27.637 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:03:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:27.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:27 np0005603623 nova_compute[226235]: 2026-01-31 08:03:27.699 226239 DEBUG nova.storage.rbd_utils [None req-086af459-02f1-4c06-b6dd-a285c4518054 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] cloning vms/4470f2e9-7e12-43b5-a9a5-f690f2509954_disk@b7dd0c278cce4fc49fd5898db54a5b83 to images/0329a887-4ad2-4163-bd3c-ee2a3ded1631 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:03:27 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:27.820 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:03:27 np0005603623 nova_compute[226235]: 2026-01-31 08:03:27.822 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:27 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:27.823 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:03:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:03:28 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1658537607' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:03:28 np0005603623 nova_compute[226235]: 2026-01-31 08:03:28.425 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e205 e205: 3 total, 3 up, 3 in
Jan 31 03:03:28 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:03:28 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:03:28 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:03:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:03:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:29.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:03:29 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 31 03:03:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:29.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:29.825 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:30.097 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:30.098 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:30.098 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:30 np0005603623 nova_compute[226235]: 2026-01-31 08:03:30.200 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:03:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:03:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 03:03:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:03:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 03:03:31 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 31 03:03:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:31.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:31 np0005603623 nova_compute[226235]: 2026-01-31 08:03:31.474 226239 DEBUG nova.storage.rbd_utils [None req-086af459-02f1-4c06-b6dd-a285c4518054 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] flattening images/0329a887-4ad2-4163-bd3c-ee2a3ded1631 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 31 03:03:31 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 03:03:31 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:03:31 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:03:31 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:03:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:31.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:32 np0005603623 nova_compute[226235]: 2026-01-31 08:03:32.297 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:33 np0005603623 nova_compute[226235]: 2026-01-31 08:03:33.428 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:33.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:33.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:35.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:35.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:37 np0005603623 nova_compute[226235]: 2026-01-31 08:03:37.299 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:37.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:03:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:37.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:03:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e206 e206: 3 total, 3 up, 3 in
Jan 31 03:03:37 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 31 03:03:37 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Jan 31 03:03:38 np0005603623 nova_compute[226235]: 2026-01-31 08:03:38.031 226239 DEBUG nova.storage.rbd_utils [None req-086af459-02f1-4c06-b6dd-a285c4518054 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] removing snapshot(b7dd0c278cce4fc49fd5898db54a5b83) on rbd image(4470f2e9-7e12-43b5-a9a5-f690f2509954_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:03:38 np0005603623 nova_compute[226235]: 2026-01-31 08:03:38.430 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e207 e207: 3 total, 3 up, 3 in
Jan 31 03:03:38 np0005603623 nova_compute[226235]: 2026-01-31 08:03:38.855 226239 DEBUG nova.storage.rbd_utils [None req-086af459-02f1-4c06-b6dd-a285c4518054 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] creating snapshot(snap) on rbd image(0329a887-4ad2-4163-bd3c-ee2a3ded1631) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:03:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:39.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:39.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e208 e208: 3 total, 3 up, 3 in
Jan 31 03:03:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:03:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:41.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:03:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:41.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:03:41 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2176022897' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:03:41 np0005603623 ovn_controller[133449]: 2026-01-31T08:03:41Z|00213|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 31 03:03:42 np0005603623 nova_compute[226235]: 2026-01-31 08:03:42.301 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:42 np0005603623 podman[252951]: 2026-01-31 08:03:42.587746438 +0000 UTC m=+0.051791145 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 31 03:03:42 np0005603623 podman[252952]: 2026-01-31 08:03:42.609309874 +0000 UTC m=+0.074073363 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true)
Jan 31 03:03:43 np0005603623 nova_compute[226235]: 2026-01-31 08:03:43.040 226239 INFO nova.virt.libvirt.driver [None req-086af459-02f1-4c06-b6dd-a285c4518054 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Snapshot image upload complete#033[00m
Jan 31 03:03:43 np0005603623 nova_compute[226235]: 2026-01-31 08:03:43.040 226239 INFO nova.compute.manager [None req-086af459-02f1-4c06-b6dd-a285c4518054 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Took 21.56 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 31 03:03:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:03:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:03:43 np0005603623 nova_compute[226235]: 2026-01-31 08:03:43.433 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:43.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:43.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:44 np0005603623 nova_compute[226235]: 2026-01-31 08:03:44.696 226239 DEBUG oslo_concurrency.lockutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Acquiring lock "7d051fe7-956a-4c8c-9f91-47c8c057964c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:44 np0005603623 nova_compute[226235]: 2026-01-31 08:03:44.696 226239 DEBUG oslo_concurrency.lockutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Lock "7d051fe7-956a-4c8c-9f91-47c8c057964c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:44 np0005603623 nova_compute[226235]: 2026-01-31 08:03:44.740 226239 DEBUG nova.compute.manager [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:03:44 np0005603623 nova_compute[226235]: 2026-01-31 08:03:44.829 226239 DEBUG oslo_concurrency.lockutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:44 np0005603623 nova_compute[226235]: 2026-01-31 08:03:44.829 226239 DEBUG oslo_concurrency.lockutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:44 np0005603623 nova_compute[226235]: 2026-01-31 08:03:44.837 226239 DEBUG nova.virt.hardware [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:03:44 np0005603623 nova_compute[226235]: 2026-01-31 08:03:44.837 226239 INFO nova.compute.claims [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:03:44 np0005603623 nova_compute[226235]: 2026-01-31 08:03:44.981 226239 DEBUG oslo_concurrency.processutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:03:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4093485850' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.423 226239 DEBUG oslo_concurrency.processutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.430 226239 DEBUG nova.compute.provider_tree [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:03:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:03:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:45.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.449 226239 DEBUG nova.scheduler.client.report [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.484 226239 DEBUG oslo_concurrency.lockutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.485 226239 DEBUG nova.compute.manager [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.541 226239 DEBUG nova.compute.manager [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.542 226239 DEBUG nova.network.neutron [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.572 226239 INFO nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.594 226239 DEBUG nova.compute.manager [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:03:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:45.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.713 226239 DEBUG nova.compute.manager [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.714 226239 DEBUG nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.715 226239 INFO nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Creating image(s)#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.742 226239 DEBUG nova.storage.rbd_utils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] rbd image 7d051fe7-956a-4c8c-9f91-47c8c057964c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.781 226239 DEBUG nova.storage.rbd_utils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] rbd image 7d051fe7-956a-4c8c-9f91-47c8c057964c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.821 226239 DEBUG nova.storage.rbd_utils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] rbd image 7d051fe7-956a-4c8c-9f91-47c8c057964c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.826 226239 DEBUG oslo_concurrency.processutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.884 226239 DEBUG oslo_concurrency.processutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.885 226239 DEBUG oslo_concurrency.lockutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.885 226239 DEBUG oslo_concurrency.lockutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.886 226239 DEBUG oslo_concurrency.lockutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.909 226239 DEBUG nova.storage.rbd_utils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] rbd image 7d051fe7-956a-4c8c-9f91-47c8c057964c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:45 np0005603623 nova_compute[226235]: 2026-01-31 08:03:45.913 226239 DEBUG oslo_concurrency.processutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 7d051fe7-956a-4c8c-9f91-47c8c057964c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:46 np0005603623 nova_compute[226235]: 2026-01-31 08:03:46.339 226239 DEBUG nova.network.neutron [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 31 03:03:46 np0005603623 nova_compute[226235]: 2026-01-31 08:03:46.339 226239 DEBUG nova.compute.manager [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:03:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e209 e209: 3 total, 3 up, 3 in
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.186 226239 DEBUG oslo_concurrency.processutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 7d051fe7-956a-4c8c-9f91-47c8c057964c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.273s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.256 226239 DEBUG nova.storage.rbd_utils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] resizing rbd image 7d051fe7-956a-4c8c-9f91-47c8c057964c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.302 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:47.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:03:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:47.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.893 226239 DEBUG nova.objects.instance [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Lazy-loading 'migration_context' on Instance uuid 7d051fe7-956a-4c8c-9f91-47c8c057964c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.914 226239 DEBUG nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.914 226239 DEBUG nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Ensure instance console log exists: /var/lib/nova/instances/7d051fe7-956a-4c8c-9f91-47c8c057964c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.915 226239 DEBUG oslo_concurrency.lockutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.915 226239 DEBUG oslo_concurrency.lockutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.915 226239 DEBUG oslo_concurrency.lockutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.916 226239 DEBUG nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.921 226239 WARNING nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.927 226239 DEBUG nova.virt.libvirt.host [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.927 226239 DEBUG nova.virt.libvirt.host [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.933 226239 DEBUG nova.virt.libvirt.host [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.934 226239 DEBUG nova.virt.libvirt.host [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.935 226239 DEBUG nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.935 226239 DEBUG nova.virt.hardware [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.936 226239 DEBUG nova.virt.hardware [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.936 226239 DEBUG nova.virt.hardware [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.936 226239 DEBUG nova.virt.hardware [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.936 226239 DEBUG nova.virt.hardware [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.936 226239 DEBUG nova.virt.hardware [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.937 226239 DEBUG nova.virt.hardware [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.937 226239 DEBUG nova.virt.hardware [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.937 226239 DEBUG nova.virt.hardware [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.938 226239 DEBUG nova.virt.hardware [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.938 226239 DEBUG nova.virt.hardware [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:03:47 np0005603623 nova_compute[226235]: 2026-01-31 08:03:47.940 226239 DEBUG oslo_concurrency.processutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e210 e210: 3 total, 3 up, 3 in
Jan 31 03:03:48 np0005603623 nova_compute[226235]: 2026-01-31 08:03:48.435 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:03:48 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2138781035' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:03:48 np0005603623 nova_compute[226235]: 2026-01-31 08:03:48.540 226239 DEBUG oslo_concurrency.processutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.600s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:48 np0005603623 nova_compute[226235]: 2026-01-31 08:03:48.570 226239 DEBUG nova.storage.rbd_utils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] rbd image 7d051fe7-956a-4c8c-9f91-47c8c057964c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:48 np0005603623 nova_compute[226235]: 2026-01-31 08:03:48.575 226239 DEBUG oslo_concurrency.processutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:03:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/993692824' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:03:49 np0005603623 nova_compute[226235]: 2026-01-31 08:03:49.022 226239 DEBUG oslo_concurrency.processutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:49 np0005603623 nova_compute[226235]: 2026-01-31 08:03:49.025 226239 DEBUG nova.objects.instance [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7d051fe7-956a-4c8c-9f91-47c8c057964c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:03:49 np0005603623 nova_compute[226235]: 2026-01-31 08:03:49.045 226239 DEBUG nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:03:49 np0005603623 nova_compute[226235]:  <uuid>7d051fe7-956a-4c8c-9f91-47c8c057964c</uuid>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:  <name>instance-0000003a</name>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <nova:name>tempest-ListImageFiltersTestJSON-server-1834037861</nova:name>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:03:47</nova:creationTime>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:03:49 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:        <nova:user uuid="fd3d70d97c394edaa70e32807d7a96ca">tempest-ListImageFiltersTestJSON-1012419265-project-member</nova:user>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:        <nova:project uuid="3d28270b439f4cb1aa201d46b9f8a843">tempest-ListImageFiltersTestJSON-1012419265</nova:project>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <nova:ports/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <entry name="serial">7d051fe7-956a-4c8c-9f91-47c8c057964c</entry>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <entry name="uuid">7d051fe7-956a-4c8c-9f91-47c8c057964c</entry>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/7d051fe7-956a-4c8c-9f91-47c8c057964c_disk">
Jan 31 03:03:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:03:49 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/7d051fe7-956a-4c8c-9f91-47c8c057964c_disk.config">
Jan 31 03:03:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:03:49 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/7d051fe7-956a-4c8c-9f91-47c8c057964c/console.log" append="off"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:03:49 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:03:49 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:03:49 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:03:49 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:03:49 np0005603623 nova_compute[226235]: 2026-01-31 08:03:49.103 226239 DEBUG nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:03:49 np0005603623 nova_compute[226235]: 2026-01-31 08:03:49.103 226239 DEBUG nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:03:49 np0005603623 nova_compute[226235]: 2026-01-31 08:03:49.104 226239 INFO nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Using config drive#033[00m
Jan 31 03:03:49 np0005603623 nova_compute[226235]: 2026-01-31 08:03:49.132 226239 DEBUG nova.storage.rbd_utils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] rbd image 7d051fe7-956a-4c8c-9f91-47c8c057964c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:49 np0005603623 nova_compute[226235]: 2026-01-31 08:03:49.337 226239 INFO nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Creating config drive at /var/lib/nova/instances/7d051fe7-956a-4c8c-9f91-47c8c057964c/disk.config#033[00m
Jan 31 03:03:49 np0005603623 nova_compute[226235]: 2026-01-31 08:03:49.343 226239 DEBUG oslo_concurrency.processutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7d051fe7-956a-4c8c-9f91-47c8c057964c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp_h6c906s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:03:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:49.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:03:49 np0005603623 nova_compute[226235]: 2026-01-31 08:03:49.467 226239 DEBUG oslo_concurrency.processutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7d051fe7-956a-4c8c-9f91-47c8c057964c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp_h6c906s" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:49 np0005603623 nova_compute[226235]: 2026-01-31 08:03:49.499 226239 DEBUG nova.storage.rbd_utils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] rbd image 7d051fe7-956a-4c8c-9f91-47c8c057964c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:49 np0005603623 nova_compute[226235]: 2026-01-31 08:03:49.503 226239 DEBUG oslo_concurrency.processutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7d051fe7-956a-4c8c-9f91-47c8c057964c/disk.config 7d051fe7-956a-4c8c-9f91-47c8c057964c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:49.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:49 np0005603623 nova_compute[226235]: 2026-01-31 08:03:49.858 226239 DEBUG oslo_concurrency.processutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7d051fe7-956a-4c8c-9f91-47c8c057964c/disk.config 7d051fe7-956a-4c8c-9f91-47c8c057964c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.354s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:49 np0005603623 nova_compute[226235]: 2026-01-31 08:03:49.859 226239 INFO nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Deleting local config drive /var/lib/nova/instances/7d051fe7-956a-4c8c-9f91-47c8c057964c/disk.config because it was imported into RBD.#033[00m
Jan 31 03:03:49 np0005603623 systemd-machined[194379]: New machine qemu-29-instance-0000003a.
Jan 31 03:03:49 np0005603623 systemd[1]: Started Virtual Machine qemu-29-instance-0000003a.
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.271 226239 DEBUG oslo_concurrency.lockutils [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Acquiring lock "4470f2e9-7e12-43b5-a9a5-f690f2509954" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.272 226239 DEBUG oslo_concurrency.lockutils [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lock "4470f2e9-7e12-43b5-a9a5-f690f2509954" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.272 226239 DEBUG oslo_concurrency.lockutils [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Acquiring lock "4470f2e9-7e12-43b5-a9a5-f690f2509954-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.273 226239 DEBUG oslo_concurrency.lockutils [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lock "4470f2e9-7e12-43b5-a9a5-f690f2509954-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.273 226239 DEBUG oslo_concurrency.lockutils [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lock "4470f2e9-7e12-43b5-a9a5-f690f2509954-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.274 226239 INFO nova.compute.manager [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Terminating instance#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.275 226239 DEBUG nova.compute.manager [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:03:50 np0005603623 kernel: tap2cb7cbe0-77 (unregistering): left promiscuous mode
Jan 31 03:03:50 np0005603623 NetworkManager[48970]: <info>  [1769846630.3483] device (tap2cb7cbe0-77): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:03:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:03:50Z|00214|binding|INFO|Releasing lport 2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 from this chassis (sb_readonly=0)
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.360 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:03:50Z|00215|binding|INFO|Setting lport 2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 down in Southbound
Jan 31 03:03:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:03:50Z|00216|binding|INFO|Removing iface tap2cb7cbe0-77 ovn-installed in OVS
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.364 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:50.372 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:9d:eb 10.100.0.6'], port_security=['fa:16:3e:7a:9d:eb 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4470f2e9-7e12-43b5-a9a5-f690f2509954', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f76a919-8420-45e2-a6e8-56654f40ec08', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '03f24e162c6d454aa9e31d60b478001d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9a69a75b-5ce1-4bb0-8555-2e322ef267ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=043eb9e9-d351-4602-ba73-ce358ef795c2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:50.373 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 in datapath 8f76a919-8420-45e2-a6e8-56654f40ec08 unbound from our chassis#033[00m
Jan 31 03:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:50.375 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8f76a919-8420-45e2-a6e8-56654f40ec08, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:50.376 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1ab42bb9-2d58-47af-8866-71eb15ad7f3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:50.380 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08 namespace which is not needed anymore#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.388 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:50 np0005603623 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000035.scope: Deactivated successfully.
Jan 31 03:03:50 np0005603623 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000035.scope: Consumed 14.162s CPU time.
Jan 31 03:03:50 np0005603623 systemd-machined[194379]: Machine qemu-28-instance-00000035 terminated.
Jan 31 03:03:50 np0005603623 neutron-haproxy-ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08[252114]: [NOTICE]   (252118) : haproxy version is 2.8.14-c23fe91
Jan 31 03:03:50 np0005603623 neutron-haproxy-ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08[252114]: [NOTICE]   (252118) : path to executable is /usr/sbin/haproxy
Jan 31 03:03:50 np0005603623 neutron-haproxy-ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08[252114]: [WARNING]  (252118) : Exiting Master process...
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.508 226239 INFO nova.virt.libvirt.driver [-] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Instance destroyed successfully.#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.509 226239 DEBUG nova.objects.instance [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lazy-loading 'resources' on Instance uuid 4470f2e9-7e12-43b5-a9a5-f690f2509954 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:03:50 np0005603623 neutron-haproxy-ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08[252114]: [ALERT]    (252118) : Current worker (252120) exited with code 143 (Terminated)
Jan 31 03:03:50 np0005603623 neutron-haproxy-ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08[252114]: [WARNING]  (252118) : All workers exited. Exiting... (0)
Jan 31 03:03:50 np0005603623 systemd[1]: libpod-f49795642e1b00d730f84f3a3126cc89f505d3eebbf48bd260a6e5ee34be5a05.scope: Deactivated successfully.
Jan 31 03:03:50 np0005603623 podman[253402]: 2026-01-31 08:03:50.522666291 +0000 UTC m=+0.065110743 container died f49795642e1b00d730f84f3a3126cc89f505d3eebbf48bd260a6e5ee34be5a05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.529 226239 DEBUG nova.virt.libvirt.vif [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:02:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1556480914',display_name='tempest-ImagesOneServerTestJSON-server-1556480914',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1556480914',id=53,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:02:52Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='03f24e162c6d454aa9e31d60b478001d',ramdisk_id='',reservation_id='r-i9hgf1b5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerTestJSON-77816411',owner_user_name='tempest-ImagesOneServerTestJSON-77816411-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:03:43Z,user_data=None,user_id='97778ff629964356819ef34be55ca5a6',uuid=4470f2e9-7e12-43b5-a9a5-f690f2509954,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "address": "fa:16:3e:7a:9d:eb", "network": {"id": "8f76a919-8420-45e2-a6e8-56654f40ec08", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-57295873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "03f24e162c6d454aa9e31d60b478001d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cb7cbe0-77", "ovs_interfaceid": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.530 226239 DEBUG nova.network.os_vif_util [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Converting VIF {"id": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "address": "fa:16:3e:7a:9d:eb", "network": {"id": "8f76a919-8420-45e2-a6e8-56654f40ec08", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-57295873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "03f24e162c6d454aa9e31d60b478001d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cb7cbe0-77", "ovs_interfaceid": "2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.531 226239 DEBUG nova.network.os_vif_util [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7a:9d:eb,bridge_name='br-int',has_traffic_filtering=True,id=2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0,network=Network(8f76a919-8420-45e2-a6e8-56654f40ec08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2cb7cbe0-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.531 226239 DEBUG os_vif [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7a:9d:eb,bridge_name='br-int',has_traffic_filtering=True,id=2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0,network=Network(8f76a919-8420-45e2-a6e8-56654f40ec08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2cb7cbe0-77') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.533 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.533 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2cb7cbe0-77, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.535 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.538 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.540 226239 INFO os_vif [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7a:9d:eb,bridge_name='br-int',has_traffic_filtering=True,id=2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0,network=Network(8f76a919-8420-45e2-a6e8-56654f40ec08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2cb7cbe0-77')#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.588 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846630.5885315, 7d051fe7-956a-4c8c-9f91-47c8c057964c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.589 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.592 226239 DEBUG nova.compute.manager [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.592 226239 DEBUG nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.595 226239 INFO nova.virt.libvirt.driver [-] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Instance spawned successfully.#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.595 226239 DEBUG nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:03:50 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f49795642e1b00d730f84f3a3126cc89f505d3eebbf48bd260a6e5ee34be5a05-userdata-shm.mount: Deactivated successfully.
Jan 31 03:03:50 np0005603623 systemd[1]: var-lib-containers-storage-overlay-01ce683dee7e76791635b6183f2b0875cfb819d5ffdcdf197e35c54d73df2e84-merged.mount: Deactivated successfully.
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.647 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.652 226239 DEBUG nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.653 226239 DEBUG nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.653 226239 DEBUG nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.654 226239 DEBUG nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.654 226239 DEBUG nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.654 226239 DEBUG nova.virt.libvirt.driver [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.660 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:03:50 np0005603623 podman[253402]: 2026-01-31 08:03:50.692092803 +0000 UTC m=+0.234537215 container cleanup f49795642e1b00d730f84f3a3126cc89f505d3eebbf48bd260a6e5ee34be5a05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true)
Jan 31 03:03:50 np0005603623 systemd[1]: libpod-conmon-f49795642e1b00d730f84f3a3126cc89f505d3eebbf48bd260a6e5ee34be5a05.scope: Deactivated successfully.
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.722 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.724 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846630.590749, 7d051fe7-956a-4c8c-9f91-47c8c057964c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.724 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] VM Started (Lifecycle Event)#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.758 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.765 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.770 226239 INFO nova.compute.manager [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Took 5.06 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.770 226239 DEBUG nova.compute.manager [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.796 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:03:50 np0005603623 podman[253475]: 2026-01-31 08:03:50.852256956 +0000 UTC m=+0.131986580 container remove f49795642e1b00d730f84f3a3126cc89f505d3eebbf48bd260a6e5ee34be5a05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:50.857 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[804176e5-3be0-4a18-b724-302a6f0f6d18]: (4, ('Sat Jan 31 08:03:50 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08 (f49795642e1b00d730f84f3a3126cc89f505d3eebbf48bd260a6e5ee34be5a05)\nf49795642e1b00d730f84f3a3126cc89f505d3eebbf48bd260a6e5ee34be5a05\nSat Jan 31 08:03:50 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08 (f49795642e1b00d730f84f3a3126cc89f505d3eebbf48bd260a6e5ee34be5a05)\nf49795642e1b00d730f84f3a3126cc89f505d3eebbf48bd260a6e5ee34be5a05\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:50.859 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a7d123c6-3dcd-4fdd-a3a7-0db4eec21243]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:50.861 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f76a919-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.864 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:50 np0005603623 kernel: tap8f76a919-80: left promiscuous mode
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.870 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:50.872 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7d4167ed-47e3-436f-8843-3f6d7e509dd9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:50.889 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[50aa2c4c-31cf-4610-a452-615b84604534]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:50.892 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9061b83e-4c16-41b2-9161-70905f1ba903]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:50.904 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[faebfe5b-7bbb-43ab-87a0-bb284f29dc59]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571088, 'reachable_time': 30581, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253491, 'error': None, 'target': 'ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:50.909 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8f76a919-8420-45e2-a6e8-56654f40ec08 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:03:50 np0005603623 systemd[1]: run-netns-ovnmeta\x2d8f76a919\x2d8420\x2d45e2\x2da6e8\x2d56654f40ec08.mount: Deactivated successfully.
Jan 31 03:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:03:50.909 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[66cdcba5-2675-4bb9-bc56-2e0be7e48840]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:50 np0005603623 nova_compute[226235]: 2026-01-31 08:03:50.914 226239 INFO nova.compute.manager [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Took 6.11 seconds to build instance.#033[00m
Jan 31 03:03:51 np0005603623 nova_compute[226235]: 2026-01-31 08:03:51.012 226239 DEBUG oslo_concurrency.lockutils [None req-a8ee8b1d-c7e8-478c-aae3-e0ee489d3a13 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Lock "7d051fe7-956a-4c8c-9f91-47c8c057964c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.316s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:51.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:51.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e210 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:52 np0005603623 nova_compute[226235]: 2026-01-31 08:03:52.305 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:53 np0005603623 nova_compute[226235]: 2026-01-31 08:03:53.364 226239 DEBUG nova.compute.manager [req-063161ff-6c2e-43c3-adf9-8e52640db2d5 req-b527951e-e4ef-43f6-b6ba-1fe505550584 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Received event network-vif-unplugged-2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:03:53 np0005603623 nova_compute[226235]: 2026-01-31 08:03:53.365 226239 DEBUG oslo_concurrency.lockutils [req-063161ff-6c2e-43c3-adf9-8e52640db2d5 req-b527951e-e4ef-43f6-b6ba-1fe505550584 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4470f2e9-7e12-43b5-a9a5-f690f2509954-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:53 np0005603623 nova_compute[226235]: 2026-01-31 08:03:53.365 226239 DEBUG oslo_concurrency.lockutils [req-063161ff-6c2e-43c3-adf9-8e52640db2d5 req-b527951e-e4ef-43f6-b6ba-1fe505550584 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4470f2e9-7e12-43b5-a9a5-f690f2509954-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:53 np0005603623 nova_compute[226235]: 2026-01-31 08:03:53.365 226239 DEBUG oslo_concurrency.lockutils [req-063161ff-6c2e-43c3-adf9-8e52640db2d5 req-b527951e-e4ef-43f6-b6ba-1fe505550584 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4470f2e9-7e12-43b5-a9a5-f690f2509954-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:53 np0005603623 nova_compute[226235]: 2026-01-31 08:03:53.366 226239 DEBUG nova.compute.manager [req-063161ff-6c2e-43c3-adf9-8e52640db2d5 req-b527951e-e4ef-43f6-b6ba-1fe505550584 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] No waiting events found dispatching network-vif-unplugged-2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:03:53 np0005603623 nova_compute[226235]: 2026-01-31 08:03:53.366 226239 DEBUG nova.compute.manager [req-063161ff-6c2e-43c3-adf9-8e52640db2d5 req-b527951e-e4ef-43f6-b6ba-1fe505550584 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Received event network-vif-unplugged-2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:03:53 np0005603623 nova_compute[226235]: 2026-01-31 08:03:53.366 226239 DEBUG nova.compute.manager [req-063161ff-6c2e-43c3-adf9-8e52640db2d5 req-b527951e-e4ef-43f6-b6ba-1fe505550584 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Received event network-vif-plugged-2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:03:53 np0005603623 nova_compute[226235]: 2026-01-31 08:03:53.366 226239 DEBUG oslo_concurrency.lockutils [req-063161ff-6c2e-43c3-adf9-8e52640db2d5 req-b527951e-e4ef-43f6-b6ba-1fe505550584 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4470f2e9-7e12-43b5-a9a5-f690f2509954-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:53 np0005603623 nova_compute[226235]: 2026-01-31 08:03:53.367 226239 DEBUG oslo_concurrency.lockutils [req-063161ff-6c2e-43c3-adf9-8e52640db2d5 req-b527951e-e4ef-43f6-b6ba-1fe505550584 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4470f2e9-7e12-43b5-a9a5-f690f2509954-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:53 np0005603623 nova_compute[226235]: 2026-01-31 08:03:53.367 226239 DEBUG oslo_concurrency.lockutils [req-063161ff-6c2e-43c3-adf9-8e52640db2d5 req-b527951e-e4ef-43f6-b6ba-1fe505550584 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4470f2e9-7e12-43b5-a9a5-f690f2509954-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:53 np0005603623 nova_compute[226235]: 2026-01-31 08:03:53.367 226239 DEBUG nova.compute.manager [req-063161ff-6c2e-43c3-adf9-8e52640db2d5 req-b527951e-e4ef-43f6-b6ba-1fe505550584 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] No waiting events found dispatching network-vif-plugged-2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:03:53 np0005603623 nova_compute[226235]: 2026-01-31 08:03:53.367 226239 WARNING nova.compute.manager [req-063161ff-6c2e-43c3-adf9-8e52640db2d5 req-b527951e-e4ef-43f6-b6ba-1fe505550584 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Received unexpected event network-vif-plugged-2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:03:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:03:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:53.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:03:53 np0005603623 nova_compute[226235]: 2026-01-31 08:03:53.467 226239 INFO nova.virt.libvirt.driver [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Deleting instance files /var/lib/nova/instances/4470f2e9-7e12-43b5-a9a5-f690f2509954_del#033[00m
Jan 31 03:03:53 np0005603623 nova_compute[226235]: 2026-01-31 08:03:53.468 226239 INFO nova.virt.libvirt.driver [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Deletion of /var/lib/nova/instances/4470f2e9-7e12-43b5-a9a5-f690f2509954_del complete#033[00m
Jan 31 03:03:53 np0005603623 nova_compute[226235]: 2026-01-31 08:03:53.538 226239 INFO nova.compute.manager [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Took 3.26 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:03:53 np0005603623 nova_compute[226235]: 2026-01-31 08:03:53.539 226239 DEBUG oslo.service.loopingcall [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:03:53 np0005603623 nova_compute[226235]: 2026-01-31 08:03:53.539 226239 DEBUG nova.compute.manager [-] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:03:53 np0005603623 nova_compute[226235]: 2026-01-31 08:03:53.540 226239 DEBUG nova.network.neutron [-] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:03:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:53.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:54 np0005603623 nova_compute[226235]: 2026-01-31 08:03:54.834 226239 DEBUG nova.network.neutron [-] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:03:54 np0005603623 nova_compute[226235]: 2026-01-31 08:03:54.867 226239 INFO nova.compute.manager [-] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Took 1.33 seconds to deallocate network for instance.#033[00m
Jan 31 03:03:54 np0005603623 nova_compute[226235]: 2026-01-31 08:03:54.930 226239 DEBUG oslo_concurrency.lockutils [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:54 np0005603623 nova_compute[226235]: 2026-01-31 08:03:54.931 226239 DEBUG oslo_concurrency.lockutils [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:54 np0005603623 nova_compute[226235]: 2026-01-31 08:03:54.955 226239 DEBUG nova.compute.manager [req-4df4fa81-d0ee-436f-8e66-e18b8777b625 req-02e0c742-1756-4b46-b44c-e8e9073ee71b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Received event network-vif-deleted-2cb7cbe0-7766-4ff0-b64f-1dd352ef32d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:03:55 np0005603623 nova_compute[226235]: 2026-01-31 08:03:55.014 226239 DEBUG oslo_concurrency.processutils [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:03:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:55.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:03:55 np0005603623 nova_compute[226235]: 2026-01-31 08:03:55.536 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:03:55 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/48164792' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:03:55 np0005603623 nova_compute[226235]: 2026-01-31 08:03:55.556 226239 DEBUG oslo_concurrency.processutils [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.541s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:55 np0005603623 nova_compute[226235]: 2026-01-31 08:03:55.561 226239 DEBUG nova.compute.provider_tree [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:03:55 np0005603623 nova_compute[226235]: 2026-01-31 08:03:55.582 226239 DEBUG nova.scheduler.client.report [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:03:55 np0005603623 nova_compute[226235]: 2026-01-31 08:03:55.615 226239 DEBUG oslo_concurrency.lockutils [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.684s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:55 np0005603623 nova_compute[226235]: 2026-01-31 08:03:55.677 226239 INFO nova.scheduler.client.report [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Deleted allocations for instance 4470f2e9-7e12-43b5-a9a5-f690f2509954#033[00m
Jan 31 03:03:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:55.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:56 np0005603623 nova_compute[226235]: 2026-01-31 08:03:56.026 226239 DEBUG oslo_concurrency.lockutils [None req-6fadf0c3-3e1d-458a-ad0e-4c26e81ccef6 97778ff629964356819ef34be55ca5a6 03f24e162c6d454aa9e31d60b478001d - - default default] Lock "4470f2e9-7e12-43b5-a9a5-f690f2509954" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.754s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e211 e211: 3 total, 3 up, 3 in
Jan 31 03:03:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e211 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:57 np0005603623 nova_compute[226235]: 2026-01-31 08:03:57.307 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:57.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:57.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e212 e212: 3 total, 3 up, 3 in
Jan 31 03:03:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:59.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:03:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:59.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:00 np0005603623 nova_compute[226235]: 2026-01-31 08:04:00.541 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:01 np0005603623 nova_compute[226235]: 2026-01-31 08:04:01.207 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:01.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:01.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e212 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:02 np0005603623 nova_compute[226235]: 2026-01-31 08:04:02.308 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:04:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:03.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:04:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:03.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:04:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:05.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:04:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e213 e213: 3 total, 3 up, 3 in
Jan 31 03:04:05 np0005603623 nova_compute[226235]: 2026-01-31 08:04:05.504 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846630.502592, 4470f2e9-7e12-43b5-a9a5-f690f2509954 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:04:05 np0005603623 nova_compute[226235]: 2026-01-31 08:04:05.505 226239 INFO nova.compute.manager [-] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:04:05 np0005603623 nova_compute[226235]: 2026-01-31 08:04:05.523 226239 DEBUG nova.compute.manager [None req-095da816-f83e-4d08-ac2c-1d4d29ee6611 - - - - - -] [instance: 4470f2e9-7e12-43b5-a9a5-f690f2509954] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:04:05 np0005603623 nova_compute[226235]: 2026-01-31 08:04:05.544 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:04:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:05.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:04:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e214 e214: 3 total, 3 up, 3 in
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:07 np0005603623 nova_compute[226235]: 2026-01-31 08:04:07.310 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:04:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:07.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:04:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:07.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:07.820916) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846647821539, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 1878, "num_deletes": 262, "total_data_size": 4214693, "memory_usage": 4273008, "flush_reason": "Manual Compaction"}
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846647860325, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 1881148, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33859, "largest_seqno": 35732, "table_properties": {"data_size": 1874307, "index_size": 3787, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 17237, "raw_average_key_size": 22, "raw_value_size": 1859739, "raw_average_value_size": 2375, "num_data_blocks": 164, "num_entries": 783, "num_filter_entries": 783, "num_deletions": 262, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769846523, "oldest_key_time": 1769846523, "file_creation_time": 1769846647, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 39471 microseconds, and 4013 cpu microseconds.
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:07.860381) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 1881148 bytes OK
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:07.860405) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:07.866272) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:07.866322) EVENT_LOG_v1 {"time_micros": 1769846647866310, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:07.866347) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 4205977, prev total WAL file size 4205977, number of live WAL files 2.
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:07.867315) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303030' seq:72057594037927935, type:22 .. '6D6772737461740031323533' seq:0, type:0; will stop at (end)
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(1837KB)], [63(10MB)]
Jan 31 03:04:07 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846647867379, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 12626612, "oldest_snapshot_seqno": -1}
Jan 31 03:04:08 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6055 keys, 9605103 bytes, temperature: kUnknown
Jan 31 03:04:08 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846648040074, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 9605103, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9564652, "index_size": 24213, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15173, "raw_key_size": 154830, "raw_average_key_size": 25, "raw_value_size": 9456118, "raw_average_value_size": 1561, "num_data_blocks": 977, "num_entries": 6055, "num_filter_entries": 6055, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769846647, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:04:08 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:04:08 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:08.040421) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 9605103 bytes
Jan 31 03:04:08 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:08.045677) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 73.1 rd, 55.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 10.2 +0.0 blob) out(9.2 +0.0 blob), read-write-amplify(11.8) write-amplify(5.1) OK, records in: 6536, records dropped: 481 output_compression: NoCompression
Jan 31 03:04:08 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:08.045730) EVENT_LOG_v1 {"time_micros": 1769846648045713, "job": 38, "event": "compaction_finished", "compaction_time_micros": 172740, "compaction_time_cpu_micros": 21025, "output_level": 6, "num_output_files": 1, "total_output_size": 9605103, "num_input_records": 6536, "num_output_records": 6055, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:04:08 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:04:08 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846648046276, "job": 38, "event": "table_file_deletion", "file_number": 65}
Jan 31 03:04:08 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:04:08 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846648047590, "job": 38, "event": "table_file_deletion", "file_number": 63}
Jan 31 03:04:08 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:07.867193) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:08 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:08.047724) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:08 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:08.047729) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:08 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:08.047731) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:08 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:08.047733) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:08 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:08.047735) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:09.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:09.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:10 np0005603623 nova_compute[226235]: 2026-01-31 08:04:10.547 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:04:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:11.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:04:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e215 e215: 3 total, 3 up, 3 in
Jan 31 03:04:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:04:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:11.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:04:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:12 np0005603623 nova_compute[226235]: 2026-01-31 08:04:12.314 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:12 np0005603623 podman[253579]: 2026-01-31 08:04:12.990134727 +0000 UTC m=+0.077149870 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 31 03:04:12 np0005603623 podman[253578]: 2026-01-31 08:04:12.989031843 +0000 UTC m=+0.076277083 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 31 03:04:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e216 e216: 3 total, 3 up, 3 in
Jan 31 03:04:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:13.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:13.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:14 np0005603623 nova_compute[226235]: 2026-01-31 08:04:14.319 226239 DEBUG nova.compute.manager [None req-af5bee04-f51c-4ebe-9c78-0b1e0687295c fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:04:14 np0005603623 nova_compute[226235]: 2026-01-31 08:04:14.371 226239 INFO nova.compute.manager [None req-af5bee04-f51c-4ebe-9c78-0b1e0687295c fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] instance snapshotting#033[00m
Jan 31 03:04:14 np0005603623 nova_compute[226235]: 2026-01-31 08:04:14.740 226239 INFO nova.virt.libvirt.driver [None req-af5bee04-f51c-4ebe-9c78-0b1e0687295c fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Beginning live snapshot process#033[00m
Jan 31 03:04:14 np0005603623 nova_compute[226235]: 2026-01-31 08:04:14.911 226239 DEBUG nova.virt.libvirt.imagebackend [None req-af5bee04-f51c-4ebe-9c78-0b1e0687295c fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] No parent info for 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 31 03:04:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e217 e217: 3 total, 3 up, 3 in
Jan 31 03:04:15 np0005603623 nova_compute[226235]: 2026-01-31 08:04:15.299 226239 DEBUG nova.storage.rbd_utils [None req-af5bee04-f51c-4ebe-9c78-0b1e0687295c fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] creating snapshot(00a320afaa874a20845e103a46af2dbe) on rbd image(7d051fe7-956a-4c8c-9f91-47c8c057964c_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:04:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:15.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:15 np0005603623 nova_compute[226235]: 2026-01-31 08:04:15.549 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:15.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e218 e218: 3 total, 3 up, 3 in
Jan 31 03:04:16 np0005603623 nova_compute[226235]: 2026-01-31 08:04:16.246 226239 DEBUG nova.storage.rbd_utils [None req-af5bee04-f51c-4ebe-9c78-0b1e0687295c fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] cloning vms/7d051fe7-956a-4c8c-9f91-47c8c057964c_disk@00a320afaa874a20845e103a46af2dbe to images/c60105d3-836b-48ac-bea6-55acfd972195 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:04:16 np0005603623 nova_compute[226235]: 2026-01-31 08:04:16.612 226239 DEBUG nova.storage.rbd_utils [None req-af5bee04-f51c-4ebe-9c78-0b1e0687295c fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] flattening images/c60105d3-836b-48ac-bea6-55acfd972195 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 31 03:04:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e219 e219: 3 total, 3 up, 3 in
Jan 31 03:04:17 np0005603623 nova_compute[226235]: 2026-01-31 08:04:17.316 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:17.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:04:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:17.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:04:18 np0005603623 nova_compute[226235]: 2026-01-31 08:04:18.251 226239 DEBUG nova.storage.rbd_utils [None req-af5bee04-f51c-4ebe-9c78-0b1e0687295c fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] removing snapshot(00a320afaa874a20845e103a46af2dbe) on rbd image(7d051fe7-956a-4c8c-9f91-47c8c057964c_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:04:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e220 e220: 3 total, 3 up, 3 in
Jan 31 03:04:19 np0005603623 nova_compute[226235]: 2026-01-31 08:04:19.311 226239 DEBUG nova.storage.rbd_utils [None req-af5bee04-f51c-4ebe-9c78-0b1e0687295c fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] creating snapshot(snap) on rbd image(c60105d3-836b-48ac-bea6-55acfd972195) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:04:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:19.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:19.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e221 e221: 3 total, 3 up, 3 in
Jan 31 03:04:20 np0005603623 nova_compute[226235]: 2026-01-31 08:04:20.552 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:04:20 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1125588561' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:04:21 np0005603623 nova_compute[226235]: 2026-01-31 08:04:21.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:21.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:21.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:22 np0005603623 nova_compute[226235]: 2026-01-31 08:04:22.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:22 np0005603623 nova_compute[226235]: 2026-01-31 08:04:22.176 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:22 np0005603623 nova_compute[226235]: 2026-01-31 08:04:22.222 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:22 np0005603623 nova_compute[226235]: 2026-01-31 08:04:22.222 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:22 np0005603623 nova_compute[226235]: 2026-01-31 08:04:22.222 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:22 np0005603623 nova_compute[226235]: 2026-01-31 08:04:22.223 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:04:22 np0005603623 nova_compute[226235]: 2026-01-31 08:04:22.223 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:04:22 np0005603623 nova_compute[226235]: 2026-01-31 08:04:22.317 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e221 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:04:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1014558163' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:04:22 np0005603623 nova_compute[226235]: 2026-01-31 08:04:22.684 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:04:22 np0005603623 nova_compute[226235]: 2026-01-31 08:04:22.760 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000003a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:04:22 np0005603623 nova_compute[226235]: 2026-01-31 08:04:22.761 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000003a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:04:22 np0005603623 nova_compute[226235]: 2026-01-31 08:04:22.875 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:04:22 np0005603623 nova_compute[226235]: 2026-01-31 08:04:22.876 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4423MB free_disk=20.835636138916016GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:04:22 np0005603623 nova_compute[226235]: 2026-01-31 08:04:22.877 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:22 np0005603623 nova_compute[226235]: 2026-01-31 08:04:22.877 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:22 np0005603623 nova_compute[226235]: 2026-01-31 08:04:22.883 226239 INFO nova.virt.libvirt.driver [None req-af5bee04-f51c-4ebe-9c78-0b1e0687295c fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Snapshot image upload complete#033[00m
Jan 31 03:04:22 np0005603623 nova_compute[226235]: 2026-01-31 08:04:22.884 226239 INFO nova.compute.manager [None req-af5bee04-f51c-4ebe-9c78-0b1e0687295c fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Took 8.51 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 31 03:04:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e222 e222: 3 total, 3 up, 3 in
Jan 31 03:04:23 np0005603623 nova_compute[226235]: 2026-01-31 08:04:23.020 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 7d051fe7-956a-4c8c-9f91-47c8c057964c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:04:23 np0005603623 nova_compute[226235]: 2026-01-31 08:04:23.020 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:04:23 np0005603623 nova_compute[226235]: 2026-01-31 08:04:23.021 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:04:23 np0005603623 nova_compute[226235]: 2026-01-31 08:04:23.063 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:04:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:04:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1315898641' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:04:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:23.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:23 np0005603623 nova_compute[226235]: 2026-01-31 08:04:23.492 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:04:23 np0005603623 nova_compute[226235]: 2026-01-31 08:04:23.496 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:04:23 np0005603623 nova_compute[226235]: 2026-01-31 08:04:23.514 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:04:23 np0005603623 nova_compute[226235]: 2026-01-31 08:04:23.607 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:04:23 np0005603623 nova_compute[226235]: 2026-01-31 08:04:23.608 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:23.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:24 np0005603623 nova_compute[226235]: 2026-01-31 08:04:24.587 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:24 np0005603623 nova_compute[226235]: 2026-01-31 08:04:24.588 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:04:24 np0005603623 nova_compute[226235]: 2026-01-31 08:04:24.589 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:04:24 np0005603623 nova_compute[226235]: 2026-01-31 08:04:24.920 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-7d051fe7-956a-4c8c-9f91-47c8c057964c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:04:24 np0005603623 nova_compute[226235]: 2026-01-31 08:04:24.920 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-7d051fe7-956a-4c8c-9f91-47c8c057964c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:04:24 np0005603623 nova_compute[226235]: 2026-01-31 08:04:24.920 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:04:24 np0005603623 nova_compute[226235]: 2026-01-31 08:04:24.921 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7d051fe7-956a-4c8c-9f91-47c8c057964c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:04:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:25.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:25 np0005603623 nova_compute[226235]: 2026-01-31 08:04:25.556 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:25 np0005603623 nova_compute[226235]: 2026-01-31 08:04:25.612 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:04:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:25.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:26 np0005603623 nova_compute[226235]: 2026-01-31 08:04:26.056 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:04:26 np0005603623 nova_compute[226235]: 2026-01-31 08:04:26.077 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-7d051fe7-956a-4c8c-9f91-47c8c057964c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:04:26 np0005603623 nova_compute[226235]: 2026-01-31 08:04:26.077 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:04:26 np0005603623 nova_compute[226235]: 2026-01-31 08:04:26.078 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:26 np0005603623 nova_compute[226235]: 2026-01-31 08:04:26.078 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:26 np0005603623 nova_compute[226235]: 2026-01-31 08:04:26.079 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:26 np0005603623 nova_compute[226235]: 2026-01-31 08:04:26.079 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:26 np0005603623 nova_compute[226235]: 2026-01-31 08:04:26.079 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:04:27 np0005603623 nova_compute[226235]: 2026-01-31 08:04:27.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:27 np0005603623 nova_compute[226235]: 2026-01-31 08:04:27.319 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e222 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:27.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e223 e223: 3 total, 3 up, 3 in
Jan 31 03:04:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:27.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e224 e224: 3 total, 3 up, 3 in
Jan 31 03:04:28 np0005603623 nova_compute[226235]: 2026-01-31 08:04:28.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:29.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:29.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e225 e225: 3 total, 3 up, 3 in
Jan 31 03:04:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:04:30.098 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:04:30.098 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:04:30.098 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:30 np0005603623 nova_compute[226235]: 2026-01-31 08:04:30.559 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e226 e226: 3 total, 3 up, 3 in
Jan 31 03:04:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:04:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:31.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:04:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:04:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:31.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:04:32 np0005603623 nova_compute[226235]: 2026-01-31 08:04:32.320 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e226 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:33.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:33.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:04:35.033 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:04:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:04:35.034 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:04:35 np0005603623 nova_compute[226235]: 2026-01-31 08:04:35.035 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:35.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:35 np0005603623 nova_compute[226235]: 2026-01-31 08:04:35.561 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:04:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:35.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:04:36 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #48. Immutable memtables: 5.
Jan 31 03:04:37 np0005603623 nova_compute[226235]: 2026-01-31 08:04:37.322 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e226 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:37.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:37.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e227 e227: 3 total, 3 up, 3 in
Jan 31 03:04:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:04:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:39.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:04:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:39.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:40 np0005603623 nova_compute[226235]: 2026-01-31 08:04:40.564 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:04:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:41.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:04:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:41.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:04:42.037 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:04:42 np0005603623 nova_compute[226235]: 2026-01-31 08:04:42.358 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e227 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:04:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:04:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:04:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:04:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:43.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:04:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:43.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:43 np0005603623 ovn_controller[133449]: 2026-01-31T08:04:43Z|00217|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 03:04:43 np0005603623 podman[254057]: 2026-01-31 08:04:43.948966003 +0000 UTC m=+0.044872506 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:04:43 np0005603623 podman[254058]: 2026-01-31 08:04:43.972437902 +0000 UTC m=+0.066648722 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:04:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:45.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:45 np0005603623 nova_compute[226235]: 2026-01-31 08:04:45.593 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:45.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e228 e228: 3 total, 3 up, 3 in
Jan 31 03:04:47 np0005603623 nova_compute[226235]: 2026-01-31 08:04:47.359 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e228 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:47.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:47.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e229 e229: 3 total, 3 up, 3 in
Jan 31 03:04:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:04:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:49.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:04:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:49.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:04:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:04:50 np0005603623 nova_compute[226235]: 2026-01-31 08:04:50.595 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e230 e230: 3 total, 3 up, 3 in
Jan 31 03:04:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:51.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:51.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:51 np0005603623 nova_compute[226235]: 2026-01-31 08:04:51.928 226239 DEBUG oslo_concurrency.lockutils [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Acquiring lock "7d051fe7-956a-4c8c-9f91-47c8c057964c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:51 np0005603623 nova_compute[226235]: 2026-01-31 08:04:51.928 226239 DEBUG oslo_concurrency.lockutils [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Lock "7d051fe7-956a-4c8c-9f91-47c8c057964c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:51 np0005603623 nova_compute[226235]: 2026-01-31 08:04:51.929 226239 DEBUG oslo_concurrency.lockutils [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Acquiring lock "7d051fe7-956a-4c8c-9f91-47c8c057964c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:51 np0005603623 nova_compute[226235]: 2026-01-31 08:04:51.929 226239 DEBUG oslo_concurrency.lockutils [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Lock "7d051fe7-956a-4c8c-9f91-47c8c057964c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:51 np0005603623 nova_compute[226235]: 2026-01-31 08:04:51.929 226239 DEBUG oslo_concurrency.lockutils [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Lock "7d051fe7-956a-4c8c-9f91-47c8c057964c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:51 np0005603623 nova_compute[226235]: 2026-01-31 08:04:51.930 226239 INFO nova.compute.manager [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Terminating instance#033[00m
Jan 31 03:04:51 np0005603623 nova_compute[226235]: 2026-01-31 08:04:51.931 226239 DEBUG oslo_concurrency.lockutils [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Acquiring lock "refresh_cache-7d051fe7-956a-4c8c-9f91-47c8c057964c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:04:51 np0005603623 nova_compute[226235]: 2026-01-31 08:04:51.931 226239 DEBUG oslo_concurrency.lockutils [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Acquired lock "refresh_cache-7d051fe7-956a-4c8c-9f91-47c8c057964c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:04:51 np0005603623 nova_compute[226235]: 2026-01-31 08:04:51.931 226239 DEBUG nova.network.neutron [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:04:52 np0005603623 nova_compute[226235]: 2026-01-31 08:04:52.361 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:52 np0005603623 nova_compute[226235]: 2026-01-31 08:04:52.525 226239 DEBUG nova.network.neutron [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:04:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e231 e231: 3 total, 3 up, 3 in
Jan 31 03:04:53 np0005603623 nova_compute[226235]: 2026-01-31 08:04:53.211 226239 DEBUG nova.network.neutron [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:04:53 np0005603623 nova_compute[226235]: 2026-01-31 08:04:53.232 226239 DEBUG oslo_concurrency.lockutils [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Releasing lock "refresh_cache-7d051fe7-956a-4c8c-9f91-47c8c057964c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:04:53 np0005603623 nova_compute[226235]: 2026-01-31 08:04:53.232 226239 DEBUG nova.compute.manager [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:04:53 np0005603623 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000003a.scope: Deactivated successfully.
Jan 31 03:04:53 np0005603623 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000003a.scope: Consumed 14.439s CPU time.
Jan 31 03:04:53 np0005603623 systemd-machined[194379]: Machine qemu-29-instance-0000003a terminated.
Jan 31 03:04:53 np0005603623 nova_compute[226235]: 2026-01-31 08:04:53.449 226239 INFO nova.virt.libvirt.driver [-] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Instance destroyed successfully.#033[00m
Jan 31 03:04:53 np0005603623 nova_compute[226235]: 2026-01-31 08:04:53.450 226239 DEBUG nova.objects.instance [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Lazy-loading 'resources' on Instance uuid 7d051fe7-956a-4c8c-9f91-47c8c057964c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:04:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:53.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:53.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:53 np0005603623 nova_compute[226235]: 2026-01-31 08:04:53.976 226239 INFO nova.virt.libvirt.driver [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Deleting instance files /var/lib/nova/instances/7d051fe7-956a-4c8c-9f91-47c8c057964c_del#033[00m
Jan 31 03:04:53 np0005603623 nova_compute[226235]: 2026-01-31 08:04:53.978 226239 INFO nova.virt.libvirt.driver [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Deletion of /var/lib/nova/instances/7d051fe7-956a-4c8c-9f91-47c8c057964c_del complete#033[00m
Jan 31 03:04:54 np0005603623 nova_compute[226235]: 2026-01-31 08:04:54.044 226239 INFO nova.compute.manager [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Took 0.81 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:04:54 np0005603623 nova_compute[226235]: 2026-01-31 08:04:54.045 226239 DEBUG oslo.service.loopingcall [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:04:54 np0005603623 nova_compute[226235]: 2026-01-31 08:04:54.045 226239 DEBUG nova.compute.manager [-] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:04:54 np0005603623 nova_compute[226235]: 2026-01-31 08:04:54.046 226239 DEBUG nova.network.neutron [-] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:04:54 np0005603623 nova_compute[226235]: 2026-01-31 08:04:54.247 226239 DEBUG nova.network.neutron [-] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:04:54 np0005603623 nova_compute[226235]: 2026-01-31 08:04:54.267 226239 DEBUG nova.network.neutron [-] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:04:54 np0005603623 nova_compute[226235]: 2026-01-31 08:04:54.293 226239 INFO nova.compute.manager [-] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Took 0.25 seconds to deallocate network for instance.#033[00m
Jan 31 03:04:54 np0005603623 nova_compute[226235]: 2026-01-31 08:04:54.355 226239 DEBUG oslo_concurrency.lockutils [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:54 np0005603623 nova_compute[226235]: 2026-01-31 08:04:54.356 226239 DEBUG oslo_concurrency.lockutils [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:54 np0005603623 nova_compute[226235]: 2026-01-31 08:04:54.420 226239 DEBUG oslo_concurrency.processutils [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:04:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:04:54 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1474987970' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:04:54 np0005603623 nova_compute[226235]: 2026-01-31 08:04:54.862 226239 DEBUG oslo_concurrency.processutils [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:04:54 np0005603623 nova_compute[226235]: 2026-01-31 08:04:54.869 226239 DEBUG nova.compute.provider_tree [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:04:54 np0005603623 nova_compute[226235]: 2026-01-31 08:04:54.895 226239 DEBUG nova.scheduler.client.report [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:04:54 np0005603623 nova_compute[226235]: 2026-01-31 08:04:54.920 226239 DEBUG oslo_concurrency.lockutils [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:54 np0005603623 nova_compute[226235]: 2026-01-31 08:04:54.968 226239 INFO nova.scheduler.client.report [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Deleted allocations for instance 7d051fe7-956a-4c8c-9f91-47c8c057964c#033[00m
Jan 31 03:04:55 np0005603623 nova_compute[226235]: 2026-01-31 08:04:55.053 226239 DEBUG oslo_concurrency.lockutils [None req-7d24bffa-b0b4-49f3-87b9-dde056238e81 fd3d70d97c394edaa70e32807d7a96ca 3d28270b439f4cb1aa201d46b9f8a843 - - default default] Lock "7d051fe7-956a-4c8c-9f91-47c8c057964c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:55.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:55 np0005603623 nova_compute[226235]: 2026-01-31 08:04:55.597 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:55.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:57 np0005603623 nova_compute[226235]: 2026-01-31 08:04:57.363 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:57.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:57.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e232 e232: 3 total, 3 up, 3 in
Jan 31 03:04:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:59.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:59.646671) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846699646783, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 1043, "num_deletes": 259, "total_data_size": 1840675, "memory_usage": 1874624, "flush_reason": "Manual Compaction"}
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846699654493, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 1210284, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 35737, "largest_seqno": 36775, "table_properties": {"data_size": 1205423, "index_size": 2385, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 11182, "raw_average_key_size": 20, "raw_value_size": 1195517, "raw_average_value_size": 2213, "num_data_blocks": 104, "num_entries": 540, "num_filter_entries": 540, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769846648, "oldest_key_time": 1769846648, "file_creation_time": 1769846699, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 7844 microseconds, and 4174 cpu microseconds.
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:59.654533) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 1210284 bytes OK
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:59.654555) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:59.656155) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:59.656168) EVENT_LOG_v1 {"time_micros": 1769846699656164, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:59.656183) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 1835419, prev total WAL file size 1835419, number of live WAL files 2.
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:59.656671) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(1181KB)], [66(9379KB)]
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846699656747, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 10815387, "oldest_snapshot_seqno": -1}
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6065 keys, 8861826 bytes, temperature: kUnknown
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846699726635, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 8861826, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8821616, "index_size": 23972, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15173, "raw_key_size": 155925, "raw_average_key_size": 25, "raw_value_size": 8713230, "raw_average_value_size": 1436, "num_data_blocks": 958, "num_entries": 6065, "num_filter_entries": 6065, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769846699, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:59.726859) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 8861826 bytes
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:59.737958) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 154.6 rd, 126.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 9.2 +0.0 blob) out(8.5 +0.0 blob), read-write-amplify(16.3) write-amplify(7.3) OK, records in: 6595, records dropped: 530 output_compression: NoCompression
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:59.737995) EVENT_LOG_v1 {"time_micros": 1769846699737981, "job": 40, "event": "compaction_finished", "compaction_time_micros": 69945, "compaction_time_cpu_micros": 18629, "output_level": 6, "num_output_files": 1, "total_output_size": 8861826, "num_input_records": 6595, "num_output_records": 6065, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846699738254, "job": 40, "event": "table_file_deletion", "file_number": 68}
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846699738910, "job": 40, "event": "table_file_deletion", "file_number": 66}
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:59.656572) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:59.739008) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:59.739018) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:59.739021) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:59.739024) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:59 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:04:59.739027) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:04:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:59.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:00 np0005603623 nova_compute[226235]: 2026-01-31 08:05:00.599 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:01.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:01.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e233 e233: 3 total, 3 up, 3 in
Jan 31 03:05:02 np0005603623 nova_compute[226235]: 2026-01-31 08:05:02.364 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e233 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:03.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:03.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e234 e234: 3 total, 3 up, 3 in
Jan 31 03:05:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:05.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:05 np0005603623 nova_compute[226235]: 2026-01-31 08:05:05.635 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:05.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e235 e235: 3 total, 3 up, 3 in
Jan 31 03:05:07 np0005603623 nova_compute[226235]: 2026-01-31 08:05:07.367 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e235 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:07.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:07.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:08 np0005603623 nova_compute[226235]: 2026-01-31 08:05:08.449 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846693.4470525, 7d051fe7-956a-4c8c-9f91-47c8c057964c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:05:08 np0005603623 nova_compute[226235]: 2026-01-31 08:05:08.450 226239 INFO nova.compute.manager [-] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:05:08 np0005603623 nova_compute[226235]: 2026-01-31 08:05:08.485 226239 DEBUG nova.compute.manager [None req-58d12c14-e242-4ef2-a820-341c46beb525 - - - - - -] [instance: 7d051fe7-956a-4c8c-9f91-47c8c057964c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:05:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:09.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:09.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:10 np0005603623 nova_compute[226235]: 2026-01-31 08:05:10.638 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:11.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:11.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:12 np0005603623 nova_compute[226235]: 2026-01-31 08:05:12.368 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e235 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e236 e236: 3 total, 3 up, 3 in
Jan 31 03:05:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:13.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:13.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:14 np0005603623 podman[254260]: 2026-01-31 08:05:14.072302697 +0000 UTC m=+0.054873811 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:05:14 np0005603623 podman[254261]: 2026-01-31 08:05:14.136180081 +0000 UTC m=+0.112001502 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:05:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:15.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:15 np0005603623 nova_compute[226235]: 2026-01-31 08:05:15.641 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:15.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:17 np0005603623 nova_compute[226235]: 2026-01-31 08:05:17.370 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e236 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:17.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:17.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e237 e237: 3 total, 3 up, 3 in
Jan 31 03:05:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:19.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:19.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:20 np0005603623 nova_compute[226235]: 2026-01-31 08:05:20.688 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:21.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:21.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:22 np0005603623 nova_compute[226235]: 2026-01-31 08:05:22.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:22 np0005603623 nova_compute[226235]: 2026-01-31 08:05:22.371 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e237 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:23.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e238 e238: 3 total, 3 up, 3 in
Jan 31 03:05:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:23.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:24 np0005603623 nova_compute[226235]: 2026-01-31 08:05:24.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:24 np0005603623 nova_compute[226235]: 2026-01-31 08:05:24.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:05:24 np0005603623 nova_compute[226235]: 2026-01-31 08:05:24.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:05:24 np0005603623 nova_compute[226235]: 2026-01-31 08:05:24.191 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:05:24 np0005603623 nova_compute[226235]: 2026-01-31 08:05:24.191 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:24 np0005603623 nova_compute[226235]: 2026-01-31 08:05:24.246 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:24 np0005603623 nova_compute[226235]: 2026-01-31 08:05:24.246 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:24 np0005603623 nova_compute[226235]: 2026-01-31 08:05:24.247 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:24 np0005603623 nova_compute[226235]: 2026-01-31 08:05:24.247 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:05:24 np0005603623 nova_compute[226235]: 2026-01-31 08:05:24.247 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:05:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:05:24 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3372879179' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:05:24 np0005603623 nova_compute[226235]: 2026-01-31 08:05:24.651 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:05:24 np0005603623 nova_compute[226235]: 2026-01-31 08:05:24.788 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:05:24 np0005603623 nova_compute[226235]: 2026-01-31 08:05:24.789 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4653MB free_disk=20.897323608398438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:05:24 np0005603623 nova_compute[226235]: 2026-01-31 08:05:24.789 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:24 np0005603623 nova_compute[226235]: 2026-01-31 08:05:24.790 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:25.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:25 np0005603623 nova_compute[226235]: 2026-01-31 08:05:25.722 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:25.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:26 np0005603623 nova_compute[226235]: 2026-01-31 08:05:26.073 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:05:26 np0005603623 nova_compute[226235]: 2026-01-31 08:05:26.073 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:05:26 np0005603623 nova_compute[226235]: 2026-01-31 08:05:26.865 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:05:27 np0005603623 nova_compute[226235]: 2026-01-31 08:05:27.025 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:05:27 np0005603623 nova_compute[226235]: 2026-01-31 08:05:27.026 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:05:27 np0005603623 nova_compute[226235]: 2026-01-31 08:05:27.046 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:05:27 np0005603623 nova_compute[226235]: 2026-01-31 08:05:27.072 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:05:27 np0005603623 nova_compute[226235]: 2026-01-31 08:05:27.088 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:05:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e239 e239: 3 total, 3 up, 3 in
Jan 31 03:05:27 np0005603623 nova_compute[226235]: 2026-01-31 08:05:27.372 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:05:27 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2408052312' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:05:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:27.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:27 np0005603623 nova_compute[226235]: 2026-01-31 08:05:27.556 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:05:27 np0005603623 nova_compute[226235]: 2026-01-31 08:05:27.563 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:05:27 np0005603623 nova_compute[226235]: 2026-01-31 08:05:27.595 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:05:27 np0005603623 nova_compute[226235]: 2026-01-31 08:05:27.666 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:05:27 np0005603623 nova_compute[226235]: 2026-01-31 08:05:27.667 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.877s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:27 np0005603623 nova_compute[226235]: 2026-01-31 08:05:27.667 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:27 np0005603623 nova_compute[226235]: 2026-01-31 08:05:27.667 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:05:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:27.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:28 np0005603623 nova_compute[226235]: 2026-01-31 08:05:28.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:28 np0005603623 nova_compute[226235]: 2026-01-31 08:05:28.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:28 np0005603623 nova_compute[226235]: 2026-01-31 08:05:28.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:28 np0005603623 nova_compute[226235]: 2026-01-31 08:05:28.156 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:28 np0005603623 nova_compute[226235]: 2026-01-31 08:05:28.156 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:05:28 np0005603623 nova_compute[226235]: 2026-01-31 08:05:28.156 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e240 e240: 3 total, 3 up, 3 in
Jan 31 03:05:29 np0005603623 nova_compute[226235]: 2026-01-31 08:05:29.179 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:29 np0005603623 nova_compute[226235]: 2026-01-31 08:05:29.180 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e241 e241: 3 total, 3 up, 3 in
Jan 31 03:05:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:29.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:29.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:05:30.098 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:05:30.099 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:05:30.099 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:30 np0005603623 nova_compute[226235]: 2026-01-31 08:05:30.726 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:05:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:31.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:05:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:31.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:32 np0005603623 nova_compute[226235]: 2026-01-31 08:05:32.375 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:33.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:33.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:35.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:35 np0005603623 nova_compute[226235]: 2026-01-31 08:05:35.729 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:35.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:36 np0005603623 nova_compute[226235]: 2026-01-31 08:05:36.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:36 np0005603623 nova_compute[226235]: 2026-01-31 08:05:36.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:05:36 np0005603623 nova_compute[226235]: 2026-01-31 08:05:36.192 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:05:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:05:37.209 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:05:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:05:37.210 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:05:37 np0005603623 nova_compute[226235]: 2026-01-31 08:05:37.228 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:37 np0005603623 nova_compute[226235]: 2026-01-31 08:05:37.376 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:37.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:37.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e242 e242: 3 total, 3 up, 3 in
Jan 31 03:05:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:39.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:39.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:05:40.211 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:40 np0005603623 nova_compute[226235]: 2026-01-31 08:05:40.731 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:41.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:41.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:42 np0005603623 nova_compute[226235]: 2026-01-31 08:05:42.377 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:05:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:43.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:05:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:43.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:45 np0005603623 podman[254466]: 2026-01-31 08:05:45.017151447 +0000 UTC m=+0.109636146 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 31 03:05:45 np0005603623 podman[254465]: 2026-01-31 08:05:45.032151071 +0000 UTC m=+0.124083993 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 03:05:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:45.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:45 np0005603623 nova_compute[226235]: 2026-01-31 08:05:45.733 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:45.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:47 np0005603623 nova_compute[226235]: 2026-01-31 08:05:47.379 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:47.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:05:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:47.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:05:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:49.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:49.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:50 np0005603623 nova_compute[226235]: 2026-01-31 08:05:50.571 226239 DEBUG oslo_concurrency.lockutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "4582dbf2-09cd-4a26-84dd-28adcb24011e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:50 np0005603623 nova_compute[226235]: 2026-01-31 08:05:50.572 226239 DEBUG oslo_concurrency.lockutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:50 np0005603623 nova_compute[226235]: 2026-01-31 08:05:50.597 226239 DEBUG nova.compute.manager [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:05:50 np0005603623 nova_compute[226235]: 2026-01-31 08:05:50.700 226239 DEBUG oslo_concurrency.lockutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:50 np0005603623 nova_compute[226235]: 2026-01-31 08:05:50.700 226239 DEBUG oslo_concurrency.lockutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:50 np0005603623 nova_compute[226235]: 2026-01-31 08:05:50.714 226239 DEBUG nova.virt.hardware [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:05:50 np0005603623 nova_compute[226235]: 2026-01-31 08:05:50.715 226239 INFO nova.compute.claims [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:05:50 np0005603623 nova_compute[226235]: 2026-01-31 08:05:50.736 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:50 np0005603623 nova_compute[226235]: 2026-01-31 08:05:50.886 226239 DEBUG oslo_concurrency.processutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:05:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:05:51 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/651217459' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.286 226239 DEBUG oslo_concurrency.processutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.290 226239 DEBUG nova.compute.provider_tree [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.319 226239 DEBUG nova.scheduler.client.report [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.360 226239 DEBUG oslo_concurrency.lockutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.361 226239 DEBUG nova.compute.manager [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.432 226239 DEBUG nova.compute.manager [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.432 226239 DEBUG nova.network.neutron [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.475 226239 INFO nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.520 226239 DEBUG nova.compute.manager [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:05:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:51.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.684 226239 DEBUG nova.compute.manager [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.685 226239 DEBUG nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.686 226239 INFO nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Creating image(s)#033[00m
Jan 31 03:05:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:05:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:05:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.726 226239 DEBUG nova.storage.rbd_utils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 4582dbf2-09cd-4a26-84dd-28adcb24011e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.757 226239 DEBUG nova.storage.rbd_utils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 4582dbf2-09cd-4a26-84dd-28adcb24011e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.779 226239 DEBUG nova.storage.rbd_utils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 4582dbf2-09cd-4a26-84dd-28adcb24011e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.783 226239 DEBUG oslo_concurrency.processutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.799 226239 DEBUG nova.policy [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60f2b878669c4c529b35e04860cc6d64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c73212dc7c84914b6c934d45b6826f7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.832 226239 DEBUG oslo_concurrency.processutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.833 226239 DEBUG oslo_concurrency.lockutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.834 226239 DEBUG oslo_concurrency.lockutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.834 226239 DEBUG oslo_concurrency.lockutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:51.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.856 226239 DEBUG nova.storage.rbd_utils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 4582dbf2-09cd-4a26-84dd-28adcb24011e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:05:51 np0005603623 nova_compute[226235]: 2026-01-31 08:05:51.859 226239 DEBUG oslo_concurrency.processutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4582dbf2-09cd-4a26-84dd-28adcb24011e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:05:52 np0005603623 nova_compute[226235]: 2026-01-31 08:05:52.052 226239 DEBUG oslo_concurrency.processutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4582dbf2-09cd-4a26-84dd-28adcb24011e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.192s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:05:52 np0005603623 nova_compute[226235]: 2026-01-31 08:05:52.114 226239 DEBUG nova.storage.rbd_utils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] resizing rbd image 4582dbf2-09cd-4a26-84dd-28adcb24011e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:05:52 np0005603623 nova_compute[226235]: 2026-01-31 08:05:52.197 226239 DEBUG nova.objects.instance [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'migration_context' on Instance uuid 4582dbf2-09cd-4a26-84dd-28adcb24011e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:05:52 np0005603623 nova_compute[226235]: 2026-01-31 08:05:52.222 226239 DEBUG nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:05:52 np0005603623 nova_compute[226235]: 2026-01-31 08:05:52.223 226239 DEBUG nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Ensure instance console log exists: /var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:05:52 np0005603623 nova_compute[226235]: 2026-01-31 08:05:52.223 226239 DEBUG oslo_concurrency.lockutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:52 np0005603623 nova_compute[226235]: 2026-01-31 08:05:52.223 226239 DEBUG oslo_concurrency.lockutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:52 np0005603623 nova_compute[226235]: 2026-01-31 08:05:52.224 226239 DEBUG oslo_concurrency.lockutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:52 np0005603623 nova_compute[226235]: 2026-01-31 08:05:52.380 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:53.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:53.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:54 np0005603623 nova_compute[226235]: 2026-01-31 08:05:54.178 226239 DEBUG nova.network.neutron [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Successfully created port: d3f10293-a2fb-49cc-a81c-f5fee53bb74f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:05:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:55.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:55 np0005603623 nova_compute[226235]: 2026-01-31 08:05:55.739 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:55.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:56 np0005603623 nova_compute[226235]: 2026-01-31 08:05:56.140 226239 DEBUG nova.network.neutron [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Successfully updated port: d3f10293-a2fb-49cc-a81c-f5fee53bb74f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:05:56 np0005603623 nova_compute[226235]: 2026-01-31 08:05:56.176 226239 DEBUG oslo_concurrency.lockutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:05:56 np0005603623 nova_compute[226235]: 2026-01-31 08:05:56.176 226239 DEBUG oslo_concurrency.lockutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquired lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:05:56 np0005603623 nova_compute[226235]: 2026-01-31 08:05:56.176 226239 DEBUG nova.network.neutron [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:05:56 np0005603623 nova_compute[226235]: 2026-01-31 08:05:56.308 226239 DEBUG nova.compute.manager [req-55d90bd2-679f-4e7a-83be-eba1773ba755 req-6c294aa0-45f7-4c93-843b-071ba884b073 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-changed-d3f10293-a2fb-49cc-a81c-f5fee53bb74f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:05:56 np0005603623 nova_compute[226235]: 2026-01-31 08:05:56.309 226239 DEBUG nova.compute.manager [req-55d90bd2-679f-4e7a-83be-eba1773ba755 req-6c294aa0-45f7-4c93-843b-071ba884b073 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Refreshing instance network info cache due to event network-changed-d3f10293-a2fb-49cc-a81c-f5fee53bb74f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:05:56 np0005603623 nova_compute[226235]: 2026-01-31 08:05:56.309 226239 DEBUG oslo_concurrency.lockutils [req-55d90bd2-679f-4e7a-83be-eba1773ba755 req-6c294aa0-45f7-4c93-843b-071ba884b073 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:05:56 np0005603623 nova_compute[226235]: 2026-01-31 08:05:56.507 226239 DEBUG nova.network.neutron [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:05:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:05:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:05:57 np0005603623 nova_compute[226235]: 2026-01-31 08:05:57.423 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:57.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:57.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.772 226239 DEBUG nova.network.neutron [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updating instance_info_cache with network_info: [{"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.903 226239 DEBUG oslo_concurrency.lockutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Releasing lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.904 226239 DEBUG nova.compute.manager [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Instance network_info: |[{"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.904 226239 DEBUG oslo_concurrency.lockutils [req-55d90bd2-679f-4e7a-83be-eba1773ba755 req-6c294aa0-45f7-4c93-843b-071ba884b073 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.904 226239 DEBUG nova.network.neutron [req-55d90bd2-679f-4e7a-83be-eba1773ba755 req-6c294aa0-45f7-4c93-843b-071ba884b073 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Refreshing network info cache for port d3f10293-a2fb-49cc-a81c-f5fee53bb74f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.907 226239 DEBUG nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Start _get_guest_xml network_info=[{"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.910 226239 WARNING nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.921 226239 DEBUG nova.virt.libvirt.host [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.922 226239 DEBUG nova.virt.libvirt.host [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.937 226239 DEBUG nova.virt.libvirt.host [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.937 226239 DEBUG nova.virt.libvirt.host [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.939 226239 DEBUG nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.939 226239 DEBUG nova.virt.hardware [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.940 226239 DEBUG nova.virt.hardware [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.940 226239 DEBUG nova.virt.hardware [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.940 226239 DEBUG nova.virt.hardware [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.940 226239 DEBUG nova.virt.hardware [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.941 226239 DEBUG nova.virt.hardware [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.941 226239 DEBUG nova.virt.hardware [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.941 226239 DEBUG nova.virt.hardware [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.941 226239 DEBUG nova.virt.hardware [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.942 226239 DEBUG nova.virt.hardware [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.942 226239 DEBUG nova.virt.hardware [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:05:58 np0005603623 nova_compute[226235]: 2026-01-31 08:05:58.944 226239 DEBUG oslo_concurrency.processutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:05:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:05:59 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/19721126' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.349 226239 DEBUG oslo_concurrency.processutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.405s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.376 226239 DEBUG nova.storage.rbd_utils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 4582dbf2-09cd-4a26-84dd-28adcb24011e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.380 226239 DEBUG oslo_concurrency.processutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:05:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:05:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:59.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:05:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:05:59 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/655326019' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.838 226239 DEBUG oslo_concurrency.processutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.840 226239 DEBUG nova.virt.libvirt.vif [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-401952099',display_name='tempest-AttachInterfacesTestJSON-server-401952099',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-401952099',id=64,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLE+D8OgZ1/dCdtrU14/PqCqGOYd+UkR9TQlI7xIz745wJamwCS0XZjGFLIMnjQi5YZpo14J+M8CAOtiHugBc6D8B5fKxatiHPrhEn6jkmtDN0v0fZ+hZxGDop3fETxTA==',key_name='tempest-keypair-524885169',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0cwd24au',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:05:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=4582dbf2-09cd-4a26-84dd-28adcb24011e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.840 226239 DEBUG nova.network.os_vif_util [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.841 226239 DEBUG nova.network.os_vif_util [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:76:91,bridge_name='br-int',has_traffic_filtering=True,id=d3f10293-a2fb-49cc-a81c-f5fee53bb74f,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3f10293-a2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.843 226239 DEBUG nova.objects.instance [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4582dbf2-09cd-4a26-84dd-28adcb24011e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:05:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:05:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:59.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.867 226239 DEBUG nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:05:59 np0005603623 nova_compute[226235]:  <uuid>4582dbf2-09cd-4a26-84dd-28adcb24011e</uuid>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:  <name>instance-00000040</name>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <nova:name>tempest-AttachInterfacesTestJSON-server-401952099</nova:name>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:05:58</nova:creationTime>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:05:59 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:        <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:        <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:        <nova:port uuid="d3f10293-a2fb-49cc-a81c-f5fee53bb74f">
Jan 31 03:05:59 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <entry name="serial">4582dbf2-09cd-4a26-84dd-28adcb24011e</entry>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <entry name="uuid">4582dbf2-09cd-4a26-84dd-28adcb24011e</entry>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/4582dbf2-09cd-4a26-84dd-28adcb24011e_disk">
Jan 31 03:05:59 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:05:59 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/4582dbf2-09cd-4a26-84dd-28adcb24011e_disk.config">
Jan 31 03:05:59 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:05:59 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:72:76:91"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <target dev="tapd3f10293-a2"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e/console.log" append="off"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:05:59 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:05:59 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:05:59 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:05:59 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.868 226239 DEBUG nova.compute.manager [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Preparing to wait for external event network-vif-plugged-d3f10293-a2fb-49cc-a81c-f5fee53bb74f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.869 226239 DEBUG oslo_concurrency.lockutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.869 226239 DEBUG oslo_concurrency.lockutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.869 226239 DEBUG oslo_concurrency.lockutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.870 226239 DEBUG nova.virt.libvirt.vif [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-401952099',display_name='tempest-AttachInterfacesTestJSON-server-401952099',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-401952099',id=64,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLE+D8OgZ1/dCdtrU14/PqCqGOYd+UkR9TQlI7xIz745wJamwCS0XZjGFLIMnjQi5YZpo14J+M8CAOtiHugBc6D8B5fKxatiHPrhEn6jkmtDN0v0fZ+hZxGDop3fETxTA==',key_name='tempest-keypair-524885169',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0cwd24au',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:05:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=4582dbf2-09cd-4a26-84dd-28adcb24011e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.870 226239 DEBUG nova.network.os_vif_util [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.871 226239 DEBUG nova.network.os_vif_util [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:76:91,bridge_name='br-int',has_traffic_filtering=True,id=d3f10293-a2fb-49cc-a81c-f5fee53bb74f,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3f10293-a2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.871 226239 DEBUG os_vif [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:76:91,bridge_name='br-int',has_traffic_filtering=True,id=d3f10293-a2fb-49cc-a81c-f5fee53bb74f,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3f10293-a2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.872 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.872 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.872 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.876 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.876 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3f10293-a2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.876 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd3f10293-a2, col_values=(('external_ids', {'iface-id': 'd3f10293-a2fb-49cc-a81c-f5fee53bb74f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:72:76:91', 'vm-uuid': '4582dbf2-09cd-4a26-84dd-28adcb24011e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.878 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:59 np0005603623 NetworkManager[48970]: <info>  [1769846759.8787] manager: (tapd3f10293-a2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.880 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.882 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.883 226239 INFO os_vif [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:76:91,bridge_name='br-int',has_traffic_filtering=True,id=d3f10293-a2fb-49cc-a81c-f5fee53bb74f,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3f10293-a2')#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.939 226239 DEBUG nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.940 226239 DEBUG nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.940 226239 DEBUG nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:72:76:91, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.940 226239 INFO nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Using config drive#033[00m
Jan 31 03:05:59 np0005603623 nova_compute[226235]: 2026-01-31 08:05:59.959 226239 DEBUG nova.storage.rbd_utils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 4582dbf2-09cd-4a26-84dd-28adcb24011e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:06:00 np0005603623 nova_compute[226235]: 2026-01-31 08:06:00.609 226239 INFO nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Creating config drive at /var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e/disk.config#033[00m
Jan 31 03:06:00 np0005603623 nova_compute[226235]: 2026-01-31 08:06:00.613 226239 DEBUG oslo_concurrency.processutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpnwo6hm2l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:00 np0005603623 nova_compute[226235]: 2026-01-31 08:06:00.731 226239 DEBUG oslo_concurrency.processutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpnwo6hm2l" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:00 np0005603623 nova_compute[226235]: 2026-01-31 08:06:00.755 226239 DEBUG nova.storage.rbd_utils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 4582dbf2-09cd-4a26-84dd-28adcb24011e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:06:00 np0005603623 nova_compute[226235]: 2026-01-31 08:06:00.758 226239 DEBUG oslo_concurrency.processutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e/disk.config 4582dbf2-09cd-4a26-84dd-28adcb24011e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:00 np0005603623 nova_compute[226235]: 2026-01-31 08:06:00.927 226239 DEBUG oslo_concurrency.processutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e/disk.config 4582dbf2-09cd-4a26-84dd-28adcb24011e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.168s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:00 np0005603623 nova_compute[226235]: 2026-01-31 08:06:00.928 226239 INFO nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Deleting local config drive /var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e/disk.config because it was imported into RBD.#033[00m
Jan 31 03:06:00 np0005603623 kernel: tapd3f10293-a2: entered promiscuous mode
Jan 31 03:06:00 np0005603623 NetworkManager[48970]: <info>  [1769846760.9818] manager: (tapd3f10293-a2): new Tun device (/org/freedesktop/NetworkManager/Devices/105)
Jan 31 03:06:00 np0005603623 nova_compute[226235]: 2026-01-31 08:06:00.982 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:00 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:00Z|00218|binding|INFO|Claiming lport d3f10293-a2fb-49cc-a81c-f5fee53bb74f for this chassis.
Jan 31 03:06:00 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:00Z|00219|binding|INFO|d3f10293-a2fb-49cc-a81c-f5fee53bb74f: Claiming fa:16:3e:72:76:91 10.100.0.7
Jan 31 03:06:00 np0005603623 nova_compute[226235]: 2026-01-31 08:06:00.988 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:00 np0005603623 nova_compute[226235]: 2026-01-31 08:06:00.993 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:01 np0005603623 systemd-udevd[255075]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:06:01 np0005603623 systemd-machined[194379]: New machine qemu-30-instance-00000040.
Jan 31 03:06:01 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:01Z|00220|binding|INFO|Setting lport d3f10293-a2fb-49cc-a81c-f5fee53bb74f ovn-installed in OVS
Jan 31 03:06:01 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:01Z|00221|binding|INFO|Setting lport d3f10293-a2fb-49cc-a81c-f5fee53bb74f up in Southbound
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.013 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.016 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:76:91 10.100.0.7'], port_security=['fa:16:3e:72:76:91 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4582dbf2-09cd-4a26-84dd-28adcb24011e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0eb19bce-cce0-4cee-80b2-e44224af388b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=d3f10293-a2fb-49cc-a81c-f5fee53bb74f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:06:01 np0005603623 NetworkManager[48970]: <info>  [1769846761.0184] device (tapd3f10293-a2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.017 143258 INFO neutron.agent.ovn.metadata.agent [-] Port d3f10293-a2fb-49cc-a81c-f5fee53bb74f in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f bound to our chassis#033[00m
Jan 31 03:06:01 np0005603623 NetworkManager[48970]: <info>  [1769846761.0189] device (tapd3f10293-a2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.019 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455fab34-b015-4d97-a96d-f7ebd7f7555f#033[00m
Jan 31 03:06:01 np0005603623 systemd[1]: Started Virtual Machine qemu-30-instance-00000040.
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.027 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b496888d-5365-4b09-9fee-7133ee2f1f42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.027 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap455fab34-b1 in ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.030 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap455fab34-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.030 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f5f7617d-0d3d-463d-8b87-6a7769a3a55f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.031 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d6dd2b2f-cadf-4f8f-8921-32fd1faaa6a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.039 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[c4f266d2-6f82-4507-9e26-2a70bb33c4f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.049 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c755663b-a0de-456b-aaa1-719c3ec5cd6c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.067 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[39eb31f5-e4f7-4b4a-9814-a5bbdaf05166]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:01 np0005603623 NetworkManager[48970]: <info>  [1769846761.0723] manager: (tap455fab34-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/106)
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.071 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a2467a11-e8c1-47fd-8a07-5b0841981579]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.097 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[f44f9499-dfef-49fc-83e8-88eaa2b273a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.100 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[5193607a-f976-4345-96c4-ea6e23f4266f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:01 np0005603623 NetworkManager[48970]: <info>  [1769846761.1147] device (tap455fab34-b0): carrier: link connected
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.117 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[2d965492-a9d2-4174-b329-4d4d24f4fc11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.126 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[225ddbfa-553a-4c43-bbf8-998a2e1c2e9d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590154, 'reachable_time': 21770, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255108, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.134 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[133430f0-d30d-46c5-87e8-7015b01bb6a5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fede:8f98'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590154, 'tstamp': 590154}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255109, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.142 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee3a4da-ac5f-411b-b769-feaf20bdca45]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590154, 'reachable_time': 21770, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 255110, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.160 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[575dd438-f58d-41df-bffd-cf2d002f7dba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.198 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[23e962be-5373-4b8b-b802-2ef9e9d39841]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.200 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.200 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.201 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455fab34-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.202 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:01 np0005603623 NetworkManager[48970]: <info>  [1769846761.2031] manager: (tap455fab34-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Jan 31 03:06:01 np0005603623 kernel: tap455fab34-b0: entered promiscuous mode
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.204 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.208 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455fab34-b0, col_values=(('external_ids', {'iface-id': 'b4a40811-3703-4da5-859c-3e041b7cfee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.209 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:01 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:01Z|00222|binding|INFO|Releasing lport b4a40811-3703-4da5-859c-3e041b7cfee4 from this chassis (sb_readonly=0)
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.209 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.212 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/455fab34-b015-4d97-a96d-f7ebd7f7555f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/455fab34-b015-4d97-a96d-f7ebd7f7555f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.213 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.213 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d774b013-62b2-4c34-954a-d09d65617e4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.215 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-455fab34-b015-4d97-a96d-f7ebd7f7555f
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/455fab34-b015-4d97-a96d-f7ebd7f7555f.pid.haproxy
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 455fab34-b015-4d97-a96d-f7ebd7f7555f
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:06:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:01.215 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'env', 'PROCESS_TAG=haproxy-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/455fab34-b015-4d97-a96d-f7ebd7f7555f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:06:01 np0005603623 podman[255156]: 2026-01-31 08:06:01.541849387 +0000 UTC m=+0.061347755 container create 341ad2037159d7b5d527945f2a4969e372b6e6ad61156104abda7b0652aabff7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.544 226239 DEBUG nova.compute.manager [req-64bf8647-1d43-4fe3-9a2d-fac50abf06db req-79ef1faf-20d4-4626-a3d5-23ea9b0570d0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-plugged-d3f10293-a2fb-49cc-a81c-f5fee53bb74f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.544 226239 DEBUG oslo_concurrency.lockutils [req-64bf8647-1d43-4fe3-9a2d-fac50abf06db req-79ef1faf-20d4-4626-a3d5-23ea9b0570d0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.545 226239 DEBUG oslo_concurrency.lockutils [req-64bf8647-1d43-4fe3-9a2d-fac50abf06db req-79ef1faf-20d4-4626-a3d5-23ea9b0570d0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.545 226239 DEBUG oslo_concurrency.lockutils [req-64bf8647-1d43-4fe3-9a2d-fac50abf06db req-79ef1faf-20d4-4626-a3d5-23ea9b0570d0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.546 226239 DEBUG nova.compute.manager [req-64bf8647-1d43-4fe3-9a2d-fac50abf06db req-79ef1faf-20d4-4626-a3d5-23ea9b0570d0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Processing event network-vif-plugged-d3f10293-a2fb-49cc-a81c-f5fee53bb74f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.547 226239 DEBUG nova.network.neutron [req-55d90bd2-679f-4e7a-83be-eba1773ba755 req-6c294aa0-45f7-4c93-843b-071ba884b073 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updated VIF entry in instance network info cache for port d3f10293-a2fb-49cc-a81c-f5fee53bb74f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.547 226239 DEBUG nova.network.neutron [req-55d90bd2-679f-4e7a-83be-eba1773ba755 req-6c294aa0-45f7-4c93-843b-071ba884b073 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updating instance_info_cache with network_info: [{"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:06:01 np0005603623 systemd[1]: Started libpod-conmon-341ad2037159d7b5d527945f2a4969e372b6e6ad61156104abda7b0652aabff7.scope.
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.566 226239 DEBUG oslo_concurrency.lockutils [req-55d90bd2-679f-4e7a-83be-eba1773ba755 req-6c294aa0-45f7-4c93-843b-071ba884b073 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:06:01 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:06:01 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8fecf1af0abb036f027c5a440e729c77f98ca9b88e942efe84885e698d4b361/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:06:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:06:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:01.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:06:01 np0005603623 podman[255156]: 2026-01-31 08:06:01.502874268 +0000 UTC m=+0.022372646 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:06:01 np0005603623 podman[255156]: 2026-01-31 08:06:01.600159545 +0000 UTC m=+0.119657903 container init 341ad2037159d7b5d527945f2a4969e372b6e6ad61156104abda7b0652aabff7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:06:01 np0005603623 podman[255156]: 2026-01-31 08:06:01.604409088 +0000 UTC m=+0.123907446 container start 341ad2037159d7b5d527945f2a4969e372b6e6ad61156104abda7b0652aabff7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 03:06:01 np0005603623 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[255198]: [NOTICE]   (255203) : New worker (255206) forked
Jan 31 03:06:01 np0005603623 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[255198]: [NOTICE]   (255203) : Loading success.
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.655 226239 DEBUG nova.compute.manager [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.656 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846761.6560366, 4582dbf2-09cd-4a26-84dd-28adcb24011e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.657 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] VM Started (Lifecycle Event)#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.664 226239 DEBUG nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.670 226239 INFO nova.virt.libvirt.driver [-] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Instance spawned successfully.#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.670 226239 DEBUG nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.684 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.688 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.697 226239 DEBUG nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.698 226239 DEBUG nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.699 226239 DEBUG nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.699 226239 DEBUG nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.699 226239 DEBUG nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.700 226239 DEBUG nova.virt.libvirt.driver [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.773 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.774 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846761.6567457, 4582dbf2-09cd-4a26-84dd-28adcb24011e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.774 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.805 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.810 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846761.6587684, 4582dbf2-09cd-4a26-84dd-28adcb24011e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.811 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.845 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.848 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.854 226239 INFO nova.compute.manager [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Took 10.17 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.855 226239 DEBUG nova.compute.manager [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:06:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:06:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:01.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.886 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.964 226239 INFO nova.compute.manager [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Took 11.29 seconds to build instance.#033[00m
Jan 31 03:06:01 np0005603623 nova_compute[226235]: 2026-01-31 08:06:01.995 226239 DEBUG oslo_concurrency.lockutils [None req-7852b8e3-bf7e-4917-a9e9-0e8126589c76 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.423s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:02 np0005603623 nova_compute[226235]: 2026-01-31 08:06:02.425 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:03.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:03 np0005603623 nova_compute[226235]: 2026-01-31 08:06:03.689 226239 DEBUG nova.compute.manager [req-26072601-d102-410f-a87e-2b9f2a5b0db2 req-1a7edba3-03c2-4eba-ade2-44bf0410dabb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-plugged-d3f10293-a2fb-49cc-a81c-f5fee53bb74f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:03 np0005603623 nova_compute[226235]: 2026-01-31 08:06:03.689 226239 DEBUG oslo_concurrency.lockutils [req-26072601-d102-410f-a87e-2b9f2a5b0db2 req-1a7edba3-03c2-4eba-ade2-44bf0410dabb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:03 np0005603623 nova_compute[226235]: 2026-01-31 08:06:03.690 226239 DEBUG oslo_concurrency.lockutils [req-26072601-d102-410f-a87e-2b9f2a5b0db2 req-1a7edba3-03c2-4eba-ade2-44bf0410dabb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:03 np0005603623 nova_compute[226235]: 2026-01-31 08:06:03.690 226239 DEBUG oslo_concurrency.lockutils [req-26072601-d102-410f-a87e-2b9f2a5b0db2 req-1a7edba3-03c2-4eba-ade2-44bf0410dabb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:03 np0005603623 nova_compute[226235]: 2026-01-31 08:06:03.690 226239 DEBUG nova.compute.manager [req-26072601-d102-410f-a87e-2b9f2a5b0db2 req-1a7edba3-03c2-4eba-ade2-44bf0410dabb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] No waiting events found dispatching network-vif-plugged-d3f10293-a2fb-49cc-a81c-f5fee53bb74f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:06:03 np0005603623 nova_compute[226235]: 2026-01-31 08:06:03.690 226239 WARNING nova.compute.manager [req-26072601-d102-410f-a87e-2b9f2a5b0db2 req-1a7edba3-03c2-4eba-ade2-44bf0410dabb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received unexpected event network-vif-plugged-d3f10293-a2fb-49cc-a81c-f5fee53bb74f for instance with vm_state active and task_state None.#033[00m
Jan 31 03:06:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:03.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:04 np0005603623 nova_compute[226235]: 2026-01-31 08:06:04.879 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:05.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:05.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:05 np0005603623 nova_compute[226235]: 2026-01-31 08:06:05.974 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:05 np0005603623 NetworkManager[48970]: <info>  [1769846765.9753] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Jan 31 03:06:05 np0005603623 NetworkManager[48970]: <info>  [1769846765.9763] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/109)
Jan 31 03:06:06 np0005603623 nova_compute[226235]: 2026-01-31 08:06:06.017 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:06 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:06Z|00223|binding|INFO|Releasing lport b4a40811-3703-4da5-859c-3e041b7cfee4 from this chassis (sb_readonly=0)
Jan 31 03:06:06 np0005603623 nova_compute[226235]: 2026-01-31 08:06:06.032 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:07 np0005603623 nova_compute[226235]: 2026-01-31 08:06:07.459 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:07.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:07 np0005603623 nova_compute[226235]: 2026-01-31 08:06:07.631 226239 DEBUG nova.compute.manager [req-4b4f660a-cf79-4f65-9dc0-c3f85550f014 req-206e5b05-22ae-49de-a840-f572e48f1094 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-changed-d3f10293-a2fb-49cc-a81c-f5fee53bb74f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:07 np0005603623 nova_compute[226235]: 2026-01-31 08:06:07.631 226239 DEBUG nova.compute.manager [req-4b4f660a-cf79-4f65-9dc0-c3f85550f014 req-206e5b05-22ae-49de-a840-f572e48f1094 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Refreshing instance network info cache due to event network-changed-d3f10293-a2fb-49cc-a81c-f5fee53bb74f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:06:07 np0005603623 nova_compute[226235]: 2026-01-31 08:06:07.631 226239 DEBUG oslo_concurrency.lockutils [req-4b4f660a-cf79-4f65-9dc0-c3f85550f014 req-206e5b05-22ae-49de-a840-f572e48f1094 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:06:07 np0005603623 nova_compute[226235]: 2026-01-31 08:06:07.631 226239 DEBUG oslo_concurrency.lockutils [req-4b4f660a-cf79-4f65-9dc0-c3f85550f014 req-206e5b05-22ae-49de-a840-f572e48f1094 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:06:07 np0005603623 nova_compute[226235]: 2026-01-31 08:06:07.631 226239 DEBUG nova.network.neutron [req-4b4f660a-cf79-4f65-9dc0-c3f85550f014 req-206e5b05-22ae-49de-a840-f572e48f1094 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Refreshing network info cache for port d3f10293-a2fb-49cc-a81c-f5fee53bb74f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:06:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:06:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:07.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:06:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e243 e243: 3 total, 3 up, 3 in
Jan 31 03:06:09 np0005603623 nova_compute[226235]: 2026-01-31 08:06:09.554 226239 DEBUG nova.network.neutron [req-4b4f660a-cf79-4f65-9dc0-c3f85550f014 req-206e5b05-22ae-49de-a840-f572e48f1094 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updated VIF entry in instance network info cache for port d3f10293-a2fb-49cc-a81c-f5fee53bb74f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:06:09 np0005603623 nova_compute[226235]: 2026-01-31 08:06:09.554 226239 DEBUG nova.network.neutron [req-4b4f660a-cf79-4f65-9dc0-c3f85550f014 req-206e5b05-22ae-49de-a840-f572e48f1094 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updating instance_info_cache with network_info: [{"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:06:09 np0005603623 nova_compute[226235]: 2026-01-31 08:06:09.580 226239 DEBUG oslo_concurrency.lockutils [req-4b4f660a-cf79-4f65-9dc0-c3f85550f014 req-206e5b05-22ae-49de-a840-f572e48f1094 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:06:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:09.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e244 e244: 3 total, 3 up, 3 in
Jan 31 03:06:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:09.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:09 np0005603623 nova_compute[226235]: 2026-01-31 08:06:09.882 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e245 e245: 3 total, 3 up, 3 in
Jan 31 03:06:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:06:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:11.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:06:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:11.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e246 e246: 3 total, 3 up, 3 in
Jan 31 03:06:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e246 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:12 np0005603623 nova_compute[226235]: 2026-01-31 08:06:12.460 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:06:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:13.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:06:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:06:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:13.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:06:13 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:13Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:72:76:91 10.100.0.7
Jan 31 03:06:13 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:13Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:72:76:91 10.100.0.7
Jan 31 03:06:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:06:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4133849122' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:06:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:06:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4133849122' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:06:14 np0005603623 nova_compute[226235]: 2026-01-31 08:06:14.884 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:06:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:15.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:06:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:06:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:15.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:06:15 np0005603623 podman[255224]: 2026-01-31 08:06:15.963472397 +0000 UTC m=+0.053451955 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 31 03:06:15 np0005603623 podman[255225]: 2026-01-31 08:06:15.983028103 +0000 UTC m=+0.073007731 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:06:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e246 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:17 np0005603623 nova_compute[226235]: 2026-01-31 08:06:17.463 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:06:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:17.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:06:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:06:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:17.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:06:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 e247: 3 total, 3 up, 3 in
Jan 31 03:06:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:19.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:19 np0005603623 nova_compute[226235]: 2026-01-31 08:06:19.886 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:06:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:19.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:06:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:06:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:21.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:06:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:06:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:21.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:06:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:22 np0005603623 nova_compute[226235]: 2026-01-31 08:06:22.465 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:23 np0005603623 nova_compute[226235]: 2026-01-31 08:06:23.192 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:23.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:23 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:23Z|00224|binding|INFO|Releasing lport b4a40811-3703-4da5-859c-3e041b7cfee4 from this chassis (sb_readonly=0)
Jan 31 03:06:23 np0005603623 nova_compute[226235]: 2026-01-31 08:06:23.815 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:23.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:24 np0005603623 nova_compute[226235]: 2026-01-31 08:06:24.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:24 np0005603623 nova_compute[226235]: 2026-01-31 08:06:24.156 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:06:24 np0005603623 nova_compute[226235]: 2026-01-31 08:06:24.157 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:06:24 np0005603623 nova_compute[226235]: 2026-01-31 08:06:24.655 226239 DEBUG oslo_concurrency.lockutils [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "interface-4582dbf2-09cd-4a26-84dd-28adcb24011e-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:24 np0005603623 nova_compute[226235]: 2026-01-31 08:06:24.657 226239 DEBUG oslo_concurrency.lockutils [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-4582dbf2-09cd-4a26-84dd-28adcb24011e-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:24 np0005603623 nova_compute[226235]: 2026-01-31 08:06:24.657 226239 DEBUG nova.objects.instance [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'flavor' on Instance uuid 4582dbf2-09cd-4a26-84dd-28adcb24011e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:06:24 np0005603623 nova_compute[226235]: 2026-01-31 08:06:24.705 226239 DEBUG nova.objects.instance [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'pci_requests' on Instance uuid 4582dbf2-09cd-4a26-84dd-28adcb24011e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:06:24 np0005603623 nova_compute[226235]: 2026-01-31 08:06:24.725 226239 DEBUG nova.network.neutron [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:06:24 np0005603623 nova_compute[226235]: 2026-01-31 08:06:24.749 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:06:24 np0005603623 nova_compute[226235]: 2026-01-31 08:06:24.750 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:06:24 np0005603623 nova_compute[226235]: 2026-01-31 08:06:24.750 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:06:24 np0005603623 nova_compute[226235]: 2026-01-31 08:06:24.750 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4582dbf2-09cd-4a26-84dd-28adcb24011e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:06:24 np0005603623 nova_compute[226235]: 2026-01-31 08:06:24.888 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:25 np0005603623 nova_compute[226235]: 2026-01-31 08:06:25.508 226239 DEBUG nova.policy [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60f2b878669c4c529b35e04860cc6d64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c73212dc7c84914b6c934d45b6826f7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:06:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:25.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:25.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:27 np0005603623 nova_compute[226235]: 2026-01-31 08:06:27.143 226239 DEBUG nova.network.neutron [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Successfully created port: 52038e6d-42cd-444a-959a-ce24f4c9bb50 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:06:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:27 np0005603623 nova_compute[226235]: 2026-01-31 08:06:27.468 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:27.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:06:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:27.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:06:27 np0005603623 nova_compute[226235]: 2026-01-31 08:06:27.970 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updating instance_info_cache with network_info: [{"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.012 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.013 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.013 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.013 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.013 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.014 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.014 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.050 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.051 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.051 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.051 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.051 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:06:28 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1938862965' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.471 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.555 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000040 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.555 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000040 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.688 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.689 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4462MB free_disk=20.94271469116211GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.690 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.690 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.841 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 4582dbf2-09cd-4a26-84dd-28adcb24011e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.841 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.841 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:06:28 np0005603623 nova_compute[226235]: 2026-01-31 08:06:28.923 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:29 np0005603623 nova_compute[226235]: 2026-01-31 08:06:29.046 226239 DEBUG nova.network.neutron [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Successfully updated port: 52038e6d-42cd-444a-959a-ce24f4c9bb50 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:06:29 np0005603623 nova_compute[226235]: 2026-01-31 08:06:29.066 226239 DEBUG oslo_concurrency.lockutils [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:06:29 np0005603623 nova_compute[226235]: 2026-01-31 08:06:29.067 226239 DEBUG oslo_concurrency.lockutils [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquired lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:06:29 np0005603623 nova_compute[226235]: 2026-01-31 08:06:29.067 226239 DEBUG nova.network.neutron [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:06:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:06:29 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/632434907' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:06:29 np0005603623 nova_compute[226235]: 2026-01-31 08:06:29.348 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:29 np0005603623 nova_compute[226235]: 2026-01-31 08:06:29.352 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:06:29 np0005603623 nova_compute[226235]: 2026-01-31 08:06:29.362 226239 WARNING nova.network.neutron [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] 455fab34-b015-4d97-a96d-f7ebd7f7555f already exists in list: networks containing: ['455fab34-b015-4d97-a96d-f7ebd7f7555f']. ignoring it#033[00m
Jan 31 03:06:29 np0005603623 nova_compute[226235]: 2026-01-31 08:06:29.376 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:06:29 np0005603623 nova_compute[226235]: 2026-01-31 08:06:29.409 226239 DEBUG nova.compute.manager [req-ba2e8feb-c45a-453d-abb9-c5731c723689 req-3c26bbee-89f0-47f7-9e38-62ac15b18104 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-changed-52038e6d-42cd-444a-959a-ce24f4c9bb50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:29 np0005603623 nova_compute[226235]: 2026-01-31 08:06:29.409 226239 DEBUG nova.compute.manager [req-ba2e8feb-c45a-453d-abb9-c5731c723689 req-3c26bbee-89f0-47f7-9e38-62ac15b18104 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Refreshing instance network info cache due to event network-changed-52038e6d-42cd-444a-959a-ce24f4c9bb50. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:06:29 np0005603623 nova_compute[226235]: 2026-01-31 08:06:29.410 226239 DEBUG oslo_concurrency.lockutils [req-ba2e8feb-c45a-453d-abb9-c5731c723689 req-3c26bbee-89f0-47f7-9e38-62ac15b18104 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:06:29 np0005603623 nova_compute[226235]: 2026-01-31 08:06:29.411 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:06:29 np0005603623 nova_compute[226235]: 2026-01-31 08:06:29.411 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:06:29 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1012566620' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:06:29 np0005603623 nova_compute[226235]: 2026-01-31 08:06:29.552 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:29 np0005603623 nova_compute[226235]: 2026-01-31 08:06:29.595 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:29.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:29 np0005603623 nova_compute[226235]: 2026-01-31 08:06:29.890 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:29.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:30.100 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:30.100 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:30.101 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:31 np0005603623 nova_compute[226235]: 2026-01-31 08:06:31.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:31 np0005603623 nova_compute[226235]: 2026-01-31 08:06:31.156 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:06:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:31.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:06:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:31.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.469 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.953 226239 DEBUG nova.network.neutron [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updating instance_info_cache with network_info: [{"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "address": "fa:16:3e:5c:c8:27", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52038e6d-42", "ovs_interfaceid": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.980 226239 DEBUG oslo_concurrency.lockutils [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Releasing lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.981 226239 DEBUG oslo_concurrency.lockutils [req-ba2e8feb-c45a-453d-abb9-c5731c723689 req-3c26bbee-89f0-47f7-9e38-62ac15b18104 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.981 226239 DEBUG nova.network.neutron [req-ba2e8feb-c45a-453d-abb9-c5731c723689 req-3c26bbee-89f0-47f7-9e38-62ac15b18104 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Refreshing network info cache for port 52038e6d-42cd-444a-959a-ce24f4c9bb50 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.984 226239 DEBUG nova.virt.libvirt.vif [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-401952099',display_name='tempest-AttachInterfacesTestJSON-server-401952099',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-401952099',id=64,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLE+D8OgZ1/dCdtrU14/PqCqGOYd+UkR9TQlI7xIz745wJamwCS0XZjGFLIMnjQi5YZpo14J+M8CAOtiHugBc6D8B5fKxatiHPrhEn6jkmtDN0v0fZ+hZxGDop3fETxTA==',key_name='tempest-keypair-524885169',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:06:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0cwd24au',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:06:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=4582dbf2-09cd-4a26-84dd-28adcb24011e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "address": "fa:16:3e:5c:c8:27", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52038e6d-42", "ovs_interfaceid": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.984 226239 DEBUG nova.network.os_vif_util [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "address": "fa:16:3e:5c:c8:27", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52038e6d-42", "ovs_interfaceid": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.985 226239 DEBUG nova.network.os_vif_util [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5c:c8:27,bridge_name='br-int',has_traffic_filtering=True,id=52038e6d-42cd-444a-959a-ce24f4c9bb50,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52038e6d-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.985 226239 DEBUG os_vif [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5c:c8:27,bridge_name='br-int',has_traffic_filtering=True,id=52038e6d-42cd-444a-959a-ce24f4c9bb50,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52038e6d-42') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.986 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.986 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.986 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.988 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.989 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap52038e6d-42, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.989 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap52038e6d-42, col_values=(('external_ids', {'iface-id': '52038e6d-42cd-444a-959a-ce24f4c9bb50', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5c:c8:27', 'vm-uuid': '4582dbf2-09cd-4a26-84dd-28adcb24011e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.990 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:32 np0005603623 NetworkManager[48970]: <info>  [1769846792.9912] manager: (tap52038e6d-42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.991 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.996 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.997 226239 INFO os_vif [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5c:c8:27,bridge_name='br-int',has_traffic_filtering=True,id=52038e6d-42cd-444a-959a-ce24f4c9bb50,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52038e6d-42')#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.997 226239 DEBUG nova.virt.libvirt.vif [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-401952099',display_name='tempest-AttachInterfacesTestJSON-server-401952099',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-401952099',id=64,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLE+D8OgZ1/dCdtrU14/PqCqGOYd+UkR9TQlI7xIz745wJamwCS0XZjGFLIMnjQi5YZpo14J+M8CAOtiHugBc6D8B5fKxatiHPrhEn6jkmtDN0v0fZ+hZxGDop3fETxTA==',key_name='tempest-keypair-524885169',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:06:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0cwd24au',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:06:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=4582dbf2-09cd-4a26-84dd-28adcb24011e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "address": "fa:16:3e:5c:c8:27", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52038e6d-42", "ovs_interfaceid": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.997 226239 DEBUG nova.network.os_vif_util [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "address": "fa:16:3e:5c:c8:27", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52038e6d-42", "ovs_interfaceid": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:06:32 np0005603623 nova_compute[226235]: 2026-01-31 08:06:32.998 226239 DEBUG nova.network.os_vif_util [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5c:c8:27,bridge_name='br-int',has_traffic_filtering=True,id=52038e6d-42cd-444a-959a-ce24f4c9bb50,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52038e6d-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:06:33 np0005603623 nova_compute[226235]: 2026-01-31 08:06:33.001 226239 DEBUG nova.virt.libvirt.guest [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] attach device xml: <interface type="ethernet">
Jan 31 03:06:33 np0005603623 nova_compute[226235]:  <mac address="fa:16:3e:5c:c8:27"/>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:  <model type="virtio"/>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:  <mtu size="1442"/>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:  <target dev="tap52038e6d-42"/>
Jan 31 03:06:33 np0005603623 nova_compute[226235]: </interface>
Jan 31 03:06:33 np0005603623 nova_compute[226235]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:06:33 np0005603623 NetworkManager[48970]: <info>  [1769846793.0101] manager: (tap52038e6d-42): new Tun device (/org/freedesktop/NetworkManager/Devices/111)
Jan 31 03:06:33 np0005603623 kernel: tap52038e6d-42: entered promiscuous mode
Jan 31 03:06:33 np0005603623 nova_compute[226235]: 2026-01-31 08:06:33.011 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:33 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:33Z|00225|binding|INFO|Claiming lport 52038e6d-42cd-444a-959a-ce24f4c9bb50 for this chassis.
Jan 31 03:06:33 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:33Z|00226|binding|INFO|52038e6d-42cd-444a-959a-ce24f4c9bb50: Claiming fa:16:3e:5c:c8:27 10.100.0.12
Jan 31 03:06:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:33.017 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5c:c8:27 10.100.0.12'], port_security=['fa:16:3e:5c:c8:27 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4582dbf2-09cd-4a26-84dd-28adcb24011e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fdcf3a61-8bd1-47a3-8e6c-d6fed17d2331', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=52038e6d-42cd-444a-959a-ce24f4c9bb50) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:06:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:33.018 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 52038e6d-42cd-444a-959a-ce24f4c9bb50 in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f bound to our chassis#033[00m
Jan 31 03:06:33 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:33Z|00227|binding|INFO|Setting lport 52038e6d-42cd-444a-959a-ce24f4c9bb50 ovn-installed in OVS
Jan 31 03:06:33 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:33Z|00228|binding|INFO|Setting lport 52038e6d-42cd-444a-959a-ce24f4c9bb50 up in Southbound
Jan 31 03:06:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:33.020 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455fab34-b015-4d97-a96d-f7ebd7f7555f#033[00m
Jan 31 03:06:33 np0005603623 nova_compute[226235]: 2026-01-31 08:06:33.021 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:33.030 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[eadaedd5-7545-4a0b-b50e-c0a04476c163]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:33 np0005603623 systemd-udevd[255378]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:06:33 np0005603623 NetworkManager[48970]: <info>  [1769846793.0454] device (tap52038e6d-42): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:06:33 np0005603623 NetworkManager[48970]: <info>  [1769846793.0459] device (tap52038e6d-42): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:06:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:33.050 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[3aa4ef29-20dc-4c21-83c8-625ac9bafa45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:33.053 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[0a4a6ba7-ad75-4cc6-bdfd-d91408029827]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:33.068 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[d646d097-7ce2-470e-acc3-eb402a16d677]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:33.078 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[43dc4791-005b-4ca0-ae9b-2d973ab3d3d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590154, 'reachable_time': 21770, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255385, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:33.086 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c3b07ce8-ed28-4b07-92a8-ac8a15211a5b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590160, 'tstamp': 590160}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255386, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590162, 'tstamp': 590162}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255386, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:33.088 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:33 np0005603623 nova_compute[226235]: 2026-01-31 08:06:33.089 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:33 np0005603623 nova_compute[226235]: 2026-01-31 08:06:33.090 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:33.090 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455fab34-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:33.091 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:06:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:33.091 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455fab34-b0, col_values=(('external_ids', {'iface-id': 'b4a40811-3703-4da5-859c-3e041b7cfee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:33.092 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:06:33 np0005603623 nova_compute[226235]: 2026-01-31 08:06:33.105 226239 DEBUG nova.virt.libvirt.driver [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:06:33 np0005603623 nova_compute[226235]: 2026-01-31 08:06:33.105 226239 DEBUG nova.virt.libvirt.driver [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:06:33 np0005603623 nova_compute[226235]: 2026-01-31 08:06:33.105 226239 DEBUG nova.virt.libvirt.driver [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:72:76:91, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:06:33 np0005603623 nova_compute[226235]: 2026-01-31 08:06:33.105 226239 DEBUG nova.virt.libvirt.driver [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:5c:c8:27, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:06:33 np0005603623 nova_compute[226235]: 2026-01-31 08:06:33.137 226239 DEBUG nova.virt.libvirt.guest [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:06:33 np0005603623 nova_compute[226235]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:  <nova:name>tempest-AttachInterfacesTestJSON-server-401952099</nova:name>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:  <nova:creationTime>2026-01-31 08:06:33</nova:creationTime>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:  <nova:flavor name="m1.nano">
Jan 31 03:06:33 np0005603623 nova_compute[226235]:    <nova:memory>128</nova:memory>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:    <nova:disk>1</nova:disk>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:    <nova:swap>0</nova:swap>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:  </nova:flavor>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:  <nova:owner>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:  </nova:owner>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:  <nova:ports>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:    <nova:port uuid="d3f10293-a2fb-49cc-a81c-f5fee53bb74f">
Jan 31 03:06:33 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:    <nova:port uuid="52038e6d-42cd-444a-959a-ce24f4c9bb50">
Jan 31 03:06:33 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:06:33 np0005603623 nova_compute[226235]:  </nova:ports>
Jan 31 03:06:33 np0005603623 nova_compute[226235]: </nova:instance>
Jan 31 03:06:33 np0005603623 nova_compute[226235]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 31 03:06:33 np0005603623 nova_compute[226235]: 2026-01-31 08:06:33.160 226239 DEBUG oslo_concurrency.lockutils [None req-4cac9697-34b3-41b1-8311-02d6c8921edd 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-4582dbf2-09cd-4a26-84dd-28adcb24011e-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 8.504s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:06:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:33.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:06:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:06:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:33.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:06:34 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:34Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5c:c8:27 10.100.0.12
Jan 31 03:06:34 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:34Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5c:c8:27 10.100.0.12
Jan 31 03:06:35 np0005603623 nova_compute[226235]: 2026-01-31 08:06:35.582 226239 DEBUG nova.compute.manager [req-302afb60-e6d0-4336-87ed-70af0ba73298 req-8bbbf5b3-17e5-43b9-af66-8f36ec039055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-plugged-52038e6d-42cd-444a-959a-ce24f4c9bb50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:35 np0005603623 nova_compute[226235]: 2026-01-31 08:06:35.583 226239 DEBUG oslo_concurrency.lockutils [req-302afb60-e6d0-4336-87ed-70af0ba73298 req-8bbbf5b3-17e5-43b9-af66-8f36ec039055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:35 np0005603623 nova_compute[226235]: 2026-01-31 08:06:35.583 226239 DEBUG oslo_concurrency.lockutils [req-302afb60-e6d0-4336-87ed-70af0ba73298 req-8bbbf5b3-17e5-43b9-af66-8f36ec039055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:35 np0005603623 nova_compute[226235]: 2026-01-31 08:06:35.583 226239 DEBUG oslo_concurrency.lockutils [req-302afb60-e6d0-4336-87ed-70af0ba73298 req-8bbbf5b3-17e5-43b9-af66-8f36ec039055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:35 np0005603623 nova_compute[226235]: 2026-01-31 08:06:35.583 226239 DEBUG nova.compute.manager [req-302afb60-e6d0-4336-87ed-70af0ba73298 req-8bbbf5b3-17e5-43b9-af66-8f36ec039055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] No waiting events found dispatching network-vif-plugged-52038e6d-42cd-444a-959a-ce24f4c9bb50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:06:35 np0005603623 nova_compute[226235]: 2026-01-31 08:06:35.583 226239 WARNING nova.compute.manager [req-302afb60-e6d0-4336-87ed-70af0ba73298 req-8bbbf5b3-17e5-43b9-af66-8f36ec039055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received unexpected event network-vif-plugged-52038e6d-42cd-444a-959a-ce24f4c9bb50 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:06:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:35.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:35.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:36 np0005603623 nova_compute[226235]: 2026-01-31 08:06:36.804 226239 DEBUG nova.network.neutron [req-ba2e8feb-c45a-453d-abb9-c5731c723689 req-3c26bbee-89f0-47f7-9e38-62ac15b18104 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updated VIF entry in instance network info cache for port 52038e6d-42cd-444a-959a-ce24f4c9bb50. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:06:36 np0005603623 nova_compute[226235]: 2026-01-31 08:06:36.805 226239 DEBUG nova.network.neutron [req-ba2e8feb-c45a-453d-abb9-c5731c723689 req-3c26bbee-89f0-47f7-9e38-62ac15b18104 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updating instance_info_cache with network_info: [{"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "address": "fa:16:3e:5c:c8:27", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52038e6d-42", "ovs_interfaceid": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:06:36 np0005603623 nova_compute[226235]: 2026-01-31 08:06:36.827 226239 DEBUG oslo_concurrency.lockutils [req-ba2e8feb-c45a-453d-abb9-c5731c723689 req-3c26bbee-89f0-47f7-9e38-62ac15b18104 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:06:37 np0005603623 nova_compute[226235]: 2026-01-31 08:06:37.165 226239 DEBUG oslo_concurrency.lockutils [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "interface-4582dbf2-09cd-4a26-84dd-28adcb24011e-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:37 np0005603623 nova_compute[226235]: 2026-01-31 08:06:37.166 226239 DEBUG oslo_concurrency.lockutils [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-4582dbf2-09cd-4a26-84dd-28adcb24011e-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:37 np0005603623 nova_compute[226235]: 2026-01-31 08:06:37.166 226239 DEBUG nova.objects.instance [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'flavor' on Instance uuid 4582dbf2-09cd-4a26-84dd-28adcb24011e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:06:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:37 np0005603623 nova_compute[226235]: 2026-01-31 08:06:37.470 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:37 np0005603623 nova_compute[226235]: 2026-01-31 08:06:37.573 226239 DEBUG nova.objects.instance [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'pci_requests' on Instance uuid 4582dbf2-09cd-4a26-84dd-28adcb24011e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:06:37 np0005603623 nova_compute[226235]: 2026-01-31 08:06:37.587 226239 DEBUG nova.network.neutron [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:06:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:37.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:37 np0005603623 nova_compute[226235]: 2026-01-31 08:06:37.668 226239 DEBUG nova.compute.manager [req-f262b79e-850a-4d44-961c-5d9fc1c18629 req-04bc7aa0-711b-4440-bfbe-edc68d93d0a7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-plugged-52038e6d-42cd-444a-959a-ce24f4c9bb50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:37 np0005603623 nova_compute[226235]: 2026-01-31 08:06:37.669 226239 DEBUG oslo_concurrency.lockutils [req-f262b79e-850a-4d44-961c-5d9fc1c18629 req-04bc7aa0-711b-4440-bfbe-edc68d93d0a7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:37 np0005603623 nova_compute[226235]: 2026-01-31 08:06:37.669 226239 DEBUG oslo_concurrency.lockutils [req-f262b79e-850a-4d44-961c-5d9fc1c18629 req-04bc7aa0-711b-4440-bfbe-edc68d93d0a7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:37 np0005603623 nova_compute[226235]: 2026-01-31 08:06:37.669 226239 DEBUG oslo_concurrency.lockutils [req-f262b79e-850a-4d44-961c-5d9fc1c18629 req-04bc7aa0-711b-4440-bfbe-edc68d93d0a7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:37 np0005603623 nova_compute[226235]: 2026-01-31 08:06:37.669 226239 DEBUG nova.compute.manager [req-f262b79e-850a-4d44-961c-5d9fc1c18629 req-04bc7aa0-711b-4440-bfbe-edc68d93d0a7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] No waiting events found dispatching network-vif-plugged-52038e6d-42cd-444a-959a-ce24f4c9bb50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:06:37 np0005603623 nova_compute[226235]: 2026-01-31 08:06:37.669 226239 WARNING nova.compute.manager [req-f262b79e-850a-4d44-961c-5d9fc1c18629 req-04bc7aa0-711b-4440-bfbe-edc68d93d0a7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received unexpected event network-vif-plugged-52038e6d-42cd-444a-959a-ce24f4c9bb50 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:06:37 np0005603623 nova_compute[226235]: 2026-01-31 08:06:37.772 226239 DEBUG nova.policy [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60f2b878669c4c529b35e04860cc6d64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c73212dc7c84914b6c934d45b6826f7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:06:37 np0005603623 nova_compute[226235]: 2026-01-31 08:06:37.836 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:06:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:37.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:06:37 np0005603623 nova_compute[226235]: 2026-01-31 08:06:37.990 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:39 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:39.527 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:06:39 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:39.527 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:06:39 np0005603623 nova_compute[226235]: 2026-01-31 08:06:39.528 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:39.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:39 np0005603623 nova_compute[226235]: 2026-01-31 08:06:39.625 226239 DEBUG nova.network.neutron [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Successfully created port: 39f1b902-3d83-4831-b91c-5d4e2349cb30 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:06:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:39.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:40 np0005603623 nova_compute[226235]: 2026-01-31 08:06:40.650 226239 DEBUG nova.network.neutron [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Successfully updated port: 39f1b902-3d83-4831-b91c-5d4e2349cb30 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:06:40 np0005603623 nova_compute[226235]: 2026-01-31 08:06:40.678 226239 DEBUG oslo_concurrency.lockutils [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:06:40 np0005603623 nova_compute[226235]: 2026-01-31 08:06:40.678 226239 DEBUG oslo_concurrency.lockutils [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquired lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:06:40 np0005603623 nova_compute[226235]: 2026-01-31 08:06:40.679 226239 DEBUG nova.network.neutron [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:06:40 np0005603623 nova_compute[226235]: 2026-01-31 08:06:40.962 226239 WARNING nova.network.neutron [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] 455fab34-b015-4d97-a96d-f7ebd7f7555f already exists in list: networks containing: ['455fab34-b015-4d97-a96d-f7ebd7f7555f']. ignoring it#033[00m
Jan 31 03:06:40 np0005603623 nova_compute[226235]: 2026-01-31 08:06:40.962 226239 WARNING nova.network.neutron [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] 455fab34-b015-4d97-a96d-f7ebd7f7555f already exists in list: networks containing: ['455fab34-b015-4d97-a96d-f7ebd7f7555f']. ignoring it#033[00m
Jan 31 03:06:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:41.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:41.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:42 np0005603623 nova_compute[226235]: 2026-01-31 08:06:42.472 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:42 np0005603623 nova_compute[226235]: 2026-01-31 08:06:42.992 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:43 np0005603623 nova_compute[226235]: 2026-01-31 08:06:43.100 226239 DEBUG nova.compute.manager [req-2fdfd529-03bc-49e5-add2-5561f1d2f02e req-4fd3e9e5-38ce-4333-a6fb-25bcd440a1f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-changed-39f1b902-3d83-4831-b91c-5d4e2349cb30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:43 np0005603623 nova_compute[226235]: 2026-01-31 08:06:43.100 226239 DEBUG nova.compute.manager [req-2fdfd529-03bc-49e5-add2-5561f1d2f02e req-4fd3e9e5-38ce-4333-a6fb-25bcd440a1f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Refreshing instance network info cache due to event network-changed-39f1b902-3d83-4831-b91c-5d4e2349cb30. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:06:43 np0005603623 nova_compute[226235]: 2026-01-31 08:06:43.101 226239 DEBUG oslo_concurrency.lockutils [req-2fdfd529-03bc-49e5-add2-5561f1d2f02e req-4fd3e9e5-38ce-4333-a6fb-25bcd440a1f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:06:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:43.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:06:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:43.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:06:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:06:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:45.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:06:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:45.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:46 np0005603623 nova_compute[226235]: 2026-01-31 08:06:46.771 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:46 np0005603623 podman[255444]: 2026-01-31 08:06:46.996203819 +0000 UTC m=+0.085207827 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent)
Jan 31 03:06:47 np0005603623 podman[255445]: 2026-01-31 08:06:47.002819417 +0000 UTC m=+0.077212465 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:06:47 np0005603623 nova_compute[226235]: 2026-01-31 08:06:47.476 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:47.529 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:47.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:47.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:47 np0005603623 nova_compute[226235]: 2026-01-31 08:06:47.994 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.311 226239 DEBUG nova.network.neutron [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updating instance_info_cache with network_info: [{"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "address": "fa:16:3e:5c:c8:27", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52038e6d-42", "ovs_interfaceid": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "address": "fa:16:3e:cd:ea:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39f1b902-3d", "ovs_interfaceid": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.332 226239 DEBUG oslo_concurrency.lockutils [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Releasing lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.332 226239 DEBUG oslo_concurrency.lockutils [req-2fdfd529-03bc-49e5-add2-5561f1d2f02e req-4fd3e9e5-38ce-4333-a6fb-25bcd440a1f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.333 226239 DEBUG nova.network.neutron [req-2fdfd529-03bc-49e5-add2-5561f1d2f02e req-4fd3e9e5-38ce-4333-a6fb-25bcd440a1f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Refreshing network info cache for port 39f1b902-3d83-4831-b91c-5d4e2349cb30 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.339 226239 DEBUG nova.virt.libvirt.vif [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-401952099',display_name='tempest-AttachInterfacesTestJSON-server-401952099',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-401952099',id=64,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLE+D8OgZ1/dCdtrU14/PqCqGOYd+UkR9TQlI7xIz745wJamwCS0XZjGFLIMnjQi5YZpo14J+M8CAOtiHugBc6D8B5fKxatiHPrhEn6jkmtDN0v0fZ+hZxGDop3fETxTA==',key_name='tempest-keypair-524885169',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:06:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0cwd24au',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:06:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=4582dbf2-09cd-4a26-84dd-28adcb24011e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "address": "fa:16:3e:cd:ea:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39f1b902-3d", "ovs_interfaceid": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.340 226239 DEBUG nova.network.os_vif_util [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "address": "fa:16:3e:cd:ea:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39f1b902-3d", "ovs_interfaceid": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.341 226239 DEBUG nova.network.os_vif_util [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:ea:70,bridge_name='br-int',has_traffic_filtering=True,id=39f1b902-3d83-4831-b91c-5d4e2349cb30,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39f1b902-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.341 226239 DEBUG os_vif [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:ea:70,bridge_name='br-int',has_traffic_filtering=True,id=39f1b902-3d83-4831-b91c-5d4e2349cb30,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39f1b902-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.342 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.343 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.343 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.352 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.353 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap39f1b902-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.354 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap39f1b902-3d, col_values=(('external_ids', {'iface-id': '39f1b902-3d83-4831-b91c-5d4e2349cb30', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cd:ea:70', 'vm-uuid': '4582dbf2-09cd-4a26-84dd-28adcb24011e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.356 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:49 np0005603623 NetworkManager[48970]: <info>  [1769846809.3572] manager: (tap39f1b902-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/112)
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.358 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.364 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.364 226239 INFO os_vif [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:ea:70,bridge_name='br-int',has_traffic_filtering=True,id=39f1b902-3d83-4831-b91c-5d4e2349cb30,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39f1b902-3d')#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.365 226239 DEBUG nova.virt.libvirt.vif [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-401952099',display_name='tempest-AttachInterfacesTestJSON-server-401952099',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-401952099',id=64,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLE+D8OgZ1/dCdtrU14/PqCqGOYd+UkR9TQlI7xIz745wJamwCS0XZjGFLIMnjQi5YZpo14J+M8CAOtiHugBc6D8B5fKxatiHPrhEn6jkmtDN0v0fZ+hZxGDop3fETxTA==',key_name='tempest-keypair-524885169',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:06:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0cwd24au',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:06:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=4582dbf2-09cd-4a26-84dd-28adcb24011e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "address": "fa:16:3e:cd:ea:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39f1b902-3d", "ovs_interfaceid": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.365 226239 DEBUG nova.network.os_vif_util [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "address": "fa:16:3e:cd:ea:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39f1b902-3d", "ovs_interfaceid": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.366 226239 DEBUG nova.network.os_vif_util [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:ea:70,bridge_name='br-int',has_traffic_filtering=True,id=39f1b902-3d83-4831-b91c-5d4e2349cb30,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39f1b902-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.369 226239 DEBUG nova.virt.libvirt.guest [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] attach device xml: <interface type="ethernet">
Jan 31 03:06:49 np0005603623 nova_compute[226235]:  <mac address="fa:16:3e:cd:ea:70"/>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:  <model type="virtio"/>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:  <mtu size="1442"/>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:  <target dev="tap39f1b902-3d"/>
Jan 31 03:06:49 np0005603623 nova_compute[226235]: </interface>
Jan 31 03:06:49 np0005603623 nova_compute[226235]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:06:49 np0005603623 kernel: tap39f1b902-3d: entered promiscuous mode
Jan 31 03:06:49 np0005603623 NetworkManager[48970]: <info>  [1769846809.3842] manager: (tap39f1b902-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/113)
Jan 31 03:06:49 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:49Z|00229|binding|INFO|Claiming lport 39f1b902-3d83-4831-b91c-5d4e2349cb30 for this chassis.
Jan 31 03:06:49 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:49Z|00230|binding|INFO|39f1b902-3d83-4831-b91c-5d4e2349cb30: Claiming fa:16:3e:cd:ea:70 10.100.0.9
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.386 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:49.393 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:ea:70 10.100.0.9'], port_security=['fa:16:3e:cd:ea:70 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4582dbf2-09cd-4a26-84dd-28adcb24011e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fdcf3a61-8bd1-47a3-8e6c-d6fed17d2331', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=39f1b902-3d83-4831-b91c-5d4e2349cb30) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:06:49 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:49Z|00231|binding|INFO|Setting lport 39f1b902-3d83-4831-b91c-5d4e2349cb30 ovn-installed in OVS
Jan 31 03:06:49 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:49Z|00232|binding|INFO|Setting lport 39f1b902-3d83-4831-b91c-5d4e2349cb30 up in Southbound
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.395 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:49.396 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 39f1b902-3d83-4831-b91c-5d4e2349cb30 in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f bound to our chassis#033[00m
Jan 31 03:06:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:49.399 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455fab34-b015-4d97-a96d-f7ebd7f7555f#033[00m
Jan 31 03:06:49 np0005603623 systemd-udevd[255496]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:06:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:49.420 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7ef7edd7-3238-443e-89cc-69b37aae579a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:49 np0005603623 NetworkManager[48970]: <info>  [1769846809.4324] device (tap39f1b902-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:06:49 np0005603623 NetworkManager[48970]: <info>  [1769846809.4334] device (tap39f1b902-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:06:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:49.451 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[5c5c9f07-4f02-45c1-9e49-5b8147c659f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:49.455 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ac587d38-f7ca-4d8a-95f9-095d33e4ec28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:49.486 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[44fe2afa-fd88-4f90-b5cf-8dc90ba0d847]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.487 226239 DEBUG nova.virt.libvirt.driver [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.487 226239 DEBUG nova.virt.libvirt.driver [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.488 226239 DEBUG nova.virt.libvirt.driver [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:72:76:91, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.488 226239 DEBUG nova.virt.libvirt.driver [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:5c:c8:27, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.489 226239 DEBUG nova.virt.libvirt.driver [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:cd:ea:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:06:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:49.505 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6e32b684-7016-4f04-b010-36f623015079]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590154, 'reachable_time': 34334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255503, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.515 226239 DEBUG nova.virt.libvirt.guest [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:06:49 np0005603623 nova_compute[226235]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:  <nova:name>tempest-AttachInterfacesTestJSON-server-401952099</nova:name>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:  <nova:creationTime>2026-01-31 08:06:49</nova:creationTime>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:  <nova:flavor name="m1.nano">
Jan 31 03:06:49 np0005603623 nova_compute[226235]:    <nova:memory>128</nova:memory>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:    <nova:disk>1</nova:disk>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:    <nova:swap>0</nova:swap>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:  </nova:flavor>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:  <nova:owner>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:  </nova:owner>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:  <nova:ports>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:    <nova:port uuid="d3f10293-a2fb-49cc-a81c-f5fee53bb74f">
Jan 31 03:06:49 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:    <nova:port uuid="52038e6d-42cd-444a-959a-ce24f4c9bb50">
Jan 31 03:06:49 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:    <nova:port uuid="39f1b902-3d83-4831-b91c-5d4e2349cb30">
Jan 31 03:06:49 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:06:49 np0005603623 nova_compute[226235]:  </nova:ports>
Jan 31 03:06:49 np0005603623 nova_compute[226235]: </nova:instance>
Jan 31 03:06:49 np0005603623 nova_compute[226235]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 31 03:06:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:49.530 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[78a68994-2cb2-4bd1-b7f9-2813a8b0e48f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590160, 'tstamp': 590160}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255504, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590162, 'tstamp': 590162}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255504, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:49.532 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.534 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.536 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:49.537 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455fab34-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:49.537 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:06:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:49.537 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455fab34-b0, col_values=(('external_ids', {'iface-id': 'b4a40811-3703-4da5-859c-3e041b7cfee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:49.538 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:06:49 np0005603623 nova_compute[226235]: 2026-01-31 08:06:49.539 226239 DEBUG oslo_concurrency.lockutils [None req-a870adf2-02e3-46b5-b0d7-cd7ab3ddc118 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-4582dbf2-09cd-4a26-84dd-28adcb24011e-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 12.374s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:49.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:06:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:49.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:06:50 np0005603623 nova_compute[226235]: 2026-01-31 08:06:50.345 226239 DEBUG nova.compute.manager [req-66b58992-52d3-4130-b842-b11fc2278a48 req-30fb7599-1848-4688-86ad-69e82c263359 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-plugged-39f1b902-3d83-4831-b91c-5d4e2349cb30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:50 np0005603623 nova_compute[226235]: 2026-01-31 08:06:50.346 226239 DEBUG oslo_concurrency.lockutils [req-66b58992-52d3-4130-b842-b11fc2278a48 req-30fb7599-1848-4688-86ad-69e82c263359 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:50 np0005603623 nova_compute[226235]: 2026-01-31 08:06:50.347 226239 DEBUG oslo_concurrency.lockutils [req-66b58992-52d3-4130-b842-b11fc2278a48 req-30fb7599-1848-4688-86ad-69e82c263359 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:50 np0005603623 nova_compute[226235]: 2026-01-31 08:06:50.347 226239 DEBUG oslo_concurrency.lockutils [req-66b58992-52d3-4130-b842-b11fc2278a48 req-30fb7599-1848-4688-86ad-69e82c263359 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:50 np0005603623 nova_compute[226235]: 2026-01-31 08:06:50.348 226239 DEBUG nova.compute.manager [req-66b58992-52d3-4130-b842-b11fc2278a48 req-30fb7599-1848-4688-86ad-69e82c263359 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] No waiting events found dispatching network-vif-plugged-39f1b902-3d83-4831-b91c-5d4e2349cb30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:06:50 np0005603623 nova_compute[226235]: 2026-01-31 08:06:50.348 226239 WARNING nova.compute.manager [req-66b58992-52d3-4130-b842-b11fc2278a48 req-30fb7599-1848-4688-86ad-69e82c263359 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received unexpected event network-vif-plugged-39f1b902-3d83-4831-b91c-5d4e2349cb30 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:06:51 np0005603623 nova_compute[226235]: 2026-01-31 08:06:51.241 226239 DEBUG nova.network.neutron [req-2fdfd529-03bc-49e5-add2-5561f1d2f02e req-4fd3e9e5-38ce-4333-a6fb-25bcd440a1f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updated VIF entry in instance network info cache for port 39f1b902-3d83-4831-b91c-5d4e2349cb30. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:06:51 np0005603623 nova_compute[226235]: 2026-01-31 08:06:51.242 226239 DEBUG nova.network.neutron [req-2fdfd529-03bc-49e5-add2-5561f1d2f02e req-4fd3e9e5-38ce-4333-a6fb-25bcd440a1f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updating instance_info_cache with network_info: [{"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "address": "fa:16:3e:5c:c8:27", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52038e6d-42", "ovs_interfaceid": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "address": "fa:16:3e:cd:ea:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39f1b902-3d", "ovs_interfaceid": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:06:51 np0005603623 nova_compute[226235]: 2026-01-31 08:06:51.257 226239 DEBUG oslo_concurrency.lockutils [req-2fdfd529-03bc-49e5-add2-5561f1d2f02e req-4fd3e9e5-38ce-4333-a6fb-25bcd440a1f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:06:51 np0005603623 nova_compute[226235]: 2026-01-31 08:06:51.586 226239 DEBUG oslo_concurrency.lockutils [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "interface-4582dbf2-09cd-4a26-84dd-28adcb24011e-0b091e6e-9544-424b-b33c-45258253789e" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:51 np0005603623 nova_compute[226235]: 2026-01-31 08:06:51.587 226239 DEBUG oslo_concurrency.lockutils [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-4582dbf2-09cd-4a26-84dd-28adcb24011e-0b091e6e-9544-424b-b33c-45258253789e" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:51 np0005603623 nova_compute[226235]: 2026-01-31 08:06:51.587 226239 DEBUG nova.objects.instance [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'flavor' on Instance uuid 4582dbf2-09cd-4a26-84dd-28adcb24011e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:06:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:51.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:51.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:52Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cd:ea:70 10.100.0.9
Jan 31 03:06:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:52Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cd:ea:70 10.100.0.9
Jan 31 03:06:52 np0005603623 nova_compute[226235]: 2026-01-31 08:06:52.261 226239 DEBUG nova.objects.instance [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'pci_requests' on Instance uuid 4582dbf2-09cd-4a26-84dd-28adcb24011e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:06:52 np0005603623 nova_compute[226235]: 2026-01-31 08:06:52.275 226239 DEBUG nova.network.neutron [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:06:52 np0005603623 nova_compute[226235]: 2026-01-31 08:06:52.463 226239 DEBUG nova.compute.manager [req-4005f759-ed5e-40c9-a976-0ae4e94c035f req-a6f0f6d4-765d-41f4-b6c9-a5755747573a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-plugged-39f1b902-3d83-4831-b91c-5d4e2349cb30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:52 np0005603623 nova_compute[226235]: 2026-01-31 08:06:52.463 226239 DEBUG oslo_concurrency.lockutils [req-4005f759-ed5e-40c9-a976-0ae4e94c035f req-a6f0f6d4-765d-41f4-b6c9-a5755747573a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:52 np0005603623 nova_compute[226235]: 2026-01-31 08:06:52.464 226239 DEBUG oslo_concurrency.lockutils [req-4005f759-ed5e-40c9-a976-0ae4e94c035f req-a6f0f6d4-765d-41f4-b6c9-a5755747573a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:52 np0005603623 nova_compute[226235]: 2026-01-31 08:06:52.464 226239 DEBUG oslo_concurrency.lockutils [req-4005f759-ed5e-40c9-a976-0ae4e94c035f req-a6f0f6d4-765d-41f4-b6c9-a5755747573a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:52 np0005603623 nova_compute[226235]: 2026-01-31 08:06:52.464 226239 DEBUG nova.compute.manager [req-4005f759-ed5e-40c9-a976-0ae4e94c035f req-a6f0f6d4-765d-41f4-b6c9-a5755747573a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] No waiting events found dispatching network-vif-plugged-39f1b902-3d83-4831-b91c-5d4e2349cb30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:06:52 np0005603623 nova_compute[226235]: 2026-01-31 08:06:52.464 226239 WARNING nova.compute.manager [req-4005f759-ed5e-40c9-a976-0ae4e94c035f req-a6f0f6d4-765d-41f4-b6c9-a5755747573a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received unexpected event network-vif-plugged-39f1b902-3d83-4831-b91c-5d4e2349cb30 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:06:52 np0005603623 nova_compute[226235]: 2026-01-31 08:06:52.478 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:52 np0005603623 nova_compute[226235]: 2026-01-31 08:06:52.640 226239 DEBUG nova.policy [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60f2b878669c4c529b35e04860cc6d64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c73212dc7c84914b6c934d45b6826f7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:06:53 np0005603623 nova_compute[226235]: 2026-01-31 08:06:53.358 226239 DEBUG nova.network.neutron [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Successfully updated port: 0b091e6e-9544-424b-b33c-45258253789e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:06:53 np0005603623 nova_compute[226235]: 2026-01-31 08:06:53.380 226239 DEBUG oslo_concurrency.lockutils [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:06:53 np0005603623 nova_compute[226235]: 2026-01-31 08:06:53.381 226239 DEBUG oslo_concurrency.lockutils [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquired lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:06:53 np0005603623 nova_compute[226235]: 2026-01-31 08:06:53.381 226239 DEBUG nova.network.neutron [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:06:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:53.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:53.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:53 np0005603623 nova_compute[226235]: 2026-01-31 08:06:53.934 226239 WARNING nova.network.neutron [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] 455fab34-b015-4d97-a96d-f7ebd7f7555f already exists in list: networks containing: ['455fab34-b015-4d97-a96d-f7ebd7f7555f']. ignoring it#033[00m
Jan 31 03:06:53 np0005603623 nova_compute[226235]: 2026-01-31 08:06:53.934 226239 WARNING nova.network.neutron [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] 455fab34-b015-4d97-a96d-f7ebd7f7555f already exists in list: networks containing: ['455fab34-b015-4d97-a96d-f7ebd7f7555f']. ignoring it#033[00m
Jan 31 03:06:53 np0005603623 nova_compute[226235]: 2026-01-31 08:06:53.934 226239 WARNING nova.network.neutron [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] 455fab34-b015-4d97-a96d-f7ebd7f7555f already exists in list: networks containing: ['455fab34-b015-4d97-a96d-f7ebd7f7555f']. ignoring it#033[00m
Jan 31 03:06:54 np0005603623 nova_compute[226235]: 2026-01-31 08:06:54.358 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:54 np0005603623 nova_compute[226235]: 2026-01-31 08:06:54.583 226239 DEBUG nova.compute.manager [req-5e854849-ee2c-47e1-b006-29d6e7addaf1 req-a2a34163-9f3b-403c-9e62-d9873890714d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-changed-0b091e6e-9544-424b-b33c-45258253789e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:54 np0005603623 nova_compute[226235]: 2026-01-31 08:06:54.584 226239 DEBUG nova.compute.manager [req-5e854849-ee2c-47e1-b006-29d6e7addaf1 req-a2a34163-9f3b-403c-9e62-d9873890714d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Refreshing instance network info cache due to event network-changed-0b091e6e-9544-424b-b33c-45258253789e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:06:54 np0005603623 nova_compute[226235]: 2026-01-31 08:06:54.584 226239 DEBUG oslo_concurrency.lockutils [req-5e854849-ee2c-47e1-b006-29d6e7addaf1 req-a2a34163-9f3b-403c-9e62-d9873890714d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:06:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:55.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:06:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:55.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:06:57 np0005603623 nova_compute[226235]: 2026-01-31 08:06:57.480 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:57.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:06:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:57.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.317 226239 DEBUG nova.network.neutron [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updating instance_info_cache with network_info: [{"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "address": "fa:16:3e:5c:c8:27", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52038e6d-42", "ovs_interfaceid": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "address": "fa:16:3e:cd:ea:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39f1b902-3d", "ovs_interfaceid": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0b091e6e-9544-424b-b33c-45258253789e", "address": "fa:16:3e:86:00:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b091e6e-95", "ovs_interfaceid": "0b091e6e-9544-424b-b33c-45258253789e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.390 226239 DEBUG oslo_concurrency.lockutils [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Releasing lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.392 226239 DEBUG oslo_concurrency.lockutils [req-5e854849-ee2c-47e1-b006-29d6e7addaf1 req-a2a34163-9f3b-403c-9e62-d9873890714d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.393 226239 DEBUG nova.network.neutron [req-5e854849-ee2c-47e1-b006-29d6e7addaf1 req-a2a34163-9f3b-403c-9e62-d9873890714d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Refreshing network info cache for port 0b091e6e-9544-424b-b33c-45258253789e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.398 226239 DEBUG nova.virt.libvirt.vif [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-401952099',display_name='tempest-AttachInterfacesTestJSON-server-401952099',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-401952099',id=64,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLE+D8OgZ1/dCdtrU14/PqCqGOYd+UkR9TQlI7xIz745wJamwCS0XZjGFLIMnjQi5YZpo14J+M8CAOtiHugBc6D8B5fKxatiHPrhEn6jkmtDN0v0fZ+hZxGDop3fETxTA==',key_name='tempest-keypair-524885169',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:06:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0cwd24au',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:06:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=4582dbf2-09cd-4a26-84dd-28adcb24011e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b091e6e-9544-424b-b33c-45258253789e", "address": "fa:16:3e:86:00:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b091e6e-95", "ovs_interfaceid": "0b091e6e-9544-424b-b33c-45258253789e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.399 226239 DEBUG nova.network.os_vif_util [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "0b091e6e-9544-424b-b33c-45258253789e", "address": "fa:16:3e:86:00:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b091e6e-95", "ovs_interfaceid": "0b091e6e-9544-424b-b33c-45258253789e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.401 226239 DEBUG nova.network.os_vif_util [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:00:70,bridge_name='br-int',has_traffic_filtering=True,id=0b091e6e-9544-424b-b33c-45258253789e,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b091e6e-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.402 226239 DEBUG os_vif [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:00:70,bridge_name='br-int',has_traffic_filtering=True,id=0b091e6e-9544-424b-b33c-45258253789e,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b091e6e-95') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.403 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.404 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.405 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.410 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.410 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b091e6e-95, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.411 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b091e6e-95, col_values=(('external_ids', {'iface-id': '0b091e6e-9544-424b-b33c-45258253789e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:86:00:70', 'vm-uuid': '4582dbf2-09cd-4a26-84dd-28adcb24011e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:58 np0005603623 NetworkManager[48970]: <info>  [1769846818.4154] manager: (tap0b091e6e-95): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.417 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.422 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.423 226239 INFO os_vif [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:00:70,bridge_name='br-int',has_traffic_filtering=True,id=0b091e6e-9544-424b-b33c-45258253789e,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b091e6e-95')#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.425 226239 DEBUG nova.virt.libvirt.vif [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-401952099',display_name='tempest-AttachInterfacesTestJSON-server-401952099',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-401952099',id=64,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLE+D8OgZ1/dCdtrU14/PqCqGOYd+UkR9TQlI7xIz745wJamwCS0XZjGFLIMnjQi5YZpo14J+M8CAOtiHugBc6D8B5fKxatiHPrhEn6jkmtDN0v0fZ+hZxGDop3fETxTA==',key_name='tempest-keypair-524885169',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:06:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0cwd24au',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:06:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=4582dbf2-09cd-4a26-84dd-28adcb24011e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b091e6e-9544-424b-b33c-45258253789e", "address": "fa:16:3e:86:00:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b091e6e-95", "ovs_interfaceid": "0b091e6e-9544-424b-b33c-45258253789e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.426 226239 DEBUG nova.network.os_vif_util [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "0b091e6e-9544-424b-b33c-45258253789e", "address": "fa:16:3e:86:00:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b091e6e-95", "ovs_interfaceid": "0b091e6e-9544-424b-b33c-45258253789e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.427 226239 DEBUG nova.network.os_vif_util [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:00:70,bridge_name='br-int',has_traffic_filtering=True,id=0b091e6e-9544-424b-b33c-45258253789e,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b091e6e-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.429 226239 DEBUG nova.virt.libvirt.guest [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] attach device xml: <interface type="ethernet">
Jan 31 03:06:58 np0005603623 nova_compute[226235]:  <mac address="fa:16:3e:86:00:70"/>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:  <model type="virtio"/>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:  <mtu size="1442"/>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:  <target dev="tap0b091e6e-95"/>
Jan 31 03:06:58 np0005603623 nova_compute[226235]: </interface>
Jan 31 03:06:58 np0005603623 nova_compute[226235]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:06:58 np0005603623 NetworkManager[48970]: <info>  [1769846818.4458] manager: (tap0b091e6e-95): new Tun device (/org/freedesktop/NetworkManager/Devices/115)
Jan 31 03:06:58 np0005603623 kernel: tap0b091e6e-95: entered promiscuous mode
Jan 31 03:06:58 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:58Z|00233|binding|INFO|Claiming lport 0b091e6e-9544-424b-b33c-45258253789e for this chassis.
Jan 31 03:06:58 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:58Z|00234|binding|INFO|0b091e6e-9544-424b-b33c-45258253789e: Claiming fa:16:3e:86:00:70 10.100.0.13
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.451 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:58 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:58Z|00235|binding|INFO|Setting lport 0b091e6e-9544-424b-b33c-45258253789e ovn-installed in OVS
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.457 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.463 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:58 np0005603623 systemd-udevd[255647]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:06:58 np0005603623 NetworkManager[48970]: <info>  [1769846818.4857] device (tap0b091e6e-95): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:06:58 np0005603623 NetworkManager[48970]: <info>  [1769846818.4866] device (tap0b091e6e-95): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:06:58 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:58Z|00236|binding|INFO|Setting lport 0b091e6e-9544-424b-b33c-45258253789e up in Southbound
Jan 31 03:06:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:58.566 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:00:70 10.100.0.13'], port_security=['fa:16:3e:86:00:70 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1283313675', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4582dbf2-09cd-4a26-84dd-28adcb24011e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1283313675', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fdcf3a61-8bd1-47a3-8e6c-d6fed17d2331', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=0b091e6e-9544-424b-b33c-45258253789e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:06:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:58.568 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 0b091e6e-9544-424b-b33c-45258253789e in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f bound to our chassis#033[00m
Jan 31 03:06:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:58.572 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455fab34-b015-4d97-a96d-f7ebd7f7555f#033[00m
Jan 31 03:06:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:58.586 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[27c79503-0637-468f-b91a-828eb0069f67]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:58.615 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[c4e5db63-09ca-421e-99ee-cbe8642f959f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:58.621 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[53e4e6c7-c57e-46bb-985b-d391b2b6cf6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:58.636 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[84583f2d-c9a0-4dae-a569-732850e411d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:58.649 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c66fed3d-029e-4cb7-9612-e3b648f45105]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 9, 'rx_bytes': 742, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 9, 'rx_bytes': 742, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590154, 'reachable_time': 34334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255655, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:58.666 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4e6e66aa-120f-4096-98a7-9fef736be6f6]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590160, 'tstamp': 590160}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255656, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590162, 'tstamp': 590162}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255656, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:58.668 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.670 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:58.673 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455fab34-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:58.673 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:06:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:58.674 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455fab34-b0, col_values=(('external_ids', {'iface-id': 'b4a40811-3703-4da5-859c-3e041b7cfee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:06:58.675 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.824 226239 DEBUG nova.virt.libvirt.driver [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.825 226239 DEBUG nova.virt.libvirt.driver [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.825 226239 DEBUG nova.virt.libvirt.driver [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:72:76:91, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.826 226239 DEBUG nova.virt.libvirt.driver [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:5c:c8:27, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:06:58 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:58Z|00237|binding|INFO|Releasing lport b4a40811-3703-4da5-859c-3e041b7cfee4 from this chassis (sb_readonly=0)
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.826 226239 DEBUG nova.virt.libvirt.driver [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:cd:ea:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.826 226239 DEBUG nova.virt.libvirt.driver [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:86:00:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.846 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.932 226239 DEBUG nova.virt.libvirt.guest [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:06:58 np0005603623 nova_compute[226235]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:  <nova:name>tempest-AttachInterfacesTestJSON-server-401952099</nova:name>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:  <nova:creationTime>2026-01-31 08:06:58</nova:creationTime>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:  <nova:flavor name="m1.nano">
Jan 31 03:06:58 np0005603623 nova_compute[226235]:    <nova:memory>128</nova:memory>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:    <nova:disk>1</nova:disk>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:    <nova:swap>0</nova:swap>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:  </nova:flavor>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:  <nova:owner>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:  </nova:owner>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:  <nova:ports>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:    <nova:port uuid="d3f10293-a2fb-49cc-a81c-f5fee53bb74f">
Jan 31 03:06:58 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:    <nova:port uuid="52038e6d-42cd-444a-959a-ce24f4c9bb50">
Jan 31 03:06:58 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:    <nova:port uuid="39f1b902-3d83-4831-b91c-5d4e2349cb30">
Jan 31 03:06:58 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:    <nova:port uuid="0b091e6e-9544-424b-b33c-45258253789e">
Jan 31 03:06:58 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:06:58 np0005603623 nova_compute[226235]:  </nova:ports>
Jan 31 03:06:58 np0005603623 nova_compute[226235]: </nova:instance>
Jan 31 03:06:58 np0005603623 nova_compute[226235]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 31 03:06:58 np0005603623 nova_compute[226235]: 2026-01-31 08:06:58.990 226239 DEBUG oslo_concurrency.lockutils [None req-46840057-403f-4a50-8256-fe3aaa788199 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-4582dbf2-09cd-4a26-84dd-28adcb24011e-0b091e6e-9544-424b-b33c-45258253789e" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.403s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:59 np0005603623 nova_compute[226235]: 2026-01-31 08:06:59.189 226239 DEBUG nova.compute.manager [req-3b9b05ce-ca21-4911-af2c-ba678aed7b09 req-d6b0ffe6-f77d-43ff-a0ea-7353092b2f9e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-plugged-0b091e6e-9544-424b-b33c-45258253789e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:59 np0005603623 nova_compute[226235]: 2026-01-31 08:06:59.189 226239 DEBUG oslo_concurrency.lockutils [req-3b9b05ce-ca21-4911-af2c-ba678aed7b09 req-d6b0ffe6-f77d-43ff-a0ea-7353092b2f9e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:59 np0005603623 nova_compute[226235]: 2026-01-31 08:06:59.189 226239 DEBUG oslo_concurrency.lockutils [req-3b9b05ce-ca21-4911-af2c-ba678aed7b09 req-d6b0ffe6-f77d-43ff-a0ea-7353092b2f9e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:59 np0005603623 nova_compute[226235]: 2026-01-31 08:06:59.190 226239 DEBUG oslo_concurrency.lockutils [req-3b9b05ce-ca21-4911-af2c-ba678aed7b09 req-d6b0ffe6-f77d-43ff-a0ea-7353092b2f9e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:59 np0005603623 nova_compute[226235]: 2026-01-31 08:06:59.190 226239 DEBUG nova.compute.manager [req-3b9b05ce-ca21-4911-af2c-ba678aed7b09 req-d6b0ffe6-f77d-43ff-a0ea-7353092b2f9e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] No waiting events found dispatching network-vif-plugged-0b091e6e-9544-424b-b33c-45258253789e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:06:59 np0005603623 nova_compute[226235]: 2026-01-31 08:06:59.190 226239 WARNING nova.compute.manager [req-3b9b05ce-ca21-4911-af2c-ba678aed7b09 req-d6b0ffe6-f77d-43ff-a0ea-7353092b2f9e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received unexpected event network-vif-plugged-0b091e6e-9544-424b-b33c-45258253789e for instance with vm_state active and task_state None.#033[00m
Jan 31 03:06:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:59.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:59 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:59Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:86:00:70 10.100.0.13
Jan 31 03:06:59 np0005603623 ovn_controller[133449]: 2026-01-31T08:06:59Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:86:00:70 10.100.0.13
Jan 31 03:06:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:06:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:06:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:59.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:06:59 np0005603623 nova_compute[226235]: 2026-01-31 08:06:59.963 226239 DEBUG nova.network.neutron [req-5e854849-ee2c-47e1-b006-29d6e7addaf1 req-a2a34163-9f3b-403c-9e62-d9873890714d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updated VIF entry in instance network info cache for port 0b091e6e-9544-424b-b33c-45258253789e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:06:59 np0005603623 nova_compute[226235]: 2026-01-31 08:06:59.964 226239 DEBUG nova.network.neutron [req-5e854849-ee2c-47e1-b006-29d6e7addaf1 req-a2a34163-9f3b-403c-9e62-d9873890714d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updating instance_info_cache with network_info: [{"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "address": "fa:16:3e:5c:c8:27", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52038e6d-42", "ovs_interfaceid": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "address": "fa:16:3e:cd:ea:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39f1b902-3d", "ovs_interfaceid": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0b091e6e-9544-424b-b33c-45258253789e", "address": "fa:16:3e:86:00:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b091e6e-95", "ovs_interfaceid": "0b091e6e-9544-424b-b33c-45258253789e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:07:00 np0005603623 nova_compute[226235]: 2026-01-31 08:07:00.034 226239 DEBUG oslo_concurrency.lockutils [req-5e854849-ee2c-47e1-b006-29d6e7addaf1 req-a2a34163-9f3b-403c-9e62-d9873890714d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:07:00 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:07:00 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:07:01 np0005603623 nova_compute[226235]: 2026-01-31 08:07:01.164 226239 DEBUG oslo_concurrency.lockutils [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "interface-4582dbf2-09cd-4a26-84dd-28adcb24011e-52038e6d-42cd-444a-959a-ce24f4c9bb50" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:01 np0005603623 nova_compute[226235]: 2026-01-31 08:07:01.165 226239 DEBUG oslo_concurrency.lockutils [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-4582dbf2-09cd-4a26-84dd-28adcb24011e-52038e6d-42cd-444a-959a-ce24f4c9bb50" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:01 np0005603623 nova_compute[226235]: 2026-01-31 08:07:01.222 226239 DEBUG nova.objects.instance [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'flavor' on Instance uuid 4582dbf2-09cd-4a26-84dd-28adcb24011e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:07:01 np0005603623 nova_compute[226235]: 2026-01-31 08:07:01.517 226239 DEBUG nova.virt.libvirt.vif [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-401952099',display_name='tempest-AttachInterfacesTestJSON-server-401952099',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-401952099',id=64,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLE+D8OgZ1/dCdtrU14/PqCqGOYd+UkR9TQlI7xIz745wJamwCS0XZjGFLIMnjQi5YZpo14J+M8CAOtiHugBc6D8B5fKxatiHPrhEn6jkmtDN0v0fZ+hZxGDop3fETxTA==',key_name='tempest-keypair-524885169',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:06:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0cwd24au',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:06:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=4582dbf2-09cd-4a26-84dd-28adcb24011e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "address": "fa:16:3e:5c:c8:27", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52038e6d-42", "ovs_interfaceid": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:07:01 np0005603623 nova_compute[226235]: 2026-01-31 08:07:01.518 226239 DEBUG nova.network.os_vif_util [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "address": "fa:16:3e:5c:c8:27", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52038e6d-42", "ovs_interfaceid": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:07:01 np0005603623 nova_compute[226235]: 2026-01-31 08:07:01.519 226239 DEBUG nova.network.os_vif_util [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5c:c8:27,bridge_name='br-int',has_traffic_filtering=True,id=52038e6d-42cd-444a-959a-ce24f4c9bb50,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52038e6d-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:07:01 np0005603623 nova_compute[226235]: 2026-01-31 08:07:01.523 226239 DEBUG nova.virt.libvirt.guest [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:5c:c8:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap52038e6d-42"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:07:01 np0005603623 nova_compute[226235]: 2026-01-31 08:07:01.527 226239 DEBUG nova.virt.libvirt.guest [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:5c:c8:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap52038e6d-42"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:07:01 np0005603623 nova_compute[226235]: 2026-01-31 08:07:01.529 226239 DEBUG nova.virt.libvirt.driver [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Attempting to detach device tap52038e6d-42 from instance 4582dbf2-09cd-4a26-84dd-28adcb24011e from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:07:01 np0005603623 nova_compute[226235]: 2026-01-31 08:07:01.529 226239 DEBUG nova.virt.libvirt.guest [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] detach device xml: <interface type="ethernet">
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <mac address="fa:16:3e:5c:c8:27"/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <model type="virtio"/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <mtu size="1442"/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <target dev="tap52038e6d-42"/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]: </interface>
Jan 31 03:07:01 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:07:01 np0005603623 nova_compute[226235]: 2026-01-31 08:07:01.582 226239 DEBUG nova.virt.libvirt.guest [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:5c:c8:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap52038e6d-42"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:07:01 np0005603623 nova_compute[226235]: 2026-01-31 08:07:01.589 226239 DEBUG nova.virt.libvirt.guest [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:5c:c8:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap52038e6d-42"/></interface>not found in domain: <domain type='kvm' id='30'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <name>instance-00000040</name>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <uuid>4582dbf2-09cd-4a26-84dd-28adcb24011e</uuid>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <nova:name>tempest-AttachInterfacesTestJSON-server-401952099</nova:name>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <nova:creationTime>2026-01-31 08:06:58</nova:creationTime>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <nova:flavor name="m1.nano">
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <nova:memory>128</nova:memory>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <nova:disk>1</nova:disk>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <nova:swap>0</nova:swap>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  </nova:flavor>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <nova:owner>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  </nova:owner>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <nova:ports>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <nova:port uuid="d3f10293-a2fb-49cc-a81c-f5fee53bb74f">
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <nova:port uuid="52038e6d-42cd-444a-959a-ce24f4c9bb50">
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <nova:port uuid="39f1b902-3d83-4831-b91c-5d4e2349cb30">
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <nova:port uuid="0b091e6e-9544-424b-b33c-45258253789e">
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  </nova:ports>
Jan 31 03:07:01 np0005603623 nova_compute[226235]: </nova:instance>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <memory unit='KiB'>131072</memory>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <vcpu placement='static'>1</vcpu>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <resource>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <partition>/machine</partition>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  </resource>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <sysinfo type='smbios'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <entry name='manufacturer'>RDO</entry>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <entry name='serial'>4582dbf2-09cd-4a26-84dd-28adcb24011e</entry>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <entry name='uuid'>4582dbf2-09cd-4a26-84dd-28adcb24011e</entry>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <entry name='family'>Virtual Machine</entry>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <boot dev='hd'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <smbios mode='sysinfo'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <vmcoreinfo state='on'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <model fallback='forbid'>Nehalem</model>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <feature policy='require' name='x2apic'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <feature policy='require' name='hypervisor'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <feature policy='require' name='vme'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <clock offset='utc'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <timer name='hpet' present='no'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <on_poweroff>destroy</on_poweroff>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <on_reboot>restart</on_reboot>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <on_crash>destroy</on_crash>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <disk type='network' device='disk'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <auth username='openstack'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <source protocol='rbd' name='vms/4582dbf2-09cd-4a26-84dd-28adcb24011e_disk' index='2'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target dev='vda' bus='virtio'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='virtio-disk0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <disk type='network' device='cdrom'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <auth username='openstack'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <source protocol='rbd' name='vms/4582dbf2-09cd-4a26-84dd-28adcb24011e_disk.config' index='1'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target dev='sda' bus='sata'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <readonly/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='sata0-0-0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pcie.0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='1' port='0x10'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.1'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='2' port='0x11'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.2'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='3' port='0x12'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.3'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='4' port='0x13'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.4'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='5' port='0x14'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.5'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='6' port='0x15'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.6'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='7' port='0x16'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.7'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='8' port='0x17'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.8'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='9' port='0x18'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.9'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='10' port='0x19'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.10'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='11' port='0x1a'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.11'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='12' port='0x1b'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.12'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='13' port='0x1c'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.13'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='14' port='0x1d'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.14'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='15' port='0x1e'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.15'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='16' port='0x1f'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.16'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='17' port='0x20'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.17'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='18' port='0x21'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.18'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='19' port='0x22'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.19'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='20' port='0x23'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.20'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='21' port='0x24'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.21'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='22' port='0x25'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.22'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='23' port='0x26'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.23'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='24' port='0x27'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.24'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target chassis='25' port='0x28'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.25'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model name='pcie-pci-bridge'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='pci.26'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='usb'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <controller type='sata' index='0'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='ide'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <interface type='ethernet'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <mac address='fa:16:3e:72:76:91'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target dev='tapd3f10293-a2'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model type='virtio'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <mtu size='1442'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='net0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <interface type='ethernet'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <mac address='fa:16:3e:5c:c8:27'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target dev='tap52038e6d-42'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model type='virtio'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <mtu size='1442'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='net1'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <interface type='ethernet'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <mac address='fa:16:3e:cd:ea:70'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target dev='tap39f1b902-3d'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model type='virtio'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <mtu size='1442'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='net2'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <interface type='ethernet'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <mac address='fa:16:3e:86:00:70'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target dev='tap0b091e6e-95'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model type='virtio'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <mtu size='1442'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='net3'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <serial type='pty'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <source path='/dev/pts/0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <log file='/var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e/console.log' append='off'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target type='isa-serial' port='0'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:        <model name='isa-serial'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      </target>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='serial0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <source path='/dev/pts/0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <log file='/var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e/console.log' append='off'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <target type='serial' port='0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='serial0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </console>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <input type='tablet' bus='usb'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='input0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='usb' bus='0' port='1'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <input type='mouse' bus='ps2'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='input1'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <input type='keyboard' bus='ps2'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='input2'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <listen type='address' address='::0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </graphics>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <audio id='1' type='none'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='video0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <watchdog model='itco' action='reset'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='watchdog0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </watchdog>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <memballoon model='virtio'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <stats period='10'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='balloon0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <rng model='virtio'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <backend model='random'>/dev/urandom</backend>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <alias name='rng0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <label>system_u:system_r:svirt_t:s0:c705,c995</label>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c705,c995</imagelabel>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  </seclabel>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <label>+107:+107</label>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:    <imagelabel>+107:+107</imagelabel>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  </seclabel>
Jan 31 03:07:01 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:07:01 np0005603623 nova_compute[226235]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 31 03:07:01 np0005603623 nova_compute[226235]: 2026-01-31 08:07:01.590 226239 INFO nova.virt.libvirt.driver [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully detached device tap52038e6d-42 from instance 4582dbf2-09cd-4a26-84dd-28adcb24011e from the persistent domain config.#033[00m
Jan 31 03:07:01 np0005603623 nova_compute[226235]: 2026-01-31 08:07:01.591 226239 DEBUG nova.virt.libvirt.driver [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] (1/8): Attempting to detach device tap52038e6d-42 with device alias net1 from instance 4582dbf2-09cd-4a26-84dd-28adcb24011e from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 03:07:01 np0005603623 nova_compute[226235]: 2026-01-31 08:07:01.591 226239 DEBUG nova.virt.libvirt.guest [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] detach device xml: <interface type="ethernet">
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <mac address="fa:16:3e:5c:c8:27"/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <model type="virtio"/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <mtu size="1442"/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]:  <target dev="tap52038e6d-42"/>
Jan 31 03:07:01 np0005603623 nova_compute[226235]: </interface>
Jan 31 03:07:01 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:07:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:01.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:03 np0005603623 kernel: tap52038e6d-42 (unregistering): left promiscuous mode
Jan 31 03:07:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:03.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.138 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:03 np0005603623 NetworkManager[48970]: <info>  [1769846823.1404] device (tap52038e6d-42): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:07:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:03Z|00238|binding|INFO|Releasing lport 52038e6d-42cd-444a-959a-ce24f4c9bb50 from this chassis (sb_readonly=0)
Jan 31 03:07:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:03Z|00239|binding|INFO|Setting lport 52038e6d-42cd-444a-959a-ce24f4c9bb50 down in Southbound
Jan 31 03:07:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:03Z|00240|binding|INFO|Removing iface tap52038e6d-42 ovn-installed in OVS
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.147 226239 DEBUG nova.virt.libvirt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Received event <DeviceRemovedEvent: 1769846823.1469457, 4582dbf2-09cd-4a26-84dd-28adcb24011e => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.148 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.149 226239 DEBUG nova.virt.libvirt.driver [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Start waiting for the detach event from libvirt for device tap52038e6d-42 with device alias net1 for instance 4582dbf2-09cd-4a26-84dd-28adcb24011e _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.149 226239 DEBUG nova.virt.libvirt.guest [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:5c:c8:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap52038e6d-42"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.150 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.153 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.154 226239 DEBUG nova.virt.libvirt.guest [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:5c:c8:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap52038e6d-42"/></interface>not found in domain: <domain type='kvm' id='30'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <name>instance-00000040</name>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <uuid>4582dbf2-09cd-4a26-84dd-28adcb24011e</uuid>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <nova:name>tempest-AttachInterfacesTestJSON-server-401952099</nova:name>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <nova:creationTime>2026-01-31 08:06:58</nova:creationTime>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <nova:flavor name="m1.nano">
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:memory>128</nova:memory>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:disk>1</nova:disk>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:swap>0</nova:swap>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  </nova:flavor>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <nova:owner>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  </nova:owner>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <nova:ports>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:port uuid="d3f10293-a2fb-49cc-a81c-f5fee53bb74f">
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:port uuid="52038e6d-42cd-444a-959a-ce24f4c9bb50">
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:port uuid="39f1b902-3d83-4831-b91c-5d4e2349cb30">
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:port uuid="0b091e6e-9544-424b-b33c-45258253789e">
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  </nova:ports>
Jan 31 03:07:03 np0005603623 nova_compute[226235]: </nova:instance>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <memory unit='KiB'>131072</memory>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <vcpu placement='static'>1</vcpu>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <resource>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <partition>/machine</partition>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  </resource>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <sysinfo type='smbios'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <entry name='manufacturer'>RDO</entry>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <entry name='serial'>4582dbf2-09cd-4a26-84dd-28adcb24011e</entry>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <entry name='uuid'>4582dbf2-09cd-4a26-84dd-28adcb24011e</entry>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <entry name='family'>Virtual Machine</entry>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <boot dev='hd'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <smbios mode='sysinfo'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <vmcoreinfo state='on'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <model fallback='forbid'>Nehalem</model>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <feature policy='require' name='x2apic'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <feature policy='require' name='hypervisor'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <feature policy='require' name='vme'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <clock offset='utc'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <timer name='hpet' present='no'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <on_poweroff>destroy</on_poweroff>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <on_reboot>restart</on_reboot>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <on_crash>destroy</on_crash>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <disk type='network' device='disk'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <auth username='openstack'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <source protocol='rbd' name='vms/4582dbf2-09cd-4a26-84dd-28adcb24011e_disk' index='2'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target dev='vda' bus='virtio'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='virtio-disk0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <disk type='network' device='cdrom'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <auth username='openstack'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <source protocol='rbd' name='vms/4582dbf2-09cd-4a26-84dd-28adcb24011e_disk.config' index='1'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target dev='sda' bus='sata'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <readonly/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='sata0-0-0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pcie.0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='1' port='0x10'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.1'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='2' port='0x11'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.2'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='3' port='0x12'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.3'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='4' port='0x13'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.4'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='5' port='0x14'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.5'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='6' port='0x15'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.6'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='7' port='0x16'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.7'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='8' port='0x17'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.8'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='9' port='0x18'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.9'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='10' port='0x19'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.10'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='11' port='0x1a'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.11'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='12' port='0x1b'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.12'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='13' port='0x1c'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.13'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='14' port='0x1d'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.14'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='15' port='0x1e'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.15'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='16' port='0x1f'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.16'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='17' port='0x20'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.17'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='18' port='0x21'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.18'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='19' port='0x22'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.19'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='20' port='0x23'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.20'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='21' port='0x24'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.21'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='22' port='0x25'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.22'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='23' port='0x26'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.23'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='24' port='0x27'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.24'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target chassis='25' port='0x28'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.25'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model name='pcie-pci-bridge'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='pci.26'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='usb'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <controller type='sata' index='0'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='ide'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <interface type='ethernet'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <mac address='fa:16:3e:72:76:91'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target dev='tapd3f10293-a2'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model type='virtio'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <mtu size='1442'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='net0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <interface type='ethernet'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <mac address='fa:16:3e:cd:ea:70'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target dev='tap39f1b902-3d'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model type='virtio'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <mtu size='1442'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='net2'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <interface type='ethernet'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <mac address='fa:16:3e:86:00:70'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target dev='tap0b091e6e-95'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model type='virtio'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <mtu size='1442'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='net3'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <serial type='pty'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <source path='/dev/pts/0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <log file='/var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e/console.log' append='off'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target type='isa-serial' port='0'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:        <model name='isa-serial'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      </target>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='serial0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <source path='/dev/pts/0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <log file='/var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e/console.log' append='off'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <target type='serial' port='0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='serial0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </console>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <input type='tablet' bus='usb'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='input0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='usb' bus='0' port='1'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <input type='mouse' bus='ps2'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='input1'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <input type='keyboard' bus='ps2'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='input2'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <listen type='address' address='::0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </graphics>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <audio id='1' type='none'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='video0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <watchdog model='itco' action='reset'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='watchdog0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </watchdog>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <memballoon model='virtio'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <stats period='10'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='balloon0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <rng model='virtio'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <backend model='random'>/dev/urandom</backend>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <alias name='rng0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <label>system_u:system_r:svirt_t:s0:c705,c995</label>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c705,c995</imagelabel>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  </seclabel>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <label>+107:+107</label>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <imagelabel>+107:+107</imagelabel>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  </seclabel>
Jan 31 03:07:03 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:07:03 np0005603623 nova_compute[226235]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.154 226239 INFO nova.virt.libvirt.driver [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully detached device tap52038e6d-42 from instance 4582dbf2-09cd-4a26-84dd-28adcb24011e from the live domain config.#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.155 226239 DEBUG nova.virt.libvirt.vif [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-401952099',display_name='tempest-AttachInterfacesTestJSON-server-401952099',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-401952099',id=64,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLE+D8OgZ1/dCdtrU14/PqCqGOYd+UkR9TQlI7xIz745wJamwCS0XZjGFLIMnjQi5YZpo14J+M8CAOtiHugBc6D8B5fKxatiHPrhEn6jkmtDN0v0fZ+hZxGDop3fETxTA==',key_name='tempest-keypair-524885169',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:06:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0cwd24au',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:06:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=4582dbf2-09cd-4a26-84dd-28adcb24011e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "address": "fa:16:3e:5c:c8:27", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52038e6d-42", "ovs_interfaceid": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.155 226239 DEBUG nova.network.os_vif_util [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "address": "fa:16:3e:5c:c8:27", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52038e6d-42", "ovs_interfaceid": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.156 226239 DEBUG nova.network.os_vif_util [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5c:c8:27,bridge_name='br-int',has_traffic_filtering=True,id=52038e6d-42cd-444a-959a-ce24f4c9bb50,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52038e6d-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.156 226239 DEBUG os_vif [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5c:c8:27,bridge_name='br-int',has_traffic_filtering=True,id=52038e6d-42cd-444a-959a-ce24f4c9bb50,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52038e6d-42') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.157 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.158 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52038e6d-42, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.159 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.161 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.162 226239 INFO os_vif [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5c:c8:27,bridge_name='br-int',has_traffic_filtering=True,id=52038e6d-42cd-444a-959a-ce24f4c9bb50,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52038e6d-42')#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.163 226239 DEBUG nova.virt.libvirt.guest [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <nova:name>tempest-AttachInterfacesTestJSON-server-401952099</nova:name>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <nova:creationTime>2026-01-31 08:07:03</nova:creationTime>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <nova:flavor name="m1.nano">
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:memory>128</nova:memory>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:disk>1</nova:disk>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:swap>0</nova:swap>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  </nova:flavor>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <nova:owner>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  </nova:owner>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  <nova:ports>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:port uuid="d3f10293-a2fb-49cc-a81c-f5fee53bb74f">
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:port uuid="39f1b902-3d83-4831-b91c-5d4e2349cb30">
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    <nova:port uuid="0b091e6e-9544-424b-b33c-45258253789e">
Jan 31 03:07:03 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:03 np0005603623 nova_compute[226235]:  </nova:ports>
Jan 31 03:07:03 np0005603623 nova_compute[226235]: </nova:instance>
Jan 31 03:07:03 np0005603623 nova_compute[226235]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 31 03:07:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:03.166 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5c:c8:27 10.100.0.12'], port_security=['fa:16:3e:5c:c8:27 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4582dbf2-09cd-4a26-84dd-28adcb24011e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fdcf3a61-8bd1-47a3-8e6c-d6fed17d2331', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=52038e6d-42cd-444a-959a-ce24f4c9bb50) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:07:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:03.168 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 52038e6d-42cd-444a-959a-ce24f4c9bb50 in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f unbound from our chassis#033[00m
Jan 31 03:07:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:03.170 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455fab34-b015-4d97-a96d-f7ebd7f7555f#033[00m
Jan 31 03:07:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:03.178 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bde082b5-4647-4abc-ab78-47257b68debc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:03.200 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[f912f042-ac23-427a-80e4-3e45a6f523b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:03.205 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[eb0cb27d-0ef1-40d0-8b66-e6221a146dd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:03.226 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[8ad5e8a5-83da-444f-9dad-0a1f431366a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:03.238 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a426459a-3f02-44b3-8176-ff36dc62eab0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590154, 'reachable_time': 34334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255720, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:03.249 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[09d075c2-57b9-4769-b873-6816533ede26]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590160, 'tstamp': 590160}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255721, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590162, 'tstamp': 590162}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255721, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:03.250 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.252 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.253 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:03.254 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455fab34-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:03.255 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:07:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:03.255 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455fab34-b0, col_values=(('external_ids', {'iface-id': 'b4a40811-3703-4da5-859c-3e041b7cfee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:03.255 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:07:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:03 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:07:03 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:07:03 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:07:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:03.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.769 226239 DEBUG nova.compute.manager [req-247b1b4a-cc4f-4be1-8714-bc36b5ac7659 req-0e94e46b-3c4d-443c-be67-3a8e8620bafb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-unplugged-52038e6d-42cd-444a-959a-ce24f4c9bb50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.770 226239 DEBUG oslo_concurrency.lockutils [req-247b1b4a-cc4f-4be1-8714-bc36b5ac7659 req-0e94e46b-3c4d-443c-be67-3a8e8620bafb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.770 226239 DEBUG oslo_concurrency.lockutils [req-247b1b4a-cc4f-4be1-8714-bc36b5ac7659 req-0e94e46b-3c4d-443c-be67-3a8e8620bafb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.771 226239 DEBUG oslo_concurrency.lockutils [req-247b1b4a-cc4f-4be1-8714-bc36b5ac7659 req-0e94e46b-3c4d-443c-be67-3a8e8620bafb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.771 226239 DEBUG nova.compute.manager [req-247b1b4a-cc4f-4be1-8714-bc36b5ac7659 req-0e94e46b-3c4d-443c-be67-3a8e8620bafb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] No waiting events found dispatching network-vif-unplugged-52038e6d-42cd-444a-959a-ce24f4c9bb50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.772 226239 WARNING nova.compute.manager [req-247b1b4a-cc4f-4be1-8714-bc36b5ac7659 req-0e94e46b-3c4d-443c-be67-3a8e8620bafb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received unexpected event network-vif-unplugged-52038e6d-42cd-444a-959a-ce24f4c9bb50 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.955 226239 DEBUG nova.compute.manager [req-e31ddcea-9f83-474f-baba-8e7835a52ad2 req-c8a45078-161b-43c2-bb11-cc824df93fac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-plugged-0b091e6e-9544-424b-b33c-45258253789e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.955 226239 DEBUG oslo_concurrency.lockutils [req-e31ddcea-9f83-474f-baba-8e7835a52ad2 req-c8a45078-161b-43c2-bb11-cc824df93fac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.956 226239 DEBUG oslo_concurrency.lockutils [req-e31ddcea-9f83-474f-baba-8e7835a52ad2 req-c8a45078-161b-43c2-bb11-cc824df93fac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.956 226239 DEBUG oslo_concurrency.lockutils [req-e31ddcea-9f83-474f-baba-8e7835a52ad2 req-c8a45078-161b-43c2-bb11-cc824df93fac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.956 226239 DEBUG nova.compute.manager [req-e31ddcea-9f83-474f-baba-8e7835a52ad2 req-c8a45078-161b-43c2-bb11-cc824df93fac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] No waiting events found dispatching network-vif-plugged-0b091e6e-9544-424b-b33c-45258253789e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:07:03 np0005603623 nova_compute[226235]: 2026-01-31 08:07:03.956 226239 WARNING nova.compute.manager [req-e31ddcea-9f83-474f-baba-8e7835a52ad2 req-c8a45078-161b-43c2-bb11-cc824df93fac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received unexpected event network-vif-plugged-0b091e6e-9544-424b-b33c-45258253789e for instance with vm_state active and task_state None.#033[00m
Jan 31 03:07:04 np0005603623 nova_compute[226235]: 2026-01-31 08:07:04.590 226239 DEBUG nova.compute.manager [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-deleted-52038e6d-42cd-444a-959a-ce24f4c9bb50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:04 np0005603623 nova_compute[226235]: 2026-01-31 08:07:04.590 226239 INFO nova.compute.manager [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Neutron deleted interface 52038e6d-42cd-444a-959a-ce24f4c9bb50; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:07:04 np0005603623 nova_compute[226235]: 2026-01-31 08:07:04.591 226239 DEBUG nova.network.neutron [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updating instance_info_cache with network_info: [{"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "address": "fa:16:3e:cd:ea:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39f1b902-3d", "ovs_interfaceid": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0b091e6e-9544-424b-b33c-45258253789e", "address": "fa:16:3e:86:00:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b091e6e-95", "ovs_interfaceid": "0b091e6e-9544-424b-b33c-45258253789e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:07:04 np0005603623 nova_compute[226235]: 2026-01-31 08:07:04.602 226239 DEBUG oslo_concurrency.lockutils [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:07:04 np0005603623 nova_compute[226235]: 2026-01-31 08:07:04.602 226239 DEBUG oslo_concurrency.lockutils [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquired lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:07:04 np0005603623 nova_compute[226235]: 2026-01-31 08:07:04.602 226239 DEBUG nova.network.neutron [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:07:04 np0005603623 nova_compute[226235]: 2026-01-31 08:07:04.814 226239 DEBUG nova.objects.instance [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lazy-loading 'system_metadata' on Instance uuid 4582dbf2-09cd-4a26-84dd-28adcb24011e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:07:04 np0005603623 nova_compute[226235]: 2026-01-31 08:07:04.928 226239 DEBUG nova.objects.instance [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lazy-loading 'flavor' on Instance uuid 4582dbf2-09cd-4a26-84dd-28adcb24011e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.014 226239 DEBUG nova.virt.libvirt.vif [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-401952099',display_name='tempest-AttachInterfacesTestJSON-server-401952099',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-401952099',id=64,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLE+D8OgZ1/dCdtrU14/PqCqGOYd+UkR9TQlI7xIz745wJamwCS0XZjGFLIMnjQi5YZpo14J+M8CAOtiHugBc6D8B5fKxatiHPrhEn6jkmtDN0v0fZ+hZxGDop3fETxTA==',key_name='tempest-keypair-524885169',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:06:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0cwd24au',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:06:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=4582dbf2-09cd-4a26-84dd-28adcb24011e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "address": "fa:16:3e:5c:c8:27", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52038e6d-42", "ovs_interfaceid": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.015 226239 DEBUG nova.network.os_vif_util [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Converting VIF {"id": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "address": "fa:16:3e:5c:c8:27", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52038e6d-42", "ovs_interfaceid": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.015 226239 DEBUG nova.network.os_vif_util [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5c:c8:27,bridge_name='br-int',has_traffic_filtering=True,id=52038e6d-42cd-444a-959a-ce24f4c9bb50,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52038e6d-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.020 226239 DEBUG nova.virt.libvirt.guest [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:5c:c8:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap52038e6d-42"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.025 226239 DEBUG nova.virt.libvirt.guest [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:5c:c8:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap52038e6d-42"/></interface>not found in domain: <domain type='kvm' id='30'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <name>instance-00000040</name>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <uuid>4582dbf2-09cd-4a26-84dd-28adcb24011e</uuid>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:name>tempest-AttachInterfacesTestJSON-server-401952099</nova:name>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:creationTime>2026-01-31 08:07:03</nova:creationTime>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:flavor name="m1.nano">
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:memory>128</nova:memory>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:disk>1</nova:disk>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:swap>0</nova:swap>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </nova:flavor>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:owner>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </nova:owner>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:ports>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:port uuid="d3f10293-a2fb-49cc-a81c-f5fee53bb74f">
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:port uuid="39f1b902-3d83-4831-b91c-5d4e2349cb30">
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:port uuid="0b091e6e-9544-424b-b33c-45258253789e">
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </nova:ports>
Jan 31 03:07:05 np0005603623 nova_compute[226235]: </nova:instance>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <memory unit='KiB'>131072</memory>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <vcpu placement='static'>1</vcpu>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <resource>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <partition>/machine</partition>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </resource>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <sysinfo type='smbios'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <entry name='manufacturer'>RDO</entry>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <entry name='serial'>4582dbf2-09cd-4a26-84dd-28adcb24011e</entry>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <entry name='uuid'>4582dbf2-09cd-4a26-84dd-28adcb24011e</entry>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <entry name='family'>Virtual Machine</entry>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <boot dev='hd'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <smbios mode='sysinfo'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <vmcoreinfo state='on'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <model fallback='forbid'>Nehalem</model>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <feature policy='require' name='x2apic'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <feature policy='require' name='hypervisor'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <feature policy='require' name='vme'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <clock offset='utc'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <timer name='hpet' present='no'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <on_poweroff>destroy</on_poweroff>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <on_reboot>restart</on_reboot>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <on_crash>destroy</on_crash>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <disk type='network' device='disk'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <auth username='openstack'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <source protocol='rbd' name='vms/4582dbf2-09cd-4a26-84dd-28adcb24011e_disk' index='2'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target dev='vda' bus='virtio'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='virtio-disk0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <disk type='network' device='cdrom'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <auth username='openstack'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <source protocol='rbd' name='vms/4582dbf2-09cd-4a26-84dd-28adcb24011e_disk.config' index='1'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target dev='sda' bus='sata'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <readonly/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='sata0-0-0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pcie.0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='1' port='0x10'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.1'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='2' port='0x11'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='3' port='0x12'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.3'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='4' port='0x13'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.4'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='5' port='0x14'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.5'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='6' port='0x15'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.6'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='7' port='0x16'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.7'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='8' port='0x17'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.8'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='9' port='0x18'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.9'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='10' port='0x19'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.10'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='11' port='0x1a'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.11'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='12' port='0x1b'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.12'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='13' port='0x1c'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.13'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='14' port='0x1d'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.14'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='15' port='0x1e'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.15'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='16' port='0x1f'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.16'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='17' port='0x20'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.17'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='18' port='0x21'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.18'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='19' port='0x22'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.19'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='20' port='0x23'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.20'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='21' port='0x24'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.21'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='22' port='0x25'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.22'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='23' port='0x26'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.23'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='24' port='0x27'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.24'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='25' port='0x28'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.25'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-pci-bridge'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.26'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='usb'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='sata' index='0'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='ide'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <interface type='ethernet'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <mac address='fa:16:3e:72:76:91'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target dev='tapd3f10293-a2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model type='virtio'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <mtu size='1442'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='net0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <interface type='ethernet'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <mac address='fa:16:3e:cd:ea:70'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target dev='tap39f1b902-3d'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model type='virtio'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <mtu size='1442'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='net2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <interface type='ethernet'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <mac address='fa:16:3e:86:00:70'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target dev='tap0b091e6e-95'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model type='virtio'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <mtu size='1442'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='net3'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <serial type='pty'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <source path='/dev/pts/0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <log file='/var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e/console.log' append='off'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target type='isa-serial' port='0'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:        <model name='isa-serial'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      </target>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='serial0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <source path='/dev/pts/0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <log file='/var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e/console.log' append='off'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target type='serial' port='0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='serial0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </console>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <input type='tablet' bus='usb'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='input0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='usb' bus='0' port='1'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <input type='mouse' bus='ps2'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='input1'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <input type='keyboard' bus='ps2'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='input2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <listen type='address' address='::0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </graphics>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <audio id='1' type='none'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='video0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <watchdog model='itco' action='reset'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='watchdog0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </watchdog>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <memballoon model='virtio'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <stats period='10'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='balloon0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <rng model='virtio'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <backend model='random'>/dev/urandom</backend>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='rng0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <label>system_u:system_r:svirt_t:s0:c705,c995</label>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c705,c995</imagelabel>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </seclabel>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <label>+107:+107</label>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <imagelabel>+107:+107</imagelabel>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </seclabel>
Jan 31 03:07:05 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:07:05 np0005603623 nova_compute[226235]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.026 226239 DEBUG nova.virt.libvirt.guest [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:5c:c8:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap52038e6d-42"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.029 226239 DEBUG nova.virt.libvirt.guest [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:5c:c8:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap52038e6d-42"/></interface>not found in domain: <domain type='kvm' id='30'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <name>instance-00000040</name>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <uuid>4582dbf2-09cd-4a26-84dd-28adcb24011e</uuid>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:name>tempest-AttachInterfacesTestJSON-server-401952099</nova:name>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:creationTime>2026-01-31 08:07:03</nova:creationTime>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:flavor name="m1.nano">
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:memory>128</nova:memory>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:disk>1</nova:disk>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:swap>0</nova:swap>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </nova:flavor>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:owner>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </nova:owner>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:ports>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:port uuid="d3f10293-a2fb-49cc-a81c-f5fee53bb74f">
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:port uuid="39f1b902-3d83-4831-b91c-5d4e2349cb30">
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:port uuid="0b091e6e-9544-424b-b33c-45258253789e">
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </nova:ports>
Jan 31 03:07:05 np0005603623 nova_compute[226235]: </nova:instance>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <memory unit='KiB'>131072</memory>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <vcpu placement='static'>1</vcpu>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <resource>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <partition>/machine</partition>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </resource>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <sysinfo type='smbios'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <entry name='manufacturer'>RDO</entry>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <entry name='serial'>4582dbf2-09cd-4a26-84dd-28adcb24011e</entry>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <entry name='uuid'>4582dbf2-09cd-4a26-84dd-28adcb24011e</entry>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <entry name='family'>Virtual Machine</entry>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <boot dev='hd'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <smbios mode='sysinfo'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <vmcoreinfo state='on'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <model fallback='forbid'>Nehalem</model>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <feature policy='require' name='x2apic'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <feature policy='require' name='hypervisor'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <feature policy='require' name='vme'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <clock offset='utc'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <timer name='hpet' present='no'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <on_poweroff>destroy</on_poweroff>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <on_reboot>restart</on_reboot>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <on_crash>destroy</on_crash>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <disk type='network' device='disk'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <auth username='openstack'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <source protocol='rbd' name='vms/4582dbf2-09cd-4a26-84dd-28adcb24011e_disk' index='2'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target dev='vda' bus='virtio'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='virtio-disk0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <disk type='network' device='cdrom'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <auth username='openstack'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <source protocol='rbd' name='vms/4582dbf2-09cd-4a26-84dd-28adcb24011e_disk.config' index='1'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target dev='sda' bus='sata'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <readonly/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='sata0-0-0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pcie.0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='1' port='0x10'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.1'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='2' port='0x11'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='3' port='0x12'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.3'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='4' port='0x13'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.4'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='5' port='0x14'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.5'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='6' port='0x15'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.6'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='7' port='0x16'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.7'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='8' port='0x17'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.8'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='9' port='0x18'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.9'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='10' port='0x19'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.10'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='11' port='0x1a'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.11'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='12' port='0x1b'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.12'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='13' port='0x1c'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.13'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='14' port='0x1d'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.14'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='15' port='0x1e'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.15'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='16' port='0x1f'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.16'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='17' port='0x20'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.17'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='18' port='0x21'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.18'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='19' port='0x22'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.19'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='20' port='0x23'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.20'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='21' port='0x24'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.21'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='22' port='0x25'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.22'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='23' port='0x26'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.23'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='24' port='0x27'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.24'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target chassis='25' port='0x28'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.25'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model name='pcie-pci-bridge'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='pci.26'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='usb'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <controller type='sata' index='0'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='ide'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <interface type='ethernet'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <mac address='fa:16:3e:72:76:91'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target dev='tapd3f10293-a2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model type='virtio'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <mtu size='1442'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='net0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <interface type='ethernet'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <mac address='fa:16:3e:cd:ea:70'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target dev='tap39f1b902-3d'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model type='virtio'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <mtu size='1442'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='net2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <interface type='ethernet'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <mac address='fa:16:3e:86:00:70'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target dev='tap0b091e6e-95'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model type='virtio'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <mtu size='1442'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='net3'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <serial type='pty'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <source path='/dev/pts/0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <log file='/var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e/console.log' append='off'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target type='isa-serial' port='0'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:        <model name='isa-serial'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      </target>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='serial0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <source path='/dev/pts/0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <log file='/var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e/console.log' append='off'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <target type='serial' port='0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='serial0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </console>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <input type='tablet' bus='usb'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='input0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='usb' bus='0' port='1'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <input type='mouse' bus='ps2'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='input1'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <input type='keyboard' bus='ps2'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='input2'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <listen type='address' address='::0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </graphics>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <audio id='1' type='none'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='video0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <watchdog model='itco' action='reset'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='watchdog0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </watchdog>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <memballoon model='virtio'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <stats period='10'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='balloon0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <rng model='virtio'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <backend model='random'>/dev/urandom</backend>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <alias name='rng0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <label>system_u:system_r:svirt_t:s0:c705,c995</label>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c705,c995</imagelabel>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </seclabel>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <label>+107:+107</label>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <imagelabel>+107:+107</imagelabel>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </seclabel>
Jan 31 03:07:05 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:07:05 np0005603623 nova_compute[226235]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.029 226239 WARNING nova.virt.libvirt.driver [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Detaching interface fa:16:3e:5c:c8:27 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap52038e6d-42' not found.#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.030 226239 DEBUG nova.virt.libvirt.vif [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-401952099',display_name='tempest-AttachInterfacesTestJSON-server-401952099',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-401952099',id=64,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLE+D8OgZ1/dCdtrU14/PqCqGOYd+UkR9TQlI7xIz745wJamwCS0XZjGFLIMnjQi5YZpo14J+M8CAOtiHugBc6D8B5fKxatiHPrhEn6jkmtDN0v0fZ+hZxGDop3fETxTA==',key_name='tempest-keypair-524885169',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:06:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0cwd24au',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:06:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=4582dbf2-09cd-4a26-84dd-28adcb24011e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "address": "fa:16:3e:5c:c8:27", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52038e6d-42", "ovs_interfaceid": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.030 226239 DEBUG nova.network.os_vif_util [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Converting VIF {"id": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "address": "fa:16:3e:5c:c8:27", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52038e6d-42", "ovs_interfaceid": "52038e6d-42cd-444a-959a-ce24f4c9bb50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.030 226239 DEBUG nova.network.os_vif_util [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5c:c8:27,bridge_name='br-int',has_traffic_filtering=True,id=52038e6d-42cd-444a-959a-ce24f4c9bb50,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52038e6d-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.031 226239 DEBUG os_vif [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5c:c8:27,bridge_name='br-int',has_traffic_filtering=True,id=52038e6d-42cd-444a-959a-ce24f4c9bb50,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52038e6d-42') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.032 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.033 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52038e6d-42, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.033 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.035 226239 INFO os_vif [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5c:c8:27,bridge_name='br-int',has_traffic_filtering=True,id=52038e6d-42cd-444a-959a-ce24f4c9bb50,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52038e6d-42')#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.036 226239 DEBUG nova.virt.libvirt.guest [req-12a11bb8-04c3-4eb9-868f-3d79fe524ad0 req-e708f24a-f384-40b6-89c6-1f755937d18d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:name>tempest-AttachInterfacesTestJSON-server-401952099</nova:name>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:creationTime>2026-01-31 08:07:05</nova:creationTime>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:flavor name="m1.nano">
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:memory>128</nova:memory>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:disk>1</nova:disk>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:swap>0</nova:swap>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </nova:flavor>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:owner>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </nova:owner>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  <nova:ports>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:port uuid="d3f10293-a2fb-49cc-a81c-f5fee53bb74f">
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:port uuid="39f1b902-3d83-4831-b91c-5d4e2349cb30">
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    <nova:port uuid="0b091e6e-9544-424b-b33c-45258253789e">
Jan 31 03:07:05 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:05 np0005603623 nova_compute[226235]:  </nova:ports>
Jan 31 03:07:05 np0005603623 nova_compute[226235]: </nova:instance>
Jan 31 03:07:05 np0005603623 nova_compute[226235]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 31 03:07:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:05.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:05.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:05.865 143258 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 5ab58ddf-9841-4b98-bf7b-7388b969d2d2 with type ""#033[00m
Jan 31 03:07:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:05.866 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:00:70 10.100.0.13'], port_security=['fa:16:3e:86:00:70 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1283313675', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4582dbf2-09cd-4a26-84dd-28adcb24011e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1283313675', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fdcf3a61-8bd1-47a3-8e6c-d6fed17d2331', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=0b091e6e-9544-424b-b33c-45258253789e) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:07:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:05.867 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 0b091e6e-9544-424b-b33c-45258253789e in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f unbound from our chassis#033[00m
Jan 31 03:07:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:05.869 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455fab34-b015-4d97-a96d-f7ebd7f7555f#033[00m
Jan 31 03:07:05 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:05Z|00241|binding|INFO|Removing iface tap0b091e6e-95 ovn-installed in OVS
Jan 31 03:07:05 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:05Z|00242|binding|INFO|Removing lport 0b091e6e-9544-424b-b33c-45258253789e ovn-installed in OVS
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.878 226239 DEBUG nova.compute.manager [req-13a592a9-3585-4653-ad1e-bfd87bd89801 req-ba57b2e5-7b98-4fdf-aeeb-c7ee19643a33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-plugged-52038e6d-42cd-444a-959a-ce24f4c9bb50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.879 226239 DEBUG oslo_concurrency.lockutils [req-13a592a9-3585-4653-ad1e-bfd87bd89801 req-ba57b2e5-7b98-4fdf-aeeb-c7ee19643a33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.879 226239 DEBUG oslo_concurrency.lockutils [req-13a592a9-3585-4653-ad1e-bfd87bd89801 req-ba57b2e5-7b98-4fdf-aeeb-c7ee19643a33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.879 226239 DEBUG oslo_concurrency.lockutils [req-13a592a9-3585-4653-ad1e-bfd87bd89801 req-ba57b2e5-7b98-4fdf-aeeb-c7ee19643a33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.879 226239 DEBUG nova.compute.manager [req-13a592a9-3585-4653-ad1e-bfd87bd89801 req-ba57b2e5-7b98-4fdf-aeeb-c7ee19643a33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] No waiting events found dispatching network-vif-plugged-52038e6d-42cd-444a-959a-ce24f4c9bb50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.880 226239 WARNING nova.compute.manager [req-13a592a9-3585-4653-ad1e-bfd87bd89801 req-ba57b2e5-7b98-4fdf-aeeb-c7ee19643a33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received unexpected event network-vif-plugged-52038e6d-42cd-444a-959a-ce24f4c9bb50 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.880 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:05.883 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b25311c0-a602-42e2-8563-ae773fb78d17]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:05.913 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[6dba7f39-ba6e-493a-8a87-c47f92165751]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:05.917 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[6066516e-2ecc-44c6-b8e9-becdd4118034]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:05.946 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[6d47dc00-d23b-46af-944c-6699ea43a008]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:05.965 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[abe66a37-8306-45fa-8daf-1d8f7b29db78]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 13, 'rx_bytes': 826, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 13, 'rx_bytes': 826, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590154, 'reachable_time': 34334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255728, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:05.982 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d28d491f-158e-487d-891a-b3638bff93f3]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590160, 'tstamp': 590160}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255729, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590162, 'tstamp': 590162}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255729, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:05.984 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:05 np0005603623 nova_compute[226235]: 2026-01-31 08:07:05.985 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:05.986 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455fab34-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:05.986 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:07:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:05.987 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455fab34-b0, col_values=(('external_ids', {'iface-id': 'b4a40811-3703-4da5-859c-3e041b7cfee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:05.987 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:07:06 np0005603623 nova_compute[226235]: 2026-01-31 08:07:06.298 226239 INFO nova.network.neutron [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Port 52038e6d-42cd-444a-959a-ce24f4c9bb50 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 31 03:07:06 np0005603623 nova_compute[226235]: 2026-01-31 08:07:06.566 226239 DEBUG oslo_concurrency.lockutils [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "4582dbf2-09cd-4a26-84dd-28adcb24011e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:06 np0005603623 nova_compute[226235]: 2026-01-31 08:07:06.566 226239 DEBUG oslo_concurrency.lockutils [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:06 np0005603623 nova_compute[226235]: 2026-01-31 08:07:06.567 226239 DEBUG oslo_concurrency.lockutils [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:06 np0005603623 nova_compute[226235]: 2026-01-31 08:07:06.567 226239 DEBUG oslo_concurrency.lockutils [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:06 np0005603623 nova_compute[226235]: 2026-01-31 08:07:06.567 226239 DEBUG oslo_concurrency.lockutils [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:06 np0005603623 nova_compute[226235]: 2026-01-31 08:07:06.568 226239 INFO nova.compute.manager [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Terminating instance#033[00m
Jan 31 03:07:06 np0005603623 nova_compute[226235]: 2026-01-31 08:07:06.569 226239 DEBUG nova.compute.manager [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:07:06 np0005603623 kernel: tapd3f10293-a2 (unregistering): left promiscuous mode
Jan 31 03:07:06 np0005603623 NetworkManager[48970]: <info>  [1769846826.9439] device (tapd3f10293-a2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:07:06 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:06Z|00243|binding|INFO|Releasing lport d3f10293-a2fb-49cc-a81c-f5fee53bb74f from this chassis (sb_readonly=0)
Jan 31 03:07:06 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:06Z|00244|binding|INFO|Setting lport d3f10293-a2fb-49cc-a81c-f5fee53bb74f down in Southbound
Jan 31 03:07:06 np0005603623 nova_compute[226235]: 2026-01-31 08:07:06.948 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:06 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:06Z|00245|binding|INFO|Removing iface tapd3f10293-a2 ovn-installed in OVS
Jan 31 03:07:06 np0005603623 nova_compute[226235]: 2026-01-31 08:07:06.955 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:06 np0005603623 kernel: tap39f1b902-3d (unregistering): left promiscuous mode
Jan 31 03:07:06 np0005603623 NetworkManager[48970]: <info>  [1769846826.9698] device (tap39f1b902-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:07:06 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:06Z|00246|binding|INFO|Releasing lport 39f1b902-3d83-4831-b91c-5d4e2349cb30 from this chassis (sb_readonly=1)
Jan 31 03:07:06 np0005603623 nova_compute[226235]: 2026-01-31 08:07:06.975 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:06 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:06Z|00247|binding|INFO|Removing iface tap39f1b902-3d ovn-installed in OVS
Jan 31 03:07:06 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:06Z|00248|if_status|INFO|Dropped 3 log messages in last 991 seconds (most recently, 991 seconds ago) due to excessive rate
Jan 31 03:07:06 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:06Z|00249|if_status|INFO|Not setting lport 39f1b902-3d83-4831-b91c-5d4e2349cb30 down as sb is readonly
Jan 31 03:07:06 np0005603623 nova_compute[226235]: 2026-01-31 08:07:06.976 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:06 np0005603623 nova_compute[226235]: 2026-01-31 08:07:06.980 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:06 np0005603623 kernel: tap0b091e6e-95 (unregistering): left promiscuous mode
Jan 31 03:07:06 np0005603623 NetworkManager[48970]: <info>  [1769846826.9976] device (tap0b091e6e-95): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.000 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.011 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:07 np0005603623 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000040.scope: Deactivated successfully.
Jan 31 03:07:07 np0005603623 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000040.scope: Consumed 14.465s CPU time.
Jan 31 03:07:07 np0005603623 systemd-machined[194379]: Machine qemu-30-instance-00000040 terminated.
Jan 31 03:07:07 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:07Z|00250|binding|INFO|Setting lport 39f1b902-3d83-4831-b91c-5d4e2349cb30 down in Southbound
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.061 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:76:91 10.100.0.7'], port_security=['fa:16:3e:72:76:91 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4582dbf2-09cd-4a26-84dd-28adcb24011e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0eb19bce-cce0-4cee-80b2-e44224af388b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.212'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=d3f10293-a2fb-49cc-a81c-f5fee53bb74f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.062 143258 INFO neutron.agent.ovn.metadata.agent [-] Port d3f10293-a2fb-49cc-a81c-f5fee53bb74f in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f unbound from our chassis#033[00m
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.063 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455fab34-b015-4d97-a96d-f7ebd7f7555f#033[00m
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.075 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3ecbb7ad-a7be-42dd-abad-b468e4ba52a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.081 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:ea:70 10.100.0.9'], port_security=['fa:16:3e:cd:ea:70 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4582dbf2-09cd-4a26-84dd-28adcb24011e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fdcf3a61-8bd1-47a3-8e6c-d6fed17d2331', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=39f1b902-3d83-4831-b91c-5d4e2349cb30) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.099 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[1dc7299d-dfa6-43c7-8ddf-8c5790196f4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.102 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[91ec7e66-242b-4ac4-a825-7244b492a2ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.131 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[18e93349-bd08-4dc3-b2de-f3244583db63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:07.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.144 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f063ebfa-6594-4268-92d3-813b76d66e2e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 15, 'rx_bytes': 826, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 15, 'rx_bytes': 826, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590154, 'reachable_time': 34334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255754, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.161 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[087f5ff5-8d61-4bc5-a7c1-af7acd5b5339]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590160, 'tstamp': 590160}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255755, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590162, 'tstamp': 590162}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255755, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.162 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.164 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.170 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.171 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455fab34-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.171 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.172 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455fab34-b0, col_values=(('external_ids', {'iface-id': 'b4a40811-3703-4da5-859c-3e041b7cfee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.172 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.173 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 39f1b902-3d83-4831-b91c-5d4e2349cb30 in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f unbound from our chassis#033[00m
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.174 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 455fab34-b015-4d97-a96d-f7ebd7f7555f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.175 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3d564c47-15f4-4ba2-a87a-79b3a43fd43e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:07.175 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f namespace which is not needed anymore#033[00m
Jan 31 03:07:07 np0005603623 NetworkManager[48970]: <info>  [1769846827.1837] manager: (tapd3f10293-a2): new Tun device (/org/freedesktop/NetworkManager/Devices/116)
Jan 31 03:07:07 np0005603623 NetworkManager[48970]: <info>  [1769846827.1943] manager: (tap39f1b902-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/117)
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.231 226239 INFO nova.virt.libvirt.driver [-] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Instance destroyed successfully.#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.231 226239 DEBUG nova.objects.instance [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'resources' on Instance uuid 4582dbf2-09cd-4a26-84dd-28adcb24011e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.324 226239 DEBUG nova.virt.libvirt.vif [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-401952099',display_name='tempest-AttachInterfacesTestJSON-server-401952099',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-401952099',id=64,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLE+D8OgZ1/dCdtrU14/PqCqGOYd+UkR9TQlI7xIz745wJamwCS0XZjGFLIMnjQi5YZpo14J+M8CAOtiHugBc6D8B5fKxatiHPrhEn6jkmtDN0v0fZ+hZxGDop3fETxTA==',key_name='tempest-keypair-524885169',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:06:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0cwd24au',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:06:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=4582dbf2-09cd-4a26-84dd-28adcb24011e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.324 226239 DEBUG nova.network.os_vif_util [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.325 226239 DEBUG nova.network.os_vif_util [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:72:76:91,bridge_name='br-int',has_traffic_filtering=True,id=d3f10293-a2fb-49cc-a81c-f5fee53bb74f,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3f10293-a2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.325 226239 DEBUG os_vif [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:72:76:91,bridge_name='br-int',has_traffic_filtering=True,id=d3f10293-a2fb-49cc-a81c-f5fee53bb74f,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3f10293-a2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.326 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.326 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3f10293-a2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.327 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.329 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.333 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.335 226239 INFO os_vif [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:72:76:91,bridge_name='br-int',has_traffic_filtering=True,id=d3f10293-a2fb-49cc-a81c-f5fee53bb74f,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3f10293-a2')#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.336 226239 DEBUG nova.virt.libvirt.vif [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-401952099',display_name='tempest-AttachInterfacesTestJSON-server-401952099',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-401952099',id=64,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLE+D8OgZ1/dCdtrU14/PqCqGOYd+UkR9TQlI7xIz745wJamwCS0XZjGFLIMnjQi5YZpo14J+M8CAOtiHugBc6D8B5fKxatiHPrhEn6jkmtDN0v0fZ+hZxGDop3fETxTA==',key_name='tempest-keypair-524885169',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:06:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0cwd24au',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:06:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=4582dbf2-09cd-4a26-84dd-28adcb24011e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "address": "fa:16:3e:cd:ea:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39f1b902-3d", "ovs_interfaceid": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.336 226239 DEBUG nova.network.os_vif_util [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "address": "fa:16:3e:cd:ea:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39f1b902-3d", "ovs_interfaceid": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.337 226239 DEBUG nova.network.os_vif_util [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cd:ea:70,bridge_name='br-int',has_traffic_filtering=True,id=39f1b902-3d83-4831-b91c-5d4e2349cb30,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39f1b902-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.337 226239 DEBUG os_vif [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cd:ea:70,bridge_name='br-int',has_traffic_filtering=True,id=39f1b902-3d83-4831-b91c-5d4e2349cb30,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39f1b902-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.338 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.338 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap39f1b902-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.339 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.341 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.342 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.344 226239 INFO os_vif [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cd:ea:70,bridge_name='br-int',has_traffic_filtering=True,id=39f1b902-3d83-4831-b91c-5d4e2349cb30,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39f1b902-3d')#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.345 226239 DEBUG nova.virt.libvirt.vif [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-401952099',display_name='tempest-AttachInterfacesTestJSON-server-401952099',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-401952099',id=64,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLE+D8OgZ1/dCdtrU14/PqCqGOYd+UkR9TQlI7xIz745wJamwCS0XZjGFLIMnjQi5YZpo14J+M8CAOtiHugBc6D8B5fKxatiHPrhEn6jkmtDN0v0fZ+hZxGDop3fETxTA==',key_name='tempest-keypair-524885169',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:06:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0cwd24au',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:06:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=4582dbf2-09cd-4a26-84dd-28adcb24011e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b091e6e-9544-424b-b33c-45258253789e", "address": "fa:16:3e:86:00:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b091e6e-95", "ovs_interfaceid": "0b091e6e-9544-424b-b33c-45258253789e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.345 226239 DEBUG nova.network.os_vif_util [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "0b091e6e-9544-424b-b33c-45258253789e", "address": "fa:16:3e:86:00:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b091e6e-95", "ovs_interfaceid": "0b091e6e-9544-424b-b33c-45258253789e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.346 226239 DEBUG nova.network.os_vif_util [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:00:70,bridge_name='br-int',has_traffic_filtering=True,id=0b091e6e-9544-424b-b33c-45258253789e,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b091e6e-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.346 226239 DEBUG os_vif [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:00:70,bridge_name='br-int',has_traffic_filtering=True,id=0b091e6e-9544-424b-b33c-45258253789e,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b091e6e-95') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.347 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.347 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b091e6e-95, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.348 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.349 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.351 226239 INFO os_vif [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:00:70,bridge_name='br-int',has_traffic_filtering=True,id=0b091e6e-9544-424b-b33c-45258253789e,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b091e6e-95')#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.376 226239 DEBUG nova.compute.manager [req-861d0fbf-79e6-4dda-be89-d3d9124fd832 req-a04776ac-3ce0-4c05-8676-f05c1b414b54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-unplugged-d3f10293-a2fb-49cc-a81c-f5fee53bb74f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.376 226239 DEBUG oslo_concurrency.lockutils [req-861d0fbf-79e6-4dda-be89-d3d9124fd832 req-a04776ac-3ce0-4c05-8676-f05c1b414b54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.376 226239 DEBUG oslo_concurrency.lockutils [req-861d0fbf-79e6-4dda-be89-d3d9124fd832 req-a04776ac-3ce0-4c05-8676-f05c1b414b54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.377 226239 DEBUG oslo_concurrency.lockutils [req-861d0fbf-79e6-4dda-be89-d3d9124fd832 req-a04776ac-3ce0-4c05-8676-f05c1b414b54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.377 226239 DEBUG nova.compute.manager [req-861d0fbf-79e6-4dda-be89-d3d9124fd832 req-a04776ac-3ce0-4c05-8676-f05c1b414b54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] No waiting events found dispatching network-vif-unplugged-d3f10293-a2fb-49cc-a81c-f5fee53bb74f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.377 226239 DEBUG nova.compute.manager [req-861d0fbf-79e6-4dda-be89-d3d9124fd832 req-a04776ac-3ce0-4c05-8676-f05c1b414b54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-unplugged-d3f10293-a2fb-49cc-a81c-f5fee53bb74f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:07:07 np0005603623 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[255198]: [NOTICE]   (255203) : haproxy version is 2.8.14-c23fe91
Jan 31 03:07:07 np0005603623 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[255198]: [NOTICE]   (255203) : path to executable is /usr/sbin/haproxy
Jan 31 03:07:07 np0005603623 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[255198]: [WARNING]  (255203) : Exiting Master process...
Jan 31 03:07:07 np0005603623 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[255198]: [ALERT]    (255203) : Current worker (255206) exited with code 143 (Terminated)
Jan 31 03:07:07 np0005603623 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[255198]: [WARNING]  (255203) : All workers exited. Exiting... (0)
Jan 31 03:07:07 np0005603623 systemd[1]: libpod-341ad2037159d7b5d527945f2a4969e372b6e6ad61156104abda7b0652aabff7.scope: Deactivated successfully.
Jan 31 03:07:07 np0005603623 podman[255813]: 2026-01-31 08:07:07.496700978 +0000 UTC m=+0.238359655 container died 341ad2037159d7b5d527945f2a4969e372b6e6ad61156104abda7b0652aabff7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:07:07 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:07Z|00251|binding|INFO|Releasing lport b4a40811-3703-4da5-859c-3e041b7cfee4 from this chassis (sb_readonly=0)
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.541 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:07 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:07Z|00252|binding|INFO|Releasing lport b4a40811-3703-4da5-859c-3e041b7cfee4 from this chassis (sb_readonly=0)
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.597 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:07.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.986 226239 DEBUG nova.compute.manager [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-deleted-0b091e6e-9544-424b-b33c-45258253789e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.987 226239 INFO nova.compute.manager [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Neutron deleted interface 0b091e6e-9544-424b-b33c-45258253789e; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:07:07 np0005603623 nova_compute[226235]: 2026-01-31 08:07:07.988 226239 DEBUG nova.network.neutron [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updating instance_info_cache with network_info: [{"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "address": "fa:16:3e:cd:ea:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39f1b902-3d", "ovs_interfaceid": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.026 226239 DEBUG nova.objects.instance [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lazy-loading 'system_metadata' on Instance uuid 4582dbf2-09cd-4a26-84dd-28adcb24011e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.063 226239 DEBUG nova.objects.instance [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lazy-loading 'flavor' on Instance uuid 4582dbf2-09cd-4a26-84dd-28adcb24011e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:07:08 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-341ad2037159d7b5d527945f2a4969e372b6e6ad61156104abda7b0652aabff7-userdata-shm.mount: Deactivated successfully.
Jan 31 03:07:08 np0005603623 systemd[1]: var-lib-containers-storage-overlay-b8fecf1af0abb036f027c5a440e729c77f98ca9b88e942efe84885e698d4b361-merged.mount: Deactivated successfully.
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.101 226239 DEBUG nova.virt.libvirt.vif [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-401952099',display_name='tempest-AttachInterfacesTestJSON-server-401952099',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-401952099',id=64,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLE+D8OgZ1/dCdtrU14/PqCqGOYd+UkR9TQlI7xIz745wJamwCS0XZjGFLIMnjQi5YZpo14J+M8CAOtiHugBc6D8B5fKxatiHPrhEn6jkmtDN0v0fZ+hZxGDop3fETxTA==',key_name='tempest-keypair-524885169',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:06:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0cwd24au',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:07:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=4582dbf2-09cd-4a26-84dd-28adcb24011e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b091e6e-9544-424b-b33c-45258253789e", "address": "fa:16:3e:86:00:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b091e6e-95", "ovs_interfaceid": "0b091e6e-9544-424b-b33c-45258253789e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.102 226239 DEBUG nova.network.os_vif_util [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Converting VIF {"id": "0b091e6e-9544-424b-b33c-45258253789e", "address": "fa:16:3e:86:00:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b091e6e-95", "ovs_interfaceid": "0b091e6e-9544-424b-b33c-45258253789e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.103 226239 DEBUG nova.network.os_vif_util [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:00:70,bridge_name='br-int',has_traffic_filtering=True,id=0b091e6e-9544-424b-b33c-45258253789e,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b091e6e-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.107 226239 DEBUG nova.virt.libvirt.guest [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:86:00:70"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap0b091e6e-95"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.110 226239 DEBUG nova.virt.libvirt.driver [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Attempting to detach device tap0b091e6e-95 from instance 4582dbf2-09cd-4a26-84dd-28adcb24011e from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.110 226239 DEBUG nova.virt.libvirt.guest [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] detach device xml: <interface type="ethernet">
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <mac address="fa:16:3e:86:00:70"/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <model type="virtio"/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <mtu size="1442"/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <target dev="tap0b091e6e-95"/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]: </interface>
Jan 31 03:07:08 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.149 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.249 226239 DEBUG nova.virt.libvirt.guest [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:86:00:70"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap0b091e6e-95"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.254 226239 DEBUG nova.virt.libvirt.guest [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:86:00:70"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap0b091e6e-95"/></interface>not found in domain: <domain type='kvm'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <name>instance-00000040</name>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <uuid>4582dbf2-09cd-4a26-84dd-28adcb24011e</uuid>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <nova:name>tempest-AttachInterfacesTestJSON-server-401952099</nova:name>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:05:58</nova:creationTime>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:07:08 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:        <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:        <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:        <nova:port uuid="d3f10293-a2fb-49cc-a81c-f5fee53bb74f">
Jan 31 03:07:08 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <memory unit='KiB'>131072</memory>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <vcpu placement='static'>1</vcpu>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <sysinfo type='smbios'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <entry name='manufacturer'>RDO</entry>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <entry name='serial'>4582dbf2-09cd-4a26-84dd-28adcb24011e</entry>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <entry name='uuid'>4582dbf2-09cd-4a26-84dd-28adcb24011e</entry>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <entry name='family'>Virtual Machine</entry>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <boot dev='hd'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <smbios mode='sysinfo'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <vmcoreinfo state='on'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <cpu mode='custom' match='exact' check='partial'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <model fallback='allow'>Nehalem</model>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <clock offset='utc'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <timer name='hpet' present='no'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <on_poweroff>destroy</on_poweroff>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <on_reboot>restart</on_reboot>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <on_crash>destroy</on_crash>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <disk type='network' device='disk'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <auth username='openstack'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <source protocol='rbd' name='vms/4582dbf2-09cd-4a26-84dd-28adcb24011e_disk'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target dev='vda' bus='virtio'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <disk type='network' device='cdrom'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <auth username='openstack'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <source protocol='rbd' name='vms/4582dbf2-09cd-4a26-84dd-28adcb24011e_disk.config'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target dev='sda' bus='sata'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <readonly/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='0' model='pcie-root'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='1' port='0x10'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='2' port='0x11'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='3' port='0x12'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='4' port='0x13'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='5' port='0x14'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='6' port='0x15'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='7' port='0x16'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='8' port='0x17'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='9' port='0x18'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='10' port='0x19'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='11' port='0x1a'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='12' port='0x1b'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='13' port='0x1c'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='14' port='0x1d'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='15' port='0x1e'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='16' port='0x1f'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='17' port='0x20'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='18' port='0x21'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='19' port='0x22'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='20' port='0x23'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='21' port='0x24'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='22' port='0x25'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='23' port='0x26'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='24' port='0x27'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target chassis='25' port='0x28'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model name='pcie-pci-bridge'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <controller type='sata' index='0'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <interface type='ethernet'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <mac address='fa:16:3e:72:76:91'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target dev='tapd3f10293-a2'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model type='virtio'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <mtu size='1442'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <interface type='ethernet'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <mac address='fa:16:3e:cd:ea:70'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target dev='tap39f1b902-3d'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model type='virtio'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <mtu size='1442'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <serial type='pty'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <log file='/var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e/console.log' append='off'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target type='isa-serial' port='0'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:        <model name='isa-serial'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      </target>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <console type='pty'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <log file='/var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e/console.log' append='off'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <target type='serial' port='0'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </console>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <input type='tablet' bus='usb'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='usb' bus='0' port='1'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <input type='mouse' bus='ps2'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <input type='keyboard' bus='ps2'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <graphics type='vnc' port='-1' autoport='yes' listen='::0'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <listen type='address' address='::0'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </graphics>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <audio id='1' type='none'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <watchdog model='itco' action='reset'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <memballoon model='virtio'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <stats period='10'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <rng model='virtio'>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <backend model='random'>/dev/urandom</backend>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:07:08 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:07:08 np0005603623 nova_compute[226235]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.255 226239 INFO nova.virt.libvirt.driver [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Successfully detached device tap0b091e6e-95 from instance 4582dbf2-09cd-4a26-84dd-28adcb24011e from the persistent domain config.#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.256 226239 DEBUG nova.virt.libvirt.vif [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-401952099',display_name='tempest-AttachInterfacesTestJSON-server-401952099',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-401952099',id=64,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLE+D8OgZ1/dCdtrU14/PqCqGOYd+UkR9TQlI7xIz745wJamwCS0XZjGFLIMnjQi5YZpo14J+M8CAOtiHugBc6D8B5fKxatiHPrhEn6jkmtDN0v0fZ+hZxGDop3fETxTA==',key_name='tempest-keypair-524885169',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:06:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0cwd24au',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:07:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=4582dbf2-09cd-4a26-84dd-28adcb24011e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b091e6e-9544-424b-b33c-45258253789e", "address": "fa:16:3e:86:00:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b091e6e-95", "ovs_interfaceid": "0b091e6e-9544-424b-b33c-45258253789e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.256 226239 DEBUG nova.network.os_vif_util [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Converting VIF {"id": "0b091e6e-9544-424b-b33c-45258253789e", "address": "fa:16:3e:86:00:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b091e6e-95", "ovs_interfaceid": "0b091e6e-9544-424b-b33c-45258253789e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.256 226239 DEBUG nova.network.os_vif_util [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:00:70,bridge_name='br-int',has_traffic_filtering=True,id=0b091e6e-9544-424b-b33c-45258253789e,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b091e6e-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.257 226239 DEBUG os_vif [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:00:70,bridge_name='br-int',has_traffic_filtering=True,id=0b091e6e-9544-424b-b33c-45258253789e,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b091e6e-95') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.258 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.258 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b091e6e-95, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.259 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.260 226239 INFO os_vif [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:00:70,bridge_name='br-int',has_traffic_filtering=True,id=0b091e6e-9544-424b-b33c-45258253789e,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b091e6e-95')#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.261 226239 DEBUG nova.virt.libvirt.guest [req-cda2b499-d0e9-40f6-b74d-c31662d2772b req-2b24f88b-0796-46ee-9a4b-91a97c4ec3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <nova:name>tempest-AttachInterfacesTestJSON-server-401952099</nova:name>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <nova:creationTime>2026-01-31 08:07:08</nova:creationTime>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <nova:flavor name="m1.nano">
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <nova:memory>128</nova:memory>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <nova:disk>1</nova:disk>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <nova:swap>0</nova:swap>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  </nova:flavor>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <nova:owner>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  </nova:owner>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  <nova:ports>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <nova:port uuid="d3f10293-a2fb-49cc-a81c-f5fee53bb74f">
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    <nova:port uuid="39f1b902-3d83-4831-b91c-5d4e2349cb30">
Jan 31 03:07:08 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:07:08 np0005603623 nova_compute[226235]:  </nova:ports>
Jan 31 03:07:08 np0005603623 nova_compute[226235]: </nova:instance>
Jan 31 03:07:08 np0005603623 nova_compute[226235]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.304 226239 DEBUG nova.network.neutron [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updating instance_info_cache with network_info: [{"id": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "address": "fa:16:3e:72:76:91", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f10293-a2", "ovs_interfaceid": "d3f10293-a2fb-49cc-a81c-f5fee53bb74f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "address": "fa:16:3e:cd:ea:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39f1b902-3d", "ovs_interfaceid": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0b091e6e-9544-424b-b33c-45258253789e", "address": "fa:16:3e:86:00:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b091e6e-95", "ovs_interfaceid": "0b091e6e-9544-424b-b33c-45258253789e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.335 226239 DEBUG oslo_concurrency.lockutils [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Releasing lock "refresh_cache-4582dbf2-09cd-4a26-84dd-28adcb24011e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.360 226239 DEBUG oslo_concurrency.lockutils [None req-030c0a37-c777-400d-a5cc-6dd5a1b24198 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-4582dbf2-09cd-4a26-84dd-28adcb24011e-52038e6d-42cd-444a-959a-ce24f4c9bb50" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 7.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:08 np0005603623 podman[255813]: 2026-01-31 08:07:08.479712356 +0000 UTC m=+1.221371023 container cleanup 341ad2037159d7b5d527945f2a4969e372b6e6ad61156104abda7b0652aabff7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:07:08 np0005603623 systemd[1]: libpod-conmon-341ad2037159d7b5d527945f2a4969e372b6e6ad61156104abda7b0652aabff7.scope: Deactivated successfully.
Jan 31 03:07:08 np0005603623 podman[255869]: 2026-01-31 08:07:08.957970753 +0000 UTC m=+0.462475261 container remove 341ad2037159d7b5d527945f2a4969e372b6e6ad61156104abda7b0652aabff7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 31 03:07:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:08.963 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[afd9b32d-7c1d-4eb9-89b9-a5bf13bb8c53]: (4, ('Sat Jan 31 08:07:07 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f (341ad2037159d7b5d527945f2a4969e372b6e6ad61156104abda7b0652aabff7)\n341ad2037159d7b5d527945f2a4969e372b6e6ad61156104abda7b0652aabff7\nSat Jan 31 08:07:08 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f (341ad2037159d7b5d527945f2a4969e372b6e6ad61156104abda7b0652aabff7)\n341ad2037159d7b5d527945f2a4969e372b6e6ad61156104abda7b0652aabff7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:08.966 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d7d62c57-afd8-4e0f-aec4-2b10a823fbff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:08.966 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.969 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:08 np0005603623 kernel: tap455fab34-b0: left promiscuous mode
Jan 31 03:07:08 np0005603623 nova_compute[226235]: 2026-01-31 08:07:08.980 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:08.984 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[988b1153-a9fc-4434-a675-b1aac22c920b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:09.005 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[73d949d3-6a62-48d3-a590-4a5d6eea9562]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:09.007 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c61687f5-a510-40a5-9374-a87719cef3cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:09.022 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[55747b10-7011-44a8-bd6d-2800c1413a8e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590149, 'reachable_time': 27049, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255884, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:09 np0005603623 systemd[1]: run-netns-ovnmeta\x2d455fab34\x2db015\x2d4d97\x2da96d\x2df7ebd7f7555f.mount: Deactivated successfully.
Jan 31 03:07:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:09.029 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:07:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:09.029 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[962f0001-1414-4fa7-8d5a-22808a3b11ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:09.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:09 np0005603623 nova_compute[226235]: 2026-01-31 08:07:09.538 226239 DEBUG nova.compute.manager [req-5d57041e-cda2-4457-b256-f968d4b04c66 req-dd732f04-d768-448a-b329-b5d59d207744 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-plugged-d3f10293-a2fb-49cc-a81c-f5fee53bb74f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:09 np0005603623 nova_compute[226235]: 2026-01-31 08:07:09.539 226239 DEBUG oslo_concurrency.lockutils [req-5d57041e-cda2-4457-b256-f968d4b04c66 req-dd732f04-d768-448a-b329-b5d59d207744 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:09 np0005603623 nova_compute[226235]: 2026-01-31 08:07:09.540 226239 DEBUG oslo_concurrency.lockutils [req-5d57041e-cda2-4457-b256-f968d4b04c66 req-dd732f04-d768-448a-b329-b5d59d207744 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:09 np0005603623 nova_compute[226235]: 2026-01-31 08:07:09.540 226239 DEBUG oslo_concurrency.lockutils [req-5d57041e-cda2-4457-b256-f968d4b04c66 req-dd732f04-d768-448a-b329-b5d59d207744 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:09 np0005603623 nova_compute[226235]: 2026-01-31 08:07:09.540 226239 DEBUG nova.compute.manager [req-5d57041e-cda2-4457-b256-f968d4b04c66 req-dd732f04-d768-448a-b329-b5d59d207744 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] No waiting events found dispatching network-vif-plugged-d3f10293-a2fb-49cc-a81c-f5fee53bb74f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:07:09 np0005603623 nova_compute[226235]: 2026-01-31 08:07:09.540 226239 WARNING nova.compute.manager [req-5d57041e-cda2-4457-b256-f968d4b04c66 req-dd732f04-d768-448a-b329-b5d59d207744 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received unexpected event network-vif-plugged-d3f10293-a2fb-49cc-a81c-f5fee53bb74f for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:07:09 np0005603623 nova_compute[226235]: 2026-01-31 08:07:09.541 226239 DEBUG nova.compute.manager [req-5d57041e-cda2-4457-b256-f968d4b04c66 req-dd732f04-d768-448a-b329-b5d59d207744 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-unplugged-39f1b902-3d83-4831-b91c-5d4e2349cb30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:09 np0005603623 nova_compute[226235]: 2026-01-31 08:07:09.541 226239 DEBUG oslo_concurrency.lockutils [req-5d57041e-cda2-4457-b256-f968d4b04c66 req-dd732f04-d768-448a-b329-b5d59d207744 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:09 np0005603623 nova_compute[226235]: 2026-01-31 08:07:09.541 226239 DEBUG oslo_concurrency.lockutils [req-5d57041e-cda2-4457-b256-f968d4b04c66 req-dd732f04-d768-448a-b329-b5d59d207744 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:09 np0005603623 nova_compute[226235]: 2026-01-31 08:07:09.542 226239 DEBUG oslo_concurrency.lockutils [req-5d57041e-cda2-4457-b256-f968d4b04c66 req-dd732f04-d768-448a-b329-b5d59d207744 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:09 np0005603623 nova_compute[226235]: 2026-01-31 08:07:09.542 226239 DEBUG nova.compute.manager [req-5d57041e-cda2-4457-b256-f968d4b04c66 req-dd732f04-d768-448a-b329-b5d59d207744 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] No waiting events found dispatching network-vif-unplugged-39f1b902-3d83-4831-b91c-5d4e2349cb30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:07:09 np0005603623 nova_compute[226235]: 2026-01-31 08:07:09.542 226239 DEBUG nova.compute.manager [req-5d57041e-cda2-4457-b256-f968d4b04c66 req-dd732f04-d768-448a-b329-b5d59d207744 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-unplugged-39f1b902-3d83-4831-b91c-5d4e2349cb30 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:07:09 np0005603623 nova_compute[226235]: 2026-01-31 08:07:09.542 226239 DEBUG nova.compute.manager [req-5d57041e-cda2-4457-b256-f968d4b04c66 req-dd732f04-d768-448a-b329-b5d59d207744 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-plugged-39f1b902-3d83-4831-b91c-5d4e2349cb30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:09 np0005603623 nova_compute[226235]: 2026-01-31 08:07:09.543 226239 DEBUG oslo_concurrency.lockutils [req-5d57041e-cda2-4457-b256-f968d4b04c66 req-dd732f04-d768-448a-b329-b5d59d207744 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:09 np0005603623 nova_compute[226235]: 2026-01-31 08:07:09.543 226239 DEBUG oslo_concurrency.lockutils [req-5d57041e-cda2-4457-b256-f968d4b04c66 req-dd732f04-d768-448a-b329-b5d59d207744 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:09 np0005603623 nova_compute[226235]: 2026-01-31 08:07:09.543 226239 DEBUG oslo_concurrency.lockutils [req-5d57041e-cda2-4457-b256-f968d4b04c66 req-dd732f04-d768-448a-b329-b5d59d207744 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:09 np0005603623 nova_compute[226235]: 2026-01-31 08:07:09.543 226239 DEBUG nova.compute.manager [req-5d57041e-cda2-4457-b256-f968d4b04c66 req-dd732f04-d768-448a-b329-b5d59d207744 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] No waiting events found dispatching network-vif-plugged-39f1b902-3d83-4831-b91c-5d4e2349cb30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:07:09 np0005603623 nova_compute[226235]: 2026-01-31 08:07:09.544 226239 WARNING nova.compute.manager [req-5d57041e-cda2-4457-b256-f968d4b04c66 req-dd732f04-d768-448a-b329-b5d59d207744 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received unexpected event network-vif-plugged-39f1b902-3d83-4831-b91c-5d4e2349cb30 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:07:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:09.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:11.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:11.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:12 np0005603623 nova_compute[226235]: 2026-01-31 08:07:12.103 226239 INFO nova.virt.libvirt.driver [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Deleting instance files /var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e_del#033[00m
Jan 31 03:07:12 np0005603623 nova_compute[226235]: 2026-01-31 08:07:12.104 226239 INFO nova.virt.libvirt.driver [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Deletion of /var/lib/nova/instances/4582dbf2-09cd-4a26-84dd-28adcb24011e_del complete#033[00m
Jan 31 03:07:12 np0005603623 nova_compute[226235]: 2026-01-31 08:07:12.179 226239 INFO nova.compute.manager [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Took 5.61 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:07:12 np0005603623 nova_compute[226235]: 2026-01-31 08:07:12.180 226239 DEBUG oslo.service.loopingcall [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:07:12 np0005603623 nova_compute[226235]: 2026-01-31 08:07:12.180 226239 DEBUG nova.compute.manager [-] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:07:12 np0005603623 nova_compute[226235]: 2026-01-31 08:07:12.180 226239 DEBUG nova.network.neutron [-] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:07:12 np0005603623 nova_compute[226235]: 2026-01-31 08:07:12.349 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:12 np0005603623 nova_compute[226235]: 2026-01-31 08:07:12.575 226239 DEBUG neutronclient.v2_0.client [-] Error message: {"NeutronError": {"type": "PortNotFound", "message": "Port 0b091e6e-9544-424b-b33c-45258253789e could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 31 03:07:12 np0005603623 nova_compute[226235]: 2026-01-31 08:07:12.576 226239 DEBUG nova.network.neutron [-] Unable to show port 0b091e6e-9544-424b-b33c-45258253789e as it no longer exists. _unbind_ports /usr/lib/python3.9/site-packages/nova/network/neutron.py:666#033[00m
Jan 31 03:07:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:13.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:13 np0005603623 nova_compute[226235]: 2026-01-31 08:07:13.151 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:13 np0005603623 nova_compute[226235]: 2026-01-31 08:07:13.549 226239 DEBUG nova.compute.manager [req-b4b3f34e-c4c7-476c-b64e-99cd170a368b req-7b6eb31f-5b02-4235-a000-fc2edb832f25 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-deleted-d3f10293-a2fb-49cc-a81c-f5fee53bb74f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:13 np0005603623 nova_compute[226235]: 2026-01-31 08:07:13.550 226239 INFO nova.compute.manager [req-b4b3f34e-c4c7-476c-b64e-99cd170a368b req-7b6eb31f-5b02-4235-a000-fc2edb832f25 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Neutron deleted interface d3f10293-a2fb-49cc-a81c-f5fee53bb74f; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:07:13 np0005603623 nova_compute[226235]: 2026-01-31 08:07:13.550 226239 DEBUG nova.network.neutron [req-b4b3f34e-c4c7-476c-b64e-99cd170a368b req-7b6eb31f-5b02-4235-a000-fc2edb832f25 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updating instance_info_cache with network_info: [{"id": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "address": "fa:16:3e:cd:ea:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39f1b902-3d", "ovs_interfaceid": "39f1b902-3d83-4831-b91c-5d4e2349cb30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0b091e6e-9544-424b-b33c-45258253789e", "address": "fa:16:3e:86:00:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b091e6e-95", "ovs_interfaceid": "0b091e6e-9544-424b-b33c-45258253789e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:07:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:13.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:13 np0005603623 nova_compute[226235]: 2026-01-31 08:07:13.684 226239 DEBUG nova.compute.manager [req-b4b3f34e-c4c7-476c-b64e-99cd170a368b req-7b6eb31f-5b02-4235-a000-fc2edb832f25 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Detach interface failed, port_id=d3f10293-a2fb-49cc-a81c-f5fee53bb74f, reason: Instance 4582dbf2-09cd-4a26-84dd-28adcb24011e could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:07:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:07:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2548576344' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:07:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:07:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2548576344' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:07:14 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:07:14 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:07:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:15.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:15.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:15 np0005603623 nova_compute[226235]: 2026-01-31 08:07:15.773 226239 DEBUG nova.network.neutron [-] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:07:15 np0005603623 nova_compute[226235]: 2026-01-31 08:07:15.777 226239 DEBUG nova.compute.manager [req-1a7ee42c-a9e2-4c1c-9b46-3124b7382277 req-604598c7-37d1-4434-afe7-27b66ee76753 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Received event network-vif-deleted-39f1b902-3d83-4831-b91c-5d4e2349cb30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:15 np0005603623 nova_compute[226235]: 2026-01-31 08:07:15.777 226239 INFO nova.compute.manager [req-1a7ee42c-a9e2-4c1c-9b46-3124b7382277 req-604598c7-37d1-4434-afe7-27b66ee76753 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Neutron deleted interface 39f1b902-3d83-4831-b91c-5d4e2349cb30; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:07:15 np0005603623 nova_compute[226235]: 2026-01-31 08:07:15.777 226239 DEBUG nova.network.neutron [req-1a7ee42c-a9e2-4c1c-9b46-3124b7382277 req-604598c7-37d1-4434-afe7-27b66ee76753 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Updating instance_info_cache with network_info: [{"id": "0b091e6e-9544-424b-b33c-45258253789e", "address": "fa:16:3e:86:00:70", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b091e6e-95", "ovs_interfaceid": "0b091e6e-9544-424b-b33c-45258253789e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:07:15 np0005603623 nova_compute[226235]: 2026-01-31 08:07:15.924 226239 DEBUG nova.compute.manager [req-1a7ee42c-a9e2-4c1c-9b46-3124b7382277 req-604598c7-37d1-4434-afe7-27b66ee76753 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Detach interface failed, port_id=39f1b902-3d83-4831-b91c-5d4e2349cb30, reason: Instance 4582dbf2-09cd-4a26-84dd-28adcb24011e could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:07:16 np0005603623 nova_compute[226235]: 2026-01-31 08:07:16.019 226239 INFO nova.compute.manager [-] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Took 3.84 seconds to deallocate network for instance.#033[00m
Jan 31 03:07:16 np0005603623 nova_compute[226235]: 2026-01-31 08:07:16.231 226239 DEBUG oslo_concurrency.lockutils [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:16 np0005603623 nova_compute[226235]: 2026-01-31 08:07:16.232 226239 DEBUG oslo_concurrency.lockutils [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:16 np0005603623 nova_compute[226235]: 2026-01-31 08:07:16.276 226239 DEBUG oslo_concurrency.processutils [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:07:16 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2969055353' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:07:16 np0005603623 nova_compute[226235]: 2026-01-31 08:07:16.708 226239 DEBUG oslo_concurrency.processutils [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:16 np0005603623 nova_compute[226235]: 2026-01-31 08:07:16.715 226239 DEBUG nova.compute.provider_tree [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:07:16 np0005603623 nova_compute[226235]: 2026-01-31 08:07:16.753 226239 DEBUG nova.scheduler.client.report [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:07:16 np0005603623 nova_compute[226235]: 2026-01-31 08:07:16.792 226239 DEBUG oslo_concurrency.lockutils [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.560s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:16 np0005603623 nova_compute[226235]: 2026-01-31 08:07:16.872 226239 INFO nova.scheduler.client.report [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Deleted allocations for instance 4582dbf2-09cd-4a26-84dd-28adcb24011e#033[00m
Jan 31 03:07:17 np0005603623 nova_compute[226235]: 2026-01-31 08:07:17.028 226239 DEBUG oslo_concurrency.lockutils [None req-868c17b5-0661-4f5e-9525-52a18ff6cd12 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "4582dbf2-09cd-4a26-84dd-28adcb24011e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.462s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:17.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:17 np0005603623 nova_compute[226235]: 2026-01-31 08:07:17.350 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:17.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:17 np0005603623 podman[255963]: 2026-01-31 08:07:17.955359309 +0000 UTC m=+0.051934748 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 31 03:07:17 np0005603623 podman[255964]: 2026-01-31 08:07:17.972346085 +0000 UTC m=+0.068944194 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 31 03:07:18 np0005603623 nova_compute[226235]: 2026-01-31 08:07:18.152 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:19.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:07:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:19.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:07:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:21.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:21.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:22 np0005603623 nova_compute[226235]: 2026-01-31 08:07:22.229 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846827.2273128, 4582dbf2-09cd-4a26-84dd-28adcb24011e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:07:22 np0005603623 nova_compute[226235]: 2026-01-31 08:07:22.229 226239 INFO nova.compute.manager [-] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:07:22 np0005603623 nova_compute[226235]: 2026-01-31 08:07:22.344 226239 DEBUG nova.compute.manager [None req-fafdd5f1-153c-4881-bd53-7acdb1b11903 - - - - - -] [instance: 4582dbf2-09cd-4a26-84dd-28adcb24011e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:07:22 np0005603623 nova_compute[226235]: 2026-01-31 08:07:22.352 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:23.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:23 np0005603623 nova_compute[226235]: 2026-01-31 08:07:23.197 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:23.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:25 np0005603623 nova_compute[226235]: 2026-01-31 08:07:25.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:07:25 np0005603623 nova_compute[226235]: 2026-01-31 08:07:25.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:07:25 np0005603623 nova_compute[226235]: 2026-01-31 08:07:25.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:07:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:25.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:25 np0005603623 nova_compute[226235]: 2026-01-31 08:07:25.170 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:07:25 np0005603623 nova_compute[226235]: 2026-01-31 08:07:25.170 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:07:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:25.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:26 np0005603623 nova_compute[226235]: 2026-01-31 08:07:26.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:07:26 np0005603623 nova_compute[226235]: 2026-01-31 08:07:26.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:07:26 np0005603623 nova_compute[226235]: 2026-01-31 08:07:26.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:07:26 np0005603623 nova_compute[226235]: 2026-01-31 08:07:26.186 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:26 np0005603623 nova_compute[226235]: 2026-01-31 08:07:26.187 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:26 np0005603623 nova_compute[226235]: 2026-01-31 08:07:26.187 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:26 np0005603623 nova_compute[226235]: 2026-01-31 08:07:26.187 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:07:26 np0005603623 nova_compute[226235]: 2026-01-31 08:07:26.187 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:07:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/395868628' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:07:26 np0005603623 nova_compute[226235]: 2026-01-31 08:07:26.635 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:26 np0005603623 nova_compute[226235]: 2026-01-31 08:07:26.808 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:07:26 np0005603623 nova_compute[226235]: 2026-01-31 08:07:26.810 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4692MB free_disk=20.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:07:26 np0005603623 nova_compute[226235]: 2026-01-31 08:07:26.810 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:26 np0005603623 nova_compute[226235]: 2026-01-31 08:07:26.810 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:26 np0005603623 nova_compute[226235]: 2026-01-31 08:07:26.886 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:07:26 np0005603623 nova_compute[226235]: 2026-01-31 08:07:26.886 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:07:26 np0005603623 nova_compute[226235]: 2026-01-31 08:07:26.911 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:07:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:27.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:07:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:07:27 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2686443185' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:07:27 np0005603623 nova_compute[226235]: 2026-01-31 08:07:27.344 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:27 np0005603623 nova_compute[226235]: 2026-01-31 08:07:27.349 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:07:27 np0005603623 nova_compute[226235]: 2026-01-31 08:07:27.354 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:27 np0005603623 nova_compute[226235]: 2026-01-31 08:07:27.369 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:07:27 np0005603623 nova_compute[226235]: 2026-01-31 08:07:27.410 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:07:27 np0005603623 nova_compute[226235]: 2026-01-31 08:07:27.410 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.600s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:27.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:28 np0005603623 nova_compute[226235]: 2026-01-31 08:07:28.199 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:29.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:29 np0005603623 nova_compute[226235]: 2026-01-31 08:07:29.411 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:07:29 np0005603623 nova_compute[226235]: 2026-01-31 08:07:29.412 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:07:29 np0005603623 nova_compute[226235]: 2026-01-31 08:07:29.412 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:07:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:29.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:30.101 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:30.101 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:30.101 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:31 np0005603623 nova_compute[226235]: 2026-01-31 08:07:31.154 226239 DEBUG oslo_concurrency.lockutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "d4484d63-c590-4676-b3ae-b8e33bd348f1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:31 np0005603623 nova_compute[226235]: 2026-01-31 08:07:31.155 226239 DEBUG oslo_concurrency.lockutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:31.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:31 np0005603623 nova_compute[226235]: 2026-01-31 08:07:31.248 226239 DEBUG nova.compute.manager [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:07:31 np0005603623 nova_compute[226235]: 2026-01-31 08:07:31.390 226239 DEBUG oslo_concurrency.lockutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:31 np0005603623 nova_compute[226235]: 2026-01-31 08:07:31.390 226239 DEBUG oslo_concurrency.lockutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:31 np0005603623 nova_compute[226235]: 2026-01-31 08:07:31.397 226239 DEBUG nova.virt.hardware [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:07:31 np0005603623 nova_compute[226235]: 2026-01-31 08:07:31.398 226239 INFO nova.compute.claims [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:07:31 np0005603623 nova_compute[226235]: 2026-01-31 08:07:31.573 226239 DEBUG oslo_concurrency.processutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:31.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:07:31 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4246934514' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:07:31 np0005603623 nova_compute[226235]: 2026-01-31 08:07:31.991 226239 DEBUG oslo_concurrency.processutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:31 np0005603623 nova_compute[226235]: 2026-01-31 08:07:31.996 226239 DEBUG nova.compute.provider_tree [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.065 226239 DEBUG nova.scheduler.client.report [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.143 226239 DEBUG oslo_concurrency.lockutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.144 226239 DEBUG nova.compute.manager [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.204 226239 DEBUG nova.compute.manager [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.205 226239 DEBUG nova.network.neutron [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.237 226239 INFO nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.355 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.381 226239 DEBUG nova.compute.manager [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.415 226239 DEBUG nova.policy [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60f2b878669c4c529b35e04860cc6d64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c73212dc7c84914b6c934d45b6826f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.518 226239 DEBUG nova.compute.manager [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.519 226239 DEBUG nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.520 226239 INFO nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Creating image(s)#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.545 226239 DEBUG nova.storage.rbd_utils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image d4484d63-c590-4676-b3ae-b8e33bd348f1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.578 226239 DEBUG nova.storage.rbd_utils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image d4484d63-c590-4676-b3ae-b8e33bd348f1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.605 226239 DEBUG nova.storage.rbd_utils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image d4484d63-c590-4676-b3ae-b8e33bd348f1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.609 226239 DEBUG oslo_concurrency.processutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.660 226239 DEBUG oslo_concurrency.processutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.661 226239 DEBUG oslo_concurrency.lockutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.661 226239 DEBUG oslo_concurrency.lockutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.662 226239 DEBUG oslo_concurrency.lockutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.686 226239 DEBUG nova.storage.rbd_utils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image d4484d63-c590-4676-b3ae-b8e33bd348f1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:32 np0005603623 nova_compute[226235]: 2026-01-31 08:07:32.690 226239 DEBUG oslo_concurrency.processutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 d4484d63-c590-4676-b3ae-b8e33bd348f1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:33 np0005603623 nova_compute[226235]: 2026-01-31 08:07:33.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:07:33 np0005603623 nova_compute[226235]: 2026-01-31 08:07:33.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:07:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:33.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:33 np0005603623 nova_compute[226235]: 2026-01-31 08:07:33.201 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:33 np0005603623 nova_compute[226235]: 2026-01-31 08:07:33.229 226239 DEBUG nova.network.neutron [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Successfully created port: 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:07:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:33.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:34 np0005603623 nova_compute[226235]: 2026-01-31 08:07:34.081 226239 DEBUG nova.network.neutron [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Successfully updated port: 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:07:34 np0005603623 nova_compute[226235]: 2026-01-31 08:07:34.101 226239 DEBUG oslo_concurrency.processutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 d4484d63-c590-4676-b3ae-b8e33bd348f1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.411s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:34 np0005603623 nova_compute[226235]: 2026-01-31 08:07:34.132 226239 DEBUG oslo_concurrency.lockutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:07:34 np0005603623 nova_compute[226235]: 2026-01-31 08:07:34.132 226239 DEBUG oslo_concurrency.lockutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquired lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:07:34 np0005603623 nova_compute[226235]: 2026-01-31 08:07:34.132 226239 DEBUG nova.network.neutron [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:07:34 np0005603623 nova_compute[226235]: 2026-01-31 08:07:34.178 226239 DEBUG nova.storage.rbd_utils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] resizing rbd image d4484d63-c590-4676-b3ae-b8e33bd348f1_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:07:34 np0005603623 nova_compute[226235]: 2026-01-31 08:07:34.219 226239 DEBUG nova.compute.manager [req-19605795-968a-4b4c-b20a-e72ab86d103d req-32dc7d79-bf98-45db-b76b-a2bd8fb9f620 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received event network-changed-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:34 np0005603623 nova_compute[226235]: 2026-01-31 08:07:34.220 226239 DEBUG nova.compute.manager [req-19605795-968a-4b4c-b20a-e72ab86d103d req-32dc7d79-bf98-45db-b76b-a2bd8fb9f620 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Refreshing instance network info cache due to event network-changed-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:07:34 np0005603623 nova_compute[226235]: 2026-01-31 08:07:34.221 226239 DEBUG oslo_concurrency.lockutils [req-19605795-968a-4b4c-b20a-e72ab86d103d req-32dc7d79-bf98-45db-b76b-a2bd8fb9f620 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:07:34 np0005603623 nova_compute[226235]: 2026-01-31 08:07:34.296 226239 DEBUG nova.objects.instance [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'migration_context' on Instance uuid d4484d63-c590-4676-b3ae-b8e33bd348f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:07:34 np0005603623 nova_compute[226235]: 2026-01-31 08:07:34.309 226239 DEBUG nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:07:34 np0005603623 nova_compute[226235]: 2026-01-31 08:07:34.310 226239 DEBUG nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Ensure instance console log exists: /var/lib/nova/instances/d4484d63-c590-4676-b3ae-b8e33bd348f1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:07:34 np0005603623 nova_compute[226235]: 2026-01-31 08:07:34.310 226239 DEBUG oslo_concurrency.lockutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:34 np0005603623 nova_compute[226235]: 2026-01-31 08:07:34.311 226239 DEBUG oslo_concurrency.lockutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:34 np0005603623 nova_compute[226235]: 2026-01-31 08:07:34.311 226239 DEBUG oslo_concurrency.lockutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:34 np0005603623 nova_compute[226235]: 2026-01-31 08:07:34.318 226239 DEBUG nova.network.neutron [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:07:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:35.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:35.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.762 226239 DEBUG nova.network.neutron [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Updating instance_info_cache with network_info: [{"id": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "address": "fa:16:3e:63:1a:05", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e779f8-cc", "ovs_interfaceid": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.810 226239 DEBUG oslo_concurrency.lockutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Releasing lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.810 226239 DEBUG nova.compute.manager [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Instance network_info: |[{"id": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "address": "fa:16:3e:63:1a:05", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e779f8-cc", "ovs_interfaceid": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.811 226239 DEBUG oslo_concurrency.lockutils [req-19605795-968a-4b4c-b20a-e72ab86d103d req-32dc7d79-bf98-45db-b76b-a2bd8fb9f620 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.811 226239 DEBUG nova.network.neutron [req-19605795-968a-4b4c-b20a-e72ab86d103d req-32dc7d79-bf98-45db-b76b-a2bd8fb9f620 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Refreshing network info cache for port 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.814 226239 DEBUG nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Start _get_guest_xml network_info=[{"id": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "address": "fa:16:3e:63:1a:05", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e779f8-cc", "ovs_interfaceid": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.819 226239 WARNING nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.829 226239 DEBUG nova.virt.libvirt.host [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.829 226239 DEBUG nova.virt.libvirt.host [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.835 226239 DEBUG nova.virt.libvirt.host [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.835 226239 DEBUG nova.virt.libvirt.host [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.836 226239 DEBUG nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.837 226239 DEBUG nova.virt.hardware [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.837 226239 DEBUG nova.virt.hardware [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.837 226239 DEBUG nova.virt.hardware [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.837 226239 DEBUG nova.virt.hardware [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.838 226239 DEBUG nova.virt.hardware [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.838 226239 DEBUG nova.virt.hardware [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.838 226239 DEBUG nova.virt.hardware [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.838 226239 DEBUG nova.virt.hardware [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.838 226239 DEBUG nova.virt.hardware [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.839 226239 DEBUG nova.virt.hardware [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.839 226239 DEBUG nova.virt.hardware [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:07:35 np0005603623 nova_compute[226235]: 2026-01-31 08:07:35.842 226239 DEBUG oslo_concurrency.processutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:07:36 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/104138598' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.247 226239 DEBUG oslo_concurrency.processutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.405s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.271 226239 DEBUG nova.storage.rbd_utils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image d4484d63-c590-4676-b3ae-b8e33bd348f1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.274 226239 DEBUG oslo_concurrency.processutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:07:36 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/809088595' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.662 226239 DEBUG oslo_concurrency.processutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.388s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.664 226239 DEBUG nova.virt.libvirt.vif [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:07:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1091698191',display_name='tempest-tempest.common.compute-instance-1091698191',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1091698191',id=70,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPZmLk+NGh2zKbql/sBzP6qM4W9cXGD3OUJAhT/207QiFni858RIgrXDyBBR0Tlv+t9A7ybvSMg5e6CDTEEkg6g7w68asAv+N4fL3AAeDAWcmo04YGYMANL/8swEdfyv1w==',key_name='tempest-keypair-1664347335',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-uhssxba1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:07:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=d4484d63-c590-4676-b3ae-b8e33bd348f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "address": "fa:16:3e:63:1a:05", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e779f8-cc", "ovs_interfaceid": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.665 226239 DEBUG nova.network.os_vif_util [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "address": "fa:16:3e:63:1a:05", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e779f8-cc", "ovs_interfaceid": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.665 226239 DEBUG nova.network.os_vif_util [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:1a:05,bridge_name='br-int',has_traffic_filtering=True,id=00e779f8-ccb2-4b71-bce9-7a3c9df3b85c,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00e779f8-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.666 226239 DEBUG nova.objects.instance [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid d4484d63-c590-4676-b3ae-b8e33bd348f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.693 226239 DEBUG nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:07:36 np0005603623 nova_compute[226235]:  <uuid>d4484d63-c590-4676-b3ae-b8e33bd348f1</uuid>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:  <name>instance-00000046</name>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <nova:name>tempest-tempest.common.compute-instance-1091698191</nova:name>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:07:35</nova:creationTime>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:07:36 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:        <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:        <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:        <nova:port uuid="00e779f8-ccb2-4b71-bce9-7a3c9df3b85c">
Jan 31 03:07:36 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <entry name="serial">d4484d63-c590-4676-b3ae-b8e33bd348f1</entry>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <entry name="uuid">d4484d63-c590-4676-b3ae-b8e33bd348f1</entry>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/d4484d63-c590-4676-b3ae-b8e33bd348f1_disk">
Jan 31 03:07:36 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:07:36 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/d4484d63-c590-4676-b3ae-b8e33bd348f1_disk.config">
Jan 31 03:07:36 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:07:36 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:63:1a:05"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <target dev="tap00e779f8-cc"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/d4484d63-c590-4676-b3ae-b8e33bd348f1/console.log" append="off"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:07:36 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:07:36 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:07:36 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:07:36 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.694 226239 DEBUG nova.compute.manager [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Preparing to wait for external event network-vif-plugged-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.695 226239 DEBUG oslo_concurrency.lockutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.695 226239 DEBUG oslo_concurrency.lockutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.695 226239 DEBUG oslo_concurrency.lockutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.696 226239 DEBUG nova.virt.libvirt.vif [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:07:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1091698191',display_name='tempest-tempest.common.compute-instance-1091698191',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1091698191',id=70,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPZmLk+NGh2zKbql/sBzP6qM4W9cXGD3OUJAhT/207QiFni858RIgrXDyBBR0Tlv+t9A7ybvSMg5e6CDTEEkg6g7w68asAv+N4fL3AAeDAWcmo04YGYMANL/8swEdfyv1w==',key_name='tempest-keypair-1664347335',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-uhssxba1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:07:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=d4484d63-c590-4676-b3ae-b8e33bd348f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "address": "fa:16:3e:63:1a:05", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e779f8-cc", "ovs_interfaceid": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.696 226239 DEBUG nova.network.os_vif_util [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "address": "fa:16:3e:63:1a:05", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e779f8-cc", "ovs_interfaceid": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.697 226239 DEBUG nova.network.os_vif_util [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:1a:05,bridge_name='br-int',has_traffic_filtering=True,id=00e779f8-ccb2-4b71-bce9-7a3c9df3b85c,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00e779f8-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.697 226239 DEBUG os_vif [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:1a:05,bridge_name='br-int',has_traffic_filtering=True,id=00e779f8-ccb2-4b71-bce9-7a3c9df3b85c,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00e779f8-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.698 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.698 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.698 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.700 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.701 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap00e779f8-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.701 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap00e779f8-cc, col_values=(('external_ids', {'iface-id': '00e779f8-ccb2-4b71-bce9-7a3c9df3b85c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:63:1a:05', 'vm-uuid': 'd4484d63-c590-4676-b3ae-b8e33bd348f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.702 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:36 np0005603623 NetworkManager[48970]: <info>  [1769846856.7035] manager: (tap00e779f8-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/118)
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.705 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.707 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.709 226239 INFO os_vif [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:1a:05,bridge_name='br-int',has_traffic_filtering=True,id=00e779f8-ccb2-4b71-bce9-7a3c9df3b85c,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00e779f8-cc')#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.777 226239 DEBUG nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.777 226239 DEBUG nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.777 226239 DEBUG nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:63:1a:05, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.778 226239 INFO nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Using config drive#033[00m
Jan 31 03:07:36 np0005603623 nova_compute[226235]: 2026-01-31 08:07:36.810 226239 DEBUG nova.storage.rbd_utils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image d4484d63-c590-4676-b3ae-b8e33bd348f1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:37.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:37.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:38 np0005603623 nova_compute[226235]: 2026-01-31 08:07:38.202 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:38 np0005603623 nova_compute[226235]: 2026-01-31 08:07:38.938 226239 INFO nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Creating config drive at /var/lib/nova/instances/d4484d63-c590-4676-b3ae-b8e33bd348f1/disk.config#033[00m
Jan 31 03:07:38 np0005603623 nova_compute[226235]: 2026-01-31 08:07:38.943 226239 DEBUG oslo_concurrency.processutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d4484d63-c590-4676-b3ae-b8e33bd348f1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpk8v3hmrs execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:39 np0005603623 nova_compute[226235]: 2026-01-31 08:07:39.061 226239 DEBUG oslo_concurrency.processutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d4484d63-c590-4676-b3ae-b8e33bd348f1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpk8v3hmrs" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:39 np0005603623 nova_compute[226235]: 2026-01-31 08:07:39.085 226239 DEBUG nova.storage.rbd_utils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image d4484d63-c590-4676-b3ae-b8e33bd348f1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:39 np0005603623 nova_compute[226235]: 2026-01-31 08:07:39.088 226239 DEBUG oslo_concurrency.processutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d4484d63-c590-4676-b3ae-b8e33bd348f1/disk.config d4484d63-c590-4676-b3ae-b8e33bd348f1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:39.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:39.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:39 np0005603623 nova_compute[226235]: 2026-01-31 08:07:39.910 226239 DEBUG nova.network.neutron [req-19605795-968a-4b4c-b20a-e72ab86d103d req-32dc7d79-bf98-45db-b76b-a2bd8fb9f620 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Updated VIF entry in instance network info cache for port 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:07:39 np0005603623 nova_compute[226235]: 2026-01-31 08:07:39.911 226239 DEBUG nova.network.neutron [req-19605795-968a-4b4c-b20a-e72ab86d103d req-32dc7d79-bf98-45db-b76b-a2bd8fb9f620 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Updating instance_info_cache with network_info: [{"id": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "address": "fa:16:3e:63:1a:05", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e779f8-cc", "ovs_interfaceid": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:07:39 np0005603623 nova_compute[226235]: 2026-01-31 08:07:39.941 226239 DEBUG oslo_concurrency.lockutils [req-19605795-968a-4b4c-b20a-e72ab86d103d req-32dc7d79-bf98-45db-b76b-a2bd8fb9f620 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.199 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:07:40 np0005603623 nova_compute[226235]: 2026-01-31 08:07:40.199 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.200 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:07:40 np0005603623 nova_compute[226235]: 2026-01-31 08:07:40.298 226239 DEBUG oslo_concurrency.processutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d4484d63-c590-4676-b3ae-b8e33bd348f1/disk.config d4484d63-c590-4676-b3ae-b8e33bd348f1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.210s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:40 np0005603623 nova_compute[226235]: 2026-01-31 08:07:40.298 226239 INFO nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Deleting local config drive /var/lib/nova/instances/d4484d63-c590-4676-b3ae-b8e33bd348f1/disk.config because it was imported into RBD.#033[00m
Jan 31 03:07:40 np0005603623 kernel: tap00e779f8-cc: entered promiscuous mode
Jan 31 03:07:40 np0005603623 NetworkManager[48970]: <info>  [1769846860.3366] manager: (tap00e779f8-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/119)
Jan 31 03:07:40 np0005603623 nova_compute[226235]: 2026-01-31 08:07:40.337 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:40 np0005603623 nova_compute[226235]: 2026-01-31 08:07:40.340 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:40Z|00253|if_status|INFO|Not updating pb chassis for 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c now as sb is readonly
Jan 31 03:07:40 np0005603623 nova_compute[226235]: 2026-01-31 08:07:40.343 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:40 np0005603623 nova_compute[226235]: 2026-01-31 08:07:40.346 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:40Z|00254|binding|INFO|Claiming lport 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c for this chassis.
Jan 31 03:07:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:40Z|00255|binding|INFO|00e779f8-ccb2-4b71-bce9-7a3c9df3b85c: Claiming fa:16:3e:63:1a:05 10.100.0.3
Jan 31 03:07:40 np0005603623 systemd-udevd[256483]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:07:40 np0005603623 systemd-machined[194379]: New machine qemu-31-instance-00000046.
Jan 31 03:07:40 np0005603623 NetworkManager[48970]: <info>  [1769846860.3711] device (tap00e779f8-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:07:40 np0005603623 NetworkManager[48970]: <info>  [1769846860.3719] device (tap00e779f8-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:07:40 np0005603623 systemd[1]: Started Virtual Machine qemu-31-instance-00000046.
Jan 31 03:07:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:40Z|00256|binding|INFO|Setting lport 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c ovn-installed in OVS
Jan 31 03:07:40 np0005603623 nova_compute[226235]: 2026-01-31 08:07:40.378 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.419 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:1a:05 10.100.0.3'], port_security=['fa:16:3e:63:1a:05 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'd4484d63-c590-4676-b3ae-b8e33bd348f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3c826f71-7560-44f4-8034-5ac735f4e81f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=00e779f8-ccb2-4b71-bce9-7a3c9df3b85c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:07:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:40Z|00257|binding|INFO|Setting lport 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c up in Southbound
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.420 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f bound to our chassis#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.421 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455fab34-b015-4d97-a96d-f7ebd7f7555f#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.430 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[eb4b4398-5bde-41bb-b1ec-37babbce294e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.430 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap455fab34-b1 in ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.432 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap455fab34-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.432 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ac36f158-5207-44c1-88c0-9cfa79e4b59a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.432 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a30fbc1c-86cb-41a6-b96b-7fa7046b6974]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.439 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[453a517e-77b1-44ec-bc6e-6209f37ee3ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.448 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[40e6abdf-eb18-48a5-8b37-b34151964717]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.469 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[3311eda6-dee2-4a57-be8b-bd52176d4390]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.473 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b1bca7fa-56fe-410b-a221-9847470110fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:40 np0005603623 NetworkManager[48970]: <info>  [1769846860.4745] manager: (tap455fab34-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/120)
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.499 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[6683399c-2689-4fcd-9e6f-15d6191bb7d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.502 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[89a1ad0c-256f-407c-a4be-dab54cee5017]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:40 np0005603623 NetworkManager[48970]: <info>  [1769846860.5169] device (tap455fab34-b0): carrier: link connected
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.521 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[21d70072-35e1-4c82-a271-4cc9b653c3ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.533 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[52d497ed-16ff-4072-b6e7-0c379f4423f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600094, 'reachable_time': 39949, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256516, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.542 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e37eba1c-73d7-4f69-9d94-bd43057c5242]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fede:8f98'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600094, 'tstamp': 600094}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256517, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.553 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0a133455-6f64-471a-abc5-b2dffcdf9e6c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600094, 'reachable_time': 39949, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 256518, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.573 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3c04c96b-314f-46dc-a849-813a613c053d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.606 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[849a7839-d6de-4286-bc2a-b295ba510d31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.607 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.608 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.608 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455fab34-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:40 np0005603623 NetworkManager[48970]: <info>  [1769846860.6103] manager: (tap455fab34-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Jan 31 03:07:40 np0005603623 kernel: tap455fab34-b0: entered promiscuous mode
Jan 31 03:07:40 np0005603623 nova_compute[226235]: 2026-01-31 08:07:40.609 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:40 np0005603623 nova_compute[226235]: 2026-01-31 08:07:40.611 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.612 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455fab34-b0, col_values=(('external_ids', {'iface-id': 'b4a40811-3703-4da5-859c-3e041b7cfee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:40 np0005603623 nova_compute[226235]: 2026-01-31 08:07:40.612 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:40Z|00258|binding|INFO|Releasing lport b4a40811-3703-4da5-859c-3e041b7cfee4 from this chassis (sb_readonly=0)
Jan 31 03:07:40 np0005603623 nova_compute[226235]: 2026-01-31 08:07:40.614 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.614 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/455fab34-b015-4d97-a96d-f7ebd7f7555f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/455fab34-b015-4d97-a96d-f7ebd7f7555f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.615 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[df8f7d4a-d35d-4e6d-abb0-7eaf3768de0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.616 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-455fab34-b015-4d97-a96d-f7ebd7f7555f
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/455fab34-b015-4d97-a96d-f7ebd7f7555f.pid.haproxy
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 455fab34-b015-4d97-a96d-f7ebd7f7555f
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:07:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:40.616 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'env', 'PROCESS_TAG=haproxy-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/455fab34-b015-4d97-a96d-f7ebd7f7555f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:07:40 np0005603623 nova_compute[226235]: 2026-01-31 08:07:40.618 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:41 np0005603623 podman[256585]: 2026-01-31 08:07:40.907017152 +0000 UTC m=+0.020246089 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.040 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846861.0398176, d4484d63-c590-4676-b3ae-b8e33bd348f1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.041 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] VM Started (Lifecycle Event)#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.091 226239 DEBUG nova.compute.manager [req-9599e9fa-da5a-449b-a02c-3ef8f907020f req-b080f7fb-c1c5-4269-a8e9-e11307bf1097 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received event network-vif-plugged-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.092 226239 DEBUG oslo_concurrency.lockutils [req-9599e9fa-da5a-449b-a02c-3ef8f907020f req-b080f7fb-c1c5-4269-a8e9-e11307bf1097 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.092 226239 DEBUG oslo_concurrency.lockutils [req-9599e9fa-da5a-449b-a02c-3ef8f907020f req-b080f7fb-c1c5-4269-a8e9-e11307bf1097 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.092 226239 DEBUG oslo_concurrency.lockutils [req-9599e9fa-da5a-449b-a02c-3ef8f907020f req-b080f7fb-c1c5-4269-a8e9-e11307bf1097 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.092 226239 DEBUG nova.compute.manager [req-9599e9fa-da5a-449b-a02c-3ef8f907020f req-b080f7fb-c1c5-4269-a8e9-e11307bf1097 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Processing event network-vif-plugged-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.093 226239 DEBUG nova.compute.manager [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.098 226239 DEBUG nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.103 226239 INFO nova.virt.libvirt.driver [-] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Instance spawned successfully.#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.104 226239 DEBUG nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.162 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.168 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.171 226239 DEBUG nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.172 226239 DEBUG nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.172 226239 DEBUG nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.173 226239 DEBUG nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.173 226239 DEBUG nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.173 226239 DEBUG nova.virt.libvirt.driver [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:07:41 np0005603623 podman[256585]: 2026-01-31 08:07:41.178282184 +0000 UTC m=+0.291511101 container create 8babfc046a7f03ed6f3a78c5a092338cdfe24cb4970b9742aaaf52b6350204e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 03:07:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:41.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.308 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.308 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846861.0410342, d4484d63-c590-4676-b3ae-b8e33bd348f1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.308 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:07:41 np0005603623 systemd[1]: Started libpod-conmon-8babfc046a7f03ed6f3a78c5a092338cdfe24cb4970b9742aaaf52b6350204e9.scope.
Jan 31 03:07:41 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:07:41 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24398064eb9fa09598985b3a629b9b9d0c1400909774c71328a7adf728be2849/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.393 226239 INFO nova.compute.manager [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Took 8.87 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.394 226239 DEBUG nova.compute.manager [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.395 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.405 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846861.0971382, d4484d63-c590-4676-b3ae-b8e33bd348f1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.406 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.440 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.443 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:07:41 np0005603623 podman[256585]: 2026-01-31 08:07:41.461699949 +0000 UTC m=+0.574928886 container init 8babfc046a7f03ed6f3a78c5a092338cdfe24cb4970b9742aaaf52b6350204e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 31 03:07:41 np0005603623 podman[256585]: 2026-01-31 08:07:41.466894552 +0000 UTC m=+0.580123469 container start 8babfc046a7f03ed6f3a78c5a092338cdfe24cb4970b9742aaaf52b6350204e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.466 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.478 226239 INFO nova.compute.manager [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Took 10.10 seconds to build instance.#033[00m
Jan 31 03:07:41 np0005603623 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[256606]: [NOTICE]   (256610) : New worker (256612) forked
Jan 31 03:07:41 np0005603623 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[256606]: [NOTICE]   (256610) : Loading success.
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.496 226239 DEBUG oslo_concurrency.lockutils [None req-1e53b0c3-dd54-4e75-948d-34a753759c57 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.341s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:41.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:41 np0005603623 nova_compute[226235]: 2026-01-31 08:07:41.704 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:43.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:43 np0005603623 nova_compute[226235]: 2026-01-31 08:07:43.205 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:43 np0005603623 nova_compute[226235]: 2026-01-31 08:07:43.362 226239 DEBUG nova.compute.manager [req-54f78a04-8a44-42a3-ab44-0b045fed58fb req-5c0ae47c-883f-483f-aac3-6633d4c286b7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received event network-vif-plugged-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:43 np0005603623 nova_compute[226235]: 2026-01-31 08:07:43.363 226239 DEBUG oslo_concurrency.lockutils [req-54f78a04-8a44-42a3-ab44-0b045fed58fb req-5c0ae47c-883f-483f-aac3-6633d4c286b7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:43 np0005603623 nova_compute[226235]: 2026-01-31 08:07:43.363 226239 DEBUG oslo_concurrency.lockutils [req-54f78a04-8a44-42a3-ab44-0b045fed58fb req-5c0ae47c-883f-483f-aac3-6633d4c286b7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:43 np0005603623 nova_compute[226235]: 2026-01-31 08:07:43.363 226239 DEBUG oslo_concurrency.lockutils [req-54f78a04-8a44-42a3-ab44-0b045fed58fb req-5c0ae47c-883f-483f-aac3-6633d4c286b7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:43 np0005603623 nova_compute[226235]: 2026-01-31 08:07:43.364 226239 DEBUG nova.compute.manager [req-54f78a04-8a44-42a3-ab44-0b045fed58fb req-5c0ae47c-883f-483f-aac3-6633d4c286b7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] No waiting events found dispatching network-vif-plugged-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:07:43 np0005603623 nova_compute[226235]: 2026-01-31 08:07:43.364 226239 WARNING nova.compute.manager [req-54f78a04-8a44-42a3-ab44-0b045fed58fb req-5c0ae47c-883f-483f-aac3-6633d4c286b7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received unexpected event network-vif-plugged-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c for instance with vm_state active and task_state None.#033[00m
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:07:43.471714) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846863472040, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 2149, "num_deletes": 265, "total_data_size": 4649906, "memory_usage": 4715568, "flush_reason": "Manual Compaction"}
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846863549101, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 3051600, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36780, "largest_seqno": 38924, "table_properties": {"data_size": 3042963, "index_size": 5259, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 18871, "raw_average_key_size": 20, "raw_value_size": 3025228, "raw_average_value_size": 3302, "num_data_blocks": 228, "num_entries": 916, "num_filter_entries": 916, "num_deletions": 265, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769846700, "oldest_key_time": 1769846700, "file_creation_time": 1769846863, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 77457 microseconds, and 6475 cpu microseconds.
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:07:43.549180) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 3051600 bytes OK
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:07:43.549201) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:07:43.590612) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:07:43.590692) EVENT_LOG_v1 {"time_micros": 1769846863590677, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:07:43.590731) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 4640298, prev total WAL file size 4640298, number of live WAL files 2.
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:07:43.592127) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303034' seq:72057594037927935, type:22 .. '6C6F676D0031323536' seq:0, type:0; will stop at (end)
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(2980KB)], [69(8654KB)]
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846863592209, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 11913426, "oldest_snapshot_seqno": -1}
Jan 31 03:07:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:43.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6437 keys, 11756445 bytes, temperature: kUnknown
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846863955782, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 11756445, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11710628, "index_size": 28619, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16133, "raw_key_size": 164790, "raw_average_key_size": 25, "raw_value_size": 11592590, "raw_average_value_size": 1800, "num_data_blocks": 1152, "num_entries": 6437, "num_filter_entries": 6437, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769846863, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:07:43 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:07:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:07:43.956299) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 11756445 bytes
Jan 31 03:07:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:07:44.023341) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 32.7 rd, 32.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.9, 8.5 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(7.8) write-amplify(3.9) OK, records in: 6981, records dropped: 544 output_compression: NoCompression
Jan 31 03:07:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:07:44.023398) EVENT_LOG_v1 {"time_micros": 1769846864023377, "job": 42, "event": "compaction_finished", "compaction_time_micros": 363853, "compaction_time_cpu_micros": 21323, "output_level": 6, "num_output_files": 1, "total_output_size": 11756445, "num_input_records": 6981, "num_output_records": 6437, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:07:44 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:07:44 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846864024866, "job": 42, "event": "table_file_deletion", "file_number": 71}
Jan 31 03:07:44 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:07:44 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846864026567, "job": 42, "event": "table_file_deletion", "file_number": 69}
Jan 31 03:07:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:07:43.591966) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:07:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:07:44.026831) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:07:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:07:44.026839) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:07:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:07:44.026841) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:07:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:07:44.026843) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:07:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:07:44.026845) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:07:44 np0005603623 nova_compute[226235]: 2026-01-31 08:07:44.704 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:44 np0005603623 NetworkManager[48970]: <info>  [1769846864.7053] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Jan 31 03:07:44 np0005603623 NetworkManager[48970]: <info>  [1769846864.7062] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/123)
Jan 31 03:07:44 np0005603623 nova_compute[226235]: 2026-01-31 08:07:44.744 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:44 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:44Z|00259|binding|INFO|Releasing lport b4a40811-3703-4da5-859c-3e041b7cfee4 from this chassis (sb_readonly=0)
Jan 31 03:07:44 np0005603623 nova_compute[226235]: 2026-01-31 08:07:44.761 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:45 np0005603623 nova_compute[226235]: 2026-01-31 08:07:45.032 226239 DEBUG nova.compute.manager [req-37fbfdc0-9d5f-4c6c-a8f6-bc7bd0b6349e req-c40ca504-557f-40ab-96ee-2f1c7d94576b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received event network-changed-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:45 np0005603623 nova_compute[226235]: 2026-01-31 08:07:45.033 226239 DEBUG nova.compute.manager [req-37fbfdc0-9d5f-4c6c-a8f6-bc7bd0b6349e req-c40ca504-557f-40ab-96ee-2f1c7d94576b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Refreshing instance network info cache due to event network-changed-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:07:45 np0005603623 nova_compute[226235]: 2026-01-31 08:07:45.034 226239 DEBUG oslo_concurrency.lockutils [req-37fbfdc0-9d5f-4c6c-a8f6-bc7bd0b6349e req-c40ca504-557f-40ab-96ee-2f1c7d94576b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:07:45 np0005603623 nova_compute[226235]: 2026-01-31 08:07:45.035 226239 DEBUG oslo_concurrency.lockutils [req-37fbfdc0-9d5f-4c6c-a8f6-bc7bd0b6349e req-c40ca504-557f-40ab-96ee-2f1c7d94576b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:07:45 np0005603623 nova_compute[226235]: 2026-01-31 08:07:45.035 226239 DEBUG nova.network.neutron [req-37fbfdc0-9d5f-4c6c-a8f6-bc7bd0b6349e req-c40ca504-557f-40ab-96ee-2f1c7d94576b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Refreshing network info cache for port 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:07:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:45.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:07:45.203 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:45.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:46 np0005603623 nova_compute[226235]: 2026-01-31 08:07:46.746 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:47.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:47.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:48 np0005603623 nova_compute[226235]: 2026-01-31 08:07:48.206 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:48 np0005603623 nova_compute[226235]: 2026-01-31 08:07:48.480 226239 DEBUG nova.network.neutron [req-37fbfdc0-9d5f-4c6c-a8f6-bc7bd0b6349e req-c40ca504-557f-40ab-96ee-2f1c7d94576b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Updated VIF entry in instance network info cache for port 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:07:48 np0005603623 nova_compute[226235]: 2026-01-31 08:07:48.481 226239 DEBUG nova.network.neutron [req-37fbfdc0-9d5f-4c6c-a8f6-bc7bd0b6349e req-c40ca504-557f-40ab-96ee-2f1c7d94576b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Updating instance_info_cache with network_info: [{"id": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "address": "fa:16:3e:63:1a:05", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e779f8-cc", "ovs_interfaceid": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:07:48 np0005603623 nova_compute[226235]: 2026-01-31 08:07:48.563 226239 DEBUG oslo_concurrency.lockutils [req-37fbfdc0-9d5f-4c6c-a8f6-bc7bd0b6349e req-c40ca504-557f-40ab-96ee-2f1c7d94576b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:07:48 np0005603623 podman[256626]: 2026-01-31 08:07:48.984758406 +0000 UTC m=+0.076259754 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:07:48 np0005603623 podman[256627]: 2026-01-31 08:07:48.991749217 +0000 UTC m=+0.083182283 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 31 03:07:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:49.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:49.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:51.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:51.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:51 np0005603623 nova_compute[226235]: 2026-01-31 08:07:51.752 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:53.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:53 np0005603623 nova_compute[226235]: 2026-01-31 08:07:53.209 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:53.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:53 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:53Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:63:1a:05 10.100.0.3
Jan 31 03:07:53 np0005603623 ovn_controller[133449]: 2026-01-31T08:07:53Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:63:1a:05 10.100.0.3
Jan 31 03:07:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e248 e248: 3 total, 3 up, 3 in
Jan 31 03:07:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:55.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:55.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e249 e249: 3 total, 3 up, 3 in
Jan 31 03:07:56 np0005603623 nova_compute[226235]: 2026-01-31 08:07:56.755 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:07:57 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1723541852' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:07:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:07:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:57.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:07:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e250 e250: 3 total, 3 up, 3 in
Jan 31 03:07:57 np0005603623 ceph-mgr[77391]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3835187053
Jan 31 03:07:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:07:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:57.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:07:58 np0005603623 nova_compute[226235]: 2026-01-31 08:07:58.211 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:59.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:07:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:59.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:00 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:00Z|00260|binding|INFO|Releasing lport b4a40811-3703-4da5-859c-3e041b7cfee4 from this chassis (sb_readonly=0)
Jan 31 03:08:00 np0005603623 nova_compute[226235]: 2026-01-31 08:08:00.929 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:01.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:01.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:01 np0005603623 nova_compute[226235]: 2026-01-31 08:08:01.756 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:03 np0005603623 nova_compute[226235]: 2026-01-31 08:08:03.214 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:03.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e251 e251: 3 total, 3 up, 3 in
Jan 31 03:08:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:03.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:05.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:05.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:06 np0005603623 nova_compute[226235]: 2026-01-31 08:08:06.760 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:07.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:07.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:08 np0005603623 nova_compute[226235]: 2026-01-31 08:08:08.215 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:09 np0005603623 nova_compute[226235]: 2026-01-31 08:08:09.015 226239 DEBUG nova.compute.manager [req-1b424e8d-0b2d-40a7-b69b-181c2ace3ce8 req-2a92ded5-47e2-4903-a3cd-989a585b3a18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received event network-changed-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:09 np0005603623 nova_compute[226235]: 2026-01-31 08:08:09.015 226239 DEBUG nova.compute.manager [req-1b424e8d-0b2d-40a7-b69b-181c2ace3ce8 req-2a92ded5-47e2-4903-a3cd-989a585b3a18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Refreshing instance network info cache due to event network-changed-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:08:09 np0005603623 nova_compute[226235]: 2026-01-31 08:08:09.015 226239 DEBUG oslo_concurrency.lockutils [req-1b424e8d-0b2d-40a7-b69b-181c2ace3ce8 req-2a92ded5-47e2-4903-a3cd-989a585b3a18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:09 np0005603623 nova_compute[226235]: 2026-01-31 08:08:09.015 226239 DEBUG oslo_concurrency.lockutils [req-1b424e8d-0b2d-40a7-b69b-181c2ace3ce8 req-2a92ded5-47e2-4903-a3cd-989a585b3a18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:09 np0005603623 nova_compute[226235]: 2026-01-31 08:08:09.016 226239 DEBUG nova.network.neutron [req-1b424e8d-0b2d-40a7-b69b-181c2ace3ce8 req-2a92ded5-47e2-4903-a3cd-989a585b3a18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Refreshing network info cache for port 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:08:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:09.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:09.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:10 np0005603623 nova_compute[226235]: 2026-01-31 08:08:10.676 226239 DEBUG nova.network.neutron [req-1b424e8d-0b2d-40a7-b69b-181c2ace3ce8 req-2a92ded5-47e2-4903-a3cd-989a585b3a18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Updated VIF entry in instance network info cache for port 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:08:10 np0005603623 nova_compute[226235]: 2026-01-31 08:08:10.677 226239 DEBUG nova.network.neutron [req-1b424e8d-0b2d-40a7-b69b-181c2ace3ce8 req-2a92ded5-47e2-4903-a3cd-989a585b3a18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Updating instance_info_cache with network_info: [{"id": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "address": "fa:16:3e:63:1a:05", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e779f8-cc", "ovs_interfaceid": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:10 np0005603623 nova_compute[226235]: 2026-01-31 08:08:10.726 226239 DEBUG oslo_concurrency.lockutils [req-1b424e8d-0b2d-40a7-b69b-181c2ace3ce8 req-2a92ded5-47e2-4903-a3cd-989a585b3a18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:11.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:11 np0005603623 nova_compute[226235]: 2026-01-31 08:08:11.488 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:11.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:11 np0005603623 nova_compute[226235]: 2026-01-31 08:08:11.761 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:12 np0005603623 nova_compute[226235]: 2026-01-31 08:08:12.234 226239 DEBUG oslo_concurrency.lockutils [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "interface-d4484d63-c590-4676-b3ae-b8e33bd348f1-131dda89-6e7d-4a88-9572-dcd63205fc02" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:12 np0005603623 nova_compute[226235]: 2026-01-31 08:08:12.234 226239 DEBUG oslo_concurrency.lockutils [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-d4484d63-c590-4676-b3ae-b8e33bd348f1-131dda89-6e7d-4a88-9572-dcd63205fc02" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:12 np0005603623 nova_compute[226235]: 2026-01-31 08:08:12.235 226239 DEBUG nova.objects.instance [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'flavor' on Instance uuid d4484d63-c590-4676-b3ae-b8e33bd348f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e252 e252: 3 total, 3 up, 3 in
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:08:12.650404) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846892650903, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 588, "num_deletes": 252, "total_data_size": 884651, "memory_usage": 896712, "flush_reason": "Manual Compaction"}
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846892728892, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 582597, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38929, "largest_seqno": 39512, "table_properties": {"data_size": 579587, "index_size": 982, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7321, "raw_average_key_size": 19, "raw_value_size": 573451, "raw_average_value_size": 1525, "num_data_blocks": 43, "num_entries": 376, "num_filter_entries": 376, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769846864, "oldest_key_time": 1769846864, "file_creation_time": 1769846892, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 78556 microseconds, and 2513 cpu microseconds.
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:08:12 np0005603623 nova_compute[226235]: 2026-01-31 08:08:12.738 226239 DEBUG nova.compute.manager [req-08a96754-279f-4afc-97d3-2831b84ce417 req-4313f7ba-7b7f-486c-afb5-0c78c2928f75 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received event network-changed-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:12 np0005603623 nova_compute[226235]: 2026-01-31 08:08:12.739 226239 DEBUG nova.compute.manager [req-08a96754-279f-4afc-97d3-2831b84ce417 req-4313f7ba-7b7f-486c-afb5-0c78c2928f75 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Refreshing instance network info cache due to event network-changed-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:08:12 np0005603623 nova_compute[226235]: 2026-01-31 08:08:12.739 226239 DEBUG oslo_concurrency.lockutils [req-08a96754-279f-4afc-97d3-2831b84ce417 req-4313f7ba-7b7f-486c-afb5-0c78c2928f75 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:12 np0005603623 nova_compute[226235]: 2026-01-31 08:08:12.739 226239 DEBUG oslo_concurrency.lockutils [req-08a96754-279f-4afc-97d3-2831b84ce417 req-4313f7ba-7b7f-486c-afb5-0c78c2928f75 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:12 np0005603623 nova_compute[226235]: 2026-01-31 08:08:12.739 226239 DEBUG nova.network.neutron [req-08a96754-279f-4afc-97d3-2831b84ce417 req-4313f7ba-7b7f-486c-afb5-0c78c2928f75 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Refreshing network info cache for port 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:08:12.728968) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 582597 bytes OK
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:08:12.729005) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:08:12.837723) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:08:12.837793) EVENT_LOG_v1 {"time_micros": 1769846892837782, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:08:12.837817) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 881303, prev total WAL file size 881303, number of live WAL files 2.
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:08:12.838343) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(568KB)], [72(11MB)]
Jan 31 03:08:12 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846892838422, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 12339042, "oldest_snapshot_seqno": -1}
Jan 31 03:08:13 np0005603623 nova_compute[226235]: 2026-01-31 08:08:13.041 226239 DEBUG nova.objects.instance [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'pci_requests' on Instance uuid d4484d63-c590-4676-b3ae-b8e33bd348f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:13 np0005603623 nova_compute[226235]: 2026-01-31 08:08:13.056 226239 DEBUG nova.network.neutron [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:08:13 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6297 keys, 10405537 bytes, temperature: kUnknown
Jan 31 03:08:13 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846893099560, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 10405537, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10361920, "index_size": 26803, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15749, "raw_key_size": 162655, "raw_average_key_size": 25, "raw_value_size": 10247542, "raw_average_value_size": 1627, "num_data_blocks": 1068, "num_entries": 6297, "num_filter_entries": 6297, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769846892, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:08:13 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:08:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:08:13.099943) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 10405537 bytes
Jan 31 03:08:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:08:13.133912) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 47.2 rd, 39.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 11.2 +0.0 blob) out(9.9 +0.0 blob), read-write-amplify(39.0) write-amplify(17.9) OK, records in: 6813, records dropped: 516 output_compression: NoCompression
Jan 31 03:08:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:08:13.133952) EVENT_LOG_v1 {"time_micros": 1769846893133934, "job": 44, "event": "compaction_finished", "compaction_time_micros": 261253, "compaction_time_cpu_micros": 20392, "output_level": 6, "num_output_files": 1, "total_output_size": 10405537, "num_input_records": 6813, "num_output_records": 6297, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:08:13 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:08:13 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846893134362, "job": 44, "event": "table_file_deletion", "file_number": 74}
Jan 31 03:08:13 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:08:13 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846893136507, "job": 44, "event": "table_file_deletion", "file_number": 72}
Jan 31 03:08:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:08:12.838233) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:08:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:08:13.136669) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:08:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:08:13.136677) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:08:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:08:13.136679) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:08:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:08:13.136684) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:08:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:08:13.136686) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:08:13 np0005603623 nova_compute[226235]: 2026-01-31 08:08:13.218 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:13.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:13.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:14 np0005603623 nova_compute[226235]: 2026-01-31 08:08:14.005 226239 DEBUG nova.policy [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60f2b878669c4c529b35e04860cc6d64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c73212dc7c84914b6c934d45b6826f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:08:14 np0005603623 nova_compute[226235]: 2026-01-31 08:08:14.248 226239 DEBUG nova.network.neutron [req-08a96754-279f-4afc-97d3-2831b84ce417 req-4313f7ba-7b7f-486c-afb5-0c78c2928f75 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Updated VIF entry in instance network info cache for port 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:08:14 np0005603623 nova_compute[226235]: 2026-01-31 08:08:14.248 226239 DEBUG nova.network.neutron [req-08a96754-279f-4afc-97d3-2831b84ce417 req-4313f7ba-7b7f-486c-afb5-0c78c2928f75 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Updating instance_info_cache with network_info: [{"id": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "address": "fa:16:3e:63:1a:05", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e779f8-cc", "ovs_interfaceid": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:14 np0005603623 nova_compute[226235]: 2026-01-31 08:08:14.273 226239 DEBUG oslo_concurrency.lockutils [req-08a96754-279f-4afc-97d3-2831b84ce417 req-4313f7ba-7b7f-486c-afb5-0c78c2928f75 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:08:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2105483700' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:08:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:08:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2105483700' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:08:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:15.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:15 np0005603623 nova_compute[226235]: 2026-01-31 08:08:15.341 226239 DEBUG nova.network.neutron [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Successfully updated port: 131dda89-6e7d-4a88-9572-dcd63205fc02 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:08:15 np0005603623 nova_compute[226235]: 2026-01-31 08:08:15.403 226239 DEBUG oslo_concurrency.lockutils [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:15 np0005603623 nova_compute[226235]: 2026-01-31 08:08:15.404 226239 DEBUG oslo_concurrency.lockutils [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquired lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:15 np0005603623 nova_compute[226235]: 2026-01-31 08:08:15.404 226239 DEBUG nova.network.neutron [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:08:15 np0005603623 nova_compute[226235]: 2026-01-31 08:08:15.408 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:15 np0005603623 nova_compute[226235]: 2026-01-31 08:08:15.489 226239 DEBUG nova.compute.manager [req-97b1b420-e715-4dfa-a716-638a08e976c7 req-221b4496-8719-47e7-a745-fcb0d775bea6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received event network-changed-131dda89-6e7d-4a88-9572-dcd63205fc02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:15 np0005603623 nova_compute[226235]: 2026-01-31 08:08:15.489 226239 DEBUG nova.compute.manager [req-97b1b420-e715-4dfa-a716-638a08e976c7 req-221b4496-8719-47e7-a745-fcb0d775bea6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Refreshing instance network info cache due to event network-changed-131dda89-6e7d-4a88-9572-dcd63205fc02. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:08:15 np0005603623 nova_compute[226235]: 2026-01-31 08:08:15.490 226239 DEBUG oslo_concurrency.lockutils [req-97b1b420-e715-4dfa-a716-638a08e976c7 req-221b4496-8719-47e7-a745-fcb0d775bea6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:15 np0005603623 nova_compute[226235]: 2026-01-31 08:08:15.649 226239 WARNING nova.network.neutron [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] 455fab34-b015-4d97-a96d-f7ebd7f7555f already exists in list: networks containing: ['455fab34-b015-4d97-a96d-f7ebd7f7555f']. ignoring it#033[00m
Jan 31 03:08:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:15.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:16 np0005603623 nova_compute[226235]: 2026-01-31 08:08:16.764 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:17.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.464 226239 DEBUG nova.network.neutron [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Updating instance_info_cache with network_info: [{"id": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "address": "fa:16:3e:63:1a:05", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e779f8-cc", "ovs_interfaceid": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.530 226239 DEBUG oslo_concurrency.lockutils [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Releasing lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.533 226239 DEBUG oslo_concurrency.lockutils [req-97b1b420-e715-4dfa-a716-638a08e976c7 req-221b4496-8719-47e7-a745-fcb0d775bea6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.533 226239 DEBUG nova.network.neutron [req-97b1b420-e715-4dfa-a716-638a08e976c7 req-221b4496-8719-47e7-a745-fcb0d775bea6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Refreshing network info cache for port 131dda89-6e7d-4a88-9572-dcd63205fc02 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.538 226239 DEBUG nova.virt.libvirt.vif [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:07:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1091698191',display_name='tempest-tempest.common.compute-instance-1091698191',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1091698191',id=70,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPZmLk+NGh2zKbql/sBzP6qM4W9cXGD3OUJAhT/207QiFni858RIgrXDyBBR0Tlv+t9A7ybvSMg5e6CDTEEkg6g7w68asAv+N4fL3AAeDAWcmo04YGYMANL/8swEdfyv1w==',key_name='tempest-keypair-1664347335',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:07:41Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-uhssxba1',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:07:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=d4484d63-c590-4676-b3ae-b8e33bd348f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.538 226239 DEBUG nova.network.os_vif_util [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.539 226239 DEBUG nova.network.os_vif_util [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.540 226239 DEBUG os_vif [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.541 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.541 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.542 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.547 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.548 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap131dda89-6e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.549 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap131dda89-6e, col_values=(('external_ids', {'iface-id': '131dda89-6e7d-4a88-9572-dcd63205fc02', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b3:47:2e', 'vm-uuid': 'd4484d63-c590-4676-b3ae-b8e33bd348f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.550 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:17 np0005603623 NetworkManager[48970]: <info>  [1769846897.5519] manager: (tap131dda89-6e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.555 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.557 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.558 226239 INFO os_vif [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e')#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.559 226239 DEBUG nova.virt.libvirt.vif [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:07:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1091698191',display_name='tempest-tempest.common.compute-instance-1091698191',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1091698191',id=70,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPZmLk+NGh2zKbql/sBzP6qM4W9cXGD3OUJAhT/207QiFni858RIgrXDyBBR0Tlv+t9A7ybvSMg5e6CDTEEkg6g7w68asAv+N4fL3AAeDAWcmo04YGYMANL/8swEdfyv1w==',key_name='tempest-keypair-1664347335',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:07:41Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-uhssxba1',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:07:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=d4484d63-c590-4676-b3ae-b8e33bd348f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.560 226239 DEBUG nova.network.os_vif_util [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.560 226239 DEBUG nova.network.os_vif_util [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.564 226239 DEBUG nova.virt.libvirt.guest [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] attach device xml: <interface type="ethernet">
Jan 31 03:08:17 np0005603623 nova_compute[226235]:  <mac address="fa:16:3e:b3:47:2e"/>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:  <model type="virtio"/>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:  <mtu size="1442"/>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:  <target dev="tap131dda89-6e"/>
Jan 31 03:08:17 np0005603623 nova_compute[226235]: </interface>
Jan 31 03:08:17 np0005603623 nova_compute[226235]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:08:17 np0005603623 NetworkManager[48970]: <info>  [1769846897.5756] manager: (tap131dda89-6e): new Tun device (/org/freedesktop/NetworkManager/Devices/125)
Jan 31 03:08:17 np0005603623 kernel: tap131dda89-6e: entered promiscuous mode
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.577 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:17 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:17Z|00261|binding|INFO|Claiming lport 131dda89-6e7d-4a88-9572-dcd63205fc02 for this chassis.
Jan 31 03:08:17 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:17Z|00262|binding|INFO|131dda89-6e7d-4a88-9572-dcd63205fc02: Claiming fa:16:3e:b3:47:2e 10.100.0.10
Jan 31 03:08:17 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:17Z|00263|binding|INFO|Setting lport 131dda89-6e7d-4a88-9572-dcd63205fc02 ovn-installed in OVS
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.583 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:17 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:17Z|00264|binding|INFO|Setting lport 131dda89-6e7d-4a88-9572-dcd63205fc02 up in Southbound
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.588 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:17.587 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:47:2e 10.100.0.10'], port_security=['fa:16:3e:b3:47:2e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1951371451', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd4484d63-c590-4676-b3ae-b8e33bd348f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1951371451', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fdcf3a61-8bd1-47a3-8e6c-d6fed17d2331', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=131dda89-6e7d-4a88-9572-dcd63205fc02) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:08:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:17.589 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 131dda89-6e7d-4a88-9572-dcd63205fc02 in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f bound to our chassis#033[00m
Jan 31 03:08:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:17.590 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455fab34-b015-4d97-a96d-f7ebd7f7555f#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.593 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:17 np0005603623 systemd-udevd[256876]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:08:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:17.607 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[371fd9bf-14d8-41c7-983c-509780be9e9b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:17 np0005603623 NetworkManager[48970]: <info>  [1769846897.6144] device (tap131dda89-6e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:08:17 np0005603623 NetworkManager[48970]: <info>  [1769846897.6153] device (tap131dda89-6e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:08:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:17.627 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[554c7c6a-dc34-4ed6-b9ff-d7e63a7904c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:17.629 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[81c14e72-f4f8-45a9-b1b6-eecedd8f0376]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:17.650 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[79e2a264-4367-4a0d-8afa-803e8779731b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:17.663 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[99580b62-2f42-4e83-9ad4-8f7729807e12]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600094, 'reachable_time': 39949, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256883, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:17.675 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8ebb730e-add2-4bfa-9068-d7473b7c90ae]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600101, 'tstamp': 600101}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256884, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600103, 'tstamp': 600103}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256884, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:17.676 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.678 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.681 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:17.681 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455fab34-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:17.681 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:08:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:17.682 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455fab34-b0, col_values=(('external_ids', {'iface-id': 'b4a40811-3703-4da5-859c-3e041b7cfee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:17.682 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.693 226239 DEBUG nova.virt.libvirt.driver [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.693 226239 DEBUG nova.virt.libvirt.driver [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.693 226239 DEBUG nova.virt.libvirt.driver [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:63:1a:05, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.693 226239 DEBUG nova.virt.libvirt.driver [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:b3:47:2e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:08:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:17.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.758 226239 DEBUG nova.virt.libvirt.guest [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:08:17 np0005603623 nova_compute[226235]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:  <nova:name>tempest-tempest.common.compute-instance-1091698191</nova:name>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:  <nova:creationTime>2026-01-31 08:08:17</nova:creationTime>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:  <nova:flavor name="m1.nano">
Jan 31 03:08:17 np0005603623 nova_compute[226235]:    <nova:memory>128</nova:memory>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:    <nova:disk>1</nova:disk>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:    <nova:swap>0</nova:swap>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:  </nova:flavor>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:  <nova:owner>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:  </nova:owner>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:  <nova:ports>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:    <nova:port uuid="00e779f8-ccb2-4b71-bce9-7a3c9df3b85c">
Jan 31 03:08:17 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:    <nova:port uuid="131dda89-6e7d-4a88-9572-dcd63205fc02">
Jan 31 03:08:17 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:08:17 np0005603623 nova_compute[226235]:  </nova:ports>
Jan 31 03:08:17 np0005603623 nova_compute[226235]: </nova:instance>
Jan 31 03:08:17 np0005603623 nova_compute[226235]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 31 03:08:17 np0005603623 nova_compute[226235]: 2026-01-31 08:08:17.803 226239 DEBUG oslo_concurrency.lockutils [None req-c5791178-6703-4873-95b3-7f64523d931b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-d4484d63-c590-4676-b3ae-b8e33bd348f1-131dda89-6e7d-4a88-9572-dcd63205fc02" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.568s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:18 np0005603623 nova_compute[226235]: 2026-01-31 08:08:18.220 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 e253: 3 total, 3 up, 3 in
Jan 31 03:08:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:18 np0005603623 nova_compute[226235]: 2026-01-31 08:08:18.569 226239 DEBUG nova.compute.manager [req-f542db8c-883f-40be-bba6-98f0f242b72b req-5b0ee15d-ddbd-4182-bde2-0d5a209bf167 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received event network-vif-plugged-131dda89-6e7d-4a88-9572-dcd63205fc02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:18 np0005603623 nova_compute[226235]: 2026-01-31 08:08:18.570 226239 DEBUG oslo_concurrency.lockutils [req-f542db8c-883f-40be-bba6-98f0f242b72b req-5b0ee15d-ddbd-4182-bde2-0d5a209bf167 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:18 np0005603623 nova_compute[226235]: 2026-01-31 08:08:18.571 226239 DEBUG oslo_concurrency.lockutils [req-f542db8c-883f-40be-bba6-98f0f242b72b req-5b0ee15d-ddbd-4182-bde2-0d5a209bf167 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:18 np0005603623 nova_compute[226235]: 2026-01-31 08:08:18.571 226239 DEBUG oslo_concurrency.lockutils [req-f542db8c-883f-40be-bba6-98f0f242b72b req-5b0ee15d-ddbd-4182-bde2-0d5a209bf167 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:18 np0005603623 nova_compute[226235]: 2026-01-31 08:08:18.572 226239 DEBUG nova.compute.manager [req-f542db8c-883f-40be-bba6-98f0f242b72b req-5b0ee15d-ddbd-4182-bde2-0d5a209bf167 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] No waiting events found dispatching network-vif-plugged-131dda89-6e7d-4a88-9572-dcd63205fc02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:08:18 np0005603623 nova_compute[226235]: 2026-01-31 08:08:18.572 226239 WARNING nova.compute.manager [req-f542db8c-883f-40be-bba6-98f0f242b72b req-5b0ee15d-ddbd-4182-bde2-0d5a209bf167 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received unexpected event network-vif-plugged-131dda89-6e7d-4a88-9572-dcd63205fc02 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:08:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:08:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:08:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:08:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:08:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:08:18 np0005603623 nova_compute[226235]: 2026-01-31 08:08:18.857 226239 DEBUG nova.network.neutron [req-97b1b420-e715-4dfa-a716-638a08e976c7 req-221b4496-8719-47e7-a745-fcb0d775bea6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Updated VIF entry in instance network info cache for port 131dda89-6e7d-4a88-9572-dcd63205fc02. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:08:18 np0005603623 nova_compute[226235]: 2026-01-31 08:08:18.857 226239 DEBUG nova.network.neutron [req-97b1b420-e715-4dfa-a716-638a08e976c7 req-221b4496-8719-47e7-a745-fcb0d775bea6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Updating instance_info_cache with network_info: [{"id": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "address": "fa:16:3e:63:1a:05", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e779f8-cc", "ovs_interfaceid": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:18 np0005603623 nova_compute[226235]: 2026-01-31 08:08:18.957 226239 DEBUG oslo_concurrency.lockutils [req-97b1b420-e715-4dfa-a716-638a08e976c7 req-221b4496-8719-47e7-a745-fcb0d775bea6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.011 226239 DEBUG oslo_concurrency.lockutils [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "interface-d4484d63-c590-4676-b3ae-b8e33bd348f1-131dda89-6e7d-4a88-9572-dcd63205fc02" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.013 226239 DEBUG oslo_concurrency.lockutils [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-d4484d63-c590-4676-b3ae-b8e33bd348f1-131dda89-6e7d-4a88-9572-dcd63205fc02" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.038 226239 DEBUG nova.objects.instance [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'flavor' on Instance uuid d4484d63-c590-4676-b3ae-b8e33bd348f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.081 226239 DEBUG nova.virt.libvirt.vif [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:07:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1091698191',display_name='tempest-tempest.common.compute-instance-1091698191',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1091698191',id=70,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPZmLk+NGh2zKbql/sBzP6qM4W9cXGD3OUJAhT/207QiFni858RIgrXDyBBR0Tlv+t9A7ybvSMg5e6CDTEEkg6g7w68asAv+N4fL3AAeDAWcmo04YGYMANL/8swEdfyv1w==',key_name='tempest-keypair-1664347335',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:07:41Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-uhssxba1',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:07:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=d4484d63-c590-4676-b3ae-b8e33bd348f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.082 226239 DEBUG nova.network.os_vif_util [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.083 226239 DEBUG nova.network.os_vif_util [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.085 226239 DEBUG nova.virt.libvirt.guest [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b3:47:2e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap131dda89-6e"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.087 226239 DEBUG nova.virt.libvirt.guest [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b3:47:2e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap131dda89-6e"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.088 226239 DEBUG nova.virt.libvirt.driver [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Attempting to detach device tap131dda89-6e from instance d4484d63-c590-4676-b3ae-b8e33bd348f1 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.089 226239 DEBUG nova.virt.libvirt.guest [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] detach device xml: <interface type="ethernet">
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <mac address="fa:16:3e:b3:47:2e"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <model type="virtio"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <mtu size="1442"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <target dev="tap131dda89-6e"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]: </interface>
Jan 31 03:08:19 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.106 226239 DEBUG nova.virt.libvirt.guest [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b3:47:2e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap131dda89-6e"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.108 226239 DEBUG nova.virt.libvirt.guest [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:b3:47:2e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap131dda89-6e"/></interface>not found in domain: <domain type='kvm' id='31'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <name>instance-00000046</name>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <uuid>d4484d63-c590-4676-b3ae-b8e33bd348f1</uuid>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:name>tempest-tempest.common.compute-instance-1091698191</nova:name>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:creationTime>2026-01-31 08:08:17</nova:creationTime>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:flavor name="m1.nano">
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:memory>128</nova:memory>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:disk>1</nova:disk>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:swap>0</nova:swap>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </nova:flavor>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:owner>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </nova:owner>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:ports>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:port uuid="00e779f8-ccb2-4b71-bce9-7a3c9df3b85c">
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:port uuid="131dda89-6e7d-4a88-9572-dcd63205fc02">
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </nova:ports>
Jan 31 03:08:19 np0005603623 nova_compute[226235]: </nova:instance>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <memory unit='KiB'>131072</memory>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <vcpu placement='static'>1</vcpu>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <resource>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <partition>/machine</partition>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </resource>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <sysinfo type='smbios'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <entry name='manufacturer'>RDO</entry>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <entry name='serial'>d4484d63-c590-4676-b3ae-b8e33bd348f1</entry>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <entry name='uuid'>d4484d63-c590-4676-b3ae-b8e33bd348f1</entry>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <entry name='family'>Virtual Machine</entry>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <boot dev='hd'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <smbios mode='sysinfo'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <vmcoreinfo state='on'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <model fallback='forbid'>Nehalem</model>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <feature policy='require' name='x2apic'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <feature policy='require' name='hypervisor'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <feature policy='require' name='vme'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <clock offset='utc'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <timer name='hpet' present='no'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <on_poweroff>destroy</on_poweroff>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <on_reboot>restart</on_reboot>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <on_crash>destroy</on_crash>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <disk type='network' device='disk'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <auth username='openstack'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <source protocol='rbd' name='vms/d4484d63-c590-4676-b3ae-b8e33bd348f1_disk' index='2'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target dev='vda' bus='virtio'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='virtio-disk0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <disk type='network' device='cdrom'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <auth username='openstack'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <source protocol='rbd' name='vms/d4484d63-c590-4676-b3ae-b8e33bd348f1_disk.config' index='1'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target dev='sda' bus='sata'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <readonly/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='sata0-0-0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pcie.0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='1' port='0x10'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.1'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='2' port='0x11'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.2'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='3' port='0x12'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.3'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='4' port='0x13'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.4'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='5' port='0x14'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.5'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='6' port='0x15'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.6'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='7' port='0x16'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.7'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='8' port='0x17'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.8'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='9' port='0x18'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.9'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='10' port='0x19'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.10'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='11' port='0x1a'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.11'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='12' port='0x1b'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.12'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='13' port='0x1c'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.13'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='14' port='0x1d'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.14'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='15' port='0x1e'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.15'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='16' port='0x1f'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.16'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='17' port='0x20'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.17'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='18' port='0x21'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.18'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='19' port='0x22'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.19'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='20' port='0x23'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.20'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='21' port='0x24'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.21'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='22' port='0x25'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.22'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='23' port='0x26'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.23'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='24' port='0x27'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.24'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='25' port='0x28'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.25'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-pci-bridge'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.26'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='usb'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='sata' index='0'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='ide'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <interface type='ethernet'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <mac address='fa:16:3e:63:1a:05'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target dev='tap00e779f8-cc'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model type='virtio'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <mtu size='1442'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='net0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <interface type='ethernet'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <mac address='fa:16:3e:b3:47:2e'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target dev='tap131dda89-6e'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model type='virtio'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <mtu size='1442'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='net1'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <serial type='pty'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <source path='/dev/pts/0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <log file='/var/lib/nova/instances/d4484d63-c590-4676-b3ae-b8e33bd348f1/console.log' append='off'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target type='isa-serial' port='0'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:        <model name='isa-serial'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      </target>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='serial0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <source path='/dev/pts/0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <log file='/var/lib/nova/instances/d4484d63-c590-4676-b3ae-b8e33bd348f1/console.log' append='off'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target type='serial' port='0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='serial0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </console>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <input type='tablet' bus='usb'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='input0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='usb' bus='0' port='1'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <input type='mouse' bus='ps2'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='input1'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <input type='keyboard' bus='ps2'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='input2'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <listen type='address' address='::0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </graphics>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <audio id='1' type='none'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='video0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <watchdog model='itco' action='reset'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='watchdog0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </watchdog>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <memballoon model='virtio'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <stats period='10'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='balloon0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <rng model='virtio'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <backend model='random'>/dev/urandom</backend>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='rng0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <label>system_u:system_r:svirt_t:s0:c182,c955</label>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c182,c955</imagelabel>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </seclabel>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <label>+107:+107</label>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <imagelabel>+107:+107</imagelabel>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </seclabel>
Jan 31 03:08:19 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:08:19 np0005603623 nova_compute[226235]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.108 226239 INFO nova.virt.libvirt.driver [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully detached device tap131dda89-6e from instance d4484d63-c590-4676-b3ae-b8e33bd348f1 from the persistent domain config.#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.108 226239 DEBUG nova.virt.libvirt.driver [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] (1/8): Attempting to detach device tap131dda89-6e with device alias net1 from instance d4484d63-c590-4676-b3ae-b8e33bd348f1 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.109 226239 DEBUG nova.virt.libvirt.guest [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] detach device xml: <interface type="ethernet">
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <mac address="fa:16:3e:b3:47:2e"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <model type="virtio"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <mtu size="1442"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <target dev="tap131dda89-6e"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]: </interface>
Jan 31 03:08:19 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:08:19 np0005603623 kernel: tap131dda89-6e (unregistering): left promiscuous mode
Jan 31 03:08:19 np0005603623 NetworkManager[48970]: <info>  [1769846899.2151] device (tap131dda89-6e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:08:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:19.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:19Z|00265|binding|INFO|Releasing lport 131dda89-6e7d-4a88-9572-dcd63205fc02 from this chassis (sb_readonly=0)
Jan 31 03:08:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:19Z|00266|binding|INFO|Setting lport 131dda89-6e7d-4a88-9572-dcd63205fc02 down in Southbound
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.258 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:19Z|00267|binding|INFO|Removing iface tap131dda89-6e ovn-installed in OVS
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.260 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.261 226239 DEBUG nova.virt.libvirt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Received event <DeviceRemovedEvent: 1769846899.2613792, d4484d63-c590-4676-b3ae-b8e33bd348f1 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.263 226239 DEBUG nova.virt.libvirt.driver [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Start waiting for the detach event from libvirt for device tap131dda89-6e with device alias net1 for instance d4484d63-c590-4676-b3ae-b8e33bd348f1 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.263 226239 DEBUG nova.virt.libvirt.guest [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b3:47:2e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap131dda89-6e"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.263 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.267 226239 DEBUG nova.virt.libvirt.guest [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:b3:47:2e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap131dda89-6e"/></interface>not found in domain: <domain type='kvm' id='31'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <name>instance-00000046</name>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <uuid>d4484d63-c590-4676-b3ae-b8e33bd348f1</uuid>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:name>tempest-tempest.common.compute-instance-1091698191</nova:name>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:creationTime>2026-01-31 08:08:17</nova:creationTime>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:flavor name="m1.nano">
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:memory>128</nova:memory>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:disk>1</nova:disk>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:swap>0</nova:swap>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </nova:flavor>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:owner>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </nova:owner>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:ports>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:port uuid="00e779f8-ccb2-4b71-bce9-7a3c9df3b85c">
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:port uuid="131dda89-6e7d-4a88-9572-dcd63205fc02">
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </nova:ports>
Jan 31 03:08:19 np0005603623 nova_compute[226235]: </nova:instance>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <memory unit='KiB'>131072</memory>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <vcpu placement='static'>1</vcpu>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <resource>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <partition>/machine</partition>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </resource>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <sysinfo type='smbios'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <entry name='manufacturer'>RDO</entry>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <entry name='serial'>d4484d63-c590-4676-b3ae-b8e33bd348f1</entry>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <entry name='uuid'>d4484d63-c590-4676-b3ae-b8e33bd348f1</entry>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <entry name='family'>Virtual Machine</entry>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <boot dev='hd'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <smbios mode='sysinfo'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <vmcoreinfo state='on'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <model fallback='forbid'>Nehalem</model>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <feature policy='require' name='x2apic'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <feature policy='require' name='hypervisor'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <feature policy='require' name='vme'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <clock offset='utc'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <timer name='hpet' present='no'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <on_poweroff>destroy</on_poweroff>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <on_reboot>restart</on_reboot>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <on_crash>destroy</on_crash>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <disk type='network' device='disk'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <auth username='openstack'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <source protocol='rbd' name='vms/d4484d63-c590-4676-b3ae-b8e33bd348f1_disk' index='2'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target dev='vda' bus='virtio'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='virtio-disk0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <disk type='network' device='cdrom'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <auth username='openstack'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <source protocol='rbd' name='vms/d4484d63-c590-4676-b3ae-b8e33bd348f1_disk.config' index='1'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target dev='sda' bus='sata'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <readonly/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='sata0-0-0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pcie.0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='1' port='0x10'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.1'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='2' port='0x11'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.2'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='3' port='0x12'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.3'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='4' port='0x13'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.4'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='5' port='0x14'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.5'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='6' port='0x15'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.6'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='7' port='0x16'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.7'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='8' port='0x17'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.8'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='9' port='0x18'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.9'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='10' port='0x19'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.10'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='11' port='0x1a'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.11'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='12' port='0x1b'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.12'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='13' port='0x1c'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.13'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='14' port='0x1d'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.14'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='15' port='0x1e'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.15'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='16' port='0x1f'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.16'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='17' port='0x20'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.17'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='18' port='0x21'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.18'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='19' port='0x22'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.19'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='20' port='0x23'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.20'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='21' port='0x24'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.21'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='22' port='0x25'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.22'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='23' port='0x26'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.23'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='24' port='0x27'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.24'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-root-port'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target chassis='25' port='0x28'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.25'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model name='pcie-pci-bridge'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='pci.26'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='usb'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <controller type='sata' index='0'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='ide'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </controller>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <interface type='ethernet'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <mac address='fa:16:3e:63:1a:05'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target dev='tap00e779f8-cc'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model type='virtio'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <mtu size='1442'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='net0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <serial type='pty'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <source path='/dev/pts/0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <log file='/var/lib/nova/instances/d4484d63-c590-4676-b3ae-b8e33bd348f1/console.log' append='off'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target type='isa-serial' port='0'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:        <model name='isa-serial'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      </target>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='serial0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <source path='/dev/pts/0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <log file='/var/lib/nova/instances/d4484d63-c590-4676-b3ae-b8e33bd348f1/console.log' append='off'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <target type='serial' port='0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='serial0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </console>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <input type='tablet' bus='usb'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='input0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='usb' bus='0' port='1'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <input type='mouse' bus='ps2'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='input1'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <input type='keyboard' bus='ps2'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='input2'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </input>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <listen type='address' address='::0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </graphics>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <audio id='1' type='none'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='video0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <watchdog model='itco' action='reset'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='watchdog0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </watchdog>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <memballoon model='virtio'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <stats period='10'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='balloon0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <rng model='virtio'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <backend model='random'>/dev/urandom</backend>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <alias name='rng0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <label>system_u:system_r:svirt_t:s0:c182,c955</label>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c182,c955</imagelabel>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </seclabel>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <label>+107:+107</label>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <imagelabel>+107:+107</imagelabel>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </seclabel>
Jan 31 03:08:19 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:08:19 np0005603623 nova_compute[226235]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.267 226239 INFO nova.virt.libvirt.driver [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully detached device tap131dda89-6e from instance d4484d63-c590-4676-b3ae-b8e33bd348f1 from the live domain config.#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.268 226239 DEBUG nova.virt.libvirt.vif [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:07:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1091698191',display_name='tempest-tempest.common.compute-instance-1091698191',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1091698191',id=70,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPZmLk+NGh2zKbql/sBzP6qM4W9cXGD3OUJAhT/207QiFni858RIgrXDyBBR0Tlv+t9A7ybvSMg5e6CDTEEkg6g7w68asAv+N4fL3AAeDAWcmo04YGYMANL/8swEdfyv1w==',key_name='tempest-keypair-1664347335',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:07:41Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-uhssxba1',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:07:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=d4484d63-c590-4676-b3ae-b8e33bd348f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.268 226239 DEBUG nova.network.os_vif_util [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.269 226239 DEBUG nova.network.os_vif_util [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.269 226239 DEBUG os_vif [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.272 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.272 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap131dda89-6e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:19.272 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:47:2e 10.100.0.10'], port_security=['fa:16:3e:b3:47:2e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1951371451', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd4484d63-c590-4676-b3ae-b8e33bd348f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1951371451', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fdcf3a61-8bd1-47a3-8e6c-d6fed17d2331', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=131dda89-6e7d-4a88-9572-dcd63205fc02) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:08:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:19.273 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 131dda89-6e7d-4a88-9572-dcd63205fc02 in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f unbound from our chassis#033[00m
Jan 31 03:08:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:19.275 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455fab34-b015-4d97-a96d-f7ebd7f7555f#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.275 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.277 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.278 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.280 226239 INFO os_vif [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e')#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.281 226239 DEBUG nova.virt.libvirt.guest [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:name>tempest-tempest.common.compute-instance-1091698191</nova:name>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:creationTime>2026-01-31 08:08:19</nova:creationTime>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:flavor name="m1.nano">
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:memory>128</nova:memory>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:disk>1</nova:disk>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:swap>0</nova:swap>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </nova:flavor>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:owner>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </nova:owner>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  <nova:ports>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    <nova:port uuid="00e779f8-ccb2-4b71-bce9-7a3c9df3b85c">
Jan 31 03:08:19 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:08:19 np0005603623 nova_compute[226235]:  </nova:ports>
Jan 31 03:08:19 np0005603623 nova_compute[226235]: </nova:instance>
Jan 31 03:08:19 np0005603623 nova_compute[226235]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 31 03:08:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:19.287 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[487fac13-620b-4ee4-b896-cd9775f093f6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:19.304 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[4f9768d8-d43e-4c11-8e77-cf5d912af539]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:19.308 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[4f88fb77-1ddf-43e6-8f27-3590671509d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:19 np0005603623 podman[256891]: 2026-01-31 08:08:19.331034687 +0000 UTC m=+0.047065394 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 31 03:08:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:19.330 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[4f7def96-563e-49c0-a9e1-53d6315b0d10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:19.346 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fbc1c172-3a1e-4cd7-a867-ca021b821dcc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600094, 'reachable_time': 39949, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256934, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:19 np0005603623 podman[256892]: 2026-01-31 08:08:19.359322289 +0000 UTC m=+0.075351546 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:08:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:19.358 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[56632fdf-d4e1-435a-9cfb-22b39998a6e5]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600101, 'tstamp': 600101}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256940, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600103, 'tstamp': 600103}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256940, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:19.360 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.362 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:19 np0005603623 nova_compute[226235]: 2026-01-31 08:08:19.362 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:19.363 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455fab34-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:19.363 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:08:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:19.364 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455fab34-b0, col_values=(('external_ids', {'iface-id': 'b4a40811-3703-4da5-859c-3e041b7cfee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:19.364 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:08:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:19.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:20 np0005603623 nova_compute[226235]: 2026-01-31 08:08:20.685 226239 DEBUG nova.compute.manager [req-c20fa5ac-b19f-43a2-a6e3-a4acd7aa4590 req-558a58d8-fc53-4a7a-90cb-88cc236d62c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received event network-vif-plugged-131dda89-6e7d-4a88-9572-dcd63205fc02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:20 np0005603623 nova_compute[226235]: 2026-01-31 08:08:20.686 226239 DEBUG oslo_concurrency.lockutils [req-c20fa5ac-b19f-43a2-a6e3-a4acd7aa4590 req-558a58d8-fc53-4a7a-90cb-88cc236d62c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:20 np0005603623 nova_compute[226235]: 2026-01-31 08:08:20.686 226239 DEBUG oslo_concurrency.lockutils [req-c20fa5ac-b19f-43a2-a6e3-a4acd7aa4590 req-558a58d8-fc53-4a7a-90cb-88cc236d62c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:20 np0005603623 nova_compute[226235]: 2026-01-31 08:08:20.686 226239 DEBUG oslo_concurrency.lockutils [req-c20fa5ac-b19f-43a2-a6e3-a4acd7aa4590 req-558a58d8-fc53-4a7a-90cb-88cc236d62c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:20 np0005603623 nova_compute[226235]: 2026-01-31 08:08:20.687 226239 DEBUG nova.compute.manager [req-c20fa5ac-b19f-43a2-a6e3-a4acd7aa4590 req-558a58d8-fc53-4a7a-90cb-88cc236d62c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] No waiting events found dispatching network-vif-plugged-131dda89-6e7d-4a88-9572-dcd63205fc02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:08:20 np0005603623 nova_compute[226235]: 2026-01-31 08:08:20.687 226239 WARNING nova.compute.manager [req-c20fa5ac-b19f-43a2-a6e3-a4acd7aa4590 req-558a58d8-fc53-4a7a-90cb-88cc236d62c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received unexpected event network-vif-plugged-131dda89-6e7d-4a88-9572-dcd63205fc02 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:08:20 np0005603623 nova_compute[226235]: 2026-01-31 08:08:20.688 226239 DEBUG nova.compute.manager [req-c20fa5ac-b19f-43a2-a6e3-a4acd7aa4590 req-558a58d8-fc53-4a7a-90cb-88cc236d62c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received event network-vif-unplugged-131dda89-6e7d-4a88-9572-dcd63205fc02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:20 np0005603623 nova_compute[226235]: 2026-01-31 08:08:20.688 226239 DEBUG oslo_concurrency.lockutils [req-c20fa5ac-b19f-43a2-a6e3-a4acd7aa4590 req-558a58d8-fc53-4a7a-90cb-88cc236d62c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:20 np0005603623 nova_compute[226235]: 2026-01-31 08:08:20.688 226239 DEBUG oslo_concurrency.lockutils [req-c20fa5ac-b19f-43a2-a6e3-a4acd7aa4590 req-558a58d8-fc53-4a7a-90cb-88cc236d62c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:20 np0005603623 nova_compute[226235]: 2026-01-31 08:08:20.689 226239 DEBUG oslo_concurrency.lockutils [req-c20fa5ac-b19f-43a2-a6e3-a4acd7aa4590 req-558a58d8-fc53-4a7a-90cb-88cc236d62c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:20 np0005603623 nova_compute[226235]: 2026-01-31 08:08:20.689 226239 DEBUG nova.compute.manager [req-c20fa5ac-b19f-43a2-a6e3-a4acd7aa4590 req-558a58d8-fc53-4a7a-90cb-88cc236d62c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] No waiting events found dispatching network-vif-unplugged-131dda89-6e7d-4a88-9572-dcd63205fc02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:08:20 np0005603623 nova_compute[226235]: 2026-01-31 08:08:20.690 226239 WARNING nova.compute.manager [req-c20fa5ac-b19f-43a2-a6e3-a4acd7aa4590 req-558a58d8-fc53-4a7a-90cb-88cc236d62c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received unexpected event network-vif-unplugged-131dda89-6e7d-4a88-9572-dcd63205fc02 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:08:20 np0005603623 nova_compute[226235]: 2026-01-31 08:08:20.691 226239 DEBUG nova.compute.manager [req-c20fa5ac-b19f-43a2-a6e3-a4acd7aa4590 req-558a58d8-fc53-4a7a-90cb-88cc236d62c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received event network-vif-plugged-131dda89-6e7d-4a88-9572-dcd63205fc02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:20 np0005603623 nova_compute[226235]: 2026-01-31 08:08:20.691 226239 DEBUG oslo_concurrency.lockutils [req-c20fa5ac-b19f-43a2-a6e3-a4acd7aa4590 req-558a58d8-fc53-4a7a-90cb-88cc236d62c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:20 np0005603623 nova_compute[226235]: 2026-01-31 08:08:20.691 226239 DEBUG oslo_concurrency.lockutils [req-c20fa5ac-b19f-43a2-a6e3-a4acd7aa4590 req-558a58d8-fc53-4a7a-90cb-88cc236d62c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:20 np0005603623 nova_compute[226235]: 2026-01-31 08:08:20.691 226239 DEBUG oslo_concurrency.lockutils [req-c20fa5ac-b19f-43a2-a6e3-a4acd7aa4590 req-558a58d8-fc53-4a7a-90cb-88cc236d62c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:20 np0005603623 nova_compute[226235]: 2026-01-31 08:08:20.691 226239 DEBUG nova.compute.manager [req-c20fa5ac-b19f-43a2-a6e3-a4acd7aa4590 req-558a58d8-fc53-4a7a-90cb-88cc236d62c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] No waiting events found dispatching network-vif-plugged-131dda89-6e7d-4a88-9572-dcd63205fc02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:08:20 np0005603623 nova_compute[226235]: 2026-01-31 08:08:20.692 226239 WARNING nova.compute.manager [req-c20fa5ac-b19f-43a2-a6e3-a4acd7aa4590 req-558a58d8-fc53-4a7a-90cb-88cc236d62c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received unexpected event network-vif-plugged-131dda89-6e7d-4a88-9572-dcd63205fc02 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:08:21 np0005603623 nova_compute[226235]: 2026-01-31 08:08:21.209 226239 DEBUG oslo_concurrency.lockutils [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:21 np0005603623 nova_compute[226235]: 2026-01-31 08:08:21.210 226239 DEBUG oslo_concurrency.lockutils [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquired lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:21 np0005603623 nova_compute[226235]: 2026-01-31 08:08:21.210 226239 DEBUG nova.network.neutron [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:08:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:21.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:21.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:22 np0005603623 nova_compute[226235]: 2026-01-31 08:08:22.977 226239 DEBUG oslo_concurrency.lockutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "7f77789f-b530-453e-8213-1c345fa78fac" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:22 np0005603623 nova_compute[226235]: 2026-01-31 08:08:22.978 226239 DEBUG oslo_concurrency.lockutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "7f77789f-b530-453e-8213-1c345fa78fac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:22 np0005603623 nova_compute[226235]: 2026-01-31 08:08:22.982 226239 INFO nova.network.neutron [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Port 131dda89-6e7d-4a88-9572-dcd63205fc02 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 31 03:08:22 np0005603623 nova_compute[226235]: 2026-01-31 08:08:22.982 226239 DEBUG nova.network.neutron [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Updating instance_info_cache with network_info: [{"id": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "address": "fa:16:3e:63:1a:05", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e779f8-cc", "ovs_interfaceid": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:23 np0005603623 nova_compute[226235]: 2026-01-31 08:08:23.044 226239 DEBUG nova.compute.manager [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:08:23 np0005603623 nova_compute[226235]: 2026-01-31 08:08:23.047 226239 DEBUG oslo_concurrency.lockutils [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Releasing lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:23 np0005603623 nova_compute[226235]: 2026-01-31 08:08:23.111 226239 DEBUG oslo_concurrency.lockutils [None req-fbae7ae5-3be6-451c-b680-2e74a7f5e11c 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-d4484d63-c590-4676-b3ae-b8e33bd348f1-131dda89-6e7d-4a88-9572-dcd63205fc02" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 4.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:23 np0005603623 nova_compute[226235]: 2026-01-31 08:08:23.174 226239 DEBUG oslo_concurrency.lockutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:23 np0005603623 nova_compute[226235]: 2026-01-31 08:08:23.175 226239 DEBUG oslo_concurrency.lockutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:23 np0005603623 nova_compute[226235]: 2026-01-31 08:08:23.181 226239 DEBUG nova.virt.hardware [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:08:23 np0005603623 nova_compute[226235]: 2026-01-31 08:08:23.182 226239 INFO nova.compute.claims [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:08:23 np0005603623 nova_compute[226235]: 2026-01-31 08:08:23.222 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:23.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:23 np0005603623 nova_compute[226235]: 2026-01-31 08:08:23.360 226239 DEBUG oslo_concurrency.processutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:08:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:23.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:08:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1050766205' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:08:23 np0005603623 nova_compute[226235]: 2026-01-31 08:08:23.808 226239 DEBUG oslo_concurrency.processutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:08:23 np0005603623 nova_compute[226235]: 2026-01-31 08:08:23.813 226239 DEBUG nova.compute.provider_tree [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:08:23 np0005603623 nova_compute[226235]: 2026-01-31 08:08:23.864 226239 DEBUG nova.scheduler.client.report [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:08:23 np0005603623 nova_compute[226235]: 2026-01-31 08:08:23.907 226239 DEBUG oslo_concurrency.lockutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.732s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:23 np0005603623 nova_compute[226235]: 2026-01-31 08:08:23.908 226239 DEBUG nova.compute.manager [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.016 226239 DEBUG nova.compute.manager [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.016 226239 DEBUG nova.network.neutron [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.072 226239 INFO nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.113 226239 DEBUG nova.compute.manager [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.274 226239 DEBUG nova.compute.manager [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.275 226239 DEBUG nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.275 226239 INFO nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Creating image(s)#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.302 226239 DEBUG nova.storage.rbd_utils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 7f77789f-b530-453e-8213-1c345fa78fac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.327 226239 DEBUG nova.storage.rbd_utils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 7f77789f-b530-453e-8213-1c345fa78fac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.354 226239 DEBUG nova.storage.rbd_utils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 7f77789f-b530-453e-8213-1c345fa78fac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.358 226239 DEBUG oslo_concurrency.processutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.379 226239 DEBUG nova.policy [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '16d731f5875748ca9b8036b2ba061042', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3469c253459e40e39dcf5bcb6a32008f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.382 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.432 226239 DEBUG oslo_concurrency.processutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.433 226239 DEBUG oslo_concurrency.lockutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.434 226239 DEBUG oslo_concurrency.lockutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.434 226239 DEBUG oslo_concurrency.lockutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.466 226239 DEBUG nova.storage.rbd_utils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 7f77789f-b530-453e-8213-1c345fa78fac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.471 226239 DEBUG oslo_concurrency.processutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 7f77789f-b530-453e-8213-1c345fa78fac_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.880 226239 DEBUG nova.compute.manager [req-441ad6d2-6a22-4215-8bd5-c3fcf4779dac req-422b4eca-5771-48aa-b80b-264baec10d01 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received event network-changed-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.882 226239 DEBUG nova.compute.manager [req-441ad6d2-6a22-4215-8bd5-c3fcf4779dac req-422b4eca-5771-48aa-b80b-264baec10d01 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Refreshing instance network info cache due to event network-changed-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.882 226239 DEBUG oslo_concurrency.lockutils [req-441ad6d2-6a22-4215-8bd5-c3fcf4779dac req-422b4eca-5771-48aa-b80b-264baec10d01 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.883 226239 DEBUG oslo_concurrency.lockutils [req-441ad6d2-6a22-4215-8bd5-c3fcf4779dac req-422b4eca-5771-48aa-b80b-264baec10d01 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:24 np0005603623 nova_compute[226235]: 2026-01-31 08:08:24.884 226239 DEBUG nova.network.neutron [req-441ad6d2-6a22-4215-8bd5-c3fcf4779dac req-422b4eca-5771-48aa-b80b-264baec10d01 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Refreshing network info cache for port 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:08:25 np0005603623 nova_compute[226235]: 2026-01-31 08:08:25.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:25 np0005603623 nova_compute[226235]: 2026-01-31 08:08:25.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:08:25 np0005603623 nova_compute[226235]: 2026-01-31 08:08:25.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:08:25 np0005603623 nova_compute[226235]: 2026-01-31 08:08:25.230 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:08:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:25.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:25.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:25 np0005603623 nova_compute[226235]: 2026-01-31 08:08:25.985 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:26 np0005603623 nova_compute[226235]: 2026-01-31 08:08:26.064 226239 DEBUG nova.network.neutron [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Successfully created port: b8f1eb95-1365-46ea-9b13-e055b5689380 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:08:26 np0005603623 nova_compute[226235]: 2026-01-31 08:08:26.291 226239 DEBUG oslo_concurrency.processutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 7f77789f-b530-453e-8213-1c345fa78fac_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.821s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:08:26 np0005603623 nova_compute[226235]: 2026-01-31 08:08:26.383 226239 DEBUG nova.storage.rbd_utils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] resizing rbd image 7f77789f-b530-453e-8213-1c345fa78fac_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:08:26 np0005603623 nova_compute[226235]: 2026-01-31 08:08:26.557 226239 DEBUG nova.objects.instance [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'migration_context' on Instance uuid 7f77789f-b530-453e-8213-1c345fa78fac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:26 np0005603623 nova_compute[226235]: 2026-01-31 08:08:26.579 226239 DEBUG nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:08:26 np0005603623 nova_compute[226235]: 2026-01-31 08:08:26.580 226239 DEBUG nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Ensure instance console log exists: /var/lib/nova/instances/7f77789f-b530-453e-8213-1c345fa78fac/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:08:26 np0005603623 nova_compute[226235]: 2026-01-31 08:08:26.580 226239 DEBUG oslo_concurrency.lockutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:26 np0005603623 nova_compute[226235]: 2026-01-31 08:08:26.580 226239 DEBUG oslo_concurrency.lockutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:26 np0005603623 nova_compute[226235]: 2026-01-31 08:08:26.581 226239 DEBUG oslo_concurrency.lockutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:27 np0005603623 nova_compute[226235]: 2026-01-31 08:08:27.067 226239 DEBUG nova.network.neutron [req-441ad6d2-6a22-4215-8bd5-c3fcf4779dac req-422b4eca-5771-48aa-b80b-264baec10d01 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Updated VIF entry in instance network info cache for port 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:08:27 np0005603623 nova_compute[226235]: 2026-01-31 08:08:27.067 226239 DEBUG nova.network.neutron [req-441ad6d2-6a22-4215-8bd5-c3fcf4779dac req-422b4eca-5771-48aa-b80b-264baec10d01 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Updating instance_info_cache with network_info: [{"id": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "address": "fa:16:3e:63:1a:05", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e779f8-cc", "ovs_interfaceid": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:27 np0005603623 nova_compute[226235]: 2026-01-31 08:08:27.162 226239 DEBUG oslo_concurrency.lockutils [req-441ad6d2-6a22-4215-8bd5-c3fcf4779dac req-422b4eca-5771-48aa-b80b-264baec10d01 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:27 np0005603623 nova_compute[226235]: 2026-01-31 08:08:27.163 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:27 np0005603623 nova_compute[226235]: 2026-01-31 08:08:27.163 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:08:27 np0005603623 nova_compute[226235]: 2026-01-31 08:08:27.163 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d4484d63-c590-4676-b3ae-b8e33bd348f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:27.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:27.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:28 np0005603623 nova_compute[226235]: 2026-01-31 08:08:28.271 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:28 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:08:28 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.019 226239 DEBUG nova.network.neutron [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Successfully updated port: b8f1eb95-1365-46ea-9b13-e055b5689380 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.043 226239 DEBUG oslo_concurrency.lockutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "refresh_cache-7f77789f-b530-453e-8213-1c345fa78fac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.044 226239 DEBUG oslo_concurrency.lockutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquired lock "refresh_cache-7f77789f-b530-453e-8213-1c345fa78fac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.044 226239 DEBUG nova.network.neutron [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:08:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:29.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.384 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.393 226239 DEBUG nova.network.neutron [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:08:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:29.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.934 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Updating instance_info_cache with network_info: [{"id": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "address": "fa:16:3e:63:1a:05", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e779f8-cc", "ovs_interfaceid": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.960 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-d4484d63-c590-4676-b3ae-b8e33bd348f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.960 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.961 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.961 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.961 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.962 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.962 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.991 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.991 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.992 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.992 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:08:29 np0005603623 nova_compute[226235]: 2026-01-31 08:08:29.992 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:30.102 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:30.102 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:30.102 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:08:30 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/872390666' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.391 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.399s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.502 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000046 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.502 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000046 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.603 226239 DEBUG nova.network.neutron [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Updating instance_info_cache with network_info: [{"id": "b8f1eb95-1365-46ea-9b13-e055b5689380", "address": "fa:16:3e:36:f9:b9", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8f1eb95-13", "ovs_interfaceid": "b8f1eb95-1365-46ea-9b13-e055b5689380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.628 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.630 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4422MB free_disk=20.809898376464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.630 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.630 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.666 226239 DEBUG oslo_concurrency.lockutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Releasing lock "refresh_cache-7f77789f-b530-453e-8213-1c345fa78fac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.666 226239 DEBUG nova.compute.manager [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Instance network_info: |[{"id": "b8f1eb95-1365-46ea-9b13-e055b5689380", "address": "fa:16:3e:36:f9:b9", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8f1eb95-13", "ovs_interfaceid": "b8f1eb95-1365-46ea-9b13-e055b5689380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.668 226239 DEBUG nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Start _get_guest_xml network_info=[{"id": "b8f1eb95-1365-46ea-9b13-e055b5689380", "address": "fa:16:3e:36:f9:b9", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8f1eb95-13", "ovs_interfaceid": "b8f1eb95-1365-46ea-9b13-e055b5689380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.672 226239 WARNING nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.676 226239 DEBUG nova.virt.libvirt.host [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.677 226239 DEBUG nova.virt.libvirt.host [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.680 226239 DEBUG nova.virt.libvirt.host [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.680 226239 DEBUG nova.virt.libvirt.host [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.681 226239 DEBUG nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.681 226239 DEBUG nova.virt.hardware [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.682 226239 DEBUG nova.virt.hardware [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.682 226239 DEBUG nova.virt.hardware [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.682 226239 DEBUG nova.virt.hardware [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.682 226239 DEBUG nova.virt.hardware [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.682 226239 DEBUG nova.virt.hardware [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.683 226239 DEBUG nova.virt.hardware [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.683 226239 DEBUG nova.virt.hardware [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.683 226239 DEBUG nova.virt.hardware [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.683 226239 DEBUG nova.virt.hardware [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.683 226239 DEBUG nova.virt.hardware [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.685 226239 DEBUG oslo_concurrency.processutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.775 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance d4484d63-c590-4676-b3ae-b8e33bd348f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.775 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 7f77789f-b530-453e-8213-1c345fa78fac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.777 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.777 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:08:30 np0005603623 nova_compute[226235]: 2026-01-31 08:08:30.842 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:08:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:08:31 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3048453611' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.101 226239 DEBUG oslo_concurrency.processutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.119 226239 DEBUG nova.storage.rbd_utils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 7f77789f-b530-453e-8213-1c345fa78fac_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.122 226239 DEBUG oslo_concurrency.processutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.199 226239 DEBUG nova.compute.manager [req-f5a66487-47b7-4bbc-bf3f-08a66c08d66a req-d424200c-bcf7-485d-aadf-b94adf335573 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Received event network-changed-b8f1eb95-1365-46ea-9b13-e055b5689380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.199 226239 DEBUG nova.compute.manager [req-f5a66487-47b7-4bbc-bf3f-08a66c08d66a req-d424200c-bcf7-485d-aadf-b94adf335573 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Refreshing instance network info cache due to event network-changed-b8f1eb95-1365-46ea-9b13-e055b5689380. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.199 226239 DEBUG oslo_concurrency.lockutils [req-f5a66487-47b7-4bbc-bf3f-08a66c08d66a req-d424200c-bcf7-485d-aadf-b94adf335573 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-7f77789f-b530-453e-8213-1c345fa78fac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.200 226239 DEBUG oslo_concurrency.lockutils [req-f5a66487-47b7-4bbc-bf3f-08a66c08d66a req-d424200c-bcf7-485d-aadf-b94adf335573 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-7f77789f-b530-453e-8213-1c345fa78fac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.200 226239 DEBUG nova.network.neutron [req-f5a66487-47b7-4bbc-bf3f-08a66c08d66a req-d424200c-bcf7-485d-aadf-b94adf335573 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Refreshing network info cache for port b8f1eb95-1365-46ea-9b13-e055b5689380 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:08:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:08:31 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3915440364' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.264 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.269 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:08:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:31.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.323 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.414 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.414 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.784s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:08:31 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1180628848' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.545 226239 DEBUG oslo_concurrency.processutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.547 226239 DEBUG nova.virt.libvirt.vif [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:08:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-328804424',display_name='tempest-DeleteServersTestJSON-server-328804424',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-328804424',id=75,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3469c253459e40e39dcf5bcb6a32008f',ramdisk_id='',reservation_id='r-b3o36avc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-808715310',owner_user_name='tempest-DeleteServersTestJSON-808715310-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:08:24Z,user_data=None,user_id='16d731f5875748ca9b8036b2ba061042',uuid=7f77789f-b530-453e-8213-1c345fa78fac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b8f1eb95-1365-46ea-9b13-e055b5689380", "address": "fa:16:3e:36:f9:b9", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8f1eb95-13", "ovs_interfaceid": "b8f1eb95-1365-46ea-9b13-e055b5689380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.547 226239 DEBUG nova.network.os_vif_util [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converting VIF {"id": "b8f1eb95-1365-46ea-9b13-e055b5689380", "address": "fa:16:3e:36:f9:b9", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8f1eb95-13", "ovs_interfaceid": "b8f1eb95-1365-46ea-9b13-e055b5689380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.548 226239 DEBUG nova.network.os_vif_util [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:f9:b9,bridge_name='br-int',has_traffic_filtering=True,id=b8f1eb95-1365-46ea-9b13-e055b5689380,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8f1eb95-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.549 226239 DEBUG nova.objects.instance [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'pci_devices' on Instance uuid 7f77789f-b530-453e-8213-1c345fa78fac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.591 226239 DEBUG nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:08:31 np0005603623 nova_compute[226235]:  <uuid>7f77789f-b530-453e-8213-1c345fa78fac</uuid>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:  <name>instance-0000004b</name>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <nova:name>tempest-DeleteServersTestJSON-server-328804424</nova:name>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:08:30</nova:creationTime>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:08:31 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:        <nova:user uuid="16d731f5875748ca9b8036b2ba061042">tempest-DeleteServersTestJSON-808715310-project-member</nova:user>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:        <nova:project uuid="3469c253459e40e39dcf5bcb6a32008f">tempest-DeleteServersTestJSON-808715310</nova:project>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:        <nova:port uuid="b8f1eb95-1365-46ea-9b13-e055b5689380">
Jan 31 03:08:31 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <entry name="serial">7f77789f-b530-453e-8213-1c345fa78fac</entry>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <entry name="uuid">7f77789f-b530-453e-8213-1c345fa78fac</entry>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/7f77789f-b530-453e-8213-1c345fa78fac_disk">
Jan 31 03:08:31 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:08:31 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/7f77789f-b530-453e-8213-1c345fa78fac_disk.config">
Jan 31 03:08:31 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:08:31 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:36:f9:b9"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <target dev="tapb8f1eb95-13"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/7f77789f-b530-453e-8213-1c345fa78fac/console.log" append="off"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:08:31 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:08:31 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:08:31 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:08:31 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.592 226239 DEBUG nova.compute.manager [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Preparing to wait for external event network-vif-plugged-b8f1eb95-1365-46ea-9b13-e055b5689380 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.592 226239 DEBUG oslo_concurrency.lockutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "7f77789f-b530-453e-8213-1c345fa78fac-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.592 226239 DEBUG oslo_concurrency.lockutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "7f77789f-b530-453e-8213-1c345fa78fac-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.592 226239 DEBUG oslo_concurrency.lockutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "7f77789f-b530-453e-8213-1c345fa78fac-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.593 226239 DEBUG nova.virt.libvirt.vif [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:08:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-328804424',display_name='tempest-DeleteServersTestJSON-server-328804424',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-328804424',id=75,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3469c253459e40e39dcf5bcb6a32008f',ramdisk_id='',reservation_id='r-b3o36avc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-808715310',owner_user_name='tempest-DeleteServersTestJSON-808715310-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:08:24Z,user_data=None,user_id='16d731f5875748ca9b8036b2ba061042',uuid=7f77789f-b530-453e-8213-1c345fa78fac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b8f1eb95-1365-46ea-9b13-e055b5689380", "address": "fa:16:3e:36:f9:b9", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8f1eb95-13", "ovs_interfaceid": "b8f1eb95-1365-46ea-9b13-e055b5689380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.593 226239 DEBUG nova.network.os_vif_util [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converting VIF {"id": "b8f1eb95-1365-46ea-9b13-e055b5689380", "address": "fa:16:3e:36:f9:b9", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8f1eb95-13", "ovs_interfaceid": "b8f1eb95-1365-46ea-9b13-e055b5689380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.594 226239 DEBUG nova.network.os_vif_util [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:f9:b9,bridge_name='br-int',has_traffic_filtering=True,id=b8f1eb95-1365-46ea-9b13-e055b5689380,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8f1eb95-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.594 226239 DEBUG os_vif [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:f9:b9,bridge_name='br-int',has_traffic_filtering=True,id=b8f1eb95-1365-46ea-9b13-e055b5689380,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8f1eb95-13') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.595 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.595 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.595 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.597 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.598 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8f1eb95-13, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.598 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb8f1eb95-13, col_values=(('external_ids', {'iface-id': 'b8f1eb95-1365-46ea-9b13-e055b5689380', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:36:f9:b9', 'vm-uuid': '7f77789f-b530-453e-8213-1c345fa78fac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.599 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:31 np0005603623 NetworkManager[48970]: <info>  [1769846911.6005] manager: (tapb8f1eb95-13): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/126)
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.603 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.604 226239 INFO os_vif [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:f9:b9,bridge_name='br-int',has_traffic_filtering=True,id=b8f1eb95-1365-46ea-9b13-e055b5689380,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8f1eb95-13')#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.726 226239 DEBUG nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.727 226239 DEBUG nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.727 226239 DEBUG nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No VIF found with MAC fa:16:3e:36:f9:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.728 226239 INFO nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Using config drive#033[00m
Jan 31 03:08:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.002000064s ======
Jan 31 03:08:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:31.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 31 03:08:31 np0005603623 nova_compute[226235]: 2026-01-31 08:08:31.752 226239 DEBUG nova.storage.rbd_utils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 7f77789f-b530-453e-8213-1c345fa78fac_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:08:32 np0005603623 nova_compute[226235]: 2026-01-31 08:08:32.608 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:32 np0005603623 nova_compute[226235]: 2026-01-31 08:08:32.699 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:32 np0005603623 nova_compute[226235]: 2026-01-31 08:08:32.700 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.081 226239 INFO nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Creating config drive at /var/lib/nova/instances/7f77789f-b530-453e-8213-1c345fa78fac/disk.config#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.085 226239 DEBUG oslo_concurrency.processutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7f77789f-b530-453e-8213-1c345fa78fac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp8_uclvjz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.204 226239 DEBUG oslo_concurrency.processutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7f77789f-b530-453e-8213-1c345fa78fac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp8_uclvjz" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.228 226239 DEBUG nova.storage.rbd_utils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 7f77789f-b530-453e-8213-1c345fa78fac_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.231 226239 DEBUG oslo_concurrency.processutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7f77789f-b530-453e-8213-1c345fa78fac/disk.config 7f77789f-b530-453e-8213-1c345fa78fac_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.273 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:33.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.440 226239 DEBUG oslo_concurrency.processutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7f77789f-b530-453e-8213-1c345fa78fac/disk.config 7f77789f-b530-453e-8213-1c345fa78fac_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.209s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.441 226239 INFO nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Deleting local config drive /var/lib/nova/instances/7f77789f-b530-453e-8213-1c345fa78fac/disk.config because it was imported into RBD.#033[00m
Jan 31 03:08:33 np0005603623 kernel: tapb8f1eb95-13: entered promiscuous mode
Jan 31 03:08:33 np0005603623 NetworkManager[48970]: <info>  [1769846913.4816] manager: (tapb8f1eb95-13): new Tun device (/org/freedesktop/NetworkManager/Devices/127)
Jan 31 03:08:33 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:33Z|00268|binding|INFO|Claiming lport b8f1eb95-1365-46ea-9b13-e055b5689380 for this chassis.
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.481 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:33 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:33Z|00269|binding|INFO|b8f1eb95-1365-46ea-9b13-e055b5689380: Claiming fa:16:3e:36:f9:b9 10.100.0.4
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.488 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:33 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:33Z|00270|binding|INFO|Setting lport b8f1eb95-1365-46ea-9b13-e055b5689380 ovn-installed in OVS
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.492 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.498 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:f9:b9 10.100.0.4'], port_security=['fa:16:3e:36:f9:b9 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '7f77789f-b530-453e-8213-1c345fa78fac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3469c253459e40e39dcf5bcb6a32008f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e42c06e8-2644-4a21-adfb-06ef74de77bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=298bbe2a-1faa-4c77-b3c3-4633e58f5921, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=b8f1eb95-1365-46ea-9b13-e055b5689380) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:08:33 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:33Z|00271|binding|INFO|Setting lport b8f1eb95-1365-46ea-9b13-e055b5689380 up in Southbound
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.500 143258 INFO neutron.agent.ovn.metadata.agent [-] Port b8f1eb95-1365-46ea-9b13-e055b5689380 in datapath c1c6810e-ec8f-43f3-a3c6-22606d9416b6 bound to our chassis#033[00m
Jan 31 03:08:33 np0005603623 systemd-udevd[257417]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.501 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1c6810e-ec8f-43f3-a3c6-22606d9416b6#033[00m
Jan 31 03:08:33 np0005603623 systemd-machined[194379]: New machine qemu-32-instance-0000004b.
Jan 31 03:08:33 np0005603623 NetworkManager[48970]: <info>  [1769846913.5100] device (tapb8f1eb95-13): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:08:33 np0005603623 NetworkManager[48970]: <info>  [1769846913.5107] device (tapb8f1eb95-13): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.511 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8603b721-fd71-4277-88de-7d5fa8fdb80b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.512 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc1c6810e-e1 in ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.513 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc1c6810e-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.513 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d2153707-64be-4b28-a09e-f5176ab01f98]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.514 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[21763a09-0fc6-4dd9-97fb-361b86a038cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:33 np0005603623 systemd[1]: Started Virtual Machine qemu-32-instance-0000004b.
Jan 31 03:08:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:08:33 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4246643325' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.523 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[f60a1f7a-8150-4fa6-aa32-036eabf2e560]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.530 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3ef2a5e2-f393-4e27-a84b-9bd48eb6a07b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.546 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[3856661f-b8d0-4279-809f-79aeea7faffd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.551 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1fe94816-5616-4736-ac19-ac5e2d861445]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:33 np0005603623 NetworkManager[48970]: <info>  [1769846913.5526] manager: (tapc1c6810e-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/128)
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.578 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[cf2536c1-20f7-421f-a03f-c8076e13aff1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.581 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[83ddd4d4-b9f1-4c1c-ab82-eab875e93c67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:33 np0005603623 NetworkManager[48970]: <info>  [1769846913.5959] device (tapc1c6810e-e0): carrier: link connected
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.599 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[59693e55-114b-4ef1-922f-0d720db8fc08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.608 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9bcc5532-fbee-4550-b377-95ab52c7c561]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1c6810e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:97:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 605402, 'reachable_time': 32900, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257450, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.619 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[280eafc8-1269-49a5-86e6-9072aa8c9a91]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:9781'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 605402, 'tstamp': 605402}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 257451, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.631 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c4f6d4bc-cb32-469c-8a1a-57f69194d9a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1c6810e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:97:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 605402, 'reachable_time': 32900, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 257452, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.649 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2d2ffc87-f6be-4a3d-a3e2-a6fe3ff27a8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.692 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[de62091d-aee2-495c-befb-7f4631e84945]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.693 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1c6810e-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.693 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.694 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1c6810e-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.695 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:33 np0005603623 NetworkManager[48970]: <info>  [1769846913.6968] manager: (tapc1c6810e-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Jan 31 03:08:33 np0005603623 kernel: tapc1c6810e-e0: entered promiscuous mode
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.698 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.699 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1c6810e-e0, col_values=(('external_ids', {'iface-id': '937542c1-ab1e-4312-ab3a-ee4483fcdf7b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.700 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:33 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:33Z|00272|binding|INFO|Releasing lport 937542c1-ab1e-4312-ab3a-ee4483fcdf7b from this chassis (sb_readonly=0)
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.700 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.701 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.702 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6892a19f-93eb-47ed-8119-bbb54978406e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.703 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-c1c6810e-ec8f-43f3-a3c6-22606d9416b6
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.pid.haproxy
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID c1c6810e-ec8f-43f3-a3c6-22606d9416b6
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:08:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:33.703 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'env', 'PROCESS_TAG=haproxy-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.705 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.711 226239 DEBUG nova.network.neutron [req-f5a66487-47b7-4bbc-bf3f-08a66c08d66a req-d424200c-bcf7-485d-aadf-b94adf335573 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Updated VIF entry in instance network info cache for port b8f1eb95-1365-46ea-9b13-e055b5689380. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.711 226239 DEBUG nova.network.neutron [req-f5a66487-47b7-4bbc-bf3f-08a66c08d66a req-d424200c-bcf7-485d-aadf-b94adf335573 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Updating instance_info_cache with network_info: [{"id": "b8f1eb95-1365-46ea-9b13-e055b5689380", "address": "fa:16:3e:36:f9:b9", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8f1eb95-13", "ovs_interfaceid": "b8f1eb95-1365-46ea-9b13-e055b5689380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.747 226239 DEBUG oslo_concurrency.lockutils [req-f5a66487-47b7-4bbc-bf3f-08a66c08d66a req-d424200c-bcf7-485d-aadf-b94adf335573 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-7f77789f-b530-453e-8213-1c345fa78fac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:33.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.888 226239 DEBUG nova.compute.manager [req-b7a68bec-3e7a-42ff-9b8f-3b3a5729f2bf req-7505567a-ecdd-4cd7-b103-56d105017f97 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Received event network-vif-plugged-b8f1eb95-1365-46ea-9b13-e055b5689380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.888 226239 DEBUG oslo_concurrency.lockutils [req-b7a68bec-3e7a-42ff-9b8f-3b3a5729f2bf req-7505567a-ecdd-4cd7-b103-56d105017f97 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7f77789f-b530-453e-8213-1c345fa78fac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.889 226239 DEBUG oslo_concurrency.lockutils [req-b7a68bec-3e7a-42ff-9b8f-3b3a5729f2bf req-7505567a-ecdd-4cd7-b103-56d105017f97 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7f77789f-b530-453e-8213-1c345fa78fac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.889 226239 DEBUG oslo_concurrency.lockutils [req-b7a68bec-3e7a-42ff-9b8f-3b3a5729f2bf req-7505567a-ecdd-4cd7-b103-56d105017f97 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7f77789f-b530-453e-8213-1c345fa78fac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.889 226239 DEBUG nova.compute.manager [req-b7a68bec-3e7a-42ff-9b8f-3b3a5729f2bf req-7505567a-ecdd-4cd7-b103-56d105017f97 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Processing event network-vif-plugged-b8f1eb95-1365-46ea-9b13-e055b5689380 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.891 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846913.8913894, 7f77789f-b530-453e-8213-1c345fa78fac => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.892 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] VM Started (Lifecycle Event)#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.895 226239 DEBUG nova.compute.manager [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.899 226239 DEBUG nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.902 226239 INFO nova.virt.libvirt.driver [-] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Instance spawned successfully.#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.902 226239 DEBUG nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.936 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.942 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.945 226239 DEBUG nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.945 226239 DEBUG nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.946 226239 DEBUG nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.946 226239 DEBUG nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.946 226239 DEBUG nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.947 226239 DEBUG nova.virt.libvirt.driver [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.986 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.987 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846913.891534, 7f77789f-b530-453e-8213-1c345fa78fac => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:08:33 np0005603623 nova_compute[226235]: 2026-01-31 08:08:33.987 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:08:34 np0005603623 podman[257526]: 2026-01-31 08:08:34.040622763 +0000 UTC m=+0.080472769 container create ec96e6cdcc671083b7f5ec50ca2f608cdee60ce47374cb4a6e1433f8fabb8764 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:08:34 np0005603623 nova_compute[226235]: 2026-01-31 08:08:34.043 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:34 np0005603623 nova_compute[226235]: 2026-01-31 08:08:34.056 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846913.8984046, 7f77789f-b530-453e-8213-1c345fa78fac => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:08:34 np0005603623 nova_compute[226235]: 2026-01-31 08:08:34.056 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:08:34 np0005603623 nova_compute[226235]: 2026-01-31 08:08:34.061 226239 INFO nova.compute.manager [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Took 9.79 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:08:34 np0005603623 nova_compute[226235]: 2026-01-31 08:08:34.061 226239 DEBUG nova.compute.manager [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:34 np0005603623 podman[257526]: 2026-01-31 08:08:33.980935891 +0000 UTC m=+0.020785917 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:08:34 np0005603623 nova_compute[226235]: 2026-01-31 08:08:34.085 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:34 np0005603623 nova_compute[226235]: 2026-01-31 08:08:34.088 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:08:34 np0005603623 systemd[1]: Started libpod-conmon-ec96e6cdcc671083b7f5ec50ca2f608cdee60ce47374cb4a6e1433f8fabb8764.scope.
Jan 31 03:08:34 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:08:34 np0005603623 nova_compute[226235]: 2026-01-31 08:08:34.111 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:08:34 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68faac808dfe09649033c257518f4e95ec5663db953c8c874133adb9202f5f81/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:08:34 np0005603623 nova_compute[226235]: 2026-01-31 08:08:34.128 226239 INFO nova.compute.manager [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Took 10.98 seconds to build instance.#033[00m
Jan 31 03:08:34 np0005603623 nova_compute[226235]: 2026-01-31 08:08:34.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:34 np0005603623 nova_compute[226235]: 2026-01-31 08:08:34.155 226239 DEBUG oslo_concurrency.lockutils [None req-3c653231-500c-47f7-93db-280ca3ca1411 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "7f77789f-b530-453e-8213-1c345fa78fac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:34 np0005603623 podman[257526]: 2026-01-31 08:08:34.168852185 +0000 UTC m=+0.208702191 container init ec96e6cdcc671083b7f5ec50ca2f608cdee60ce47374cb4a6e1433f8fabb8764 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 31 03:08:34 np0005603623 podman[257526]: 2026-01-31 08:08:34.17788099 +0000 UTC m=+0.217731016 container start ec96e6cdcc671083b7f5ec50ca2f608cdee60ce47374cb4a6e1433f8fabb8764 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:08:34 np0005603623 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[257541]: [NOTICE]   (257545) : New worker (257547) forked
Jan 31 03:08:34 np0005603623 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[257541]: [NOTICE]   (257545) : Loading success.
Jan 31 03:08:35 np0005603623 nova_compute[226235]: 2026-01-31 08:08:35.101 226239 DEBUG oslo_concurrency.lockutils [None req-07ca1c35-e269-4613-8660-e7b7f1ce07dd 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "7f77789f-b530-453e-8213-1c345fa78fac" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:35 np0005603623 nova_compute[226235]: 2026-01-31 08:08:35.102 226239 DEBUG oslo_concurrency.lockutils [None req-07ca1c35-e269-4613-8660-e7b7f1ce07dd 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "7f77789f-b530-453e-8213-1c345fa78fac" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:35 np0005603623 nova_compute[226235]: 2026-01-31 08:08:35.102 226239 DEBUG nova.compute.manager [None req-07ca1c35-e269-4613-8660-e7b7f1ce07dd 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:35 np0005603623 nova_compute[226235]: 2026-01-31 08:08:35.106 226239 DEBUG nova.compute.manager [None req-07ca1c35-e269-4613-8660-e7b7f1ce07dd 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 31 03:08:35 np0005603623 nova_compute[226235]: 2026-01-31 08:08:35.107 226239 DEBUG nova.objects.instance [None req-07ca1c35-e269-4613-8660-e7b7f1ce07dd 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'flavor' on Instance uuid 7f77789f-b530-453e-8213-1c345fa78fac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:35 np0005603623 nova_compute[226235]: 2026-01-31 08:08:35.140 226239 DEBUG nova.virt.libvirt.driver [None req-07ca1c35-e269-4613-8660-e7b7f1ce07dd 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:08:35 np0005603623 nova_compute[226235]: 2026-01-31 08:08:35.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:35.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:35.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:36 np0005603623 nova_compute[226235]: 2026-01-31 08:08:36.172 226239 DEBUG nova.compute.manager [req-d73afae5-af7f-4e39-9125-efeb0dcb75a1 req-e19f1d87-9750-4f1e-8ecd-7bb77f723591 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Received event network-vif-plugged-b8f1eb95-1365-46ea-9b13-e055b5689380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:36 np0005603623 nova_compute[226235]: 2026-01-31 08:08:36.173 226239 DEBUG oslo_concurrency.lockutils [req-d73afae5-af7f-4e39-9125-efeb0dcb75a1 req-e19f1d87-9750-4f1e-8ecd-7bb77f723591 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7f77789f-b530-453e-8213-1c345fa78fac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:36 np0005603623 nova_compute[226235]: 2026-01-31 08:08:36.173 226239 DEBUG oslo_concurrency.lockutils [req-d73afae5-af7f-4e39-9125-efeb0dcb75a1 req-e19f1d87-9750-4f1e-8ecd-7bb77f723591 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7f77789f-b530-453e-8213-1c345fa78fac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:36 np0005603623 nova_compute[226235]: 2026-01-31 08:08:36.173 226239 DEBUG oslo_concurrency.lockutils [req-d73afae5-af7f-4e39-9125-efeb0dcb75a1 req-e19f1d87-9750-4f1e-8ecd-7bb77f723591 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7f77789f-b530-453e-8213-1c345fa78fac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:36 np0005603623 nova_compute[226235]: 2026-01-31 08:08:36.173 226239 DEBUG nova.compute.manager [req-d73afae5-af7f-4e39-9125-efeb0dcb75a1 req-e19f1d87-9750-4f1e-8ecd-7bb77f723591 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] No waiting events found dispatching network-vif-plugged-b8f1eb95-1365-46ea-9b13-e055b5689380 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:08:36 np0005603623 nova_compute[226235]: 2026-01-31 08:08:36.173 226239 WARNING nova.compute.manager [req-d73afae5-af7f-4e39-9125-efeb0dcb75a1 req-e19f1d87-9750-4f1e-8ecd-7bb77f723591 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Received unexpected event network-vif-plugged-b8f1eb95-1365-46ea-9b13-e055b5689380 for instance with vm_state active and task_state powering-off.#033[00m
Jan 31 03:08:36 np0005603623 nova_compute[226235]: 2026-01-31 08:08:36.601 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:37.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:37.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:38 np0005603623 nova_compute[226235]: 2026-01-31 08:08:38.323 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:39.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:39.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:40Z|00273|binding|INFO|Releasing lport 937542c1-ab1e-4312-ab3a-ee4483fcdf7b from this chassis (sb_readonly=0)
Jan 31 03:08:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:40Z|00274|binding|INFO|Releasing lport b4a40811-3703-4da5-859c-3e041b7cfee4 from this chassis (sb_readonly=0)
Jan 31 03:08:40 np0005603623 nova_compute[226235]: 2026-01-31 08:08:40.698 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:41.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:41 np0005603623 nova_compute[226235]: 2026-01-31 08:08:41.604 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:41.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:41.770 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:08:41 np0005603623 nova_compute[226235]: 2026-01-31 08:08:41.770 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:41.781 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:08:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:41.783 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:43.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:43 np0005603623 nova_compute[226235]: 2026-01-31 08:08:43.325 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:43.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:45 np0005603623 nova_compute[226235]: 2026-01-31 08:08:45.184 226239 DEBUG nova.virt.libvirt.driver [None req-07ca1c35-e269-4613-8660-e7b7f1ce07dd 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 03:08:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:45.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:45.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:46 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:46Z|00275|binding|INFO|Releasing lport 937542c1-ab1e-4312-ab3a-ee4483fcdf7b from this chassis (sb_readonly=0)
Jan 31 03:08:46 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:46Z|00276|binding|INFO|Releasing lport b4a40811-3703-4da5-859c-3e041b7cfee4 from this chassis (sb_readonly=0)
Jan 31 03:08:46 np0005603623 nova_compute[226235]: 2026-01-31 08:08:46.456 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:46 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:46Z|00277|binding|INFO|Releasing lport 937542c1-ab1e-4312-ab3a-ee4483fcdf7b from this chassis (sb_readonly=0)
Jan 31 03:08:46 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:46Z|00278|binding|INFO|Releasing lport b4a40811-3703-4da5-859c-3e041b7cfee4 from this chassis (sb_readonly=0)
Jan 31 03:08:46 np0005603623 nova_compute[226235]: 2026-01-31 08:08:46.488 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:46 np0005603623 nova_compute[226235]: 2026-01-31 08:08:46.606 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:47.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:47.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.328 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.486 226239 DEBUG oslo_concurrency.lockutils [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "d4484d63-c590-4676-b3ae-b8e33bd348f1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.487 226239 DEBUG oslo_concurrency.lockutils [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.487 226239 DEBUG oslo_concurrency.lockutils [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.487 226239 DEBUG oslo_concurrency.lockutils [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.488 226239 DEBUG oslo_concurrency.lockutils [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.490 226239 INFO nova.compute.manager [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Terminating instance#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.492 226239 DEBUG nova.compute.manager [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:08:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:48 np0005603623 kernel: tap00e779f8-cc (unregistering): left promiscuous mode
Jan 31 03:08:48 np0005603623 NetworkManager[48970]: <info>  [1769846928.5652] device (tap00e779f8-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.572 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:48Z|00279|binding|INFO|Releasing lport 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c from this chassis (sb_readonly=0)
Jan 31 03:08:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:48Z|00280|binding|INFO|Setting lport 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c down in Southbound
Jan 31 03:08:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:48Z|00281|binding|INFO|Removing iface tap00e779f8-cc ovn-installed in OVS
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.574 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.578 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:48 np0005603623 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000046.scope: Deactivated successfully.
Jan 31 03:08:48 np0005603623 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000046.scope: Consumed 15.020s CPU time.
Jan 31 03:08:48 np0005603623 systemd-machined[194379]: Machine qemu-31-instance-00000046 terminated.
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.672 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:1a:05 10.100.0.3'], port_security=['fa:16:3e:63:1a:05 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'd4484d63-c590-4676-b3ae-b8e33bd348f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c826f71-7560-44f4-8034-5ac735f4e81f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=00e779f8-ccb2-4b71-bce9-7a3c9df3b85c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.675 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 00e779f8-ccb2-4b71-bce9-7a3c9df3b85c in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f unbound from our chassis#033[00m
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.677 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 455fab34-b015-4d97-a96d-f7ebd7f7555f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.678 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2fa8b6f7-1414-4aaf-a946-72e3ca1d895c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.679 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f namespace which is not needed anymore#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.710 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.714 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:48 np0005603623 kernel: tapb8f1eb95-13 (unregistering): left promiscuous mode
Jan 31 03:08:48 np0005603623 NetworkManager[48970]: <info>  [1769846928.7180] device (tapb8f1eb95-13): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:08:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:48Z|00282|binding|INFO|Releasing lport b8f1eb95-1365-46ea-9b13-e055b5689380 from this chassis (sb_readonly=0)
Jan 31 03:08:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:48Z|00283|binding|INFO|Setting lport b8f1eb95-1365-46ea-9b13-e055b5689380 down in Southbound
Jan 31 03:08:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:08:48Z|00284|binding|INFO|Removing iface tapb8f1eb95-13 ovn-installed in OVS
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.725 226239 INFO nova.virt.libvirt.driver [-] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Instance destroyed successfully.#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.725 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.726 226239 DEBUG nova.objects.instance [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'resources' on Instance uuid d4484d63-c590-4676-b3ae-b8e33bd348f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.727 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.736 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:48 np0005603623 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000004b.scope: Deactivated successfully.
Jan 31 03:08:48 np0005603623 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000004b.scope: Consumed 12.139s CPU time.
Jan 31 03:08:48 np0005603623 systemd-machined[194379]: Machine qemu-32-instance-0000004b terminated.
Jan 31 03:08:48 np0005603623 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[256606]: [NOTICE]   (256610) : haproxy version is 2.8.14-c23fe91
Jan 31 03:08:48 np0005603623 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[256606]: [NOTICE]   (256610) : path to executable is /usr/sbin/haproxy
Jan 31 03:08:48 np0005603623 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[256606]: [WARNING]  (256610) : Exiting Master process...
Jan 31 03:08:48 np0005603623 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[256606]: [WARNING]  (256610) : Exiting Master process...
Jan 31 03:08:48 np0005603623 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[256606]: [ALERT]    (256610) : Current worker (256612) exited with code 143 (Terminated)
Jan 31 03:08:48 np0005603623 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[256606]: [WARNING]  (256610) : All workers exited. Exiting... (0)
Jan 31 03:08:48 np0005603623 systemd[1]: libpod-8babfc046a7f03ed6f3a78c5a092338cdfe24cb4970b9742aaaf52b6350204e9.scope: Deactivated successfully.
Jan 31 03:08:48 np0005603623 podman[257655]: 2026-01-31 08:08:48.805263967 +0000 UTC m=+0.039346452 container died 8babfc046a7f03ed6f3a78c5a092338cdfe24cb4970b9742aaaf52b6350204e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.806 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:f9:b9 10.100.0.4'], port_security=['fa:16:3e:36:f9:b9 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '7f77789f-b530-453e-8213-1c345fa78fac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3469c253459e40e39dcf5bcb6a32008f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e42c06e8-2644-4a21-adfb-06ef74de77bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=298bbe2a-1faa-4c77-b3c3-4633e58f5921, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=b8f1eb95-1365-46ea-9b13-e055b5689380) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.816 226239 DEBUG nova.virt.libvirt.vif [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:07:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1091698191',display_name='tempest-tempest.common.compute-instance-1091698191',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1091698191',id=70,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPZmLk+NGh2zKbql/sBzP6qM4W9cXGD3OUJAhT/207QiFni858RIgrXDyBBR0Tlv+t9A7ybvSMg5e6CDTEEkg6g7w68asAv+N4fL3AAeDAWcmo04YGYMANL/8swEdfyv1w==',key_name='tempest-keypair-1664347335',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:07:41Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-uhssxba1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:07:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=d4484d63-c590-4676-b3ae-b8e33bd348f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "address": "fa:16:3e:63:1a:05", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e779f8-cc", "ovs_interfaceid": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.817 226239 DEBUG nova.network.os_vif_util [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "address": "fa:16:3e:63:1a:05", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e779f8-cc", "ovs_interfaceid": "00e779f8-ccb2-4b71-bce9-7a3c9df3b85c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.818 226239 DEBUG nova.network.os_vif_util [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:63:1a:05,bridge_name='br-int',has_traffic_filtering=True,id=00e779f8-ccb2-4b71-bce9-7a3c9df3b85c,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00e779f8-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.818 226239 DEBUG os_vif [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:63:1a:05,bridge_name='br-int',has_traffic_filtering=True,id=00e779f8-ccb2-4b71-bce9-7a3c9df3b85c,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00e779f8-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.820 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.820 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00e779f8-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.822 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.824 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.825 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.828 226239 INFO os_vif [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:63:1a:05,bridge_name='br-int',has_traffic_filtering=True,id=00e779f8-ccb2-4b71-bce9-7a3c9df3b85c,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00e779f8-cc')#033[00m
Jan 31 03:08:48 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8babfc046a7f03ed6f3a78c5a092338cdfe24cb4970b9742aaaf52b6350204e9-userdata-shm.mount: Deactivated successfully.
Jan 31 03:08:48 np0005603623 systemd[1]: var-lib-containers-storage-overlay-24398064eb9fa09598985b3a629b9b9d0c1400909774c71328a7adf728be2849-merged.mount: Deactivated successfully.
Jan 31 03:08:48 np0005603623 podman[257655]: 2026-01-31 08:08:48.853133166 +0000 UTC m=+0.087215631 container cleanup 8babfc046a7f03ed6f3a78c5a092338cdfe24cb4970b9742aaaf52b6350204e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:08:48 np0005603623 systemd[1]: libpod-conmon-8babfc046a7f03ed6f3a78c5a092338cdfe24cb4970b9742aaaf52b6350204e9.scope: Deactivated successfully.
Jan 31 03:08:48 np0005603623 podman[257701]: 2026-01-31 08:08:48.910673669 +0000 UTC m=+0.044833004 container remove 8babfc046a7f03ed6f3a78c5a092338cdfe24cb4970b9742aaaf52b6350204e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.914 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[efd4acfa-892f-4dbc-a31b-da6eb3f7108a]: (4, ('Sat Jan 31 08:08:48 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f (8babfc046a7f03ed6f3a78c5a092338cdfe24cb4970b9742aaaf52b6350204e9)\n8babfc046a7f03ed6f3a78c5a092338cdfe24cb4970b9742aaaf52b6350204e9\nSat Jan 31 08:08:48 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f (8babfc046a7f03ed6f3a78c5a092338cdfe24cb4970b9742aaaf52b6350204e9)\n8babfc046a7f03ed6f3a78c5a092338cdfe24cb4970b9742aaaf52b6350204e9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.916 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b695ab9d-1db8-443b-b36d-addfa6e7f69f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.917 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:48 np0005603623 kernel: tap455fab34-b0: left promiscuous mode
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.919 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:48 np0005603623 nova_compute[226235]: 2026-01-31 08:08:48.926 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.929 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4428e7b4-d3b6-49cd-8ba1-43b09951a800]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.943 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[55050516-a376-4ddb-b619-b208f416fc37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.946 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5651158f-d673-4ed1-bad8-c23fa1e4dd83]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.958 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b64ae1e9-b213-4a57-adce-f38780d1c361]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600089, 'reachable_time': 29364, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257731, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:48 np0005603623 systemd[1]: run-netns-ovnmeta\x2d455fab34\x2db015\x2d4d97\x2da96d\x2df7ebd7f7555f.mount: Deactivated successfully.
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.964 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.965 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[fcec4e43-1ce5-41a9-8411-fd921148ad84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.966 143258 INFO neutron.agent.ovn.metadata.agent [-] Port b8f1eb95-1365-46ea-9b13-e055b5689380 in datapath c1c6810e-ec8f-43f3-a3c6-22606d9416b6 unbound from our chassis#033[00m
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.967 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c1c6810e-ec8f-43f3-a3c6-22606d9416b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.968 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8fc21747-5a1b-4390-bbd4-3d6de8e9f0d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:48.969 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 namespace which is not needed anymore#033[00m
Jan 31 03:08:49 np0005603623 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[257541]: [NOTICE]   (257545) : haproxy version is 2.8.14-c23fe91
Jan 31 03:08:49 np0005603623 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[257541]: [NOTICE]   (257545) : path to executable is /usr/sbin/haproxy
Jan 31 03:08:49 np0005603623 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[257541]: [WARNING]  (257545) : Exiting Master process...
Jan 31 03:08:49 np0005603623 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[257541]: [ALERT]    (257545) : Current worker (257547) exited with code 143 (Terminated)
Jan 31 03:08:49 np0005603623 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[257541]: [WARNING]  (257545) : All workers exited. Exiting... (0)
Jan 31 03:08:49 np0005603623 systemd[1]: libpod-ec96e6cdcc671083b7f5ec50ca2f608cdee60ce47374cb4a6e1433f8fabb8764.scope: Deactivated successfully.
Jan 31 03:08:49 np0005603623 podman[257751]: 2026-01-31 08:08:49.0680229 +0000 UTC m=+0.038284438 container died ec96e6cdcc671083b7f5ec50ca2f608cdee60ce47374cb4a6e1433f8fabb8764 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:08:49 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec96e6cdcc671083b7f5ec50ca2f608cdee60ce47374cb4a6e1433f8fabb8764-userdata-shm.mount: Deactivated successfully.
Jan 31 03:08:49 np0005603623 systemd[1]: var-lib-containers-storage-overlay-68faac808dfe09649033c257518f4e95ec5663db953c8c874133adb9202f5f81-merged.mount: Deactivated successfully.
Jan 31 03:08:49 np0005603623 podman[257751]: 2026-01-31 08:08:49.10040141 +0000 UTC m=+0.070662948 container cleanup ec96e6cdcc671083b7f5ec50ca2f608cdee60ce47374cb4a6e1433f8fabb8764 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:08:49 np0005603623 systemd[1]: libpod-conmon-ec96e6cdcc671083b7f5ec50ca2f608cdee60ce47374cb4a6e1433f8fabb8764.scope: Deactivated successfully.
Jan 31 03:08:49 np0005603623 podman[257781]: 2026-01-31 08:08:49.148377743 +0000 UTC m=+0.034351884 container remove ec96e6cdcc671083b7f5ec50ca2f608cdee60ce47374cb4a6e1433f8fabb8764 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:08:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:49.151 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e1e73fb8-a725-453f-9bf8-0e67bdcefb3f]: (4, ('Sat Jan 31 08:08:49 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 (ec96e6cdcc671083b7f5ec50ca2f608cdee60ce47374cb4a6e1433f8fabb8764)\nec96e6cdcc671083b7f5ec50ca2f608cdee60ce47374cb4a6e1433f8fabb8764\nSat Jan 31 08:08:49 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 (ec96e6cdcc671083b7f5ec50ca2f608cdee60ce47374cb4a6e1433f8fabb8764)\nec96e6cdcc671083b7f5ec50ca2f608cdee60ce47374cb4a6e1433f8fabb8764\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:49.153 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb6cd6a-eb9b-48de-8453-1587170092d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:49.154 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1c6810e-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.155 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:49 np0005603623 kernel: tapc1c6810e-e0: left promiscuous mode
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.163 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:49.167 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a8252021-5457-4670-a415-30391e5846cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:49.181 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f18c2cb1-413b-4265-ab83-865f7a067f87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:49.182 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[10390712-1391-40d2-bc69-048abf76f977]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:49.194 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3493122c-5506-425a-a1fe-9f5742c6519c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 605397, 'reachable_time': 32169, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257800, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:49.196 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:08:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:08:49.196 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[b3b147e5-19fb-4dbb-8558-56ceb713cc9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.203 226239 INFO nova.virt.libvirt.driver [None req-07ca1c35-e269-4613-8660-e7b7f1ce07dd 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Instance shutdown successfully after 14 seconds.#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.209 226239 INFO nova.virt.libvirt.driver [-] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Instance destroyed successfully.#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.210 226239 DEBUG nova.objects.instance [None req-07ca1c35-e269-4613-8660-e7b7f1ce07dd 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'numa_topology' on Instance uuid 7f77789f-b530-453e-8213-1c345fa78fac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.234 226239 INFO nova.virt.libvirt.driver [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Deleting instance files /var/lib/nova/instances/d4484d63-c590-4676-b3ae-b8e33bd348f1_del#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.235 226239 INFO nova.virt.libvirt.driver [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Deletion of /var/lib/nova/instances/d4484d63-c590-4676-b3ae-b8e33bd348f1_del complete#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.291 226239 DEBUG nova.compute.manager [None req-07ca1c35-e269-4613-8660-e7b7f1ce07dd 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:49.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.558 226239 INFO nova.compute.manager [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Took 1.07 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.558 226239 DEBUG oslo.service.loopingcall [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.559 226239 DEBUG nova.compute.manager [-] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.559 226239 DEBUG nova.network.neutron [-] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.571 226239 DEBUG oslo_concurrency.lockutils [None req-07ca1c35-e269-4613-8660-e7b7f1ce07dd 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "7f77789f-b530-453e-8213-1c345fa78fac" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 14.469s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.583 226239 DEBUG nova.compute.manager [req-c4d6daef-badd-4ee1-be4e-da38eded7f33 req-5f8356e6-715f-47ff-b5e4-2b7ba1ef04bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received event network-vif-unplugged-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.583 226239 DEBUG oslo_concurrency.lockutils [req-c4d6daef-badd-4ee1-be4e-da38eded7f33 req-5f8356e6-715f-47ff-b5e4-2b7ba1ef04bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.584 226239 DEBUG oslo_concurrency.lockutils [req-c4d6daef-badd-4ee1-be4e-da38eded7f33 req-5f8356e6-715f-47ff-b5e4-2b7ba1ef04bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.584 226239 DEBUG oslo_concurrency.lockutils [req-c4d6daef-badd-4ee1-be4e-da38eded7f33 req-5f8356e6-715f-47ff-b5e4-2b7ba1ef04bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.584 226239 DEBUG nova.compute.manager [req-c4d6daef-badd-4ee1-be4e-da38eded7f33 req-5f8356e6-715f-47ff-b5e4-2b7ba1ef04bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] No waiting events found dispatching network-vif-unplugged-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.585 226239 DEBUG nova.compute.manager [req-c4d6daef-badd-4ee1-be4e-da38eded7f33 req-5f8356e6-715f-47ff-b5e4-2b7ba1ef04bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received event network-vif-unplugged-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.585 226239 DEBUG nova.compute.manager [req-474cf04d-fdc8-48e6-aa50-5e6f684b2b2e req-db61af3d-ddf7-4b31-bcd6-8819e78cf149 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Received event network-vif-unplugged-b8f1eb95-1365-46ea-9b13-e055b5689380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.586 226239 DEBUG oslo_concurrency.lockutils [req-474cf04d-fdc8-48e6-aa50-5e6f684b2b2e req-db61af3d-ddf7-4b31-bcd6-8819e78cf149 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7f77789f-b530-453e-8213-1c345fa78fac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.586 226239 DEBUG oslo_concurrency.lockutils [req-474cf04d-fdc8-48e6-aa50-5e6f684b2b2e req-db61af3d-ddf7-4b31-bcd6-8819e78cf149 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7f77789f-b530-453e-8213-1c345fa78fac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.586 226239 DEBUG oslo_concurrency.lockutils [req-474cf04d-fdc8-48e6-aa50-5e6f684b2b2e req-db61af3d-ddf7-4b31-bcd6-8819e78cf149 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7f77789f-b530-453e-8213-1c345fa78fac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.587 226239 DEBUG nova.compute.manager [req-474cf04d-fdc8-48e6-aa50-5e6f684b2b2e req-db61af3d-ddf7-4b31-bcd6-8819e78cf149 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] No waiting events found dispatching network-vif-unplugged-b8f1eb95-1365-46ea-9b13-e055b5689380 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:08:49 np0005603623 nova_compute[226235]: 2026-01-31 08:08:49.587 226239 WARNING nova.compute.manager [req-474cf04d-fdc8-48e6-aa50-5e6f684b2b2e req-db61af3d-ddf7-4b31-bcd6-8819e78cf149 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Received unexpected event network-vif-unplugged-b8f1eb95-1365-46ea-9b13-e055b5689380 for instance with vm_state active and task_state powering-off.#033[00m
Jan 31 03:08:49 np0005603623 podman[257801]: 2026-01-31 08:08:49.701207891 +0000 UTC m=+0.049593305 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 31 03:08:49 np0005603623 podman[257802]: 2026-01-31 08:08:49.724083232 +0000 UTC m=+0.070692970 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:08:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:49.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:49 np0005603623 systemd[1]: run-netns-ovnmeta\x2dc1c6810e\x2dec8f\x2d43f3\x2da3c6\x2d22606d9416b6.mount: Deactivated successfully.
Jan 31 03:08:50 np0005603623 nova_compute[226235]: 2026-01-31 08:08:50.443 226239 DEBUG nova.network.neutron [-] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:50 np0005603623 nova_compute[226235]: 2026-01-31 08:08:50.482 226239 INFO nova.compute.manager [-] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Took 0.92 seconds to deallocate network for instance.#033[00m
Jan 31 03:08:50 np0005603623 nova_compute[226235]: 2026-01-31 08:08:50.540 226239 DEBUG oslo_concurrency.lockutils [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:50 np0005603623 nova_compute[226235]: 2026-01-31 08:08:50.540 226239 DEBUG oslo_concurrency.lockutils [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:50 np0005603623 nova_compute[226235]: 2026-01-31 08:08:50.552 226239 DEBUG nova.compute.manager [req-1ce2c7bb-b6bc-4beb-9b2e-233f9b17812a req-1108a7e1-aa73-4f82-8a5e-4936f3eb16cd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received event network-vif-deleted-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:50 np0005603623 nova_compute[226235]: 2026-01-31 08:08:50.639 226239 DEBUG oslo_concurrency.processutils [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:08:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:08:51 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3831632223' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:08:51 np0005603623 nova_compute[226235]: 2026-01-31 08:08:51.049 226239 DEBUG oslo_concurrency.processutils [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:08:51 np0005603623 nova_compute[226235]: 2026-01-31 08:08:51.055 226239 DEBUG nova.compute.provider_tree [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:08:51 np0005603623 nova_compute[226235]: 2026-01-31 08:08:51.083 226239 DEBUG nova.scheduler.client.report [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:08:51 np0005603623 nova_compute[226235]: 2026-01-31 08:08:51.124 226239 DEBUG oslo_concurrency.lockutils [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.583s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:51 np0005603623 nova_compute[226235]: 2026-01-31 08:08:51.153 226239 INFO nova.scheduler.client.report [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Deleted allocations for instance d4484d63-c590-4676-b3ae-b8e33bd348f1#033[00m
Jan 31 03:08:51 np0005603623 nova_compute[226235]: 2026-01-31 08:08:51.246 226239 DEBUG oslo_concurrency.lockutils [None req-dd418b48-00f3-42aa-9682-a0a355f20603 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:08:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:51.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:08:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:51.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:51 np0005603623 nova_compute[226235]: 2026-01-31 08:08:51.807 226239 DEBUG nova.compute.manager [req-e0bf4b94-3aa9-40e4-8fbc-69419d1050bb req-2652b0d2-521a-4ad0-85bc-7a87ae06cb1c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received event network-vif-plugged-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:51 np0005603623 nova_compute[226235]: 2026-01-31 08:08:51.808 226239 DEBUG oslo_concurrency.lockutils [req-e0bf4b94-3aa9-40e4-8fbc-69419d1050bb req-2652b0d2-521a-4ad0-85bc-7a87ae06cb1c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:51 np0005603623 nova_compute[226235]: 2026-01-31 08:08:51.808 226239 DEBUG oslo_concurrency.lockutils [req-e0bf4b94-3aa9-40e4-8fbc-69419d1050bb req-2652b0d2-521a-4ad0-85bc-7a87ae06cb1c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:51 np0005603623 nova_compute[226235]: 2026-01-31 08:08:51.808 226239 DEBUG oslo_concurrency.lockutils [req-e0bf4b94-3aa9-40e4-8fbc-69419d1050bb req-2652b0d2-521a-4ad0-85bc-7a87ae06cb1c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d4484d63-c590-4676-b3ae-b8e33bd348f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:51 np0005603623 nova_compute[226235]: 2026-01-31 08:08:51.808 226239 DEBUG nova.compute.manager [req-e0bf4b94-3aa9-40e4-8fbc-69419d1050bb req-2652b0d2-521a-4ad0-85bc-7a87ae06cb1c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] No waiting events found dispatching network-vif-plugged-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:08:51 np0005603623 nova_compute[226235]: 2026-01-31 08:08:51.808 226239 WARNING nova.compute.manager [req-e0bf4b94-3aa9-40e4-8fbc-69419d1050bb req-2652b0d2-521a-4ad0-85bc-7a87ae06cb1c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Received unexpected event network-vif-plugged-00e779f8-ccb2-4b71-bce9-7a3c9df3b85c for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:08:51 np0005603623 nova_compute[226235]: 2026-01-31 08:08:51.828 226239 DEBUG nova.compute.manager [req-b02bbd1a-adc3-48e3-82fa-5350adcf2429 req-0f0a7e00-5fb5-4529-8cb3-4520950f21de fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Received event network-vif-plugged-b8f1eb95-1365-46ea-9b13-e055b5689380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:51 np0005603623 nova_compute[226235]: 2026-01-31 08:08:51.829 226239 DEBUG oslo_concurrency.lockutils [req-b02bbd1a-adc3-48e3-82fa-5350adcf2429 req-0f0a7e00-5fb5-4529-8cb3-4520950f21de fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7f77789f-b530-453e-8213-1c345fa78fac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:51 np0005603623 nova_compute[226235]: 2026-01-31 08:08:51.829 226239 DEBUG oslo_concurrency.lockutils [req-b02bbd1a-adc3-48e3-82fa-5350adcf2429 req-0f0a7e00-5fb5-4529-8cb3-4520950f21de fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7f77789f-b530-453e-8213-1c345fa78fac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:51 np0005603623 nova_compute[226235]: 2026-01-31 08:08:51.829 226239 DEBUG oslo_concurrency.lockutils [req-b02bbd1a-adc3-48e3-82fa-5350adcf2429 req-0f0a7e00-5fb5-4529-8cb3-4520950f21de fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7f77789f-b530-453e-8213-1c345fa78fac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:51 np0005603623 nova_compute[226235]: 2026-01-31 08:08:51.829 226239 DEBUG nova.compute.manager [req-b02bbd1a-adc3-48e3-82fa-5350adcf2429 req-0f0a7e00-5fb5-4529-8cb3-4520950f21de fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] No waiting events found dispatching network-vif-plugged-b8f1eb95-1365-46ea-9b13-e055b5689380 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:08:51 np0005603623 nova_compute[226235]: 2026-01-31 08:08:51.830 226239 WARNING nova.compute.manager [req-b02bbd1a-adc3-48e3-82fa-5350adcf2429 req-0f0a7e00-5fb5-4529-8cb3-4520950f21de fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Received unexpected event network-vif-plugged-b8f1eb95-1365-46ea-9b13-e055b5689380 for instance with vm_state stopped and task_state None.#033[00m
Jan 31 03:08:52 np0005603623 nova_compute[226235]: 2026-01-31 08:08:52.249 226239 DEBUG oslo_concurrency.lockutils [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "7f77789f-b530-453e-8213-1c345fa78fac" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:52 np0005603623 nova_compute[226235]: 2026-01-31 08:08:52.250 226239 DEBUG oslo_concurrency.lockutils [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "7f77789f-b530-453e-8213-1c345fa78fac" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:52 np0005603623 nova_compute[226235]: 2026-01-31 08:08:52.250 226239 DEBUG oslo_concurrency.lockutils [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "7f77789f-b530-453e-8213-1c345fa78fac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:52 np0005603623 nova_compute[226235]: 2026-01-31 08:08:52.250 226239 DEBUG oslo_concurrency.lockutils [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "7f77789f-b530-453e-8213-1c345fa78fac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:52 np0005603623 nova_compute[226235]: 2026-01-31 08:08:52.251 226239 DEBUG oslo_concurrency.lockutils [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "7f77789f-b530-453e-8213-1c345fa78fac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:52 np0005603623 nova_compute[226235]: 2026-01-31 08:08:52.252 226239 INFO nova.compute.manager [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Terminating instance#033[00m
Jan 31 03:08:52 np0005603623 nova_compute[226235]: 2026-01-31 08:08:52.252 226239 DEBUG nova.compute.manager [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:08:52 np0005603623 nova_compute[226235]: 2026-01-31 08:08:52.258 226239 INFO nova.virt.libvirt.driver [-] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Instance destroyed successfully.#033[00m
Jan 31 03:08:52 np0005603623 nova_compute[226235]: 2026-01-31 08:08:52.258 226239 DEBUG nova.objects.instance [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'resources' on Instance uuid 7f77789f-b530-453e-8213-1c345fa78fac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:52 np0005603623 nova_compute[226235]: 2026-01-31 08:08:52.281 226239 DEBUG nova.virt.libvirt.vif [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:08:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-328804424',display_name='tempest-DeleteServersTestJSON-server-328804424',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-328804424',id=75,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:08:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='3469c253459e40e39dcf5bcb6a32008f',ramdisk_id='',reservation_id='r-b3o36avc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-808715310',owner_user_name='tempest-DeleteServersTestJSON-808715310-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:08:49Z,user_data=None,user_id='16d731f5875748ca9b8036b2ba061042',uuid=7f77789f-b530-453e-8213-1c345fa78fac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "b8f1eb95-1365-46ea-9b13-e055b5689380", "address": "fa:16:3e:36:f9:b9", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8f1eb95-13", "ovs_interfaceid": "b8f1eb95-1365-46ea-9b13-e055b5689380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:08:52 np0005603623 nova_compute[226235]: 2026-01-31 08:08:52.282 226239 DEBUG nova.network.os_vif_util [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converting VIF {"id": "b8f1eb95-1365-46ea-9b13-e055b5689380", "address": "fa:16:3e:36:f9:b9", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8f1eb95-13", "ovs_interfaceid": "b8f1eb95-1365-46ea-9b13-e055b5689380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:08:52 np0005603623 nova_compute[226235]: 2026-01-31 08:08:52.283 226239 DEBUG nova.network.os_vif_util [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:f9:b9,bridge_name='br-int',has_traffic_filtering=True,id=b8f1eb95-1365-46ea-9b13-e055b5689380,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8f1eb95-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:08:52 np0005603623 nova_compute[226235]: 2026-01-31 08:08:52.283 226239 DEBUG os_vif [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:f9:b9,bridge_name='br-int',has_traffic_filtering=True,id=b8f1eb95-1365-46ea-9b13-e055b5689380,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8f1eb95-13') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:08:52 np0005603623 nova_compute[226235]: 2026-01-31 08:08:52.284 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:52 np0005603623 nova_compute[226235]: 2026-01-31 08:08:52.285 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8f1eb95-13, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:52 np0005603623 nova_compute[226235]: 2026-01-31 08:08:52.286 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:52 np0005603623 nova_compute[226235]: 2026-01-31 08:08:52.287 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:52 np0005603623 nova_compute[226235]: 2026-01-31 08:08:52.289 226239 INFO os_vif [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:f9:b9,bridge_name='br-int',has_traffic_filtering=True,id=b8f1eb95-1365-46ea-9b13-e055b5689380,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8f1eb95-13')#033[00m
Jan 31 03:08:53 np0005603623 nova_compute[226235]: 2026-01-31 08:08:53.035 226239 INFO nova.virt.libvirt.driver [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Deleting instance files /var/lib/nova/instances/7f77789f-b530-453e-8213-1c345fa78fac_del#033[00m
Jan 31 03:08:53 np0005603623 nova_compute[226235]: 2026-01-31 08:08:53.035 226239 INFO nova.virt.libvirt.driver [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Deletion of /var/lib/nova/instances/7f77789f-b530-453e-8213-1c345fa78fac_del complete#033[00m
Jan 31 03:08:53 np0005603623 nova_compute[226235]: 2026-01-31 08:08:53.099 226239 INFO nova.compute.manager [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Took 0.85 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:08:53 np0005603623 nova_compute[226235]: 2026-01-31 08:08:53.100 226239 DEBUG oslo.service.loopingcall [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:08:53 np0005603623 nova_compute[226235]: 2026-01-31 08:08:53.100 226239 DEBUG nova.compute.manager [-] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:08:53 np0005603623 nova_compute[226235]: 2026-01-31 08:08:53.100 226239 DEBUG nova.network.neutron [-] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:08:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:53.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:53 np0005603623 nova_compute[226235]: 2026-01-31 08:08:53.329 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:53.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:54 np0005603623 nova_compute[226235]: 2026-01-31 08:08:54.168 226239 DEBUG nova.network.neutron [-] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:54 np0005603623 nova_compute[226235]: 2026-01-31 08:08:54.195 226239 INFO nova.compute.manager [-] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Took 1.09 seconds to deallocate network for instance.#033[00m
Jan 31 03:08:54 np0005603623 nova_compute[226235]: 2026-01-31 08:08:54.237 226239 DEBUG nova.compute.manager [req-e5a02da9-998e-465f-82e7-b124623baf28 req-69270921-8632-49a7-818b-3b4e0263ec96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Received event network-vif-deleted-b8f1eb95-1365-46ea-9b13-e055b5689380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:54 np0005603623 nova_compute[226235]: 2026-01-31 08:08:54.407 226239 DEBUG oslo_concurrency.lockutils [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:54 np0005603623 nova_compute[226235]: 2026-01-31 08:08:54.407 226239 DEBUG oslo_concurrency.lockutils [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:54 np0005603623 nova_compute[226235]: 2026-01-31 08:08:54.467 226239 DEBUG oslo_concurrency.processutils [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:08:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:08:54 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4026873066' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:08:54 np0005603623 nova_compute[226235]: 2026-01-31 08:08:54.874 226239 DEBUG oslo_concurrency.processutils [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.406s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:08:54 np0005603623 nova_compute[226235]: 2026-01-31 08:08:54.879 226239 DEBUG nova.compute.provider_tree [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:08:54 np0005603623 nova_compute[226235]: 2026-01-31 08:08:54.897 226239 DEBUG nova.scheduler.client.report [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:08:54 np0005603623 nova_compute[226235]: 2026-01-31 08:08:54.939 226239 DEBUG oslo_concurrency.lockutils [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.532s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:54 np0005603623 nova_compute[226235]: 2026-01-31 08:08:54.968 226239 INFO nova.scheduler.client.report [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Deleted allocations for instance 7f77789f-b530-453e-8213-1c345fa78fac#033[00m
Jan 31 03:08:55 np0005603623 nova_compute[226235]: 2026-01-31 08:08:55.045 226239 DEBUG oslo_concurrency.lockutils [None req-ef5e72fa-53bf-49b5-8124-c2cb2e4feaaf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "7f77789f-b530-453e-8213-1c345fa78fac" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.795s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:55.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:55.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:57 np0005603623 nova_compute[226235]: 2026-01-31 08:08:57.072 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:57 np0005603623 nova_compute[226235]: 2026-01-31 08:08:57.287 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:08:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:57.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:08:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:08:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:57.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:08:58 np0005603623 nova_compute[226235]: 2026-01-31 08:08:58.331 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:59.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:08:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:59.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:01.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:01.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:02 np0005603623 nova_compute[226235]: 2026-01-31 08:09:02.833 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:03.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:03 np0005603623 nova_compute[226235]: 2026-01-31 08:09:03.333 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:03 np0005603623 nova_compute[226235]: 2026-01-31 08:09:03.724 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846928.7228763, d4484d63-c590-4676-b3ae-b8e33bd348f1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:09:03 np0005603623 nova_compute[226235]: 2026-01-31 08:09:03.724 226239 INFO nova.compute.manager [-] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:09:03 np0005603623 nova_compute[226235]: 2026-01-31 08:09:03.750 226239 DEBUG nova.compute.manager [None req-dfcc5913-5a02-44b5-94c0-cc7a09fdb0f9 - - - - - -] [instance: d4484d63-c590-4676-b3ae-b8e33bd348f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:09:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:03.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:03 np0005603623 nova_compute[226235]: 2026-01-31 08:09:03.957 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846928.9560788, 7f77789f-b530-453e-8213-1c345fa78fac => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:09:03 np0005603623 nova_compute[226235]: 2026-01-31 08:09:03.957 226239 INFO nova.compute.manager [-] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:09:03 np0005603623 nova_compute[226235]: 2026-01-31 08:09:03.983 226239 DEBUG nova.compute.manager [None req-0775f2ca-a183-4785-8bc0-d7b27fec999b - - - - - -] [instance: 7f77789f-b530-453e-8213-1c345fa78fac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:09:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:05.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:05 np0005603623 nova_compute[226235]: 2026-01-31 08:09:05.603 226239 DEBUG oslo_concurrency.lockutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Acquiring lock "1c071b37-4498-458c-96b0-3e1a15ca470c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:05 np0005603623 nova_compute[226235]: 2026-01-31 08:09:05.603 226239 DEBUG oslo_concurrency.lockutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:05 np0005603623 nova_compute[226235]: 2026-01-31 08:09:05.625 226239 DEBUG nova.compute.manager [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:09:05 np0005603623 nova_compute[226235]: 2026-01-31 08:09:05.715 226239 DEBUG oslo_concurrency.lockutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:05 np0005603623 nova_compute[226235]: 2026-01-31 08:09:05.715 226239 DEBUG oslo_concurrency.lockutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:05 np0005603623 nova_compute[226235]: 2026-01-31 08:09:05.721 226239 DEBUG nova.virt.hardware [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:09:05 np0005603623 nova_compute[226235]: 2026-01-31 08:09:05.722 226239 INFO nova.compute.claims [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:09:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:05.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:05 np0005603623 nova_compute[226235]: 2026-01-31 08:09:05.974 226239 DEBUG oslo_concurrency.processutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:09:06 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/501119239' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:09:06 np0005603623 nova_compute[226235]: 2026-01-31 08:09:06.620 226239 DEBUG oslo_concurrency.processutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.646s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:06 np0005603623 nova_compute[226235]: 2026-01-31 08:09:06.629 226239 DEBUG nova.compute.provider_tree [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:09:06 np0005603623 nova_compute[226235]: 2026-01-31 08:09:06.655 226239 DEBUG nova.scheduler.client.report [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:09:06 np0005603623 nova_compute[226235]: 2026-01-31 08:09:06.712 226239 DEBUG oslo_concurrency.lockutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.997s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:06 np0005603623 nova_compute[226235]: 2026-01-31 08:09:06.714 226239 DEBUG nova.compute.manager [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:09:06 np0005603623 nova_compute[226235]: 2026-01-31 08:09:06.768 226239 DEBUG nova.compute.manager [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:09:06 np0005603623 nova_compute[226235]: 2026-01-31 08:09:06.769 226239 DEBUG nova.network.neutron [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:09:06 np0005603623 nova_compute[226235]: 2026-01-31 08:09:06.804 226239 INFO nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:09:06 np0005603623 nova_compute[226235]: 2026-01-31 08:09:06.867 226239 DEBUG nova.compute.manager [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:09:06 np0005603623 nova_compute[226235]: 2026-01-31 08:09:06.961 226239 DEBUG nova.compute.manager [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:09:06 np0005603623 nova_compute[226235]: 2026-01-31 08:09:06.962 226239 DEBUG nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:09:06 np0005603623 nova_compute[226235]: 2026-01-31 08:09:06.963 226239 INFO nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Creating image(s)#033[00m
Jan 31 03:09:06 np0005603623 nova_compute[226235]: 2026-01-31 08:09:06.984 226239 DEBUG nova.storage.rbd_utils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] rbd image 1c071b37-4498-458c-96b0-3e1a15ca470c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:07 np0005603623 nova_compute[226235]: 2026-01-31 08:09:07.005 226239 DEBUG nova.storage.rbd_utils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] rbd image 1c071b37-4498-458c-96b0-3e1a15ca470c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:07 np0005603623 nova_compute[226235]: 2026-01-31 08:09:07.029 226239 DEBUG nova.storage.rbd_utils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] rbd image 1c071b37-4498-458c-96b0-3e1a15ca470c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:07 np0005603623 nova_compute[226235]: 2026-01-31 08:09:07.033 226239 DEBUG oslo_concurrency.processutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:07 np0005603623 nova_compute[226235]: 2026-01-31 08:09:07.051 226239 DEBUG nova.policy [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9b363a9a11084c998b823d2941f27c97', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dfc138d4fe084f1e8bfe6a94be18cc23', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:09:07 np0005603623 nova_compute[226235]: 2026-01-31 08:09:07.082 226239 DEBUG oslo_concurrency.processutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:07 np0005603623 nova_compute[226235]: 2026-01-31 08:09:07.083 226239 DEBUG oslo_concurrency.lockutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:07 np0005603623 nova_compute[226235]: 2026-01-31 08:09:07.083 226239 DEBUG oslo_concurrency.lockutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:07 np0005603623 nova_compute[226235]: 2026-01-31 08:09:07.084 226239 DEBUG oslo_concurrency.lockutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:07 np0005603623 nova_compute[226235]: 2026-01-31 08:09:07.107 226239 DEBUG nova.storage.rbd_utils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] rbd image 1c071b37-4498-458c-96b0-3e1a15ca470c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:07 np0005603623 nova_compute[226235]: 2026-01-31 08:09:07.110 226239 DEBUG oslo_concurrency.processutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 1c071b37-4498-458c-96b0-3e1a15ca470c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:07.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:07.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:07 np0005603623 nova_compute[226235]: 2026-01-31 08:09:07.811 226239 DEBUG nova.network.neutron [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Successfully created port: 0c27f6b7-b93a-418a-8953-51a84b461843 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:09:07 np0005603623 nova_compute[226235]: 2026-01-31 08:09:07.835 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:08 np0005603623 nova_compute[226235]: 2026-01-31 08:09:08.334 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:09.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:09.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:09 np0005603623 nova_compute[226235]: 2026-01-31 08:09:09.958 226239 DEBUG oslo_concurrency.processutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 1c071b37-4498-458c-96b0-3e1a15ca470c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.848s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:09 np0005603623 nova_compute[226235]: 2026-01-31 08:09:09.995 226239 DEBUG nova.network.neutron [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Successfully updated port: 0c27f6b7-b93a-418a-8953-51a84b461843 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:09:10 np0005603623 nova_compute[226235]: 2026-01-31 08:09:10.033 226239 DEBUG oslo_concurrency.lockutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Acquiring lock "refresh_cache-1c071b37-4498-458c-96b0-3e1a15ca470c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:09:10 np0005603623 nova_compute[226235]: 2026-01-31 08:09:10.033 226239 DEBUG oslo_concurrency.lockutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Acquired lock "refresh_cache-1c071b37-4498-458c-96b0-3e1a15ca470c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:09:10 np0005603623 nova_compute[226235]: 2026-01-31 08:09:10.033 226239 DEBUG nova.network.neutron [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:09:10 np0005603623 nova_compute[226235]: 2026-01-31 08:09:10.039 226239 DEBUG nova.storage.rbd_utils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] resizing rbd image 1c071b37-4498-458c-96b0-3e1a15ca470c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:09:10 np0005603623 nova_compute[226235]: 2026-01-31 08:09:10.263 226239 DEBUG nova.compute.manager [req-9ea054d2-9efa-4c5e-9efb-b16f5d3cd7fa req-3fdabb77-a286-41de-b746-1ff961ff6ec2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received event network-changed-0c27f6b7-b93a-418a-8953-51a84b461843 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:10 np0005603623 nova_compute[226235]: 2026-01-31 08:09:10.263 226239 DEBUG nova.compute.manager [req-9ea054d2-9efa-4c5e-9efb-b16f5d3cd7fa req-3fdabb77-a286-41de-b746-1ff961ff6ec2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Refreshing instance network info cache due to event network-changed-0c27f6b7-b93a-418a-8953-51a84b461843. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:09:10 np0005603623 nova_compute[226235]: 2026-01-31 08:09:10.264 226239 DEBUG oslo_concurrency.lockutils [req-9ea054d2-9efa-4c5e-9efb-b16f5d3cd7fa req-3fdabb77-a286-41de-b746-1ff961ff6ec2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-1c071b37-4498-458c-96b0-3e1a15ca470c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:09:10 np0005603623 nova_compute[226235]: 2026-01-31 08:09:10.467 226239 DEBUG nova.network.neutron [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.315 226239 DEBUG nova.objects.instance [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lazy-loading 'migration_context' on Instance uuid 1c071b37-4498-458c-96b0-3e1a15ca470c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:09:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:11.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.440 226239 DEBUG nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.440 226239 DEBUG nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Ensure instance console log exists: /var/lib/nova/instances/1c071b37-4498-458c-96b0-3e1a15ca470c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.440 226239 DEBUG oslo_concurrency.lockutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.441 226239 DEBUG oslo_concurrency.lockutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.441 226239 DEBUG oslo_concurrency.lockutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.449 226239 DEBUG nova.network.neutron [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Updating instance_info_cache with network_info: [{"id": "0c27f6b7-b93a-418a-8953-51a84b461843", "address": "fa:16:3e:27:b2:15", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c27f6b7-b9", "ovs_interfaceid": "0c27f6b7-b93a-418a-8953-51a84b461843", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.694 226239 DEBUG oslo_concurrency.lockutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Releasing lock "refresh_cache-1c071b37-4498-458c-96b0-3e1a15ca470c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.695 226239 DEBUG nova.compute.manager [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Instance network_info: |[{"id": "0c27f6b7-b93a-418a-8953-51a84b461843", "address": "fa:16:3e:27:b2:15", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c27f6b7-b9", "ovs_interfaceid": "0c27f6b7-b93a-418a-8953-51a84b461843", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.696 226239 DEBUG oslo_concurrency.lockutils [req-9ea054d2-9efa-4c5e-9efb-b16f5d3cd7fa req-3fdabb77-a286-41de-b746-1ff961ff6ec2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-1c071b37-4498-458c-96b0-3e1a15ca470c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.697 226239 DEBUG nova.network.neutron [req-9ea054d2-9efa-4c5e-9efb-b16f5d3cd7fa req-3fdabb77-a286-41de-b746-1ff961ff6ec2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Refreshing network info cache for port 0c27f6b7-b93a-418a-8953-51a84b461843 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.704 226239 DEBUG nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Start _get_guest_xml network_info=[{"id": "0c27f6b7-b93a-418a-8953-51a84b461843", "address": "fa:16:3e:27:b2:15", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c27f6b7-b9", "ovs_interfaceid": "0c27f6b7-b93a-418a-8953-51a84b461843", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.710 226239 WARNING nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.717 226239 DEBUG nova.virt.libvirt.host [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.718 226239 DEBUG nova.virt.libvirt.host [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.722 226239 DEBUG nova.virt.libvirt.host [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.723 226239 DEBUG nova.virt.libvirt.host [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.725 226239 DEBUG nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.725 226239 DEBUG nova.virt.hardware [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.726 226239 DEBUG nova.virt.hardware [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.726 226239 DEBUG nova.virt.hardware [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.727 226239 DEBUG nova.virt.hardware [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.727 226239 DEBUG nova.virt.hardware [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.727 226239 DEBUG nova.virt.hardware [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.727 226239 DEBUG nova.virt.hardware [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.728 226239 DEBUG nova.virt.hardware [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.728 226239 DEBUG nova.virt.hardware [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.728 226239 DEBUG nova.virt.hardware [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.728 226239 DEBUG nova.virt.hardware [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:09:11 np0005603623 nova_compute[226235]: 2026-01-31 08:09:11.732 226239 DEBUG oslo_concurrency.processutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:11.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:09:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2786131697' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.234 226239 DEBUG oslo_concurrency.processutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.263 226239 DEBUG nova.storage.rbd_utils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] rbd image 1c071b37-4498-458c-96b0-3e1a15ca470c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.268 226239 DEBUG oslo_concurrency.processutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:09:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/562293037' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.703 226239 DEBUG oslo_concurrency.processutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.706 226239 DEBUG nova.virt.libvirt.vif [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:09:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1486312359',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1486312359',id=78,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dfc138d4fe084f1e8bfe6a94be18cc23',ramdisk_id='',reservation_id='r-jowc2xfu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-1402581101',owner_user_name='tempest-AttachInterfacesV270Test-1402581101-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:09:06Z,user_data=None,user_id='9b363a9a11084c998b823d2941f27c97',uuid=1c071b37-4498-458c-96b0-3e1a15ca470c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0c27f6b7-b93a-418a-8953-51a84b461843", "address": "fa:16:3e:27:b2:15", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c27f6b7-b9", "ovs_interfaceid": "0c27f6b7-b93a-418a-8953-51a84b461843", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.707 226239 DEBUG nova.network.os_vif_util [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Converting VIF {"id": "0c27f6b7-b93a-418a-8953-51a84b461843", "address": "fa:16:3e:27:b2:15", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c27f6b7-b9", "ovs_interfaceid": "0c27f6b7-b93a-418a-8953-51a84b461843", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.708 226239 DEBUG nova.network.os_vif_util [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:b2:15,bridge_name='br-int',has_traffic_filtering=True,id=0c27f6b7-b93a-418a-8953-51a84b461843,network=Network(5655565c-f4a9-4971-a164-5eabea306a80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c27f6b7-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.709 226239 DEBUG nova.objects.instance [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1c071b37-4498-458c-96b0-3e1a15ca470c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.837 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.851 226239 DEBUG nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:09:12 np0005603623 nova_compute[226235]:  <uuid>1c071b37-4498-458c-96b0-3e1a15ca470c</uuid>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:  <name>instance-0000004e</name>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <nova:name>tempest-AttachInterfacesV270Test-server-1486312359</nova:name>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:09:11</nova:creationTime>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:09:12 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:        <nova:user uuid="9b363a9a11084c998b823d2941f27c97">tempest-AttachInterfacesV270Test-1402581101-project-member</nova:user>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:        <nova:project uuid="dfc138d4fe084f1e8bfe6a94be18cc23">tempest-AttachInterfacesV270Test-1402581101</nova:project>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:        <nova:port uuid="0c27f6b7-b93a-418a-8953-51a84b461843">
Jan 31 03:09:12 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <entry name="serial">1c071b37-4498-458c-96b0-3e1a15ca470c</entry>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <entry name="uuid">1c071b37-4498-458c-96b0-3e1a15ca470c</entry>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/1c071b37-4498-458c-96b0-3e1a15ca470c_disk">
Jan 31 03:09:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:09:12 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/1c071b37-4498-458c-96b0-3e1a15ca470c_disk.config">
Jan 31 03:09:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:09:12 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:27:b2:15"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <target dev="tap0c27f6b7-b9"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/1c071b37-4498-458c-96b0-3e1a15ca470c/console.log" append="off"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:09:12 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:09:12 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:09:12 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:09:12 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.852 226239 DEBUG nova.compute.manager [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Preparing to wait for external event network-vif-plugged-0c27f6b7-b93a-418a-8953-51a84b461843 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.852 226239 DEBUG oslo_concurrency.lockutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Acquiring lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.853 226239 DEBUG oslo_concurrency.lockutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.853 226239 DEBUG oslo_concurrency.lockutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.853 226239 DEBUG nova.virt.libvirt.vif [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:09:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1486312359',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1486312359',id=78,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dfc138d4fe084f1e8bfe6a94be18cc23',ramdisk_id='',reservation_id='r-jowc2xfu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-1402581101',owner_user_name='tempest-AttachInterfacesV270Test-1402581101-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:09:06Z,user_data=None,user_id='9b363a9a11084c998b823d2941f27c97',uuid=1c071b37-4498-458c-96b0-3e1a15ca470c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0c27f6b7-b93a-418a-8953-51a84b461843", "address": "fa:16:3e:27:b2:15", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c27f6b7-b9", "ovs_interfaceid": "0c27f6b7-b93a-418a-8953-51a84b461843", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.854 226239 DEBUG nova.network.os_vif_util [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Converting VIF {"id": "0c27f6b7-b93a-418a-8953-51a84b461843", "address": "fa:16:3e:27:b2:15", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c27f6b7-b9", "ovs_interfaceid": "0c27f6b7-b93a-418a-8953-51a84b461843", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.854 226239 DEBUG nova.network.os_vif_util [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:b2:15,bridge_name='br-int',has_traffic_filtering=True,id=0c27f6b7-b93a-418a-8953-51a84b461843,network=Network(5655565c-f4a9-4971-a164-5eabea306a80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c27f6b7-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.855 226239 DEBUG os_vif [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:b2:15,bridge_name='br-int',has_traffic_filtering=True,id=0c27f6b7-b93a-418a-8953-51a84b461843,network=Network(5655565c-f4a9-4971-a164-5eabea306a80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c27f6b7-b9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.855 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.855 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.856 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.859 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.860 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c27f6b7-b9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.860 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0c27f6b7-b9, col_values=(('external_ids', {'iface-id': '0c27f6b7-b93a-418a-8953-51a84b461843', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:27:b2:15', 'vm-uuid': '1c071b37-4498-458c-96b0-3e1a15ca470c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.861 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:12 np0005603623 NetworkManager[48970]: <info>  [1769846952.8626] manager: (tap0c27f6b7-b9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/130)
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.864 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.866 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:12 np0005603623 nova_compute[226235]: 2026-01-31 08:09:12.866 226239 INFO os_vif [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:b2:15,bridge_name='br-int',has_traffic_filtering=True,id=0c27f6b7-b93a-418a-8953-51a84b461843,network=Network(5655565c-f4a9-4971-a164-5eabea306a80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c27f6b7-b9')#033[00m
Jan 31 03:09:13 np0005603623 nova_compute[226235]: 2026-01-31 08:09:13.051 226239 DEBUG nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:09:13 np0005603623 nova_compute[226235]: 2026-01-31 08:09:13.052 226239 DEBUG nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:09:13 np0005603623 nova_compute[226235]: 2026-01-31 08:09:13.052 226239 DEBUG nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] No VIF found with MAC fa:16:3e:27:b2:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:09:13 np0005603623 nova_compute[226235]: 2026-01-31 08:09:13.053 226239 INFO nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Using config drive#033[00m
Jan 31 03:09:13 np0005603623 nova_compute[226235]: 2026-01-31 08:09:13.076 226239 DEBUG nova.storage.rbd_utils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] rbd image 1c071b37-4498-458c-96b0-3e1a15ca470c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:13 np0005603623 nova_compute[226235]: 2026-01-31 08:09:13.335 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:13.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:13.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:09:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2719356456' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:09:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:09:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2719356456' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:09:15 np0005603623 nova_compute[226235]: 2026-01-31 08:09:15.016 226239 INFO nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Creating config drive at /var/lib/nova/instances/1c071b37-4498-458c-96b0-3e1a15ca470c/disk.config#033[00m
Jan 31 03:09:15 np0005603623 nova_compute[226235]: 2026-01-31 08:09:15.020 226239 DEBUG oslo_concurrency.processutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1c071b37-4498-458c-96b0-3e1a15ca470c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpqyaauiku execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:15 np0005603623 nova_compute[226235]: 2026-01-31 08:09:15.058 226239 DEBUG nova.network.neutron [req-9ea054d2-9efa-4c5e-9efb-b16f5d3cd7fa req-3fdabb77-a286-41de-b746-1ff961ff6ec2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Updated VIF entry in instance network info cache for port 0c27f6b7-b93a-418a-8953-51a84b461843. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:09:15 np0005603623 nova_compute[226235]: 2026-01-31 08:09:15.059 226239 DEBUG nova.network.neutron [req-9ea054d2-9efa-4c5e-9efb-b16f5d3cd7fa req-3fdabb77-a286-41de-b746-1ff961ff6ec2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Updating instance_info_cache with network_info: [{"id": "0c27f6b7-b93a-418a-8953-51a84b461843", "address": "fa:16:3e:27:b2:15", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c27f6b7-b9", "ovs_interfaceid": "0c27f6b7-b93a-418a-8953-51a84b461843", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:09:15 np0005603623 nova_compute[226235]: 2026-01-31 08:09:15.140 226239 DEBUG oslo_concurrency.processutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1c071b37-4498-458c-96b0-3e1a15ca470c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpqyaauiku" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:15 np0005603623 nova_compute[226235]: 2026-01-31 08:09:15.168 226239 DEBUG nova.storage.rbd_utils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] rbd image 1c071b37-4498-458c-96b0-3e1a15ca470c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:15 np0005603623 nova_compute[226235]: 2026-01-31 08:09:15.172 226239 DEBUG oslo_concurrency.processutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1c071b37-4498-458c-96b0-3e1a15ca470c/disk.config 1c071b37-4498-458c-96b0-3e1a15ca470c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:15 np0005603623 nova_compute[226235]: 2026-01-31 08:09:15.224 226239 DEBUG oslo_concurrency.lockutils [req-9ea054d2-9efa-4c5e-9efb-b16f5d3cd7fa req-3fdabb77-a286-41de-b746-1ff961ff6ec2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-1c071b37-4498-458c-96b0-3e1a15ca470c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:09:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:15.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:15 np0005603623 nova_compute[226235]: 2026-01-31 08:09:15.561 226239 DEBUG oslo_concurrency.processutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1c071b37-4498-458c-96b0-3e1a15ca470c/disk.config 1c071b37-4498-458c-96b0-3e1a15ca470c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.388s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:15 np0005603623 nova_compute[226235]: 2026-01-31 08:09:15.561 226239 INFO nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Deleting local config drive /var/lib/nova/instances/1c071b37-4498-458c-96b0-3e1a15ca470c/disk.config because it was imported into RBD.#033[00m
Jan 31 03:09:15 np0005603623 kernel: tap0c27f6b7-b9: entered promiscuous mode
Jan 31 03:09:15 np0005603623 NetworkManager[48970]: <info>  [1769846955.5969] manager: (tap0c27f6b7-b9): new Tun device (/org/freedesktop/NetworkManager/Devices/131)
Jan 31 03:09:15 np0005603623 nova_compute[226235]: 2026-01-31 08:09:15.597 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:15 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:15Z|00285|binding|INFO|Claiming lport 0c27f6b7-b93a-418a-8953-51a84b461843 for this chassis.
Jan 31 03:09:15 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:15Z|00286|binding|INFO|0c27f6b7-b93a-418a-8953-51a84b461843: Claiming fa:16:3e:27:b2:15 10.100.0.6
Jan 31 03:09:15 np0005603623 nova_compute[226235]: 2026-01-31 08:09:15.599 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:15 np0005603623 systemd-udevd[258291]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:09:15 np0005603623 nova_compute[226235]: 2026-01-31 08:09:15.625 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:15 np0005603623 systemd-machined[194379]: New machine qemu-33-instance-0000004e.
Jan 31 03:09:15 np0005603623 NetworkManager[48970]: <info>  [1769846955.6298] device (tap0c27f6b7-b9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:09:15 np0005603623 NetworkManager[48970]: <info>  [1769846955.6302] device (tap0c27f6b7-b9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:09:15 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:15Z|00287|binding|INFO|Setting lport 0c27f6b7-b93a-418a-8953-51a84b461843 ovn-installed in OVS
Jan 31 03:09:15 np0005603623 nova_compute[226235]: 2026-01-31 08:09:15.632 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:15 np0005603623 systemd[1]: Started Virtual Machine qemu-33-instance-0000004e.
Jan 31 03:09:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:15.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:15 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:15Z|00288|binding|INFO|Setting lport 0c27f6b7-b93a-418a-8953-51a84b461843 up in Southbound
Jan 31 03:09:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:15.837 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:b2:15 10.100.0.6'], port_security=['fa:16:3e:27:b2:15 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1c071b37-4498-458c-96b0-3e1a15ca470c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5655565c-f4a9-4971-a164-5eabea306a80', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dfc138d4fe084f1e8bfe6a94be18cc23', 'neutron:revision_number': '2', 'neutron:security_group_ids': '21ebf42d-f1d4-416f-8212-6b9f8a117dd9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2cf5339-996b-4e91-b696-376e376e07d0, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=0c27f6b7-b93a-418a-8953-51a84b461843) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:09:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:15.838 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 0c27f6b7-b93a-418a-8953-51a84b461843 in datapath 5655565c-f4a9-4971-a164-5eabea306a80 bound to our chassis#033[00m
Jan 31 03:09:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:15.840 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5655565c-f4a9-4971-a164-5eabea306a80#033[00m
Jan 31 03:09:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:15.847 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4e2790c5-d6ee-41c6-8115-3db26c0a200d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:15.847 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5655565c-f1 in ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:09:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:15.849 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5655565c-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:09:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:15.849 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cd2f41fc-90c2-4822-bdf3-d702a5925d87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:15.849 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[98512114-9090-4768-90ae-1b08c512cd72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:15.857 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[fb4e6839-c20f-4cae-b3d9-b4f27038b589]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:15.866 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[614f2f14-ef47-4991-8573-fff2941ce136]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:15.894 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[18ef5a38-2299-4247-89e2-ecf94d7cd7b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:15.899 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a66c674c-5c56-4190-b002-c658d43c724c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:15 np0005603623 systemd-udevd[258295]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:09:15 np0005603623 NetworkManager[48970]: <info>  [1769846955.9006] manager: (tap5655565c-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/132)
Jan 31 03:09:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:15.936 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[b24aa64a-591f-43b1-a4e6-a8cd12238a1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:15.939 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[04337853-bd7d-44f9-81f4-5e9427a1b82f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:15 np0005603623 NetworkManager[48970]: <info>  [1769846955.9596] device (tap5655565c-f0): carrier: link connected
Jan 31 03:09:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:15.963 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[33c78277-7a16-4d0d-b5ea-e92633611f2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:15.975 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ae82ded1-5880-4f8a-a373-4204d89ba09f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5655565c-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:54:fa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 609638, 'reachable_time': 19914, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258341, 'error': None, 'target': 'ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:15.988 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[301515bc-5bc6-4765-8e53-c226ebee3919]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5c:54fa'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 609638, 'tstamp': 609638}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 258345, 'error': None, 'target': 'ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:16.005 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee7d962-22f3-40d2-9b9d-bc04d00ac184]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5655565c-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:54:fa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 609638, 'reachable_time': 19914, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 258353, 'error': None, 'target': 'ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:16.034 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[77ddf072-b800-4c1d-91af-014eeb3910cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:16.074 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f895aa31-b568-4e00-ac33-d7cd50f500c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:16.076 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5655565c-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:16.077 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:16.077 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5655565c-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:16 np0005603623 NetworkManager[48970]: <info>  [1769846956.0801] manager: (tap5655565c-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/133)
Jan 31 03:09:16 np0005603623 kernel: tap5655565c-f0: entered promiscuous mode
Jan 31 03:09:16 np0005603623 nova_compute[226235]: 2026-01-31 08:09:16.079 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:16.081 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5655565c-f0, col_values=(('external_ids', {'iface-id': 'b4b7eabb-6031-4f1e-b4ac-54963402671d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:16 np0005603623 nova_compute[226235]: 2026-01-31 08:09:16.082 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:16 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:16Z|00289|binding|INFO|Releasing lport b4b7eabb-6031-4f1e-b4ac-54963402671d from this chassis (sb_readonly=0)
Jan 31 03:09:16 np0005603623 nova_compute[226235]: 2026-01-31 08:09:16.083 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:16.083 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5655565c-f4a9-4971-a164-5eabea306a80.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5655565c-f4a9-4971-a164-5eabea306a80.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:16.084 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1358bf1d-0bc0-4e39-9cc1-b59c942ce1b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:16.084 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-5655565c-f4a9-4971-a164-5eabea306a80
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/5655565c-f4a9-4971-a164-5eabea306a80.pid.haproxy
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 5655565c-f4a9-4971-a164-5eabea306a80
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:09:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:16.085 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80', 'env', 'PROCESS_TAG=haproxy-5655565c-f4a9-4971-a164-5eabea306a80', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5655565c-f4a9-4971-a164-5eabea306a80.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:09:16 np0005603623 nova_compute[226235]: 2026-01-31 08:09:16.087 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:16 np0005603623 nova_compute[226235]: 2026-01-31 08:09:16.133 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846956.1327136, 1c071b37-4498-458c-96b0-3e1a15ca470c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:09:16 np0005603623 nova_compute[226235]: 2026-01-31 08:09:16.133 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] VM Started (Lifecycle Event)#033[00m
Jan 31 03:09:16 np0005603623 nova_compute[226235]: 2026-01-31 08:09:16.206 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:09:16 np0005603623 nova_compute[226235]: 2026-01-31 08:09:16.211 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846956.1333091, 1c071b37-4498-458c-96b0-3e1a15ca470c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:09:16 np0005603623 nova_compute[226235]: 2026-01-31 08:09:16.211 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:09:16 np0005603623 nova_compute[226235]: 2026-01-31 08:09:16.267 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:09:16 np0005603623 nova_compute[226235]: 2026-01-31 08:09:16.270 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:09:16 np0005603623 nova_compute[226235]: 2026-01-31 08:09:16.350 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:09:16 np0005603623 podman[258402]: 2026-01-31 08:09:16.40389293 +0000 UTC m=+0.050714040 container create 768937ded3183340ae91a06c1be550e939ae04e552ef125082b38d800748dfcc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:09:16 np0005603623 systemd[1]: Started libpod-conmon-768937ded3183340ae91a06c1be550e939ae04e552ef125082b38d800748dfcc.scope.
Jan 31 03:09:16 np0005603623 podman[258402]: 2026-01-31 08:09:16.374126531 +0000 UTC m=+0.020947671 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:09:16 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:09:16 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e207f95984f9be550007de936f3f2325b359adda34dd9e474a783d7ded3c6eb3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:09:16 np0005603623 podman[258402]: 2026-01-31 08:09:16.503755748 +0000 UTC m=+0.150576878 container init 768937ded3183340ae91a06c1be550e939ae04e552ef125082b38d800748dfcc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:09:16 np0005603623 podman[258402]: 2026-01-31 08:09:16.507761583 +0000 UTC m=+0.154582683 container start 768937ded3183340ae91a06c1be550e939ae04e552ef125082b38d800748dfcc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:09:16 np0005603623 neutron-haproxy-ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80[258415]: [NOTICE]   (258422) : New worker (258425) forked
Jan 31 03:09:16 np0005603623 neutron-haproxy-ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80[258415]: [NOTICE]   (258422) : Loading success.
Jan 31 03:09:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:17.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:17.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:17 np0005603623 nova_compute[226235]: 2026-01-31 08:09:17.862 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:18 np0005603623 nova_compute[226235]: 2026-01-31 08:09:18.338 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.163 226239 DEBUG nova.compute.manager [req-09613eaf-15b5-4b7c-bee3-b7b4cda49ff7 req-3ec238f1-fe8b-40cd-8d09-86eabbe3552b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received event network-vif-plugged-0c27f6b7-b93a-418a-8953-51a84b461843 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.164 226239 DEBUG oslo_concurrency.lockutils [req-09613eaf-15b5-4b7c-bee3-b7b4cda49ff7 req-3ec238f1-fe8b-40cd-8d09-86eabbe3552b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.164 226239 DEBUG oslo_concurrency.lockutils [req-09613eaf-15b5-4b7c-bee3-b7b4cda49ff7 req-3ec238f1-fe8b-40cd-8d09-86eabbe3552b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.165 226239 DEBUG oslo_concurrency.lockutils [req-09613eaf-15b5-4b7c-bee3-b7b4cda49ff7 req-3ec238f1-fe8b-40cd-8d09-86eabbe3552b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.165 226239 DEBUG nova.compute.manager [req-09613eaf-15b5-4b7c-bee3-b7b4cda49ff7 req-3ec238f1-fe8b-40cd-8d09-86eabbe3552b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Processing event network-vif-plugged-0c27f6b7-b93a-418a-8953-51a84b461843 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.166 226239 DEBUG nova.compute.manager [req-09613eaf-15b5-4b7c-bee3-b7b4cda49ff7 req-3ec238f1-fe8b-40cd-8d09-86eabbe3552b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received event network-vif-plugged-0c27f6b7-b93a-418a-8953-51a84b461843 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.166 226239 DEBUG oslo_concurrency.lockutils [req-09613eaf-15b5-4b7c-bee3-b7b4cda49ff7 req-3ec238f1-fe8b-40cd-8d09-86eabbe3552b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.167 226239 DEBUG oslo_concurrency.lockutils [req-09613eaf-15b5-4b7c-bee3-b7b4cda49ff7 req-3ec238f1-fe8b-40cd-8d09-86eabbe3552b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.167 226239 DEBUG oslo_concurrency.lockutils [req-09613eaf-15b5-4b7c-bee3-b7b4cda49ff7 req-3ec238f1-fe8b-40cd-8d09-86eabbe3552b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.168 226239 DEBUG nova.compute.manager [req-09613eaf-15b5-4b7c-bee3-b7b4cda49ff7 req-3ec238f1-fe8b-40cd-8d09-86eabbe3552b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] No waiting events found dispatching network-vif-plugged-0c27f6b7-b93a-418a-8953-51a84b461843 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.168 226239 WARNING nova.compute.manager [req-09613eaf-15b5-4b7c-bee3-b7b4cda49ff7 req-3ec238f1-fe8b-40cd-8d09-86eabbe3552b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received unexpected event network-vif-plugged-0c27f6b7-b93a-418a-8953-51a84b461843 for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.170 226239 DEBUG nova.compute.manager [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.175 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846959.1751466, 1c071b37-4498-458c-96b0-3e1a15ca470c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.175 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.177 226239 DEBUG nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.181 226239 INFO nova.virt.libvirt.driver [-] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Instance spawned successfully.#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.181 226239 DEBUG nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.198 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.203 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.214 226239 DEBUG nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.214 226239 DEBUG nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.215 226239 DEBUG nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.215 226239 DEBUG nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.216 226239 DEBUG nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.216 226239 DEBUG nova.virt.libvirt.driver [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:09:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:19.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.449 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.558 226239 INFO nova.compute.manager [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Took 12.60 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.559 226239 DEBUG nova.compute.manager [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.648 226239 INFO nova.compute.manager [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Took 13.96 seconds to build instance.#033[00m
Jan 31 03:09:19 np0005603623 nova_compute[226235]: 2026-01-31 08:09:19.672 226239 DEBUG oslo_concurrency.lockutils [None req-c229924b-1e86-4dba-be18-95c7f226d1bd 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:19.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:19 np0005603623 podman[258435]: 2026-01-31 08:09:19.951381671 +0000 UTC m=+0.047303252 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:09:19 np0005603623 podman[258436]: 2026-01-31 08:09:19.982773941 +0000 UTC m=+0.077152963 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 03:09:20 np0005603623 nova_compute[226235]: 2026-01-31 08:09:20.963 226239 DEBUG oslo_concurrency.lockutils [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Acquiring lock "interface-1c071b37-4498-458c-96b0-3e1a15ca470c-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:20 np0005603623 nova_compute[226235]: 2026-01-31 08:09:20.964 226239 DEBUG oslo_concurrency.lockutils [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lock "interface-1c071b37-4498-458c-96b0-3e1a15ca470c-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:20 np0005603623 nova_compute[226235]: 2026-01-31 08:09:20.964 226239 DEBUG nova.objects.instance [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lazy-loading 'flavor' on Instance uuid 1c071b37-4498-458c-96b0-3e1a15ca470c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:09:20 np0005603623 nova_compute[226235]: 2026-01-31 08:09:20.988 226239 DEBUG nova.objects.instance [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lazy-loading 'pci_requests' on Instance uuid 1c071b37-4498-458c-96b0-3e1a15ca470c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:09:21 np0005603623 nova_compute[226235]: 2026-01-31 08:09:21.018 226239 DEBUG nova.network.neutron [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:09:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:21.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:21.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:21 np0005603623 nova_compute[226235]: 2026-01-31 08:09:21.985 226239 DEBUG nova.policy [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9b363a9a11084c998b823d2941f27c97', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dfc138d4fe084f1e8bfe6a94be18cc23', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:09:22 np0005603623 nova_compute[226235]: 2026-01-31 08:09:22.607 226239 DEBUG nova.network.neutron [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Successfully created port: c2a7b6cc-8371-465a-b50f-822e25ab8682 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:09:22 np0005603623 nova_compute[226235]: 2026-01-31 08:09:22.865 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:23 np0005603623 nova_compute[226235]: 2026-01-31 08:09:23.339 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:23.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:23.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:24 np0005603623 nova_compute[226235]: 2026-01-31 08:09:24.084 226239 DEBUG nova.network.neutron [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Successfully updated port: c2a7b6cc-8371-465a-b50f-822e25ab8682 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:09:24 np0005603623 nova_compute[226235]: 2026-01-31 08:09:24.114 226239 DEBUG oslo_concurrency.lockutils [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Acquiring lock "refresh_cache-1c071b37-4498-458c-96b0-3e1a15ca470c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:09:24 np0005603623 nova_compute[226235]: 2026-01-31 08:09:24.115 226239 DEBUG oslo_concurrency.lockutils [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Acquired lock "refresh_cache-1c071b37-4498-458c-96b0-3e1a15ca470c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:09:24 np0005603623 nova_compute[226235]: 2026-01-31 08:09:24.115 226239 DEBUG nova.network.neutron [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:09:24 np0005603623 nova_compute[226235]: 2026-01-31 08:09:24.236 226239 DEBUG nova.compute.manager [req-78f987ad-c5d1-4771-9fe5-aec439d8a97c req-c58c6da3-1a11-447a-b85f-a9f93f918ebd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received event network-changed-c2a7b6cc-8371-465a-b50f-822e25ab8682 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:24 np0005603623 nova_compute[226235]: 2026-01-31 08:09:24.236 226239 DEBUG nova.compute.manager [req-78f987ad-c5d1-4771-9fe5-aec439d8a97c req-c58c6da3-1a11-447a-b85f-a9f93f918ebd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Refreshing instance network info cache due to event network-changed-c2a7b6cc-8371-465a-b50f-822e25ab8682. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:09:24 np0005603623 nova_compute[226235]: 2026-01-31 08:09:24.237 226239 DEBUG oslo_concurrency.lockutils [req-78f987ad-c5d1-4771-9fe5-aec439d8a97c req-c58c6da3-1a11-447a-b85f-a9f93f918ebd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-1c071b37-4498-458c-96b0-3e1a15ca470c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:09:24 np0005603623 nova_compute[226235]: 2026-01-31 08:09:24.323 226239 WARNING nova.network.neutron [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] 5655565c-f4a9-4971-a164-5eabea306a80 already exists in list: networks containing: ['5655565c-f4a9-4971-a164-5eabea306a80']. ignoring it#033[00m
Jan 31 03:09:25 np0005603623 nova_compute[226235]: 2026-01-31 08:09:25.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:09:25 np0005603623 nova_compute[226235]: 2026-01-31 08:09:25.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:09:25 np0005603623 nova_compute[226235]: 2026-01-31 08:09:25.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:09:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:25.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:25.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:25 np0005603623 nova_compute[226235]: 2026-01-31 08:09:25.884 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-1c071b37-4498-458c-96b0-3e1a15ca470c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:09:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:27.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:27.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:27 np0005603623 nova_compute[226235]: 2026-01-31 08:09:27.868 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:28 np0005603623 nova_compute[226235]: 2026-01-31 08:09:28.342 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.292 226239 DEBUG nova.network.neutron [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Updating instance_info_cache with network_info: [{"id": "0c27f6b7-b93a-418a-8953-51a84b461843", "address": "fa:16:3e:27:b2:15", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c27f6b7-b9", "ovs_interfaceid": "0c27f6b7-b93a-418a-8953-51a84b461843", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c2a7b6cc-8371-465a-b50f-822e25ab8682", "address": "fa:16:3e:18:77:6a", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2a7b6cc-83", "ovs_interfaceid": "c2a7b6cc-8371-465a-b50f-822e25ab8682", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.317 226239 DEBUG oslo_concurrency.lockutils [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Releasing lock "refresh_cache-1c071b37-4498-458c-96b0-3e1a15ca470c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.321 226239 DEBUG oslo_concurrency.lockutils [req-78f987ad-c5d1-4771-9fe5-aec439d8a97c req-c58c6da3-1a11-447a-b85f-a9f93f918ebd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-1c071b37-4498-458c-96b0-3e1a15ca470c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.321 226239 DEBUG nova.network.neutron [req-78f987ad-c5d1-4771-9fe5-aec439d8a97c req-c58c6da3-1a11-447a-b85f-a9f93f918ebd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Refreshing network info cache for port c2a7b6cc-8371-465a-b50f-822e25ab8682 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.325 226239 DEBUG nova.virt.libvirt.vif [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:09:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1486312359',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1486312359',id=78,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:09:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='dfc138d4fe084f1e8bfe6a94be18cc23',ramdisk_id='',reservation_id='r-jowc2xfu',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1402581101',owner_user_name='tempest-AttachInterfacesV270Test-1402581101-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:09:19Z,user_data=None,user_id='9b363a9a11084c998b823d2941f27c97',uuid=1c071b37-4498-458c-96b0-3e1a15ca470c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c2a7b6cc-8371-465a-b50f-822e25ab8682", "address": "fa:16:3e:18:77:6a", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2a7b6cc-83", "ovs_interfaceid": "c2a7b6cc-8371-465a-b50f-822e25ab8682", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.325 226239 DEBUG nova.network.os_vif_util [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Converting VIF {"id": "c2a7b6cc-8371-465a-b50f-822e25ab8682", "address": "fa:16:3e:18:77:6a", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2a7b6cc-83", "ovs_interfaceid": "c2a7b6cc-8371-465a-b50f-822e25ab8682", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.326 226239 DEBUG nova.network.os_vif_util [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:77:6a,bridge_name='br-int',has_traffic_filtering=True,id=c2a7b6cc-8371-465a-b50f-822e25ab8682,network=Network(5655565c-f4a9-4971-a164-5eabea306a80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2a7b6cc-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.327 226239 DEBUG os_vif [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:77:6a,bridge_name='br-int',has_traffic_filtering=True,id=c2a7b6cc-8371-465a-b50f-822e25ab8682,network=Network(5655565c-f4a9-4971-a164-5eabea306a80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2a7b6cc-83') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.328 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.328 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.329 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.333 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.333 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc2a7b6cc-83, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.334 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc2a7b6cc-83, col_values=(('external_ids', {'iface-id': 'c2a7b6cc-8371-465a-b50f-822e25ab8682', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:18:77:6a', 'vm-uuid': '1c071b37-4498-458c-96b0-3e1a15ca470c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.336 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:29 np0005603623 NetworkManager[48970]: <info>  [1769846969.3369] manager: (tapc2a7b6cc-83): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.339 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.343 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.344 226239 INFO os_vif [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:77:6a,bridge_name='br-int',has_traffic_filtering=True,id=c2a7b6cc-8371-465a-b50f-822e25ab8682,network=Network(5655565c-f4a9-4971-a164-5eabea306a80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2a7b6cc-83')#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.346 226239 DEBUG nova.virt.libvirt.vif [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:09:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1486312359',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1486312359',id=78,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:09:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='dfc138d4fe084f1e8bfe6a94be18cc23',ramdisk_id='',reservation_id='r-jowc2xfu',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1402581101',owner_user_name='tempest-AttachInterfacesV270Test-1402581101-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:09:19Z,user_data=None,user_id='9b363a9a11084c998b823d2941f27c97',uuid=1c071b37-4498-458c-96b0-3e1a15ca470c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c2a7b6cc-8371-465a-b50f-822e25ab8682", "address": "fa:16:3e:18:77:6a", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2a7b6cc-83", "ovs_interfaceid": "c2a7b6cc-8371-465a-b50f-822e25ab8682", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.346 226239 DEBUG nova.network.os_vif_util [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Converting VIF {"id": "c2a7b6cc-8371-465a-b50f-822e25ab8682", "address": "fa:16:3e:18:77:6a", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2a7b6cc-83", "ovs_interfaceid": "c2a7b6cc-8371-465a-b50f-822e25ab8682", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.347 226239 DEBUG nova.network.os_vif_util [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:77:6a,bridge_name='br-int',has_traffic_filtering=True,id=c2a7b6cc-8371-465a-b50f-822e25ab8682,network=Network(5655565c-f4a9-4971-a164-5eabea306a80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2a7b6cc-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.353 226239 DEBUG nova.virt.libvirt.guest [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] attach device xml: <interface type="ethernet">
Jan 31 03:09:29 np0005603623 nova_compute[226235]:  <mac address="fa:16:3e:18:77:6a"/>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:  <model type="virtio"/>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:  <mtu size="1442"/>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:  <target dev="tapc2a7b6cc-83"/>
Jan 31 03:09:29 np0005603623 nova_compute[226235]: </interface>
Jan 31 03:09:29 np0005603623 nova_compute[226235]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:09:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:29.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:29 np0005603623 kernel: tapc2a7b6cc-83: entered promiscuous mode
Jan 31 03:09:29 np0005603623 NetworkManager[48970]: <info>  [1769846969.3695] manager: (tapc2a7b6cc-83): new Tun device (/org/freedesktop/NetworkManager/Devices/135)
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.369 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:29 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:29Z|00290|binding|INFO|Claiming lport c2a7b6cc-8371-465a-b50f-822e25ab8682 for this chassis.
Jan 31 03:09:29 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:29Z|00291|binding|INFO|c2a7b6cc-8371-465a-b50f-822e25ab8682: Claiming fa:16:3e:18:77:6a 10.100.0.7
Jan 31 03:09:29 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:29Z|00292|binding|INFO|Setting lport c2a7b6cc-8371-465a-b50f-822e25ab8682 ovn-installed in OVS
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.377 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.381 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:29 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:29Z|00293|binding|INFO|Setting lport c2a7b6cc-8371-465a-b50f-822e25ab8682 up in Southbound
Jan 31 03:09:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:29.384 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:77:6a 10.100.0.7'], port_security=['fa:16:3e:18:77:6a 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '1c071b37-4498-458c-96b0-3e1a15ca470c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5655565c-f4a9-4971-a164-5eabea306a80', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dfc138d4fe084f1e8bfe6a94be18cc23', 'neutron:revision_number': '2', 'neutron:security_group_ids': '21ebf42d-f1d4-416f-8212-6b9f8a117dd9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2cf5339-996b-4e91-b696-376e376e07d0, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=c2a7b6cc-8371-465a-b50f-822e25ab8682) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:09:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:29.386 143258 INFO neutron.agent.ovn.metadata.agent [-] Port c2a7b6cc-8371-465a-b50f-822e25ab8682 in datapath 5655565c-f4a9-4971-a164-5eabea306a80 bound to our chassis#033[00m
Jan 31 03:09:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:29.387 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5655565c-f4a9-4971-a164-5eabea306a80#033[00m
Jan 31 03:09:29 np0005603623 systemd-udevd[258672]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:09:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:29.401 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[57830671-4f12-4065-b2b0-9a2ffe832706]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:29 np0005603623 NetworkManager[48970]: <info>  [1769846969.4147] device (tapc2a7b6cc-83): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:09:29 np0005603623 NetworkManager[48970]: <info>  [1769846969.4153] device (tapc2a7b6cc-83): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:09:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:29.423 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[0d0a306a-8027-4eeb-8d93-ec21d12d0771]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:29.425 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[6326d32d-c8e9-41a7-898c-8e41753ab355]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:29.443 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ab7f39b0-8f97-4b07-a138-4f4ee85aa547]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:29.458 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[12619022-f6d7-4dd4-850e-344999e52d56]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5655565c-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:54:fa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 6, 'rx_bytes': 532, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 6, 'rx_bytes': 532, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 609638, 'reachable_time': 19914, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258679, 'error': None, 'target': 'ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:29.471 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f501431a-5da3-4307-bed5-87987d72f5bc]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5655565c-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 609648, 'tstamp': 609648}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 258680, 'error': None, 'target': 'ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap5655565c-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 609650, 'tstamp': 609650}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 258680, 'error': None, 'target': 'ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:29.473 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5655565c-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.475 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.476 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:29.476 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5655565c-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:29.477 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:09:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:29.477 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5655565c-f0, col_values=(('external_ids', {'iface-id': 'b4b7eabb-6031-4f1e-b4ac-54963402671d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:29 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:29.478 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.742 226239 DEBUG nova.compute.manager [req-b7b43753-e1a3-4dd1-8825-e0ba9a7887af req-2af9d17f-7d30-4b5b-a94f-507df846239e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received event network-vif-plugged-c2a7b6cc-8371-465a-b50f-822e25ab8682 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.742 226239 DEBUG oslo_concurrency.lockutils [req-b7b43753-e1a3-4dd1-8825-e0ba9a7887af req-2af9d17f-7d30-4b5b-a94f-507df846239e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.743 226239 DEBUG oslo_concurrency.lockutils [req-b7b43753-e1a3-4dd1-8825-e0ba9a7887af req-2af9d17f-7d30-4b5b-a94f-507df846239e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.743 226239 DEBUG oslo_concurrency.lockutils [req-b7b43753-e1a3-4dd1-8825-e0ba9a7887af req-2af9d17f-7d30-4b5b-a94f-507df846239e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.743 226239 DEBUG nova.compute.manager [req-b7b43753-e1a3-4dd1-8825-e0ba9a7887af req-2af9d17f-7d30-4b5b-a94f-507df846239e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] No waiting events found dispatching network-vif-plugged-c2a7b6cc-8371-465a-b50f-822e25ab8682 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.743 226239 WARNING nova.compute.manager [req-b7b43753-e1a3-4dd1-8825-e0ba9a7887af req-2af9d17f-7d30-4b5b-a94f-507df846239e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received unexpected event network-vif-plugged-c2a7b6cc-8371-465a-b50f-822e25ab8682 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.781 226239 DEBUG nova.virt.libvirt.driver [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.781 226239 DEBUG nova.virt.libvirt.driver [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.782 226239 DEBUG nova.virt.libvirt.driver [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] No VIF found with MAC fa:16:3e:27:b2:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.782 226239 DEBUG nova.virt.libvirt.driver [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] No VIF found with MAC fa:16:3e:18:77:6a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:09:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:29.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.823 226239 DEBUG nova.virt.libvirt.guest [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:09:29 np0005603623 nova_compute[226235]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:  <nova:name>tempest-AttachInterfacesV270Test-server-1486312359</nova:name>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:  <nova:creationTime>2026-01-31 08:09:29</nova:creationTime>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:  <nova:flavor name="m1.nano">
Jan 31 03:09:29 np0005603623 nova_compute[226235]:    <nova:memory>128</nova:memory>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:    <nova:disk>1</nova:disk>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:    <nova:swap>0</nova:swap>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:  </nova:flavor>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:  <nova:owner>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:    <nova:user uuid="9b363a9a11084c998b823d2941f27c97">tempest-AttachInterfacesV270Test-1402581101-project-member</nova:user>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:    <nova:project uuid="dfc138d4fe084f1e8bfe6a94be18cc23">tempest-AttachInterfacesV270Test-1402581101</nova:project>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:  </nova:owner>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:  <nova:ports>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:    <nova:port uuid="0c27f6b7-b93a-418a-8953-51a84b461843">
Jan 31 03:09:29 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:    <nova:port uuid="c2a7b6cc-8371-465a-b50f-822e25ab8682">
Jan 31 03:09:29 np0005603623 nova_compute[226235]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:    </nova:port>
Jan 31 03:09:29 np0005603623 nova_compute[226235]:  </nova:ports>
Jan 31 03:09:29 np0005603623 nova_compute[226235]: </nova:instance>
Jan 31 03:09:29 np0005603623 nova_compute[226235]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 31 03:09:29 np0005603623 nova_compute[226235]: 2026-01-31 08:09:29.862 226239 DEBUG oslo_concurrency.lockutils [None req-c6b36467-38bc-4464-a88f-def20394550e 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lock "interface-1c071b37-4498-458c-96b0-3e1a15ca470c-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 8.898s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:09:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:30.102 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:30.103 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:30.104 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:31.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:31 np0005603623 nova_compute[226235]: 2026-01-31 08:09:31.449 226239 DEBUG nova.network.neutron [req-78f987ad-c5d1-4771-9fe5-aec439d8a97c req-c58c6da3-1a11-447a-b85f-a9f93f918ebd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Updated VIF entry in instance network info cache for port c2a7b6cc-8371-465a-b50f-822e25ab8682. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:09:31 np0005603623 nova_compute[226235]: 2026-01-31 08:09:31.449 226239 DEBUG nova.network.neutron [req-78f987ad-c5d1-4771-9fe5-aec439d8a97c req-c58c6da3-1a11-447a-b85f-a9f93f918ebd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Updating instance_info_cache with network_info: [{"id": "0c27f6b7-b93a-418a-8953-51a84b461843", "address": "fa:16:3e:27:b2:15", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c27f6b7-b9", "ovs_interfaceid": "0c27f6b7-b93a-418a-8953-51a84b461843", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c2a7b6cc-8371-465a-b50f-822e25ab8682", "address": "fa:16:3e:18:77:6a", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2a7b6cc-83", "ovs_interfaceid": "c2a7b6cc-8371-465a-b50f-822e25ab8682", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:09:31 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:09:31 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:09:31 np0005603623 nova_compute[226235]: 2026-01-31 08:09:31.472 226239 DEBUG oslo_concurrency.lockutils [req-78f987ad-c5d1-4771-9fe5-aec439d8a97c req-c58c6da3-1a11-447a-b85f-a9f93f918ebd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-1c071b37-4498-458c-96b0-3e1a15ca470c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:09:31 np0005603623 nova_compute[226235]: 2026-01-31 08:09:31.473 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-1c071b37-4498-458c-96b0-3e1a15ca470c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:09:31 np0005603623 nova_compute[226235]: 2026-01-31 08:09:31.473 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:09:31 np0005603623 nova_compute[226235]: 2026-01-31 08:09:31.473 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1c071b37-4498-458c-96b0-3e1a15ca470c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:09:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:31.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:31 np0005603623 nova_compute[226235]: 2026-01-31 08:09:31.870 226239 DEBUG nova.compute.manager [req-e9763de6-b5c0-42f8-8563-e9e46bc41f8e req-68575a9f-70cb-4172-9fea-cbd13b4753fd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received event network-vif-plugged-c2a7b6cc-8371-465a-b50f-822e25ab8682 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:31 np0005603623 nova_compute[226235]: 2026-01-31 08:09:31.871 226239 DEBUG oslo_concurrency.lockutils [req-e9763de6-b5c0-42f8-8563-e9e46bc41f8e req-68575a9f-70cb-4172-9fea-cbd13b4753fd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:31 np0005603623 nova_compute[226235]: 2026-01-31 08:09:31.871 226239 DEBUG oslo_concurrency.lockutils [req-e9763de6-b5c0-42f8-8563-e9e46bc41f8e req-68575a9f-70cb-4172-9fea-cbd13b4753fd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:31 np0005603623 nova_compute[226235]: 2026-01-31 08:09:31.872 226239 DEBUG oslo_concurrency.lockutils [req-e9763de6-b5c0-42f8-8563-e9e46bc41f8e req-68575a9f-70cb-4172-9fea-cbd13b4753fd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:31 np0005603623 nova_compute[226235]: 2026-01-31 08:09:31.872 226239 DEBUG nova.compute.manager [req-e9763de6-b5c0-42f8-8563-e9e46bc41f8e req-68575a9f-70cb-4172-9fea-cbd13b4753fd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] No waiting events found dispatching network-vif-plugged-c2a7b6cc-8371-465a-b50f-822e25ab8682 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:09:31 np0005603623 nova_compute[226235]: 2026-01-31 08:09:31.872 226239 WARNING nova.compute.manager [req-e9763de6-b5c0-42f8-8563-e9e46bc41f8e req-68575a9f-70cb-4172-9fea-cbd13b4753fd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received unexpected event network-vif-plugged-c2a7b6cc-8371-465a-b50f-822e25ab8682 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:09:32 np0005603623 nova_compute[226235]: 2026-01-31 08:09:32.290 226239 DEBUG oslo_concurrency.lockutils [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Acquiring lock "1c071b37-4498-458c-96b0-3e1a15ca470c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:32 np0005603623 nova_compute[226235]: 2026-01-31 08:09:32.291 226239 DEBUG oslo_concurrency.lockutils [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:32 np0005603623 nova_compute[226235]: 2026-01-31 08:09:32.291 226239 DEBUG oslo_concurrency.lockutils [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Acquiring lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:32 np0005603623 nova_compute[226235]: 2026-01-31 08:09:32.291 226239 DEBUG oslo_concurrency.lockutils [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:32 np0005603623 nova_compute[226235]: 2026-01-31 08:09:32.291 226239 DEBUG oslo_concurrency.lockutils [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:32 np0005603623 nova_compute[226235]: 2026-01-31 08:09:32.292 226239 INFO nova.compute.manager [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Terminating instance#033[00m
Jan 31 03:09:32 np0005603623 nova_compute[226235]: 2026-01-31 08:09:32.293 226239 DEBUG nova.compute.manager [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:09:32 np0005603623 kernel: tap0c27f6b7-b9 (unregistering): left promiscuous mode
Jan 31 03:09:32 np0005603623 NetworkManager[48970]: <info>  [1769846972.9368] device (tap0c27f6b7-b9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:09:32 np0005603623 nova_compute[226235]: 2026-01-31 08:09:32.944 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:32Z|00294|binding|INFO|Releasing lport 0c27f6b7-b93a-418a-8953-51a84b461843 from this chassis (sb_readonly=0)
Jan 31 03:09:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:32Z|00295|binding|INFO|Setting lport 0c27f6b7-b93a-418a-8953-51a84b461843 down in Southbound
Jan 31 03:09:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:32Z|00296|binding|INFO|Removing iface tap0c27f6b7-b9 ovn-installed in OVS
Jan 31 03:09:32 np0005603623 nova_compute[226235]: 2026-01-31 08:09:32.946 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:32 np0005603623 nova_compute[226235]: 2026-01-31 08:09:32.950 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:32.954 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:b2:15 10.100.0.6'], port_security=['fa:16:3e:27:b2:15 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1c071b37-4498-458c-96b0-3e1a15ca470c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5655565c-f4a9-4971-a164-5eabea306a80', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dfc138d4fe084f1e8bfe6a94be18cc23', 'neutron:revision_number': '4', 'neutron:security_group_ids': '21ebf42d-f1d4-416f-8212-6b9f8a117dd9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2cf5339-996b-4e91-b696-376e376e07d0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=0c27f6b7-b93a-418a-8953-51a84b461843) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:09:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:32.955 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 0c27f6b7-b93a-418a-8953-51a84b461843 in datapath 5655565c-f4a9-4971-a164-5eabea306a80 unbound from our chassis#033[00m
Jan 31 03:09:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:32.957 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5655565c-f4a9-4971-a164-5eabea306a80#033[00m
Jan 31 03:09:32 np0005603623 kernel: tapc2a7b6cc-83 (unregistering): left promiscuous mode
Jan 31 03:09:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:32.969 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bbb992f1-1d8d-45d4-af83-84e689a91764]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:32 np0005603623 NetworkManager[48970]: <info>  [1769846972.9726] device (tapc2a7b6cc-83): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:09:32 np0005603623 nova_compute[226235]: 2026-01-31 08:09:32.972 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:32 np0005603623 nova_compute[226235]: 2026-01-31 08:09:32.979 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:32Z|00297|binding|INFO|Releasing lport c2a7b6cc-8371-465a-b50f-822e25ab8682 from this chassis (sb_readonly=0)
Jan 31 03:09:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:32Z|00298|binding|INFO|Setting lport c2a7b6cc-8371-465a-b50f-822e25ab8682 down in Southbound
Jan 31 03:09:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:32Z|00299|binding|INFO|Removing iface tapc2a7b6cc-83 ovn-installed in OVS
Jan 31 03:09:32 np0005603623 nova_compute[226235]: 2026-01-31 08:09:32.981 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:32.985 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:77:6a 10.100.0.7'], port_security=['fa:16:3e:18:77:6a 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '1c071b37-4498-458c-96b0-3e1a15ca470c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5655565c-f4a9-4971-a164-5eabea306a80', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dfc138d4fe084f1e8bfe6a94be18cc23', 'neutron:revision_number': '4', 'neutron:security_group_ids': '21ebf42d-f1d4-416f-8212-6b9f8a117dd9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2cf5339-996b-4e91-b696-376e376e07d0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=c2a7b6cc-8371-465a-b50f-822e25ab8682) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:09:32 np0005603623 nova_compute[226235]: 2026-01-31 08:09:32.987 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:33.003 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[2444142e-f444-451b-b423-24b4409e62c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:33.006 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[cc10dd20-75c3-4f51-84c2-3daa973c7e4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:33.033 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[faa8b026-75f2-4e63-adeb-bb380b3417e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:33 np0005603623 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d0000004e.scope: Deactivated successfully.
Jan 31 03:09:33 np0005603623 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d0000004e.scope: Consumed 11.899s CPU time.
Jan 31 03:09:33 np0005603623 systemd-machined[194379]: Machine qemu-33-instance-0000004e terminated.
Jan 31 03:09:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:33.047 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e4cc0f22-e6fc-4435-ad95-f1ee0abbe905]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5655565c-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:54:fa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 8, 'rx_bytes': 532, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 8, 'rx_bytes': 532, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 609638, 'reachable_time': 19914, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258699, 'error': None, 'target': 'ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:33.060 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a7eb826b-6b0d-4db2-b416-99cfd482db1b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5655565c-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 609648, 'tstamp': 609648}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 258700, 'error': None, 'target': 'ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap5655565c-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 609650, 'tstamp': 609650}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 258700, 'error': None, 'target': 'ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:33.062 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5655565c-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.063 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.069 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:33.070 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5655565c-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:33.070 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:09:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:33.071 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5655565c-f0, col_values=(('external_ids', {'iface-id': 'b4b7eabb-6031-4f1e-b4ac-54963402671d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:33.071 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:09:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:33.072 143258 INFO neutron.agent.ovn.metadata.agent [-] Port c2a7b6cc-8371-465a-b50f-822e25ab8682 in datapath 5655565c-f4a9-4971-a164-5eabea306a80 unbound from our chassis#033[00m
Jan 31 03:09:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:33.073 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5655565c-f4a9-4971-a164-5eabea306a80, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:09:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:33.074 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[215ae9cd-ebb0-42c9-839e-6393c5b9a4cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:33.074 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80 namespace which is not needed anymore#033[00m
Jan 31 03:09:33 np0005603623 NetworkManager[48970]: <info>  [1769846973.1235] manager: (tapc2a7b6cc-83): new Tun device (/org/freedesktop/NetworkManager/Devices/136)
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.150 226239 INFO nova.virt.libvirt.driver [-] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Instance destroyed successfully.#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.151 226239 DEBUG nova.objects.instance [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lazy-loading 'resources' on Instance uuid 1c071b37-4498-458c-96b0-3e1a15ca470c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.337 226239 DEBUG nova.virt.libvirt.vif [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:09:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1486312359',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1486312359',id=78,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:09:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dfc138d4fe084f1e8bfe6a94be18cc23',ramdisk_id='',reservation_id='r-jowc2xfu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1402581101',owner_user_name='tempest-AttachInterfacesV270Test-1402581101-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:09:19Z,user_data=None,user_id='9b363a9a11084c998b823d2941f27c97',uuid=1c071b37-4498-458c-96b0-3e1a15ca470c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0c27f6b7-b93a-418a-8953-51a84b461843", "address": "fa:16:3e:27:b2:15", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c27f6b7-b9", "ovs_interfaceid": "0c27f6b7-b93a-418a-8953-51a84b461843", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.338 226239 DEBUG nova.network.os_vif_util [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Converting VIF {"id": "0c27f6b7-b93a-418a-8953-51a84b461843", "address": "fa:16:3e:27:b2:15", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c27f6b7-b9", "ovs_interfaceid": "0c27f6b7-b93a-418a-8953-51a84b461843", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.339 226239 DEBUG nova.network.os_vif_util [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:b2:15,bridge_name='br-int',has_traffic_filtering=True,id=0c27f6b7-b93a-418a-8953-51a84b461843,network=Network(5655565c-f4a9-4971-a164-5eabea306a80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c27f6b7-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.339 226239 DEBUG os_vif [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:b2:15,bridge_name='br-int',has_traffic_filtering=True,id=0c27f6b7-b93a-418a-8953-51a84b461843,network=Network(5655565c-f4a9-4971-a164-5eabea306a80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c27f6b7-b9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.340 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.341 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c27f6b7-b9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.342 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.344 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.345 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.346 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.347 226239 INFO os_vif [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:b2:15,bridge_name='br-int',has_traffic_filtering=True,id=0c27f6b7-b93a-418a-8953-51a84b461843,network=Network(5655565c-f4a9-4971-a164-5eabea306a80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c27f6b7-b9')#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.348 226239 DEBUG nova.virt.libvirt.vif [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:09:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1486312359',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1486312359',id=78,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:09:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dfc138d4fe084f1e8bfe6a94be18cc23',ramdisk_id='',reservation_id='r-jowc2xfu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1402581101',owner_user_name='tempest-AttachInterfacesV270Test-1402581101-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:09:19Z,user_data=None,user_id='9b363a9a11084c998b823d2941f27c97',uuid=1c071b37-4498-458c-96b0-3e1a15ca470c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c2a7b6cc-8371-465a-b50f-822e25ab8682", "address": "fa:16:3e:18:77:6a", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2a7b6cc-83", "ovs_interfaceid": "c2a7b6cc-8371-465a-b50f-822e25ab8682", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.348 226239 DEBUG nova.network.os_vif_util [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Converting VIF {"id": "c2a7b6cc-8371-465a-b50f-822e25ab8682", "address": "fa:16:3e:18:77:6a", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2a7b6cc-83", "ovs_interfaceid": "c2a7b6cc-8371-465a-b50f-822e25ab8682", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.348 226239 DEBUG nova.network.os_vif_util [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:77:6a,bridge_name='br-int',has_traffic_filtering=True,id=c2a7b6cc-8371-465a-b50f-822e25ab8682,network=Network(5655565c-f4a9-4971-a164-5eabea306a80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2a7b6cc-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.349 226239 DEBUG os_vif [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:77:6a,bridge_name='br-int',has_traffic_filtering=True,id=c2a7b6cc-8371-465a-b50f-822e25ab8682,network=Network(5655565c-f4a9-4971-a164-5eabea306a80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2a7b6cc-83') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.350 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.350 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2a7b6cc-83, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.351 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.352 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:33 np0005603623 nova_compute[226235]: 2026-01-31 08:09:33.354 226239 INFO os_vif [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:77:6a,bridge_name='br-int',has_traffic_filtering=True,id=c2a7b6cc-8371-465a-b50f-822e25ab8682,network=Network(5655565c-f4a9-4971-a164-5eabea306a80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2a7b6cc-83')#033[00m
Jan 31 03:09:33 np0005603623 neutron-haproxy-ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80[258415]: [NOTICE]   (258422) : haproxy version is 2.8.14-c23fe91
Jan 31 03:09:33 np0005603623 neutron-haproxy-ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80[258415]: [NOTICE]   (258422) : path to executable is /usr/sbin/haproxy
Jan 31 03:09:33 np0005603623 neutron-haproxy-ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80[258415]: [WARNING]  (258422) : Exiting Master process...
Jan 31 03:09:33 np0005603623 neutron-haproxy-ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80[258415]: [ALERT]    (258422) : Current worker (258425) exited with code 143 (Terminated)
Jan 31 03:09:33 np0005603623 neutron-haproxy-ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80[258415]: [WARNING]  (258422) : All workers exited. Exiting... (0)
Jan 31 03:09:33 np0005603623 systemd[1]: libpod-768937ded3183340ae91a06c1be550e939ae04e552ef125082b38d800748dfcc.scope: Deactivated successfully.
Jan 31 03:09:33 np0005603623 podman[258741]: 2026-01-31 08:09:33.368929378 +0000 UTC m=+0.200952706 container died 768937ded3183340ae91a06c1be550e939ae04e552ef125082b38d800748dfcc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 03:09:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:33.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:33.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:34 np0005603623 nova_compute[226235]: 2026-01-31 08:09:34.133 226239 DEBUG nova.compute.manager [req-05e25473-4f2f-45ee-9191-b28bdfadd65f req-6b31d00a-9de4-4853-ab1b-6e56634a9698 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received event network-vif-unplugged-0c27f6b7-b93a-418a-8953-51a84b461843 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:34 np0005603623 nova_compute[226235]: 2026-01-31 08:09:34.133 226239 DEBUG oslo_concurrency.lockutils [req-05e25473-4f2f-45ee-9191-b28bdfadd65f req-6b31d00a-9de4-4853-ab1b-6e56634a9698 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:34 np0005603623 nova_compute[226235]: 2026-01-31 08:09:34.134 226239 DEBUG oslo_concurrency.lockutils [req-05e25473-4f2f-45ee-9191-b28bdfadd65f req-6b31d00a-9de4-4853-ab1b-6e56634a9698 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:34 np0005603623 nova_compute[226235]: 2026-01-31 08:09:34.134 226239 DEBUG oslo_concurrency.lockutils [req-05e25473-4f2f-45ee-9191-b28bdfadd65f req-6b31d00a-9de4-4853-ab1b-6e56634a9698 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:34 np0005603623 nova_compute[226235]: 2026-01-31 08:09:34.135 226239 DEBUG nova.compute.manager [req-05e25473-4f2f-45ee-9191-b28bdfadd65f req-6b31d00a-9de4-4853-ab1b-6e56634a9698 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] No waiting events found dispatching network-vif-unplugged-0c27f6b7-b93a-418a-8953-51a84b461843 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:09:34 np0005603623 nova_compute[226235]: 2026-01-31 08:09:34.135 226239 DEBUG nova.compute.manager [req-05e25473-4f2f-45ee-9191-b28bdfadd65f req-6b31d00a-9de4-4853-ab1b-6e56634a9698 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received event network-vif-unplugged-0c27f6b7-b93a-418a-8953-51a84b461843 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:09:34 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-768937ded3183340ae91a06c1be550e939ae04e552ef125082b38d800748dfcc-userdata-shm.mount: Deactivated successfully.
Jan 31 03:09:34 np0005603623 systemd[1]: var-lib-containers-storage-overlay-e207f95984f9be550007de936f3f2325b359adda34dd9e474a783d7ded3c6eb3-merged.mount: Deactivated successfully.
Jan 31 03:09:34 np0005603623 podman[258741]: 2026-01-31 08:09:34.620384218 +0000 UTC m=+1.452407566 container cleanup 768937ded3183340ae91a06c1be550e939ae04e552ef125082b38d800748dfcc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 31 03:09:34 np0005603623 systemd[1]: libpod-conmon-768937ded3183340ae91a06c1be550e939ae04e552ef125082b38d800748dfcc.scope: Deactivated successfully.
Jan 31 03:09:35 np0005603623 podman[258793]: 2026-01-31 08:09:35.116165618 +0000 UTC m=+0.480017944 container remove 768937ded3183340ae91a06c1be550e939ae04e552ef125082b38d800748dfcc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:09:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:35.120 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[df298140-7043-44be-b86a-f8a31d29a4d2]: (4, ('Sat Jan 31 08:09:33 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80 (768937ded3183340ae91a06c1be550e939ae04e552ef125082b38d800748dfcc)\n768937ded3183340ae91a06c1be550e939ae04e552ef125082b38d800748dfcc\nSat Jan 31 08:09:34 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80 (768937ded3183340ae91a06c1be550e939ae04e552ef125082b38d800748dfcc)\n768937ded3183340ae91a06c1be550e939ae04e552ef125082b38d800748dfcc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:35.122 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d1edcdfa-b683-46fc-87d6-012c4cf338bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:35.122 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5655565c-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:35 np0005603623 nova_compute[226235]: 2026-01-31 08:09:35.124 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:35 np0005603623 kernel: tap5655565c-f0: left promiscuous mode
Jan 31 03:09:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:35.128 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bba4219e-204e-4955-a73d-ec620f0536ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:35 np0005603623 nova_compute[226235]: 2026-01-31 08:09:35.130 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:35.149 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d0324304-7b2a-4563-a9df-6d503f64acb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:35.150 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e480208e-e503-4974-86c8-5a4c1529fdd5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:35.164 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6463126f-4515-4f5a-9a60-1ae10e1b4ae9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 609631, 'reachable_time': 31361, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258809, 'error': None, 'target': 'ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:35.167 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5655565c-f4a9-4971-a164-5eabea306a80 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:09:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:35.168 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[d637c600-5f66-409c-a354-c5821810f416]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:35 np0005603623 systemd[1]: run-netns-ovnmeta\x2d5655565c\x2df4a9\x2d4971\x2da164\x2d5eabea306a80.mount: Deactivated successfully.
Jan 31 03:09:35 np0005603623 nova_compute[226235]: 2026-01-31 08:09:35.205 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Updating instance_info_cache with network_info: [{"id": "0c27f6b7-b93a-418a-8953-51a84b461843", "address": "fa:16:3e:27:b2:15", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c27f6b7-b9", "ovs_interfaceid": "0c27f6b7-b93a-418a-8953-51a84b461843", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c2a7b6cc-8371-465a-b50f-822e25ab8682", "address": "fa:16:3e:18:77:6a", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2a7b6cc-83", "ovs_interfaceid": "c2a7b6cc-8371-465a-b50f-822e25ab8682", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:09:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:35.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:35 np0005603623 nova_compute[226235]: 2026-01-31 08:09:35.554 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-1c071b37-4498-458c-96b0-3e1a15ca470c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:09:35 np0005603623 nova_compute[226235]: 2026-01-31 08:09:35.555 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:09:35 np0005603623 nova_compute[226235]: 2026-01-31 08:09:35.557 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:09:35 np0005603623 nova_compute[226235]: 2026-01-31 08:09:35.557 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:09:35 np0005603623 nova_compute[226235]: 2026-01-31 08:09:35.557 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:09:35 np0005603623 nova_compute[226235]: 2026-01-31 08:09:35.558 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:09:35 np0005603623 nova_compute[226235]: 2026-01-31 08:09:35.558 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:09:35 np0005603623 nova_compute[226235]: 2026-01-31 08:09:35.558 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:09:35 np0005603623 nova_compute[226235]: 2026-01-31 08:09:35.558 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:09:35 np0005603623 nova_compute[226235]: 2026-01-31 08:09:35.592 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:35 np0005603623 nova_compute[226235]: 2026-01-31 08:09:35.593 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:35 np0005603623 nova_compute[226235]: 2026-01-31 08:09:35.593 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:35 np0005603623 nova_compute[226235]: 2026-01-31 08:09:35.594 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:09:35 np0005603623 nova_compute[226235]: 2026-01-31 08:09:35.594 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:35.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:09:36 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3536021923' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.048 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.123 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000004e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.124 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000004e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.247 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.248 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4678MB free_disk=20.94390869140625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.248 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.249 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.281 226239 DEBUG nova.compute.manager [req-ecb85983-7b89-4b46-82c4-3d582b2d1d23 req-2472a9ce-0da7-4945-915f-067888e457c5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received event network-vif-plugged-0c27f6b7-b93a-418a-8953-51a84b461843 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.281 226239 DEBUG oslo_concurrency.lockutils [req-ecb85983-7b89-4b46-82c4-3d582b2d1d23 req-2472a9ce-0da7-4945-915f-067888e457c5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.281 226239 DEBUG oslo_concurrency.lockutils [req-ecb85983-7b89-4b46-82c4-3d582b2d1d23 req-2472a9ce-0da7-4945-915f-067888e457c5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.282 226239 DEBUG oslo_concurrency.lockutils [req-ecb85983-7b89-4b46-82c4-3d582b2d1d23 req-2472a9ce-0da7-4945-915f-067888e457c5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.282 226239 DEBUG nova.compute.manager [req-ecb85983-7b89-4b46-82c4-3d582b2d1d23 req-2472a9ce-0da7-4945-915f-067888e457c5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] No waiting events found dispatching network-vif-plugged-0c27f6b7-b93a-418a-8953-51a84b461843 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.282 226239 WARNING nova.compute.manager [req-ecb85983-7b89-4b46-82c4-3d582b2d1d23 req-2472a9ce-0da7-4945-915f-067888e457c5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received unexpected event network-vif-plugged-0c27f6b7-b93a-418a-8953-51a84b461843 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.282 226239 DEBUG nova.compute.manager [req-ecb85983-7b89-4b46-82c4-3d582b2d1d23 req-2472a9ce-0da7-4945-915f-067888e457c5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received event network-vif-unplugged-c2a7b6cc-8371-465a-b50f-822e25ab8682 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.282 226239 DEBUG oslo_concurrency.lockutils [req-ecb85983-7b89-4b46-82c4-3d582b2d1d23 req-2472a9ce-0da7-4945-915f-067888e457c5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.283 226239 DEBUG oslo_concurrency.lockutils [req-ecb85983-7b89-4b46-82c4-3d582b2d1d23 req-2472a9ce-0da7-4945-915f-067888e457c5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.283 226239 DEBUG oslo_concurrency.lockutils [req-ecb85983-7b89-4b46-82c4-3d582b2d1d23 req-2472a9ce-0da7-4945-915f-067888e457c5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.283 226239 DEBUG nova.compute.manager [req-ecb85983-7b89-4b46-82c4-3d582b2d1d23 req-2472a9ce-0da7-4945-915f-067888e457c5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] No waiting events found dispatching network-vif-unplugged-c2a7b6cc-8371-465a-b50f-822e25ab8682 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.283 226239 DEBUG nova.compute.manager [req-ecb85983-7b89-4b46-82c4-3d582b2d1d23 req-2472a9ce-0da7-4945-915f-067888e457c5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received event network-vif-unplugged-c2a7b6cc-8371-465a-b50f-822e25ab8682 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.284 226239 DEBUG nova.compute.manager [req-ecb85983-7b89-4b46-82c4-3d582b2d1d23 req-2472a9ce-0da7-4945-915f-067888e457c5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received event network-vif-plugged-c2a7b6cc-8371-465a-b50f-822e25ab8682 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.284 226239 DEBUG oslo_concurrency.lockutils [req-ecb85983-7b89-4b46-82c4-3d582b2d1d23 req-2472a9ce-0da7-4945-915f-067888e457c5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.284 226239 DEBUG oslo_concurrency.lockutils [req-ecb85983-7b89-4b46-82c4-3d582b2d1d23 req-2472a9ce-0da7-4945-915f-067888e457c5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.284 226239 DEBUG oslo_concurrency.lockutils [req-ecb85983-7b89-4b46-82c4-3d582b2d1d23 req-2472a9ce-0da7-4945-915f-067888e457c5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.284 226239 DEBUG nova.compute.manager [req-ecb85983-7b89-4b46-82c4-3d582b2d1d23 req-2472a9ce-0da7-4945-915f-067888e457c5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] No waiting events found dispatching network-vif-plugged-c2a7b6cc-8371-465a-b50f-822e25ab8682 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.285 226239 WARNING nova.compute.manager [req-ecb85983-7b89-4b46-82c4-3d582b2d1d23 req-2472a9ce-0da7-4945-915f-067888e457c5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received unexpected event network-vif-plugged-c2a7b6cc-8371-465a-b50f-822e25ab8682 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.348 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 1c071b37-4498-458c-96b0-3e1a15ca470c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.349 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.349 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.396 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:09:36 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/832907583' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.839 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.843 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.859 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.887 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:09:36 np0005603623 nova_compute[226235]: 2026-01-31 08:09:36.887 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.639s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:37.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:37 np0005603623 nova_compute[226235]: 2026-01-31 08:09:37.484 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:09:37 np0005603623 nova_compute[226235]: 2026-01-31 08:09:37.485 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:09:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:37.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:38 np0005603623 nova_compute[226235]: 2026-01-31 08:09:38.346 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:38 np0005603623 nova_compute[226235]: 2026-01-31 08:09:38.351 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:39.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:39.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:40 np0005603623 nova_compute[226235]: 2026-01-31 08:09:40.408 226239 INFO nova.virt.libvirt.driver [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Deleting instance files /var/lib/nova/instances/1c071b37-4498-458c-96b0-3e1a15ca470c_del#033[00m
Jan 31 03:09:40 np0005603623 nova_compute[226235]: 2026-01-31 08:09:40.409 226239 INFO nova.virt.libvirt.driver [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Deletion of /var/lib/nova/instances/1c071b37-4498-458c-96b0-3e1a15ca470c_del complete#033[00m
Jan 31 03:09:40 np0005603623 nova_compute[226235]: 2026-01-31 08:09:40.475 226239 INFO nova.compute.manager [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Took 8.18 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:09:40 np0005603623 nova_compute[226235]: 2026-01-31 08:09:40.475 226239 DEBUG oslo.service.loopingcall [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:09:40 np0005603623 nova_compute[226235]: 2026-01-31 08:09:40.475 226239 DEBUG nova.compute.manager [-] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:09:40 np0005603623 nova_compute[226235]: 2026-01-31 08:09:40.476 226239 DEBUG nova.network.neutron [-] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:09:41 np0005603623 nova_compute[226235]: 2026-01-31 08:09:41.367 226239 DEBUG nova.compute.manager [req-2d3316ef-ddda-4f42-a1c7-591ae90a4dd3 req-c8a1566c-6d71-4b3b-8f0f-366841c467e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received event network-vif-deleted-c2a7b6cc-8371-465a-b50f-822e25ab8682 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:41 np0005603623 nova_compute[226235]: 2026-01-31 08:09:41.367 226239 INFO nova.compute.manager [req-2d3316ef-ddda-4f42-a1c7-591ae90a4dd3 req-c8a1566c-6d71-4b3b-8f0f-366841c467e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Neutron deleted interface c2a7b6cc-8371-465a-b50f-822e25ab8682; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:09:41 np0005603623 nova_compute[226235]: 2026-01-31 08:09:41.367 226239 DEBUG nova.network.neutron [req-2d3316ef-ddda-4f42-a1c7-591ae90a4dd3 req-c8a1566c-6d71-4b3b-8f0f-366841c467e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Updating instance_info_cache with network_info: [{"id": "0c27f6b7-b93a-418a-8953-51a84b461843", "address": "fa:16:3e:27:b2:15", "network": {"id": "5655565c-f4a9-4971-a164-5eabea306a80", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1771236210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfc138d4fe084f1e8bfe6a94be18cc23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c27f6b7-b9", "ovs_interfaceid": "0c27f6b7-b93a-418a-8953-51a84b461843", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:09:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:41.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:41 np0005603623 nova_compute[226235]: 2026-01-31 08:09:41.388 226239 DEBUG nova.compute.manager [req-2d3316ef-ddda-4f42-a1c7-591ae90a4dd3 req-c8a1566c-6d71-4b3b-8f0f-366841c467e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Detach interface failed, port_id=c2a7b6cc-8371-465a-b50f-822e25ab8682, reason: Instance 1c071b37-4498-458c-96b0-3e1a15ca470c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:09:41 np0005603623 nova_compute[226235]: 2026-01-31 08:09:41.685 226239 DEBUG nova.network.neutron [-] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:09:41 np0005603623 nova_compute[226235]: 2026-01-31 08:09:41.703 226239 INFO nova.compute.manager [-] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Took 1.23 seconds to deallocate network for instance.#033[00m
Jan 31 03:09:41 np0005603623 nova_compute[226235]: 2026-01-31 08:09:41.805 226239 DEBUG oslo_concurrency.lockutils [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:41 np0005603623 nova_compute[226235]: 2026-01-31 08:09:41.805 226239 DEBUG oslo_concurrency.lockutils [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:41.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:41 np0005603623 nova_compute[226235]: 2026-01-31 08:09:41.884 226239 DEBUG oslo_concurrency.processutils [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:09:42 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4284206667' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:09:42 np0005603623 nova_compute[226235]: 2026-01-31 08:09:42.278 226239 DEBUG oslo_concurrency.processutils [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.394s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:42 np0005603623 nova_compute[226235]: 2026-01-31 08:09:42.284 226239 DEBUG nova.compute.provider_tree [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:09:42 np0005603623 nova_compute[226235]: 2026-01-31 08:09:42.317 226239 DEBUG nova.scheduler.client.report [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:09:42 np0005603623 nova_compute[226235]: 2026-01-31 08:09:42.388 226239 DEBUG oslo_concurrency.lockutils [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.583s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:42 np0005603623 nova_compute[226235]: 2026-01-31 08:09:42.436 226239 INFO nova.scheduler.client.report [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Deleted allocations for instance 1c071b37-4498-458c-96b0-3e1a15ca470c#033[00m
Jan 31 03:09:42 np0005603623 nova_compute[226235]: 2026-01-31 08:09:42.535 226239 DEBUG oslo_concurrency.lockutils [None req-7430e75e-db7f-4883-ac11-bc207f33a207 9b363a9a11084c998b823d2941f27c97 dfc138d4fe084f1e8bfe6a94be18cc23 - - default default] Lock "1c071b37-4498-458c-96b0-3e1a15ca470c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.244s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:43 np0005603623 nova_compute[226235]: 2026-01-31 08:09:43.349 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:43 np0005603623 nova_compute[226235]: 2026-01-31 08:09:43.352 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:43.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:43 np0005603623 nova_compute[226235]: 2026-01-31 08:09:43.483 226239 DEBUG nova.compute.manager [req-3f7b7ef2-d656-4409-bb44-c33c501d650c req-0f71a8e2-e0ce-4f3e-a052-22bd7272ba69 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Received event network-vif-deleted-0c27f6b7-b93a-418a-8953-51a84b461843 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:43.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:45.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:45.773 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:09:45 np0005603623 nova_compute[226235]: 2026-01-31 08:09:45.773 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:45.774 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:09:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:45.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:47 np0005603623 nova_compute[226235]: 2026-01-31 08:09:47.190 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:47.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:47.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:48 np0005603623 nova_compute[226235]: 2026-01-31 08:09:48.150 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846973.1483538, 1c071b37-4498-458c-96b0-3e1a15ca470c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:09:48 np0005603623 nova_compute[226235]: 2026-01-31 08:09:48.150 226239 INFO nova.compute.manager [-] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:09:48 np0005603623 nova_compute[226235]: 2026-01-31 08:09:48.173 226239 DEBUG nova.compute.manager [None req-a2a48f97-2a42-43ef-a85e-bd063560e0b9 - - - - - -] [instance: 1c071b37-4498-458c-96b0-3e1a15ca470c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:09:48 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:09:48 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:09:48 np0005603623 nova_compute[226235]: 2026-01-31 08:09:48.351 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:48 np0005603623 nova_compute[226235]: 2026-01-31 08:09:48.353 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:48 np0005603623 nova_compute[226235]: 2026-01-31 08:09:48.402 226239 DEBUG oslo_concurrency.lockutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Acquiring lock "c79a42ab-abd7-41ad-8fb4-784beb525937" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:48 np0005603623 nova_compute[226235]: 2026-01-31 08:09:48.403 226239 DEBUG oslo_concurrency.lockutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Lock "c79a42ab-abd7-41ad-8fb4-784beb525937" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:48 np0005603623 nova_compute[226235]: 2026-01-31 08:09:48.422 226239 DEBUG nova.compute.manager [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:09:48 np0005603623 nova_compute[226235]: 2026-01-31 08:09:48.533 226239 DEBUG oslo_concurrency.lockutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:48 np0005603623 nova_compute[226235]: 2026-01-31 08:09:48.534 226239 DEBUG oslo_concurrency.lockutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:48 np0005603623 nova_compute[226235]: 2026-01-31 08:09:48.541 226239 DEBUG nova.virt.hardware [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:09:48 np0005603623 nova_compute[226235]: 2026-01-31 08:09:48.541 226239 INFO nova.compute.claims [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:09:48 np0005603623 nova_compute[226235]: 2026-01-31 08:09:48.655 226239 DEBUG oslo_concurrency.processutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:09:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2507330025' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.089 226239 DEBUG oslo_concurrency.processutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.094 226239 DEBUG nova.compute.provider_tree [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.111 226239 DEBUG nova.scheduler.client.report [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.135 226239 DEBUG oslo_concurrency.lockutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.136 226239 DEBUG nova.compute.manager [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.202 226239 DEBUG nova.compute.manager [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.202 226239 DEBUG nova.network.neutron [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.223 226239 INFO nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.241 226239 DEBUG nova.compute.manager [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.292 226239 INFO nova.virt.block_device [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Booting with volume 77364634-2150-4955-8185-1bf60ebd89d8 at /dev/vda#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.375 226239 DEBUG nova.policy [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2c72ac0892c84ca0bf3e2ef74eed4f64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2901e55f200f4622ae841166074ac8f8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:09:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:49.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.565 226239 DEBUG os_brick.utils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.567 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.581 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.582 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[6128d310-2aa2-425b-8030-09c409597bac]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.584 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.593 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.594 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[99909d73-8a80-4a75-a0cc-5e0740c74aa7]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.596 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.604 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.604 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[2dd688b8-f1a5-45e0-8928-ef1b7cca58e0]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.606 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[aac29169-d3ed-4d81-a2e0-4dcbef0b3a70]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.607 226239 DEBUG oslo_concurrency.processutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.630 226239 DEBUG oslo_concurrency.processutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] CMD "nvme version" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.633 226239 DEBUG os_brick.initiator.connectors.lightos [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.633 226239 DEBUG os_brick.initiator.connectors.lightos [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.633 226239 DEBUG os_brick.initiator.connectors.lightos [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.634 226239 DEBUG os_brick.utils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] <== get_connector_properties: return (67ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:09:49 np0005603623 nova_compute[226235]: 2026-01-31 08:09:49.634 226239 DEBUG nova.virt.block_device [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Updating existing volume attachment record: fbca7a76-0bf4-4bdc-8716-d63f4ec0ab61 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:09:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:49.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:50 np0005603623 nova_compute[226235]: 2026-01-31 08:09:50.588 226239 DEBUG nova.network.neutron [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Successfully created port: 455d9ab3-8df2-4cb1-8df9-087e0c5db7af _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:09:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:50.776 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:50 np0005603623 podman[259016]: 2026-01-31 08:09:50.966695166 +0000 UTC m=+0.049074379 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent)
Jan 31 03:09:50 np0005603623 podman[259017]: 2026-01-31 08:09:50.99030415 +0000 UTC m=+0.072943071 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:09:51 np0005603623 nova_compute[226235]: 2026-01-31 08:09:51.045 226239 DEBUG nova.compute.manager [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:09:51 np0005603623 nova_compute[226235]: 2026-01-31 08:09:51.046 226239 DEBUG nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:09:51 np0005603623 nova_compute[226235]: 2026-01-31 08:09:51.047 226239 INFO nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Creating image(s)#033[00m
Jan 31 03:09:51 np0005603623 nova_compute[226235]: 2026-01-31 08:09:51.047 226239 DEBUG nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 03:09:51 np0005603623 nova_compute[226235]: 2026-01-31 08:09:51.047 226239 DEBUG nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Ensure instance console log exists: /var/lib/nova/instances/c79a42ab-abd7-41ad-8fb4-784beb525937/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:09:51 np0005603623 nova_compute[226235]: 2026-01-31 08:09:51.047 226239 DEBUG oslo_concurrency.lockutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:51 np0005603623 nova_compute[226235]: 2026-01-31 08:09:51.048 226239 DEBUG oslo_concurrency.lockutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:51 np0005603623 nova_compute[226235]: 2026-01-31 08:09:51.048 226239 DEBUG oslo_concurrency.lockutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:51.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:51.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:52 np0005603623 nova_compute[226235]: 2026-01-31 08:09:52.487 226239 DEBUG nova.network.neutron [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Successfully updated port: 455d9ab3-8df2-4cb1-8df9-087e0c5db7af _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:09:52 np0005603623 nova_compute[226235]: 2026-01-31 08:09:52.505 226239 DEBUG oslo_concurrency.lockutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Acquiring lock "refresh_cache-c79a42ab-abd7-41ad-8fb4-784beb525937" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:09:52 np0005603623 nova_compute[226235]: 2026-01-31 08:09:52.506 226239 DEBUG oslo_concurrency.lockutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Acquired lock "refresh_cache-c79a42ab-abd7-41ad-8fb4-784beb525937" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:09:52 np0005603623 nova_compute[226235]: 2026-01-31 08:09:52.506 226239 DEBUG nova.network.neutron [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:09:52 np0005603623 nova_compute[226235]: 2026-01-31 08:09:52.572 226239 DEBUG nova.compute.manager [req-960dbc4e-8999-4d57-b814-bc9d6bd0389c req-7e47efc6-55cb-4fb8-86fd-b1fa83bfd41d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Received event network-changed-455d9ab3-8df2-4cb1-8df9-087e0c5db7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:52 np0005603623 nova_compute[226235]: 2026-01-31 08:09:52.572 226239 DEBUG nova.compute.manager [req-960dbc4e-8999-4d57-b814-bc9d6bd0389c req-7e47efc6-55cb-4fb8-86fd-b1fa83bfd41d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Refreshing instance network info cache due to event network-changed-455d9ab3-8df2-4cb1-8df9-087e0c5db7af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:09:52 np0005603623 nova_compute[226235]: 2026-01-31 08:09:52.572 226239 DEBUG oslo_concurrency.lockutils [req-960dbc4e-8999-4d57-b814-bc9d6bd0389c req-7e47efc6-55cb-4fb8-86fd-b1fa83bfd41d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-c79a42ab-abd7-41ad-8fb4-784beb525937" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:09:52 np0005603623 nova_compute[226235]: 2026-01-31 08:09:52.664 226239 DEBUG nova.network.neutron [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:09:53 np0005603623 nova_compute[226235]: 2026-01-31 08:09:53.352 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:53 np0005603623 nova_compute[226235]: 2026-01-31 08:09:53.354 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:53.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:09:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:53.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.111 226239 DEBUG nova.network.neutron [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Updating instance_info_cache with network_info: [{"id": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "address": "fa:16:3e:fb:50:bc", "network": {"id": "427c4522-d251-4a37-9ae4-2bbb61e0b5cc", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1456402372-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2901e55f200f4622ae841166074ac8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap455d9ab3-8d", "ovs_interfaceid": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.156 226239 DEBUG oslo_concurrency.lockutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Releasing lock "refresh_cache-c79a42ab-abd7-41ad-8fb4-784beb525937" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.157 226239 DEBUG nova.compute.manager [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Instance network_info: |[{"id": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "address": "fa:16:3e:fb:50:bc", "network": {"id": "427c4522-d251-4a37-9ae4-2bbb61e0b5cc", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1456402372-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2901e55f200f4622ae841166074ac8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap455d9ab3-8d", "ovs_interfaceid": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.158 226239 DEBUG oslo_concurrency.lockutils [req-960dbc4e-8999-4d57-b814-bc9d6bd0389c req-7e47efc6-55cb-4fb8-86fd-b1fa83bfd41d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-c79a42ab-abd7-41ad-8fb4-784beb525937" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.158 226239 DEBUG nova.network.neutron [req-960dbc4e-8999-4d57-b814-bc9d6bd0389c req-7e47efc6-55cb-4fb8-86fd-b1fa83bfd41d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Refreshing network info cache for port 455d9ab3-8df2-4cb1-8df9-087e0c5db7af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.164 226239 DEBUG nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Start _get_guest_xml network_info=[{"id": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "address": "fa:16:3e:fb:50:bc", "network": {"id": "427c4522-d251-4a37-9ae4-2bbb61e0b5cc", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1456402372-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2901e55f200f4622ae841166074ac8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap455d9ab3-8d", "ovs_interfaceid": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'attachment_id': 'fbca7a76-0bf4-4bdc-8716-d63f4ec0ab61', 'delete_on_termination': True, 'guest_format': None, 'device_type': 'disk', 'boot_index': 0, 'disk_bus': 'virtio', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-77364634-2150-4955-8185-1bf60ebd89d8', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '77364634-2150-4955-8185-1bf60ebd89d8', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'c79a42ab-abd7-41ad-8fb4-784beb525937', 'attached_at': '', 'detached_at': '', 'volume_id': '77364634-2150-4955-8185-1bf60ebd89d8', 'serial': '77364634-2150-4955-8185-1bf60ebd89d8'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.171 226239 WARNING nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.178 226239 DEBUG nova.virt.libvirt.host [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.179 226239 DEBUG nova.virt.libvirt.host [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.184 226239 DEBUG nova.virt.libvirt.host [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.184 226239 DEBUG nova.virt.libvirt.host [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.185 226239 DEBUG nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.186 226239 DEBUG nova.virt.hardware [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.186 226239 DEBUG nova.virt.hardware [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.186 226239 DEBUG nova.virt.hardware [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.186 226239 DEBUG nova.virt.hardware [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.187 226239 DEBUG nova.virt.hardware [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.187 226239 DEBUG nova.virt.hardware [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.187 226239 DEBUG nova.virt.hardware [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.187 226239 DEBUG nova.virt.hardware [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.187 226239 DEBUG nova.virt.hardware [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.187 226239 DEBUG nova.virt.hardware [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.188 226239 DEBUG nova.virt.hardware [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.537 226239 DEBUG nova.storage.rbd_utils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] rbd image c79a42ab-abd7-41ad-8fb4-784beb525937_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:54 np0005603623 nova_compute[226235]: 2026-01-31 08:09:54.544 226239 DEBUG oslo_concurrency.processutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:55.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:09:55 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2674131664' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.457 226239 DEBUG oslo_concurrency.processutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.913s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.496 226239 DEBUG nova.virt.libvirt.vif [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:09:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestBootFromVolume-server-261240903',display_name='tempest-ServersTestBootFromVolume-server-261240903',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestbootfromvolume-server-261240903',id=80,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJkxZV+9MwTlFQb3ey2DGp7lLmYaCdqDUn7U6Rv+fzK/R7sgl5eeJJlwCadg8UCTvxfioYu2gZCvzx1wjklevqFDiRN1Gcijvn6dKc7E5TYHCxulIveFCYDDxeVKj6zt4g==',key_name='tempest-keypair-2048208306',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2901e55f200f4622ae841166074ac8f8',ramdisk_id='',reservation_id='r-eblqe8m0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-ServersTestBootFromVolume-908041274',owner_user_name='tempest-ServersTestBootFromVolume-908041274-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:09:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2c72ac0892c84ca0bf3e2ef74eed4f64',uuid=c79a42ab-abd7-41ad-8fb4-784beb525937,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "address": "fa:16:3e:fb:50:bc", "network": {"id": "427c4522-d251-4a37-9ae4-2bbb61e0b5cc", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1456402372-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2901e55f200f4622ae841166074ac8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap455d9ab3-8d", "ovs_interfaceid": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.496 226239 DEBUG nova.network.os_vif_util [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Converting VIF {"id": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "address": "fa:16:3e:fb:50:bc", "network": {"id": "427c4522-d251-4a37-9ae4-2bbb61e0b5cc", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1456402372-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2901e55f200f4622ae841166074ac8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap455d9ab3-8d", "ovs_interfaceid": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.497 226239 DEBUG nova.network.os_vif_util [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:50:bc,bridge_name='br-int',has_traffic_filtering=True,id=455d9ab3-8df2-4cb1-8df9-087e0c5db7af,network=Network(427c4522-d251-4a37-9ae4-2bbb61e0b5cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap455d9ab3-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.498 226239 DEBUG nova.objects.instance [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Lazy-loading 'pci_devices' on Instance uuid c79a42ab-abd7-41ad-8fb4-784beb525937 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.519 226239 DEBUG nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:09:55 np0005603623 nova_compute[226235]:  <uuid>c79a42ab-abd7-41ad-8fb4-784beb525937</uuid>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:  <name>instance-00000050</name>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServersTestBootFromVolume-server-261240903</nova:name>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:09:54</nova:creationTime>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:09:55 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:        <nova:user uuid="2c72ac0892c84ca0bf3e2ef74eed4f64">tempest-ServersTestBootFromVolume-908041274-project-member</nova:user>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:        <nova:project uuid="2901e55f200f4622ae841166074ac8f8">tempest-ServersTestBootFromVolume-908041274</nova:project>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:        <nova:port uuid="455d9ab3-8df2-4cb1-8df9-087e0c5db7af">
Jan 31 03:09:55 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <entry name="serial">c79a42ab-abd7-41ad-8fb4-784beb525937</entry>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <entry name="uuid">c79a42ab-abd7-41ad-8fb4-784beb525937</entry>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/c79a42ab-abd7-41ad-8fb4-784beb525937_disk.config">
Jan 31 03:09:55 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:09:55 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="volumes/volume-77364634-2150-4955-8185-1bf60ebd89d8">
Jan 31 03:09:55 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:09:55 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <serial>77364634-2150-4955-8185-1bf60ebd89d8</serial>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:fb:50:bc"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <target dev="tap455d9ab3-8d"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/c79a42ab-abd7-41ad-8fb4-784beb525937/console.log" append="off"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:09:55 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:09:55 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:09:55 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:09:55 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.521 226239 DEBUG nova.compute.manager [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Preparing to wait for external event network-vif-plugged-455d9ab3-8df2-4cb1-8df9-087e0c5db7af prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.521 226239 DEBUG oslo_concurrency.lockutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Acquiring lock "c79a42ab-abd7-41ad-8fb4-784beb525937-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.521 226239 DEBUG oslo_concurrency.lockutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Lock "c79a42ab-abd7-41ad-8fb4-784beb525937-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.522 226239 DEBUG oslo_concurrency.lockutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Lock "c79a42ab-abd7-41ad-8fb4-784beb525937-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.522 226239 DEBUG nova.virt.libvirt.vif [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:09:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestBootFromVolume-server-261240903',display_name='tempest-ServersTestBootFromVolume-server-261240903',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestbootfromvolume-server-261240903',id=80,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJkxZV+9MwTlFQb3ey2DGp7lLmYaCdqDUn7U6Rv+fzK/R7sgl5eeJJlwCadg8UCTvxfioYu2gZCvzx1wjklevqFDiRN1Gcijvn6dKc7E5TYHCxulIveFCYDDxeVKj6zt4g==',key_name='tempest-keypair-2048208306',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2901e55f200f4622ae841166074ac8f8',ramdisk_id='',reservation_id='r-eblqe8m0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-ServersTestBootFromVolume-908041274',owner_user_name='tempest-ServersTestBootFromVolume-908041274-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:09:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2c72ac0892c84ca0bf3e2ef74eed4f64',uuid=c79a42ab-abd7-41ad-8fb4-784beb525937,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "address": "fa:16:3e:fb:50:bc", "network": {"id": "427c4522-d251-4a37-9ae4-2bbb61e0b5cc", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1456402372-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2901e55f200f4622ae841166074ac8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap455d9ab3-8d", "ovs_interfaceid": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.523 226239 DEBUG nova.network.os_vif_util [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Converting VIF {"id": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "address": "fa:16:3e:fb:50:bc", "network": {"id": "427c4522-d251-4a37-9ae4-2bbb61e0b5cc", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1456402372-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2901e55f200f4622ae841166074ac8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap455d9ab3-8d", "ovs_interfaceid": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.524 226239 DEBUG nova.network.os_vif_util [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:50:bc,bridge_name='br-int',has_traffic_filtering=True,id=455d9ab3-8df2-4cb1-8df9-087e0c5db7af,network=Network(427c4522-d251-4a37-9ae4-2bbb61e0b5cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap455d9ab3-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.524 226239 DEBUG os_vif [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:50:bc,bridge_name='br-int',has_traffic_filtering=True,id=455d9ab3-8df2-4cb1-8df9-087e0c5db7af,network=Network(427c4522-d251-4a37-9ae4-2bbb61e0b5cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap455d9ab3-8d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.525 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.526 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.526 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.530 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.530 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455d9ab3-8d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.531 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap455d9ab3-8d, col_values=(('external_ids', {'iface-id': '455d9ab3-8df2-4cb1-8df9-087e0c5db7af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fb:50:bc', 'vm-uuid': 'c79a42ab-abd7-41ad-8fb4-784beb525937'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.532 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:55 np0005603623 NetworkManager[48970]: <info>  [1769846995.5331] manager: (tap455d9ab3-8d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.534 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.536 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.537 226239 INFO os_vif [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:50:bc,bridge_name='br-int',has_traffic_filtering=True,id=455d9ab3-8df2-4cb1-8df9-087e0c5db7af,network=Network(427c4522-d251-4a37-9ae4-2bbb61e0b5cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap455d9ab3-8d')#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.811 226239 DEBUG nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.811 226239 DEBUG nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.811 226239 DEBUG nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] No VIF found with MAC fa:16:3e:fb:50:bc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.811 226239 INFO nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Using config drive#033[00m
Jan 31 03:09:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:55.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:55 np0005603623 nova_compute[226235]: 2026-01-31 08:09:55.886 226239 DEBUG nova.storage.rbd_utils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] rbd image c79a42ab-abd7-41ad-8fb4-784beb525937_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:56 np0005603623 nova_compute[226235]: 2026-01-31 08:09:56.238 226239 INFO nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Creating config drive at /var/lib/nova/instances/c79a42ab-abd7-41ad-8fb4-784beb525937/disk.config#033[00m
Jan 31 03:09:56 np0005603623 nova_compute[226235]: 2026-01-31 08:09:56.243 226239 DEBUG oslo_concurrency.processutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c79a42ab-abd7-41ad-8fb4-784beb525937/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6hb2sz6_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:56 np0005603623 nova_compute[226235]: 2026-01-31 08:09:56.367 226239 DEBUG oslo_concurrency.processutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c79a42ab-abd7-41ad-8fb4-784beb525937/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6hb2sz6_" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:56 np0005603623 nova_compute[226235]: 2026-01-31 08:09:56.398 226239 DEBUG nova.storage.rbd_utils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] rbd image c79a42ab-abd7-41ad-8fb4-784beb525937_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:56 np0005603623 nova_compute[226235]: 2026-01-31 08:09:56.404 226239 DEBUG oslo_concurrency.processutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c79a42ab-abd7-41ad-8fb4-784beb525937/disk.config c79a42ab-abd7-41ad-8fb4-784beb525937_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:57 np0005603623 nova_compute[226235]: 2026-01-31 08:09:57.111 226239 DEBUG nova.network.neutron [req-960dbc4e-8999-4d57-b814-bc9d6bd0389c req-7e47efc6-55cb-4fb8-86fd-b1fa83bfd41d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Updated VIF entry in instance network info cache for port 455d9ab3-8df2-4cb1-8df9-087e0c5db7af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:09:57 np0005603623 nova_compute[226235]: 2026-01-31 08:09:57.113 226239 DEBUG nova.network.neutron [req-960dbc4e-8999-4d57-b814-bc9d6bd0389c req-7e47efc6-55cb-4fb8-86fd-b1fa83bfd41d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Updating instance_info_cache with network_info: [{"id": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "address": "fa:16:3e:fb:50:bc", "network": {"id": "427c4522-d251-4a37-9ae4-2bbb61e0b5cc", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1456402372-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2901e55f200f4622ae841166074ac8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap455d9ab3-8d", "ovs_interfaceid": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:09:57 np0005603623 nova_compute[226235]: 2026-01-31 08:09:57.134 226239 DEBUG oslo_concurrency.lockutils [req-960dbc4e-8999-4d57-b814-bc9d6bd0389c req-7e47efc6-55cb-4fb8-86fd-b1fa83bfd41d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-c79a42ab-abd7-41ad-8fb4-784beb525937" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:09:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:57.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:57 np0005603623 nova_compute[226235]: 2026-01-31 08:09:57.436 226239 DEBUG oslo_concurrency.processutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c79a42ab-abd7-41ad-8fb4-784beb525937/disk.config c79a42ab-abd7-41ad-8fb4-784beb525937_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:57 np0005603623 nova_compute[226235]: 2026-01-31 08:09:57.437 226239 INFO nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Deleting local config drive /var/lib/nova/instances/c79a42ab-abd7-41ad-8fb4-784beb525937/disk.config because it was imported into RBD.#033[00m
Jan 31 03:09:57 np0005603623 kernel: tap455d9ab3-8d: entered promiscuous mode
Jan 31 03:09:57 np0005603623 NetworkManager[48970]: <info>  [1769846997.4906] manager: (tap455d9ab3-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/138)
Jan 31 03:09:57 np0005603623 nova_compute[226235]: 2026-01-31 08:09:57.490 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:57Z|00300|binding|INFO|Claiming lport 455d9ab3-8df2-4cb1-8df9-087e0c5db7af for this chassis.
Jan 31 03:09:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:57Z|00301|binding|INFO|455d9ab3-8df2-4cb1-8df9-087e0c5db7af: Claiming fa:16:3e:fb:50:bc 10.100.0.9
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.509 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:50:bc 10.100.0.9'], port_security=['fa:16:3e:fb:50:bc 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'c79a42ab-abd7-41ad-8fb4-784beb525937', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-427c4522-d251-4a37-9ae4-2bbb61e0b5cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2901e55f200f4622ae841166074ac8f8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '35b28c31-ee34-4322-b9a1-b21a549632b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=431cf621-73da-4bf9-9684-c22cf73e873d, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=455d9ab3-8df2-4cb1-8df9-087e0c5db7af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.510 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 455d9ab3-8df2-4cb1-8df9-087e0c5db7af in datapath 427c4522-d251-4a37-9ae4-2bbb61e0b5cc bound to our chassis#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.511 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 427c4522-d251-4a37-9ae4-2bbb61e0b5cc#033[00m
Jan 31 03:09:57 np0005603623 systemd-udevd[259174]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:09:57 np0005603623 nova_compute[226235]: 2026-01-31 08:09:57.522 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.524 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[601a7d28-2d9a-4eeb-b871-6344a92008fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.525 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap427c4522-d1 in ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:09:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:57Z|00302|binding|INFO|Setting lport 455d9ab3-8df2-4cb1-8df9-087e0c5db7af ovn-installed in OVS
Jan 31 03:09:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:57Z|00303|binding|INFO|Setting lport 455d9ab3-8df2-4cb1-8df9-087e0c5db7af up in Southbound
Jan 31 03:09:57 np0005603623 nova_compute[226235]: 2026-01-31 08:09:57.527 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.527 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap427c4522-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.527 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[696c717a-2dc6-4abd-9876-e0b61d85da5f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.528 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5cda76dc-1b99-45d7-a9f6-31573b418230]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:57 np0005603623 NetworkManager[48970]: <info>  [1769846997.5354] device (tap455d9ab3-8d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:09:57 np0005603623 NetworkManager[48970]: <info>  [1769846997.5363] device (tap455d9ab3-8d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:09:57 np0005603623 systemd-machined[194379]: New machine qemu-34-instance-00000050.
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.539 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[92bf60a9-6d76-4f43-9b59-d99abd02238f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.549 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fb7a72e9-0a9e-4401-aba7-1de648624f3f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:57 np0005603623 systemd[1]: Started Virtual Machine qemu-34-instance-00000050.
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.578 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ca9ee41e-8b1e-4091-905e-3f542d80ad4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.583 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb4fb05-fafb-4b0d-895f-558a09cb5ee3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:57 np0005603623 NetworkManager[48970]: <info>  [1769846997.5846] manager: (tap427c4522-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/139)
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.616 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[9694500a-2b0f-43b0-9a8a-84b63af3a932]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.619 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[326f79e0-dc92-431a-8282-5a473ab0c4e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:57 np0005603623 NetworkManager[48970]: <info>  [1769846997.6391] device (tap427c4522-d0): carrier: link connected
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.643 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[85f8e95d-5bac-4028-8753-f3e7d95ff512]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.660 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[229fa90a-882e-4ae2-9e13-121b97673f3e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap427c4522-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:0d:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 613806, 'reachable_time': 24287, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259210, 'error': None, 'target': 'ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.674 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ba76bd52-62f1-49a4-b134-74b792fe3ac6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe89:da4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 613806, 'tstamp': 613806}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259211, 'error': None, 'target': 'ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.689 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6c93658e-8087-4b71-b51c-31acc7202000]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap427c4522-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:0d:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 613806, 'reachable_time': 24287, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 259212, 'error': None, 'target': 'ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.709 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0baea39b-70e1-45ff-af3b-9cb594b18651]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.755 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4cd0134c-4b65-41a5-b90d-fc75518f6d61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.758 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap427c4522-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.759 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.759 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap427c4522-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:57 np0005603623 kernel: tap427c4522-d0: entered promiscuous mode
Jan 31 03:09:57 np0005603623 NetworkManager[48970]: <info>  [1769846997.7639] manager: (tap427c4522-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/140)
Jan 31 03:09:57 np0005603623 nova_compute[226235]: 2026-01-31 08:09:57.763 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:57 np0005603623 nova_compute[226235]: 2026-01-31 08:09:57.767 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.774 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap427c4522-d0, col_values=(('external_ids', {'iface-id': '8a27028a-bce4-4bee-8999-cd9b9f0e24ce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:57 np0005603623 nova_compute[226235]: 2026-01-31 08:09:57.775 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:09:57Z|00304|binding|INFO|Releasing lport 8a27028a-bce4-4bee-8999-cd9b9f0e24ce from this chassis (sb_readonly=0)
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.777 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/427c4522-d251-4a37-9ae4-2bbb61e0b5cc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/427c4522-d251-4a37-9ae4-2bbb61e0b5cc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.778 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3342d371-6d5f-4c1a-a42e-2b7807fc135e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.778 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-427c4522-d251-4a37-9ae4-2bbb61e0b5cc
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/427c4522-d251-4a37-9ae4-2bbb61e0b5cc.pid.haproxy
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 427c4522-d251-4a37-9ae4-2bbb61e0b5cc
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:09:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:09:57.780 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc', 'env', 'PROCESS_TAG=haproxy-427c4522-d251-4a37-9ae4-2bbb61e0b5cc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/427c4522-d251-4a37-9ae4-2bbb61e0b5cc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:09:57 np0005603623 nova_compute[226235]: 2026-01-31 08:09:57.782 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:09:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:57.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:09:57 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #49. Immutable memtables: 6.
Jan 31 03:09:58 np0005603623 podman[259280]: 2026-01-31 08:09:58.083722865 +0000 UTC m=+0.021383576 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.192 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846998.1918228, c79a42ab-abd7-41ad-8fb4-784beb525937 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.193 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] VM Started (Lifecycle Event)#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.220 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.223 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846998.1929493, c79a42ab-abd7-41ad-8fb4-784beb525937 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.224 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.242 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.245 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.254 226239 DEBUG nova.compute.manager [req-3e76841d-50d9-4473-9efb-3d06c40637f7 req-693c8209-31b8-4f6d-bfc4-c5ae26872e39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Received event network-vif-plugged-455d9ab3-8df2-4cb1-8df9-087e0c5db7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.254 226239 DEBUG oslo_concurrency.lockutils [req-3e76841d-50d9-4473-9efb-3d06c40637f7 req-693c8209-31b8-4f6d-bfc4-c5ae26872e39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c79a42ab-abd7-41ad-8fb4-784beb525937-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.254 226239 DEBUG oslo_concurrency.lockutils [req-3e76841d-50d9-4473-9efb-3d06c40637f7 req-693c8209-31b8-4f6d-bfc4-c5ae26872e39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c79a42ab-abd7-41ad-8fb4-784beb525937-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.255 226239 DEBUG oslo_concurrency.lockutils [req-3e76841d-50d9-4473-9efb-3d06c40637f7 req-693c8209-31b8-4f6d-bfc4-c5ae26872e39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c79a42ab-abd7-41ad-8fb4-784beb525937-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.255 226239 DEBUG nova.compute.manager [req-3e76841d-50d9-4473-9efb-3d06c40637f7 req-693c8209-31b8-4f6d-bfc4-c5ae26872e39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Processing event network-vif-plugged-455d9ab3-8df2-4cb1-8df9-087e0c5db7af _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.255 226239 DEBUG nova.compute.manager [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.259 226239 DEBUG nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.262 226239 INFO nova.virt.libvirt.driver [-] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Instance spawned successfully.#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.263 226239 DEBUG nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.270 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.271 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769846998.2586064, c79a42ab-abd7-41ad-8fb4-784beb525937 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.271 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.287 226239 DEBUG nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.288 226239 DEBUG nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.288 226239 DEBUG nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.289 226239 DEBUG nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.289 226239 DEBUG nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.290 226239 DEBUG nova.virt.libvirt.driver [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.295 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.298 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.334 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.353 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.367 226239 INFO nova.compute.manager [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Took 7.32 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.368 226239 DEBUG nova.compute.manager [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.431 226239 INFO nova.compute.manager [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Took 9.93 seconds to build instance.#033[00m
Jan 31 03:09:58 np0005603623 nova_compute[226235]: 2026-01-31 08:09:58.447 226239 DEBUG oslo_concurrency.lockutils [None req-f8776c92-5292-45e7-b13b-8ee10ecbe6f0 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Lock "c79a42ab-abd7-41ad-8fb4-784beb525937" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.044s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:58 np0005603623 podman[259280]: 2026-01-31 08:09:58.817157956 +0000 UTC m=+0.754818647 container create fa6108bb43711aba287d42acc6ad3a713a16d1d584f66bb7501ea6b58f275fad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:09:59 np0005603623 systemd[1]: Started libpod-conmon-fa6108bb43711aba287d42acc6ad3a713a16d1d584f66bb7501ea6b58f275fad.scope.
Jan 31 03:09:59 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:09:59 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b9c5e2c33954232f229f3a5eff6dd3074c6a0ea69aff34251ac6117a40b7afa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:09:59 np0005603623 podman[259280]: 2026-01-31 08:09:59.294504483 +0000 UTC m=+1.232165214 container init fa6108bb43711aba287d42acc6ad3a713a16d1d584f66bb7501ea6b58f275fad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:09:59 np0005603623 podman[259280]: 2026-01-31 08:09:59.298967155 +0000 UTC m=+1.236627856 container start fa6108bb43711aba287d42acc6ad3a713a16d1d584f66bb7501ea6b58f275fad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 31 03:09:59 np0005603623 neutron-haproxy-ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc[259303]: [NOTICE]   (259307) : New worker (259309) forked
Jan 31 03:09:59 np0005603623 neutron-haproxy-ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc[259303]: [NOTICE]   (259307) : Loading success.
Jan 31 03:09:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:59.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:09:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:09:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:59.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:10:00 np0005603623 nova_compute[226235]: 2026-01-31 08:10:00.344 226239 DEBUG nova.compute.manager [req-78d53eeb-9f41-4b7c-84af-264b09ba9586 req-eb2bf40e-7e0d-473c-8188-3611512711e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Received event network-vif-plugged-455d9ab3-8df2-4cb1-8df9-087e0c5db7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:00 np0005603623 nova_compute[226235]: 2026-01-31 08:10:00.344 226239 DEBUG oslo_concurrency.lockutils [req-78d53eeb-9f41-4b7c-84af-264b09ba9586 req-eb2bf40e-7e0d-473c-8188-3611512711e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c79a42ab-abd7-41ad-8fb4-784beb525937-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:00 np0005603623 nova_compute[226235]: 2026-01-31 08:10:00.344 226239 DEBUG oslo_concurrency.lockutils [req-78d53eeb-9f41-4b7c-84af-264b09ba9586 req-eb2bf40e-7e0d-473c-8188-3611512711e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c79a42ab-abd7-41ad-8fb4-784beb525937-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:00 np0005603623 nova_compute[226235]: 2026-01-31 08:10:00.344 226239 DEBUG oslo_concurrency.lockutils [req-78d53eeb-9f41-4b7c-84af-264b09ba9586 req-eb2bf40e-7e0d-473c-8188-3611512711e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c79a42ab-abd7-41ad-8fb4-784beb525937-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:00 np0005603623 nova_compute[226235]: 2026-01-31 08:10:00.344 226239 DEBUG nova.compute.manager [req-78d53eeb-9f41-4b7c-84af-264b09ba9586 req-eb2bf40e-7e0d-473c-8188-3611512711e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] No waiting events found dispatching network-vif-plugged-455d9ab3-8df2-4cb1-8df9-087e0c5db7af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:10:00 np0005603623 nova_compute[226235]: 2026-01-31 08:10:00.344 226239 WARNING nova.compute.manager [req-78d53eeb-9f41-4b7c-84af-264b09ba9586 req-eb2bf40e-7e0d-473c-8188-3611512711e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Received unexpected event network-vif-plugged-455d9ab3-8df2-4cb1-8df9-087e0c5db7af for instance with vm_state active and task_state None.#033[00m
Jan 31 03:10:00 np0005603623 nova_compute[226235]: 2026-01-31 08:10:00.532 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:01 np0005603623 NetworkManager[48970]: <info>  [1769847001.0617] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/141)
Jan 31 03:10:01 np0005603623 NetworkManager[48970]: <info>  [1769847001.0623] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/142)
Jan 31 03:10:01 np0005603623 nova_compute[226235]: 2026-01-31 08:10:01.061 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:01 np0005603623 nova_compute[226235]: 2026-01-31 08:10:01.102 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:01 np0005603623 ovn_controller[133449]: 2026-01-31T08:10:01Z|00305|binding|INFO|Releasing lport 8a27028a-bce4-4bee-8999-cd9b9f0e24ce from this chassis (sb_readonly=0)
Jan 31 03:10:01 np0005603623 nova_compute[226235]: 2026-01-31 08:10:01.117 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:01.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:01 np0005603623 ceph-mon[77037]: overall HEALTH_OK
Jan 31 03:10:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:01.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:02 np0005603623 nova_compute[226235]: 2026-01-31 08:10:02.182 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:02 np0005603623 nova_compute[226235]: 2026-01-31 08:10:02.436 226239 DEBUG nova.compute.manager [req-85701d79-58e5-428c-b373-e78f905ae86d req-0b092ed2-30ad-47e6-a6b3-9d2a1892d6e7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Received event network-changed-455d9ab3-8df2-4cb1-8df9-087e0c5db7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:02 np0005603623 nova_compute[226235]: 2026-01-31 08:10:02.437 226239 DEBUG nova.compute.manager [req-85701d79-58e5-428c-b373-e78f905ae86d req-0b092ed2-30ad-47e6-a6b3-9d2a1892d6e7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Refreshing instance network info cache due to event network-changed-455d9ab3-8df2-4cb1-8df9-087e0c5db7af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:10:02 np0005603623 nova_compute[226235]: 2026-01-31 08:10:02.438 226239 DEBUG oslo_concurrency.lockutils [req-85701d79-58e5-428c-b373-e78f905ae86d req-0b092ed2-30ad-47e6-a6b3-9d2a1892d6e7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-c79a42ab-abd7-41ad-8fb4-784beb525937" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:10:02 np0005603623 nova_compute[226235]: 2026-01-31 08:10:02.438 226239 DEBUG oslo_concurrency.lockutils [req-85701d79-58e5-428c-b373-e78f905ae86d req-0b092ed2-30ad-47e6-a6b3-9d2a1892d6e7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-c79a42ab-abd7-41ad-8fb4-784beb525937" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:10:02 np0005603623 nova_compute[226235]: 2026-01-31 08:10:02.439 226239 DEBUG nova.network.neutron [req-85701d79-58e5-428c-b373-e78f905ae86d req-0b092ed2-30ad-47e6-a6b3-9d2a1892d6e7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Refreshing network info cache for port 455d9ab3-8df2-4cb1-8df9-087e0c5db7af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:10:03 np0005603623 nova_compute[226235]: 2026-01-31 08:10:03.356 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:10:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:03.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:10:03 np0005603623 nova_compute[226235]: 2026-01-31 08:10:03.617 226239 DEBUG nova.network.neutron [req-85701d79-58e5-428c-b373-e78f905ae86d req-0b092ed2-30ad-47e6-a6b3-9d2a1892d6e7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Updated VIF entry in instance network info cache for port 455d9ab3-8df2-4cb1-8df9-087e0c5db7af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:10:03 np0005603623 nova_compute[226235]: 2026-01-31 08:10:03.619 226239 DEBUG nova.network.neutron [req-85701d79-58e5-428c-b373-e78f905ae86d req-0b092ed2-30ad-47e6-a6b3-9d2a1892d6e7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Updating instance_info_cache with network_info: [{"id": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "address": "fa:16:3e:fb:50:bc", "network": {"id": "427c4522-d251-4a37-9ae4-2bbb61e0b5cc", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1456402372-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2901e55f200f4622ae841166074ac8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap455d9ab3-8d", "ovs_interfaceid": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:10:03 np0005603623 nova_compute[226235]: 2026-01-31 08:10:03.639 226239 DEBUG oslo_concurrency.lockutils [req-85701d79-58e5-428c-b373-e78f905ae86d req-0b092ed2-30ad-47e6-a6b3-9d2a1892d6e7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-c79a42ab-abd7-41ad-8fb4-784beb525937" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:10:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:03.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:10:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:05.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:10:05 np0005603623 nova_compute[226235]: 2026-01-31 08:10:05.533 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:05.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:10:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:07.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:10:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:07.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:08 np0005603623 nova_compute[226235]: 2026-01-31 08:10:08.358 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:09.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:09.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:10 np0005603623 nova_compute[226235]: 2026-01-31 08:10:10.567 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e254 e254: 3 total, 3 up, 3 in
Jan 31 03:10:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:11.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:11 np0005603623 ovn_controller[133449]: 2026-01-31T08:10:11Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fb:50:bc 10.100.0.9
Jan 31 03:10:11 np0005603623 ovn_controller[133449]: 2026-01-31T08:10:11Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fb:50:bc 10.100.0.9
Jan 31 03:10:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:11.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:13 np0005603623 nova_compute[226235]: 2026-01-31 08:10:13.360 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:13.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:10:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:13.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:10:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:15.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:15 np0005603623 nova_compute[226235]: 2026-01-31 08:10:15.569 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:15.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:10:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:17.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:10:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:17.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:18 np0005603623 nova_compute[226235]: 2026-01-31 08:10:18.362 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:10:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:19.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.602 226239 DEBUG oslo_concurrency.lockutils [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Acquiring lock "c79a42ab-abd7-41ad-8fb4-784beb525937" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.602 226239 DEBUG oslo_concurrency.lockutils [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Lock "c79a42ab-abd7-41ad-8fb4-784beb525937" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.602 226239 DEBUG oslo_concurrency.lockutils [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Acquiring lock "c79a42ab-abd7-41ad-8fb4-784beb525937-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.603 226239 DEBUG oslo_concurrency.lockutils [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Lock "c79a42ab-abd7-41ad-8fb4-784beb525937-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.603 226239 DEBUG oslo_concurrency.lockutils [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Lock "c79a42ab-abd7-41ad-8fb4-784beb525937-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.604 226239 INFO nova.compute.manager [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Terminating instance#033[00m
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.605 226239 DEBUG nova.compute.manager [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:10:19 np0005603623 kernel: tap455d9ab3-8d (unregistering): left promiscuous mode
Jan 31 03:10:19 np0005603623 NetworkManager[48970]: <info>  [1769847019.6932] device (tap455d9ab3-8d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.693 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:10:19Z|00306|binding|INFO|Releasing lport 455d9ab3-8df2-4cb1-8df9-087e0c5db7af from this chassis (sb_readonly=0)
Jan 31 03:10:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:10:19Z|00307|binding|INFO|Setting lport 455d9ab3-8df2-4cb1-8df9-087e0c5db7af down in Southbound
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.702 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:10:19Z|00308|binding|INFO|Removing iface tap455d9ab3-8d ovn-installed in OVS
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.706 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.710 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:19.711 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:50:bc 10.100.0.9'], port_security=['fa:16:3e:fb:50:bc 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'c79a42ab-abd7-41ad-8fb4-784beb525937', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-427c4522-d251-4a37-9ae4-2bbb61e0b5cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2901e55f200f4622ae841166074ac8f8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '35b28c31-ee34-4322-b9a1-b21a549632b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.174'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=431cf621-73da-4bf9-9684-c22cf73e873d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=455d9ab3-8df2-4cb1-8df9-087e0c5db7af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:10:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:19.712 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 455d9ab3-8df2-4cb1-8df9-087e0c5db7af in datapath 427c4522-d251-4a37-9ae4-2bbb61e0b5cc unbound from our chassis#033[00m
Jan 31 03:10:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:19.714 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 427c4522-d251-4a37-9ae4-2bbb61e0b5cc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:10:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:19.715 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e01f270d-5bac-402d-84f6-5f8d4e933b3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:19.716 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc namespace which is not needed anymore#033[00m
Jan 31 03:10:19 np0005603623 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000050.scope: Deactivated successfully.
Jan 31 03:10:19 np0005603623 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000050.scope: Consumed 12.843s CPU time.
Jan 31 03:10:19 np0005603623 systemd-machined[194379]: Machine qemu-34-instance-00000050 terminated.
Jan 31 03:10:19 np0005603623 neutron-haproxy-ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc[259303]: [NOTICE]   (259307) : haproxy version is 2.8.14-c23fe91
Jan 31 03:10:19 np0005603623 neutron-haproxy-ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc[259303]: [NOTICE]   (259307) : path to executable is /usr/sbin/haproxy
Jan 31 03:10:19 np0005603623 neutron-haproxy-ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc[259303]: [WARNING]  (259307) : Exiting Master process...
Jan 31 03:10:19 np0005603623 neutron-haproxy-ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc[259303]: [ALERT]    (259307) : Current worker (259309) exited with code 143 (Terminated)
Jan 31 03:10:19 np0005603623 neutron-haproxy-ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc[259303]: [WARNING]  (259307) : All workers exited. Exiting... (0)
Jan 31 03:10:19 np0005603623 systemd[1]: libpod-fa6108bb43711aba287d42acc6ad3a713a16d1d584f66bb7501ea6b58f275fad.scope: Deactivated successfully.
Jan 31 03:10:19 np0005603623 conmon[259303]: conmon fa6108bb43711aba287d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-fa6108bb43711aba287d42acc6ad3a713a16d1d584f66bb7501ea6b58f275fad.scope/container/memory.events
Jan 31 03:10:19 np0005603623 kernel: tap455d9ab3-8d: entered promiscuous mode
Jan 31 03:10:19 np0005603623 kernel: tap455d9ab3-8d (unregistering): left promiscuous mode
Jan 31 03:10:19 np0005603623 podman[259400]: 2026-01-31 08:10:19.821906023 +0000 UTC m=+0.040899741 container died fa6108bb43711aba287d42acc6ad3a713a16d1d584f66bb7501ea6b58f275fad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Jan 31 03:10:19 np0005603623 NetworkManager[48970]: <info>  [1769847019.8231] manager: (tap455d9ab3-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/143)
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.828 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.844 226239 INFO nova.virt.libvirt.driver [-] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Instance destroyed successfully.#033[00m
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.844 226239 DEBUG nova.objects.instance [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Lazy-loading 'resources' on Instance uuid c79a42ab-abd7-41ad-8fb4-784beb525937 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:10:19 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fa6108bb43711aba287d42acc6ad3a713a16d1d584f66bb7501ea6b58f275fad-userdata-shm.mount: Deactivated successfully.
Jan 31 03:10:19 np0005603623 systemd[1]: var-lib-containers-storage-overlay-4b9c5e2c33954232f229f3a5eff6dd3074c6a0ea69aff34251ac6117a40b7afa-merged.mount: Deactivated successfully.
Jan 31 03:10:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:19.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:19 np0005603623 podman[259400]: 2026-01-31 08:10:19.865613431 +0000 UTC m=+0.084607149 container cleanup fa6108bb43711aba287d42acc6ad3a713a16d1d584f66bb7501ea6b58f275fad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 03:10:19 np0005603623 systemd[1]: libpod-conmon-fa6108bb43711aba287d42acc6ad3a713a16d1d584f66bb7501ea6b58f275fad.scope: Deactivated successfully.
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.896 226239 DEBUG nova.virt.libvirt.vif [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:09:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestBootFromVolume-server-261240903',display_name='tempest-ServersTestBootFromVolume-server-261240903',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestbootfromvolume-server-261240903',id=80,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJkxZV+9MwTlFQb3ey2DGp7lLmYaCdqDUn7U6Rv+fzK/R7sgl5eeJJlwCadg8UCTvxfioYu2gZCvzx1wjklevqFDiRN1Gcijvn6dKc7E5TYHCxulIveFCYDDxeVKj6zt4g==',key_name='tempest-keypair-2048208306',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:09:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2901e55f200f4622ae841166074ac8f8',ramdisk_id='',reservation_id='r-eblqe8m0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-ServersTestBootFromVolume-908041274',owner_user_name='tempest-ServersTestBootFromVolume-908041274-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:09:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2c72ac0892c84ca0bf3e2ef74eed4f64',uuid=c79a42ab-abd7-41ad-8fb4-784beb525937,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "address": "fa:16:3e:fb:50:bc", "network": {"id": "427c4522-d251-4a37-9ae4-2bbb61e0b5cc", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1456402372-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2901e55f200f4622ae841166074ac8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap455d9ab3-8d", "ovs_interfaceid": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.897 226239 DEBUG nova.network.os_vif_util [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Converting VIF {"id": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "address": "fa:16:3e:fb:50:bc", "network": {"id": "427c4522-d251-4a37-9ae4-2bbb61e0b5cc", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1456402372-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2901e55f200f4622ae841166074ac8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap455d9ab3-8d", "ovs_interfaceid": "455d9ab3-8df2-4cb1-8df9-087e0c5db7af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.898 226239 DEBUG nova.network.os_vif_util [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fb:50:bc,bridge_name='br-int',has_traffic_filtering=True,id=455d9ab3-8df2-4cb1-8df9-087e0c5db7af,network=Network(427c4522-d251-4a37-9ae4-2bbb61e0b5cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap455d9ab3-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.899 226239 DEBUG os_vif [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fb:50:bc,bridge_name='br-int',has_traffic_filtering=True,id=455d9ab3-8df2-4cb1-8df9-087e0c5db7af,network=Network(427c4522-d251-4a37-9ae4-2bbb61e0b5cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap455d9ab3-8d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.901 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.902 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455d9ab3-8d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.904 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.908 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.911 226239 INFO os_vif [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fb:50:bc,bridge_name='br-int',has_traffic_filtering=True,id=455d9ab3-8df2-4cb1-8df9-087e0c5db7af,network=Network(427c4522-d251-4a37-9ae4-2bbb61e0b5cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap455d9ab3-8d')#033[00m
Jan 31 03:10:19 np0005603623 podman[259438]: 2026-01-31 08:10:19.919652784 +0000 UTC m=+0.038465513 container remove fa6108bb43711aba287d42acc6ad3a713a16d1d584f66bb7501ea6b58f275fad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:10:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:19.923 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[86933364-465d-4195-ab8f-1d96efe16eeb]: (4, ('Sat Jan 31 08:10:19 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc (fa6108bb43711aba287d42acc6ad3a713a16d1d584f66bb7501ea6b58f275fad)\nfa6108bb43711aba287d42acc6ad3a713a16d1d584f66bb7501ea6b58f275fad\nSat Jan 31 08:10:19 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc (fa6108bb43711aba287d42acc6ad3a713a16d1d584f66bb7501ea6b58f275fad)\nfa6108bb43711aba287d42acc6ad3a713a16d1d584f66bb7501ea6b58f275fad\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:19.925 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[99fa05f6-5a0e-4ba9-823a-36d81d8499f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:19.926 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap427c4522-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:19 np0005603623 kernel: tap427c4522-d0: left promiscuous mode
Jan 31 03:10:19 np0005603623 nova_compute[226235]: 2026-01-31 08:10:19.937 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:19.940 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f9a4922a-f6be-403a-b246-407476cf35f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:19.955 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2019ab1d-a14d-46c2-8be3-651c9bfb8dbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:19.956 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9173b807-f88f-48fe-a0c4-339d54560416]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:19.967 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5c056ea8-b905-4494-a4f3-6b4988dedce3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 613800, 'reachable_time': 25899, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259470, 'error': None, 'target': 'ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:19.969 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-427c4522-d251-4a37-9ae4-2bbb61e0b5cc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:10:19 np0005603623 systemd[1]: run-netns-ovnmeta\x2d427c4522\x2dd251\x2d4a37\x2d9ae4\x2d2bbb61e0b5cc.mount: Deactivated successfully.
Jan 31 03:10:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:19.969 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[0adc9678-a7f0-4652-8b4e-560a035a2451]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:20 np0005603623 nova_compute[226235]: 2026-01-31 08:10:20.090 226239 DEBUG nova.compute.manager [req-71a90f7a-0a51-4514-aff9-dd32712f212c req-cad4e09a-e949-4cb5-a03e-93e5926b85ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Received event network-vif-unplugged-455d9ab3-8df2-4cb1-8df9-087e0c5db7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:20 np0005603623 nova_compute[226235]: 2026-01-31 08:10:20.091 226239 DEBUG oslo_concurrency.lockutils [req-71a90f7a-0a51-4514-aff9-dd32712f212c req-cad4e09a-e949-4cb5-a03e-93e5926b85ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c79a42ab-abd7-41ad-8fb4-784beb525937-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:20 np0005603623 nova_compute[226235]: 2026-01-31 08:10:20.091 226239 DEBUG oslo_concurrency.lockutils [req-71a90f7a-0a51-4514-aff9-dd32712f212c req-cad4e09a-e949-4cb5-a03e-93e5926b85ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c79a42ab-abd7-41ad-8fb4-784beb525937-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:20 np0005603623 nova_compute[226235]: 2026-01-31 08:10:20.091 226239 DEBUG oslo_concurrency.lockutils [req-71a90f7a-0a51-4514-aff9-dd32712f212c req-cad4e09a-e949-4cb5-a03e-93e5926b85ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c79a42ab-abd7-41ad-8fb4-784beb525937-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:20 np0005603623 nova_compute[226235]: 2026-01-31 08:10:20.091 226239 DEBUG nova.compute.manager [req-71a90f7a-0a51-4514-aff9-dd32712f212c req-cad4e09a-e949-4cb5-a03e-93e5926b85ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] No waiting events found dispatching network-vif-unplugged-455d9ab3-8df2-4cb1-8df9-087e0c5db7af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:10:20 np0005603623 nova_compute[226235]: 2026-01-31 08:10:20.092 226239 DEBUG nova.compute.manager [req-71a90f7a-0a51-4514-aff9-dd32712f212c req-cad4e09a-e949-4cb5-a03e-93e5926b85ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Received event network-vif-unplugged-455d9ab3-8df2-4cb1-8df9-087e0c5db7af for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:10:20 np0005603623 nova_compute[226235]: 2026-01-31 08:10:20.141 226239 INFO nova.virt.libvirt.driver [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Deleting instance files /var/lib/nova/instances/c79a42ab-abd7-41ad-8fb4-784beb525937_del#033[00m
Jan 31 03:10:20 np0005603623 nova_compute[226235]: 2026-01-31 08:10:20.142 226239 INFO nova.virt.libvirt.driver [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Deletion of /var/lib/nova/instances/c79a42ab-abd7-41ad-8fb4-784beb525937_del complete#033[00m
Jan 31 03:10:20 np0005603623 nova_compute[226235]: 2026-01-31 08:10:20.217 226239 INFO nova.compute.manager [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Took 0.61 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:10:20 np0005603623 nova_compute[226235]: 2026-01-31 08:10:20.219 226239 DEBUG oslo.service.loopingcall [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:10:20 np0005603623 nova_compute[226235]: 2026-01-31 08:10:20.219 226239 DEBUG nova.compute.manager [-] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:10:20 np0005603623 nova_compute[226235]: 2026-01-31 08:10:20.219 226239 DEBUG nova.network.neutron [-] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:10:21 np0005603623 podman[259497]: 2026-01-31 08:10:21.281398742 +0000 UTC m=+0.052967591 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:10:21 np0005603623 podman[259498]: 2026-01-31 08:10:21.301696782 +0000 UTC m=+0.070393211 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Jan 31 03:10:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:10:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:21.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:10:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:21.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:21 np0005603623 nova_compute[226235]: 2026-01-31 08:10:21.966 226239 DEBUG nova.network.neutron [-] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:10:22 np0005603623 nova_compute[226235]: 2026-01-31 08:10:22.154 226239 INFO nova.compute.manager [-] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Took 1.93 seconds to deallocate network for instance.#033[00m
Jan 31 03:10:22 np0005603623 nova_compute[226235]: 2026-01-31 08:10:22.177 226239 DEBUG nova.compute.manager [req-b0925d7d-7380-459c-88e3-a45bbc8f1e84 req-ef2c4ac2-6f4f-4a58-a632-3d8c68e10afd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Received event network-vif-plugged-455d9ab3-8df2-4cb1-8df9-087e0c5db7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:22 np0005603623 nova_compute[226235]: 2026-01-31 08:10:22.178 226239 DEBUG oslo_concurrency.lockutils [req-b0925d7d-7380-459c-88e3-a45bbc8f1e84 req-ef2c4ac2-6f4f-4a58-a632-3d8c68e10afd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c79a42ab-abd7-41ad-8fb4-784beb525937-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:22 np0005603623 nova_compute[226235]: 2026-01-31 08:10:22.178 226239 DEBUG oslo_concurrency.lockutils [req-b0925d7d-7380-459c-88e3-a45bbc8f1e84 req-ef2c4ac2-6f4f-4a58-a632-3d8c68e10afd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c79a42ab-abd7-41ad-8fb4-784beb525937-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:22 np0005603623 nova_compute[226235]: 2026-01-31 08:10:22.178 226239 DEBUG oslo_concurrency.lockutils [req-b0925d7d-7380-459c-88e3-a45bbc8f1e84 req-ef2c4ac2-6f4f-4a58-a632-3d8c68e10afd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c79a42ab-abd7-41ad-8fb4-784beb525937-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:22 np0005603623 nova_compute[226235]: 2026-01-31 08:10:22.178 226239 DEBUG nova.compute.manager [req-b0925d7d-7380-459c-88e3-a45bbc8f1e84 req-ef2c4ac2-6f4f-4a58-a632-3d8c68e10afd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] No waiting events found dispatching network-vif-plugged-455d9ab3-8df2-4cb1-8df9-087e0c5db7af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:10:22 np0005603623 nova_compute[226235]: 2026-01-31 08:10:22.178 226239 WARNING nova.compute.manager [req-b0925d7d-7380-459c-88e3-a45bbc8f1e84 req-ef2c4ac2-6f4f-4a58-a632-3d8c68e10afd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Received unexpected event network-vif-plugged-455d9ab3-8df2-4cb1-8df9-087e0c5db7af for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:10:22 np0005603623 nova_compute[226235]: 2026-01-31 08:10:22.419 226239 INFO nova.compute.manager [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Took 0.26 seconds to detach 1 volumes for instance.#033[00m
Jan 31 03:10:22 np0005603623 nova_compute[226235]: 2026-01-31 08:10:22.421 226239 DEBUG nova.compute.manager [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Deleting volume: 77364634-2150-4955-8185-1bf60ebd89d8 _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 31 03:10:22 np0005603623 nova_compute[226235]: 2026-01-31 08:10:22.449 226239 DEBUG nova.compute.manager [req-d42241ba-dde1-45ef-825f-fd27cdc3b16c req-d53e6a30-4e7b-4bb0-a6d4-a4cee4dfdb26 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Received event network-vif-deleted-455d9ab3-8df2-4cb1-8df9-087e0c5db7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:22 np0005603623 nova_compute[226235]: 2026-01-31 08:10:22.736 226239 DEBUG oslo_concurrency.lockutils [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:22 np0005603623 nova_compute[226235]: 2026-01-31 08:10:22.737 226239 DEBUG oslo_concurrency.lockutils [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e255 e255: 3 total, 3 up, 3 in
Jan 31 03:10:22 np0005603623 nova_compute[226235]: 2026-01-31 08:10:22.805 226239 DEBUG oslo_concurrency.processutils [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:10:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:10:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4210870188' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:10:23 np0005603623 nova_compute[226235]: 2026-01-31 08:10:23.206 226239 DEBUG oslo_concurrency.processutils [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.401s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:10:23 np0005603623 nova_compute[226235]: 2026-01-31 08:10:23.211 226239 DEBUG nova.compute.provider_tree [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:10:23 np0005603623 nova_compute[226235]: 2026-01-31 08:10:23.238 226239 DEBUG nova.scheduler.client.report [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:10:23 np0005603623 nova_compute[226235]: 2026-01-31 08:10:23.265 226239 DEBUG oslo_concurrency.lockutils [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.528s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:23 np0005603623 nova_compute[226235]: 2026-01-31 08:10:23.294 226239 INFO nova.scheduler.client.report [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Deleted allocations for instance c79a42ab-abd7-41ad-8fb4-784beb525937#033[00m
Jan 31 03:10:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:10:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1830728136' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:10:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:10:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1830728136' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:10:23 np0005603623 nova_compute[226235]: 2026-01-31 08:10:23.365 226239 DEBUG oslo_concurrency.lockutils [None req-f4198478-0d93-40df-aae9-9655d1c8cb7f 2c72ac0892c84ca0bf3e2ef74eed4f64 2901e55f200f4622ae841166074ac8f8 - - default default] Lock "c79a42ab-abd7-41ad-8fb4-784beb525937" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.763s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:23 np0005603623 nova_compute[226235]: 2026-01-31 08:10:23.366 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:23.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:10:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:23.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:10:24 np0005603623 nova_compute[226235]: 2026-01-31 08:10:24.905 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:25.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:25.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:25 np0005603623 nova_compute[226235]: 2026-01-31 08:10:25.971 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:27 np0005603623 nova_compute[226235]: 2026-01-31 08:10:27.156 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:27 np0005603623 nova_compute[226235]: 2026-01-31 08:10:27.157 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:10:27 np0005603623 nova_compute[226235]: 2026-01-31 08:10:27.157 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:10:27 np0005603623 nova_compute[226235]: 2026-01-31 08:10:27.174 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:10:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:27.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:27.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:28 np0005603623 nova_compute[226235]: 2026-01-31 08:10:28.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:28 np0005603623 nova_compute[226235]: 2026-01-31 08:10:28.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:28 np0005603623 nova_compute[226235]: 2026-01-31 08:10:28.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:10:28 np0005603623 nova_compute[226235]: 2026-01-31 08:10:28.366 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:28 np0005603623 nova_compute[226235]: 2026-01-31 08:10:28.547 226239 DEBUG oslo_concurrency.lockutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:28 np0005603623 nova_compute[226235]: 2026-01-31 08:10:28.547 226239 DEBUG oslo_concurrency.lockutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:28 np0005603623 nova_compute[226235]: 2026-01-31 08:10:28.571 226239 DEBUG nova.compute.manager [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:10:28 np0005603623 nova_compute[226235]: 2026-01-31 08:10:28.637 226239 DEBUG oslo_concurrency.lockutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:28 np0005603623 nova_compute[226235]: 2026-01-31 08:10:28.637 226239 DEBUG oslo_concurrency.lockutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:28 np0005603623 nova_compute[226235]: 2026-01-31 08:10:28.643 226239 DEBUG nova.virt.hardware [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:10:28 np0005603623 nova_compute[226235]: 2026-01-31 08:10:28.644 226239 INFO nova.compute.claims [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:10:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:28 np0005603623 nova_compute[226235]: 2026-01-31 08:10:28.930 226239 DEBUG nova.scheduler.client.report [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:10:29 np0005603623 nova_compute[226235]: 2026-01-31 08:10:29.022 226239 DEBUG nova.scheduler.client.report [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:10:29 np0005603623 nova_compute[226235]: 2026-01-31 08:10:29.023 226239 DEBUG nova.compute.provider_tree [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:10:29 np0005603623 nova_compute[226235]: 2026-01-31 08:10:29.039 226239 DEBUG nova.scheduler.client.report [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:10:29 np0005603623 nova_compute[226235]: 2026-01-31 08:10:29.064 226239 DEBUG nova.scheduler.client.report [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:10:29 np0005603623 nova_compute[226235]: 2026-01-31 08:10:29.101 226239 DEBUG oslo_concurrency.processutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:10:29 np0005603623 nova_compute[226235]: 2026-01-31 08:10:29.294 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:29 np0005603623 nova_compute[226235]: 2026-01-31 08:10:29.358 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:29.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:10:29 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3849060992' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:10:29 np0005603623 nova_compute[226235]: 2026-01-31 08:10:29.543 226239 DEBUG oslo_concurrency.processutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:10:29 np0005603623 nova_compute[226235]: 2026-01-31 08:10:29.548 226239 DEBUG nova.compute.provider_tree [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:10:29 np0005603623 nova_compute[226235]: 2026-01-31 08:10:29.575 226239 DEBUG nova.scheduler.client.report [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:10:29 np0005603623 nova_compute[226235]: 2026-01-31 08:10:29.619 226239 DEBUG oslo_concurrency.lockutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.981s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:29 np0005603623 nova_compute[226235]: 2026-01-31 08:10:29.619 226239 DEBUG nova.compute.manager [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:10:29 np0005603623 nova_compute[226235]: 2026-01-31 08:10:29.748 226239 DEBUG nova.compute.manager [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:10:29 np0005603623 nova_compute[226235]: 2026-01-31 08:10:29.749 226239 DEBUG nova.network.neutron [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:10:29 np0005603623 nova_compute[226235]: 2026-01-31 08:10:29.851 226239 INFO nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:10:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:29.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:29 np0005603623 nova_compute[226235]: 2026-01-31 08:10:29.883 226239 DEBUG nova.compute.manager [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:10:29 np0005603623 nova_compute[226235]: 2026-01-31 08:10:29.929 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.063 226239 DEBUG nova.compute.manager [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.064 226239 DEBUG nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.064 226239 INFO nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Creating image(s)#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.086 226239 DEBUG nova.storage.rbd_utils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:10:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:30.104 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:30.104 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:30.104 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.116 226239 DEBUG nova.storage.rbd_utils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.141 226239 DEBUG nova.storage.rbd_utils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.145 226239 DEBUG oslo_concurrency.processutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.164 226239 DEBUG nova.policy [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '12a823bd7c6e4cf492ebf6c1d002a91f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9c03fec1b3664105996aa979e226d8f8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.167 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.198 226239 DEBUG oslo_concurrency.processutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.199 226239 DEBUG oslo_concurrency.lockutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.199 226239 DEBUG oslo_concurrency.lockutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.200 226239 DEBUG oslo_concurrency.lockutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.223 226239 DEBUG nova.storage.rbd_utils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.226 226239 DEBUG oslo_concurrency.processutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.249 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.250 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.250 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.250 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.251 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.547 226239 DEBUG oslo_concurrency.processutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.321s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.600 226239 DEBUG nova.storage.rbd_utils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] resizing rbd image fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.684 226239 DEBUG nova.objects.instance [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'migration_context' on Instance uuid fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.704 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.770 226239 DEBUG nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.770 226239 DEBUG nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Ensure instance console log exists: /var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.771 226239 DEBUG oslo_concurrency.lockutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.771 226239 DEBUG oslo_concurrency.lockutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.771 226239 DEBUG oslo_concurrency.lockutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.838 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.839 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4597MB free_disk=20.921764373779297GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.840 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:30 np0005603623 nova_compute[226235]: 2026-01-31 08:10:30.840 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:31 np0005603623 nova_compute[226235]: 2026-01-31 08:10:31.211 226239 DEBUG nova.network.neutron [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Successfully created port: 2f8a1103-332f-40ce-8e2d-20bcf884c0d1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:10:31 np0005603623 nova_compute[226235]: 2026-01-31 08:10:31.214 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:10:31 np0005603623 nova_compute[226235]: 2026-01-31 08:10:31.214 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:10:31 np0005603623 nova_compute[226235]: 2026-01-31 08:10:31.214 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:10:31 np0005603623 nova_compute[226235]: 2026-01-31 08:10:31.266 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:10:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:31.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:10:31 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1725521277' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:10:31 np0005603623 nova_compute[226235]: 2026-01-31 08:10:31.735 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:10:31 np0005603623 nova_compute[226235]: 2026-01-31 08:10:31.740 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:10:31 np0005603623 nova_compute[226235]: 2026-01-31 08:10:31.781 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:10:31 np0005603623 nova_compute[226235]: 2026-01-31 08:10:31.835 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:10:31 np0005603623 nova_compute[226235]: 2026-01-31 08:10:31.836 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.996s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:31.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:32 np0005603623 nova_compute[226235]: 2026-01-31 08:10:32.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:32 np0005603623 nova_compute[226235]: 2026-01-31 08:10:32.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:32 np0005603623 nova_compute[226235]: 2026-01-31 08:10:32.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:32 np0005603623 nova_compute[226235]: 2026-01-31 08:10:32.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:10:32 np0005603623 nova_compute[226235]: 2026-01-31 08:10:32.256 226239 DEBUG nova.network.neutron [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Successfully updated port: 2f8a1103-332f-40ce-8e2d-20bcf884c0d1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:10:32 np0005603623 nova_compute[226235]: 2026-01-31 08:10:32.297 226239 DEBUG oslo_concurrency.lockutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "refresh_cache-fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:10:32 np0005603623 nova_compute[226235]: 2026-01-31 08:10:32.298 226239 DEBUG oslo_concurrency.lockutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquired lock "refresh_cache-fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:10:32 np0005603623 nova_compute[226235]: 2026-01-31 08:10:32.298 226239 DEBUG nova.network.neutron [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:10:32 np0005603623 nova_compute[226235]: 2026-01-31 08:10:32.397 226239 DEBUG nova.compute.manager [req-8dfb33d9-7183-4586-883b-23644add440f req-a6ca5612-25c8-4574-b42c-924884b5d049 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Received event network-changed-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:32 np0005603623 nova_compute[226235]: 2026-01-31 08:10:32.398 226239 DEBUG nova.compute.manager [req-8dfb33d9-7183-4586-883b-23644add440f req-a6ca5612-25c8-4574-b42c-924884b5d049 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Refreshing instance network info cache due to event network-changed-2f8a1103-332f-40ce-8e2d-20bcf884c0d1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:10:32 np0005603623 nova_compute[226235]: 2026-01-31 08:10:32.398 226239 DEBUG oslo_concurrency.lockutils [req-8dfb33d9-7183-4586-883b-23644add440f req-a6ca5612-25c8-4574-b42c-924884b5d049 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:10:32 np0005603623 nova_compute[226235]: 2026-01-31 08:10:32.583 226239 DEBUG nova.network.neutron [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.369 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:33.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.636 226239 DEBUG nova.network.neutron [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Updating instance_info_cache with network_info: [{"id": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "address": "fa:16:3e:b4:d8:e3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f8a1103-33", "ovs_interfaceid": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:10:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 e256: 3 total, 3 up, 3 in
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.724 226239 DEBUG oslo_concurrency.lockutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Releasing lock "refresh_cache-fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.725 226239 DEBUG nova.compute.manager [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Instance network_info: |[{"id": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "address": "fa:16:3e:b4:d8:e3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f8a1103-33", "ovs_interfaceid": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.725 226239 DEBUG oslo_concurrency.lockutils [req-8dfb33d9-7183-4586-883b-23644add440f req-a6ca5612-25c8-4574-b42c-924884b5d049 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.726 226239 DEBUG nova.network.neutron [req-8dfb33d9-7183-4586-883b-23644add440f req-a6ca5612-25c8-4574-b42c-924884b5d049 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Refreshing network info cache for port 2f8a1103-332f-40ce-8e2d-20bcf884c0d1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.729 226239 DEBUG nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Start _get_guest_xml network_info=[{"id": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "address": "fa:16:3e:b4:d8:e3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f8a1103-33", "ovs_interfaceid": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.734 226239 WARNING nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.737 226239 DEBUG nova.virt.libvirt.host [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.738 226239 DEBUG nova.virt.libvirt.host [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.740 226239 DEBUG nova.virt.libvirt.host [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.741 226239 DEBUG nova.virt.libvirt.host [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.742 226239 DEBUG nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.742 226239 DEBUG nova.virt.hardware [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.742 226239 DEBUG nova.virt.hardware [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.743 226239 DEBUG nova.virt.hardware [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.743 226239 DEBUG nova.virt.hardware [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.743 226239 DEBUG nova.virt.hardware [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.743 226239 DEBUG nova.virt.hardware [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.744 226239 DEBUG nova.virt.hardware [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.744 226239 DEBUG nova.virt.hardware [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.744 226239 DEBUG nova.virt.hardware [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.744 226239 DEBUG nova.virt.hardware [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.744 226239 DEBUG nova.virt.hardware [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:10:33 np0005603623 nova_compute[226235]: 2026-01-31 08:10:33.750 226239 DEBUG oslo_concurrency.processutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:10:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:33.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:10:34 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2362351789' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.153 226239 DEBUG oslo_concurrency.processutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.177 226239 DEBUG nova.storage.rbd_utils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.182 226239 DEBUG oslo_concurrency.processutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:10:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:10:34 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3050226469' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.599 226239 DEBUG oslo_concurrency.processutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.601 226239 DEBUG nova.virt.libvirt.vif [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:10:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-577277100',display_name='tempest-tempest.common.compute-instance-577277100',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-577277100',id=83,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-2ke6bok9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:10:29Z,user_data=None,user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "address": "fa:16:3e:b4:d8:e3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f8a1103-33", "ovs_interfaceid": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.601 226239 DEBUG nova.network.os_vif_util [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "address": "fa:16:3e:b4:d8:e3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f8a1103-33", "ovs_interfaceid": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.602 226239 DEBUG nova.network.os_vif_util [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:d8:e3,bridge_name='br-int',has_traffic_filtering=True,id=2f8a1103-332f-40ce-8e2d-20bcf884c0d1,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f8a1103-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.603 226239 DEBUG nova.objects.instance [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'pci_devices' on Instance uuid fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.635 226239 DEBUG nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:10:34 np0005603623 nova_compute[226235]:  <uuid>fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091</uuid>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:  <name>instance-00000053</name>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <nova:name>tempest-tempest.common.compute-instance-577277100</nova:name>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:10:33</nova:creationTime>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:10:34 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:        <nova:user uuid="12a823bd7c6e4cf492ebf6c1d002a91f">tempest-ServerActionsTestOtherA-1768827668-project-member</nova:user>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:        <nova:project uuid="9c03fec1b3664105996aa979e226d8f8">tempest-ServerActionsTestOtherA-1768827668</nova:project>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:        <nova:port uuid="2f8a1103-332f-40ce-8e2d-20bcf884c0d1">
Jan 31 03:10:34 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <entry name="serial">fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091</entry>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <entry name="uuid">fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091</entry>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk">
Jan 31 03:10:34 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:10:34 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk.config">
Jan 31 03:10:34 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:10:34 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:b4:d8:e3"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <target dev="tap2f8a1103-33"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091/console.log" append="off"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:10:34 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:10:34 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:10:34 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:10:34 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.636 226239 DEBUG nova.compute.manager [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Preparing to wait for external event network-vif-plugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.636 226239 DEBUG oslo_concurrency.lockutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.636 226239 DEBUG oslo_concurrency.lockutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.636 226239 DEBUG oslo_concurrency.lockutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.637 226239 DEBUG nova.virt.libvirt.vif [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:10:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-577277100',display_name='tempest-tempest.common.compute-instance-577277100',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-577277100',id=83,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-2ke6bok9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:10:29Z,user_data=None,user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "address": "fa:16:3e:b4:d8:e3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f8a1103-33", "ovs_interfaceid": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.637 226239 DEBUG nova.network.os_vif_util [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "address": "fa:16:3e:b4:d8:e3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f8a1103-33", "ovs_interfaceid": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.638 226239 DEBUG nova.network.os_vif_util [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:d8:e3,bridge_name='br-int',has_traffic_filtering=True,id=2f8a1103-332f-40ce-8e2d-20bcf884c0d1,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f8a1103-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.638 226239 DEBUG os_vif [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:d8:e3,bridge_name='br-int',has_traffic_filtering=True,id=2f8a1103-332f-40ce-8e2d-20bcf884c0d1,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f8a1103-33') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.639 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.639 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.640 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.642 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.642 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2f8a1103-33, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.643 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2f8a1103-33, col_values=(('external_ids', {'iface-id': '2f8a1103-332f-40ce-8e2d-20bcf884c0d1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b4:d8:e3', 'vm-uuid': 'fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.644 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:34 np0005603623 NetworkManager[48970]: <info>  [1769847034.6455] manager: (tap2f8a1103-33): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/144)
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.646 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.649 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.651 226239 INFO os_vif [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:d8:e3,bridge_name='br-int',has_traffic_filtering=True,id=2f8a1103-332f-40ce-8e2d-20bcf884c0d1,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f8a1103-33')#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.811 226239 DEBUG nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.812 226239 DEBUG nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.812 226239 DEBUG nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No VIF found with MAC fa:16:3e:b4:d8:e3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.813 226239 INFO nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Using config drive#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.839 226239 DEBUG nova.storage.rbd_utils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.845 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847019.8376245, c79a42ab-abd7-41ad-8fb4-784beb525937 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.846 226239 INFO nova.compute.manager [-] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:10:34 np0005603623 nova_compute[226235]: 2026-01-31 08:10:34.913 226239 DEBUG nova.compute.manager [None req-9ca6dc31-0730-4e04-8795-154ccc2856a8 - - - - - -] [instance: c79a42ab-abd7-41ad-8fb4-784beb525937] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:10:35 np0005603623 nova_compute[226235]: 2026-01-31 08:10:35.185 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:35 np0005603623 nova_compute[226235]: 2026-01-31 08:10:35.247 226239 DEBUG nova.network.neutron [req-8dfb33d9-7183-4586-883b-23644add440f req-a6ca5612-25c8-4574-b42c-924884b5d049 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Updated VIF entry in instance network info cache for port 2f8a1103-332f-40ce-8e2d-20bcf884c0d1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:10:35 np0005603623 nova_compute[226235]: 2026-01-31 08:10:35.248 226239 DEBUG nova.network.neutron [req-8dfb33d9-7183-4586-883b-23644add440f req-a6ca5612-25c8-4574-b42c-924884b5d049 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Updating instance_info_cache with network_info: [{"id": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "address": "fa:16:3e:b4:d8:e3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f8a1103-33", "ovs_interfaceid": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:10:35 np0005603623 nova_compute[226235]: 2026-01-31 08:10:35.359 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:35 np0005603623 nova_compute[226235]: 2026-01-31 08:10:35.378 226239 DEBUG oslo_concurrency.lockutils [req-8dfb33d9-7183-4586-883b-23644add440f req-a6ca5612-25c8-4574-b42c-924884b5d049 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:10:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:35.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:35 np0005603623 nova_compute[226235]: 2026-01-31 08:10:35.543 226239 INFO nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Creating config drive at /var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091/disk.config#033[00m
Jan 31 03:10:35 np0005603623 nova_compute[226235]: 2026-01-31 08:10:35.548 226239 DEBUG oslo_concurrency.processutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpsgz8dck1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:10:35 np0005603623 nova_compute[226235]: 2026-01-31 08:10:35.673 226239 DEBUG oslo_concurrency.processutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpsgz8dck1" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:10:35 np0005603623 nova_compute[226235]: 2026-01-31 08:10:35.702 226239 DEBUG nova.storage.rbd_utils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:10:35 np0005603623 nova_compute[226235]: 2026-01-31 08:10:35.706 226239 DEBUG oslo_concurrency.processutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091/disk.config fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:10:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:35.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:35 np0005603623 nova_compute[226235]: 2026-01-31 08:10:35.928 226239 DEBUG oslo_concurrency.processutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091/disk.config fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.221s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:10:35 np0005603623 nova_compute[226235]: 2026-01-31 08:10:35.929 226239 INFO nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Deleting local config drive /var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091/disk.config because it was imported into RBD.#033[00m
Jan 31 03:10:35 np0005603623 kernel: tap2f8a1103-33: entered promiscuous mode
Jan 31 03:10:35 np0005603623 NetworkManager[48970]: <info>  [1769847035.9657] manager: (tap2f8a1103-33): new Tun device (/org/freedesktop/NetworkManager/Devices/145)
Jan 31 03:10:35 np0005603623 ovn_controller[133449]: 2026-01-31T08:10:35Z|00309|binding|INFO|Claiming lport 2f8a1103-332f-40ce-8e2d-20bcf884c0d1 for this chassis.
Jan 31 03:10:35 np0005603623 ovn_controller[133449]: 2026-01-31T08:10:35Z|00310|binding|INFO|2f8a1103-332f-40ce-8e2d-20bcf884c0d1: Claiming fa:16:3e:b4:d8:e3 10.100.0.11
Jan 31 03:10:35 np0005603623 nova_compute[226235]: 2026-01-31 08:10:35.966 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:35 np0005603623 nova_compute[226235]: 2026-01-31 08:10:35.975 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:35 np0005603623 nova_compute[226235]: 2026-01-31 08:10:35.978 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:35 np0005603623 nova_compute[226235]: 2026-01-31 08:10:35.983 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:35 np0005603623 NetworkManager[48970]: <info>  [1769847035.9843] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/146)
Jan 31 03:10:35 np0005603623 NetworkManager[48970]: <info>  [1769847035.9848] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/147)
Jan 31 03:10:35 np0005603623 systemd-udevd[259963]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:10:35 np0005603623 systemd-machined[194379]: New machine qemu-35-instance-00000053.
Jan 31 03:10:36 np0005603623 NetworkManager[48970]: <info>  [1769847036.0032] device (tap2f8a1103-33): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:10:36 np0005603623 NetworkManager[48970]: <info>  [1769847036.0038] device (tap2f8a1103-33): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:10:36 np0005603623 systemd[1]: Started Virtual Machine qemu-35-instance-00000053.
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.020 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.027 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:d8:e3 10.100.0.11'], port_security=['fa:16:3e:b4:d8:e3 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f564452-5f08-4a1c-921e-f2daee9ec936', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c03fec1b3664105996aa979e226d8f8', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c20bb243-1a39-4929-870f-6661da0e39e7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d620dc35-e1b1-4011-a8c1-0995d2048b09, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=2f8a1103-332f-40ce-8e2d-20bcf884c0d1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.028 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 2f8a1103-332f-40ce-8e2d-20bcf884c0d1 in datapath 1f564452-5f08-4a1c-921e-f2daee9ec936 bound to our chassis#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.029 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1f564452-5f08-4a1c-921e-f2daee9ec936#033[00m
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.032 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:36 np0005603623 ovn_controller[133449]: 2026-01-31T08:10:36Z|00311|binding|INFO|Setting lport 2f8a1103-332f-40ce-8e2d-20bcf884c0d1 ovn-installed in OVS
Jan 31 03:10:36 np0005603623 ovn_controller[133449]: 2026-01-31T08:10:36Z|00312|binding|INFO|Setting lport 2f8a1103-332f-40ce-8e2d-20bcf884c0d1 up in Southbound
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.036 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.038 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3827c43d-4a12-448e-b0e3-6ea10e68e5c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.038 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1f564452-51 in ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.040 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1f564452-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.040 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5fbb1a40-cab2-47a3-879c-02a0d7a695ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.040 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[09732645-23f8-42b2-bb92-e1610d0aa2c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.048 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[74747acb-b7e4-49d0-b8b5-542aa7d05557]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.058 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6b5e0ef5-8ecb-432f-b9af-8920e8fe59a4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.080 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[498532d7-7ffb-44ee-895a-534e28b3746b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:36 np0005603623 systemd-udevd[259966]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:10:36 np0005603623 NetworkManager[48970]: <info>  [1769847036.0871] manager: (tap1f564452-50): new Veth device (/org/freedesktop/NetworkManager/Devices/148)
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.086 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[63d28ed6-0098-4aa3-9518-1910b0df5cc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.116 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[81dc99e6-d2a6-447b-a388-8d0cd44c7c5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.120 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[c460026b-7941-470b-9d22-6ffa2476a05e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:36 np0005603623 NetworkManager[48970]: <info>  [1769847036.1472] device (tap1f564452-50): carrier: link connected
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.151 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[2c7c319f-d05c-4c5d-ad24-5679932abdbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.170 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[17ba0432-50f9-433d-9894-897992a320ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f564452-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:23:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 617657, 'reachable_time': 26673, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259997, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.183 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[420b7781-8b2e-4938-98d4-b3044db2ada1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:23e8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 617657, 'tstamp': 617657}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259998, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.196 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[138996d8-654e-49bc-b862-8ad83faae74a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f564452-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:23:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 617657, 'reachable_time': 26673, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 259999, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.223 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f8b312b6-a77d-44f0-992b-ffdcab72dca7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.277 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[64b9a4ed-0615-464b-85e0-3f617911cd77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.279 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f564452-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.279 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.280 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f564452-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:36 np0005603623 kernel: tap1f564452-50: entered promiscuous mode
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.282 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:36 np0005603623 NetworkManager[48970]: <info>  [1769847036.2850] manager: (tap1f564452-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/149)
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.284 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1f564452-50, col_values=(('external_ids', {'iface-id': '5bb8c1b5-edce-4f6a-8164-58b7d89a3330'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:36 np0005603623 ovn_controller[133449]: 2026-01-31T08:10:36Z|00313|binding|INFO|Releasing lport 5bb8c1b5-edce-4f6a-8164-58b7d89a3330 from this chassis (sb_readonly=0)
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.293 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.294 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1f564452-5f08-4a1c-921e-f2daee9ec936.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1f564452-5f08-4a1c-921e-f2daee9ec936.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.295 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[858fc95e-cae9-4449-8904-50546e0bd8f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.297 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-1f564452-5f08-4a1c-921e-f2daee9ec936
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/1f564452-5f08-4a1c-921e-f2daee9ec936.pid.haproxy
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 1f564452-5f08-4a1c-921e-f2daee9ec936
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:10:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:36.298 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'env', 'PROCESS_TAG=haproxy-1f564452-5f08-4a1c-921e-f2daee9ec936', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1f564452-5f08-4a1c-921e-f2daee9ec936.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.500 226239 DEBUG nova.compute.manager [req-0bc6781d-ad51-4211-9971-8bec07562a14 req-d6793784-a893-4a49-997b-b5d0b1dcd0bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Received event network-vif-plugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.501 226239 DEBUG oslo_concurrency.lockutils [req-0bc6781d-ad51-4211-9971-8bec07562a14 req-d6793784-a893-4a49-997b-b5d0b1dcd0bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.501 226239 DEBUG oslo_concurrency.lockutils [req-0bc6781d-ad51-4211-9971-8bec07562a14 req-d6793784-a893-4a49-997b-b5d0b1dcd0bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.501 226239 DEBUG oslo_concurrency.lockutils [req-0bc6781d-ad51-4211-9971-8bec07562a14 req-d6793784-a893-4a49-997b-b5d0b1dcd0bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.502 226239 DEBUG nova.compute.manager [req-0bc6781d-ad51-4211-9971-8bec07562a14 req-d6793784-a893-4a49-997b-b5d0b1dcd0bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Processing event network-vif-plugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:10:36 np0005603623 podman[260032]: 2026-01-31 08:10:36.678793412 +0000 UTC m=+0.028981135 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:10:36 np0005603623 podman[260032]: 2026-01-31 08:10:36.863389471 +0000 UTC m=+0.213577164 container create 1070c2a42582b61f09a3d14e93c58cb821cf32cb3362d7f31f531d3246ca73fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:10:36 np0005603623 systemd[1]: Started libpod-conmon-1070c2a42582b61f09a3d14e93c58cb821cf32cb3362d7f31f531d3246ca73fa.scope.
Jan 31 03:10:36 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:10:36 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e14d29bcb174c7bbce220bcfd474a419e68a76a185a9d0fe9c5e9f609debb3fa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:10:36 np0005603623 podman[260032]: 2026-01-31 08:10:36.940048137 +0000 UTC m=+0.290235850 container init 1070c2a42582b61f09a3d14e93c58cb821cf32cb3362d7f31f531d3246ca73fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:10:36 np0005603623 podman[260032]: 2026-01-31 08:10:36.945153628 +0000 UTC m=+0.295341321 container start 1070c2a42582b61f09a3d14e93c58cb821cf32cb3362d7f31f531d3246ca73fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 03:10:36 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[260089]: [NOTICE]   (260093) : New worker (260096) forked
Jan 31 03:10:36 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[260089]: [NOTICE]   (260093) : Loading success.
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.979 226239 DEBUG nova.compute.manager [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.981 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847036.9789824, fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.981 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] VM Started (Lifecycle Event)#033[00m
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.984 226239 DEBUG nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.988 226239 INFO nova.virt.libvirt.driver [-] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Instance spawned successfully.#033[00m
Jan 31 03:10:36 np0005603623 nova_compute[226235]: 2026-01-31 08:10:36.989 226239 DEBUG nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.069 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.076 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.081 226239 DEBUG nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.082 226239 DEBUG nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.082 226239 DEBUG nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.082 226239 DEBUG nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.083 226239 DEBUG nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.083 226239 DEBUG nova.virt.libvirt.driver [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.148 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.149 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847036.9804678, fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.149 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.222 226239 INFO nova.compute.manager [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Took 7.16 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.222 226239 DEBUG nova.compute.manager [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.233 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.237 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847036.9838653, fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.237 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.279 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.283 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.310 226239 INFO nova.compute.manager [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Took 8.70 seconds to build instance.#033[00m
Jan 31 03:10:37 np0005603623 nova_compute[226235]: 2026-01-31 08:10:37.392 226239 DEBUG oslo_concurrency.lockutils [None req-3aa3e2ea-726b-4d9c-9ca7-40f2d9616ed6 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.844s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:10:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:37.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:10:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:10:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:37.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:10:38 np0005603623 nova_compute[226235]: 2026-01-31 08:10:38.371 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:38 np0005603623 nova_compute[226235]: 2026-01-31 08:10:38.913 226239 DEBUG nova.compute.manager [req-9831946a-074b-407e-a7c8-3623336d96d1 req-38e1900a-c8bd-4f1c-94ad-f620f1af1a9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Received event network-vif-plugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:38 np0005603623 nova_compute[226235]: 2026-01-31 08:10:38.914 226239 DEBUG oslo_concurrency.lockutils [req-9831946a-074b-407e-a7c8-3623336d96d1 req-38e1900a-c8bd-4f1c-94ad-f620f1af1a9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:38 np0005603623 nova_compute[226235]: 2026-01-31 08:10:38.914 226239 DEBUG oslo_concurrency.lockutils [req-9831946a-074b-407e-a7c8-3623336d96d1 req-38e1900a-c8bd-4f1c-94ad-f620f1af1a9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:38 np0005603623 nova_compute[226235]: 2026-01-31 08:10:38.914 226239 DEBUG oslo_concurrency.lockutils [req-9831946a-074b-407e-a7c8-3623336d96d1 req-38e1900a-c8bd-4f1c-94ad-f620f1af1a9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:38 np0005603623 nova_compute[226235]: 2026-01-31 08:10:38.914 226239 DEBUG nova.compute.manager [req-9831946a-074b-407e-a7c8-3623336d96d1 req-38e1900a-c8bd-4f1c-94ad-f620f1af1a9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] No waiting events found dispatching network-vif-plugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:10:38 np0005603623 nova_compute[226235]: 2026-01-31 08:10:38.915 226239 WARNING nova.compute.manager [req-9831946a-074b-407e-a7c8-3623336d96d1 req-38e1900a-c8bd-4f1c-94ad-f620f1af1a9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Received unexpected event network-vif-plugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:10:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:39.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:39 np0005603623 nova_compute[226235]: 2026-01-31 08:10:39.646 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:10:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:39.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:10:40 np0005603623 nova_compute[226235]: 2026-01-31 08:10:40.025 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:40 np0005603623 nova_compute[226235]: 2026-01-31 08:10:40.182 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:40 np0005603623 nova_compute[226235]: 2026-01-31 08:10:40.182 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:10:40 np0005603623 nova_compute[226235]: 2026-01-31 08:10:40.202 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:10:40 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 03:10:40 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 03:10:40 np0005603623 nova_compute[226235]: 2026-01-31 08:10:40.571 226239 DEBUG oslo_concurrency.lockutils [None req-d6ae3944-3c88-41fc-97e0-121b93ed1d8c 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:40 np0005603623 nova_compute[226235]: 2026-01-31 08:10:40.571 226239 DEBUG oslo_concurrency.lockutils [None req-d6ae3944-3c88-41fc-97e0-121b93ed1d8c 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:40 np0005603623 nova_compute[226235]: 2026-01-31 08:10:40.571 226239 DEBUG nova.compute.manager [None req-d6ae3944-3c88-41fc-97e0-121b93ed1d8c 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:10:40 np0005603623 nova_compute[226235]: 2026-01-31 08:10:40.575 226239 DEBUG nova.compute.manager [None req-d6ae3944-3c88-41fc-97e0-121b93ed1d8c 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 31 03:10:40 np0005603623 nova_compute[226235]: 2026-01-31 08:10:40.575 226239 DEBUG nova.objects.instance [None req-d6ae3944-3c88-41fc-97e0-121b93ed1d8c 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'flavor' on Instance uuid fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:10:40 np0005603623 nova_compute[226235]: 2026-01-31 08:10:40.624 226239 DEBUG nova.virt.libvirt.driver [None req-d6ae3944-3c88-41fc-97e0-121b93ed1d8c 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:10:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:10:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:41.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:10:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:10:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:41.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:10:43 np0005603623 nova_compute[226235]: 2026-01-31 08:10:43.373 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:43.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:10:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:43.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:10:44 np0005603623 nova_compute[226235]: 2026-01-31 08:10:44.650 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:45.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:10:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:45.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:10:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:10:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2151565659' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:10:46 np0005603623 nova_compute[226235]: 2026-01-31 08:10:46.272 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:10:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:47.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:10:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:47.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:48 np0005603623 podman[260334]: 2026-01-31 08:10:48.087109901 +0000 UTC m=+0.064547986 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Jan 31 03:10:48 np0005603623 podman[260334]: 2026-01-31 08:10:48.197871352 +0000 UTC m=+0.175309427 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 03:10:48 np0005603623 nova_compute[226235]: 2026-01-31 08:10:48.280 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:48 np0005603623 nova_compute[226235]: 2026-01-31 08:10:48.374 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:48 np0005603623 podman[260486]: 2026-01-31 08:10:48.67838723 +0000 UTC m=+0.047583981 container exec dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 03:10:48 np0005603623 podman[260486]: 2026-01-31 08:10:48.688715866 +0000 UTC m=+0.057912597 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 03:10:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:48 np0005603623 podman[260552]: 2026-01-31 08:10:48.894292976 +0000 UTC m=+0.053123005 container exec 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, distribution-scope=public, name=keepalived, build-date=2023-02-22T09:23:20)
Jan 31 03:10:48 np0005603623 podman[260552]: 2026-01-31 08:10:48.913889864 +0000 UTC m=+0.072719873 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, io.openshift.expose-services=, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, architecture=x86_64, distribution-scope=public, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, io.buildah.version=1.28.2)
Jan 31 03:10:49 np0005603623 nova_compute[226235]: 2026-01-31 08:10:49.260 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:49.260 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:10:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:49.262 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:10:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:49.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:49 np0005603623 nova_compute[226235]: 2026-01-31 08:10:49.652 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:10:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:49.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:10:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:10:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:10:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:10:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:10:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:10:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:10:50Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b4:d8:e3 10.100.0.11
Jan 31 03:10:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:10:50Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b4:d8:e3 10.100.0.11
Jan 31 03:10:50 np0005603623 nova_compute[226235]: 2026-01-31 08:10:50.676 226239 DEBUG nova.virt.libvirt.driver [None req-d6ae3944-3c88-41fc-97e0-121b93ed1d8c 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 03:10:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:51.266 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:51.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:51.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:51 np0005603623 podman[260717]: 2026-01-31 08:10:51.983101688 +0000 UTC m=+0.067845240 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 31 03:10:52 np0005603623 podman[260718]: 2026-01-31 08:10:52.035340475 +0000 UTC m=+0.118349291 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Jan 31 03:10:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:10:52Z|00314|binding|INFO|Releasing lport 5bb8c1b5-edce-4f6a-8164-58b7d89a3330 from this chassis (sb_readonly=0)
Jan 31 03:10:52 np0005603623 nova_compute[226235]: 2026-01-31 08:10:52.973 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:53 np0005603623 kernel: tap2f8a1103-33 (unregistering): left promiscuous mode
Jan 31 03:10:53 np0005603623 NetworkManager[48970]: <info>  [1769847053.0880] device (tap2f8a1103-33): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:10:53 np0005603623 ovn_controller[133449]: 2026-01-31T08:10:53Z|00315|binding|INFO|Releasing lport 2f8a1103-332f-40ce-8e2d-20bcf884c0d1 from this chassis (sb_readonly=0)
Jan 31 03:10:53 np0005603623 ovn_controller[133449]: 2026-01-31T08:10:53Z|00316|binding|INFO|Setting lport 2f8a1103-332f-40ce-8e2d-20bcf884c0d1 down in Southbound
Jan 31 03:10:53 np0005603623 ovn_controller[133449]: 2026-01-31T08:10:53Z|00317|binding|INFO|Removing iface tap2f8a1103-33 ovn-installed in OVS
Jan 31 03:10:53 np0005603623 nova_compute[226235]: 2026-01-31 08:10:53.102 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:53 np0005603623 nova_compute[226235]: 2026-01-31 08:10:53.119 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:53.124 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:d8:e3 10.100.0.11'], port_security=['fa:16:3e:b4:d8:e3 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f564452-5f08-4a1c-921e-f2daee9ec936', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c03fec1b3664105996aa979e226d8f8', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c20bb243-1a39-4929-870f-6661da0e39e7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d620dc35-e1b1-4011-a8c1-0995d2048b09, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=2f8a1103-332f-40ce-8e2d-20bcf884c0d1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:10:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:53.125 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 2f8a1103-332f-40ce-8e2d-20bcf884c0d1 in datapath 1f564452-5f08-4a1c-921e-f2daee9ec936 unbound from our chassis#033[00m
Jan 31 03:10:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:53.127 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1f564452-5f08-4a1c-921e-f2daee9ec936, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:10:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:53.128 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9f7c99dc-4ee0-4a48-8a9a-306f72ae20b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:53.128 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 namespace which is not needed anymore#033[00m
Jan 31 03:10:53 np0005603623 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000053.scope: Deactivated successfully.
Jan 31 03:10:53 np0005603623 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000053.scope: Consumed 14.156s CPU time.
Jan 31 03:10:53 np0005603623 systemd-machined[194379]: Machine qemu-35-instance-00000053 terminated.
Jan 31 03:10:53 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[260089]: [NOTICE]   (260093) : haproxy version is 2.8.14-c23fe91
Jan 31 03:10:53 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[260089]: [NOTICE]   (260093) : path to executable is /usr/sbin/haproxy
Jan 31 03:10:53 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[260089]: [WARNING]  (260093) : Exiting Master process...
Jan 31 03:10:53 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[260089]: [ALERT]    (260093) : Current worker (260096) exited with code 143 (Terminated)
Jan 31 03:10:53 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[260089]: [WARNING]  (260093) : All workers exited. Exiting... (0)
Jan 31 03:10:53 np0005603623 systemd[1]: libpod-1070c2a42582b61f09a3d14e93c58cb821cf32cb3362d7f31f531d3246ca73fa.scope: Deactivated successfully.
Jan 31 03:10:53 np0005603623 podman[260784]: 2026-01-31 08:10:53.259078333 +0000 UTC m=+0.048409318 container died 1070c2a42582b61f09a3d14e93c58cb821cf32cb3362d7f31f531d3246ca73fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Jan 31 03:10:53 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1070c2a42582b61f09a3d14e93c58cb821cf32cb3362d7f31f531d3246ca73fa-userdata-shm.mount: Deactivated successfully.
Jan 31 03:10:53 np0005603623 systemd[1]: var-lib-containers-storage-overlay-e14d29bcb174c7bbce220bcfd474a419e68a76a185a9d0fe9c5e9f609debb3fa-merged.mount: Deactivated successfully.
Jan 31 03:10:53 np0005603623 podman[260784]: 2026-01-31 08:10:53.28722117 +0000 UTC m=+0.076552155 container cleanup 1070c2a42582b61f09a3d14e93c58cb821cf32cb3362d7f31f531d3246ca73fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:10:53 np0005603623 systemd[1]: libpod-conmon-1070c2a42582b61f09a3d14e93c58cb821cf32cb3362d7f31f531d3246ca73fa.scope: Deactivated successfully.
Jan 31 03:10:53 np0005603623 nova_compute[226235]: 2026-01-31 08:10:53.336 226239 DEBUG nova.compute.manager [req-3533c2ad-6e25-4304-b775-5c43b93a86f0 req-5e82acaf-85fe-4642-9424-8f1cf8edcac7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Received event network-vif-unplugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:53 np0005603623 nova_compute[226235]: 2026-01-31 08:10:53.338 226239 DEBUG oslo_concurrency.lockutils [req-3533c2ad-6e25-4304-b775-5c43b93a86f0 req-5e82acaf-85fe-4642-9424-8f1cf8edcac7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:53 np0005603623 nova_compute[226235]: 2026-01-31 08:10:53.338 226239 DEBUG oslo_concurrency.lockutils [req-3533c2ad-6e25-4304-b775-5c43b93a86f0 req-5e82acaf-85fe-4642-9424-8f1cf8edcac7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:53 np0005603623 nova_compute[226235]: 2026-01-31 08:10:53.339 226239 DEBUG oslo_concurrency.lockutils [req-3533c2ad-6e25-4304-b775-5c43b93a86f0 req-5e82acaf-85fe-4642-9424-8f1cf8edcac7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:53 np0005603623 nova_compute[226235]: 2026-01-31 08:10:53.339 226239 DEBUG nova.compute.manager [req-3533c2ad-6e25-4304-b775-5c43b93a86f0 req-5e82acaf-85fe-4642-9424-8f1cf8edcac7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] No waiting events found dispatching network-vif-unplugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:10:53 np0005603623 nova_compute[226235]: 2026-01-31 08:10:53.339 226239 WARNING nova.compute.manager [req-3533c2ad-6e25-4304-b775-5c43b93a86f0 req-5e82acaf-85fe-4642-9424-8f1cf8edcac7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Received unexpected event network-vif-unplugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 for instance with vm_state active and task_state powering-off.#033[00m
Jan 31 03:10:53 np0005603623 podman[260813]: 2026-01-31 08:10:53.349936447 +0000 UTC m=+0.046489707 container remove 1070c2a42582b61f09a3d14e93c58cb821cf32cb3362d7f31f531d3246ca73fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 03:10:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:53.354 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[544f6db4-9332-4ea9-9e5c-8d4abcb77322]: (4, ('Sat Jan 31 08:10:53 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 (1070c2a42582b61f09a3d14e93c58cb821cf32cb3362d7f31f531d3246ca73fa)\n1070c2a42582b61f09a3d14e93c58cb821cf32cb3362d7f31f531d3246ca73fa\nSat Jan 31 08:10:53 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 (1070c2a42582b61f09a3d14e93c58cb821cf32cb3362d7f31f531d3246ca73fa)\n1070c2a42582b61f09a3d14e93c58cb821cf32cb3362d7f31f531d3246ca73fa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:53.356 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e6ec8c4d-effa-44a6-aeec-7e2700e5b0cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:53.357 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f564452-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:53 np0005603623 nova_compute[226235]: 2026-01-31 08:10:53.359 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:53 np0005603623 kernel: tap1f564452-50: left promiscuous mode
Jan 31 03:10:53 np0005603623 nova_compute[226235]: 2026-01-31 08:10:53.367 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:53.371 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f4e24ba4-d39d-4a82-b201-fac28de5d96d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:53 np0005603623 nova_compute[226235]: 2026-01-31 08:10:53.376 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:53.386 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[da98cd92-e849-4c73-bc1e-83a33f480913]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:53.388 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1eff4a8f-d727-41ec-82f6-fbdc6e7fee43]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:53.399 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b8f0799b-5ae6-494d-bde2-80156d72c66e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 617650, 'reachable_time': 35493, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260847, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:53 np0005603623 systemd[1]: run-netns-ovnmeta\x2d1f564452\x2d5f08\x2d4a1c\x2d921e\x2df2daee9ec936.mount: Deactivated successfully.
Jan 31 03:10:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:53.403 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:10:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:10:53.404 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[f472edfe-b101-4bc9-8bbd-8324cda0b7e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:10:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:53.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:10:53 np0005603623 nova_compute[226235]: 2026-01-31 08:10:53.692 226239 INFO nova.virt.libvirt.driver [None req-d6ae3944-3c88-41fc-97e0-121b93ed1d8c 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Instance shutdown successfully after 13 seconds.#033[00m
Jan 31 03:10:53 np0005603623 nova_compute[226235]: 2026-01-31 08:10:53.701 226239 INFO nova.virt.libvirt.driver [-] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Instance destroyed successfully.#033[00m
Jan 31 03:10:53 np0005603623 nova_compute[226235]: 2026-01-31 08:10:53.702 226239 DEBUG nova.objects.instance [None req-d6ae3944-3c88-41fc-97e0-121b93ed1d8c 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'numa_topology' on Instance uuid fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:10:53 np0005603623 nova_compute[226235]: 2026-01-31 08:10:53.717 226239 DEBUG nova.compute.manager [None req-d6ae3944-3c88-41fc-97e0-121b93ed1d8c 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:10:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:53 np0005603623 nova_compute[226235]: 2026-01-31 08:10:53.812 226239 DEBUG oslo_concurrency.lockutils [None req-d6ae3944-3c88-41fc-97e0-121b93ed1d8c 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:53.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:54 np0005603623 nova_compute[226235]: 2026-01-31 08:10:54.090 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:54 np0005603623 nova_compute[226235]: 2026-01-31 08:10:54.655 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:54 np0005603623 nova_compute[226235]: 2026-01-31 08:10:54.833 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:55 np0005603623 nova_compute[226235]: 2026-01-31 08:10:55.455 226239 DEBUG nova.compute.manager [req-403f8313-5dcd-4349-a545-ee9e6ecdab01 req-8c5016ab-04a4-408c-bdbf-ba1caf23abf0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Received event network-vif-plugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:55 np0005603623 nova_compute[226235]: 2026-01-31 08:10:55.455 226239 DEBUG oslo_concurrency.lockutils [req-403f8313-5dcd-4349-a545-ee9e6ecdab01 req-8c5016ab-04a4-408c-bdbf-ba1caf23abf0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:55 np0005603623 nova_compute[226235]: 2026-01-31 08:10:55.456 226239 DEBUG oslo_concurrency.lockutils [req-403f8313-5dcd-4349-a545-ee9e6ecdab01 req-8c5016ab-04a4-408c-bdbf-ba1caf23abf0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:55 np0005603623 nova_compute[226235]: 2026-01-31 08:10:55.456 226239 DEBUG oslo_concurrency.lockutils [req-403f8313-5dcd-4349-a545-ee9e6ecdab01 req-8c5016ab-04a4-408c-bdbf-ba1caf23abf0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:55 np0005603623 nova_compute[226235]: 2026-01-31 08:10:55.456 226239 DEBUG nova.compute.manager [req-403f8313-5dcd-4349-a545-ee9e6ecdab01 req-8c5016ab-04a4-408c-bdbf-ba1caf23abf0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] No waiting events found dispatching network-vif-plugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:10:55 np0005603623 nova_compute[226235]: 2026-01-31 08:10:55.456 226239 WARNING nova.compute.manager [req-403f8313-5dcd-4349-a545-ee9e6ecdab01 req-8c5016ab-04a4-408c-bdbf-ba1caf23abf0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Received unexpected event network-vif-plugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 for instance with vm_state stopped and task_state None.#033[00m
Jan 31 03:10:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:55.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:10:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:55.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:10:56 np0005603623 nova_compute[226235]: 2026-01-31 08:10:56.594 226239 INFO nova.compute.manager [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Rebuilding instance#033[00m
Jan 31 03:10:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:10:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.178 226239 DEBUG nova.objects.instance [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.216 226239 DEBUG nova.compute.manager [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.273 226239 DEBUG nova.objects.instance [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'pci_requests' on Instance uuid fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.453 226239 DEBUG nova.objects.instance [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'pci_devices' on Instance uuid fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.473 226239 DEBUG nova.objects.instance [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'resources' on Instance uuid fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:10:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:57.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.492 226239 DEBUG nova.objects.instance [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'migration_context' on Instance uuid fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.514 226239 DEBUG nova.objects.instance [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.517 226239 INFO nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Instance already shutdown.#033[00m
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.524 226239 INFO nova.virt.libvirt.driver [-] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Instance destroyed successfully.#033[00m
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.528 226239 INFO nova.virt.libvirt.driver [-] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Instance destroyed successfully.#033[00m
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.529 226239 DEBUG nova.virt.libvirt.vif [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:10:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-577277100',display_name='tempest-tempest.common.compute-instance-577277100',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-577277100',id=83,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:10:37Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-2ke6bok9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:10:56Z,user_data=None,user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "address": "fa:16:3e:b4:d8:e3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f8a1103-33", "ovs_interfaceid": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.529 226239 DEBUG nova.network.os_vif_util [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "address": "fa:16:3e:b4:d8:e3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f8a1103-33", "ovs_interfaceid": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.530 226239 DEBUG nova.network.os_vif_util [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:d8:e3,bridge_name='br-int',has_traffic_filtering=True,id=2f8a1103-332f-40ce-8e2d-20bcf884c0d1,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f8a1103-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.531 226239 DEBUG os_vif [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:d8:e3,bridge_name='br-int',has_traffic_filtering=True,id=2f8a1103-332f-40ce-8e2d-20bcf884c0d1,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f8a1103-33') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.533 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.534 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2f8a1103-33, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.537 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.541 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:10:57 np0005603623 nova_compute[226235]: 2026-01-31 08:10:57.545 226239 INFO os_vif [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:d8:e3,bridge_name='br-int',has_traffic_filtering=True,id=2f8a1103-332f-40ce-8e2d-20bcf884c0d1,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f8a1103-33')#033[00m
Jan 31 03:10:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:10:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:57.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:10:58 np0005603623 nova_compute[226235]: 2026-01-31 08:10:58.008 226239 INFO nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Deleting instance files /var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_del#033[00m
Jan 31 03:10:58 np0005603623 nova_compute[226235]: 2026-01-31 08:10:58.009 226239 INFO nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Deletion of /var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_del complete#033[00m
Jan 31 03:10:58 np0005603623 nova_compute[226235]: 2026-01-31 08:10:58.198 226239 DEBUG nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:10:58 np0005603623 nova_compute[226235]: 2026-01-31 08:10:58.199 226239 INFO nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Creating image(s)#033[00m
Jan 31 03:10:58 np0005603623 nova_compute[226235]: 2026-01-31 08:10:58.226 226239 DEBUG nova.storage.rbd_utils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:10:58 np0005603623 nova_compute[226235]: 2026-01-31 08:10:58.252 226239 DEBUG nova.storage.rbd_utils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:10:58 np0005603623 nova_compute[226235]: 2026-01-31 08:10:58.274 226239 DEBUG nova.storage.rbd_utils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:10:58 np0005603623 nova_compute[226235]: 2026-01-31 08:10:58.278 226239 DEBUG oslo_concurrency.processutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:10:58 np0005603623 nova_compute[226235]: 2026-01-31 08:10:58.328 226239 DEBUG oslo_concurrency.processutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:10:58 np0005603623 nova_compute[226235]: 2026-01-31 08:10:58.329 226239 DEBUG oslo_concurrency.lockutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "365f9823d2619ef09948bdeed685488da63755b5" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:58 np0005603623 nova_compute[226235]: 2026-01-31 08:10:58.329 226239 DEBUG oslo_concurrency.lockutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "365f9823d2619ef09948bdeed685488da63755b5" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:58 np0005603623 nova_compute[226235]: 2026-01-31 08:10:58.330 226239 DEBUG oslo_concurrency.lockutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "365f9823d2619ef09948bdeed685488da63755b5" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:58 np0005603623 nova_compute[226235]: 2026-01-31 08:10:58.352 226239 DEBUG nova.storage.rbd_utils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:10:58 np0005603623 nova_compute[226235]: 2026-01-31 08:10:58.357 226239 DEBUG oslo_concurrency.processutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:10:58 np0005603623 nova_compute[226235]: 2026-01-31 08:10:58.381 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:58 np0005603623 nova_compute[226235]: 2026-01-31 08:10:58.692 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:10:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:59.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:10:59 np0005603623 nova_compute[226235]: 2026-01-31 08:10:59.741 226239 DEBUG oslo_concurrency.processutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.385s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:10:59 np0005603623 nova_compute[226235]: 2026-01-31 08:10:59.858 226239 DEBUG nova.storage.rbd_utils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] resizing rbd image fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:10:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:10:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:59.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.093 226239 DEBUG nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.094 226239 DEBUG nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Ensure instance console log exists: /var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.094 226239 DEBUG oslo_concurrency.lockutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.095 226239 DEBUG oslo_concurrency.lockutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.098 226239 DEBUG oslo_concurrency.lockutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.101 226239 DEBUG nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Start _get_guest_xml network_info=[{"id": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "address": "fa:16:3e:b4:d8:e3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f8a1103-33", "ovs_interfaceid": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:45Z,direct_url=<?>,disk_format='qcow2',id=0864ca59-9877-4e6d-adfc-f0a3204ed8f8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:47Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.106 226239 WARNING nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.113 226239 DEBUG nova.virt.libvirt.host [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.114 226239 DEBUG nova.virt.libvirt.host [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.119 226239 DEBUG nova.virt.libvirt.host [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.120 226239 DEBUG nova.virt.libvirt.host [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.121 226239 DEBUG nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.122 226239 DEBUG nova.virt.hardware [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:45Z,direct_url=<?>,disk_format='qcow2',id=0864ca59-9877-4e6d-adfc-f0a3204ed8f8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:47Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.122 226239 DEBUG nova.virt.hardware [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.122 226239 DEBUG nova.virt.hardware [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.123 226239 DEBUG nova.virt.hardware [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.123 226239 DEBUG nova.virt.hardware [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.123 226239 DEBUG nova.virt.hardware [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.123 226239 DEBUG nova.virt.hardware [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.124 226239 DEBUG nova.virt.hardware [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.124 226239 DEBUG nova.virt.hardware [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.124 226239 DEBUG nova.virt.hardware [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.124 226239 DEBUG nova.virt.hardware [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.124 226239 DEBUG nova.objects.instance [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.153 226239 DEBUG oslo_concurrency.processutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:11:00 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1877571730' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.584 226239 DEBUG oslo_concurrency.processutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.615 226239 DEBUG nova.storage.rbd_utils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:00 np0005603623 nova_compute[226235]: 2026-01-31 08:11:00.620 226239 DEBUG oslo_concurrency.processutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:11:01 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3350982961' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:11:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:11:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:01.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:11:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:01.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.158 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.243 226239 DEBUG oslo_concurrency.processutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.623s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.245 226239 DEBUG nova.virt.libvirt.vif [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:10:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-577277100',display_name='tempest-tempest.common.compute-instance-577277100',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-577277100',id=83,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:10:37Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-2ke6bok9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:10:58Z,user_data=None,user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "address": "fa:16:3e:b4:d8:e3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f8a1103-33", "ovs_interfaceid": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.246 226239 DEBUG nova.network.os_vif_util [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "address": "fa:16:3e:b4:d8:e3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f8a1103-33", "ovs_interfaceid": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.247 226239 DEBUG nova.network.os_vif_util [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:d8:e3,bridge_name='br-int',has_traffic_filtering=True,id=2f8a1103-332f-40ce-8e2d-20bcf884c0d1,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f8a1103-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.251 226239 DEBUG nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:11:02 np0005603623 nova_compute[226235]:  <uuid>fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091</uuid>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:  <name>instance-00000053</name>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <nova:name>tempest-tempest.common.compute-instance-577277100</nova:name>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:11:00</nova:creationTime>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:11:02 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:        <nova:user uuid="12a823bd7c6e4cf492ebf6c1d002a91f">tempest-ServerActionsTestOtherA-1768827668-project-member</nova:user>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:        <nova:project uuid="9c03fec1b3664105996aa979e226d8f8">tempest-ServerActionsTestOtherA-1768827668</nova:project>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="0864ca59-9877-4e6d-adfc-f0a3204ed8f8"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:        <nova:port uuid="2f8a1103-332f-40ce-8e2d-20bcf884c0d1">
Jan 31 03:11:02 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <entry name="serial">fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091</entry>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <entry name="uuid">fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091</entry>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk">
Jan 31 03:11:02 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:11:02 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk.config">
Jan 31 03:11:02 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:11:02 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:b4:d8:e3"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <target dev="tap2f8a1103-33"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091/console.log" append="off"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:11:02 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:11:02 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:11:02 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:11:02 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.251 226239 DEBUG nova.compute.manager [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Preparing to wait for external event network-vif-plugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.252 226239 DEBUG oslo_concurrency.lockutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.252 226239 DEBUG oslo_concurrency.lockutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.252 226239 DEBUG oslo_concurrency.lockutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.253 226239 DEBUG nova.virt.libvirt.vif [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:10:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-577277100',display_name='tempest-tempest.common.compute-instance-577277100',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-577277100',id=83,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:10:37Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-2ke6bok9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:10:58Z,user_data=None,user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "address": "fa:16:3e:b4:d8:e3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f8a1103-33", "ovs_interfaceid": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.253 226239 DEBUG nova.network.os_vif_util [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "address": "fa:16:3e:b4:d8:e3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f8a1103-33", "ovs_interfaceid": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.254 226239 DEBUG nova.network.os_vif_util [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:d8:e3,bridge_name='br-int',has_traffic_filtering=True,id=2f8a1103-332f-40ce-8e2d-20bcf884c0d1,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f8a1103-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.254 226239 DEBUG os_vif [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:d8:e3,bridge_name='br-int',has_traffic_filtering=True,id=2f8a1103-332f-40ce-8e2d-20bcf884c0d1,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f8a1103-33') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.255 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.256 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.256 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.259 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.259 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2f8a1103-33, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.260 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2f8a1103-33, col_values=(('external_ids', {'iface-id': '2f8a1103-332f-40ce-8e2d-20bcf884c0d1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b4:d8:e3', 'vm-uuid': 'fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.294 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:02 np0005603623 NetworkManager[48970]: <info>  [1769847062.2958] manager: (tap2f8a1103-33): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/150)
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.297 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.300 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.301 226239 INFO os_vif [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:d8:e3,bridge_name='br-int',has_traffic_filtering=True,id=2f8a1103-332f-40ce-8e2d-20bcf884c0d1,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f8a1103-33')#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.397 226239 DEBUG nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.398 226239 DEBUG nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.398 226239 DEBUG nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No VIF found with MAC fa:16:3e:b4:d8:e3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.398 226239 INFO nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Using config drive#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.418 226239 DEBUG nova.storage.rbd_utils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.488 226239 DEBUG nova.objects.instance [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'ec2_ids' on Instance uuid fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:02 np0005603623 nova_compute[226235]: 2026-01-31 08:11:02.550 226239 DEBUG nova.objects.instance [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'keypairs' on Instance uuid fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:03 np0005603623 nova_compute[226235]: 2026-01-31 08:11:03.400 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:11:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:03.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:11:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:03.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:04 np0005603623 nova_compute[226235]: 2026-01-31 08:11:04.297 226239 INFO nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Creating config drive at /var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091/disk.config#033[00m
Jan 31 03:11:04 np0005603623 nova_compute[226235]: 2026-01-31 08:11:04.301 226239 DEBUG oslo_concurrency.processutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpte_fegkz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:04 np0005603623 nova_compute[226235]: 2026-01-31 08:11:04.426 226239 DEBUG oslo_concurrency.processutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpte_fegkz" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:04 np0005603623 nova_compute[226235]: 2026-01-31 08:11:04.448 226239 DEBUG nova.storage.rbd_utils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:04 np0005603623 nova_compute[226235]: 2026-01-31 08:11:04.451 226239 DEBUG oslo_concurrency.processutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091/disk.config fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:04 np0005603623 nova_compute[226235]: 2026-01-31 08:11:04.675 226239 DEBUG oslo_concurrency.processutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091/disk.config fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.224s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:04 np0005603623 nova_compute[226235]: 2026-01-31 08:11:04.676 226239 INFO nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Deleting local config drive /var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091/disk.config because it was imported into RBD.#033[00m
Jan 31 03:11:04 np0005603623 kernel: tap2f8a1103-33: entered promiscuous mode
Jan 31 03:11:04 np0005603623 NetworkManager[48970]: <info>  [1769847064.7387] manager: (tap2f8a1103-33): new Tun device (/org/freedesktop/NetworkManager/Devices/151)
Jan 31 03:11:04 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:04Z|00318|binding|INFO|Claiming lport 2f8a1103-332f-40ce-8e2d-20bcf884c0d1 for this chassis.
Jan 31 03:11:04 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:04Z|00319|binding|INFO|2f8a1103-332f-40ce-8e2d-20bcf884c0d1: Claiming fa:16:3e:b4:d8:e3 10.100.0.11
Jan 31 03:11:04 np0005603623 nova_compute[226235]: 2026-01-31 08:11:04.739 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:04 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:04Z|00320|binding|INFO|Setting lport 2f8a1103-332f-40ce-8e2d-20bcf884c0d1 ovn-installed in OVS
Jan 31 03:11:04 np0005603623 nova_compute[226235]: 2026-01-31 08:11:04.745 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:04 np0005603623 nova_compute[226235]: 2026-01-31 08:11:04.747 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:04 np0005603623 systemd-machined[194379]: New machine qemu-36-instance-00000053.
Jan 31 03:11:04 np0005603623 systemd-udevd[261275]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:11:04 np0005603623 NetworkManager[48970]: <info>  [1769847064.7789] device (tap2f8a1103-33): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:11:04 np0005603623 NetworkManager[48970]: <info>  [1769847064.7796] device (tap2f8a1103-33): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:11:04 np0005603623 systemd[1]: Started Virtual Machine qemu-36-instance-00000053.
Jan 31 03:11:04 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:04Z|00321|binding|INFO|Setting lport 2f8a1103-332f-40ce-8e2d-20bcf884c0d1 up in Southbound
Jan 31 03:11:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:04.902 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:d8:e3 10.100.0.11'], port_security=['fa:16:3e:b4:d8:e3 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f564452-5f08-4a1c-921e-f2daee9ec936', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c03fec1b3664105996aa979e226d8f8', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'c20bb243-1a39-4929-870f-6661da0e39e7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d620dc35-e1b1-4011-a8c1-0995d2048b09, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=2f8a1103-332f-40ce-8e2d-20bcf884c0d1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:11:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:04.904 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 2f8a1103-332f-40ce-8e2d-20bcf884c0d1 in datapath 1f564452-5f08-4a1c-921e-f2daee9ec936 bound to our chassis#033[00m
Jan 31 03:11:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:04.907 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1f564452-5f08-4a1c-921e-f2daee9ec936#033[00m
Jan 31 03:11:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:04.916 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[959bec24-2bce-46c6-96e8-00e850d29724]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:04.919 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1f564452-51 in ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:11:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:04.922 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1f564452-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:11:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:04.922 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[60f89c86-36f4-43ad-aa79-bf59b1ec135b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:04.924 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c2b56f4d-864d-4d93-9068-99fb51e7b4ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:04.942 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[31283329-9021-4a16-bffb-12425d13833b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:04.967 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d5042388-ad43-4f71-8a6d-244bc8ed9032]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:05.006 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[047d292d-b04c-44b4-a2f2-20ae9ddd758f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:05 np0005603623 NetworkManager[48970]: <info>  [1769847065.0119] manager: (tap1f564452-50): new Veth device (/org/freedesktop/NetworkManager/Devices/152)
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:05.010 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f122b510-aa6f-4e6c-b695-8c2b536aa1ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:05 np0005603623 systemd-udevd[261277]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:05.039 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[53e43b9f-1231-4114-af8c-f34cb21a4b05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:05.043 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[b8c4ea7c-3d0c-44f3-9729-d107e07cdaa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:05 np0005603623 NetworkManager[48970]: <info>  [1769847065.0650] device (tap1f564452-50): carrier: link connected
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:05.071 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[5f12708b-e90e-439e-850f-235bc0938089]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.099 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Removed pending event for fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:05.098 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d4f6e7d1-41fd-47f0-9d3c-fefc963fa398]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f564452-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:23:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 89], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 620549, 'reachable_time': 44928, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261350, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.100 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847065.0986018, fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.100 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] VM Started (Lifecycle Event)#033[00m
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:05.113 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8ef97a00-64c2-4dd3-b01b-4f40ae39c15f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:23e8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 620549, 'tstamp': 620549}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 261351, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:05.130 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ac27f7c1-27af-4261-a7db-6d5ad37e64b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f564452-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:23:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 89], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 620549, 'reachable_time': 44928, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 261352, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:05.158 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[86fa4146-2e93-46b1-9eb4-2260e8b5d989]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:05.212 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[80758577-eb39-4a77-9da5-9acce185e325]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:05.214 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f564452-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:05.215 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:05.216 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f564452-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.218 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:05 np0005603623 NetworkManager[48970]: <info>  [1769847065.2196] manager: (tap1f564452-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Jan 31 03:11:05 np0005603623 kernel: tap1f564452-50: entered promiscuous mode
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:05.223 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1f564452-50, col_values=(('external_ids', {'iface-id': '5bb8c1b5-edce-4f6a-8164-58b7d89a3330'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:05 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:05Z|00322|binding|INFO|Releasing lport 5bb8c1b5-edce-4f6a-8164-58b7d89a3330 from this chassis (sb_readonly=0)
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.224 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:05.226 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1f564452-5f08-4a1c-921e-f2daee9ec936.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1f564452-5f08-4a1c-921e-f2daee9ec936.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:05.228 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[03970c0c-e867-482f-90bb-404ceecfaf24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:05.229 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-1f564452-5f08-4a1c-921e-f2daee9ec936
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/1f564452-5f08-4a1c-921e-f2daee9ec936.pid.haproxy
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 1f564452-5f08-4a1c-921e-f2daee9ec936
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:11:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:05.230 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'env', 'PROCESS_TAG=haproxy-1f564452-5f08-4a1c-921e-f2daee9ec936', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1f564452-5f08-4a1c-921e-f2daee9ec936.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.231 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.247 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.253 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847065.0987906, fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.253 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.395 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.405 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: stopped, current task_state: rebuild_spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.476 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 31 03:11:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:05.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:05 np0005603623 podman[261382]: 2026-01-31 08:11:05.644536453 +0000 UTC m=+0.114400867 container create 374aa8ec13bf33a5e33ed87c90bd90c93186cdd8afdde4666a5919cfcbf1fc48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:11:05 np0005603623 podman[261382]: 2026-01-31 08:11:05.548998852 +0000 UTC m=+0.018863296 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:11:05 np0005603623 systemd[1]: Started libpod-conmon-374aa8ec13bf33a5e33ed87c90bd90c93186cdd8afdde4666a5919cfcbf1fc48.scope.
Jan 31 03:11:05 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:11:05 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60d95f3c243e19fd298c1080e8e256bf8d663e4603bd8c23ece4f4697caf64a7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:11:05 np0005603623 podman[261382]: 2026-01-31 08:11:05.744036131 +0000 UTC m=+0.213900575 container init 374aa8ec13bf33a5e33ed87c90bd90c93186cdd8afdde4666a5919cfcbf1fc48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:11:05 np0005603623 podman[261382]: 2026-01-31 08:11:05.749158262 +0000 UTC m=+0.219022676 container start 374aa8ec13bf33a5e33ed87c90bd90c93186cdd8afdde4666a5919cfcbf1fc48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:11:05 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[261397]: [NOTICE]   (261401) : New worker (261403) forked
Jan 31 03:11:05 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[261397]: [NOTICE]   (261401) : Loading success.
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.783 226239 DEBUG nova.compute.manager [req-e5a48f70-4fe5-41c2-8431-66befc40ef7f req-fc466ed4-3e9d-4be0-a6bf-7859d6231c7b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Received event network-vif-plugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.783 226239 DEBUG oslo_concurrency.lockutils [req-e5a48f70-4fe5-41c2-8431-66befc40ef7f req-fc466ed4-3e9d-4be0-a6bf-7859d6231c7b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.784 226239 DEBUG oslo_concurrency.lockutils [req-e5a48f70-4fe5-41c2-8431-66befc40ef7f req-fc466ed4-3e9d-4be0-a6bf-7859d6231c7b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.784 226239 DEBUG oslo_concurrency.lockutils [req-e5a48f70-4fe5-41c2-8431-66befc40ef7f req-fc466ed4-3e9d-4be0-a6bf-7859d6231c7b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.784 226239 DEBUG nova.compute.manager [req-e5a48f70-4fe5-41c2-8431-66befc40ef7f req-fc466ed4-3e9d-4be0-a6bf-7859d6231c7b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Processing event network-vif-plugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.785 226239 DEBUG nova.compute.manager [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.790 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847065.7897086, fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.790 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.791 226239 DEBUG nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.796 226239 INFO nova.virt.libvirt.driver [-] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Instance spawned successfully.#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.796 226239 DEBUG nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.844 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.848 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: rebuild_spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.864 226239 DEBUG nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.865 226239 DEBUG nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.866 226239 DEBUG nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.866 226239 DEBUG nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.866 226239 DEBUG nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:05 np0005603623 nova_compute[226235]: 2026-01-31 08:11:05.867 226239 DEBUG nova.virt.libvirt.driver [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:05.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:06 np0005603623 nova_compute[226235]: 2026-01-31 08:11:06.597 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 31 03:11:06 np0005603623 nova_compute[226235]: 2026-01-31 08:11:06.682 226239 DEBUG nova.compute.manager [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:06 np0005603623 nova_compute[226235]: 2026-01-31 08:11:06.767 226239 INFO nova.compute.manager [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] bringing vm to original state: 'stopped'#033[00m
Jan 31 03:11:06 np0005603623 nova_compute[226235]: 2026-01-31 08:11:06.934 226239 DEBUG oslo_concurrency.lockutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:06 np0005603623 nova_compute[226235]: 2026-01-31 08:11:06.934 226239 DEBUG oslo_concurrency.lockutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:06 np0005603623 nova_compute[226235]: 2026-01-31 08:11:06.934 226239 DEBUG nova.compute.manager [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:06 np0005603623 nova_compute[226235]: 2026-01-31 08:11:06.938 226239 DEBUG nova.compute.manager [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 31 03:11:07 np0005603623 kernel: tap2f8a1103-33 (unregistering): left promiscuous mode
Jan 31 03:11:07 np0005603623 NetworkManager[48970]: <info>  [1769847067.0301] device (tap2f8a1103-33): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:11:07 np0005603623 nova_compute[226235]: 2026-01-31 08:11:07.029 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:07 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:07Z|00323|binding|INFO|Releasing lport 2f8a1103-332f-40ce-8e2d-20bcf884c0d1 from this chassis (sb_readonly=0)
Jan 31 03:11:07 np0005603623 nova_compute[226235]: 2026-01-31 08:11:07.037 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:07 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:07Z|00324|binding|INFO|Setting lport 2f8a1103-332f-40ce-8e2d-20bcf884c0d1 down in Southbound
Jan 31 03:11:07 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:07Z|00325|binding|INFO|Removing iface tap2f8a1103-33 ovn-installed in OVS
Jan 31 03:11:07 np0005603623 nova_compute[226235]: 2026-01-31 08:11:07.039 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:07 np0005603623 nova_compute[226235]: 2026-01-31 08:11:07.043 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:07.047 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:d8:e3 10.100.0.11'], port_security=['fa:16:3e:b4:d8:e3 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f564452-5f08-4a1c-921e-f2daee9ec936', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c03fec1b3664105996aa979e226d8f8', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'c20bb243-1a39-4929-870f-6661da0e39e7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d620dc35-e1b1-4011-a8c1-0995d2048b09, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=2f8a1103-332f-40ce-8e2d-20bcf884c0d1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:11:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:07.049 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 2f8a1103-332f-40ce-8e2d-20bcf884c0d1 in datapath 1f564452-5f08-4a1c-921e-f2daee9ec936 unbound from our chassis#033[00m
Jan 31 03:11:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:07.051 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1f564452-5f08-4a1c-921e-f2daee9ec936, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:11:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:07.052 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9e2c867b-4245-457a-8f27-e0e25460e6b1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:07.052 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 namespace which is not needed anymore#033[00m
Jan 31 03:11:07 np0005603623 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000053.scope: Deactivated successfully.
Jan 31 03:11:07 np0005603623 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000053.scope: Consumed 1.575s CPU time.
Jan 31 03:11:07 np0005603623 systemd-machined[194379]: Machine qemu-36-instance-00000053 terminated.
Jan 31 03:11:07 np0005603623 nova_compute[226235]: 2026-01-31 08:11:07.172 226239 INFO nova.virt.libvirt.driver [-] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Instance destroyed successfully.#033[00m
Jan 31 03:11:07 np0005603623 nova_compute[226235]: 2026-01-31 08:11:07.174 226239 DEBUG nova.compute.manager [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:07 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[261397]: [NOTICE]   (261401) : haproxy version is 2.8.14-c23fe91
Jan 31 03:11:07 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[261397]: [NOTICE]   (261401) : path to executable is /usr/sbin/haproxy
Jan 31 03:11:07 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[261397]: [WARNING]  (261401) : Exiting Master process...
Jan 31 03:11:07 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[261397]: [ALERT]    (261401) : Current worker (261403) exited with code 143 (Terminated)
Jan 31 03:11:07 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[261397]: [WARNING]  (261401) : All workers exited. Exiting... (0)
Jan 31 03:11:07 np0005603623 systemd[1]: libpod-374aa8ec13bf33a5e33ed87c90bd90c93186cdd8afdde4666a5919cfcbf1fc48.scope: Deactivated successfully.
Jan 31 03:11:07 np0005603623 podman[261433]: 2026-01-31 08:11:07.202593651 +0000 UTC m=+0.056695369 container died 374aa8ec13bf33a5e33ed87c90bd90c93186cdd8afdde4666a5919cfcbf1fc48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:11:07 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-374aa8ec13bf33a5e33ed87c90bd90c93186cdd8afdde4666a5919cfcbf1fc48-userdata-shm.mount: Deactivated successfully.
Jan 31 03:11:07 np0005603623 systemd[1]: var-lib-containers-storage-overlay-60d95f3c243e19fd298c1080e8e256bf8d663e4603bd8c23ece4f4697caf64a7-merged.mount: Deactivated successfully.
Jan 31 03:11:07 np0005603623 podman[261433]: 2026-01-31 08:11:07.243045755 +0000 UTC m=+0.097147473 container cleanup 374aa8ec13bf33a5e33ed87c90bd90c93186cdd8afdde4666a5919cfcbf1fc48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:11:07 np0005603623 systemd[1]: libpod-conmon-374aa8ec13bf33a5e33ed87c90bd90c93186cdd8afdde4666a5919cfcbf1fc48.scope: Deactivated successfully.
Jan 31 03:11:07 np0005603623 nova_compute[226235]: 2026-01-31 08:11:07.256 226239 DEBUG oslo_concurrency.lockutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 0.322s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:07 np0005603623 nova_compute[226235]: 2026-01-31 08:11:07.283 226239 DEBUG oslo_concurrency.lockutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:07 np0005603623 nova_compute[226235]: 2026-01-31 08:11:07.284 226239 DEBUG oslo_concurrency.lockutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:07 np0005603623 nova_compute[226235]: 2026-01-31 08:11:07.284 226239 DEBUG nova.objects.instance [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 03:11:07 np0005603623 nova_compute[226235]: 2026-01-31 08:11:07.294 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:07 np0005603623 podman[261473]: 2026-01-31 08:11:07.30344505 +0000 UTC m=+0.041084437 container remove 374aa8ec13bf33a5e33ed87c90bd90c93186cdd8afdde4666a5919cfcbf1fc48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:11:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:07.308 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7e2df663-7e3a-46bb-aaaf-1d2ae61ed1a0]: (4, ('Sat Jan 31 08:11:07 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 (374aa8ec13bf33a5e33ed87c90bd90c93186cdd8afdde4666a5919cfcbf1fc48)\n374aa8ec13bf33a5e33ed87c90bd90c93186cdd8afdde4666a5919cfcbf1fc48\nSat Jan 31 08:11:07 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 (374aa8ec13bf33a5e33ed87c90bd90c93186cdd8afdde4666a5919cfcbf1fc48)\n374aa8ec13bf33a5e33ed87c90bd90c93186cdd8afdde4666a5919cfcbf1fc48\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:07.309 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c7ac55c6-966c-48c0-8a3b-9b237cc8f93e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:07.310 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f564452-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:07 np0005603623 nova_compute[226235]: 2026-01-31 08:11:07.312 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:07 np0005603623 kernel: tap1f564452-50: left promiscuous mode
Jan 31 03:11:07 np0005603623 nova_compute[226235]: 2026-01-31 08:11:07.321 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:07.322 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c04360da-2846-49a1-8f33-59cb548dcac5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:07.340 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a73be71a-b1ad-43dc-b587-50bb0fb17461]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:07.341 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e63b1965-9a6e-46f0-8527-aa4f9f65f08c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:07.352 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[64180913-179a-4919-a078-2d2a57532c29]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 620542, 'reachable_time': 24577, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261490, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:07 np0005603623 systemd[1]: run-netns-ovnmeta\x2d1f564452\x2d5f08\x2d4a1c\x2d921e\x2df2daee9ec936.mount: Deactivated successfully.
Jan 31 03:11:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:07.354 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:11:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:07.355 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[04de6831-362e-4a51-a6a8-fdc53c13e94e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:07 np0005603623 nova_compute[226235]: 2026-01-31 08:11:07.357 226239 DEBUG oslo_concurrency.lockutils [None req-5e2c6ed3-9275-4093-aeb9-a42ad03f747e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:07.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:07.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:08 np0005603623 nova_compute[226235]: 2026-01-31 08:11:08.005 226239 DEBUG nova.compute.manager [req-3b94b958-e27e-4d26-9234-4b9034fe1d14 req-34ddf345-c2f0-4d93-8c43-f1df7d348d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Received event network-vif-plugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:08 np0005603623 nova_compute[226235]: 2026-01-31 08:11:08.006 226239 DEBUG oslo_concurrency.lockutils [req-3b94b958-e27e-4d26-9234-4b9034fe1d14 req-34ddf345-c2f0-4d93-8c43-f1df7d348d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:08 np0005603623 nova_compute[226235]: 2026-01-31 08:11:08.006 226239 DEBUG oslo_concurrency.lockutils [req-3b94b958-e27e-4d26-9234-4b9034fe1d14 req-34ddf345-c2f0-4d93-8c43-f1df7d348d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:08 np0005603623 nova_compute[226235]: 2026-01-31 08:11:08.006 226239 DEBUG oslo_concurrency.lockutils [req-3b94b958-e27e-4d26-9234-4b9034fe1d14 req-34ddf345-c2f0-4d93-8c43-f1df7d348d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:08 np0005603623 nova_compute[226235]: 2026-01-31 08:11:08.007 226239 DEBUG nova.compute.manager [req-3b94b958-e27e-4d26-9234-4b9034fe1d14 req-34ddf345-c2f0-4d93-8c43-f1df7d348d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] No waiting events found dispatching network-vif-plugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:08 np0005603623 nova_compute[226235]: 2026-01-31 08:11:08.007 226239 WARNING nova.compute.manager [req-3b94b958-e27e-4d26-9234-4b9034fe1d14 req-34ddf345-c2f0-4d93-8c43-f1df7d348d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Received unexpected event network-vif-plugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 for instance with vm_state stopped and task_state None.#033[00m
Jan 31 03:11:08 np0005603623 nova_compute[226235]: 2026-01-31 08:11:08.007 226239 DEBUG nova.compute.manager [req-3b94b958-e27e-4d26-9234-4b9034fe1d14 req-34ddf345-c2f0-4d93-8c43-f1df7d348d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Received event network-vif-unplugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:08 np0005603623 nova_compute[226235]: 2026-01-31 08:11:08.007 226239 DEBUG oslo_concurrency.lockutils [req-3b94b958-e27e-4d26-9234-4b9034fe1d14 req-34ddf345-c2f0-4d93-8c43-f1df7d348d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:08 np0005603623 nova_compute[226235]: 2026-01-31 08:11:08.008 226239 DEBUG oslo_concurrency.lockutils [req-3b94b958-e27e-4d26-9234-4b9034fe1d14 req-34ddf345-c2f0-4d93-8c43-f1df7d348d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:08 np0005603623 nova_compute[226235]: 2026-01-31 08:11:08.008 226239 DEBUG oslo_concurrency.lockutils [req-3b94b958-e27e-4d26-9234-4b9034fe1d14 req-34ddf345-c2f0-4d93-8c43-f1df7d348d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:08 np0005603623 nova_compute[226235]: 2026-01-31 08:11:08.009 226239 DEBUG nova.compute.manager [req-3b94b958-e27e-4d26-9234-4b9034fe1d14 req-34ddf345-c2f0-4d93-8c43-f1df7d348d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] No waiting events found dispatching network-vif-unplugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:08 np0005603623 nova_compute[226235]: 2026-01-31 08:11:08.009 226239 WARNING nova.compute.manager [req-3b94b958-e27e-4d26-9234-4b9034fe1d14 req-34ddf345-c2f0-4d93-8c43-f1df7d348d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Received unexpected event network-vif-unplugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 for instance with vm_state stopped and task_state None.#033[00m
Jan 31 03:11:08 np0005603623 nova_compute[226235]: 2026-01-31 08:11:08.402 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:11:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:09.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:11:09 np0005603623 nova_compute[226235]: 2026-01-31 08:11:09.668 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:09.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:10 np0005603623 nova_compute[226235]: 2026-01-31 08:11:10.177 226239 DEBUG nova.compute.manager [req-afc76cc0-527b-446c-b4ad-e19821ab2e9d req-7af7b72c-f5e4-427a-b866-06ce8521459b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Received event network-vif-plugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:10 np0005603623 nova_compute[226235]: 2026-01-31 08:11:10.177 226239 DEBUG oslo_concurrency.lockutils [req-afc76cc0-527b-446c-b4ad-e19821ab2e9d req-7af7b72c-f5e4-427a-b866-06ce8521459b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:10 np0005603623 nova_compute[226235]: 2026-01-31 08:11:10.178 226239 DEBUG oslo_concurrency.lockutils [req-afc76cc0-527b-446c-b4ad-e19821ab2e9d req-7af7b72c-f5e4-427a-b866-06ce8521459b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:10 np0005603623 nova_compute[226235]: 2026-01-31 08:11:10.178 226239 DEBUG oslo_concurrency.lockutils [req-afc76cc0-527b-446c-b4ad-e19821ab2e9d req-7af7b72c-f5e4-427a-b866-06ce8521459b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:10 np0005603623 nova_compute[226235]: 2026-01-31 08:11:10.178 226239 DEBUG nova.compute.manager [req-afc76cc0-527b-446c-b4ad-e19821ab2e9d req-7af7b72c-f5e4-427a-b866-06ce8521459b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] No waiting events found dispatching network-vif-plugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:10 np0005603623 nova_compute[226235]: 2026-01-31 08:11:10.179 226239 WARNING nova.compute.manager [req-afc76cc0-527b-446c-b4ad-e19821ab2e9d req-7af7b72c-f5e4-427a-b866-06ce8521459b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Received unexpected event network-vif-plugged-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 for instance with vm_state stopped and task_state None.#033[00m
Jan 31 03:11:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:11.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:11 np0005603623 nova_compute[226235]: 2026-01-31 08:11:11.840 226239 DEBUG oslo_concurrency.lockutils [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:11 np0005603623 nova_compute[226235]: 2026-01-31 08:11:11.840 226239 DEBUG oslo_concurrency.lockutils [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:11 np0005603623 nova_compute[226235]: 2026-01-31 08:11:11.841 226239 DEBUG oslo_concurrency.lockutils [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:11 np0005603623 nova_compute[226235]: 2026-01-31 08:11:11.841 226239 DEBUG oslo_concurrency.lockutils [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:11 np0005603623 nova_compute[226235]: 2026-01-31 08:11:11.841 226239 DEBUG oslo_concurrency.lockutils [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:11 np0005603623 nova_compute[226235]: 2026-01-31 08:11:11.842 226239 INFO nova.compute.manager [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Terminating instance#033[00m
Jan 31 03:11:11 np0005603623 nova_compute[226235]: 2026-01-31 08:11:11.844 226239 DEBUG nova.compute.manager [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:11:11 np0005603623 nova_compute[226235]: 2026-01-31 08:11:11.850 226239 INFO nova.virt.libvirt.driver [-] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Instance destroyed successfully.#033[00m
Jan 31 03:11:11 np0005603623 nova_compute[226235]: 2026-01-31 08:11:11.851 226239 DEBUG nova.objects.instance [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'resources' on Instance uuid fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:11 np0005603623 nova_compute[226235]: 2026-01-31 08:11:11.868 226239 DEBUG nova.virt.libvirt.vif [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:10:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-577277100',display_name='tempest-tempest.common.compute-instance-577277100',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-577277100',id=83,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:11:06Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-2ke6bok9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:11:07Z,user_data=None,user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "address": "fa:16:3e:b4:d8:e3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f8a1103-33", "ovs_interfaceid": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:11:11 np0005603623 nova_compute[226235]: 2026-01-31 08:11:11.869 226239 DEBUG nova.network.os_vif_util [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "address": "fa:16:3e:b4:d8:e3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f8a1103-33", "ovs_interfaceid": "2f8a1103-332f-40ce-8e2d-20bcf884c0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:11 np0005603623 nova_compute[226235]: 2026-01-31 08:11:11.869 226239 DEBUG nova.network.os_vif_util [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:d8:e3,bridge_name='br-int',has_traffic_filtering=True,id=2f8a1103-332f-40ce-8e2d-20bcf884c0d1,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f8a1103-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:11 np0005603623 nova_compute[226235]: 2026-01-31 08:11:11.870 226239 DEBUG os_vif [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:d8:e3,bridge_name='br-int',has_traffic_filtering=True,id=2f8a1103-332f-40ce-8e2d-20bcf884c0d1,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f8a1103-33') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:11:11 np0005603623 nova_compute[226235]: 2026-01-31 08:11:11.871 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:11 np0005603623 nova_compute[226235]: 2026-01-31 08:11:11.871 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2f8a1103-33, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:11 np0005603623 nova_compute[226235]: 2026-01-31 08:11:11.873 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:11 np0005603623 nova_compute[226235]: 2026-01-31 08:11:11.874 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:11 np0005603623 nova_compute[226235]: 2026-01-31 08:11:11.877 226239 INFO os_vif [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:d8:e3,bridge_name='br-int',has_traffic_filtering=True,id=2f8a1103-332f-40ce-8e2d-20bcf884c0d1,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f8a1103-33')#033[00m
Jan 31 03:11:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:11.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:13 np0005603623 nova_compute[226235]: 2026-01-31 08:11:13.404 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:13.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:13.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:14 np0005603623 nova_compute[226235]: 2026-01-31 08:11:14.651 226239 INFO nova.virt.libvirt.driver [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Deleting instance files /var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_del#033[00m
Jan 31 03:11:14 np0005603623 nova_compute[226235]: 2026-01-31 08:11:14.652 226239 INFO nova.virt.libvirt.driver [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Deletion of /var/lib/nova/instances/fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091_del complete#033[00m
Jan 31 03:11:14 np0005603623 nova_compute[226235]: 2026-01-31 08:11:14.725 226239 INFO nova.compute.manager [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Took 2.88 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:11:14 np0005603623 nova_compute[226235]: 2026-01-31 08:11:14.726 226239 DEBUG oslo.service.loopingcall [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:11:14 np0005603623 nova_compute[226235]: 2026-01-31 08:11:14.727 226239 DEBUG nova.compute.manager [-] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:11:14 np0005603623 nova_compute[226235]: 2026-01-31 08:11:14.727 226239 DEBUG nova.network.neutron [-] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:11:14 np0005603623 nova_compute[226235]: 2026-01-31 08:11:14.959 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:14 np0005603623 nova_compute[226235]: 2026-01-31 08:11:14.985 226239 WARNING nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.#033[00m
Jan 31 03:11:14 np0005603623 nova_compute[226235]: 2026-01-31 08:11:14.986 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Triggering sync for uuid fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 03:11:14 np0005603623 nova_compute[226235]: 2026-01-31 08:11:14.986 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:15.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:15 np0005603623 nova_compute[226235]: 2026-01-31 08:11:15.833 226239 DEBUG nova.network.neutron [-] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:11:15 np0005603623 nova_compute[226235]: 2026-01-31 08:11:15.872 226239 INFO nova.compute.manager [-] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Took 1.14 seconds to deallocate network for instance.#033[00m
Jan 31 03:11:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:11:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:15.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:11:15 np0005603623 nova_compute[226235]: 2026-01-31 08:11:15.939 226239 DEBUG oslo_concurrency.lockutils [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:15 np0005603623 nova_compute[226235]: 2026-01-31 08:11:15.939 226239 DEBUG oslo_concurrency.lockutils [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:16 np0005603623 nova_compute[226235]: 2026-01-31 08:11:16.015 226239 DEBUG oslo_concurrency.processutils [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:16 np0005603623 nova_compute[226235]: 2026-01-31 08:11:16.035 226239 DEBUG nova.compute.manager [req-e2ea5f12-3a1f-461d-ada9-9bb613e982b4 req-39d44a6d-4a0a-471e-8da3-1c46f8980c68 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Received event network-vif-deleted-2f8a1103-332f-40ce-8e2d-20bcf884c0d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:11:16 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1221025377' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:11:16 np0005603623 nova_compute[226235]: 2026-01-31 08:11:16.466 226239 DEBUG oslo_concurrency.processutils [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:16 np0005603623 nova_compute[226235]: 2026-01-31 08:11:16.474 226239 DEBUG nova.compute.provider_tree [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:11:16 np0005603623 nova_compute[226235]: 2026-01-31 08:11:16.495 226239 DEBUG nova.scheduler.client.report [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:11:16 np0005603623 nova_compute[226235]: 2026-01-31 08:11:16.524 226239 DEBUG oslo_concurrency.lockutils [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:16 np0005603623 nova_compute[226235]: 2026-01-31 08:11:16.575 226239 INFO nova.scheduler.client.report [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Deleted allocations for instance fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091#033[00m
Jan 31 03:11:16 np0005603623 nova_compute[226235]: 2026-01-31 08:11:16.715 226239 DEBUG oslo_concurrency.lockutils [None req-4ec2e243-8260-4ae2-918c-3761e8b676c8 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.875s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:16 np0005603623 nova_compute[226235]: 2026-01-31 08:11:16.718 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 1.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:16 np0005603623 nova_compute[226235]: 2026-01-31 08:11:16.718 226239 INFO nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Jan 31 03:11:16 np0005603623 nova_compute[226235]: 2026-01-31 08:11:16.718 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:16 np0005603623 nova_compute[226235]: 2026-01-31 08:11:16.876 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:11:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:17.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:11:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:17.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:18 np0005603623 nova_compute[226235]: 2026-01-31 08:11:18.405 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:19.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:19.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:21 np0005603623 nova_compute[226235]: 2026-01-31 08:11:21.279 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:21 np0005603623 nova_compute[226235]: 2026-01-31 08:11:21.280 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:21 np0005603623 nova_compute[226235]: 2026-01-31 08:11:21.293 226239 DEBUG nova.compute.manager [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:11:21 np0005603623 nova_compute[226235]: 2026-01-31 08:11:21.358 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:21 np0005603623 nova_compute[226235]: 2026-01-31 08:11:21.358 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:21 np0005603623 nova_compute[226235]: 2026-01-31 08:11:21.364 226239 DEBUG nova.virt.hardware [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:11:21 np0005603623 nova_compute[226235]: 2026-01-31 08:11:21.364 226239 INFO nova.compute.claims [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:11:21 np0005603623 nova_compute[226235]: 2026-01-31 08:11:21.513 226239 DEBUG oslo_concurrency.processutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:21.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:21 np0005603623 nova_compute[226235]: 2026-01-31 08:11:21.879 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:11:21 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/119861870' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:11:21 np0005603623 nova_compute[226235]: 2026-01-31 08:11:21.918 226239 DEBUG oslo_concurrency.processutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.405s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:21 np0005603623 nova_compute[226235]: 2026-01-31 08:11:21.926 226239 DEBUG nova.compute.provider_tree [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:11:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:21.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:21 np0005603623 nova_compute[226235]: 2026-01-31 08:11:21.983 226239 DEBUG nova.scheduler.client.report [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.109 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.750s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.109 226239 DEBUG nova.compute.manager [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.171 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847067.1694224, fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.171 226239 INFO nova.compute.manager [-] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.214 226239 DEBUG nova.compute.manager [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.214 226239 DEBUG nova.network.neutron [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.223 226239 DEBUG nova.compute.manager [None req-e8a2292e-5405-45d9-9217-47b117ccb4ac - - - - - -] [instance: fcf9cf7f-f1d4-4d28-ad0e-87e80d08d091] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.282 226239 INFO nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.298 226239 DEBUG nova.compute.manager [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.411 226239 DEBUG nova.policy [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '55f81600a60b49aaae5b4c28549afdaf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '88b896f61c644b6fac0351ce6828b6e1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.417 226239 DEBUG nova.compute.manager [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.418 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.419 226239 INFO nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Creating image(s)#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.505 226239 DEBUG nova.storage.rbd_utils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] rbd image f6cd19dd-9676-4737-a5dc-6b0d0705d8ca_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.531 226239 DEBUG nova.storage.rbd_utils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] rbd image f6cd19dd-9676-4737-a5dc-6b0d0705d8ca_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.557 226239 DEBUG nova.storage.rbd_utils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] rbd image f6cd19dd-9676-4737-a5dc-6b0d0705d8ca_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.560 226239 DEBUG oslo_concurrency.processutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.618 226239 DEBUG oslo_concurrency.processutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.619 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.620 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.620 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.650 226239 DEBUG nova.storage.rbd_utils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] rbd image f6cd19dd-9676-4737-a5dc-6b0d0705d8ca_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:22 np0005603623 nova_compute[226235]: 2026-01-31 08:11:22.654 226239 DEBUG oslo_concurrency.processutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 f6cd19dd-9676-4737-a5dc-6b0d0705d8ca_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:22 np0005603623 podman[261707]: 2026-01-31 08:11:22.953255388 +0000 UTC m=+0.042099847 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:11:22 np0005603623 podman[261708]: 2026-01-31 08:11:22.971566846 +0000 UTC m=+0.058934009 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:11:23 np0005603623 nova_compute[226235]: 2026-01-31 08:11:23.407 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:23 np0005603623 nova_compute[226235]: 2026-01-31 08:11:23.475 226239 DEBUG oslo_concurrency.processutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 f6cd19dd-9676-4737-a5dc-6b0d0705d8ca_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.821s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:23.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:23 np0005603623 nova_compute[226235]: 2026-01-31 08:11:23.538 226239 DEBUG nova.storage.rbd_utils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] resizing rbd image f6cd19dd-9676-4737-a5dc-6b0d0705d8ca_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:11:23 np0005603623 nova_compute[226235]: 2026-01-31 08:11:23.614 226239 DEBUG nova.objects.instance [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lazy-loading 'migration_context' on Instance uuid f6cd19dd-9676-4737-a5dc-6b0d0705d8ca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:23 np0005603623 nova_compute[226235]: 2026-01-31 08:11:23.635 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:11:23 np0005603623 nova_compute[226235]: 2026-01-31 08:11:23.636 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Ensure instance console log exists: /var/lib/nova/instances/f6cd19dd-9676-4737-a5dc-6b0d0705d8ca/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:11:23 np0005603623 nova_compute[226235]: 2026-01-31 08:11:23.636 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:23 np0005603623 nova_compute[226235]: 2026-01-31 08:11:23.636 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:23 np0005603623 nova_compute[226235]: 2026-01-31 08:11:23.637 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:23.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:11:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:25.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:11:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:25.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:26 np0005603623 nova_compute[226235]: 2026-01-31 08:11:26.883 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:27 np0005603623 nova_compute[226235]: 2026-01-31 08:11:27.341 226239 DEBUG nova.network.neutron [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Successfully created port: ab0ce9b5-cd73-4758-8513-45a7f13eefe3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:11:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:11:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:27.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:11:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:27.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.020 226239 DEBUG nova.network.neutron [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Successfully created port: b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.042 226239 DEBUG oslo_concurrency.lockutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "702e2506-8d57-4ea2-b56e-1800da93f646" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.043 226239 DEBUG oslo_concurrency.lockutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.067 226239 DEBUG nova.compute.manager [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.164 226239 DEBUG oslo_concurrency.lockutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.165 226239 DEBUG oslo_concurrency.lockutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.172 226239 DEBUG nova.virt.hardware [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.172 226239 INFO nova.compute.claims [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.176 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.176 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.176 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.177 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.332 226239 DEBUG oslo_concurrency.processutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.434 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:11:28 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4049405460' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.762 226239 DEBUG oslo_concurrency.processutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.767 226239 DEBUG nova.compute.provider_tree [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.788 226239 DEBUG nova.scheduler.client.report [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:11:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.861 226239 DEBUG oslo_concurrency.lockutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.862 226239 DEBUG nova.compute.manager [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.936 226239 DEBUG nova.compute.manager [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.937 226239 DEBUG nova.network.neutron [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:11:28 np0005603623 nova_compute[226235]: 2026-01-31 08:11:28.972 226239 INFO nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.014 226239 DEBUG nova.compute.manager [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.059 226239 DEBUG nova.network.neutron [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Successfully created port: 393a6935-9a5d-4dbb-8ace-1d72f748ecd4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.120 226239 DEBUG nova.compute.manager [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.123 226239 DEBUG nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.124 226239 INFO nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Creating image(s)#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.148 226239 DEBUG nova.storage.rbd_utils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 702e2506-8d57-4ea2-b56e-1800da93f646_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.173 226239 DEBUG nova.storage.rbd_utils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 702e2506-8d57-4ea2-b56e-1800da93f646_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.196 226239 DEBUG nova.storage.rbd_utils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 702e2506-8d57-4ea2-b56e-1800da93f646_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.200 226239 DEBUG oslo_concurrency.processutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.219 226239 DEBUG nova.policy [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '12a823bd7c6e4cf492ebf6c1d002a91f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9c03fec1b3664105996aa979e226d8f8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.250 226239 DEBUG oslo_concurrency.processutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.250 226239 DEBUG oslo_concurrency.lockutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.251 226239 DEBUG oslo_concurrency.lockutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.251 226239 DEBUG oslo_concurrency.lockutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.274 226239 DEBUG nova.storage.rbd_utils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 702e2506-8d57-4ea2-b56e-1800da93f646_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.277 226239 DEBUG oslo_concurrency.processutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 702e2506-8d57-4ea2-b56e-1800da93f646_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:11:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:29.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.552 226239 DEBUG oslo_concurrency.processutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 702e2506-8d57-4ea2-b56e-1800da93f646_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.274s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.613 226239 DEBUG nova.storage.rbd_utils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] resizing rbd image 702e2506-8d57-4ea2-b56e-1800da93f646_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.723 226239 DEBUG nova.objects.instance [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'migration_context' on Instance uuid 702e2506-8d57-4ea2-b56e-1800da93f646 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.777 226239 DEBUG nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.777 226239 DEBUG nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Ensure instance console log exists: /var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.778 226239 DEBUG oslo_concurrency.lockutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.778 226239 DEBUG oslo_concurrency.lockutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:29 np0005603623 nova_compute[226235]: 2026-01-31 08:11:29.778 226239 DEBUG oslo_concurrency.lockutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:29.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:30.104 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:30.105 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:30.105 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:30 np0005603623 nova_compute[226235]: 2026-01-31 08:11:30.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:30 np0005603623 nova_compute[226235]: 2026-01-31 08:11:30.232 226239 DEBUG nova.network.neutron [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Successfully updated port: ab0ce9b5-cd73-4758-8513-45a7f13eefe3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:11:30 np0005603623 nova_compute[226235]: 2026-01-31 08:11:30.285 226239 DEBUG nova.network.neutron [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Successfully created port: 7429a420-eefe-4af6-b5a7-ad8aff346ea8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.155 226239 DEBUG nova.compute.manager [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-changed-ab0ce9b5-cd73-4758-8513-45a7f13eefe3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.155 226239 DEBUG nova.compute.manager [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Refreshing instance network info cache due to event network-changed-ab0ce9b5-cd73-4758-8513-45a7f13eefe3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.155 226239 DEBUG oslo_concurrency.lockutils [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-f6cd19dd-9676-4737-a5dc-6b0d0705d8ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.155 226239 DEBUG oslo_concurrency.lockutils [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-f6cd19dd-9676-4737-a5dc-6b0d0705d8ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.156 226239 DEBUG nova.network.neutron [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Refreshing network info cache for port ab0ce9b5-cd73-4758-8513-45a7f13eefe3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.190 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.190 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.190 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.190 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.191 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.243 226239 DEBUG nova.network.neutron [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Successfully updated port: 7429a420-eefe-4af6-b5a7-ad8aff346ea8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.257 226239 DEBUG oslo_concurrency.lockutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "refresh_cache-702e2506-8d57-4ea2-b56e-1800da93f646" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.257 226239 DEBUG oslo_concurrency.lockutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquired lock "refresh_cache-702e2506-8d57-4ea2-b56e-1800da93f646" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.257 226239 DEBUG nova.network.neutron [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.345 226239 DEBUG nova.network.neutron [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.450 226239 DEBUG nova.network.neutron [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.461 226239 DEBUG nova.network.neutron [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Successfully updated port: b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:11:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:31.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:11:31 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/81705275' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.593 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.403s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.737 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.739 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4554MB free_disk=20.858692169189453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.740 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.740 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.876 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance f6cd19dd-9676-4737-a5dc-6b0d0705d8ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.876 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 702e2506-8d57-4ea2-b56e-1800da93f646 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.877 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.877 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.886 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:31 np0005603623 nova_compute[226235]: 2026-01-31 08:11:31.934 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:31.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.046 226239 DEBUG nova.network.neutron [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.071 226239 DEBUG oslo_concurrency.lockutils [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-f6cd19dd-9676-4737-a5dc-6b0d0705d8ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:11:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:11:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4214370478' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.375 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.380 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.389 226239 DEBUG nova.network.neutron [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Successfully updated port: 393a6935-9a5d-4dbb-8ace-1d72f748ecd4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.415 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.422 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "refresh_cache-f6cd19dd-9676-4737-a5dc-6b0d0705d8ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.422 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquired lock "refresh_cache-f6cd19dd-9676-4737-a5dc-6b0d0705d8ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.422 226239 DEBUG nova.network.neutron [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.461 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.461 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.562 226239 DEBUG nova.network.neutron [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Updating instance_info_cache with network_info: [{"id": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "address": "fa:16:3e:39:1e:10", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7429a420-ee", "ovs_interfaceid": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.590 226239 DEBUG oslo_concurrency.lockutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Releasing lock "refresh_cache-702e2506-8d57-4ea2-b56e-1800da93f646" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.591 226239 DEBUG nova.compute.manager [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Instance network_info: |[{"id": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "address": "fa:16:3e:39:1e:10", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7429a420-ee", "ovs_interfaceid": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.593 226239 DEBUG nova.network.neutron [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.598 226239 DEBUG nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Start _get_guest_xml network_info=[{"id": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "address": "fa:16:3e:39:1e:10", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7429a420-ee", "ovs_interfaceid": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.604 226239 WARNING nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.610 226239 DEBUG nova.virt.libvirt.host [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.611 226239 DEBUG nova.virt.libvirt.host [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.615 226239 DEBUG nova.virt.libvirt.host [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.616 226239 DEBUG nova.virt.libvirt.host [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.618 226239 DEBUG nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.618 226239 DEBUG nova.virt.hardware [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.619 226239 DEBUG nova.virt.hardware [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.619 226239 DEBUG nova.virt.hardware [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.619 226239 DEBUG nova.virt.hardware [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.620 226239 DEBUG nova.virt.hardware [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.620 226239 DEBUG nova.virt.hardware [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.620 226239 DEBUG nova.virt.hardware [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.621 226239 DEBUG nova.virt.hardware [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.621 226239 DEBUG nova.virt.hardware [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.621 226239 DEBUG nova.virt.hardware [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.622 226239 DEBUG nova.virt.hardware [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.626 226239 DEBUG oslo_concurrency.processutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.692 226239 DEBUG nova.compute.manager [req-c38bbed2-31c5-434e-af94-4705bd083b43 req-89659ebb-4d77-4e30-aadb-95efb959692e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Received event network-changed-7429a420-eefe-4af6-b5a7-ad8aff346ea8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.692 226239 DEBUG nova.compute.manager [req-c38bbed2-31c5-434e-af94-4705bd083b43 req-89659ebb-4d77-4e30-aadb-95efb959692e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Refreshing instance network info cache due to event network-changed-7429a420-eefe-4af6-b5a7-ad8aff346ea8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.692 226239 DEBUG oslo_concurrency.lockutils [req-c38bbed2-31c5-434e-af94-4705bd083b43 req-89659ebb-4d77-4e30-aadb-95efb959692e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-702e2506-8d57-4ea2-b56e-1800da93f646" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.692 226239 DEBUG oslo_concurrency.lockutils [req-c38bbed2-31c5-434e-af94-4705bd083b43 req-89659ebb-4d77-4e30-aadb-95efb959692e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-702e2506-8d57-4ea2-b56e-1800da93f646" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:11:32 np0005603623 nova_compute[226235]: 2026-01-31 08:11:32.693 226239 DEBUG nova.network.neutron [req-c38bbed2-31c5-434e-af94-4705bd083b43 req-89659ebb-4d77-4e30-aadb-95efb959692e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Refreshing network info cache for port 7429a420-eefe-4af6-b5a7-ad8aff346ea8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3453661724' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.051 226239 DEBUG oslo_concurrency.processutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.085 226239 DEBUG nova.storage.rbd_utils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 702e2506-8d57-4ea2-b56e-1800da93f646_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.091 226239 DEBUG oslo_concurrency.processutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.228 226239 DEBUG nova.compute.manager [req-426a2ba4-d078-4b66-ba5d-9d461abc9f33 req-054d77a3-6e84-4e15-852f-d6b52f87120b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-changed-b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.228 226239 DEBUG nova.compute.manager [req-426a2ba4-d078-4b66-ba5d-9d461abc9f33 req-054d77a3-6e84-4e15-852f-d6b52f87120b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Refreshing instance network info cache due to event network-changed-b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.228 226239 DEBUG oslo_concurrency.lockutils [req-426a2ba4-d078-4b66-ba5d-9d461abc9f33 req-054d77a3-6e84-4e15-852f-d6b52f87120b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-f6cd19dd-9676-4737-a5dc-6b0d0705d8ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.437 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2560619563' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:11:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:33.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.538 226239 DEBUG oslo_concurrency.processutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.539 226239 DEBUG nova.virt.libvirt.vif [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:11:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-662570082',display_name='tempest-tempest.common.compute-instance-662570082',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-662570082',id=87,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIc9G9qrE9DkmH4MfDS/pJVE/TBsDIWPxmulohRcOfbn2Cn27rx2gYgt8roH3OFkAEcaX90eL1koUD1iHLea0bAao7hRDcWiOicUocX2Hu4advs3+4GguQABPQt3bJ2N+w==',key_name='tempest-keypair-569982077',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-ax5f1zxm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:11:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=702e2506-8d57-4ea2-b56e-1800da93f646,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "address": "fa:16:3e:39:1e:10", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7429a420-ee", "ovs_interfaceid": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.539 226239 DEBUG nova.network.os_vif_util [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "address": "fa:16:3e:39:1e:10", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7429a420-ee", "ovs_interfaceid": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.540 226239 DEBUG nova.network.os_vif_util [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=7429a420-eefe-4af6-b5a7-ad8aff346ea8,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7429a420-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.541 226239 DEBUG nova.objects.instance [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 702e2506-8d57-4ea2-b56e-1800da93f646 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.565 226239 DEBUG nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:11:33 np0005603623 nova_compute[226235]:  <uuid>702e2506-8d57-4ea2-b56e-1800da93f646</uuid>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:  <name>instance-00000057</name>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <nova:name>tempest-tempest.common.compute-instance-662570082</nova:name>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:11:32</nova:creationTime>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:11:33 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:        <nova:user uuid="12a823bd7c6e4cf492ebf6c1d002a91f">tempest-ServerActionsTestOtherA-1768827668-project-member</nova:user>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:        <nova:project uuid="9c03fec1b3664105996aa979e226d8f8">tempest-ServerActionsTestOtherA-1768827668</nova:project>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:        <nova:port uuid="7429a420-eefe-4af6-b5a7-ad8aff346ea8">
Jan 31 03:11:33 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <entry name="serial">702e2506-8d57-4ea2-b56e-1800da93f646</entry>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <entry name="uuid">702e2506-8d57-4ea2-b56e-1800da93f646</entry>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/702e2506-8d57-4ea2-b56e-1800da93f646_disk">
Jan 31 03:11:33 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:11:33 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/702e2506-8d57-4ea2-b56e-1800da93f646_disk.config">
Jan 31 03:11:33 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:11:33 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:39:1e:10"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <target dev="tap7429a420-ee"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646/console.log" append="off"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:11:33 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:11:33 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:11:33 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:11:33 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.567 226239 DEBUG nova.compute.manager [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Preparing to wait for external event network-vif-plugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.568 226239 DEBUG oslo_concurrency.lockutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.568 226239 DEBUG oslo_concurrency.lockutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.569 226239 DEBUG oslo_concurrency.lockutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.570 226239 DEBUG nova.virt.libvirt.vif [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:11:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-662570082',display_name='tempest-tempest.common.compute-instance-662570082',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-662570082',id=87,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIc9G9qrE9DkmH4MfDS/pJVE/TBsDIWPxmulohRcOfbn2Cn27rx2gYgt8roH3OFkAEcaX90eL1koUD1iHLea0bAao7hRDcWiOicUocX2Hu4advs3+4GguQABPQt3bJ2N+w==',key_name='tempest-keypair-569982077',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-ax5f1zxm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:11:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=702e2506-8d57-4ea2-b56e-1800da93f646,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "address": "fa:16:3e:39:1e:10", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7429a420-ee", "ovs_interfaceid": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.570 226239 DEBUG nova.network.os_vif_util [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "address": "fa:16:3e:39:1e:10", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7429a420-ee", "ovs_interfaceid": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.572 226239 DEBUG nova.network.os_vif_util [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=7429a420-eefe-4af6-b5a7-ad8aff346ea8,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7429a420-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.572 226239 DEBUG os_vif [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=7429a420-eefe-4af6-b5a7-ad8aff346ea8,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7429a420-ee') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.574 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.574 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.575 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.580 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.581 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7429a420-ee, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.581 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7429a420-ee, col_values=(('external_ids', {'iface-id': '7429a420-eefe-4af6-b5a7-ad8aff346ea8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:39:1e:10', 'vm-uuid': '702e2506-8d57-4ea2-b56e-1800da93f646'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.583 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:33 np0005603623 NetworkManager[48970]: <info>  [1769847093.5854] manager: (tap7429a420-ee): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/154)
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.586 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.590 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.591 226239 INFO os_vif [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=7429a420-eefe-4af6-b5a7-ad8aff346ea8,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7429a420-ee')#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.640 226239 DEBUG nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.640 226239 DEBUG nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.640 226239 DEBUG nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No VIF found with MAC fa:16:3e:39:1e:10, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.641 226239 INFO nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Using config drive#033[00m
Jan 31 03:11:33 np0005603623 nova_compute[226235]: 2026-01-31 08:11:33.670 226239 DEBUG nova.storage.rbd_utils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 702e2506-8d57-4ea2-b56e-1800da93f646_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:11:33.880016) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847093880079, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 2419, "num_deletes": 253, "total_data_size": 5582740, "memory_usage": 5660640, "flush_reason": "Manual Compaction"}
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847093901817, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 3647581, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39517, "largest_seqno": 41931, "table_properties": {"data_size": 3637977, "index_size": 5970, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 20941, "raw_average_key_size": 20, "raw_value_size": 3618368, "raw_average_value_size": 3578, "num_data_blocks": 259, "num_entries": 1011, "num_filter_entries": 1011, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769846893, "oldest_key_time": 1769846893, "file_creation_time": 1769847093, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 21848 microseconds, and 5104 cpu microseconds.
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:11:33.901856) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 3647581 bytes OK
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:11:33.901879) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:11:33.903974) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:11:33.903986) EVENT_LOG_v1 {"time_micros": 1769847093903982, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:11:33.904000) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 5572106, prev total WAL file size 5572106, number of live WAL files 2.
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:11:33.904693) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033323633' seq:72057594037927935, type:22 .. '7061786F730033353135' seq:0, type:0; will stop at (end)
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(3562KB)], [75(10161KB)]
Jan 31 03:11:33 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847093904743, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 14053118, "oldest_snapshot_seqno": -1}
Jan 31 03:11:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:33.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:34 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6782 keys, 12003857 bytes, temperature: kUnknown
Jan 31 03:11:34 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847094020091, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 12003857, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11956179, "index_size": 29646, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16965, "raw_key_size": 173683, "raw_average_key_size": 25, "raw_value_size": 11832587, "raw_average_value_size": 1744, "num_data_blocks": 1182, "num_entries": 6782, "num_filter_entries": 6782, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769847093, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:11:34 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:11:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:11:34.020308) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 12003857 bytes
Jan 31 03:11:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:11:34.028422) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 121.8 rd, 104.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 9.9 +0.0 blob) out(11.4 +0.0 blob), read-write-amplify(7.1) write-amplify(3.3) OK, records in: 7308, records dropped: 526 output_compression: NoCompression
Jan 31 03:11:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:11:34.028664) EVENT_LOG_v1 {"time_micros": 1769847094028650, "job": 46, "event": "compaction_finished", "compaction_time_micros": 115417, "compaction_time_cpu_micros": 18128, "output_level": 6, "num_output_files": 1, "total_output_size": 12003857, "num_input_records": 7308, "num_output_records": 6782, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:11:34 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:11:34 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847094029238, "job": 46, "event": "table_file_deletion", "file_number": 77}
Jan 31 03:11:34 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:11:34 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847094030225, "job": 46, "event": "table_file_deletion", "file_number": 75}
Jan 31 03:11:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:11:33.904607) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:11:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:11:34.030313) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:11:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:11:34.030318) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:11:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:11:34.030320) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:11:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:11:34.030322) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:11:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:11:34.030324) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:11:34 np0005603623 nova_compute[226235]: 2026-01-31 08:11:34.464 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:34 np0005603623 nova_compute[226235]: 2026-01-31 08:11:34.464 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:34 np0005603623 nova_compute[226235]: 2026-01-31 08:11:34.871 226239 INFO nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Creating config drive at /var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646/disk.config#033[00m
Jan 31 03:11:34 np0005603623 nova_compute[226235]: 2026-01-31 08:11:34.875 226239 DEBUG oslo_concurrency.processutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpm800hix9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.004 226239 DEBUG oslo_concurrency.processutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpm800hix9" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.040 226239 DEBUG nova.storage.rbd_utils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 702e2506-8d57-4ea2-b56e-1800da93f646_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.044 226239 DEBUG oslo_concurrency.processutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646/disk.config 702e2506-8d57-4ea2-b56e-1800da93f646_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.224 226239 DEBUG oslo_concurrency.processutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646/disk.config 702e2506-8d57-4ea2-b56e-1800da93f646_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.180s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.225 226239 INFO nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Deleting local config drive /var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646/disk.config because it was imported into RBD.#033[00m
Jan 31 03:11:35 np0005603623 kernel: tap7429a420-ee: entered promiscuous mode
Jan 31 03:11:35 np0005603623 NetworkManager[48970]: <info>  [1769847095.2725] manager: (tap7429a420-ee): new Tun device (/org/freedesktop/NetworkManager/Devices/155)
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.273 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:35 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:35Z|00326|binding|INFO|Claiming lport 7429a420-eefe-4af6-b5a7-ad8aff346ea8 for this chassis.
Jan 31 03:11:35 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:35Z|00327|binding|INFO|7429a420-eefe-4af6-b5a7-ad8aff346ea8: Claiming fa:16:3e:39:1e:10 10.100.0.13
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.280 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:1e:10 10.100.0.13'], port_security=['fa:16:3e:39:1e:10 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '702e2506-8d57-4ea2-b56e-1800da93f646', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f564452-5f08-4a1c-921e-f2daee9ec936', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c03fec1b3664105996aa979e226d8f8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '56300515-2cca-484e-a39f-36468f7be69f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d620dc35-e1b1-4011-a8c1-0995d2048b09, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=7429a420-eefe-4af6-b5a7-ad8aff346ea8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.280 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.281 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 7429a420-eefe-4af6-b5a7-ad8aff346ea8 in datapath 1f564452-5f08-4a1c-921e-f2daee9ec936 bound to our chassis#033[00m
Jan 31 03:11:35 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:35Z|00328|binding|INFO|Setting lport 7429a420-eefe-4af6-b5a7-ad8aff346ea8 ovn-installed in OVS
Jan 31 03:11:35 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:35Z|00329|binding|INFO|Setting lport 7429a420-eefe-4af6-b5a7-ad8aff346ea8 up in Southbound
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.283 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.284 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1f564452-5f08-4a1c-921e-f2daee9ec936#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.284 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.292 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[27b5d075-e576-4492-9517-3c157be338d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.293 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1f564452-51 in ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.297 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1f564452-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.298 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[83bdce82-3ae7-4c19-a394-d1eea7d596ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.299 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[988900ef-d0d7-48d2-b732-a439bd56b675]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603623 systemd-machined[194379]: New machine qemu-37-instance-00000057.
Jan 31 03:11:35 np0005603623 systemd-udevd[262201]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.313 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[318aa643-926c-40d5-99ee-c60fb18bc8ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603623 systemd[1]: Started Virtual Machine qemu-37-instance-00000057.
Jan 31 03:11:35 np0005603623 NetworkManager[48970]: <info>  [1769847095.3206] device (tap7429a420-ee): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:11:35 np0005603623 NetworkManager[48970]: <info>  [1769847095.3220] device (tap7429a420-ee): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.330 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e1d694c8-b5c9-461b-837d-6ab2702c771b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.349 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[42f4fef3-b6bf-4624-9f74-6cfc7033870c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603623 NetworkManager[48970]: <info>  [1769847095.3548] manager: (tap1f564452-50): new Veth device (/org/freedesktop/NetworkManager/Devices/156)
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.355 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[21b0e28d-331f-41c2-a047-a3b3cd0a7f1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.392 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[739a0697-8129-45f7-93fc-1ec8c640b310]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.396 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[030ba29f-6d48-484e-9e8b-762862538359]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603623 NetworkManager[48970]: <info>  [1769847095.4206] device (tap1f564452-50): carrier: link connected
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.424 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ff718ed8-9fac-4a7d-9487-a949ad2dc05b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.441 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9f139244-bdbe-477a-a4b2-a83ae1c615f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f564452-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:23:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 92], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623584, 'reachable_time': 18177, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262232, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.455 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9305d088-f48c-471c-9898-c5efb2b54ee7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:23e8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 623584, 'tstamp': 623584}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 262233, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.471 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8c02c5bd-0f4e-4df4-a7f7-bc9897eeb761]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f564452-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:23:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 92], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623584, 'reachable_time': 18177, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 262234, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.503 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6287cfa3-474f-4376-837c-345d84f7c0a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:35.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.558 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2c66fe1c-9307-4794-bb52-737fcdd3412b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.559 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f564452-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.559 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.560 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f564452-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:35 np0005603623 NetworkManager[48970]: <info>  [1769847095.5630] manager: (tap1f564452-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/157)
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.562 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:35 np0005603623 kernel: tap1f564452-50: entered promiscuous mode
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.568 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.569 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1f564452-50, col_values=(('external_ids', {'iface-id': '5bb8c1b5-edce-4f6a-8164-58b7d89a3330'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:35 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:35Z|00330|binding|INFO|Releasing lport 5bb8c1b5-edce-4f6a-8164-58b7d89a3330 from this chassis (sb_readonly=0)
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.572 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1f564452-5f08-4a1c-921e-f2daee9ec936.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1f564452-5f08-4a1c-921e-f2daee9ec936.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.572 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.573 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1eda1284-ab4c-4e3c-be13-27a2d73ef26a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.573 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-1f564452-5f08-4a1c-921e-f2daee9ec936
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/1f564452-5f08-4a1c-921e-f2daee9ec936.pid.haproxy
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 1f564452-5f08-4a1c-921e-f2daee9ec936
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:11:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:35.574 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'env', 'PROCESS_TAG=haproxy-1f564452-5f08-4a1c-921e-f2daee9ec936', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1f564452-5f08-4a1c-921e-f2daee9ec936.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.577 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.840 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847095.8400884, 702e2506-8d57-4ea2-b56e-1800da93f646 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.841 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] VM Started (Lifecycle Event)#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.867 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.873 226239 DEBUG nova.compute.manager [req-423852cc-5055-43cf-9329-2769011eb7fb req-4a265dd5-7863-4227-b385-85503d47bf94 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Received event network-vif-plugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.873 226239 DEBUG oslo_concurrency.lockutils [req-423852cc-5055-43cf-9329-2769011eb7fb req-4a265dd5-7863-4227-b385-85503d47bf94 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.874 226239 DEBUG oslo_concurrency.lockutils [req-423852cc-5055-43cf-9329-2769011eb7fb req-4a265dd5-7863-4227-b385-85503d47bf94 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.874 226239 DEBUG oslo_concurrency.lockutils [req-423852cc-5055-43cf-9329-2769011eb7fb req-4a265dd5-7863-4227-b385-85503d47bf94 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.874 226239 DEBUG nova.compute.manager [req-423852cc-5055-43cf-9329-2769011eb7fb req-4a265dd5-7863-4227-b385-85503d47bf94 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Processing event network-vif-plugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.875 226239 DEBUG nova.compute.manager [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.876 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847095.8403466, 702e2506-8d57-4ea2-b56e-1800da93f646 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.876 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.879 226239 DEBUG nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.883 226239 INFO nova.virt.libvirt.driver [-] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Instance spawned successfully.#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.883 226239 DEBUG nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:11:35 np0005603623 podman[262308]: 2026-01-31 08:11:35.897652021 +0000 UTC m=+0.049616976 container create 96ddce76daa5396cb284771f011f6b92278ed42b2ae208f3e52fd47a423c584b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.903 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.914 226239 DEBUG nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.915 226239 DEBUG nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.916 226239 DEBUG nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.916 226239 DEBUG nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.917 226239 DEBUG nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.917 226239 DEBUG nova.virt.libvirt.driver [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.922 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847095.8801894, 702e2506-8d57-4ea2-b56e-1800da93f646 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.922 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:11:35 np0005603623 systemd[1]: Started libpod-conmon-96ddce76daa5396cb284771f011f6b92278ed42b2ae208f3e52fd47a423c584b.scope.
Jan 31 03:11:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:35.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:35 np0005603623 podman[262308]: 2026-01-31 08:11:35.866391895 +0000 UTC m=+0.018356870 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.965 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:35 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:11:35 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6276b9a1a574931bea2627437dbd13908606446ef403812318958b4b795063bb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:11:35 np0005603623 nova_compute[226235]: 2026-01-31 08:11:35.977 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:11:35 np0005603623 podman[262308]: 2026-01-31 08:11:35.985409877 +0000 UTC m=+0.137374842 container init 96ddce76daa5396cb284771f011f6b92278ed42b2ae208f3e52fd47a423c584b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 03:11:35 np0005603623 podman[262308]: 2026-01-31 08:11:35.989393923 +0000 UTC m=+0.141358858 container start 96ddce76daa5396cb284771f011f6b92278ed42b2ae208f3e52fd47a423c584b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:11:36 np0005603623 nova_compute[226235]: 2026-01-31 08:11:36.001 226239 INFO nova.compute.manager [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Took 6.88 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:11:36 np0005603623 nova_compute[226235]: 2026-01-31 08:11:36.002 226239 DEBUG nova.compute.manager [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:36 np0005603623 nova_compute[226235]: 2026-01-31 08:11:36.003 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:11:36 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[262324]: [NOTICE]   (262328) : New worker (262330) forked
Jan 31 03:11:36 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[262324]: [NOTICE]   (262328) : Loading success.
Jan 31 03:11:36 np0005603623 nova_compute[226235]: 2026-01-31 08:11:36.076 226239 INFO nova.compute.manager [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Took 7.94 seconds to build instance.#033[00m
Jan 31 03:11:36 np0005603623 nova_compute[226235]: 2026-01-31 08:11:36.089 226239 DEBUG oslo_concurrency.lockutils [None req-a9a31ff3-1046-42dd-99d3-ebe129e046be 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:36 np0005603623 nova_compute[226235]: 2026-01-31 08:11:36.644 226239 DEBUG nova.network.neutron [req-c38bbed2-31c5-434e-af94-4705bd083b43 req-89659ebb-4d77-4e30-aadb-95efb959692e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Updated VIF entry in instance network info cache for port 7429a420-eefe-4af6-b5a7-ad8aff346ea8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:11:36 np0005603623 nova_compute[226235]: 2026-01-31 08:11:36.645 226239 DEBUG nova.network.neutron [req-c38bbed2-31c5-434e-af94-4705bd083b43 req-89659ebb-4d77-4e30-aadb-95efb959692e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Updating instance_info_cache with network_info: [{"id": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "address": "fa:16:3e:39:1e:10", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7429a420-ee", "ovs_interfaceid": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:11:36 np0005603623 nova_compute[226235]: 2026-01-31 08:11:36.664 226239 DEBUG oslo_concurrency.lockutils [req-c38bbed2-31c5-434e-af94-4705bd083b43 req-89659ebb-4d77-4e30-aadb-95efb959692e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-702e2506-8d57-4ea2-b56e-1800da93f646" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:11:37 np0005603623 nova_compute[226235]: 2026-01-31 08:11:37.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:37 np0005603623 nova_compute[226235]: 2026-01-31 08:11:37.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:37 np0005603623 nova_compute[226235]: 2026-01-31 08:11:37.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:37.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:37.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:38 np0005603623 nova_compute[226235]: 2026-01-31 08:11:38.440 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:38 np0005603623 nova_compute[226235]: 2026-01-31 08:11:38.584 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:38 np0005603623 nova_compute[226235]: 2026-01-31 08:11:38.966 226239 DEBUG nova.compute.manager [req-989b2742-353a-46d0-a0a9-d1dbfc8e43fc req-af4c9140-665d-46d8-8c33-3497b4d2bd6f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Received event network-vif-plugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:38 np0005603623 nova_compute[226235]: 2026-01-31 08:11:38.966 226239 DEBUG oslo_concurrency.lockutils [req-989b2742-353a-46d0-a0a9-d1dbfc8e43fc req-af4c9140-665d-46d8-8c33-3497b4d2bd6f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:38 np0005603623 nova_compute[226235]: 2026-01-31 08:11:38.967 226239 DEBUG oslo_concurrency.lockutils [req-989b2742-353a-46d0-a0a9-d1dbfc8e43fc req-af4c9140-665d-46d8-8c33-3497b4d2bd6f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:38 np0005603623 nova_compute[226235]: 2026-01-31 08:11:38.967 226239 DEBUG oslo_concurrency.lockutils [req-989b2742-353a-46d0-a0a9-d1dbfc8e43fc req-af4c9140-665d-46d8-8c33-3497b4d2bd6f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:38 np0005603623 nova_compute[226235]: 2026-01-31 08:11:38.967 226239 DEBUG nova.compute.manager [req-989b2742-353a-46d0-a0a9-d1dbfc8e43fc req-af4c9140-665d-46d8-8c33-3497b4d2bd6f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] No waiting events found dispatching network-vif-plugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:38 np0005603623 nova_compute[226235]: 2026-01-31 08:11:38.967 226239 WARNING nova.compute.manager [req-989b2742-353a-46d0-a0a9-d1dbfc8e43fc req-af4c9140-665d-46d8-8c33-3497b4d2bd6f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Received unexpected event network-vif-plugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:11:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:39.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.613 226239 DEBUG nova.network.neutron [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Updating instance_info_cache with network_info: [{"id": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "address": "fa:16:3e:90:4d:5c", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.129", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0ce9b5-cd", "ovs_interfaceid": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "address": "fa:16:3e:d7:44:22", "network": {"id": "aabdf0ce-15d3-47ef-8035-167ef2db9ba8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1490540087", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.151", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f4f30e-6b", "ovs_interfaceid": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "address": "fa:16:3e:08:73:e3", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap393a6935-9a", "ovs_interfaceid": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.637 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Releasing lock "refresh_cache-f6cd19dd-9676-4737-a5dc-6b0d0705d8ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.638 226239 DEBUG nova.compute.manager [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Instance network_info: |[{"id": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "address": "fa:16:3e:90:4d:5c", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.129", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0ce9b5-cd", "ovs_interfaceid": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "address": "fa:16:3e:d7:44:22", "network": {"id": "aabdf0ce-15d3-47ef-8035-167ef2db9ba8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1490540087", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.151", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f4f30e-6b", "ovs_interfaceid": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "address": "fa:16:3e:08:73:e3", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap393a6935-9a", "ovs_interfaceid": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.638 226239 DEBUG oslo_concurrency.lockutils [req-426a2ba4-d078-4b66-ba5d-9d461abc9f33 req-054d77a3-6e84-4e15-852f-d6b52f87120b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-f6cd19dd-9676-4737-a5dc-6b0d0705d8ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.638 226239 DEBUG nova.network.neutron [req-426a2ba4-d078-4b66-ba5d-9d461abc9f33 req-054d77a3-6e84-4e15-852f-d6b52f87120b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Refreshing network info cache for port b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.644 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Start _get_guest_xml network_info=[{"id": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "address": "fa:16:3e:90:4d:5c", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.129", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0ce9b5-cd", "ovs_interfaceid": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "address": "fa:16:3e:d7:44:22", "network": {"id": "aabdf0ce-15d3-47ef-8035-167ef2db9ba8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1490540087", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.151", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f4f30e-6b", "ovs_interfaceid": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "address": "fa:16:3e:08:73:e3", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap393a6935-9a", "ovs_interfaceid": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.649 226239 WARNING nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.654 226239 DEBUG nova.virt.libvirt.host [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.654 226239 DEBUG nova.virt.libvirt.host [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.662 226239 DEBUG nova.virt.libvirt.host [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.662 226239 DEBUG nova.virt.libvirt.host [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.663 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.663 226239 DEBUG nova.virt.hardware [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.664 226239 DEBUG nova.virt.hardware [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.664 226239 DEBUG nova.virt.hardware [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.664 226239 DEBUG nova.virt.hardware [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.665 226239 DEBUG nova.virt.hardware [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.665 226239 DEBUG nova.virt.hardware [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.665 226239 DEBUG nova.virt.hardware [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.665 226239 DEBUG nova.virt.hardware [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.665 226239 DEBUG nova.virt.hardware [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.666 226239 DEBUG nova.virt.hardware [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.666 226239 DEBUG nova.virt.hardware [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:11:39 np0005603623 nova_compute[226235]: 2026-01-31 08:11:39.668 226239 DEBUG oslo_concurrency.processutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:39.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:11:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/50443758' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.146 226239 DEBUG oslo_concurrency.processutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.169 226239 DEBUG nova.storage.rbd_utils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] rbd image f6cd19dd-9676-4737-a5dc-6b0d0705d8ca_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.172 226239 DEBUG oslo_concurrency.processutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:11:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/660832369' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.676 226239 DEBUG oslo_concurrency.processutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.678 226239 DEBUG nova.virt.libvirt.vif [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:11:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-8673776',display_name='tempest-ServersTestMultiNic-server-8673776',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-8673776',id=86,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88b896f61c644b6fac0351ce6828b6e1',ramdisk_id='',reservation_id='r-xpuh8pw8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1053198737',owner_user_name='tempest-ServersTestMultiNic-1053198737-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:11:22Z,user_data=None,user_id='55f81600a60b49aaae5b4c28549afdaf',uuid=f6cd19dd-9676-4737-a5dc-6b0d0705d8ca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "address": "fa:16:3e:90:4d:5c", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.129", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0ce9b5-cd", "ovs_interfaceid": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.679 226239 DEBUG nova.network.os_vif_util [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converting VIF {"id": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "address": "fa:16:3e:90:4d:5c", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.129", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0ce9b5-cd", "ovs_interfaceid": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.679 226239 DEBUG nova.network.os_vif_util [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:4d:5c,bridge_name='br-int',has_traffic_filtering=True,id=ab0ce9b5-cd73-4758-8513-45a7f13eefe3,network=Network(2a7e47cf-58cf-4e4a-af81-97bf0a6bc596),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0ce9b5-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.680 226239 DEBUG nova.virt.libvirt.vif [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:11:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-8673776',display_name='tempest-ServersTestMultiNic-server-8673776',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-8673776',id=86,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88b896f61c644b6fac0351ce6828b6e1',ramdisk_id='',reservation_id='r-xpuh8pw8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1053198737',owner_user_name='tempest-ServersTestMultiNic-1053198737-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:11:22Z,user_data=None,user_id='55f81600a60b49aaae5b4c28549afdaf',uuid=f6cd19dd-9676-4737-a5dc-6b0d0705d8ca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "address": "fa:16:3e:d7:44:22", "network": {"id": "aabdf0ce-15d3-47ef-8035-167ef2db9ba8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1490540087", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.151", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f4f30e-6b", "ovs_interfaceid": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.680 226239 DEBUG nova.network.os_vif_util [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converting VIF {"id": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "address": "fa:16:3e:d7:44:22", "network": {"id": "aabdf0ce-15d3-47ef-8035-167ef2db9ba8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1490540087", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.151", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f4f30e-6b", "ovs_interfaceid": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.681 226239 DEBUG nova.network.os_vif_util [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:44:22,bridge_name='br-int',has_traffic_filtering=True,id=b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9,network=Network(aabdf0ce-15d3-47ef-8035-167ef2db9ba8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3f4f30e-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.682 226239 DEBUG nova.virt.libvirt.vif [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:11:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-8673776',display_name='tempest-ServersTestMultiNic-server-8673776',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-8673776',id=86,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88b896f61c644b6fac0351ce6828b6e1',ramdisk_id='',reservation_id='r-xpuh8pw8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1053198737',owner_user_name='tempest-ServersTestMultiNic-1053198737-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:11:22Z,user_data=None,user_id='55f81600a60b49aaae5b4c28549afdaf',uuid=f6cd19dd-9676-4737-a5dc-6b0d0705d8ca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "address": "fa:16:3e:08:73:e3", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap393a6935-9a", "ovs_interfaceid": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.682 226239 DEBUG nova.network.os_vif_util [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converting VIF {"id": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "address": "fa:16:3e:08:73:e3", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap393a6935-9a", "ovs_interfaceid": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.682 226239 DEBUG nova.network.os_vif_util [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:73:e3,bridge_name='br-int',has_traffic_filtering=True,id=393a6935-9a5d-4dbb-8ace-1d72f748ecd4,network=Network(2a7e47cf-58cf-4e4a-af81-97bf0a6bc596),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap393a6935-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.683 226239 DEBUG nova.objects.instance [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lazy-loading 'pci_devices' on Instance uuid f6cd19dd-9676-4737-a5dc-6b0d0705d8ca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.700 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:  <uuid>f6cd19dd-9676-4737-a5dc-6b0d0705d8ca</uuid>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:  <name>instance-00000056</name>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServersTestMultiNic-server-8673776</nova:name>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:11:39</nova:creationTime>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        <nova:user uuid="55f81600a60b49aaae5b4c28549afdaf">tempest-ServersTestMultiNic-1053198737-project-member</nova:user>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        <nova:project uuid="88b896f61c644b6fac0351ce6828b6e1">tempest-ServersTestMultiNic-1053198737</nova:project>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        <nova:port uuid="ab0ce9b5-cd73-4758-8513-45a7f13eefe3">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.129" ipVersion="4"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        <nova:port uuid="b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.1.151" ipVersion="4"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        <nova:port uuid="393a6935-9a5d-4dbb-8ace-1d72f748ecd4">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.102" ipVersion="4"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <entry name="serial">f6cd19dd-9676-4737-a5dc-6b0d0705d8ca</entry>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <entry name="uuid">f6cd19dd-9676-4737-a5dc-6b0d0705d8ca</entry>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/f6cd19dd-9676-4737-a5dc-6b0d0705d8ca_disk">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/f6cd19dd-9676-4737-a5dc-6b0d0705d8ca_disk.config">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:90:4d:5c"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <target dev="tapab0ce9b5-cd"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:d7:44:22"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <target dev="tapb3f4f30e-6b"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:08:73:e3"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <target dev="tap393a6935-9a"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/f6cd19dd-9676-4737-a5dc-6b0d0705d8ca/console.log" append="off"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:11:40 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:11:40 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:11:40 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:11:40 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.701 226239 DEBUG nova.compute.manager [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Preparing to wait for external event network-vif-plugged-ab0ce9b5-cd73-4758-8513-45a7f13eefe3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.702 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.702 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.702 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.703 226239 DEBUG nova.compute.manager [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Preparing to wait for external event network-vif-plugged-b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.703 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.703 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.703 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.703 226239 DEBUG nova.compute.manager [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Preparing to wait for external event network-vif-plugged-393a6935-9a5d-4dbb-8ace-1d72f748ecd4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.704 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.704 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.704 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.705 226239 DEBUG nova.virt.libvirt.vif [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:11:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-8673776',display_name='tempest-ServersTestMultiNic-server-8673776',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-8673776',id=86,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88b896f61c644b6fac0351ce6828b6e1',ramdisk_id='',reservation_id='r-xpuh8pw8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1053198737',owner_user_name='tempest-ServersTestMultiNic-1053198737-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:11:22Z,user_data=None,user_id='55f81600a60b49aaae5b4c28549afdaf',uuid=f6cd19dd-9676-4737-a5dc-6b0d0705d8ca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "address": "fa:16:3e:90:4d:5c", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.129", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0ce9b5-cd", "ovs_interfaceid": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.705 226239 DEBUG nova.network.os_vif_util [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converting VIF {"id": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "address": "fa:16:3e:90:4d:5c", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.129", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0ce9b5-cd", "ovs_interfaceid": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.705 226239 DEBUG nova.network.os_vif_util [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:4d:5c,bridge_name='br-int',has_traffic_filtering=True,id=ab0ce9b5-cd73-4758-8513-45a7f13eefe3,network=Network(2a7e47cf-58cf-4e4a-af81-97bf0a6bc596),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0ce9b5-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.706 226239 DEBUG os_vif [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:4d:5c,bridge_name='br-int',has_traffic_filtering=True,id=ab0ce9b5-cd73-4758-8513-45a7f13eefe3,network=Network(2a7e47cf-58cf-4e4a-af81-97bf0a6bc596),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0ce9b5-cd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.706 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.707 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.707 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.709 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.710 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab0ce9b5-cd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.710 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapab0ce9b5-cd, col_values=(('external_ids', {'iface-id': 'ab0ce9b5-cd73-4758-8513-45a7f13eefe3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:90:4d:5c', 'vm-uuid': 'f6cd19dd-9676-4737-a5dc-6b0d0705d8ca'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:40 np0005603623 NetworkManager[48970]: <info>  [1769847100.7129] manager: (tapab0ce9b5-cd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/158)
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.712 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.716 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.720 226239 INFO os_vif [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:4d:5c,bridge_name='br-int',has_traffic_filtering=True,id=ab0ce9b5-cd73-4758-8513-45a7f13eefe3,network=Network(2a7e47cf-58cf-4e4a-af81-97bf0a6bc596),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0ce9b5-cd')#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.720 226239 DEBUG nova.virt.libvirt.vif [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:11:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-8673776',display_name='tempest-ServersTestMultiNic-server-8673776',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-8673776',id=86,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88b896f61c644b6fac0351ce6828b6e1',ramdisk_id='',reservation_id='r-xpuh8pw8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1053198737',owner_user_name='tempest-ServersTestMultiNic-1053198737-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:11:22Z,user_data=None,user_id='55f81600a60b49aaae5b4c28549afdaf',uuid=f6cd19dd-9676-4737-a5dc-6b0d0705d8ca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "address": "fa:16:3e:d7:44:22", "network": {"id": "aabdf0ce-15d3-47ef-8035-167ef2db9ba8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1490540087", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.151", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f4f30e-6b", "ovs_interfaceid": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.721 226239 DEBUG nova.network.os_vif_util [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converting VIF {"id": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "address": "fa:16:3e:d7:44:22", "network": {"id": "aabdf0ce-15d3-47ef-8035-167ef2db9ba8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1490540087", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.151", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f4f30e-6b", "ovs_interfaceid": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.721 226239 DEBUG nova.network.os_vif_util [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:44:22,bridge_name='br-int',has_traffic_filtering=True,id=b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9,network=Network(aabdf0ce-15d3-47ef-8035-167ef2db9ba8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3f4f30e-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.722 226239 DEBUG os_vif [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:44:22,bridge_name='br-int',has_traffic_filtering=True,id=b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9,network=Network(aabdf0ce-15d3-47ef-8035-167ef2db9ba8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3f4f30e-6b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.722 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.722 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.722 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.724 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.724 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb3f4f30e-6b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.724 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb3f4f30e-6b, col_values=(('external_ids', {'iface-id': 'b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d7:44:22', 'vm-uuid': 'f6cd19dd-9676-4737-a5dc-6b0d0705d8ca'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.725 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:40 np0005603623 NetworkManager[48970]: <info>  [1769847100.7265] manager: (tapb3f4f30e-6b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/159)
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.732 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.733 226239 INFO os_vif [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:44:22,bridge_name='br-int',has_traffic_filtering=True,id=b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9,network=Network(aabdf0ce-15d3-47ef-8035-167ef2db9ba8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3f4f30e-6b')#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.734 226239 DEBUG nova.virt.libvirt.vif [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:11:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-8673776',display_name='tempest-ServersTestMultiNic-server-8673776',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-8673776',id=86,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88b896f61c644b6fac0351ce6828b6e1',ramdisk_id='',reservation_id='r-xpuh8pw8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1053198737',owner_user_name='tempest-ServersTestMultiNic-1053198737-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:11:22Z,user_data=None,user_id='55f81600a60b49aaae5b4c28549afdaf',uuid=f6cd19dd-9676-4737-a5dc-6b0d0705d8ca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "address": "fa:16:3e:08:73:e3", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap393a6935-9a", "ovs_interfaceid": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.734 226239 DEBUG nova.network.os_vif_util [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converting VIF {"id": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "address": "fa:16:3e:08:73:e3", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap393a6935-9a", "ovs_interfaceid": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.734 226239 DEBUG nova.network.os_vif_util [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:73:e3,bridge_name='br-int',has_traffic_filtering=True,id=393a6935-9a5d-4dbb-8ace-1d72f748ecd4,network=Network(2a7e47cf-58cf-4e4a-af81-97bf0a6bc596),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap393a6935-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.735 226239 DEBUG os_vif [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:73:e3,bridge_name='br-int',has_traffic_filtering=True,id=393a6935-9a5d-4dbb-8ace-1d72f748ecd4,network=Network(2a7e47cf-58cf-4e4a-af81-97bf0a6bc596),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap393a6935-9a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.735 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.735 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.736 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.738 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.738 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap393a6935-9a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.738 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap393a6935-9a, col_values=(('external_ids', {'iface-id': '393a6935-9a5d-4dbb-8ace-1d72f748ecd4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:08:73:e3', 'vm-uuid': 'f6cd19dd-9676-4737-a5dc-6b0d0705d8ca'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:40 np0005603623 NetworkManager[48970]: <info>  [1769847100.7407] manager: (tap393a6935-9a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/160)
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.739 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.743 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.748 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.749 226239 INFO os_vif [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:73:e3,bridge_name='br-int',has_traffic_filtering=True,id=393a6935-9a5d-4dbb-8ace-1d72f748ecd4,network=Network(2a7e47cf-58cf-4e4a-af81-97bf0a6bc596),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap393a6935-9a')#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.815 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.815 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.815 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] No VIF found with MAC fa:16:3e:90:4d:5c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.816 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] No VIF found with MAC fa:16:3e:d7:44:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.816 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] No VIF found with MAC fa:16:3e:08:73:e3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.816 226239 INFO nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Using config drive#033[00m
Jan 31 03:11:40 np0005603623 nova_compute[226235]: 2026-01-31 08:11:40.847 226239 DEBUG nova.storage.rbd_utils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] rbd image f6cd19dd-9676-4737-a5dc-6b0d0705d8ca_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:41 np0005603623 nova_compute[226235]: 2026-01-31 08:11:41.147 226239 DEBUG nova.compute.manager [req-11c3c2d9-1bfc-4fb9-8ea4-8b70af085545 req-eb4df482-ad89-46f2-aa16-42ad1b40a2eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Received event network-changed-7429a420-eefe-4af6-b5a7-ad8aff346ea8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:41 np0005603623 nova_compute[226235]: 2026-01-31 08:11:41.147 226239 DEBUG nova.compute.manager [req-11c3c2d9-1bfc-4fb9-8ea4-8b70af085545 req-eb4df482-ad89-46f2-aa16-42ad1b40a2eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Refreshing instance network info cache due to event network-changed-7429a420-eefe-4af6-b5a7-ad8aff346ea8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:11:41 np0005603623 nova_compute[226235]: 2026-01-31 08:11:41.148 226239 DEBUG oslo_concurrency.lockutils [req-11c3c2d9-1bfc-4fb9-8ea4-8b70af085545 req-eb4df482-ad89-46f2-aa16-42ad1b40a2eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-702e2506-8d57-4ea2-b56e-1800da93f646" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:11:41 np0005603623 nova_compute[226235]: 2026-01-31 08:11:41.148 226239 DEBUG oslo_concurrency.lockutils [req-11c3c2d9-1bfc-4fb9-8ea4-8b70af085545 req-eb4df482-ad89-46f2-aa16-42ad1b40a2eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-702e2506-8d57-4ea2-b56e-1800da93f646" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:11:41 np0005603623 nova_compute[226235]: 2026-01-31 08:11:41.148 226239 DEBUG nova.network.neutron [req-11c3c2d9-1bfc-4fb9-8ea4-8b70af085545 req-eb4df482-ad89-46f2-aa16-42ad1b40a2eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Refreshing network info cache for port 7429a420-eefe-4af6-b5a7-ad8aff346ea8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:11:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:41.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:41 np0005603623 nova_compute[226235]: 2026-01-31 08:11:41.765 226239 DEBUG nova.network.neutron [req-426a2ba4-d078-4b66-ba5d-9d461abc9f33 req-054d77a3-6e84-4e15-852f-d6b52f87120b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Updated VIF entry in instance network info cache for port b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:11:41 np0005603623 nova_compute[226235]: 2026-01-31 08:11:41.767 226239 DEBUG nova.network.neutron [req-426a2ba4-d078-4b66-ba5d-9d461abc9f33 req-054d77a3-6e84-4e15-852f-d6b52f87120b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Updating instance_info_cache with network_info: [{"id": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "address": "fa:16:3e:90:4d:5c", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.129", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0ce9b5-cd", "ovs_interfaceid": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "address": "fa:16:3e:d7:44:22", "network": {"id": "aabdf0ce-15d3-47ef-8035-167ef2db9ba8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1490540087", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.151", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f4f30e-6b", "ovs_interfaceid": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "address": "fa:16:3e:08:73:e3", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap393a6935-9a", "ovs_interfaceid": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:11:41 np0005603623 nova_compute[226235]: 2026-01-31 08:11:41.796 226239 DEBUG oslo_concurrency.lockutils [req-426a2ba4-d078-4b66-ba5d-9d461abc9f33 req-054d77a3-6e84-4e15-852f-d6b52f87120b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-f6cd19dd-9676-4737-a5dc-6b0d0705d8ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:11:41 np0005603623 nova_compute[226235]: 2026-01-31 08:11:41.797 226239 DEBUG nova.compute.manager [req-426a2ba4-d078-4b66-ba5d-9d461abc9f33 req-054d77a3-6e84-4e15-852f-d6b52f87120b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-changed-393a6935-9a5d-4dbb-8ace-1d72f748ecd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:41 np0005603623 nova_compute[226235]: 2026-01-31 08:11:41.797 226239 DEBUG nova.compute.manager [req-426a2ba4-d078-4b66-ba5d-9d461abc9f33 req-054d77a3-6e84-4e15-852f-d6b52f87120b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Refreshing instance network info cache due to event network-changed-393a6935-9a5d-4dbb-8ace-1d72f748ecd4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:11:41 np0005603623 nova_compute[226235]: 2026-01-31 08:11:41.797 226239 DEBUG oslo_concurrency.lockutils [req-426a2ba4-d078-4b66-ba5d-9d461abc9f33 req-054d77a3-6e84-4e15-852f-d6b52f87120b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-f6cd19dd-9676-4737-a5dc-6b0d0705d8ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:11:41 np0005603623 nova_compute[226235]: 2026-01-31 08:11:41.798 226239 DEBUG oslo_concurrency.lockutils [req-426a2ba4-d078-4b66-ba5d-9d461abc9f33 req-054d77a3-6e84-4e15-852f-d6b52f87120b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-f6cd19dd-9676-4737-a5dc-6b0d0705d8ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:11:41 np0005603623 nova_compute[226235]: 2026-01-31 08:11:41.798 226239 DEBUG nova.network.neutron [req-426a2ba4-d078-4b66-ba5d-9d461abc9f33 req-054d77a3-6e84-4e15-852f-d6b52f87120b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Refreshing network info cache for port 393a6935-9a5d-4dbb-8ace-1d72f748ecd4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:11:41 np0005603623 nova_compute[226235]: 2026-01-31 08:11:41.819 226239 INFO nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Creating config drive at /var/lib/nova/instances/f6cd19dd-9676-4737-a5dc-6b0d0705d8ca/disk.config#033[00m
Jan 31 03:11:41 np0005603623 nova_compute[226235]: 2026-01-31 08:11:41.824 226239 DEBUG oslo_concurrency.processutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f6cd19dd-9676-4737-a5dc-6b0d0705d8ca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpyqqpi1lu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:41 np0005603623 nova_compute[226235]: 2026-01-31 08:11:41.946 226239 DEBUG oslo_concurrency.processutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f6cd19dd-9676-4737-a5dc-6b0d0705d8ca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpyqqpi1lu" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:41.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:41 np0005603623 nova_compute[226235]: 2026-01-31 08:11:41.972 226239 DEBUG nova.storage.rbd_utils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] rbd image f6cd19dd-9676-4737-a5dc-6b0d0705d8ca_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:41 np0005603623 nova_compute[226235]: 2026-01-31 08:11:41.976 226239 DEBUG oslo_concurrency.processutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f6cd19dd-9676-4737-a5dc-6b0d0705d8ca/disk.config f6cd19dd-9676-4737-a5dc-6b0d0705d8ca_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.151 226239 DEBUG oslo_concurrency.processutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f6cd19dd-9676-4737-a5dc-6b0d0705d8ca/disk.config f6cd19dd-9676-4737-a5dc-6b0d0705d8ca_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.175s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.152 226239 INFO nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Deleting local config drive /var/lib/nova/instances/f6cd19dd-9676-4737-a5dc-6b0d0705d8ca/disk.config because it was imported into RBD.#033[00m
Jan 31 03:11:42 np0005603623 NetworkManager[48970]: <info>  [1769847102.1856] manager: (tapab0ce9b5-cd): new Tun device (/org/freedesktop/NetworkManager/Devices/161)
Jan 31 03:11:42 np0005603623 kernel: tapab0ce9b5-cd: entered promiscuous mode
Jan 31 03:11:42 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:42Z|00331|binding|INFO|Claiming lport ab0ce9b5-cd73-4758-8513-45a7f13eefe3 for this chassis.
Jan 31 03:11:42 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:42Z|00332|binding|INFO|ab0ce9b5-cd73-4758-8513-45a7f13eefe3: Claiming fa:16:3e:90:4d:5c 10.100.0.129
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.197 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603623 NetworkManager[48970]: <info>  [1769847102.1986] manager: (tapb3f4f30e-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/162)
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.205 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:4d:5c 10.100.0.129'], port_security=['fa:16:3e:90:4d:5c 10.100.0.129'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.129/24', 'neutron:device_id': 'f6cd19dd-9676-4737-a5dc-6b0d0705d8ca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88b896f61c644b6fac0351ce6828b6e1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '52fcb5e9-44f4-488b-bcc4-743586d363ee', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b25f26c4-9818-49ac-b211-d1b5817e21f4, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=ab0ce9b5-cd73-4758-8513-45a7f13eefe3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.206 143258 INFO neutron.agent.ovn.metadata.agent [-] Port ab0ce9b5-cd73-4758-8513-45a7f13eefe3 in datapath 2a7e47cf-58cf-4e4a-af81-97bf0a6bc596 bound to our chassis#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.208 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2a7e47cf-58cf-4e4a-af81-97bf0a6bc596#033[00m
Jan 31 03:11:42 np0005603623 NetworkManager[48970]: <info>  [1769847102.2100] manager: (tap393a6935-9a): new Tun device (/org/freedesktop/NetworkManager/Devices/163)
Jan 31 03:11:42 np0005603623 systemd-udevd[262539]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:11:42 np0005603623 systemd-udevd[262540]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:11:42 np0005603623 systemd-udevd[262541]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.216 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2a78a6ea-61a7-438f-906a-29e85af29ff5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.217 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2a7e47cf-51 in ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:11:42 np0005603623 kernel: tap393a6935-9a: entered promiscuous mode
Jan 31 03:11:42 np0005603623 kernel: tapb3f4f30e-6b: entered promiscuous mode
Jan 31 03:11:42 np0005603623 NetworkManager[48970]: <info>  [1769847102.2258] device (tapb3f4f30e-6b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:11:42 np0005603623 NetworkManager[48970]: <info>  [1769847102.2264] device (tapb3f4f30e-6b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:11:42 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:42Z|00333|binding|INFO|Claiming lport 393a6935-9a5d-4dbb-8ace-1d72f748ecd4 for this chassis.
Jan 31 03:11:42 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:42Z|00334|binding|INFO|393a6935-9a5d-4dbb-8ace-1d72f748ecd4: Claiming fa:16:3e:08:73:e3 10.100.0.102
Jan 31 03:11:42 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:42Z|00335|binding|INFO|Claiming lport b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 for this chassis.
Jan 31 03:11:42 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:42Z|00336|binding|INFO|b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9: Claiming fa:16:3e:d7:44:22 10.100.1.151
Jan 31 03:11:42 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:42Z|00337|binding|INFO|Setting lport ab0ce9b5-cd73-4758-8513-45a7f13eefe3 ovn-installed in OVS
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.221 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.223 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.221 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2a7e47cf-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.221 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bc9bd2d1-b998-41d7-8054-f1125e0f178e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.227 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.227 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ba4f0841-8d3b-4ab2-8d1a-24451620670a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.230 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:08:73:e3 10.100.0.102'], port_security=['fa:16:3e:08:73:e3 10.100.0.102'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.102/24', 'neutron:device_id': 'f6cd19dd-9676-4737-a5dc-6b0d0705d8ca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88b896f61c644b6fac0351ce6828b6e1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '52fcb5e9-44f4-488b-bcc4-743586d363ee', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b25f26c4-9818-49ac-b211-d1b5817e21f4, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=393a6935-9a5d-4dbb-8ace-1d72f748ecd4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.231 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:44:22 10.100.1.151'], port_security=['fa:16:3e:d7:44:22 10.100.1.151'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.151/24', 'neutron:device_id': 'f6cd19dd-9676-4737-a5dc-6b0d0705d8ca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aabdf0ce-15d3-47ef-8035-167ef2db9ba8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88b896f61c644b6fac0351ce6828b6e1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '52fcb5e9-44f4-488b-bcc4-743586d363ee', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=607aa9b2-1367-446a-ba20-d910db948ebd, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:11:42 np0005603623 NetworkManager[48970]: <info>  [1769847102.2328] device (tapab0ce9b5-cd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:11:42 np0005603623 NetworkManager[48970]: <info>  [1769847102.2333] device (tap393a6935-9a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:11:42 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:42Z|00338|binding|INFO|Setting lport ab0ce9b5-cd73-4758-8513-45a7f13eefe3 up in Southbound
Jan 31 03:11:42 np0005603623 NetworkManager[48970]: <info>  [1769847102.2338] device (tapab0ce9b5-cd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:11:42 np0005603623 NetworkManager[48970]: <info>  [1769847102.2341] device (tap393a6935-9a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.237 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[d88732c9-41dc-451d-ad56-551e3adafa5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:42Z|00339|binding|INFO|Setting lport 393a6935-9a5d-4dbb-8ace-1d72f748ecd4 ovn-installed in OVS
Jan 31 03:11:42 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:42Z|00340|binding|INFO|Setting lport 393a6935-9a5d-4dbb-8ace-1d72f748ecd4 up in Southbound
Jan 31 03:11:42 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:42Z|00341|binding|INFO|Setting lport b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 ovn-installed in OVS
Jan 31 03:11:42 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:42Z|00342|binding|INFO|Setting lport b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 up in Southbound
Jan 31 03:11:42 np0005603623 systemd-machined[194379]: New machine qemu-38-instance-00000056.
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.245 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.246 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c7556ee0-e7b4-4c92-b49a-f5a721124346]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 systemd[1]: Started Virtual Machine qemu-38-instance-00000056.
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.272 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[59500909-46b5-42f6-af35-476f674deba4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 NetworkManager[48970]: <info>  [1769847102.2793] manager: (tap2a7e47cf-50): new Veth device (/org/freedesktop/NetworkManager/Devices/164)
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.280 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[13fe41f5-0d6f-47a1-8e89-46ad6308065e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.304 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d90691-b9ea-46a5-94df-08aaa1892a06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.306 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[a27df171-e94f-4470-ae53-a0ed2c3fdf89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 NetworkManager[48970]: <info>  [1769847102.3226] device (tap2a7e47cf-50): carrier: link connected
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.326 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[818fb66b-5564-4140-99cb-e38b6b96f767]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.339 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3ad391e8-f93f-47bf-8898-ab98db219096]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2a7e47cf-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:68:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624275, 'reachable_time': 18903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262578, 'error': None, 'target': 'ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.350 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[49d44ccd-5607-49e8-b86b-3dc3733e4671]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6f:6833'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624275, 'tstamp': 624275}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 262579, 'error': None, 'target': 'ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.362 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[05bf0b9a-8d9a-45c6-95eb-1d973f66e27f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2a7e47cf-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:68:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624275, 'reachable_time': 18903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 262580, 'error': None, 'target': 'ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.387 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ae580674-6f24-4191-ac0f-5a7406e40a85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.423 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[be1d6787-4dfa-4650-aa86-75b2263852c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.424 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2a7e47cf-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:42 np0005603623 kernel: tap2a7e47cf-50: entered promiscuous mode
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.425 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.425 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2a7e47cf-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.426 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603623 NetworkManager[48970]: <info>  [1769847102.4283] manager: (tap2a7e47cf-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/165)
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.429 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.431 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2a7e47cf-50, col_values=(('external_ids', {'iface-id': '2b722873-7a80-476d-8643-e2d1afbbf447'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.432 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:42Z|00343|binding|INFO|Releasing lport 2b722873-7a80-476d-8643-e2d1afbbf447 from this chassis (sb_readonly=0)
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.433 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.434 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2a7e47cf-58cf-4e4a-af81-97bf0a6bc596.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2a7e47cf-58cf-4e4a-af81-97bf0a6bc596.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.435 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[06756df0-080e-4c85-bc03-0a2d39c60e05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.436 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/2a7e47cf-58cf-4e4a-af81-97bf0a6bc596.pid.haproxy
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 2a7e47cf-58cf-4e4a-af81-97bf0a6bc596
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.438 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596', 'env', 'PROCESS_TAG=haproxy-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2a7e47cf-58cf-4e4a-af81-97bf0a6bc596.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.440 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603623 podman[262613]: 2026-01-31 08:11:42.756093788 +0000 UTC m=+0.053702344 container create 859a370a579187df4c053ddb5b0ef53cbea4fefec7f0469a53eddd452babaefd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:11:42 np0005603623 systemd[1]: Started libpod-conmon-859a370a579187df4c053ddb5b0ef53cbea4fefec7f0469a53eddd452babaefd.scope.
Jan 31 03:11:42 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:11:42 np0005603623 podman[262613]: 2026-01-31 08:11:42.725864895 +0000 UTC m=+0.023473471 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:11:42 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f27e28db095faa3c9f0cf1f3894570005bbbd39f6a40dc8532096171c8faad59/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:11:42 np0005603623 podman[262613]: 2026-01-31 08:11:42.836521553 +0000 UTC m=+0.134130129 container init 859a370a579187df4c053ddb5b0ef53cbea4fefec7f0469a53eddd452babaefd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Jan 31 03:11:42 np0005603623 podman[262613]: 2026-01-31 08:11:42.841794779 +0000 UTC m=+0.139403365 container start 859a370a579187df4c053ddb5b0ef53cbea4fefec7f0469a53eddd452babaefd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:11:42 np0005603623 neutron-haproxy-ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596[262644]: [NOTICE]   (262668) : New worker (262670) forked
Jan 31 03:11:42 np0005603623 neutron-haproxy-ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596[262644]: [NOTICE]   (262668) : Loading success.
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.887 226239 DEBUG nova.compute.manager [req-06b86dab-5570-4de2-82e2-3921439b77b1 req-a14e2163-4184-4180-aa03-7b5f0f24b117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-vif-plugged-b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.888 226239 DEBUG oslo_concurrency.lockutils [req-06b86dab-5570-4de2-82e2-3921439b77b1 req-a14e2163-4184-4180-aa03-7b5f0f24b117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.889 226239 DEBUG oslo_concurrency.lockutils [req-06b86dab-5570-4de2-82e2-3921439b77b1 req-a14e2163-4184-4180-aa03-7b5f0f24b117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.889 226239 DEBUG oslo_concurrency.lockutils [req-06b86dab-5570-4de2-82e2-3921439b77b1 req-a14e2163-4184-4180-aa03-7b5f0f24b117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.889 226239 DEBUG nova.compute.manager [req-06b86dab-5570-4de2-82e2-3921439b77b1 req-a14e2163-4184-4180-aa03-7b5f0f24b117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Processing event network-vif-plugged-b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.898 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 393a6935-9a5d-4dbb-8ace-1d72f748ecd4 in datapath 2a7e47cf-58cf-4e4a-af81-97bf0a6bc596 unbound from our chassis#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.901 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2a7e47cf-58cf-4e4a-af81-97bf0a6bc596#033[00m
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.909 226239 DEBUG nova.compute.manager [req-1484abea-f92d-4736-a897-1e2faf45465b req-82610baf-be68-4581-9a0f-01589947e266 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-vif-plugged-ab0ce9b5-cd73-4758-8513-45a7f13eefe3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.910 226239 DEBUG oslo_concurrency.lockutils [req-1484abea-f92d-4736-a897-1e2faf45465b req-82610baf-be68-4581-9a0f-01589947e266 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.910 226239 DEBUG oslo_concurrency.lockutils [req-1484abea-f92d-4736-a897-1e2faf45465b req-82610baf-be68-4581-9a0f-01589947e266 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.910 226239 DEBUG oslo_concurrency.lockutils [req-1484abea-f92d-4736-a897-1e2faf45465b req-82610baf-be68-4581-9a0f-01589947e266 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.911 226239 DEBUG nova.compute.manager [req-1484abea-f92d-4736-a897-1e2faf45465b req-82610baf-be68-4581-9a0f-01589947e266 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Processing event network-vif-plugged-ab0ce9b5-cd73-4758-8513-45a7f13eefe3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.920 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[70cced3f-0b28-4223-89eb-58a906e4014f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.948 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[0029cb2d-76a9-422a-a9cf-cc74858d8a87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.951 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[0a679151-bdad-487c-a0ef-b58b329d2518]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.977 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[83bbcf1a-38ea-4af9-a160-db94143000d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:42.994 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2661dc76-9613-417e-b078-274e444393e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2a7e47cf-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:68:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 266, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 266, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624275, 'reachable_time': 18903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262692, 'error': None, 'target': 'ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.997 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847102.9952242, f6cd19dd-9676-4737-a5dc-6b0d0705d8ca => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:11:42 np0005603623 nova_compute[226235]: 2026-01-31 08:11:42.997 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] VM Started (Lifecycle Event)#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.016 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f92b31bb-72d8-4b84-842a-8837747753bd]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tap2a7e47cf-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624283, 'tstamp': 624283}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 262693, 'error': None, 'target': 'ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2a7e47cf-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624285, 'tstamp': 624285}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 262693, 'error': None, 'target': 'ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.018 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2a7e47cf-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:43 np0005603623 nova_compute[226235]: 2026-01-31 08:11:43.021 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.022 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2a7e47cf-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.023 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.023 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2a7e47cf-50, col_values=(('external_ids', {'iface-id': '2b722873-7a80-476d-8643-e2d1afbbf447'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.024 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.025 143258 INFO neutron.agent.ovn.metadata.agent [-] Port b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 in datapath aabdf0ce-15d3-47ef-8035-167ef2db9ba8 unbound from our chassis#033[00m
Jan 31 03:11:43 np0005603623 nova_compute[226235]: 2026-01-31 08:11:43.027 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.027 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aabdf0ce-15d3-47ef-8035-167ef2db9ba8#033[00m
Jan 31 03:11:43 np0005603623 nova_compute[226235]: 2026-01-31 08:11:43.034 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847102.9954355, f6cd19dd-9676-4737-a5dc-6b0d0705d8ca => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:11:43 np0005603623 nova_compute[226235]: 2026-01-31 08:11:43.035 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.038 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c0ad7aec-de2b-418d-93c4-f577ae3071d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.040 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaabdf0ce-11 in ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.042 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaabdf0ce-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.042 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3634d3a6-37f0-4ad6-8cda-e3c5f40beba2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.043 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b60d10a6-221a-4485-82d9-2c8fdf595c9b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.058 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[77a2b14a-4ca7-4d10-a98e-ea64488dc6ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:43 np0005603623 nova_compute[226235]: 2026-01-31 08:11:43.065 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:43 np0005603623 nova_compute[226235]: 2026-01-31 08:11:43.072 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.078 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ce179234-f817-4d67-a046-2abba3e3e1f6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:43 np0005603623 nova_compute[226235]: 2026-01-31 08:11:43.098 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.114 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[781f60f1-2c45-42b0-883c-c4a3f2b9dbae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:43 np0005603623 NetworkManager[48970]: <info>  [1769847103.1233] manager: (tapaabdf0ce-10): new Veth device (/org/freedesktop/NetworkManager/Devices/166)
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.124 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7692bb6b-841c-40f3-98cd-50118c8aa86a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.159 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[5da9e95b-8be6-451d-b26e-eece1b50e907]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.163 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[698ffd09-aaae-49bc-abe9-a523d0577214]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:43 np0005603623 NetworkManager[48970]: <info>  [1769847103.1905] device (tapaabdf0ce-10): carrier: link connected
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.199 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[0158ea31-a48e-4aa7-89d9-6d71ceecc114]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.219 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4a0685e0-cdf6-4470-9253-f8f7df543eb9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaabdf0ce-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:98:4b:00'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624361, 'reachable_time': 29076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262704, 'error': None, 'target': 'ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.238 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a389701c-e74e-471b-a8cd-475390f0fbce]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe98:4b00'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624361, 'tstamp': 624361}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 262705, 'error': None, 'target': 'ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.261 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[523eb48f-c1e3-47b7-a368-c0da17285c86]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaabdf0ce-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:98:4b:00'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624361, 'reachable_time': 29076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 262706, 'error': None, 'target': 'ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.295 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[adfa11be-f033-447f-8ebd-e9be62dea7e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.369 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0d005762-ed0c-4573-9202-a0bfc5ea546f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.371 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaabdf0ce-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.371 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.371 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaabdf0ce-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:43 np0005603623 nova_compute[226235]: 2026-01-31 08:11:43.373 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:43 np0005603623 NetworkManager[48970]: <info>  [1769847103.3748] manager: (tapaabdf0ce-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/167)
Jan 31 03:11:43 np0005603623 kernel: tapaabdf0ce-10: entered promiscuous mode
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.377 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaabdf0ce-10, col_values=(('external_ids', {'iface-id': '0476569d-630e-4496-b4dc-c6614cf11bf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:43 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:43Z|00344|binding|INFO|Releasing lport 0476569d-630e-4496-b4dc-c6614cf11bf8 from this chassis (sb_readonly=0)
Jan 31 03:11:43 np0005603623 nova_compute[226235]: 2026-01-31 08:11:43.380 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.380 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/aabdf0ce-15d3-47ef-8035-167ef2db9ba8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/aabdf0ce-15d3-47ef-8035-167ef2db9ba8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.382 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ca29b1f3-0836-4a0a-88a0-e0e5e751ebb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.382 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-aabdf0ce-15d3-47ef-8035-167ef2db9ba8
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/aabdf0ce-15d3-47ef-8035-167ef2db9ba8.pid.haproxy
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID aabdf0ce-15d3-47ef-8035-167ef2db9ba8
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:43.384 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8', 'env', 'PROCESS_TAG=haproxy-aabdf0ce-15d3-47ef-8035-167ef2db9ba8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/aabdf0ce-15d3-47ef-8035-167ef2db9ba8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:11:43 np0005603623 nova_compute[226235]: 2026-01-31 08:11:43.386 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:43 np0005603623 nova_compute[226235]: 2026-01-31 08:11:43.441 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:11:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:43.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:11:43 np0005603623 nova_compute[226235]: 2026-01-31 08:11:43.662 226239 DEBUG nova.network.neutron [req-11c3c2d9-1bfc-4fb9-8ea4-8b70af085545 req-eb4df482-ad89-46f2-aa16-42ad1b40a2eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Updated VIF entry in instance network info cache for port 7429a420-eefe-4af6-b5a7-ad8aff346ea8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:11:43 np0005603623 nova_compute[226235]: 2026-01-31 08:11:43.663 226239 DEBUG nova.network.neutron [req-11c3c2d9-1bfc-4fb9-8ea4-8b70af085545 req-eb4df482-ad89-46f2-aa16-42ad1b40a2eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Updating instance_info_cache with network_info: [{"id": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "address": "fa:16:3e:39:1e:10", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7429a420-ee", "ovs_interfaceid": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:11:43 np0005603623 nova_compute[226235]: 2026-01-31 08:11:43.686 226239 DEBUG oslo_concurrency.lockutils [req-11c3c2d9-1bfc-4fb9-8ea4-8b70af085545 req-eb4df482-ad89-46f2-aa16-42ad1b40a2eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-702e2506-8d57-4ea2-b56e-1800da93f646" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:11:43 np0005603623 podman[262739]: 2026-01-31 08:11:43.728985646 +0000 UTC m=+0.050856493 container create 137bb16f75d08efe273e324bd0b02debeb446a5abb656af06041ab8c94b8f395 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 03:11:43 np0005603623 systemd[1]: Started libpod-conmon-137bb16f75d08efe273e324bd0b02debeb446a5abb656af06041ab8c94b8f395.scope.
Jan 31 03:11:43 np0005603623 podman[262739]: 2026-01-31 08:11:43.700165418 +0000 UTC m=+0.022036275 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:11:43 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:11:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:43 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f22872e5e79c99ee6d2710d38ab13884475e6a10a1855948bc375d51c09ccd0f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:11:43 np0005603623 podman[262739]: 2026-01-31 08:11:43.820335996 +0000 UTC m=+0.142206853 container init 137bb16f75d08efe273e324bd0b02debeb446a5abb656af06041ab8c94b8f395 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:11:43 np0005603623 podman[262739]: 2026-01-31 08:11:43.827208762 +0000 UTC m=+0.149079599 container start 137bb16f75d08efe273e324bd0b02debeb446a5abb656af06041ab8c94b8f395 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 03:11:43 np0005603623 neutron-haproxy-ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8[262755]: [NOTICE]   (262759) : New worker (262761) forked
Jan 31 03:11:43 np0005603623 neutron-haproxy-ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8[262755]: [NOTICE]   (262759) : Loading success.
Jan 31 03:11:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:11:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:43.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:11:44 np0005603623 nova_compute[226235]: 2026-01-31 08:11:44.116 226239 DEBUG nova.network.neutron [req-426a2ba4-d078-4b66-ba5d-9d461abc9f33 req-054d77a3-6e84-4e15-852f-d6b52f87120b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Updated VIF entry in instance network info cache for port 393a6935-9a5d-4dbb-8ace-1d72f748ecd4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:11:44 np0005603623 nova_compute[226235]: 2026-01-31 08:11:44.116 226239 DEBUG nova.network.neutron [req-426a2ba4-d078-4b66-ba5d-9d461abc9f33 req-054d77a3-6e84-4e15-852f-d6b52f87120b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Updating instance_info_cache with network_info: [{"id": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "address": "fa:16:3e:90:4d:5c", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.129", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0ce9b5-cd", "ovs_interfaceid": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "address": "fa:16:3e:d7:44:22", "network": {"id": "aabdf0ce-15d3-47ef-8035-167ef2db9ba8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1490540087", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.151", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f4f30e-6b", "ovs_interfaceid": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "address": "fa:16:3e:08:73:e3", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap393a6935-9a", "ovs_interfaceid": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:11:44 np0005603623 nova_compute[226235]: 2026-01-31 08:11:44.138 226239 DEBUG oslo_concurrency.lockutils [req-426a2ba4-d078-4b66-ba5d-9d461abc9f33 req-054d77a3-6e84-4e15-852f-d6b52f87120b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-f6cd19dd-9676-4737-a5dc-6b0d0705d8ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.038 226239 DEBUG nova.compute.manager [req-43e49c7b-83eb-49eb-be24-309e5dd7a1fe req-9ba2fdd8-12d1-44da-b47f-b3424470528f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-vif-plugged-ab0ce9b5-cd73-4758-8513-45a7f13eefe3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.038 226239 DEBUG oslo_concurrency.lockutils [req-43e49c7b-83eb-49eb-be24-309e5dd7a1fe req-9ba2fdd8-12d1-44da-b47f-b3424470528f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.040 226239 DEBUG oslo_concurrency.lockutils [req-43e49c7b-83eb-49eb-be24-309e5dd7a1fe req-9ba2fdd8-12d1-44da-b47f-b3424470528f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.040 226239 DEBUG oslo_concurrency.lockutils [req-43e49c7b-83eb-49eb-be24-309e5dd7a1fe req-9ba2fdd8-12d1-44da-b47f-b3424470528f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.041 226239 DEBUG nova.compute.manager [req-43e49c7b-83eb-49eb-be24-309e5dd7a1fe req-9ba2fdd8-12d1-44da-b47f-b3424470528f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] No event matching network-vif-plugged-ab0ce9b5-cd73-4758-8513-45a7f13eefe3 in dict_keys([('network-vif-plugged', '393a6935-9a5d-4dbb-8ace-1d72f748ecd4')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.042 226239 WARNING nova.compute.manager [req-43e49c7b-83eb-49eb-be24-309e5dd7a1fe req-9ba2fdd8-12d1-44da-b47f-b3424470528f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received unexpected event network-vif-plugged-ab0ce9b5-cd73-4758-8513-45a7f13eefe3 for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.150 226239 DEBUG nova.compute.manager [req-393f4db7-dbd8-4262-8b14-3cdc5895789d req-9c6f6b31-d875-4861-aa2c-81b0d30e8a1e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-vif-plugged-b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.151 226239 DEBUG oslo_concurrency.lockutils [req-393f4db7-dbd8-4262-8b14-3cdc5895789d req-9c6f6b31-d875-4861-aa2c-81b0d30e8a1e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.151 226239 DEBUG oslo_concurrency.lockutils [req-393f4db7-dbd8-4262-8b14-3cdc5895789d req-9c6f6b31-d875-4861-aa2c-81b0d30e8a1e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.152 226239 DEBUG oslo_concurrency.lockutils [req-393f4db7-dbd8-4262-8b14-3cdc5895789d req-9c6f6b31-d875-4861-aa2c-81b0d30e8a1e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.152 226239 DEBUG nova.compute.manager [req-393f4db7-dbd8-4262-8b14-3cdc5895789d req-9c6f6b31-d875-4861-aa2c-81b0d30e8a1e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] No event matching network-vif-plugged-b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 in dict_keys([('network-vif-plugged', '393a6935-9a5d-4dbb-8ace-1d72f748ecd4')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.152 226239 WARNING nova.compute.manager [req-393f4db7-dbd8-4262-8b14-3cdc5895789d req-9c6f6b31-d875-4861-aa2c-81b0d30e8a1e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received unexpected event network-vif-plugged-b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.367 226239 DEBUG nova.compute.manager [req-7c671e43-3726-4718-b80a-fb7aa0d7bf9d req-5ffe4daa-a803-4c2b-a1ca-834431245f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-vif-plugged-393a6935-9a5d-4dbb-8ace-1d72f748ecd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.368 226239 DEBUG oslo_concurrency.lockutils [req-7c671e43-3726-4718-b80a-fb7aa0d7bf9d req-5ffe4daa-a803-4c2b-a1ca-834431245f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.368 226239 DEBUG oslo_concurrency.lockutils [req-7c671e43-3726-4718-b80a-fb7aa0d7bf9d req-5ffe4daa-a803-4c2b-a1ca-834431245f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.368 226239 DEBUG oslo_concurrency.lockutils [req-7c671e43-3726-4718-b80a-fb7aa0d7bf9d req-5ffe4daa-a803-4c2b-a1ca-834431245f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.369 226239 DEBUG nova.compute.manager [req-7c671e43-3726-4718-b80a-fb7aa0d7bf9d req-5ffe4daa-a803-4c2b-a1ca-834431245f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Processing event network-vif-plugged-393a6935-9a5d-4dbb-8ace-1d72f748ecd4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.369 226239 DEBUG nova.compute.manager [req-7c671e43-3726-4718-b80a-fb7aa0d7bf9d req-5ffe4daa-a803-4c2b-a1ca-834431245f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-vif-plugged-393a6935-9a5d-4dbb-8ace-1d72f748ecd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.370 226239 DEBUG oslo_concurrency.lockutils [req-7c671e43-3726-4718-b80a-fb7aa0d7bf9d req-5ffe4daa-a803-4c2b-a1ca-834431245f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.370 226239 DEBUG oslo_concurrency.lockutils [req-7c671e43-3726-4718-b80a-fb7aa0d7bf9d req-5ffe4daa-a803-4c2b-a1ca-834431245f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.370 226239 DEBUG oslo_concurrency.lockutils [req-7c671e43-3726-4718-b80a-fb7aa0d7bf9d req-5ffe4daa-a803-4c2b-a1ca-834431245f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.371 226239 DEBUG nova.compute.manager [req-7c671e43-3726-4718-b80a-fb7aa0d7bf9d req-5ffe4daa-a803-4c2b-a1ca-834431245f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] No waiting events found dispatching network-vif-plugged-393a6935-9a5d-4dbb-8ace-1d72f748ecd4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.371 226239 WARNING nova.compute.manager [req-7c671e43-3726-4718-b80a-fb7aa0d7bf9d req-5ffe4daa-a803-4c2b-a1ca-834431245f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received unexpected event network-vif-plugged-393a6935-9a5d-4dbb-8ace-1d72f748ecd4 for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.372 226239 DEBUG nova.compute.manager [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Instance event wait completed in 2 seconds for network-vif-plugged,network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.377 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847105.3766627, f6cd19dd-9676-4737-a5dc-6b0d0705d8ca => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.378 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.381 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.388 226239 INFO nova.virt.libvirt.driver [-] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Instance spawned successfully.#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.390 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.410 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.415 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.428 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.429 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.430 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.430 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.431 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.432 226239 DEBUG nova.virt.libvirt.driver [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.444 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.510 226239 INFO nova.compute.manager [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Took 23.09 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.511 226239 DEBUG nova.compute.manager [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:11:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:45.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.569 226239 INFO nova.compute.manager [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Took 24.24 seconds to build instance.#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.594 226239 DEBUG oslo_concurrency.lockutils [None req-74e5b4e2-3b65-4ece-b1a8-6c21824bab60 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 24.314s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:45 np0005603623 nova_compute[226235]: 2026-01-31 08:11:45.740 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:11:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:45.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.238 226239 DEBUG oslo_concurrency.lockutils [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.239 226239 DEBUG oslo_concurrency.lockutils [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.240 226239 DEBUG oslo_concurrency.lockutils [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.240 226239 DEBUG oslo_concurrency.lockutils [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.241 226239 DEBUG oslo_concurrency.lockutils [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.242 226239 INFO nova.compute.manager [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Terminating instance#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.243 226239 DEBUG nova.compute.manager [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:11:47 np0005603623 kernel: tapab0ce9b5-cd (unregistering): left promiscuous mode
Jan 31 03:11:47 np0005603623 NetworkManager[48970]: <info>  [1769847107.2855] device (tapab0ce9b5-cd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:11:47 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:47Z|00345|binding|INFO|Releasing lport ab0ce9b5-cd73-4758-8513-45a7f13eefe3 from this chassis (sb_readonly=0)
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.295 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:47Z|00346|binding|INFO|Setting lport ab0ce9b5-cd73-4758-8513-45a7f13eefe3 down in Southbound
Jan 31 03:11:47 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:47Z|00347|binding|INFO|Removing iface tapab0ce9b5-cd ovn-installed in OVS
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.299 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 kernel: tapb3f4f30e-6b (unregistering): left promiscuous mode
Jan 31 03:11:47 np0005603623 NetworkManager[48970]: <info>  [1769847107.3040] device (tapb3f4f30e-6b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.307 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:4d:5c 10.100.0.129'], port_security=['fa:16:3e:90:4d:5c 10.100.0.129'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.129/24', 'neutron:device_id': 'f6cd19dd-9676-4737-a5dc-6b0d0705d8ca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88b896f61c644b6fac0351ce6828b6e1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '52fcb5e9-44f4-488b-bcc4-743586d363ee', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b25f26c4-9818-49ac-b211-d1b5817e21f4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=ab0ce9b5-cd73-4758-8513-45a7f13eefe3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.308 143258 INFO neutron.agent.ovn.metadata.agent [-] Port ab0ce9b5-cd73-4758-8513-45a7f13eefe3 in datapath 2a7e47cf-58cf-4e4a-af81-97bf0a6bc596 unbound from our chassis#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.311 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2a7e47cf-58cf-4e4a-af81-97bf0a6bc596#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.310 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:47Z|00348|binding|INFO|Releasing lport b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 from this chassis (sb_readonly=0)
Jan 31 03:11:47 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:47Z|00349|binding|INFO|Setting lport b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 down in Southbound
Jan 31 03:11:47 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:47Z|00350|binding|INFO|Removing iface tapb3f4f30e-6b ovn-installed in OVS
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.314 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.320 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.322 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:44:22 10.100.1.151'], port_security=['fa:16:3e:d7:44:22 10.100.1.151'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.151/24', 'neutron:device_id': 'f6cd19dd-9676-4737-a5dc-6b0d0705d8ca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aabdf0ce-15d3-47ef-8035-167ef2db9ba8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88b896f61c644b6fac0351ce6828b6e1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '52fcb5e9-44f4-488b-bcc4-743586d363ee', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=607aa9b2-1367-446a-ba20-d910db948ebd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.326 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[80005517-506a-4620-94c6-fe9180a4bb87]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 kernel: tap393a6935-9a (unregistering): left promiscuous mode
Jan 31 03:11:47 np0005603623 NetworkManager[48970]: <info>  [1769847107.3327] device (tap393a6935-9a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:11:47 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:47Z|00351|binding|INFO|Releasing lport 393a6935-9a5d-4dbb-8ace-1d72f748ecd4 from this chassis (sb_readonly=0)
Jan 31 03:11:47 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:47Z|00352|binding|INFO|Setting lport 393a6935-9a5d-4dbb-8ace-1d72f748ecd4 down in Southbound
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.346 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:47Z|00353|binding|INFO|Removing iface tap393a6935-9a ovn-installed in OVS
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.348 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.353 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.354 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[527eb62c-9541-47c3-8deb-295f2f98c41e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.356 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:08:73:e3 10.100.0.102'], port_security=['fa:16:3e:08:73:e3 10.100.0.102'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.102/24', 'neutron:device_id': 'f6cd19dd-9676-4737-a5dc-6b0d0705d8ca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88b896f61c644b6fac0351ce6828b6e1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '52fcb5e9-44f4-488b-bcc4-743586d363ee', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b25f26c4-9818-49ac-b211-d1b5817e21f4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=393a6935-9a5d-4dbb-8ace-1d72f748ecd4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.359 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[f2060b9e-9e0b-4302-a5ea-b2074dba7544]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000056.scope: Deactivated successfully.
Jan 31 03:11:47 np0005603623 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000056.scope: Consumed 2.665s CPU time.
Jan 31 03:11:47 np0005603623 systemd-machined[194379]: Machine qemu-38-instance-00000056 terminated.
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.386 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[4be17299-a72b-49c4-9c38-49be07770ee1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.400 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[15852ec9-22fc-4144-a88f-f785358fb3bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2a7e47cf-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:68:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624275, 'reachable_time': 18903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262795, 'error': None, 'target': 'ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.414 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ed489412-977a-4a8c-99f1-1354a4ffc07e]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tap2a7e47cf-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624283, 'tstamp': 624283}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 262796, 'error': None, 'target': 'ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2a7e47cf-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624285, 'tstamp': 624285}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 262796, 'error': None, 'target': 'ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.415 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2a7e47cf-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.417 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.423 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.424 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2a7e47cf-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.425 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.425 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2a7e47cf-50, col_values=(('external_ids', {'iface-id': '2b722873-7a80-476d-8643-e2d1afbbf447'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.425 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.427 143258 INFO neutron.agent.ovn.metadata.agent [-] Port b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 in datapath aabdf0ce-15d3-47ef-8035-167ef2db9ba8 unbound from our chassis#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.428 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aabdf0ce-15d3-47ef-8035-167ef2db9ba8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.429 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1ed6e03c-a6ac-4f61-8a73-c03677c8dd7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.430 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8 namespace which is not needed anymore#033[00m
Jan 31 03:11:47 np0005603623 NetworkManager[48970]: <info>  [1769847107.4701] manager: (tapb3f4f30e-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/168)
Jan 31 03:11:47 np0005603623 NetworkManager[48970]: <info>  [1769847107.4811] manager: (tap393a6935-9a): new Tun device (/org/freedesktop/NetworkManager/Devices/169)
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.500 226239 INFO nova.virt.libvirt.driver [-] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Instance destroyed successfully.#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.500 226239 DEBUG nova.objects.instance [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lazy-loading 'resources' on Instance uuid f6cd19dd-9676-4737-a5dc-6b0d0705d8ca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.517 226239 DEBUG nova.virt.libvirt.vif [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:11:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-8673776',display_name='tempest-ServersTestMultiNic-server-8673776',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-8673776',id=86,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:11:45Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='88b896f61c644b6fac0351ce6828b6e1',ramdisk_id='',reservation_id='r-xpuh8pw8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1053198737',owner_user_name='tempest-ServersTestMultiNic-1053198737-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:11:45Z,user_data=None,user_id='55f81600a60b49aaae5b4c28549afdaf',uuid=f6cd19dd-9676-4737-a5dc-6b0d0705d8ca,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "address": "fa:16:3e:90:4d:5c", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.129", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0ce9b5-cd", "ovs_interfaceid": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.517 226239 DEBUG nova.network.os_vif_util [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converting VIF {"id": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "address": "fa:16:3e:90:4d:5c", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.129", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0ce9b5-cd", "ovs_interfaceid": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.518 226239 DEBUG nova.network.os_vif_util [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:4d:5c,bridge_name='br-int',has_traffic_filtering=True,id=ab0ce9b5-cd73-4758-8513-45a7f13eefe3,network=Network(2a7e47cf-58cf-4e4a-af81-97bf0a6bc596),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0ce9b5-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.518 226239 DEBUG os_vif [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:4d:5c,bridge_name='br-int',has_traffic_filtering=True,id=ab0ce9b5-cd73-4758-8513-45a7f13eefe3,network=Network(2a7e47cf-58cf-4e4a-af81-97bf0a6bc596),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0ce9b5-cd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.519 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.520 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab0ce9b5-cd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.521 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.523 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.526 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.528 226239 INFO os_vif [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:4d:5c,bridge_name='br-int',has_traffic_filtering=True,id=ab0ce9b5-cd73-4758-8513-45a7f13eefe3,network=Network(2a7e47cf-58cf-4e4a-af81-97bf0a6bc596),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0ce9b5-cd')#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.529 226239 DEBUG nova.virt.libvirt.vif [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:11:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-8673776',display_name='tempest-ServersTestMultiNic-server-8673776',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-8673776',id=86,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:11:45Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='88b896f61c644b6fac0351ce6828b6e1',ramdisk_id='',reservation_id='r-xpuh8pw8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1053198737',owner_user_name='tempest-ServersTestMultiNic-1053198737-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:11:45Z,user_data=None,user_id='55f81600a60b49aaae5b4c28549afdaf',uuid=f6cd19dd-9676-4737-a5dc-6b0d0705d8ca,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "address": "fa:16:3e:d7:44:22", "network": {"id": "aabdf0ce-15d3-47ef-8035-167ef2db9ba8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1490540087", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.151", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f4f30e-6b", "ovs_interfaceid": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.529 226239 DEBUG nova.network.os_vif_util [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converting VIF {"id": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "address": "fa:16:3e:d7:44:22", "network": {"id": "aabdf0ce-15d3-47ef-8035-167ef2db9ba8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1490540087", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.151", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f4f30e-6b", "ovs_interfaceid": "b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.530 226239 DEBUG nova.network.os_vif_util [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:44:22,bridge_name='br-int',has_traffic_filtering=True,id=b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9,network=Network(aabdf0ce-15d3-47ef-8035-167ef2db9ba8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3f4f30e-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.530 226239 DEBUG os_vif [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:44:22,bridge_name='br-int',has_traffic_filtering=True,id=b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9,network=Network(aabdf0ce-15d3-47ef-8035-167ef2db9ba8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3f4f30e-6b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.531 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.531 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3f4f30e-6b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.533 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.535 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.536 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.538 226239 INFO os_vif [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:44:22,bridge_name='br-int',has_traffic_filtering=True,id=b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9,network=Network(aabdf0ce-15d3-47ef-8035-167ef2db9ba8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3f4f30e-6b')#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.539 226239 DEBUG nova.virt.libvirt.vif [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:11:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-8673776',display_name='tempest-ServersTestMultiNic-server-8673776',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-8673776',id=86,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:11:45Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='88b896f61c644b6fac0351ce6828b6e1',ramdisk_id='',reservation_id='r-xpuh8pw8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1053198737',owner_user_name='tempest-ServersTestMultiNic-1053198737-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:11:45Z,user_data=None,user_id='55f81600a60b49aaae5b4c28549afdaf',uuid=f6cd19dd-9676-4737-a5dc-6b0d0705d8ca,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "address": "fa:16:3e:08:73:e3", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap393a6935-9a", "ovs_interfaceid": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.539 226239 DEBUG nova.network.os_vif_util [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converting VIF {"id": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "address": "fa:16:3e:08:73:e3", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap393a6935-9a", "ovs_interfaceid": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.540 226239 DEBUG nova.network.os_vif_util [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:73:e3,bridge_name='br-int',has_traffic_filtering=True,id=393a6935-9a5d-4dbb-8ace-1d72f748ecd4,network=Network(2a7e47cf-58cf-4e4a-af81-97bf0a6bc596),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap393a6935-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.541 226239 DEBUG os_vif [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:73:e3,bridge_name='br-int',has_traffic_filtering=True,id=393a6935-9a5d-4dbb-8ace-1d72f748ecd4,network=Network(2a7e47cf-58cf-4e4a-af81-97bf0a6bc596),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap393a6935-9a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.541 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.542 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap393a6935-9a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.543 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.544 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.546 226239 INFO os_vif [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:73:e3,bridge_name='br-int',has_traffic_filtering=True,id=393a6935-9a5d-4dbb-8ace-1d72f748ecd4,network=Network(2a7e47cf-58cf-4e4a-af81-97bf0a6bc596),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap393a6935-9a')#033[00m
Jan 31 03:11:47 np0005603623 neutron-haproxy-ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8[262755]: [NOTICE]   (262759) : haproxy version is 2.8.14-c23fe91
Jan 31 03:11:47 np0005603623 neutron-haproxy-ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8[262755]: [NOTICE]   (262759) : path to executable is /usr/sbin/haproxy
Jan 31 03:11:47 np0005603623 neutron-haproxy-ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8[262755]: [WARNING]  (262759) : Exiting Master process...
Jan 31 03:11:47 np0005603623 neutron-haproxy-ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8[262755]: [ALERT]    (262759) : Current worker (262761) exited with code 143 (Terminated)
Jan 31 03:11:47 np0005603623 neutron-haproxy-ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8[262755]: [WARNING]  (262759) : All workers exited. Exiting... (0)
Jan 31 03:11:47 np0005603623 systemd[1]: libpod-137bb16f75d08efe273e324bd0b02debeb446a5abb656af06041ab8c94b8f395.scope: Deactivated successfully.
Jan 31 03:11:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:47.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:47 np0005603623 podman[262852]: 2026-01-31 08:11:47.559834071 +0000 UTC m=+0.055257753 container died 137bb16f75d08efe273e324bd0b02debeb446a5abb656af06041ab8c94b8f395 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:11:47 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-137bb16f75d08efe273e324bd0b02debeb446a5abb656af06041ab8c94b8f395-userdata-shm.mount: Deactivated successfully.
Jan 31 03:11:47 np0005603623 systemd[1]: var-lib-containers-storage-overlay-f22872e5e79c99ee6d2710d38ab13884475e6a10a1855948bc375d51c09ccd0f-merged.mount: Deactivated successfully.
Jan 31 03:11:47 np0005603623 podman[262852]: 2026-01-31 08:11:47.599549083 +0000 UTC m=+0.094972755 container cleanup 137bb16f75d08efe273e324bd0b02debeb446a5abb656af06041ab8c94b8f395 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 03:11:47 np0005603623 systemd[1]: libpod-conmon-137bb16f75d08efe273e324bd0b02debeb446a5abb656af06041ab8c94b8f395.scope: Deactivated successfully.
Jan 31 03:11:47 np0005603623 podman[262903]: 2026-01-31 08:11:47.651906273 +0000 UTC m=+0.037788212 container remove 137bb16f75d08efe273e324bd0b02debeb446a5abb656af06041ab8c94b8f395 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.656 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[df998faf-5f00-473d-b6ad-4c889ceb2bc1]: (4, ('Sat Jan 31 08:11:47 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8 (137bb16f75d08efe273e324bd0b02debeb446a5abb656af06041ab8c94b8f395)\n137bb16f75d08efe273e324bd0b02debeb446a5abb656af06041ab8c94b8f395\nSat Jan 31 08:11:47 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8 (137bb16f75d08efe273e324bd0b02debeb446a5abb656af06041ab8c94b8f395)\n137bb16f75d08efe273e324bd0b02debeb446a5abb656af06041ab8c94b8f395\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.658 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cd559272-c7f0-4fbc-8c62-18182fef2dac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.659 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaabdf0ce-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.661 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 kernel: tapaabdf0ce-10: left promiscuous mode
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.669 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.671 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d15c137a-a8e9-47d0-8c6e-b241b6242ead]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.681 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7a0560b6-eabc-42cc-8ac0-59334607a395]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.682 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[edb6fb59-96d3-462a-b9ec-6eaaddb2ad42]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.692 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[368163a4-d3f3-4f4b-bc55-a13b244dcdff]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624353, 'reachable_time': 28649, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262918, 'error': None, 'target': 'ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.695 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-aabdf0ce-15d3-47ef-8035-167ef2db9ba8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.695 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[818bc4a1-da07-4612-bd5d-d9d8cb6710b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 systemd[1]: run-netns-ovnmeta\x2daabdf0ce\x2d15d3\x2d47ef\x2d8035\x2d167ef2db9ba8.mount: Deactivated successfully.
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.696 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 393a6935-9a5d-4dbb-8ace-1d72f748ecd4 in datapath 2a7e47cf-58cf-4e4a-af81-97bf0a6bc596 unbound from our chassis#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.697 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2a7e47cf-58cf-4e4a-af81-97bf0a6bc596, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.698 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[15caa8b7-7099-413a-8172-d16ab4573c9a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.698 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596 namespace which is not needed anymore#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.783 226239 DEBUG nova.compute.manager [req-a861b114-cf56-4b22-be84-205e6d3cd310 req-f552841b-5bed-4620-8e6f-3f29a5859808 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-vif-unplugged-ab0ce9b5-cd73-4758-8513-45a7f13eefe3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.784 226239 DEBUG oslo_concurrency.lockutils [req-a861b114-cf56-4b22-be84-205e6d3cd310 req-f552841b-5bed-4620-8e6f-3f29a5859808 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.784 226239 DEBUG oslo_concurrency.lockutils [req-a861b114-cf56-4b22-be84-205e6d3cd310 req-f552841b-5bed-4620-8e6f-3f29a5859808 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.784 226239 DEBUG oslo_concurrency.lockutils [req-a861b114-cf56-4b22-be84-205e6d3cd310 req-f552841b-5bed-4620-8e6f-3f29a5859808 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.784 226239 DEBUG nova.compute.manager [req-a861b114-cf56-4b22-be84-205e6d3cd310 req-f552841b-5bed-4620-8e6f-3f29a5859808 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] No waiting events found dispatching network-vif-unplugged-ab0ce9b5-cd73-4758-8513-45a7f13eefe3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.784 226239 DEBUG nova.compute.manager [req-a861b114-cf56-4b22-be84-205e6d3cd310 req-f552841b-5bed-4620-8e6f-3f29a5859808 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-vif-unplugged-ab0ce9b5-cd73-4758-8513-45a7f13eefe3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:11:47 np0005603623 neutron-haproxy-ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596[262644]: [NOTICE]   (262668) : haproxy version is 2.8.14-c23fe91
Jan 31 03:11:47 np0005603623 neutron-haproxy-ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596[262644]: [NOTICE]   (262668) : path to executable is /usr/sbin/haproxy
Jan 31 03:11:47 np0005603623 neutron-haproxy-ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596[262644]: [WARNING]  (262668) : Exiting Master process...
Jan 31 03:11:47 np0005603623 neutron-haproxy-ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596[262644]: [ALERT]    (262668) : Current worker (262670) exited with code 143 (Terminated)
Jan 31 03:11:47 np0005603623 neutron-haproxy-ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596[262644]: [WARNING]  (262668) : All workers exited. Exiting... (0)
Jan 31 03:11:47 np0005603623 systemd[1]: libpod-859a370a579187df4c053ddb5b0ef53cbea4fefec7f0469a53eddd452babaefd.scope: Deactivated successfully.
Jan 31 03:11:47 np0005603623 podman[262936]: 2026-01-31 08:11:47.816792591 +0000 UTC m=+0.053836987 container died 859a370a579187df4c053ddb5b0ef53cbea4fefec7f0469a53eddd452babaefd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.816 226239 DEBUG nova.compute.manager [req-3b84c4b7-af95-44ec-b90d-95b62c770b55 req-52fe215e-2d4e-45de-95c2-4035c78768c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-vif-unplugged-b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.817 226239 DEBUG oslo_concurrency.lockutils [req-3b84c4b7-af95-44ec-b90d-95b62c770b55 req-52fe215e-2d4e-45de-95c2-4035c78768c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.817 226239 DEBUG oslo_concurrency.lockutils [req-3b84c4b7-af95-44ec-b90d-95b62c770b55 req-52fe215e-2d4e-45de-95c2-4035c78768c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.817 226239 DEBUG oslo_concurrency.lockutils [req-3b84c4b7-af95-44ec-b90d-95b62c770b55 req-52fe215e-2d4e-45de-95c2-4035c78768c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.818 226239 DEBUG nova.compute.manager [req-3b84c4b7-af95-44ec-b90d-95b62c770b55 req-52fe215e-2d4e-45de-95c2-4035c78768c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] No waiting events found dispatching network-vif-unplugged-b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.818 226239 DEBUG nova.compute.manager [req-3b84c4b7-af95-44ec-b90d-95b62c770b55 req-52fe215e-2d4e-45de-95c2-4035c78768c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-vif-unplugged-b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:11:47 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-859a370a579187df4c053ddb5b0ef53cbea4fefec7f0469a53eddd452babaefd-userdata-shm.mount: Deactivated successfully.
Jan 31 03:11:47 np0005603623 systemd[1]: var-lib-containers-storage-overlay-f27e28db095faa3c9f0cf1f3894570005bbbd39f6a40dc8532096171c8faad59-merged.mount: Deactivated successfully.
Jan 31 03:11:47 np0005603623 podman[262936]: 2026-01-31 08:11:47.845883938 +0000 UTC m=+0.082928334 container cleanup 859a370a579187df4c053ddb5b0ef53cbea4fefec7f0469a53eddd452babaefd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:11:47 np0005603623 systemd[1]: libpod-conmon-859a370a579187df4c053ddb5b0ef53cbea4fefec7f0469a53eddd452babaefd.scope: Deactivated successfully.
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.880 226239 DEBUG nova.compute.manager [req-5f2c608a-9a63-4951-84e6-d3fda440ffaa req-eff118cf-ceee-4b7c-a08b-25c4f056acd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-vif-unplugged-393a6935-9a5d-4dbb-8ace-1d72f748ecd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.880 226239 DEBUG oslo_concurrency.lockutils [req-5f2c608a-9a63-4951-84e6-d3fda440ffaa req-eff118cf-ceee-4b7c-a08b-25c4f056acd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.881 226239 DEBUG oslo_concurrency.lockutils [req-5f2c608a-9a63-4951-84e6-d3fda440ffaa req-eff118cf-ceee-4b7c-a08b-25c4f056acd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.881 226239 DEBUG oslo_concurrency.lockutils [req-5f2c608a-9a63-4951-84e6-d3fda440ffaa req-eff118cf-ceee-4b7c-a08b-25c4f056acd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.881 226239 DEBUG nova.compute.manager [req-5f2c608a-9a63-4951-84e6-d3fda440ffaa req-eff118cf-ceee-4b7c-a08b-25c4f056acd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] No waiting events found dispatching network-vif-unplugged-393a6935-9a5d-4dbb-8ace-1d72f748ecd4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.881 226239 DEBUG nova.compute.manager [req-5f2c608a-9a63-4951-84e6-d3fda440ffaa req-eff118cf-ceee-4b7c-a08b-25c4f056acd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-vif-unplugged-393a6935-9a5d-4dbb-8ace-1d72f748ecd4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:11:47 np0005603623 podman[262966]: 2026-01-31 08:11:47.916224736 +0000 UTC m=+0.053379524 container remove 859a370a579187df4c053ddb5b0ef53cbea4fefec7f0469a53eddd452babaefd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.920 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c765f3c1-e092-4009-868c-bd21388907d2]: (4, ('Sat Jan 31 08:11:47 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596 (859a370a579187df4c053ddb5b0ef53cbea4fefec7f0469a53eddd452babaefd)\n859a370a579187df4c053ddb5b0ef53cbea4fefec7f0469a53eddd452babaefd\nSat Jan 31 08:11:47 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596 (859a370a579187df4c053ddb5b0ef53cbea4fefec7f0469a53eddd452babaefd)\n859a370a579187df4c053ddb5b0ef53cbea4fefec7f0469a53eddd452babaefd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.922 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[44a3bc28-0a00-4d2c-9f97-4ef9357d78e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.924 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2a7e47cf-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.925 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 kernel: tap2a7e47cf-50: left promiscuous mode
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.928 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.931 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[efae923d-81e6-49e6-9fab-14e6b6ae8757]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 nova_compute[226235]: 2026-01-31 08:11:47.932 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.945 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[26888f4c-95aa-429a-9691-950e27ef5be8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.948 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[af827423-0331-4578-8ca0-b8a349153928]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:11:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:47.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.967 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5e8a0b7c-64c9-45cb-9e7c-4404166e1c66]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624269, 'reachable_time': 16780, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262983, 'error': None, 'target': 'ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.970 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2a7e47cf-58cf-4e4a-af81-97bf0a6bc596 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:11:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:47.970 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[0c1f8dc1-cd2e-4c38-a10e-ff06d9a0dd2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:48 np0005603623 nova_compute[226235]: 2026-01-31 08:11:48.036 226239 INFO nova.virt.libvirt.driver [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Deleting instance files /var/lib/nova/instances/f6cd19dd-9676-4737-a5dc-6b0d0705d8ca_del#033[00m
Jan 31 03:11:48 np0005603623 nova_compute[226235]: 2026-01-31 08:11:48.036 226239 INFO nova.virt.libvirt.driver [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Deletion of /var/lib/nova/instances/f6cd19dd-9676-4737-a5dc-6b0d0705d8ca_del complete#033[00m
Jan 31 03:11:48 np0005603623 nova_compute[226235]: 2026-01-31 08:11:48.112 226239 INFO nova.compute.manager [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Took 0.87 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:11:48 np0005603623 nova_compute[226235]: 2026-01-31 08:11:48.113 226239 DEBUG oslo.service.loopingcall [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:11:48 np0005603623 nova_compute[226235]: 2026-01-31 08:11:48.113 226239 DEBUG nova.compute.manager [-] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:11:48 np0005603623 nova_compute[226235]: 2026-01-31 08:11:48.113 226239 DEBUG nova.network.neutron [-] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:11:48 np0005603623 nova_compute[226235]: 2026-01-31 08:11:48.442 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:48 np0005603623 systemd[1]: run-netns-ovnmeta\x2d2a7e47cf\x2d58cf\x2d4e4a\x2daf81\x2d97bf0a6bc596.mount: Deactivated successfully.
Jan 31 03:11:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:49 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:49Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:39:1e:10 10.100.0.13
Jan 31 03:11:49 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:49Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:39:1e:10 10.100.0.13
Jan 31 03:11:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:49.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:49.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.008 226239 DEBUG nova.compute.manager [req-36529197-6bc8-41e5-9e00-6bd368006ffa req-5dd21b73-6526-42d5-b11d-3e33064d47e3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-vif-plugged-b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.008 226239 DEBUG oslo_concurrency.lockutils [req-36529197-6bc8-41e5-9e00-6bd368006ffa req-5dd21b73-6526-42d5-b11d-3e33064d47e3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.009 226239 DEBUG oslo_concurrency.lockutils [req-36529197-6bc8-41e5-9e00-6bd368006ffa req-5dd21b73-6526-42d5-b11d-3e33064d47e3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.009 226239 DEBUG oslo_concurrency.lockutils [req-36529197-6bc8-41e5-9e00-6bd368006ffa req-5dd21b73-6526-42d5-b11d-3e33064d47e3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.009 226239 DEBUG nova.compute.manager [req-36529197-6bc8-41e5-9e00-6bd368006ffa req-5dd21b73-6526-42d5-b11d-3e33064d47e3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] No waiting events found dispatching network-vif-plugged-b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.009 226239 WARNING nova.compute.manager [req-36529197-6bc8-41e5-9e00-6bd368006ffa req-5dd21b73-6526-42d5-b11d-3e33064d47e3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received unexpected event network-vif-plugged-b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.016 226239 DEBUG nova.compute.manager [req-15620d51-61df-4066-8200-53ed3ba10002 req-c5602889-3424-4928-84dd-65f67f43079a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-vif-plugged-ab0ce9b5-cd73-4758-8513-45a7f13eefe3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.016 226239 DEBUG oslo_concurrency.lockutils [req-15620d51-61df-4066-8200-53ed3ba10002 req-c5602889-3424-4928-84dd-65f67f43079a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.017 226239 DEBUG oslo_concurrency.lockutils [req-15620d51-61df-4066-8200-53ed3ba10002 req-c5602889-3424-4928-84dd-65f67f43079a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.017 226239 DEBUG oslo_concurrency.lockutils [req-15620d51-61df-4066-8200-53ed3ba10002 req-c5602889-3424-4928-84dd-65f67f43079a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.017 226239 DEBUG nova.compute.manager [req-15620d51-61df-4066-8200-53ed3ba10002 req-c5602889-3424-4928-84dd-65f67f43079a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] No waiting events found dispatching network-vif-plugged-ab0ce9b5-cd73-4758-8513-45a7f13eefe3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.017 226239 WARNING nova.compute.manager [req-15620d51-61df-4066-8200-53ed3ba10002 req-c5602889-3424-4928-84dd-65f67f43079a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received unexpected event network-vif-plugged-ab0ce9b5-cd73-4758-8513-45a7f13eefe3 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.017 226239 DEBUG nova.compute.manager [req-15620d51-61df-4066-8200-53ed3ba10002 req-c5602889-3424-4928-84dd-65f67f43079a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-vif-deleted-b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.018 226239 INFO nova.compute.manager [req-15620d51-61df-4066-8200-53ed3ba10002 req-c5602889-3424-4928-84dd-65f67f43079a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Neutron deleted interface b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.018 226239 DEBUG nova.network.neutron [req-15620d51-61df-4066-8200-53ed3ba10002 req-c5602889-3424-4928-84dd-65f67f43079a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Updating instance_info_cache with network_info: [{"id": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "address": "fa:16:3e:90:4d:5c", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.129", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0ce9b5-cd", "ovs_interfaceid": "ab0ce9b5-cd73-4758-8513-45a7f13eefe3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "address": "fa:16:3e:08:73:e3", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap393a6935-9a", "ovs_interfaceid": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.041 226239 DEBUG nova.compute.manager [req-15620d51-61df-4066-8200-53ed3ba10002 req-c5602889-3424-4928-84dd-65f67f43079a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Detach interface failed, port_id=b3f4f30e-6b1a-4ee1-8d01-bc705ce727c9, reason: Instance f6cd19dd-9676-4737-a5dc-6b0d0705d8ca could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.041 226239 DEBUG nova.compute.manager [req-15620d51-61df-4066-8200-53ed3ba10002 req-c5602889-3424-4928-84dd-65f67f43079a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-vif-deleted-ab0ce9b5-cd73-4758-8513-45a7f13eefe3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.041 226239 INFO nova.compute.manager [req-15620d51-61df-4066-8200-53ed3ba10002 req-c5602889-3424-4928-84dd-65f67f43079a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Neutron deleted interface ab0ce9b5-cd73-4758-8513-45a7f13eefe3; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.041 226239 DEBUG nova.network.neutron [req-15620d51-61df-4066-8200-53ed3ba10002 req-c5602889-3424-4928-84dd-65f67f43079a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Updating instance_info_cache with network_info: [{"id": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "address": "fa:16:3e:08:73:e3", "network": {"id": "2a7e47cf-58cf-4e4a-af81-97bf0a6bc596", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-169202200", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap393a6935-9a", "ovs_interfaceid": "393a6935-9a5d-4dbb-8ace-1d72f748ecd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.063 226239 DEBUG nova.compute.manager [req-15620d51-61df-4066-8200-53ed3ba10002 req-c5602889-3424-4928-84dd-65f67f43079a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Detach interface failed, port_id=ab0ce9b5-cd73-4758-8513-45a7f13eefe3, reason: Instance f6cd19dd-9676-4737-a5dc-6b0d0705d8ca could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.110 226239 DEBUG nova.compute.manager [req-438f72e4-9aa9-4abf-85de-38076f37f081 req-8daefa0a-14e9-4315-9e0b-4a93baaf4769 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-vif-plugged-393a6935-9a5d-4dbb-8ace-1d72f748ecd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.111 226239 DEBUG oslo_concurrency.lockutils [req-438f72e4-9aa9-4abf-85de-38076f37f081 req-8daefa0a-14e9-4315-9e0b-4a93baaf4769 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.111 226239 DEBUG oslo_concurrency.lockutils [req-438f72e4-9aa9-4abf-85de-38076f37f081 req-8daefa0a-14e9-4315-9e0b-4a93baaf4769 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.111 226239 DEBUG oslo_concurrency.lockutils [req-438f72e4-9aa9-4abf-85de-38076f37f081 req-8daefa0a-14e9-4315-9e0b-4a93baaf4769 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.112 226239 DEBUG nova.compute.manager [req-438f72e4-9aa9-4abf-85de-38076f37f081 req-8daefa0a-14e9-4315-9e0b-4a93baaf4769 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] No waiting events found dispatching network-vif-plugged-393a6935-9a5d-4dbb-8ace-1d72f748ecd4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.112 226239 WARNING nova.compute.manager [req-438f72e4-9aa9-4abf-85de-38076f37f081 req-8daefa0a-14e9-4315-9e0b-4a93baaf4769 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received unexpected event network-vif-plugged-393a6935-9a5d-4dbb-8ace-1d72f748ecd4 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.179 226239 DEBUG nova.network.neutron [-] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.206 226239 INFO nova.compute.manager [-] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Took 2.09 seconds to deallocate network for instance.#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.260 226239 DEBUG oslo_concurrency.lockutils [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.261 226239 DEBUG oslo_concurrency.lockutils [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.327 226239 DEBUG oslo_concurrency.processutils [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:11:50 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2666228640' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.760 226239 DEBUG oslo_concurrency.processutils [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.764 226239 DEBUG nova.compute.provider_tree [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.797 226239 DEBUG nova.scheduler.client.report [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.825 226239 DEBUG oslo_concurrency.lockutils [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.847 226239 INFO nova.scheduler.client.report [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Deleted allocations for instance f6cd19dd-9676-4737-a5dc-6b0d0705d8ca#033[00m
Jan 31 03:11:50 np0005603623 nova_compute[226235]: 2026-01-31 08:11:50.910 226239 DEBUG oslo_concurrency.lockutils [None req-e0d5babd-53ba-493d-83c4-328db399a07f 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "f6cd19dd-9676-4737-a5dc-6b0d0705d8ca" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.671s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:11:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:51.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:11:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:51.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:52 np0005603623 nova_compute[226235]: 2026-01-31 08:11:52.129 226239 DEBUG nova.compute.manager [req-5c56b9aa-71f4-4b6e-bd51-88936557195c req-bf3ff644-c9c9-4b5c-9e33-f3ac325e5f69 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Received event network-vif-deleted-393a6935-9a5d-4dbb-8ace-1d72f748ecd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:52 np0005603623 nova_compute[226235]: 2026-01-31 08:11:52.546 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:52.737 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:11:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:52.737 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:11:52 np0005603623 nova_compute[226235]: 2026-01-31 08:11:52.738 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:11:52.738 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:53 np0005603623 ovn_controller[133449]: 2026-01-31T08:11:53Z|00354|binding|INFO|Releasing lport 5bb8c1b5-edce-4f6a-8164-58b7d89a3330 from this chassis (sb_readonly=0)
Jan 31 03:11:53 np0005603623 nova_compute[226235]: 2026-01-31 08:11:53.064 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:53 np0005603623 nova_compute[226235]: 2026-01-31 08:11:53.444 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:53.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:53.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:53 np0005603623 podman[263009]: 2026-01-31 08:11:53.991198615 +0000 UTC m=+0.071871307 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:11:53 np0005603623 podman[263010]: 2026-01-31 08:11:53.997273057 +0000 UTC m=+0.086679443 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:11:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:55.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:55.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:11:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:11:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.441 226239 DEBUG oslo_concurrency.lockutils [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "702e2506-8d57-4ea2-b56e-1800da93f646" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.443 226239 DEBUG oslo_concurrency.lockutils [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.471 226239 DEBUG nova.objects.instance [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'flavor' on Instance uuid 702e2506-8d57-4ea2-b56e-1800da93f646 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.541 226239 DEBUG oslo_concurrency.lockutils [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.550 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:57.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.763 226239 DEBUG oslo_concurrency.lockutils [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "702e2506-8d57-4ea2-b56e-1800da93f646" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.764 226239 DEBUG oslo_concurrency.lockutils [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.764 226239 INFO nova.compute.manager [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Attaching volume 0a2f77d6-6ebd-4e32-8f3a-3b8197764510 to /dev/vdb#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.900 226239 DEBUG os_brick.utils [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.901 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.912 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.912 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[4f6b1649-caef-491f-89c6-a55752995897]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.913 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.939 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.939 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[8ea23f1d-df9c-423d-945e-9508e12d9d22]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.941 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.948 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.948 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[b0db4c99-10d7-41e8-9404-8517ac0c59a8]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.950 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[c790e957-cdf0-4c77-aa38-45dfe00da376]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.950 226239 DEBUG oslo_concurrency.processutils [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:57.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.969 226239 DEBUG oslo_concurrency.processutils [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "nvme version" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.972 226239 DEBUG os_brick.initiator.connectors.lightos [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.972 226239 DEBUG os_brick.initiator.connectors.lightos [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.972 226239 DEBUG os_brick.initiator.connectors.lightos [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.973 226239 DEBUG os_brick.utils [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] <== get_connector_properties: return (72ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:11:57 np0005603623 nova_compute[226235]: 2026-01-31 08:11:57.973 226239 DEBUG nova.virt.block_device [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Updating existing volume attachment record: cefb6509-31de-4a4e-8fa0-03a3823c06e1 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:11:58 np0005603623 nova_compute[226235]: 2026-01-31 08:11:58.447 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:59 np0005603623 nova_compute[226235]: 2026-01-31 08:11:59.078 226239 DEBUG nova.objects.instance [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'flavor' on Instance uuid 702e2506-8d57-4ea2-b56e-1800da93f646 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:59 np0005603623 nova_compute[226235]: 2026-01-31 08:11:59.101 226239 DEBUG nova.virt.libvirt.driver [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Attempting to attach volume 0a2f77d6-6ebd-4e32-8f3a-3b8197764510 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 03:11:59 np0005603623 nova_compute[226235]: 2026-01-31 08:11:59.105 226239 DEBUG nova.virt.libvirt.guest [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 03:11:59 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:11:59 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-0a2f77d6-6ebd-4e32-8f3a-3b8197764510">
Jan 31 03:11:59 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:11:59 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:11:59 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:11:59 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:11:59 np0005603623 nova_compute[226235]:  <auth username="openstack">
Jan 31 03:11:59 np0005603623 nova_compute[226235]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:11:59 np0005603623 nova_compute[226235]:  </auth>
Jan 31 03:11:59 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:11:59 np0005603623 nova_compute[226235]:  <serial>0a2f77d6-6ebd-4e32-8f3a-3b8197764510</serial>
Jan 31 03:11:59 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:11:59 np0005603623 nova_compute[226235]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:11:59 np0005603623 nova_compute[226235]: 2026-01-31 08:11:59.397 226239 DEBUG nova.virt.libvirt.driver [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:11:59 np0005603623 nova_compute[226235]: 2026-01-31 08:11:59.398 226239 DEBUG nova.virt.libvirt.driver [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:11:59 np0005603623 nova_compute[226235]: 2026-01-31 08:11:59.399 226239 DEBUG nova.virt.libvirt.driver [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:11:59 np0005603623 nova_compute[226235]: 2026-01-31 08:11:59.399 226239 DEBUG nova.virt.libvirt.driver [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No VIF found with MAC fa:16:3e:39:1e:10, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:11:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:11:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:59.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:11:59 np0005603623 nova_compute[226235]: 2026-01-31 08:11:59.791 226239 DEBUG oslo_concurrency.lockutils [None req-419515d0-50dd-4489-81b1-0c1f3f8b8071 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:11:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:59.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:01.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:01 np0005603623 nova_compute[226235]: 2026-01-31 08:12:01.632 226239 INFO nova.compute.manager [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Rebuilding instance#033[00m
Jan 31 03:12:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:01.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:01 np0005603623 nova_compute[226235]: 2026-01-31 08:12:01.996 226239 DEBUG nova.objects.instance [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 702e2506-8d57-4ea2-b56e-1800da93f646 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:12:02 np0005603623 nova_compute[226235]: 2026-01-31 08:12:02.013 226239 DEBUG nova.compute.manager [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:02 np0005603623 nova_compute[226235]: 2026-01-31 08:12:02.064 226239 DEBUG nova.objects.instance [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'pci_requests' on Instance uuid 702e2506-8d57-4ea2-b56e-1800da93f646 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:12:02 np0005603623 nova_compute[226235]: 2026-01-31 08:12:02.076 226239 DEBUG nova.objects.instance [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 702e2506-8d57-4ea2-b56e-1800da93f646 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:12:02 np0005603623 nova_compute[226235]: 2026-01-31 08:12:02.086 226239 DEBUG nova.objects.instance [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'resources' on Instance uuid 702e2506-8d57-4ea2-b56e-1800da93f646 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:12:02 np0005603623 nova_compute[226235]: 2026-01-31 08:12:02.100 226239 DEBUG nova.objects.instance [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'migration_context' on Instance uuid 702e2506-8d57-4ea2-b56e-1800da93f646 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:12:02 np0005603623 nova_compute[226235]: 2026-01-31 08:12:02.114 226239 DEBUG nova.objects.instance [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 03:12:02 np0005603623 nova_compute[226235]: 2026-01-31 08:12:02.118 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:12:02 np0005603623 nova_compute[226235]: 2026-01-31 08:12:02.498 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847107.4975867, f6cd19dd-9676-4737-a5dc-6b0d0705d8ca => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:12:02 np0005603623 nova_compute[226235]: 2026-01-31 08:12:02.499 226239 INFO nova.compute.manager [-] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:12:02 np0005603623 nova_compute[226235]: 2026-01-31 08:12:02.518 226239 DEBUG nova.compute.manager [None req-cfe6951e-726a-4991-a3fc-3afe7e424f6d - - - - - -] [instance: f6cd19dd-9676-4737-a5dc-6b0d0705d8ca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:02 np0005603623 nova_compute[226235]: 2026-01-31 08:12:02.554 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:03 np0005603623 nova_compute[226235]: 2026-01-31 08:12:03.452 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:03.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:03 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:12:03 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:12:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:12:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:03.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:12:04 np0005603623 kernel: tap7429a420-ee (unregistering): left promiscuous mode
Jan 31 03:12:04 np0005603623 NetworkManager[48970]: <info>  [1769847124.4534] device (tap7429a420-ee): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:12:04 np0005603623 nova_compute[226235]: 2026-01-31 08:12:04.461 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:04 np0005603623 ovn_controller[133449]: 2026-01-31T08:12:04Z|00355|binding|INFO|Releasing lport 7429a420-eefe-4af6-b5a7-ad8aff346ea8 from this chassis (sb_readonly=0)
Jan 31 03:12:04 np0005603623 ovn_controller[133449]: 2026-01-31T08:12:04Z|00356|binding|INFO|Setting lport 7429a420-eefe-4af6-b5a7-ad8aff346ea8 down in Southbound
Jan 31 03:12:04 np0005603623 ovn_controller[133449]: 2026-01-31T08:12:04Z|00357|binding|INFO|Removing iface tap7429a420-ee ovn-installed in OVS
Jan 31 03:12:04 np0005603623 nova_compute[226235]: 2026-01-31 08:12:04.465 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:04.470 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:1e:10 10.100.0.13'], port_security=['fa:16:3e:39:1e:10 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '702e2506-8d57-4ea2-b56e-1800da93f646', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f564452-5f08-4a1c-921e-f2daee9ec936', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c03fec1b3664105996aa979e226d8f8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '56300515-2cca-484e-a39f-36468f7be69f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.194'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d620dc35-e1b1-4011-a8c1-0995d2048b09, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=7429a420-eefe-4af6-b5a7-ad8aff346ea8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:12:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:04.472 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 7429a420-eefe-4af6-b5a7-ad8aff346ea8 in datapath 1f564452-5f08-4a1c-921e-f2daee9ec936 unbound from our chassis#033[00m
Jan 31 03:12:04 np0005603623 nova_compute[226235]: 2026-01-31 08:12:04.472 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:04.474 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1f564452-5f08-4a1c-921e-f2daee9ec936, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:12:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:04.476 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7a580cf0-41cf-49b2-9657-a1d99e36fd5a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:04.477 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 namespace which is not needed anymore#033[00m
Jan 31 03:12:04 np0005603623 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000057.scope: Deactivated successfully.
Jan 31 03:12:04 np0005603623 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000057.scope: Consumed 13.181s CPU time.
Jan 31 03:12:04 np0005603623 systemd-machined[194379]: Machine qemu-37-instance-00000057 terminated.
Jan 31 03:12:04 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[262324]: [NOTICE]   (262328) : haproxy version is 2.8.14-c23fe91
Jan 31 03:12:04 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[262324]: [NOTICE]   (262328) : path to executable is /usr/sbin/haproxy
Jan 31 03:12:04 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[262324]: [WARNING]  (262328) : Exiting Master process...
Jan 31 03:12:04 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[262324]: [ALERT]    (262328) : Current worker (262330) exited with code 143 (Terminated)
Jan 31 03:12:04 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[262324]: [WARNING]  (262328) : All workers exited. Exiting... (0)
Jan 31 03:12:04 np0005603623 systemd[1]: libpod-96ddce76daa5396cb284771f011f6b92278ed42b2ae208f3e52fd47a423c584b.scope: Deactivated successfully.
Jan 31 03:12:04 np0005603623 podman[263342]: 2026-01-31 08:12:04.611656256 +0000 UTC m=+0.049911234 container died 96ddce76daa5396cb284771f011f6b92278ed42b2ae208f3e52fd47a423c584b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 03:12:04 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-96ddce76daa5396cb284771f011f6b92278ed42b2ae208f3e52fd47a423c584b-userdata-shm.mount: Deactivated successfully.
Jan 31 03:12:04 np0005603623 systemd[1]: var-lib-containers-storage-overlay-6276b9a1a574931bea2627437dbd13908606446ef403812318958b4b795063bb-merged.mount: Deactivated successfully.
Jan 31 03:12:04 np0005603623 podman[263342]: 2026-01-31 08:12:04.650710067 +0000 UTC m=+0.088965045 container cleanup 96ddce76daa5396cb284771f011f6b92278ed42b2ae208f3e52fd47a423c584b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:12:04 np0005603623 systemd[1]: libpod-conmon-96ddce76daa5396cb284771f011f6b92278ed42b2ae208f3e52fd47a423c584b.scope: Deactivated successfully.
Jan 31 03:12:04 np0005603623 podman[263374]: 2026-01-31 08:12:04.718220006 +0000 UTC m=+0.049474471 container remove 96ddce76daa5396cb284771f011f6b92278ed42b2ae208f3e52fd47a423c584b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 03:12:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:04.724 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d5ab1345-7d88-4623-af1e-dc58f634195c]: (4, ('Sat Jan 31 08:12:04 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 (96ddce76daa5396cb284771f011f6b92278ed42b2ae208f3e52fd47a423c584b)\n96ddce76daa5396cb284771f011f6b92278ed42b2ae208f3e52fd47a423c584b\nSat Jan 31 08:12:04 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 (96ddce76daa5396cb284771f011f6b92278ed42b2ae208f3e52fd47a423c584b)\n96ddce76daa5396cb284771f011f6b92278ed42b2ae208f3e52fd47a423c584b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:04.728 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[48093a73-c954-45bd-9db0-9afca057d439]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:04 np0005603623 nova_compute[226235]: 2026-01-31 08:12:04.728 226239 DEBUG nova.compute.manager [req-52a445f3-c184-4451-acbd-8542d06160b0 req-dc2b0edf-a317-478e-af10-2d78e77bee9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Received event network-vif-unplugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:04 np0005603623 nova_compute[226235]: 2026-01-31 08:12:04.728 226239 DEBUG oslo_concurrency.lockutils [req-52a445f3-c184-4451-acbd-8542d06160b0 req-dc2b0edf-a317-478e-af10-2d78e77bee9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:04 np0005603623 nova_compute[226235]: 2026-01-31 08:12:04.728 226239 DEBUG oslo_concurrency.lockutils [req-52a445f3-c184-4451-acbd-8542d06160b0 req-dc2b0edf-a317-478e-af10-2d78e77bee9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:04 np0005603623 nova_compute[226235]: 2026-01-31 08:12:04.729 226239 DEBUG oslo_concurrency.lockutils [req-52a445f3-c184-4451-acbd-8542d06160b0 req-dc2b0edf-a317-478e-af10-2d78e77bee9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:04 np0005603623 nova_compute[226235]: 2026-01-31 08:12:04.729 226239 DEBUG nova.compute.manager [req-52a445f3-c184-4451-acbd-8542d06160b0 req-dc2b0edf-a317-478e-af10-2d78e77bee9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] No waiting events found dispatching network-vif-unplugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:12:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:04.729 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f564452-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:04 np0005603623 nova_compute[226235]: 2026-01-31 08:12:04.729 226239 WARNING nova.compute.manager [req-52a445f3-c184-4451-acbd-8542d06160b0 req-dc2b0edf-a317-478e-af10-2d78e77bee9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Received unexpected event network-vif-unplugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 for instance with vm_state active and task_state rebuilding.#033[00m
Jan 31 03:12:04 np0005603623 nova_compute[226235]: 2026-01-31 08:12:04.731 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:04 np0005603623 kernel: tap1f564452-50: left promiscuous mode
Jan 31 03:12:04 np0005603623 nova_compute[226235]: 2026-01-31 08:12:04.739 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:04.744 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[28b4f363-8b38-4344-abb2-c470d001c5fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:04.757 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c193749c-9413-49d8-b939-4ee2ea1b71f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:04.759 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9a37aaeb-aeff-4c79-8895-92b410222d54]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:04.774 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3d142bf4-3efb-44c5-86ac-8ea2ac13c34e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623577, 'reachable_time': 36310, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263404, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:04 np0005603623 systemd[1]: run-netns-ovnmeta\x2d1f564452\x2d5f08\x2d4a1c\x2d921e\x2df2daee9ec936.mount: Deactivated successfully.
Jan 31 03:12:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:04.777 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:12:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:04.777 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[69f0cc5c-4c41-4bb3-9fd2-ed70b0f91929]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:05 np0005603623 nova_compute[226235]: 2026-01-31 08:12:05.135 226239 INFO nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Instance shutdown successfully after 3 seconds.#033[00m
Jan 31 03:12:05 np0005603623 nova_compute[226235]: 2026-01-31 08:12:05.141 226239 INFO nova.virt.libvirt.driver [-] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Instance destroyed successfully.#033[00m
Jan 31 03:12:05 np0005603623 nova_compute[226235]: 2026-01-31 08:12:05.356 226239 INFO nova.compute.manager [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Detaching volume 0a2f77d6-6ebd-4e32-8f3a-3b8197764510#033[00m
Jan 31 03:12:05 np0005603623 nova_compute[226235]: 2026-01-31 08:12:05.531 226239 INFO nova.virt.block_device [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Attempting to driver detach volume 0a2f77d6-6ebd-4e32-8f3a-3b8197764510 from mountpoint /dev/vdb#033[00m
Jan 31 03:12:05 np0005603623 nova_compute[226235]: 2026-01-31 08:12:05.539 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Attempting to detach device vdb from instance 702e2506-8d57-4ea2-b56e-1800da93f646 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:12:05 np0005603623 nova_compute[226235]: 2026-01-31 08:12:05.540 226239 DEBUG nova.virt.libvirt.guest [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:12:05 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:12:05 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-0a2f77d6-6ebd-4e32-8f3a-3b8197764510">
Jan 31 03:12:05 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:12:05 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:12:05 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:12:05 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:12:05 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:12:05 np0005603623 nova_compute[226235]:  <serial>0a2f77d6-6ebd-4e32-8f3a-3b8197764510</serial>
Jan 31 03:12:05 np0005603623 nova_compute[226235]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 03:12:05 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:12:05 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:12:05 np0005603623 nova_compute[226235]: 2026-01-31 08:12:05.553 226239 INFO nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Successfully detached device vdb from instance 702e2506-8d57-4ea2-b56e-1800da93f646 from the persistent domain config.#033[00m
Jan 31 03:12:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:05.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:05 np0005603623 nova_compute[226235]: 2026-01-31 08:12:05.775 226239 INFO nova.virt.libvirt.driver [-] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Instance destroyed successfully.#033[00m
Jan 31 03:12:05 np0005603623 nova_compute[226235]: 2026-01-31 08:12:05.777 226239 DEBUG nova.virt.libvirt.vif [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:11:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-662570082',display_name='tempest-ServerActionsTestOtherA-server-64949834',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-662570082',id=87,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIc9G9qrE9DkmH4MfDS/pJVE/TBsDIWPxmulohRcOfbn2Cn27rx2gYgt8roH3OFkAEcaX90eL1koUD1iHLea0bAao7hRDcWiOicUocX2Hu4advs3+4GguQABPQt3bJ2N+w==',key_name='tempest-keypair-569982077',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:11:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-ax5f1zxm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:12:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=702e2506-8d57-4ea2-b56e-1800da93f646,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "address": "fa:16:3e:39:1e:10", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7429a420-ee", "ovs_interfaceid": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:12:05 np0005603623 nova_compute[226235]: 2026-01-31 08:12:05.777 226239 DEBUG nova.network.os_vif_util [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "address": "fa:16:3e:39:1e:10", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7429a420-ee", "ovs_interfaceid": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:12:05 np0005603623 nova_compute[226235]: 2026-01-31 08:12:05.778 226239 DEBUG nova.network.os_vif_util [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:39:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=7429a420-eefe-4af6-b5a7-ad8aff346ea8,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7429a420-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:12:05 np0005603623 nova_compute[226235]: 2026-01-31 08:12:05.779 226239 DEBUG os_vif [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=7429a420-eefe-4af6-b5a7-ad8aff346ea8,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7429a420-ee') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:12:05 np0005603623 nova_compute[226235]: 2026-01-31 08:12:05.781 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:05 np0005603623 nova_compute[226235]: 2026-01-31 08:12:05.782 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7429a420-ee, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:05 np0005603623 nova_compute[226235]: 2026-01-31 08:12:05.784 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:05 np0005603623 nova_compute[226235]: 2026-01-31 08:12:05.786 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:05 np0005603623 nova_compute[226235]: 2026-01-31 08:12:05.788 226239 INFO os_vif [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=7429a420-eefe-4af6-b5a7-ad8aff346ea8,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7429a420-ee')#033[00m
Jan 31 03:12:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:05.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.340 226239 INFO nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Deleting instance files /var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646_del#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.341 226239 INFO nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Deletion of /var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646_del complete#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.520 226239 INFO nova.virt.block_device [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Booting with volume 0a2f77d6-6ebd-4e32-8f3a-3b8197764510 at /dev/vdb#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.691 226239 DEBUG os_brick.utils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.693 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.706 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.707 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[cc1967ef-95b1-4f69-a622-5ab5dd129b30]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.708 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.715 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.715 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[6eda5fd0-9a14-4662-a477-2b1603739085]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.717 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.723 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.724 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[ebad87f4-0d16-41d4-b563-7d6d92787f84]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.725 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[b609668e-ff03-4799-941c-cb7c98b95268]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.726 226239 DEBUG oslo_concurrency.processutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.745 226239 DEBUG oslo_concurrency.processutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "nvme version" returned: 0 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.748 226239 DEBUG os_brick.initiator.connectors.lightos [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.748 226239 DEBUG os_brick.initiator.connectors.lightos [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.749 226239 DEBUG os_brick.initiator.connectors.lightos [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.750 226239 DEBUG os_brick.utils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] <== get_connector_properties: return (57ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.750 226239 DEBUG nova.virt.block_device [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Updating existing volume attachment record: 09ad1680-f3f2-4edb-93a4-01a6ecd4a735 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.914 226239 DEBUG nova.compute.manager [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Received event network-vif-plugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.915 226239 DEBUG oslo_concurrency.lockutils [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.916 226239 DEBUG oslo_concurrency.lockutils [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.916 226239 DEBUG oslo_concurrency.lockutils [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.917 226239 DEBUG nova.compute.manager [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] No waiting events found dispatching network-vif-plugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:12:06 np0005603623 nova_compute[226235]: 2026-01-31 08:12:06.917 226239 WARNING nova.compute.manager [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Received unexpected event network-vif-plugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 for instance with vm_state active and task_state rebuild_block_device_mapping.#033[00m
Jan 31 03:12:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:07.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:07.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.311 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.311 226239 INFO nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Creating image(s)#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.329 226239 DEBUG nova.storage.rbd_utils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 702e2506-8d57-4ea2-b56e-1800da93f646_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.352 226239 DEBUG nova.storage.rbd_utils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 702e2506-8d57-4ea2-b56e-1800da93f646_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.375 226239 DEBUG nova.storage.rbd_utils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 702e2506-8d57-4ea2-b56e-1800da93f646_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.379 226239 DEBUG oslo_concurrency.processutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.433 226239 DEBUG oslo_concurrency.processutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.434 226239 DEBUG oslo_concurrency.lockutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "365f9823d2619ef09948bdeed685488da63755b5" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.434 226239 DEBUG oslo_concurrency.lockutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "365f9823d2619ef09948bdeed685488da63755b5" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.435 226239 DEBUG oslo_concurrency.lockutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "365f9823d2619ef09948bdeed685488da63755b5" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.454 226239 DEBUG nova.storage.rbd_utils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 702e2506-8d57-4ea2-b56e-1800da93f646_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.457 226239 DEBUG oslo_concurrency.processutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 702e2506-8d57-4ea2-b56e-1800da93f646_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.470 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.686 226239 DEBUG oslo_concurrency.processutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 702e2506-8d57-4ea2-b56e-1800da93f646_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.228s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.757 226239 DEBUG nova.storage.rbd_utils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] resizing rbd image 702e2506-8d57-4ea2-b56e-1800da93f646_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:12:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.876 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.877 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Ensure instance console log exists: /var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.878 226239 DEBUG oslo_concurrency.lockutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.878 226239 DEBUG oslo_concurrency.lockutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.879 226239 DEBUG oslo_concurrency.lockutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.883 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Start _get_guest_xml network_info=[{"id": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "address": "fa:16:3e:39:1e:10", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7429a420-ee", "ovs_interfaceid": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:45Z,direct_url=<?>,disk_format='qcow2',id=0864ca59-9877-4e6d-adfc-f0a3204ed8f8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:47Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [{'attachment_id': '09ad1680-f3f2-4edb-93a4-01a6ecd4a735', 'delete_on_termination': False, 'guest_format': None, 'device_type': 'disk', 'boot_index': None, 'disk_bus': 'virtio', 'mount_device': '/dev/vdb', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-0a2f77d6-6ebd-4e32-8f3a-3b8197764510', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '0a2f77d6-6ebd-4e32-8f3a-3b8197764510', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '702e2506-8d57-4ea2-b56e-1800da93f646', 'attached_at': '', 'detached_at': '', 'volume_id': '0a2f77d6-6ebd-4e32-8f3a-3b8197764510', 'serial': '0a2f77d6-6ebd-4e32-8f3a-3b8197764510'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.887 226239 WARNING nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.895 226239 DEBUG nova.virt.libvirt.host [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.896 226239 DEBUG nova.virt.libvirt.host [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.899 226239 DEBUG nova.virt.libvirt.host [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.899 226239 DEBUG nova.virt.libvirt.host [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.900 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.900 226239 DEBUG nova.virt.hardware [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:45Z,direct_url=<?>,disk_format='qcow2',id=0864ca59-9877-4e6d-adfc-f0a3204ed8f8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:47Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.901 226239 DEBUG nova.virt.hardware [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.901 226239 DEBUG nova.virt.hardware [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.901 226239 DEBUG nova.virt.hardware [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.901 226239 DEBUG nova.virt.hardware [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.902 226239 DEBUG nova.virt.hardware [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.902 226239 DEBUG nova.virt.hardware [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.902 226239 DEBUG nova.virt.hardware [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.902 226239 DEBUG nova.virt.hardware [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.903 226239 DEBUG nova.virt.hardware [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.903 226239 DEBUG nova.virt.hardware [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.903 226239 DEBUG nova.objects.instance [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 702e2506-8d57-4ea2-b56e-1800da93f646 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:12:08 np0005603623 nova_compute[226235]: 2026-01-31 08:12:08.978 226239 DEBUG oslo_concurrency.processutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:12:09 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1718214715' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:12:09 np0005603623 nova_compute[226235]: 2026-01-31 08:12:09.498 226239 DEBUG oslo_concurrency.processutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:09 np0005603623 nova_compute[226235]: 2026-01-31 08:12:09.533 226239 DEBUG nova.storage.rbd_utils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 702e2506-8d57-4ea2-b56e-1800da93f646_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:09 np0005603623 nova_compute[226235]: 2026-01-31 08:12:09.538 226239 DEBUG oslo_concurrency.processutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:09.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:09.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:12:10 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/363183896' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.022 226239 DEBUG oslo_concurrency.processutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.069 226239 DEBUG nova.virt.libvirt.vif [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:11:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-662570082',display_name='tempest-ServerActionsTestOtherA-server-64949834',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-662570082',id=87,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIc9G9qrE9DkmH4MfDS/pJVE/TBsDIWPxmulohRcOfbn2Cn27rx2gYgt8roH3OFkAEcaX90eL1koUD1iHLea0bAao7hRDcWiOicUocX2Hu4advs3+4GguQABPQt3bJ2N+w==',key_name='tempest-keypair-569982077',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:11:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-ax5f1zxm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:12:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=702e2506-8d57-4ea2-b56e-1800da93f646,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "address": "fa:16:3e:39:1e:10", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7429a420-ee", "ovs_interfaceid": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.070 226239 DEBUG nova.network.os_vif_util [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "address": "fa:16:3e:39:1e:10", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7429a420-ee", "ovs_interfaceid": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.071 226239 DEBUG nova.network.os_vif_util [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:39:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=7429a420-eefe-4af6-b5a7-ad8aff346ea8,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7429a420-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.073 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:  <uuid>702e2506-8d57-4ea2-b56e-1800da93f646</uuid>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:  <name>instance-00000057</name>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerActionsTestOtherA-server-64949834</nova:name>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:12:08</nova:creationTime>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <nova:user uuid="12a823bd7c6e4cf492ebf6c1d002a91f">tempest-ServerActionsTestOtherA-1768827668-project-member</nova:user>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <nova:project uuid="9c03fec1b3664105996aa979e226d8f8">tempest-ServerActionsTestOtherA-1768827668</nova:project>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="0864ca59-9877-4e6d-adfc-f0a3204ed8f8"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <nova:port uuid="7429a420-eefe-4af6-b5a7-ad8aff346ea8">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <entry name="serial">702e2506-8d57-4ea2-b56e-1800da93f646</entry>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <entry name="uuid">702e2506-8d57-4ea2-b56e-1800da93f646</entry>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/702e2506-8d57-4ea2-b56e-1800da93f646_disk">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/702e2506-8d57-4ea2-b56e-1800da93f646_disk.config">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="volumes/volume-0a2f77d6-6ebd-4e32-8f3a-3b8197764510">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <target dev="vdb" bus="virtio"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <serial>0a2f77d6-6ebd-4e32-8f3a-3b8197764510</serial>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:39:1e:10"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <target dev="tap7429a420-ee"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646/console.log" append="off"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:12:10 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:12:10 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:12:10 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:12:10 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.074 226239 DEBUG nova.virt.libvirt.vif [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:11:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-662570082',display_name='tempest-ServerActionsTestOtherA-server-64949834',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-662570082',id=87,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIc9G9qrE9DkmH4MfDS/pJVE/TBsDIWPxmulohRcOfbn2Cn27rx2gYgt8roH3OFkAEcaX90eL1koUD1iHLea0bAao7hRDcWiOicUocX2Hu4advs3+4GguQABPQt3bJ2N+w==',key_name='tempest-keypair-569982077',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:11:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-ax5f1zxm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:12:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=702e2506-8d57-4ea2-b56e-1800da93f646,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "address": "fa:16:3e:39:1e:10", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7429a420-ee", "ovs_interfaceid": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.074 226239 DEBUG nova.network.os_vif_util [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "address": "fa:16:3e:39:1e:10", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7429a420-ee", "ovs_interfaceid": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.075 226239 DEBUG nova.network.os_vif_util [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:39:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=7429a420-eefe-4af6-b5a7-ad8aff346ea8,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7429a420-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.075 226239 DEBUG os_vif [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=7429a420-eefe-4af6-b5a7-ad8aff346ea8,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7429a420-ee') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.076 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.076 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.076 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.079 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.079 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7429a420-ee, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.080 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7429a420-ee, col_values=(('external_ids', {'iface-id': '7429a420-eefe-4af6-b5a7-ad8aff346ea8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:39:1e:10', 'vm-uuid': '702e2506-8d57-4ea2-b56e-1800da93f646'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.081 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:10 np0005603623 NetworkManager[48970]: <info>  [1769847130.0822] manager: (tap7429a420-ee): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/170)
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.083 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.086 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.087 226239 INFO os_vif [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=7429a420-eefe-4af6-b5a7-ad8aff346ea8,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7429a420-ee')#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.133 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.133 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.133 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.134 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No VIF found with MAC fa:16:3e:39:1e:10, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.134 226239 INFO nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Using config drive#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.160 226239 DEBUG nova.storage.rbd_utils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 702e2506-8d57-4ea2-b56e-1800da93f646_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.198 226239 DEBUG nova.objects.instance [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 702e2506-8d57-4ea2-b56e-1800da93f646 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.226 226239 DEBUG nova.objects.instance [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'keypairs' on Instance uuid 702e2506-8d57-4ea2-b56e-1800da93f646 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.806 226239 INFO nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Creating config drive at /var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646/disk.config#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.810 226239 DEBUG oslo_concurrency.processutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjf000119 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.930 226239 DEBUG oslo_concurrency.processutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjf000119" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.957 226239 DEBUG nova.storage.rbd_utils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 702e2506-8d57-4ea2-b56e-1800da93f646_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:10 np0005603623 nova_compute[226235]: 2026-01-31 08:12:10.960 226239 DEBUG oslo_concurrency.processutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646/disk.config 702e2506-8d57-4ea2-b56e-1800da93f646_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:11 np0005603623 nova_compute[226235]: 2026-01-31 08:12:11.106 226239 DEBUG oslo_concurrency.processutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646/disk.config 702e2506-8d57-4ea2-b56e-1800da93f646_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:11 np0005603623 nova_compute[226235]: 2026-01-31 08:12:11.107 226239 INFO nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Deleting local config drive /var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646/disk.config because it was imported into RBD.#033[00m
Jan 31 03:12:11 np0005603623 kernel: tap7429a420-ee: entered promiscuous mode
Jan 31 03:12:11 np0005603623 NetworkManager[48970]: <info>  [1769847131.1504] manager: (tap7429a420-ee): new Tun device (/org/freedesktop/NetworkManager/Devices/171)
Jan 31 03:12:11 np0005603623 ovn_controller[133449]: 2026-01-31T08:12:11Z|00358|binding|INFO|Claiming lport 7429a420-eefe-4af6-b5a7-ad8aff346ea8 for this chassis.
Jan 31 03:12:11 np0005603623 nova_compute[226235]: 2026-01-31 08:12:11.151 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:11 np0005603623 ovn_controller[133449]: 2026-01-31T08:12:11Z|00359|binding|INFO|7429a420-eefe-4af6-b5a7-ad8aff346ea8: Claiming fa:16:3e:39:1e:10 10.100.0.13
Jan 31 03:12:11 np0005603623 ovn_controller[133449]: 2026-01-31T08:12:11Z|00360|binding|INFO|Setting lport 7429a420-eefe-4af6-b5a7-ad8aff346ea8 ovn-installed in OVS
Jan 31 03:12:11 np0005603623 nova_compute[226235]: 2026-01-31 08:12:11.162 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:11 np0005603623 ovn_controller[133449]: 2026-01-31T08:12:11Z|00361|binding|INFO|Setting lport 7429a420-eefe-4af6-b5a7-ad8aff346ea8 up in Southbound
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.166 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:1e:10 10.100.0.13'], port_security=['fa:16:3e:39:1e:10 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '702e2506-8d57-4ea2-b56e-1800da93f646', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f564452-5f08-4a1c-921e-f2daee9ec936', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c03fec1b3664105996aa979e226d8f8', 'neutron:revision_number': '5', 'neutron:security_group_ids': '56300515-2cca-484e-a39f-36468f7be69f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.194'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d620dc35-e1b1-4011-a8c1-0995d2048b09, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=7429a420-eefe-4af6-b5a7-ad8aff346ea8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.168 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 7429a420-eefe-4af6-b5a7-ad8aff346ea8 in datapath 1f564452-5f08-4a1c-921e-f2daee9ec936 bound to our chassis#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.169 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1f564452-5f08-4a1c-921e-f2daee9ec936#033[00m
Jan 31 03:12:11 np0005603623 systemd-udevd[263734]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.176 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[08f93836-8c9c-47dc-94c5-aec587f17f40]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.178 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1f564452-51 in ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:12:11 np0005603623 systemd-machined[194379]: New machine qemu-39-instance-00000057.
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.179 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1f564452-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.179 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c63285dc-d494-4064-b330-8175c36094ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.180 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[124b3ad7-5558-49f9-9766-9c88d7258bb1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:11 np0005603623 NetworkManager[48970]: <info>  [1769847131.1839] device (tap7429a420-ee): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:12:11 np0005603623 NetworkManager[48970]: <info>  [1769847131.1843] device (tap7429a420-ee): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.189 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[4daf15a0-e043-4ae9-9615-3522a7616545]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:11 np0005603623 systemd[1]: Started Virtual Machine qemu-39-instance-00000057.
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.196 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[16dd6fa8-4265-457f-a7d5-47b27939b9ed]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.227 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[c283c4c8-96c7-4317-b396-d7953a0a531d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.232 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2a6acaae-3113-4199-8b3e-99e3b90dc607]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:11 np0005603623 NetworkManager[48970]: <info>  [1769847131.2334] manager: (tap1f564452-50): new Veth device (/org/freedesktop/NetworkManager/Devices/172)
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.257 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[988089b5-c9b5-440a-baa0-d9c53b61e78f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.261 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[69e2020a-d2f4-4fef-b5b7-e48d53a1bdad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:11 np0005603623 NetworkManager[48970]: <info>  [1769847131.2812] device (tap1f564452-50): carrier: link connected
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.287 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[e6491ed0-9c14-4862-9055-afc5551462fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.304 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[872aca78-5173-49c9-8c46-00b02354cff9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f564452-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:23:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627171, 'reachable_time': 19698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263768, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.319 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b3a9e1af-c3d6-41df-84c1-f3e31c331d2b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:23e8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 627171, 'tstamp': 627171}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 263769, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.332 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[09337412-7dfc-46b3-b552-385b99ce58c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f564452-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:23:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627171, 'reachable_time': 19698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 263770, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.357 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c9b43736-5750-4e07-876d-351f20896b55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.412 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dc04960e-b2bb-4419-8173-8155bbffbc3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.413 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f564452-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.414 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.414 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f564452-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:11 np0005603623 nova_compute[226235]: 2026-01-31 08:12:11.416 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:11 np0005603623 NetworkManager[48970]: <info>  [1769847131.4175] manager: (tap1f564452-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/173)
Jan 31 03:12:11 np0005603623 kernel: tap1f564452-50: entered promiscuous mode
Jan 31 03:12:11 np0005603623 nova_compute[226235]: 2026-01-31 08:12:11.422 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.422 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1f564452-50, col_values=(('external_ids', {'iface-id': '5bb8c1b5-edce-4f6a-8164-58b7d89a3330'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:11 np0005603623 ovn_controller[133449]: 2026-01-31T08:12:11Z|00362|binding|INFO|Releasing lport 5bb8c1b5-edce-4f6a-8164-58b7d89a3330 from this chassis (sb_readonly=0)
Jan 31 03:12:11 np0005603623 nova_compute[226235]: 2026-01-31 08:12:11.438 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.439 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1f564452-5f08-4a1c-921e-f2daee9ec936.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1f564452-5f08-4a1c-921e-f2daee9ec936.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.440 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b975a1c5-587e-4677-a458-be32d2c8e8bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.441 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-1f564452-5f08-4a1c-921e-f2daee9ec936
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/1f564452-5f08-4a1c-921e-f2daee9ec936.pid.haproxy
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 1f564452-5f08-4a1c-921e-f2daee9ec936
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:12:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:11.441 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'env', 'PROCESS_TAG=haproxy-1f564452-5f08-4a1c-921e-f2daee9ec936', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1f564452-5f08-4a1c-921e-f2daee9ec936.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:12:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:11.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:12:11 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1362854782' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:12:11 np0005603623 podman[263802]: 2026-01-31 08:12:11.745999352 +0000 UTC m=+0.018740512 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:12:11 np0005603623 podman[263802]: 2026-01-31 08:12:11.875994159 +0000 UTC m=+0.148735299 container create 5a6059534341722a0354e0dcc0e38827bbe267afad076df283d9a639c213aa5e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:12:11 np0005603623 nova_compute[226235]: 2026-01-31 08:12:11.876 226239 DEBUG nova.compute.manager [req-36411cde-91d2-432c-9b8a-e9a7c520c731 req-bece546a-4365-4f2f-94fb-19f3df104d6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Received event network-vif-plugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:11 np0005603623 nova_compute[226235]: 2026-01-31 08:12:11.877 226239 DEBUG oslo_concurrency.lockutils [req-36411cde-91d2-432c-9b8a-e9a7c520c731 req-bece546a-4365-4f2f-94fb-19f3df104d6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:11 np0005603623 nova_compute[226235]: 2026-01-31 08:12:11.878 226239 DEBUG oslo_concurrency.lockutils [req-36411cde-91d2-432c-9b8a-e9a7c520c731 req-bece546a-4365-4f2f-94fb-19f3df104d6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:11 np0005603623 nova_compute[226235]: 2026-01-31 08:12:11.878 226239 DEBUG oslo_concurrency.lockutils [req-36411cde-91d2-432c-9b8a-e9a7c520c731 req-bece546a-4365-4f2f-94fb-19f3df104d6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:11 np0005603623 nova_compute[226235]: 2026-01-31 08:12:11.879 226239 DEBUG nova.compute.manager [req-36411cde-91d2-432c-9b8a-e9a7c520c731 req-bece546a-4365-4f2f-94fb-19f3df104d6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] No waiting events found dispatching network-vif-plugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:12:11 np0005603623 nova_compute[226235]: 2026-01-31 08:12:11.879 226239 WARNING nova.compute.manager [req-36411cde-91d2-432c-9b8a-e9a7c520c731 req-bece546a-4365-4f2f-94fb-19f3df104d6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Received unexpected event network-vif-plugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 31 03:12:11 np0005603623 systemd[1]: Started libpod-conmon-5a6059534341722a0354e0dcc0e38827bbe267afad076df283d9a639c213aa5e.scope.
Jan 31 03:12:11 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:12:11 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/360b01f8f97c3ea667d6154ad79aad2cd0ab0fe88be2022b108ec71948f90d61/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:12:11 np0005603623 podman[263802]: 2026-01-31 08:12:11.953198853 +0000 UTC m=+0.225940013 container init 5a6059534341722a0354e0dcc0e38827bbe267afad076df283d9a639c213aa5e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 31 03:12:11 np0005603623 podman[263802]: 2026-01-31 08:12:11.956987582 +0000 UTC m=+0.229728712 container start 5a6059534341722a0354e0dcc0e38827bbe267afad076df283d9a639c213aa5e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 31 03:12:11 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[263818]: [NOTICE]   (263822) : New worker (263824) forked
Jan 31 03:12:11 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[263818]: [NOTICE]   (263822) : Loading success.
Jan 31 03:12:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:11.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.443 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Removed pending event for 702e2506-8d57-4ea2-b56e-1800da93f646 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.444 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847132.4426289, 702e2506-8d57-4ea2-b56e-1800da93f646 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.444 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.446 226239 DEBUG nova.compute.manager [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.447 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.450 226239 INFO nova.virt.libvirt.driver [-] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Instance spawned successfully.#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.450 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.490 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.494 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.495 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.495 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.496 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.496 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.496 226239 DEBUG nova.virt.libvirt.driver [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.500 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.538 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.539 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847132.4435139, 702e2506-8d57-4ea2-b56e-1800da93f646 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.539 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] VM Started (Lifecycle Event)#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.562 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.566 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.569 226239 DEBUG nova.compute.manager [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.586 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.633 226239 DEBUG oslo_concurrency.lockutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.634 226239 DEBUG oslo_concurrency.lockutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:12 np0005603623 nova_compute[226235]: 2026-01-31 08:12:12.634 226239 DEBUG nova.objects.instance [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 03:12:13 np0005603623 nova_compute[226235]: 2026-01-31 08:12:13.222 226239 DEBUG oslo_concurrency.lockutils [None req-6a7a8822-3a59-4ff0-bcdf-17e04ab5b97d 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.589s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:13 np0005603623 nova_compute[226235]: 2026-01-31 08:12:13.455 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:13.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:12:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:13.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:12:15 np0005603623 nova_compute[226235]: 2026-01-31 08:12:15.046 226239 DEBUG nova.compute.manager [req-9bf6901b-1393-43de-9bdd-0661b4b63404 req-397e2b92-50e6-4936-a777-850e3052238c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Received event network-vif-plugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:15 np0005603623 nova_compute[226235]: 2026-01-31 08:12:15.047 226239 DEBUG oslo_concurrency.lockutils [req-9bf6901b-1393-43de-9bdd-0661b4b63404 req-397e2b92-50e6-4936-a777-850e3052238c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:15 np0005603623 nova_compute[226235]: 2026-01-31 08:12:15.048 226239 DEBUG oslo_concurrency.lockutils [req-9bf6901b-1393-43de-9bdd-0661b4b63404 req-397e2b92-50e6-4936-a777-850e3052238c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:15 np0005603623 nova_compute[226235]: 2026-01-31 08:12:15.048 226239 DEBUG oslo_concurrency.lockutils [req-9bf6901b-1393-43de-9bdd-0661b4b63404 req-397e2b92-50e6-4936-a777-850e3052238c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:15 np0005603623 nova_compute[226235]: 2026-01-31 08:12:15.049 226239 DEBUG nova.compute.manager [req-9bf6901b-1393-43de-9bdd-0661b4b63404 req-397e2b92-50e6-4936-a777-850e3052238c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] No waiting events found dispatching network-vif-plugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:12:15 np0005603623 nova_compute[226235]: 2026-01-31 08:12:15.049 226239 WARNING nova.compute.manager [req-9bf6901b-1393-43de-9bdd-0661b4b63404 req-397e2b92-50e6-4936-a777-850e3052238c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Received unexpected event network-vif-plugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:12:15 np0005603623 nova_compute[226235]: 2026-01-31 08:12:15.082 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:12:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:15.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:12:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:15.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:16 np0005603623 nova_compute[226235]: 2026-01-31 08:12:16.748 226239 DEBUG oslo_concurrency.lockutils [None req-5cd286f0-17dc-4a52-a134-39ec75123da7 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "702e2506-8d57-4ea2-b56e-1800da93f646" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:16 np0005603623 nova_compute[226235]: 2026-01-31 08:12:16.751 226239 DEBUG oslo_concurrency.lockutils [None req-5cd286f0-17dc-4a52-a134-39ec75123da7 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:16 np0005603623 nova_compute[226235]: 2026-01-31 08:12:16.782 226239 INFO nova.compute.manager [None req-5cd286f0-17dc-4a52-a134-39ec75123da7 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Detaching volume 0a2f77d6-6ebd-4e32-8f3a-3b8197764510#033[00m
Jan 31 03:12:16 np0005603623 nova_compute[226235]: 2026-01-31 08:12:16.964 226239 INFO nova.virt.block_device [None req-5cd286f0-17dc-4a52-a134-39ec75123da7 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Attempting to driver detach volume 0a2f77d6-6ebd-4e32-8f3a-3b8197764510 from mountpoint /dev/vdb#033[00m
Jan 31 03:12:16 np0005603623 nova_compute[226235]: 2026-01-31 08:12:16.973 226239 DEBUG nova.virt.libvirt.driver [None req-5cd286f0-17dc-4a52-a134-39ec75123da7 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Attempting to detach device vdb from instance 702e2506-8d57-4ea2-b56e-1800da93f646 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:12:16 np0005603623 nova_compute[226235]: 2026-01-31 08:12:16.974 226239 DEBUG nova.virt.libvirt.guest [None req-5cd286f0-17dc-4a52-a134-39ec75123da7 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:12:16 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:12:16 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-0a2f77d6-6ebd-4e32-8f3a-3b8197764510">
Jan 31 03:12:16 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:12:16 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:12:16 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:12:16 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:12:16 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:12:16 np0005603623 nova_compute[226235]:  <serial>0a2f77d6-6ebd-4e32-8f3a-3b8197764510</serial>
Jan 31 03:12:16 np0005603623 nova_compute[226235]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 31 03:12:16 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:12:16 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:12:16 np0005603623 nova_compute[226235]: 2026-01-31 08:12:16.983 226239 INFO nova.virt.libvirt.driver [None req-5cd286f0-17dc-4a52-a134-39ec75123da7 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Successfully detached device vdb from instance 702e2506-8d57-4ea2-b56e-1800da93f646 from the persistent domain config.#033[00m
Jan 31 03:12:16 np0005603623 nova_compute[226235]: 2026-01-31 08:12:16.984 226239 DEBUG nova.virt.libvirt.driver [None req-5cd286f0-17dc-4a52-a134-39ec75123da7 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 702e2506-8d57-4ea2-b56e-1800da93f646 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 03:12:16 np0005603623 nova_compute[226235]: 2026-01-31 08:12:16.985 226239 DEBUG nova.virt.libvirt.guest [None req-5cd286f0-17dc-4a52-a134-39ec75123da7 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:12:16 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:12:16 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-0a2f77d6-6ebd-4e32-8f3a-3b8197764510">
Jan 31 03:12:16 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:12:16 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:12:16 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:12:16 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:12:16 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:12:16 np0005603623 nova_compute[226235]:  <serial>0a2f77d6-6ebd-4e32-8f3a-3b8197764510</serial>
Jan 31 03:12:16 np0005603623 nova_compute[226235]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 31 03:12:16 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:12:16 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:12:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:12:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:17.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:12:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:17.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:18 np0005603623 nova_compute[226235]: 2026-01-31 08:12:18.458 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:19 np0005603623 nova_compute[226235]: 2026-01-31 08:12:19.441 226239 DEBUG nova.virt.libvirt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Received event <DeviceRemovedEvent: 1769847139.4413345, 702e2506-8d57-4ea2-b56e-1800da93f646 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 03:12:19 np0005603623 nova_compute[226235]: 2026-01-31 08:12:19.442 226239 DEBUG nova.virt.libvirt.driver [None req-5cd286f0-17dc-4a52-a134-39ec75123da7 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 702e2506-8d57-4ea2-b56e-1800da93f646 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 03:12:19 np0005603623 nova_compute[226235]: 2026-01-31 08:12:19.444 226239 INFO nova.virt.libvirt.driver [None req-5cd286f0-17dc-4a52-a134-39ec75123da7 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Successfully detached device vdb from instance 702e2506-8d57-4ea2-b56e-1800da93f646 from the live domain config.#033[00m
Jan 31 03:12:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:19.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:19 np0005603623 nova_compute[226235]: 2026-01-31 08:12:19.677 226239 DEBUG nova.objects.instance [None req-5cd286f0-17dc-4a52-a134-39ec75123da7 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'flavor' on Instance uuid 702e2506-8d57-4ea2-b56e-1800da93f646 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:12:19 np0005603623 nova_compute[226235]: 2026-01-31 08:12:19.795 226239 DEBUG oslo_concurrency.lockutils [None req-5cd286f0-17dc-4a52-a134-39ec75123da7 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 3.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:12:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:19.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.085 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.522 226239 DEBUG oslo_concurrency.lockutils [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "702e2506-8d57-4ea2-b56e-1800da93f646" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.523 226239 DEBUG oslo_concurrency.lockutils [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.523 226239 DEBUG oslo_concurrency.lockutils [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.523 226239 DEBUG oslo_concurrency.lockutils [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.523 226239 DEBUG oslo_concurrency.lockutils [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.524 226239 INFO nova.compute.manager [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Terminating instance#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.525 226239 DEBUG nova.compute.manager [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:12:20 np0005603623 kernel: tap7429a420-ee (unregistering): left promiscuous mode
Jan 31 03:12:20 np0005603623 NetworkManager[48970]: <info>  [1769847140.5616] device (tap7429a420-ee): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.561 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:20 np0005603623 ovn_controller[133449]: 2026-01-31T08:12:20Z|00363|binding|INFO|Releasing lport 7429a420-eefe-4af6-b5a7-ad8aff346ea8 from this chassis (sb_readonly=0)
Jan 31 03:12:20 np0005603623 ovn_controller[133449]: 2026-01-31T08:12:20Z|00364|binding|INFO|Setting lport 7429a420-eefe-4af6-b5a7-ad8aff346ea8 down in Southbound
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.567 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:20 np0005603623 ovn_controller[133449]: 2026-01-31T08:12:20Z|00365|binding|INFO|Removing iface tap7429a420-ee ovn-installed in OVS
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.569 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.574 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:20.575 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:1e:10 10.100.0.13'], port_security=['fa:16:3e:39:1e:10 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '702e2506-8d57-4ea2-b56e-1800da93f646', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f564452-5f08-4a1c-921e-f2daee9ec936', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c03fec1b3664105996aa979e226d8f8', 'neutron:revision_number': '6', 'neutron:security_group_ids': '56300515-2cca-484e-a39f-36468f7be69f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.194', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d620dc35-e1b1-4011-a8c1-0995d2048b09, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=7429a420-eefe-4af6-b5a7-ad8aff346ea8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:12:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:20.577 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 7429a420-eefe-4af6-b5a7-ad8aff346ea8 in datapath 1f564452-5f08-4a1c-921e-f2daee9ec936 unbound from our chassis#033[00m
Jan 31 03:12:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:20.578 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1f564452-5f08-4a1c-921e-f2daee9ec936, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:12:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:20.579 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cbebf494-03e9-4fb0-b09e-562f682bf905]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:20.580 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 namespace which is not needed anymore#033[00m
Jan 31 03:12:20 np0005603623 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000057.scope: Deactivated successfully.
Jan 31 03:12:20 np0005603623 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000057.scope: Consumed 9.324s CPU time.
Jan 31 03:12:20 np0005603623 systemd-machined[194379]: Machine qemu-39-instance-00000057 terminated.
Jan 31 03:12:20 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[263818]: [NOTICE]   (263822) : haproxy version is 2.8.14-c23fe91
Jan 31 03:12:20 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[263818]: [NOTICE]   (263822) : path to executable is /usr/sbin/haproxy
Jan 31 03:12:20 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[263818]: [WARNING]  (263822) : Exiting Master process...
Jan 31 03:12:20 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[263818]: [ALERT]    (263822) : Current worker (263824) exited with code 143 (Terminated)
Jan 31 03:12:20 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[263818]: [WARNING]  (263822) : All workers exited. Exiting... (0)
Jan 31 03:12:20 np0005603623 systemd[1]: libpod-5a6059534341722a0354e0dcc0e38827bbe267afad076df283d9a639c213aa5e.scope: Deactivated successfully.
Jan 31 03:12:20 np0005603623 podman[263922]: 2026-01-31 08:12:20.685090419 +0000 UTC m=+0.038672189 container died 5a6059534341722a0354e0dcc0e38827bbe267afad076df283d9a639c213aa5e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 03:12:20 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5a6059534341722a0354e0dcc0e38827bbe267afad076df283d9a639c213aa5e-userdata-shm.mount: Deactivated successfully.
Jan 31 03:12:20 np0005603623 systemd[1]: var-lib-containers-storage-overlay-360b01f8f97c3ea667d6154ad79aad2cd0ab0fe88be2022b108ec71948f90d61-merged.mount: Deactivated successfully.
Jan 31 03:12:20 np0005603623 podman[263922]: 2026-01-31 08:12:20.730605755 +0000 UTC m=+0.084187525 container cleanup 5a6059534341722a0354e0dcc0e38827bbe267afad076df283d9a639c213aa5e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:12:20 np0005603623 systemd[1]: libpod-conmon-5a6059534341722a0354e0dcc0e38827bbe267afad076df283d9a639c213aa5e.scope: Deactivated successfully.
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.742 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.744 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.754 226239 INFO nova.virt.libvirt.driver [-] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Instance destroyed successfully.#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.754 226239 DEBUG nova.objects.instance [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'resources' on Instance uuid 702e2506-8d57-4ea2-b56e-1800da93f646 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.771 226239 DEBUG nova.virt.libvirt.vif [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:11:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-662570082',display_name='tempest-ServerActionsTestOtherA-server-64949834',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-662570082',id=87,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIc9G9qrE9DkmH4MfDS/pJVE/TBsDIWPxmulohRcOfbn2Cn27rx2gYgt8roH3OFkAEcaX90eL1koUD1iHLea0bAao7hRDcWiOicUocX2Hu4advs3+4GguQABPQt3bJ2N+w==',key_name='tempest-keypair-569982077',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:12:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-ax5f1zxm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:12:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=702e2506-8d57-4ea2-b56e-1800da93f646,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "address": "fa:16:3e:39:1e:10", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7429a420-ee", "ovs_interfaceid": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.772 226239 DEBUG nova.network.os_vif_util [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "address": "fa:16:3e:39:1e:10", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7429a420-ee", "ovs_interfaceid": "7429a420-eefe-4af6-b5a7-ad8aff346ea8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.772 226239 DEBUG nova.network.os_vif_util [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:39:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=7429a420-eefe-4af6-b5a7-ad8aff346ea8,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7429a420-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.773 226239 DEBUG os_vif [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=7429a420-eefe-4af6-b5a7-ad8aff346ea8,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7429a420-ee') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.774 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.774 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7429a420-ee, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.776 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.778 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.781 226239 INFO os_vif [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=7429a420-eefe-4af6-b5a7-ad8aff346ea8,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7429a420-ee')#033[00m
Jan 31 03:12:20 np0005603623 podman[263953]: 2026-01-31 08:12:20.794261751 +0000 UTC m=+0.046302141 container remove 5a6059534341722a0354e0dcc0e38827bbe267afad076df283d9a639c213aa5e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 31 03:12:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:20.798 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[98dccb43-40a9-4d14-bea1-6f555c7efe9e]: (4, ('Sat Jan 31 08:12:20 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 (5a6059534341722a0354e0dcc0e38827bbe267afad076df283d9a639c213aa5e)\n5a6059534341722a0354e0dcc0e38827bbe267afad076df283d9a639c213aa5e\nSat Jan 31 08:12:20 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 (5a6059534341722a0354e0dcc0e38827bbe267afad076df283d9a639c213aa5e)\n5a6059534341722a0354e0dcc0e38827bbe267afad076df283d9a639c213aa5e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:20.799 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6f74f0b7-de0a-4614-a3cc-037d443bf32e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:20.800 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f564452-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.801 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:20 np0005603623 kernel: tap1f564452-50: left promiscuous mode
Jan 31 03:12:20 np0005603623 nova_compute[226235]: 2026-01-31 08:12:20.807 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:20.811 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[600f32c2-2707-45cd-a500-a819405d1e9e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:20.830 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0db81409-1610-47da-b3e7-9db67ff22d4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:20.832 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3b6a7bdf-ae89-437d-ad42-37043b5f8b38]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:20.843 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f7a7dd3f-6459-41f0-911c-10c5dc4f6f60]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627165, 'reachable_time': 20156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263994, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:20.846 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:12:20 np0005603623 systemd[1]: run-netns-ovnmeta\x2d1f564452\x2d5f08\x2d4a1c\x2d921e\x2df2daee9ec936.mount: Deactivated successfully.
Jan 31 03:12:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:20.847 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[a2d10cc2-3029-4216-aa35-021bd5ee496f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:21 np0005603623 nova_compute[226235]: 2026-01-31 08:12:21.198 226239 INFO nova.virt.libvirt.driver [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Deleting instance files /var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646_del#033[00m
Jan 31 03:12:21 np0005603623 nova_compute[226235]: 2026-01-31 08:12:21.198 226239 INFO nova.virt.libvirt.driver [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Deletion of /var/lib/nova/instances/702e2506-8d57-4ea2-b56e-1800da93f646_del complete#033[00m
Jan 31 03:12:21 np0005603623 nova_compute[226235]: 2026-01-31 08:12:21.255 226239 INFO nova.compute.manager [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Took 0.73 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:12:21 np0005603623 nova_compute[226235]: 2026-01-31 08:12:21.256 226239 DEBUG oslo.service.loopingcall [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:12:21 np0005603623 nova_compute[226235]: 2026-01-31 08:12:21.256 226239 DEBUG nova.compute.manager [-] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:12:21 np0005603623 nova_compute[226235]: 2026-01-31 08:12:21.257 226239 DEBUG nova.network.neutron [-] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:12:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:21.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:21.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:23 np0005603623 nova_compute[226235]: 2026-01-31 08:12:23.434 226239 DEBUG nova.network.neutron [-] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:12:23 np0005603623 nova_compute[226235]: 2026-01-31 08:12:23.451 226239 INFO nova.compute.manager [-] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Took 2.19 seconds to deallocate network for instance.#033[00m
Jan 31 03:12:23 np0005603623 nova_compute[226235]: 2026-01-31 08:12:23.503 226239 DEBUG oslo_concurrency.lockutils [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:23 np0005603623 nova_compute[226235]: 2026-01-31 08:12:23.503 226239 DEBUG oslo_concurrency.lockutils [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:23 np0005603623 nova_compute[226235]: 2026-01-31 08:12:23.510 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:23 np0005603623 nova_compute[226235]: 2026-01-31 08:12:23.599 226239 DEBUG oslo_concurrency.processutils [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:23.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:23 np0005603623 nova_compute[226235]: 2026-01-31 08:12:23.694 226239 DEBUG nova.compute.manager [req-128d72c3-13a2-4be8-83d8-0b2a705c7a5a req-11b2b071-07f5-4250-ba8e-8331eca54b0c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Received event network-vif-unplugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:23 np0005603623 nova_compute[226235]: 2026-01-31 08:12:23.695 226239 DEBUG oslo_concurrency.lockutils [req-128d72c3-13a2-4be8-83d8-0b2a705c7a5a req-11b2b071-07f5-4250-ba8e-8331eca54b0c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:23 np0005603623 nova_compute[226235]: 2026-01-31 08:12:23.695 226239 DEBUG oslo_concurrency.lockutils [req-128d72c3-13a2-4be8-83d8-0b2a705c7a5a req-11b2b071-07f5-4250-ba8e-8331eca54b0c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:23 np0005603623 nova_compute[226235]: 2026-01-31 08:12:23.695 226239 DEBUG oslo_concurrency.lockutils [req-128d72c3-13a2-4be8-83d8-0b2a705c7a5a req-11b2b071-07f5-4250-ba8e-8331eca54b0c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:23 np0005603623 nova_compute[226235]: 2026-01-31 08:12:23.696 226239 DEBUG nova.compute.manager [req-128d72c3-13a2-4be8-83d8-0b2a705c7a5a req-11b2b071-07f5-4250-ba8e-8331eca54b0c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] No waiting events found dispatching network-vif-unplugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:12:23 np0005603623 nova_compute[226235]: 2026-01-31 08:12:23.696 226239 WARNING nova.compute.manager [req-128d72c3-13a2-4be8-83d8-0b2a705c7a5a req-11b2b071-07f5-4250-ba8e-8331eca54b0c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Received unexpected event network-vif-unplugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:12:23 np0005603623 nova_compute[226235]: 2026-01-31 08:12:23.696 226239 DEBUG nova.compute.manager [req-128d72c3-13a2-4be8-83d8-0b2a705c7a5a req-11b2b071-07f5-4250-ba8e-8331eca54b0c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Received event network-vif-plugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:23 np0005603623 nova_compute[226235]: 2026-01-31 08:12:23.696 226239 DEBUG oslo_concurrency.lockutils [req-128d72c3-13a2-4be8-83d8-0b2a705c7a5a req-11b2b071-07f5-4250-ba8e-8331eca54b0c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:23 np0005603623 nova_compute[226235]: 2026-01-31 08:12:23.697 226239 DEBUG oslo_concurrency.lockutils [req-128d72c3-13a2-4be8-83d8-0b2a705c7a5a req-11b2b071-07f5-4250-ba8e-8331eca54b0c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:23 np0005603623 nova_compute[226235]: 2026-01-31 08:12:23.697 226239 DEBUG oslo_concurrency.lockutils [req-128d72c3-13a2-4be8-83d8-0b2a705c7a5a req-11b2b071-07f5-4250-ba8e-8331eca54b0c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:23 np0005603623 nova_compute[226235]: 2026-01-31 08:12:23.697 226239 DEBUG nova.compute.manager [req-128d72c3-13a2-4be8-83d8-0b2a705c7a5a req-11b2b071-07f5-4250-ba8e-8331eca54b0c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] No waiting events found dispatching network-vif-plugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:12:23 np0005603623 nova_compute[226235]: 2026-01-31 08:12:23.697 226239 WARNING nova.compute.manager [req-128d72c3-13a2-4be8-83d8-0b2a705c7a5a req-11b2b071-07f5-4250-ba8e-8331eca54b0c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Received unexpected event network-vif-plugged-7429a420-eefe-4af6-b5a7-ad8aff346ea8 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:12:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:23.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:12:24 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/90614978' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:12:24 np0005603623 nova_compute[226235]: 2026-01-31 08:12:24.062 226239 DEBUG oslo_concurrency.processutils [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:24 np0005603623 nova_compute[226235]: 2026-01-31 08:12:24.071 226239 DEBUG nova.compute.provider_tree [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:12:24 np0005603623 nova_compute[226235]: 2026-01-31 08:12:24.093 226239 DEBUG nova.scheduler.client.report [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:12:24 np0005603623 nova_compute[226235]: 2026-01-31 08:12:24.125 226239 DEBUG oslo_concurrency.lockutils [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:24 np0005603623 nova_compute[226235]: 2026-01-31 08:12:24.151 226239 INFO nova.scheduler.client.report [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Deleted allocations for instance 702e2506-8d57-4ea2-b56e-1800da93f646#033[00m
Jan 31 03:12:24 np0005603623 nova_compute[226235]: 2026-01-31 08:12:24.229 226239 DEBUG oslo_concurrency.lockutils [None req-944a657c-3949-4755-8ab6-8d7e8744b361 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "702e2506-8d57-4ea2-b56e-1800da93f646" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.706s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:24 np0005603623 podman[264070]: 2026-01-31 08:12:24.98279385 +0000 UTC m=+0.064409031 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 31 03:12:25 np0005603623 podman[264071]: 2026-01-31 08:12:25.004867076 +0000 UTC m=+0.087478519 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Jan 31 03:12:25 np0005603623 nova_compute[226235]: 2026-01-31 08:12:25.319 226239 DEBUG nova.compute.manager [req-70f29fec-0a87-415d-a12d-b096443c2f5d req-e38f8142-1b3b-4a45-b4b2-6b26aef08043 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Received event network-vif-deleted-7429a420-eefe-4af6-b5a7-ad8aff346ea8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:12:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.0 total, 600.0 interval#012Cumulative writes: 8414 writes, 42K keys, 8414 commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.03 MB/s#012Cumulative WAL: 8414 writes, 8414 syncs, 1.00 writes per sync, written: 0.08 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1769 writes, 8387 keys, 1769 commit groups, 1.0 writes per commit group, ingest: 17.00 MB, 0.03 MB/s#012Interval WAL: 1769 writes, 1769 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     38.8      1.33              0.13        23    0.058       0      0       0.0       0.0#012  L6      1/0   11.45 MB   0.0      0.2     0.1      0.2       0.2      0.0       0.0   3.9     62.1     51.8      3.88              0.44        22    0.176    121K    12K       0.0       0.0#012 Sum      1/0   11.45 MB   0.0      0.2     0.1      0.2       0.2      0.1       0.0   4.9     46.3     48.5      5.21              0.57        45    0.116    121K    12K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   6.1     48.7     49.7      1.21              0.12        10    0.121     34K   2597       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.1      0.2       0.2      0.0       0.0   0.0     62.1     51.8      3.88              0.44        22    0.176    121K    12K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     38.8      1.33              0.13        22    0.060       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 3000.0 total, 600.0 interval#012Flush(GB): cumulative 0.050, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.25 GB write, 0.08 MB/s write, 0.24 GB read, 0.08 MB/s read, 5.2 seconds#012Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.06 GB read, 0.10 MB/s read, 1.2 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557fc5f1b1f0#2 capacity: 304.00 MB usage: 28.13 MB table_size: 0 occupancy: 18446744073709551615 collections: 6 last_copies: 0 last_secs: 0.000405 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1622,27.19 MB,8.94547%) FilterBlock(45,342.61 KB,0.110059%) IndexBlock(45,620.14 KB,0.199213%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 03:12:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:25.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:25 np0005603623 nova_compute[226235]: 2026-01-31 08:12:25.777 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:25.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:27.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:12:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:27.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:12:28 np0005603623 nova_compute[226235]: 2026-01-31 08:12:28.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:28 np0005603623 nova_compute[226235]: 2026-01-31 08:12:28.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:12:28 np0005603623 nova_compute[226235]: 2026-01-31 08:12:28.512 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:12:28.733269) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847148733303, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 865, "num_deletes": 250, "total_data_size": 1464368, "memory_usage": 1480536, "flush_reason": "Manual Compaction"}
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847148739936, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 643124, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 41936, "largest_seqno": 42796, "table_properties": {"data_size": 639804, "index_size": 1100, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 9368, "raw_average_key_size": 20, "raw_value_size": 632583, "raw_average_value_size": 1405, "num_data_blocks": 48, "num_entries": 450, "num_filter_entries": 450, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769847094, "oldest_key_time": 1769847094, "file_creation_time": 1769847148, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 6729 microseconds, and 2185 cpu microseconds.
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:12:28.739991) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 643124 bytes OK
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:12:28.740014) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:12:28.743948) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:12:28.743972) EVENT_LOG_v1 {"time_micros": 1769847148743965, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:12:28.743993) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 1459947, prev total WAL file size 1459947, number of live WAL files 2.
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:12:28.744626) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031323532' seq:72057594037927935, type:22 .. '6D6772737461740031353033' seq:0, type:0; will stop at (end)
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(628KB)], [78(11MB)]
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847148744705, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 12646981, "oldest_snapshot_seqno": -1}
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 6740 keys, 9122964 bytes, temperature: kUnknown
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847148837115, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 9122964, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9079742, "index_size": 25254, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16901, "raw_key_size": 173043, "raw_average_key_size": 25, "raw_value_size": 8961140, "raw_average_value_size": 1329, "num_data_blocks": 999, "num_entries": 6740, "num_filter_entries": 6740, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769847148, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:12:28.837356) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 9122964 bytes
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:12:28.839231) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 136.8 rd, 98.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 11.4 +0.0 blob) out(8.7 +0.0 blob), read-write-amplify(33.9) write-amplify(14.2) OK, records in: 7232, records dropped: 492 output_compression: NoCompression
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:12:28.839252) EVENT_LOG_v1 {"time_micros": 1769847148839242, "job": 48, "event": "compaction_finished", "compaction_time_micros": 92478, "compaction_time_cpu_micros": 24992, "output_level": 6, "num_output_files": 1, "total_output_size": 9122964, "num_input_records": 7232, "num_output_records": 6740, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847148839531, "job": 48, "event": "table_file_deletion", "file_number": 80}
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847148840985, "job": 48, "event": "table_file_deletion", "file_number": 78}
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:12:28.744469) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:12:28.841128) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:12:28.841134) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:12:28.841137) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:12:28.841140) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:12:28 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:12:28.841143) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:12:29 np0005603623 nova_compute[226235]: 2026-01-31 08:12:29.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:29 np0005603623 nova_compute[226235]: 2026-01-31 08:12:29.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:12:29 np0005603623 nova_compute[226235]: 2026-01-31 08:12:29.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:12:29 np0005603623 nova_compute[226235]: 2026-01-31 08:12:29.168 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:12:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:29.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:29.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:30.105 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:30.105 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:30.105 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:30 np0005603623 nova_compute[226235]: 2026-01-31 08:12:30.779 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:31 np0005603623 nova_compute[226235]: 2026-01-31 08:12:31.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:12:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:31.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:12:31 np0005603623 nova_compute[226235]: 2026-01-31 08:12:31.808 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:31 np0005603623 nova_compute[226235]: 2026-01-31 08:12:31.895 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:31.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.013 226239 DEBUG oslo_concurrency.lockutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "0c37b9a9-3924-451d-bf70-c38147e26756" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.013 226239 DEBUG oslo_concurrency.lockutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "0c37b9a9-3924-451d-bf70-c38147e26756" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.047 226239 DEBUG nova.compute.manager [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.140 226239 DEBUG oslo_concurrency.lockutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.142 226239 DEBUG oslo_concurrency.lockutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.150 226239 DEBUG nova.virt.hardware [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.151 226239 INFO nova.compute.claims [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.185 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.311 226239 DEBUG oslo_concurrency.processutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.514 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:33.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:12:33 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/441824671' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.774 226239 DEBUG oslo_concurrency.processutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.782 226239 DEBUG nova.compute.provider_tree [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:12:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.818 226239 DEBUG nova.scheduler.client.report [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.867 226239 DEBUG oslo_concurrency.lockutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.725s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.868 226239 DEBUG nova.compute.manager [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.872 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.687s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.873 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.873 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:12:33 np0005603623 nova_compute[226235]: 2026-01-31 08:12:33.873 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:34.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.030 226239 DEBUG nova.compute.manager [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.031 226239 DEBUG nova.network.neutron [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.274 226239 INFO nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:12:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:12:34 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/904497313' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.303 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.314 226239 DEBUG nova.compute.manager [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.446 226239 DEBUG nova.compute.manager [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.448 226239 DEBUG nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.448 226239 INFO nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Creating image(s)#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.472 226239 DEBUG nova.storage.rbd_utils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 0c37b9a9-3924-451d-bf70-c38147e26756_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.502 226239 DEBUG nova.storage.rbd_utils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 0c37b9a9-3924-451d-bf70-c38147e26756_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.532 226239 DEBUG nova.storage.rbd_utils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 0c37b9a9-3924-451d-bf70-c38147e26756_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.536 226239 DEBUG oslo_concurrency.processutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.597 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.599 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4575MB free_disk=20.921905517578125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.599 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.600 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.618 226239 DEBUG oslo_concurrency.processutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.619 226239 DEBUG oslo_concurrency.lockutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.620 226239 DEBUG oslo_concurrency.lockutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.620 226239 DEBUG oslo_concurrency.lockutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.644 226239 DEBUG nova.storage.rbd_utils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 0c37b9a9-3924-451d-bf70-c38147e26756_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.648 226239 DEBUG oslo_concurrency.processutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 0c37b9a9-3924-451d-bf70-c38147e26756_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.767 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 0c37b9a9-3924-451d-bf70-c38147e26756 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.768 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.768 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.860 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:34 np0005603623 nova_compute[226235]: 2026-01-31 08:12:34.988 226239 DEBUG nova.policy [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '12a823bd7c6e4cf492ebf6c1d002a91f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9c03fec1b3664105996aa979e226d8f8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:12:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:12:35 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2405206136' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:12:35 np0005603623 nova_compute[226235]: 2026-01-31 08:12:35.303 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:35 np0005603623 nova_compute[226235]: 2026-01-31 08:12:35.306 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:12:35 np0005603623 nova_compute[226235]: 2026-01-31 08:12:35.325 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:12:35 np0005603623 nova_compute[226235]: 2026-01-31 08:12:35.378 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:12:35 np0005603623 nova_compute[226235]: 2026-01-31 08:12:35.379 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.779s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:35.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:35 np0005603623 nova_compute[226235]: 2026-01-31 08:12:35.753 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847140.7516832, 702e2506-8d57-4ea2-b56e-1800da93f646 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:12:35 np0005603623 nova_compute[226235]: 2026-01-31 08:12:35.754 226239 INFO nova.compute.manager [-] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:12:35 np0005603623 nova_compute[226235]: 2026-01-31 08:12:35.780 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:35 np0005603623 nova_compute[226235]: 2026-01-31 08:12:35.910 226239 DEBUG oslo_concurrency.processutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 0c37b9a9-3924-451d-bf70-c38147e26756_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.262s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:35 np0005603623 nova_compute[226235]: 2026-01-31 08:12:35.975 226239 DEBUG nova.storage.rbd_utils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] resizing rbd image 0c37b9a9-3924-451d-bf70-c38147e26756_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:12:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:36.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:36 np0005603623 nova_compute[226235]: 2026-01-31 08:12:36.380 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:36 np0005603623 nova_compute[226235]: 2026-01-31 08:12:36.386 226239 DEBUG nova.compute.manager [None req-41b47082-68af-4406-98c5-f1bd79d9c1a8 - - - - - -] [instance: 702e2506-8d57-4ea2-b56e-1800da93f646] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:36 np0005603623 nova_compute[226235]: 2026-01-31 08:12:36.403 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:36 np0005603623 nova_compute[226235]: 2026-01-31 08:12:36.403 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:36 np0005603623 nova_compute[226235]: 2026-01-31 08:12:36.702 226239 DEBUG nova.objects.instance [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'migration_context' on Instance uuid 0c37b9a9-3924-451d-bf70-c38147e26756 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:12:36 np0005603623 nova_compute[226235]: 2026-01-31 08:12:36.733 226239 DEBUG nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:12:36 np0005603623 nova_compute[226235]: 2026-01-31 08:12:36.734 226239 DEBUG nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Ensure instance console log exists: /var/lib/nova/instances/0c37b9a9-3924-451d-bf70-c38147e26756/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:12:36 np0005603623 nova_compute[226235]: 2026-01-31 08:12:36.734 226239 DEBUG oslo_concurrency.lockutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:36 np0005603623 nova_compute[226235]: 2026-01-31 08:12:36.735 226239 DEBUG oslo_concurrency.lockutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:36 np0005603623 nova_compute[226235]: 2026-01-31 08:12:36.735 226239 DEBUG oslo_concurrency.lockutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:37 np0005603623 nova_compute[226235]: 2026-01-31 08:12:37.103 226239 DEBUG nova.network.neutron [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Successfully created port: abdf877d-771f-4148-a98f-c7e8319f044c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:12:37 np0005603623 nova_compute[226235]: 2026-01-31 08:12:37.172 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:37.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:38.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:38 np0005603623 nova_compute[226235]: 2026-01-31 08:12:38.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:38 np0005603623 nova_compute[226235]: 2026-01-31 08:12:38.516 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:38 np0005603623 nova_compute[226235]: 2026-01-31 08:12:38.661 226239 DEBUG nova.network.neutron [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Successfully updated port: abdf877d-771f-4148-a98f-c7e8319f044c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:12:38 np0005603623 nova_compute[226235]: 2026-01-31 08:12:38.679 226239 DEBUG oslo_concurrency.lockutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "refresh_cache-0c37b9a9-3924-451d-bf70-c38147e26756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:12:38 np0005603623 nova_compute[226235]: 2026-01-31 08:12:38.679 226239 DEBUG oslo_concurrency.lockutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquired lock "refresh_cache-0c37b9a9-3924-451d-bf70-c38147e26756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:12:38 np0005603623 nova_compute[226235]: 2026-01-31 08:12:38.679 226239 DEBUG nova.network.neutron [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:12:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:39 np0005603623 nova_compute[226235]: 2026-01-31 08:12:39.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:39 np0005603623 nova_compute[226235]: 2026-01-31 08:12:39.301 226239 DEBUG nova.compute.manager [req-b887a3bc-8541-4666-a15c-b2e98648cf79 req-eb1e8517-a1aa-463a-a4d8-035664abdea3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Received event network-changed-abdf877d-771f-4148-a98f-c7e8319f044c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:39 np0005603623 nova_compute[226235]: 2026-01-31 08:12:39.302 226239 DEBUG nova.compute.manager [req-b887a3bc-8541-4666-a15c-b2e98648cf79 req-eb1e8517-a1aa-463a-a4d8-035664abdea3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Refreshing instance network info cache due to event network-changed-abdf877d-771f-4148-a98f-c7e8319f044c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:12:39 np0005603623 nova_compute[226235]: 2026-01-31 08:12:39.302 226239 DEBUG oslo_concurrency.lockutils [req-b887a3bc-8541-4666-a15c-b2e98648cf79 req-eb1e8517-a1aa-463a-a4d8-035664abdea3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-0c37b9a9-3924-451d-bf70-c38147e26756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:12:39 np0005603623 nova_compute[226235]: 2026-01-31 08:12:39.451 226239 DEBUG nova.network.neutron [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:12:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:39.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 31 03:12:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:12:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:40.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:12:40 np0005603623 nova_compute[226235]: 2026-01-31 08:12:40.783 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:41.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:41 np0005603623 nova_compute[226235]: 2026-01-31 08:12:41.919 226239 DEBUG nova.network.neutron [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Updating instance_info_cache with network_info: [{"id": "abdf877d-771f-4148-a98f-c7e8319f044c", "address": "fa:16:3e:86:4e:c3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabdf877d-77", "ovs_interfaceid": "abdf877d-771f-4148-a98f-c7e8319f044c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:12:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:42.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.223 226239 DEBUG oslo_concurrency.lockutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Releasing lock "refresh_cache-0c37b9a9-3924-451d-bf70-c38147e26756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.224 226239 DEBUG nova.compute.manager [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Instance network_info: |[{"id": "abdf877d-771f-4148-a98f-c7e8319f044c", "address": "fa:16:3e:86:4e:c3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabdf877d-77", "ovs_interfaceid": "abdf877d-771f-4148-a98f-c7e8319f044c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.224 226239 DEBUG oslo_concurrency.lockutils [req-b887a3bc-8541-4666-a15c-b2e98648cf79 req-eb1e8517-a1aa-463a-a4d8-035664abdea3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-0c37b9a9-3924-451d-bf70-c38147e26756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.224 226239 DEBUG nova.network.neutron [req-b887a3bc-8541-4666-a15c-b2e98648cf79 req-eb1e8517-a1aa-463a-a4d8-035664abdea3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Refreshing network info cache for port abdf877d-771f-4148-a98f-c7e8319f044c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.227 226239 DEBUG nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Start _get_guest_xml network_info=[{"id": "abdf877d-771f-4148-a98f-c7e8319f044c", "address": "fa:16:3e:86:4e:c3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabdf877d-77", "ovs_interfaceid": "abdf877d-771f-4148-a98f-c7e8319f044c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.230 226239 WARNING nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.234 226239 DEBUG nova.virt.libvirt.host [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.235 226239 DEBUG nova.virt.libvirt.host [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.237 226239 DEBUG nova.virt.libvirt.host [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.238 226239 DEBUG nova.virt.libvirt.host [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.239 226239 DEBUG nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.239 226239 DEBUG nova.virt.hardware [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.239 226239 DEBUG nova.virt.hardware [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.240 226239 DEBUG nova.virt.hardware [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.240 226239 DEBUG nova.virt.hardware [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.240 226239 DEBUG nova.virt.hardware [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.240 226239 DEBUG nova.virt.hardware [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.240 226239 DEBUG nova.virt.hardware [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.241 226239 DEBUG nova.virt.hardware [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.241 226239 DEBUG nova.virt.hardware [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.241 226239 DEBUG nova.virt.hardware [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.241 226239 DEBUG nova.virt.hardware [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.243 226239 DEBUG oslo_concurrency.processutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.518 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:43.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:12:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1236478935' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.640 226239 DEBUG oslo_concurrency.processutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.396s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.669 226239 DEBUG nova.storage.rbd_utils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 0c37b9a9-3924-451d-bf70-c38147e26756_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:43 np0005603623 nova_compute[226235]: 2026-01-31 08:12:43.674 226239 DEBUG oslo_concurrency.processutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:44.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:12:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3479187477' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.069 226239 DEBUG oslo_concurrency.processutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.396s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.071 226239 DEBUG nova.virt.libvirt.vif [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:12:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-419177014',display_name='tempest-ServerActionsTestOtherA-server-419177014',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-419177014',id=91,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-0y3yxgdv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:12:34Z,user_data=None,user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=0c37b9a9-3924-451d-bf70-c38147e26756,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "abdf877d-771f-4148-a98f-c7e8319f044c", "address": "fa:16:3e:86:4e:c3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabdf877d-77", "ovs_interfaceid": "abdf877d-771f-4148-a98f-c7e8319f044c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.072 226239 DEBUG nova.network.os_vif_util [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "abdf877d-771f-4148-a98f-c7e8319f044c", "address": "fa:16:3e:86:4e:c3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabdf877d-77", "ovs_interfaceid": "abdf877d-771f-4148-a98f-c7e8319f044c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.073 226239 DEBUG nova.network.os_vif_util [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:4e:c3,bridge_name='br-int',has_traffic_filtering=True,id=abdf877d-771f-4148-a98f-c7e8319f044c,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabdf877d-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.074 226239 DEBUG nova.objects.instance [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0c37b9a9-3924-451d-bf70-c38147e26756 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.093 226239 DEBUG nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:12:44 np0005603623 nova_compute[226235]:  <uuid>0c37b9a9-3924-451d-bf70-c38147e26756</uuid>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:  <name>instance-0000005b</name>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerActionsTestOtherA-server-419177014</nova:name>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:12:43</nova:creationTime>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:12:44 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:        <nova:user uuid="12a823bd7c6e4cf492ebf6c1d002a91f">tempest-ServerActionsTestOtherA-1768827668-project-member</nova:user>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:        <nova:project uuid="9c03fec1b3664105996aa979e226d8f8">tempest-ServerActionsTestOtherA-1768827668</nova:project>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:        <nova:port uuid="abdf877d-771f-4148-a98f-c7e8319f044c">
Jan 31 03:12:44 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <entry name="serial">0c37b9a9-3924-451d-bf70-c38147e26756</entry>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <entry name="uuid">0c37b9a9-3924-451d-bf70-c38147e26756</entry>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/0c37b9a9-3924-451d-bf70-c38147e26756_disk">
Jan 31 03:12:44 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:12:44 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/0c37b9a9-3924-451d-bf70-c38147e26756_disk.config">
Jan 31 03:12:44 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:12:44 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:86:4e:c3"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <target dev="tapabdf877d-77"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/0c37b9a9-3924-451d-bf70-c38147e26756/console.log" append="off"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:12:44 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:12:44 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:12:44 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:12:44 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.094 226239 DEBUG nova.compute.manager [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Preparing to wait for external event network-vif-plugged-abdf877d-771f-4148-a98f-c7e8319f044c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.094 226239 DEBUG oslo_concurrency.lockutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "0c37b9a9-3924-451d-bf70-c38147e26756-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.094 226239 DEBUG oslo_concurrency.lockutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "0c37b9a9-3924-451d-bf70-c38147e26756-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.095 226239 DEBUG oslo_concurrency.lockutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "0c37b9a9-3924-451d-bf70-c38147e26756-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.095 226239 DEBUG nova.virt.libvirt.vif [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:12:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-419177014',display_name='tempest-ServerActionsTestOtherA-server-419177014',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-419177014',id=91,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-0y3yxgdv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:12:34Z,user_data=None,user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=0c37b9a9-3924-451d-bf70-c38147e26756,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "abdf877d-771f-4148-a98f-c7e8319f044c", "address": "fa:16:3e:86:4e:c3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabdf877d-77", "ovs_interfaceid": "abdf877d-771f-4148-a98f-c7e8319f044c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.095 226239 DEBUG nova.network.os_vif_util [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "abdf877d-771f-4148-a98f-c7e8319f044c", "address": "fa:16:3e:86:4e:c3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabdf877d-77", "ovs_interfaceid": "abdf877d-771f-4148-a98f-c7e8319f044c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.096 226239 DEBUG nova.network.os_vif_util [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:4e:c3,bridge_name='br-int',has_traffic_filtering=True,id=abdf877d-771f-4148-a98f-c7e8319f044c,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabdf877d-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.096 226239 DEBUG os_vif [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:4e:c3,bridge_name='br-int',has_traffic_filtering=True,id=abdf877d-771f-4148-a98f-c7e8319f044c,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabdf877d-77') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.097 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.097 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.097 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.100 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.100 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapabdf877d-77, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.100 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapabdf877d-77, col_values=(('external_ids', {'iface-id': 'abdf877d-771f-4148-a98f-c7e8319f044c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:86:4e:c3', 'vm-uuid': '0c37b9a9-3924-451d-bf70-c38147e26756'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.102 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:44 np0005603623 NetworkManager[48970]: <info>  [1769847164.1025] manager: (tapabdf877d-77): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/174)
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.104 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.107 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.108 226239 INFO os_vif [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:4e:c3,bridge_name='br-int',has_traffic_filtering=True,id=abdf877d-771f-4148-a98f-c7e8319f044c,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabdf877d-77')#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.382 226239 DEBUG nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.383 226239 DEBUG nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.383 226239 DEBUG nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No VIF found with MAC fa:16:3e:86:4e:c3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.383 226239 INFO nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Using config drive#033[00m
Jan 31 03:12:44 np0005603623 nova_compute[226235]: 2026-01-31 08:12:44.406 226239 DEBUG nova.storage.rbd_utils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 0c37b9a9-3924-451d-bf70-c38147e26756_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.383 226239 INFO nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Creating config drive at /var/lib/nova/instances/0c37b9a9-3924-451d-bf70-c38147e26756/disk.config#033[00m
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.387 226239 DEBUG oslo_concurrency.processutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0c37b9a9-3924-451d-bf70-c38147e26756/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmps8myw3xy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.511 226239 DEBUG oslo_concurrency.processutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0c37b9a9-3924-451d-bf70-c38147e26756/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmps8myw3xy" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.538 226239 DEBUG nova.storage.rbd_utils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 0c37b9a9-3924-451d-bf70-c38147e26756_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.542 226239 DEBUG oslo_concurrency.processutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0c37b9a9-3924-451d-bf70-c38147e26756/disk.config 0c37b9a9-3924-451d-bf70-c38147e26756_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:12:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:45.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.698 226239 DEBUG oslo_concurrency.processutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0c37b9a9-3924-451d-bf70-c38147e26756/disk.config 0c37b9a9-3924-451d-bf70-c38147e26756_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.157s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.699 226239 INFO nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Deleting local config drive /var/lib/nova/instances/0c37b9a9-3924-451d-bf70-c38147e26756/disk.config because it was imported into RBD.#033[00m
Jan 31 03:12:45 np0005603623 kernel: tapabdf877d-77: entered promiscuous mode
Jan 31 03:12:45 np0005603623 NetworkManager[48970]: <info>  [1769847165.7362] manager: (tapabdf877d-77): new Tun device (/org/freedesktop/NetworkManager/Devices/175)
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.736 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:45 np0005603623 ovn_controller[133449]: 2026-01-31T08:12:45Z|00366|binding|INFO|Claiming lport abdf877d-771f-4148-a98f-c7e8319f044c for this chassis.
Jan 31 03:12:45 np0005603623 ovn_controller[133449]: 2026-01-31T08:12:45Z|00367|binding|INFO|abdf877d-771f-4148-a98f-c7e8319f044c: Claiming fa:16:3e:86:4e:c3 10.100.0.14
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.740 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.745 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.752 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:45 np0005603623 NetworkManager[48970]: <info>  [1769847165.7529] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/176)
Jan 31 03:12:45 np0005603623 NetworkManager[48970]: <info>  [1769847165.7540] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/177)
Jan 31 03:12:45 np0005603623 systemd-udevd[264544]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:12:45 np0005603623 systemd-machined[194379]: New machine qemu-40-instance-0000005b.
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.766 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:4e:c3 10.100.0.14'], port_security=['fa:16:3e:86:4e:c3 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '0c37b9a9-3924-451d-bf70-c38147e26756', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f564452-5f08-4a1c-921e-f2daee9ec936', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c03fec1b3664105996aa979e226d8f8', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c20bb243-1a39-4929-870f-6661da0e39e7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d620dc35-e1b1-4011-a8c1-0995d2048b09, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=abdf877d-771f-4148-a98f-c7e8319f044c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.767 143258 INFO neutron.agent.ovn.metadata.agent [-] Port abdf877d-771f-4148-a98f-c7e8319f044c in datapath 1f564452-5f08-4a1c-921e-f2daee9ec936 bound to our chassis#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.768 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1f564452-5f08-4a1c-921e-f2daee9ec936#033[00m
Jan 31 03:12:45 np0005603623 NetworkManager[48970]: <info>  [1769847165.7726] device (tapabdf877d-77): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:12:45 np0005603623 NetworkManager[48970]: <info>  [1769847165.7733] device (tapabdf877d-77): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:12:45 np0005603623 systemd[1]: Started Virtual Machine qemu-40-instance-0000005b.
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.777 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[994d7016-4e67-42a1-a5e3-b48ed5960123]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.777 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1f564452-51 in ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.779 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1f564452-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.779 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3e3a5b66-b77e-4cdf-b80e-c1da33a37177]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.780 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[46d8df60-0d3c-4ad8-8651-75d89e1d6ae2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.792 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[ed088fa8-f5b8-4ab5-a770-ada9050345fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.802 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.814 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e5de448f-99f1-4fcd-b207-d924510626fc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.815 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:45 np0005603623 ovn_controller[133449]: 2026-01-31T08:12:45Z|00368|binding|INFO|Setting lport abdf877d-771f-4148-a98f-c7e8319f044c ovn-installed in OVS
Jan 31 03:12:45 np0005603623 ovn_controller[133449]: 2026-01-31T08:12:45Z|00369|binding|INFO|Setting lport abdf877d-771f-4148-a98f-c7e8319f044c up in Southbound
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.818 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.834 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[13d8a096-fe7f-4c00-98dc-3f111edbddc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:45 np0005603623 NetworkManager[48970]: <info>  [1769847165.8396] manager: (tap1f564452-50): new Veth device (/org/freedesktop/NetworkManager/Devices/178)
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.839 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d5aee8-f395-43f4-aca2-70ab3ecce382]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.860 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[aa18bc88-d3d9-4393-a9bb-787a55c3395d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.862 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[5f705daf-aec3-4ced-9c23-79081e1b3194]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:45 np0005603623 NetworkManager[48970]: <info>  [1769847165.8771] device (tap1f564452-50): carrier: link connected
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.881 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[c16f89c2-7697-4101-93e8-1812233e7b94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.891 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0a455a79-bcb9-4cda-b6b1-bf8a2edd49e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f564452-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:23:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 630630, 'reachable_time': 43533, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264577, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.901 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fa130d48-d29d-4012-8c6f-9eaa0a6918d9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:23e8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 630630, 'tstamp': 630630}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264578, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.912 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1f9e71e8-8a2b-4172-a2ef-92148393fce4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f564452-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:23:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 630630, 'reachable_time': 43533, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 264579, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.930 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[774067a8-6541-47fd-af71-63cc2311c200]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.966 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e64142ed-2244-4c48-8f24-3b26dec7a547]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.968 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f564452-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.968 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.969 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f564452-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.971 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:45 np0005603623 NetworkManager[48970]: <info>  [1769847165.9719] manager: (tap1f564452-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Jan 31 03:12:45 np0005603623 kernel: tap1f564452-50: entered promiscuous mode
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.975 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.977 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1f564452-50, col_values=(('external_ids', {'iface-id': '5bb8c1b5-edce-4f6a-8164-58b7d89a3330'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.978 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:45 np0005603623 ovn_controller[133449]: 2026-01-31T08:12:45Z|00370|binding|INFO|Releasing lport 5bb8c1b5-edce-4f6a-8164-58b7d89a3330 from this chassis (sb_readonly=0)
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.979 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.980 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1f564452-5f08-4a1c-921e-f2daee9ec936.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1f564452-5f08-4a1c-921e-f2daee9ec936.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.981 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f5f52a65-9c9f-4b89-b8d0-ac2c00c3e286]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.981 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-1f564452-5f08-4a1c-921e-f2daee9ec936
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/1f564452-5f08-4a1c-921e-f2daee9ec936.pid.haproxy
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 1f564452-5f08-4a1c-921e-f2daee9ec936
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:12:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:45.982 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'env', 'PROCESS_TAG=haproxy-1f564452-5f08-4a1c-921e-f2daee9ec936', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1f564452-5f08-4a1c-921e-f2daee9ec936.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:12:45 np0005603623 nova_compute[226235]: 2026-01-31 08:12:45.984 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:46.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.233 226239 DEBUG nova.network.neutron [req-b887a3bc-8541-4666-a15c-b2e98648cf79 req-eb1e8517-a1aa-463a-a4d8-035664abdea3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Updated VIF entry in instance network info cache for port abdf877d-771f-4148-a98f-c7e8319f044c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.233 226239 DEBUG nova.network.neutron [req-b887a3bc-8541-4666-a15c-b2e98648cf79 req-eb1e8517-a1aa-463a-a4d8-035664abdea3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Updating instance_info_cache with network_info: [{"id": "abdf877d-771f-4148-a98f-c7e8319f044c", "address": "fa:16:3e:86:4e:c3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabdf877d-77", "ovs_interfaceid": "abdf877d-771f-4148-a98f-c7e8319f044c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.258 226239 DEBUG oslo_concurrency.lockutils [req-b887a3bc-8541-4666-a15c-b2e98648cf79 req-eb1e8517-a1aa-463a-a4d8-035664abdea3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-0c37b9a9-3924-451d-bf70-c38147e26756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.260 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847166.2601802, 0c37b9a9-3924-451d-bf70-c38147e26756 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.260 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] VM Started (Lifecycle Event)#033[00m
Jan 31 03:12:46 np0005603623 podman[264653]: 2026-01-31 08:12:46.33340308 +0000 UTC m=+0.089823553 container create 97c881b4d987fa67d1935531b2fc9481e80ea4baf4df8ee284592597348d20a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.342 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.346 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847166.2607434, 0c37b9a9-3924-451d-bf70-c38147e26756 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.346 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:12:46 np0005603623 podman[264653]: 2026-01-31 08:12:46.269628639 +0000 UTC m=+0.026049132 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:12:46 np0005603623 systemd[1]: Started libpod-conmon-97c881b4d987fa67d1935531b2fc9481e80ea4baf4df8ee284592597348d20a5.scope.
Jan 31 03:12:46 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:12:46 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4976bef030d97eb0af9a8bb9c52a63c8e4a0537d0d721956bba49fdf9a93b67/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:12:46 np0005603623 podman[264653]: 2026-01-31 08:12:46.393106402 +0000 UTC m=+0.149526895 container init 97c881b4d987fa67d1935531b2fc9481e80ea4baf4df8ee284592597348d20a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.395 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.397 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:12:46 np0005603623 podman[264653]: 2026-01-31 08:12:46.398926676 +0000 UTC m=+0.155347139 container start 97c881b4d987fa67d1935531b2fc9481e80ea4baf4df8ee284592597348d20a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 03:12:46 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[264669]: [NOTICE]   (264673) : New worker (264675) forked
Jan 31 03:12:46 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[264669]: [NOTICE]   (264673) : Loading success.
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.429 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.534 226239 DEBUG nova.compute.manager [req-880136fb-f7d7-486d-8bed-5f03d2ee763e req-65f72a73-f53b-4eb3-bca1-5e74bc28e534 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Received event network-vif-plugged-abdf877d-771f-4148-a98f-c7e8319f044c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.534 226239 DEBUG oslo_concurrency.lockutils [req-880136fb-f7d7-486d-8bed-5f03d2ee763e req-65f72a73-f53b-4eb3-bca1-5e74bc28e534 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0c37b9a9-3924-451d-bf70-c38147e26756-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.535 226239 DEBUG oslo_concurrency.lockutils [req-880136fb-f7d7-486d-8bed-5f03d2ee763e req-65f72a73-f53b-4eb3-bca1-5e74bc28e534 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0c37b9a9-3924-451d-bf70-c38147e26756-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.535 226239 DEBUG oslo_concurrency.lockutils [req-880136fb-f7d7-486d-8bed-5f03d2ee763e req-65f72a73-f53b-4eb3-bca1-5e74bc28e534 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0c37b9a9-3924-451d-bf70-c38147e26756-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.535 226239 DEBUG nova.compute.manager [req-880136fb-f7d7-486d-8bed-5f03d2ee763e req-65f72a73-f53b-4eb3-bca1-5e74bc28e534 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Processing event network-vif-plugged-abdf877d-771f-4148-a98f-c7e8319f044c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.536 226239 DEBUG nova.compute.manager [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.538 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847166.5385091, 0c37b9a9-3924-451d-bf70-c38147e26756 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.539 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.541 226239 DEBUG nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.544 226239 INFO nova.virt.libvirt.driver [-] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Instance spawned successfully.#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.544 226239 DEBUG nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.573 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.576 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.589 226239 DEBUG nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.589 226239 DEBUG nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.590 226239 DEBUG nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.590 226239 DEBUG nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.591 226239 DEBUG nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.591 226239 DEBUG nova.virt.libvirt.driver [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.611 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.917 226239 INFO nova.compute.manager [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Took 12.47 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:12:46 np0005603623 nova_compute[226235]: 2026-01-31 08:12:46.917 226239 DEBUG nova.compute.manager [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:47 np0005603623 nova_compute[226235]: 2026-01-31 08:12:47.054 226239 INFO nova.compute.manager [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Took 13.95 seconds to build instance.#033[00m
Jan 31 03:12:47 np0005603623 nova_compute[226235]: 2026-01-31 08:12:47.136 226239 DEBUG oslo_concurrency.lockutils [None req-feef7eb8-d6b3-47df-a14b-ee212229688e 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "0c37b9a9-3924-451d-bf70-c38147e26756" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:47 np0005603623 ovn_controller[133449]: 2026-01-31T08:12:47Z|00371|binding|INFO|Releasing lport 5bb8c1b5-edce-4f6a-8164-58b7d89a3330 from this chassis (sb_readonly=0)
Jan 31 03:12:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:47.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:47 np0005603623 nova_compute[226235]: 2026-01-31 08:12:47.637 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:48.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:48 np0005603623 nova_compute[226235]: 2026-01-31 08:12:48.519 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:48 np0005603623 nova_compute[226235]: 2026-01-31 08:12:48.732 226239 DEBUG nova.compute.manager [req-19aa611f-d837-4f8f-b70e-c10835c5f2bd req-162968a9-f450-4914-bc2a-666de5d5b922 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Received event network-vif-plugged-abdf877d-771f-4148-a98f-c7e8319f044c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:48 np0005603623 nova_compute[226235]: 2026-01-31 08:12:48.733 226239 DEBUG oslo_concurrency.lockutils [req-19aa611f-d837-4f8f-b70e-c10835c5f2bd req-162968a9-f450-4914-bc2a-666de5d5b922 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0c37b9a9-3924-451d-bf70-c38147e26756-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:48 np0005603623 nova_compute[226235]: 2026-01-31 08:12:48.734 226239 DEBUG oslo_concurrency.lockutils [req-19aa611f-d837-4f8f-b70e-c10835c5f2bd req-162968a9-f450-4914-bc2a-666de5d5b922 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0c37b9a9-3924-451d-bf70-c38147e26756-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:48 np0005603623 nova_compute[226235]: 2026-01-31 08:12:48.734 226239 DEBUG oslo_concurrency.lockutils [req-19aa611f-d837-4f8f-b70e-c10835c5f2bd req-162968a9-f450-4914-bc2a-666de5d5b922 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0c37b9a9-3924-451d-bf70-c38147e26756-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:48 np0005603623 nova_compute[226235]: 2026-01-31 08:12:48.734 226239 DEBUG nova.compute.manager [req-19aa611f-d837-4f8f-b70e-c10835c5f2bd req-162968a9-f450-4914-bc2a-666de5d5b922 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] No waiting events found dispatching network-vif-plugged-abdf877d-771f-4148-a98f-c7e8319f044c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:12:48 np0005603623 nova_compute[226235]: 2026-01-31 08:12:48.734 226239 WARNING nova.compute.manager [req-19aa611f-d837-4f8f-b70e-c10835c5f2bd req-162968a9-f450-4914-bc2a-666de5d5b922 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Received unexpected event network-vif-plugged-abdf877d-771f-4148-a98f-c7e8319f044c for instance with vm_state active and task_state None.#033[00m
Jan 31 03:12:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:49 np0005603623 nova_compute[226235]: 2026-01-31 08:12:49.103 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:49.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:49 np0005603623 nova_compute[226235]: 2026-01-31 08:12:49.968 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:50.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:12:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 34K writes, 149K keys, 34K commit groups, 1.0 writes per commit group, ingest: 0.15 GB, 0.05 MB/s#012Cumulative WAL: 34K writes, 10K syncs, 3.20 writes per sync, written: 0.15 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 11K writes, 49K keys, 11K commit groups, 1.0 writes per commit group, ingest: 54.75 MB, 0.09 MB/s#012Interval WAL: 11K writes, 4039 syncs, 2.85 writes per sync, written: 0.05 GB, 0.09 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 03:12:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:12:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:51.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:12:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:52.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:53 np0005603623 nova_compute[226235]: 2026-01-31 08:12:53.521 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:53.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:54.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:54 np0005603623 nova_compute[226235]: 2026-01-31 08:12:54.105 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:54.699 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:12:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:54.701 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:12:54 np0005603623 nova_compute[226235]: 2026-01-31 08:12:54.701 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:54 np0005603623 nova_compute[226235]: 2026-01-31 08:12:54.997 226239 DEBUG nova.compute.manager [req-2f02b173-0522-42c0-9f3c-f365b17a6bf9 req-04e58acf-0ce1-4546-8715-a3c7a9508f68 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Received event network-changed-abdf877d-771f-4148-a98f-c7e8319f044c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:54 np0005603623 nova_compute[226235]: 2026-01-31 08:12:54.997 226239 DEBUG nova.compute.manager [req-2f02b173-0522-42c0-9f3c-f365b17a6bf9 req-04e58acf-0ce1-4546-8715-a3c7a9508f68 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Refreshing instance network info cache due to event network-changed-abdf877d-771f-4148-a98f-c7e8319f044c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:12:54 np0005603623 nova_compute[226235]: 2026-01-31 08:12:54.998 226239 DEBUG oslo_concurrency.lockutils [req-2f02b173-0522-42c0-9f3c-f365b17a6bf9 req-04e58acf-0ce1-4546-8715-a3c7a9508f68 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-0c37b9a9-3924-451d-bf70-c38147e26756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:12:54 np0005603623 nova_compute[226235]: 2026-01-31 08:12:54.998 226239 DEBUG oslo_concurrency.lockutils [req-2f02b173-0522-42c0-9f3c-f365b17a6bf9 req-04e58acf-0ce1-4546-8715-a3c7a9508f68 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-0c37b9a9-3924-451d-bf70-c38147e26756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:12:54 np0005603623 nova_compute[226235]: 2026-01-31 08:12:54.998 226239 DEBUG nova.network.neutron [req-2f02b173-0522-42c0-9f3c-f365b17a6bf9 req-04e58acf-0ce1-4546-8715-a3c7a9508f68 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Refreshing network info cache for port abdf877d-771f-4148-a98f-c7e8319f044c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:12:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:12:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:55.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:12:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:12:55.703 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:55 np0005603623 podman[264688]: 2026-01-31 08:12:55.96917387 +0000 UTC m=+0.058930309 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 03:12:55 np0005603623 podman[264689]: 2026-01-31 08:12:55.986244398 +0000 UTC m=+0.073559569 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:12:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:56.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:57.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:12:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:58.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:12:58 np0005603623 nova_compute[226235]: 2026-01-31 08:12:58.523 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:58 np0005603623 nova_compute[226235]: 2026-01-31 08:12:58.831 226239 DEBUG nova.network.neutron [req-2f02b173-0522-42c0-9f3c-f365b17a6bf9 req-04e58acf-0ce1-4546-8715-a3c7a9508f68 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Updated VIF entry in instance network info cache for port abdf877d-771f-4148-a98f-c7e8319f044c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:12:58 np0005603623 nova_compute[226235]: 2026-01-31 08:12:58.831 226239 DEBUG nova.network.neutron [req-2f02b173-0522-42c0-9f3c-f365b17a6bf9 req-04e58acf-0ce1-4546-8715-a3c7a9508f68 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Updating instance_info_cache with network_info: [{"id": "abdf877d-771f-4148-a98f-c7e8319f044c", "address": "fa:16:3e:86:4e:c3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabdf877d-77", "ovs_interfaceid": "abdf877d-771f-4148-a98f-c7e8319f044c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:12:59 np0005603623 nova_compute[226235]: 2026-01-31 08:12:59.107 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:59 np0005603623 nova_compute[226235]: 2026-01-31 08:12:59.140 226239 DEBUG oslo_concurrency.lockutils [req-2f02b173-0522-42c0-9f3c-f365b17a6bf9 req-04e58acf-0ce1-4546-8715-a3c7a9508f68 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-0c37b9a9-3924-451d-bf70-c38147e26756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:12:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:12:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:12:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:59.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:13:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:13:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:00.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:13:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:01.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:02.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:02 np0005603623 nova_compute[226235]: 2026-01-31 08:13:02.641 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:03 np0005603623 nova_compute[226235]: 2026-01-31 08:13:03.526 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:03.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:04.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:04 np0005603623 nova_compute[226235]: 2026-01-31 08:13:04.109 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:13:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:13:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:13:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:13:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:05.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:13:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:13:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:06.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:13:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:07.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:08.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:08 np0005603623 nova_compute[226235]: 2026-01-31 08:13:08.528 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:09 np0005603623 nova_compute[226235]: 2026-01-31 08:13:09.111 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:09.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:10.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:13:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:13:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:11.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e257 e257: 3 total, 3 up, 3 in
Jan 31 03:13:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:12.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:12 np0005603623 nova_compute[226235]: 2026-01-31 08:13:12.421 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:13 np0005603623 nova_compute[226235]: 2026-01-31 08:13:13.534 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:13.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:13:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:14.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:13:14 np0005603623 nova_compute[226235]: 2026-01-31 08:13:14.113 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:13:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4272873672' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:13:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:13:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4272873672' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:13:15 np0005603623 nova_compute[226235]: 2026-01-31 08:13:15.030 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:15.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:16.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:17.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:18.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:18 np0005603623 nova_compute[226235]: 2026-01-31 08:13:18.538 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:19 np0005603623 nova_compute[226235]: 2026-01-31 08:13:19.115 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:19.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:20.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:13:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:21.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:13:21 np0005603623 nova_compute[226235]: 2026-01-31 08:13:21.819 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:22.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e258 e258: 3 total, 3 up, 3 in
Jan 31 03:13:23 np0005603623 nova_compute[226235]: 2026-01-31 08:13:23.409 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:23 np0005603623 nova_compute[226235]: 2026-01-31 08:13:23.540 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:23.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:24.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:24 np0005603623 nova_compute[226235]: 2026-01-31 08:13:24.117 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:13:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:25.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:13:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:26.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:26 np0005603623 podman[265033]: 2026-01-31 08:13:26.975255304 +0000 UTC m=+0.069608345 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:13:26 np0005603623 podman[265032]: 2026-01-31 08:13:26.975233294 +0000 UTC m=+0.071232586 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:13:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:13:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:27.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:13:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:28.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:28 np0005603623 nova_compute[226235]: 2026-01-31 08:13:28.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:13:28 np0005603623 nova_compute[226235]: 2026-01-31 08:13:28.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:13:28 np0005603623 nova_compute[226235]: 2026-01-31 08:13:28.542 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:29 np0005603623 nova_compute[226235]: 2026-01-31 08:13:29.118 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:29 np0005603623 nova_compute[226235]: 2026-01-31 08:13:29.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:13:29 np0005603623 nova_compute[226235]: 2026-01-31 08:13:29.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:13:29 np0005603623 nova_compute[226235]: 2026-01-31 08:13:29.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:13:29 np0005603623 nova_compute[226235]: 2026-01-31 08:13:29.634 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-0c37b9a9-3924-451d-bf70-c38147e26756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:13:29 np0005603623 nova_compute[226235]: 2026-01-31 08:13:29.634 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-0c37b9a9-3924-451d-bf70-c38147e26756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:13:29 np0005603623 nova_compute[226235]: 2026-01-31 08:13:29.634 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:13:29 np0005603623 nova_compute[226235]: 2026-01-31 08:13:29.635 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0c37b9a9-3924-451d-bf70-c38147e26756 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:13:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:13:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:29.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:13:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:30.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:13:30.106 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:13:30.106 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:13:30.106 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:31.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:32.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:33 np0005603623 nova_compute[226235]: 2026-01-31 08:13:33.235 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:33 np0005603623 nova_compute[226235]: 2026-01-31 08:13:33.291 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Updating instance_info_cache with network_info: [{"id": "abdf877d-771f-4148-a98f-c7e8319f044c", "address": "fa:16:3e:86:4e:c3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabdf877d-77", "ovs_interfaceid": "abdf877d-771f-4148-a98f-c7e8319f044c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:13:33 np0005603623 nova_compute[226235]: 2026-01-31 08:13:33.345 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-0c37b9a9-3924-451d-bf70-c38147e26756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:13:33 np0005603623 nova_compute[226235]: 2026-01-31 08:13:33.345 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:13:33 np0005603623 nova_compute[226235]: 2026-01-31 08:13:33.346 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:13:33 np0005603623 nova_compute[226235]: 2026-01-31 08:13:33.543 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:13:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:33.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:13:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e259 e259: 3 total, 3 up, 3 in
Jan 31 03:13:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:34.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:34 np0005603623 nova_compute[226235]: 2026-01-31 08:13:34.120 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:34 np0005603623 nova_compute[226235]: 2026-01-31 08:13:34.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:13:34 np0005603623 nova_compute[226235]: 2026-01-31 08:13:34.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:13:34 np0005603623 nova_compute[226235]: 2026-01-31 08:13:34.483 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:34 np0005603623 nova_compute[226235]: 2026-01-31 08:13:34.484 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:34 np0005603623 nova_compute[226235]: 2026-01-31 08:13:34.484 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:34 np0005603623 nova_compute[226235]: 2026-01-31 08:13:34.484 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:13:34 np0005603623 nova_compute[226235]: 2026-01-31 08:13:34.484 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:13:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:13:34 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2615519414' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:13:34 np0005603623 nova_compute[226235]: 2026-01-31 08:13:34.890 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.405s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:13:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:35.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:36.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:37.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:38.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:38 np0005603623 nova_compute[226235]: 2026-01-31 08:13:38.223 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000005b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:13:38 np0005603623 nova_compute[226235]: 2026-01-31 08:13:38.224 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000005b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:13:38 np0005603623 nova_compute[226235]: 2026-01-31 08:13:38.352 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:13:38 np0005603623 nova_compute[226235]: 2026-01-31 08:13:38.353 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4428MB free_disk=20.851593017578125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:13:38 np0005603623 nova_compute[226235]: 2026-01-31 08:13:38.353 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:38 np0005603623 nova_compute[226235]: 2026-01-31 08:13:38.353 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:38 np0005603623 nova_compute[226235]: 2026-01-31 08:13:38.545 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:13:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/344474505' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:13:38 np0005603623 nova_compute[226235]: 2026-01-31 08:13:38.570 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 0c37b9a9-3924-451d-bf70-c38147e26756 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:13:38 np0005603623 nova_compute[226235]: 2026-01-31 08:13:38.570 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:13:38 np0005603623 nova_compute[226235]: 2026-01-31 08:13:38.571 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:13:38 np0005603623 nova_compute[226235]: 2026-01-31 08:13:38.767 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:13:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:39 np0005603623 nova_compute[226235]: 2026-01-31 08:13:39.122 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:13:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2936159149' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:13:39 np0005603623 nova_compute[226235]: 2026-01-31 08:13:39.169 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.402s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:13:39 np0005603623 nova_compute[226235]: 2026-01-31 08:13:39.174 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:13:39 np0005603623 nova_compute[226235]: 2026-01-31 08:13:39.245 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:13:39 np0005603623 nova_compute[226235]: 2026-01-31 08:13:39.350 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:13:39 np0005603623 nova_compute[226235]: 2026-01-31 08:13:39.350 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.997s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:39.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:40.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:40 np0005603623 nova_compute[226235]: 2026-01-31 08:13:40.351 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:13:40 np0005603623 nova_compute[226235]: 2026-01-31 08:13:40.351 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:13:40 np0005603623 nova_compute[226235]: 2026-01-31 08:13:40.351 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:13:40 np0005603623 nova_compute[226235]: 2026-01-31 08:13:40.351 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:13:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:13:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:41.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:13:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:42.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:42 np0005603623 nova_compute[226235]: 2026-01-31 08:13:42.686 226239 DEBUG oslo_concurrency.lockutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "e54ff9a1-d1c9-4792-a837-076e8289ee23" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:42 np0005603623 nova_compute[226235]: 2026-01-31 08:13:42.686 226239 DEBUG oslo_concurrency.lockutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:42 np0005603623 nova_compute[226235]: 2026-01-31 08:13:42.831 226239 DEBUG nova.compute.manager [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:13:43 np0005603623 nova_compute[226235]: 2026-01-31 08:13:43.021 226239 DEBUG oslo_concurrency.lockutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:43 np0005603623 nova_compute[226235]: 2026-01-31 08:13:43.022 226239 DEBUG oslo_concurrency.lockutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:43 np0005603623 nova_compute[226235]: 2026-01-31 08:13:43.030 226239 DEBUG nova.virt.hardware [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:13:43 np0005603623 nova_compute[226235]: 2026-01-31 08:13:43.031 226239 INFO nova.compute.claims [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:13:43 np0005603623 nova_compute[226235]: 2026-01-31 08:13:43.380 226239 DEBUG oslo_concurrency.processutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:13:43 np0005603623 nova_compute[226235]: 2026-01-31 08:13:43.547 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:43.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:13:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2555995833' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:13:43 np0005603623 nova_compute[226235]: 2026-01-31 08:13:43.851 226239 DEBUG oslo_concurrency.processutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:13:43 np0005603623 nova_compute[226235]: 2026-01-31 08:13:43.856 226239 DEBUG nova.compute.provider_tree [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:13:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:43 np0005603623 nova_compute[226235]: 2026-01-31 08:13:43.941 226239 DEBUG nova.scheduler.client.report [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:13:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:44.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:44 np0005603623 nova_compute[226235]: 2026-01-31 08:13:44.125 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:44 np0005603623 nova_compute[226235]: 2026-01-31 08:13:44.204 226239 DEBUG oslo_concurrency.lockutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:44 np0005603623 nova_compute[226235]: 2026-01-31 08:13:44.206 226239 DEBUG nova.compute.manager [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:13:45 np0005603623 nova_compute[226235]: 2026-01-31 08:13:45.224 226239 DEBUG nova.compute.manager [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:13:45 np0005603623 nova_compute[226235]: 2026-01-31 08:13:45.225 226239 DEBUG nova.network.neutron [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:13:45 np0005603623 nova_compute[226235]: 2026-01-31 08:13:45.494 226239 INFO nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:13:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:13:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:45.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:13:45 np0005603623 nova_compute[226235]: 2026-01-31 08:13:45.816 226239 DEBUG nova.compute.manager [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:13:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:46.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.230 226239 DEBUG nova.compute.manager [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.231 226239 DEBUG nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.232 226239 INFO nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Creating image(s)#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.254 226239 DEBUG nova.storage.rbd_utils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image e54ff9a1-d1c9-4792-a837-076e8289ee23_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.280 226239 DEBUG nova.storage.rbd_utils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image e54ff9a1-d1c9-4792-a837-076e8289ee23_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.315 226239 DEBUG nova.storage.rbd_utils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image e54ff9a1-d1c9-4792-a837-076e8289ee23_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.321 226239 DEBUG oslo_concurrency.processutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.380 226239 DEBUG oslo_concurrency.processutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.381 226239 DEBUG oslo_concurrency.lockutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.382 226239 DEBUG oslo_concurrency.lockutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.382 226239 DEBUG oslo_concurrency.lockutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.412 226239 DEBUG nova.storage.rbd_utils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image e54ff9a1-d1c9-4792-a837-076e8289ee23_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.417 226239 DEBUG oslo_concurrency.processutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 e54ff9a1-d1c9-4792-a837-076e8289ee23_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.676 226239 DEBUG oslo_concurrency.processutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 e54ff9a1-d1c9-4792-a837-076e8289ee23_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.259s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.756 226239 DEBUG nova.storage.rbd_utils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] resizing rbd image e54ff9a1-d1c9-4792-a837-076e8289ee23_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.794 226239 DEBUG nova.policy [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '111fdaf79c084a91902fe37a7a502020', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '58e900992be7400fb940ca20f13e12d1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.881 226239 DEBUG nova.objects.instance [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'migration_context' on Instance uuid e54ff9a1-d1c9-4792-a837-076e8289ee23 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.964 226239 DEBUG nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.964 226239 DEBUG nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Ensure instance console log exists: /var/lib/nova/instances/e54ff9a1-d1c9-4792-a837-076e8289ee23/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.965 226239 DEBUG oslo_concurrency.lockutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.965 226239 DEBUG oslo_concurrency.lockutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:46 np0005603623 nova_compute[226235]: 2026-01-31 08:13:46.965 226239 DEBUG oslo_concurrency.lockutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:47.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:48.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:48 np0005603623 nova_compute[226235]: 2026-01-31 08:13:48.549 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:48 np0005603623 nova_compute[226235]: 2026-01-31 08:13:48.923 226239 DEBUG nova.network.neutron [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Successfully created port: db231dc0-94bd-47c5-bc4c-f139648e2cfa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:13:49 np0005603623 nova_compute[226235]: 2026-01-31 08:13:49.128 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:49.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:49 np0005603623 nova_compute[226235]: 2026-01-31 08:13:49.800 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:50.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:50 np0005603623 nova_compute[226235]: 2026-01-31 08:13:50.493 226239 DEBUG nova.network.neutron [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Successfully updated port: db231dc0-94bd-47c5-bc4c-f139648e2cfa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:13:50 np0005603623 nova_compute[226235]: 2026-01-31 08:13:50.728 226239 DEBUG nova.compute.manager [req-497f2fcd-1c4a-4ee5-a70e-f8236cda0c92 req-243ab3ef-be7c-471f-b786-ff2f18b57dd6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Received event network-changed-db231dc0-94bd-47c5-bc4c-f139648e2cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:13:50 np0005603623 nova_compute[226235]: 2026-01-31 08:13:50.729 226239 DEBUG nova.compute.manager [req-497f2fcd-1c4a-4ee5-a70e-f8236cda0c92 req-243ab3ef-be7c-471f-b786-ff2f18b57dd6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Refreshing instance network info cache due to event network-changed-db231dc0-94bd-47c5-bc4c-f139648e2cfa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:13:50 np0005603623 nova_compute[226235]: 2026-01-31 08:13:50.729 226239 DEBUG oslo_concurrency.lockutils [req-497f2fcd-1c4a-4ee5-a70e-f8236cda0c92 req-243ab3ef-be7c-471f-b786-ff2f18b57dd6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e54ff9a1-d1c9-4792-a837-076e8289ee23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:13:50 np0005603623 nova_compute[226235]: 2026-01-31 08:13:50.729 226239 DEBUG oslo_concurrency.lockutils [req-497f2fcd-1c4a-4ee5-a70e-f8236cda0c92 req-243ab3ef-be7c-471f-b786-ff2f18b57dd6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e54ff9a1-d1c9-4792-a837-076e8289ee23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:13:50 np0005603623 nova_compute[226235]: 2026-01-31 08:13:50.729 226239 DEBUG nova.network.neutron [req-497f2fcd-1c4a-4ee5-a70e-f8236cda0c92 req-243ab3ef-be7c-471f-b786-ff2f18b57dd6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Refreshing network info cache for port db231dc0-94bd-47c5-bc4c-f139648e2cfa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:13:50 np0005603623 nova_compute[226235]: 2026-01-31 08:13:50.736 226239 DEBUG oslo_concurrency.lockutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "refresh_cache-e54ff9a1-d1c9-4792-a837-076e8289ee23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:13:51 np0005603623 nova_compute[226235]: 2026-01-31 08:13:51.037 226239 DEBUG nova.network.neutron [req-497f2fcd-1c4a-4ee5-a70e-f8236cda0c92 req-243ab3ef-be7c-471f-b786-ff2f18b57dd6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:13:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:51.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:52 np0005603623 nova_compute[226235]: 2026-01-31 08:13:52.063 226239 DEBUG nova.network.neutron [req-497f2fcd-1c4a-4ee5-a70e-f8236cda0c92 req-243ab3ef-be7c-471f-b786-ff2f18b57dd6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:13:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:52.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:52 np0005603623 nova_compute[226235]: 2026-01-31 08:13:52.150 226239 DEBUG oslo_concurrency.lockutils [req-497f2fcd-1c4a-4ee5-a70e-f8236cda0c92 req-243ab3ef-be7c-471f-b786-ff2f18b57dd6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e54ff9a1-d1c9-4792-a837-076e8289ee23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:13:52 np0005603623 nova_compute[226235]: 2026-01-31 08:13:52.151 226239 DEBUG oslo_concurrency.lockutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquired lock "refresh_cache-e54ff9a1-d1c9-4792-a837-076e8289ee23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:13:52 np0005603623 nova_compute[226235]: 2026-01-31 08:13:52.151 226239 DEBUG nova.network.neutron [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:13:52 np0005603623 nova_compute[226235]: 2026-01-31 08:13:52.508 226239 DEBUG nova.network.neutron [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:13:53 np0005603623 nova_compute[226235]: 2026-01-31 08:13:53.550 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:53.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:54.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.130 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.133 226239 DEBUG nova.network.neutron [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Updating instance_info_cache with network_info: [{"id": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "address": "fa:16:3e:0d:e9:1e", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb231dc0-94", "ovs_interfaceid": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.318 226239 DEBUG oslo_concurrency.lockutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Releasing lock "refresh_cache-e54ff9a1-d1c9-4792-a837-076e8289ee23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.319 226239 DEBUG nova.compute.manager [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Instance network_info: |[{"id": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "address": "fa:16:3e:0d:e9:1e", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb231dc0-94", "ovs_interfaceid": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.321 226239 DEBUG nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Start _get_guest_xml network_info=[{"id": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "address": "fa:16:3e:0d:e9:1e", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb231dc0-94", "ovs_interfaceid": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.324 226239 WARNING nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.330 226239 DEBUG nova.virt.libvirt.host [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.330 226239 DEBUG nova.virt.libvirt.host [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.333 226239 DEBUG nova.virt.libvirt.host [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.333 226239 DEBUG nova.virt.libvirt.host [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.334 226239 DEBUG nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.335 226239 DEBUG nova.virt.hardware [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.335 226239 DEBUG nova.virt.hardware [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.336 226239 DEBUG nova.virt.hardware [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.336 226239 DEBUG nova.virt.hardware [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.336 226239 DEBUG nova.virt.hardware [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.336 226239 DEBUG nova.virt.hardware [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.336 226239 DEBUG nova.virt.hardware [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.337 226239 DEBUG nova.virt.hardware [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.337 226239 DEBUG nova.virt.hardware [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.337 226239 DEBUG nova.virt.hardware [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.337 226239 DEBUG nova.virt.hardware [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.340 226239 DEBUG oslo_concurrency.processutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:13:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:13:54 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3886946685' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.779 226239 DEBUG oslo_concurrency.processutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.806 226239 DEBUG nova.storage.rbd_utils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image e54ff9a1-d1c9-4792-a837-076e8289ee23_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:13:54 np0005603623 nova_compute[226235]: 2026-01-31 08:13:54.810 226239 DEBUG oslo_concurrency.processutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:13:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:13:55 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3079912129' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:13:55 np0005603623 nova_compute[226235]: 2026-01-31 08:13:55.218 226239 DEBUG oslo_concurrency.processutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.407s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:13:55 np0005603623 nova_compute[226235]: 2026-01-31 08:13:55.221 226239 DEBUG nova.virt.libvirt.vif [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:13:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-436373101',display_name='tempest-ServerDiskConfigTestJSON-server-436373101',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-436373101',id=93,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58e900992be7400fb940ca20f13e12d1',ramdisk_id='',reservation_id='r-0t50bqgi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-855158150',owner_user_name='tempest-ServerDiskConfigTestJSON-855158150-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:13:45Z,user_data=None,user_id='111fdaf79c084a91902fe37a7a502020',uuid=e54ff9a1-d1c9-4792-a837-076e8289ee23,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "address": "fa:16:3e:0d:e9:1e", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb231dc0-94", "ovs_interfaceid": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:13:55 np0005603623 nova_compute[226235]: 2026-01-31 08:13:55.222 226239 DEBUG nova.network.os_vif_util [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converting VIF {"id": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "address": "fa:16:3e:0d:e9:1e", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb231dc0-94", "ovs_interfaceid": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:13:55 np0005603623 nova_compute[226235]: 2026-01-31 08:13:55.223 226239 DEBUG nova.network.os_vif_util [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:e9:1e,bridge_name='br-int',has_traffic_filtering=True,id=db231dc0-94bd-47c5-bc4c-f139648e2cfa,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb231dc0-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:13:55 np0005603623 nova_compute[226235]: 2026-01-31 08:13:55.225 226239 DEBUG nova.objects.instance [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'pci_devices' on Instance uuid e54ff9a1-d1c9-4792-a837-076e8289ee23 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:13:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:55.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:56.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:57.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:57 np0005603623 podman[265438]: 2026-01-31 08:13:57.936498272 +0000 UTC m=+0.035375307 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 03:13:57 np0005603623 podman[265439]: 2026-01-31 08:13:57.975370477 +0000 UTC m=+0.070425931 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:13:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:58.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.512 226239 DEBUG nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:13:58 np0005603623 nova_compute[226235]:  <uuid>e54ff9a1-d1c9-4792-a837-076e8289ee23</uuid>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:  <name>instance-0000005d</name>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-436373101</nova:name>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:13:54</nova:creationTime>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:13:58 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:        <nova:user uuid="111fdaf79c084a91902fe37a7a502020">tempest-ServerDiskConfigTestJSON-855158150-project-member</nova:user>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:        <nova:project uuid="58e900992be7400fb940ca20f13e12d1">tempest-ServerDiskConfigTestJSON-855158150</nova:project>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:        <nova:port uuid="db231dc0-94bd-47c5-bc4c-f139648e2cfa">
Jan 31 03:13:58 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <entry name="serial">e54ff9a1-d1c9-4792-a837-076e8289ee23</entry>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <entry name="uuid">e54ff9a1-d1c9-4792-a837-076e8289ee23</entry>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/e54ff9a1-d1c9-4792-a837-076e8289ee23_disk">
Jan 31 03:13:58 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:13:58 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/e54ff9a1-d1c9-4792-a837-076e8289ee23_disk.config">
Jan 31 03:13:58 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:13:58 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:0d:e9:1e"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <target dev="tapdb231dc0-94"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/e54ff9a1-d1c9-4792-a837-076e8289ee23/console.log" append="off"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:13:58 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:13:58 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:13:58 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:13:58 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.512 226239 DEBUG nova.compute.manager [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Preparing to wait for external event network-vif-plugged-db231dc0-94bd-47c5-bc4c-f139648e2cfa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.513 226239 DEBUG oslo_concurrency.lockutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.513 226239 DEBUG oslo_concurrency.lockutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.513 226239 DEBUG oslo_concurrency.lockutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.514 226239 DEBUG nova.virt.libvirt.vif [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:13:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-436373101',display_name='tempest-ServerDiskConfigTestJSON-server-436373101',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-436373101',id=93,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58e900992be7400fb940ca20f13e12d1',ramdisk_id='',reservation_id='r-0t50bqgi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-855158150',owner_user_name='tempest-ServerDiskConfigTestJSON-855158150-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:13:45Z,user_data=None,user_id='111fdaf79c084a91902fe37a7a502020',uuid=e54ff9a1-d1c9-4792-a837-076e8289ee23,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "address": "fa:16:3e:0d:e9:1e", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb231dc0-94", "ovs_interfaceid": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.514 226239 DEBUG nova.network.os_vif_util [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converting VIF {"id": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "address": "fa:16:3e:0d:e9:1e", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb231dc0-94", "ovs_interfaceid": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.515 226239 DEBUG nova.network.os_vif_util [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:e9:1e,bridge_name='br-int',has_traffic_filtering=True,id=db231dc0-94bd-47c5-bc4c-f139648e2cfa,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb231dc0-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.515 226239 DEBUG os_vif [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:e9:1e,bridge_name='br-int',has_traffic_filtering=True,id=db231dc0-94bd-47c5-bc4c-f139648e2cfa,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb231dc0-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.516 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.516 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.516 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.519 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.519 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdb231dc0-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.520 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdb231dc0-94, col_values=(('external_ids', {'iface-id': 'db231dc0-94bd-47c5-bc4c-f139648e2cfa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0d:e9:1e', 'vm-uuid': 'e54ff9a1-d1c9-4792-a837-076e8289ee23'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.521 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:58 np0005603623 NetworkManager[48970]: <info>  [1769847238.5219] manager: (tapdb231dc0-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/180)
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.523 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.526 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.527 226239 INFO os_vif [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:e9:1e,bridge_name='br-int',has_traffic_filtering=True,id=db231dc0-94bd-47c5-bc4c-f139648e2cfa,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb231dc0-94')#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.552 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.720 226239 DEBUG nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.720 226239 DEBUG nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.720 226239 DEBUG nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] No VIF found with MAC fa:16:3e:0d:e9:1e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.721 226239 INFO nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Using config drive#033[00m
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #82. Immutable memtables: 0.
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:13:58.737772) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 49] Flushing memtable with next log file: 82
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847238737807, "job": 49, "event": "flush_started", "num_memtables": 1, "num_entries": 1189, "num_deletes": 257, "total_data_size": 2412117, "memory_usage": 2451552, "flush_reason": "Manual Compaction"}
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 49] Level-0 flush table #83: started
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847238745058, "cf_name": "default", "job": 49, "event": "table_file_creation", "file_number": 83, "file_size": 1590236, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 42801, "largest_seqno": 43985, "table_properties": {"data_size": 1585069, "index_size": 2627, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11503, "raw_average_key_size": 19, "raw_value_size": 1574440, "raw_average_value_size": 2695, "num_data_blocks": 116, "num_entries": 584, "num_filter_entries": 584, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769847149, "oldest_key_time": 1769847149, "file_creation_time": 1769847238, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 83, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 49] Flush lasted 7354 microseconds, and 3358 cpu microseconds.
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:13:58.745120) [db/flush_job.cc:967] [default] [JOB 49] Level-0 flush table #83: 1590236 bytes OK
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:13:58.745144) [db/memtable_list.cc:519] [default] Level-0 commit table #83 started
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:13:58.747302) [db/memtable_list.cc:722] [default] Level-0 commit table #83: memtable #1 done
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:13:58.747332) EVENT_LOG_v1 {"time_micros": 1769847238747323, "job": 49, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:13:58.747358) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 49] Try to delete WAL files size 2406405, prev total WAL file size 2406405, number of live WAL files 2.
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000079.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:13:58.748260) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031323535' seq:72057594037927935, type:22 .. '6C6F676D0031353037' seq:0, type:0; will stop at (end)
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 50] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 49 Base level 0, inputs: [83(1552KB)], [81(8909KB)]
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847238748294, "job": 50, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [83], "files_L6": [81], "score": -1, "input_data_size": 10713200, "oldest_snapshot_seqno": -1}
Jan 31 03:13:58 np0005603623 nova_compute[226235]: 2026-01-31 08:13:58.755 226239 DEBUG nova.storage.rbd_utils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image e54ff9a1-d1c9-4792-a837-076e8289ee23_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 50] Generated table #84: 6793 keys, 10575450 bytes, temperature: kUnknown
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847238831866, "cf_name": "default", "job": 50, "event": "table_file_creation", "file_number": 84, "file_size": 10575450, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10530026, "index_size": 27328, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17029, "raw_key_size": 175165, "raw_average_key_size": 25, "raw_value_size": 10408639, "raw_average_value_size": 1532, "num_data_blocks": 1085, "num_entries": 6793, "num_filter_entries": 6793, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769847238, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 84, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:13:58.832586) [db/compaction/compaction_job.cc:1663] [default] [JOB 50] Compacted 1@0 + 1@6 files to L6 => 10575450 bytes
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:13:58.855778) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 127.7 rd, 126.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 8.7 +0.0 blob) out(10.1 +0.0 blob), read-write-amplify(13.4) write-amplify(6.7) OK, records in: 7324, records dropped: 531 output_compression: NoCompression
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:13:58.855856) EVENT_LOG_v1 {"time_micros": 1769847238855833, "job": 50, "event": "compaction_finished", "compaction_time_micros": 83900, "compaction_time_cpu_micros": 21416, "output_level": 6, "num_output_files": 1, "total_output_size": 10575450, "num_input_records": 7324, "num_output_records": 6793, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000083.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847238856559, "job": 50, "event": "table_file_deletion", "file_number": 83}
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000081.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847238858313, "job": 50, "event": "table_file_deletion", "file_number": 81}
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:13:58.748073) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:13:58.858360) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:13:58.858367) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:13:58.858371) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:13:58.858374) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:13:58.858377) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:13:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:13:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:13:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:59.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:14:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:00.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:01.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:01 np0005603623 nova_compute[226235]: 2026-01-31 08:14:01.877 226239 INFO nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Creating config drive at /var/lib/nova/instances/e54ff9a1-d1c9-4792-a837-076e8289ee23/disk.config#033[00m
Jan 31 03:14:01 np0005603623 nova_compute[226235]: 2026-01-31 08:14:01.881 226239 DEBUG oslo_concurrency.processutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e54ff9a1-d1c9-4792-a837-076e8289ee23/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpowtwqjff execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:14:02 np0005603623 nova_compute[226235]: 2026-01-31 08:14:02.005 226239 DEBUG oslo_concurrency.processutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e54ff9a1-d1c9-4792-a837-076e8289ee23/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpowtwqjff" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:14:02 np0005603623 nova_compute[226235]: 2026-01-31 08:14:02.037 226239 DEBUG nova.storage.rbd_utils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image e54ff9a1-d1c9-4792-a837-076e8289ee23_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:14:02 np0005603623 nova_compute[226235]: 2026-01-31 08:14:02.040 226239 DEBUG oslo_concurrency.processutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e54ff9a1-d1c9-4792-a837-076e8289ee23/disk.config e54ff9a1-d1c9-4792-a837-076e8289ee23_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:14:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:02.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:02 np0005603623 nova_compute[226235]: 2026-01-31 08:14:02.312 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:02.313 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:14:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:02.316 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:14:02 np0005603623 nova_compute[226235]: 2026-01-31 08:14:02.562 226239 DEBUG oslo_concurrency.processutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e54ff9a1-d1c9-4792-a837-076e8289ee23/disk.config e54ff9a1-d1c9-4792-a837-076e8289ee23_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:14:02 np0005603623 nova_compute[226235]: 2026-01-31 08:14:02.562 226239 INFO nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Deleting local config drive /var/lib/nova/instances/e54ff9a1-d1c9-4792-a837-076e8289ee23/disk.config because it was imported into RBD.#033[00m
Jan 31 03:14:03 np0005603623 kernel: tapdb231dc0-94: entered promiscuous mode
Jan 31 03:14:03 np0005603623 NetworkManager[48970]: <info>  [1769847242.6025] manager: (tapdb231dc0-94): new Tun device (/org/freedesktop/NetworkManager/Devices/181)
Jan 31 03:14:03 np0005603623 nova_compute[226235]: 2026-01-31 08:14:02.604 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:14:02Z|00372|if_status|INFO|Not updating pb chassis for db231dc0-94bd-47c5-bc4c-f139648e2cfa now as sb is readonly
Jan 31 03:14:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:14:02Z|00373|binding|INFO|Claiming lport db231dc0-94bd-47c5-bc4c-f139648e2cfa for this chassis.
Jan 31 03:14:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:14:02Z|00374|binding|INFO|db231dc0-94bd-47c5-bc4c-f139648e2cfa: Claiming fa:16:3e:0d:e9:1e 10.100.0.9
Jan 31 03:14:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:14:03Z|00375|binding|INFO|Setting lport db231dc0-94bd-47c5-bc4c-f139648e2cfa ovn-installed in OVS
Jan 31 03:14:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:14:03Z|00376|binding|INFO|Setting lport db231dc0-94bd-47c5-bc4c-f139648e2cfa up in Southbound
Jan 31 03:14:03 np0005603623 systemd-udevd[265557]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:14:03 np0005603623 nova_compute[226235]: 2026-01-31 08:14:03.141 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.142 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:e9:1e 10.100.0.9'], port_security=['fa:16:3e:0d:e9:1e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'e54ff9a1-d1c9-4792-a837-076e8289ee23', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58e900992be7400fb940ca20f13e12d1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '596ab0fa-9144-4a59-97b9-1afd98634ee5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bae8797c-8cfa-434b-94e1-deeda92af05f, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=db231dc0-94bd-47c5-bc4c-f139648e2cfa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.143 143258 INFO neutron.agent.ovn.metadata.agent [-] Port db231dc0-94bd-47c5-bc4c-f139648e2cfa in datapath f218695f-c744-4bd8-b2d8-122a920c7ca0 bound to our chassis#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.146 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f218695f-c744-4bd8-b2d8-122a920c7ca0#033[00m
Jan 31 03:14:03 np0005603623 nova_compute[226235]: 2026-01-31 08:14:03.147 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:03 np0005603623 NetworkManager[48970]: <info>  [1769847243.1573] device (tapdb231dc0-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:14:03 np0005603623 NetworkManager[48970]: <info>  [1769847243.1578] device (tapdb231dc0-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.162 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[16bc108d-bff7-4760-98f9-682101a71319]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.163 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf218695f-c1 in ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.166 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf218695f-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.167 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[96966125-3fc5-45c6-a24a-46bcdc4b678b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.168 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5c43eca8-0977-422c-942a-c02ddcb1c430]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:03 np0005603623 systemd-machined[194379]: New machine qemu-41-instance-0000005d.
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.182 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[3ec6daf0-b66f-46dc-8d34-a5e758a01ce6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:03 np0005603623 systemd[1]: Started Virtual Machine qemu-41-instance-0000005d.
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.195 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dc4ce209-e3c3-41b6-9cf1-b5864e00af94]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.224 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[daefa078-efae-49ab-9f18-d0203540e863]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:03 np0005603623 NetworkManager[48970]: <info>  [1769847243.2303] manager: (tapf218695f-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/182)
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.231 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[015c46e7-bbd3-4b8d-99fe-a60aa670ff3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.259 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ca79c500-a06a-463c-a04a-cc730564b679]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.262 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[8008b526-cca3-4d23-a831-5705439077f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:03 np0005603623 NetworkManager[48970]: <info>  [1769847243.2846] device (tapf218695f-c0): carrier: link connected
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.289 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[030fc86f-f6ea-4bcf-bc1a-4edbf538ad8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.307 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7cd80577-6cce-4250-8d77-8b3340ac36f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf218695f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:08:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 108], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638371, 'reachable_time': 44718, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 265643, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.321 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a443c0e8-abfa-4073-bf8b-8fddefe98c7f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:830'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 638371, 'tstamp': 638371}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 265644, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.340 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b01e13a8-49e8-4d71-acc9-cf7d6368b077]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf218695f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:08:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 108], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638371, 'reachable_time': 44718, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 265645, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.369 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[29a5ddb4-1915-4fd1-b70a-3c36dbcdbb18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.421 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2db0611b-d498-45af-a49b-adfe578cc33c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.424 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf218695f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.425 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.426 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf218695f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:14:03 np0005603623 NetworkManager[48970]: <info>  [1769847243.4292] manager: (tapf218695f-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/183)
Jan 31 03:14:03 np0005603623 nova_compute[226235]: 2026-01-31 08:14:03.429 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:03 np0005603623 kernel: tapf218695f-c0: entered promiscuous mode
Jan 31 03:14:03 np0005603623 nova_compute[226235]: 2026-01-31 08:14:03.432 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.435 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf218695f-c0, col_values=(('external_ids', {'iface-id': 'd3a551a2-38e3-48d3-bdee-f2493a79eca0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:14:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:14:03Z|00377|binding|INFO|Releasing lport d3a551a2-38e3-48d3-bdee-f2493a79eca0 from this chassis (sb_readonly=0)
Jan 31 03:14:03 np0005603623 nova_compute[226235]: 2026-01-31 08:14:03.437 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.438 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f218695f-c744-4bd8-b2d8-122a920c7ca0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f218695f-c744-4bd8-b2d8-122a920c7ca0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.439 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f141ff59-ef50-43c4-9a66-40b19916c31b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.440 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-f218695f-c744-4bd8-b2d8-122a920c7ca0
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/f218695f-c744-4bd8-b2d8-122a920c7ca0.pid.haproxy
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID f218695f-c744-4bd8-b2d8-122a920c7ca0
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:14:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:03.440 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'env', 'PROCESS_TAG=haproxy-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f218695f-c744-4bd8-b2d8-122a920c7ca0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:14:03 np0005603623 nova_compute[226235]: 2026-01-31 08:14:03.447 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:03 np0005603623 nova_compute[226235]: 2026-01-31 08:14:03.520 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:03 np0005603623 nova_compute[226235]: 2026-01-31 08:14:03.555 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:03 np0005603623 nova_compute[226235]: 2026-01-31 08:14:03.563 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847243.5633543, e54ff9a1-d1c9-4792-a837-076e8289ee23 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:14:03 np0005603623 nova_compute[226235]: 2026-01-31 08:14:03.563 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] VM Started (Lifecycle Event)#033[00m
Jan 31 03:14:03 np0005603623 nova_compute[226235]: 2026-01-31 08:14:03.608 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:14:03 np0005603623 nova_compute[226235]: 2026-01-31 08:14:03.611 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847243.563901, e54ff9a1-d1c9-4792-a837-076e8289ee23 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:14:03 np0005603623 nova_compute[226235]: 2026-01-31 08:14:03.612 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:14:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:03.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:03 np0005603623 nova_compute[226235]: 2026-01-31 08:14:03.736 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:14:03 np0005603623 nova_compute[226235]: 2026-01-31 08:14:03.740 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:14:03 np0005603623 podman[265720]: 2026-01-31 08:14:03.804443184 +0000 UTC m=+0.040050454 container create f3d1e5f9e459422586f7aea3cc7618b8578f1e790e5cb20b9139ffe0461c20ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:14:03 np0005603623 systemd[1]: Started libpod-conmon-f3d1e5f9e459422586f7aea3cc7618b8578f1e790e5cb20b9139ffe0461c20ed.scope.
Jan 31 03:14:03 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:14:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:03 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc6c42ffc8c50dde4995f53a909492fca31b5adb5d35489aad9f180523ab7383/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:14:03 np0005603623 podman[265720]: 2026-01-31 08:14:03.783971378 +0000 UTC m=+0.019578648 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:14:03 np0005603623 podman[265720]: 2026-01-31 08:14:03.881361948 +0000 UTC m=+0.116969238 container init f3d1e5f9e459422586f7aea3cc7618b8578f1e790e5cb20b9139ffe0461c20ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127)
Jan 31 03:14:03 np0005603623 podman[265720]: 2026-01-31 08:14:03.888949278 +0000 UTC m=+0.124556568 container start f3d1e5f9e459422586f7aea3cc7618b8578f1e790e5cb20b9139ffe0461c20ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 31 03:14:03 np0005603623 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[265735]: [NOTICE]   (265739) : New worker (265741) forked
Jan 31 03:14:03 np0005603623 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[265735]: [NOTICE]   (265739) : Loading success.
Jan 31 03:14:03 np0005603623 nova_compute[226235]: 2026-01-31 08:14:03.914 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:14:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:04.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.417 226239 DEBUG nova.compute.manager [req-e8552186-7880-4305-a02d-b0cf538a9f60 req-88d476cd-1b1a-4b0c-be5d-824156a25007 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Received event network-vif-plugged-db231dc0-94bd-47c5-bc4c-f139648e2cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.417 226239 DEBUG oslo_concurrency.lockutils [req-e8552186-7880-4305-a02d-b0cf538a9f60 req-88d476cd-1b1a-4b0c-be5d-824156a25007 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.417 226239 DEBUG oslo_concurrency.lockutils [req-e8552186-7880-4305-a02d-b0cf538a9f60 req-88d476cd-1b1a-4b0c-be5d-824156a25007 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.418 226239 DEBUG oslo_concurrency.lockutils [req-e8552186-7880-4305-a02d-b0cf538a9f60 req-88d476cd-1b1a-4b0c-be5d-824156a25007 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.418 226239 DEBUG nova.compute.manager [req-e8552186-7880-4305-a02d-b0cf538a9f60 req-88d476cd-1b1a-4b0c-be5d-824156a25007 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Processing event network-vif-plugged-db231dc0-94bd-47c5-bc4c-f139648e2cfa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.418 226239 DEBUG nova.compute.manager [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.421 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847244.4214056, e54ff9a1-d1c9-4792-a837-076e8289ee23 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.421 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.423 226239 DEBUG nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.426 226239 INFO nova.virt.libvirt.driver [-] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Instance spawned successfully.#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.426 226239 DEBUG nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.538 226239 DEBUG nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.538 226239 DEBUG nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.539 226239 DEBUG nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.539 226239 DEBUG nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.540 226239 DEBUG nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.540 226239 DEBUG nova.virt.libvirt.driver [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.552 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.555 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.626 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.750 226239 INFO nova.compute.manager [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Took 18.52 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:14:04 np0005603623 nova_compute[226235]: 2026-01-31 08:14:04.750 226239 DEBUG nova.compute.manager [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:14:05 np0005603623 nova_compute[226235]: 2026-01-31 08:14:05.021 226239 INFO nova.compute.manager [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Took 22.03 seconds to build instance.#033[00m
Jan 31 03:14:05 np0005603623 nova_compute[226235]: 2026-01-31 08:14:05.181 226239 DEBUG oslo_concurrency.lockutils [None req-85ab9afa-35c1-480b-b804-2280b2eebc07 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.495s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:05.318 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:14:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:05.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:06.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:06 np0005603623 nova_compute[226235]: 2026-01-31 08:14:06.619 226239 DEBUG nova.compute.manager [req-baeabbed-069a-40b1-a81e-078b0b482f0a req-7261ef93-6a53-41ff-972c-1c0ceddcb542 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Received event network-vif-plugged-db231dc0-94bd-47c5-bc4c-f139648e2cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:14:06 np0005603623 nova_compute[226235]: 2026-01-31 08:14:06.619 226239 DEBUG oslo_concurrency.lockutils [req-baeabbed-069a-40b1-a81e-078b0b482f0a req-7261ef93-6a53-41ff-972c-1c0ceddcb542 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:06 np0005603623 nova_compute[226235]: 2026-01-31 08:14:06.619 226239 DEBUG oslo_concurrency.lockutils [req-baeabbed-069a-40b1-a81e-078b0b482f0a req-7261ef93-6a53-41ff-972c-1c0ceddcb542 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:06 np0005603623 nova_compute[226235]: 2026-01-31 08:14:06.620 226239 DEBUG oslo_concurrency.lockutils [req-baeabbed-069a-40b1-a81e-078b0b482f0a req-7261ef93-6a53-41ff-972c-1c0ceddcb542 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:06 np0005603623 nova_compute[226235]: 2026-01-31 08:14:06.620 226239 DEBUG nova.compute.manager [req-baeabbed-069a-40b1-a81e-078b0b482f0a req-7261ef93-6a53-41ff-972c-1c0ceddcb542 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] No waiting events found dispatching network-vif-plugged-db231dc0-94bd-47c5-bc4c-f139648e2cfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:14:06 np0005603623 nova_compute[226235]: 2026-01-31 08:14:06.620 226239 WARNING nova.compute.manager [req-baeabbed-069a-40b1-a81e-078b0b482f0a req-7261ef93-6a53-41ff-972c-1c0ceddcb542 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Received unexpected event network-vif-plugged-db231dc0-94bd-47c5-bc4c-f139648e2cfa for instance with vm_state active and task_state None.#033[00m
Jan 31 03:14:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:14:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:07.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:14:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:08.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:08 np0005603623 nova_compute[226235]: 2026-01-31 08:14:08.522 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:08 np0005603623 nova_compute[226235]: 2026-01-31 08:14:08.556 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:09.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:10.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:10 np0005603623 systemd[1]: Starting dnf makecache...
Jan 31 03:14:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:14:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:14:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:14:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:14:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:14:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:14:10 np0005603623 dnf[265899]: Metadata cache refreshed recently.
Jan 31 03:14:10 np0005603623 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 31 03:14:10 np0005603623 systemd[1]: Finished dnf makecache.
Jan 31 03:14:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:11.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 03:14:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 03:14:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 03:14:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:14:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:14:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:14:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:12.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:13 np0005603623 nova_compute[226235]: 2026-01-31 08:14:13.523 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:13 np0005603623 nova_compute[226235]: 2026-01-31 08:14:13.557 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:14:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:13.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:14:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:14.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:14 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #50. Immutable memtables: 7.
Jan 31 03:14:15 np0005603623 nova_compute[226235]: 2026-01-31 08:14:15.482 226239 DEBUG oslo_concurrency.lockutils [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "refresh_cache-e54ff9a1-d1c9-4792-a837-076e8289ee23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:14:15 np0005603623 nova_compute[226235]: 2026-01-31 08:14:15.483 226239 DEBUG oslo_concurrency.lockutils [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquired lock "refresh_cache-e54ff9a1-d1c9-4792-a837-076e8289ee23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:14:15 np0005603623 nova_compute[226235]: 2026-01-31 08:14:15.483 226239 DEBUG nova.network.neutron [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:14:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:15.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:16.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:16 np0005603623 ovn_controller[133449]: 2026-01-31T08:14:16Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0d:e9:1e 10.100.0.9
Jan 31 03:14:16 np0005603623 ovn_controller[133449]: 2026-01-31T08:14:16Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0d:e9:1e 10.100.0.9
Jan 31 03:14:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:14:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:17.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:14:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:14:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:14:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:14:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:18.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:14:18 np0005603623 nova_compute[226235]: 2026-01-31 08:14:18.525 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:18 np0005603623 nova_compute[226235]: 2026-01-31 08:14:18.559 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:18 np0005603623 nova_compute[226235]: 2026-01-31 08:14:18.589 226239 DEBUG nova.network.neutron [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Updating instance_info_cache with network_info: [{"id": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "address": "fa:16:3e:0d:e9:1e", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb231dc0-94", "ovs_interfaceid": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:14:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:19 np0005603623 nova_compute[226235]: 2026-01-31 08:14:19.060 226239 DEBUG oslo_concurrency.lockutils [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Releasing lock "refresh_cache-e54ff9a1-d1c9-4792-a837-076e8289ee23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:14:19 np0005603623 nova_compute[226235]: 2026-01-31 08:14:19.584 226239 DEBUG nova.virt.libvirt.driver [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 31 03:14:19 np0005603623 nova_compute[226235]: 2026-01-31 08:14:19.585 226239 DEBUG nova.virt.libvirt.volume.remotefs [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Creating file /var/lib/nova/instances/e54ff9a1-d1c9-4792-a837-076e8289ee23/ee9eaeb397034c27a18d0d93219580e0.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 31 03:14:19 np0005603623 nova_compute[226235]: 2026-01-31 08:14:19.585 226239 DEBUG oslo_concurrency.processutils [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/e54ff9a1-d1c9-4792-a837-076e8289ee23/ee9eaeb397034c27a18d0d93219580e0.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:14:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:19.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:19 np0005603623 nova_compute[226235]: 2026-01-31 08:14:19.778 226239 DEBUG oslo_concurrency.processutils [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/e54ff9a1-d1c9-4792-a837-076e8289ee23/ee9eaeb397034c27a18d0d93219580e0.tmp" returned: 1 in 0.193s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:14:19 np0005603623 nova_compute[226235]: 2026-01-31 08:14:19.780 226239 DEBUG oslo_concurrency.processutils [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/e54ff9a1-d1c9-4792-a837-076e8289ee23/ee9eaeb397034c27a18d0d93219580e0.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 31 03:14:19 np0005603623 nova_compute[226235]: 2026-01-31 08:14:19.781 226239 DEBUG nova.virt.libvirt.volume.remotefs [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Creating directory /var/lib/nova/instances/e54ff9a1-d1c9-4792-a837-076e8289ee23 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 31 03:14:19 np0005603623 nova_compute[226235]: 2026-01-31 08:14:19.781 226239 DEBUG oslo_concurrency.processutils [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/e54ff9a1-d1c9-4792-a837-076e8289ee23 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:14:19 np0005603623 nova_compute[226235]: 2026-01-31 08:14:19.966 226239 DEBUG oslo_concurrency.processutils [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/e54ff9a1-d1c9-4792-a837-076e8289ee23" returned: 0 in 0.184s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:14:19 np0005603623 nova_compute[226235]: 2026-01-31 08:14:19.970 226239 DEBUG nova.virt.libvirt.driver [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:14:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:20.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:21.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:22.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:22 np0005603623 kernel: tapdb231dc0-94 (unregistering): left promiscuous mode
Jan 31 03:14:22 np0005603623 NetworkManager[48970]: <info>  [1769847262.4468] device (tapdb231dc0-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:14:22 np0005603623 ovn_controller[133449]: 2026-01-31T08:14:22Z|00378|binding|INFO|Releasing lport db231dc0-94bd-47c5-bc4c-f139648e2cfa from this chassis (sb_readonly=0)
Jan 31 03:14:22 np0005603623 nova_compute[226235]: 2026-01-31 08:14:22.453 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:22 np0005603623 ovn_controller[133449]: 2026-01-31T08:14:22Z|00379|binding|INFO|Setting lport db231dc0-94bd-47c5-bc4c-f139648e2cfa down in Southbound
Jan 31 03:14:22 np0005603623 ovn_controller[133449]: 2026-01-31T08:14:22Z|00380|binding|INFO|Removing iface tapdb231dc0-94 ovn-installed in OVS
Jan 31 03:14:22 np0005603623 nova_compute[226235]: 2026-01-31 08:14:22.457 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:22 np0005603623 nova_compute[226235]: 2026-01-31 08:14:22.460 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:22 np0005603623 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d0000005d.scope: Deactivated successfully.
Jan 31 03:14:22 np0005603623 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d0000005d.scope: Consumed 11.845s CPU time.
Jan 31 03:14:22 np0005603623 systemd-machined[194379]: Machine qemu-41-instance-0000005d terminated.
Jan 31 03:14:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:22.679 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:e9:1e 10.100.0.9'], port_security=['fa:16:3e:0d:e9:1e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'e54ff9a1-d1c9-4792-a837-076e8289ee23', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58e900992be7400fb940ca20f13e12d1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '596ab0fa-9144-4a59-97b9-1afd98634ee5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bae8797c-8cfa-434b-94e1-deeda92af05f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=db231dc0-94bd-47c5-bc4c-f139648e2cfa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:14:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:22.680 143258 INFO neutron.agent.ovn.metadata.agent [-] Port db231dc0-94bd-47c5-bc4c-f139648e2cfa in datapath f218695f-c744-4bd8-b2d8-122a920c7ca0 unbound from our chassis#033[00m
Jan 31 03:14:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:22.682 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f218695f-c744-4bd8-b2d8-122a920c7ca0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:14:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:22.683 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d45a3582-fbbd-4a80-bf02-c8dc2766bddb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:22.683 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 namespace which is not needed anymore#033[00m
Jan 31 03:14:22 np0005603623 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[265735]: [NOTICE]   (265739) : haproxy version is 2.8.14-c23fe91
Jan 31 03:14:22 np0005603623 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[265735]: [NOTICE]   (265739) : path to executable is /usr/sbin/haproxy
Jan 31 03:14:22 np0005603623 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[265735]: [WARNING]  (265739) : Exiting Master process...
Jan 31 03:14:22 np0005603623 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[265735]: [ALERT]    (265739) : Current worker (265741) exited with code 143 (Terminated)
Jan 31 03:14:22 np0005603623 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[265735]: [WARNING]  (265739) : All workers exited. Exiting... (0)
Jan 31 03:14:22 np0005603623 systemd[1]: libpod-f3d1e5f9e459422586f7aea3cc7618b8578f1e790e5cb20b9139ffe0461c20ed.scope: Deactivated successfully.
Jan 31 03:14:22 np0005603623 podman[266097]: 2026-01-31 08:14:22.819006072 +0000 UTC m=+0.046514818 container died f3d1e5f9e459422586f7aea3cc7618b8578f1e790e5cb20b9139ffe0461c20ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:14:22 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f3d1e5f9e459422586f7aea3cc7618b8578f1e790e5cb20b9139ffe0461c20ed-userdata-shm.mount: Deactivated successfully.
Jan 31 03:14:22 np0005603623 systemd[1]: var-lib-containers-storage-overlay-bc6c42ffc8c50dde4995f53a909492fca31b5adb5d35489aad9f180523ab7383-merged.mount: Deactivated successfully.
Jan 31 03:14:22 np0005603623 podman[266097]: 2026-01-31 08:14:22.863815965 +0000 UTC m=+0.091324711 container cleanup f3d1e5f9e459422586f7aea3cc7618b8578f1e790e5cb20b9139ffe0461c20ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 31 03:14:22 np0005603623 systemd[1]: libpod-conmon-f3d1e5f9e459422586f7aea3cc7618b8578f1e790e5cb20b9139ffe0461c20ed.scope: Deactivated successfully.
Jan 31 03:14:22 np0005603623 podman[266127]: 2026-01-31 08:14:22.914238944 +0000 UTC m=+0.035748577 container remove f3d1e5f9e459422586f7aea3cc7618b8578f1e790e5cb20b9139ffe0461c20ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:14:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:22.918 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[da3967cf-000e-4aa2-bab2-732d8886d756]: (4, ('Sat Jan 31 08:14:22 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 (f3d1e5f9e459422586f7aea3cc7618b8578f1e790e5cb20b9139ffe0461c20ed)\nf3d1e5f9e459422586f7aea3cc7618b8578f1e790e5cb20b9139ffe0461c20ed\nSat Jan 31 08:14:22 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 (f3d1e5f9e459422586f7aea3cc7618b8578f1e790e5cb20b9139ffe0461c20ed)\nf3d1e5f9e459422586f7aea3cc7618b8578f1e790e5cb20b9139ffe0461c20ed\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:22.919 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[49cad1e6-b86a-4269-b297-d31bf08702df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:22.920 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf218695f-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:14:22 np0005603623 nova_compute[226235]: 2026-01-31 08:14:22.922 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:22 np0005603623 kernel: tapf218695f-c0: left promiscuous mode
Jan 31 03:14:22 np0005603623 nova_compute[226235]: 2026-01-31 08:14:22.930 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:22.932 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e22780db-8465-4205-81ed-aeb95490439a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:22.944 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cf98268e-81f7-4107-b528-d90d8dccdb81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:22.945 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[97d13f0e-faff-448b-90eb-74ebe42d3d6f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:22.954 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d1a8c27a-3f3d-407e-a123-a418a05a7b41]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638364, 'reachable_time': 29380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266146, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:22.956 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:14:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:22.956 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[953a4a1c-811a-4c7b-bc4b-9124f93a3330]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:22 np0005603623 systemd[1]: run-netns-ovnmeta\x2df218695f\x2dc744\x2d4bd8\x2db2d8\x2d122a920c7ca0.mount: Deactivated successfully.
Jan 31 03:14:22 np0005603623 nova_compute[226235]: 2026-01-31 08:14:22.985 226239 INFO nova.virt.libvirt.driver [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Instance shutdown successfully after 3 seconds.#033[00m
Jan 31 03:14:22 np0005603623 nova_compute[226235]: 2026-01-31 08:14:22.988 226239 INFO nova.virt.libvirt.driver [-] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Instance destroyed successfully.#033[00m
Jan 31 03:14:22 np0005603623 nova_compute[226235]: 2026-01-31 08:14:22.989 226239 DEBUG nova.virt.libvirt.vif [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:13:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-436373101',display_name='tempest-ServerDiskConfigTestJSON-server-436373101',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-436373101',id=93,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:14:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='58e900992be7400fb940ca20f13e12d1',ramdisk_id='',reservation_id='r-0t50bqgi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-855158150',owner_user_name='tempest-ServerDiskConfigTestJSON-855158150-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:14:12Z,user_data=None,user_id='111fdaf79c084a91902fe37a7a502020',uuid=e54ff9a1-d1c9-4792-a837-076e8289ee23,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "address": "fa:16:3e:0d:e9:1e", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "vif_mac": "fa:16:3e:0d:e9:1e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb231dc0-94", "ovs_interfaceid": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:14:22 np0005603623 nova_compute[226235]: 2026-01-31 08:14:22.990 226239 DEBUG nova.network.os_vif_util [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converting VIF {"id": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "address": "fa:16:3e:0d:e9:1e", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "vif_mac": "fa:16:3e:0d:e9:1e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb231dc0-94", "ovs_interfaceid": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:14:22 np0005603623 nova_compute[226235]: 2026-01-31 08:14:22.990 226239 DEBUG nova.network.os_vif_util [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:e9:1e,bridge_name='br-int',has_traffic_filtering=True,id=db231dc0-94bd-47c5-bc4c-f139648e2cfa,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb231dc0-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:14:22 np0005603623 nova_compute[226235]: 2026-01-31 08:14:22.991 226239 DEBUG os_vif [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:e9:1e,bridge_name='br-int',has_traffic_filtering=True,id=db231dc0-94bd-47c5-bc4c-f139648e2cfa,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb231dc0-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:14:22 np0005603623 nova_compute[226235]: 2026-01-31 08:14:22.992 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:22 np0005603623 nova_compute[226235]: 2026-01-31 08:14:22.992 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb231dc0-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:14:22 np0005603623 nova_compute[226235]: 2026-01-31 08:14:22.994 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:22 np0005603623 nova_compute[226235]: 2026-01-31 08:14:22.996 226239 INFO os_vif [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:e9:1e,bridge_name='br-int',has_traffic_filtering=True,id=db231dc0-94bd-47c5-bc4c-f139648e2cfa,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb231dc0-94')#033[00m
Jan 31 03:14:22 np0005603623 nova_compute[226235]: 2026-01-31 08:14:22.999 226239 DEBUG nova.virt.libvirt.driver [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] skipping disk for instance-0000005d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:14:22 np0005603623 nova_compute[226235]: 2026-01-31 08:14:22.999 226239 DEBUG nova.virt.libvirt.driver [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] skipping disk for instance-0000005d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:14:23 np0005603623 nova_compute[226235]: 2026-01-31 08:14:23.560 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:23.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:23 np0005603623 nova_compute[226235]: 2026-01-31 08:14:23.939 226239 DEBUG neutronclient.v2_0.client [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port db231dc0-94bd-47c5-bc4c-f139648e2cfa for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 31 03:14:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:24.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:24 np0005603623 nova_compute[226235]: 2026-01-31 08:14:24.480 226239 DEBUG nova.compute.manager [req-f91ef9c1-0a22-48f6-b3a9-634f28e52e23 req-9ca50edc-c8b8-425d-9c7d-4df7737f92df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Received event network-vif-unplugged-db231dc0-94bd-47c5-bc4c-f139648e2cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:14:24 np0005603623 nova_compute[226235]: 2026-01-31 08:14:24.481 226239 DEBUG oslo_concurrency.lockutils [req-f91ef9c1-0a22-48f6-b3a9-634f28e52e23 req-9ca50edc-c8b8-425d-9c7d-4df7737f92df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:24 np0005603623 nova_compute[226235]: 2026-01-31 08:14:24.481 226239 DEBUG oslo_concurrency.lockutils [req-f91ef9c1-0a22-48f6-b3a9-634f28e52e23 req-9ca50edc-c8b8-425d-9c7d-4df7737f92df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:24 np0005603623 nova_compute[226235]: 2026-01-31 08:14:24.482 226239 DEBUG oslo_concurrency.lockutils [req-f91ef9c1-0a22-48f6-b3a9-634f28e52e23 req-9ca50edc-c8b8-425d-9c7d-4df7737f92df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:24 np0005603623 nova_compute[226235]: 2026-01-31 08:14:24.482 226239 DEBUG nova.compute.manager [req-f91ef9c1-0a22-48f6-b3a9-634f28e52e23 req-9ca50edc-c8b8-425d-9c7d-4df7737f92df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] No waiting events found dispatching network-vif-unplugged-db231dc0-94bd-47c5-bc4c-f139648e2cfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:14:24 np0005603623 nova_compute[226235]: 2026-01-31 08:14:24.482 226239 WARNING nova.compute.manager [req-f91ef9c1-0a22-48f6-b3a9-634f28e52e23 req-9ca50edc-c8b8-425d-9c7d-4df7737f92df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Received unexpected event network-vif-unplugged-db231dc0-94bd-47c5-bc4c-f139648e2cfa for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 31 03:14:24 np0005603623 nova_compute[226235]: 2026-01-31 08:14:24.705 226239 DEBUG oslo_concurrency.lockutils [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:24 np0005603623 nova_compute[226235]: 2026-01-31 08:14:24.705 226239 DEBUG oslo_concurrency.lockutils [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:24 np0005603623 nova_compute[226235]: 2026-01-31 08:14:24.706 226239 DEBUG oslo_concurrency.lockutils [None req-b580fca4-b0aa-4549-bf87-a7b0125bdea7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:25.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:26.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:27 np0005603623 nova_compute[226235]: 2026-01-31 08:14:27.021 226239 DEBUG nova.compute.manager [req-23960eff-7331-4ffc-aadc-54fecf0f461f req-47bab3ef-1b00-4e2f-9a16-34dc49f03ad5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Received event network-vif-plugged-db231dc0-94bd-47c5-bc4c-f139648e2cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:14:27 np0005603623 nova_compute[226235]: 2026-01-31 08:14:27.021 226239 DEBUG oslo_concurrency.lockutils [req-23960eff-7331-4ffc-aadc-54fecf0f461f req-47bab3ef-1b00-4e2f-9a16-34dc49f03ad5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:27 np0005603623 nova_compute[226235]: 2026-01-31 08:14:27.021 226239 DEBUG oslo_concurrency.lockutils [req-23960eff-7331-4ffc-aadc-54fecf0f461f req-47bab3ef-1b00-4e2f-9a16-34dc49f03ad5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:27 np0005603623 nova_compute[226235]: 2026-01-31 08:14:27.021 226239 DEBUG oslo_concurrency.lockutils [req-23960eff-7331-4ffc-aadc-54fecf0f461f req-47bab3ef-1b00-4e2f-9a16-34dc49f03ad5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:27 np0005603623 nova_compute[226235]: 2026-01-31 08:14:27.021 226239 DEBUG nova.compute.manager [req-23960eff-7331-4ffc-aadc-54fecf0f461f req-47bab3ef-1b00-4e2f-9a16-34dc49f03ad5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] No waiting events found dispatching network-vif-plugged-db231dc0-94bd-47c5-bc4c-f139648e2cfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:14:27 np0005603623 nova_compute[226235]: 2026-01-31 08:14:27.022 226239 WARNING nova.compute.manager [req-23960eff-7331-4ffc-aadc-54fecf0f461f req-47bab3ef-1b00-4e2f-9a16-34dc49f03ad5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Received unexpected event network-vif-plugged-db231dc0-94bd-47c5-bc4c-f139648e2cfa for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:14:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:14:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:27.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:14:27 np0005603623 nova_compute[226235]: 2026-01-31 08:14:27.993 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:28.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:28 np0005603623 nova_compute[226235]: 2026-01-31 08:14:28.561 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:28 np0005603623 nova_compute[226235]: 2026-01-31 08:14:28.826 226239 DEBUG nova.compute.manager [req-4df344fc-5dce-4636-8716-d2408f03970b req-d688831b-5b79-42ea-9c05-132e1703f5d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Received event network-changed-db231dc0-94bd-47c5-bc4c-f139648e2cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:14:28 np0005603623 nova_compute[226235]: 2026-01-31 08:14:28.826 226239 DEBUG nova.compute.manager [req-4df344fc-5dce-4636-8716-d2408f03970b req-d688831b-5b79-42ea-9c05-132e1703f5d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Refreshing instance network info cache due to event network-changed-db231dc0-94bd-47c5-bc4c-f139648e2cfa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:14:28 np0005603623 nova_compute[226235]: 2026-01-31 08:14:28.827 226239 DEBUG oslo_concurrency.lockutils [req-4df344fc-5dce-4636-8716-d2408f03970b req-d688831b-5b79-42ea-9c05-132e1703f5d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e54ff9a1-d1c9-4792-a837-076e8289ee23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:14:28 np0005603623 nova_compute[226235]: 2026-01-31 08:14:28.827 226239 DEBUG oslo_concurrency.lockutils [req-4df344fc-5dce-4636-8716-d2408f03970b req-d688831b-5b79-42ea-9c05-132e1703f5d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e54ff9a1-d1c9-4792-a837-076e8289ee23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:14:28 np0005603623 nova_compute[226235]: 2026-01-31 08:14:28.827 226239 DEBUG nova.network.neutron [req-4df344fc-5dce-4636-8716-d2408f03970b req-d688831b-5b79-42ea-9c05-132e1703f5d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Refreshing network info cache for port db231dc0-94bd-47c5-bc4c-f139648e2cfa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:14:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:28 np0005603623 podman[266200]: 2026-01-31 08:14:28.96534123 +0000 UTC m=+0.045727122 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent)
Jan 31 03:14:29 np0005603623 podman[266201]: 2026-01-31 08:14:29.062176264 +0000 UTC m=+0.142684159 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:14:29 np0005603623 nova_compute[226235]: 2026-01-31 08:14:29.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:29 np0005603623 nova_compute[226235]: 2026-01-31 08:14:29.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:14:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:29.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:30.106 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:30.107 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:14:30.107 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:30.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:30 np0005603623 nova_compute[226235]: 2026-01-31 08:14:30.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:30 np0005603623 nova_compute[226235]: 2026-01-31 08:14:30.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:14:30 np0005603623 nova_compute[226235]: 2026-01-31 08:14:30.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:14:30 np0005603623 nova_compute[226235]: 2026-01-31 08:14:30.455 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-0c37b9a9-3924-451d-bf70-c38147e26756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:14:30 np0005603623 nova_compute[226235]: 2026-01-31 08:14:30.456 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-0c37b9a9-3924-451d-bf70-c38147e26756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:14:30 np0005603623 nova_compute[226235]: 2026-01-31 08:14:30.456 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:14:30 np0005603623 nova_compute[226235]: 2026-01-31 08:14:30.456 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0c37b9a9-3924-451d-bf70-c38147e26756 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:14:31 np0005603623 nova_compute[226235]: 2026-01-31 08:14:31.344 226239 DEBUG nova.network.neutron [req-4df344fc-5dce-4636-8716-d2408f03970b req-d688831b-5b79-42ea-9c05-132e1703f5d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Updated VIF entry in instance network info cache for port db231dc0-94bd-47c5-bc4c-f139648e2cfa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:14:31 np0005603623 nova_compute[226235]: 2026-01-31 08:14:31.345 226239 DEBUG nova.network.neutron [req-4df344fc-5dce-4636-8716-d2408f03970b req-d688831b-5b79-42ea-9c05-132e1703f5d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Updating instance_info_cache with network_info: [{"id": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "address": "fa:16:3e:0d:e9:1e", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb231dc0-94", "ovs_interfaceid": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:14:31 np0005603623 nova_compute[226235]: 2026-01-31 08:14:31.714 226239 DEBUG oslo_concurrency.lockutils [req-4df344fc-5dce-4636-8716-d2408f03970b req-d688831b-5b79-42ea-9c05-132e1703f5d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e54ff9a1-d1c9-4792-a837-076e8289ee23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:14:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:31.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:32.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e260 e260: 3 total, 3 up, 3 in
Jan 31 03:14:32 np0005603623 nova_compute[226235]: 2026-01-31 08:14:32.995 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:33 np0005603623 nova_compute[226235]: 2026-01-31 08:14:33.562 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:14:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:33.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:14:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:34.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:34 np0005603623 nova_compute[226235]: 2026-01-31 08:14:34.283 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Updating instance_info_cache with network_info: [{"id": "abdf877d-771f-4148-a98f-c7e8319f044c", "address": "fa:16:3e:86:4e:c3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabdf877d-77", "ovs_interfaceid": "abdf877d-771f-4148-a98f-c7e8319f044c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:14:34 np0005603623 nova_compute[226235]: 2026-01-31 08:14:34.342 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-0c37b9a9-3924-451d-bf70-c38147e26756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:14:34 np0005603623 nova_compute[226235]: 2026-01-31 08:14:34.342 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:14:34 np0005603623 nova_compute[226235]: 2026-01-31 08:14:34.343 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:35 np0005603623 nova_compute[226235]: 2026-01-31 08:14:35.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:35 np0005603623 nova_compute[226235]: 2026-01-31 08:14:35.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:35 np0005603623 nova_compute[226235]: 2026-01-31 08:14:35.281 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:35 np0005603623 nova_compute[226235]: 2026-01-31 08:14:35.281 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:35 np0005603623 nova_compute[226235]: 2026-01-31 08:14:35.281 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:35 np0005603623 nova_compute[226235]: 2026-01-31 08:14:35.281 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:14:35 np0005603623 nova_compute[226235]: 2026-01-31 08:14:35.282 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:14:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:14:35 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3452854122' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:14:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:35.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:35 np0005603623 nova_compute[226235]: 2026-01-31 08:14:35.747 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:14:35 np0005603623 nova_compute[226235]: 2026-01-31 08:14:35.985 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000005d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:14:35 np0005603623 nova_compute[226235]: 2026-01-31 08:14:35.986 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000005d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:14:35 np0005603623 nova_compute[226235]: 2026-01-31 08:14:35.989 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000005b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:14:35 np0005603623 nova_compute[226235]: 2026-01-31 08:14:35.989 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000005b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:14:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:36.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:36 np0005603623 nova_compute[226235]: 2026-01-31 08:14:36.155 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:14:36 np0005603623 nova_compute[226235]: 2026-01-31 08:14:36.157 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4423MB free_disk=20.851505279541016GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:14:36 np0005603623 nova_compute[226235]: 2026-01-31 08:14:36.158 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:36 np0005603623 nova_compute[226235]: 2026-01-31 08:14:36.158 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:36 np0005603623 nova_compute[226235]: 2026-01-31 08:14:36.331 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Migration for instance e54ff9a1-d1c9-4792-a837-076e8289ee23 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 03:14:36 np0005603623 nova_compute[226235]: 2026-01-31 08:14:36.394 226239 INFO nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Updating resource usage from migration 573cb6b9-9e94-474a-9cc7-e9a6e0bcfd43#033[00m
Jan 31 03:14:36 np0005603623 nova_compute[226235]: 2026-01-31 08:14:36.394 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Starting to track outgoing migration 573cb6b9-9e94-474a-9cc7-e9a6e0bcfd43 with flavor a01eb4f0-fd80-416b-a750-75de320394d8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Jan 31 03:14:36 np0005603623 nova_compute[226235]: 2026-01-31 08:14:36.426 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 0c37b9a9-3924-451d-bf70-c38147e26756 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:14:36 np0005603623 nova_compute[226235]: 2026-01-31 08:14:36.426 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Migration 573cb6b9-9e94-474a-9cc7-e9a6e0bcfd43 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 03:14:36 np0005603623 nova_compute[226235]: 2026-01-31 08:14:36.426 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:14:36 np0005603623 nova_compute[226235]: 2026-01-31 08:14:36.426 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:14:36 np0005603623 nova_compute[226235]: 2026-01-31 08:14:36.521 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:14:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:14:36 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1236475402' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:14:36 np0005603623 nova_compute[226235]: 2026-01-31 08:14:36.967 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:14:36 np0005603623 nova_compute[226235]: 2026-01-31 08:14:36.973 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:14:37 np0005603623 nova_compute[226235]: 2026-01-31 08:14:37.027 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:14:37 np0005603623 nova_compute[226235]: 2026-01-31 08:14:37.133 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:14:37 np0005603623 nova_compute[226235]: 2026-01-31 08:14:37.134 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.976s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:37 np0005603623 nova_compute[226235]: 2026-01-31 08:14:37.496 226239 DEBUG nova.compute.manager [req-7099b6cd-85da-4623-aab8-e8935ba8f3fc req-a8532269-cf69-45b2-bd8c-58a192583cc2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Received event network-vif-plugged-db231dc0-94bd-47c5-bc4c-f139648e2cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:14:37 np0005603623 nova_compute[226235]: 2026-01-31 08:14:37.496 226239 DEBUG oslo_concurrency.lockutils [req-7099b6cd-85da-4623-aab8-e8935ba8f3fc req-a8532269-cf69-45b2-bd8c-58a192583cc2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:37 np0005603623 nova_compute[226235]: 2026-01-31 08:14:37.497 226239 DEBUG oslo_concurrency.lockutils [req-7099b6cd-85da-4623-aab8-e8935ba8f3fc req-a8532269-cf69-45b2-bd8c-58a192583cc2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:37 np0005603623 nova_compute[226235]: 2026-01-31 08:14:37.497 226239 DEBUG oslo_concurrency.lockutils [req-7099b6cd-85da-4623-aab8-e8935ba8f3fc req-a8532269-cf69-45b2-bd8c-58a192583cc2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:37 np0005603623 nova_compute[226235]: 2026-01-31 08:14:37.497 226239 DEBUG nova.compute.manager [req-7099b6cd-85da-4623-aab8-e8935ba8f3fc req-a8532269-cf69-45b2-bd8c-58a192583cc2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] No waiting events found dispatching network-vif-plugged-db231dc0-94bd-47c5-bc4c-f139648e2cfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:14:37 np0005603623 nova_compute[226235]: 2026-01-31 08:14:37.497 226239 WARNING nova.compute.manager [req-7099b6cd-85da-4623-aab8-e8935ba8f3fc req-a8532269-cf69-45b2-bd8c-58a192583cc2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Received unexpected event network-vif-plugged-db231dc0-94bd-47c5-bc4c-f139648e2cfa for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:14:37 np0005603623 nova_compute[226235]: 2026-01-31 08:14:37.691 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847262.6901817, e54ff9a1-d1c9-4792-a837-076e8289ee23 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:14:37 np0005603623 nova_compute[226235]: 2026-01-31 08:14:37.692 226239 INFO nova.compute.manager [-] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:14:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:37.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:37 np0005603623 nova_compute[226235]: 2026-01-31 08:14:37.760 226239 DEBUG nova.compute.manager [None req-7c2303f1-0ca3-45a7-971f-9a25e01ea580 - - - - - -] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:14:37 np0005603623 nova_compute[226235]: 2026-01-31 08:14:37.763 226239 DEBUG nova.compute.manager [None req-7c2303f1-0ca3-45a7-971f-9a25e01ea580 - - - - - -] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:14:37 np0005603623 nova_compute[226235]: 2026-01-31 08:14:37.980 226239 INFO nova.compute.manager [None req-7c2303f1-0ca3-45a7-971f-9a25e01ea580 - - - - - -] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 31 03:14:37 np0005603623 nova_compute[226235]: 2026-01-31 08:14:37.996 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:38.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:38 np0005603623 nova_compute[226235]: 2026-01-31 08:14:38.135 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:38 np0005603623 nova_compute[226235]: 2026-01-31 08:14:38.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:38 np0005603623 nova_compute[226235]: 2026-01-31 08:14:38.306 226239 DEBUG oslo_concurrency.lockutils [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "e54ff9a1-d1c9-4792-a837-076e8289ee23" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:38 np0005603623 nova_compute[226235]: 2026-01-31 08:14:38.307 226239 DEBUG oslo_concurrency.lockutils [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:38 np0005603623 nova_compute[226235]: 2026-01-31 08:14:38.307 226239 DEBUG nova.compute.manager [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Going to confirm migration 14 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 31 03:14:38 np0005603623 nova_compute[226235]: 2026-01-31 08:14:38.564 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:38 np0005603623 nova_compute[226235]: 2026-01-31 08:14:38.921 226239 DEBUG neutronclient.v2_0.client [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port db231dc0-94bd-47c5-bc4c-f139648e2cfa for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 31 03:14:38 np0005603623 nova_compute[226235]: 2026-01-31 08:14:38.922 226239 DEBUG oslo_concurrency.lockutils [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "refresh_cache-e54ff9a1-d1c9-4792-a837-076e8289ee23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:14:38 np0005603623 nova_compute[226235]: 2026-01-31 08:14:38.922 226239 DEBUG oslo_concurrency.lockutils [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquired lock "refresh_cache-e54ff9a1-d1c9-4792-a837-076e8289ee23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:14:38 np0005603623 nova_compute[226235]: 2026-01-31 08:14:38.922 226239 DEBUG nova.network.neutron [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:14:38 np0005603623 nova_compute[226235]: 2026-01-31 08:14:38.923 226239 DEBUG nova.objects.instance [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'info_cache' on Instance uuid e54ff9a1-d1c9-4792-a837-076e8289ee23 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:14:39 np0005603623 nova_compute[226235]: 2026-01-31 08:14:39.628 226239 DEBUG nova.compute.manager [req-9e65d6c2-40b1-4565-89a5-7014cbb3a06d req-818e3a51-2d9f-4444-b0a5-cc9daeacf748 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Received event network-vif-plugged-db231dc0-94bd-47c5-bc4c-f139648e2cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:14:39 np0005603623 nova_compute[226235]: 2026-01-31 08:14:39.629 226239 DEBUG oslo_concurrency.lockutils [req-9e65d6c2-40b1-4565-89a5-7014cbb3a06d req-818e3a51-2d9f-4444-b0a5-cc9daeacf748 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:39 np0005603623 nova_compute[226235]: 2026-01-31 08:14:39.629 226239 DEBUG oslo_concurrency.lockutils [req-9e65d6c2-40b1-4565-89a5-7014cbb3a06d req-818e3a51-2d9f-4444-b0a5-cc9daeacf748 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:39 np0005603623 nova_compute[226235]: 2026-01-31 08:14:39.630 226239 DEBUG oslo_concurrency.lockutils [req-9e65d6c2-40b1-4565-89a5-7014cbb3a06d req-818e3a51-2d9f-4444-b0a5-cc9daeacf748 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:39 np0005603623 nova_compute[226235]: 2026-01-31 08:14:39.630 226239 DEBUG nova.compute.manager [req-9e65d6c2-40b1-4565-89a5-7014cbb3a06d req-818e3a51-2d9f-4444-b0a5-cc9daeacf748 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] No waiting events found dispatching network-vif-plugged-db231dc0-94bd-47c5-bc4c-f139648e2cfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:14:39 np0005603623 nova_compute[226235]: 2026-01-31 08:14:39.630 226239 WARNING nova.compute.manager [req-9e65d6c2-40b1-4565-89a5-7014cbb3a06d req-818e3a51-2d9f-4444-b0a5-cc9daeacf748 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Received unexpected event network-vif-plugged-db231dc0-94bd-47c5-bc4c-f139648e2cfa for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:14:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:14:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:39.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:14:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:40.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:41 np0005603623 nova_compute[226235]: 2026-01-31 08:14:41.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:41 np0005603623 nova_compute[226235]: 2026-01-31 08:14:41.191 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:14:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:41.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:14:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:42.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:42 np0005603623 nova_compute[226235]: 2026-01-31 08:14:42.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:42 np0005603623 nova_compute[226235]: 2026-01-31 08:14:42.355 226239 DEBUG nova.network.neutron [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: e54ff9a1-d1c9-4792-a837-076e8289ee23] Updating instance_info_cache with network_info: [{"id": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "address": "fa:16:3e:0d:e9:1e", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb231dc0-94", "ovs_interfaceid": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:14:42 np0005603623 nova_compute[226235]: 2026-01-31 08:14:42.560 226239 DEBUG oslo_concurrency.lockutils [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Releasing lock "refresh_cache-e54ff9a1-d1c9-4792-a837-076e8289ee23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:14:42 np0005603623 nova_compute[226235]: 2026-01-31 08:14:42.560 226239 DEBUG nova.objects.instance [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'migration_context' on Instance uuid e54ff9a1-d1c9-4792-a837-076e8289ee23 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:14:42 np0005603623 nova_compute[226235]: 2026-01-31 08:14:42.997 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:43 np0005603623 nova_compute[226235]: 2026-01-31 08:14:43.206 226239 DEBUG nova.storage.rbd_utils [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] removing snapshot(nova-resize) on rbd image(e54ff9a1-d1c9-4792-a837-076e8289ee23_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:14:43 np0005603623 nova_compute[226235]: 2026-01-31 08:14:43.566 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:43.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:44.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e261 e261: 3 total, 3 up, 3 in
Jan 31 03:14:45 np0005603623 nova_compute[226235]: 2026-01-31 08:14:45.381 226239 DEBUG nova.virt.libvirt.vif [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:13:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-436373101',display_name='tempest-ServerDiskConfigTestJSON-server-436373101',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-436373101',id=93,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:14:35Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='58e900992be7400fb940ca20f13e12d1',ramdisk_id='',reservation_id='r-0t50bqgi',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-855158150',owner_user_name='tempest-ServerDiskConfigTestJSON-855158150-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:14:36Z,user_data=None,user_id='111fdaf79c084a91902fe37a7a502020',uuid=e54ff9a1-d1c9-4792-a837-076e8289ee23,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "address": "fa:16:3e:0d:e9:1e", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb231dc0-94", "ovs_interfaceid": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:14:45 np0005603623 nova_compute[226235]: 2026-01-31 08:14:45.381 226239 DEBUG nova.network.os_vif_util [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converting VIF {"id": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "address": "fa:16:3e:0d:e9:1e", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb231dc0-94", "ovs_interfaceid": "db231dc0-94bd-47c5-bc4c-f139648e2cfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:14:45 np0005603623 nova_compute[226235]: 2026-01-31 08:14:45.382 226239 DEBUG nova.network.os_vif_util [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:e9:1e,bridge_name='br-int',has_traffic_filtering=True,id=db231dc0-94bd-47c5-bc4c-f139648e2cfa,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb231dc0-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:14:45 np0005603623 nova_compute[226235]: 2026-01-31 08:14:45.382 226239 DEBUG os_vif [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:e9:1e,bridge_name='br-int',has_traffic_filtering=True,id=db231dc0-94bd-47c5-bc4c-f139648e2cfa,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb231dc0-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:14:45 np0005603623 nova_compute[226235]: 2026-01-31 08:14:45.384 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:45 np0005603623 nova_compute[226235]: 2026-01-31 08:14:45.384 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb231dc0-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:14:45 np0005603623 nova_compute[226235]: 2026-01-31 08:14:45.384 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:14:45 np0005603623 nova_compute[226235]: 2026-01-31 08:14:45.386 226239 INFO os_vif [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:e9:1e,bridge_name='br-int',has_traffic_filtering=True,id=db231dc0-94bd-47c5-bc4c-f139648e2cfa,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb231dc0-94')#033[00m
Jan 31 03:14:45 np0005603623 nova_compute[226235]: 2026-01-31 08:14:45.386 226239 DEBUG oslo_concurrency.lockutils [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:45 np0005603623 nova_compute[226235]: 2026-01-31 08:14:45.386 226239 DEBUG oslo_concurrency.lockutils [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:14:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:45.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:14:45 np0005603623 nova_compute[226235]: 2026-01-31 08:14:45.752 226239 DEBUG oslo_concurrency.processutils [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:14:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:46.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:14:46 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/837268704' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:14:46 np0005603623 nova_compute[226235]: 2026-01-31 08:14:46.179 226239 DEBUG oslo_concurrency.processutils [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:14:46 np0005603623 nova_compute[226235]: 2026-01-31 08:14:46.184 226239 DEBUG nova.compute.provider_tree [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:14:46 np0005603623 nova_compute[226235]: 2026-01-31 08:14:46.389 226239 DEBUG nova.scheduler.client.report [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #85. Immutable memtables: 0.
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:14:47.119017) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 51] Flushing memtable with next log file: 85
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847287119128, "job": 51, "event": "flush_started", "num_memtables": 1, "num_entries": 852, "num_deletes": 251, "total_data_size": 1557958, "memory_usage": 1575248, "flush_reason": "Manual Compaction"}
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 51] Level-0 flush table #86: started
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847287213293, "cf_name": "default", "job": 51, "event": "table_file_creation", "file_number": 86, "file_size": 1016668, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 43990, "largest_seqno": 44837, "table_properties": {"data_size": 1012588, "index_size": 1796, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9627, "raw_average_key_size": 20, "raw_value_size": 1004193, "raw_average_value_size": 2100, "num_data_blocks": 77, "num_entries": 478, "num_filter_entries": 478, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769847238, "oldest_key_time": 1769847238, "file_creation_time": 1769847287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 86, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 51] Flush lasted 94520 microseconds, and 5464 cpu microseconds.
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:14:47.213464) [db/flush_job.cc:967] [default] [JOB 51] Level-0 flush table #86: 1016668 bytes OK
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:14:47.213728) [db/memtable_list.cc:519] [default] Level-0 commit table #86 started
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:14:47.222258) [db/memtable_list.cc:722] [default] Level-0 commit table #86: memtable #1 done
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:14:47.222288) EVENT_LOG_v1 {"time_micros": 1769847287222279, "job": 51, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:14:47.222312) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 51] Try to delete WAL files size 1553531, prev total WAL file size 1553531, number of live WAL files 2.
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000082.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:14:47.223309) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033353134' seq:72057594037927935, type:22 .. '7061786F730033373636' seq:0, type:0; will stop at (end)
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 52] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 51 Base level 0, inputs: [86(992KB)], [84(10MB)]
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847287223375, "job": 52, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [86], "files_L6": [84], "score": -1, "input_data_size": 11592118, "oldest_snapshot_seqno": -1}
Jan 31 03:14:47 np0005603623 nova_compute[226235]: 2026-01-31 08:14:47.332 226239 DEBUG oslo_concurrency.lockutils [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 1.946s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 52] Generated table #87: 6748 keys, 9709042 bytes, temperature: kUnknown
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847287437702, "cf_name": "default", "job": 52, "event": "table_file_creation", "file_number": 87, "file_size": 9709042, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9664728, "index_size": 26346, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16901, "raw_key_size": 175006, "raw_average_key_size": 25, "raw_value_size": 9544861, "raw_average_value_size": 1414, "num_data_blocks": 1039, "num_entries": 6748, "num_filter_entries": 6748, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769847287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 87, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:14:47.438705) [db/compaction/compaction_job.cc:1663] [default] [JOB 52] Compacted 1@0 + 1@6 files to L6 => 9709042 bytes
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:14:47.444340) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 53.9 rd, 45.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 10.1 +0.0 blob) out(9.3 +0.0 blob), read-write-amplify(21.0) write-amplify(9.5) OK, records in: 7271, records dropped: 523 output_compression: NoCompression
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:14:47.444396) EVENT_LOG_v1 {"time_micros": 1769847287444374, "job": 52, "event": "compaction_finished", "compaction_time_micros": 215016, "compaction_time_cpu_micros": 18005, "output_level": 6, "num_output_files": 1, "total_output_size": 9709042, "num_input_records": 7271, "num_output_records": 6748, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000086.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847287445403, "job": 52, "event": "table_file_deletion", "file_number": 86}
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000084.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847287448178, "job": 52, "event": "table_file_deletion", "file_number": 84}
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:14:47.223244) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:14:47.448249) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:14:47.448254) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:14:47.448256) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:14:47.448258) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:14:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:14:47.448260) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:14:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:47.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:47 np0005603623 nova_compute[226235]: 2026-01-31 08:14:47.757 226239 INFO nova.scheduler.client.report [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Deleted allocation for migration 573cb6b9-9e94-474a-9cc7-e9a6e0bcfd43#033[00m
Jan 31 03:14:47 np0005603623 nova_compute[226235]: 2026-01-31 08:14:47.999 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:14:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:48.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:14:48 np0005603623 nova_compute[226235]: 2026-01-31 08:14:48.278 226239 DEBUG oslo_concurrency.lockutils [None req-07ade60f-6053-485a-893e-d0114374c2c5 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "e54ff9a1-d1c9-4792-a837-076e8289ee23" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 9.971s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:48 np0005603623 nova_compute[226235]: 2026-01-31 08:14:48.568 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:49.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:50.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:51.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:52.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:53 np0005603623 nova_compute[226235]: 2026-01-31 08:14:53.000 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:53 np0005603623 nova_compute[226235]: 2026-01-31 08:14:53.570 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:53.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 e262: 3 total, 3 up, 3 in
Jan 31 03:14:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:14:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:54.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:14:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:55.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:56.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:14:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:57.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:14:58 np0005603623 nova_compute[226235]: 2026-01-31 08:14:58.001 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:58.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:58 np0005603623 nova_compute[226235]: 2026-01-31 08:14:58.572 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:14:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:59.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:14:59 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1241434426' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:14:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:14:59 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1241434426' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:14:59 np0005603623 podman[266413]: 2026-01-31 08:14:59.960488967 +0000 UTC m=+0.048324504 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 31 03:14:59 np0005603623 podman[266414]: 2026-01-31 08:14:59.982253913 +0000 UTC m=+0.070089840 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 31 03:15:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:00.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:01.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:02.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:03 np0005603623 nova_compute[226235]: 2026-01-31 08:15:03.003 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:03 np0005603623 nova_compute[226235]: 2026-01-31 08:15:03.575 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:03.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:15:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:04.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:15:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:15:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:05.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:15:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:15:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:06.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:15:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:07.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:08 np0005603623 nova_compute[226235]: 2026-01-31 08:15:08.005 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:08.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:08 np0005603623 nova_compute[226235]: 2026-01-31 08:15:08.578 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:09.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:15:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:10.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:15:10 np0005603623 nova_compute[226235]: 2026-01-31 08:15:10.824 226239 DEBUG oslo_concurrency.lockutils [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "0c37b9a9-3924-451d-bf70-c38147e26756" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:10 np0005603623 nova_compute[226235]: 2026-01-31 08:15:10.825 226239 DEBUG oslo_concurrency.lockutils [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "0c37b9a9-3924-451d-bf70-c38147e26756" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:10 np0005603623 nova_compute[226235]: 2026-01-31 08:15:10.825 226239 DEBUG oslo_concurrency.lockutils [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "0c37b9a9-3924-451d-bf70-c38147e26756-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:10 np0005603623 nova_compute[226235]: 2026-01-31 08:15:10.826 226239 DEBUG oslo_concurrency.lockutils [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "0c37b9a9-3924-451d-bf70-c38147e26756-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:10 np0005603623 nova_compute[226235]: 2026-01-31 08:15:10.826 226239 DEBUG oslo_concurrency.lockutils [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "0c37b9a9-3924-451d-bf70-c38147e26756-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:10 np0005603623 nova_compute[226235]: 2026-01-31 08:15:10.827 226239 INFO nova.compute.manager [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Terminating instance#033[00m
Jan 31 03:15:10 np0005603623 nova_compute[226235]: 2026-01-31 08:15:10.830 226239 DEBUG nova.compute.manager [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:15:11 np0005603623 kernel: tapabdf877d-77 (unregistering): left promiscuous mode
Jan 31 03:15:11 np0005603623 NetworkManager[48970]: <info>  [1769847311.1103] device (tapabdf877d-77): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:15:11 np0005603623 ovn_controller[133449]: 2026-01-31T08:15:11Z|00381|binding|INFO|Releasing lport abdf877d-771f-4148-a98f-c7e8319f044c from this chassis (sb_readonly=0)
Jan 31 03:15:11 np0005603623 ovn_controller[133449]: 2026-01-31T08:15:11Z|00382|binding|INFO|Setting lport abdf877d-771f-4148-a98f-c7e8319f044c down in Southbound
Jan 31 03:15:11 np0005603623 ovn_controller[133449]: 2026-01-31T08:15:11Z|00383|binding|INFO|Removing iface tapabdf877d-77 ovn-installed in OVS
Jan 31 03:15:11 np0005603623 nova_compute[226235]: 2026-01-31 08:15:11.119 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:11 np0005603623 nova_compute[226235]: 2026-01-31 08:15:11.125 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:11 np0005603623 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000005b.scope: Deactivated successfully.
Jan 31 03:15:11 np0005603623 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000005b.scope: Consumed 17.550s CPU time.
Jan 31 03:15:11 np0005603623 systemd-machined[194379]: Machine qemu-40-instance-0000005b terminated.
Jan 31 03:15:11 np0005603623 nova_compute[226235]: 2026-01-31 08:15:11.250 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:11 np0005603623 nova_compute[226235]: 2026-01-31 08:15:11.254 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:11 np0005603623 nova_compute[226235]: 2026-01-31 08:15:11.263 226239 INFO nova.virt.libvirt.driver [-] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Instance destroyed successfully.#033[00m
Jan 31 03:15:11 np0005603623 nova_compute[226235]: 2026-01-31 08:15:11.264 226239 DEBUG nova.objects.instance [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'resources' on Instance uuid 0c37b9a9-3924-451d-bf70-c38147e26756 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:15:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:11.318 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:4e:c3 10.100.0.14'], port_security=['fa:16:3e:86:4e:c3 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '0c37b9a9-3924-451d-bf70-c38147e26756', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f564452-5f08-4a1c-921e-f2daee9ec936', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c03fec1b3664105996aa979e226d8f8', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d620dc35-e1b1-4011-a8c1-0995d2048b09, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=abdf877d-771f-4148-a98f-c7e8319f044c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:15:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:11.319 143258 INFO neutron.agent.ovn.metadata.agent [-] Port abdf877d-771f-4148-a98f-c7e8319f044c in datapath 1f564452-5f08-4a1c-921e-f2daee9ec936 unbound from our chassis#033[00m
Jan 31 03:15:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:11.321 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1f564452-5f08-4a1c-921e-f2daee9ec936, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:15:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:11.322 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8c1e1283-6245-47a3-b5ac-4d4a502fd9bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:11.323 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 namespace which is not needed anymore#033[00m
Jan 31 03:15:11 np0005603623 nova_compute[226235]: 2026-01-31 08:15:11.434 226239 DEBUG nova.virt.libvirt.vif [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:12:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-419177014',display_name='tempest-ServerActionsTestOtherA-server-419177014',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-419177014',id=91,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:12:46Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-0y3yxgdv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:12:46Z,user_data=None,user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=0c37b9a9-3924-451d-bf70-c38147e26756,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "abdf877d-771f-4148-a98f-c7e8319f044c", "address": "fa:16:3e:86:4e:c3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabdf877d-77", "ovs_interfaceid": "abdf877d-771f-4148-a98f-c7e8319f044c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:15:11 np0005603623 nova_compute[226235]: 2026-01-31 08:15:11.435 226239 DEBUG nova.network.os_vif_util [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "abdf877d-771f-4148-a98f-c7e8319f044c", "address": "fa:16:3e:86:4e:c3", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabdf877d-77", "ovs_interfaceid": "abdf877d-771f-4148-a98f-c7e8319f044c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:15:11 np0005603623 nova_compute[226235]: 2026-01-31 08:15:11.436 226239 DEBUG nova.network.os_vif_util [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:86:4e:c3,bridge_name='br-int',has_traffic_filtering=True,id=abdf877d-771f-4148-a98f-c7e8319f044c,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabdf877d-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:15:11 np0005603623 nova_compute[226235]: 2026-01-31 08:15:11.436 226239 DEBUG os_vif [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:4e:c3,bridge_name='br-int',has_traffic_filtering=True,id=abdf877d-771f-4148-a98f-c7e8319f044c,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabdf877d-77') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:15:11 np0005603623 nova_compute[226235]: 2026-01-31 08:15:11.437 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:11 np0005603623 nova_compute[226235]: 2026-01-31 08:15:11.437 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapabdf877d-77, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:15:11 np0005603623 nova_compute[226235]: 2026-01-31 08:15:11.439 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:11 np0005603623 nova_compute[226235]: 2026-01-31 08:15:11.440 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:11 np0005603623 nova_compute[226235]: 2026-01-31 08:15:11.442 226239 INFO os_vif [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:4e:c3,bridge_name='br-int',has_traffic_filtering=True,id=abdf877d-771f-4148-a98f-c7e8319f044c,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabdf877d-77')#033[00m
Jan 31 03:15:11 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[264669]: [NOTICE]   (264673) : haproxy version is 2.8.14-c23fe91
Jan 31 03:15:11 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[264669]: [NOTICE]   (264673) : path to executable is /usr/sbin/haproxy
Jan 31 03:15:11 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[264669]: [WARNING]  (264673) : Exiting Master process...
Jan 31 03:15:11 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[264669]: [WARNING]  (264673) : Exiting Master process...
Jan 31 03:15:11 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[264669]: [ALERT]    (264673) : Current worker (264675) exited with code 143 (Terminated)
Jan 31 03:15:11 np0005603623 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[264669]: [WARNING]  (264673) : All workers exited. Exiting... (0)
Jan 31 03:15:11 np0005603623 systemd[1]: libpod-97c881b4d987fa67d1935531b2fc9481e80ea4baf4df8ee284592597348d20a5.scope: Deactivated successfully.
Jan 31 03:15:11 np0005603623 podman[266548]: 2026-01-31 08:15:11.55525974 +0000 UTC m=+0.166427376 container died 97c881b4d987fa67d1935531b2fc9481e80ea4baf4df8ee284592597348d20a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:15:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:11.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:11 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-97c881b4d987fa67d1935531b2fc9481e80ea4baf4df8ee284592597348d20a5-userdata-shm.mount: Deactivated successfully.
Jan 31 03:15:11 np0005603623 systemd[1]: var-lib-containers-storage-overlay-d4976bef030d97eb0af9a8bb9c52a63c8e4a0537d0d721956bba49fdf9a93b67-merged.mount: Deactivated successfully.
Jan 31 03:15:12 np0005603623 podman[266548]: 2026-01-31 08:15:12.029976366 +0000 UTC m=+0.641144002 container cleanup 97c881b4d987fa67d1935531b2fc9481e80ea4baf4df8ee284592597348d20a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:15:12 np0005603623 podman[266597]: 2026-01-31 08:15:12.092775135 +0000 UTC m=+0.044069300 container remove 97c881b4d987fa67d1935531b2fc9481e80ea4baf4df8ee284592597348d20a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Jan 31 03:15:12 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:12.097 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[262e5ea3-439e-49cf-8d52-9dde8d31d624]: (4, ('Sat Jan 31 08:15:11 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 (97c881b4d987fa67d1935531b2fc9481e80ea4baf4df8ee284592597348d20a5)\n97c881b4d987fa67d1935531b2fc9481e80ea4baf4df8ee284592597348d20a5\nSat Jan 31 08:15:12 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 (97c881b4d987fa67d1935531b2fc9481e80ea4baf4df8ee284592597348d20a5)\n97c881b4d987fa67d1935531b2fc9481e80ea4baf4df8ee284592597348d20a5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:12 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:12.101 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e0013aae-92e9-49aa-b8f2-98319764a2d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:12 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:12.103 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f564452-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:15:12 np0005603623 nova_compute[226235]: 2026-01-31 08:15:12.141 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:12 np0005603623 kernel: tap1f564452-50: left promiscuous mode
Jan 31 03:15:12 np0005603623 systemd[1]: libpod-conmon-97c881b4d987fa67d1935531b2fc9481e80ea4baf4df8ee284592597348d20a5.scope: Deactivated successfully.
Jan 31 03:15:12 np0005603623 nova_compute[226235]: 2026-01-31 08:15:12.148 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:12 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:12.152 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2a6719e8-54c8-4d44-ab65-9271a1772b7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:12.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:12 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:12.173 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[edc4c48b-db9a-4ab5-919d-1b71dd880416]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:12 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:12.175 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[506c6f33-6259-4136-a17d-da2704e34695]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:12 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:12.192 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5c1e7618-092b-49b4-8ac1-e15dd435bca2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 630626, 'reachable_time': 42915, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266612, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:12 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:12.197 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:15:12 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:12.197 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[12d3f9b9-dcef-40c1-84f0-1d786deb76bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:12 np0005603623 systemd[1]: run-netns-ovnmeta\x2d1f564452\x2d5f08\x2d4a1c\x2d921e\x2df2daee9ec936.mount: Deactivated successfully.
Jan 31 03:15:12 np0005603623 nova_compute[226235]: 2026-01-31 08:15:12.333 226239 INFO nova.virt.libvirt.driver [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Deleting instance files /var/lib/nova/instances/0c37b9a9-3924-451d-bf70-c38147e26756_del#033[00m
Jan 31 03:15:12 np0005603623 nova_compute[226235]: 2026-01-31 08:15:12.334 226239 INFO nova.virt.libvirt.driver [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Deletion of /var/lib/nova/instances/0c37b9a9-3924-451d-bf70-c38147e26756_del complete#033[00m
Jan 31 03:15:12 np0005603623 nova_compute[226235]: 2026-01-31 08:15:12.905 226239 INFO nova.compute.manager [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Took 2.07 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:15:12 np0005603623 nova_compute[226235]: 2026-01-31 08:15:12.906 226239 DEBUG oslo.service.loopingcall [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:15:12 np0005603623 nova_compute[226235]: 2026-01-31 08:15:12.906 226239 DEBUG nova.compute.manager [-] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:15:12 np0005603623 nova_compute[226235]: 2026-01-31 08:15:12.906 226239 DEBUG nova.network.neutron [-] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:15:13 np0005603623 nova_compute[226235]: 2026-01-31 08:15:13.271 226239 DEBUG nova.compute.manager [req-2b80f813-6830-491b-947a-33d3b67867a7 req-0d7bf247-5fab-4c03-aa5e-3f37f8b75122 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Received event network-vif-unplugged-abdf877d-771f-4148-a98f-c7e8319f044c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:13 np0005603623 nova_compute[226235]: 2026-01-31 08:15:13.271 226239 DEBUG oslo_concurrency.lockutils [req-2b80f813-6830-491b-947a-33d3b67867a7 req-0d7bf247-5fab-4c03-aa5e-3f37f8b75122 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0c37b9a9-3924-451d-bf70-c38147e26756-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:13 np0005603623 nova_compute[226235]: 2026-01-31 08:15:13.271 226239 DEBUG oslo_concurrency.lockutils [req-2b80f813-6830-491b-947a-33d3b67867a7 req-0d7bf247-5fab-4c03-aa5e-3f37f8b75122 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0c37b9a9-3924-451d-bf70-c38147e26756-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:13 np0005603623 nova_compute[226235]: 2026-01-31 08:15:13.272 226239 DEBUG oslo_concurrency.lockutils [req-2b80f813-6830-491b-947a-33d3b67867a7 req-0d7bf247-5fab-4c03-aa5e-3f37f8b75122 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0c37b9a9-3924-451d-bf70-c38147e26756-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:13 np0005603623 nova_compute[226235]: 2026-01-31 08:15:13.272 226239 DEBUG nova.compute.manager [req-2b80f813-6830-491b-947a-33d3b67867a7 req-0d7bf247-5fab-4c03-aa5e-3f37f8b75122 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] No waiting events found dispatching network-vif-unplugged-abdf877d-771f-4148-a98f-c7e8319f044c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:15:13 np0005603623 nova_compute[226235]: 2026-01-31 08:15:13.272 226239 DEBUG nova.compute.manager [req-2b80f813-6830-491b-947a-33d3b67867a7 req-0d7bf247-5fab-4c03-aa5e-3f37f8b75122 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Received event network-vif-unplugged-abdf877d-771f-4148-a98f-c7e8319f044c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:15:13 np0005603623 nova_compute[226235]: 2026-01-31 08:15:13.580 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:13.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:14.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:14.528 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:15:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:14.529 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:15:14 np0005603623 nova_compute[226235]: 2026-01-31 08:15:14.528 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:15.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:15 np0005603623 nova_compute[226235]: 2026-01-31 08:15:15.884 226239 DEBUG nova.compute.manager [req-80f75fc0-cd13-4b11-b2b2-e92381b64e52 req-07e608ac-9ba9-475e-bbdc-0ca4fab439d3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Received event network-vif-plugged-abdf877d-771f-4148-a98f-c7e8319f044c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:15 np0005603623 nova_compute[226235]: 2026-01-31 08:15:15.884 226239 DEBUG oslo_concurrency.lockutils [req-80f75fc0-cd13-4b11-b2b2-e92381b64e52 req-07e608ac-9ba9-475e-bbdc-0ca4fab439d3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0c37b9a9-3924-451d-bf70-c38147e26756-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:15 np0005603623 nova_compute[226235]: 2026-01-31 08:15:15.884 226239 DEBUG oslo_concurrency.lockutils [req-80f75fc0-cd13-4b11-b2b2-e92381b64e52 req-07e608ac-9ba9-475e-bbdc-0ca4fab439d3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0c37b9a9-3924-451d-bf70-c38147e26756-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:15 np0005603623 nova_compute[226235]: 2026-01-31 08:15:15.884 226239 DEBUG oslo_concurrency.lockutils [req-80f75fc0-cd13-4b11-b2b2-e92381b64e52 req-07e608ac-9ba9-475e-bbdc-0ca4fab439d3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0c37b9a9-3924-451d-bf70-c38147e26756-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:15 np0005603623 nova_compute[226235]: 2026-01-31 08:15:15.885 226239 DEBUG nova.compute.manager [req-80f75fc0-cd13-4b11-b2b2-e92381b64e52 req-07e608ac-9ba9-475e-bbdc-0ca4fab439d3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] No waiting events found dispatching network-vif-plugged-abdf877d-771f-4148-a98f-c7e8319f044c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:15:15 np0005603623 nova_compute[226235]: 2026-01-31 08:15:15.885 226239 WARNING nova.compute.manager [req-80f75fc0-cd13-4b11-b2b2-e92381b64e52 req-07e608ac-9ba9-475e-bbdc-0ca4fab439d3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Received unexpected event network-vif-plugged-abdf877d-771f-4148-a98f-c7e8319f044c for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:15:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:15:15 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2649828403' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:15:16 np0005603623 nova_compute[226235]: 2026-01-31 08:15:16.014 226239 DEBUG nova.network.neutron [-] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:15:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:15:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:16.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:15:16 np0005603623 nova_compute[226235]: 2026-01-31 08:15:16.438 226239 INFO nova.compute.manager [-] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Took 3.53 seconds to deallocate network for instance.#033[00m
Jan 31 03:15:16 np0005603623 nova_compute[226235]: 2026-01-31 08:15:16.441 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:16.531 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:15:16 np0005603623 nova_compute[226235]: 2026-01-31 08:15:16.757 226239 DEBUG oslo_concurrency.lockutils [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:16 np0005603623 nova_compute[226235]: 2026-01-31 08:15:16.758 226239 DEBUG oslo_concurrency.lockutils [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:16 np0005603623 nova_compute[226235]: 2026-01-31 08:15:16.874 226239 DEBUG oslo_concurrency.processutils [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:15:17 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1298308665' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:15:17 np0005603623 nova_compute[226235]: 2026-01-31 08:15:17.303 226239 DEBUG oslo_concurrency.processutils [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:17 np0005603623 nova_compute[226235]: 2026-01-31 08:15:17.309 226239 DEBUG nova.compute.provider_tree [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:15:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:15:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:17.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:15:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:15:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:15:17 np0005603623 nova_compute[226235]: 2026-01-31 08:15:17.936 226239 DEBUG nova.scheduler.client.report [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:15:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:15:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:18.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:15:18 np0005603623 nova_compute[226235]: 2026-01-31 08:15:18.174 226239 DEBUG oslo_concurrency.lockutils [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.416s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:18 np0005603623 nova_compute[226235]: 2026-01-31 08:15:18.216 226239 DEBUG nova.compute.manager [req-4f38eb67-c523-409e-a7fc-99f4c4e840cc req-7f8d6375-81cd-4a08-a424-4acb839fbcfc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Received event network-vif-deleted-abdf877d-771f-4148-a98f-c7e8319f044c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:18 np0005603623 nova_compute[226235]: 2026-01-31 08:15:18.360 226239 INFO nova.scheduler.client.report [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Deleted allocations for instance 0c37b9a9-3924-451d-bf70-c38147e26756#033[00m
Jan 31 03:15:18 np0005603623 nova_compute[226235]: 2026-01-31 08:15:18.583 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:18 np0005603623 nova_compute[226235]: 2026-01-31 08:15:18.845 226239 DEBUG oslo_concurrency.lockutils [None req-1f67e955-0dc7-4287-a150-4a3c12519c30 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "0c37b9a9-3924-451d-bf70-c38147e26756" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.020s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:15:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:19.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:15:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:20.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:15:21 np0005603623 nova_compute[226235]: 2026-01-31 08:15:21.443 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:15:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:21.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:15:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:22.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:23 np0005603623 nova_compute[226235]: 2026-01-31 08:15:23.584 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:15:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:23.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:15:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:24.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:15:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:15:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:25.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:26.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:26 np0005603623 nova_compute[226235]: 2026-01-31 08:15:26.263 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847311.2619765, 0c37b9a9-3924-451d-bf70-c38147e26756 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:15:26 np0005603623 nova_compute[226235]: 2026-01-31 08:15:26.263 226239 INFO nova.compute.manager [-] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:15:26 np0005603623 nova_compute[226235]: 2026-01-31 08:15:26.358 226239 DEBUG nova.compute.manager [None req-a0a35c55-f3e7-4e70-9517-96ae5b0c9112 - - - - - -] [instance: 0c37b9a9-3924-451d-bf70-c38147e26756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:15:26 np0005603623 nova_compute[226235]: 2026-01-31 08:15:26.445 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:27.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:28.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:28 np0005603623 nova_compute[226235]: 2026-01-31 08:15:28.644 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:29 np0005603623 nova_compute[226235]: 2026-01-31 08:15:29.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:29 np0005603623 nova_compute[226235]: 2026-01-31 08:15:29.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:15:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:29.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:30.107 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:30.108 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:15:30.108 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:30.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:30 np0005603623 podman[266876]: 2026-01-31 08:15:30.954121413 +0000 UTC m=+0.047519589 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:15:30 np0005603623 podman[266877]: 2026-01-31 08:15:30.975406214 +0000 UTC m=+0.066522708 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:15:31 np0005603623 nova_compute[226235]: 2026-01-31 08:15:31.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:31 np0005603623 nova_compute[226235]: 2026-01-31 08:15:31.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:15:31 np0005603623 nova_compute[226235]: 2026-01-31 08:15:31.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:15:31 np0005603623 nova_compute[226235]: 2026-01-31 08:15:31.447 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:31 np0005603623 nova_compute[226235]: 2026-01-31 08:15:31.680 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:15:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:31.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:15:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:32.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:15:33 np0005603623 nova_compute[226235]: 2026-01-31 08:15:33.645 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:15:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:33.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:15:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:34 np0005603623 nova_compute[226235]: 2026-01-31 08:15:34.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:34.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:15:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:35.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:15:36 np0005603623 nova_compute[226235]: 2026-01-31 08:15:36.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:36.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:36 np0005603623 nova_compute[226235]: 2026-01-31 08:15:36.486 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:37 np0005603623 nova_compute[226235]: 2026-01-31 08:15:37.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:37 np0005603623 nova_compute[226235]: 2026-01-31 08:15:37.213 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:37 np0005603623 nova_compute[226235]: 2026-01-31 08:15:37.214 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:37 np0005603623 nova_compute[226235]: 2026-01-31 08:15:37.214 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:37 np0005603623 nova_compute[226235]: 2026-01-31 08:15:37.214 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:15:37 np0005603623 nova_compute[226235]: 2026-01-31 08:15:37.214 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:15:37 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1203290210' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:15:37 np0005603623 nova_compute[226235]: 2026-01-31 08:15:37.614 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.399s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:37 np0005603623 nova_compute[226235]: 2026-01-31 08:15:37.751 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:15:37 np0005603623 nova_compute[226235]: 2026-01-31 08:15:37.752 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4620MB free_disk=20.967376708984375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:15:37 np0005603623 nova_compute[226235]: 2026-01-31 08:15:37.752 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:37 np0005603623 nova_compute[226235]: 2026-01-31 08:15:37.752 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:37.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:38 np0005603623 nova_compute[226235]: 2026-01-31 08:15:38.050 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:15:38 np0005603623 nova_compute[226235]: 2026-01-31 08:15:38.050 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:15:38 np0005603623 nova_compute[226235]: 2026-01-31 08:15:38.076 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:15:38 np0005603623 nova_compute[226235]: 2026-01-31 08:15:38.147 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:15:38 np0005603623 nova_compute[226235]: 2026-01-31 08:15:38.147 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:15:38 np0005603623 nova_compute[226235]: 2026-01-31 08:15:38.164 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:15:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:38.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:38 np0005603623 nova_compute[226235]: 2026-01-31 08:15:38.207 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:15:38 np0005603623 nova_compute[226235]: 2026-01-31 08:15:38.235 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:15:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2927116313' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:15:38 np0005603623 nova_compute[226235]: 2026-01-31 08:15:38.628 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.393s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:38 np0005603623 nova_compute[226235]: 2026-01-31 08:15:38.635 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:15:38 np0005603623 nova_compute[226235]: 2026-01-31 08:15:38.647 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:38 np0005603623 nova_compute[226235]: 2026-01-31 08:15:38.661 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:15:38 np0005603623 nova_compute[226235]: 2026-01-31 08:15:38.718 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:15:38 np0005603623 nova_compute[226235]: 2026-01-31 08:15:38.719 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.966s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:39 np0005603623 nova_compute[226235]: 2026-01-31 08:15:39.587 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:39 np0005603623 nova_compute[226235]: 2026-01-31 08:15:39.691 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:39 np0005603623 nova_compute[226235]: 2026-01-31 08:15:39.714 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:39 np0005603623 nova_compute[226235]: 2026-01-31 08:15:39.714 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:15:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:39.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:15:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:40.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:41 np0005603623 nova_compute[226235]: 2026-01-31 08:15:41.488 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:15:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:41.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:15:42 np0005603623 nova_compute[226235]: 2026-01-31 08:15:42.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:42 np0005603623 nova_compute[226235]: 2026-01-31 08:15:42.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:42 np0005603623 nova_compute[226235]: 2026-01-31 08:15:42.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:15:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:42.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:43 np0005603623 nova_compute[226235]: 2026-01-31 08:15:43.649 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:43.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:44 np0005603623 nova_compute[226235]: 2026-01-31 08:15:44.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:44 np0005603623 nova_compute[226235]: 2026-01-31 08:15:44.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:44.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:15:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:45.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:15:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:46.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:46 np0005603623 nova_compute[226235]: 2026-01-31 08:15:46.491 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:47.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:48.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:48 np0005603623 nova_compute[226235]: 2026-01-31 08:15:48.651 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:49.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:50.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:51 np0005603623 nova_compute[226235]: 2026-01-31 08:15:51.492 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:51.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:15:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:52.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:15:53 np0005603623 nova_compute[226235]: 2026-01-31 08:15:53.653 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:15:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:53.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:15:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:54.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:54 np0005603623 nova_compute[226235]: 2026-01-31 08:15:54.378 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:54 np0005603623 nova_compute[226235]: 2026-01-31 08:15:54.379 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:15:54 np0005603623 nova_compute[226235]: 2026-01-31 08:15:54.545 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:15:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:55.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:56.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:56 np0005603623 nova_compute[226235]: 2026-01-31 08:15:56.495 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:57.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:58.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:58 np0005603623 nova_compute[226235]: 2026-01-31 08:15:58.654 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:15:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:59.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:00.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:01 np0005603623 nova_compute[226235]: 2026-01-31 08:16:01.497 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:01.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:01 np0005603623 podman[267031]: 2026-01-31 08:16:01.951911242 +0000 UTC m=+0.046439045 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:16:01 np0005603623 podman[267032]: 2026-01-31 08:16:01.97211902 +0000 UTC m=+0.064764963 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 03:16:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:16:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:02.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:16:03 np0005603623 nova_compute[226235]: 2026-01-31 08:16:03.655 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:03.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:04.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:16:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:05.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:16:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:06.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:06 np0005603623 nova_compute[226235]: 2026-01-31 08:16:06.499 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:07.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:08.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:08 np0005603623 nova_compute[226235]: 2026-01-31 08:16:08.702 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:09.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:10.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:11 np0005603623 nova_compute[226235]: 2026-01-31 08:16:11.502 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:11.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:12.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:13 np0005603623 nova_compute[226235]: 2026-01-31 08:16:13.703 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:13.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:14.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:16:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3750496990' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:16:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:16:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3750496990' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:16:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:15.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:16 np0005603623 nova_compute[226235]: 2026-01-31 08:16:16.075 226239 DEBUG oslo_concurrency.lockutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Acquiring lock "b5409d3b-1f0a-476f-b537-91e30f6543fd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:16 np0005603623 nova_compute[226235]: 2026-01-31 08:16:16.075 226239 DEBUG oslo_concurrency.lockutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lock "b5409d3b-1f0a-476f-b537-91e30f6543fd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:16.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:16 np0005603623 nova_compute[226235]: 2026-01-31 08:16:16.504 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:16 np0005603623 nova_compute[226235]: 2026-01-31 08:16:16.517 226239 DEBUG nova.compute.manager [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:16:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:16:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:17.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:16:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:18.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:18 np0005603623 nova_compute[226235]: 2026-01-31 08:16:18.694 226239 DEBUG oslo_concurrency.lockutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:18 np0005603623 nova_compute[226235]: 2026-01-31 08:16:18.694 226239 DEBUG oslo_concurrency.lockutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:18 np0005603623 nova_compute[226235]: 2026-01-31 08:16:18.703 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:18 np0005603623 nova_compute[226235]: 2026-01-31 08:16:18.707 226239 DEBUG nova.virt.hardware [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:16:18 np0005603623 nova_compute[226235]: 2026-01-31 08:16:18.707 226239 INFO nova.compute.claims [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:16:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:19.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:20.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:20 np0005603623 nova_compute[226235]: 2026-01-31 08:16:20.743 226239 DEBUG oslo_concurrency.processutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:16:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:16:21 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1130040401' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:16:21 np0005603623 nova_compute[226235]: 2026-01-31 08:16:21.151 226239 DEBUG oslo_concurrency.processutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:16:21 np0005603623 nova_compute[226235]: 2026-01-31 08:16:21.157 226239 DEBUG nova.compute.provider_tree [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:16:21 np0005603623 nova_compute[226235]: 2026-01-31 08:16:21.289 226239 DEBUG nova.scheduler.client.report [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:16:21 np0005603623 nova_compute[226235]: 2026-01-31 08:16:21.507 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:21.660 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:16:21 np0005603623 nova_compute[226235]: 2026-01-31 08:16:21.660 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:21.661 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:16:21 np0005603623 nova_compute[226235]: 2026-01-31 08:16:21.712 226239 DEBUG oslo_concurrency.lockutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 3.017s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:21 np0005603623 nova_compute[226235]: 2026-01-31 08:16:21.712 226239 DEBUG nova.compute.manager [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:16:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:16:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:21.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:16:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:22.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:23 np0005603623 nova_compute[226235]: 2026-01-31 08:16:23.705 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:16:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:23.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:16:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:24.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:25.664 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:25.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:26.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:26 np0005603623 nova_compute[226235]: 2026-01-31 08:16:26.509 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:16:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:27.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:16:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:28.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:28 np0005603623 nova_compute[226235]: 2026-01-31 08:16:28.706 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:29 np0005603623 nova_compute[226235]: 2026-01-31 08:16:29.320 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:29 np0005603623 nova_compute[226235]: 2026-01-31 08:16:29.321 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:16:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:16:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:16:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:16:29 np0005603623 ovn_controller[133449]: 2026-01-31T08:16:29Z|00384|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Jan 31 03:16:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:29.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:30.109 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:30.109 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:30.109 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:30.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:31 np0005603623 nova_compute[226235]: 2026-01-31 08:16:31.093 226239 DEBUG nova.compute.manager [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:16:31 np0005603623 nova_compute[226235]: 2026-01-31 08:16:31.093 226239 DEBUG nova.network.neutron [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:16:31 np0005603623 nova_compute[226235]: 2026-01-31 08:16:31.511 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:31 np0005603623 nova_compute[226235]: 2026-01-31 08:16:31.631 226239 DEBUG nova.policy [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '684cece6993b4999be175313a51a3eac', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5c1adda10526412b8a61b51f1d5d947a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:16:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:16:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:31.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:16:31 np0005603623 nova_compute[226235]: 2026-01-31 08:16:31.889 226239 INFO nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:16:32 np0005603623 nova_compute[226235]: 2026-01-31 08:16:32.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:32 np0005603623 nova_compute[226235]: 2026-01-31 08:16:32.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:16:32 np0005603623 nova_compute[226235]: 2026-01-31 08:16:32.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:16:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:32.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:32 np0005603623 podman[267346]: 2026-01-31 08:16:32.945067607 +0000 UTC m=+0.044362879 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:16:32 np0005603623 podman[267347]: 2026-01-31 08:16:32.960875946 +0000 UTC m=+0.059148795 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 31 03:16:33 np0005603623 nova_compute[226235]: 2026-01-31 08:16:33.293 226239 DEBUG nova.compute.manager [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:16:33 np0005603623 nova_compute[226235]: 2026-01-31 08:16:33.600 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:16:33 np0005603623 nova_compute[226235]: 2026-01-31 08:16:33.600 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:16:33 np0005603623 nova_compute[226235]: 2026-01-31 08:16:33.707 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:33.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:34 np0005603623 nova_compute[226235]: 2026-01-31 08:16:34.194 226239 DEBUG nova.compute.manager [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:16:34 np0005603623 nova_compute[226235]: 2026-01-31 08:16:34.196 226239 DEBUG nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:16:34 np0005603623 nova_compute[226235]: 2026-01-31 08:16:34.196 226239 INFO nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Creating image(s)#033[00m
Jan 31 03:16:34 np0005603623 nova_compute[226235]: 2026-01-31 08:16:34.216 226239 DEBUG nova.storage.rbd_utils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] rbd image b5409d3b-1f0a-476f-b537-91e30f6543fd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:16:34 np0005603623 nova_compute[226235]: 2026-01-31 08:16:34.238 226239 DEBUG nova.storage.rbd_utils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] rbd image b5409d3b-1f0a-476f-b537-91e30f6543fd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:16:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:34.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:34 np0005603623 nova_compute[226235]: 2026-01-31 08:16:34.266 226239 DEBUG nova.storage.rbd_utils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] rbd image b5409d3b-1f0a-476f-b537-91e30f6543fd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:16:34 np0005603623 nova_compute[226235]: 2026-01-31 08:16:34.271 226239 DEBUG oslo_concurrency.processutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:16:34 np0005603623 nova_compute[226235]: 2026-01-31 08:16:34.322 226239 DEBUG oslo_concurrency.processutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:16:34 np0005603623 nova_compute[226235]: 2026-01-31 08:16:34.323 226239 DEBUG oslo_concurrency.lockutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:34 np0005603623 nova_compute[226235]: 2026-01-31 08:16:34.324 226239 DEBUG oslo_concurrency.lockutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:34 np0005603623 nova_compute[226235]: 2026-01-31 08:16:34.324 226239 DEBUG oslo_concurrency.lockutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:34 np0005603623 nova_compute[226235]: 2026-01-31 08:16:34.482 226239 DEBUG nova.storage.rbd_utils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] rbd image b5409d3b-1f0a-476f-b537-91e30f6543fd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:16:34 np0005603623 nova_compute[226235]: 2026-01-31 08:16:34.485 226239 DEBUG oslo_concurrency.processutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 b5409d3b-1f0a-476f-b537-91e30f6543fd_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:16:34 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:16:34 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:16:35 np0005603623 nova_compute[226235]: 2026-01-31 08:16:35.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:16:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:35.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:16:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:36.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:36 np0005603623 nova_compute[226235]: 2026-01-31 08:16:36.513 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:37 np0005603623 nova_compute[226235]: 2026-01-31 08:16:37.137 226239 DEBUG oslo_concurrency.processutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 b5409d3b-1f0a-476f-b537-91e30f6543fd_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.652s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:16:37 np0005603623 nova_compute[226235]: 2026-01-31 08:16:37.170 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:37 np0005603623 nova_compute[226235]: 2026-01-31 08:16:37.206 226239 DEBUG nova.storage.rbd_utils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] resizing rbd image b5409d3b-1f0a-476f-b537-91e30f6543fd_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:16:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:16:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:37.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:16:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:38.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:38 np0005603623 nova_compute[226235]: 2026-01-31 08:16:38.436 226239 DEBUG nova.objects.instance [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lazy-loading 'migration_context' on Instance uuid b5409d3b-1f0a-476f-b537-91e30f6543fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:16:38 np0005603623 nova_compute[226235]: 2026-01-31 08:16:38.709 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:38 np0005603623 nova_compute[226235]: 2026-01-31 08:16:38.974 226239 DEBUG nova.network.neutron [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Successfully created port: 57231250-645c-4a66-ac22-f67044a76f7e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:16:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:16:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:39.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:16:40 np0005603623 nova_compute[226235]: 2026-01-31 08:16:40.221 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:40 np0005603623 nova_compute[226235]: 2026-01-31 08:16:40.221 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:40 np0005603623 nova_compute[226235]: 2026-01-31 08:16:40.222 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:40 np0005603623 nova_compute[226235]: 2026-01-31 08:16:40.222 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:16:40 np0005603623 nova_compute[226235]: 2026-01-31 08:16:40.222 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:16:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:40.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:40 np0005603623 nova_compute[226235]: 2026-01-31 08:16:40.296 226239 DEBUG nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:16:40 np0005603623 nova_compute[226235]: 2026-01-31 08:16:40.297 226239 DEBUG nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Ensure instance console log exists: /var/lib/nova/instances/b5409d3b-1f0a-476f-b537-91e30f6543fd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:16:40 np0005603623 nova_compute[226235]: 2026-01-31 08:16:40.297 226239 DEBUG oslo_concurrency.lockutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:40 np0005603623 nova_compute[226235]: 2026-01-31 08:16:40.298 226239 DEBUG oslo_concurrency.lockutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:40 np0005603623 nova_compute[226235]: 2026-01-31 08:16:40.298 226239 DEBUG oslo_concurrency.lockutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:16:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2966267338' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:16:40 np0005603623 nova_compute[226235]: 2026-01-31 08:16:40.646 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:16:40 np0005603623 nova_compute[226235]: 2026-01-31 08:16:40.760 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:16:40 np0005603623 nova_compute[226235]: 2026-01-31 08:16:40.761 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4590MB free_disk=20.96734619140625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:16:40 np0005603623 nova_compute[226235]: 2026-01-31 08:16:40.762 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:40 np0005603623 nova_compute[226235]: 2026-01-31 08:16:40.762 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:41 np0005603623 nova_compute[226235]: 2026-01-31 08:16:41.515 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:41 np0005603623 nova_compute[226235]: 2026-01-31 08:16:41.745 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance b5409d3b-1f0a-476f-b537-91e30f6543fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:16:41 np0005603623 nova_compute[226235]: 2026-01-31 08:16:41.746 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:16:41 np0005603623 nova_compute[226235]: 2026-01-31 08:16:41.746 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:16:41 np0005603623 nova_compute[226235]: 2026-01-31 08:16:41.780 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:16:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:16:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:41.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:16:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:16:42 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3642190950' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:16:42 np0005603623 nova_compute[226235]: 2026-01-31 08:16:42.216 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:16:42 np0005603623 nova_compute[226235]: 2026-01-31 08:16:42.220 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:16:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:42.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:42 np0005603623 nova_compute[226235]: 2026-01-31 08:16:42.662 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:16:42 np0005603623 nova_compute[226235]: 2026-01-31 08:16:42.875 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:16:42 np0005603623 nova_compute[226235]: 2026-01-31 08:16:42.875 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:43 np0005603623 nova_compute[226235]: 2026-01-31 08:16:43.710 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:43 np0005603623 nova_compute[226235]: 2026-01-31 08:16:43.859 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:43 np0005603623 nova_compute[226235]: 2026-01-31 08:16:43.859 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:43.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:43 np0005603623 nova_compute[226235]: 2026-01-31 08:16:43.976 226239 DEBUG nova.network.neutron [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Successfully updated port: 57231250-645c-4a66-ac22-f67044a76f7e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:16:44 np0005603623 nova_compute[226235]: 2026-01-31 08:16:44.016 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:44 np0005603623 nova_compute[226235]: 2026-01-31 08:16:44.016 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:44 np0005603623 nova_compute[226235]: 2026-01-31 08:16:44.016 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:44.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:44 np0005603623 nova_compute[226235]: 2026-01-31 08:16:44.282 226239 DEBUG oslo_concurrency.lockutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Acquiring lock "refresh_cache-b5409d3b-1f0a-476f-b537-91e30f6543fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:16:44 np0005603623 nova_compute[226235]: 2026-01-31 08:16:44.283 226239 DEBUG oslo_concurrency.lockutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Acquired lock "refresh_cache-b5409d3b-1f0a-476f-b537-91e30f6543fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:16:44 np0005603623 nova_compute[226235]: 2026-01-31 08:16:44.283 226239 DEBUG nova.network.neutron [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:16:44 np0005603623 nova_compute[226235]: 2026-01-31 08:16:44.361 226239 DEBUG nova.compute.manager [req-ede956f8-62ec-40eb-8afa-14e6d7f97d55 req-c3c5a779-3d1b-4dad-aa28-a56a832c637b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Received event network-changed-57231250-645c-4a66-ac22-f67044a76f7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:16:44 np0005603623 nova_compute[226235]: 2026-01-31 08:16:44.361 226239 DEBUG nova.compute.manager [req-ede956f8-62ec-40eb-8afa-14e6d7f97d55 req-c3c5a779-3d1b-4dad-aa28-a56a832c637b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Refreshing instance network info cache due to event network-changed-57231250-645c-4a66-ac22-f67044a76f7e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:16:44 np0005603623 nova_compute[226235]: 2026-01-31 08:16:44.361 226239 DEBUG oslo_concurrency.lockutils [req-ede956f8-62ec-40eb-8afa-14e6d7f97d55 req-c3c5a779-3d1b-4dad-aa28-a56a832c637b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-b5409d3b-1f0a-476f-b537-91e30f6543fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:16:45 np0005603623 nova_compute[226235]: 2026-01-31 08:16:45.024 226239 DEBUG nova.network.neutron [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:16:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:16:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:45.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:16:46 np0005603623 nova_compute[226235]: 2026-01-31 08:16:46.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:46.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:46 np0005603623 nova_compute[226235]: 2026-01-31 08:16:46.517 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:47 np0005603623 nova_compute[226235]: 2026-01-31 08:16:47.017 226239 DEBUG nova.network.neutron [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Updating instance_info_cache with network_info: [{"id": "57231250-645c-4a66-ac22-f67044a76f7e", "address": "fa:16:3e:b7:9a:cd", "network": {"id": "2a00ea48-6904-4f2c-9f79-74fdf123a925", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1925012459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c1adda10526412b8a61b51f1d5d947a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57231250-64", "ovs_interfaceid": "57231250-645c-4a66-ac22-f67044a76f7e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:16:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:47.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.085 226239 DEBUG oslo_concurrency.lockutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Releasing lock "refresh_cache-b5409d3b-1f0a-476f-b537-91e30f6543fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.086 226239 DEBUG nova.compute.manager [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Instance network_info: |[{"id": "57231250-645c-4a66-ac22-f67044a76f7e", "address": "fa:16:3e:b7:9a:cd", "network": {"id": "2a00ea48-6904-4f2c-9f79-74fdf123a925", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1925012459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c1adda10526412b8a61b51f1d5d947a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57231250-64", "ovs_interfaceid": "57231250-645c-4a66-ac22-f67044a76f7e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.087 226239 DEBUG oslo_concurrency.lockutils [req-ede956f8-62ec-40eb-8afa-14e6d7f97d55 req-c3c5a779-3d1b-4dad-aa28-a56a832c637b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-b5409d3b-1f0a-476f-b537-91e30f6543fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.088 226239 DEBUG nova.network.neutron [req-ede956f8-62ec-40eb-8afa-14e6d7f97d55 req-c3c5a779-3d1b-4dad-aa28-a56a832c637b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Refreshing network info cache for port 57231250-645c-4a66-ac22-f67044a76f7e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.091 226239 DEBUG nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Start _get_guest_xml network_info=[{"id": "57231250-645c-4a66-ac22-f67044a76f7e", "address": "fa:16:3e:b7:9a:cd", "network": {"id": "2a00ea48-6904-4f2c-9f79-74fdf123a925", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1925012459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c1adda10526412b8a61b51f1d5d947a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57231250-64", "ovs_interfaceid": "57231250-645c-4a66-ac22-f67044a76f7e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.095 226239 WARNING nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.100 226239 DEBUG nova.virt.libvirt.host [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.101 226239 DEBUG nova.virt.libvirt.host [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.103 226239 DEBUG nova.virt.libvirt.host [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.104 226239 DEBUG nova.virt.libvirt.host [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.105 226239 DEBUG nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.105 226239 DEBUG nova.virt.hardware [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.105 226239 DEBUG nova.virt.hardware [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.106 226239 DEBUG nova.virt.hardware [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.106 226239 DEBUG nova.virt.hardware [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.106 226239 DEBUG nova.virt.hardware [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.106 226239 DEBUG nova.virt.hardware [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.107 226239 DEBUG nova.virt.hardware [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.107 226239 DEBUG nova.virt.hardware [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.107 226239 DEBUG nova.virt.hardware [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.108 226239 DEBUG nova.virt.hardware [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.108 226239 DEBUG nova.virt.hardware [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.111 226239 DEBUG oslo_concurrency.processutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:16:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:48.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:16:48 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4225316492' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.499 226239 DEBUG oslo_concurrency.processutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.388s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.532 226239 DEBUG nova.storage.rbd_utils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] rbd image b5409d3b-1f0a-476f-b537-91e30f6543fd_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.539 226239 DEBUG oslo_concurrency.processutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.740 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:16:48 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3533031737' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.935 226239 DEBUG oslo_concurrency.processutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.396s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.936 226239 DEBUG nova.virt.libvirt.vif [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:16:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-1766985933',display_name='tempest-ServerMetadataNegativeTestJSON-server-1766985933',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-1766985933',id=97,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c1adda10526412b8a61b51f1d5d947a',ramdisk_id='',reservation_id='r-4v9sbyt3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-852438232',owner_user_name='tempest-ServerMetadataNegativeTestJSON-852438232-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:16:33Z,user_data=None,user_id='684cece6993b4999be175313a51a3eac',uuid=b5409d3b-1f0a-476f-b537-91e30f6543fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57231250-645c-4a66-ac22-f67044a76f7e", "address": "fa:16:3e:b7:9a:cd", "network": {"id": "2a00ea48-6904-4f2c-9f79-74fdf123a925", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1925012459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c1adda10526412b8a61b51f1d5d947a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57231250-64", "ovs_interfaceid": "57231250-645c-4a66-ac22-f67044a76f7e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.937 226239 DEBUG nova.network.os_vif_util [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Converting VIF {"id": "57231250-645c-4a66-ac22-f67044a76f7e", "address": "fa:16:3e:b7:9a:cd", "network": {"id": "2a00ea48-6904-4f2c-9f79-74fdf123a925", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1925012459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c1adda10526412b8a61b51f1d5d947a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57231250-64", "ovs_interfaceid": "57231250-645c-4a66-ac22-f67044a76f7e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.938 226239 DEBUG nova.network.os_vif_util [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:9a:cd,bridge_name='br-int',has_traffic_filtering=True,id=57231250-645c-4a66-ac22-f67044a76f7e,network=Network(2a00ea48-6904-4f2c-9f79-74fdf123a925),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57231250-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:16:48 np0005603623 nova_compute[226235]: 2026-01-31 08:16:48.939 226239 DEBUG nova.objects.instance [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lazy-loading 'pci_devices' on Instance uuid b5409d3b-1f0a-476f-b537-91e30f6543fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:16:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.114 226239 DEBUG nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:16:49 np0005603623 nova_compute[226235]:  <uuid>b5409d3b-1f0a-476f-b537-91e30f6543fd</uuid>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:  <name>instance-00000061</name>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerMetadataNegativeTestJSON-server-1766985933</nova:name>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:16:48</nova:creationTime>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:16:49 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:        <nova:user uuid="684cece6993b4999be175313a51a3eac">tempest-ServerMetadataNegativeTestJSON-852438232-project-member</nova:user>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:        <nova:project uuid="5c1adda10526412b8a61b51f1d5d947a">tempest-ServerMetadataNegativeTestJSON-852438232</nova:project>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:        <nova:port uuid="57231250-645c-4a66-ac22-f67044a76f7e">
Jan 31 03:16:49 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <entry name="serial">b5409d3b-1f0a-476f-b537-91e30f6543fd</entry>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <entry name="uuid">b5409d3b-1f0a-476f-b537-91e30f6543fd</entry>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/b5409d3b-1f0a-476f-b537-91e30f6543fd_disk">
Jan 31 03:16:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:16:49 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/b5409d3b-1f0a-476f-b537-91e30f6543fd_disk.config">
Jan 31 03:16:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:16:49 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:b7:9a:cd"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <target dev="tap57231250-64"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/b5409d3b-1f0a-476f-b537-91e30f6543fd/console.log" append="off"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:16:49 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:16:49 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:16:49 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:16:49 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.116 226239 DEBUG nova.compute.manager [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Preparing to wait for external event network-vif-plugged-57231250-645c-4a66-ac22-f67044a76f7e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.117 226239 DEBUG oslo_concurrency.lockutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Acquiring lock "b5409d3b-1f0a-476f-b537-91e30f6543fd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.117 226239 DEBUG oslo_concurrency.lockutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lock "b5409d3b-1f0a-476f-b537-91e30f6543fd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.117 226239 DEBUG oslo_concurrency.lockutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lock "b5409d3b-1f0a-476f-b537-91e30f6543fd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.118 226239 DEBUG nova.virt.libvirt.vif [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:16:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-1766985933',display_name='tempest-ServerMetadataNegativeTestJSON-server-1766985933',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-1766985933',id=97,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c1adda10526412b8a61b51f1d5d947a',ramdisk_id='',reservation_id='r-4v9sbyt3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-852438232',owner_user_name='tempest-ServerMetadataNegativeTestJSON-852438232-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:16:33Z,user_data=None,user_id='684cece6993b4999be175313a51a3eac',uuid=b5409d3b-1f0a-476f-b537-91e30f6543fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57231250-645c-4a66-ac22-f67044a76f7e", "address": "fa:16:3e:b7:9a:cd", "network": {"id": "2a00ea48-6904-4f2c-9f79-74fdf123a925", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1925012459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c1adda10526412b8a61b51f1d5d947a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57231250-64", "ovs_interfaceid": "57231250-645c-4a66-ac22-f67044a76f7e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.118 226239 DEBUG nova.network.os_vif_util [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Converting VIF {"id": "57231250-645c-4a66-ac22-f67044a76f7e", "address": "fa:16:3e:b7:9a:cd", "network": {"id": "2a00ea48-6904-4f2c-9f79-74fdf123a925", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1925012459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c1adda10526412b8a61b51f1d5d947a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57231250-64", "ovs_interfaceid": "57231250-645c-4a66-ac22-f67044a76f7e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.119 226239 DEBUG nova.network.os_vif_util [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:9a:cd,bridge_name='br-int',has_traffic_filtering=True,id=57231250-645c-4a66-ac22-f67044a76f7e,network=Network(2a00ea48-6904-4f2c-9f79-74fdf123a925),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57231250-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.119 226239 DEBUG os_vif [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:9a:cd,bridge_name='br-int',has_traffic_filtering=True,id=57231250-645c-4a66-ac22-f67044a76f7e,network=Network(2a00ea48-6904-4f2c-9f79-74fdf123a925),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57231250-64') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.120 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.120 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.121 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.124 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.124 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57231250-64, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.125 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap57231250-64, col_values=(('external_ids', {'iface-id': '57231250-645c-4a66-ac22-f67044a76f7e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b7:9a:cd', 'vm-uuid': 'b5409d3b-1f0a-476f-b537-91e30f6543fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.126 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:49 np0005603623 NetworkManager[48970]: <info>  [1769847409.1274] manager: (tap57231250-64): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/184)
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.128 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.133 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.134 226239 INFO os_vif [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:9a:cd,bridge_name='br-int',has_traffic_filtering=True,id=57231250-645c-4a66-ac22-f67044a76f7e,network=Network(2a00ea48-6904-4f2c-9f79-74fdf123a925),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57231250-64')#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.366 226239 DEBUG nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.367 226239 DEBUG nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.367 226239 DEBUG nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] No VIF found with MAC fa:16:3e:b7:9a:cd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.368 226239 INFO nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Using config drive#033[00m
Jan 31 03:16:49 np0005603623 nova_compute[226235]: 2026-01-31 08:16:49.394 226239 DEBUG nova.storage.rbd_utils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] rbd image b5409d3b-1f0a-476f-b537-91e30f6543fd_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:16:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:49.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:50.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:50 np0005603623 nova_compute[226235]: 2026-01-31 08:16:50.555 226239 INFO nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Creating config drive at /var/lib/nova/instances/b5409d3b-1f0a-476f-b537-91e30f6543fd/disk.config#033[00m
Jan 31 03:16:50 np0005603623 nova_compute[226235]: 2026-01-31 08:16:50.558 226239 DEBUG oslo_concurrency.processutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b5409d3b-1f0a-476f-b537-91e30f6543fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpju993e1n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:16:50 np0005603623 nova_compute[226235]: 2026-01-31 08:16:50.681 226239 DEBUG oslo_concurrency.processutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b5409d3b-1f0a-476f-b537-91e30f6543fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpju993e1n" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:16:50 np0005603623 nova_compute[226235]: 2026-01-31 08:16:50.706 226239 DEBUG nova.storage.rbd_utils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] rbd image b5409d3b-1f0a-476f-b537-91e30f6543fd_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:16:50 np0005603623 nova_compute[226235]: 2026-01-31 08:16:50.710 226239 DEBUG oslo_concurrency.processutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b5409d3b-1f0a-476f-b537-91e30f6543fd/disk.config b5409d3b-1f0a-476f-b537-91e30f6543fd_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:16:50 np0005603623 nova_compute[226235]: 2026-01-31 08:16:50.947 226239 DEBUG oslo_concurrency.processutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b5409d3b-1f0a-476f-b537-91e30f6543fd/disk.config b5409d3b-1f0a-476f-b537-91e30f6543fd_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.238s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:16:50 np0005603623 nova_compute[226235]: 2026-01-31 08:16:50.948 226239 INFO nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Deleting local config drive /var/lib/nova/instances/b5409d3b-1f0a-476f-b537-91e30f6543fd/disk.config because it was imported into RBD.#033[00m
Jan 31 03:16:50 np0005603623 kernel: tap57231250-64: entered promiscuous mode
Jan 31 03:16:50 np0005603623 NetworkManager[48970]: <info>  [1769847410.9884] manager: (tap57231250-64): new Tun device (/org/freedesktop/NetworkManager/Devices/185)
Jan 31 03:16:51 np0005603623 systemd-udevd[267841]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:16:51 np0005603623 nova_compute[226235]: 2026-01-31 08:16:51.014 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:51 np0005603623 ovn_controller[133449]: 2026-01-31T08:16:51Z|00385|binding|INFO|Claiming lport 57231250-645c-4a66-ac22-f67044a76f7e for this chassis.
Jan 31 03:16:51 np0005603623 ovn_controller[133449]: 2026-01-31T08:16:51Z|00386|binding|INFO|57231250-645c-4a66-ac22-f67044a76f7e: Claiming fa:16:3e:b7:9a:cd 10.100.0.8
Jan 31 03:16:51 np0005603623 nova_compute[226235]: 2026-01-31 08:16:51.018 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:51 np0005603623 NetworkManager[48970]: <info>  [1769847411.0260] device (tap57231250-64): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:16:51 np0005603623 NetworkManager[48970]: <info>  [1769847411.0268] device (tap57231250-64): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:16:51 np0005603623 systemd-machined[194379]: New machine qemu-42-instance-00000061.
Jan 31 03:16:51 np0005603623 ovn_controller[133449]: 2026-01-31T08:16:51Z|00387|binding|INFO|Setting lport 57231250-645c-4a66-ac22-f67044a76f7e ovn-installed in OVS
Jan 31 03:16:51 np0005603623 nova_compute[226235]: 2026-01-31 08:16:51.043 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:51 np0005603623 systemd[1]: Started Virtual Machine qemu-42-instance-00000061.
Jan 31 03:16:51 np0005603623 ovn_controller[133449]: 2026-01-31T08:16:51Z|00388|binding|INFO|Setting lport 57231250-645c-4a66-ac22-f67044a76f7e up in Southbound
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.123 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:9a:cd 10.100.0.8'], port_security=['fa:16:3e:b7:9a:cd 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b5409d3b-1f0a-476f-b537-91e30f6543fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2a00ea48-6904-4f2c-9f79-74fdf123a925', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c1adda10526412b8a61b51f1d5d947a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '89f1a85e-21ba-4b47-aa5d-8a67b4c59002', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed3fe747-5785-43db-afec-d270fa1d381c, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=57231250-645c-4a66-ac22-f67044a76f7e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.124 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 57231250-645c-4a66-ac22-f67044a76f7e in datapath 2a00ea48-6904-4f2c-9f79-74fdf123a925 bound to our chassis#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.126 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2a00ea48-6904-4f2c-9f79-74fdf123a925#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.134 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[331cb20b-273e-4fc3-83b9-a0907ab8e37b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.135 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2a00ea48-61 in ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.137 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2a00ea48-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.137 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[007b6ed4-0a74-4714-a280-4969160042c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.138 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[09428a7c-aea1-4f42-95ff-0ff23a5efc76]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.146 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[b145bca3-af25-4a67-b001-2de62b1cb60a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.156 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[769033f6-5b45-4dfa-95a3-7fe43b89aa5e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.176 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[eb6771d4-8c58-4d27-97b0-06fbf0e2d7e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.180 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cb92a390-91d9-4057-a717-ac73eda6a1e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:51 np0005603623 NetworkManager[48970]: <info>  [1769847411.1812] manager: (tap2a00ea48-60): new Veth device (/org/freedesktop/NetworkManager/Devices/186)
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.203 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[b11d14b1-6e93-49b9-9935-2ddc2ea0c3bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.206 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[52643fd5-e1ba-48ff-aaaa-391ff9b25d2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:51 np0005603623 NetworkManager[48970]: <info>  [1769847411.2247] device (tap2a00ea48-60): carrier: link connected
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.228 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ecb2d1a8-fd7e-4c17-864d-43a3807c2ab8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.238 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5f44d772-7294-4404-8abb-66e215b22cfa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2a00ea48-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:4a:df'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 112], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655165, 'reachable_time': 29266, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267877, 'error': None, 'target': 'ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.249 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d0e1930a-bc2c-4af4-a6a8-fe262e1d51f4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed9:4adf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 655165, 'tstamp': 655165}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267878, 'error': None, 'target': 'ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.262 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2927f6e6-3786-412c-a011-5e03d8224556]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2a00ea48-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:4a:df'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 112], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655165, 'reachable_time': 29266, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 267879, 'error': None, 'target': 'ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.283 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[22392999-e078-41bc-998e-93ea2183d2fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.321 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fe4a3ee9-0714-41b6-85f2-e95ce87e67a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:51 np0005603623 nova_compute[226235]: 2026-01-31 08:16:51.322 226239 DEBUG nova.network.neutron [req-ede956f8-62ec-40eb-8afa-14e6d7f97d55 req-c3c5a779-3d1b-4dad-aa28-a56a832c637b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Updated VIF entry in instance network info cache for port 57231250-645c-4a66-ac22-f67044a76f7e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.323 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2a00ea48-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.323 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:51 np0005603623 nova_compute[226235]: 2026-01-31 08:16:51.323 226239 DEBUG nova.network.neutron [req-ede956f8-62ec-40eb-8afa-14e6d7f97d55 req-c3c5a779-3d1b-4dad-aa28-a56a832c637b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Updating instance_info_cache with network_info: [{"id": "57231250-645c-4a66-ac22-f67044a76f7e", "address": "fa:16:3e:b7:9a:cd", "network": {"id": "2a00ea48-6904-4f2c-9f79-74fdf123a925", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1925012459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c1adda10526412b8a61b51f1d5d947a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57231250-64", "ovs_interfaceid": "57231250-645c-4a66-ac22-f67044a76f7e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.323 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2a00ea48-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:51 np0005603623 NetworkManager[48970]: <info>  [1769847411.3258] manager: (tap2a00ea48-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/187)
Jan 31 03:16:51 np0005603623 kernel: tap2a00ea48-60: entered promiscuous mode
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.327 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2a00ea48-60, col_values=(('external_ids', {'iface-id': '682f51b5-8801-40ec-8efc-2db599e1be26'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:51 np0005603623 ovn_controller[133449]: 2026-01-31T08:16:51Z|00389|binding|INFO|Releasing lport 682f51b5-8801-40ec-8efc-2db599e1be26 from this chassis (sb_readonly=0)
Jan 31 03:16:51 np0005603623 nova_compute[226235]: 2026-01-31 08:16:51.328 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.330 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2a00ea48-6904-4f2c-9f79-74fdf123a925.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2a00ea48-6904-4f2c-9f79-74fdf123a925.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.335 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fe4bf896-1d6a-4dc3-9f6a-d6062a640e21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.336 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-2a00ea48-6904-4f2c-9f79-74fdf123a925
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/2a00ea48-6904-4f2c-9f79-74fdf123a925.pid.haproxy
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 2a00ea48-6904-4f2c-9f79-74fdf123a925
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:16:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:16:51.338 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925', 'env', 'PROCESS_TAG=haproxy-2a00ea48-6904-4f2c-9f79-74fdf123a925', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2a00ea48-6904-4f2c-9f79-74fdf123a925.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:16:51 np0005603623 nova_compute[226235]: 2026-01-31 08:16:51.550 226239 DEBUG oslo_concurrency.lockutils [req-ede956f8-62ec-40eb-8afa-14e6d7f97d55 req-c3c5a779-3d1b-4dad-aa28-a56a832c637b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-b5409d3b-1f0a-476f-b537-91e30f6543fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:16:51 np0005603623 podman[267911]: 2026-01-31 08:16:51.609693704 +0000 UTC m=+0.019128564 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:16:51 np0005603623 podman[267911]: 2026-01-31 08:16:51.833396115 +0000 UTC m=+0.242830965 container create a9a11f11d45991fff73bddbc90b6c75128f48a50a36b788d12bfc7ae82f486f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:16:51 np0005603623 systemd[1]: Started libpod-conmon-a9a11f11d45991fff73bddbc90b6c75128f48a50a36b788d12bfc7ae82f486f6.scope.
Jan 31 03:16:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:51.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:51 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:16:51 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc0b8bc88ed82bb3b5240dc094f8a71735111fb8085cc931c0f680f292dcb7b2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:16:51 np0005603623 nova_compute[226235]: 2026-01-31 08:16:51.901 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847411.9008687, b5409d3b-1f0a-476f-b537-91e30f6543fd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:16:51 np0005603623 nova_compute[226235]: 2026-01-31 08:16:51.903 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] VM Started (Lifecycle Event)#033[00m
Jan 31 03:16:52 np0005603623 podman[267911]: 2026-01-31 08:16:52.110061747 +0000 UTC m=+0.519496617 container init a9a11f11d45991fff73bddbc90b6c75128f48a50a36b788d12bfc7ae82f486f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 31 03:16:52 np0005603623 podman[267911]: 2026-01-31 08:16:52.11458963 +0000 UTC m=+0.524024470 container start a9a11f11d45991fff73bddbc90b6c75128f48a50a36b788d12bfc7ae82f486f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 03:16:52 np0005603623 neutron-haproxy-ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925[267967]: [NOTICE]   (267972) : New worker (267974) forked
Jan 31 03:16:52 np0005603623 neutron-haproxy-ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925[267967]: [NOTICE]   (267972) : Loading success.
Jan 31 03:16:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:52.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:52 np0005603623 nova_compute[226235]: 2026-01-31 08:16:52.265 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:16:52 np0005603623 nova_compute[226235]: 2026-01-31 08:16:52.268 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847411.9011457, b5409d3b-1f0a-476f-b537-91e30f6543fd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:16:52 np0005603623 nova_compute[226235]: 2026-01-31 08:16:52.268 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:16:52 np0005603623 nova_compute[226235]: 2026-01-31 08:16:52.655 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:16:52 np0005603623 nova_compute[226235]: 2026-01-31 08:16:52.659 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:16:52 np0005603623 nova_compute[226235]: 2026-01-31 08:16:52.950 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:16:53 np0005603623 nova_compute[226235]: 2026-01-31 08:16:53.743 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:53.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:54 np0005603623 nova_compute[226235]: 2026-01-31 08:16:54.128 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:54.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:55.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:16:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:56.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.085 226239 DEBUG nova.compute.manager [req-75c6991d-e5bc-4746-8b38-539014313c6b req-15666f59-ae0a-4100-8ae1-af0f37f7e6e9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Received event network-vif-plugged-57231250-645c-4a66-ac22-f67044a76f7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.085 226239 DEBUG oslo_concurrency.lockutils [req-75c6991d-e5bc-4746-8b38-539014313c6b req-15666f59-ae0a-4100-8ae1-af0f37f7e6e9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b5409d3b-1f0a-476f-b537-91e30f6543fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.086 226239 DEBUG oslo_concurrency.lockutils [req-75c6991d-e5bc-4746-8b38-539014313c6b req-15666f59-ae0a-4100-8ae1-af0f37f7e6e9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b5409d3b-1f0a-476f-b537-91e30f6543fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.086 226239 DEBUG oslo_concurrency.lockutils [req-75c6991d-e5bc-4746-8b38-539014313c6b req-15666f59-ae0a-4100-8ae1-af0f37f7e6e9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b5409d3b-1f0a-476f-b537-91e30f6543fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.087 226239 DEBUG nova.compute.manager [req-75c6991d-e5bc-4746-8b38-539014313c6b req-15666f59-ae0a-4100-8ae1-af0f37f7e6e9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Processing event network-vif-plugged-57231250-645c-4a66-ac22-f67044a76f7e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.088 226239 DEBUG nova.compute.manager [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.092 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847417.0915596, b5409d3b-1f0a-476f-b537-91e30f6543fd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.093 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.094 226239 DEBUG nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.097 226239 INFO nova.virt.libvirt.driver [-] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Instance spawned successfully.#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.098 226239 DEBUG nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.219 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.225 226239 DEBUG nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.226 226239 DEBUG nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.227 226239 DEBUG nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.227 226239 DEBUG nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.228 226239 DEBUG nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.228 226239 DEBUG nova.virt.libvirt.driver [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.233 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.323 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.529 226239 INFO nova.compute.manager [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Took 23.33 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.529 226239 DEBUG nova.compute.manager [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.695 226239 INFO nova.compute.manager [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Took 40.12 seconds to build instance.#033[00m
Jan 31 03:16:57 np0005603623 nova_compute[226235]: 2026-01-31 08:16:57.847 226239 DEBUG oslo_concurrency.lockutils [None req-393e6836-aaed-4aaf-960b-9a5053026ccd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lock "b5409d3b-1f0a-476f-b537-91e30f6543fd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 41.771s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:57.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:58.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:58 np0005603623 nova_compute[226235]: 2026-01-31 08:16:58.775 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:59 np0005603623 nova_compute[226235]: 2026-01-31 08:16:59.129 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:16:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:59.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:00.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:01 np0005603623 nova_compute[226235]: 2026-01-31 08:17:01.405 226239 DEBUG nova.compute.manager [req-3112ddb1-efc8-485d-83f6-d8d5fc141495 req-24981067-e79e-4738-a152-3ef9cde4b754 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Received event network-vif-plugged-57231250-645c-4a66-ac22-f67044a76f7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:01 np0005603623 nova_compute[226235]: 2026-01-31 08:17:01.406 226239 DEBUG oslo_concurrency.lockutils [req-3112ddb1-efc8-485d-83f6-d8d5fc141495 req-24981067-e79e-4738-a152-3ef9cde4b754 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b5409d3b-1f0a-476f-b537-91e30f6543fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:01 np0005603623 nova_compute[226235]: 2026-01-31 08:17:01.407 226239 DEBUG oslo_concurrency.lockutils [req-3112ddb1-efc8-485d-83f6-d8d5fc141495 req-24981067-e79e-4738-a152-3ef9cde4b754 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b5409d3b-1f0a-476f-b537-91e30f6543fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:01 np0005603623 nova_compute[226235]: 2026-01-31 08:17:01.407 226239 DEBUG oslo_concurrency.lockutils [req-3112ddb1-efc8-485d-83f6-d8d5fc141495 req-24981067-e79e-4738-a152-3ef9cde4b754 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b5409d3b-1f0a-476f-b537-91e30f6543fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:01 np0005603623 nova_compute[226235]: 2026-01-31 08:17:01.408 226239 DEBUG nova.compute.manager [req-3112ddb1-efc8-485d-83f6-d8d5fc141495 req-24981067-e79e-4738-a152-3ef9cde4b754 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] No waiting events found dispatching network-vif-plugged-57231250-645c-4a66-ac22-f67044a76f7e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:01 np0005603623 nova_compute[226235]: 2026-01-31 08:17:01.408 226239 WARNING nova.compute.manager [req-3112ddb1-efc8-485d-83f6-d8d5fc141495 req-24981067-e79e-4738-a152-3ef9cde4b754 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Received unexpected event network-vif-plugged-57231250-645c-4a66-ac22-f67044a76f7e for instance with vm_state active and task_state None.#033[00m
Jan 31 03:17:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:01.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:02.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:02 np0005603623 nova_compute[226235]: 2026-01-31 08:17:02.619 226239 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquiring lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:02 np0005603623 nova_compute[226235]: 2026-01-31 08:17:02.619 226239 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:03 np0005603623 nova_compute[226235]: 2026-01-31 08:17:03.507 226239 DEBUG nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:17:03 np0005603623 nova_compute[226235]: 2026-01-31 08:17:03.777 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:03.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:04 np0005603623 podman[267989]: 2026-01-31 08:17:04.004913932 +0000 UTC m=+0.098636420 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 03:17:04 np0005603623 podman[267990]: 2026-01-31 08:17:04.029054084 +0000 UTC m=+0.123494275 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:17:04 np0005603623 nova_compute[226235]: 2026-01-31 08:17:04.036 226239 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:04 np0005603623 nova_compute[226235]: 2026-01-31 08:17:04.036 226239 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:04 np0005603623 nova_compute[226235]: 2026-01-31 08:17:04.049 226239 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:17:04 np0005603623 nova_compute[226235]: 2026-01-31 08:17:04.049 226239 INFO nova.compute.claims [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:17:04 np0005603623 nova_compute[226235]: 2026-01-31 08:17:04.131 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:04.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:04 np0005603623 nova_compute[226235]: 2026-01-31 08:17:04.733 226239 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:17:05 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3557684963' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:17:05 np0005603623 nova_compute[226235]: 2026-01-31 08:17:05.199 226239 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:05 np0005603623 nova_compute[226235]: 2026-01-31 08:17:05.206 226239 DEBUG nova.compute.provider_tree [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:17:05 np0005603623 nova_compute[226235]: 2026-01-31 08:17:05.259 226239 DEBUG nova.scheduler.client.report [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:17:05 np0005603623 nova_compute[226235]: 2026-01-31 08:17:05.410 226239 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.373s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:05 np0005603623 nova_compute[226235]: 2026-01-31 08:17:05.410 226239 DEBUG nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:17:05 np0005603623 nova_compute[226235]: 2026-01-31 08:17:05.625 226239 DEBUG nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:17:05 np0005603623 nova_compute[226235]: 2026-01-31 08:17:05.626 226239 DEBUG nova.network.neutron [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:17:05 np0005603623 nova_compute[226235]: 2026-01-31 08:17:05.731 226239 INFO nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:17:05 np0005603623 nova_compute[226235]: 2026-01-31 08:17:05.840 226239 DEBUG nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:17:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:05.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:06.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:06 np0005603623 nova_compute[226235]: 2026-01-31 08:17:06.319 226239 DEBUG nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:17:06 np0005603623 nova_compute[226235]: 2026-01-31 08:17:06.320 226239 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:17:06 np0005603623 nova_compute[226235]: 2026-01-31 08:17:06.320 226239 INFO nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Creating image(s)#033[00m
Jan 31 03:17:06 np0005603623 nova_compute[226235]: 2026-01-31 08:17:06.340 226239 DEBUG nova.storage.rbd_utils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] rbd image d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:17:06 np0005603623 nova_compute[226235]: 2026-01-31 08:17:06.361 226239 DEBUG nova.storage.rbd_utils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] rbd image d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:17:06 np0005603623 nova_compute[226235]: 2026-01-31 08:17:06.386 226239 DEBUG nova.storage.rbd_utils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] rbd image d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:17:06 np0005603623 nova_compute[226235]: 2026-01-31 08:17:06.390 226239 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:06 np0005603623 nova_compute[226235]: 2026-01-31 08:17:06.408 226239 DEBUG nova.policy [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8db5a8acb6d04c988f9dd4f74380c487', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'eafe22d6cfcb41d4b31597087498a565', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:17:06 np0005603623 nova_compute[226235]: 2026-01-31 08:17:06.445 226239 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:06 np0005603623 nova_compute[226235]: 2026-01-31 08:17:06.446 226239 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:06 np0005603623 nova_compute[226235]: 2026-01-31 08:17:06.447 226239 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:06 np0005603623 nova_compute[226235]: 2026-01-31 08:17:06.447 226239 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:06 np0005603623 nova_compute[226235]: 2026-01-31 08:17:06.471 226239 DEBUG nova.storage.rbd_utils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] rbd image d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:17:06 np0005603623 nova_compute[226235]: 2026-01-31 08:17:06.476 226239 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:06 np0005603623 nova_compute[226235]: 2026-01-31 08:17:06.742 226239 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.267s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:06 np0005603623 nova_compute[226235]: 2026-01-31 08:17:06.813 226239 DEBUG nova.storage.rbd_utils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] resizing rbd image d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:17:06 np0005603623 nova_compute[226235]: 2026-01-31 08:17:06.922 226239 DEBUG nova.objects.instance [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lazy-loading 'migration_context' on Instance uuid d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:17:07 np0005603623 nova_compute[226235]: 2026-01-31 08:17:07.506 226239 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:17:07 np0005603623 nova_compute[226235]: 2026-01-31 08:17:07.506 226239 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Ensure instance console log exists: /var/lib/nova/instances/d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:17:07 np0005603623 nova_compute[226235]: 2026-01-31 08:17:07.507 226239 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:07 np0005603623 nova_compute[226235]: 2026-01-31 08:17:07.507 226239 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:07 np0005603623 nova_compute[226235]: 2026-01-31 08:17:07.508 226239 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:17:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:07.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.168 226239 DEBUG oslo_concurrency.lockutils [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Acquiring lock "b5409d3b-1f0a-476f-b537-91e30f6543fd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.169 226239 DEBUG oslo_concurrency.lockutils [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lock "b5409d3b-1f0a-476f-b537-91e30f6543fd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.170 226239 DEBUG oslo_concurrency.lockutils [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Acquiring lock "b5409d3b-1f0a-476f-b537-91e30f6543fd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.170 226239 DEBUG oslo_concurrency.lockutils [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lock "b5409d3b-1f0a-476f-b537-91e30f6543fd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.170 226239 DEBUG oslo_concurrency.lockutils [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lock "b5409d3b-1f0a-476f-b537-91e30f6543fd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.172 226239 INFO nova.compute.manager [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Terminating instance#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.173 226239 DEBUG nova.compute.manager [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:17:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:08.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:08 np0005603623 kernel: tap57231250-64 (unregistering): left promiscuous mode
Jan 31 03:17:08 np0005603623 NetworkManager[48970]: <info>  [1769847428.3205] device (tap57231250-64): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.320 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:08 np0005603623 ovn_controller[133449]: 2026-01-31T08:17:08Z|00390|binding|INFO|Releasing lport 57231250-645c-4a66-ac22-f67044a76f7e from this chassis (sb_readonly=0)
Jan 31 03:17:08 np0005603623 ovn_controller[133449]: 2026-01-31T08:17:08Z|00391|binding|INFO|Setting lport 57231250-645c-4a66-ac22-f67044a76f7e down in Southbound
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.327 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:08 np0005603623 ovn_controller[133449]: 2026-01-31T08:17:08Z|00392|binding|INFO|Removing iface tap57231250-64 ovn-installed in OVS
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.328 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.334 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:08 np0005603623 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000061.scope: Deactivated successfully.
Jan 31 03:17:08 np0005603623 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000061.scope: Consumed 11.669s CPU time.
Jan 31 03:17:08 np0005603623 systemd-machined[194379]: Machine qemu-42-instance-00000061 terminated.
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.407 226239 INFO nova.virt.libvirt.driver [-] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Instance destroyed successfully.#033[00m
Jan 31 03:17:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:08.408 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:9a:cd 10.100.0.8'], port_security=['fa:16:3e:b7:9a:cd 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b5409d3b-1f0a-476f-b537-91e30f6543fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2a00ea48-6904-4f2c-9f79-74fdf123a925', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c1adda10526412b8a61b51f1d5d947a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '89f1a85e-21ba-4b47-aa5d-8a67b4c59002', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed3fe747-5785-43db-afec-d270fa1d381c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=57231250-645c-4a66-ac22-f67044a76f7e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.408 226239 DEBUG nova.objects.instance [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lazy-loading 'resources' on Instance uuid b5409d3b-1f0a-476f-b537-91e30f6543fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:17:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:08.409 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 57231250-645c-4a66-ac22-f67044a76f7e in datapath 2a00ea48-6904-4f2c-9f79-74fdf123a925 unbound from our chassis#033[00m
Jan 31 03:17:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:08.411 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2a00ea48-6904-4f2c-9f79-74fdf123a925, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:17:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:08.412 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f87d92f2-d35f-40f5-8328-d2a79223d11b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:08.413 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925 namespace which is not needed anymore#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.459 226239 DEBUG nova.virt.libvirt.vif [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:16:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-1766985933',display_name='tempest-ServerMetadataNegativeTestJSON-server-1766985933',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-1766985933',id=97,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:16:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5c1adda10526412b8a61b51f1d5d947a',ramdisk_id='',reservation_id='r-4v9sbyt3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataNegativeTestJSON-852438232',owner_user_name='tempest-ServerMetadataNegativeTestJSON-852438232-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:16:57Z,user_data=None,user_id='684cece6993b4999be175313a51a3eac',uuid=b5409d3b-1f0a-476f-b537-91e30f6543fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "57231250-645c-4a66-ac22-f67044a76f7e", "address": "fa:16:3e:b7:9a:cd", "network": {"id": "2a00ea48-6904-4f2c-9f79-74fdf123a925", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1925012459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c1adda10526412b8a61b51f1d5d947a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57231250-64", "ovs_interfaceid": "57231250-645c-4a66-ac22-f67044a76f7e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.459 226239 DEBUG nova.network.os_vif_util [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Converting VIF {"id": "57231250-645c-4a66-ac22-f67044a76f7e", "address": "fa:16:3e:b7:9a:cd", "network": {"id": "2a00ea48-6904-4f2c-9f79-74fdf123a925", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1925012459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c1adda10526412b8a61b51f1d5d947a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57231250-64", "ovs_interfaceid": "57231250-645c-4a66-ac22-f67044a76f7e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.460 226239 DEBUG nova.network.os_vif_util [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:9a:cd,bridge_name='br-int',has_traffic_filtering=True,id=57231250-645c-4a66-ac22-f67044a76f7e,network=Network(2a00ea48-6904-4f2c-9f79-74fdf123a925),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57231250-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.460 226239 DEBUG os_vif [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:9a:cd,bridge_name='br-int',has_traffic_filtering=True,id=57231250-645c-4a66-ac22-f67044a76f7e,network=Network(2a00ea48-6904-4f2c-9f79-74fdf123a925),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57231250-64') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.462 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.462 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57231250-64, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.463 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.464 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.466 226239 INFO os_vif [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:9a:cd,bridge_name='br-int',has_traffic_filtering=True,id=57231250-645c-4a66-ac22-f67044a76f7e,network=Network(2a00ea48-6904-4f2c-9f79-74fdf123a925),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57231250-64')#033[00m
Jan 31 03:17:08 np0005603623 neutron-haproxy-ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925[267967]: [NOTICE]   (267972) : haproxy version is 2.8.14-c23fe91
Jan 31 03:17:08 np0005603623 neutron-haproxy-ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925[267967]: [NOTICE]   (267972) : path to executable is /usr/sbin/haproxy
Jan 31 03:17:08 np0005603623 neutron-haproxy-ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925[267967]: [WARNING]  (267972) : Exiting Master process...
Jan 31 03:17:08 np0005603623 neutron-haproxy-ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925[267967]: [ALERT]    (267972) : Current worker (267974) exited with code 143 (Terminated)
Jan 31 03:17:08 np0005603623 neutron-haproxy-ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925[267967]: [WARNING]  (267972) : All workers exited. Exiting... (0)
Jan 31 03:17:08 np0005603623 systemd[1]: libpod-a9a11f11d45991fff73bddbc90b6c75128f48a50a36b788d12bfc7ae82f486f6.scope: Deactivated successfully.
Jan 31 03:17:08 np0005603623 podman[268324]: 2026-01-31 08:17:08.530657203 +0000 UTC m=+0.041748297 container died a9a11f11d45991fff73bddbc90b6c75128f48a50a36b788d12bfc7ae82f486f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:17:08 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a9a11f11d45991fff73bddbc90b6c75128f48a50a36b788d12bfc7ae82f486f6-userdata-shm.mount: Deactivated successfully.
Jan 31 03:17:08 np0005603623 systemd[1]: var-lib-containers-storage-overlay-cc0b8bc88ed82bb3b5240dc094f8a71735111fb8085cc931c0f680f292dcb7b2-merged.mount: Deactivated successfully.
Jan 31 03:17:08 np0005603623 podman[268324]: 2026-01-31 08:17:08.56766316 +0000 UTC m=+0.078754254 container cleanup a9a11f11d45991fff73bddbc90b6c75128f48a50a36b788d12bfc7ae82f486f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:17:08 np0005603623 systemd[1]: libpod-conmon-a9a11f11d45991fff73bddbc90b6c75128f48a50a36b788d12bfc7ae82f486f6.scope: Deactivated successfully.
Jan 31 03:17:08 np0005603623 podman[268358]: 2026-01-31 08:17:08.632172863 +0000 UTC m=+0.051069641 container remove a9a11f11d45991fff73bddbc90b6c75128f48a50a36b788d12bfc7ae82f486f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:17:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:08.635 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[321edc62-900b-47a2-afb4-743226aec9f5]: (4, ('Sat Jan 31 08:17:08 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925 (a9a11f11d45991fff73bddbc90b6c75128f48a50a36b788d12bfc7ae82f486f6)\na9a11f11d45991fff73bddbc90b6c75128f48a50a36b788d12bfc7ae82f486f6\nSat Jan 31 08:17:08 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925 (a9a11f11d45991fff73bddbc90b6c75128f48a50a36b788d12bfc7ae82f486f6)\na9a11f11d45991fff73bddbc90b6c75128f48a50a36b788d12bfc7ae82f486f6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:08.637 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[36d9edb4-fd23-47d4-8b08-9cb3d1b292d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:08.638 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2a00ea48-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:08 np0005603623 kernel: tap2a00ea48-60: left promiscuous mode
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.640 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.641 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:08.644 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bf423dec-e2d1-45a2-a72f-00533065530a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.649 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:08.661 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5ac67d4e-bca9-43cc-a5f2-faebbc75947e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:08.663 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[46384c40-39ca-452c-920d-a5c48efbf541]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:08.675 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d90205d4-aec9-437d-91db-fb38581af09c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655160, 'reachable_time': 32167, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268371, 'error': None, 'target': 'ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:08.678 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2a00ea48-6904-4f2c-9f79-74fdf123a925 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:17:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:08.679 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[29cb7681-ff6a-448c-bb86-069a2e39fa85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:08 np0005603623 systemd[1]: run-netns-ovnmeta\x2d2a00ea48\x2d6904\x2d4f2c\x2d9f79\x2d74fdf123a925.mount: Deactivated successfully.
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.779 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.854 226239 INFO nova.virt.libvirt.driver [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Deleting instance files /var/lib/nova/instances/b5409d3b-1f0a-476f-b537-91e30f6543fd_del#033[00m
Jan 31 03:17:08 np0005603623 nova_compute[226235]: 2026-01-31 08:17:08.855 226239 INFO nova.virt.libvirt.driver [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Deletion of /var/lib/nova/instances/b5409d3b-1f0a-476f-b537-91e30f6543fd_del complete#033[00m
Jan 31 03:17:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:09 np0005603623 nova_compute[226235]: 2026-01-31 08:17:09.519 226239 DEBUG nova.compute.manager [req-1fcf4c63-051b-4730-9b9d-e0351a86794c req-178780c5-9206-4694-8509-30867a777577 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Received event network-vif-unplugged-57231250-645c-4a66-ac22-f67044a76f7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:09 np0005603623 nova_compute[226235]: 2026-01-31 08:17:09.520 226239 DEBUG oslo_concurrency.lockutils [req-1fcf4c63-051b-4730-9b9d-e0351a86794c req-178780c5-9206-4694-8509-30867a777577 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b5409d3b-1f0a-476f-b537-91e30f6543fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:09 np0005603623 nova_compute[226235]: 2026-01-31 08:17:09.520 226239 DEBUG oslo_concurrency.lockutils [req-1fcf4c63-051b-4730-9b9d-e0351a86794c req-178780c5-9206-4694-8509-30867a777577 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b5409d3b-1f0a-476f-b537-91e30f6543fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:09 np0005603623 nova_compute[226235]: 2026-01-31 08:17:09.520 226239 DEBUG oslo_concurrency.lockutils [req-1fcf4c63-051b-4730-9b9d-e0351a86794c req-178780c5-9206-4694-8509-30867a777577 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b5409d3b-1f0a-476f-b537-91e30f6543fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:09 np0005603623 nova_compute[226235]: 2026-01-31 08:17:09.520 226239 DEBUG nova.compute.manager [req-1fcf4c63-051b-4730-9b9d-e0351a86794c req-178780c5-9206-4694-8509-30867a777577 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] No waiting events found dispatching network-vif-unplugged-57231250-645c-4a66-ac22-f67044a76f7e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:09 np0005603623 nova_compute[226235]: 2026-01-31 08:17:09.520 226239 DEBUG nova.compute.manager [req-1fcf4c63-051b-4730-9b9d-e0351a86794c req-178780c5-9206-4694-8509-30867a777577 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Received event network-vif-unplugged-57231250-645c-4a66-ac22-f67044a76f7e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:17:09 np0005603623 nova_compute[226235]: 2026-01-31 08:17:09.544 226239 DEBUG nova.network.neutron [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Successfully created port: 3b96fe71-71a3-4cca-8b42-2f004d1114ea _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:17:09 np0005603623 nova_compute[226235]: 2026-01-31 08:17:09.607 226239 INFO nova.compute.manager [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Took 1.43 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:17:09 np0005603623 nova_compute[226235]: 2026-01-31 08:17:09.607 226239 DEBUG oslo.service.loopingcall [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:17:09 np0005603623 nova_compute[226235]: 2026-01-31 08:17:09.607 226239 DEBUG nova.compute.manager [-] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:17:09 np0005603623 nova_compute[226235]: 2026-01-31 08:17:09.607 226239 DEBUG nova.network.neutron [-] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:17:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:17:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:09.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:17:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:10.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:11.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:12.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:13 np0005603623 nova_compute[226235]: 2026-01-31 08:17:13.204 226239 DEBUG nova.network.neutron [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Successfully updated port: 3b96fe71-71a3-4cca-8b42-2f004d1114ea _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:17:13 np0005603623 nova_compute[226235]: 2026-01-31 08:17:13.322 226239 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquiring lock "refresh_cache-d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:17:13 np0005603623 nova_compute[226235]: 2026-01-31 08:17:13.322 226239 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquired lock "refresh_cache-d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:17:13 np0005603623 nova_compute[226235]: 2026-01-31 08:17:13.322 226239 DEBUG nova.network.neutron [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:17:13 np0005603623 nova_compute[226235]: 2026-01-31 08:17:13.359 226239 DEBUG nova.compute.manager [req-48775de6-5280-40b1-bcdc-2aa102653c3b req-ab23a9d8-c9ea-410f-9b09-dfa35b48202f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Received event network-vif-plugged-57231250-645c-4a66-ac22-f67044a76f7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:13 np0005603623 nova_compute[226235]: 2026-01-31 08:17:13.360 226239 DEBUG oslo_concurrency.lockutils [req-48775de6-5280-40b1-bcdc-2aa102653c3b req-ab23a9d8-c9ea-410f-9b09-dfa35b48202f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b5409d3b-1f0a-476f-b537-91e30f6543fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:13 np0005603623 nova_compute[226235]: 2026-01-31 08:17:13.360 226239 DEBUG oslo_concurrency.lockutils [req-48775de6-5280-40b1-bcdc-2aa102653c3b req-ab23a9d8-c9ea-410f-9b09-dfa35b48202f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b5409d3b-1f0a-476f-b537-91e30f6543fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:13 np0005603623 nova_compute[226235]: 2026-01-31 08:17:13.360 226239 DEBUG oslo_concurrency.lockutils [req-48775de6-5280-40b1-bcdc-2aa102653c3b req-ab23a9d8-c9ea-410f-9b09-dfa35b48202f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b5409d3b-1f0a-476f-b537-91e30f6543fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:13 np0005603623 nova_compute[226235]: 2026-01-31 08:17:13.360 226239 DEBUG nova.compute.manager [req-48775de6-5280-40b1-bcdc-2aa102653c3b req-ab23a9d8-c9ea-410f-9b09-dfa35b48202f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] No waiting events found dispatching network-vif-plugged-57231250-645c-4a66-ac22-f67044a76f7e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:13 np0005603623 nova_compute[226235]: 2026-01-31 08:17:13.360 226239 WARNING nova.compute.manager [req-48775de6-5280-40b1-bcdc-2aa102653c3b req-ab23a9d8-c9ea-410f-9b09-dfa35b48202f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Received unexpected event network-vif-plugged-57231250-645c-4a66-ac22-f67044a76f7e for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:17:13 np0005603623 nova_compute[226235]: 2026-01-31 08:17:13.464 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603623 nova_compute[226235]: 2026-01-31 08:17:13.664 226239 DEBUG nova.network.neutron [-] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:17:13 np0005603623 nova_compute[226235]: 2026-01-31 08:17:13.671 226239 DEBUG nova.compute.manager [req-986fbf3c-d5ea-4f65-8ca0-20538e77bd97 req-8e4c851f-e957-424c-bbef-6f265b74e38e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Received event network-changed-3b96fe71-71a3-4cca-8b42-2f004d1114ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:13 np0005603623 nova_compute[226235]: 2026-01-31 08:17:13.672 226239 DEBUG nova.compute.manager [req-986fbf3c-d5ea-4f65-8ca0-20538e77bd97 req-8e4c851f-e957-424c-bbef-6f265b74e38e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Refreshing instance network info cache due to event network-changed-3b96fe71-71a3-4cca-8b42-2f004d1114ea. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:17:13 np0005603623 nova_compute[226235]: 2026-01-31 08:17:13.672 226239 DEBUG oslo_concurrency.lockutils [req-986fbf3c-d5ea-4f65-8ca0-20538e77bd97 req-8e4c851f-e957-424c-bbef-6f265b74e38e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:17:13 np0005603623 nova_compute[226235]: 2026-01-31 08:17:13.780 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603623 nova_compute[226235]: 2026-01-31 08:17:13.815 226239 INFO nova.compute.manager [-] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Took 4.21 seconds to deallocate network for instance.#033[00m
Jan 31 03:17:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:17:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:13.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:17:14 np0005603623 nova_compute[226235]: 2026-01-31 08:17:14.041 226239 DEBUG oslo_concurrency.lockutils [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:14 np0005603623 nova_compute[226235]: 2026-01-31 08:17:14.041 226239 DEBUG oslo_concurrency.lockutils [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:14 np0005603623 nova_compute[226235]: 2026-01-31 08:17:14.188 226239 DEBUG nova.compute.manager [req-5b6c142d-93a7-45ec-8570-882af79b30e9 req-061a45ea-cacf-4fcc-bb04-6cadfee662aa fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Received event network-vif-deleted-57231250-645c-4a66-ac22-f67044a76f7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:14 np0005603623 nova_compute[226235]: 2026-01-31 08:17:14.247 226239 DEBUG oslo_concurrency.processutils [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:17:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:14.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:17:14 np0005603623 nova_compute[226235]: 2026-01-31 08:17:14.392 226239 DEBUG nova.network.neutron [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:17:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:17:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2936630503' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:17:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:17:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/844075393' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:17:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:17:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2936630503' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:17:14 np0005603623 nova_compute[226235]: 2026-01-31 08:17:14.637 226239 DEBUG oslo_concurrency.processutils [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.390s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:14 np0005603623 nova_compute[226235]: 2026-01-31 08:17:14.643 226239 DEBUG nova.compute.provider_tree [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:17:14 np0005603623 nova_compute[226235]: 2026-01-31 08:17:14.719 226239 DEBUG nova.scheduler.client.report [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:17:14 np0005603623 nova_compute[226235]: 2026-01-31 08:17:14.822 226239 DEBUG oslo_concurrency.lockutils [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.780s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:15 np0005603623 nova_compute[226235]: 2026-01-31 08:17:15.019 226239 INFO nova.scheduler.client.report [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Deleted allocations for instance b5409d3b-1f0a-476f-b537-91e30f6543fd#033[00m
Jan 31 03:17:15 np0005603623 nova_compute[226235]: 2026-01-31 08:17:15.225 226239 DEBUG oslo_concurrency.lockutils [None req-8eec0cdf-d9ba-479a-96ee-4b6d784482bd 684cece6993b4999be175313a51a3eac 5c1adda10526412b8a61b51f1d5d947a - - default default] Lock "b5409d3b-1f0a-476f-b537-91e30f6543fd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:15.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:16.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:17 np0005603623 nova_compute[226235]: 2026-01-31 08:17:17.358 226239 DEBUG nova.network.neutron [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Updating instance_info_cache with network_info: [{"id": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "address": "fa:16:3e:40:6b:d9", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b96fe71-71", "ovs_interfaceid": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:17:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:17:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:17.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:17:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:17:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:18.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:17:18 np0005603623 nova_compute[226235]: 2026-01-31 08:17:18.466 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:18 np0005603623 nova_compute[226235]: 2026-01-31 08:17:18.781 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.103 226239 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Releasing lock "refresh_cache-d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.103 226239 DEBUG nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Instance network_info: |[{"id": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "address": "fa:16:3e:40:6b:d9", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b96fe71-71", "ovs_interfaceid": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.104 226239 DEBUG oslo_concurrency.lockutils [req-986fbf3c-d5ea-4f65-8ca0-20538e77bd97 req-8e4c851f-e957-424c-bbef-6f265b74e38e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.104 226239 DEBUG nova.network.neutron [req-986fbf3c-d5ea-4f65-8ca0-20538e77bd97 req-8e4c851f-e957-424c-bbef-6f265b74e38e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Refreshing network info cache for port 3b96fe71-71a3-4cca-8b42-2f004d1114ea _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.110 226239 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Start _get_guest_xml network_info=[{"id": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "address": "fa:16:3e:40:6b:d9", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b96fe71-71", "ovs_interfaceid": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.114 226239 WARNING nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.122 226239 DEBUG nova.virt.libvirt.host [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.122 226239 DEBUG nova.virt.libvirt.host [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.127 226239 DEBUG nova.virt.libvirt.host [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.127 226239 DEBUG nova.virt.libvirt.host [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.129 226239 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.129 226239 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.129 226239 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.130 226239 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.130 226239 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.130 226239 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.131 226239 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.131 226239 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.131 226239 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.132 226239 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.132 226239 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.132 226239 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:17:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.136 226239 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:17:19 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1698759136' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:17:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:17:19 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1698759136' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:17:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:17:19 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2446758895' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.617 226239 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.642 226239 DEBUG nova.storage.rbd_utils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] rbd image d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:17:19 np0005603623 nova_compute[226235]: 2026-01-31 08:17:19.646 226239 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:17:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:19.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:17:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:17:20 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3526366918' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.051 226239 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.405s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.054 226239 DEBUG nova.virt.libvirt.vif [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:16:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-603829389',display_name='tempest-ListServersNegativeTestJSON-server-603829389-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-603829389-2',id=99,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eafe22d6cfcb41d4b31597087498a565',ramdisk_id='',reservation_id='r-5jowh2i5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-79577656',owner_user_name='tempest-ListServersNegativeTestJSON-79577656-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:17:05Z,user_data=None,user_id='8db5a8acb6d04c988f9dd4f74380c487',uuid=d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "address": "fa:16:3e:40:6b:d9", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b96fe71-71", "ovs_interfaceid": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.054 226239 DEBUG nova.network.os_vif_util [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Converting VIF {"id": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "address": "fa:16:3e:40:6b:d9", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b96fe71-71", "ovs_interfaceid": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.055 226239 DEBUG nova.network.os_vif_util [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:6b:d9,bridge_name='br-int',has_traffic_filtering=True,id=3b96fe71-71a3-4cca-8b42-2f004d1114ea,network=Network(4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b96fe71-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.057 226239 DEBUG nova.objects.instance [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lazy-loading 'pci_devices' on Instance uuid d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.159 226239 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:17:20 np0005603623 nova_compute[226235]:  <uuid>d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1</uuid>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:  <name>instance-00000063</name>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <nova:name>tempest-ListServersNegativeTestJSON-server-603829389-2</nova:name>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:17:19</nova:creationTime>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:17:20 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:        <nova:user uuid="8db5a8acb6d04c988f9dd4f74380c487">tempest-ListServersNegativeTestJSON-79577656-project-member</nova:user>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:        <nova:project uuid="eafe22d6cfcb41d4b31597087498a565">tempest-ListServersNegativeTestJSON-79577656</nova:project>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:        <nova:port uuid="3b96fe71-71a3-4cca-8b42-2f004d1114ea">
Jan 31 03:17:20 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <entry name="serial">d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1</entry>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <entry name="uuid">d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1</entry>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1_disk">
Jan 31 03:17:20 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:17:20 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1_disk.config">
Jan 31 03:17:20 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:17:20 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:40:6b:d9"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <target dev="tap3b96fe71-71"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1/console.log" append="off"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:17:20 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:17:20 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:17:20 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:17:20 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.160 226239 DEBUG nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Preparing to wait for external event network-vif-plugged-3b96fe71-71a3-4cca-8b42-2f004d1114ea prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.160 226239 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquiring lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.161 226239 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.161 226239 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.161 226239 DEBUG nova.virt.libvirt.vif [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:16:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-603829389',display_name='tempest-ListServersNegativeTestJSON-server-603829389-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-603829389-2',id=99,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eafe22d6cfcb41d4b31597087498a565',ramdisk_id='',reservation_id='r-5jowh2i5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-79577656',owner_user_name='tempest-ListServersNegativeTestJSON-79577656-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:17:05Z,user_data=None,user_id='8db5a8acb6d04c988f9dd4f74380c487',uuid=d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "address": "fa:16:3e:40:6b:d9", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b96fe71-71", "ovs_interfaceid": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.162 226239 DEBUG nova.network.os_vif_util [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Converting VIF {"id": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "address": "fa:16:3e:40:6b:d9", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b96fe71-71", "ovs_interfaceid": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.162 226239 DEBUG nova.network.os_vif_util [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:6b:d9,bridge_name='br-int',has_traffic_filtering=True,id=3b96fe71-71a3-4cca-8b42-2f004d1114ea,network=Network(4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b96fe71-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.162 226239 DEBUG os_vif [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:6b:d9,bridge_name='br-int',has_traffic_filtering=True,id=3b96fe71-71a3-4cca-8b42-2f004d1114ea,network=Network(4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b96fe71-71') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.163 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.163 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.164 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.166 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.166 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3b96fe71-71, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.167 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3b96fe71-71, col_values=(('external_ids', {'iface-id': '3b96fe71-71a3-4cca-8b42-2f004d1114ea', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:40:6b:d9', 'vm-uuid': 'd6e7c764-5ddd-4bee-b6b4-a7a93994c7b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.168 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:20 np0005603623 NetworkManager[48970]: <info>  [1769847440.1692] manager: (tap3b96fe71-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/188)
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.171 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.175 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.176 226239 INFO os_vif [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:6b:d9,bridge_name='br-int',has_traffic_filtering=True,id=3b96fe71-71a3-4cca-8b42-2f004d1114ea,network=Network(4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b96fe71-71')#033[00m
Jan 31 03:17:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:20.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.288 226239 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.289 226239 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.289 226239 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] No VIF found with MAC fa:16:3e:40:6b:d9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.289 226239 INFO nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Using config drive#033[00m
Jan 31 03:17:20 np0005603623 nova_compute[226235]: 2026-01-31 08:17:20.317 226239 DEBUG nova.storage.rbd_utils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] rbd image d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:17:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:21.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:22 np0005603623 nova_compute[226235]: 2026-01-31 08:17:22.116 226239 INFO nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Creating config drive at /var/lib/nova/instances/d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1/disk.config#033[00m
Jan 31 03:17:22 np0005603623 nova_compute[226235]: 2026-01-31 08:17:22.119 226239 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpntolsjxs execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:22 np0005603623 nova_compute[226235]: 2026-01-31 08:17:22.247 226239 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpntolsjxs" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:22 np0005603623 nova_compute[226235]: 2026-01-31 08:17:22.276 226239 DEBUG nova.storage.rbd_utils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] rbd image d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:17:22 np0005603623 nova_compute[226235]: 2026-01-31 08:17:22.279 226239 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1/disk.config d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:22.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:22 np0005603623 nova_compute[226235]: 2026-01-31 08:17:22.425 226239 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1/disk.config d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:22 np0005603623 nova_compute[226235]: 2026-01-31 08:17:22.425 226239 INFO nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Deleting local config drive /var/lib/nova/instances/d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1/disk.config because it was imported into RBD.#033[00m
Jan 31 03:17:22 np0005603623 kernel: tap3b96fe71-71: entered promiscuous mode
Jan 31 03:17:22 np0005603623 NetworkManager[48970]: <info>  [1769847442.4673] manager: (tap3b96fe71-71): new Tun device (/org/freedesktop/NetworkManager/Devices/189)
Jan 31 03:17:22 np0005603623 ovn_controller[133449]: 2026-01-31T08:17:22Z|00393|binding|INFO|Claiming lport 3b96fe71-71a3-4cca-8b42-2f004d1114ea for this chassis.
Jan 31 03:17:22 np0005603623 nova_compute[226235]: 2026-01-31 08:17:22.469 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:22 np0005603623 ovn_controller[133449]: 2026-01-31T08:17:22Z|00394|binding|INFO|3b96fe71-71a3-4cca-8b42-2f004d1114ea: Claiming fa:16:3e:40:6b:d9 10.100.0.6
Jan 31 03:17:22 np0005603623 nova_compute[226235]: 2026-01-31 08:17:22.474 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:22 np0005603623 systemd-machined[194379]: New machine qemu-43-instance-00000063.
Jan 31 03:17:22 np0005603623 ovn_controller[133449]: 2026-01-31T08:17:22Z|00395|binding|INFO|Setting lport 3b96fe71-71a3-4cca-8b42-2f004d1114ea ovn-installed in OVS
Jan 31 03:17:22 np0005603623 nova_compute[226235]: 2026-01-31 08:17:22.493 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:22 np0005603623 systemd[1]: Started Virtual Machine qemu-43-instance-00000063.
Jan 31 03:17:22 np0005603623 systemd-udevd[268539]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:17:22 np0005603623 NetworkManager[48970]: <info>  [1769847442.5218] device (tap3b96fe71-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:17:22 np0005603623 NetworkManager[48970]: <info>  [1769847442.5223] device (tap3b96fe71-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:17:23 np0005603623 ovn_controller[133449]: 2026-01-31T08:17:23Z|00396|binding|INFO|Setting lport 3b96fe71-71a3-4cca-8b42-2f004d1114ea up in Southbound
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.294 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:6b:d9 10.100.0.6'], port_security=['fa:16:3e:40:6b:d9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'd6e7c764-5ddd-4bee-b6b4-a7a93994c7b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eafe22d6cfcb41d4b31597087498a565', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2748ecb7-7ea4-47e3-84b3-eb7c4d3ddc31', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=27d5be35-4a5e-4d77-b9b0-f9a41d41dd18, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=3b96fe71-71a3-4cca-8b42-2f004d1114ea) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.296 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 3b96fe71-71a3-4cca-8b42-2f004d1114ea in datapath 4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9 bound to our chassis#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.298 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.305 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d2f003d3-3abf-4383-89f3-64d363f74c81]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.306 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4ba4d6d9-c1 in ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.310 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4ba4d6d9-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.310 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c56ef05d-93f8-4999-a4b4-a36ad2a88511]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.311 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bf36270f-f672-456e-8822-26a142f8d0bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.320 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[c1bf7c40-0af7-4ca4-90f9-179d94e3e6e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.329 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[840fa980-72be-4600-a216-ac8fc74cb41b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.356 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[386217f7-4bf8-400c-a05f-e907fa5c244a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:23 np0005603623 NetworkManager[48970]: <info>  [1769847443.3610] manager: (tap4ba4d6d9-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/190)
Jan 31 03:17:23 np0005603623 systemd-udevd[268543]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.361 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b01da52f-ec9a-4287-b933-ee6cb9adef05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:23 np0005603623 nova_compute[226235]: 2026-01-31 08:17:23.387 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847443.3863711, d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:17:23 np0005603623 nova_compute[226235]: 2026-01-31 08:17:23.387 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] VM Started (Lifecycle Event)#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.388 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[884907d9-fd86-4525-96fe-c8a72ebd077f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.391 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[97556f68-c89c-49b6-93ed-5f0eee0d853b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:23 np0005603623 NetworkManager[48970]: <info>  [1769847443.4071] device (tap4ba4d6d9-c0): carrier: link connected
Jan 31 03:17:23 np0005603623 nova_compute[226235]: 2026-01-31 08:17:23.406 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847428.404279, b5409d3b-1f0a-476f-b537-91e30f6543fd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:17:23 np0005603623 nova_compute[226235]: 2026-01-31 08:17:23.407 226239 INFO nova.compute.manager [-] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.408 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[680daad1-61d5-4907-acf3-0c14ecd96d17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.418 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[df345290-23b8-4473-8b16-261f795211b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4ba4d6d9-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:09:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 658383, 'reachable_time': 38191, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268614, 'error': None, 'target': 'ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.425 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4a8f1b1b-1496-4c7b-9418-219065869c6b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1d:951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 658383, 'tstamp': 658383}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 268615, 'error': None, 'target': 'ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.433 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2c184f3e-df8c-412f-bba8-a0f127666b2b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4ba4d6d9-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:09:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 658383, 'reachable_time': 38191, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 268616, 'error': None, 'target': 'ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.448 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e9f8a7cd-4f74-4d77-b32e-53361d3c7d60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.478 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[64c0c8a5-9b20-4ef4-9ec2-ed754eeccfd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.480 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ba4d6d9-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.480 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.480 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4ba4d6d9-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:23 np0005603623 NetworkManager[48970]: <info>  [1769847443.4826] manager: (tap4ba4d6d9-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Jan 31 03:17:23 np0005603623 nova_compute[226235]: 2026-01-31 08:17:23.482 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:23 np0005603623 kernel: tap4ba4d6d9-c0: entered promiscuous mode
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.484 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4ba4d6d9-c0, col_values=(('external_ids', {'iface-id': 'f9d388c2-0e9b-4991-9c20-42c713a8ba0d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:23 np0005603623 ovn_controller[133449]: 2026-01-31T08:17:23Z|00397|binding|INFO|Releasing lport f9d388c2-0e9b-4991-9c20-42c713a8ba0d from this chassis (sb_readonly=1)
Jan 31 03:17:23 np0005603623 nova_compute[226235]: 2026-01-31 08:17:23.486 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.486 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.487 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e2daec1b-97d6-463c-8ee8-aaeaca453f55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.488 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9.pid.haproxy
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:17:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:23.488 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9', 'env', 'PROCESS_TAG=haproxy-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:17:23 np0005603623 nova_compute[226235]: 2026-01-31 08:17:23.489 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:23 np0005603623 nova_compute[226235]: 2026-01-31 08:17:23.623 226239 DEBUG nova.compute.manager [None req-74d6fa76-d9f8-4f23-b3c8-c4f16f1ac360 - - - - - -] [instance: b5409d3b-1f0a-476f-b537-91e30f6543fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:17:23 np0005603623 nova_compute[226235]: 2026-01-31 08:17:23.639 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:17:23 np0005603623 nova_compute[226235]: 2026-01-31 08:17:23.643 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847443.386635, d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:17:23 np0005603623 nova_compute[226235]: 2026-01-31 08:17:23.643 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:17:23 np0005603623 nova_compute[226235]: 2026-01-31 08:17:23.722 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:17:23 np0005603623 nova_compute[226235]: 2026-01-31 08:17:23.725 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:17:23 np0005603623 nova_compute[226235]: 2026-01-31 08:17:23.785 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:23 np0005603623 podman[268647]: 2026-01-31 08:17:23.803961541 +0000 UTC m=+0.052491566 container create d7b1338522d2a396bd7ba4cf9c93d7fb0937918ece28351074036c0ddbdf9c2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 03:17:23 np0005603623 nova_compute[226235]: 2026-01-31 08:17:23.838 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:17:23 np0005603623 systemd[1]: Started libpod-conmon-d7b1338522d2a396bd7ba4cf9c93d7fb0937918ece28351074036c0ddbdf9c2a.scope.
Jan 31 03:17:23 np0005603623 podman[268647]: 2026-01-31 08:17:23.772219751 +0000 UTC m=+0.020749786 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:17:23 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:17:23 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74d91701d5efb46a14cfc36d140a4fab75534a2e4d09801b2a1bbc63499a8c72/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:17:23 np0005603623 podman[268647]: 2026-01-31 08:17:23.890044115 +0000 UTC m=+0.138574170 container init d7b1338522d2a396bd7ba4cf9c93d7fb0937918ece28351074036c0ddbdf9c2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:17:23 np0005603623 podman[268647]: 2026-01-31 08:17:23.894581468 +0000 UTC m=+0.143111503 container start d7b1338522d2a396bd7ba4cf9c93d7fb0937918ece28351074036c0ddbdf9c2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:17:23 np0005603623 neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9[268663]: [NOTICE]   (268667) : New worker (268669) forked
Jan 31 03:17:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:23.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:23 np0005603623 neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9[268663]: [NOTICE]   (268667) : Loading success.
Jan 31 03:17:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:24.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:24 np0005603623 nova_compute[226235]: 2026-01-31 08:17:24.755 226239 DEBUG nova.network.neutron [req-986fbf3c-d5ea-4f65-8ca0-20538e77bd97 req-8e4c851f-e957-424c-bbef-6f265b74e38e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Updated VIF entry in instance network info cache for port 3b96fe71-71a3-4cca-8b42-2f004d1114ea. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:17:24 np0005603623 nova_compute[226235]: 2026-01-31 08:17:24.756 226239 DEBUG nova.network.neutron [req-986fbf3c-d5ea-4f65-8ca0-20538e77bd97 req-8e4c851f-e957-424c-bbef-6f265b74e38e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Updating instance_info_cache with network_info: [{"id": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "address": "fa:16:3e:40:6b:d9", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b96fe71-71", "ovs_interfaceid": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:17:24 np0005603623 nova_compute[226235]: 2026-01-31 08:17:24.991 226239 DEBUG nova.compute.manager [req-b8f47595-cb6f-4f5b-bd8e-b30fc24581bd req-2ee4fb0c-62bf-483e-a551-d6ee40303277 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Received event network-vif-plugged-3b96fe71-71a3-4cca-8b42-2f004d1114ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:24 np0005603623 nova_compute[226235]: 2026-01-31 08:17:24.991 226239 DEBUG oslo_concurrency.lockutils [req-b8f47595-cb6f-4f5b-bd8e-b30fc24581bd req-2ee4fb0c-62bf-483e-a551-d6ee40303277 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:24 np0005603623 nova_compute[226235]: 2026-01-31 08:17:24.992 226239 DEBUG oslo_concurrency.lockutils [req-b8f47595-cb6f-4f5b-bd8e-b30fc24581bd req-2ee4fb0c-62bf-483e-a551-d6ee40303277 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:24 np0005603623 nova_compute[226235]: 2026-01-31 08:17:24.992 226239 DEBUG oslo_concurrency.lockutils [req-b8f47595-cb6f-4f5b-bd8e-b30fc24581bd req-2ee4fb0c-62bf-483e-a551-d6ee40303277 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:24 np0005603623 nova_compute[226235]: 2026-01-31 08:17:24.992 226239 DEBUG nova.compute.manager [req-b8f47595-cb6f-4f5b-bd8e-b30fc24581bd req-2ee4fb0c-62bf-483e-a551-d6ee40303277 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Processing event network-vif-plugged-3b96fe71-71a3-4cca-8b42-2f004d1114ea _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:17:24 np0005603623 nova_compute[226235]: 2026-01-31 08:17:24.993 226239 DEBUG nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:17:24 np0005603623 nova_compute[226235]: 2026-01-31 08:17:24.997 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847444.9967985, d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:17:24 np0005603623 nova_compute[226235]: 2026-01-31 08:17:24.997 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:17:24 np0005603623 nova_compute[226235]: 2026-01-31 08:17:24.999 226239 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:17:25 np0005603623 nova_compute[226235]: 2026-01-31 08:17:25.002 226239 INFO nova.virt.libvirt.driver [-] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Instance spawned successfully.#033[00m
Jan 31 03:17:25 np0005603623 nova_compute[226235]: 2026-01-31 08:17:25.002 226239 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:17:25 np0005603623 nova_compute[226235]: 2026-01-31 08:17:25.071 226239 DEBUG oslo_concurrency.lockutils [req-986fbf3c-d5ea-4f65-8ca0-20538e77bd97 req-8e4c851f-e957-424c-bbef-6f265b74e38e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:17:25 np0005603623 nova_compute[226235]: 2026-01-31 08:17:25.110 226239 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:17:25 np0005603623 nova_compute[226235]: 2026-01-31 08:17:25.110 226239 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:17:25 np0005603623 nova_compute[226235]: 2026-01-31 08:17:25.111 226239 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:17:25 np0005603623 nova_compute[226235]: 2026-01-31 08:17:25.111 226239 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:17:25 np0005603623 nova_compute[226235]: 2026-01-31 08:17:25.112 226239 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:17:25 np0005603623 nova_compute[226235]: 2026-01-31 08:17:25.112 226239 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:17:25 np0005603623 nova_compute[226235]: 2026-01-31 08:17:25.117 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:17:25 np0005603623 nova_compute[226235]: 2026-01-31 08:17:25.120 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:17:25 np0005603623 nova_compute[226235]: 2026-01-31 08:17:25.170 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:25 np0005603623 nova_compute[226235]: 2026-01-31 08:17:25.598 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:17:25 np0005603623 nova_compute[226235]: 2026-01-31 08:17:25.691 226239 INFO nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Took 19.37 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:17:25 np0005603623 nova_compute[226235]: 2026-01-31 08:17:25.692 226239 DEBUG nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:17:25 np0005603623 nova_compute[226235]: 2026-01-31 08:17:25.839 226239 INFO nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Took 21.85 seconds to build instance.#033[00m
Jan 31 03:17:25 np0005603623 nova_compute[226235]: 2026-01-31 08:17:25.913 226239 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 23.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:25.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:17:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:26.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:17:27 np0005603623 nova_compute[226235]: 2026-01-31 08:17:27.531 226239 DEBUG nova.compute.manager [req-c67e3184-3735-4b0c-80b9-ae14ce6c0d7b req-9be3a479-9c77-497e-b854-ddb586d39191 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Received event network-vif-plugged-3b96fe71-71a3-4cca-8b42-2f004d1114ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:27 np0005603623 nova_compute[226235]: 2026-01-31 08:17:27.532 226239 DEBUG oslo_concurrency.lockutils [req-c67e3184-3735-4b0c-80b9-ae14ce6c0d7b req-9be3a479-9c77-497e-b854-ddb586d39191 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:27 np0005603623 nova_compute[226235]: 2026-01-31 08:17:27.532 226239 DEBUG oslo_concurrency.lockutils [req-c67e3184-3735-4b0c-80b9-ae14ce6c0d7b req-9be3a479-9c77-497e-b854-ddb586d39191 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:27 np0005603623 nova_compute[226235]: 2026-01-31 08:17:27.533 226239 DEBUG oslo_concurrency.lockutils [req-c67e3184-3735-4b0c-80b9-ae14ce6c0d7b req-9be3a479-9c77-497e-b854-ddb586d39191 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:27 np0005603623 nova_compute[226235]: 2026-01-31 08:17:27.533 226239 DEBUG nova.compute.manager [req-c67e3184-3735-4b0c-80b9-ae14ce6c0d7b req-9be3a479-9c77-497e-b854-ddb586d39191 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] No waiting events found dispatching network-vif-plugged-3b96fe71-71a3-4cca-8b42-2f004d1114ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:27 np0005603623 nova_compute[226235]: 2026-01-31 08:17:27.533 226239 WARNING nova.compute.manager [req-c67e3184-3735-4b0c-80b9-ae14ce6c0d7b req-9be3a479-9c77-497e-b854-ddb586d39191 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Received unexpected event network-vif-plugged-3b96fe71-71a3-4cca-8b42-2f004d1114ea for instance with vm_state active and task_state None.#033[00m
Jan 31 03:17:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:17:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:27.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:17:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:28.068 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:17:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:28.069 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:17:28 np0005603623 nova_compute[226235]: 2026-01-31 08:17:28.116 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:28.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:28 np0005603623 nova_compute[226235]: 2026-01-31 08:17:28.788 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:28 np0005603623 ovn_controller[133449]: 2026-01-31T08:17:28Z|00398|binding|INFO|Releasing lport f9d388c2-0e9b-4991-9c20-42c713a8ba0d from this chassis (sb_readonly=0)
Jan 31 03:17:28 np0005603623 nova_compute[226235]: 2026-01-31 08:17:28.828 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:29.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:30.109 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:30.110 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:30.110 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:30 np0005603623 nova_compute[226235]: 2026-01-31 08:17:30.173 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:30.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:31 np0005603623 nova_compute[226235]: 2026-01-31 08:17:31.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:17:31 np0005603623 nova_compute[226235]: 2026-01-31 08:17:31.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:17:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:31.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:32.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:33 np0005603623 nova_compute[226235]: 2026-01-31 08:17:33.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:17:33 np0005603623 nova_compute[226235]: 2026-01-31 08:17:33.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:17:33 np0005603623 nova_compute[226235]: 2026-01-31 08:17:33.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:17:33 np0005603623 nova_compute[226235]: 2026-01-31 08:17:33.789 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:33.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:34 np0005603623 nova_compute[226235]: 2026-01-31 08:17:34.149 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:17:34 np0005603623 nova_compute[226235]: 2026-01-31 08:17:34.150 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:17:34 np0005603623 nova_compute[226235]: 2026-01-31 08:17:34.151 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:17:34 np0005603623 nova_compute[226235]: 2026-01-31 08:17:34.151 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:17:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:34.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:34 np0005603623 podman[268865]: 2026-01-31 08:17:34.967192374 +0000 UTC m=+0.055993226 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 03:17:34 np0005603623 podman[268866]: 2026-01-31 08:17:34.995685623 +0000 UTC m=+0.083644228 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 31 03:17:35 np0005603623 nova_compute[226235]: 2026-01-31 08:17:35.175 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:17:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:35.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:17:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:36.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:17:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:17:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:17:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:17:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:17:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:17:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:37.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:17:38 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:17:38.072 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:17:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:38.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:17:38 np0005603623 nova_compute[226235]: 2026-01-31 08:17:38.791 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:39 np0005603623 nova_compute[226235]: 2026-01-31 08:17:39.048 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Updating instance_info_cache with network_info: [{"id": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "address": "fa:16:3e:40:6b:d9", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b96fe71-71", "ovs_interfaceid": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:17:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:39 np0005603623 nova_compute[226235]: 2026-01-31 08:17:39.286 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:17:39 np0005603623 nova_compute[226235]: 2026-01-31 08:17:39.287 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:17:39 np0005603623 nova_compute[226235]: 2026-01-31 08:17:39.287 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:17:39 np0005603623 nova_compute[226235]: 2026-01-31 08:17:39.287 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:17:39 np0005603623 nova_compute[226235]: 2026-01-31 08:17:39.287 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:17:39 np0005603623 nova_compute[226235]: 2026-01-31 08:17:39.288 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:17:39 np0005603623 nova_compute[226235]: 2026-01-31 08:17:39.359 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:39 np0005603623 nova_compute[226235]: 2026-01-31 08:17:39.359 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:39 np0005603623 nova_compute[226235]: 2026-01-31 08:17:39.360 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:39 np0005603623 nova_compute[226235]: 2026-01-31 08:17:39.360 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:17:39 np0005603623 nova_compute[226235]: 2026-01-31 08:17:39.360 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:39 np0005603623 ovn_controller[133449]: 2026-01-31T08:17:39Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:40:6b:d9 10.100.0.6
Jan 31 03:17:39 np0005603623 ovn_controller[133449]: 2026-01-31T08:17:39Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:40:6b:d9 10.100.0.6
Jan 31 03:17:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:17:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/86268056' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:17:39 np0005603623 nova_compute[226235]: 2026-01-31 08:17:39.771 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.411s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:39.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:39 np0005603623 nova_compute[226235]: 2026-01-31 08:17:39.971 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000063 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:17:39 np0005603623 nova_compute[226235]: 2026-01-31 08:17:39.971 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000063 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:17:40 np0005603623 nova_compute[226235]: 2026-01-31 08:17:40.116 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:17:40 np0005603623 nova_compute[226235]: 2026-01-31 08:17:40.117 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4399MB free_disk=20.92306137084961GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:17:40 np0005603623 nova_compute[226235]: 2026-01-31 08:17:40.117 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:40 np0005603623 nova_compute[226235]: 2026-01-31 08:17:40.117 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:40 np0005603623 nova_compute[226235]: 2026-01-31 08:17:40.179 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:40.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:40 np0005603623 nova_compute[226235]: 2026-01-31 08:17:40.416 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:17:40 np0005603623 nova_compute[226235]: 2026-01-31 08:17:40.417 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:17:40 np0005603623 nova_compute[226235]: 2026-01-31 08:17:40.417 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:17:40 np0005603623 nova_compute[226235]: 2026-01-31 08:17:40.526 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:17:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1454815784' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:17:40 np0005603623 nova_compute[226235]: 2026-01-31 08:17:40.962 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:40 np0005603623 nova_compute[226235]: 2026-01-31 08:17:40.967 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:17:41 np0005603623 nova_compute[226235]: 2026-01-31 08:17:41.094 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:17:41 np0005603623 nova_compute[226235]: 2026-01-31 08:17:41.401 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:17:41 np0005603623 nova_compute[226235]: 2026-01-31 08:17:41.402 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:17:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:41.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:17:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:42.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:43 np0005603623 nova_compute[226235]: 2026-01-31 08:17:43.793 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:17:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:43.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:17:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:17:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:44.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:17:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:17:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:17:45 np0005603623 nova_compute[226235]: 2026-01-31 08:17:45.182 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:45 np0005603623 nova_compute[226235]: 2026-01-31 08:17:45.269 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:17:45 np0005603623 nova_compute[226235]: 2026-01-31 08:17:45.270 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:17:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:45.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:46.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:47 np0005603623 nova_compute[226235]: 2026-01-31 08:17:47.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:17:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:47.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:48.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:48 np0005603623 nova_compute[226235]: 2026-01-31 08:17:48.794 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:17:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:49.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:17:50 np0005603623 nova_compute[226235]: 2026-01-31 08:17:50.184 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:17:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:50.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:17:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:51.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:52.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:53 np0005603623 nova_compute[226235]: 2026-01-31 08:17:53.851 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:17:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:53.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:17:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:54.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:55 np0005603623 nova_compute[226235]: 2026-01-31 08:17:55.186 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:17:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:55.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:17:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:56.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:57.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:58.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:58 np0005603623 nova_compute[226235]: 2026-01-31 08:17:58.852 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:59 np0005603623 nova_compute[226235]: 2026-01-31 08:17:59.866 226239 DEBUG oslo_concurrency.lockutils [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquiring lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:59 np0005603623 nova_compute[226235]: 2026-01-31 08:17:59.867 226239 DEBUG oslo_concurrency.lockutils [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:59 np0005603623 nova_compute[226235]: 2026-01-31 08:17:59.868 226239 DEBUG oslo_concurrency.lockutils [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquiring lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:59 np0005603623 nova_compute[226235]: 2026-01-31 08:17:59.868 226239 DEBUG oslo_concurrency.lockutils [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:59 np0005603623 nova_compute[226235]: 2026-01-31 08:17:59.868 226239 DEBUG oslo_concurrency.lockutils [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:59 np0005603623 nova_compute[226235]: 2026-01-31 08:17:59.870 226239 INFO nova.compute.manager [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Terminating instance#033[00m
Jan 31 03:17:59 np0005603623 nova_compute[226235]: 2026-01-31 08:17:59.871 226239 DEBUG nova.compute.manager [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:17:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:17:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:59.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:00 np0005603623 kernel: tap3b96fe71-71 (unregistering): left promiscuous mode
Jan 31 03:18:00 np0005603623 NetworkManager[48970]: <info>  [1769847480.0304] device (tap3b96fe71-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:18:00 np0005603623 ovn_controller[133449]: 2026-01-31T08:18:00Z|00399|binding|INFO|Releasing lport 3b96fe71-71a3-4cca-8b42-2f004d1114ea from this chassis (sb_readonly=0)
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.038 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:00 np0005603623 ovn_controller[133449]: 2026-01-31T08:18:00Z|00400|binding|INFO|Setting lport 3b96fe71-71a3-4cca-8b42-2f004d1114ea down in Southbound
Jan 31 03:18:00 np0005603623 ovn_controller[133449]: 2026-01-31T08:18:00Z|00401|binding|INFO|Removing iface tap3b96fe71-71 ovn-installed in OVS
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.040 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.048 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:00 np0005603623 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000063.scope: Deactivated successfully.
Jan 31 03:18:00 np0005603623 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000063.scope: Consumed 13.834s CPU time.
Jan 31 03:18:00 np0005603623 systemd-machined[194379]: Machine qemu-43-instance-00000063 terminated.
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.087 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.089 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.103 226239 INFO nova.virt.libvirt.driver [-] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Instance destroyed successfully.#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.104 226239 DEBUG nova.objects.instance [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lazy-loading 'resources' on Instance uuid d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.188 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:00.200 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:6b:d9 10.100.0.6'], port_security=['fa:16:3e:40:6b:d9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'd6e7c764-5ddd-4bee-b6b4-a7a93994c7b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eafe22d6cfcb41d4b31597087498a565', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2748ecb7-7ea4-47e3-84b3-eb7c4d3ddc31', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=27d5be35-4a5e-4d77-b9b0-f9a41d41dd18, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=3b96fe71-71a3-4cca-8b42-2f004d1114ea) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:18:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:00.201 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 3b96fe71-71a3-4cca-8b42-2f004d1114ea in datapath 4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9 unbound from our chassis#033[00m
Jan 31 03:18:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:00.203 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:18:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:00.204 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d1bf8890-c5b8-4b97-95e1-b052d6cd31d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:00.204 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9 namespace which is not needed anymore#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.258 226239 DEBUG nova.virt.libvirt.vif [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:16:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-603829389',display_name='tempest-ListServersNegativeTestJSON-server-603829389-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-603829389-2',id=99,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2026-01-31T08:17:25Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='eafe22d6cfcb41d4b31597087498a565',ramdisk_id='',reservation_id='r-5jowh2i5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-79577656',owner_user_name='tempest-ListServersNegativeTestJSON-79577656-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:17:25Z,user_data=None,user_id='8db5a8acb6d04c988f9dd4f74380c487',uuid=d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "address": "fa:16:3e:40:6b:d9", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b96fe71-71", "ovs_interfaceid": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.259 226239 DEBUG nova.network.os_vif_util [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Converting VIF {"id": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "address": "fa:16:3e:40:6b:d9", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b96fe71-71", "ovs_interfaceid": "3b96fe71-71a3-4cca-8b42-2f004d1114ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.259 226239 DEBUG nova.network.os_vif_util [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:40:6b:d9,bridge_name='br-int',has_traffic_filtering=True,id=3b96fe71-71a3-4cca-8b42-2f004d1114ea,network=Network(4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b96fe71-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.260 226239 DEBUG os_vif [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:40:6b:d9,bridge_name='br-int',has_traffic_filtering=True,id=3b96fe71-71a3-4cca-8b42-2f004d1114ea,network=Network(4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b96fe71-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.261 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.261 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b96fe71-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.262 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.264 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.266 226239 INFO os_vif [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:40:6b:d9,bridge_name='br-int',has_traffic_filtering=True,id=3b96fe71-71a3-4cca-8b42-2f004d1114ea,network=Network(4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b96fe71-71')#033[00m
Jan 31 03:18:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:00.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:00 np0005603623 neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9[268663]: [NOTICE]   (268667) : haproxy version is 2.8.14-c23fe91
Jan 31 03:18:00 np0005603623 neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9[268663]: [NOTICE]   (268667) : path to executable is /usr/sbin/haproxy
Jan 31 03:18:00 np0005603623 neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9[268663]: [WARNING]  (268667) : Exiting Master process...
Jan 31 03:18:00 np0005603623 neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9[268663]: [ALERT]    (268667) : Current worker (268669) exited with code 143 (Terminated)
Jan 31 03:18:00 np0005603623 neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9[268663]: [WARNING]  (268667) : All workers exited. Exiting... (0)
Jan 31 03:18:00 np0005603623 systemd[1]: libpod-d7b1338522d2a396bd7ba4cf9c93d7fb0937918ece28351074036c0ddbdf9c2a.scope: Deactivated successfully.
Jan 31 03:18:00 np0005603623 podman[269098]: 2026-01-31 08:18:00.453325881 +0000 UTC m=+0.183335790 container died d7b1338522d2a396bd7ba4cf9c93d7fb0937918ece28351074036c0ddbdf9c2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:18:00 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d7b1338522d2a396bd7ba4cf9c93d7fb0937918ece28351074036c0ddbdf9c2a-userdata-shm.mount: Deactivated successfully.
Jan 31 03:18:00 np0005603623 systemd[1]: var-lib-containers-storage-overlay-74d91701d5efb46a14cfc36d140a4fab75534a2e4d09801b2a1bbc63499a8c72-merged.mount: Deactivated successfully.
Jan 31 03:18:00 np0005603623 podman[269098]: 2026-01-31 08:18:00.66927173 +0000 UTC m=+0.399281639 container cleanup d7b1338522d2a396bd7ba4cf9c93d7fb0937918ece28351074036c0ddbdf9c2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 31 03:18:00 np0005603623 systemd[1]: libpod-conmon-d7b1338522d2a396bd7ba4cf9c93d7fb0937918ece28351074036c0ddbdf9c2a.scope: Deactivated successfully.
Jan 31 03:18:00 np0005603623 podman[269150]: 2026-01-31 08:18:00.753760203 +0000 UTC m=+0.071155033 container remove d7b1338522d2a396bd7ba4cf9c93d7fb0937918ece28351074036c0ddbdf9c2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 03:18:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:00.758 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5098f680-c020-4825-b18b-a9105bb3a26d]: (4, ('Sat Jan 31 08:18:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9 (d7b1338522d2a396bd7ba4cf9c93d7fb0937918ece28351074036c0ddbdf9c2a)\nd7b1338522d2a396bd7ba4cf9c93d7fb0937918ece28351074036c0ddbdf9c2a\nSat Jan 31 08:18:00 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9 (d7b1338522d2a396bd7ba4cf9c93d7fb0937918ece28351074036c0ddbdf9c2a)\nd7b1338522d2a396bd7ba4cf9c93d7fb0937918ece28351074036c0ddbdf9c2a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:00.760 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[16e35ca8-2142-4e21-8b36-56cb81351fd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:00.761 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ba4d6d9-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:18:00 np0005603623 kernel: tap4ba4d6d9-c0: left promiscuous mode
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.763 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.769 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:00.772 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[64139558-72cf-41ad-b5ff-73b8b2e102d9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:00.788 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[837f11fd-be22-4272-91c3-8f9d66f20537]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:00.789 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[49307633-33e5-4669-94fb-241516bff5b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:00.801 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6a44a5d5-6ef6-4043-bbff-d8e9fdf1070f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 658378, 'reachable_time': 28334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 269165, 'error': None, 'target': 'ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:00 np0005603623 systemd[1]: run-netns-ovnmeta\x2d4ba4d6d9\x2dcb51\x2d4c5e\x2d9b78\x2ddca15ca271c9.mount: Deactivated successfully.
Jan 31 03:18:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:00.805 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:18:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:00.805 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[5e4ab183-58fa-40b2-8882-13f9e048c499]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.902 226239 DEBUG nova.compute.manager [req-fbcd150b-7dd5-407c-bf21-b26cda943a91 req-8eec9179-a590-47dd-82c6-3208a9b30fff fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Received event network-vif-unplugged-3b96fe71-71a3-4cca-8b42-2f004d1114ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.903 226239 DEBUG oslo_concurrency.lockutils [req-fbcd150b-7dd5-407c-bf21-b26cda943a91 req-8eec9179-a590-47dd-82c6-3208a9b30fff fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.904 226239 DEBUG oslo_concurrency.lockutils [req-fbcd150b-7dd5-407c-bf21-b26cda943a91 req-8eec9179-a590-47dd-82c6-3208a9b30fff fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.904 226239 DEBUG oslo_concurrency.lockutils [req-fbcd150b-7dd5-407c-bf21-b26cda943a91 req-8eec9179-a590-47dd-82c6-3208a9b30fff fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.904 226239 DEBUG nova.compute.manager [req-fbcd150b-7dd5-407c-bf21-b26cda943a91 req-8eec9179-a590-47dd-82c6-3208a9b30fff fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] No waiting events found dispatching network-vif-unplugged-3b96fe71-71a3-4cca-8b42-2f004d1114ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:18:00 np0005603623 nova_compute[226235]: 2026-01-31 08:18:00.905 226239 DEBUG nova.compute.manager [req-fbcd150b-7dd5-407c-bf21-b26cda943a91 req-8eec9179-a590-47dd-82c6-3208a9b30fff fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Received event network-vif-unplugged-3b96fe71-71a3-4cca-8b42-2f004d1114ea for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:18:01 np0005603623 nova_compute[226235]: 2026-01-31 08:18:01.895 226239 INFO nova.virt.libvirt.driver [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Deleting instance files /var/lib/nova/instances/d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1_del#033[00m
Jan 31 03:18:01 np0005603623 nova_compute[226235]: 2026-01-31 08:18:01.896 226239 INFO nova.virt.libvirt.driver [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Deletion of /var/lib/nova/instances/d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1_del complete#033[00m
Jan 31 03:18:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:01.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:02 np0005603623 nova_compute[226235]: 2026-01-31 08:18:02.105 226239 INFO nova.compute.manager [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Took 2.23 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:18:02 np0005603623 nova_compute[226235]: 2026-01-31 08:18:02.106 226239 DEBUG oslo.service.loopingcall [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:18:02 np0005603623 nova_compute[226235]: 2026-01-31 08:18:02.106 226239 DEBUG nova.compute.manager [-] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:18:02 np0005603623 nova_compute[226235]: 2026-01-31 08:18:02.106 226239 DEBUG nova.network.neutron [-] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:18:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:02.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:03 np0005603623 nova_compute[226235]: 2026-01-31 08:18:03.421 226239 DEBUG nova.compute.manager [req-7f420065-4278-45f8-918f-6e881ab64285 req-563e6d3c-e61a-4466-ab4e-1324645b5630 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Received event network-vif-plugged-3b96fe71-71a3-4cca-8b42-2f004d1114ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:18:03 np0005603623 nova_compute[226235]: 2026-01-31 08:18:03.421 226239 DEBUG oslo_concurrency.lockutils [req-7f420065-4278-45f8-918f-6e881ab64285 req-563e6d3c-e61a-4466-ab4e-1324645b5630 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:03 np0005603623 nova_compute[226235]: 2026-01-31 08:18:03.422 226239 DEBUG oslo_concurrency.lockutils [req-7f420065-4278-45f8-918f-6e881ab64285 req-563e6d3c-e61a-4466-ab4e-1324645b5630 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:03 np0005603623 nova_compute[226235]: 2026-01-31 08:18:03.422 226239 DEBUG oslo_concurrency.lockutils [req-7f420065-4278-45f8-918f-6e881ab64285 req-563e6d3c-e61a-4466-ab4e-1324645b5630 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:03 np0005603623 nova_compute[226235]: 2026-01-31 08:18:03.422 226239 DEBUG nova.compute.manager [req-7f420065-4278-45f8-918f-6e881ab64285 req-563e6d3c-e61a-4466-ab4e-1324645b5630 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] No waiting events found dispatching network-vif-plugged-3b96fe71-71a3-4cca-8b42-2f004d1114ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:18:03 np0005603623 nova_compute[226235]: 2026-01-31 08:18:03.422 226239 WARNING nova.compute.manager [req-7f420065-4278-45f8-918f-6e881ab64285 req-563e6d3c-e61a-4466-ab4e-1324645b5630 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Received unexpected event network-vif-plugged-3b96fe71-71a3-4cca-8b42-2f004d1114ea for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:18:03 np0005603623 nova_compute[226235]: 2026-01-31 08:18:03.894 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:18:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:03.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:18:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:04.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:05 np0005603623 nova_compute[226235]: 2026-01-31 08:18:05.264 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:05 np0005603623 podman[269219]: 2026-01-31 08:18:05.946524461 +0000 UTC m=+0.042625684 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 31 03:18:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:05.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:05 np0005603623 podman[269220]: 2026-01-31 08:18:05.972384716 +0000 UTC m=+0.068222072 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 31 03:18:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:06.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:07.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:08.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:08 np0005603623 nova_compute[226235]: 2026-01-31 08:18:08.529 226239 DEBUG nova.network.neutron [-] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:18:08 np0005603623 nova_compute[226235]: 2026-01-31 08:18:08.894 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:08 np0005603623 nova_compute[226235]: 2026-01-31 08:18:08.934 226239 INFO nova.compute.manager [-] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Took 6.83 seconds to deallocate network for instance.#033[00m
Jan 31 03:18:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:09 np0005603623 nova_compute[226235]: 2026-01-31 08:18:09.420 226239 DEBUG oslo_concurrency.lockutils [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:09 np0005603623 nova_compute[226235]: 2026-01-31 08:18:09.421 226239 DEBUG oslo_concurrency.lockutils [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:09 np0005603623 nova_compute[226235]: 2026-01-31 08:18:09.499 226239 DEBUG oslo_concurrency.processutils [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:18:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:09.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:18:10 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2056771223' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:18:10 np0005603623 nova_compute[226235]: 2026-01-31 08:18:10.123 226239 DEBUG oslo_concurrency.processutils [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.623s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:18:10 np0005603623 nova_compute[226235]: 2026-01-31 08:18:10.129 226239 DEBUG nova.compute.provider_tree [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:18:10 np0005603623 nova_compute[226235]: 2026-01-31 08:18:10.268 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:10 np0005603623 nova_compute[226235]: 2026-01-31 08:18:10.305 226239 DEBUG nova.compute.manager [req-6e44ab8f-d9a7-476f-9a57-45364a8fdf07 req-48007027-c67d-4928-8ec5-df92ad605df0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Received event network-vif-deleted-3b96fe71-71a3-4cca-8b42-2f004d1114ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:18:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:10.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:10 np0005603623 nova_compute[226235]: 2026-01-31 08:18:10.433 226239 DEBUG nova.scheduler.client.report [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:18:10 np0005603623 nova_compute[226235]: 2026-01-31 08:18:10.550 226239 DEBUG oslo_concurrency.lockutils [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:10 np0005603623 nova_compute[226235]: 2026-01-31 08:18:10.654 226239 INFO nova.scheduler.client.report [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Deleted allocations for instance d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1#033[00m
Jan 31 03:18:10 np0005603623 nova_compute[226235]: 2026-01-31 08:18:10.970 226239 DEBUG oslo_concurrency.lockutils [None req-131bc3e4-63fe-4f6f-86cd-6a53b0dd350c 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 11.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:11.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:12.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:13 np0005603623 nova_compute[226235]: 2026-01-31 08:18:13.895 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:18:13 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1276310961' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:18:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:18:13 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1276310961' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:18:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:18:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:13.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:18:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:14.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:18:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/433660857' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:18:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:18:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/433660857' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:18:15 np0005603623 nova_compute[226235]: 2026-01-31 08:18:15.102 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847480.101783, d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:18:15 np0005603623 nova_compute[226235]: 2026-01-31 08:18:15.103 226239 INFO nova.compute.manager [-] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:18:15 np0005603623 nova_compute[226235]: 2026-01-31 08:18:15.260 226239 DEBUG nova.compute.manager [None req-c847334c-397e-4a27-8d6e-e93a06bc8d87 - - - - - -] [instance: d6e7c764-5ddd-4bee-b6b4-a7a93994c7b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:18:15 np0005603623 nova_compute[226235]: 2026-01-31 08:18:15.271 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:15.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:18:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:16.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:18:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:17.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:18.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:18 np0005603623 nova_compute[226235]: 2026-01-31 08:18:18.897 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:19.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:20 np0005603623 nova_compute[226235]: 2026-01-31 08:18:20.274 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:18:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:20.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:18:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:21.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:22.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:23 np0005603623 nova_compute[226235]: 2026-01-31 08:18:23.898 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:18:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:23.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:18:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:24.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:24 np0005603623 nova_compute[226235]: 2026-01-31 08:18:24.380 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #88. Immutable memtables: 0.
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:18:24.706811) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 53] Flushing memtable with next log file: 88
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847504706889, "job": 53, "event": "flush_started", "num_memtables": 1, "num_entries": 2391, "num_deletes": 252, "total_data_size": 5710197, "memory_usage": 5790976, "flush_reason": "Manual Compaction"}
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 53] Level-0 flush table #89: started
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847504785270, "cf_name": "default", "job": 53, "event": "table_file_creation", "file_number": 89, "file_size": 3744132, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 44843, "largest_seqno": 47228, "table_properties": {"data_size": 3734529, "index_size": 6033, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20336, "raw_average_key_size": 20, "raw_value_size": 3715173, "raw_average_value_size": 3760, "num_data_blocks": 263, "num_entries": 988, "num_filter_entries": 988, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769847287, "oldest_key_time": 1769847287, "file_creation_time": 1769847504, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 89, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 53] Flush lasted 78503 microseconds, and 6192 cpu microseconds.
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:18:24.785313) [db/flush_job.cc:967] [default] [JOB 53] Level-0 flush table #89: 3744132 bytes OK
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:18:24.785332) [db/memtable_list.cc:519] [default] Level-0 commit table #89 started
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:18:24.819066) [db/memtable_list.cc:722] [default] Level-0 commit table #89: memtable #1 done
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:18:24.819118) EVENT_LOG_v1 {"time_micros": 1769847504819102, "job": 53, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:18:24.819143) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 53] Try to delete WAL files size 5699760, prev total WAL file size 5700041, number of live WAL files 2.
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000085.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:18:24.820150) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033373635' seq:72057594037927935, type:22 .. '7061786F730034303137' seq:0, type:0; will stop at (end)
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 54] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 53 Base level 0, inputs: [89(3656KB)], [87(9481KB)]
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847504820224, "job": 54, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [89], "files_L6": [87], "score": -1, "input_data_size": 13453174, "oldest_snapshot_seqno": -1}
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 54] Generated table #90: 7212 keys, 11566658 bytes, temperature: kUnknown
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847504973696, "cf_name": "default", "job": 54, "event": "table_file_creation", "file_number": 90, "file_size": 11566658, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11517698, "index_size": 29840, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18053, "raw_key_size": 185523, "raw_average_key_size": 25, "raw_value_size": 11388300, "raw_average_value_size": 1579, "num_data_blocks": 1183, "num_entries": 7212, "num_filter_entries": 7212, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769847504, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 90, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:18:24 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:18:25 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:18:24.973943) [db/compaction/compaction_job.cc:1663] [default] [JOB 54] Compacted 1@0 + 1@6 files to L6 => 11566658 bytes
Jan 31 03:18:25 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:18:25.010872) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 87.6 rd, 75.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 9.3 +0.0 blob) out(11.0 +0.0 blob), read-write-amplify(6.7) write-amplify(3.1) OK, records in: 7736, records dropped: 524 output_compression: NoCompression
Jan 31 03:18:25 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:18:25.010903) EVENT_LOG_v1 {"time_micros": 1769847505010891, "job": 54, "event": "compaction_finished", "compaction_time_micros": 153546, "compaction_time_cpu_micros": 19688, "output_level": 6, "num_output_files": 1, "total_output_size": 11566658, "num_input_records": 7736, "num_output_records": 7212, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:18:25 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000089.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:18:25 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847505011319, "job": 54, "event": "table_file_deletion", "file_number": 89}
Jan 31 03:18:25 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000087.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:18:25 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847505011889, "job": 54, "event": "table_file_deletion", "file_number": 87}
Jan 31 03:18:25 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:18:24.820019) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:18:25 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:18:25.011973) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:18:25 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:18:25.011979) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:18:25 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:18:25.011981) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:18:25 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:18:25.011983) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:18:25 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:18:25.011985) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:18:25 np0005603623 nova_compute[226235]: 2026-01-31 08:18:25.275 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:25.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:26.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:27.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:18:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:28.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:18:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:28.529 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:18:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:28.530 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:18:28 np0005603623 nova_compute[226235]: 2026-01-31 08:18:28.582 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:28 np0005603623 nova_compute[226235]: 2026-01-31 08:18:28.901 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:29.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:30.111 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:30.111 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:30.112 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:30 np0005603623 nova_compute[226235]: 2026-01-31 08:18:30.278 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:30.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:31.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:32 np0005603623 nova_compute[226235]: 2026-01-31 08:18:32.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:32 np0005603623 nova_compute[226235]: 2026-01-31 08:18:32.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:18:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:32.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:18:32.532 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:18:33 np0005603623 nova_compute[226235]: 2026-01-31 08:18:33.902 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:18:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:33.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:18:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:34.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:35 np0005603623 nova_compute[226235]: 2026-01-31 08:18:35.157 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:35 np0005603623 nova_compute[226235]: 2026-01-31 08:18:35.157 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:18:35 np0005603623 nova_compute[226235]: 2026-01-31 08:18:35.157 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:18:35 np0005603623 nova_compute[226235]: 2026-01-31 08:18:35.272 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:18:35 np0005603623 nova_compute[226235]: 2026-01-31 08:18:35.282 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:35.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:18:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:36.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:18:36 np0005603623 podman[269354]: 2026-01-31 08:18:36.941173896 +0000 UTC m=+0.038950978 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 31 03:18:36 np0005603623 podman[269355]: 2026-01-31 08:18:36.962027224 +0000 UTC m=+0.056904505 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:18:37 np0005603623 nova_compute[226235]: 2026-01-31 08:18:37.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:18:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:37.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:18:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:38.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:38 np0005603623 nova_compute[226235]: 2026-01-31 08:18:38.905 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:39 np0005603623 nova_compute[226235]: 2026-01-31 08:18:39.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:39 np0005603623 nova_compute[226235]: 2026-01-31 08:18:39.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:39 np0005603623 nova_compute[226235]: 2026-01-31 08:18:39.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:39 np0005603623 nova_compute[226235]: 2026-01-31 08:18:39.228 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:39 np0005603623 nova_compute[226235]: 2026-01-31 08:18:39.229 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:39 np0005603623 nova_compute[226235]: 2026-01-31 08:18:39.229 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:39 np0005603623 nova_compute[226235]: 2026-01-31 08:18:39.229 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:18:39 np0005603623 nova_compute[226235]: 2026-01-31 08:18:39.229 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:18:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:18:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3149648822' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:18:39 np0005603623 nova_compute[226235]: 2026-01-31 08:18:39.639 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:18:39 np0005603623 nova_compute[226235]: 2026-01-31 08:18:39.770 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:18:39 np0005603623 nova_compute[226235]: 2026-01-31 08:18:39.772 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4619MB free_disk=20.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:18:39 np0005603623 nova_compute[226235]: 2026-01-31 08:18:39.772 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:39 np0005603623 nova_compute[226235]: 2026-01-31 08:18:39.772 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:18:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:39.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:18:40 np0005603623 nova_compute[226235]: 2026-01-31 08:18:40.166 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:18:40 np0005603623 nova_compute[226235]: 2026-01-31 08:18:40.166 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:18:40 np0005603623 nova_compute[226235]: 2026-01-31 08:18:40.191 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:18:40 np0005603623 nova_compute[226235]: 2026-01-31 08:18:40.286 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:40.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:18:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/746002587' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:18:40 np0005603623 nova_compute[226235]: 2026-01-31 08:18:40.637 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:18:40 np0005603623 nova_compute[226235]: 2026-01-31 08:18:40.641 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:18:40 np0005603623 nova_compute[226235]: 2026-01-31 08:18:40.685 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:18:40 np0005603623 nova_compute[226235]: 2026-01-31 08:18:40.764 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:18:40 np0005603623 nova_compute[226235]: 2026-01-31 08:18:40.764 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.992s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:41 np0005603623 nova_compute[226235]: 2026-01-31 08:18:41.760 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:41.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:42.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:43 np0005603623 nova_compute[226235]: 2026-01-31 08:18:43.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:43 np0005603623 nova_compute[226235]: 2026-01-31 08:18:43.907 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:43.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:18:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:44.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:18:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:45 np0005603623 nova_compute[226235]: 2026-01-31 08:18:45.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:45 np0005603623 nova_compute[226235]: 2026-01-31 08:18:45.288 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:18:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:45.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:18:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:46.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:48.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:48.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:48 np0005603623 nova_compute[226235]: 2026-01-31 08:18:48.909 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:18:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:18:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:18:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:18:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:18:49 np0005603623 nova_compute[226235]: 2026-01-31 08:18:49.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:50.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:50 np0005603623 nova_compute[226235]: 2026-01-31 08:18:50.291 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:50.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:52.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:52.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:53 np0005603623 nova_compute[226235]: 2026-01-31 08:18:53.910 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:54.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:18:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:54.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:18:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:55 np0005603623 nova_compute[226235]: 2026-01-31 08:18:55.293 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:56.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:56.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:58.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:18:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:58.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:58 np0005603623 nova_compute[226235]: 2026-01-31 08:18:58.912 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:19:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:00.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:19:00 np0005603623 nova_compute[226235]: 2026-01-31 08:19:00.296 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:00.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:02.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:19:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:02.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:19:03 np0005603623 nova_compute[226235]: 2026-01-31 08:19:03.964 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:04.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:04.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:05 np0005603623 nova_compute[226235]: 2026-01-31 08:19:05.298 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:19:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:06.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:19:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:19:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:19:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:06.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:07 np0005603623 podman[269740]: 2026-01-31 08:19:07.975577411 +0000 UTC m=+0.069301606 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 31 03:19:07 np0005603623 podman[269739]: 2026-01-31 08:19:07.980181996 +0000 UTC m=+0.074157039 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 03:19:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:19:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:08.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:19:08 np0005603623 nova_compute[226235]: 2026-01-31 08:19:08.384 226239 DEBUG oslo_concurrency.lockutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Acquiring lock "2d301840-aeb7-41c2-8de1-80b4ed9c22bc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:08 np0005603623 nova_compute[226235]: 2026-01-31 08:19:08.384 226239 DEBUG oslo_concurrency.lockutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Lock "2d301840-aeb7-41c2-8de1-80b4ed9c22bc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:08.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:08 np0005603623 nova_compute[226235]: 2026-01-31 08:19:08.456 226239 DEBUG nova.compute.manager [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:19:08 np0005603623 nova_compute[226235]: 2026-01-31 08:19:08.613 226239 DEBUG oslo_concurrency.lockutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:08 np0005603623 nova_compute[226235]: 2026-01-31 08:19:08.613 226239 DEBUG oslo_concurrency.lockutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:08 np0005603623 nova_compute[226235]: 2026-01-31 08:19:08.619 226239 DEBUG nova.virt.hardware [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:19:08 np0005603623 nova_compute[226235]: 2026-01-31 08:19:08.620 226239 INFO nova.compute.claims [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:19:08 np0005603623 nova_compute[226235]: 2026-01-31 08:19:08.882 226239 DEBUG oslo_concurrency.processutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:08 np0005603623 nova_compute[226235]: 2026-01-31 08:19:08.967 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:19:09 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1039928736' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:19:09 np0005603623 nova_compute[226235]: 2026-01-31 08:19:09.261 226239 DEBUG oslo_concurrency.processutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.378s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:09 np0005603623 nova_compute[226235]: 2026-01-31 08:19:09.265 226239 DEBUG nova.compute.provider_tree [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:19:09 np0005603623 nova_compute[226235]: 2026-01-31 08:19:09.331 226239 DEBUG nova.scheduler.client.report [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:19:09 np0005603623 nova_compute[226235]: 2026-01-31 08:19:09.442 226239 DEBUG oslo_concurrency.lockutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.828s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:09 np0005603623 nova_compute[226235]: 2026-01-31 08:19:09.443 226239 DEBUG nova.compute.manager [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:19:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:09 np0005603623 nova_compute[226235]: 2026-01-31 08:19:09.650 226239 DEBUG nova.compute.manager [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 31 03:19:10 np0005603623 nova_compute[226235]: 2026-01-31 08:19:10.020 226239 INFO nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:19:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:10.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:19:10.267 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:19:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:19:10.268 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:19:10 np0005603623 nova_compute[226235]: 2026-01-31 08:19:10.267 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:10 np0005603623 nova_compute[226235]: 2026-01-31 08:19:10.299 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:19:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:10.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:19:10 np0005603623 nova_compute[226235]: 2026-01-31 08:19:10.444 226239 DEBUG nova.compute.manager [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:19:10 np0005603623 nova_compute[226235]: 2026-01-31 08:19:10.742 226239 DEBUG nova.compute.manager [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:19:10 np0005603623 nova_compute[226235]: 2026-01-31 08:19:10.743 226239 DEBUG nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:19:10 np0005603623 nova_compute[226235]: 2026-01-31 08:19:10.743 226239 INFO nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Creating image(s)#033[00m
Jan 31 03:19:10 np0005603623 nova_compute[226235]: 2026-01-31 08:19:10.768 226239 DEBUG nova.storage.rbd_utils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] rbd image 2d301840-aeb7-41c2-8de1-80b4ed9c22bc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:19:10 np0005603623 nova_compute[226235]: 2026-01-31 08:19:10.792 226239 DEBUG nova.storage.rbd_utils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] rbd image 2d301840-aeb7-41c2-8de1-80b4ed9c22bc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:19:10 np0005603623 nova_compute[226235]: 2026-01-31 08:19:10.814 226239 DEBUG nova.storage.rbd_utils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] rbd image 2d301840-aeb7-41c2-8de1-80b4ed9c22bc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:19:10 np0005603623 nova_compute[226235]: 2026-01-31 08:19:10.818 226239 DEBUG oslo_concurrency.processutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:10 np0005603623 nova_compute[226235]: 2026-01-31 08:19:10.862 226239 DEBUG oslo_concurrency.processutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:10 np0005603623 nova_compute[226235]: 2026-01-31 08:19:10.863 226239 DEBUG oslo_concurrency.lockutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:10 np0005603623 nova_compute[226235]: 2026-01-31 08:19:10.864 226239 DEBUG oslo_concurrency.lockutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:10 np0005603623 nova_compute[226235]: 2026-01-31 08:19:10.864 226239 DEBUG oslo_concurrency.lockutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:10 np0005603623 nova_compute[226235]: 2026-01-31 08:19:10.885 226239 DEBUG nova.storage.rbd_utils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] rbd image 2d301840-aeb7-41c2-8de1-80b4ed9c22bc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:19:10 np0005603623 nova_compute[226235]: 2026-01-31 08:19:10.888 226239 DEBUG oslo_concurrency.processutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 2d301840-aeb7-41c2-8de1-80b4ed9c22bc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.295 226239 DEBUG oslo_concurrency.processutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 2d301840-aeb7-41c2-8de1-80b4ed9c22bc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.407s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:11 np0005603623 ovn_controller[133449]: 2026-01-31T08:19:11Z|00402|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.437 226239 DEBUG nova.storage.rbd_utils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] resizing rbd image 2d301840-aeb7-41c2-8de1-80b4ed9c22bc_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.591 226239 DEBUG nova.objects.instance [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Lazy-loading 'migration_context' on Instance uuid 2d301840-aeb7-41c2-8de1-80b4ed9c22bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.625 226239 DEBUG nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.625 226239 DEBUG nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Ensure instance console log exists: /var/lib/nova/instances/2d301840-aeb7-41c2-8de1-80b4ed9c22bc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.626 226239 DEBUG oslo_concurrency.lockutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.626 226239 DEBUG oslo_concurrency.lockutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.627 226239 DEBUG oslo_concurrency.lockutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.628 226239 DEBUG nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.633 226239 WARNING nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.642 226239 DEBUG nova.virt.libvirt.host [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.643 226239 DEBUG nova.virt.libvirt.host [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.648 226239 DEBUG nova.virt.libvirt.host [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.649 226239 DEBUG nova.virt.libvirt.host [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.650 226239 DEBUG nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.650 226239 DEBUG nova.virt.hardware [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.650 226239 DEBUG nova.virt.hardware [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.651 226239 DEBUG nova.virt.hardware [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.651 226239 DEBUG nova.virt.hardware [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.651 226239 DEBUG nova.virt.hardware [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.651 226239 DEBUG nova.virt.hardware [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.651 226239 DEBUG nova.virt.hardware [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.652 226239 DEBUG nova.virt.hardware [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.652 226239 DEBUG nova.virt.hardware [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.652 226239 DEBUG nova.virt.hardware [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.652 226239 DEBUG nova.virt.hardware [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:19:11 np0005603623 nova_compute[226235]: 2026-01-31 08:19:11.655 226239 DEBUG oslo_concurrency.processutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:19:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:12.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:19:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:19:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3369580802' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:19:12 np0005603623 nova_compute[226235]: 2026-01-31 08:19:12.099 226239 DEBUG oslo_concurrency.processutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:12 np0005603623 nova_compute[226235]: 2026-01-31 08:19:12.122 226239 DEBUG nova.storage.rbd_utils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] rbd image 2d301840-aeb7-41c2-8de1-80b4ed9c22bc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:19:12 np0005603623 nova_compute[226235]: 2026-01-31 08:19:12.125 226239 DEBUG oslo_concurrency.processutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:12.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:19:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3950122414' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:19:12 np0005603623 nova_compute[226235]: 2026-01-31 08:19:12.522 226239 DEBUG oslo_concurrency.processutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.397s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:12 np0005603623 nova_compute[226235]: 2026-01-31 08:19:12.523 226239 DEBUG nova.objects.instance [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Lazy-loading 'pci_devices' on Instance uuid 2d301840-aeb7-41c2-8de1-80b4ed9c22bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:19:12 np0005603623 nova_compute[226235]: 2026-01-31 08:19:12.676 226239 DEBUG nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:19:12 np0005603623 nova_compute[226235]:  <uuid>2d301840-aeb7-41c2-8de1-80b4ed9c22bc</uuid>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:  <name>instance-00000067</name>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServersAaction247Test-server-2101973623</nova:name>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:19:11</nova:creationTime>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:19:12 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:        <nova:user uuid="124a117e3bc74be7a699df447518bc54">tempest-ServersAaction247Test-777376341-project-member</nova:user>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:        <nova:project uuid="0470bddbdc05460087ff3c4b7dbb6dcd">tempest-ServersAaction247Test-777376341</nova:project>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <nova:ports/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <entry name="serial">2d301840-aeb7-41c2-8de1-80b4ed9c22bc</entry>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <entry name="uuid">2d301840-aeb7-41c2-8de1-80b4ed9c22bc</entry>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/2d301840-aeb7-41c2-8de1-80b4ed9c22bc_disk">
Jan 31 03:19:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:19:12 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/2d301840-aeb7-41c2-8de1-80b4ed9c22bc_disk.config">
Jan 31 03:19:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:19:12 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/2d301840-aeb7-41c2-8de1-80b4ed9c22bc/console.log" append="off"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:19:12 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:19:12 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:19:12 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:19:12 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:19:12 np0005603623 nova_compute[226235]: 2026-01-31 08:19:12.858 226239 DEBUG nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:19:12 np0005603623 nova_compute[226235]: 2026-01-31 08:19:12.860 226239 DEBUG nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:19:12 np0005603623 nova_compute[226235]: 2026-01-31 08:19:12.860 226239 INFO nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Using config drive#033[00m
Jan 31 03:19:12 np0005603623 nova_compute[226235]: 2026-01-31 08:19:12.884 226239 DEBUG nova.storage.rbd_utils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] rbd image 2d301840-aeb7-41c2-8de1-80b4ed9c22bc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:19:13 np0005603623 nova_compute[226235]: 2026-01-31 08:19:13.796 226239 INFO nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Creating config drive at /var/lib/nova/instances/2d301840-aeb7-41c2-8de1-80b4ed9c22bc/disk.config#033[00m
Jan 31 03:19:13 np0005603623 nova_compute[226235]: 2026-01-31 08:19:13.800 226239 DEBUG oslo_concurrency.processutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2d301840-aeb7-41c2-8de1-80b4ed9c22bc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpazegrjfd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:13 np0005603623 nova_compute[226235]: 2026-01-31 08:19:13.920 226239 DEBUG oslo_concurrency.processutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2d301840-aeb7-41c2-8de1-80b4ed9c22bc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpazegrjfd" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:13 np0005603623 nova_compute[226235]: 2026-01-31 08:19:13.945 226239 DEBUG nova.storage.rbd_utils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] rbd image 2d301840-aeb7-41c2-8de1-80b4ed9c22bc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:19:13 np0005603623 nova_compute[226235]: 2026-01-31 08:19:13.948 226239 DEBUG oslo_concurrency.processutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2d301840-aeb7-41c2-8de1-80b4ed9c22bc/disk.config 2d301840-aeb7-41c2-8de1-80b4ed9c22bc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:13 np0005603623 nova_compute[226235]: 2026-01-31 08:19:13.968 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:19:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:14.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:19:14 np0005603623 nova_compute[226235]: 2026-01-31 08:19:14.140 226239 DEBUG oslo_concurrency.processutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2d301840-aeb7-41c2-8de1-80b4ed9c22bc/disk.config 2d301840-aeb7-41c2-8de1-80b4ed9c22bc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.192s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:14 np0005603623 nova_compute[226235]: 2026-01-31 08:19:14.141 226239 INFO nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Deleting local config drive /var/lib/nova/instances/2d301840-aeb7-41c2-8de1-80b4ed9c22bc/disk.config because it was imported into RBD.#033[00m
Jan 31 03:19:14 np0005603623 systemd-machined[194379]: New machine qemu-44-instance-00000067.
Jan 31 03:19:14 np0005603623 systemd[1]: Started Virtual Machine qemu-44-instance-00000067.
Jan 31 03:19:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:14.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:14 np0005603623 nova_compute[226235]: 2026-01-31 08:19:14.716 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847554.7158494, 2d301840-aeb7-41c2-8de1-80b4ed9c22bc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:19:14 np0005603623 nova_compute[226235]: 2026-01-31 08:19:14.717 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:19:14 np0005603623 nova_compute[226235]: 2026-01-31 08:19:14.720 226239 DEBUG nova.compute.manager [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:19:14 np0005603623 nova_compute[226235]: 2026-01-31 08:19:14.720 226239 DEBUG nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:19:14 np0005603623 nova_compute[226235]: 2026-01-31 08:19:14.724 226239 INFO nova.virt.libvirt.driver [-] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Instance spawned successfully.#033[00m
Jan 31 03:19:14 np0005603623 nova_compute[226235]: 2026-01-31 08:19:14.724 226239 DEBUG nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:19:14 np0005603623 nova_compute[226235]: 2026-01-31 08:19:14.962 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:19:14 np0005603623 nova_compute[226235]: 2026-01-31 08:19:14.968 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:19:14 np0005603623 nova_compute[226235]: 2026-01-31 08:19:14.977 226239 DEBUG nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:19:14 np0005603623 nova_compute[226235]: 2026-01-31 08:19:14.978 226239 DEBUG nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:19:14 np0005603623 nova_compute[226235]: 2026-01-31 08:19:14.979 226239 DEBUG nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:19:14 np0005603623 nova_compute[226235]: 2026-01-31 08:19:14.980 226239 DEBUG nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:19:14 np0005603623 nova_compute[226235]: 2026-01-31 08:19:14.980 226239 DEBUG nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:19:14 np0005603623 nova_compute[226235]: 2026-01-31 08:19:14.981 226239 DEBUG nova.virt.libvirt.driver [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:19:15 np0005603623 nova_compute[226235]: 2026-01-31 08:19:15.099 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:19:15 np0005603623 nova_compute[226235]: 2026-01-31 08:19:15.100 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847554.7168407, 2d301840-aeb7-41c2-8de1-80b4ed9c22bc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:19:15 np0005603623 nova_compute[226235]: 2026-01-31 08:19:15.100 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] VM Started (Lifecycle Event)#033[00m
Jan 31 03:19:15 np0005603623 nova_compute[226235]: 2026-01-31 08:19:15.153 226239 INFO nova.compute.manager [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Took 4.41 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:19:15 np0005603623 nova_compute[226235]: 2026-01-31 08:19:15.154 226239 DEBUG nova.compute.manager [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:19:15 np0005603623 nova_compute[226235]: 2026-01-31 08:19:15.227 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:19:15 np0005603623 nova_compute[226235]: 2026-01-31 08:19:15.229 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:19:15 np0005603623 nova_compute[226235]: 2026-01-31 08:19:15.301 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:15 np0005603623 nova_compute[226235]: 2026-01-31 08:19:15.315 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:19:15 np0005603623 nova_compute[226235]: 2026-01-31 08:19:15.364 226239 INFO nova.compute.manager [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Took 6.79 seconds to build instance.#033[00m
Jan 31 03:19:15 np0005603623 nova_compute[226235]: 2026-01-31 08:19:15.569 226239 DEBUG oslo_concurrency.lockutils [None req-090fd7a1-68f8-4e17-a473-92437a1d04a7 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Lock "2d301840-aeb7-41c2-8de1-80b4ed9c22bc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:16.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:19:16.270 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:19:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:16.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:19:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:18.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:19:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:18.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:18 np0005603623 nova_compute[226235]: 2026-01-31 08:19:18.969 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:19 np0005603623 nova_compute[226235]: 2026-01-31 08:19:19.531 226239 DEBUG nova.compute.manager [None req-b67411fa-b74b-495f-86fe-803cd0505101 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:19:19 np0005603623 nova_compute[226235]: 2026-01-31 08:19:19.582 226239 INFO nova.compute.manager [None req-b67411fa-b74b-495f-86fe-803cd0505101 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] instance snapshotting#033[00m
Jan 31 03:19:19 np0005603623 nova_compute[226235]: 2026-01-31 08:19:19.583 226239 DEBUG nova.objects.instance [None req-b67411fa-b74b-495f-86fe-803cd0505101 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Lazy-loading 'flavor' on Instance uuid 2d301840-aeb7-41c2-8de1-80b4ed9c22bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:19:19 np0005603623 nova_compute[226235]: 2026-01-31 08:19:19.971 226239 DEBUG oslo_concurrency.lockutils [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Acquiring lock "2d301840-aeb7-41c2-8de1-80b4ed9c22bc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:19 np0005603623 nova_compute[226235]: 2026-01-31 08:19:19.972 226239 DEBUG oslo_concurrency.lockutils [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Lock "2d301840-aeb7-41c2-8de1-80b4ed9c22bc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:19 np0005603623 nova_compute[226235]: 2026-01-31 08:19:19.972 226239 DEBUG oslo_concurrency.lockutils [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Acquiring lock "2d301840-aeb7-41c2-8de1-80b4ed9c22bc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:19 np0005603623 nova_compute[226235]: 2026-01-31 08:19:19.973 226239 DEBUG oslo_concurrency.lockutils [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Lock "2d301840-aeb7-41c2-8de1-80b4ed9c22bc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:19 np0005603623 nova_compute[226235]: 2026-01-31 08:19:19.973 226239 DEBUG oslo_concurrency.lockutils [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Lock "2d301840-aeb7-41c2-8de1-80b4ed9c22bc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:19 np0005603623 nova_compute[226235]: 2026-01-31 08:19:19.974 226239 INFO nova.compute.manager [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Terminating instance#033[00m
Jan 31 03:19:19 np0005603623 nova_compute[226235]: 2026-01-31 08:19:19.975 226239 DEBUG oslo_concurrency.lockutils [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Acquiring lock "refresh_cache-2d301840-aeb7-41c2-8de1-80b4ed9c22bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:19:19 np0005603623 nova_compute[226235]: 2026-01-31 08:19:19.975 226239 DEBUG oslo_concurrency.lockutils [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Acquired lock "refresh_cache-2d301840-aeb7-41c2-8de1-80b4ed9c22bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:19:19 np0005603623 nova_compute[226235]: 2026-01-31 08:19:19.976 226239 DEBUG nova.network.neutron [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:19:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:20.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:20 np0005603623 nova_compute[226235]: 2026-01-31 08:19:20.304 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:20.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:20 np0005603623 nova_compute[226235]: 2026-01-31 08:19:20.473 226239 DEBUG nova.network.neutron [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:19:20 np0005603623 nova_compute[226235]: 2026-01-31 08:19:20.509 226239 INFO nova.virt.libvirt.driver [None req-b67411fa-b74b-495f-86fe-803cd0505101 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Beginning live snapshot process#033[00m
Jan 31 03:19:20 np0005603623 nova_compute[226235]: 2026-01-31 08:19:20.709 226239 DEBUG nova.compute.manager [None req-b67411fa-b74b-495f-86fe-803cd0505101 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Instance disappeared during snapshot _snapshot_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:4390#033[00m
Jan 31 03:19:21 np0005603623 nova_compute[226235]: 2026-01-31 08:19:21.056 226239 DEBUG nova.network.neutron [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:19:21 np0005603623 nova_compute[226235]: 2026-01-31 08:19:21.093 226239 DEBUG oslo_concurrency.lockutils [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Releasing lock "refresh_cache-2d301840-aeb7-41c2-8de1-80b4ed9c22bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:19:21 np0005603623 nova_compute[226235]: 2026-01-31 08:19:21.094 226239 DEBUG nova.compute.manager [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:19:21 np0005603623 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000067.scope: Deactivated successfully.
Jan 31 03:19:21 np0005603623 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000067.scope: Consumed 7.049s CPU time.
Jan 31 03:19:21 np0005603623 systemd-machined[194379]: Machine qemu-44-instance-00000067 terminated.
Jan 31 03:19:21 np0005603623 nova_compute[226235]: 2026-01-31 08:19:21.313 226239 INFO nova.virt.libvirt.driver [-] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Instance destroyed successfully.#033[00m
Jan 31 03:19:21 np0005603623 nova_compute[226235]: 2026-01-31 08:19:21.314 226239 DEBUG nova.objects.instance [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Lazy-loading 'resources' on Instance uuid 2d301840-aeb7-41c2-8de1-80b4ed9c22bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:19:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:22.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:22 np0005603623 nova_compute[226235]: 2026-01-31 08:19:22.040 226239 DEBUG nova.compute.manager [None req-b67411fa-b74b-495f-86fe-803cd0505101 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Found 0 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Jan 31 03:19:22 np0005603623 nova_compute[226235]: 2026-01-31 08:19:22.219 226239 INFO nova.virt.libvirt.driver [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Deleting instance files /var/lib/nova/instances/2d301840-aeb7-41c2-8de1-80b4ed9c22bc_del#033[00m
Jan 31 03:19:22 np0005603623 nova_compute[226235]: 2026-01-31 08:19:22.219 226239 INFO nova.virt.libvirt.driver [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Deletion of /var/lib/nova/instances/2d301840-aeb7-41c2-8de1-80b4ed9c22bc_del complete#033[00m
Jan 31 03:19:22 np0005603623 nova_compute[226235]: 2026-01-31 08:19:22.386 226239 INFO nova.compute.manager [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Took 1.29 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:19:22 np0005603623 nova_compute[226235]: 2026-01-31 08:19:22.387 226239 DEBUG oslo.service.loopingcall [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:19:22 np0005603623 nova_compute[226235]: 2026-01-31 08:19:22.387 226239 DEBUG nova.compute.manager [-] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:19:22 np0005603623 nova_compute[226235]: 2026-01-31 08:19:22.387 226239 DEBUG nova.network.neutron [-] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:19:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:22.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:23 np0005603623 nova_compute[226235]: 2026-01-31 08:19:23.362 226239 DEBUG nova.network.neutron [-] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:19:23 np0005603623 nova_compute[226235]: 2026-01-31 08:19:23.421 226239 DEBUG nova.network.neutron [-] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:19:23 np0005603623 nova_compute[226235]: 2026-01-31 08:19:23.623 226239 INFO nova.compute.manager [-] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Took 1.24 seconds to deallocate network for instance.#033[00m
Jan 31 03:19:23 np0005603623 nova_compute[226235]: 2026-01-31 08:19:23.795 226239 DEBUG oslo_concurrency.lockutils [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:23 np0005603623 nova_compute[226235]: 2026-01-31 08:19:23.795 226239 DEBUG oslo_concurrency.lockutils [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:24 np0005603623 nova_compute[226235]: 2026-01-31 08:19:23.999 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:24.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:24.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:24 np0005603623 nova_compute[226235]: 2026-01-31 08:19:24.405 226239 DEBUG oslo_concurrency.processutils [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:19:24 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3562314615' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:19:24 np0005603623 nova_compute[226235]: 2026-01-31 08:19:24.812 226239 DEBUG oslo_concurrency.processutils [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.407s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:24 np0005603623 nova_compute[226235]: 2026-01-31 08:19:24.818 226239 DEBUG nova.compute.provider_tree [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:19:24 np0005603623 nova_compute[226235]: 2026-01-31 08:19:24.866 226239 DEBUG nova.scheduler.client.report [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:19:24 np0005603623 nova_compute[226235]: 2026-01-31 08:19:24.958 226239 DEBUG oslo_concurrency.lockutils [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:25 np0005603623 nova_compute[226235]: 2026-01-31 08:19:25.081 226239 INFO nova.scheduler.client.report [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Deleted allocations for instance 2d301840-aeb7-41c2-8de1-80b4ed9c22bc#033[00m
Jan 31 03:19:25 np0005603623 nova_compute[226235]: 2026-01-31 08:19:25.307 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:25 np0005603623 nova_compute[226235]: 2026-01-31 08:19:25.546 226239 DEBUG oslo_concurrency.lockutils [None req-c02652ec-532d-4829-a5af-0279fd291afb 124a117e3bc74be7a699df447518bc54 0470bddbdc05460087ff3c4b7dbb6dcd - - default default] Lock "2d301840-aeb7-41c2-8de1-80b4ed9c22bc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:26.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:26.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:28.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:28.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:29 np0005603623 nova_compute[226235]: 2026-01-31 08:19:29.002 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:30.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:19:30.112 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:19:30.112 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:19:30.112 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:30 np0005603623 nova_compute[226235]: 2026-01-31 08:19:30.310 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:30.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:32.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:32.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:34 np0005603623 nova_compute[226235]: 2026-01-31 08:19:34.004 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:34.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:34 np0005603623 nova_compute[226235]: 2026-01-31 08:19:34.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:19:34 np0005603623 nova_compute[226235]: 2026-01-31 08:19:34.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:19:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:34.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:35 np0005603623 nova_compute[226235]: 2026-01-31 08:19:35.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:19:35 np0005603623 nova_compute[226235]: 2026-01-31 08:19:35.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:19:35 np0005603623 nova_compute[226235]: 2026-01-31 08:19:35.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:19:35 np0005603623 nova_compute[226235]: 2026-01-31 08:19:35.203 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:19:35 np0005603623 nova_compute[226235]: 2026-01-31 08:19:35.313 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:36.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:36 np0005603623 nova_compute[226235]: 2026-01-31 08:19:36.311 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847561.31025, 2d301840-aeb7-41c2-8de1-80b4ed9c22bc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:19:36 np0005603623 nova_compute[226235]: 2026-01-31 08:19:36.312 226239 INFO nova.compute.manager [-] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:19:36 np0005603623 nova_compute[226235]: 2026-01-31 08:19:36.410 226239 DEBUG nova.compute.manager [None req-dcc3b70f-daa1-4d07-91fa-07b5dd324979 - - - - - -] [instance: 2d301840-aeb7-41c2-8de1-80b4ed9c22bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:19:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:36.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:38.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:38.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:38 np0005603623 podman[270259]: 2026-01-31 08:19:38.961125456 +0000 UTC m=+0.050846204 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:19:38 np0005603623 podman[270260]: 2026-01-31 08:19:38.988428587 +0000 UTC m=+0.077197385 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 31 03:19:39 np0005603623 nova_compute[226235]: 2026-01-31 08:19:39.005 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:39 np0005603623 nova_compute[226235]: 2026-01-31 08:19:39.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:19:39 np0005603623 nova_compute[226235]: 2026-01-31 08:19:39.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:19:39 np0005603623 nova_compute[226235]: 2026-01-31 08:19:39.260 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:39 np0005603623 nova_compute[226235]: 2026-01-31 08:19:39.261 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:39 np0005603623 nova_compute[226235]: 2026-01-31 08:19:39.262 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:39 np0005603623 nova_compute[226235]: 2026-01-31 08:19:39.262 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:19:39 np0005603623 nova_compute[226235]: 2026-01-31 08:19:39.262 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:19:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/797508244' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:19:39 np0005603623 nova_compute[226235]: 2026-01-31 08:19:39.710 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:39 np0005603623 nova_compute[226235]: 2026-01-31 08:19:39.846 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:19:39 np0005603623 nova_compute[226235]: 2026-01-31 08:19:39.847 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4581MB free_disk=20.967357635498047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:19:39 np0005603623 nova_compute[226235]: 2026-01-31 08:19:39.847 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:39 np0005603623 nova_compute[226235]: 2026-01-31 08:19:39.848 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:40.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:40 np0005603623 nova_compute[226235]: 2026-01-31 08:19:40.133 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:19:40 np0005603623 nova_compute[226235]: 2026-01-31 08:19:40.134 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:19:40 np0005603623 nova_compute[226235]: 2026-01-31 08:19:40.162 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:40 np0005603623 nova_compute[226235]: 2026-01-31 08:19:40.317 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:40.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:19:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1698960645' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:19:40 np0005603623 nova_compute[226235]: 2026-01-31 08:19:40.591 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:40 np0005603623 nova_compute[226235]: 2026-01-31 08:19:40.596 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:19:40 np0005603623 nova_compute[226235]: 2026-01-31 08:19:40.677 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:19:40 np0005603623 nova_compute[226235]: 2026-01-31 08:19:40.989 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:19:40 np0005603623 nova_compute[226235]: 2026-01-31 08:19:40.990 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:41 np0005603623 nova_compute[226235]: 2026-01-31 08:19:41.990 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:19:41 np0005603623 nova_compute[226235]: 2026-01-31 08:19:41.990 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:19:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:42.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:42 np0005603623 nova_compute[226235]: 2026-01-31 08:19:42.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:19:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:19:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:42.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:19:44 np0005603623 nova_compute[226235]: 2026-01-31 08:19:44.009 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:44.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:44.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:45 np0005603623 nova_compute[226235]: 2026-01-31 08:19:45.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:19:45 np0005603623 nova_compute[226235]: 2026-01-31 08:19:45.321 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:19:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:46.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:19:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:46.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:48.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:48.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:49 np0005603623 nova_compute[226235]: 2026-01-31 08:19:49.011 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:50.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:50 np0005603623 nova_compute[226235]: 2026-01-31 08:19:50.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:19:50 np0005603623 nova_compute[226235]: 2026-01-31 08:19:50.324 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:50.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:52.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:52.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:54 np0005603623 nova_compute[226235]: 2026-01-31 08:19:54.013 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:54.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:54.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:54 np0005603623 nova_compute[226235]: 2026-01-31 08:19:54.940 226239 DEBUG oslo_concurrency.lockutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "ce5ab48d-0d12-49ac-92a1-001e91e26553" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:54 np0005603623 nova_compute[226235]: 2026-01-31 08:19:54.940 226239 DEBUG oslo_concurrency.lockutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:19:54.961 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:19:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:19:54.961 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:19:54 np0005603623 nova_compute[226235]: 2026-01-31 08:19:54.962 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:55 np0005603623 nova_compute[226235]: 2026-01-31 08:19:55.128 226239 DEBUG nova.compute.manager [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:19:55 np0005603623 nova_compute[226235]: 2026-01-31 08:19:55.325 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:55 np0005603623 nova_compute[226235]: 2026-01-31 08:19:55.474 226239 DEBUG oslo_concurrency.lockutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:55 np0005603623 nova_compute[226235]: 2026-01-31 08:19:55.475 226239 DEBUG oslo_concurrency.lockutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:55 np0005603623 nova_compute[226235]: 2026-01-31 08:19:55.498 226239 DEBUG nova.virt.hardware [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:19:55 np0005603623 nova_compute[226235]: 2026-01-31 08:19:55.499 226239 INFO nova.compute.claims [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:19:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:19:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:56.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:19:56 np0005603623 nova_compute[226235]: 2026-01-31 08:19:56.229 226239 DEBUG oslo_concurrency.processutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:19:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:56.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:19:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:19:56 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1258121493' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:19:56 np0005603623 nova_compute[226235]: 2026-01-31 08:19:56.752 226239 DEBUG oslo_concurrency.processutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:56 np0005603623 nova_compute[226235]: 2026-01-31 08:19:56.757 226239 DEBUG nova.compute.provider_tree [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:19:56 np0005603623 nova_compute[226235]: 2026-01-31 08:19:56.878 226239 DEBUG nova.scheduler.client.report [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:19:57 np0005603623 nova_compute[226235]: 2026-01-31 08:19:57.714 226239 DEBUG oslo_concurrency.lockutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:57 np0005603623 nova_compute[226235]: 2026-01-31 08:19:57.715 226239 DEBUG nova.compute.manager [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:19:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:19:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:58.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:19:58 np0005603623 nova_compute[226235]: 2026-01-31 08:19:58.385 226239 DEBUG nova.compute.manager [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:19:58 np0005603623 nova_compute[226235]: 2026-01-31 08:19:58.386 226239 DEBUG nova.network.neutron [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:19:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:19:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:58.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:59 np0005603623 nova_compute[226235]: 2026-01-31 08:19:59.056 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:59 np0005603623 nova_compute[226235]: 2026-01-31 08:19:59.370 226239 INFO nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:19:59 np0005603623 nova_compute[226235]: 2026-01-31 08:19:59.419 226239 DEBUG nova.policy [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a80ca71875e8413caa2b52e679e1dd40', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '782758ebebe64580accb21a22280e02f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:19:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:59 np0005603623 nova_compute[226235]: 2026-01-31 08:19:59.763 226239 DEBUG nova.compute.manager [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:19:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:19:59.964 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:00.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:00 np0005603623 ceph-mon[77037]: overall HEALTH_OK
Jan 31 03:20:00 np0005603623 nova_compute[226235]: 2026-01-31 08:20:00.328 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:00.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:00 np0005603623 nova_compute[226235]: 2026-01-31 08:20:00.827 226239 DEBUG nova.compute.manager [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:20:00 np0005603623 nova_compute[226235]: 2026-01-31 08:20:00.829 226239 DEBUG nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:20:00 np0005603623 nova_compute[226235]: 2026-01-31 08:20:00.830 226239 INFO nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Creating image(s)#033[00m
Jan 31 03:20:00 np0005603623 nova_compute[226235]: 2026-01-31 08:20:00.856 226239 DEBUG nova.storage.rbd_utils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image ce5ab48d-0d12-49ac-92a1-001e91e26553_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:00 np0005603623 nova_compute[226235]: 2026-01-31 08:20:00.885 226239 DEBUG nova.storage.rbd_utils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image ce5ab48d-0d12-49ac-92a1-001e91e26553_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:00 np0005603623 nova_compute[226235]: 2026-01-31 08:20:00.910 226239 DEBUG nova.storage.rbd_utils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image ce5ab48d-0d12-49ac-92a1-001e91e26553_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:00 np0005603623 nova_compute[226235]: 2026-01-31 08:20:00.914 226239 DEBUG oslo_concurrency.processutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:00 np0005603623 nova_compute[226235]: 2026-01-31 08:20:00.967 226239 DEBUG oslo_concurrency.processutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:00 np0005603623 nova_compute[226235]: 2026-01-31 08:20:00.968 226239 DEBUG oslo_concurrency.lockutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:00 np0005603623 nova_compute[226235]: 2026-01-31 08:20:00.969 226239 DEBUG oslo_concurrency.lockutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:00 np0005603623 nova_compute[226235]: 2026-01-31 08:20:00.970 226239 DEBUG oslo_concurrency.lockutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:00 np0005603623 nova_compute[226235]: 2026-01-31 08:20:00.997 226239 DEBUG nova.storage.rbd_utils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image ce5ab48d-0d12-49ac-92a1-001e91e26553_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:01 np0005603623 nova_compute[226235]: 2026-01-31 08:20:01.000 226239 DEBUG oslo_concurrency.processutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 ce5ab48d-0d12-49ac-92a1-001e91e26553_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:20:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:02.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.155 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.156 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.156 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.156 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.156 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.157 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.324 226239 DEBUG nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Adding ephemeral_1_0706d66 into backend ephemeral images _store_ephemeral_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:100#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.360 226239 DEBUG oslo_concurrency.processutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 ce5ab48d-0d12-49ac-92a1-001e91e26553_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.360s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.431 226239 DEBUG nova.storage.rbd_utils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] resizing rbd image ce5ab48d-0d12-49ac-92a1-001e91e26553_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:20:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:02.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.540 226239 DEBUG nova.objects.instance [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lazy-loading 'migration_context' on Instance uuid ce5ab48d-0d12-49ac-92a1-001e91e26553 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.609 226239 DEBUG nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.610 226239 DEBUG nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Image id 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16 yields fingerprint b1c202daae0a5d5b639e0239462ea0d46fe633d6 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.610 226239 INFO nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] image 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16 at (/var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6): checking#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.610 226239 DEBUG nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] image 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16 at (/var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.611 226239 INFO oslo.privsep.daemon [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpa3sd0r52/privsep.sock']#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.612 226239 DEBUG nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.612 226239 DEBUG nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Ensure instance console log exists: /var/lib/nova/instances/ce5ab48d-0d12-49ac-92a1-001e91e26553/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.612 226239 DEBUG oslo_concurrency.lockutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.612 226239 DEBUG oslo_concurrency.lockutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:02 np0005603623 nova_compute[226235]: 2026-01-31 08:20:02.613 226239 DEBUG oslo_concurrency.lockutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:03 np0005603623 nova_compute[226235]: 2026-01-31 08:20:03.041 226239 DEBUG nova.network.neutron [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Successfully created port: 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:20:03 np0005603623 nova_compute[226235]: 2026-01-31 08:20:03.230 226239 INFO oslo.privsep.daemon [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m
Jan 31 03:20:03 np0005603623 nova_compute[226235]: 2026-01-31 08:20:03.117 270602 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 31 03:20:03 np0005603623 nova_compute[226235]: 2026-01-31 08:20:03.120 270602 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 31 03:20:03 np0005603623 nova_compute[226235]: 2026-01-31 08:20:03.122 270602 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 31 03:20:03 np0005603623 nova_compute[226235]: 2026-01-31 08:20:03.122 270602 INFO oslo.privsep.daemon [-] privsep daemon running as pid 270602#033[00m
Jan 31 03:20:03 np0005603623 nova_compute[226235]: 2026-01-31 08:20:03.337 226239 DEBUG nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 31 03:20:03 np0005603623 nova_compute[226235]: 2026-01-31 08:20:03.338 226239 DEBUG nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] ce5ab48d-0d12-49ac-92a1-001e91e26553 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 31 03:20:03 np0005603623 nova_compute[226235]: 2026-01-31 08:20:03.338 226239 WARNING nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5#033[00m
Jan 31 03:20:03 np0005603623 nova_compute[226235]: 2026-01-31 08:20:03.338 226239 INFO nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Active base files: /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6#033[00m
Jan 31 03:20:03 np0005603623 nova_compute[226235]: 2026-01-31 08:20:03.338 226239 INFO nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Removable base files: /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5#033[00m
Jan 31 03:20:03 np0005603623 nova_compute[226235]: 2026-01-31 08:20:03.338 226239 INFO nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5#033[00m
Jan 31 03:20:03 np0005603623 nova_compute[226235]: 2026-01-31 08:20:03.339 226239 DEBUG nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Jan 31 03:20:03 np0005603623 nova_compute[226235]: 2026-01-31 08:20:03.339 226239 DEBUG nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Jan 31 03:20:03 np0005603623 nova_compute[226235]: 2026-01-31 08:20:03.339 226239 DEBUG nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Jan 31 03:20:03 np0005603623 nova_compute[226235]: 2026-01-31 08:20:03.339 226239 INFO nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/ephemeral_1_0706d66#033[00m
Jan 31 03:20:04 np0005603623 nova_compute[226235]: 2026-01-31 08:20:04.057 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:04.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:04.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:05 np0005603623 nova_compute[226235]: 2026-01-31 08:20:05.330 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:05 np0005603623 nova_compute[226235]: 2026-01-31 08:20:05.705 226239 DEBUG oslo_concurrency.lockutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:05 np0005603623 nova_compute[226235]: 2026-01-31 08:20:05.706 226239 DEBUG oslo_concurrency.lockutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:06 np0005603623 nova_compute[226235]: 2026-01-31 08:20:06.070 226239 DEBUG nova.compute.manager [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:20:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:06.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:06 np0005603623 nova_compute[226235]: 2026-01-31 08:20:06.401 226239 DEBUG oslo_concurrency.lockutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:06 np0005603623 nova_compute[226235]: 2026-01-31 08:20:06.402 226239 DEBUG oslo_concurrency.lockutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:06 np0005603623 nova_compute[226235]: 2026-01-31 08:20:06.410 226239 DEBUG nova.virt.hardware [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:20:06 np0005603623 nova_compute[226235]: 2026-01-31 08:20:06.411 226239 INFO nova.compute.claims [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:20:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:06.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:06 np0005603623 nova_compute[226235]: 2026-01-31 08:20:06.925 226239 DEBUG oslo_concurrency.processutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:20:07 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2421923903' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:20:07 np0005603623 nova_compute[226235]: 2026-01-31 08:20:07.338 226239 DEBUG oslo_concurrency.processutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:07 np0005603623 nova_compute[226235]: 2026-01-31 08:20:07.344 226239 DEBUG nova.compute.provider_tree [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:20:07 np0005603623 nova_compute[226235]: 2026-01-31 08:20:07.461 226239 DEBUG nova.scheduler.client.report [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:20:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:20:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:20:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:20:07 np0005603623 nova_compute[226235]: 2026-01-31 08:20:07.822 226239 DEBUG oslo_concurrency.lockutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.420s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:07 np0005603623 nova_compute[226235]: 2026-01-31 08:20:07.823 226239 DEBUG nova.compute.manager [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:20:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:20:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:08.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:20:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:08.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:08 np0005603623 nova_compute[226235]: 2026-01-31 08:20:08.741 226239 DEBUG nova.compute.manager [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:20:08 np0005603623 nova_compute[226235]: 2026-01-31 08:20:08.741 226239 DEBUG nova.network.neutron [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:20:09 np0005603623 nova_compute[226235]: 2026-01-31 08:20:09.058 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:09 np0005603623 nova_compute[226235]: 2026-01-31 08:20:09.104 226239 INFO nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:20:09 np0005603623 nova_compute[226235]: 2026-01-31 08:20:09.287 226239 DEBUG nova.policy [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a80ca71875e8413caa2b52e679e1dd40', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '782758ebebe64580accb21a22280e02f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:20:09 np0005603623 nova_compute[226235]: 2026-01-31 08:20:09.318 226239 DEBUG nova.network.neutron [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Successfully updated port: 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:20:09 np0005603623 nova_compute[226235]: 2026-01-31 08:20:09.428 226239 DEBUG nova.compute.manager [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:20:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:09 np0005603623 nova_compute[226235]: 2026-01-31 08:20:09.551 226239 DEBUG oslo_concurrency.lockutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "refresh_cache-ce5ab48d-0d12-49ac-92a1-001e91e26553" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:20:09 np0005603623 nova_compute[226235]: 2026-01-31 08:20:09.551 226239 DEBUG oslo_concurrency.lockutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquired lock "refresh_cache-ce5ab48d-0d12-49ac-92a1-001e91e26553" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:20:09 np0005603623 nova_compute[226235]: 2026-01-31 08:20:09.551 226239 DEBUG nova.network.neutron [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:20:09 np0005603623 podman[270813]: 2026-01-31 08:20:09.973446976 +0000 UTC m=+0.065381102 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Jan 31 03:20:09 np0005603623 podman[270812]: 2026-01-31 08:20:09.979418794 +0000 UTC m=+0.071692350 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127)
Jan 31 03:20:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:10.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:10 np0005603623 nova_compute[226235]: 2026-01-31 08:20:10.171 226239 DEBUG nova.network.neutron [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:20:10 np0005603623 nova_compute[226235]: 2026-01-31 08:20:10.230 226239 DEBUG nova.compute.manager [req-ed884dc3-0207-4559-b41e-bbb27d99bd1f req-ece6ac08-23c3-40d2-91f9-8970b419b821 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Received event network-changed-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:10 np0005603623 nova_compute[226235]: 2026-01-31 08:20:10.231 226239 DEBUG nova.compute.manager [req-ed884dc3-0207-4559-b41e-bbb27d99bd1f req-ece6ac08-23c3-40d2-91f9-8970b419b821 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Refreshing instance network info cache due to event network-changed-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:20:10 np0005603623 nova_compute[226235]: 2026-01-31 08:20:10.231 226239 DEBUG oslo_concurrency.lockutils [req-ed884dc3-0207-4559-b41e-bbb27d99bd1f req-ece6ac08-23c3-40d2-91f9-8970b419b821 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-ce5ab48d-0d12-49ac-92a1-001e91e26553" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:20:10 np0005603623 nova_compute[226235]: 2026-01-31 08:20:10.332 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:10.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:10 np0005603623 nova_compute[226235]: 2026-01-31 08:20:10.526 226239 DEBUG nova.compute.manager [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:20:10 np0005603623 nova_compute[226235]: 2026-01-31 08:20:10.528 226239 DEBUG nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:20:10 np0005603623 nova_compute[226235]: 2026-01-31 08:20:10.528 226239 INFO nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Creating image(s)#033[00m
Jan 31 03:20:10 np0005603623 nova_compute[226235]: 2026-01-31 08:20:10.553 226239 DEBUG nova.storage.rbd_utils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image 4ef2381f-8f5e-4a65-b2fa-c015131646fb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:10 np0005603623 nova_compute[226235]: 2026-01-31 08:20:10.583 226239 DEBUG nova.storage.rbd_utils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image 4ef2381f-8f5e-4a65-b2fa-c015131646fb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:10 np0005603623 nova_compute[226235]: 2026-01-31 08:20:10.612 226239 DEBUG nova.storage.rbd_utils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image 4ef2381f-8f5e-4a65-b2fa-c015131646fb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:10 np0005603623 nova_compute[226235]: 2026-01-31 08:20:10.615 226239 DEBUG oslo_concurrency.processutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:10 np0005603623 nova_compute[226235]: 2026-01-31 08:20:10.665 226239 DEBUG oslo_concurrency.processutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:10 np0005603623 nova_compute[226235]: 2026-01-31 08:20:10.666 226239 DEBUG oslo_concurrency.lockutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:10 np0005603623 nova_compute[226235]: 2026-01-31 08:20:10.666 226239 DEBUG oslo_concurrency.lockutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:10 np0005603623 nova_compute[226235]: 2026-01-31 08:20:10.667 226239 DEBUG oslo_concurrency.lockutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:10 np0005603623 nova_compute[226235]: 2026-01-31 08:20:10.692 226239 DEBUG nova.storage.rbd_utils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image 4ef2381f-8f5e-4a65-b2fa-c015131646fb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:10 np0005603623 nova_compute[226235]: 2026-01-31 08:20:10.696 226239 DEBUG oslo_concurrency.processutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4ef2381f-8f5e-4a65-b2fa-c015131646fb_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:11 np0005603623 nova_compute[226235]: 2026-01-31 08:20:11.317 226239 DEBUG nova.network.neutron [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Successfully created port: 60d954a7-a949-4701-b77d-16de80bc2317 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:20:11 np0005603623 nova_compute[226235]: 2026-01-31 08:20:11.677 226239 DEBUG oslo_concurrency.processutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4ef2381f-8f5e-4a65-b2fa-c015131646fb_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.981s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:11 np0005603623 nova_compute[226235]: 2026-01-31 08:20:11.722 226239 DEBUG nova.network.neutron [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Updating instance_info_cache with network_info: [{"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:20:11 np0005603623 nova_compute[226235]: 2026-01-31 08:20:11.771 226239 DEBUG nova.storage.rbd_utils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] resizing rbd image 4ef2381f-8f5e-4a65-b2fa-c015131646fb_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.072 226239 DEBUG oslo_concurrency.lockutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Releasing lock "refresh_cache-ce5ab48d-0d12-49ac-92a1-001e91e26553" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.073 226239 DEBUG nova.compute.manager [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Instance network_info: |[{"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.073 226239 DEBUG oslo_concurrency.lockutils [req-ed884dc3-0207-4559-b41e-bbb27d99bd1f req-ece6ac08-23c3-40d2-91f9-8970b419b821 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-ce5ab48d-0d12-49ac-92a1-001e91e26553" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.073 226239 DEBUG nova.network.neutron [req-ed884dc3-0207-4559-b41e-bbb27d99bd1f req-ece6ac08-23c3-40d2-91f9-8970b419b821 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Refreshing network info cache for port 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.076 226239 DEBUG nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Start _get_guest_xml network_info=[{"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.078 226239 WARNING nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.082 226239 DEBUG nova.virt.libvirt.host [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.082 226239 DEBUG nova.virt.libvirt.host [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.084 226239 DEBUG nova.virt.libvirt.host [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.085 226239 DEBUG nova.virt.libvirt.host [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.086 226239 DEBUG nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.086 226239 DEBUG nova.virt.hardware [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.086 226239 DEBUG nova.virt.hardware [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.087 226239 DEBUG nova.virt.hardware [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.087 226239 DEBUG nova.virt.hardware [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.087 226239 DEBUG nova.virt.hardware [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.087 226239 DEBUG nova.virt.hardware [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.087 226239 DEBUG nova.virt.hardware [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:20:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.088 226239 DEBUG nova.virt.hardware [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.088 226239 DEBUG nova.virt.hardware [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.088 226239 DEBUG nova.virt.hardware [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:20:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:12.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.088 226239 DEBUG nova.virt.hardware [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.091 226239 DEBUG oslo_concurrency.processutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:12.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:20:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1225590333' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.581 226239 DEBUG oslo_concurrency.processutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.695 226239 DEBUG nova.storage.rbd_utils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image ce5ab48d-0d12-49ac-92a1-001e91e26553_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.699 226239 DEBUG oslo_concurrency.processutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:12 np0005603623 nova_compute[226235]: 2026-01-31 08:20:12.943 226239 DEBUG nova.objects.instance [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lazy-loading 'migration_context' on Instance uuid 4ef2381f-8f5e-4a65-b2fa-c015131646fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.017 226239 DEBUG nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.018 226239 DEBUG nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Ensure instance console log exists: /var/lib/nova/instances/4ef2381f-8f5e-4a65-b2fa-c015131646fb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.018 226239 DEBUG oslo_concurrency.lockutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.018 226239 DEBUG oslo_concurrency.lockutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.018 226239 DEBUG oslo_concurrency.lockutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:20:13 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3795387269' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.151 226239 DEBUG oslo_concurrency.processutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.153 226239 DEBUG nova.virt.libvirt.vif [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:19:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-390925815',display_name='tempest-ListServerFiltersTestJSON-instance-390925815',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-390925815',id=106,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782758ebebe64580accb21a22280e02f',ramdisk_id='',reservation_id='r-f8rcyvc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-166541249',owner_user_name='tempest-ListServerFiltersTestJSON-166541249-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:19:59Z,user_data=None,user_id='a80ca71875e8413caa2b52e679e1dd40',uuid=ce5ab48d-0d12-49ac-92a1-001e91e26553,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.153 226239 DEBUG nova.network.os_vif_util [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converting VIF {"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.154 226239 DEBUG nova.network.os_vif_util [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:79:ef,bridge_name='br-int',has_traffic_filtering=True,id=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36d8ec1c-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.155 226239 DEBUG nova.objects.instance [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lazy-loading 'pci_devices' on Instance uuid ce5ab48d-0d12-49ac-92a1-001e91e26553 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.310 226239 DEBUG nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:20:13 np0005603623 nova_compute[226235]:  <uuid>ce5ab48d-0d12-49ac-92a1-001e91e26553</uuid>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:  <name>instance-0000006a</name>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-390925815</nova:name>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:20:12</nova:creationTime>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:20:13 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:        <nova:user uuid="a80ca71875e8413caa2b52e679e1dd40">tempest-ListServerFiltersTestJSON-166541249-project-member</nova:user>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:        <nova:project uuid="782758ebebe64580accb21a22280e02f">tempest-ListServerFiltersTestJSON-166541249</nova:project>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:        <nova:port uuid="36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867">
Jan 31 03:20:13 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <entry name="serial">ce5ab48d-0d12-49ac-92a1-001e91e26553</entry>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <entry name="uuid">ce5ab48d-0d12-49ac-92a1-001e91e26553</entry>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/ce5ab48d-0d12-49ac-92a1-001e91e26553_disk">
Jan 31 03:20:13 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:20:13 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/ce5ab48d-0d12-49ac-92a1-001e91e26553_disk.config">
Jan 31 03:20:13 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:20:13 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:0f:79:ef"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <target dev="tap36d8ec1c-f2"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/ce5ab48d-0d12-49ac-92a1-001e91e26553/console.log" append="off"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:20:13 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:20:13 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:20:13 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:20:13 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.311 226239 DEBUG nova.compute.manager [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Preparing to wait for external event network-vif-plugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.312 226239 DEBUG oslo_concurrency.lockutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.312 226239 DEBUG oslo_concurrency.lockutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.312 226239 DEBUG oslo_concurrency.lockutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.313 226239 DEBUG nova.virt.libvirt.vif [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:19:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-390925815',display_name='tempest-ListServerFiltersTestJSON-instance-390925815',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-390925815',id=106,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782758ebebe64580accb21a22280e02f',ramdisk_id='',reservation_id='r-f8rcyvc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-166541249',owner_user_name='tempest-ListServerFiltersTestJSON-166541249-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:19:59Z,user_data=None,user_id='a80ca71875e8413caa2b52e679e1dd40',uuid=ce5ab48d-0d12-49ac-92a1-001e91e26553,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.313 226239 DEBUG nova.network.os_vif_util [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converting VIF {"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.314 226239 DEBUG nova.network.os_vif_util [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:79:ef,bridge_name='br-int',has_traffic_filtering=True,id=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36d8ec1c-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.314 226239 DEBUG os_vif [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:79:ef,bridge_name='br-int',has_traffic_filtering=True,id=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36d8ec1c-f2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.315 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.315 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.316 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.319 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.319 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap36d8ec1c-f2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.320 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap36d8ec1c-f2, col_values=(('external_ids', {'iface-id': '36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0f:79:ef', 'vm-uuid': 'ce5ab48d-0d12-49ac-92a1-001e91e26553'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.365 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:13 np0005603623 NetworkManager[48970]: <info>  [1769847613.3668] manager: (tap36d8ec1c-f2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/192)
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.372 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:20:13 np0005603623 nova_compute[226235]: 2026-01-31 08:20:13.373 226239 INFO os_vif [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:79:ef,bridge_name='br-int',has_traffic_filtering=True,id=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36d8ec1c-f2')#033[00m
Jan 31 03:20:14 np0005603623 nova_compute[226235]: 2026-01-31 08:20:14.060 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:20:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:14.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:20:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:20:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3055756699' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:20:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:20:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3055756699' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:20:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:14.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:14 np0005603623 nova_compute[226235]: 2026-01-31 08:20:14.481 226239 DEBUG nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:20:14 np0005603623 nova_compute[226235]: 2026-01-31 08:20:14.482 226239 DEBUG nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:20:14 np0005603623 nova_compute[226235]: 2026-01-31 08:20:14.482 226239 DEBUG nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] No VIF found with MAC fa:16:3e:0f:79:ef, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:20:14 np0005603623 nova_compute[226235]: 2026-01-31 08:20:14.483 226239 INFO nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Using config drive#033[00m
Jan 31 03:20:14 np0005603623 nova_compute[226235]: 2026-01-31 08:20:14.505 226239 DEBUG nova.storage.rbd_utils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image ce5ab48d-0d12-49ac-92a1-001e91e26553_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:15 np0005603623 nova_compute[226235]: 2026-01-31 08:20:15.422 226239 DEBUG nova.network.neutron [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Successfully updated port: 60d954a7-a949-4701-b77d-16de80bc2317 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:20:15 np0005603623 nova_compute[226235]: 2026-01-31 08:20:15.646 226239 DEBUG nova.network.neutron [req-ed884dc3-0207-4559-b41e-bbb27d99bd1f req-ece6ac08-23c3-40d2-91f9-8970b419b821 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Updated VIF entry in instance network info cache for port 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:20:15 np0005603623 nova_compute[226235]: 2026-01-31 08:20:15.647 226239 DEBUG nova.network.neutron [req-ed884dc3-0207-4559-b41e-bbb27d99bd1f req-ece6ac08-23c3-40d2-91f9-8970b419b821 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Updating instance_info_cache with network_info: [{"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:20:15 np0005603623 nova_compute[226235]: 2026-01-31 08:20:15.655 226239 INFO nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Creating config drive at /var/lib/nova/instances/ce5ab48d-0d12-49ac-92a1-001e91e26553/disk.config#033[00m
Jan 31 03:20:15 np0005603623 nova_compute[226235]: 2026-01-31 08:20:15.659 226239 DEBUG oslo_concurrency.processutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ce5ab48d-0d12-49ac-92a1-001e91e26553/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpzt88rtnr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:15 np0005603623 nova_compute[226235]: 2026-01-31 08:20:15.753 226239 DEBUG oslo_concurrency.lockutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "refresh_cache-4ef2381f-8f5e-4a65-b2fa-c015131646fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:20:15 np0005603623 nova_compute[226235]: 2026-01-31 08:20:15.753 226239 DEBUG oslo_concurrency.lockutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquired lock "refresh_cache-4ef2381f-8f5e-4a65-b2fa-c015131646fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:20:15 np0005603623 nova_compute[226235]: 2026-01-31 08:20:15.754 226239 DEBUG nova.network.neutron [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:20:15 np0005603623 nova_compute[226235]: 2026-01-31 08:20:15.777 226239 DEBUG oslo_concurrency.processutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ce5ab48d-0d12-49ac-92a1-001e91e26553/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpzt88rtnr" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:15 np0005603623 nova_compute[226235]: 2026-01-31 08:20:15.799 226239 DEBUG nova.storage.rbd_utils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image ce5ab48d-0d12-49ac-92a1-001e91e26553_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:15 np0005603623 nova_compute[226235]: 2026-01-31 08:20:15.803 226239 DEBUG oslo_concurrency.processutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ce5ab48d-0d12-49ac-92a1-001e91e26553/disk.config ce5ab48d-0d12-49ac-92a1-001e91e26553_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:15 np0005603623 nova_compute[226235]: 2026-01-31 08:20:15.820 226239 DEBUG nova.compute.manager [req-dc26cedc-4a71-4c6c-aa17-184ead884caa req-dbe02f71-fab8-4006-a8f8-56fbc921f9b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Received event network-changed-60d954a7-a949-4701-b77d-16de80bc2317 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:15 np0005603623 nova_compute[226235]: 2026-01-31 08:20:15.821 226239 DEBUG nova.compute.manager [req-dc26cedc-4a71-4c6c-aa17-184ead884caa req-dbe02f71-fab8-4006-a8f8-56fbc921f9b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Refreshing instance network info cache due to event network-changed-60d954a7-a949-4701-b77d-16de80bc2317. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:20:15 np0005603623 nova_compute[226235]: 2026-01-31 08:20:15.821 226239 DEBUG oslo_concurrency.lockutils [req-dc26cedc-4a71-4c6c-aa17-184ead884caa req-dbe02f71-fab8-4006-a8f8-56fbc921f9b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4ef2381f-8f5e-4a65-b2fa-c015131646fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:20:16 np0005603623 nova_compute[226235]: 2026-01-31 08:20:15.999 226239 DEBUG oslo_concurrency.lockutils [req-ed884dc3-0207-4559-b41e-bbb27d99bd1f req-ece6ac08-23c3-40d2-91f9-8970b419b821 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-ce5ab48d-0d12-49ac-92a1-001e91e26553" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:20:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:16.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:16 np0005603623 nova_compute[226235]: 2026-01-31 08:20:16.295 226239 DEBUG nova.network.neutron [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:20:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:20:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:16.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:17 np0005603623 nova_compute[226235]: 2026-01-31 08:20:17.088 226239 DEBUG oslo_concurrency.processutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ce5ab48d-0d12-49ac-92a1-001e91e26553/disk.config ce5ab48d-0d12-49ac-92a1-001e91e26553_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.285s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:17 np0005603623 nova_compute[226235]: 2026-01-31 08:20:17.089 226239 INFO nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Deleting local config drive /var/lib/nova/instances/ce5ab48d-0d12-49ac-92a1-001e91e26553/disk.config because it was imported into RBD.#033[00m
Jan 31 03:20:17 np0005603623 kernel: tap36d8ec1c-f2: entered promiscuous mode
Jan 31 03:20:17 np0005603623 NetworkManager[48970]: <info>  [1769847617.1297] manager: (tap36d8ec1c-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/193)
Jan 31 03:20:17 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:17Z|00403|binding|INFO|Claiming lport 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 for this chassis.
Jan 31 03:20:17 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:17Z|00404|binding|INFO|36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867: Claiming fa:16:3e:0f:79:ef 10.100.0.10
Jan 31 03:20:17 np0005603623 nova_compute[226235]: 2026-01-31 08:20:17.131 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:17 np0005603623 systemd-udevd[271210]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:20:17 np0005603623 nova_compute[226235]: 2026-01-31 08:20:17.158 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:17 np0005603623 systemd-machined[194379]: New machine qemu-45-instance-0000006a.
Jan 31 03:20:17 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:17Z|00405|binding|INFO|Setting lport 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 ovn-installed in OVS
Jan 31 03:20:17 np0005603623 NetworkManager[48970]: <info>  [1769847617.1645] device (tap36d8ec1c-f2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:20:17 np0005603623 NetworkManager[48970]: <info>  [1769847617.1651] device (tap36d8ec1c-f2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:20:17 np0005603623 nova_compute[226235]: 2026-01-31 08:20:17.164 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:17 np0005603623 systemd[1]: Started Virtual Machine qemu-45-instance-0000006a.
Jan 31 03:20:17 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:17Z|00406|binding|INFO|Setting lport 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 up in Southbound
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.174 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:79:ef 10.100.0.10'], port_security=['fa:16:3e:0f:79:ef 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ce5ab48d-0d12-49ac-92a1-001e91e26553', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81dd779a-d164-4109-911b-0834e390c815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '782758ebebe64580accb21a22280e02f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '16d4de5b-4914-4656-8fe1-e7d7abed377f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f936838-8680-43ea-b7b8-c96b02e037d3, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.176 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 in datapath 81dd779a-d164-4109-911b-0834e390c815 bound to our chassis#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.177 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81dd779a-d164-4109-911b-0834e390c815#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.187 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[223605da-f8d0-465f-8da5-0c6151b175ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.188 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap81dd779a-d1 in ovnmeta-81dd779a-d164-4109-911b-0834e390c815 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.190 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap81dd779a-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.191 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f42e06ee-9d17-4872-a362-63d4fb89c75f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.192 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a68d0cb6-150b-4d4c-91e1-4c5d20d111ef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.202 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[26d1ed28-a11d-4454-9a93-1367ae352f6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.212 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f9b849d6-766a-4013-8ce2-278c2e61a79e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.236 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[428b0dcc-e435-44f6-b293-913ab9911831]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.240 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4aff0ad7-223d-42ef-9ce8-60dc07e2a591]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:17 np0005603623 systemd-udevd[271213]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:20:17 np0005603623 NetworkManager[48970]: <info>  [1769847617.2420] manager: (tap81dd779a-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/194)
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.267 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[7120f93c-be6a-407e-b561-716832a253bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.270 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[21d2c8a4-c859-40bf-8fab-85f0ca9f5fd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:17 np0005603623 NetworkManager[48970]: <info>  [1769847617.2863] device (tap81dd779a-d0): carrier: link connected
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.290 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[11c96979-1737-434f-967f-ffc5c04cf70d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.305 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[01c593e9-ace5-4737-83ba-c4a8e5b74bb7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81dd779a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:7a:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675771, 'reachable_time': 16773, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271244, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.314 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a8704cfc-64ba-42e3-9b57-b7ff7a0013b9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:7ac2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 675771, 'tstamp': 675771}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271245, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.328 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c52ef6d6-1f05-4672-a86a-f3ada45b3d88]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81dd779a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:7a:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675771, 'reachable_time': 16773, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 271246, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.345 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[931812e3-6c08-461f-9d76-415af55764ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.388 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[62985967-edba-4057-b41c-d6d2d2672373]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.390 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81dd779a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.390 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.391 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81dd779a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:17 np0005603623 nova_compute[226235]: 2026-01-31 08:20:17.392 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:17 np0005603623 kernel: tap81dd779a-d0: entered promiscuous mode
Jan 31 03:20:17 np0005603623 NetworkManager[48970]: <info>  [1769847617.3937] manager: (tap81dd779a-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/195)
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.395 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81dd779a-d0, col_values=(('external_ids', {'iface-id': 'cd015007-f775-4d63-920f-2a0c657e4d70'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:17 np0005603623 nova_compute[226235]: 2026-01-31 08:20:17.396 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:17 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:17Z|00407|binding|INFO|Releasing lport cd015007-f775-4d63-920f-2a0c657e4d70 from this chassis (sb_readonly=0)
Jan 31 03:20:17 np0005603623 nova_compute[226235]: 2026-01-31 08:20:17.397 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.398 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/81dd779a-d164-4109-911b-0834e390c815.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/81dd779a-d164-4109-911b-0834e390c815.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.399 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cb46f810-4bab-426d-a966-4db35b633c5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.399 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-81dd779a-d164-4109-911b-0834e390c815
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/81dd779a-d164-4109-911b-0834e390c815.pid.haproxy
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 81dd779a-d164-4109-911b-0834e390c815
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:20:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:17.400 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'env', 'PROCESS_TAG=haproxy-81dd779a-d164-4109-911b-0834e390c815', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/81dd779a-d164-4109-911b-0834e390c815.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:20:17 np0005603623 nova_compute[226235]: 2026-01-31 08:20:17.402 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:20:17 np0005603623 podman[271311]: 2026-01-31 08:20:17.76540783 +0000 UTC m=+0.076588725 container create df77f13a882c53561426d33e397e93075e610ddf34410d40384efce763857ddb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:20:17 np0005603623 nova_compute[226235]: 2026-01-31 08:20:17.783 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847617.7827399, ce5ab48d-0d12-49ac-92a1-001e91e26553 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:20:17 np0005603623 nova_compute[226235]: 2026-01-31 08:20:17.784 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] VM Started (Lifecycle Event)#033[00m
Jan 31 03:20:17 np0005603623 systemd[1]: Started libpod-conmon-df77f13a882c53561426d33e397e93075e610ddf34410d40384efce763857ddb.scope.
Jan 31 03:20:17 np0005603623 podman[271311]: 2026-01-31 08:20:17.713413441 +0000 UTC m=+0.024594366 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:20:17 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:20:17 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/692b5d2c4be40cebb3d302e66665518b2dacfaee66b41372c6bfecbd64aa8df1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:20:17 np0005603623 podman[271311]: 2026-01-31 08:20:17.833388034 +0000 UTC m=+0.144568939 container init df77f13a882c53561426d33e397e93075e610ddf34410d40384efce763857ddb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:20:17 np0005603623 podman[271311]: 2026-01-31 08:20:17.839439454 +0000 UTC m=+0.150620359 container start df77f13a882c53561426d33e397e93075e610ddf34410d40384efce763857ddb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:20:17 np0005603623 neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815[271334]: [NOTICE]   (271338) : New worker (271340) forked
Jan 31 03:20:17 np0005603623 neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815[271334]: [NOTICE]   (271338) : Loading success.
Jan 31 03:20:17 np0005603623 nova_compute[226235]: 2026-01-31 08:20:17.981 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:17 np0005603623 nova_compute[226235]: 2026-01-31 08:20:17.986 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847617.7831278, ce5ab48d-0d12-49ac-92a1-001e91e26553 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:20:17 np0005603623 nova_compute[226235]: 2026-01-31 08:20:17.986 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:20:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.002000064s ======
Jan 31 03:20:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:18.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.155 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.158 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.254 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.356 226239 DEBUG nova.network.neutron [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Updating instance_info_cache with network_info: [{"id": "60d954a7-a949-4701-b77d-16de80bc2317", "address": "fa:16:3e:c1:33:b2", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d954a7-a9", "ovs_interfaceid": "60d954a7-a949-4701-b77d-16de80bc2317", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.365 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:18.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.724 226239 DEBUG oslo_concurrency.lockutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Releasing lock "refresh_cache-4ef2381f-8f5e-4a65-b2fa-c015131646fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.724 226239 DEBUG nova.compute.manager [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Instance network_info: |[{"id": "60d954a7-a949-4701-b77d-16de80bc2317", "address": "fa:16:3e:c1:33:b2", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d954a7-a9", "ovs_interfaceid": "60d954a7-a949-4701-b77d-16de80bc2317", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.725 226239 DEBUG oslo_concurrency.lockutils [req-dc26cedc-4a71-4c6c-aa17-184ead884caa req-dbe02f71-fab8-4006-a8f8-56fbc921f9b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4ef2381f-8f5e-4a65-b2fa-c015131646fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.725 226239 DEBUG nova.network.neutron [req-dc26cedc-4a71-4c6c-aa17-184ead884caa req-dbe02f71-fab8-4006-a8f8-56fbc921f9b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Refreshing network info cache for port 60d954a7-a949-4701-b77d-16de80bc2317 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.727 226239 DEBUG nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Start _get_guest_xml network_info=[{"id": "60d954a7-a949-4701-b77d-16de80bc2317", "address": "fa:16:3e:c1:33:b2", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d954a7-a9", "ovs_interfaceid": "60d954a7-a949-4701-b77d-16de80bc2317", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.730 226239 WARNING nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.738 226239 DEBUG nova.virt.libvirt.host [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.738 226239 DEBUG nova.virt.libvirt.host [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.744 226239 DEBUG nova.virt.libvirt.host [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.744 226239 DEBUG nova.virt.libvirt.host [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.746 226239 DEBUG nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.746 226239 DEBUG nova.virt.hardware [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='f75c4aee-d826-4343-a7e3-f06a4b21de52',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.746 226239 DEBUG nova.virt.hardware [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.747 226239 DEBUG nova.virt.hardware [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.747 226239 DEBUG nova.virt.hardware [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.747 226239 DEBUG nova.virt.hardware [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.747 226239 DEBUG nova.virt.hardware [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.748 226239 DEBUG nova.virt.hardware [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.748 226239 DEBUG nova.virt.hardware [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.748 226239 DEBUG nova.virt.hardware [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.748 226239 DEBUG nova.virt.hardware [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.749 226239 DEBUG nova.virt.hardware [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:20:18 np0005603623 nova_compute[226235]: 2026-01-31 08:20:18.752 226239 DEBUG oslo_concurrency.processutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.101 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:20:19 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2560860442' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.225 226239 DEBUG oslo_concurrency.processutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.247 226239 DEBUG nova.storage.rbd_utils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image 4ef2381f-8f5e-4a65-b2fa-c015131646fb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.250 226239 DEBUG oslo_concurrency.processutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:20:19 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4103355945' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.652 226239 DEBUG oslo_concurrency.processutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.402s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.654 226239 DEBUG nova.virt.libvirt.vif [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:20:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-21873594',display_name='tempest-ListServerFiltersTestJSON-instance-21873594',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-21873594',id=108,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782758ebebe64580accb21a22280e02f',ramdisk_id='',reservation_id='r-0miqnhf5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-166541249',owner_user_name='tempest-ListServerFiltersTestJSON-166541249-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:20:09Z,user_data=None,user_id='a80ca71875e8413caa2b52e679e1dd40',uuid=4ef2381f-8f5e-4a65-b2fa-c015131646fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60d954a7-a949-4701-b77d-16de80bc2317", "address": "fa:16:3e:c1:33:b2", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d954a7-a9", "ovs_interfaceid": "60d954a7-a949-4701-b77d-16de80bc2317", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.654 226239 DEBUG nova.network.os_vif_util [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converting VIF {"id": "60d954a7-a949-4701-b77d-16de80bc2317", "address": "fa:16:3e:c1:33:b2", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d954a7-a9", "ovs_interfaceid": "60d954a7-a949-4701-b77d-16de80bc2317", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.655 226239 DEBUG nova.network.os_vif_util [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:33:b2,bridge_name='br-int',has_traffic_filtering=True,id=60d954a7-a949-4701-b77d-16de80bc2317,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d954a7-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.656 226239 DEBUG nova.objects.instance [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lazy-loading 'pci_devices' on Instance uuid 4ef2381f-8f5e-4a65-b2fa-c015131646fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.757 226239 DEBUG nova.compute.manager [req-b3499d58-f61a-4f29-bec8-dc61ab8e5db8 req-eda9d99a-7598-4926-9d5d-0c6b16f9a76d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Received event network-vif-plugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.757 226239 DEBUG oslo_concurrency.lockutils [req-b3499d58-f61a-4f29-bec8-dc61ab8e5db8 req-eda9d99a-7598-4926-9d5d-0c6b16f9a76d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.758 226239 DEBUG oslo_concurrency.lockutils [req-b3499d58-f61a-4f29-bec8-dc61ab8e5db8 req-eda9d99a-7598-4926-9d5d-0c6b16f9a76d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.758 226239 DEBUG oslo_concurrency.lockutils [req-b3499d58-f61a-4f29-bec8-dc61ab8e5db8 req-eda9d99a-7598-4926-9d5d-0c6b16f9a76d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.758 226239 DEBUG nova.compute.manager [req-b3499d58-f61a-4f29-bec8-dc61ab8e5db8 req-eda9d99a-7598-4926-9d5d-0c6b16f9a76d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Processing event network-vif-plugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.759 226239 DEBUG nova.compute.manager [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.762 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847619.761781, ce5ab48d-0d12-49ac-92a1-001e91e26553 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.762 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.764 226239 DEBUG nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.767 226239 INFO nova.virt.libvirt.driver [-] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Instance spawned successfully.#033[00m
Jan 31 03:20:19 np0005603623 nova_compute[226235]: 2026-01-31 08:20:19.768 226239 DEBUG nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.025 226239 DEBUG nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:20:20 np0005603623 nova_compute[226235]:  <uuid>4ef2381f-8f5e-4a65-b2fa-c015131646fb</uuid>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:  <name>instance-0000006c</name>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:  <memory>196608</memory>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-21873594</nova:name>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:20:18</nova:creationTime>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.micro">
Jan 31 03:20:20 np0005603623 nova_compute[226235]:        <nova:memory>192</nova:memory>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:        <nova:user uuid="a80ca71875e8413caa2b52e679e1dd40">tempest-ListServerFiltersTestJSON-166541249-project-member</nova:user>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:        <nova:project uuid="782758ebebe64580accb21a22280e02f">tempest-ListServerFiltersTestJSON-166541249</nova:project>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:        <nova:port uuid="60d954a7-a949-4701-b77d-16de80bc2317">
Jan 31 03:20:20 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <entry name="serial">4ef2381f-8f5e-4a65-b2fa-c015131646fb</entry>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <entry name="uuid">4ef2381f-8f5e-4a65-b2fa-c015131646fb</entry>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/4ef2381f-8f5e-4a65-b2fa-c015131646fb_disk">
Jan 31 03:20:20 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:20:20 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/4ef2381f-8f5e-4a65-b2fa-c015131646fb_disk.config">
Jan 31 03:20:20 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:20:20 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:c1:33:b2"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <target dev="tap60d954a7-a9"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/4ef2381f-8f5e-4a65-b2fa-c015131646fb/console.log" append="off"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:20:20 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:20:20 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:20:20 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:20:20 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.027 226239 DEBUG nova.compute.manager [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Preparing to wait for external event network-vif-plugged-60d954a7-a949-4701-b77d-16de80bc2317 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.028 226239 DEBUG oslo_concurrency.lockutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.029 226239 DEBUG oslo_concurrency.lockutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.029 226239 DEBUG oslo_concurrency.lockutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.031 226239 DEBUG nova.virt.libvirt.vif [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:20:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-21873594',display_name='tempest-ListServerFiltersTestJSON-instance-21873594',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-21873594',id=108,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782758ebebe64580accb21a22280e02f',ramdisk_id='',reservation_id='r-0miqnhf5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-166541249',owner_user_name='tempest-ListServerFiltersTestJSON-166541249-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:20:09Z,user_data=None,user_id='a80ca71875e8413caa2b52e679e1dd40',uuid=4ef2381f-8f5e-4a65-b2fa-c015131646fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60d954a7-a949-4701-b77d-16de80bc2317", "address": "fa:16:3e:c1:33:b2", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d954a7-a9", "ovs_interfaceid": "60d954a7-a949-4701-b77d-16de80bc2317", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.031 226239 DEBUG nova.network.os_vif_util [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converting VIF {"id": "60d954a7-a949-4701-b77d-16de80bc2317", "address": "fa:16:3e:c1:33:b2", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d954a7-a9", "ovs_interfaceid": "60d954a7-a949-4701-b77d-16de80bc2317", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.033 226239 DEBUG nova.network.os_vif_util [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:33:b2,bridge_name='br-int',has_traffic_filtering=True,id=60d954a7-a949-4701-b77d-16de80bc2317,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d954a7-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.034 226239 DEBUG os_vif [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:33:b2,bridge_name='br-int',has_traffic_filtering=True,id=60d954a7-a949-4701-b77d-16de80bc2317,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d954a7-a9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.035 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.036 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.037 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.041 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.041 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60d954a7-a9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.042 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap60d954a7-a9, col_values=(('external_ids', {'iface-id': '60d954a7-a949-4701-b77d-16de80bc2317', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:33:b2', 'vm-uuid': '4ef2381f-8f5e-4a65-b2fa-c015131646fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.045 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:20 np0005603623 NetworkManager[48970]: <info>  [1769847620.0465] manager: (tap60d954a7-a9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/196)
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.048 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.053 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.054 226239 INFO os_vif [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:33:b2,bridge_name='br-int',has_traffic_filtering=True,id=60d954a7-a949-4701-b77d-16de80bc2317,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d954a7-a9')#033[00m
Jan 31 03:20:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:20.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.206 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.211 226239 DEBUG nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.211 226239 DEBUG nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.212 226239 DEBUG nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.212 226239 DEBUG nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.213 226239 DEBUG nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.213 226239 DEBUG nova.virt.libvirt.driver [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.219 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:20:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:20.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.604 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.750 226239 DEBUG nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.750 226239 DEBUG nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.750 226239 DEBUG nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] No VIF found with MAC fa:16:3e:c1:33:b2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.751 226239 INFO nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Using config drive#033[00m
Jan 31 03:20:20 np0005603623 nova_compute[226235]: 2026-01-31 08:20:20.771 226239 DEBUG nova.storage.rbd_utils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image 4ef2381f-8f5e-4a65-b2fa-c015131646fb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:21 np0005603623 nova_compute[226235]: 2026-01-31 08:20:21.106 226239 INFO nova.compute.manager [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Took 20.28 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:20:21 np0005603623 nova_compute[226235]: 2026-01-31 08:20:21.106 226239 DEBUG nova.compute.manager [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:21 np0005603623 nova_compute[226235]: 2026-01-31 08:20:21.964 226239 INFO nova.compute.manager [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Took 26.55 seconds to build instance.#033[00m
Jan 31 03:20:22 np0005603623 nova_compute[226235]: 2026-01-31 08:20:22.069 226239 DEBUG nova.network.neutron [req-dc26cedc-4a71-4c6c-aa17-184ead884caa req-dbe02f71-fab8-4006-a8f8-56fbc921f9b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Updated VIF entry in instance network info cache for port 60d954a7-a949-4701-b77d-16de80bc2317. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:20:22 np0005603623 nova_compute[226235]: 2026-01-31 08:20:22.069 226239 DEBUG nova.network.neutron [req-dc26cedc-4a71-4c6c-aa17-184ead884caa req-dbe02f71-fab8-4006-a8f8-56fbc921f9b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Updating instance_info_cache with network_info: [{"id": "60d954a7-a949-4701-b77d-16de80bc2317", "address": "fa:16:3e:c1:33:b2", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d954a7-a9", "ovs_interfaceid": "60d954a7-a949-4701-b77d-16de80bc2317", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:20:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:22.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:22 np0005603623 nova_compute[226235]: 2026-01-31 08:20:22.231 226239 DEBUG oslo_concurrency.lockutils [None req-354bb905-a498-4a30-8f97-79bf82f20ca3 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 27.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:22 np0005603623 nova_compute[226235]: 2026-01-31 08:20:22.257 226239 DEBUG nova.compute.manager [req-0284167a-a2f8-4302-9fbf-345c314435cd req-cba4b533-3a64-4cc7-81fb-b989c305881b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Received event network-vif-plugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:22 np0005603623 nova_compute[226235]: 2026-01-31 08:20:22.258 226239 DEBUG oslo_concurrency.lockutils [req-0284167a-a2f8-4302-9fbf-345c314435cd req-cba4b533-3a64-4cc7-81fb-b989c305881b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:22 np0005603623 nova_compute[226235]: 2026-01-31 08:20:22.258 226239 DEBUG oslo_concurrency.lockutils [req-0284167a-a2f8-4302-9fbf-345c314435cd req-cba4b533-3a64-4cc7-81fb-b989c305881b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:22 np0005603623 nova_compute[226235]: 2026-01-31 08:20:22.258 226239 DEBUG oslo_concurrency.lockutils [req-0284167a-a2f8-4302-9fbf-345c314435cd req-cba4b533-3a64-4cc7-81fb-b989c305881b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:22 np0005603623 nova_compute[226235]: 2026-01-31 08:20:22.258 226239 DEBUG nova.compute.manager [req-0284167a-a2f8-4302-9fbf-345c314435cd req-cba4b533-3a64-4cc7-81fb-b989c305881b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] No waiting events found dispatching network-vif-plugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:20:22 np0005603623 nova_compute[226235]: 2026-01-31 08:20:22.259 226239 WARNING nova.compute.manager [req-0284167a-a2f8-4302-9fbf-345c314435cd req-cba4b533-3a64-4cc7-81fb-b989c305881b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Received unexpected event network-vif-plugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:20:22 np0005603623 nova_compute[226235]: 2026-01-31 08:20:22.352 226239 INFO nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Creating config drive at /var/lib/nova/instances/4ef2381f-8f5e-4a65-b2fa-c015131646fb/disk.config#033[00m
Jan 31 03:20:22 np0005603623 nova_compute[226235]: 2026-01-31 08:20:22.355 226239 DEBUG oslo_concurrency.processutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4ef2381f-8f5e-4a65-b2fa-c015131646fb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjwjr0gbj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:22 np0005603623 nova_compute[226235]: 2026-01-31 08:20:22.450 226239 DEBUG oslo_concurrency.lockutils [req-dc26cedc-4a71-4c6c-aa17-184ead884caa req-dbe02f71-fab8-4006-a8f8-56fbc921f9b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4ef2381f-8f5e-4a65-b2fa-c015131646fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:20:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:22.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:22 np0005603623 nova_compute[226235]: 2026-01-31 08:20:22.474 226239 DEBUG oslo_concurrency.processutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4ef2381f-8f5e-4a65-b2fa-c015131646fb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjwjr0gbj" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:22 np0005603623 nova_compute[226235]: 2026-01-31 08:20:22.496 226239 DEBUG nova.storage.rbd_utils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image 4ef2381f-8f5e-4a65-b2fa-c015131646fb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:22 np0005603623 nova_compute[226235]: 2026-01-31 08:20:22.500 226239 DEBUG oslo_concurrency.processutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4ef2381f-8f5e-4a65-b2fa-c015131646fb/disk.config 4ef2381f-8f5e-4a65-b2fa-c015131646fb_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:23 np0005603623 nova_compute[226235]: 2026-01-31 08:20:23.851 226239 DEBUG oslo_concurrency.processutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4ef2381f-8f5e-4a65-b2fa-c015131646fb/disk.config 4ef2381f-8f5e-4a65-b2fa-c015131646fb_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.351s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:23 np0005603623 nova_compute[226235]: 2026-01-31 08:20:23.853 226239 INFO nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Deleting local config drive /var/lib/nova/instances/4ef2381f-8f5e-4a65-b2fa-c015131646fb/disk.config because it was imported into RBD.#033[00m
Jan 31 03:20:23 np0005603623 kernel: tap60d954a7-a9: entered promiscuous mode
Jan 31 03:20:23 np0005603623 NetworkManager[48970]: <info>  [1769847623.9016] manager: (tap60d954a7-a9): new Tun device (/org/freedesktop/NetworkManager/Devices/197)
Jan 31 03:20:23 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:23Z|00408|binding|INFO|Claiming lport 60d954a7-a949-4701-b77d-16de80bc2317 for this chassis.
Jan 31 03:20:23 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:23Z|00409|binding|INFO|60d954a7-a949-4701-b77d-16de80bc2317: Claiming fa:16:3e:c1:33:b2 10.100.0.11
Jan 31 03:20:23 np0005603623 nova_compute[226235]: 2026-01-31 08:20:23.903 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:23 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:23Z|00410|binding|INFO|Setting lport 60d954a7-a949-4701-b77d-16de80bc2317 ovn-installed in OVS
Jan 31 03:20:23 np0005603623 nova_compute[226235]: 2026-01-31 08:20:23.909 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:23 np0005603623 nova_compute[226235]: 2026-01-31 08:20:23.911 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:23 np0005603623 systemd-udevd[271485]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:20:23 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:23Z|00411|binding|INFO|Setting lport 60d954a7-a949-4701-b77d-16de80bc2317 up in Southbound
Jan 31 03:20:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:23.932 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:33:b2 10.100.0.11'], port_security=['fa:16:3e:c1:33:b2 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4ef2381f-8f5e-4a65-b2fa-c015131646fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81dd779a-d164-4109-911b-0834e390c815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '782758ebebe64580accb21a22280e02f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '16d4de5b-4914-4656-8fe1-e7d7abed377f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f936838-8680-43ea-b7b8-c96b02e037d3, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=60d954a7-a949-4701-b77d-16de80bc2317) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:20:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:23.934 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 60d954a7-a949-4701-b77d-16de80bc2317 in datapath 81dd779a-d164-4109-911b-0834e390c815 bound to our chassis#033[00m
Jan 31 03:20:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:23.936 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81dd779a-d164-4109-911b-0834e390c815#033[00m
Jan 31 03:20:23 np0005603623 NetworkManager[48970]: <info>  [1769847623.9399] device (tap60d954a7-a9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:20:23 np0005603623 NetworkManager[48970]: <info>  [1769847623.9406] device (tap60d954a7-a9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:20:23 np0005603623 systemd-machined[194379]: New machine qemu-46-instance-0000006c.
Jan 31 03:20:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:23.947 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a786d35b-8803-4dba-9bb3-cebbe837a479]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:23 np0005603623 systemd[1]: Started Virtual Machine qemu-46-instance-0000006c.
Jan 31 03:20:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:23.975 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[0915c6d9-1a14-4ab4-8abf-df802feed1cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:23.980 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[39e79cef-b048-44b2-ae41-49115c9780a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:24.003 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[fd5ef726-1fa6-4d08-8973-89d1fdad956d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:24.019 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[185b2714-2e2f-4ba7-9cf4-6e09e4847b52]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81dd779a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:7a:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675771, 'reachable_time': 16773, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271502, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:24.033 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[88e49eda-5033-4da3-8e06-8f4864a0cb70]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap81dd779a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 675779, 'tstamp': 675779}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271503, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap81dd779a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 675781, 'tstamp': 675781}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271503, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:24.035 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81dd779a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:24 np0005603623 nova_compute[226235]: 2026-01-31 08:20:24.037 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:24 np0005603623 nova_compute[226235]: 2026-01-31 08:20:24.038 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:24.038 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81dd779a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:24.039 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:20:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:24.039 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81dd779a-d0, col_values=(('external_ids', {'iface-id': 'cd015007-f775-4d63-920f-2a0c657e4d70'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:24.039 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:20:24 np0005603623 nova_compute[226235]: 2026-01-31 08:20:24.103 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:20:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:24.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:20:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:24.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:25 np0005603623 nova_compute[226235]: 2026-01-31 08:20:25.046 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:25 np0005603623 nova_compute[226235]: 2026-01-31 08:20:25.248 226239 DEBUG nova.compute.manager [req-2cbda8f1-49f1-4ba0-ac1e-c60ad3b07abc req-e76032a2-83ba-4752-9565-5d1b73a4a278 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Received event network-vif-plugged-60d954a7-a949-4701-b77d-16de80bc2317 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:25 np0005603623 nova_compute[226235]: 2026-01-31 08:20:25.249 226239 DEBUG oslo_concurrency.lockutils [req-2cbda8f1-49f1-4ba0-ac1e-c60ad3b07abc req-e76032a2-83ba-4752-9565-5d1b73a4a278 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:25 np0005603623 nova_compute[226235]: 2026-01-31 08:20:25.249 226239 DEBUG oslo_concurrency.lockutils [req-2cbda8f1-49f1-4ba0-ac1e-c60ad3b07abc req-e76032a2-83ba-4752-9565-5d1b73a4a278 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:25 np0005603623 nova_compute[226235]: 2026-01-31 08:20:25.249 226239 DEBUG oslo_concurrency.lockutils [req-2cbda8f1-49f1-4ba0-ac1e-c60ad3b07abc req-e76032a2-83ba-4752-9565-5d1b73a4a278 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:25 np0005603623 nova_compute[226235]: 2026-01-31 08:20:25.250 226239 DEBUG nova.compute.manager [req-2cbda8f1-49f1-4ba0-ac1e-c60ad3b07abc req-e76032a2-83ba-4752-9565-5d1b73a4a278 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Processing event network-vif-plugged-60d954a7-a949-4701-b77d-16de80bc2317 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:20:25 np0005603623 nova_compute[226235]: 2026-01-31 08:20:25.957 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847625.9572663, 4ef2381f-8f5e-4a65-b2fa-c015131646fb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:20:25 np0005603623 nova_compute[226235]: 2026-01-31 08:20:25.958 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] VM Started (Lifecycle Event)#033[00m
Jan 31 03:20:25 np0005603623 nova_compute[226235]: 2026-01-31 08:20:25.960 226239 DEBUG nova.compute.manager [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:20:25 np0005603623 nova_compute[226235]: 2026-01-31 08:20:25.962 226239 DEBUG nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:20:25 np0005603623 nova_compute[226235]: 2026-01-31 08:20:25.965 226239 INFO nova.virt.libvirt.driver [-] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Instance spawned successfully.#033[00m
Jan 31 03:20:25 np0005603623 nova_compute[226235]: 2026-01-31 08:20:25.965 226239 DEBUG nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.056 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.060 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.071 226239 DEBUG nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.071 226239 DEBUG nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.072 226239 DEBUG nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.072 226239 DEBUG nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.073 226239 DEBUG nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.073 226239 DEBUG nova.virt.libvirt.driver [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:20:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:26.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.132 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.133 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847625.957355, 4ef2381f-8f5e-4a65-b2fa-c015131646fb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.133 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.324 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.328 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847625.9619029, 4ef2381f-8f5e-4a65-b2fa-c015131646fb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.329 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.335 226239 INFO nova.compute.manager [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Took 15.81 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.336 226239 DEBUG nova.compute.manager [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:26.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.643 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.647 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.734 226239 INFO nova.compute.manager [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Took 20.46 seconds to build instance.#033[00m
Jan 31 03:20:26 np0005603623 nova_compute[226235]: 2026-01-31 08:20:26.864 226239 DEBUG oslo_concurrency.lockutils [None req-49b04245-1f5f-4b62-a237-3a0ddaa8f0de a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:27 np0005603623 nova_compute[226235]: 2026-01-31 08:20:27.545 226239 DEBUG nova.compute.manager [req-e4ce72cc-550a-4197-91d4-01050758adbf req-1fa301fc-5883-4568-b9a5-6ae7b7379f7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Received event network-vif-plugged-60d954a7-a949-4701-b77d-16de80bc2317 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:27 np0005603623 nova_compute[226235]: 2026-01-31 08:20:27.545 226239 DEBUG oslo_concurrency.lockutils [req-e4ce72cc-550a-4197-91d4-01050758adbf req-1fa301fc-5883-4568-b9a5-6ae7b7379f7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:27 np0005603623 nova_compute[226235]: 2026-01-31 08:20:27.546 226239 DEBUG oslo_concurrency.lockutils [req-e4ce72cc-550a-4197-91d4-01050758adbf req-1fa301fc-5883-4568-b9a5-6ae7b7379f7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:27 np0005603623 nova_compute[226235]: 2026-01-31 08:20:27.546 226239 DEBUG oslo_concurrency.lockutils [req-e4ce72cc-550a-4197-91d4-01050758adbf req-1fa301fc-5883-4568-b9a5-6ae7b7379f7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:27 np0005603623 nova_compute[226235]: 2026-01-31 08:20:27.546 226239 DEBUG nova.compute.manager [req-e4ce72cc-550a-4197-91d4-01050758adbf req-1fa301fc-5883-4568-b9a5-6ae7b7379f7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] No waiting events found dispatching network-vif-plugged-60d954a7-a949-4701-b77d-16de80bc2317 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:20:27 np0005603623 nova_compute[226235]: 2026-01-31 08:20:27.546 226239 WARNING nova.compute.manager [req-e4ce72cc-550a-4197-91d4-01050758adbf req-1fa301fc-5883-4568-b9a5-6ae7b7379f7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Received unexpected event network-vif-plugged-60d954a7-a949-4701-b77d-16de80bc2317 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:20:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:20:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:28.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:20:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:28.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:29 np0005603623 nova_compute[226235]: 2026-01-31 08:20:29.104 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:30 np0005603623 nova_compute[226235]: 2026-01-31 08:20:30.048 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:30.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:30.113 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:30.114 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:30.115 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:30.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:32Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0f:79:ef 10.100.0.10
Jan 31 03:20:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:32Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0f:79:ef 10.100.0.10
Jan 31 03:20:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:32.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:32.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:34 np0005603623 nova_compute[226235]: 2026-01-31 08:20:34.106 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:34.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:20:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:34.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:20:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:35 np0005603623 nova_compute[226235]: 2026-01-31 08:20:35.050 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:36.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:36.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:37 np0005603623 nova_compute[226235]: 2026-01-31 08:20:37.339 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:37 np0005603623 nova_compute[226235]: 2026-01-31 08:20:37.339 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:20:37 np0005603623 nova_compute[226235]: 2026-01-31 08:20:37.340 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:20:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:38.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:38.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:38 np0005603623 nova_compute[226235]: 2026-01-31 08:20:38.509 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-ce5ab48d-0d12-49ac-92a1-001e91e26553" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:20:38 np0005603623 nova_compute[226235]: 2026-01-31 08:20:38.510 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-ce5ab48d-0d12-49ac-92a1-001e91e26553" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:20:38 np0005603623 nova_compute[226235]: 2026-01-31 08:20:38.510 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:20:38 np0005603623 nova_compute[226235]: 2026-01-31 08:20:38.510 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ce5ab48d-0d12-49ac-92a1-001e91e26553 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:39 np0005603623 nova_compute[226235]: 2026-01-31 08:20:39.108 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:39 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:39Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c1:33:b2 10.100.0.11
Jan 31 03:20:39 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:39Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c1:33:b2 10.100.0.11
Jan 31 03:20:40 np0005603623 nova_compute[226235]: 2026-01-31 08:20:40.053 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:20:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:40.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:20:40 np0005603623 nova_compute[226235]: 2026-01-31 08:20:40.170 226239 DEBUG oslo_concurrency.lockutils [None req-ddb3ead5-df7b-46a7-bc2f-e47af4156cf4 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "ce5ab48d-0d12-49ac-92a1-001e91e26553" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:40 np0005603623 nova_compute[226235]: 2026-01-31 08:20:40.170 226239 DEBUG oslo_concurrency.lockutils [None req-ddb3ead5-df7b-46a7-bc2f-e47af4156cf4 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:40 np0005603623 nova_compute[226235]: 2026-01-31 08:20:40.171 226239 DEBUG nova.compute.manager [None req-ddb3ead5-df7b-46a7-bc2f-e47af4156cf4 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:40 np0005603623 nova_compute[226235]: 2026-01-31 08:20:40.174 226239 DEBUG nova.compute.manager [None req-ddb3ead5-df7b-46a7-bc2f-e47af4156cf4 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 31 03:20:40 np0005603623 nova_compute[226235]: 2026-01-31 08:20:40.175 226239 DEBUG nova.objects.instance [None req-ddb3ead5-df7b-46a7-bc2f-e47af4156cf4 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lazy-loading 'flavor' on Instance uuid ce5ab48d-0d12-49ac-92a1-001e91e26553 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:40 np0005603623 nova_compute[226235]: 2026-01-31 08:20:40.458 226239 DEBUG nova.virt.libvirt.driver [None req-ddb3ead5-df7b-46a7-bc2f-e47af4156cf4 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:20:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:40.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:40.510 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:20:40 np0005603623 nova_compute[226235]: 2026-01-31 08:20:40.510 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:40.512 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:20:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e263 e263: 3 total, 3 up, 3 in
Jan 31 03:20:40 np0005603623 podman[271606]: 2026-01-31 08:20:40.968309433 +0000 UTC m=+0.053112826 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 03:20:40 np0005603623 podman[271607]: 2026-01-31 08:20:40.994733135 +0000 UTC m=+0.079037792 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:20:41 np0005603623 nova_compute[226235]: 2026-01-31 08:20:41.619 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Updating instance_info_cache with network_info: [{"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:20:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e264 e264: 3 total, 3 up, 3 in
Jan 31 03:20:41 np0005603623 nova_compute[226235]: 2026-01-31 08:20:41.837 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-ce5ab48d-0d12-49ac-92a1-001e91e26553" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:20:41 np0005603623 nova_compute[226235]: 2026-01-31 08:20:41.837 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:20:41 np0005603623 nova_compute[226235]: 2026-01-31 08:20:41.837 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:41 np0005603623 nova_compute[226235]: 2026-01-31 08:20:41.837 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:41 np0005603623 nova_compute[226235]: 2026-01-31 08:20:41.838 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:41 np0005603623 nova_compute[226235]: 2026-01-31 08:20:41.838 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:20:41 np0005603623 nova_compute[226235]: 2026-01-31 08:20:41.838 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:42 np0005603623 nova_compute[226235]: 2026-01-31 08:20:42.064 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:42 np0005603623 nova_compute[226235]: 2026-01-31 08:20:42.065 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:42 np0005603623 nova_compute[226235]: 2026-01-31 08:20:42.065 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:42 np0005603623 nova_compute[226235]: 2026-01-31 08:20:42.065 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:20:42 np0005603623 nova_compute[226235]: 2026-01-31 08:20:42.065 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:42.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:20:42 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/599629800' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:20:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:42.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:42 np0005603623 nova_compute[226235]: 2026-01-31 08:20:42.481 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:42 np0005603623 nova_compute[226235]: 2026-01-31 08:20:42.717 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000006a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:20:42 np0005603623 nova_compute[226235]: 2026-01-31 08:20:42.717 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000006a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:20:42 np0005603623 nova_compute[226235]: 2026-01-31 08:20:42.720 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000006c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:20:42 np0005603623 nova_compute[226235]: 2026-01-31 08:20:42.720 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000006c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:20:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e265 e265: 3 total, 3 up, 3 in
Jan 31 03:20:42 np0005603623 nova_compute[226235]: 2026-01-31 08:20:42.881 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:20:42 np0005603623 nova_compute[226235]: 2026-01-31 08:20:42.882 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4105MB free_disk=20.729415893554688GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:20:42 np0005603623 nova_compute[226235]: 2026-01-31 08:20:42.882 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:42 np0005603623 nova_compute[226235]: 2026-01-31 08:20:42.882 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:43 np0005603623 kernel: tap36d8ec1c-f2 (unregistering): left promiscuous mode
Jan 31 03:20:43 np0005603623 NetworkManager[48970]: <info>  [1769847643.1996] device (tap36d8ec1c-f2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:20:43 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:43Z|00412|binding|INFO|Releasing lport 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 from this chassis (sb_readonly=0)
Jan 31 03:20:43 np0005603623 nova_compute[226235]: 2026-01-31 08:20:43.207 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:43 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:43Z|00413|binding|INFO|Setting lport 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 down in Southbound
Jan 31 03:20:43 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:43Z|00414|binding|INFO|Removing iface tap36d8ec1c-f2 ovn-installed in OVS
Jan 31 03:20:43 np0005603623 nova_compute[226235]: 2026-01-31 08:20:43.210 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:43 np0005603623 nova_compute[226235]: 2026-01-31 08:20:43.217 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:43 np0005603623 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000006a.scope: Deactivated successfully.
Jan 31 03:20:43 np0005603623 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000006a.scope: Consumed 12.955s CPU time.
Jan 31 03:20:43 np0005603623 systemd-machined[194379]: Machine qemu-45-instance-0000006a terminated.
Jan 31 03:20:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:43.276 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:79:ef 10.100.0.10'], port_security=['fa:16:3e:0f:79:ef 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ce5ab48d-0d12-49ac-92a1-001e91e26553', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81dd779a-d164-4109-911b-0834e390c815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '782758ebebe64580accb21a22280e02f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '16d4de5b-4914-4656-8fe1-e7d7abed377f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f936838-8680-43ea-b7b8-c96b02e037d3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:20:43 np0005603623 nova_compute[226235]: 2026-01-31 08:20:43.276 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance ce5ab48d-0d12-49ac-92a1-001e91e26553 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:20:43 np0005603623 nova_compute[226235]: 2026-01-31 08:20:43.276 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 4ef2381f-8f5e-4a65-b2fa-c015131646fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:20:43 np0005603623 nova_compute[226235]: 2026-01-31 08:20:43.276 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:20:43 np0005603623 nova_compute[226235]: 2026-01-31 08:20:43.277 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:20:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:43.278 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 in datapath 81dd779a-d164-4109-911b-0834e390c815 unbound from our chassis#033[00m
Jan 31 03:20:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:43.281 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81dd779a-d164-4109-911b-0834e390c815#033[00m
Jan 31 03:20:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:43.292 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e0666f33-f7af-4864-b00c-99ed719e970c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:43.312 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[01f35ab0-7b0b-4e69-ac1d-b66bc79e3f42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:43.315 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[7eebfb6b-ea5f-4781-b80a-5635808f7129]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:43.336 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[c24ed59e-cae0-462a-a4d9-cac19905c525]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:43.346 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1bade911-e028-4f47-8fbb-101f6aa30bc8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81dd779a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:7a:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675771, 'reachable_time': 16773, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271687, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:43.359 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7ccaf871-c8fb-4c61-a1e4-32ecf1166632]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap81dd779a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 675779, 'tstamp': 675779}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271688, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap81dd779a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 675781, 'tstamp': 675781}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271688, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:43.360 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81dd779a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:43 np0005603623 nova_compute[226235]: 2026-01-31 08:20:43.396 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:43 np0005603623 nova_compute[226235]: 2026-01-31 08:20:43.400 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:20:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:43.401 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81dd779a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:43.401 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:20:43 np0005603623 nova_compute[226235]: 2026-01-31 08:20:43.402 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:43.402 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81dd779a-d0, col_values=(('external_ids', {'iface-id': 'cd015007-f775-4d63-920f-2a0c657e4d70'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:43.403 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:20:43 np0005603623 nova_compute[226235]: 2026-01-31 08:20:43.473 226239 INFO nova.virt.libvirt.driver [None req-ddb3ead5-df7b-46a7-bc2f-e47af4156cf4 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Instance shutdown successfully after 3 seconds.#033[00m
Jan 31 03:20:43 np0005603623 nova_compute[226235]: 2026-01-31 08:20:43.481 226239 INFO nova.virt.libvirt.driver [-] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Instance destroyed successfully.#033[00m
Jan 31 03:20:43 np0005603623 nova_compute[226235]: 2026-01-31 08:20:43.482 226239 DEBUG nova.objects.instance [None req-ddb3ead5-df7b-46a7-bc2f-e47af4156cf4 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lazy-loading 'numa_topology' on Instance uuid ce5ab48d-0d12-49ac-92a1-001e91e26553 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:43.514 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:43 np0005603623 nova_compute[226235]: 2026-01-31 08:20:43.524 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:20:43 np0005603623 nova_compute[226235]: 2026-01-31 08:20:43.525 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:20:43 np0005603623 nova_compute[226235]: 2026-01-31 08:20:43.574 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:20:43 np0005603623 nova_compute[226235]: 2026-01-31 08:20:43.611 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:20:43 np0005603623 nova_compute[226235]: 2026-01-31 08:20:43.703 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:44 np0005603623 nova_compute[226235]: 2026-01-31 08:20:44.111 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:44.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1161912560' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:20:44 np0005603623 nova_compute[226235]: 2026-01-31 08:20:44.180 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:44 np0005603623 nova_compute[226235]: 2026-01-31 08:20:44.186 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:20:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:44.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #91. Immutable memtables: 0.
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:44.616673) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 55] Flushing memtable with next log file: 91
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847644616776, "job": 55, "event": "flush_started", "num_memtables": 1, "num_entries": 1688, "num_deletes": 256, "total_data_size": 3811070, "memory_usage": 3844832, "flush_reason": "Manual Compaction"}
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 55] Level-0 flush table #92: started
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847644641615, "cf_name": "default", "job": 55, "event": "table_file_creation", "file_number": 92, "file_size": 2491355, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 47234, "largest_seqno": 48916, "table_properties": {"data_size": 2484388, "index_size": 3974, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 15083, "raw_average_key_size": 19, "raw_value_size": 2470159, "raw_average_value_size": 3271, "num_data_blocks": 175, "num_entries": 755, "num_filter_entries": 755, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769847504, "oldest_key_time": 1769847504, "file_creation_time": 1769847644, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 92, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 55] Flush lasted 25019 microseconds, and 4161 cpu microseconds.
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:44.641678) [db/flush_job.cc:967] [default] [JOB 55] Level-0 flush table #92: 2491355 bytes OK
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:44.641724) [db/memtable_list.cc:519] [default] Level-0 commit table #92 started
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:44.643568) [db/memtable_list.cc:722] [default] Level-0 commit table #92: memtable #1 done
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:44.643584) EVENT_LOG_v1 {"time_micros": 1769847644643579, "job": 55, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:44.643601) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 55] Try to delete WAL files size 3803354, prev total WAL file size 3803354, number of live WAL files 2.
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000088.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:44.644192) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031353036' seq:72057594037927935, type:22 .. '6C6F676D0031373538' seq:0, type:0; will stop at (end)
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 56] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 55 Base level 0, inputs: [92(2432KB)], [90(11MB)]
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847644644239, "job": 56, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [92], "files_L6": [90], "score": -1, "input_data_size": 14058013, "oldest_snapshot_seqno": -1}
Jan 31 03:20:44 np0005603623 nova_compute[226235]: 2026-01-31 08:20:44.675 226239 DEBUG nova.compute.manager [None req-ddb3ead5-df7b-46a7-bc2f-e47af4156cf4 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 56] Generated table #93: 7438 keys, 13922345 bytes, temperature: kUnknown
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847644844500, "cf_name": "default", "job": 56, "event": "table_file_creation", "file_number": 93, "file_size": 13922345, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13869326, "index_size": 33323, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18629, "raw_key_size": 191319, "raw_average_key_size": 25, "raw_value_size": 13733599, "raw_average_value_size": 1846, "num_data_blocks": 1328, "num_entries": 7438, "num_filter_entries": 7438, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769847644, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 93, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:44.844739) [db/compaction/compaction_job.cc:1663] [default] [JOB 56] Compacted 1@0 + 1@6 files to L6 => 13922345 bytes
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:44.866823) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 70.2 rd, 69.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 11.0 +0.0 blob) out(13.3 +0.0 blob), read-write-amplify(11.2) write-amplify(5.6) OK, records in: 7967, records dropped: 529 output_compression: NoCompression
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:44.866857) EVENT_LOG_v1 {"time_micros": 1769847644866845, "job": 56, "event": "compaction_finished", "compaction_time_micros": 200332, "compaction_time_cpu_micros": 28257, "output_level": 6, "num_output_files": 1, "total_output_size": 13922345, "num_input_records": 7967, "num_output_records": 7438, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000092.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847644867230, "job": 56, "event": "table_file_deletion", "file_number": 92}
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000090.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847644868174, "job": 56, "event": "table_file_deletion", "file_number": 90}
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:44.644095) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:44.868319) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:44.868330) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:44.868334) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:44.868338) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:44 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:44.868342) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:45 np0005603623 nova_compute[226235]: 2026-01-31 08:20:45.056 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:45 np0005603623 nova_compute[226235]: 2026-01-31 08:20:45.186 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:20:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:46.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:46.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:46 np0005603623 nova_compute[226235]: 2026-01-31 08:20:46.724 226239 DEBUG oslo_concurrency.lockutils [None req-ddb3ead5-df7b-46a7-bc2f-e47af4156cf4 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 6.554s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:47 np0005603623 nova_compute[226235]: 2026-01-31 08:20:47.736 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:20:47 np0005603623 nova_compute[226235]: 2026-01-31 08:20:47.736 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.853s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:47 np0005603623 nova_compute[226235]: 2026-01-31 08:20:47.736 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:47 np0005603623 nova_compute[226235]: 2026-01-31 08:20:47.737 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:20:47 np0005603623 nova_compute[226235]: 2026-01-31 08:20:47.755 226239 DEBUG nova.compute.manager [req-9c0a418a-b962-4443-9859-1ce7b84d22f0 req-6a884e41-4663-4d8e-9d09-cb46022bdf28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Received event network-vif-unplugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:47 np0005603623 nova_compute[226235]: 2026-01-31 08:20:47.755 226239 DEBUG oslo_concurrency.lockutils [req-9c0a418a-b962-4443-9859-1ce7b84d22f0 req-6a884e41-4663-4d8e-9d09-cb46022bdf28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:47 np0005603623 nova_compute[226235]: 2026-01-31 08:20:47.755 226239 DEBUG oslo_concurrency.lockutils [req-9c0a418a-b962-4443-9859-1ce7b84d22f0 req-6a884e41-4663-4d8e-9d09-cb46022bdf28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:47 np0005603623 nova_compute[226235]: 2026-01-31 08:20:47.755 226239 DEBUG oslo_concurrency.lockutils [req-9c0a418a-b962-4443-9859-1ce7b84d22f0 req-6a884e41-4663-4d8e-9d09-cb46022bdf28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:47 np0005603623 nova_compute[226235]: 2026-01-31 08:20:47.756 226239 DEBUG nova.compute.manager [req-9c0a418a-b962-4443-9859-1ce7b84d22f0 req-6a884e41-4663-4d8e-9d09-cb46022bdf28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] No waiting events found dispatching network-vif-unplugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:20:47 np0005603623 nova_compute[226235]: 2026-01-31 08:20:47.756 226239 WARNING nova.compute.manager [req-9c0a418a-b962-4443-9859-1ce7b84d22f0 req-6a884e41-4663-4d8e-9d09-cb46022bdf28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Received unexpected event network-vif-unplugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 for instance with vm_state stopped and task_state None.#033[00m
Jan 31 03:20:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:48.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:48.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:49 np0005603623 nova_compute[226235]: 2026-01-31 08:20:49.113 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:49 np0005603623 nova_compute[226235]: 2026-01-31 08:20:49.116 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:49 np0005603623 nova_compute[226235]: 2026-01-31 08:20:49.116 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:49 np0005603623 nova_compute[226235]: 2026-01-31 08:20:49.117 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 e266: 3 total, 3 up, 3 in
Jan 31 03:20:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:50 np0005603623 nova_compute[226235]: 2026-01-31 08:20:50.083 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:50.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:50 np0005603623 nova_compute[226235]: 2026-01-31 08:20:50.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:50.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:50 np0005603623 nova_compute[226235]: 2026-01-31 08:20:50.894 226239 DEBUG nova.compute.manager [req-6ccddda5-3b7f-447e-bfb2-3fb098bac72d req-d31f4ac3-c252-4e3d-b076-e1b0bf1cb253 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Received event network-vif-plugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:50 np0005603623 nova_compute[226235]: 2026-01-31 08:20:50.895 226239 DEBUG oslo_concurrency.lockutils [req-6ccddda5-3b7f-447e-bfb2-3fb098bac72d req-d31f4ac3-c252-4e3d-b076-e1b0bf1cb253 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:50 np0005603623 nova_compute[226235]: 2026-01-31 08:20:50.896 226239 DEBUG oslo_concurrency.lockutils [req-6ccddda5-3b7f-447e-bfb2-3fb098bac72d req-d31f4ac3-c252-4e3d-b076-e1b0bf1cb253 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:50 np0005603623 nova_compute[226235]: 2026-01-31 08:20:50.896 226239 DEBUG oslo_concurrency.lockutils [req-6ccddda5-3b7f-447e-bfb2-3fb098bac72d req-d31f4ac3-c252-4e3d-b076-e1b0bf1cb253 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:50 np0005603623 nova_compute[226235]: 2026-01-31 08:20:50.897 226239 DEBUG nova.compute.manager [req-6ccddda5-3b7f-447e-bfb2-3fb098bac72d req-d31f4ac3-c252-4e3d-b076-e1b0bf1cb253 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] No waiting events found dispatching network-vif-plugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:20:50 np0005603623 nova_compute[226235]: 2026-01-31 08:20:50.897 226239 WARNING nova.compute.manager [req-6ccddda5-3b7f-447e-bfb2-3fb098bac72d req-d31f4ac3-c252-4e3d-b076-e1b0bf1cb253 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Received unexpected event network-vif-plugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 31 03:20:50 np0005603623 nova_compute[226235]: 2026-01-31 08:20:50.971 226239 DEBUG nova.objects.instance [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lazy-loading 'flavor' on Instance uuid ce5ab48d-0d12-49ac-92a1-001e91e26553 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:51 np0005603623 nova_compute[226235]: 2026-01-31 08:20:51.066 226239 DEBUG oslo_concurrency.lockutils [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "refresh_cache-ce5ab48d-0d12-49ac-92a1-001e91e26553" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:20:51 np0005603623 nova_compute[226235]: 2026-01-31 08:20:51.067 226239 DEBUG oslo_concurrency.lockutils [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquired lock "refresh_cache-ce5ab48d-0d12-49ac-92a1-001e91e26553" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:20:51 np0005603623 nova_compute[226235]: 2026-01-31 08:20:51.068 226239 DEBUG nova.network.neutron [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:20:51 np0005603623 nova_compute[226235]: 2026-01-31 08:20:51.069 226239 DEBUG nova.objects.instance [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lazy-loading 'info_cache' on Instance uuid ce5ab48d-0d12-49ac-92a1-001e91e26553 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:52.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:52 np0005603623 nova_compute[226235]: 2026-01-31 08:20:52.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:52.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:54 np0005603623 nova_compute[226235]: 2026-01-31 08:20:54.114 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:20:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:54.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:20:54 np0005603623 nova_compute[226235]: 2026-01-31 08:20:54.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #94. Immutable memtables: 0.
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:54.335582) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 57] Flushing memtable with next log file: 94
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847654335617, "job": 57, "event": "flush_started", "num_memtables": 1, "num_entries": 367, "num_deletes": 252, "total_data_size": 255724, "memory_usage": 262304, "flush_reason": "Manual Compaction"}
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 57] Level-0 flush table #95: started
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847654337967, "cf_name": "default", "job": 57, "event": "table_file_creation", "file_number": 95, "file_size": 167472, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 48921, "largest_seqno": 49283, "table_properties": {"data_size": 165314, "index_size": 322, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 6317, "raw_average_key_size": 20, "raw_value_size": 160767, "raw_average_value_size": 527, "num_data_blocks": 14, "num_entries": 305, "num_filter_entries": 305, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769847645, "oldest_key_time": 1769847645, "file_creation_time": 1769847654, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 95, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 57] Flush lasted 2409 microseconds, and 920 cpu microseconds.
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:54.337995) [db/flush_job.cc:967] [default] [JOB 57] Level-0 flush table #95: 167472 bytes OK
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:54.338007) [db/memtable_list.cc:519] [default] Level-0 commit table #95 started
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:54.339000) [db/memtable_list.cc:722] [default] Level-0 commit table #95: memtable #1 done
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:54.339010) EVENT_LOG_v1 {"time_micros": 1769847654339007, "job": 57, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:54.339023) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 57] Try to delete WAL files size 253251, prev total WAL file size 253251, number of live WAL files 2.
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000091.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:54.339360) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031353032' seq:72057594037927935, type:22 .. '6D6772737461740031373534' seq:0, type:0; will stop at (end)
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 58] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 57 Base level 0, inputs: [95(163KB)], [93(13MB)]
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847654339428, "job": 58, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [95], "files_L6": [93], "score": -1, "input_data_size": 14089817, "oldest_snapshot_seqno": -1}
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 58] Generated table #96: 7226 keys, 10248140 bytes, temperature: kUnknown
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847654480430, "cf_name": "default", "job": 58, "event": "table_file_creation", "file_number": 96, "file_size": 10248140, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10201348, "index_size": 27608, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18117, "raw_key_size": 187127, "raw_average_key_size": 25, "raw_value_size": 10074068, "raw_average_value_size": 1394, "num_data_blocks": 1089, "num_entries": 7226, "num_filter_entries": 7226, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769847654, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 96, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:54.480689) [db/compaction/compaction_job.cc:1663] [default] [JOB 58] Compacted 1@0 + 1@6 files to L6 => 10248140 bytes
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:54.482275) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 99.9 rd, 72.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 13.3 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(145.3) write-amplify(61.2) OK, records in: 7743, records dropped: 517 output_compression: NoCompression
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:54.482297) EVENT_LOG_v1 {"time_micros": 1769847654482288, "job": 58, "event": "compaction_finished", "compaction_time_micros": 141091, "compaction_time_cpu_micros": 19790, "output_level": 6, "num_output_files": 1, "total_output_size": 10248140, "num_input_records": 7743, "num_output_records": 7226, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000095.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847654482409, "job": 58, "event": "table_file_deletion", "file_number": 95}
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000093.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847654483772, "job": 58, "event": "table_file_deletion", "file_number": 93}
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:54.339268) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:54.483934) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:54.483941) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:54.483943) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:54.483945) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:20:54.483947) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:20:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:54.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:20:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:55 np0005603623 nova_compute[226235]: 2026-01-31 08:20:55.085 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:20:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:56.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.253 226239 DEBUG nova.network.neutron [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Updating instance_info_cache with network_info: [{"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.345 226239 DEBUG oslo_concurrency.lockutils [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Releasing lock "refresh_cache-ce5ab48d-0d12-49ac-92a1-001e91e26553" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.383 226239 INFO nova.virt.libvirt.driver [-] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Instance destroyed successfully.#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.384 226239 DEBUG nova.objects.instance [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lazy-loading 'numa_topology' on Instance uuid ce5ab48d-0d12-49ac-92a1-001e91e26553 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:56.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.580 226239 DEBUG nova.objects.instance [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lazy-loading 'resources' on Instance uuid ce5ab48d-0d12-49ac-92a1-001e91e26553 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.601 226239 DEBUG nova.virt.libvirt.vif [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-390925815',display_name='tempest-ListServerFiltersTestJSON-instance-390925815',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-390925815',id=106,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:20:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='782758ebebe64580accb21a22280e02f',ramdisk_id='',reservation_id='r-f8rcyvc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-166541249',owner_user_name='tempest-ListServerFiltersTestJSON-166541249-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:20:46Z,user_data=None,user_id='a80ca71875e8413caa2b52e679e1dd40',uuid=ce5ab48d-0d12-49ac-92a1-001e91e26553,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.602 226239 DEBUG nova.network.os_vif_util [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converting VIF {"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.603 226239 DEBUG nova.network.os_vif_util [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:79:ef,bridge_name='br-int',has_traffic_filtering=True,id=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36d8ec1c-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.603 226239 DEBUG os_vif [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:79:ef,bridge_name='br-int',has_traffic_filtering=True,id=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36d8ec1c-f2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.604 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.604 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap36d8ec1c-f2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.606 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.607 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.609 226239 INFO os_vif [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:79:ef,bridge_name='br-int',has_traffic_filtering=True,id=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36d8ec1c-f2')#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.615 226239 DEBUG nova.virt.libvirt.driver [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Start _get_guest_xml network_info=[{"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.618 226239 WARNING nova.virt.libvirt.driver [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.624 226239 DEBUG nova.virt.libvirt.host [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.624 226239 DEBUG nova.virt.libvirt.host [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.627 226239 DEBUG nova.virt.libvirt.host [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.628 226239 DEBUG nova.virt.libvirt.host [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.629 226239 DEBUG nova.virt.libvirt.driver [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.629 226239 DEBUG nova.virt.hardware [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.630 226239 DEBUG nova.virt.hardware [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.630 226239 DEBUG nova.virt.hardware [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.630 226239 DEBUG nova.virt.hardware [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.630 226239 DEBUG nova.virt.hardware [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.631 226239 DEBUG nova.virt.hardware [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.631 226239 DEBUG nova.virt.hardware [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.631 226239 DEBUG nova.virt.hardware [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.631 226239 DEBUG nova.virt.hardware [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.632 226239 DEBUG nova.virt.hardware [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.632 226239 DEBUG nova.virt.hardware [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.632 226239 DEBUG nova.objects.instance [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lazy-loading 'vcpu_model' on Instance uuid ce5ab48d-0d12-49ac-92a1-001e91e26553 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:56 np0005603623 nova_compute[226235]: 2026-01-31 08:20:56.724 226239 DEBUG oslo_concurrency.processutils [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:20:57 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2876617079' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.156 226239 DEBUG oslo_concurrency.processutils [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.195 226239 DEBUG oslo_concurrency.processutils [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:20:57 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/642394387' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.628 226239 DEBUG oslo_concurrency.processutils [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.630 226239 DEBUG nova.virt.libvirt.vif [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-390925815',display_name='tempest-ListServerFiltersTestJSON-instance-390925815',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-390925815',id=106,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:20:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='782758ebebe64580accb21a22280e02f',ramdisk_id='',reservation_id='r-f8rcyvc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-166541249',owner_user_name='tempest-ListServerFiltersTestJSON-166541249-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:20:46Z,user_data=None,user_id='a80ca71875e8413caa2b52e679e1dd40',uuid=ce5ab48d-0d12-49ac-92a1-001e91e26553,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.630 226239 DEBUG nova.network.os_vif_util [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converting VIF {"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.632 226239 DEBUG nova.network.os_vif_util [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:79:ef,bridge_name='br-int',has_traffic_filtering=True,id=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36d8ec1c-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.633 226239 DEBUG nova.objects.instance [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lazy-loading 'pci_devices' on Instance uuid ce5ab48d-0d12-49ac-92a1-001e91e26553 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.699 226239 DEBUG nova.virt.libvirt.driver [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:20:57 np0005603623 nova_compute[226235]:  <uuid>ce5ab48d-0d12-49ac-92a1-001e91e26553</uuid>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:  <name>instance-0000006a</name>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-390925815</nova:name>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:20:56</nova:creationTime>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:20:57 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:        <nova:user uuid="a80ca71875e8413caa2b52e679e1dd40">tempest-ListServerFiltersTestJSON-166541249-project-member</nova:user>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:        <nova:project uuid="782758ebebe64580accb21a22280e02f">tempest-ListServerFiltersTestJSON-166541249</nova:project>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:        <nova:port uuid="36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867">
Jan 31 03:20:57 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <entry name="serial">ce5ab48d-0d12-49ac-92a1-001e91e26553</entry>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <entry name="uuid">ce5ab48d-0d12-49ac-92a1-001e91e26553</entry>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/ce5ab48d-0d12-49ac-92a1-001e91e26553_disk">
Jan 31 03:20:57 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:20:57 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/ce5ab48d-0d12-49ac-92a1-001e91e26553_disk.config">
Jan 31 03:20:57 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:20:57 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:0f:79:ef"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <target dev="tap36d8ec1c-f2"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/ce5ab48d-0d12-49ac-92a1-001e91e26553/console.log" append="off"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <input type="keyboard" bus="usb"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:20:57 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:20:57 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:20:57 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:20:57 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.701 226239 DEBUG nova.virt.libvirt.driver [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] skipping disk for instance-0000006a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.701 226239 DEBUG nova.virt.libvirt.driver [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] skipping disk for instance-0000006a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.702 226239 DEBUG nova.virt.libvirt.vif [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-390925815',display_name='tempest-ListServerFiltersTestJSON-instance-390925815',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-390925815',id=106,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:20:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='782758ebebe64580accb21a22280e02f',ramdisk_id='',reservation_id='r-f8rcyvc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-166541249',owner_user_name='tempest-ListServerFiltersTestJSON-166541249-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:20:46Z,user_data=None,user_id='a80ca71875e8413caa2b52e679e1dd40',uuid=ce5ab48d-0d12-49ac-92a1-001e91e26553,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.702 226239 DEBUG nova.network.os_vif_util [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converting VIF {"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.703 226239 DEBUG nova.network.os_vif_util [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:79:ef,bridge_name='br-int',has_traffic_filtering=True,id=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36d8ec1c-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.703 226239 DEBUG os_vif [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:79:ef,bridge_name='br-int',has_traffic_filtering=True,id=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36d8ec1c-f2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.705 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.705 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.706 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.710 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.712 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap36d8ec1c-f2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.713 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap36d8ec1c-f2, col_values=(('external_ids', {'iface-id': '36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0f:79:ef', 'vm-uuid': 'ce5ab48d-0d12-49ac-92a1-001e91e26553'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:57 np0005603623 NetworkManager[48970]: <info>  [1769847657.7165] manager: (tap36d8ec1c-f2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/198)
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.716 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.719 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.720 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.721 226239 INFO os_vif [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:79:ef,bridge_name='br-int',has_traffic_filtering=True,id=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36d8ec1c-f2')#033[00m
Jan 31 03:20:57 np0005603623 kernel: tap36d8ec1c-f2: entered promiscuous mode
Jan 31 03:20:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:57Z|00415|binding|INFO|Claiming lport 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 for this chassis.
Jan 31 03:20:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:57Z|00416|binding|INFO|36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867: Claiming fa:16:3e:0f:79:ef 10.100.0.10
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.797 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:57 np0005603623 NetworkManager[48970]: <info>  [1769847657.7988] manager: (tap36d8ec1c-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/199)
Jan 31 03:20:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:57Z|00417|binding|INFO|Setting lport 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 ovn-installed in OVS
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.806 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.808 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:57 np0005603623 systemd-udevd[271857]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:20:57 np0005603623 systemd-machined[194379]: New machine qemu-47-instance-0000006a.
Jan 31 03:20:57 np0005603623 NetworkManager[48970]: <info>  [1769847657.8273] device (tap36d8ec1c-f2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:20:57 np0005603623 NetworkManager[48970]: <info>  [1769847657.8278] device (tap36d8ec1c-f2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:20:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:20:57Z|00418|binding|INFO|Setting lport 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 up in Southbound
Jan 31 03:20:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:57.841 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:79:ef 10.100.0.10'], port_security=['fa:16:3e:0f:79:ef 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ce5ab48d-0d12-49ac-92a1-001e91e26553', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81dd779a-d164-4109-911b-0834e390c815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '782758ebebe64580accb21a22280e02f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '16d4de5b-4914-4656-8fe1-e7d7abed377f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f936838-8680-43ea-b7b8-c96b02e037d3, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:20:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:57.843 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 in datapath 81dd779a-d164-4109-911b-0834e390c815 bound to our chassis#033[00m
Jan 31 03:20:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:57.844 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81dd779a-d164-4109-911b-0834e390c815#033[00m
Jan 31 03:20:57 np0005603623 systemd[1]: Started Virtual Machine qemu-47-instance-0000006a.
Jan 31 03:20:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:57.855 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9afc9bdc-5272-41fd-9557-d37343d1a2e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:57.879 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[e1b7d997-9a3d-4b01-a6ac-ecb2a8259f31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:57.882 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[a9ff0ab0-2cd4-4000-b0e9-c1d68d025aaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:57.904 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[51010c69-7acd-41b1-a934-662d83f977b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:57.916 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dbf370ee-e6ee-4ce3-bc06-70438376787d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81dd779a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:7a:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675771, 'reachable_time': 16773, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271871, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:57.928 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a104c60b-f60b-4799-a8e3-6c7cd7feb6ad]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap81dd779a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 675779, 'tstamp': 675779}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271872, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap81dd779a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 675781, 'tstamp': 675781}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271872, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:57.930 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81dd779a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.931 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:57 np0005603623 nova_compute[226235]: 2026-01-31 08:20:57.932 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:57.933 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81dd779a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:57.933 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:20:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:57.934 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81dd779a-d0, col_values=(('external_ids', {'iface-id': 'cd015007-f775-4d63-920f-2a0c657e4d70'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:20:57.934 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:20:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:58.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:58 np0005603623 nova_compute[226235]: 2026-01-31 08:20:58.386 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Removed pending event for ce5ab48d-0d12-49ac-92a1-001e91e26553 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:20:58 np0005603623 nova_compute[226235]: 2026-01-31 08:20:58.387 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847658.3856366, ce5ab48d-0d12-49ac-92a1-001e91e26553 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:20:58 np0005603623 nova_compute[226235]: 2026-01-31 08:20:58.387 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:20:58 np0005603623 nova_compute[226235]: 2026-01-31 08:20:58.389 226239 DEBUG nova.compute.manager [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:20:58 np0005603623 nova_compute[226235]: 2026-01-31 08:20:58.393 226239 INFO nova.virt.libvirt.driver [-] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Instance rebooted successfully.#033[00m
Jan 31 03:20:58 np0005603623 nova_compute[226235]: 2026-01-31 08:20:58.394 226239 DEBUG nova.compute.manager [None req-e6b8ad37-5087-4a70-92a6-16611f66c056 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:58 np0005603623 nova_compute[226235]: 2026-01-31 08:20:58.461 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:58 np0005603623 nova_compute[226235]: 2026-01-31 08:20:58.465 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:20:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:20:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:20:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:58.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:20:58 np0005603623 nova_compute[226235]: 2026-01-31 08:20:58.644 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847658.3868842, ce5ab48d-0d12-49ac-92a1-001e91e26553 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:20:58 np0005603623 nova_compute[226235]: 2026-01-31 08:20:58.645 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] VM Started (Lifecycle Event)#033[00m
Jan 31 03:20:58 np0005603623 nova_compute[226235]: 2026-01-31 08:20:58.703 226239 DEBUG nova.compute.manager [req-90255063-f747-4f65-8e79-7bddbb7eaba9 req-439d28ee-a780-4bec-a1a9-c7dec1a63918 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Received event network-vif-plugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:58 np0005603623 nova_compute[226235]: 2026-01-31 08:20:58.703 226239 DEBUG oslo_concurrency.lockutils [req-90255063-f747-4f65-8e79-7bddbb7eaba9 req-439d28ee-a780-4bec-a1a9-c7dec1a63918 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:58 np0005603623 nova_compute[226235]: 2026-01-31 08:20:58.704 226239 DEBUG oslo_concurrency.lockutils [req-90255063-f747-4f65-8e79-7bddbb7eaba9 req-439d28ee-a780-4bec-a1a9-c7dec1a63918 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:58 np0005603623 nova_compute[226235]: 2026-01-31 08:20:58.704 226239 DEBUG oslo_concurrency.lockutils [req-90255063-f747-4f65-8e79-7bddbb7eaba9 req-439d28ee-a780-4bec-a1a9-c7dec1a63918 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:58 np0005603623 nova_compute[226235]: 2026-01-31 08:20:58.704 226239 DEBUG nova.compute.manager [req-90255063-f747-4f65-8e79-7bddbb7eaba9 req-439d28ee-a780-4bec-a1a9-c7dec1a63918 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] No waiting events found dispatching network-vif-plugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:20:58 np0005603623 nova_compute[226235]: 2026-01-31 08:20:58.705 226239 WARNING nova.compute.manager [req-90255063-f747-4f65-8e79-7bddbb7eaba9 req-439d28ee-a780-4bec-a1a9-c7dec1a63918 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Received unexpected event network-vif-plugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:20:58 np0005603623 nova_compute[226235]: 2026-01-31 08:20:58.762 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:58 np0005603623 nova_compute[226235]: 2026-01-31 08:20:58.766 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:20:59 np0005603623 nova_compute[226235]: 2026-01-31 08:20:59.116 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:00.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:00.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:01 np0005603623 nova_compute[226235]: 2026-01-31 08:21:01.441 226239 DEBUG nova.compute.manager [req-093bfb46-1e37-4663-bab2-ff271706ec40 req-413195b0-1b34-4550-b4e6-9595f1babcb3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Received event network-vif-plugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:21:01 np0005603623 nova_compute[226235]: 2026-01-31 08:21:01.441 226239 DEBUG oslo_concurrency.lockutils [req-093bfb46-1e37-4663-bab2-ff271706ec40 req-413195b0-1b34-4550-b4e6-9595f1babcb3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:01 np0005603623 nova_compute[226235]: 2026-01-31 08:21:01.441 226239 DEBUG oslo_concurrency.lockutils [req-093bfb46-1e37-4663-bab2-ff271706ec40 req-413195b0-1b34-4550-b4e6-9595f1babcb3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:01 np0005603623 nova_compute[226235]: 2026-01-31 08:21:01.442 226239 DEBUG oslo_concurrency.lockutils [req-093bfb46-1e37-4663-bab2-ff271706ec40 req-413195b0-1b34-4550-b4e6-9595f1babcb3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:01 np0005603623 nova_compute[226235]: 2026-01-31 08:21:01.442 226239 DEBUG nova.compute.manager [req-093bfb46-1e37-4663-bab2-ff271706ec40 req-413195b0-1b34-4550-b4e6-9595f1babcb3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] No waiting events found dispatching network-vif-plugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:21:01 np0005603623 nova_compute[226235]: 2026-01-31 08:21:01.442 226239 WARNING nova.compute.manager [req-093bfb46-1e37-4663-bab2-ff271706ec40 req-413195b0-1b34-4550-b4e6-9595f1babcb3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Received unexpected event network-vif-plugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:21:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:21:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:02.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:21:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:02.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:02 np0005603623 nova_compute[226235]: 2026-01-31 08:21:02.717 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:04 np0005603623 nova_compute[226235]: 2026-01-31 08:21:04.119 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:04.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:04.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:05 np0005603623 nova_compute[226235]: 2026-01-31 08:21:05.227 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:05 np0005603623 nova_compute[226235]: 2026-01-31 08:21:05.228 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:21:05 np0005603623 nova_compute[226235]: 2026-01-31 08:21:05.284 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:21:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:06.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:21:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:06.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:21:07 np0005603623 ceph-mds[84161]: mds.beacon.cephfs.compute-2.asgtzy missed beacon ack from the monitors
Jan 31 03:21:07 np0005603623 nova_compute[226235]: 2026-01-31 08:21:07.719 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:08.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:08.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:09 np0005603623 nova_compute[226235]: 2026-01-31 08:21:09.120 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:10.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:10.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:11 np0005603623 ovn_controller[133449]: 2026-01-31T08:21:11Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0f:79:ef 10.100.0.10
Jan 31 03:21:11 np0005603623 podman[271975]: 2026-01-31 08:21:11.989251853 +0000 UTC m=+0.067711280 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:21:11 np0005603623 podman[271974]: 2026-01-31 08:21:11.991908346 +0000 UTC m=+0.070344143 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent)
Jan 31 03:21:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:21:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:12.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:21:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:12.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:12 np0005603623 nova_compute[226235]: 2026-01-31 08:21:12.722 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:13 np0005603623 ovn_controller[133449]: 2026-01-31T08:21:13Z|00419|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.335 226239 DEBUG oslo_concurrency.lockutils [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.336 226239 DEBUG oslo_concurrency.lockutils [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.336 226239 DEBUG oslo_concurrency.lockutils [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.336 226239 DEBUG oslo_concurrency.lockutils [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.336 226239 DEBUG oslo_concurrency.lockutils [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.338 226239 INFO nova.compute.manager [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Terminating instance#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.338 226239 DEBUG nova.compute.manager [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:21:13 np0005603623 kernel: tap60d954a7-a9 (unregistering): left promiscuous mode
Jan 31 03:21:13 np0005603623 NetworkManager[48970]: <info>  [1769847673.5393] device (tap60d954a7-a9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:21:13 np0005603623 ovn_controller[133449]: 2026-01-31T08:21:13Z|00420|binding|INFO|Releasing lport 60d954a7-a949-4701-b77d-16de80bc2317 from this chassis (sb_readonly=0)
Jan 31 03:21:13 np0005603623 ovn_controller[133449]: 2026-01-31T08:21:13Z|00421|binding|INFO|Setting lport 60d954a7-a949-4701-b77d-16de80bc2317 down in Southbound
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.548 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:13 np0005603623 ovn_controller[133449]: 2026-01-31T08:21:13Z|00422|binding|INFO|Removing iface tap60d954a7-a9 ovn-installed in OVS
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.551 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.555 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:13 np0005603623 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000006c.scope: Deactivated successfully.
Jan 31 03:21:13 np0005603623 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000006c.scope: Consumed 14.096s CPU time.
Jan 31 03:21:13 np0005603623 systemd-machined[194379]: Machine qemu-46-instance-0000006c terminated.
Jan 31 03:21:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:13.712 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:33:b2 10.100.0.11'], port_security=['fa:16:3e:c1:33:b2 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4ef2381f-8f5e-4a65-b2fa-c015131646fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81dd779a-d164-4109-911b-0834e390c815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '782758ebebe64580accb21a22280e02f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '16d4de5b-4914-4656-8fe1-e7d7abed377f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f936838-8680-43ea-b7b8-c96b02e037d3, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=60d954a7-a949-4701-b77d-16de80bc2317) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:21:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:13.714 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 60d954a7-a949-4701-b77d-16de80bc2317 in datapath 81dd779a-d164-4109-911b-0834e390c815 unbound from our chassis#033[00m
Jan 31 03:21:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:13.715 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81dd779a-d164-4109-911b-0834e390c815#033[00m
Jan 31 03:21:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:13.727 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[22c88397-7af2-4786-938e-e3f27f1cd221]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:13.749 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[4d2fb5a8-48b4-43d1-b710-2703c77bca4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:13.752 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[99ca9b00-e281-4628-ab30-1bdba5dbd83d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:13.769 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[922f53f5-2bea-40f3-9c46-09ac8f6c2c15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.776 226239 INFO nova.virt.libvirt.driver [-] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Instance destroyed successfully.#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.776 226239 DEBUG nova.objects.instance [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lazy-loading 'resources' on Instance uuid 4ef2381f-8f5e-4a65-b2fa-c015131646fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:21:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:13.781 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f6b6ac34-f4c6-4be3-889d-57ae7632430b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81dd779a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:7a:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675771, 'reachable_time': 16773, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272041, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:13.793 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[da103aca-7f4a-4952-8c53-fe216810a5ca]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap81dd779a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 675779, 'tstamp': 675779}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 272042, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap81dd779a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 675781, 'tstamp': 675781}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 272042, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:13.795 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81dd779a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.796 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.799 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:13.800 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81dd779a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:21:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:13.800 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:21:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:13.800 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81dd779a-d0, col_values=(('external_ids', {'iface-id': 'cd015007-f775-4d63-920f-2a0c657e4d70'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:21:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:13.801 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.935 226239 DEBUG nova.virt.libvirt.vif [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:20:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-21873594',display_name='tempest-ListServerFiltersTestJSON-instance-21873594',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-21873594',id=108,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:20:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='782758ebebe64580accb21a22280e02f',ramdisk_id='',reservation_id='r-0miqnhf5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-166541249',owner_user_name='tempest-ListServerFiltersTestJSON-166541249-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:20:26Z,user_data=None,user_id='a80ca71875e8413caa2b52e679e1dd40',uuid=4ef2381f-8f5e-4a65-b2fa-c015131646fb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60d954a7-a949-4701-b77d-16de80bc2317", "address": "fa:16:3e:c1:33:b2", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d954a7-a9", "ovs_interfaceid": "60d954a7-a949-4701-b77d-16de80bc2317", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.935 226239 DEBUG nova.network.os_vif_util [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converting VIF {"id": "60d954a7-a949-4701-b77d-16de80bc2317", "address": "fa:16:3e:c1:33:b2", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d954a7-a9", "ovs_interfaceid": "60d954a7-a949-4701-b77d-16de80bc2317", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.936 226239 DEBUG nova.network.os_vif_util [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:33:b2,bridge_name='br-int',has_traffic_filtering=True,id=60d954a7-a949-4701-b77d-16de80bc2317,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d954a7-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.936 226239 DEBUG os_vif [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:33:b2,bridge_name='br-int',has_traffic_filtering=True,id=60d954a7-a949-4701-b77d-16de80bc2317,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d954a7-a9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.937 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.938 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60d954a7-a9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.939 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.940 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:13 np0005603623 nova_compute[226235]: 2026-01-31 08:21:13.942 226239 INFO os_vif [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:33:b2,bridge_name='br-int',has_traffic_filtering=True,id=60d954a7-a949-4701-b77d-16de80bc2317,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d954a7-a9')#033[00m
Jan 31 03:21:14 np0005603623 nova_compute[226235]: 2026-01-31 08:21:14.122 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:14.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:21:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4165705960' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:21:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:21:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4165705960' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:21:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:14.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:15 np0005603623 nova_compute[226235]: 2026-01-31 08:21:15.757 226239 INFO nova.virt.libvirt.driver [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Deleting instance files /var/lib/nova/instances/4ef2381f-8f5e-4a65-b2fa-c015131646fb_del#033[00m
Jan 31 03:21:15 np0005603623 nova_compute[226235]: 2026-01-31 08:21:15.758 226239 INFO nova.virt.libvirt.driver [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Deletion of /var/lib/nova/instances/4ef2381f-8f5e-4a65-b2fa-c015131646fb_del complete#033[00m
Jan 31 03:21:16 np0005603623 nova_compute[226235]: 2026-01-31 08:21:16.062 226239 INFO nova.compute.manager [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Took 2.72 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:21:16 np0005603623 nova_compute[226235]: 2026-01-31 08:21:16.063 226239 DEBUG oslo.service.loopingcall [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:21:16 np0005603623 nova_compute[226235]: 2026-01-31 08:21:16.063 226239 DEBUG nova.compute.manager [-] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:21:16 np0005603623 nova_compute[226235]: 2026-01-31 08:21:16.063 226239 DEBUG nova.network.neutron [-] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:21:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:16.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:16.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:16 np0005603623 podman[272237]: 2026-01-31 08:21:16.990366187 +0000 UTC m=+0.201486496 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:21:17 np0005603623 podman[272237]: 2026-01-31 08:21:17.104194756 +0000 UTC m=+0.315315045 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 03:21:17 np0005603623 nova_compute[226235]: 2026-01-31 08:21:17.995 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:18 np0005603623 podman[272391]: 2026-01-31 08:21:18.007440369 +0000 UTC m=+0.243533789 container exec dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 03:21:18 np0005603623 podman[272413]: 2026-01-31 08:21:18.146212262 +0000 UTC m=+0.126765548 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 03:21:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:18.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:18 np0005603623 podman[272391]: 2026-01-31 08:21:18.259540555 +0000 UTC m=+0.495633985 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 03:21:18 np0005603623 nova_compute[226235]: 2026-01-31 08:21:18.375 226239 WARNING nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] While synchronizing instance power states, found 2 instances in the database and 1 instances on the hypervisor.#033[00m
Jan 31 03:21:18 np0005603623 nova_compute[226235]: 2026-01-31 08:21:18.375 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Triggering sync for uuid ce5ab48d-0d12-49ac-92a1-001e91e26553 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 03:21:18 np0005603623 nova_compute[226235]: 2026-01-31 08:21:18.375 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Triggering sync for uuid 4ef2381f-8f5e-4a65-b2fa-c015131646fb _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 03:21:18 np0005603623 nova_compute[226235]: 2026-01-31 08:21:18.375 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "ce5ab48d-0d12-49ac-92a1-001e91e26553" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:18 np0005603623 nova_compute[226235]: 2026-01-31 08:21:18.376 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:18 np0005603623 nova_compute[226235]: 2026-01-31 08:21:18.376 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:21:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:18.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:21:18 np0005603623 nova_compute[226235]: 2026-01-31 08:21:18.554 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:18 np0005603623 nova_compute[226235]: 2026-01-31 08:21:18.579 226239 DEBUG nova.network.neutron [-] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:21:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:18.585 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:21:18 np0005603623 nova_compute[226235]: 2026-01-31 08:21:18.585 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:18.586 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:21:18 np0005603623 podman[272457]: 2026-01-31 08:21:18.839184651 +0000 UTC m=+0.317334639 container exec 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, version=2.2.4, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, release=1793, architecture=x86_64, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public)
Jan 31 03:21:18 np0005603623 nova_compute[226235]: 2026-01-31 08:21:18.939 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:18 np0005603623 nova_compute[226235]: 2026-01-31 08:21:18.943 226239 INFO nova.compute.manager [-] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Took 2.88 seconds to deallocate network for instance.#033[00m
Jan 31 03:21:19 np0005603623 podman[272478]: 2026-01-31 08:21:19.003977674 +0000 UTC m=+0.149738140 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, vendor=Red Hat, Inc., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, architecture=x86_64, vcs-type=git, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 31 03:21:19 np0005603623 podman[272457]: 2026-01-31 08:21:19.060881352 +0000 UTC m=+0.539031320 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, distribution-scope=public, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, name=keepalived, architecture=x86_64)
Jan 31 03:21:19 np0005603623 nova_compute[226235]: 2026-01-31 08:21:19.123 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:19 np0005603623 nova_compute[226235]: 2026-01-31 08:21:19.339 226239 DEBUG oslo_concurrency.lockutils [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:19 np0005603623 nova_compute[226235]: 2026-01-31 08:21:19.339 226239 DEBUG oslo_concurrency.lockutils [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:19 np0005603623 nova_compute[226235]: 2026-01-31 08:21:19.434 226239 DEBUG oslo_concurrency.processutils [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:21:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:21:19 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1867671594' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:21:19 np0005603623 nova_compute[226235]: 2026-01-31 08:21:19.864 226239 DEBUG oslo_concurrency.processutils [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:21:19 np0005603623 nova_compute[226235]: 2026-01-31 08:21:19.871 226239 DEBUG nova.compute.provider_tree [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:21:20 np0005603623 nova_compute[226235]: 2026-01-31 08:21:20.087 226239 DEBUG nova.compute.manager [req-01e21673-31a6-4431-b833-07a789abe39d req-f5d3e365-1931-44cd-ac36-2aad2c444bf4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Received event network-vif-deleted-60d954a7-a949-4701-b77d-16de80bc2317 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:21:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:20.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:20 np0005603623 nova_compute[226235]: 2026-01-31 08:21:20.243 226239 DEBUG nova.scheduler.client.report [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:21:20 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:21:20 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:21:20 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:21:20 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:21:20 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:21:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:21:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:20.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:21:20 np0005603623 nova_compute[226235]: 2026-01-31 08:21:20.510 226239 DEBUG oslo_concurrency.lockutils [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:20 np0005603623 nova_compute[226235]: 2026-01-31 08:21:20.766 226239 INFO nova.scheduler.client.report [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Deleted allocations for instance 4ef2381f-8f5e-4a65-b2fa-c015131646fb#033[00m
Jan 31 03:21:21 np0005603623 nova_compute[226235]: 2026-01-31 08:21:21.185 226239 DEBUG oslo_concurrency.lockutils [None req-db4327e2-517f-4320-a6c9-099e6edec286 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.849s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:21 np0005603623 nova_compute[226235]: 2026-01-31 08:21:21.186 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 2.810s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:21 np0005603623 nova_compute[226235]: 2026-01-31 08:21:21.186 226239 INFO nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Jan 31 03:21:21 np0005603623 nova_compute[226235]: 2026-01-31 08:21:21.187 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "4ef2381f-8f5e-4a65-b2fa-c015131646fb" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:22.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:21:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:22.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:21:23 np0005603623 nova_compute[226235]: 2026-01-31 08:21:23.941 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:24 np0005603623 nova_compute[226235]: 2026-01-31 08:21:24.125 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:21:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:24.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:21:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:21:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:24.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:21:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:24.588 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:21:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:26.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:26.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:27 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:21:27 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:21:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:28.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:28.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:28 np0005603623 nova_compute[226235]: 2026-01-31 08:21:28.774 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847673.77339, 4ef2381f-8f5e-4a65-b2fa-c015131646fb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:21:28 np0005603623 nova_compute[226235]: 2026-01-31 08:21:28.775 226239 INFO nova.compute.manager [-] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:21:28 np0005603623 nova_compute[226235]: 2026-01-31 08:21:28.944 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:29 np0005603623 nova_compute[226235]: 2026-01-31 08:21:29.126 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:29 np0005603623 nova_compute[226235]: 2026-01-31 08:21:29.287 226239 DEBUG nova.compute.manager [None req-612b9fe0-d0a3-4d23-88b6-ca0fd80512bb - - - - - -] [instance: 4ef2381f-8f5e-4a65-b2fa-c015131646fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:21:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:30.114 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:30.115 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:30.115 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:30.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:30.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:32.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:32.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:32 np0005603623 nova_compute[226235]: 2026-01-31 08:21:32.798 226239 DEBUG oslo_concurrency.lockutils [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "ce5ab48d-0d12-49ac-92a1-001e91e26553" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:32 np0005603623 nova_compute[226235]: 2026-01-31 08:21:32.799 226239 DEBUG oslo_concurrency.lockutils [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:32 np0005603623 nova_compute[226235]: 2026-01-31 08:21:32.800 226239 DEBUG oslo_concurrency.lockutils [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:32 np0005603623 nova_compute[226235]: 2026-01-31 08:21:32.800 226239 DEBUG oslo_concurrency.lockutils [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:32 np0005603623 nova_compute[226235]: 2026-01-31 08:21:32.800 226239 DEBUG oslo_concurrency.lockutils [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:32 np0005603623 nova_compute[226235]: 2026-01-31 08:21:32.801 226239 INFO nova.compute.manager [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Terminating instance#033[00m
Jan 31 03:21:32 np0005603623 nova_compute[226235]: 2026-01-31 08:21:32.802 226239 DEBUG nova.compute.manager [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:21:32 np0005603623 kernel: tap36d8ec1c-f2 (unregistering): left promiscuous mode
Jan 31 03:21:32 np0005603623 NetworkManager[48970]: <info>  [1769847692.8636] device (tap36d8ec1c-f2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:21:32 np0005603623 nova_compute[226235]: 2026-01-31 08:21:32.871 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:21:32Z|00423|binding|INFO|Releasing lport 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 from this chassis (sb_readonly=0)
Jan 31 03:21:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:21:32Z|00424|binding|INFO|Setting lport 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 down in Southbound
Jan 31 03:21:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:21:32Z|00425|binding|INFO|Removing iface tap36d8ec1c-f2 ovn-installed in OVS
Jan 31 03:21:32 np0005603623 nova_compute[226235]: 2026-01-31 08:21:32.881 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:32 np0005603623 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000006a.scope: Deactivated successfully.
Jan 31 03:21:32 np0005603623 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000006a.scope: Consumed 12.946s CPU time.
Jan 31 03:21:32 np0005603623 systemd-machined[194379]: Machine qemu-47-instance-0000006a terminated.
Jan 31 03:21:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:33.002 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:79:ef 10.100.0.10'], port_security=['fa:16:3e:0f:79:ef 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ce5ab48d-0d12-49ac-92a1-001e91e26553', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81dd779a-d164-4109-911b-0834e390c815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '782758ebebe64580accb21a22280e02f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '16d4de5b-4914-4656-8fe1-e7d7abed377f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f936838-8680-43ea-b7b8-c96b02e037d3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:21:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:33.004 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 in datapath 81dd779a-d164-4109-911b-0834e390c815 unbound from our chassis#033[00m
Jan 31 03:21:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:33.006 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 81dd779a-d164-4109-911b-0834e390c815, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:21:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:33.007 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[49341f2b-048f-4656-84db-fe0537696cfc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:33.007 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-81dd779a-d164-4109-911b-0834e390c815 namespace which is not needed anymore#033[00m
Jan 31 03:21:33 np0005603623 nova_compute[226235]: 2026-01-31 08:21:33.018 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:33 np0005603623 nova_compute[226235]: 2026-01-31 08:21:33.021 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:33 np0005603623 nova_compute[226235]: 2026-01-31 08:21:33.030 226239 INFO nova.virt.libvirt.driver [-] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Instance destroyed successfully.#033[00m
Jan 31 03:21:33 np0005603623 nova_compute[226235]: 2026-01-31 08:21:33.030 226239 DEBUG nova.objects.instance [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lazy-loading 'resources' on Instance uuid ce5ab48d-0d12-49ac-92a1-001e91e26553 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:21:33 np0005603623 neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815[271334]: [NOTICE]   (271338) : haproxy version is 2.8.14-c23fe91
Jan 31 03:21:33 np0005603623 neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815[271334]: [NOTICE]   (271338) : path to executable is /usr/sbin/haproxy
Jan 31 03:21:33 np0005603623 neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815[271334]: [WARNING]  (271338) : Exiting Master process...
Jan 31 03:21:33 np0005603623 neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815[271334]: [WARNING]  (271338) : Exiting Master process...
Jan 31 03:21:33 np0005603623 neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815[271334]: [ALERT]    (271338) : Current worker (271340) exited with code 143 (Terminated)
Jan 31 03:21:33 np0005603623 neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815[271334]: [WARNING]  (271338) : All workers exited. Exiting... (0)
Jan 31 03:21:33 np0005603623 systemd[1]: libpod-df77f13a882c53561426d33e397e93075e610ddf34410d40384efce763857ddb.scope: Deactivated successfully.
Jan 31 03:21:33 np0005603623 podman[272787]: 2026-01-31 08:21:33.120222916 +0000 UTC m=+0.044834511 container died df77f13a882c53561426d33e397e93075e610ddf34410d40384efce763857ddb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 31 03:21:33 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-df77f13a882c53561426d33e397e93075e610ddf34410d40384efce763857ddb-userdata-shm.mount: Deactivated successfully.
Jan 31 03:21:33 np0005603623 systemd[1]: var-lib-containers-storage-overlay-692b5d2c4be40cebb3d302e66665518b2dacfaee66b41372c6bfecbd64aa8df1-merged.mount: Deactivated successfully.
Jan 31 03:21:33 np0005603623 podman[272787]: 2026-01-31 08:21:33.177980012 +0000 UTC m=+0.102591607 container cleanup df77f13a882c53561426d33e397e93075e610ddf34410d40384efce763857ddb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:21:33 np0005603623 systemd[1]: libpod-conmon-df77f13a882c53561426d33e397e93075e610ddf34410d40384efce763857ddb.scope: Deactivated successfully.
Jan 31 03:21:33 np0005603623 podman[272817]: 2026-01-31 08:21:33.231171815 +0000 UTC m=+0.039977339 container remove df77f13a882c53561426d33e397e93075e610ddf34410d40384efce763857ddb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:21:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:33.235 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2cb057a1-261d-4bec-983c-8af4787e83da]: (4, ('Sat Jan 31 08:21:33 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815 (df77f13a882c53561426d33e397e93075e610ddf34410d40384efce763857ddb)\ndf77f13a882c53561426d33e397e93075e610ddf34410d40384efce763857ddb\nSat Jan 31 08:21:33 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815 (df77f13a882c53561426d33e397e93075e610ddf34410d40384efce763857ddb)\ndf77f13a882c53561426d33e397e93075e610ddf34410d40384efce763857ddb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:33.237 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6644748b-76d9-4e64-923d-cd14832f9b51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:33.238 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81dd779a-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:21:33 np0005603623 nova_compute[226235]: 2026-01-31 08:21:33.240 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:33 np0005603623 kernel: tap81dd779a-d0: left promiscuous mode
Jan 31 03:21:33 np0005603623 nova_compute[226235]: 2026-01-31 08:21:33.250 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:33.253 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c9a5893f-20a3-473f-b0b6-5db8fdae4f90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:33 np0005603623 nova_compute[226235]: 2026-01-31 08:21:33.259 226239 DEBUG nova.virt.libvirt.vif [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-390925815',display_name='tempest-ListServerFiltersTestJSON-instance-390925815',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-390925815',id=106,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:20:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='782758ebebe64580accb21a22280e02f',ramdisk_id='',reservation_id='r-f8rcyvc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-166541249',owner_user_name='tempest-ListServerFiltersTestJSON-166541249-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:20:58Z,user_data=None,user_id='a80ca71875e8413caa2b52e679e1dd40',uuid=ce5ab48d-0d12-49ac-92a1-001e91e26553,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:21:33 np0005603623 nova_compute[226235]: 2026-01-31 08:21:33.260 226239 DEBUG nova.network.os_vif_util [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converting VIF {"id": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "address": "fa:16:3e:0f:79:ef", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36d8ec1c-f2", "ovs_interfaceid": "36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:21:33 np0005603623 nova_compute[226235]: 2026-01-31 08:21:33.261 226239 DEBUG nova.network.os_vif_util [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:79:ef,bridge_name='br-int',has_traffic_filtering=True,id=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36d8ec1c-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:21:33 np0005603623 nova_compute[226235]: 2026-01-31 08:21:33.261 226239 DEBUG os_vif [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:79:ef,bridge_name='br-int',has_traffic_filtering=True,id=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36d8ec1c-f2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:21:33 np0005603623 nova_compute[226235]: 2026-01-31 08:21:33.263 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:33 np0005603623 nova_compute[226235]: 2026-01-31 08:21:33.264 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap36d8ec1c-f2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:21:33 np0005603623 nova_compute[226235]: 2026-01-31 08:21:33.265 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:33 np0005603623 nova_compute[226235]: 2026-01-31 08:21:33.268 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:21:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:33.268 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8bf8c26d-5776-4f5f-b2d1-4652343e0f92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:33.270 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ae429bd0-9ea8-4fc5-8129-6eb2c2bc4ab1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:33 np0005603623 nova_compute[226235]: 2026-01-31 08:21:33.271 226239 INFO os_vif [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:79:ef,bridge_name='br-int',has_traffic_filtering=True,id=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36d8ec1c-f2')#033[00m
Jan 31 03:21:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:33.282 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5930915b-a842-4d52-a245-847595401718]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675766, 'reachable_time': 24134, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272837, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:33 np0005603623 systemd[1]: run-netns-ovnmeta\x2d81dd779a\x2dd164\x2d4109\x2d911b\x2d0834e390c815.mount: Deactivated successfully.
Jan 31 03:21:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:33.284 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-81dd779a-d164-4109-911b-0834e390c815 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:21:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:21:33.285 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[5d945c88-f8b5-4ceb-a3ab-74fb49f32c09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:33 np0005603623 nova_compute[226235]: 2026-01-31 08:21:33.832 226239 INFO nova.virt.libvirt.driver [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Deleting instance files /var/lib/nova/instances/ce5ab48d-0d12-49ac-92a1-001e91e26553_del#033[00m
Jan 31 03:21:33 np0005603623 nova_compute[226235]: 2026-01-31 08:21:33.833 226239 INFO nova.virt.libvirt.driver [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Deletion of /var/lib/nova/instances/ce5ab48d-0d12-49ac-92a1-001e91e26553_del complete#033[00m
Jan 31 03:21:34 np0005603623 nova_compute[226235]: 2026-01-31 08:21:34.048 226239 DEBUG nova.compute.manager [req-5078ce55-9dff-4b30-ae36-406edf3f4c12 req-d8b429c1-bbac-47e3-b7c0-72312e1aa7d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Received event network-vif-unplugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:21:34 np0005603623 nova_compute[226235]: 2026-01-31 08:21:34.049 226239 DEBUG oslo_concurrency.lockutils [req-5078ce55-9dff-4b30-ae36-406edf3f4c12 req-d8b429c1-bbac-47e3-b7c0-72312e1aa7d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:34 np0005603623 nova_compute[226235]: 2026-01-31 08:21:34.049 226239 DEBUG oslo_concurrency.lockutils [req-5078ce55-9dff-4b30-ae36-406edf3f4c12 req-d8b429c1-bbac-47e3-b7c0-72312e1aa7d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:34 np0005603623 nova_compute[226235]: 2026-01-31 08:21:34.049 226239 DEBUG oslo_concurrency.lockutils [req-5078ce55-9dff-4b30-ae36-406edf3f4c12 req-d8b429c1-bbac-47e3-b7c0-72312e1aa7d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:34 np0005603623 nova_compute[226235]: 2026-01-31 08:21:34.049 226239 DEBUG nova.compute.manager [req-5078ce55-9dff-4b30-ae36-406edf3f4c12 req-d8b429c1-bbac-47e3-b7c0-72312e1aa7d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] No waiting events found dispatching network-vif-unplugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:21:34 np0005603623 nova_compute[226235]: 2026-01-31 08:21:34.050 226239 DEBUG nova.compute.manager [req-5078ce55-9dff-4b30-ae36-406edf3f4c12 req-d8b429c1-bbac-47e3-b7c0-72312e1aa7d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Received event network-vif-unplugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:21:34 np0005603623 nova_compute[226235]: 2026-01-31 08:21:34.127 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:21:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:34.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:21:34 np0005603623 nova_compute[226235]: 2026-01-31 08:21:34.401 226239 INFO nova.compute.manager [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Took 1.60 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:21:34 np0005603623 nova_compute[226235]: 2026-01-31 08:21:34.402 226239 DEBUG oslo.service.loopingcall [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:21:34 np0005603623 nova_compute[226235]: 2026-01-31 08:21:34.402 226239 DEBUG nova.compute.manager [-] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:21:34 np0005603623 nova_compute[226235]: 2026-01-31 08:21:34.402 226239 DEBUG nova.network.neutron [-] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:21:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:34.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:36 np0005603623 nova_compute[226235]: 2026-01-31 08:21:36.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:36 np0005603623 nova_compute[226235]: 2026-01-31 08:21:36.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:21:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:36.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:36 np0005603623 nova_compute[226235]: 2026-01-31 08:21:36.510 226239 DEBUG nova.compute.manager [req-108ec188-5fd6-45a7-8ce0-2175f6e1a0d0 req-c65d8a56-a560-4bf2-8f21-206bda5c3bf9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Received event network-vif-plugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:21:36 np0005603623 nova_compute[226235]: 2026-01-31 08:21:36.510 226239 DEBUG oslo_concurrency.lockutils [req-108ec188-5fd6-45a7-8ce0-2175f6e1a0d0 req-c65d8a56-a560-4bf2-8f21-206bda5c3bf9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:36 np0005603623 nova_compute[226235]: 2026-01-31 08:21:36.511 226239 DEBUG oslo_concurrency.lockutils [req-108ec188-5fd6-45a7-8ce0-2175f6e1a0d0 req-c65d8a56-a560-4bf2-8f21-206bda5c3bf9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:36 np0005603623 nova_compute[226235]: 2026-01-31 08:21:36.511 226239 DEBUG oslo_concurrency.lockutils [req-108ec188-5fd6-45a7-8ce0-2175f6e1a0d0 req-c65d8a56-a560-4bf2-8f21-206bda5c3bf9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:36 np0005603623 nova_compute[226235]: 2026-01-31 08:21:36.511 226239 DEBUG nova.compute.manager [req-108ec188-5fd6-45a7-8ce0-2175f6e1a0d0 req-c65d8a56-a560-4bf2-8f21-206bda5c3bf9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] No waiting events found dispatching network-vif-plugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:21:36 np0005603623 nova_compute[226235]: 2026-01-31 08:21:36.511 226239 WARNING nova.compute.manager [req-108ec188-5fd6-45a7-8ce0-2175f6e1a0d0 req-c65d8a56-a560-4bf2-8f21-206bda5c3bf9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Received unexpected event network-vif-plugged-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:21:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:36.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:37 np0005603623 nova_compute[226235]: 2026-01-31 08:21:37.274 226239 DEBUG nova.network.neutron [-] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:21:37 np0005603623 nova_compute[226235]: 2026-01-31 08:21:37.305 226239 DEBUG nova.compute.manager [req-3afb11c8-7c0d-4baa-8dc6-372a515db2ab req-942dd803-59b6-4e94-898b-eacbeeb9a618 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Received event network-vif-deleted-36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:21:37 np0005603623 nova_compute[226235]: 2026-01-31 08:21:37.306 226239 INFO nova.compute.manager [req-3afb11c8-7c0d-4baa-8dc6-372a515db2ab req-942dd803-59b6-4e94-898b-eacbeeb9a618 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Neutron deleted interface 36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:21:37 np0005603623 nova_compute[226235]: 2026-01-31 08:21:37.306 226239 DEBUG nova.network.neutron [req-3afb11c8-7c0d-4baa-8dc6-372a515db2ab req-942dd803-59b6-4e94-898b-eacbeeb9a618 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:21:37 np0005603623 nova_compute[226235]: 2026-01-31 08:21:37.964 226239 INFO nova.compute.manager [-] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Took 3.56 seconds to deallocate network for instance.#033[00m
Jan 31 03:21:37 np0005603623 nova_compute[226235]: 2026-01-31 08:21:37.969 226239 DEBUG nova.compute.manager [req-3afb11c8-7c0d-4baa-8dc6-372a515db2ab req-942dd803-59b6-4e94-898b-eacbeeb9a618 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Detach interface failed, port_id=36d8ec1c-f2a3-4dc4-8d00-a57ae28ac867, reason: Instance ce5ab48d-0d12-49ac-92a1-001e91e26553 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:21:38 np0005603623 nova_compute[226235]: 2026-01-31 08:21:38.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:38 np0005603623 nova_compute[226235]: 2026-01-31 08:21:38.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:21:38 np0005603623 nova_compute[226235]: 2026-01-31 08:21:38.159 226239 DEBUG oslo_concurrency.lockutils [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:38 np0005603623 nova_compute[226235]: 2026-01-31 08:21:38.160 226239 DEBUG oslo_concurrency.lockutils [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:38.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:38 np0005603623 nova_compute[226235]: 2026-01-31 08:21:38.220 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:21:38 np0005603623 nova_compute[226235]: 2026-01-31 08:21:38.221 226239 DEBUG oslo_concurrency.processutils [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:21:38 np0005603623 nova_compute[226235]: 2026-01-31 08:21:38.266 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:21:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:38.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:21:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:21:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/826441816' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:21:38 np0005603623 nova_compute[226235]: 2026-01-31 08:21:38.616 226239 DEBUG oslo_concurrency.processutils [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.395s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:21:38 np0005603623 nova_compute[226235]: 2026-01-31 08:21:38.622 226239 DEBUG nova.compute.provider_tree [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:21:38 np0005603623 nova_compute[226235]: 2026-01-31 08:21:38.755 226239 DEBUG nova.scheduler.client.report [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:21:39 np0005603623 nova_compute[226235]: 2026-01-31 08:21:39.103 226239 DEBUG oslo_concurrency.lockutils [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.943s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:39 np0005603623 nova_compute[226235]: 2026-01-31 08:21:39.130 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:39 np0005603623 nova_compute[226235]: 2026-01-31 08:21:39.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:39 np0005603623 nova_compute[226235]: 2026-01-31 08:21:39.361 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:39 np0005603623 nova_compute[226235]: 2026-01-31 08:21:39.361 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:39 np0005603623 nova_compute[226235]: 2026-01-31 08:21:39.362 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:39 np0005603623 nova_compute[226235]: 2026-01-31 08:21:39.362 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:21:39 np0005603623 nova_compute[226235]: 2026-01-31 08:21:39.363 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:21:39 np0005603623 nova_compute[226235]: 2026-01-31 08:21:39.409 226239 INFO nova.scheduler.client.report [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Deleted allocations for instance ce5ab48d-0d12-49ac-92a1-001e91e26553#033[00m
Jan 31 03:21:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:21:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2131231249' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:21:39 np0005603623 nova_compute[226235]: 2026-01-31 08:21:39.772 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:21:39 np0005603623 nova_compute[226235]: 2026-01-31 08:21:39.890 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:21:39 np0005603623 nova_compute[226235]: 2026-01-31 08:21:39.891 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4521MB free_disk=20.75481414794922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:21:39 np0005603623 nova_compute[226235]: 2026-01-31 08:21:39.891 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:39 np0005603623 nova_compute[226235]: 2026-01-31 08:21:39.892 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:40 np0005603623 nova_compute[226235]: 2026-01-31 08:21:40.115 226239 DEBUG oslo_concurrency.lockutils [None req-8b969447-8342-4a1f-b27f-d87d14c45219 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "ce5ab48d-0d12-49ac-92a1-001e91e26553" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.316s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:40.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:40.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:40 np0005603623 nova_compute[226235]: 2026-01-31 08:21:40.765 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:21:40 np0005603623 nova_compute[226235]: 2026-01-31 08:21:40.765 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:21:41 np0005603623 nova_compute[226235]: 2026-01-31 08:21:41.610 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:21:41 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #51. Immutable memtables: 8.
Jan 31 03:21:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:21:42 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1137338519' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:21:42 np0005603623 nova_compute[226235]: 2026-01-31 08:21:42.036 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:21:42 np0005603623 nova_compute[226235]: 2026-01-31 08:21:42.042 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:21:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:21:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:42.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:21:42 np0005603623 nova_compute[226235]: 2026-01-31 08:21:42.382 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:21:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:42.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:42 np0005603623 nova_compute[226235]: 2026-01-31 08:21:42.766 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:21:42 np0005603623 nova_compute[226235]: 2026-01-31 08:21:42.767 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.874s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:42 np0005603623 podman[272929]: 2026-01-31 08:21:42.979023837 +0000 UTC m=+0.075718522 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:21:42 np0005603623 podman[272930]: 2026-01-31 08:21:42.979028807 +0000 UTC m=+0.075745793 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 03:21:43 np0005603623 nova_compute[226235]: 2026-01-31 08:21:43.268 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:44 np0005603623 nova_compute[226235]: 2026-01-31 08:21:44.132 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:21:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:44.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:21:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:44.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:44 np0005603623 nova_compute[226235]: 2026-01-31 08:21:44.767 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:44 np0005603623 nova_compute[226235]: 2026-01-31 08:21:44.767 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:44 np0005603623 nova_compute[226235]: 2026-01-31 08:21:44.768 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:45 np0005603623 nova_compute[226235]: 2026-01-31 08:21:45.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:45 np0005603623 nova_compute[226235]: 2026-01-31 08:21:45.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:46.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:46.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:47 np0005603623 nova_compute[226235]: 2026-01-31 08:21:47.139 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:48 np0005603623 nova_compute[226235]: 2026-01-31 08:21:48.029 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847693.0284088, ce5ab48d-0d12-49ac-92a1-001e91e26553 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:21:48 np0005603623 nova_compute[226235]: 2026-01-31 08:21:48.030 226239 INFO nova.compute.manager [-] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:21:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:48.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:48 np0005603623 nova_compute[226235]: 2026-01-31 08:21:48.244 226239 DEBUG nova.compute.manager [None req-32e8ae0b-b1a0-4882-920f-8025aa93da04 - - - - - -] [instance: ce5ab48d-0d12-49ac-92a1-001e91e26553] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:21:48 np0005603623 nova_compute[226235]: 2026-01-31 08:21:48.270 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:48.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:49 np0005603623 nova_compute[226235]: 2026-01-31 08:21:49.133 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:50.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:50.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:52.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:52.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:53 np0005603623 nova_compute[226235]: 2026-01-31 08:21:53.271 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:54 np0005603623 nova_compute[226235]: 2026-01-31 08:21:54.143 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:54 np0005603623 nova_compute[226235]: 2026-01-31 08:21:54.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:54.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:54.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:56.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:56.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #97. Immutable memtables: 0.
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:21:58.062252) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 59] Flushing memtable with next log file: 97
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847718062321, "job": 59, "event": "flush_started", "num_memtables": 1, "num_entries": 870, "num_deletes": 251, "total_data_size": 1675648, "memory_usage": 1699184, "flush_reason": "Manual Compaction"}
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 59] Level-0 flush table #98: started
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847718075053, "cf_name": "default", "job": 59, "event": "table_file_creation", "file_number": 98, "file_size": 1105091, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 49288, "largest_seqno": 50153, "table_properties": {"data_size": 1100998, "index_size": 1809, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9498, "raw_average_key_size": 19, "raw_value_size": 1092719, "raw_average_value_size": 2295, "num_data_blocks": 79, "num_entries": 476, "num_filter_entries": 476, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769847654, "oldest_key_time": 1769847654, "file_creation_time": 1769847718, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 98, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 59] Flush lasted 12848 microseconds, and 3174 cpu microseconds.
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:21:58.075089) [db/flush_job.cc:967] [default] [JOB 59] Level-0 flush table #98: 1105091 bytes OK
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:21:58.075115) [db/memtable_list.cc:519] [default] Level-0 commit table #98 started
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:21:58.080154) [db/memtable_list.cc:722] [default] Level-0 commit table #98: memtable #1 done
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:21:58.080193) EVENT_LOG_v1 {"time_micros": 1769847718080186, "job": 59, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:21:58.080215) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 59] Try to delete WAL files size 1671191, prev total WAL file size 1671191, number of live WAL files 2.
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000094.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:21:58.080828) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034303136' seq:72057594037927935, type:22 .. '7061786F730034323638' seq:0, type:0; will stop at (end)
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 60] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 59 Base level 0, inputs: [98(1079KB)], [96(10007KB)]
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847718080915, "job": 60, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [98], "files_L6": [96], "score": -1, "input_data_size": 11353231, "oldest_snapshot_seqno": -1}
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 60] Generated table #99: 7185 keys, 9511971 bytes, temperature: kUnknown
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847718195720, "cf_name": "default", "job": 60, "event": "table_file_creation", "file_number": 99, "file_size": 9511971, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9466180, "index_size": 26735, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17989, "raw_key_size": 187055, "raw_average_key_size": 26, "raw_value_size": 9340268, "raw_average_value_size": 1299, "num_data_blocks": 1047, "num_entries": 7185, "num_filter_entries": 7185, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769847718, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 99, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:21:58.195987) [db/compaction/compaction_job.cc:1663] [default] [JOB 60] Compacted 1@0 + 1@6 files to L6 => 9511971 bytes
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:21:58.202658) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 98.8 rd, 82.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 9.8 +0.0 blob) out(9.1 +0.0 blob), read-write-amplify(18.9) write-amplify(8.6) OK, records in: 7702, records dropped: 517 output_compression: NoCompression
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:21:58.202701) EVENT_LOG_v1 {"time_micros": 1769847718202686, "job": 60, "event": "compaction_finished", "compaction_time_micros": 114908, "compaction_time_cpu_micros": 25312, "output_level": 6, "num_output_files": 1, "total_output_size": 9511971, "num_input_records": 7702, "num_output_records": 7185, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000098.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847718203002, "job": 60, "event": "table_file_deletion", "file_number": 98}
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000096.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847718204252, "job": 60, "event": "table_file_deletion", "file_number": 96}
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:21:58.080680) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:21:58.204311) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:21:58.204315) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:21:58.204316) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:21:58.204317) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:21:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:21:58.204319) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:21:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:58.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:58 np0005603623 nova_compute[226235]: 2026-01-31 08:21:58.272 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:21:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:58.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:59 np0005603623 nova_compute[226235]: 2026-01-31 08:21:59.145 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:00.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:00.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:02.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:02.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:03 np0005603623 nova_compute[226235]: 2026-01-31 08:22:03.324 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:04 np0005603623 nova_compute[226235]: 2026-01-31 08:22:04.147 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:22:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:04.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:22:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:04.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:22:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:06.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:22:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:22:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:06.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:22:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:08.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:08 np0005603623 nova_compute[226235]: 2026-01-31 08:22:08.326 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:22:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:08.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:22:09 np0005603623 nova_compute[226235]: 2026-01-31 08:22:09.187 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:22:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:10.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:22:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:10.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:12.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:22:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:12.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:22:13 np0005603623 nova_compute[226235]: 2026-01-31 08:22:13.328 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:13 np0005603623 podman[273090]: 2026-01-31 08:22:13.942119759 +0000 UTC m=+0.040493691 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:22:14 np0005603623 podman[273091]: 2026-01-31 08:22:14.000482889 +0000 UTC m=+0.098529800 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller)
Jan 31 03:22:14 np0005603623 nova_compute[226235]: 2026-01-31 08:22:14.190 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:14.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:14.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:22:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2596611071' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:22:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:22:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2596611071' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:22:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:22:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:16.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:22:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:16.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:17 np0005603623 nova_compute[226235]: 2026-01-31 08:22:17.526 226239 DEBUG oslo_concurrency.lockutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "7b830774-2315-410b-a3ed-585a1d0b6ee2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:17 np0005603623 nova_compute[226235]: 2026-01-31 08:22:17.526 226239 DEBUG oslo_concurrency.lockutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:17 np0005603623 nova_compute[226235]: 2026-01-31 08:22:17.692 226239 DEBUG nova.compute.manager [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:22:17 np0005603623 nova_compute[226235]: 2026-01-31 08:22:17.948 226239 DEBUG oslo_concurrency.lockutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:17 np0005603623 nova_compute[226235]: 2026-01-31 08:22:17.949 226239 DEBUG oslo_concurrency.lockutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:17 np0005603623 nova_compute[226235]: 2026-01-31 08:22:17.955 226239 DEBUG nova.virt.hardware [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:22:17 np0005603623 nova_compute[226235]: 2026-01-31 08:22:17.956 226239 INFO nova.compute.claims [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:22:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:22:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:18.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:22:18 np0005603623 nova_compute[226235]: 2026-01-31 08:22:18.300 226239 DEBUG oslo_concurrency.processutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:18 np0005603623 nova_compute[226235]: 2026-01-31 08:22:18.329 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:22:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:18.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:22:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:22:18 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2076072746' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:22:18 np0005603623 nova_compute[226235]: 2026-01-31 08:22:18.710 226239 DEBUG oslo_concurrency.processutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:18 np0005603623 nova_compute[226235]: 2026-01-31 08:22:18.714 226239 DEBUG nova.compute.provider_tree [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:22:18 np0005603623 nova_compute[226235]: 2026-01-31 08:22:18.837 226239 DEBUG nova.scheduler.client.report [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:22:18 np0005603623 nova_compute[226235]: 2026-01-31 08:22:18.991 226239 DEBUG oslo_concurrency.lockutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.042s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:18 np0005603623 nova_compute[226235]: 2026-01-31 08:22:18.992 226239 DEBUG nova.compute.manager [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:22:19 np0005603623 nova_compute[226235]: 2026-01-31 08:22:19.128 226239 DEBUG nova.compute.manager [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:22:19 np0005603623 nova_compute[226235]: 2026-01-31 08:22:19.129 226239 DEBUG nova.network.neutron [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:22:19 np0005603623 nova_compute[226235]: 2026-01-31 08:22:19.232 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:19 np0005603623 nova_compute[226235]: 2026-01-31 08:22:19.293 226239 INFO nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:22:19 np0005603623 nova_compute[226235]: 2026-01-31 08:22:19.486 226239 DEBUG nova.compute.manager [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:22:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:19.491 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:22:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:19.492 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:22:19 np0005603623 nova_compute[226235]: 2026-01-31 08:22:19.492 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:19 np0005603623 nova_compute[226235]: 2026-01-31 08:22:19.876 226239 DEBUG nova.compute.manager [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:22:19 np0005603623 nova_compute[226235]: 2026-01-31 08:22:19.878 226239 DEBUG nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:22:19 np0005603623 nova_compute[226235]: 2026-01-31 08:22:19.878 226239 INFO nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Creating image(s)#033[00m
Jan 31 03:22:19 np0005603623 nova_compute[226235]: 2026-01-31 08:22:19.907 226239 DEBUG nova.storage.rbd_utils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:19 np0005603623 nova_compute[226235]: 2026-01-31 08:22:19.937 226239 DEBUG nova.storage.rbd_utils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:19 np0005603623 nova_compute[226235]: 2026-01-31 08:22:19.970 226239 DEBUG nova.storage.rbd_utils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:19 np0005603623 nova_compute[226235]: 2026-01-31 08:22:19.976 226239 DEBUG oslo_concurrency.processutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:20 np0005603623 nova_compute[226235]: 2026-01-31 08:22:20.046 226239 DEBUG oslo_concurrency.processutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:20 np0005603623 nova_compute[226235]: 2026-01-31 08:22:20.047 226239 DEBUG oslo_concurrency.lockutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:20 np0005603623 nova_compute[226235]: 2026-01-31 08:22:20.048 226239 DEBUG oslo_concurrency.lockutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:20 np0005603623 nova_compute[226235]: 2026-01-31 08:22:20.048 226239 DEBUG oslo_concurrency.lockutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:20 np0005603623 nova_compute[226235]: 2026-01-31 08:22:20.073 226239 DEBUG nova.storage.rbd_utils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:20 np0005603623 nova_compute[226235]: 2026-01-31 08:22:20.076 226239 DEBUG oslo_concurrency.processutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:22:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:20.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:22:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:22:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:20.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:22:20 np0005603623 nova_compute[226235]: 2026-01-31 08:22:20.594 226239 DEBUG nova.policy [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1d03198d8ab846bda092e089b2d5a6c7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:22:20 np0005603623 nova_compute[226235]: 2026-01-31 08:22:20.689 226239 DEBUG oslo_concurrency.processutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.613s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:20 np0005603623 nova_compute[226235]: 2026-01-31 08:22:20.758 226239 DEBUG nova.storage.rbd_utils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] resizing rbd image 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:22:20 np0005603623 nova_compute[226235]: 2026-01-31 08:22:20.879 226239 DEBUG nova.objects.instance [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'migration_context' on Instance uuid 7b830774-2315-410b-a3ed-585a1d0b6ee2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:22:20 np0005603623 nova_compute[226235]: 2026-01-31 08:22:20.942 226239 DEBUG nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:22:20 np0005603623 nova_compute[226235]: 2026-01-31 08:22:20.942 226239 DEBUG nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Ensure instance console log exists: /var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:22:20 np0005603623 nova_compute[226235]: 2026-01-31 08:22:20.943 226239 DEBUG oslo_concurrency.lockutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:20 np0005603623 nova_compute[226235]: 2026-01-31 08:22:20.943 226239 DEBUG oslo_concurrency.lockutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:20 np0005603623 nova_compute[226235]: 2026-01-31 08:22:20.943 226239 DEBUG oslo_concurrency.lockutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:22.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:22.494 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:22:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:22.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:22:22 np0005603623 nova_compute[226235]: 2026-01-31 08:22:22.613 226239 DEBUG nova.network.neutron [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Successfully created port: bb98b496-e57f-4e5a-bea6-fc68d9690077 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:22:23 np0005603623 nova_compute[226235]: 2026-01-31 08:22:23.331 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:24 np0005603623 nova_compute[226235]: 2026-01-31 08:22:24.233 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:24.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:24.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:22:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.0 total, 600.0 interval#012Cumulative writes: 9975 writes, 50K keys, 9975 commit groups, 1.0 writes per commit group, ingest: 0.10 GB, 0.03 MB/s#012Cumulative WAL: 9975 writes, 9975 syncs, 1.00 writes per sync, written: 0.10 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1561 writes, 7901 keys, 1561 commit groups, 1.0 writes per commit group, ingest: 15.40 MB, 0.03 MB/s#012Interval WAL: 1561 writes, 1561 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     39.7      1.55              0.15        30    0.052       0      0       0.0       0.0#012  L6      1/0    9.07 MB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   4.4     66.5     55.7      4.88              0.60        29    0.168    174K    16K       0.0       0.0#012 Sum      1/0    9.07 MB   0.0      0.3     0.1      0.3       0.3      0.1       0.0   5.4     50.5     51.9      6.44              0.75        59    0.109    174K    16K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.9     68.2     66.3      1.23              0.18        14    0.088     52K   3633       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   0.0     66.5     55.7      4.88              0.60        29    0.168    174K    16K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     39.7      1.55              0.15        29    0.054       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 3600.0 total, 600.0 interval#012Flush(GB): cumulative 0.060, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.33 GB write, 0.09 MB/s write, 0.32 GB read, 0.09 MB/s read, 6.4 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.08 GB read, 0.14 MB/s read, 1.2 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557fc5f1b1f0#2 capacity: 304.00 MB usage: 36.31 MB table_size: 0 occupancy: 18446744073709551615 collections: 7 last_copies: 0 last_secs: 0.000248 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2080,34.99 MB,11.5083%) FilterBlock(59,493.67 KB,0.158586%) IndexBlock(59,859.45 KB,0.276089%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 03:22:25 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:25Z|00426|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 31 03:22:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:22:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:26.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:22:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:22:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:26.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:22:27 np0005603623 nova_compute[226235]: 2026-01-31 08:22:27.609 226239 DEBUG nova.network.neutron [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Successfully updated port: bb98b496-e57f-4e5a-bea6-fc68d9690077 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:22:27 np0005603623 nova_compute[226235]: 2026-01-31 08:22:27.850 226239 DEBUG nova.compute.manager [req-11412ca3-745a-44a0-8069-7f675cbbf42c req-d6972c79-ba9f-460b-9d6a-304d00761a77 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Received event network-changed-bb98b496-e57f-4e5a-bea6-fc68d9690077 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:22:27 np0005603623 nova_compute[226235]: 2026-01-31 08:22:27.850 226239 DEBUG nova.compute.manager [req-11412ca3-745a-44a0-8069-7f675cbbf42c req-d6972c79-ba9f-460b-9d6a-304d00761a77 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Refreshing instance network info cache due to event network-changed-bb98b496-e57f-4e5a-bea6-fc68d9690077. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:22:27 np0005603623 nova_compute[226235]: 2026-01-31 08:22:27.850 226239 DEBUG oslo_concurrency.lockutils [req-11412ca3-745a-44a0-8069-7f675cbbf42c req-d6972c79-ba9f-460b-9d6a-304d00761a77 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-7b830774-2315-410b-a3ed-585a1d0b6ee2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:22:27 np0005603623 nova_compute[226235]: 2026-01-31 08:22:27.850 226239 DEBUG oslo_concurrency.lockutils [req-11412ca3-745a-44a0-8069-7f675cbbf42c req-d6972c79-ba9f-460b-9d6a-304d00761a77 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-7b830774-2315-410b-a3ed-585a1d0b6ee2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:22:27 np0005603623 nova_compute[226235]: 2026-01-31 08:22:27.851 226239 DEBUG nova.network.neutron [req-11412ca3-745a-44a0-8069-7f675cbbf42c req-d6972c79-ba9f-460b-9d6a-304d00761a77 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Refreshing network info cache for port bb98b496-e57f-4e5a-bea6-fc68d9690077 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:22:27 np0005603623 nova_compute[226235]: 2026-01-31 08:22:27.998 226239 DEBUG oslo_concurrency.lockutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "refresh_cache-7b830774-2315-410b-a3ed-585a1d0b6ee2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:22:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:28.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:28 np0005603623 nova_compute[226235]: 2026-01-31 08:22:28.332 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:28 np0005603623 nova_compute[226235]: 2026-01-31 08:22:28.467 226239 DEBUG nova.network.neutron [req-11412ca3-745a-44a0-8069-7f675cbbf42c req-d6972c79-ba9f-460b-9d6a-304d00761a77 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:22:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:28.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:29 np0005603623 nova_compute[226235]: 2026-01-31 08:22:29.206 226239 DEBUG nova.network.neutron [req-11412ca3-745a-44a0-8069-7f675cbbf42c req-d6972c79-ba9f-460b-9d6a-304d00761a77 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:22:29 np0005603623 nova_compute[226235]: 2026-01-31 08:22:29.234 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:29 np0005603623 nova_compute[226235]: 2026-01-31 08:22:29.350 226239 DEBUG oslo_concurrency.lockutils [req-11412ca3-745a-44a0-8069-7f675cbbf42c req-d6972c79-ba9f-460b-9d6a-304d00761a77 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-7b830774-2315-410b-a3ed-585a1d0b6ee2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:22:29 np0005603623 nova_compute[226235]: 2026-01-31 08:22:29.351 226239 DEBUG oslo_concurrency.lockutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquired lock "refresh_cache-7b830774-2315-410b-a3ed-585a1d0b6ee2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:22:29 np0005603623 nova_compute[226235]: 2026-01-31 08:22:29.351 226239 DEBUG nova.network.neutron [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:22:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:22:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:22:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:22:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:22:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:22:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:22:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:22:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:30 np0005603623 nova_compute[226235]: 2026-01-31 08:22:30.028 226239 DEBUG nova.network.neutron [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:22:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:30.115 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:30.115 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:30.115 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:30.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:22:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:30.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.495 226239 DEBUG nova.network.neutron [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Updating instance_info_cache with network_info: [{"id": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "address": "fa:16:3e:fd:11:00", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb98b496-e5", "ovs_interfaceid": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.854 226239 DEBUG oslo_concurrency.lockutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Releasing lock "refresh_cache-7b830774-2315-410b-a3ed-585a1d0b6ee2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.854 226239 DEBUG nova.compute.manager [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Instance network_info: |[{"id": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "address": "fa:16:3e:fd:11:00", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb98b496-e5", "ovs_interfaceid": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.856 226239 DEBUG nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Start _get_guest_xml network_info=[{"id": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "address": "fa:16:3e:fd:11:00", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb98b496-e5", "ovs_interfaceid": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.861 226239 WARNING nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.867 226239 DEBUG nova.virt.libvirt.host [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.867 226239 DEBUG nova.virt.libvirt.host [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.874 226239 DEBUG nova.virt.libvirt.host [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.875 226239 DEBUG nova.virt.libvirt.host [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.876 226239 DEBUG nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.876 226239 DEBUG nova.virt.hardware [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.877 226239 DEBUG nova.virt.hardware [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.877 226239 DEBUG nova.virt.hardware [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.877 226239 DEBUG nova.virt.hardware [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.877 226239 DEBUG nova.virt.hardware [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.878 226239 DEBUG nova.virt.hardware [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.878 226239 DEBUG nova.virt.hardware [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.878 226239 DEBUG nova.virt.hardware [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.879 226239 DEBUG nova.virt.hardware [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.879 226239 DEBUG nova.virt.hardware [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.879 226239 DEBUG nova.virt.hardware [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:22:31 np0005603623 nova_compute[226235]: 2026-01-31 08:22:31.882 226239 DEBUG oslo_concurrency.processutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:32.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:22:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3599445619' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.366 226239 DEBUG oslo_concurrency.processutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.388 226239 DEBUG nova.storage.rbd_utils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.391 226239 DEBUG oslo_concurrency.processutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:32.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:22:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/696872624' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.811 226239 DEBUG oslo_concurrency.processutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.813 226239 DEBUG nova.virt.libvirt.vif [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:22:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1897188479',display_name='tempest-tempest.common.compute-instance-1897188479',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1897188479',id=112,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-yq6e639z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:22:19Z,user_data=None,user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=7b830774-2315-410b-a3ed-585a1d0b6ee2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "address": "fa:16:3e:fd:11:00", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb98b496-e5", "ovs_interfaceid": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.813 226239 DEBUG nova.network.os_vif_util [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "address": "fa:16:3e:fd:11:00", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb98b496-e5", "ovs_interfaceid": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.814 226239 DEBUG nova.network.os_vif_util [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:11:00,bridge_name='br-int',has_traffic_filtering=True,id=bb98b496-e57f-4e5a-bea6-fc68d9690077,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb98b496-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.815 226239 DEBUG nova.objects.instance [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'pci_devices' on Instance uuid 7b830774-2315-410b-a3ed-585a1d0b6ee2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.978 226239 DEBUG nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:22:32 np0005603623 nova_compute[226235]:  <uuid>7b830774-2315-410b-a3ed-585a1d0b6ee2</uuid>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:  <name>instance-00000070</name>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <nova:name>tempest-tempest.common.compute-instance-1897188479</nova:name>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:22:31</nova:creationTime>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:22:32 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:        <nova:user uuid="1d03198d8ab846bda092e089b2d5a6c7">tempest-ServerActionsTestJSON-1873947453-project-member</nova:user>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:        <nova:project uuid="5b87da3b3f42494f96baeeeaf60b54df">tempest-ServerActionsTestJSON-1873947453</nova:project>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:        <nova:port uuid="bb98b496-e57f-4e5a-bea6-fc68d9690077">
Jan 31 03:22:32 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <entry name="serial">7b830774-2315-410b-a3ed-585a1d0b6ee2</entry>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <entry name="uuid">7b830774-2315-410b-a3ed-585a1d0b6ee2</entry>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/7b830774-2315-410b-a3ed-585a1d0b6ee2_disk">
Jan 31 03:22:32 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:22:32 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/7b830774-2315-410b-a3ed-585a1d0b6ee2_disk.config">
Jan 31 03:22:32 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:22:32 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:fd:11:00"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <target dev="tapbb98b496-e5"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2/console.log" append="off"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:22:32 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:22:32 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:22:32 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:22:32 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.980 226239 DEBUG nova.compute.manager [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Preparing to wait for external event network-vif-plugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.980 226239 DEBUG oslo_concurrency.lockutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.980 226239 DEBUG oslo_concurrency.lockutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.980 226239 DEBUG oslo_concurrency.lockutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.981 226239 DEBUG nova.virt.libvirt.vif [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:22:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1897188479',display_name='tempest-tempest.common.compute-instance-1897188479',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1897188479',id=112,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-yq6e639z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:22:19Z,user_data=None,user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=7b830774-2315-410b-a3ed-585a1d0b6ee2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "address": "fa:16:3e:fd:11:00", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb98b496-e5", "ovs_interfaceid": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.981 226239 DEBUG nova.network.os_vif_util [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "address": "fa:16:3e:fd:11:00", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb98b496-e5", "ovs_interfaceid": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.982 226239 DEBUG nova.network.os_vif_util [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:11:00,bridge_name='br-int',has_traffic_filtering=True,id=bb98b496-e57f-4e5a-bea6-fc68d9690077,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb98b496-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.982 226239 DEBUG os_vif [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:11:00,bridge_name='br-int',has_traffic_filtering=True,id=bb98b496-e57f-4e5a-bea6-fc68d9690077,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb98b496-e5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.983 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.983 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.984 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.986 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.986 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbb98b496-e5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.986 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbb98b496-e5, col_values=(('external_ids', {'iface-id': 'bb98b496-e57f-4e5a-bea6-fc68d9690077', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fd:11:00', 'vm-uuid': '7b830774-2315-410b-a3ed-585a1d0b6ee2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:32 np0005603623 NetworkManager[48970]: <info>  [1769847752.9894] manager: (tapbb98b496-e5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/200)
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.988 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.992 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.994 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:32 np0005603623 nova_compute[226235]: 2026-01-31 08:22:32.994 226239 INFO os_vif [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:11:00,bridge_name='br-int',has_traffic_filtering=True,id=bb98b496-e57f-4e5a-bea6-fc68d9690077,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb98b496-e5')#033[00m
Jan 31 03:22:33 np0005603623 nova_compute[226235]: 2026-01-31 08:22:33.066 226239 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "e83e8f02-79e9-4945-b30e-65624fc06c37" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:33 np0005603623 nova_compute[226235]: 2026-01-31 08:22:33.066 226239 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "e83e8f02-79e9-4945-b30e-65624fc06c37" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:33 np0005603623 nova_compute[226235]: 2026-01-31 08:22:33.449 226239 DEBUG nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:22:33 np0005603623 nova_compute[226235]: 2026-01-31 08:22:33.632 226239 DEBUG nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:22:33 np0005603623 nova_compute[226235]: 2026-01-31 08:22:33.632 226239 DEBUG nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:22:33 np0005603623 nova_compute[226235]: 2026-01-31 08:22:33.633 226239 DEBUG nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No VIF found with MAC fa:16:3e:fd:11:00, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:22:33 np0005603623 nova_compute[226235]: 2026-01-31 08:22:33.633 226239 INFO nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Using config drive#033[00m
Jan 31 03:22:33 np0005603623 nova_compute[226235]: 2026-01-31 08:22:33.656 226239 DEBUG nova.storage.rbd_utils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:33 np0005603623 nova_compute[226235]: 2026-01-31 08:22:33.713 226239 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:33 np0005603623 nova_compute[226235]: 2026-01-31 08:22:33.714 226239 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:33 np0005603623 nova_compute[226235]: 2026-01-31 08:22:33.720 226239 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:22:33 np0005603623 nova_compute[226235]: 2026-01-31 08:22:33.720 226239 INFO nova.compute.claims [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.006 226239 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.236 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:34.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:22:34 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1672194401' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.425 226239 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.429 226239 DEBUG nova.compute.provider_tree [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.482 226239 INFO nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Creating config drive at /var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2/disk.config#033[00m
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.486 226239 DEBUG oslo_concurrency.processutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpw599jrj6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.537 226239 DEBUG nova.scheduler.client.report [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:22:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:22:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:34.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.607 226239 DEBUG oslo_concurrency.processutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpw599jrj6" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.631 226239 DEBUG nova.storage.rbd_utils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.634 226239 DEBUG oslo_concurrency.processutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2/disk.config 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.741 226239 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.742 226239 DEBUG nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.846 226239 DEBUG oslo_concurrency.processutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2/disk.config 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.211s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.847 226239 INFO nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Deleting local config drive /var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2/disk.config because it was imported into RBD.#033[00m
Jan 31 03:22:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:34 np0005603623 kernel: tapbb98b496-e5: entered promiscuous mode
Jan 31 03:22:34 np0005603623 NetworkManager[48970]: <info>  [1769847754.8843] manager: (tapbb98b496-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/201)
Jan 31 03:22:34 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:34Z|00427|binding|INFO|Claiming lport bb98b496-e57f-4e5a-bea6-fc68d9690077 for this chassis.
Jan 31 03:22:34 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:34Z|00428|binding|INFO|bb98b496-e57f-4e5a-bea6-fc68d9690077: Claiming fa:16:3e:fd:11:00 10.100.0.6
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.885 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.888 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.893 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.897 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.907 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:34 np0005603623 NetworkManager[48970]: <info>  [1769847754.9083] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Jan 31 03:22:34 np0005603623 systemd-udevd[273670]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:22:34 np0005603623 NetworkManager[48970]: <info>  [1769847754.9090] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/203)
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.919 226239 DEBUG nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.919 226239 DEBUG nova.network.neutron [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:22:34 np0005603623 NetworkManager[48970]: <info>  [1769847754.9214] device (tapbb98b496-e5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:22:34 np0005603623 NetworkManager[48970]: <info>  [1769847754.9220] device (tapbb98b496-e5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:22:34 np0005603623 systemd-machined[194379]: New machine qemu-48-instance-00000070.
Jan 31 03:22:34 np0005603623 systemd[1]: Started Virtual Machine qemu-48-instance-00000070.
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.957 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:34 np0005603623 nova_compute[226235]: 2026-01-31 08:22:34.967 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.068 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:11:00 10.100.0.6'], port_security=['fa:16:3e:fd:11:00 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7b830774-2315-410b-a3ed-585a1d0b6ee2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'be6219b2-98f8-4804-bad5-369b6bf26a95', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=bb98b496-e57f-4e5a-bea6-fc68d9690077) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.070 143258 INFO neutron.agent.ovn.metadata.agent [-] Port bb98b496-e57f-4e5a-bea6-fc68d9690077 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 bound to our chassis#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.071 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1186b71b-0c4b-47f0-a55d-4433241e46e7#033[00m
Jan 31 03:22:35 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:35Z|00429|binding|INFO|Setting lport bb98b496-e57f-4e5a-bea6-fc68d9690077 ovn-installed in OVS
Jan 31 03:22:35 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:35Z|00430|binding|INFO|Setting lport bb98b496-e57f-4e5a-bea6-fc68d9690077 up in Southbound
Jan 31 03:22:35 np0005603623 nova_compute[226235]: 2026-01-31 08:22:35.079 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.082 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9ca43ea1-4429-4737-8524-8b744b4d3aac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.082 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1186b71b-01 in ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:22:35 np0005603623 nova_compute[226235]: 2026-01-31 08:22:35.082 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.085 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1186b71b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.085 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[55e43b2f-7989-434f-8f24-1ebdc2c401f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.086 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d543333d-9a68-42e5-9776-168c781949b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.095 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[086682eb-ae12-4913-8763-8d7ae09a9ce5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.106 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[07b4bf8a-9699-4cc7-bd91-552d47e66f3d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.125 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[a2f22c73-b3a5-4ffd-a196-da46c2e7f999]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:35 np0005603623 systemd-udevd[273675]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:22:35 np0005603623 NetworkManager[48970]: <info>  [1769847755.1320] manager: (tap1186b71b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/204)
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.131 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ecd98e2d-6488-4970-a3fb-5d3ee91314d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.153 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[55309575-a0a7-490b-80c3-3c02a05164bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.156 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[10a5966b-fa46-4601-ae36-269ce29c6008]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:35 np0005603623 NetworkManager[48970]: <info>  [1769847755.1712] device (tap1186b71b-00): carrier: link connected
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.175 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[99d17b24-cdba-404c-8286-817a095b360a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.189 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[30f9ea86-0629-4d78-89ca-f982d34495f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 689560, 'reachable_time': 18438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273720, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.203 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ce7d0ce0-ca2a-4ec2-937d-ebab18fa9188]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:37ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 689560, 'tstamp': 689560}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273725, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.217 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[359ed59a-b525-4e96-8ae9-3470f5362401]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 689560, 'reachable_time': 18438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 273726, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.236 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[21610e4b-1e3b-43e1-baa0-ac2300825ae6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.284 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[362b1ce4-4497-4b45-8e8e-92ddedac110a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.285 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.286 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.286 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1186b71b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:35 np0005603623 nova_compute[226235]: 2026-01-31 08:22:35.288 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:35 np0005603623 NetworkManager[48970]: <info>  [1769847755.2886] manager: (tap1186b71b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/205)
Jan 31 03:22:35 np0005603623 kernel: tap1186b71b-00: entered promiscuous mode
Jan 31 03:22:35 np0005603623 nova_compute[226235]: 2026-01-31 08:22:35.290 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.293 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1186b71b-00, col_values=(('external_ids', {'iface-id': '4375f262-ce22-40bf-bf9b-24f6862763a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:35 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:35Z|00431|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=1)
Jan 31 03:22:35 np0005603623 nova_compute[226235]: 2026-01-31 08:22:35.295 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.296 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.297 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[214bc875-427d-4bff-84f3-9c1089a5563f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.298 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:22:35 np0005603623 nova_compute[226235]: 2026-01-31 08:22:35.299 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:35.300 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'env', 'PROCESS_TAG=haproxy-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1186b71b-0c4b-47f0-a55d-4433241e46e7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:22:35 np0005603623 nova_compute[226235]: 2026-01-31 08:22:35.356 226239 INFO nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:22:35 np0005603623 nova_compute[226235]: 2026-01-31 08:22:35.388 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847755.3884308, 7b830774-2315-410b-a3ed-585a1d0b6ee2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:22:35 np0005603623 nova_compute[226235]: 2026-01-31 08:22:35.389 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] VM Started (Lifecycle Event)#033[00m
Jan 31 03:22:35 np0005603623 nova_compute[226235]: 2026-01-31 08:22:35.451 226239 DEBUG nova.policy [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c086a82bd0384612a78981006889df41', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '96de645f38844180b404d1a7cf7dd460', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:22:35 np0005603623 nova_compute[226235]: 2026-01-31 08:22:35.552 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:22:35 np0005603623 nova_compute[226235]: 2026-01-31 08:22:35.559 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847755.3906033, 7b830774-2315-410b-a3ed-585a1d0b6ee2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:22:35 np0005603623 nova_compute[226235]: 2026-01-31 08:22:35.559 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:22:35 np0005603623 nova_compute[226235]: 2026-01-31 08:22:35.612 226239 DEBUG nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:22:35 np0005603623 podman[273780]: 2026-01-31 08:22:35.600784835 +0000 UTC m=+0.020679109 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:22:35 np0005603623 nova_compute[226235]: 2026-01-31 08:22:35.762 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:22:35 np0005603623 podman[273780]: 2026-01-31 08:22:35.763265978 +0000 UTC m=+0.183160232 container create 227e4f05ac9bbfd81c171cc150a532118456e74fb38ffb1a1538e0131660f92a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:22:35 np0005603623 nova_compute[226235]: 2026-01-31 08:22:35.767 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:22:35 np0005603623 systemd[1]: Started libpod-conmon-227e4f05ac9bbfd81c171cc150a532118456e74fb38ffb1a1538e0131660f92a.scope.
Jan 31 03:22:35 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:22:35 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/de6fb7c01ac9dd8dfaf6ecda9428be5cf8e8ff5a30d0510b4487f8bde676494e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:22:35 np0005603623 podman[273780]: 2026-01-31 08:22:35.851833425 +0000 UTC m=+0.271727699 container init 227e4f05ac9bbfd81c171cc150a532118456e74fb38ffb1a1538e0131660f92a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 31 03:22:35 np0005603623 podman[273780]: 2026-01-31 08:22:35.855889502 +0000 UTC m=+0.275783746 container start 227e4f05ac9bbfd81c171cc150a532118456e74fb38ffb1a1538e0131660f92a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:22:35 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[273795]: [NOTICE]   (273799) : New worker (273801) forked
Jan 31 03:22:35 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[273795]: [NOTICE]   (273799) : Loading success.
Jan 31 03:22:35 np0005603623 nova_compute[226235]: 2026-01-31 08:22:35.898 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.043 226239 DEBUG nova.compute.manager [req-42556a33-3b0c-4651-b742-c26e2ecf0adf req-d5c3f88d-699e-4b4b-830c-9fa11a10b4b7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Received event network-vif-plugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.044 226239 DEBUG oslo_concurrency.lockutils [req-42556a33-3b0c-4651-b742-c26e2ecf0adf req-d5c3f88d-699e-4b4b-830c-9fa11a10b4b7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.044 226239 DEBUG oslo_concurrency.lockutils [req-42556a33-3b0c-4651-b742-c26e2ecf0adf req-d5c3f88d-699e-4b4b-830c-9fa11a10b4b7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.044 226239 DEBUG oslo_concurrency.lockutils [req-42556a33-3b0c-4651-b742-c26e2ecf0adf req-d5c3f88d-699e-4b4b-830c-9fa11a10b4b7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.045 226239 DEBUG nova.compute.manager [req-42556a33-3b0c-4651-b742-c26e2ecf0adf req-d5c3f88d-699e-4b4b-830c-9fa11a10b4b7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Processing event network-vif-plugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.045 226239 DEBUG nova.compute.manager [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.049 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847756.0488837, 7b830774-2315-410b-a3ed-585a1d0b6ee2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.049 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.051 226239 DEBUG nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.054 226239 INFO nova.virt.libvirt.driver [-] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Instance spawned successfully.#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.055 226239 DEBUG nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:22:36 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:36Z|00432|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.146 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:36.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.523 226239 DEBUG nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.525 226239 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.525 226239 INFO nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Creating image(s)#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.546 226239 DEBUG nova.storage.rbd_utils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image e83e8f02-79e9-4945-b30e-65624fc06c37_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.572 226239 DEBUG nova.storage.rbd_utils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image e83e8f02-79e9-4945-b30e-65624fc06c37_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:36.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.593 226239 DEBUG nova.storage.rbd_utils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image e83e8f02-79e9-4945-b30e-65624fc06c37_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.596 226239 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.617 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.623 226239 DEBUG nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.624 226239 DEBUG nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.624 226239 DEBUG nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.625 226239 DEBUG nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.625 226239 DEBUG nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.626 226239 DEBUG nova.virt.libvirt.driver [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.631 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.649 226239 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.650 226239 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.650 226239 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.651 226239 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.676 226239 DEBUG nova.storage.rbd_utils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image e83e8f02-79e9-4945-b30e-65624fc06c37_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.680 226239 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 e83e8f02-79e9-4945-b30e-65624fc06c37_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.699 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.756 226239 INFO nova.compute.manager [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Took 16.88 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.757 226239 DEBUG nova.compute.manager [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.917 226239 INFO nova.compute.manager [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Took 19.00 seconds to build instance.#033[00m
Jan 31 03:22:36 np0005603623 nova_compute[226235]: 2026-01-31 08:22:36.993 226239 DEBUG oslo_concurrency.lockutils [None req-73eb804f-d79a-4893-a93a-6ee46ed6f76c 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.467s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:37 np0005603623 nova_compute[226235]: 2026-01-31 08:22:37.248 226239 DEBUG nova.network.neutron [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Successfully created port: 7ca23fcb-d932-4244-a5dd-f02ef1fcd06c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:22:37 np0005603623 nova_compute[226235]: 2026-01-31 08:22:37.989 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:38 np0005603623 nova_compute[226235]: 2026-01-31 08:22:38.000 226239 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 e83e8f02-79e9-4945-b30e-65624fc06c37_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.320s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:38 np0005603623 nova_compute[226235]: 2026-01-31 08:22:38.060 226239 DEBUG nova.storage.rbd_utils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] resizing rbd image e83e8f02-79e9-4945-b30e-65624fc06c37_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:22:38 np0005603623 nova_compute[226235]: 2026-01-31 08:22:38.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:38 np0005603623 nova_compute[226235]: 2026-01-31 08:22:38.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:22:38 np0005603623 nova_compute[226235]: 2026-01-31 08:22:38.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:22:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:38.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:38 np0005603623 nova_compute[226235]: 2026-01-31 08:22:38.277 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:22:38 np0005603623 nova_compute[226235]: 2026-01-31 08:22:38.470 226239 DEBUG nova.compute.manager [req-d9a1d8f3-1049-4347-bdcf-eacfdcc3e77a req-aa2cfad7-617c-4d75-a652-60af92c53be3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Received event network-vif-plugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:22:38 np0005603623 nova_compute[226235]: 2026-01-31 08:22:38.471 226239 DEBUG oslo_concurrency.lockutils [req-d9a1d8f3-1049-4347-bdcf-eacfdcc3e77a req-aa2cfad7-617c-4d75-a652-60af92c53be3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:38 np0005603623 nova_compute[226235]: 2026-01-31 08:22:38.471 226239 DEBUG oslo_concurrency.lockutils [req-d9a1d8f3-1049-4347-bdcf-eacfdcc3e77a req-aa2cfad7-617c-4d75-a652-60af92c53be3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:38 np0005603623 nova_compute[226235]: 2026-01-31 08:22:38.471 226239 DEBUG oslo_concurrency.lockutils [req-d9a1d8f3-1049-4347-bdcf-eacfdcc3e77a req-aa2cfad7-617c-4d75-a652-60af92c53be3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:38 np0005603623 nova_compute[226235]: 2026-01-31 08:22:38.472 226239 DEBUG nova.compute.manager [req-d9a1d8f3-1049-4347-bdcf-eacfdcc3e77a req-aa2cfad7-617c-4d75-a652-60af92c53be3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] No waiting events found dispatching network-vif-plugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:22:38 np0005603623 nova_compute[226235]: 2026-01-31 08:22:38.472 226239 WARNING nova.compute.manager [req-d9a1d8f3-1049-4347-bdcf-eacfdcc3e77a req-aa2cfad7-617c-4d75-a652-60af92c53be3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Received unexpected event network-vif-plugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:22:38 np0005603623 nova_compute[226235]: 2026-01-31 08:22:38.476 226239 DEBUG nova.objects.instance [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lazy-loading 'migration_context' on Instance uuid e83e8f02-79e9-4945-b30e-65624fc06c37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:22:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:38.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:38 np0005603623 nova_compute[226235]: 2026-01-31 08:22:38.620 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-7b830774-2315-410b-a3ed-585a1d0b6ee2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:22:38 np0005603623 nova_compute[226235]: 2026-01-31 08:22:38.620 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-7b830774-2315-410b-a3ed-585a1d0b6ee2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:22:38 np0005603623 nova_compute[226235]: 2026-01-31 08:22:38.620 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:22:38 np0005603623 nova_compute[226235]: 2026-01-31 08:22:38.620 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7b830774-2315-410b-a3ed-585a1d0b6ee2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:22:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:22:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:22:39 np0005603623 nova_compute[226235]: 2026-01-31 08:22:39.082 226239 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:22:39 np0005603623 nova_compute[226235]: 2026-01-31 08:22:39.083 226239 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Ensure instance console log exists: /var/lib/nova/instances/e83e8f02-79e9-4945-b30e-65624fc06c37/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:22:39 np0005603623 nova_compute[226235]: 2026-01-31 08:22:39.083 226239 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:39 np0005603623 nova_compute[226235]: 2026-01-31 08:22:39.083 226239 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:39 np0005603623 nova_compute[226235]: 2026-01-31 08:22:39.084 226239 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:39 np0005603623 nova_compute[226235]: 2026-01-31 08:22:39.238 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 31 03:22:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:40.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:40 np0005603623 nova_compute[226235]: 2026-01-31 08:22:40.426 226239 DEBUG nova.network.neutron [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Successfully updated port: 7ca23fcb-d932-4244-a5dd-f02ef1fcd06c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:22:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:40.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:40 np0005603623 nova_compute[226235]: 2026-01-31 08:22:40.801 226239 DEBUG nova.compute.manager [req-4522bb91-bd98-4ec2-b426-64035dd63d80 req-b61feff3-d165-4a2e-a6be-85142a0c26c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Received event network-changed-7ca23fcb-d932-4244-a5dd-f02ef1fcd06c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:22:40 np0005603623 nova_compute[226235]: 2026-01-31 08:22:40.802 226239 DEBUG nova.compute.manager [req-4522bb91-bd98-4ec2-b426-64035dd63d80 req-b61feff3-d165-4a2e-a6be-85142a0c26c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Refreshing instance network info cache due to event network-changed-7ca23fcb-d932-4244-a5dd-f02ef1fcd06c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:22:40 np0005603623 nova_compute[226235]: 2026-01-31 08:22:40.802 226239 DEBUG oslo_concurrency.lockutils [req-4522bb91-bd98-4ec2-b426-64035dd63d80 req-b61feff3-d165-4a2e-a6be-85142a0c26c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e83e8f02-79e9-4945-b30e-65624fc06c37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:22:40 np0005603623 nova_compute[226235]: 2026-01-31 08:22:40.802 226239 DEBUG oslo_concurrency.lockutils [req-4522bb91-bd98-4ec2-b426-64035dd63d80 req-b61feff3-d165-4a2e-a6be-85142a0c26c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e83e8f02-79e9-4945-b30e-65624fc06c37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:22:40 np0005603623 nova_compute[226235]: 2026-01-31 08:22:40.802 226239 DEBUG nova.network.neutron [req-4522bb91-bd98-4ec2-b426-64035dd63d80 req-b61feff3-d165-4a2e-a6be-85142a0c26c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Refreshing network info cache for port 7ca23fcb-d932-4244-a5dd-f02ef1fcd06c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:22:40 np0005603623 nova_compute[226235]: 2026-01-31 08:22:40.941 226239 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "refresh_cache-e83e8f02-79e9-4945-b30e-65624fc06c37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:22:41 np0005603623 nova_compute[226235]: 2026-01-31 08:22:41.548 226239 DEBUG nova.network.neutron [req-4522bb91-bd98-4ec2-b426-64035dd63d80 req-b61feff3-d165-4a2e-a6be-85142a0c26c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:22:42 np0005603623 nova_compute[226235]: 2026-01-31 08:22:42.108 226239 DEBUG nova.network.neutron [req-4522bb91-bd98-4ec2-b426-64035dd63d80 req-b61feff3-d165-4a2e-a6be-85142a0c26c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:22:42 np0005603623 nova_compute[226235]: 2026-01-31 08:22:42.113 226239 INFO nova.compute.manager [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Rebuilding instance#033[00m
Jan 31 03:22:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:42.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:42 np0005603623 nova_compute[226235]: 2026-01-31 08:22:42.472 226239 DEBUG oslo_concurrency.lockutils [req-4522bb91-bd98-4ec2-b426-64035dd63d80 req-b61feff3-d165-4a2e-a6be-85142a0c26c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e83e8f02-79e9-4945-b30e-65624fc06c37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:22:42 np0005603623 nova_compute[226235]: 2026-01-31 08:22:42.472 226239 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquired lock "refresh_cache-e83e8f02-79e9-4945-b30e-65624fc06c37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:22:42 np0005603623 nova_compute[226235]: 2026-01-31 08:22:42.473 226239 DEBUG nova.network.neutron [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:22:42 np0005603623 nova_compute[226235]: 2026-01-31 08:22:42.487 226239 DEBUG nova.objects.instance [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7b830774-2315-410b-a3ed-585a1d0b6ee2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:22:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:42.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:42 np0005603623 nova_compute[226235]: 2026-01-31 08:22:42.993 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:43 np0005603623 nova_compute[226235]: 2026-01-31 08:22:43.122 226239 DEBUG nova.compute.manager [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:22:43 np0005603623 nova_compute[226235]: 2026-01-31 08:22:43.424 226239 DEBUG nova.objects.instance [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'pci_requests' on Instance uuid 7b830774-2315-410b-a3ed-585a1d0b6ee2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:22:43 np0005603623 nova_compute[226235]: 2026-01-31 08:22:43.481 226239 DEBUG nova.objects.instance [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'pci_devices' on Instance uuid 7b830774-2315-410b-a3ed-585a1d0b6ee2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:22:43 np0005603623 nova_compute[226235]: 2026-01-31 08:22:43.559 226239 DEBUG nova.network.neutron [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:22:43 np0005603623 nova_compute[226235]: 2026-01-31 08:22:43.690 226239 DEBUG nova.objects.instance [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'resources' on Instance uuid 7b830774-2315-410b-a3ed-585a1d0b6ee2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:22:43 np0005603623 nova_compute[226235]: 2026-01-31 08:22:43.743 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Updating instance_info_cache with network_info: [{"id": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "address": "fa:16:3e:fd:11:00", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb98b496-e5", "ovs_interfaceid": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:22:43 np0005603623 nova_compute[226235]: 2026-01-31 08:22:43.932 226239 DEBUG nova.objects.instance [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'migration_context' on Instance uuid 7b830774-2315-410b-a3ed-585a1d0b6ee2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:22:43 np0005603623 nova_compute[226235]: 2026-01-31 08:22:43.978 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-7b830774-2315-410b-a3ed-585a1d0b6ee2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:22:43 np0005603623 nova_compute[226235]: 2026-01-31 08:22:43.978 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:22:43 np0005603623 nova_compute[226235]: 2026-01-31 08:22:43.978 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:43 np0005603623 nova_compute[226235]: 2026-01-31 08:22:43.979 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:43 np0005603623 nova_compute[226235]: 2026-01-31 08:22:43.979 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:43 np0005603623 nova_compute[226235]: 2026-01-31 08:22:43.979 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:22:43 np0005603623 nova_compute[226235]: 2026-01-31 08:22:43.979 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:43 np0005603623 nova_compute[226235]: 2026-01-31 08:22:43.986 226239 DEBUG nova.objects.instance [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 03:22:43 np0005603623 nova_compute[226235]: 2026-01-31 08:22:43.989 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:22:44 np0005603623 nova_compute[226235]: 2026-01-31 08:22:44.094 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:44 np0005603623 nova_compute[226235]: 2026-01-31 08:22:44.094 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:44 np0005603623 nova_compute[226235]: 2026-01-31 08:22:44.094 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:44 np0005603623 nova_compute[226235]: 2026-01-31 08:22:44.095 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:22:44 np0005603623 nova_compute[226235]: 2026-01-31 08:22:44.095 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:44 np0005603623 nova_compute[226235]: 2026-01-31 08:22:44.240 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:44.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:22:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1736875600' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:22:44 np0005603623 nova_compute[226235]: 2026-01-31 08:22:44.558 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:44.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:44 np0005603623 podman[274054]: 2026-01-31 08:22:44.638529725 +0000 UTC m=+0.045891599 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 31 03:22:44 np0005603623 podman[274055]: 2026-01-31 08:22:44.684116794 +0000 UTC m=+0.091334134 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:22:44 np0005603623 nova_compute[226235]: 2026-01-31 08:22:44.721 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000070 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:22:44 np0005603623 nova_compute[226235]: 2026-01-31 08:22:44.722 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000070 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:22:44 np0005603623 nova_compute[226235]: 2026-01-31 08:22:44.843 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:22:44 np0005603623 nova_compute[226235]: 2026-01-31 08:22:44.845 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4351MB free_disk=20.834407806396484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:22:44 np0005603623 nova_compute[226235]: 2026-01-31 08:22:44.845 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:44 np0005603623 nova_compute[226235]: 2026-01-31 08:22:44.845 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:45 np0005603623 nova_compute[226235]: 2026-01-31 08:22:45.038 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 7b830774-2315-410b-a3ed-585a1d0b6ee2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:22:45 np0005603623 nova_compute[226235]: 2026-01-31 08:22:45.039 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance e83e8f02-79e9-4945-b30e-65624fc06c37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:22:45 np0005603623 nova_compute[226235]: 2026-01-31 08:22:45.039 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:22:45 np0005603623 nova_compute[226235]: 2026-01-31 08:22:45.039 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:22:45 np0005603623 nova_compute[226235]: 2026-01-31 08:22:45.116 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:22:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1501952894' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:22:45 np0005603623 nova_compute[226235]: 2026-01-31 08:22:45.521 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:45 np0005603623 nova_compute[226235]: 2026-01-31 08:22:45.526 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:22:45 np0005603623 nova_compute[226235]: 2026-01-31 08:22:45.782 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:22:46 np0005603623 nova_compute[226235]: 2026-01-31 08:22:46.110 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:22:46 np0005603623 nova_compute[226235]: 2026-01-31 08:22:46.110 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:22:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:46.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:22:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:22:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:46.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.285 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.286 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.286 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.876 226239 DEBUG nova.network.neutron [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Updating instance_info_cache with network_info: [{"id": "7ca23fcb-d932-4244-a5dd-f02ef1fcd06c", "address": "fa:16:3e:dd:cd:7e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca23fcb-d9", "ovs_interfaceid": "7ca23fcb-d932-4244-a5dd-f02ef1fcd06c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.940 226239 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Releasing lock "refresh_cache-e83e8f02-79e9-4945-b30e-65624fc06c37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.941 226239 DEBUG nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Instance network_info: |[{"id": "7ca23fcb-d932-4244-a5dd-f02ef1fcd06c", "address": "fa:16:3e:dd:cd:7e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca23fcb-d9", "ovs_interfaceid": "7ca23fcb-d932-4244-a5dd-f02ef1fcd06c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.944 226239 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Start _get_guest_xml network_info=[{"id": "7ca23fcb-d932-4244-a5dd-f02ef1fcd06c", "address": "fa:16:3e:dd:cd:7e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca23fcb-d9", "ovs_interfaceid": "7ca23fcb-d932-4244-a5dd-f02ef1fcd06c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.948 226239 WARNING nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.951 226239 DEBUG nova.virt.libvirt.host [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.952 226239 DEBUG nova.virt.libvirt.host [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.958 226239 DEBUG nova.virt.libvirt.host [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.958 226239 DEBUG nova.virt.libvirt.host [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.960 226239 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.960 226239 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.960 226239 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.961 226239 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.961 226239 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.961 226239 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.961 226239 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.962 226239 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.962 226239 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.962 226239 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.963 226239 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.963 226239 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.966 226239 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:47 np0005603623 nova_compute[226235]: 2026-01-31 08:22:47.996 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:48.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:22:48 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4061354220' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.388 226239 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.416 226239 DEBUG nova.storage.rbd_utils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image e83e8f02-79e9-4945-b30e-65624fc06c37_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.421 226239 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:48Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fd:11:00 10.100.0.6
Jan 31 03:22:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:48Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fd:11:00 10.100.0.6
Jan 31 03:22:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:22:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:48.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:22:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:22:48 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3596543568' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.862 226239 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.865 226239 DEBUG nova.virt.libvirt.vif [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:22:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-147995892',display_name='tempest-tempest.common.compute-instance-147995892-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-147995892-2',id=114,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='96de645f38844180b404d1a7cf7dd460',ramdisk_id='',reservation_id='r-rktr52i6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-174245429',owner_user_name='tempest-MultipleCreateTestJSON-174245429-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:22:35Z,user_data=None,user_id='c086a82bd0384612a78981006889df41',uuid=e83e8f02-79e9-4945-b30e-65624fc06c37,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7ca23fcb-d932-4244-a5dd-f02ef1fcd06c", "address": "fa:16:3e:dd:cd:7e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca23fcb-d9", "ovs_interfaceid": "7ca23fcb-d932-4244-a5dd-f02ef1fcd06c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.865 226239 DEBUG nova.network.os_vif_util [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converting VIF {"id": "7ca23fcb-d932-4244-a5dd-f02ef1fcd06c", "address": "fa:16:3e:dd:cd:7e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca23fcb-d9", "ovs_interfaceid": "7ca23fcb-d932-4244-a5dd-f02ef1fcd06c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.867 226239 DEBUG nova.network.os_vif_util [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:cd:7e,bridge_name='br-int',has_traffic_filtering=True,id=7ca23fcb-d932-4244-a5dd-f02ef1fcd06c,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca23fcb-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.869 226239 DEBUG nova.objects.instance [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lazy-loading 'pci_devices' on Instance uuid e83e8f02-79e9-4945-b30e-65624fc06c37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.912 226239 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:22:48 np0005603623 nova_compute[226235]:  <uuid>e83e8f02-79e9-4945-b30e-65624fc06c37</uuid>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:  <name>instance-00000072</name>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <nova:name>tempest-tempest.common.compute-instance-147995892-2</nova:name>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:22:47</nova:creationTime>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:22:48 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:        <nova:user uuid="c086a82bd0384612a78981006889df41">tempest-MultipleCreateTestJSON-174245429-project-member</nova:user>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:        <nova:project uuid="96de645f38844180b404d1a7cf7dd460">tempest-MultipleCreateTestJSON-174245429</nova:project>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:        <nova:port uuid="7ca23fcb-d932-4244-a5dd-f02ef1fcd06c">
Jan 31 03:22:48 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <entry name="serial">e83e8f02-79e9-4945-b30e-65624fc06c37</entry>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <entry name="uuid">e83e8f02-79e9-4945-b30e-65624fc06c37</entry>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/e83e8f02-79e9-4945-b30e-65624fc06c37_disk">
Jan 31 03:22:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:22:48 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/e83e8f02-79e9-4945-b30e-65624fc06c37_disk.config">
Jan 31 03:22:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:22:48 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:dd:cd:7e"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <target dev="tap7ca23fcb-d9"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/e83e8f02-79e9-4945-b30e-65624fc06c37/console.log" append="off"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:22:48 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:22:48 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:22:48 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:22:48 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.913 226239 DEBUG nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Preparing to wait for external event network-vif-plugged-7ca23fcb-d932-4244-a5dd-f02ef1fcd06c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.913 226239 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "e83e8f02-79e9-4945-b30e-65624fc06c37-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.914 226239 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "e83e8f02-79e9-4945-b30e-65624fc06c37-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.914 226239 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "e83e8f02-79e9-4945-b30e-65624fc06c37-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.915 226239 DEBUG nova.virt.libvirt.vif [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:22:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-147995892',display_name='tempest-tempest.common.compute-instance-147995892-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-147995892-2',id=114,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='96de645f38844180b404d1a7cf7dd460',ramdisk_id='',reservation_id='r-rktr52i6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-174245429',owner_user_name='tempest-MultipleCreateTestJSON-174245429-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:22:35Z,user_data=None,user_id='c086a82bd0384612a78981006889df41',uuid=e83e8f02-79e9-4945-b30e-65624fc06c37,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7ca23fcb-d932-4244-a5dd-f02ef1fcd06c", "address": "fa:16:3e:dd:cd:7e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca23fcb-d9", "ovs_interfaceid": "7ca23fcb-d932-4244-a5dd-f02ef1fcd06c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.915 226239 DEBUG nova.network.os_vif_util [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converting VIF {"id": "7ca23fcb-d932-4244-a5dd-f02ef1fcd06c", "address": "fa:16:3e:dd:cd:7e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca23fcb-d9", "ovs_interfaceid": "7ca23fcb-d932-4244-a5dd-f02ef1fcd06c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.916 226239 DEBUG nova.network.os_vif_util [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:cd:7e,bridge_name='br-int',has_traffic_filtering=True,id=7ca23fcb-d932-4244-a5dd-f02ef1fcd06c,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca23fcb-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.916 226239 DEBUG os_vif [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:cd:7e,bridge_name='br-int',has_traffic_filtering=True,id=7ca23fcb-d932-4244-a5dd-f02ef1fcd06c,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca23fcb-d9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.917 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.917 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.917 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.920 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.920 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7ca23fcb-d9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.920 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7ca23fcb-d9, col_values=(('external_ids', {'iface-id': '7ca23fcb-d932-4244-a5dd-f02ef1fcd06c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:cd:7e', 'vm-uuid': 'e83e8f02-79e9-4945-b30e-65624fc06c37'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:48 np0005603623 NetworkManager[48970]: <info>  [1769847768.9712] manager: (tap7ca23fcb-d9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/206)
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.970 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.974 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.976 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:48 np0005603623 nova_compute[226235]: 2026-01-31 08:22:48.977 226239 INFO os_vif [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:cd:7e,bridge_name='br-int',has_traffic_filtering=True,id=7ca23fcb-d932-4244-a5dd-f02ef1fcd06c,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca23fcb-d9')#033[00m
Jan 31 03:22:49 np0005603623 nova_compute[226235]: 2026-01-31 08:22:49.077 226239 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:22:49 np0005603623 nova_compute[226235]: 2026-01-31 08:22:49.077 226239 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:22:49 np0005603623 nova_compute[226235]: 2026-01-31 08:22:49.077 226239 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] No VIF found with MAC fa:16:3e:dd:cd:7e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:22:49 np0005603623 nova_compute[226235]: 2026-01-31 08:22:49.078 226239 INFO nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Using config drive#033[00m
Jan 31 03:22:49 np0005603623 nova_compute[226235]: 2026-01-31 08:22:49.098 226239 DEBUG nova.storage.rbd_utils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image e83e8f02-79e9-4945-b30e-65624fc06c37_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:49 np0005603623 nova_compute[226235]: 2026-01-31 08:22:49.243 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:50.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:50.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:50 np0005603623 nova_compute[226235]: 2026-01-31 08:22:50.631 226239 INFO nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Creating config drive at /var/lib/nova/instances/e83e8f02-79e9-4945-b30e-65624fc06c37/disk.config#033[00m
Jan 31 03:22:50 np0005603623 nova_compute[226235]: 2026-01-31 08:22:50.634 226239 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e83e8f02-79e9-4945-b30e-65624fc06c37/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp4kk0t229 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:50 np0005603623 nova_compute[226235]: 2026-01-31 08:22:50.754 226239 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e83e8f02-79e9-4945-b30e-65624fc06c37/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp4kk0t229" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:50 np0005603623 nova_compute[226235]: 2026-01-31 08:22:50.779 226239 DEBUG nova.storage.rbd_utils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image e83e8f02-79e9-4945-b30e-65624fc06c37_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:50 np0005603623 nova_compute[226235]: 2026-01-31 08:22:50.782 226239 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e83e8f02-79e9-4945-b30e-65624fc06c37/disk.config e83e8f02-79e9-4945-b30e-65624fc06c37_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:50 np0005603623 nova_compute[226235]: 2026-01-31 08:22:50.935 226239 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e83e8f02-79e9-4945-b30e-65624fc06c37/disk.config e83e8f02-79e9-4945-b30e-65624fc06c37_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.153s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:50 np0005603623 nova_compute[226235]: 2026-01-31 08:22:50.936 226239 INFO nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Deleting local config drive /var/lib/nova/instances/e83e8f02-79e9-4945-b30e-65624fc06c37/disk.config because it was imported into RBD.#033[00m
Jan 31 03:22:50 np0005603623 NetworkManager[48970]: <info>  [1769847770.9637] manager: (tap7ca23fcb-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/207)
Jan 31 03:22:50 np0005603623 kernel: tap7ca23fcb-d9: entered promiscuous mode
Jan 31 03:22:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:50Z|00433|binding|INFO|Claiming lport 7ca23fcb-d932-4244-a5dd-f02ef1fcd06c for this chassis.
Jan 31 03:22:50 np0005603623 nova_compute[226235]: 2026-01-31 08:22:50.966 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:50Z|00434|binding|INFO|7ca23fcb-d932-4244-a5dd-f02ef1fcd06c: Claiming fa:16:3e:dd:cd:7e 10.100.0.7
Jan 31 03:22:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:50Z|00435|binding|INFO|Setting lport 7ca23fcb-d932-4244-a5dd-f02ef1fcd06c ovn-installed in OVS
Jan 31 03:22:50 np0005603623 nova_compute[226235]: 2026-01-31 08:22:50.974 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:50 np0005603623 nova_compute[226235]: 2026-01-31 08:22:50.975 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:50 np0005603623 systemd-udevd[274308]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:22:50 np0005603623 systemd-machined[194379]: New machine qemu-49-instance-00000072.
Jan 31 03:22:50 np0005603623 NetworkManager[48970]: <info>  [1769847770.9960] device (tap7ca23fcb-d9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:22:50 np0005603623 NetworkManager[48970]: <info>  [1769847770.9965] device (tap7ca23fcb-d9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:22:51 np0005603623 systemd[1]: Started Virtual Machine qemu-49-instance-00000072.
Jan 31 03:22:51 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:51Z|00436|binding|INFO|Setting lport 7ca23fcb-d932-4244-a5dd-f02ef1fcd06c up in Southbound
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.022 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:cd:7e 10.100.0.7'], port_security=['fa:16:3e:dd:cd:7e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e83e8f02-79e9-4945-b30e-65624fc06c37', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd2feb18-e01d-4084-b50c-13511157dde4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96de645f38844180b404d1a7cf7dd460', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6f7abf9c-ddb4-47da-9619-41273b5c7231', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f6f1841d-a97e-4124-981b-627c1dc4d00d, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=7ca23fcb-d932-4244-a5dd-f02ef1fcd06c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.023 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 7ca23fcb-d932-4244-a5dd-f02ef1fcd06c in datapath bd2feb18-e01d-4084-b50c-13511157dde4 bound to our chassis#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.025 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bd2feb18-e01d-4084-b50c-13511157dde4#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.034 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4fd1636a-5cd5-49b1-8f01-83cea39d6abc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.035 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbd2feb18-e1 in ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.037 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbd2feb18-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.037 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[db5812cc-f485-4211-892b-7b254d87c490]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.038 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6ec98e49-5968-40ad-ba77-e89c857d3199]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.046 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[ea04bd06-9481-41a9-90a6-d4241318b59b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.055 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4cba6f46-f72a-47e2-9207-64cbbd3e0cf9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.073 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ed330b74-510e-47c6-8f4b-d1efacb43de6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.077 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[abf5511f-df23-4540-956b-dc499aa431e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:51 np0005603623 NetworkManager[48970]: <info>  [1769847771.0785] manager: (tapbd2feb18-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/208)
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.103 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ffe8835a-9f8b-4267-8caa-dad25a033933]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.106 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[7f12cd25-b3b5-49f0-a341-9954add3e03e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 39K writes, 172K keys, 39K commit groups, 1.0 writes per commit group, ingest: 0.18 GB, 0.05 MB/s#012Cumulative WAL: 39K writes, 12K syncs, 3.10 writes per sync, written: 0.18 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5786 writes, 23K keys, 5786 commit groups, 1.0 writes per commit group, ingest: 24.57 MB, 0.04 MB/s#012Interval WAL: 5786 writes, 2192 syncs, 2.64 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 03:22:51 np0005603623 NetworkManager[48970]: <info>  [1769847771.1244] device (tapbd2feb18-e0): carrier: link connected
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.127 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[4a17280f-03e6-4b47-8cfd-2f26201e7437]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.142 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c5e164c0-6ffe-4685-be0c-65a54b7e0709]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd2feb18-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:e8:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 127], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 691155, 'reachable_time': 23424, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274342, 'error': None, 'target': 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.155 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[db74f31a-dabf-43ee-9d4c-00e9153d718b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:e8ee'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 691155, 'tstamp': 691155}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 274343, 'error': None, 'target': 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.169 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[987a06f1-0ba9-4c23-be48-800c8a5bc3b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd2feb18-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:e8:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 127], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 691155, 'reachable_time': 23424, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 274344, 'error': None, 'target': 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.190 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ee3c48d8-c632-4428-9384-91a28ef521ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.235 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e83364be-fdb5-4dbd-9bff-25a56850a9ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.237 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd2feb18-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.237 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.238 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbd2feb18-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:51 np0005603623 nova_compute[226235]: 2026-01-31 08:22:51.287 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:51 np0005603623 kernel: tapbd2feb18-e0: entered promiscuous mode
Jan 31 03:22:51 np0005603623 NetworkManager[48970]: <info>  [1769847771.2961] manager: (tapbd2feb18-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/209)
Jan 31 03:22:51 np0005603623 nova_compute[226235]: 2026-01-31 08:22:51.295 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.297 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbd2feb18-e0, col_values=(('external_ids', {'iface-id': '7b95dd4c-16d2-4ff4-9598-b3ff910c3f1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:51 np0005603623 nova_compute[226235]: 2026-01-31 08:22:51.298 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:51 np0005603623 nova_compute[226235]: 2026-01-31 08:22:51.300 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:51 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:51Z|00437|binding|INFO|Releasing lport 7b95dd4c-16d2-4ff4-9598-b3ff910c3f1b from this chassis (sb_readonly=0)
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.300 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bd2feb18-e01d-4084-b50c-13511157dde4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bd2feb18-e01d-4084-b50c-13511157dde4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.302 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[893b7432-2588-4387-bcff-ea249be7b524]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.303 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-bd2feb18-e01d-4084-b50c-13511157dde4
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/bd2feb18-e01d-4084-b50c-13511157dde4.pid.haproxy
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID bd2feb18-e01d-4084-b50c-13511157dde4
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:22:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:51.303 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'env', 'PROCESS_TAG=haproxy-bd2feb18-e01d-4084-b50c-13511157dde4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bd2feb18-e01d-4084-b50c-13511157dde4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:22:51 np0005603623 nova_compute[226235]: 2026-01-31 08:22:51.306 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:51 np0005603623 podman[274376]: 2026-01-31 08:22:51.590862221 +0000 UTC m=+0.016911412 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:22:51 np0005603623 podman[274376]: 2026-01-31 08:22:51.788342321 +0000 UTC m=+0.214391482 container create a6b2cab549a8fef70de18b502febc539a0f50ff07f08b05864726a3d20f8f9cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:22:51 np0005603623 nova_compute[226235]: 2026-01-31 08:22:51.810 226239 DEBUG nova.compute.manager [req-75c326a9-dfd5-4742-9449-3c030cdaf075 req-7fb97ff0-a10c-46dc-955a-a9af007b89ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Received event network-vif-plugged-7ca23fcb-d932-4244-a5dd-f02ef1fcd06c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:22:51 np0005603623 nova_compute[226235]: 2026-01-31 08:22:51.810 226239 DEBUG oslo_concurrency.lockutils [req-75c326a9-dfd5-4742-9449-3c030cdaf075 req-7fb97ff0-a10c-46dc-955a-a9af007b89ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e83e8f02-79e9-4945-b30e-65624fc06c37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:51 np0005603623 nova_compute[226235]: 2026-01-31 08:22:51.811 226239 DEBUG oslo_concurrency.lockutils [req-75c326a9-dfd5-4742-9449-3c030cdaf075 req-7fb97ff0-a10c-46dc-955a-a9af007b89ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e83e8f02-79e9-4945-b30e-65624fc06c37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:51 np0005603623 nova_compute[226235]: 2026-01-31 08:22:51.811 226239 DEBUG oslo_concurrency.lockutils [req-75c326a9-dfd5-4742-9449-3c030cdaf075 req-7fb97ff0-a10c-46dc-955a-a9af007b89ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e83e8f02-79e9-4945-b30e-65624fc06c37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:51 np0005603623 nova_compute[226235]: 2026-01-31 08:22:51.811 226239 DEBUG nova.compute.manager [req-75c326a9-dfd5-4742-9449-3c030cdaf075 req-7fb97ff0-a10c-46dc-955a-a9af007b89ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Processing event network-vif-plugged-7ca23fcb-d932-4244-a5dd-f02ef1fcd06c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:22:51 np0005603623 systemd[1]: Started libpod-conmon-a6b2cab549a8fef70de18b502febc539a0f50ff07f08b05864726a3d20f8f9cb.scope.
Jan 31 03:22:51 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:22:51 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efe4867985d0689e6b5eefcdb906fad80127edfd9066dd785d1764c63e7fa18b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:22:51 np0005603623 podman[274376]: 2026-01-31 08:22:51.925689417 +0000 UTC m=+0.351738618 container init a6b2cab549a8fef70de18b502febc539a0f50ff07f08b05864726a3d20f8f9cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:22:51 np0005603623 podman[274376]: 2026-01-31 08:22:51.9299346 +0000 UTC m=+0.355983771 container start a6b2cab549a8fef70de18b502febc539a0f50ff07f08b05864726a3d20f8f9cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Jan 31 03:22:51 np0005603623 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[274392]: [NOTICE]   (274396) : New worker (274398) forked
Jan 31 03:22:51 np0005603623 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[274392]: [NOTICE]   (274396) : Loading success.
Jan 31 03:22:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:52.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.529 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847772.528822, e83e8f02-79e9-4945-b30e-65624fc06c37 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.529 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] VM Started (Lifecycle Event)#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.533 226239 DEBUG nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.536 226239 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.539 226239 INFO nova.virt.libvirt.driver [-] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Instance spawned successfully.#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.539 226239 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.570 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.573 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:22:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:22:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:52.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.606 226239 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.606 226239 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.607 226239 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.607 226239 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.608 226239 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.608 226239 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.663 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.664 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847772.5296018, e83e8f02-79e9-4945-b30e-65624fc06c37 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.664 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.731 226239 INFO nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Took 16.21 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.731 226239 DEBUG nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.733 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.739 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847772.5356457, e83e8f02-79e9-4945-b30e-65624fc06c37 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.739 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.834 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:22:52 np0005603623 nova_compute[226235]: 2026-01-31 08:22:52.840 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:22:53 np0005603623 nova_compute[226235]: 2026-01-31 08:22:53.029 226239 INFO nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Took 19.36 seconds to build instance.#033[00m
Jan 31 03:22:53 np0005603623 nova_compute[226235]: 2026-01-31 08:22:53.065 226239 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "e83e8f02-79e9-4945-b30e-65624fc06c37" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.999s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:53 np0005603623 nova_compute[226235]: 2026-01-31 08:22:53.972 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:53 np0005603623 nova_compute[226235]: 2026-01-31 08:22:53.997 226239 DEBUG nova.compute.manager [req-b0a87989-8034-4e9b-be8e-b72b9ac6313f req-022c8bfa-aad4-46a7-b445-10c79fd1c59d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Received event network-vif-plugged-7ca23fcb-d932-4244-a5dd-f02ef1fcd06c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:22:53 np0005603623 nova_compute[226235]: 2026-01-31 08:22:53.997 226239 DEBUG oslo_concurrency.lockutils [req-b0a87989-8034-4e9b-be8e-b72b9ac6313f req-022c8bfa-aad4-46a7-b445-10c79fd1c59d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e83e8f02-79e9-4945-b30e-65624fc06c37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:53 np0005603623 nova_compute[226235]: 2026-01-31 08:22:53.997 226239 DEBUG oslo_concurrency.lockutils [req-b0a87989-8034-4e9b-be8e-b72b9ac6313f req-022c8bfa-aad4-46a7-b445-10c79fd1c59d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e83e8f02-79e9-4945-b30e-65624fc06c37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:53 np0005603623 nova_compute[226235]: 2026-01-31 08:22:53.998 226239 DEBUG oslo_concurrency.lockutils [req-b0a87989-8034-4e9b-be8e-b72b9ac6313f req-022c8bfa-aad4-46a7-b445-10c79fd1c59d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e83e8f02-79e9-4945-b30e-65624fc06c37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:53 np0005603623 nova_compute[226235]: 2026-01-31 08:22:53.998 226239 DEBUG nova.compute.manager [req-b0a87989-8034-4e9b-be8e-b72b9ac6313f req-022c8bfa-aad4-46a7-b445-10c79fd1c59d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] No waiting events found dispatching network-vif-plugged-7ca23fcb-d932-4244-a5dd-f02ef1fcd06c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:22:53 np0005603623 nova_compute[226235]: 2026-01-31 08:22:53.998 226239 WARNING nova.compute.manager [req-b0a87989-8034-4e9b-be8e-b72b9ac6313f req-022c8bfa-aad4-46a7-b445-10c79fd1c59d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Received unexpected event network-vif-plugged-7ca23fcb-d932-4244-a5dd-f02ef1fcd06c for instance with vm_state active and task_state None.#033[00m
Jan 31 03:22:54 np0005603623 nova_compute[226235]: 2026-01-31 08:22:54.034 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 03:22:54 np0005603623 nova_compute[226235]: 2026-01-31 08:22:54.245 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:22:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:54.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:22:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:22:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:54.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:22:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:55 np0005603623 nova_compute[226235]: 2026-01-31 08:22:55.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:55 np0005603623 nova_compute[226235]: 2026-01-31 08:22:55.190 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:22:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:56.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:22:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:22:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:56.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:22:56 np0005603623 kernel: tapbb98b496-e5 (unregistering): left promiscuous mode
Jan 31 03:22:56 np0005603623 NetworkManager[48970]: <info>  [1769847776.9024] device (tapbb98b496-e5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:22:56 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:56Z|00438|binding|INFO|Releasing lport bb98b496-e57f-4e5a-bea6-fc68d9690077 from this chassis (sb_readonly=0)
Jan 31 03:22:56 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:56Z|00439|binding|INFO|Setting lport bb98b496-e57f-4e5a-bea6-fc68d9690077 down in Southbound
Jan 31 03:22:56 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:56Z|00440|binding|INFO|Removing iface tapbb98b496-e5 ovn-installed in OVS
Jan 31 03:22:56 np0005603623 nova_compute[226235]: 2026-01-31 08:22:56.913 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:56 np0005603623 nova_compute[226235]: 2026-01-31 08:22:56.917 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:56 np0005603623 nova_compute[226235]: 2026-01-31 08:22:56.921 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:56.926 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:11:00 10.100.0.6'], port_security=['fa:16:3e:fd:11:00 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7b830774-2315-410b-a3ed-585a1d0b6ee2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'be6219b2-98f8-4804-bad5-369b6bf26a95', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=bb98b496-e57f-4e5a-bea6-fc68d9690077) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:22:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:56.927 143258 INFO neutron.agent.ovn.metadata.agent [-] Port bb98b496-e57f-4e5a-bea6-fc68d9690077 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 unbound from our chassis#033[00m
Jan 31 03:22:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:56.929 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1186b71b-0c4b-47f0-a55d-4433241e46e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:22:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:56.931 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[061556a6-9744-44f1-89be-97195cc9bcb9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:56.932 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace which is not needed anymore#033[00m
Jan 31 03:22:56 np0005603623 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000070.scope: Deactivated successfully.
Jan 31 03:22:56 np0005603623 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000070.scope: Consumed 11.750s CPU time.
Jan 31 03:22:56 np0005603623 systemd-machined[194379]: Machine qemu-48-instance-00000070 terminated.
Jan 31 03:22:57 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[273795]: [NOTICE]   (273799) : haproxy version is 2.8.14-c23fe91
Jan 31 03:22:57 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[273795]: [NOTICE]   (273799) : path to executable is /usr/sbin/haproxy
Jan 31 03:22:57 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[273795]: [WARNING]  (273799) : Exiting Master process...
Jan 31 03:22:57 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[273795]: [WARNING]  (273799) : Exiting Master process...
Jan 31 03:22:57 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[273795]: [ALERT]    (273799) : Current worker (273801) exited with code 143 (Terminated)
Jan 31 03:22:57 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[273795]: [WARNING]  (273799) : All workers exited. Exiting... (0)
Jan 31 03:22:57 np0005603623 systemd[1]: libpod-227e4f05ac9bbfd81c171cc150a532118456e74fb38ffb1a1538e0131660f92a.scope: Deactivated successfully.
Jan 31 03:22:57 np0005603623 podman[274475]: 2026-01-31 08:22:57.04437326 +0000 UTC m=+0.038534279 container died 227e4f05ac9bbfd81c171cc150a532118456e74fb38ffb1a1538e0131660f92a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 03:22:57 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-227e4f05ac9bbfd81c171cc150a532118456e74fb38ffb1a1538e0131660f92a-userdata-shm.mount: Deactivated successfully.
Jan 31 03:22:57 np0005603623 systemd[1]: var-lib-containers-storage-overlay-de6fb7c01ac9dd8dfaf6ecda9428be5cf8e8ff5a30d0510b4487f8bde676494e-merged.mount: Deactivated successfully.
Jan 31 03:22:57 np0005603623 podman[274475]: 2026-01-31 08:22:57.085904052 +0000 UTC m=+0.080065071 container cleanup 227e4f05ac9bbfd81c171cc150a532118456e74fb38ffb1a1538e0131660f92a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 03:22:57 np0005603623 systemd[1]: libpod-conmon-227e4f05ac9bbfd81c171cc150a532118456e74fb38ffb1a1538e0131660f92a.scope: Deactivated successfully.
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.134 226239 INFO nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Instance shutdown successfully after 13 seconds.#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.143 226239 INFO nova.virt.libvirt.driver [-] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Instance destroyed successfully.#033[00m
Jan 31 03:22:57 np0005603623 podman[274506]: 2026-01-31 08:22:57.144526939 +0000 UTC m=+0.043669239 container remove 227e4f05ac9bbfd81c171cc150a532118456e74fb38ffb1a1538e0131660f92a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.148 226239 INFO nova.virt.libvirt.driver [-] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Instance destroyed successfully.#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.149 226239 DEBUG nova.virt.libvirt.vif [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:22:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1897188479',display_name='tempest-ServerActionsTestJSON-server-124585492',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1897188479',id=112,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:22:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-yq6e639z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:22:39Z,user_data=None,user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=7b830774-2315-410b-a3ed-585a1d0b6ee2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "address": "fa:16:3e:fd:11:00", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb98b496-e5", "ovs_interfaceid": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.149 226239 DEBUG nova.network.os_vif_util [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "address": "fa:16:3e:fd:11:00", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb98b496-e5", "ovs_interfaceid": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.150 226239 DEBUG nova.network.os_vif_util [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:11:00,bridge_name='br-int',has_traffic_filtering=True,id=bb98b496-e57f-4e5a-bea6-fc68d9690077,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb98b496-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.154 226239 DEBUG os_vif [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:11:00,bridge_name='br-int',has_traffic_filtering=True,id=bb98b496-e57f-4e5a-bea6-fc68d9690077,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb98b496-e5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.156 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.156 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbb98b496-e5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:57.153 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5ab391bc-17dd-4a36-97ad-d88296397b09]: (4, ('Sat Jan 31 08:22:56 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (227e4f05ac9bbfd81c171cc150a532118456e74fb38ffb1a1538e0131660f92a)\n227e4f05ac9bbfd81c171cc150a532118456e74fb38ffb1a1538e0131660f92a\nSat Jan 31 08:22:57 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (227e4f05ac9bbfd81c171cc150a532118456e74fb38ffb1a1538e0131660f92a)\n227e4f05ac9bbfd81c171cc150a532118456e74fb38ffb1a1538e0131660f92a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.157 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.159 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.161 226239 INFO os_vif [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:11:00,bridge_name='br-int',has_traffic_filtering=True,id=bb98b496-e57f-4e5a-bea6-fc68d9690077,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb98b496-e5')#033[00m
Jan 31 03:22:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:57.160 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a2ec4e95-ab2c-4050-bd63-c91feba3a0ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:57.165 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:57 np0005603623 kernel: tap1186b71b-00: left promiscuous mode
Jan 31 03:22:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:57.177 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bf72f73c-ac10-4d11-90dd-1c85461c5089]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.178 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:57.193 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1b62ac88-014a-4459-adbc-fe760ca28020]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:57.195 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8fc69158-7a3f-4372-a1e9-57a7e2b705fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:57.207 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[eb81a323-ecf0-455c-b800-077b3e89ae43]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 689555, 'reachable_time': 23656, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274548, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:57 np0005603623 systemd[1]: run-netns-ovnmeta\x2d1186b71b\x2d0c4b\x2d47f0\x2da55d\x2d4433241e46e7.mount: Deactivated successfully.
Jan 31 03:22:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:57.210 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:22:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:57.210 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[58eb1600-fa54-4c47-8590-e11466644de5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.308 226239 DEBUG nova.compute.manager [req-d1c99a09-9b38-47ad-8d8a-2a16c6aef812 req-6848746b-5485-430a-8487-8f4b467b32c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Received event network-vif-unplugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.309 226239 DEBUG oslo_concurrency.lockutils [req-d1c99a09-9b38-47ad-8d8a-2a16c6aef812 req-6848746b-5485-430a-8487-8f4b467b32c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.309 226239 DEBUG oslo_concurrency.lockutils [req-d1c99a09-9b38-47ad-8d8a-2a16c6aef812 req-6848746b-5485-430a-8487-8f4b467b32c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.309 226239 DEBUG oslo_concurrency.lockutils [req-d1c99a09-9b38-47ad-8d8a-2a16c6aef812 req-6848746b-5485-430a-8487-8f4b467b32c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.310 226239 DEBUG nova.compute.manager [req-d1c99a09-9b38-47ad-8d8a-2a16c6aef812 req-6848746b-5485-430a-8487-8f4b467b32c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] No waiting events found dispatching network-vif-unplugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.310 226239 WARNING nova.compute.manager [req-d1c99a09-9b38-47ad-8d8a-2a16c6aef812 req-6848746b-5485-430a-8487-8f4b467b32c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Received unexpected event network-vif-unplugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 for instance with vm_state active and task_state rebuilding.#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.766 226239 DEBUG oslo_concurrency.lockutils [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "e83e8f02-79e9-4945-b30e-65624fc06c37" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.767 226239 DEBUG oslo_concurrency.lockutils [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "e83e8f02-79e9-4945-b30e-65624fc06c37" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.767 226239 DEBUG oslo_concurrency.lockutils [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "e83e8f02-79e9-4945-b30e-65624fc06c37-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.767 226239 DEBUG oslo_concurrency.lockutils [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "e83e8f02-79e9-4945-b30e-65624fc06c37-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.768 226239 DEBUG oslo_concurrency.lockutils [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "e83e8f02-79e9-4945-b30e-65624fc06c37-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.769 226239 INFO nova.compute.manager [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Terminating instance#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.770 226239 DEBUG nova.compute.manager [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:22:57 np0005603623 kernel: tap7ca23fcb-d9 (unregistering): left promiscuous mode
Jan 31 03:22:57 np0005603623 NetworkManager[48970]: <info>  [1769847777.8505] device (tap7ca23fcb-d9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.852 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:57Z|00441|binding|INFO|Releasing lport 7ca23fcb-d932-4244-a5dd-f02ef1fcd06c from this chassis (sb_readonly=0)
Jan 31 03:22:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:57Z|00442|binding|INFO|Setting lport 7ca23fcb-d932-4244-a5dd-f02ef1fcd06c down in Southbound
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.857 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:22:57Z|00443|binding|INFO|Removing iface tap7ca23fcb-d9 ovn-installed in OVS
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.860 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.864 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:57.867 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:cd:7e 10.100.0.7'], port_security=['fa:16:3e:dd:cd:7e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e83e8f02-79e9-4945-b30e-65624fc06c37', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd2feb18-e01d-4084-b50c-13511157dde4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96de645f38844180b404d1a7cf7dd460', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6f7abf9c-ddb4-47da-9619-41273b5c7231', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f6f1841d-a97e-4124-981b-627c1dc4d00d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=7ca23fcb-d932-4244-a5dd-f02ef1fcd06c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:22:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:57.869 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 7ca23fcb-d932-4244-a5dd-f02ef1fcd06c in datapath bd2feb18-e01d-4084-b50c-13511157dde4 unbound from our chassis#033[00m
Jan 31 03:22:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:57.871 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bd2feb18-e01d-4084-b50c-13511157dde4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.871 226239 INFO nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Deleting instance files /var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2_del#033[00m
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.872 226239 INFO nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Deletion of /var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2_del complete#033[00m
Jan 31 03:22:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:57.872 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ba0a55c1-4aa4-4e65-b856-98f85c771e82]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:57.873 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 namespace which is not needed anymore#033[00m
Jan 31 03:22:57 np0005603623 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000072.scope: Deactivated successfully.
Jan 31 03:22:57 np0005603623 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000072.scope: Consumed 6.699s CPU time.
Jan 31 03:22:57 np0005603623 systemd-machined[194379]: Machine qemu-49-instance-00000072 terminated.
Jan 31 03:22:57 np0005603623 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[274392]: [NOTICE]   (274396) : haproxy version is 2.8.14-c23fe91
Jan 31 03:22:57 np0005603623 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[274392]: [NOTICE]   (274396) : path to executable is /usr/sbin/haproxy
Jan 31 03:22:57 np0005603623 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[274392]: [WARNING]  (274396) : Exiting Master process...
Jan 31 03:22:57 np0005603623 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[274392]: [WARNING]  (274396) : Exiting Master process...
Jan 31 03:22:57 np0005603623 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[274392]: [ALERT]    (274396) : Current worker (274398) exited with code 143 (Terminated)
Jan 31 03:22:57 np0005603623 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[274392]: [WARNING]  (274396) : All workers exited. Exiting... (0)
Jan 31 03:22:57 np0005603623 systemd[1]: libpod-a6b2cab549a8fef70de18b502febc539a0f50ff07f08b05864726a3d20f8f9cb.scope: Deactivated successfully.
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.983 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:57 np0005603623 podman[274572]: 2026-01-31 08:22:57.985635287 +0000 UTC m=+0.052095334 container died a6b2cab549a8fef70de18b502febc539a0f50ff07f08b05864726a3d20f8f9cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:22:57 np0005603623 nova_compute[226235]: 2026-01-31 08:22:57.988 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.002 226239 INFO nova.virt.libvirt.driver [-] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Instance destroyed successfully.#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.004 226239 DEBUG nova.objects.instance [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lazy-loading 'resources' on Instance uuid e83e8f02-79e9-4945-b30e-65624fc06c37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:22:58 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a6b2cab549a8fef70de18b502febc539a0f50ff07f08b05864726a3d20f8f9cb-userdata-shm.mount: Deactivated successfully.
Jan 31 03:22:58 np0005603623 systemd[1]: var-lib-containers-storage-overlay-efe4867985d0689e6b5eefcdb906fad80127edfd9066dd785d1764c63e7fa18b-merged.mount: Deactivated successfully.
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.039 226239 DEBUG nova.virt.libvirt.vif [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:22:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-147995892',display_name='tempest-tempest.common.compute-instance-147995892-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-147995892-2',id=114,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2026-01-31T08:22:52Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='96de645f38844180b404d1a7cf7dd460',ramdisk_id='',reservation_id='r-rktr52i6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-174245429',owner_user_name='tempest-MultipleCreateTestJSON-174245429-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:22:52Z,user_data=None,user_id='c086a82bd0384612a78981006889df41',uuid=e83e8f02-79e9-4945-b30e-65624fc06c37,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7ca23fcb-d932-4244-a5dd-f02ef1fcd06c", "address": "fa:16:3e:dd:cd:7e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca23fcb-d9", "ovs_interfaceid": "7ca23fcb-d932-4244-a5dd-f02ef1fcd06c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.040 226239 DEBUG nova.network.os_vif_util [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converting VIF {"id": "7ca23fcb-d932-4244-a5dd-f02ef1fcd06c", "address": "fa:16:3e:dd:cd:7e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca23fcb-d9", "ovs_interfaceid": "7ca23fcb-d932-4244-a5dd-f02ef1fcd06c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.041 226239 DEBUG nova.network.os_vif_util [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:cd:7e,bridge_name='br-int',has_traffic_filtering=True,id=7ca23fcb-d932-4244-a5dd-f02ef1fcd06c,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca23fcb-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.041 226239 DEBUG os_vif [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:cd:7e,bridge_name='br-int',has_traffic_filtering=True,id=7ca23fcb-d932-4244-a5dd-f02ef1fcd06c,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca23fcb-d9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.042 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.042 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7ca23fcb-d9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.045 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.047 226239 INFO os_vif [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:cd:7e,bridge_name='br-int',has_traffic_filtering=True,id=7ca23fcb-d932-4244-a5dd-f02ef1fcd06c,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca23fcb-d9')#033[00m
Jan 31 03:22:58 np0005603623 podman[274572]: 2026-01-31 08:22:58.060860745 +0000 UTC m=+0.127320792 container cleanup a6b2cab549a8fef70de18b502febc539a0f50ff07f08b05864726a3d20f8f9cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:22:58 np0005603623 systemd[1]: libpod-conmon-a6b2cab549a8fef70de18b502febc539a0f50ff07f08b05864726a3d20f8f9cb.scope: Deactivated successfully.
Jan 31 03:22:58 np0005603623 podman[274624]: 2026-01-31 08:22:58.176255613 +0000 UTC m=+0.098926622 container remove a6b2cab549a8fef70de18b502febc539a0f50ff07f08b05864726a3d20f8f9cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:22:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:58.180 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[951b9c9b-de06-48a6-8af1-aee0a2c3b0e4]: (4, ('Sat Jan 31 08:22:57 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 (a6b2cab549a8fef70de18b502febc539a0f50ff07f08b05864726a3d20f8f9cb)\na6b2cab549a8fef70de18b502febc539a0f50ff07f08b05864726a3d20f8f9cb\nSat Jan 31 08:22:58 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 (a6b2cab549a8fef70de18b502febc539a0f50ff07f08b05864726a3d20f8f9cb)\na6b2cab549a8fef70de18b502febc539a0f50ff07f08b05864726a3d20f8f9cb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:58.182 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d20696d2-a9b8-4471-a1f3-2efa1afa51bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:58.182 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd2feb18-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:58 np0005603623 kernel: tapbd2feb18-e0: left promiscuous mode
Jan 31 03:22:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:58.187 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[05752bba-013f-4144-af81-fd7c136875e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.188 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.188 226239 INFO nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Creating image(s)#033[00m
Jan 31 03:22:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:58.196 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b1e80539-ddee-447f-b027-f52ab71fce13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:58.198 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c6636bb2-092c-41b2-858e-8ecdc713e515]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:58.211 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2944bf5e-8a0d-4427-8683-0502f2fa26bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 691150, 'reachable_time': 24892, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274656, 'error': None, 'target': 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:58 np0005603623 systemd[1]: run-netns-ovnmeta\x2dbd2feb18\x2de01d\x2d4084\x2db50c\x2d13511157dde4.mount: Deactivated successfully.
Jan 31 03:22:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:58.213 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:22:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:22:58.213 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[4de72c87-4378-4d6b-96d0-bf3ec6e41959]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.217 226239 DEBUG nova.storage.rbd_utils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.242 226239 DEBUG nova.storage.rbd_utils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.266 226239 DEBUG nova.storage.rbd_utils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.270 226239 DEBUG oslo_concurrency.processutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.289 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:58.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.333 226239 DEBUG oslo_concurrency.processutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.334 226239 DEBUG oslo_concurrency.lockutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "365f9823d2619ef09948bdeed685488da63755b5" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.335 226239 DEBUG oslo_concurrency.lockutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "365f9823d2619ef09948bdeed685488da63755b5" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.335 226239 DEBUG oslo_concurrency.lockutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "365f9823d2619ef09948bdeed685488da63755b5" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.361 226239 DEBUG nova.storage.rbd_utils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.365 226239 DEBUG oslo_concurrency.processutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:22:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:22:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:58.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.696 226239 DEBUG oslo_concurrency.processutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.331s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.728 226239 DEBUG nova.compute.manager [req-98eb72f8-e87b-4008-a659-0e2311a61f48 req-247c5571-e0a8-4dd0-b816-4b47da2dd645 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Received event network-vif-unplugged-7ca23fcb-d932-4244-a5dd-f02ef1fcd06c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.729 226239 DEBUG oslo_concurrency.lockutils [req-98eb72f8-e87b-4008-a659-0e2311a61f48 req-247c5571-e0a8-4dd0-b816-4b47da2dd645 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e83e8f02-79e9-4945-b30e-65624fc06c37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.729 226239 DEBUG oslo_concurrency.lockutils [req-98eb72f8-e87b-4008-a659-0e2311a61f48 req-247c5571-e0a8-4dd0-b816-4b47da2dd645 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e83e8f02-79e9-4945-b30e-65624fc06c37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.729 226239 DEBUG oslo_concurrency.lockutils [req-98eb72f8-e87b-4008-a659-0e2311a61f48 req-247c5571-e0a8-4dd0-b816-4b47da2dd645 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e83e8f02-79e9-4945-b30e-65624fc06c37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.729 226239 DEBUG nova.compute.manager [req-98eb72f8-e87b-4008-a659-0e2311a61f48 req-247c5571-e0a8-4dd0-b816-4b47da2dd645 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] No waiting events found dispatching network-vif-unplugged-7ca23fcb-d932-4244-a5dd-f02ef1fcd06c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.730 226239 DEBUG nova.compute.manager [req-98eb72f8-e87b-4008-a659-0e2311a61f48 req-247c5571-e0a8-4dd0-b816-4b47da2dd645 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Received event network-vif-unplugged-7ca23fcb-d932-4244-a5dd-f02ef1fcd06c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.767 226239 DEBUG nova.storage.rbd_utils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] resizing rbd image 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.888 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.889 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Ensure instance console log exists: /var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.890 226239 DEBUG oslo_concurrency.lockutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.890 226239 DEBUG oslo_concurrency.lockutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.890 226239 DEBUG oslo_concurrency.lockutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.892 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Start _get_guest_xml network_info=[{"id": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "address": "fa:16:3e:fd:11:00", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb98b496-e5", "ovs_interfaceid": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:45Z,direct_url=<?>,disk_format='qcow2',id=0864ca59-9877-4e6d-adfc-f0a3204ed8f8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:47Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.897 226239 WARNING nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.904 226239 DEBUG nova.virt.libvirt.host [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.905 226239 DEBUG nova.virt.libvirt.host [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.909 226239 DEBUG nova.virt.libvirt.host [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.909 226239 DEBUG nova.virt.libvirt.host [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.910 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.910 226239 DEBUG nova.virt.hardware [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:45Z,direct_url=<?>,disk_format='qcow2',id=0864ca59-9877-4e6d-adfc-f0a3204ed8f8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:47Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.911 226239 DEBUG nova.virt.hardware [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.911 226239 DEBUG nova.virt.hardware [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.911 226239 DEBUG nova.virt.hardware [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.912 226239 DEBUG nova.virt.hardware [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.912 226239 DEBUG nova.virt.hardware [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.912 226239 DEBUG nova.virt.hardware [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.912 226239 DEBUG nova.virt.hardware [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.913 226239 DEBUG nova.virt.hardware [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.913 226239 DEBUG nova.virt.hardware [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.913 226239 DEBUG nova.virt.hardware [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.913 226239 DEBUG nova.objects.instance [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'vcpu_model' on Instance uuid 7b830774-2315-410b-a3ed-585a1d0b6ee2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.947 226239 DEBUG oslo_concurrency.processutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.973 226239 INFO nova.virt.libvirt.driver [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Deleting instance files /var/lib/nova/instances/e83e8f02-79e9-4945-b30e-65624fc06c37_del#033[00m
Jan 31 03:22:58 np0005603623 nova_compute[226235]: 2026-01-31 08:22:58.974 226239 INFO nova.virt.libvirt.driver [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Deletion of /var/lib/nova/instances/e83e8f02-79e9-4945-b30e-65624fc06c37_del complete#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.116 226239 INFO nova.compute.manager [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Took 1.35 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.117 226239 DEBUG oslo.service.loopingcall [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.117 226239 DEBUG nova.compute.manager [-] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.117 226239 DEBUG nova.network.neutron [-] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.248 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:22:59 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3961538723' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.405 226239 DEBUG oslo_concurrency.processutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.429 226239 DEBUG nova.storage.rbd_utils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.433 226239 DEBUG oslo_concurrency.processutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.512 226239 DEBUG nova.compute.manager [req-f7126346-44fd-4e42-90c2-8d659a2f7a5c req-b8051c1a-ce13-4303-ab90-0c1f797fa3b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Received event network-vif-plugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.513 226239 DEBUG oslo_concurrency.lockutils [req-f7126346-44fd-4e42-90c2-8d659a2f7a5c req-b8051c1a-ce13-4303-ab90-0c1f797fa3b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.513 226239 DEBUG oslo_concurrency.lockutils [req-f7126346-44fd-4e42-90c2-8d659a2f7a5c req-b8051c1a-ce13-4303-ab90-0c1f797fa3b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.513 226239 DEBUG oslo_concurrency.lockutils [req-f7126346-44fd-4e42-90c2-8d659a2f7a5c req-b8051c1a-ce13-4303-ab90-0c1f797fa3b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.514 226239 DEBUG nova.compute.manager [req-f7126346-44fd-4e42-90c2-8d659a2f7a5c req-b8051c1a-ce13-4303-ab90-0c1f797fa3b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] No waiting events found dispatching network-vif-plugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.514 226239 WARNING nova.compute.manager [req-f7126346-44fd-4e42-90c2-8d659a2f7a5c req-b8051c1a-ce13-4303-ab90-0c1f797fa3b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Received unexpected event network-vif-plugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 31 03:22:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:22:59 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/344680605' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.908 226239 DEBUG oslo_concurrency.processutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.910 226239 DEBUG nova.virt.libvirt.vif [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:22:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1897188479',display_name='tempest-ServerActionsTestJSON-server-124585492',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1897188479',id=112,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:22:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-yq6e639z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:22:58Z,user_data=None,user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=7b830774-2315-410b-a3ed-585a1d0b6ee2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "address": "fa:16:3e:fd:11:00", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb98b496-e5", "ovs_interfaceid": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.910 226239 DEBUG nova.network.os_vif_util [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "address": "fa:16:3e:fd:11:00", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb98b496-e5", "ovs_interfaceid": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.911 226239 DEBUG nova.network.os_vif_util [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:11:00,bridge_name='br-int',has_traffic_filtering=True,id=bb98b496-e57f-4e5a-bea6-fc68d9690077,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb98b496-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.913 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:22:59 np0005603623 nova_compute[226235]:  <uuid>7b830774-2315-410b-a3ed-585a1d0b6ee2</uuid>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:  <name>instance-00000070</name>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerActionsTestJSON-server-124585492</nova:name>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:22:58</nova:creationTime>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:22:59 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:        <nova:user uuid="1d03198d8ab846bda092e089b2d5a6c7">tempest-ServerActionsTestJSON-1873947453-project-member</nova:user>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:        <nova:project uuid="5b87da3b3f42494f96baeeeaf60b54df">tempest-ServerActionsTestJSON-1873947453</nova:project>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="0864ca59-9877-4e6d-adfc-f0a3204ed8f8"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:        <nova:port uuid="bb98b496-e57f-4e5a-bea6-fc68d9690077">
Jan 31 03:22:59 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <entry name="serial">7b830774-2315-410b-a3ed-585a1d0b6ee2</entry>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <entry name="uuid">7b830774-2315-410b-a3ed-585a1d0b6ee2</entry>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/7b830774-2315-410b-a3ed-585a1d0b6ee2_disk">
Jan 31 03:22:59 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:22:59 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/7b830774-2315-410b-a3ed-585a1d0b6ee2_disk.config">
Jan 31 03:22:59 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:22:59 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:fd:11:00"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <target dev="tapbb98b496-e5"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2/console.log" append="off"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:22:59 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:22:59 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:22:59 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:22:59 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.914 226239 DEBUG nova.compute.manager [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Preparing to wait for external event network-vif-plugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.914 226239 DEBUG oslo_concurrency.lockutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.914 226239 DEBUG oslo_concurrency.lockutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.914 226239 DEBUG oslo_concurrency.lockutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.915 226239 DEBUG nova.virt.libvirt.vif [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:22:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1897188479',display_name='tempest-ServerActionsTestJSON-server-124585492',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1897188479',id=112,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:22:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-yq6e639z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:22:58Z,user_data=None,user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=7b830774-2315-410b-a3ed-585a1d0b6ee2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "address": "fa:16:3e:fd:11:00", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb98b496-e5", "ovs_interfaceid": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.915 226239 DEBUG nova.network.os_vif_util [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "address": "fa:16:3e:fd:11:00", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb98b496-e5", "ovs_interfaceid": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.915 226239 DEBUG nova.network.os_vif_util [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:11:00,bridge_name='br-int',has_traffic_filtering=True,id=bb98b496-e57f-4e5a-bea6-fc68d9690077,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb98b496-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.916 226239 DEBUG os_vif [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:11:00,bridge_name='br-int',has_traffic_filtering=True,id=bb98b496-e57f-4e5a-bea6-fc68d9690077,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb98b496-e5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.916 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.917 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.917 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.919 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.920 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbb98b496-e5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.920 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbb98b496-e5, col_values=(('external_ids', {'iface-id': 'bb98b496-e57f-4e5a-bea6-fc68d9690077', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fd:11:00', 'vm-uuid': '7b830774-2315-410b-a3ed-585a1d0b6ee2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.921 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:59 np0005603623 NetworkManager[48970]: <info>  [1769847779.9225] manager: (tapbb98b496-e5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.924 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.925 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:59 np0005603623 nova_compute[226235]: 2026-01-31 08:22:59.926 226239 INFO os_vif [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:11:00,bridge_name='br-int',has_traffic_filtering=True,id=bb98b496-e57f-4e5a-bea6-fc68d9690077,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb98b496-e5')#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.018 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.019 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.019 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No VIF found with MAC fa:16:3e:fd:11:00, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.020 226239 INFO nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Using config drive#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.041 226239 DEBUG nova.storage.rbd_utils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.074 226239 DEBUG nova.objects.instance [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'ec2_ids' on Instance uuid 7b830774-2315-410b-a3ed-585a1d0b6ee2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.128 226239 DEBUG nova.objects.instance [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'keypairs' on Instance uuid 7b830774-2315-410b-a3ed-585a1d0b6ee2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:23:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:23:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:00.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:23:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:00.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.689 226239 DEBUG nova.network.neutron [-] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.751 226239 INFO nova.compute.manager [-] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Took 1.63 seconds to deallocate network for instance.#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.849 226239 DEBUG nova.compute.manager [req-b6bd11e0-5696-4202-ada3-aad296c6a8e0 req-4f530e29-a6e3-464d-b873-16a9547f49bf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Received event network-vif-deleted-7ca23fcb-d932-4244-a5dd-f02ef1fcd06c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.870 226239 DEBUG oslo_concurrency.lockutils [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.871 226239 DEBUG oslo_concurrency.lockutils [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.906 226239 DEBUG nova.compute.manager [req-3429319a-706e-4cbe-aae2-9d3aa2d5d2e6 req-b6f65858-8a75-43a0-8713-4e770bf0a117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Received event network-vif-plugged-7ca23fcb-d932-4244-a5dd-f02ef1fcd06c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.907 226239 DEBUG oslo_concurrency.lockutils [req-3429319a-706e-4cbe-aae2-9d3aa2d5d2e6 req-b6f65858-8a75-43a0-8713-4e770bf0a117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e83e8f02-79e9-4945-b30e-65624fc06c37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.907 226239 DEBUG oslo_concurrency.lockutils [req-3429319a-706e-4cbe-aae2-9d3aa2d5d2e6 req-b6f65858-8a75-43a0-8713-4e770bf0a117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e83e8f02-79e9-4945-b30e-65624fc06c37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.907 226239 DEBUG oslo_concurrency.lockutils [req-3429319a-706e-4cbe-aae2-9d3aa2d5d2e6 req-b6f65858-8a75-43a0-8713-4e770bf0a117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e83e8f02-79e9-4945-b30e-65624fc06c37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.908 226239 DEBUG nova.compute.manager [req-3429319a-706e-4cbe-aae2-9d3aa2d5d2e6 req-b6f65858-8a75-43a0-8713-4e770bf0a117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] No waiting events found dispatching network-vif-plugged-7ca23fcb-d932-4244-a5dd-f02ef1fcd06c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.908 226239 WARNING nova.compute.manager [req-3429319a-706e-4cbe-aae2-9d3aa2d5d2e6 req-b6f65858-8a75-43a0-8713-4e770bf0a117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Received unexpected event network-vif-plugged-7ca23fcb-d932-4244-a5dd-f02ef1fcd06c for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.977 226239 INFO nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Creating config drive at /var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2/disk.config#033[00m
Jan 31 03:23:00 np0005603623 nova_compute[226235]: 2026-01-31 08:23:00.981 226239 DEBUG oslo_concurrency.processutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjbucfvv7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.010 226239 DEBUG oslo_concurrency.processutils [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.106 226239 DEBUG oslo_concurrency.processutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjbucfvv7" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.151 226239 DEBUG nova.storage.rbd_utils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.154 226239 DEBUG oslo_concurrency.processutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2/disk.config 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.319 226239 DEBUG oslo_concurrency.processutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2/disk.config 7b830774-2315-410b-a3ed-585a1d0b6ee2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.165s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.320 226239 INFO nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Deleting local config drive /var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2/disk.config because it was imported into RBD.#033[00m
Jan 31 03:23:01 np0005603623 kernel: tapbb98b496-e5: entered promiscuous mode
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.365 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:01 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:01Z|00444|binding|INFO|Claiming lport bb98b496-e57f-4e5a-bea6-fc68d9690077 for this chassis.
Jan 31 03:23:01 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:01Z|00445|binding|INFO|bb98b496-e57f-4e5a-bea6-fc68d9690077: Claiming fa:16:3e:fd:11:00 10.100.0.6
Jan 31 03:23:01 np0005603623 NetworkManager[48970]: <info>  [1769847781.3682] manager: (tapbb98b496-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/211)
Jan 31 03:23:01 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:01Z|00446|binding|INFO|Setting lport bb98b496-e57f-4e5a-bea6-fc68d9690077 ovn-installed in OVS
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.374 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.377 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:01 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:01Z|00447|binding|INFO|Setting lport bb98b496-e57f-4e5a-bea6-fc68d9690077 up in Southbound
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.383 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:11:00 10.100.0.6'], port_security=['fa:16:3e:fd:11:00 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7b830774-2315-410b-a3ed-585a1d0b6ee2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'be6219b2-98f8-4804-bad5-369b6bf26a95', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=bb98b496-e57f-4e5a-bea6-fc68d9690077) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.384 143258 INFO neutron.agent.ovn.metadata.agent [-] Port bb98b496-e57f-4e5a-bea6-fc68d9690077 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 bound to our chassis#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.386 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1186b71b-0c4b-47f0-a55d-4433241e46e7#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.394 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[54b9f5a1-a9ef-42f1-9de2-c2624c641896]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.395 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1186b71b-01 in ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.396 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1186b71b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.396 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e26fa136-8246-4496-bccf-e28fff1be02d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:01 np0005603623 systemd-machined[194379]: New machine qemu-50-instance-00000070.
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.397 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7e87a85d-d949-4c97-9448-b783dddafb57]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:01 np0005603623 systemd-udevd[274969]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.407 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[88d248f2-b5ef-42ed-8c72-511d24e569c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:01 np0005603623 NetworkManager[48970]: <info>  [1769847781.4117] device (tapbb98b496-e5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:23:01 np0005603623 NetworkManager[48970]: <info>  [1769847781.4127] device (tapbb98b496-e5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:23:01 np0005603623 systemd[1]: Started Virtual Machine qemu-50-instance-00000070.
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.427 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d00d508c-9d11-44d4-a353-48730f3e5161]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.449 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[5e673845-42c8-4d67-b48a-5e3712066f20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:23:01 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3285695630' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.454 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a28e1679-5644-4350-b060-9db667f33a70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:01 np0005603623 NetworkManager[48970]: <info>  [1769847781.4551] manager: (tap1186b71b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/212)
Jan 31 03:23:01 np0005603623 systemd-udevd[274973]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.473 226239 DEBUG oslo_concurrency.processutils [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.480 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[3508cf0f-25f2-42b6-9451-cf7b285ef88d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.483 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ceaf342a-90b2-4e07-a9f9-f1449d1ccfae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.484 226239 DEBUG nova.compute.provider_tree [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:23:01 np0005603623 NetworkManager[48970]: <info>  [1769847781.5030] device (tap1186b71b-00): carrier: link connected
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.507 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[9cd28eea-6198-4653-aef5-dc5f82091fec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.517 226239 DEBUG nova.scheduler.client.report [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.521 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c54db56a-e5b6-44e7-80bb-b9cf83dec3d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 692193, 'reachable_time': 38052, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275003, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.537 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dd3d48dc-9a70-4284-b027-9062a7ce1160]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:37ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 692193, 'tstamp': 692193}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 275004, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.547 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[996a776a-8549-4b49-a0d1-f98d6d3c101b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 692193, 'reachable_time': 38052, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 275005, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.568 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fb911e52-4562-4b00-bf2b-e2f256e59052]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.572 226239 DEBUG oslo_concurrency.lockutils [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.700s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.607 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e0cd0cbe-d0bf-4351-aaef-07a8935cb4d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.608 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.609 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.609 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1186b71b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.611 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:01 np0005603623 NetworkManager[48970]: <info>  [1769847781.6128] manager: (tap1186b71b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/213)
Jan 31 03:23:01 np0005603623 kernel: tap1186b71b-00: entered promiscuous mode
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.614 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1186b71b-00, col_values=(('external_ids', {'iface-id': '4375f262-ce22-40bf-bf9b-24f6862763a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:01 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:01Z|00448|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.615 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.619 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.620 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[772dcd14-75a7-4a85-90bf-006a2c9d3933]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.621 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:23:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:01.622 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'env', 'PROCESS_TAG=haproxy-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1186b71b-0c4b-47f0-a55d-4433241e46e7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.622 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.638 226239 INFO nova.scheduler.client.report [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Deleted allocations for instance e83e8f02-79e9-4945-b30e-65624fc06c37#033[00m
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.752 226239 DEBUG oslo_concurrency.lockutils [None req-97fc2867-5566-413c-95ad-cafd82fb9410 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "e83e8f02-79e9-4945-b30e-65624fc06c37" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.985s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:01 np0005603623 podman[275035]: 2026-01-31 08:23:01.970441224 +0000 UTC m=+0.071710398 container create 1f8e1751f28f82444f0874543566e5a2ce8bfb4fa062847677027c936dbfb6a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.977 226239 DEBUG nova.compute.manager [req-09ee1d20-a2ab-4041-b6e1-3277e8e7b850 req-960eb8c0-852b-4afd-8d61-57f7f39e2b16 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Received event network-vif-plugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.978 226239 DEBUG oslo_concurrency.lockutils [req-09ee1d20-a2ab-4041-b6e1-3277e8e7b850 req-960eb8c0-852b-4afd-8d61-57f7f39e2b16 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.978 226239 DEBUG oslo_concurrency.lockutils [req-09ee1d20-a2ab-4041-b6e1-3277e8e7b850 req-960eb8c0-852b-4afd-8d61-57f7f39e2b16 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.978 226239 DEBUG oslo_concurrency.lockutils [req-09ee1d20-a2ab-4041-b6e1-3277e8e7b850 req-960eb8c0-852b-4afd-8d61-57f7f39e2b16 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:01 np0005603623 nova_compute[226235]: 2026-01-31 08:23:01.979 226239 DEBUG nova.compute.manager [req-09ee1d20-a2ab-4041-b6e1-3277e8e7b850 req-960eb8c0-852b-4afd-8d61-57f7f39e2b16 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Processing event network-vif-plugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:23:02 np0005603623 podman[275035]: 2026-01-31 08:23:01.922595834 +0000 UTC m=+0.023865018 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:23:02 np0005603623 systemd[1]: Started libpod-conmon-1f8e1751f28f82444f0874543566e5a2ce8bfb4fa062847677027c936dbfb6a6.scope.
Jan 31 03:23:02 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:23:02 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c8e8c3a44e85d64006f8f5002d4c15d62dd0a8998b594f1d4c99b811fdd7887/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:23:02 np0005603623 podman[275035]: 2026-01-31 08:23:02.063577054 +0000 UTC m=+0.164846238 container init 1f8e1751f28f82444f0874543566e5a2ce8bfb4fa062847677027c936dbfb6a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:23:02 np0005603623 podman[275035]: 2026-01-31 08:23:02.068033313 +0000 UTC m=+0.169302477 container start 1f8e1751f28f82444f0874543566e5a2ce8bfb4fa062847677027c936dbfb6a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 31 03:23:02 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[275050]: [NOTICE]   (275072) : New worker (275074) forked
Jan 31 03:23:02 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[275050]: [NOTICE]   (275072) : Loading success.
Jan 31 03:23:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:02.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.428 226239 DEBUG nova.compute.manager [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.429 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Removed pending event for 7b830774-2315-410b-a3ed-585a1d0b6ee2 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.429 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847782.4277122, 7b830774-2315-410b-a3ed-585a1d0b6ee2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.430 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] VM Started (Lifecycle Event)#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.432 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.435 226239 INFO nova.virt.libvirt.driver [-] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Instance spawned successfully.#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.435 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.468 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.471 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.550 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.551 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.551 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.551 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.552 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.552 226239 DEBUG nova.virt.libvirt.driver [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.563 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.563 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847782.4278092, 7b830774-2315-410b-a3ed-585a1d0b6ee2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.564 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:23:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:02.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.637 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.642 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847782.4315495, 7b830774-2315-410b-a3ed-585a1d0b6ee2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.642 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.675 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.679 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.683 226239 DEBUG nova.compute.manager [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.765 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.859 226239 DEBUG oslo_concurrency.lockutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.860 226239 DEBUG oslo_concurrency.lockutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.860 226239 DEBUG nova.objects.instance [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 03:23:02 np0005603623 nova_compute[226235]: 2026-01-31 08:23:02.997 226239 DEBUG oslo_concurrency.lockutils [None req-6920d0ca-295d-4ddc-a778-6a915883070f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:23:03 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1553169961' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:23:04 np0005603623 nova_compute[226235]: 2026-01-31 08:23:04.250 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:04.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:04.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:04 np0005603623 nova_compute[226235]: 2026-01-31 08:23:04.921 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:05 np0005603623 nova_compute[226235]: 2026-01-31 08:23:05.764 226239 DEBUG nova.compute.manager [req-1eaadc1e-be90-49fa-8438-b48d29eefc78 req-917bd443-3e95-410c-a76f-cb1fbc8df9c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Received event network-vif-plugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:05 np0005603623 nova_compute[226235]: 2026-01-31 08:23:05.764 226239 DEBUG oslo_concurrency.lockutils [req-1eaadc1e-be90-49fa-8438-b48d29eefc78 req-917bd443-3e95-410c-a76f-cb1fbc8df9c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:05 np0005603623 nova_compute[226235]: 2026-01-31 08:23:05.765 226239 DEBUG oslo_concurrency.lockutils [req-1eaadc1e-be90-49fa-8438-b48d29eefc78 req-917bd443-3e95-410c-a76f-cb1fbc8df9c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:05 np0005603623 nova_compute[226235]: 2026-01-31 08:23:05.765 226239 DEBUG oslo_concurrency.lockutils [req-1eaadc1e-be90-49fa-8438-b48d29eefc78 req-917bd443-3e95-410c-a76f-cb1fbc8df9c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:05 np0005603623 nova_compute[226235]: 2026-01-31 08:23:05.765 226239 DEBUG nova.compute.manager [req-1eaadc1e-be90-49fa-8438-b48d29eefc78 req-917bd443-3e95-410c-a76f-cb1fbc8df9c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] No waiting events found dispatching network-vif-plugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:23:05 np0005603623 nova_compute[226235]: 2026-01-31 08:23:05.765 226239 WARNING nova.compute.manager [req-1eaadc1e-be90-49fa-8438-b48d29eefc78 req-917bd443-3e95-410c-a76f-cb1fbc8df9c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Received unexpected event network-vif-plugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:23:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:23:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:06.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:23:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:06.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:08.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:08.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:09 np0005603623 nova_compute[226235]: 2026-01-31 08:23:09.251 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:09 np0005603623 nova_compute[226235]: 2026-01-31 08:23:09.924 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:09 np0005603623 nova_compute[226235]: 2026-01-31 08:23:09.988 226239 DEBUG oslo_concurrency.lockutils [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "7b830774-2315-410b-a3ed-585a1d0b6ee2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:09 np0005603623 nova_compute[226235]: 2026-01-31 08:23:09.989 226239 DEBUG oslo_concurrency.lockutils [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:09 np0005603623 nova_compute[226235]: 2026-01-31 08:23:09.989 226239 DEBUG oslo_concurrency.lockutils [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:09 np0005603623 nova_compute[226235]: 2026-01-31 08:23:09.989 226239 DEBUG oslo_concurrency.lockutils [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:09 np0005603623 nova_compute[226235]: 2026-01-31 08:23:09.990 226239 DEBUG oslo_concurrency.lockutils [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:09 np0005603623 nova_compute[226235]: 2026-01-31 08:23:09.991 226239 INFO nova.compute.manager [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Terminating instance#033[00m
Jan 31 03:23:09 np0005603623 nova_compute[226235]: 2026-01-31 08:23:09.992 226239 DEBUG nova.compute.manager [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:23:10 np0005603623 kernel: tapbb98b496-e5 (unregistering): left promiscuous mode
Jan 31 03:23:10 np0005603623 nova_compute[226235]: 2026-01-31 08:23:10.081 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:10 np0005603623 NetworkManager[48970]: <info>  [1769847790.0827] device (tapbb98b496-e5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:23:10 np0005603623 nova_compute[226235]: 2026-01-31 08:23:10.088 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:10 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:10Z|00449|binding|INFO|Releasing lport bb98b496-e57f-4e5a-bea6-fc68d9690077 from this chassis (sb_readonly=0)
Jan 31 03:23:10 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:10Z|00450|binding|INFO|Setting lport bb98b496-e57f-4e5a-bea6-fc68d9690077 down in Southbound
Jan 31 03:23:10 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:10Z|00451|binding|INFO|Removing iface tapbb98b496-e5 ovn-installed in OVS
Jan 31 03:23:10 np0005603623 nova_compute[226235]: 2026-01-31 08:23:10.090 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:10 np0005603623 nova_compute[226235]: 2026-01-31 08:23:10.096 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:10.110 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:11:00 10.100.0.6'], port_security=['fa:16:3e:fd:11:00 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7b830774-2315-410b-a3ed-585a1d0b6ee2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'be6219b2-98f8-4804-bad5-369b6bf26a95', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=bb98b496-e57f-4e5a-bea6-fc68d9690077) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:23:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:10.112 143258 INFO neutron.agent.ovn.metadata.agent [-] Port bb98b496-e57f-4e5a-bea6-fc68d9690077 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 unbound from our chassis#033[00m
Jan 31 03:23:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:10.113 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1186b71b-0c4b-47f0-a55d-4433241e46e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:23:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:10.114 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b0e9caa5-43a6-4501-a26d-d00d56682b99]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:10.115 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace which is not needed anymore#033[00m
Jan 31 03:23:10 np0005603623 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000070.scope: Deactivated successfully.
Jan 31 03:23:10 np0005603623 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000070.scope: Consumed 8.508s CPU time.
Jan 31 03:23:10 np0005603623 systemd-machined[194379]: Machine qemu-50-instance-00000070 terminated.
Jan 31 03:23:10 np0005603623 nova_compute[226235]: 2026-01-31 08:23:10.207 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:10 np0005603623 nova_compute[226235]: 2026-01-31 08:23:10.210 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:10 np0005603623 nova_compute[226235]: 2026-01-31 08:23:10.222 226239 INFO nova.virt.libvirt.driver [-] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Instance destroyed successfully.#033[00m
Jan 31 03:23:10 np0005603623 nova_compute[226235]: 2026-01-31 08:23:10.223 226239 DEBUG nova.objects.instance [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'resources' on Instance uuid 7b830774-2315-410b-a3ed-585a1d0b6ee2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:23:10 np0005603623 nova_compute[226235]: 2026-01-31 08:23:10.247 226239 DEBUG nova.virt.libvirt.vif [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:22:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1897188479',display_name='tempest-ServerActionsTestJSON-server-124585492',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1897188479',id=112,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:23:02Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-yq6e639z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:23:02Z,user_data=None,user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=7b830774-2315-410b-a3ed-585a1d0b6ee2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "address": "fa:16:3e:fd:11:00", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb98b496-e5", "ovs_interfaceid": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:23:10 np0005603623 nova_compute[226235]: 2026-01-31 08:23:10.248 226239 DEBUG nova.network.os_vif_util [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "address": "fa:16:3e:fd:11:00", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb98b496-e5", "ovs_interfaceid": "bb98b496-e57f-4e5a-bea6-fc68d9690077", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:23:10 np0005603623 nova_compute[226235]: 2026-01-31 08:23:10.249 226239 DEBUG nova.network.os_vif_util [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fd:11:00,bridge_name='br-int',has_traffic_filtering=True,id=bb98b496-e57f-4e5a-bea6-fc68d9690077,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb98b496-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:23:10 np0005603623 nova_compute[226235]: 2026-01-31 08:23:10.249 226239 DEBUG os_vif [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:11:00,bridge_name='br-int',has_traffic_filtering=True,id=bb98b496-e57f-4e5a-bea6-fc68d9690077,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb98b496-e5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:23:10 np0005603623 nova_compute[226235]: 2026-01-31 08:23:10.250 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:10 np0005603623 nova_compute[226235]: 2026-01-31 08:23:10.250 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbb98b496-e5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:10 np0005603623 nova_compute[226235]: 2026-01-31 08:23:10.252 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:10 np0005603623 nova_compute[226235]: 2026-01-31 08:23:10.253 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:10 np0005603623 nova_compute[226235]: 2026-01-31 08:23:10.255 226239 INFO os_vif [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:11:00,bridge_name='br-int',has_traffic_filtering=True,id=bb98b496-e57f-4e5a-bea6-fc68d9690077,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb98b496-e5')#033[00m
Jan 31 03:23:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:10.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:10 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[275050]: [NOTICE]   (275072) : haproxy version is 2.8.14-c23fe91
Jan 31 03:23:10 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[275050]: [NOTICE]   (275072) : path to executable is /usr/sbin/haproxy
Jan 31 03:23:10 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[275050]: [ALERT]    (275072) : Current worker (275074) exited with code 143 (Terminated)
Jan 31 03:23:10 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[275050]: [WARNING]  (275072) : All workers exited. Exiting... (0)
Jan 31 03:23:10 np0005603623 systemd[1]: libpod-1f8e1751f28f82444f0874543566e5a2ce8bfb4fa062847677027c936dbfb6a6.scope: Deactivated successfully.
Jan 31 03:23:10 np0005603623 podman[275185]: 2026-01-31 08:23:10.43038133 +0000 UTC m=+0.250827003 container died 1f8e1751f28f82444f0874543566e5a2ce8bfb4fa062847677027c936dbfb6a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 31 03:23:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:10.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:10 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1f8e1751f28f82444f0874543566e5a2ce8bfb4fa062847677027c936dbfb6a6-userdata-shm.mount: Deactivated successfully.
Jan 31 03:23:10 np0005603623 systemd[1]: var-lib-containers-storage-overlay-1c8e8c3a44e85d64006f8f5002d4c15d62dd0a8998b594f1d4c99b811fdd7887-merged.mount: Deactivated successfully.
Jan 31 03:23:11 np0005603623 nova_compute[226235]: 2026-01-31 08:23:11.300 226239 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:11 np0005603623 nova_compute[226235]: 2026-01-31 08:23:11.301 226239 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:11 np0005603623 podman[275185]: 2026-01-31 08:23:11.322720394 +0000 UTC m=+1.143166067 container cleanup 1f8e1751f28f82444f0874543566e5a2ce8bfb4fa062847677027c936dbfb6a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 31 03:23:11 np0005603623 systemd[1]: libpod-conmon-1f8e1751f28f82444f0874543566e5a2ce8bfb4fa062847677027c936dbfb6a6.scope: Deactivated successfully.
Jan 31 03:23:11 np0005603623 nova_compute[226235]: 2026-01-31 08:23:11.375 226239 DEBUG nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:23:11 np0005603623 nova_compute[226235]: 2026-01-31 08:23:11.480 226239 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:11 np0005603623 nova_compute[226235]: 2026-01-31 08:23:11.480 226239 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:11 np0005603623 nova_compute[226235]: 2026-01-31 08:23:11.491 226239 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:23:11 np0005603623 nova_compute[226235]: 2026-01-31 08:23:11.492 226239 INFO nova.compute.claims [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:23:11 np0005603623 podman[275245]: 2026-01-31 08:23:11.7666335 +0000 UTC m=+0.427009607 container remove 1f8e1751f28f82444f0874543566e5a2ce8bfb4fa062847677027c936dbfb6a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:23:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:11.771 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9edaa649-bafd-489b-9cee-41cb8e9b2509]: (4, ('Sat Jan 31 08:23:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (1f8e1751f28f82444f0874543566e5a2ce8bfb4fa062847677027c936dbfb6a6)\n1f8e1751f28f82444f0874543566e5a2ce8bfb4fa062847677027c936dbfb6a6\nSat Jan 31 08:23:11 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (1f8e1751f28f82444f0874543566e5a2ce8bfb4fa062847677027c936dbfb6a6)\n1f8e1751f28f82444f0874543566e5a2ce8bfb4fa062847677027c936dbfb6a6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:11.772 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3a054d52-0f7d-4290-9bb5-5883798ce4f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:11.774 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:11 np0005603623 nova_compute[226235]: 2026-01-31 08:23:11.775 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:11 np0005603623 kernel: tap1186b71b-00: left promiscuous mode
Jan 31 03:23:11 np0005603623 nova_compute[226235]: 2026-01-31 08:23:11.780 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:11 np0005603623 nova_compute[226235]: 2026-01-31 08:23:11.781 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:11.782 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7c89411b-10c8-4279-ad6f-b5f19ef1fc40]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:11.798 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5043b06d-e124-4884-b039-c11d233e9af5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:11.800 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[824451f7-65ab-4468-be41-757ea54d84ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:11.811 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4aee86ef-5008-4ffc-9d3f-3cb45f743ef5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 692187, 'reachable_time': 22181, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275261, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:11.813 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:23:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:11.813 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[6819af35-0dd3-4820-948d-b687223d5e65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:11 np0005603623 systemd[1]: run-netns-ovnmeta\x2d1186b71b\x2d0c4b\x2d47f0\x2da55d\x2d4433241e46e7.mount: Deactivated successfully.
Jan 31 03:23:11 np0005603623 nova_compute[226235]: 2026-01-31 08:23:11.851 226239 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:23:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4129340396' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:23:12 np0005603623 nova_compute[226235]: 2026-01-31 08:23:12.294 226239 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:12 np0005603623 nova_compute[226235]: 2026-01-31 08:23:12.299 226239 DEBUG nova.compute.provider_tree [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:23:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:12.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:23:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4157534680' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:23:12 np0005603623 nova_compute[226235]: 2026-01-31 08:23:12.394 226239 DEBUG nova.scheduler.client.report [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:23:12 np0005603623 nova_compute[226235]: 2026-01-31 08:23:12.497 226239 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.017s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:12 np0005603623 nova_compute[226235]: 2026-01-31 08:23:12.498 226239 DEBUG nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:23:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:12.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:12 np0005603623 nova_compute[226235]: 2026-01-31 08:23:12.703 226239 DEBUG nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:23:12 np0005603623 nova_compute[226235]: 2026-01-31 08:23:12.704 226239 DEBUG nova.network.neutron [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:23:12 np0005603623 nova_compute[226235]: 2026-01-31 08:23:12.782 226239 INFO nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:23:12 np0005603623 nova_compute[226235]: 2026-01-31 08:23:12.872 226239 DEBUG nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:23:12 np0005603623 nova_compute[226235]: 2026-01-31 08:23:12.997 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847777.9963305, e83e8f02-79e9-4945-b30e-65624fc06c37 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:23:12 np0005603623 nova_compute[226235]: 2026-01-31 08:23:12.997 226239 INFO nova.compute.manager [-] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.049 226239 DEBUG nova.compute.manager [None req-48cbd4b5-5cd5-4da5-b49c-fcc9058eab70 - - - - - -] [instance: e83e8f02-79e9-4945-b30e-65624fc06c37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.184 226239 DEBUG nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.185 226239 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.185 226239 INFO nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Creating image(s)#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.211 226239 DEBUG nova.storage.rbd_utils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 78af9e5e-c5fb-4266-bcf6-4f99948aaa57_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.241 226239 DEBUG nova.storage.rbd_utils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 78af9e5e-c5fb-4266-bcf6-4f99948aaa57_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.272 226239 DEBUG nova.storage.rbd_utils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 78af9e5e-c5fb-4266-bcf6-4f99948aaa57_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.278 226239 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.298 226239 DEBUG nova.policy [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c086a82bd0384612a78981006889df41', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '96de645f38844180b404d1a7cf7dd460', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.305 226239 DEBUG nova.compute.manager [req-aaeebdbc-03a0-404d-b2fb-931abf3872bc req-7b8c75fe-d3a6-4ab2-85b0-6164bb31de59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Received event network-vif-unplugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.305 226239 DEBUG oslo_concurrency.lockutils [req-aaeebdbc-03a0-404d-b2fb-931abf3872bc req-7b8c75fe-d3a6-4ab2-85b0-6164bb31de59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.305 226239 DEBUG oslo_concurrency.lockutils [req-aaeebdbc-03a0-404d-b2fb-931abf3872bc req-7b8c75fe-d3a6-4ab2-85b0-6164bb31de59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.306 226239 DEBUG oslo_concurrency.lockutils [req-aaeebdbc-03a0-404d-b2fb-931abf3872bc req-7b8c75fe-d3a6-4ab2-85b0-6164bb31de59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.306 226239 DEBUG nova.compute.manager [req-aaeebdbc-03a0-404d-b2fb-931abf3872bc req-7b8c75fe-d3a6-4ab2-85b0-6164bb31de59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] No waiting events found dispatching network-vif-unplugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.306 226239 DEBUG nova.compute.manager [req-aaeebdbc-03a0-404d-b2fb-931abf3872bc req-7b8c75fe-d3a6-4ab2-85b0-6164bb31de59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Received event network-vif-unplugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.306 226239 DEBUG nova.compute.manager [req-aaeebdbc-03a0-404d-b2fb-931abf3872bc req-7b8c75fe-d3a6-4ab2-85b0-6164bb31de59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Received event network-vif-plugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.307 226239 DEBUG oslo_concurrency.lockutils [req-aaeebdbc-03a0-404d-b2fb-931abf3872bc req-7b8c75fe-d3a6-4ab2-85b0-6164bb31de59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.307 226239 DEBUG oslo_concurrency.lockutils [req-aaeebdbc-03a0-404d-b2fb-931abf3872bc req-7b8c75fe-d3a6-4ab2-85b0-6164bb31de59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.307 226239 DEBUG oslo_concurrency.lockutils [req-aaeebdbc-03a0-404d-b2fb-931abf3872bc req-7b8c75fe-d3a6-4ab2-85b0-6164bb31de59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.307 226239 DEBUG nova.compute.manager [req-aaeebdbc-03a0-404d-b2fb-931abf3872bc req-7b8c75fe-d3a6-4ab2-85b0-6164bb31de59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] No waiting events found dispatching network-vif-plugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.307 226239 WARNING nova.compute.manager [req-aaeebdbc-03a0-404d-b2fb-931abf3872bc req-7b8c75fe-d3a6-4ab2-85b0-6164bb31de59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Received unexpected event network-vif-plugged-bb98b496-e57f-4e5a-bea6-fc68d9690077 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.330 226239 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.330 226239 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.331 226239 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.331 226239 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.353 226239 DEBUG nova.storage.rbd_utils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 78af9e5e-c5fb-4266-bcf6-4f99948aaa57_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:13 np0005603623 nova_compute[226235]: 2026-01-31 08:23:13.355 226239 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 78af9e5e-c5fb-4266-bcf6-4f99948aaa57_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:14 np0005603623 nova_compute[226235]: 2026-01-31 08:23:14.269 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:14.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:23:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1283595945' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:23:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:23:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1283595945' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:23:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:14.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:14 np0005603623 nova_compute[226235]: 2026-01-31 08:23:14.787 226239 DEBUG nova.network.neutron [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Successfully created port: 4a895f0e-6a61-40cd-a68c-3c5cb76daadb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:23:14 np0005603623 podman[275378]: 2026-01-31 08:23:14.949632873 +0000 UTC m=+0.040760319 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:23:14 np0005603623 podman[275379]: 2026-01-31 08:23:14.97825113 +0000 UTC m=+0.067631211 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 31 03:23:15 np0005603623 nova_compute[226235]: 2026-01-31 08:23:15.252 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:23:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:16.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:23:16 np0005603623 nova_compute[226235]: 2026-01-31 08:23:16.361 226239 DEBUG nova.network.neutron [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Successfully updated port: 4a895f0e-6a61-40cd-a68c-3c5cb76daadb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:23:16 np0005603623 nova_compute[226235]: 2026-01-31 08:23:16.384 226239 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "refresh_cache-78af9e5e-c5fb-4266-bcf6-4f99948aaa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:23:16 np0005603623 nova_compute[226235]: 2026-01-31 08:23:16.385 226239 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquired lock "refresh_cache-78af9e5e-c5fb-4266-bcf6-4f99948aaa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:23:16 np0005603623 nova_compute[226235]: 2026-01-31 08:23:16.385 226239 DEBUG nova.network.neutron [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:23:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:16.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:16 np0005603623 nova_compute[226235]: 2026-01-31 08:23:16.849 226239 DEBUG nova.compute.manager [req-020919ea-6f9b-468e-8c12-82d73044fc27 req-709e9eb7-09e0-4548-b7cd-05b1e557d903 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Received event network-changed-4a895f0e-6a61-40cd-a68c-3c5cb76daadb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:16 np0005603623 nova_compute[226235]: 2026-01-31 08:23:16.850 226239 DEBUG nova.compute.manager [req-020919ea-6f9b-468e-8c12-82d73044fc27 req-709e9eb7-09e0-4548-b7cd-05b1e557d903 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Refreshing instance network info cache due to event network-changed-4a895f0e-6a61-40cd-a68c-3c5cb76daadb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:23:16 np0005603623 nova_compute[226235]: 2026-01-31 08:23:16.850 226239 DEBUG oslo_concurrency.lockutils [req-020919ea-6f9b-468e-8c12-82d73044fc27 req-709e9eb7-09e0-4548-b7cd-05b1e557d903 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-78af9e5e-c5fb-4266-bcf6-4f99948aaa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:23:17 np0005603623 nova_compute[226235]: 2026-01-31 08:23:17.029 226239 DEBUG nova.network.neutron [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:23:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:18.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e267 e267: 3 total, 3 up, 3 in
Jan 31 03:23:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:18.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:19 np0005603623 nova_compute[226235]: 2026-01-31 08:23:19.272 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:19 np0005603623 nova_compute[226235]: 2026-01-31 08:23:19.318 226239 DEBUG nova.network.neutron [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Updating instance_info_cache with network_info: [{"id": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "address": "fa:16:3e:44:27:7a", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a895f0e-6a", "ovs_interfaceid": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:23:19 np0005603623 nova_compute[226235]: 2026-01-31 08:23:19.393 226239 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Releasing lock "refresh_cache-78af9e5e-c5fb-4266-bcf6-4f99948aaa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:23:19 np0005603623 nova_compute[226235]: 2026-01-31 08:23:19.394 226239 DEBUG nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Instance network_info: |[{"id": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "address": "fa:16:3e:44:27:7a", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a895f0e-6a", "ovs_interfaceid": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:23:19 np0005603623 nova_compute[226235]: 2026-01-31 08:23:19.394 226239 DEBUG oslo_concurrency.lockutils [req-020919ea-6f9b-468e-8c12-82d73044fc27 req-709e9eb7-09e0-4548-b7cd-05b1e557d903 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-78af9e5e-c5fb-4266-bcf6-4f99948aaa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:23:19 np0005603623 nova_compute[226235]: 2026-01-31 08:23:19.394 226239 DEBUG nova.network.neutron [req-020919ea-6f9b-468e-8c12-82d73044fc27 req-709e9eb7-09e0-4548-b7cd-05b1e557d903 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Refreshing network info cache for port 4a895f0e-6a61-40cd-a68c-3c5cb76daadb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:23:19 np0005603623 nova_compute[226235]: 2026-01-31 08:23:19.427 226239 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 78af9e5e-c5fb-4266-bcf6-4f99948aaa57_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 6.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:20 np0005603623 nova_compute[226235]: 2026-01-31 08:23:20.046 226239 DEBUG nova.storage.rbd_utils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] resizing rbd image 78af9e5e-c5fb-4266-bcf6-4f99948aaa57_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:23:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:20.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:20 np0005603623 nova_compute[226235]: 2026-01-31 08:23:20.574 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:23:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:20.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:23:20 np0005603623 nova_compute[226235]: 2026-01-31 08:23:20.965 226239 INFO nova.virt.libvirt.driver [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Deleting instance files /var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2_del#033[00m
Jan 31 03:23:20 np0005603623 nova_compute[226235]: 2026-01-31 08:23:20.966 226239 INFO nova.virt.libvirt.driver [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Deletion of /var/lib/nova/instances/7b830774-2315-410b-a3ed-585a1d0b6ee2_del complete#033[00m
Jan 31 03:23:21 np0005603623 nova_compute[226235]: 2026-01-31 08:23:21.133 226239 INFO nova.compute.manager [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Took 11.14 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:23:21 np0005603623 nova_compute[226235]: 2026-01-31 08:23:21.133 226239 DEBUG oslo.service.loopingcall [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:23:21 np0005603623 nova_compute[226235]: 2026-01-31 08:23:21.133 226239 DEBUG nova.compute.manager [-] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:23:21 np0005603623 nova_compute[226235]: 2026-01-31 08:23:21.134 226239 DEBUG nova.network.neutron [-] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:23:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.073 226239 DEBUG nova.objects.instance [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lazy-loading 'migration_context' on Instance uuid 78af9e5e-c5fb-4266-bcf6-4f99948aaa57 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.105 226239 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.105 226239 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Ensure instance console log exists: /var/lib/nova/instances/78af9e5e-c5fb-4266-bcf6-4f99948aaa57/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.105 226239 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.106 226239 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.106 226239 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.108 226239 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Start _get_guest_xml network_info=[{"id": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "address": "fa:16:3e:44:27:7a", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a895f0e-6a", "ovs_interfaceid": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.113 226239 WARNING nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.120 226239 DEBUG nova.virt.libvirt.host [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.121 226239 DEBUG nova.virt.libvirt.host [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.125 226239 DEBUG nova.virt.libvirt.host [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.125 226239 DEBUG nova.virt.libvirt.host [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.126 226239 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.126 226239 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.127 226239 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.127 226239 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.127 226239 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.127 226239 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.128 226239 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.128 226239 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.128 226239 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.128 226239 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.129 226239 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.129 226239 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.132 226239 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:22.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:23:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2562737025' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.572 226239 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.599 226239 DEBUG nova.storage.rbd_utils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 78af9e5e-c5fb-4266-bcf6-4f99948aaa57_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:22 np0005603623 nova_compute[226235]: 2026-01-31 08:23:22.603 226239 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:22.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.042 226239 DEBUG nova.network.neutron [-] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.110 226239 INFO nova.compute.manager [-] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Took 1.98 seconds to deallocate network for instance.#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.203 226239 DEBUG oslo_concurrency.lockutils [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.204 226239 DEBUG oslo_concurrency.lockutils [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.209 226239 DEBUG nova.compute.manager [req-a8dbb52b-e433-472d-a8b6-bbf7d8dcacd2 req-a75f2f77-d877-41bb-8b1a-f6d9271261dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Received event network-vif-deleted-bb98b496-e57f-4e5a-bea6-fc68d9690077 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:23:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1409794339' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.232 226239 DEBUG nova.network.neutron [req-020919ea-6f9b-468e-8c12-82d73044fc27 req-709e9eb7-09e0-4548-b7cd-05b1e557d903 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Updated VIF entry in instance network info cache for port 4a895f0e-6a61-40cd-a68c-3c5cb76daadb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.233 226239 DEBUG nova.network.neutron [req-020919ea-6f9b-468e-8c12-82d73044fc27 req-709e9eb7-09e0-4548-b7cd-05b1e557d903 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Updating instance_info_cache with network_info: [{"id": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "address": "fa:16:3e:44:27:7a", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a895f0e-6a", "ovs_interfaceid": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.265 226239 DEBUG oslo_concurrency.lockutils [req-020919ea-6f9b-468e-8c12-82d73044fc27 req-709e9eb7-09e0-4548-b7cd-05b1e557d903 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-78af9e5e-c5fb-4266-bcf6-4f99948aaa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.320 226239 DEBUG oslo_concurrency.processutils [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.421 226239 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.818s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.424 226239 DEBUG nova.virt.libvirt.vif [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:23:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1654578525',display_name='tempest-MultipleCreateTestJSON-server-1654578525-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1654578525-2',id=117,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='96de645f38844180b404d1a7cf7dd460',ramdisk_id='',reservation_id='r-o4skbs7y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-174245429',owner_user_name='tempest-MultipleCreateTestJSON-174245429-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:23:12Z,user_data=None,user_id='c086a82bd0384612a78981006889df41',uuid=78af9e5e-c5fb-4266-bcf6-4f99948aaa57,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "address": "fa:16:3e:44:27:7a", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a895f0e-6a", "ovs_interfaceid": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.424 226239 DEBUG nova.network.os_vif_util [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converting VIF {"id": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "address": "fa:16:3e:44:27:7a", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a895f0e-6a", "ovs_interfaceid": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.425 226239 DEBUG nova.network.os_vif_util [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:27:7a,bridge_name='br-int',has_traffic_filtering=True,id=4a895f0e-6a61-40cd-a68c-3c5cb76daadb,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a895f0e-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.426 226239 DEBUG nova.objects.instance [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lazy-loading 'pci_devices' on Instance uuid 78af9e5e-c5fb-4266-bcf6-4f99948aaa57 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.468 226239 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:23:23 np0005603623 nova_compute[226235]:  <uuid>78af9e5e-c5fb-4266-bcf6-4f99948aaa57</uuid>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:  <name>instance-00000075</name>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <nova:name>tempest-MultipleCreateTestJSON-server-1654578525-2</nova:name>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:23:22</nova:creationTime>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:23:23 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:        <nova:user uuid="c086a82bd0384612a78981006889df41">tempest-MultipleCreateTestJSON-174245429-project-member</nova:user>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:        <nova:project uuid="96de645f38844180b404d1a7cf7dd460">tempest-MultipleCreateTestJSON-174245429</nova:project>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:        <nova:port uuid="4a895f0e-6a61-40cd-a68c-3c5cb76daadb">
Jan 31 03:23:23 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <entry name="serial">78af9e5e-c5fb-4266-bcf6-4f99948aaa57</entry>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <entry name="uuid">78af9e5e-c5fb-4266-bcf6-4f99948aaa57</entry>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/78af9e5e-c5fb-4266-bcf6-4f99948aaa57_disk">
Jan 31 03:23:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:23:23 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/78af9e5e-c5fb-4266-bcf6-4f99948aaa57_disk.config">
Jan 31 03:23:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:23:23 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:44:27:7a"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <target dev="tap4a895f0e-6a"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/78af9e5e-c5fb-4266-bcf6-4f99948aaa57/console.log" append="off"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:23:23 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:23:23 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:23:23 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:23:23 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.469 226239 DEBUG nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Preparing to wait for external event network-vif-plugged-4a895f0e-6a61-40cd-a68c-3c5cb76daadb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.470 226239 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.470 226239 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.471 226239 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.471 226239 DEBUG nova.virt.libvirt.vif [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:23:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1654578525',display_name='tempest-MultipleCreateTestJSON-server-1654578525-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1654578525-2',id=117,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='96de645f38844180b404d1a7cf7dd460',ramdisk_id='',reservation_id='r-o4skbs7y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-174245429',owner_user_name='tempest-MultipleCreateTestJSON-174245429-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:23:12Z,user_data=None,user_id='c086a82bd0384612a78981006889df41',uuid=78af9e5e-c5fb-4266-bcf6-4f99948aaa57,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "address": "fa:16:3e:44:27:7a", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a895f0e-6a", "ovs_interfaceid": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.472 226239 DEBUG nova.network.os_vif_util [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converting VIF {"id": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "address": "fa:16:3e:44:27:7a", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a895f0e-6a", "ovs_interfaceid": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.472 226239 DEBUG nova.network.os_vif_util [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:27:7a,bridge_name='br-int',has_traffic_filtering=True,id=4a895f0e-6a61-40cd-a68c-3c5cb76daadb,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a895f0e-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.473 226239 DEBUG os_vif [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:27:7a,bridge_name='br-int',has_traffic_filtering=True,id=4a895f0e-6a61-40cd-a68c-3c5cb76daadb,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a895f0e-6a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.473 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.474 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.474 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.476 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.476 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a895f0e-6a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.477 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4a895f0e-6a, col_values=(('external_ids', {'iface-id': '4a895f0e-6a61-40cd-a68c-3c5cb76daadb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:27:7a', 'vm-uuid': '78af9e5e-c5fb-4266-bcf6-4f99948aaa57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.478 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:23 np0005603623 NetworkManager[48970]: <info>  [1769847803.4803] manager: (tap4a895f0e-6a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/214)
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.481 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.485 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.486 226239 INFO os_vif [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:27:7a,bridge_name='br-int',has_traffic_filtering=True,id=4a895f0e-6a61-40cd-a68c-3c5cb76daadb,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a895f0e-6a')#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.673 226239 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.674 226239 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.674 226239 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] No VIF found with MAC fa:16:3e:44:27:7a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.675 226239 INFO nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Using config drive#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.703 226239 DEBUG nova.storage.rbd_utils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 78af9e5e-c5fb-4266-bcf6-4f99948aaa57_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:23:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1903389409' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.755 226239 DEBUG oslo_concurrency.processutils [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.759 226239 DEBUG nova.compute.provider_tree [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.795 226239 DEBUG nova.scheduler.client.report [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.842 226239 DEBUG oslo_concurrency.lockutils [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:23 np0005603623 nova_compute[226235]: 2026-01-31 08:23:23.934 226239 INFO nova.scheduler.client.report [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Deleted allocations for instance 7b830774-2315-410b-a3ed-585a1d0b6ee2#033[00m
Jan 31 03:23:24 np0005603623 nova_compute[226235]: 2026-01-31 08:23:24.057 226239 DEBUG oslo_concurrency.lockutils [None req-b158b1b2-5c11-4097-b4a7-771af081e735 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "7b830774-2315-410b-a3ed-585a1d0b6ee2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 14.068s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:24 np0005603623 nova_compute[226235]: 2026-01-31 08:23:24.265 226239 INFO nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Creating config drive at /var/lib/nova/instances/78af9e5e-c5fb-4266-bcf6-4f99948aaa57/disk.config#033[00m
Jan 31 03:23:24 np0005603623 nova_compute[226235]: 2026-01-31 08:23:24.269 226239 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/78af9e5e-c5fb-4266-bcf6-4f99948aaa57/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6iqdtwr4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:24 np0005603623 nova_compute[226235]: 2026-01-31 08:23:24.308 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:23:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:24.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:23:24 np0005603623 nova_compute[226235]: 2026-01-31 08:23:24.389 226239 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/78af9e5e-c5fb-4266-bcf6-4f99948aaa57/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6iqdtwr4" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:24 np0005603623 nova_compute[226235]: 2026-01-31 08:23:24.517 226239 DEBUG nova.storage.rbd_utils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 78af9e5e-c5fb-4266-bcf6-4f99948aaa57_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:24 np0005603623 nova_compute[226235]: 2026-01-31 08:23:24.521 226239 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/78af9e5e-c5fb-4266-bcf6-4f99948aaa57/disk.config 78af9e5e-c5fb-4266-bcf6-4f99948aaa57_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:24.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:25 np0005603623 nova_compute[226235]: 2026-01-31 08:23:25.137 226239 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/78af9e5e-c5fb-4266-bcf6-4f99948aaa57/disk.config 78af9e5e-c5fb-4266-bcf6-4f99948aaa57_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.616s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:25 np0005603623 nova_compute[226235]: 2026-01-31 08:23:25.138 226239 INFO nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Deleting local config drive /var/lib/nova/instances/78af9e5e-c5fb-4266-bcf6-4f99948aaa57/disk.config because it was imported into RBD.#033[00m
Jan 31 03:23:25 np0005603623 kernel: tap4a895f0e-6a: entered promiscuous mode
Jan 31 03:23:25 np0005603623 NetworkManager[48970]: <info>  [1769847805.1836] manager: (tap4a895f0e-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/215)
Jan 31 03:23:25 np0005603623 nova_compute[226235]: 2026-01-31 08:23:25.184 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:25 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:25Z|00452|binding|INFO|Claiming lport 4a895f0e-6a61-40cd-a68c-3c5cb76daadb for this chassis.
Jan 31 03:23:25 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:25Z|00453|binding|INFO|4a895f0e-6a61-40cd-a68c-3c5cb76daadb: Claiming fa:16:3e:44:27:7a 10.100.0.4
Jan 31 03:23:25 np0005603623 nova_compute[226235]: 2026-01-31 08:23:25.186 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:25 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:25Z|00454|binding|INFO|Setting lport 4a895f0e-6a61-40cd-a68c-3c5cb76daadb ovn-installed in OVS
Jan 31 03:23:25 np0005603623 nova_compute[226235]: 2026-01-31 08:23:25.194 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:25 np0005603623 systemd-machined[194379]: New machine qemu-51-instance-00000075.
Jan 31 03:23:25 np0005603623 systemd[1]: Started Virtual Machine qemu-51-instance-00000075.
Jan 31 03:23:25 np0005603623 nova_compute[226235]: 2026-01-31 08:23:25.221 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847790.2213972, 7b830774-2315-410b-a3ed-585a1d0b6ee2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:23:25 np0005603623 nova_compute[226235]: 2026-01-31 08:23:25.222 226239 INFO nova.compute.manager [-] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:23:25 np0005603623 systemd-udevd[275657]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:23:25 np0005603623 NetworkManager[48970]: <info>  [1769847805.2368] device (tap4a895f0e-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:23:25 np0005603623 NetworkManager[48970]: <info>  [1769847805.2377] device (tap4a895f0e-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:23:25 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:25Z|00455|binding|INFO|Setting lport 4a895f0e-6a61-40cd-a68c-3c5cb76daadb up in Southbound
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.247 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:27:7a 10.100.0.4'], port_security=['fa:16:3e:44:27:7a 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '78af9e5e-c5fb-4266-bcf6-4f99948aaa57', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd2feb18-e01d-4084-b50c-13511157dde4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96de645f38844180b404d1a7cf7dd460', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6f7abf9c-ddb4-47da-9619-41273b5c7231', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f6f1841d-a97e-4124-981b-627c1dc4d00d, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=4a895f0e-6a61-40cd-a68c-3c5cb76daadb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.248 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 4a895f0e-6a61-40cd-a68c-3c5cb76daadb in datapath bd2feb18-e01d-4084-b50c-13511157dde4 bound to our chassis#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.250 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bd2feb18-e01d-4084-b50c-13511157dde4#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.257 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[24fcd286-ab87-48bd-87a5-17d8934c3683]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.258 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbd2feb18-e1 in ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.259 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbd2feb18-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.259 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ba8fcaf9-3c4e-49fd-9efa-b9e3e87d2ab6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.259 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3e283fa5-35be-474b-8bb4-e759693d4773]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.268 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[26508ea4-ea28-4c48-b9da-6601cc9d5fa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.279 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3619ee56-1474-43e7-8462-482f06b75238]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.306 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[2d8d9ebe-b23a-4e3f-8710-5cbc496bed99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:25 np0005603623 nova_compute[226235]: 2026-01-31 08:23:25.311 226239 DEBUG nova.compute.manager [None req-283e1b17-e221-4a18-b6c9-3c6aea04ba8f - - - - - -] [instance: 7b830774-2315-410b-a3ed-585a1d0b6ee2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:23:25 np0005603623 NetworkManager[48970]: <info>  [1769847805.3156] manager: (tapbd2feb18-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/216)
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.317 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f677140c-f148-44ee-b756-f5172ec02711]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.346 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[3e3fa2f4-e30f-4644-b370-35fbfa2bb9ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.350 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[e16aeb22-2974-40b4-b4da-76525216c40f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:25 np0005603623 NetworkManager[48970]: <info>  [1769847805.3710] device (tapbd2feb18-e0): carrier: link connected
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.378 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[f2547667-2a10-42d1-8eea-ba427d2c168e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.392 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cbdc12f8-8912-44e2-9325-7252c5ee9887]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd2feb18-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:e8:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694580, 'reachable_time': 23269, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275690, 'error': None, 'target': 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.413 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f9980624-8018-4e27-baba-696730897410]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:e8ee'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 694580, 'tstamp': 694580}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 275691, 'error': None, 'target': 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.426 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[72381ffb-0b09-4bed-ae38-127e6d58d490]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd2feb18-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:e8:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694580, 'reachable_time': 23269, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 275692, 'error': None, 'target': 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.451 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[11974346-f860-4a92-a501-1318962a76ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.503 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[61ae4b38-2dc6-4e76-ba6f-988ac0d54d86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.505 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd2feb18-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.506 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.506 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbd2feb18-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:25 np0005603623 NetworkManager[48970]: <info>  [1769847805.5093] manager: (tapbd2feb18-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/217)
Jan 31 03:23:25 np0005603623 kernel: tapbd2feb18-e0: entered promiscuous mode
Jan 31 03:23:25 np0005603623 nova_compute[226235]: 2026-01-31 08:23:25.510 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.511 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbd2feb18-e0, col_values=(('external_ids', {'iface-id': '7b95dd4c-16d2-4ff4-9598-b3ff910c3f1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:25 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:25Z|00456|binding|INFO|Releasing lport 7b95dd4c-16d2-4ff4-9598-b3ff910c3f1b from this chassis (sb_readonly=0)
Jan 31 03:23:25 np0005603623 nova_compute[226235]: 2026-01-31 08:23:25.512 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.513 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bd2feb18-e01d-4084-b50c-13511157dde4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bd2feb18-e01d-4084-b50c-13511157dde4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.514 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[94426033-63b4-4b7c-8190-5acb8c192b85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.515 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-bd2feb18-e01d-4084-b50c-13511157dde4
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/bd2feb18-e01d-4084-b50c-13511157dde4.pid.haproxy
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID bd2feb18-e01d-4084-b50c-13511157dde4
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:23:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:25.517 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'env', 'PROCESS_TAG=haproxy-bd2feb18-e01d-4084-b50c-13511157dde4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bd2feb18-e01d-4084-b50c-13511157dde4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:23:25 np0005603623 nova_compute[226235]: 2026-01-31 08:23:25.518 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:25 np0005603623 nova_compute[226235]: 2026-01-31 08:23:25.773 226239 DEBUG nova.compute.manager [req-d9a75a3a-2c09-45cf-9fb0-b6d15114001b req-856865fc-1a8c-474f-a36a-61560aae3a36 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Received event network-vif-plugged-4a895f0e-6a61-40cd-a68c-3c5cb76daadb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:25 np0005603623 nova_compute[226235]: 2026-01-31 08:23:25.774 226239 DEBUG oslo_concurrency.lockutils [req-d9a75a3a-2c09-45cf-9fb0-b6d15114001b req-856865fc-1a8c-474f-a36a-61560aae3a36 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:25 np0005603623 nova_compute[226235]: 2026-01-31 08:23:25.774 226239 DEBUG oslo_concurrency.lockutils [req-d9a75a3a-2c09-45cf-9fb0-b6d15114001b req-856865fc-1a8c-474f-a36a-61560aae3a36 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:25 np0005603623 nova_compute[226235]: 2026-01-31 08:23:25.774 226239 DEBUG oslo_concurrency.lockutils [req-d9a75a3a-2c09-45cf-9fb0-b6d15114001b req-856865fc-1a8c-474f-a36a-61560aae3a36 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:25 np0005603623 nova_compute[226235]: 2026-01-31 08:23:25.775 226239 DEBUG nova.compute.manager [req-d9a75a3a-2c09-45cf-9fb0-b6d15114001b req-856865fc-1a8c-474f-a36a-61560aae3a36 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Processing event network-vif-plugged-4a895f0e-6a61-40cd-a68c-3c5cb76daadb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:23:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e268 e268: 3 total, 3 up, 3 in
Jan 31 03:23:25 np0005603623 nova_compute[226235]: 2026-01-31 08:23:25.886 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:25 np0005603623 podman[275724]: 2026-01-31 08:23:25.882189492 +0000 UTC m=+0.028705391 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:23:26 np0005603623 podman[275724]: 2026-01-31 08:23:26.080086515 +0000 UTC m=+0.226602404 container create 4ed19612dee821ef1abec824ecfe9ff7185a833ffed292041c186ae480bd3733 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 31 03:23:26 np0005603623 systemd[1]: Started libpod-conmon-4ed19612dee821ef1abec824ecfe9ff7185a833ffed292041c186ae480bd3733.scope.
Jan 31 03:23:26 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:23:26 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53f845deeb83e8557fffb9030c5310c73733a44fd067af93dbc5c8c025970ebb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:23:26 np0005603623 podman[275724]: 2026-01-31 08:23:26.26173559 +0000 UTC m=+0.408251499 container init 4ed19612dee821ef1abec824ecfe9ff7185a833ffed292041c186ae480bd3733 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:23:26 np0005603623 podman[275724]: 2026-01-31 08:23:26.268589275 +0000 UTC m=+0.415105154 container start 4ed19612dee821ef1abec824ecfe9ff7185a833ffed292041c186ae480bd3733 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:23:26 np0005603623 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[275739]: [NOTICE]   (275743) : New worker (275745) forked
Jan 31 03:23:26 np0005603623 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[275739]: [NOTICE]   (275743) : Loading success.
Jan 31 03:23:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:26.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:26.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:26.763 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:23:26 np0005603623 nova_compute[226235]: 2026-01-31 08:23:26.765 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:26.766 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:23:26 np0005603623 nova_compute[226235]: 2026-01-31 08:23:26.875 226239 DEBUG nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:23:26 np0005603623 nova_compute[226235]: 2026-01-31 08:23:26.876 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847806.8765373, 78af9e5e-c5fb-4266-bcf6-4f99948aaa57 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:23:26 np0005603623 nova_compute[226235]: 2026-01-31 08:23:26.877 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] VM Started (Lifecycle Event)#033[00m
Jan 31 03:23:26 np0005603623 nova_compute[226235]: 2026-01-31 08:23:26.879 226239 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:23:26 np0005603623 nova_compute[226235]: 2026-01-31 08:23:26.882 226239 INFO nova.virt.libvirt.driver [-] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Instance spawned successfully.#033[00m
Jan 31 03:23:26 np0005603623 nova_compute[226235]: 2026-01-31 08:23:26.883 226239 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:23:26 np0005603623 nova_compute[226235]: 2026-01-31 08:23:26.921 226239 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:23:26 np0005603623 nova_compute[226235]: 2026-01-31 08:23:26.922 226239 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:23:26 np0005603623 nova_compute[226235]: 2026-01-31 08:23:26.922 226239 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:23:26 np0005603623 nova_compute[226235]: 2026-01-31 08:23:26.922 226239 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:23:26 np0005603623 nova_compute[226235]: 2026-01-31 08:23:26.923 226239 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:23:26 np0005603623 nova_compute[226235]: 2026-01-31 08:23:26.923 226239 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:23:26 np0005603623 nova_compute[226235]: 2026-01-31 08:23:26.960 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:23:26 np0005603623 nova_compute[226235]: 2026-01-31 08:23:26.965 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.005 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.005 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847806.8789537, 78af9e5e-c5fb-4266-bcf6-4f99948aaa57 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.005 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.030 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.034 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847806.8793306, 78af9e5e-c5fb-4266-bcf6-4f99948aaa57 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.035 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.052 226239 INFO nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Took 13.87 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.052 226239 DEBUG nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:23:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e269 e269: 3 total, 3 up, 3 in
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.072 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.076 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.113 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.173 226239 INFO nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Took 15.72 seconds to build instance.#033[00m
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.233 226239 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.932s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.997 226239 DEBUG nova.compute.manager [req-5f251805-c3a8-4dc2-a4b4-bdd98350ac5e req-aa99e2e2-3794-4dab-a1e8-f0d767686f9f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Received event network-vif-plugged-4a895f0e-6a61-40cd-a68c-3c5cb76daadb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.998 226239 DEBUG oslo_concurrency.lockutils [req-5f251805-c3a8-4dc2-a4b4-bdd98350ac5e req-aa99e2e2-3794-4dab-a1e8-f0d767686f9f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.998 226239 DEBUG oslo_concurrency.lockutils [req-5f251805-c3a8-4dc2-a4b4-bdd98350ac5e req-aa99e2e2-3794-4dab-a1e8-f0d767686f9f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.998 226239 DEBUG oslo_concurrency.lockutils [req-5f251805-c3a8-4dc2-a4b4-bdd98350ac5e req-aa99e2e2-3794-4dab-a1e8-f0d767686f9f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.998 226239 DEBUG nova.compute.manager [req-5f251805-c3a8-4dc2-a4b4-bdd98350ac5e req-aa99e2e2-3794-4dab-a1e8-f0d767686f9f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] No waiting events found dispatching network-vif-plugged-4a895f0e-6a61-40cd-a68c-3c5cb76daadb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:23:27 np0005603623 nova_compute[226235]: 2026-01-31 08:23:27.998 226239 WARNING nova.compute.manager [req-5f251805-c3a8-4dc2-a4b4-bdd98350ac5e req-aa99e2e2-3794-4dab-a1e8-f0d767686f9f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Received unexpected event network-vif-plugged-4a895f0e-6a61-40cd-a68c-3c5cb76daadb for instance with vm_state active and task_state None.#033[00m
Jan 31 03:23:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:23:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:28.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:23:28 np0005603623 nova_compute[226235]: 2026-01-31 08:23:28.480 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:23:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:28.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:23:29 np0005603623 nova_compute[226235]: 2026-01-31 08:23:29.309 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:30.116 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:30.116 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:30.117 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:30.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:30.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:31 np0005603623 radosgw[83781]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 31 03:23:31 np0005603623 nova_compute[226235]: 2026-01-31 08:23:31.499 226239 DEBUG oslo_concurrency.lockutils [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:31 np0005603623 nova_compute[226235]: 2026-01-31 08:23:31.499 226239 DEBUG oslo_concurrency.lockutils [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:31 np0005603623 nova_compute[226235]: 2026-01-31 08:23:31.500 226239 DEBUG oslo_concurrency.lockutils [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:31 np0005603623 nova_compute[226235]: 2026-01-31 08:23:31.500 226239 DEBUG oslo_concurrency.lockutils [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:31 np0005603623 nova_compute[226235]: 2026-01-31 08:23:31.500 226239 DEBUG oslo_concurrency.lockutils [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:31 np0005603623 nova_compute[226235]: 2026-01-31 08:23:31.501 226239 INFO nova.compute.manager [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Terminating instance#033[00m
Jan 31 03:23:31 np0005603623 nova_compute[226235]: 2026-01-31 08:23:31.502 226239 DEBUG nova.compute.manager [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:23:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e269 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:31 np0005603623 kernel: tap4a895f0e-6a (unregistering): left promiscuous mode
Jan 31 03:23:31 np0005603623 NetworkManager[48970]: <info>  [1769847811.8686] device (tap4a895f0e-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:23:31 np0005603623 nova_compute[226235]: 2026-01-31 08:23:31.873 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:31 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:31Z|00457|binding|INFO|Releasing lport 4a895f0e-6a61-40cd-a68c-3c5cb76daadb from this chassis (sb_readonly=0)
Jan 31 03:23:31 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:31Z|00458|binding|INFO|Setting lport 4a895f0e-6a61-40cd-a68c-3c5cb76daadb down in Southbound
Jan 31 03:23:31 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:31Z|00459|binding|INFO|Removing iface tap4a895f0e-6a ovn-installed in OVS
Jan 31 03:23:31 np0005603623 nova_compute[226235]: 2026-01-31 08:23:31.875 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:31 np0005603623 nova_compute[226235]: 2026-01-31 08:23:31.882 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:31.884 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:27:7a 10.100.0.4'], port_security=['fa:16:3e:44:27:7a 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '78af9e5e-c5fb-4266-bcf6-4f99948aaa57', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd2feb18-e01d-4084-b50c-13511157dde4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96de645f38844180b404d1a7cf7dd460', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6f7abf9c-ddb4-47da-9619-41273b5c7231', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f6f1841d-a97e-4124-981b-627c1dc4d00d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=4a895f0e-6a61-40cd-a68c-3c5cb76daadb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:23:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:31.885 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 4a895f0e-6a61-40cd-a68c-3c5cb76daadb in datapath bd2feb18-e01d-4084-b50c-13511157dde4 unbound from our chassis#033[00m
Jan 31 03:23:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:31.887 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bd2feb18-e01d-4084-b50c-13511157dde4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:23:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:31.887 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[318b4e3d-735d-4088-8f93-e04544a68a7c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:31.888 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 namespace which is not needed anymore#033[00m
Jan 31 03:23:31 np0005603623 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000075.scope: Deactivated successfully.
Jan 31 03:23:31 np0005603623 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000075.scope: Consumed 6.226s CPU time.
Jan 31 03:23:31 np0005603623 systemd-machined[194379]: Machine qemu-51-instance-00000075 terminated.
Jan 31 03:23:32 np0005603623 nova_compute[226235]: 2026-01-31 08:23:32.117 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:32 np0005603623 nova_compute[226235]: 2026-01-31 08:23:32.120 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:32 np0005603623 nova_compute[226235]: 2026-01-31 08:23:32.131 226239 INFO nova.virt.libvirt.driver [-] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Instance destroyed successfully.#033[00m
Jan 31 03:23:32 np0005603623 nova_compute[226235]: 2026-01-31 08:23:32.133 226239 DEBUG nova.objects.instance [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lazy-loading 'resources' on Instance uuid 78af9e5e-c5fb-4266-bcf6-4f99948aaa57 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:23:32 np0005603623 nova_compute[226235]: 2026-01-31 08:23:32.155 226239 DEBUG nova.virt.libvirt.vif [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:23:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1654578525',display_name='tempest-MultipleCreateTestJSON-server-1654578525-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1654578525-2',id=117,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2026-01-31T08:23:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='96de645f38844180b404d1a7cf7dd460',ramdisk_id='',reservation_id='r-o4skbs7y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-174245429',owner_user_name='tempest-MultipleCreateTestJSON-174245429-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:23:27Z,user_data=None,user_id='c086a82bd0384612a78981006889df41',uuid=78af9e5e-c5fb-4266-bcf6-4f99948aaa57,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "address": "fa:16:3e:44:27:7a", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a895f0e-6a", "ovs_interfaceid": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:23:32 np0005603623 nova_compute[226235]: 2026-01-31 08:23:32.156 226239 DEBUG nova.network.os_vif_util [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converting VIF {"id": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "address": "fa:16:3e:44:27:7a", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a895f0e-6a", "ovs_interfaceid": "4a895f0e-6a61-40cd-a68c-3c5cb76daadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:23:32 np0005603623 nova_compute[226235]: 2026-01-31 08:23:32.157 226239 DEBUG nova.network.os_vif_util [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:27:7a,bridge_name='br-int',has_traffic_filtering=True,id=4a895f0e-6a61-40cd-a68c-3c5cb76daadb,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a895f0e-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:23:32 np0005603623 nova_compute[226235]: 2026-01-31 08:23:32.157 226239 DEBUG os_vif [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:27:7a,bridge_name='br-int',has_traffic_filtering=True,id=4a895f0e-6a61-40cd-a68c-3c5cb76daadb,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a895f0e-6a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:23:32 np0005603623 nova_compute[226235]: 2026-01-31 08:23:32.158 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:32 np0005603623 nova_compute[226235]: 2026-01-31 08:23:32.159 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a895f0e-6a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:32 np0005603623 nova_compute[226235]: 2026-01-31 08:23:32.160 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:32 np0005603623 nova_compute[226235]: 2026-01-31 08:23:32.163 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:23:32 np0005603623 nova_compute[226235]: 2026-01-31 08:23:32.166 226239 INFO os_vif [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:27:7a,bridge_name='br-int',has_traffic_filtering=True,id=4a895f0e-6a61-40cd-a68c-3c5cb76daadb,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a895f0e-6a')#033[00m
Jan 31 03:23:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.002000064s ======
Jan 31 03:23:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:32.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 31 03:23:32 np0005603623 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[275739]: [NOTICE]   (275743) : haproxy version is 2.8.14-c23fe91
Jan 31 03:23:32 np0005603623 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[275739]: [NOTICE]   (275743) : path to executable is /usr/sbin/haproxy
Jan 31 03:23:32 np0005603623 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[275739]: [WARNING]  (275743) : Exiting Master process...
Jan 31 03:23:32 np0005603623 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[275739]: [ALERT]    (275743) : Current worker (275745) exited with code 143 (Terminated)
Jan 31 03:23:32 np0005603623 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[275739]: [WARNING]  (275743) : All workers exited. Exiting... (0)
Jan 31 03:23:32 np0005603623 systemd[1]: libpod-4ed19612dee821ef1abec824ecfe9ff7185a833ffed292041c186ae480bd3733.scope: Deactivated successfully.
Jan 31 03:23:32 np0005603623 podman[275876]: 2026-01-31 08:23:32.363215872 +0000 UTC m=+0.412949275 container died 4ed19612dee821ef1abec824ecfe9ff7185a833ffed292041c186ae480bd3733 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:23:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:23:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:32.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:23:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:32.769 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:32 np0005603623 nova_compute[226235]: 2026-01-31 08:23:32.895 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:33 np0005603623 systemd[1]: var-lib-containers-storage-overlay-53f845deeb83e8557fffb9030c5310c73733a44fd067af93dbc5c8c025970ebb-merged.mount: Deactivated successfully.
Jan 31 03:23:33 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4ed19612dee821ef1abec824ecfe9ff7185a833ffed292041c186ae480bd3733-userdata-shm.mount: Deactivated successfully.
Jan 31 03:23:33 np0005603623 nova_compute[226235]: 2026-01-31 08:23:33.130 226239 DEBUG nova.compute.manager [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 31 03:23:33 np0005603623 nova_compute[226235]: 2026-01-31 08:23:33.220 226239 DEBUG oslo_concurrency.lockutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:33 np0005603623 nova_compute[226235]: 2026-01-31 08:23:33.220 226239 DEBUG oslo_concurrency.lockutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:33 np0005603623 nova_compute[226235]: 2026-01-31 08:23:33.270 226239 DEBUG nova.objects.instance [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'pci_requests' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:23:33 np0005603623 nova_compute[226235]: 2026-01-31 08:23:33.293 226239 DEBUG nova.virt.hardware [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:23:33 np0005603623 nova_compute[226235]: 2026-01-31 08:23:33.294 226239 INFO nova.compute.claims [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:23:33 np0005603623 nova_compute[226235]: 2026-01-31 08:23:33.295 226239 DEBUG nova.objects.instance [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'resources' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:23:33 np0005603623 nova_compute[226235]: 2026-01-31 08:23:33.315 226239 DEBUG nova.objects.instance [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'pci_devices' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:23:33 np0005603623 nova_compute[226235]: 2026-01-31 08:23:33.389 226239 INFO nova.compute.resource_tracker [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating resource usage from migration b1852fe9-e291-46e2-a9d1-96408d08f71a#033[00m
Jan 31 03:23:33 np0005603623 nova_compute[226235]: 2026-01-31 08:23:33.389 226239 DEBUG nova.compute.resource_tracker [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Starting to track incoming migration b1852fe9-e291-46e2-a9d1-96408d08f71a with flavor f75c4aee-d826-4343-a7e3-f06a4b21de52 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 31 03:23:33 np0005603623 podman[275876]: 2026-01-31 08:23:33.515725231 +0000 UTC m=+1.565458634 container cleanup 4ed19612dee821ef1abec824ecfe9ff7185a833ffed292041c186ae480bd3733 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 03:23:33 np0005603623 systemd[1]: libpod-conmon-4ed19612dee821ef1abec824ecfe9ff7185a833ffed292041c186ae480bd3733.scope: Deactivated successfully.
Jan 31 03:23:33 np0005603623 nova_compute[226235]: 2026-01-31 08:23:33.606 226239 DEBUG oslo_concurrency.processutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:23:34 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3490916425' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:23:34 np0005603623 nova_compute[226235]: 2026-01-31 08:23:34.115 226239 DEBUG oslo_concurrency.processutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:34 np0005603623 nova_compute[226235]: 2026-01-31 08:23:34.121 226239 DEBUG nova.compute.provider_tree [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:23:34 np0005603623 nova_compute[226235]: 2026-01-31 08:23:34.138 226239 DEBUG nova.scheduler.client.report [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:23:34 np0005603623 nova_compute[226235]: 2026-01-31 08:23:34.160 226239 DEBUG oslo_concurrency.lockutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.939s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:34 np0005603623 nova_compute[226235]: 2026-01-31 08:23:34.160 226239 INFO nova.compute.manager [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Migrating#033[00m
Jan 31 03:23:34 np0005603623 podman[275936]: 2026-01-31 08:23:34.186625404 +0000 UTC m=+0.655043456 container remove 4ed19612dee821ef1abec824ecfe9ff7185a833ffed292041c186ae480bd3733 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:23:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:34.191 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7730b793-383b-43ab-a4c1-fa0b445cbbfc]: (4, ('Sat Jan 31 08:23:31 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 (4ed19612dee821ef1abec824ecfe9ff7185a833ffed292041c186ae480bd3733)\n4ed19612dee821ef1abec824ecfe9ff7185a833ffed292041c186ae480bd3733\nSat Jan 31 08:23:33 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 (4ed19612dee821ef1abec824ecfe9ff7185a833ffed292041c186ae480bd3733)\n4ed19612dee821ef1abec824ecfe9ff7185a833ffed292041c186ae480bd3733\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:34.193 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[492f1c9a-4ea9-46da-a5a1-37672bc963cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:34.194 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd2feb18-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:34 np0005603623 nova_compute[226235]: 2026-01-31 08:23:34.241 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:34 np0005603623 kernel: tapbd2feb18-e0: left promiscuous mode
Jan 31 03:23:34 np0005603623 nova_compute[226235]: 2026-01-31 08:23:34.247 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:34.249 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3e070761-e338-4e41-8830-7d72762f5fd7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:34.264 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d438d86e-c45e-466a-a46a-a5711b469359]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:34.266 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9e0dbff8-c622-4ab1-9de9-4ed0f4d71d51]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:34.277 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2508f43c-89ff-45e2-86bc-df202c973ec6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694573, 'reachable_time': 42459, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275974, 'error': None, 'target': 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:34 np0005603623 systemd[1]: run-netns-ovnmeta\x2dbd2feb18\x2de01d\x2d4084\x2db50c\x2d13511157dde4.mount: Deactivated successfully.
Jan 31 03:23:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:34.281 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:23:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:34.281 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[e0e1736c-82e7-4bee-b329-565f4422ee4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:34 np0005603623 nova_compute[226235]: 2026-01-31 08:23:34.312 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:23:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:34.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:23:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:34.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e270 e270: 3 total, 3 up, 3 in
Jan 31 03:23:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:36.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:36.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:36 np0005603623 systemd[1]: Created slice User Slice of UID 42436.
Jan 31 03:23:36 np0005603623 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 31 03:23:36 np0005603623 systemd-logind[795]: New session 54 of user nova.
Jan 31 03:23:36 np0005603623 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 31 03:23:36 np0005603623 systemd[1]: Starting User Manager for UID 42436...
Jan 31 03:23:36 np0005603623 systemd[275982]: Queued start job for default target Main User Target.
Jan 31 03:23:36 np0005603623 systemd[275982]: Created slice User Application Slice.
Jan 31 03:23:36 np0005603623 systemd[275982]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 31 03:23:36 np0005603623 systemd[275982]: Started Daily Cleanup of User's Temporary Directories.
Jan 31 03:23:36 np0005603623 systemd[275982]: Reached target Paths.
Jan 31 03:23:36 np0005603623 systemd[275982]: Reached target Timers.
Jan 31 03:23:36 np0005603623 systemd[275982]: Starting D-Bus User Message Bus Socket...
Jan 31 03:23:36 np0005603623 systemd[275982]: Starting Create User's Volatile Files and Directories...
Jan 31 03:23:36 np0005603623 systemd[275982]: Listening on D-Bus User Message Bus Socket.
Jan 31 03:23:36 np0005603623 systemd[275982]: Reached target Sockets.
Jan 31 03:23:36 np0005603623 systemd[275982]: Finished Create User's Volatile Files and Directories.
Jan 31 03:23:36 np0005603623 systemd[275982]: Reached target Basic System.
Jan 31 03:23:36 np0005603623 systemd[275982]: Reached target Main User Target.
Jan 31 03:23:36 np0005603623 systemd[275982]: Startup finished in 219ms.
Jan 31 03:23:36 np0005603623 systemd[1]: Started User Manager for UID 42436.
Jan 31 03:23:36 np0005603623 systemd[1]: Started Session 54 of User nova.
Jan 31 03:23:37 np0005603623 systemd[1]: session-54.scope: Deactivated successfully.
Jan 31 03:23:37 np0005603623 systemd-logind[795]: Session 54 logged out. Waiting for processes to exit.
Jan 31 03:23:37 np0005603623 systemd-logind[795]: Removed session 54.
Jan 31 03:23:37 np0005603623 nova_compute[226235]: 2026-01-31 08:23:37.161 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:37 np0005603623 systemd-logind[795]: New session 56 of user nova.
Jan 31 03:23:37 np0005603623 systemd[1]: Started Session 56 of User nova.
Jan 31 03:23:37 np0005603623 systemd[1]: session-56.scope: Deactivated successfully.
Jan 31 03:23:37 np0005603623 systemd-logind[795]: Session 56 logged out. Waiting for processes to exit.
Jan 31 03:23:37 np0005603623 systemd-logind[795]: Removed session 56.
Jan 31 03:23:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:23:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:38.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:23:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:38.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:39 np0005603623 nova_compute[226235]: 2026-01-31 08:23:39.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:23:39 np0005603623 nova_compute[226235]: 2026-01-31 08:23:39.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:23:39 np0005603623 nova_compute[226235]: 2026-01-31 08:23:39.313 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:39 np0005603623 nova_compute[226235]: 2026-01-31 08:23:39.551 226239 DEBUG nova.compute.manager [req-0d14e95c-1498-4a53-80a0-d442c4ca5bb0 req-0c4749fc-2c00-49ac-90ca-8ae9cd1ecd9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Received event network-vif-unplugged-4a895f0e-6a61-40cd-a68c-3c5cb76daadb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:39 np0005603623 nova_compute[226235]: 2026-01-31 08:23:39.551 226239 DEBUG oslo_concurrency.lockutils [req-0d14e95c-1498-4a53-80a0-d442c4ca5bb0 req-0c4749fc-2c00-49ac-90ca-8ae9cd1ecd9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:39 np0005603623 nova_compute[226235]: 2026-01-31 08:23:39.551 226239 DEBUG oslo_concurrency.lockutils [req-0d14e95c-1498-4a53-80a0-d442c4ca5bb0 req-0c4749fc-2c00-49ac-90ca-8ae9cd1ecd9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:39 np0005603623 nova_compute[226235]: 2026-01-31 08:23:39.552 226239 DEBUG oslo_concurrency.lockutils [req-0d14e95c-1498-4a53-80a0-d442c4ca5bb0 req-0c4749fc-2c00-49ac-90ca-8ae9cd1ecd9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:39 np0005603623 nova_compute[226235]: 2026-01-31 08:23:39.552 226239 DEBUG nova.compute.manager [req-0d14e95c-1498-4a53-80a0-d442c4ca5bb0 req-0c4749fc-2c00-49ac-90ca-8ae9cd1ecd9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] No waiting events found dispatching network-vif-unplugged-4a895f0e-6a61-40cd-a68c-3c5cb76daadb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:23:39 np0005603623 nova_compute[226235]: 2026-01-31 08:23:39.552 226239 DEBUG nova.compute.manager [req-0d14e95c-1498-4a53-80a0-d442c4ca5bb0 req-0c4749fc-2c00-49ac-90ca-8ae9cd1ecd9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Received event network-vif-unplugged-4a895f0e-6a61-40cd-a68c-3c5cb76daadb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:23:39 np0005603623 nova_compute[226235]: 2026-01-31 08:23:39.552 226239 DEBUG nova.compute.manager [req-0d14e95c-1498-4a53-80a0-d442c4ca5bb0 req-0c4749fc-2c00-49ac-90ca-8ae9cd1ecd9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Received event network-vif-plugged-4a895f0e-6a61-40cd-a68c-3c5cb76daadb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:39 np0005603623 nova_compute[226235]: 2026-01-31 08:23:39.552 226239 DEBUG oslo_concurrency.lockutils [req-0d14e95c-1498-4a53-80a0-d442c4ca5bb0 req-0c4749fc-2c00-49ac-90ca-8ae9cd1ecd9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:39 np0005603623 nova_compute[226235]: 2026-01-31 08:23:39.552 226239 DEBUG oslo_concurrency.lockutils [req-0d14e95c-1498-4a53-80a0-d442c4ca5bb0 req-0c4749fc-2c00-49ac-90ca-8ae9cd1ecd9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:39 np0005603623 nova_compute[226235]: 2026-01-31 08:23:39.553 226239 DEBUG oslo_concurrency.lockutils [req-0d14e95c-1498-4a53-80a0-d442c4ca5bb0 req-0c4749fc-2c00-49ac-90ca-8ae9cd1ecd9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:39 np0005603623 nova_compute[226235]: 2026-01-31 08:23:39.553 226239 DEBUG nova.compute.manager [req-0d14e95c-1498-4a53-80a0-d442c4ca5bb0 req-0c4749fc-2c00-49ac-90ca-8ae9cd1ecd9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] No waiting events found dispatching network-vif-plugged-4a895f0e-6a61-40cd-a68c-3c5cb76daadb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:23:39 np0005603623 nova_compute[226235]: 2026-01-31 08:23:39.553 226239 WARNING nova.compute.manager [req-0d14e95c-1498-4a53-80a0-d442c4ca5bb0 req-0c4749fc-2c00-49ac-90ca-8ae9cd1ecd9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Received unexpected event network-vif-plugged-4a895f0e-6a61-40cd-a68c-3c5cb76daadb for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:23:40 np0005603623 nova_compute[226235]: 2026-01-31 08:23:40.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:23:40 np0005603623 nova_compute[226235]: 2026-01-31 08:23:40.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:23:40 np0005603623 nova_compute[226235]: 2026-01-31 08:23:40.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:23:40 np0005603623 nova_compute[226235]: 2026-01-31 08:23:40.202 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 31 03:23:40 np0005603623 nova_compute[226235]: 2026-01-31 08:23:40.202 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:23:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:23:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:40.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:23:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:23:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:40.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:23:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:23:41 np0005603623 nova_compute[226235]: 2026-01-31 08:23:41.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:23:41 np0005603623 nova_compute[226235]: 2026-01-31 08:23:41.211 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:41 np0005603623 nova_compute[226235]: 2026-01-31 08:23:41.212 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:41 np0005603623 nova_compute[226235]: 2026-01-31 08:23:41.212 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:41 np0005603623 nova_compute[226235]: 2026-01-31 08:23:41.212 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:23:41 np0005603623 nova_compute[226235]: 2026-01-31 08:23:41.212 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:41 np0005603623 nova_compute[226235]: 2026-01-31 08:23:41.565 226239 DEBUG oslo_concurrency.lockutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:41 np0005603623 nova_compute[226235]: 2026-01-31 08:23:41.566 226239 DEBUG oslo_concurrency.lockutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:41 np0005603623 nova_compute[226235]: 2026-01-31 08:23:41.614 226239 DEBUG nova.compute.manager [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:23:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:23:41 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/402229396' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:23:41 np0005603623 nova_compute[226235]: 2026-01-31 08:23:41.663 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:41 np0005603623 podman[276419]: 2026-01-31 08:23:41.871806223 +0000 UTC m=+0.016998614 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 03:23:42 np0005603623 podman[276419]: 2026-01-31 08:23:42.009768317 +0000 UTC m=+0.154960698 container create 16095d6fb5646b277cc3873ef476d91011a80b72b6880201663dbb8a1f30ba09 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_raman, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 03:23:42 np0005603623 nova_compute[226235]: 2026-01-31 08:23:42.164 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:42 np0005603623 nova_compute[226235]: 2026-01-31 08:23:42.277 226239 DEBUG oslo_concurrency.lockutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:42 np0005603623 nova_compute[226235]: 2026-01-31 08:23:42.277 226239 DEBUG oslo_concurrency.lockutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:42 np0005603623 nova_compute[226235]: 2026-01-31 08:23:42.283 226239 DEBUG nova.virt.hardware [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:23:42 np0005603623 nova_compute[226235]: 2026-01-31 08:23:42.283 226239 INFO nova.compute.claims [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:23:42 np0005603623 systemd[1]: Started libpod-conmon-16095d6fb5646b277cc3873ef476d91011a80b72b6880201663dbb8a1f30ba09.scope.
Jan 31 03:23:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:23:42 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:23:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:42.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:42 np0005603623 podman[276419]: 2026-01-31 08:23:42.556890379 +0000 UTC m=+0.702082770 container init 16095d6fb5646b277cc3873ef476d91011a80b72b6880201663dbb8a1f30ba09 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_raman, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True)
Jan 31 03:23:42 np0005603623 podman[276419]: 2026-01-31 08:23:42.564319232 +0000 UTC m=+0.709511603 container start 16095d6fb5646b277cc3873ef476d91011a80b72b6880201663dbb8a1f30ba09 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_raman, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Jan 31 03:23:42 np0005603623 determined_raman[276435]: 167 167
Jan 31 03:23:42 np0005603623 systemd[1]: libpod-16095d6fb5646b277cc3873ef476d91011a80b72b6880201663dbb8a1f30ba09.scope: Deactivated successfully.
Jan 31 03:23:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:23:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:42.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:23:42 np0005603623 podman[276419]: 2026-01-31 08:23:42.904187357 +0000 UTC m=+1.049379758 container attach 16095d6fb5646b277cc3873ef476d91011a80b72b6880201663dbb8a1f30ba09 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_raman, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True)
Jan 31 03:23:42 np0005603623 podman[276419]: 2026-01-31 08:23:42.904779885 +0000 UTC m=+1.049972266 container died 16095d6fb5646b277cc3873ef476d91011a80b72b6880201663dbb8a1f30ba09 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_raman, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Jan 31 03:23:43 np0005603623 nova_compute[226235]: 2026-01-31 08:23:43.264 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000075 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:23:43 np0005603623 nova_compute[226235]: 2026-01-31 08:23:43.265 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000075 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:23:43 np0005603623 nova_compute[226235]: 2026-01-31 08:23:43.285 226239 INFO nova.virt.libvirt.driver [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Deleting instance files /var/lib/nova/instances/78af9e5e-c5fb-4266-bcf6-4f99948aaa57_del#033[00m
Jan 31 03:23:43 np0005603623 nova_compute[226235]: 2026-01-31 08:23:43.285 226239 INFO nova.virt.libvirt.driver [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Deletion of /var/lib/nova/instances/78af9e5e-c5fb-4266-bcf6-4f99948aaa57_del complete#033[00m
Jan 31 03:23:43 np0005603623 nova_compute[226235]: 2026-01-31 08:23:43.385 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:23:43 np0005603623 nova_compute[226235]: 2026-01-31 08:23:43.386 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4436MB free_disk=20.809818267822266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:23:43 np0005603623 nova_compute[226235]: 2026-01-31 08:23:43.386 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:43 np0005603623 nova_compute[226235]: 2026-01-31 08:23:43.565 226239 INFO nova.compute.manager [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Took 12.06 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:23:43 np0005603623 nova_compute[226235]: 2026-01-31 08:23:43.565 226239 DEBUG oslo.service.loopingcall [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:23:43 np0005603623 nova_compute[226235]: 2026-01-31 08:23:43.566 226239 DEBUG nova.compute.manager [-] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:23:43 np0005603623 nova_compute[226235]: 2026-01-31 08:23:43.566 226239 DEBUG nova.network.neutron [-] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:23:43 np0005603623 nova_compute[226235]: 2026-01-31 08:23:43.684 226239 DEBUG oslo_concurrency.processutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:43 np0005603623 systemd[1]: var-lib-containers-storage-overlay-5258bd36b7e0db6ef50c412c57bded747976f5ffcc7caf57e94057dfba3d647f-merged.mount: Deactivated successfully.
Jan 31 03:23:44 np0005603623 nova_compute[226235]: 2026-01-31 08:23:44.316 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:23:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/442079960' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:23:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:44.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:44 np0005603623 nova_compute[226235]: 2026-01-31 08:23:44.365 226239 DEBUG oslo_concurrency.processutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.681s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:44 np0005603623 nova_compute[226235]: 2026-01-31 08:23:44.369 226239 DEBUG nova.compute.provider_tree [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:23:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:23:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:44.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:23:44 np0005603623 nova_compute[226235]: 2026-01-31 08:23:44.652 226239 DEBUG nova.scheduler.client.report [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:23:44 np0005603623 nova_compute[226235]: 2026-01-31 08:23:44.711 226239 DEBUG oslo_concurrency.lockutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.434s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:44 np0005603623 nova_compute[226235]: 2026-01-31 08:23:44.712 226239 DEBUG nova.compute.manager [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:23:44 np0005603623 nova_compute[226235]: 2026-01-31 08:23:44.716 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 1.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:44 np0005603623 nova_compute[226235]: 2026-01-31 08:23:44.812 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Applying migration context for instance d0c13002-57d9-4fad-8579-7343af29719d as it has an incoming, in-progress migration b1852fe9-e291-46e2-a9d1-96408d08f71a. Migration status is post-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Jan 31 03:23:44 np0005603623 nova_compute[226235]: 2026-01-31 08:23:44.813 226239 INFO nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating resource usage from migration b1852fe9-e291-46e2-a9d1-96408d08f71a#033[00m
Jan 31 03:23:44 np0005603623 nova_compute[226235]: 2026-01-31 08:23:44.840 226239 DEBUG nova.compute.manager [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:23:44 np0005603623 nova_compute[226235]: 2026-01-31 08:23:44.841 226239 DEBUG nova.network.neutron [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:23:44 np0005603623 podman[276419]: 2026-01-31 08:23:44.881343728 +0000 UTC m=+3.026536139 container remove 16095d6fb5646b277cc3873ef476d91011a80b72b6880201663dbb8a1f30ba09 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_raman, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:23:44 np0005603623 nova_compute[226235]: 2026-01-31 08:23:44.927 226239 INFO nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:23:44 np0005603623 nova_compute[226235]: 2026-01-31 08:23:44.933 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 78af9e5e-c5fb-4266-bcf6-4f99948aaa57 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:23:44 np0005603623 nova_compute[226235]: 2026-01-31 08:23:44.933 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance d0c13002-57d9-4fad-8579-7343af29719d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:23:44 np0005603623 nova_compute[226235]: 2026-01-31 08:23:44.934 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance f3b36b5b-968c-4775-ac4f-93efc36f40ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:23:44 np0005603623 nova_compute[226235]: 2026-01-31 08:23:44.934 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:23:44 np0005603623 nova_compute[226235]: 2026-01-31 08:23:44.935 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:23:44 np0005603623 systemd[1]: libpod-conmon-16095d6fb5646b277cc3873ef476d91011a80b72b6880201663dbb8a1f30ba09.scope: Deactivated successfully.
Jan 31 03:23:45 np0005603623 nova_compute[226235]: 2026-01-31 08:23:45.085 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:23:45 np0005603623 podman[276483]: 2026-01-31 08:23:45.003542259 +0000 UTC m=+0.029433565 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 03:23:45 np0005603623 nova_compute[226235]: 2026-01-31 08:23:45.114 226239 DEBUG nova.compute.manager [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:23:45 np0005603623 nova_compute[226235]: 2026-01-31 08:23:45.223 226239 INFO nova.network.neutron [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating port cc59ad05-3242-4d5f-8eec-a2480d285193 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 31 03:23:45 np0005603623 nova_compute[226235]: 2026-01-31 08:23:45.287 226239 DEBUG nova.policy [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '432ac8867d8240408db455fc25bb5901', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '491937de020742d7b4e847dc3bf57950', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:23:45 np0005603623 nova_compute[226235]: 2026-01-31 08:23:45.317 226239 DEBUG nova.compute.manager [req-5f7604b3-7495-47f8-b6af-6a877f1e277e req-58546758-06cc-4e89-a3ec-96bc07040415 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-unplugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:45 np0005603623 nova_compute[226235]: 2026-01-31 08:23:45.318 226239 DEBUG oslo_concurrency.lockutils [req-5f7604b3-7495-47f8-b6af-6a877f1e277e req-58546758-06cc-4e89-a3ec-96bc07040415 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:45 np0005603623 nova_compute[226235]: 2026-01-31 08:23:45.318 226239 DEBUG oslo_concurrency.lockutils [req-5f7604b3-7495-47f8-b6af-6a877f1e277e req-58546758-06cc-4e89-a3ec-96bc07040415 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:45 np0005603623 nova_compute[226235]: 2026-01-31 08:23:45.319 226239 DEBUG oslo_concurrency.lockutils [req-5f7604b3-7495-47f8-b6af-6a877f1e277e req-58546758-06cc-4e89-a3ec-96bc07040415 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:45 np0005603623 nova_compute[226235]: 2026-01-31 08:23:45.319 226239 DEBUG nova.compute.manager [req-5f7604b3-7495-47f8-b6af-6a877f1e277e req-58546758-06cc-4e89-a3ec-96bc07040415 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-unplugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:23:45 np0005603623 nova_compute[226235]: 2026-01-31 08:23:45.319 226239 WARNING nova.compute.manager [req-5f7604b3-7495-47f8-b6af-6a877f1e277e req-58546758-06cc-4e89-a3ec-96bc07040415 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-unplugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:23:45 np0005603623 podman[276483]: 2026-01-31 08:23:45.325198432 +0000 UTC m=+0.351089648 container create dbd14132c120382bfafd6b0fc7c1edc71868929f0f37c10b74f451476b2ed979 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_almeida, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Jan 31 03:23:45 np0005603623 nova_compute[226235]: 2026-01-31 08:23:45.506 226239 DEBUG nova.compute.manager [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:23:45 np0005603623 nova_compute[226235]: 2026-01-31 08:23:45.508 226239 DEBUG nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:23:45 np0005603623 nova_compute[226235]: 2026-01-31 08:23:45.508 226239 INFO nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Creating image(s)#033[00m
Jan 31 03:23:45 np0005603623 systemd[1]: Started libpod-conmon-dbd14132c120382bfafd6b0fc7c1edc71868929f0f37c10b74f451476b2ed979.scope.
Jan 31 03:23:45 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:23:45 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 03:23:45 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 03:23:45 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/201bc919d1565d99da95b7cdd55a63ef320fc076dcf32e12ccbc8daa9fcf757b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 03:23:45 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/201bc919d1565d99da95b7cdd55a63ef320fc076dcf32e12ccbc8daa9fcf757b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 03:23:45 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/201bc919d1565d99da95b7cdd55a63ef320fc076dcf32e12ccbc8daa9fcf757b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 03:23:45 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/201bc919d1565d99da95b7cdd55a63ef320fc076dcf32e12ccbc8daa9fcf757b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 03:23:45 np0005603623 podman[276509]: 2026-01-31 08:23:45.781960061 +0000 UTC m=+0.424847989 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:23:46 np0005603623 podman[276483]: 2026-01-31 08:23:46.241908969 +0000 UTC m=+1.267800195 container init dbd14132c120382bfafd6b0fc7c1edc71868929f0f37c10b74f451476b2ed979 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_almeida, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default)
Jan 31 03:23:46 np0005603623 podman[276483]: 2026-01-31 08:23:46.249226348 +0000 UTC m=+1.275117554 container start dbd14132c120382bfafd6b0fc7c1edc71868929f0f37c10b74f451476b2ed979 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_almeida, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 03:23:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:46.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:23:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:46.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:23:46 np0005603623 podman[276483]: 2026-01-31 08:23:46.689847012 +0000 UTC m=+1.715738218 container attach dbd14132c120382bfafd6b0fc7c1edc71868929f0f37c10b74f451476b2ed979 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_almeida, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Jan 31 03:23:46 np0005603623 podman[276508]: 2026-01-31 08:23:46.73379102 +0000 UTC m=+1.376678478 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 03:23:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:46 np0005603623 nova_compute[226235]: 2026-01-31 08:23:46.771 226239 DEBUG nova.storage.rbd_utils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:46 np0005603623 nova_compute[226235]: 2026-01-31 08:23:46.796 226239 DEBUG nova.storage.rbd_utils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:46 np0005603623 nova_compute[226235]: 2026-01-31 08:23:46.820 226239 DEBUG nova.storage.rbd_utils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:46 np0005603623 nova_compute[226235]: 2026-01-31 08:23:46.824 226239 DEBUG oslo_concurrency.processutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:46 np0005603623 nova_compute[226235]: 2026-01-31 08:23:46.844 226239 DEBUG nova.compute.manager [req-c3d75999-1ffc-4771-a720-76cc8445ebb7 req-02791dc7-93cb-4cfd-978d-2c99041a0c04 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Received event network-vif-deleted-4a895f0e-6a61-40cd-a68c-3c5cb76daadb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:46 np0005603623 nova_compute[226235]: 2026-01-31 08:23:46.845 226239 INFO nova.compute.manager [req-c3d75999-1ffc-4771-a720-76cc8445ebb7 req-02791dc7-93cb-4cfd-978d-2c99041a0c04 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Neutron deleted interface 4a895f0e-6a61-40cd-a68c-3c5cb76daadb; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:23:46 np0005603623 nova_compute[226235]: 2026-01-31 08:23:46.845 226239 DEBUG nova.network.neutron [req-c3d75999-1ffc-4771-a720-76cc8445ebb7 req-02791dc7-93cb-4cfd-978d-2c99041a0c04 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:23:46 np0005603623 nova_compute[226235]: 2026-01-31 08:23:46.870 226239 DEBUG oslo_concurrency.processutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:46 np0005603623 nova_compute[226235]: 2026-01-31 08:23:46.871 226239 DEBUG oslo_concurrency.lockutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:46 np0005603623 nova_compute[226235]: 2026-01-31 08:23:46.872 226239 DEBUG oslo_concurrency.lockutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:46 np0005603623 nova_compute[226235]: 2026-01-31 08:23:46.872 226239 DEBUG oslo_concurrency.lockutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:46 np0005603623 nova_compute[226235]: 2026-01-31 08:23:46.901 226239 DEBUG nova.storage.rbd_utils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:46 np0005603623 nova_compute[226235]: 2026-01-31 08:23:46.905 226239 DEBUG oslo_concurrency.processutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:46 np0005603623 nova_compute[226235]: 2026-01-31 08:23:46.924 226239 DEBUG nova.network.neutron [-] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:23:46 np0005603623 nova_compute[226235]: 2026-01-31 08:23:46.949 226239 DEBUG nova.compute.manager [req-c3d75999-1ffc-4771-a720-76cc8445ebb7 req-02791dc7-93cb-4cfd-978d-2c99041a0c04 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Detach interface failed, port_id=4a895f0e-6a61-40cd-a68c-3c5cb76daadb, reason: Instance 78af9e5e-c5fb-4266-bcf6-4f99948aaa57 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:23:46 np0005603623 nova_compute[226235]: 2026-01-31 08:23:46.969 226239 INFO nova.compute.manager [-] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Took 3.40 seconds to deallocate network for instance.#033[00m
Jan 31 03:23:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:23:47 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/235672614' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.098 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.105 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.131 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847812.1300144, 78af9e5e-c5fb-4266-bcf6-4f99948aaa57 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.132 226239 INFO nova.compute.manager [-] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.147 226239 DEBUG oslo_concurrency.lockutils [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.165 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.209 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.264 226239 DEBUG nova.compute.manager [None req-35258553-d495-4b9c-a603-49834249891b - - - - - -] [instance: 78af9e5e-c5fb-4266-bcf6-4f99948aaa57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.284 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.285 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.285 226239 DEBUG oslo_concurrency.lockutils [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:47 np0005603623 magical_almeida[276544]: [
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:    {
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:        "available": false,
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:        "ceph_device": false,
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:        "lsm_data": {},
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:        "lvs": [],
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:        "path": "/dev/sr0",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:        "rejected_reasons": [
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "Insufficient space (<5GB)",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "Has a FileSystem"
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:        ],
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:        "sys_api": {
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "actuators": null,
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "device_nodes": "sr0",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "devname": "sr0",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "human_readable_size": "482.00 KB",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "id_bus": "ata",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "model": "QEMU DVD-ROM",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "nr_requests": "2",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "parent": "/dev/sr0",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "partitions": {},
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "path": "/dev/sr0",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "removable": "1",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "rev": "2.5+",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "ro": "0",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "rotational": "1",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "sas_address": "",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "sas_device_handle": "",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "scheduler_mode": "mq-deadline",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "sectors": 0,
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "sectorsize": "2048",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "size": 493568.0,
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "support_discard": "2048",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "type": "disk",
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:            "vendor": "QEMU"
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:        }
Jan 31 03:23:47 np0005603623 magical_almeida[276544]:    }
Jan 31 03:23:47 np0005603623 magical_almeida[276544]: ]
Jan 31 03:23:47 np0005603623 systemd[1]: libpod-dbd14132c120382bfafd6b0fc7c1edc71868929f0f37c10b74f451476b2ed979.scope: Deactivated successfully.
Jan 31 03:23:47 np0005603623 podman[276483]: 2026-01-31 08:23:47.348968824 +0000 UTC m=+2.374860020 container died dbd14132c120382bfafd6b0fc7c1edc71868929f0f37c10b74f451476b2ed979 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_almeida, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Jan 31 03:23:47 np0005603623 systemd[1]: Stopping User Manager for UID 42436...
Jan 31 03:23:47 np0005603623 systemd[275982]: Activating special unit Exit the Session...
Jan 31 03:23:47 np0005603623 systemd[275982]: Stopped target Main User Target.
Jan 31 03:23:47 np0005603623 systemd[275982]: Stopped target Basic System.
Jan 31 03:23:47 np0005603623 systemd[275982]: Stopped target Paths.
Jan 31 03:23:47 np0005603623 systemd[275982]: Stopped target Sockets.
Jan 31 03:23:47 np0005603623 systemd[275982]: Stopped target Timers.
Jan 31 03:23:47 np0005603623 systemd[275982]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 31 03:23:47 np0005603623 systemd[275982]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 31 03:23:47 np0005603623 systemd[275982]: Closed D-Bus User Message Bus Socket.
Jan 31 03:23:47 np0005603623 systemd[275982]: Stopped Create User's Volatile Files and Directories.
Jan 31 03:23:47 np0005603623 systemd[275982]: Removed slice User Application Slice.
Jan 31 03:23:47 np0005603623 systemd[275982]: Reached target Shutdown.
Jan 31 03:23:47 np0005603623 systemd[275982]: Finished Exit the Session.
Jan 31 03:23:47 np0005603623 systemd[275982]: Reached target Exit the Session.
Jan 31 03:23:47 np0005603623 systemd[1]: user@42436.service: Deactivated successfully.
Jan 31 03:23:47 np0005603623 systemd[1]: Stopped User Manager for UID 42436.
Jan 31 03:23:47 np0005603623 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 31 03:23:47 np0005603623 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 31 03:23:47 np0005603623 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 31 03:23:47 np0005603623 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 31 03:23:47 np0005603623 systemd[1]: Removed slice User Slice of UID 42436.
Jan 31 03:23:47 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.412 226239 DEBUG oslo_concurrency.processutils [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.599 226239 DEBUG oslo_concurrency.lockutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.601 226239 DEBUG oslo_concurrency.lockutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquired lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.602 226239 DEBUG nova.network.neutron [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.910 226239 DEBUG nova.compute.manager [req-e310c75d-60f6-4276-af92-8288f9238ad8 req-3c77d365-f955-4457-9afc-27157b0a6f84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.911 226239 DEBUG oslo_concurrency.lockutils [req-e310c75d-60f6-4276-af92-8288f9238ad8 req-3c77d365-f955-4457-9afc-27157b0a6f84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.912 226239 DEBUG oslo_concurrency.lockutils [req-e310c75d-60f6-4276-af92-8288f9238ad8 req-3c77d365-f955-4457-9afc-27157b0a6f84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.912 226239 DEBUG oslo_concurrency.lockutils [req-e310c75d-60f6-4276-af92-8288f9238ad8 req-3c77d365-f955-4457-9afc-27157b0a6f84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.912 226239 DEBUG nova.compute.manager [req-e310c75d-60f6-4276-af92-8288f9238ad8 req-3c77d365-f955-4457-9afc-27157b0a6f84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.912 226239 WARNING nova.compute.manager [req-e310c75d-60f6-4276-af92-8288f9238ad8 req-3c77d365-f955-4457-9afc-27157b0a6f84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.918 226239 DEBUG nova.compute.manager [req-288bb1a3-fcea-47a1-ae63-36574ca1cad3 req-7845037e-e2db-4ed2-8254-ac4dc9c89a9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-changed-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.919 226239 DEBUG nova.compute.manager [req-288bb1a3-fcea-47a1-ae63-36574ca1cad3 req-7845037e-e2db-4ed2-8254-ac4dc9c89a9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Refreshing instance network info cache due to event network-changed-cc59ad05-3242-4d5f-8eec-a2480d285193. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:23:47 np0005603623 nova_compute[226235]: 2026-01-31 08:23:47.919 226239 DEBUG oslo_concurrency.lockutils [req-288bb1a3-fcea-47a1-ae63-36574ca1cad3 req-7845037e-e2db-4ed2-8254-ac4dc9c89a9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:23:47 np0005603623 systemd[1]: var-lib-containers-storage-overlay-201bc919d1565d99da95b7cdd55a63ef320fc076dcf32e12ccbc8daa9fcf757b-merged.mount: Deactivated successfully.
Jan 31 03:23:48 np0005603623 nova_compute[226235]: 2026-01-31 08:23:48.154 226239 DEBUG oslo_concurrency.processutils [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.742s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:48 np0005603623 nova_compute[226235]: 2026-01-31 08:23:48.162 226239 DEBUG nova.compute.provider_tree [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:23:48 np0005603623 nova_compute[226235]: 2026-01-31 08:23:48.288 226239 DEBUG nova.scheduler.client.report [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:23:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:48.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:48 np0005603623 nova_compute[226235]: 2026-01-31 08:23:48.368 226239 DEBUG oslo_concurrency.lockutils [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:48 np0005603623 nova_compute[226235]: 2026-01-31 08:23:48.427 226239 INFO nova.scheduler.client.report [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Deleted allocations for instance 78af9e5e-c5fb-4266-bcf6-4f99948aaa57#033[00m
Jan 31 03:23:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:48.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:48 np0005603623 nova_compute[226235]: 2026-01-31 08:23:48.649 226239 DEBUG nova.network.neutron [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Successfully created port: 19a4f194-5514-4b8e-b635-e6fe0255dc1a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:23:48 np0005603623 podman[276483]: 2026-01-31 08:23:48.678465162 +0000 UTC m=+3.704356368 container remove dbd14132c120382bfafd6b0fc7c1edc71868929f0f37c10b74f451476b2ed979 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_almeida, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Jan 31 03:23:48 np0005603623 systemd[1]: libpod-conmon-dbd14132c120382bfafd6b0fc7c1edc71868929f0f37c10b74f451476b2ed979.scope: Deactivated successfully.
Jan 31 03:23:48 np0005603623 nova_compute[226235]: 2026-01-31 08:23:48.777 226239 DEBUG oslo_concurrency.lockutils [None req-4f0daf8a-9d89-4eeb-9fc3-9d4e86517bbf c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "78af9e5e-c5fb-4266-bcf6-4f99948aaa57" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 17.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:49 np0005603623 nova_compute[226235]: 2026-01-31 08:23:49.286 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:23:49 np0005603623 nova_compute[226235]: 2026-01-31 08:23:49.286 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:23:49 np0005603623 nova_compute[226235]: 2026-01-31 08:23:49.287 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:23:49 np0005603623 nova_compute[226235]: 2026-01-31 08:23:49.287 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:23:49 np0005603623 nova_compute[226235]: 2026-01-31 08:23:49.287 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:23:49 np0005603623 nova_compute[226235]: 2026-01-31 08:23:49.316 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:23:49 np0005603623 nova_compute[226235]: 2026-01-31 08:23:49.403 226239 DEBUG oslo_concurrency.processutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:49 np0005603623 nova_compute[226235]: 2026-01-31 08:23:49.746 226239 DEBUG nova.storage.rbd_utils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] resizing rbd image f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:23:49 np0005603623 nova_compute[226235]: 2026-01-31 08:23:49.869 226239 DEBUG nova.objects.instance [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'migration_context' on Instance uuid f3b36b5b-968c-4775-ac4f-93efc36f40ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:23:50 np0005603623 nova_compute[226235]: 2026-01-31 08:23:50.225 226239 DEBUG nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:23:50 np0005603623 nova_compute[226235]: 2026-01-31 08:23:50.226 226239 DEBUG nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Ensure instance console log exists: /var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:23:50 np0005603623 nova_compute[226235]: 2026-01-31 08:23:50.226 226239 DEBUG oslo_concurrency.lockutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:50 np0005603623 nova_compute[226235]: 2026-01-31 08:23:50.226 226239 DEBUG oslo_concurrency.lockutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:50 np0005603623 nova_compute[226235]: 2026-01-31 08:23:50.227 226239 DEBUG oslo_concurrency.lockutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:50.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:50 np0005603623 nova_compute[226235]: 2026-01-31 08:23:50.446 226239 DEBUG nova.network.neutron [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating instance_info_cache with network_info: [{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:23:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:23:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:50.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:23:50 np0005603623 nova_compute[226235]: 2026-01-31 08:23:50.817 226239 DEBUG oslo_concurrency.lockutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Releasing lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:23:50 np0005603623 nova_compute[226235]: 2026-01-31 08:23:50.822 226239 DEBUG oslo_concurrency.lockutils [req-288bb1a3-fcea-47a1-ae63-36574ca1cad3 req-7845037e-e2db-4ed2-8254-ac4dc9c89a9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:23:50 np0005603623 nova_compute[226235]: 2026-01-31 08:23:50.823 226239 DEBUG nova.network.neutron [req-288bb1a3-fcea-47a1-ae63-36574ca1cad3 req-7845037e-e2db-4ed2-8254-ac4dc9c89a9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Refreshing network info cache for port cc59ad05-3242-4d5f-8eec-a2480d285193 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:23:50 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:23:50 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:23:50 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:23:50 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:23:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:23:51 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2374750131' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:23:51 np0005603623 nova_compute[226235]: 2026-01-31 08:23:51.065 226239 DEBUG nova.network.neutron [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Successfully updated port: 19a4f194-5514-4b8e-b635-e6fe0255dc1a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:23:51 np0005603623 nova_compute[226235]: 2026-01-31 08:23:51.470 226239 DEBUG nova.compute.manager [req-fe67122c-17b7-415e-8d01-9bb2b0abd0b0 req-061881ad-bd5f-4782-a1fb-634d088713cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received event network-changed-19a4f194-5514-4b8e-b635-e6fe0255dc1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:51 np0005603623 nova_compute[226235]: 2026-01-31 08:23:51.470 226239 DEBUG nova.compute.manager [req-fe67122c-17b7-415e-8d01-9bb2b0abd0b0 req-061881ad-bd5f-4782-a1fb-634d088713cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Refreshing instance network info cache due to event network-changed-19a4f194-5514-4b8e-b635-e6fe0255dc1a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:23:51 np0005603623 nova_compute[226235]: 2026-01-31 08:23:51.470 226239 DEBUG oslo_concurrency.lockutils [req-fe67122c-17b7-415e-8d01-9bb2b0abd0b0 req-061881ad-bd5f-4782-a1fb-634d088713cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:23:51 np0005603623 nova_compute[226235]: 2026-01-31 08:23:51.471 226239 DEBUG oslo_concurrency.lockutils [req-fe67122c-17b7-415e-8d01-9bb2b0abd0b0 req-061881ad-bd5f-4782-a1fb-634d088713cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:23:51 np0005603623 nova_compute[226235]: 2026-01-31 08:23:51.471 226239 DEBUG nova.network.neutron [req-fe67122c-17b7-415e-8d01-9bb2b0abd0b0 req-061881ad-bd5f-4782-a1fb-634d088713cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Refreshing network info cache for port 19a4f194-5514-4b8e-b635-e6fe0255dc1a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:23:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:52 np0005603623 nova_compute[226235]: 2026-01-31 08:23:52.167 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:23:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:23:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:52.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:52 np0005603623 nova_compute[226235]: 2026-01-31 08:23:52.639 226239 DEBUG oslo_concurrency.lockutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:23:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:52.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:52 np0005603623 nova_compute[226235]: 2026-01-31 08:23:52.667 226239 DEBUG nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 31 03:23:52 np0005603623 nova_compute[226235]: 2026-01-31 08:23:52.668 226239 DEBUG nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 31 03:23:52 np0005603623 nova_compute[226235]: 2026-01-31 08:23:52.669 226239 INFO nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Creating image(s)#033[00m
Jan 31 03:23:52 np0005603623 nova_compute[226235]: 2026-01-31 08:23:52.714 226239 DEBUG nova.storage.rbd_utils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] creating snapshot(nova-resize) on rbd image(d0c13002-57d9-4fad-8579-7343af29719d_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:23:53 np0005603623 nova_compute[226235]: 2026-01-31 08:23:53.055 226239 DEBUG nova.network.neutron [req-288bb1a3-fcea-47a1-ae63-36574ca1cad3 req-7845037e-e2db-4ed2-8254-ac4dc9c89a9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updated VIF entry in instance network info cache for port cc59ad05-3242-4d5f-8eec-a2480d285193. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:23:53 np0005603623 nova_compute[226235]: 2026-01-31 08:23:53.055 226239 DEBUG nova.network.neutron [req-288bb1a3-fcea-47a1-ae63-36574ca1cad3 req-7845037e-e2db-4ed2-8254-ac4dc9c89a9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating instance_info_cache with network_info: [{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:23:54 np0005603623 nova_compute[226235]: 2026-01-31 08:23:54.317 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:54.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e271 e271: 3 total, 3 up, 3 in
Jan 31 03:23:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:54.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:55 np0005603623 nova_compute[226235]: 2026-01-31 08:23:55.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:23:56 np0005603623 nova_compute[226235]: 2026-01-31 08:23:56.023 226239 DEBUG oslo_concurrency.lockutils [req-288bb1a3-fcea-47a1-ae63-36574ca1cad3 req-7845037e-e2db-4ed2-8254-ac4dc9c89a9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:23:56 np0005603623 nova_compute[226235]: 2026-01-31 08:23:56.204 226239 DEBUG nova.objects.instance [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'trusted_certs' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:23:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:23:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:56.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:23:56 np0005603623 nova_compute[226235]: 2026-01-31 08:23:56.585 226239 DEBUG nova.network.neutron [req-fe67122c-17b7-415e-8d01-9bb2b0abd0b0 req-061881ad-bd5f-4782-a1fb-634d088713cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:23:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:56.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.058 226239 DEBUG nova.network.neutron [req-fe67122c-17b7-415e-8d01-9bb2b0abd0b0 req-061881ad-bd5f-4782-a1fb-634d088713cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.169 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.305 226239 DEBUG nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.305 226239 DEBUG nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Ensure instance console log exists: /var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.306 226239 DEBUG oslo_concurrency.lockutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.306 226239 DEBUG oslo_concurrency.lockutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.306 226239 DEBUG oslo_concurrency.lockutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.309 226239 DEBUG nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Start _get_guest_xml network_info=[{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-924085117-network", "vif_mac": "fa:16:3e:b9:24:4f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.310 226239 DEBUG oslo_concurrency.lockutils [req-fe67122c-17b7-415e-8d01-9bb2b0abd0b0 req-061881ad-bd5f-4782-a1fb-634d088713cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.311 226239 DEBUG oslo_concurrency.lockutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquired lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.311 226239 DEBUG nova.network.neutron [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.315 226239 WARNING nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.320 226239 DEBUG nova.virt.libvirt.host [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.321 226239 DEBUG nova.virt.libvirt.host [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.324 226239 DEBUG nova.virt.libvirt.host [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.325 226239 DEBUG nova.virt.libvirt.host [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.326 226239 DEBUG nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.326 226239 DEBUG nova.virt.hardware [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='f75c4aee-d826-4343-a7e3-f06a4b21de52',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.327 226239 DEBUG nova.virt.hardware [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.327 226239 DEBUG nova.virt.hardware [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.327 226239 DEBUG nova.virt.hardware [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.327 226239 DEBUG nova.virt.hardware [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.328 226239 DEBUG nova.virt.hardware [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.328 226239 DEBUG nova.virt.hardware [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.328 226239 DEBUG nova.virt.hardware [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.328 226239 DEBUG nova.virt.hardware [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.328 226239 DEBUG nova.virt.hardware [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.329 226239 DEBUG nova.virt.hardware [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.329 226239 DEBUG nova.objects.instance [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'vcpu_model' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.641 226239 DEBUG oslo_concurrency.processutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:57 np0005603623 nova_compute[226235]: 2026-01-31 08:23:57.771 226239 DEBUG nova.network.neutron [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:23:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:23:58 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3799156449' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:23:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:58.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:58 np0005603623 nova_compute[226235]: 2026-01-31 08:23:58.612 226239 DEBUG oslo_concurrency.processutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.970s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:58 np0005603623 nova_compute[226235]: 2026-01-31 08:23:58.649 226239 DEBUG oslo_concurrency.processutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:23:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:23:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:58.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:23:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:23:59 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2384636833' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.244 226239 DEBUG oslo_concurrency.processutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.595s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.245 226239 DEBUG nova.virt.libvirt.vif [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-922415262',display_name='tempest-ServerActionsTestJSON-server-922415262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-922415262',id=104,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:19:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-zpk5jwol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:23:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=d0c13002-57d9-4fad-8579-7343af29719d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-924085117-network", "vif_mac": "fa:16:3e:b9:24:4f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.246 226239 DEBUG nova.network.os_vif_util [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-924085117-network", "vif_mac": "fa:16:3e:b9:24:4f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.247 226239 DEBUG nova.network.os_vif_util [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.249 226239 DEBUG nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:23:59 np0005603623 nova_compute[226235]:  <uuid>d0c13002-57d9-4fad-8579-7343af29719d</uuid>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:  <name>instance-00000068</name>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:  <memory>196608</memory>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerActionsTestJSON-server-922415262</nova:name>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:23:57</nova:creationTime>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.micro">
Jan 31 03:23:59 np0005603623 nova_compute[226235]:        <nova:memory>192</nova:memory>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:        <nova:user uuid="1d03198d8ab846bda092e089b2d5a6c7">tempest-ServerActionsTestJSON-1873947453-project-member</nova:user>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:        <nova:project uuid="5b87da3b3f42494f96baeeeaf60b54df">tempest-ServerActionsTestJSON-1873947453</nova:project>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:        <nova:port uuid="cc59ad05-3242-4d5f-8eec-a2480d285193">
Jan 31 03:23:59 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <entry name="serial">d0c13002-57d9-4fad-8579-7343af29719d</entry>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <entry name="uuid">d0c13002-57d9-4fad-8579-7343af29719d</entry>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/d0c13002-57d9-4fad-8579-7343af29719d_disk">
Jan 31 03:23:59 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:23:59 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/d0c13002-57d9-4fad-8579-7343af29719d_disk.config">
Jan 31 03:23:59 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:23:59 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:b9:24:4f"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <target dev="tapcc59ad05-32"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d/console.log" append="off"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:23:59 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:23:59 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:23:59 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:23:59 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.250 226239 DEBUG nova.virt.libvirt.vif [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-922415262',display_name='tempest-ServerActionsTestJSON-server-922415262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-922415262',id=104,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:19:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-zpk5jwol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:23:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=d0c13002-57d9-4fad-8579-7343af29719d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-924085117-network", "vif_mac": "fa:16:3e:b9:24:4f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.251 226239 DEBUG nova.network.os_vif_util [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-924085117-network", "vif_mac": "fa:16:3e:b9:24:4f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.251 226239 DEBUG nova.network.os_vif_util [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.252 226239 DEBUG os_vif [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.253 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.253 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.253 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.257 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.257 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc59ad05-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.257 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcc59ad05-32, col_values=(('external_ids', {'iface-id': 'cc59ad05-3242-4d5f-8eec-a2480d285193', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b9:24:4f', 'vm-uuid': 'd0c13002-57d9-4fad-8579-7343af29719d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.258 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:59 np0005603623 NetworkManager[48970]: <info>  [1769847839.2598] manager: (tapcc59ad05-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/218)
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.261 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.264 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.264 226239 INFO os_vif [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32')#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.318 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.531 226239 DEBUG nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.531 226239 DEBUG nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.532 226239 DEBUG nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No VIF found with MAC fa:16:3e:b9:24:4f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.532 226239 INFO nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Using config drive#033[00m
Jan 31 03:23:59 np0005603623 kernel: tapcc59ad05-32: entered promiscuous mode
Jan 31 03:23:59 np0005603623 NetworkManager[48970]: <info>  [1769847839.5996] manager: (tapcc59ad05-32): new Tun device (/org/freedesktop/NetworkManager/Devices/219)
Jan 31 03:23:59 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:59Z|00460|binding|INFO|Claiming lport cc59ad05-3242-4d5f-8eec-a2480d285193 for this chassis.
Jan 31 03:23:59 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:59Z|00461|binding|INFO|cc59ad05-3242-4d5f-8eec-a2480d285193: Claiming fa:16:3e:b9:24:4f 10.100.0.4
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.600 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.602 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:59 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:59Z|00462|binding|INFO|Setting lport cc59ad05-3242-4d5f-8eec-a2480d285193 ovn-installed in OVS
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.610 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:59 np0005603623 systemd-udevd[278167]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:23:59 np0005603623 systemd-machined[194379]: New machine qemu-52-instance-00000068.
Jan 31 03:23:59 np0005603623 NetworkManager[48970]: <info>  [1769847839.6386] device (tapcc59ad05-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:23:59 np0005603623 NetworkManager[48970]: <info>  [1769847839.6391] device (tapcc59ad05-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:23:59 np0005603623 systemd[1]: Started Virtual Machine qemu-52-instance-00000068.
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.643 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:24:4f 10.100.0.4'], port_security=['fa:16:3e:b9:24:4f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'd0c13002-57d9-4fad-8579-7343af29719d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '12', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=cc59ad05-3242-4d5f-8eec-a2480d285193) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:23:59 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:59Z|00463|binding|INFO|Setting lport cc59ad05-3242-4d5f-8eec-a2480d285193 up in Southbound
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.644 143258 INFO neutron.agent.ovn.metadata.agent [-] Port cc59ad05-3242-4d5f-8eec-a2480d285193 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 bound to our chassis#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.645 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1186b71b-0c4b-47f0-a55d-4433241e46e7#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.651 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[61bcb416-0a96-41e8-8a71-ebcd3a19b132]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.652 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1186b71b-01 in ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.653 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1186b71b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.653 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[21e59936-bc80-4c5a-94f1-dcdea911bcf4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.654 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[69bb8cfb-1196-48a0-9f28-144191b60856]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.660 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[a5e4eb77-007a-42b1-b7d7-f45d26fdafb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.667 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b7ab2a6e-63b7-4ec8-a302-d3803579da20]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.687 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[077533f2-4712-4410-a9be-828579a2c908]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:59 np0005603623 NetworkManager[48970]: <info>  [1769847839.6939] manager: (tap1186b71b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/220)
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.693 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8ea17cfd-7496-4e85-9a50-2508db1f49ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.714 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[dc80ecd8-bef6-4ad6-961e-6c10ae2c9ef7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.717 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[f879eed4-6d62-4938-a4f0-9639ba2ddb05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:59 np0005603623 NetworkManager[48970]: <info>  [1769847839.7328] device (tap1186b71b-00): carrier: link connected
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.735 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[1d4cac88-e67d-4ba9-b91e-657e15a2a273]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.747 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[aeac729b-a45f-43ae-9f6c-7a1f79b68146]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 698016, 'reachable_time': 27112, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278200, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.756 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1a588d5d-188c-40d7-b7af-ee6b65782bc3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:37ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 698016, 'tstamp': 698016}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 278201, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.769 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f61208b6-a891-414f-a891-14640b7e445d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 698016, 'reachable_time': 27112, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 278202, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.792 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a5ee86f7-8297-41c7-aacc-518a99f457fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.830 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[27a819d6-dbbb-4843-b9a4-197da310d020]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.832 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.832 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.832 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1186b71b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.875 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:59 np0005603623 NetworkManager[48970]: <info>  [1769847839.8762] manager: (tap1186b71b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/221)
Jan 31 03:23:59 np0005603623 kernel: tap1186b71b-00: entered promiscuous mode
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.879 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.879 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1186b71b-00, col_values=(('external_ids', {'iface-id': '4375f262-ce22-40bf-bf9b-24f6862763a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:59 np0005603623 ovn_controller[133449]: 2026-01-31T08:23:59Z|00464|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.880 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.881 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.881 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.882 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b34e2ddd-8791-427f-b89d-9a5d577eb913]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.883 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:23:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:23:59.883 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'env', 'PROCESS_TAG=haproxy-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1186b71b-0c4b-47f0-a55d-4433241e46e7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:23:59 np0005603623 nova_compute[226235]: 2026-01-31 08:23:59.886 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:00 np0005603623 nova_compute[226235]: 2026-01-31 08:24:00.257 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847840.2567236, d0c13002-57d9-4fad-8579-7343af29719d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:24:00 np0005603623 nova_compute[226235]: 2026-01-31 08:24:00.258 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:24:00 np0005603623 nova_compute[226235]: 2026-01-31 08:24:00.261 226239 DEBUG nova.compute.manager [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:24:00 np0005603623 nova_compute[226235]: 2026-01-31 08:24:00.269 226239 INFO nova.virt.libvirt.driver [-] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance running successfully.#033[00m
Jan 31 03:24:00 np0005603623 virtqemud[225858]: argument unsupported: QEMU guest agent is not configured
Jan 31 03:24:00 np0005603623 nova_compute[226235]: 2026-01-31 08:24:00.273 226239 DEBUG nova.virt.libvirt.guest [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 31 03:24:00 np0005603623 nova_compute[226235]: 2026-01-31 08:24:00.274 226239 DEBUG nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 31 03:24:00 np0005603623 podman[278270]: 2026-01-31 08:24:00.190589199 +0000 UTC m=+0.023960922 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:24:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:00.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:00 np0005603623 nova_compute[226235]: 2026-01-31 08:24:00.458 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:24:00 np0005603623 nova_compute[226235]: 2026-01-31 08:24:00.462 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:24:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:24:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:00.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:24:00 np0005603623 nova_compute[226235]: 2026-01-31 08:24:00.770 226239 DEBUG nova.network.neutron [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updating instance_info_cache with network_info: [{"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:24:00 np0005603623 podman[278270]: 2026-01-31 08:24:00.979782389 +0000 UTC m=+0.813154092 container create 8b8fbfe7d29eabb47a02fb9cc60a5bf63e6e070514bc5302dc32f53bbe780d69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 31 03:24:00 np0005603623 nova_compute[226235]: 2026-01-31 08:24:00.998 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 31 03:24:00 np0005603623 nova_compute[226235]: 2026-01-31 08:24:00.999 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847840.2578335, d0c13002-57d9-4fad-8579-7343af29719d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:00.999 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] VM Started (Lifecycle Event)#033[00m
Jan 31 03:24:01 np0005603623 systemd[1]: Started libpod-conmon-8b8fbfe7d29eabb47a02fb9cc60a5bf63e6e070514bc5302dc32f53bbe780d69.scope.
Jan 31 03:24:01 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:24:01 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a20bfbbee6f6857c92576d9dd91a45d85f4b5e608a390340d3a81f31bf7c4001/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.435 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.439 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:24:01 np0005603623 podman[278270]: 2026-01-31 08:24:01.441707179 +0000 UTC m=+1.275078882 container init 8b8fbfe7d29eabb47a02fb9cc60a5bf63e6e070514bc5302dc32f53bbe780d69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 31 03:24:01 np0005603623 podman[278270]: 2026-01-31 08:24:01.446308574 +0000 UTC m=+1.279680277 container start 8b8fbfe7d29eabb47a02fb9cc60a5bf63e6e070514bc5302dc32f53bbe780d69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:24:01 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[278290]: [NOTICE]   (278294) : New worker (278296) forked
Jan 31 03:24:01 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[278290]: [NOTICE]   (278294) : Loading success.
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.470 226239 DEBUG oslo_concurrency.lockutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Releasing lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.471 226239 DEBUG nova.compute.manager [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Instance network_info: |[{"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.473 226239 DEBUG nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Start _get_guest_xml network_info=[{"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.477 226239 WARNING nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.483 226239 DEBUG nova.virt.libvirt.host [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.484 226239 DEBUG nova.virt.libvirt.host [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.487 226239 DEBUG nova.virt.libvirt.host [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.487 226239 DEBUG nova.virt.libvirt.host [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.489 226239 DEBUG nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.489 226239 DEBUG nova.virt.hardware [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.489 226239 DEBUG nova.virt.hardware [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.490 226239 DEBUG nova.virt.hardware [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.490 226239 DEBUG nova.virt.hardware [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.490 226239 DEBUG nova.virt.hardware [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.490 226239 DEBUG nova.virt.hardware [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.491 226239 DEBUG nova.virt.hardware [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.491 226239 DEBUG nova.virt.hardware [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.491 226239 DEBUG nova.virt.hardware [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.491 226239 DEBUG nova.virt.hardware [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.492 226239 DEBUG nova.virt.hardware [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:24:01 np0005603623 nova_compute[226235]: 2026-01-31 08:24:01.494 226239 DEBUG oslo_concurrency.processutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:24:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:02.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:24:02 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1815617876' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:24:02 np0005603623 nova_compute[226235]: 2026-01-31 08:24:02.476 226239 DEBUG oslo_concurrency.processutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.982s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:24:02 np0005603623 nova_compute[226235]: 2026-01-31 08:24:02.502 226239 DEBUG nova.storage.rbd_utils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:24:02 np0005603623 nova_compute[226235]: 2026-01-31 08:24:02.506 226239 DEBUG oslo_concurrency.processutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:24:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:24:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:02.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:24:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:24:03 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2561176297' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.167 226239 DEBUG oslo_concurrency.processutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.661s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.169 226239 DEBUG nova.virt.libvirt.vif [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:23:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-657799937',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-657799937',id=119,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGuPmLyvW8nH0zjMVVycFJqThAJ40QxmOiqbjQtqD9yxLSlZxgvi2M6cEnwp9NZOC4D7auSCKwZopexRDoMXTIOi6B9+vGiJqB0/tIgguNHeMJkz6XGWm9K9JFV8LaAZqQ==',key_name='tempest-keypair-1582363486',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='491937de020742d7b4e847dc3bf57950',ramdisk_id='',reservation_id='r-mev9f22v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-60119558',owner_user_name='tempest-AttachVolumeShelveTestJSON-60119558-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:23:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='432ac8867d8240408db455fc25bb5901',uuid=f3b36b5b-968c-4775-ac4f-93efc36f40ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.170 226239 DEBUG nova.network.os_vif_util [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converting VIF {"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.170 226239 DEBUG nova.network.os_vif_util [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:9f:8d,bridge_name='br-int',has_traffic_filtering=True,id=19a4f194-5514-4b8e-b635-e6fe0255dc1a,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a4f194-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.172 226239 DEBUG nova.objects.instance [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'pci_devices' on Instance uuid f3b36b5b-968c-4775-ac4f-93efc36f40ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.211 226239 DEBUG nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:24:03 np0005603623 nova_compute[226235]:  <uuid>f3b36b5b-968c-4775-ac4f-93efc36f40ac</uuid>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:  <name>instance-00000077</name>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <nova:name>tempest-AttachVolumeShelveTestJSON-server-657799937</nova:name>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:24:01</nova:creationTime>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:24:03 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:        <nova:user uuid="432ac8867d8240408db455fc25bb5901">tempest-AttachVolumeShelveTestJSON-60119558-project-member</nova:user>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:        <nova:project uuid="491937de020742d7b4e847dc3bf57950">tempest-AttachVolumeShelveTestJSON-60119558</nova:project>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:        <nova:port uuid="19a4f194-5514-4b8e-b635-e6fe0255dc1a">
Jan 31 03:24:03 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <entry name="serial">f3b36b5b-968c-4775-ac4f-93efc36f40ac</entry>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <entry name="uuid">f3b36b5b-968c-4775-ac4f-93efc36f40ac</entry>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk">
Jan 31 03:24:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:24:03 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk.config">
Jan 31 03:24:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:24:03 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:59:9f:8d"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <target dev="tap19a4f194-55"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac/console.log" append="off"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:24:03 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:24:03 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:24:03 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:24:03 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.212 226239 DEBUG nova.compute.manager [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Preparing to wait for external event network-vif-plugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.212 226239 DEBUG oslo_concurrency.lockutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.213 226239 DEBUG oslo_concurrency.lockutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.213 226239 DEBUG oslo_concurrency.lockutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.214 226239 DEBUG nova.virt.libvirt.vif [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:23:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-657799937',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-657799937',id=119,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGuPmLyvW8nH0zjMVVycFJqThAJ40QxmOiqbjQtqD9yxLSlZxgvi2M6cEnwp9NZOC4D7auSCKwZopexRDoMXTIOi6B9+vGiJqB0/tIgguNHeMJkz6XGWm9K9JFV8LaAZqQ==',key_name='tempest-keypair-1582363486',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='491937de020742d7b4e847dc3bf57950',ramdisk_id='',reservation_id='r-mev9f22v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-60119558',owner_user_name='tempest-AttachVolumeShelveTestJSON-60119558-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:23:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='432ac8867d8240408db455fc25bb5901',uuid=f3b36b5b-968c-4775-ac4f-93efc36f40ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.214 226239 DEBUG nova.network.os_vif_util [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converting VIF {"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.215 226239 DEBUG nova.network.os_vif_util [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:9f:8d,bridge_name='br-int',has_traffic_filtering=True,id=19a4f194-5514-4b8e-b635-e6fe0255dc1a,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a4f194-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.216 226239 DEBUG os_vif [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:9f:8d,bridge_name='br-int',has_traffic_filtering=True,id=19a4f194-5514-4b8e-b635-e6fe0255dc1a,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a4f194-55') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.216 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.217 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.217 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.220 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.220 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19a4f194-55, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.221 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap19a4f194-55, col_values=(('external_ids', {'iface-id': '19a4f194-5514-4b8e-b635-e6fe0255dc1a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:59:9f:8d', 'vm-uuid': 'f3b36b5b-968c-4775-ac4f-93efc36f40ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.222 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:03 np0005603623 NetworkManager[48970]: <info>  [1769847843.2239] manager: (tap19a4f194-55): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/222)
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.225 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.228 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.230 226239 INFO os_vif [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:9f:8d,bridge_name='br-int',has_traffic_filtering=True,id=19a4f194-5514-4b8e-b635-e6fe0255dc1a,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a4f194-55')#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.648 226239 DEBUG nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.649 226239 DEBUG nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.649 226239 DEBUG nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] No VIF found with MAC fa:16:3e:59:9f:8d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.649 226239 INFO nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Using config drive#033[00m
Jan 31 03:24:03 np0005603623 nova_compute[226235]: 2026-01-31 08:24:03.674 226239 DEBUG nova.storage.rbd_utils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:24:04 np0005603623 nova_compute[226235]: 2026-01-31 08:24:04.172 226239 INFO nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Creating config drive at /var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac/disk.config#033[00m
Jan 31 03:24:04 np0005603623 nova_compute[226235]: 2026-01-31 08:24:04.176 226239 DEBUG oslo_concurrency.processutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpx0j7vgfb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:24:04 np0005603623 nova_compute[226235]: 2026-01-31 08:24:04.299 226239 DEBUG oslo_concurrency.processutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpx0j7vgfb" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:24:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:04.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:04 np0005603623 nova_compute[226235]: 2026-01-31 08:24:04.449 226239 DEBUG nova.storage.rbd_utils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:24:04 np0005603623 nova_compute[226235]: 2026-01-31 08:24:04.453 226239 DEBUG oslo_concurrency.processutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac/disk.config f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:24:04 np0005603623 nova_compute[226235]: 2026-01-31 08:24:04.466 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:04.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:04 np0005603623 nova_compute[226235]: 2026-01-31 08:24:04.929 226239 DEBUG nova.compute.manager [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:24:04 np0005603623 nova_compute[226235]: 2026-01-31 08:24:04.930 226239 DEBUG oslo_concurrency.lockutils [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:04 np0005603623 nova_compute[226235]: 2026-01-31 08:24:04.931 226239 DEBUG oslo_concurrency.lockutils [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:04 np0005603623 nova_compute[226235]: 2026-01-31 08:24:04.931 226239 DEBUG oslo_concurrency.lockutils [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:04 np0005603623 nova_compute[226235]: 2026-01-31 08:24:04.931 226239 DEBUG nova.compute.manager [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:24:04 np0005603623 nova_compute[226235]: 2026-01-31 08:24:04.931 226239 WARNING nova.compute.manager [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:24:04 np0005603623 nova_compute[226235]: 2026-01-31 08:24:04.932 226239 DEBUG nova.compute.manager [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:24:04 np0005603623 nova_compute[226235]: 2026-01-31 08:24:04.932 226239 DEBUG oslo_concurrency.lockutils [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:04 np0005603623 nova_compute[226235]: 2026-01-31 08:24:04.932 226239 DEBUG oslo_concurrency.lockutils [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:04 np0005603623 nova_compute[226235]: 2026-01-31 08:24:04.933 226239 DEBUG oslo_concurrency.lockutils [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:04 np0005603623 nova_compute[226235]: 2026-01-31 08:24:04.933 226239 DEBUG nova.compute.manager [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:24:04 np0005603623 nova_compute[226235]: 2026-01-31 08:24:04.933 226239 WARNING nova.compute.manager [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:24:05 np0005603623 nova_compute[226235]: 2026-01-31 08:24:05.775 226239 DEBUG oslo_concurrency.processutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac/disk.config f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.322s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:24:05 np0005603623 nova_compute[226235]: 2026-01-31 08:24:05.776 226239 INFO nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Deleting local config drive /var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac/disk.config because it was imported into RBD.#033[00m
Jan 31 03:24:05 np0005603623 kernel: tap19a4f194-55: entered promiscuous mode
Jan 31 03:24:05 np0005603623 NetworkManager[48970]: <info>  [1769847845.8106] manager: (tap19a4f194-55): new Tun device (/org/freedesktop/NetworkManager/Devices/223)
Jan 31 03:24:05 np0005603623 nova_compute[226235]: 2026-01-31 08:24:05.813 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:05 np0005603623 ovn_controller[133449]: 2026-01-31T08:24:05Z|00465|binding|INFO|Claiming lport 19a4f194-5514-4b8e-b635-e6fe0255dc1a for this chassis.
Jan 31 03:24:05 np0005603623 ovn_controller[133449]: 2026-01-31T08:24:05Z|00466|binding|INFO|19a4f194-5514-4b8e-b635-e6fe0255dc1a: Claiming fa:16:3e:59:9f:8d 10.100.0.6
Jan 31 03:24:05 np0005603623 ovn_controller[133449]: 2026-01-31T08:24:05Z|00467|binding|INFO|Setting lport 19a4f194-5514-4b8e-b635-e6fe0255dc1a ovn-installed in OVS
Jan 31 03:24:05 np0005603623 nova_compute[226235]: 2026-01-31 08:24:05.836 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:05 np0005603623 systemd-machined[194379]: New machine qemu-53-instance-00000077.
Jan 31 03:24:05 np0005603623 nova_compute[226235]: 2026-01-31 08:24:05.846 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:05 np0005603623 systemd[1]: Started Virtual Machine qemu-53-instance-00000077.
Jan 31 03:24:05 np0005603623 systemd-udevd[278443]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:24:05 np0005603623 NetworkManager[48970]: <info>  [1769847845.8773] device (tap19a4f194-55): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:24:05 np0005603623 NetworkManager[48970]: <info>  [1769847845.8780] device (tap19a4f194-55): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:24:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:24:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:06.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:24:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:06.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:06 np0005603623 nova_compute[226235]: 2026-01-31 08:24:06.699 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847846.6989307, f3b36b5b-968c-4775-ac4f-93efc36f40ac => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:24:06 np0005603623 nova_compute[226235]: 2026-01-31 08:24:06.699 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] VM Started (Lifecycle Event)#033[00m
Jan 31 03:24:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:07 np0005603623 ovn_controller[133449]: 2026-01-31T08:24:07Z|00468|binding|INFO|Setting lport 19a4f194-5514-4b8e-b635-e6fe0255dc1a up in Southbound
Jan 31 03:24:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:07.991 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:9f:8d 10.100.0.6'], port_security=['fa:16:3e:59:9f:8d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'f3b36b5b-968c-4775-ac4f-93efc36f40ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6525247d-48b2-4359-a813-d7276403ba32', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '491937de020742d7b4e847dc3bf57950', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fbd8fca4-628f-4f27-9bbb-a2cbce3d02c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c7370ba-0307-4b10-bef7-8ff686d828f1, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=19a4f194-5514-4b8e-b635-e6fe0255dc1a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:24:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:07.992 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 19a4f194-5514-4b8e-b635-e6fe0255dc1a in datapath 6525247d-48b2-4359-a813-d7276403ba32 bound to our chassis#033[00m
Jan 31 03:24:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:07.994 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6525247d-48b2-4359-a813-d7276403ba32#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.002 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a9d97d0d-f90b-45c0-98bd-64e57aaec7cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.003 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6525247d-41 in ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.004 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6525247d-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.005 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f603f509-f687-4abb-b063-a856332f59aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.006 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c541cd73-7d8e-496e-b7b0-54af109537e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.013 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[cc31f10b-c36b-43cb-bb19-a4eae8cb56d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.022 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8349b71c-bd42-45a5-a83a-df3912d1015a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.045 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[bcdb9019-66aa-4197-8a94-3fc322432160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:08 np0005603623 systemd-udevd[278447]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:24:08 np0005603623 NetworkManager[48970]: <info>  [1769847848.0548] manager: (tap6525247d-40): new Veth device (/org/freedesktop/NetworkManager/Devices/224)
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.057 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6f5d9529-6050-4d87-9ef1-f10ebe3e0614]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.080 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[90362eda-c8a7-4e12-85d8-78b44fcccb53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.084 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[3bd2a1c9-f4f8-42be-8641-5957fbb3b9ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:08 np0005603623 NetworkManager[48970]: <info>  [1769847848.1075] device (tap6525247d-40): carrier: link connected
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.114 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[4fd86c82-dcfd-4737-88ec-925e0389508e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.132 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0af7c150-80e3-4d8d-a5fb-80977c26d0d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6525247d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c7:c8:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 698853, 'reachable_time': 25096, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278569, 'error': None, 'target': 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.136 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.142 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847846.699082, f3b36b5b-968c-4775-ac4f-93efc36f40ac => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.142 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.145 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2502ed3c-3844-4b1e-afdf-a72a1e6e0bd6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec7:c843'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 698853, 'tstamp': 698853}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 278570, 'error': None, 'target': 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.158 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4b2cc57c-6249-46e2-8de5-0abd164ab56e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6525247d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c7:c8:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 698853, 'reachable_time': 25096, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 278571, 'error': None, 'target': 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.180 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9407f450-ba9d-4693-8bdf-63887688fae6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.196 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.198 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.218 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6e418d1b-793c-49b6-8bc2-adac86568716]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.219 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6525247d-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.219 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.220 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6525247d-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:24:08 np0005603623 NetworkManager[48970]: <info>  [1769847848.2222] manager: (tap6525247d-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/225)
Jan 31 03:24:08 np0005603623 kernel: tap6525247d-40: entered promiscuous mode
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.221 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.224 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6525247d-40, col_values=(('external_ids', {'iface-id': '044f1919-2550-4bba-9baa-5d3f39f69ec6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:24:08 np0005603623 ovn_controller[133449]: 2026-01-31T08:24:08Z|00469|binding|INFO|Releasing lport 044f1919-2550-4bba-9baa-5d3f39f69ec6 from this chassis (sb_readonly=0)
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.227 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6525247d-48b2-4359-a813-d7276403ba32.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6525247d-48b2-4359-a813-d7276403ba32.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.228 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.229 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9e2c84ec-7c30-4fc7-a05c-081c595ef702]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.230 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-6525247d-48b2-4359-a813-d7276403ba32
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/6525247d-48b2-4359-a813-d7276403ba32.pid.haproxy
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 6525247d-48b2-4359-a813-d7276403ba32
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:24:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:08.230 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'env', 'PROCESS_TAG=haproxy-6525247d-48b2-4359-a813-d7276403ba32', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6525247d-48b2-4359-a813-d7276403ba32.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.236 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.248 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:24:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:08.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.573 226239 DEBUG nova.compute.manager [req-5be15fd6-25cb-405d-824f-2664cc639e4b req-bca564f5-03d4-46e1-9a31-2cd845043808 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received event network-vif-plugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.575 226239 DEBUG oslo_concurrency.lockutils [req-5be15fd6-25cb-405d-824f-2664cc639e4b req-bca564f5-03d4-46e1-9a31-2cd845043808 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.575 226239 DEBUG oslo_concurrency.lockutils [req-5be15fd6-25cb-405d-824f-2664cc639e4b req-bca564f5-03d4-46e1-9a31-2cd845043808 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.575 226239 DEBUG oslo_concurrency.lockutils [req-5be15fd6-25cb-405d-824f-2664cc639e4b req-bca564f5-03d4-46e1-9a31-2cd845043808 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.576 226239 DEBUG nova.compute.manager [req-5be15fd6-25cb-405d-824f-2664cc639e4b req-bca564f5-03d4-46e1-9a31-2cd845043808 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Processing event network-vif-plugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.576 226239 DEBUG nova.compute.manager [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.580 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847848.579797, f3b36b5b-968c-4775-ac4f-93efc36f40ac => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.580 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.582 226239 DEBUG nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.585 226239 INFO nova.virt.libvirt.driver [-] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Instance spawned successfully.#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.585 226239 DEBUG nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:24:08 np0005603623 podman[278603]: 2026-01-31 08:24:08.546627458 +0000 UTC m=+0.017886291 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:24:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:24:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:08.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.664 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.668 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.679 226239 DEBUG nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.679 226239 DEBUG nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.679 226239 DEBUG nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.680 226239 DEBUG nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.680 226239 DEBUG nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.681 226239 DEBUG nova.virt.libvirt.driver [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.827 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.891 226239 INFO nova.compute.manager [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Took 23.38 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:24:08 np0005603623 nova_compute[226235]: 2026-01-31 08:24:08.892 226239 DEBUG nova.compute.manager [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:24:09 np0005603623 nova_compute[226235]: 2026-01-31 08:24:09.060 226239 INFO nova.compute.manager [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Took 26.86 seconds to build instance.#033[00m
Jan 31 03:24:09 np0005603623 nova_compute[226235]: 2026-01-31 08:24:09.113 226239 DEBUG oslo_concurrency.lockutils [None req-9f2d7b01-039b-4615-acb8-00e24d805f8c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 27.547s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:09 np0005603623 nova_compute[226235]: 2026-01-31 08:24:09.322 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:09 np0005603623 podman[278603]: 2026-01-31 08:24:09.589325795 +0000 UTC m=+1.060584608 container create b918744e90131e523a3d9ed2ee42999ee487e65f3060ea05cdd9a11de6f808ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:24:09 np0005603623 systemd[1]: Started libpod-conmon-b918744e90131e523a3d9ed2ee42999ee487e65f3060ea05cdd9a11de6f808ca.scope.
Jan 31 03:24:09 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:24:09 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b340db11277158f09f809355243ab1c467f67e8fb5691e2d8712274f1b1dc204/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:24:10 np0005603623 podman[278603]: 2026-01-31 08:24:10.106107545 +0000 UTC m=+1.577366378 container init b918744e90131e523a3d9ed2ee42999ee487e65f3060ea05cdd9a11de6f808ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Jan 31 03:24:10 np0005603623 podman[278603]: 2026-01-31 08:24:10.110314438 +0000 UTC m=+1.581573251 container start b918744e90131e523a3d9ed2ee42999ee487e65f3060ea05cdd9a11de6f808ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:24:10 np0005603623 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[278619]: [NOTICE]   (278623) : New worker (278625) forked
Jan 31 03:24:10 np0005603623 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[278619]: [NOTICE]   (278623) : Loading success.
Jan 31 03:24:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:10.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:10.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:11 np0005603623 nova_compute[226235]: 2026-01-31 08:24:11.104 226239 DEBUG nova.compute.manager [req-d1926553-2d43-4d37-a077-d60b61afc8ee req-2dc2fb98-cdb8-49ff-a79b-61c65c9502e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received event network-vif-plugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:24:11 np0005603623 nova_compute[226235]: 2026-01-31 08:24:11.104 226239 DEBUG oslo_concurrency.lockutils [req-d1926553-2d43-4d37-a077-d60b61afc8ee req-2dc2fb98-cdb8-49ff-a79b-61c65c9502e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:11 np0005603623 nova_compute[226235]: 2026-01-31 08:24:11.104 226239 DEBUG oslo_concurrency.lockutils [req-d1926553-2d43-4d37-a077-d60b61afc8ee req-2dc2fb98-cdb8-49ff-a79b-61c65c9502e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:11 np0005603623 nova_compute[226235]: 2026-01-31 08:24:11.105 226239 DEBUG oslo_concurrency.lockutils [req-d1926553-2d43-4d37-a077-d60b61afc8ee req-2dc2fb98-cdb8-49ff-a79b-61c65c9502e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:11 np0005603623 nova_compute[226235]: 2026-01-31 08:24:11.105 226239 DEBUG nova.compute.manager [req-d1926553-2d43-4d37-a077-d60b61afc8ee req-2dc2fb98-cdb8-49ff-a79b-61c65c9502e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] No waiting events found dispatching network-vif-plugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:24:11 np0005603623 nova_compute[226235]: 2026-01-31 08:24:11.105 226239 WARNING nova.compute.manager [req-d1926553-2d43-4d37-a077-d60b61afc8ee req-2dc2fb98-cdb8-49ff-a79b-61c65c9502e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received unexpected event network-vif-plugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a for instance with vm_state active and task_state None.#033[00m
Jan 31 03:24:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e272 e272: 3 total, 3 up, 3 in
Jan 31 03:24:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:12 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:24:12 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:24:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:12.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:24:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:12.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:24:12 np0005603623 ovn_controller[133449]: 2026-01-31T08:24:12Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b9:24:4f 10.100.0.4
Jan 31 03:24:13 np0005603623 nova_compute[226235]: 2026-01-31 08:24:13.226 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:13 np0005603623 ovn_controller[133449]: 2026-01-31T08:24:13Z|00470|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:24:13 np0005603623 ovn_controller[133449]: 2026-01-31T08:24:13Z|00471|binding|INFO|Releasing lport 044f1919-2550-4bba-9baa-5d3f39f69ec6 from this chassis (sb_readonly=0)
Jan 31 03:24:13 np0005603623 nova_compute[226235]: 2026-01-31 08:24:13.581 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:14 np0005603623 nova_compute[226235]: 2026-01-31 08:24:14.324 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:14.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:14.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:16 np0005603623 podman[278687]: 2026-01-31 08:24:16.018312245 +0000 UTC m=+0.097423096 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:24:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:16.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:16.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:16 np0005603623 podman[278715]: 2026-01-31 08:24:16.948298978 +0000 UTC m=+0.043356100 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:24:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:17 np0005603623 nova_compute[226235]: 2026-01-31 08:24:17.198 226239 DEBUG oslo_concurrency.lockutils [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:17 np0005603623 nova_compute[226235]: 2026-01-31 08:24:17.199 226239 DEBUG oslo_concurrency.lockutils [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:17 np0005603623 nova_compute[226235]: 2026-01-31 08:24:17.200 226239 DEBUG oslo_concurrency.lockutils [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:17 np0005603623 nova_compute[226235]: 2026-01-31 08:24:17.200 226239 DEBUG oslo_concurrency.lockutils [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:17 np0005603623 nova_compute[226235]: 2026-01-31 08:24:17.200 226239 DEBUG oslo_concurrency.lockutils [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:17 np0005603623 nova_compute[226235]: 2026-01-31 08:24:17.201 226239 INFO nova.compute.manager [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Terminating instance#033[00m
Jan 31 03:24:17 np0005603623 nova_compute[226235]: 2026-01-31 08:24:17.202 226239 DEBUG nova.compute.manager [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:24:17 np0005603623 nova_compute[226235]: 2026-01-31 08:24:17.390 226239 DEBUG nova.compute.manager [req-9a365f7f-8aa5-42d6-91e9-aa26bf815836 req-7398293a-7253-4a45-b36c-4f30947acb23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received event network-changed-19a4f194-5514-4b8e-b635-e6fe0255dc1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:24:17 np0005603623 nova_compute[226235]: 2026-01-31 08:24:17.390 226239 DEBUG nova.compute.manager [req-9a365f7f-8aa5-42d6-91e9-aa26bf815836 req-7398293a-7253-4a45-b36c-4f30947acb23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Refreshing instance network info cache due to event network-changed-19a4f194-5514-4b8e-b635-e6fe0255dc1a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:24:17 np0005603623 nova_compute[226235]: 2026-01-31 08:24:17.391 226239 DEBUG oslo_concurrency.lockutils [req-9a365f7f-8aa5-42d6-91e9-aa26bf815836 req-7398293a-7253-4a45-b36c-4f30947acb23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:24:17 np0005603623 nova_compute[226235]: 2026-01-31 08:24:17.391 226239 DEBUG oslo_concurrency.lockutils [req-9a365f7f-8aa5-42d6-91e9-aa26bf815836 req-7398293a-7253-4a45-b36c-4f30947acb23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:24:17 np0005603623 nova_compute[226235]: 2026-01-31 08:24:17.391 226239 DEBUG nova.network.neutron [req-9a365f7f-8aa5-42d6-91e9-aa26bf815836 req-7398293a-7253-4a45-b36c-4f30947acb23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Refreshing network info cache for port 19a4f194-5514-4b8e-b635-e6fe0255dc1a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:24:18 np0005603623 nova_compute[226235]: 2026-01-31 08:24:18.228 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:24:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:18.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:24:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:18.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:19 np0005603623 kernel: tapcc59ad05-32 (unregistering): left promiscuous mode
Jan 31 03:24:19 np0005603623 NetworkManager[48970]: <info>  [1769847859.1149] device (tapcc59ad05-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:24:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:24:19Z|00472|binding|INFO|Releasing lport cc59ad05-3242-4d5f-8eec-a2480d285193 from this chassis (sb_readonly=0)
Jan 31 03:24:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:24:19Z|00473|binding|INFO|Setting lport cc59ad05-3242-4d5f-8eec-a2480d285193 down in Southbound
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.127 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:24:19Z|00474|binding|INFO|Removing iface tapcc59ad05-32 ovn-installed in OVS
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.131 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.137 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:19.137 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:24:4f 10.100.0.4'], port_security=['fa:16:3e:b9:24:4f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'd0c13002-57d9-4fad-8579-7343af29719d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '14', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=cc59ad05-3242-4d5f-8eec-a2480d285193) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:24:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:19.138 143258 INFO neutron.agent.ovn.metadata.agent [-] Port cc59ad05-3242-4d5f-8eec-a2480d285193 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 unbound from our chassis#033[00m
Jan 31 03:24:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:19.140 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1186b71b-0c4b-47f0-a55d-4433241e46e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:24:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:19.141 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ce18af6e-9007-449c-be36-a5be81243e73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:19.141 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace which is not needed anymore#033[00m
Jan 31 03:24:19 np0005603623 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000068.scope: Deactivated successfully.
Jan 31 03:24:19 np0005603623 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000068.scope: Consumed 12.527s CPU time.
Jan 31 03:24:19 np0005603623 systemd-machined[194379]: Machine qemu-52-instance-00000068 terminated.
Jan 31 03:24:19 np0005603623 NetworkManager[48970]: <info>  [1769847859.2247] manager: (tapcc59ad05-32): new Tun device (/org/freedesktop/NetworkManager/Devices/226)
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.225 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.228 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.234 226239 INFO nova.virt.libvirt.driver [-] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance destroyed successfully.#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.235 226239 DEBUG nova.objects.instance [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'resources' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.261 226239 DEBUG nova.virt.libvirt.vif [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-922415262',display_name='tempest-ServerActionsTestJSON-server-922415262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-922415262',id=104,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:24:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-zpk5jwol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:24:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=d0c13002-57d9-4fad-8579-7343af29719d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.262 226239 DEBUG nova.network.os_vif_util [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.262 226239 DEBUG nova.network.os_vif_util [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.263 226239 DEBUG os_vif [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.265 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.265 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc59ad05-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.268 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.270 226239 INFO os_vif [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32')#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.326 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.601 226239 DEBUG nova.compute.manager [req-9f21484c-8040-4de2-b066-d9da0d05a20b req-15e22295-d935-4893-80c9-c6f6bcfa6faa fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-unplugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.601 226239 DEBUG oslo_concurrency.lockutils [req-9f21484c-8040-4de2-b066-d9da0d05a20b req-15e22295-d935-4893-80c9-c6f6bcfa6faa fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.602 226239 DEBUG oslo_concurrency.lockutils [req-9f21484c-8040-4de2-b066-d9da0d05a20b req-15e22295-d935-4893-80c9-c6f6bcfa6faa fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.602 226239 DEBUG oslo_concurrency.lockutils [req-9f21484c-8040-4de2-b066-d9da0d05a20b req-15e22295-d935-4893-80c9-c6f6bcfa6faa fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.602 226239 DEBUG nova.compute.manager [req-9f21484c-8040-4de2-b066-d9da0d05a20b req-15e22295-d935-4893-80c9-c6f6bcfa6faa fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-unplugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:24:19 np0005603623 nova_compute[226235]: 2026-01-31 08:24:19.602 226239 DEBUG nova.compute.manager [req-9f21484c-8040-4de2-b066-d9da0d05a20b req-15e22295-d935-4893-80c9-c6f6bcfa6faa fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-unplugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:24:19 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[278290]: [NOTICE]   (278294) : haproxy version is 2.8.14-c23fe91
Jan 31 03:24:19 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[278290]: [NOTICE]   (278294) : path to executable is /usr/sbin/haproxy
Jan 31 03:24:19 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[278290]: [WARNING]  (278294) : Exiting Master process...
Jan 31 03:24:19 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[278290]: [WARNING]  (278294) : Exiting Master process...
Jan 31 03:24:19 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[278290]: [ALERT]    (278294) : Current worker (278296) exited with code 143 (Terminated)
Jan 31 03:24:19 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[278290]: [WARNING]  (278294) : All workers exited. Exiting... (0)
Jan 31 03:24:19 np0005603623 systemd[1]: libpod-8b8fbfe7d29eabb47a02fb9cc60a5bf63e6e070514bc5302dc32f53bbe780d69.scope: Deactivated successfully.
Jan 31 03:24:19 np0005603623 conmon[278290]: conmon 8b8fbfe7d29eabb47a02 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-8b8fbfe7d29eabb47a02fb9cc60a5bf63e6e070514bc5302dc32f53bbe780d69.scope/container/memory.events
Jan 31 03:24:19 np0005603623 podman[278761]: 2026-01-31 08:24:19.880139437 +0000 UTC m=+0.665866105 container died 8b8fbfe7d29eabb47a02fb9cc60a5bf63e6e070514bc5302dc32f53bbe780d69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127)
Jan 31 03:24:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:20.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:20 np0005603623 nova_compute[226235]: 2026-01-31 08:24:20.522 226239 DEBUG nova.network.neutron [req-9a365f7f-8aa5-42d6-91e9-aa26bf815836 req-7398293a-7253-4a45-b36c-4f30947acb23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updated VIF entry in instance network info cache for port 19a4f194-5514-4b8e-b635-e6fe0255dc1a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:24:20 np0005603623 nova_compute[226235]: 2026-01-31 08:24:20.522 226239 DEBUG nova.network.neutron [req-9a365f7f-8aa5-42d6-91e9-aa26bf815836 req-7398293a-7253-4a45-b36c-4f30947acb23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updating instance_info_cache with network_info: [{"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:24:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:20.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:20 np0005603623 nova_compute[226235]: 2026-01-31 08:24:20.876 226239 DEBUG oslo_concurrency.lockutils [req-9a365f7f-8aa5-42d6-91e9-aa26bf815836 req-7398293a-7253-4a45-b36c-4f30947acb23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:24:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e273 e273: 3 total, 3 up, 3 in
Jan 31 03:24:21 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8b8fbfe7d29eabb47a02fb9cc60a5bf63e6e070514bc5302dc32f53bbe780d69-userdata-shm.mount: Deactivated successfully.
Jan 31 03:24:21 np0005603623 systemd[1]: var-lib-containers-storage-overlay-a20bfbbee6f6857c92576d9dd91a45d85f4b5e608a390340d3a81f31bf7c4001-merged.mount: Deactivated successfully.
Jan 31 03:24:21 np0005603623 podman[278761]: 2026-01-31 08:24:21.695545018 +0000 UTC m=+2.481271676 container cleanup 8b8fbfe7d29eabb47a02fb9cc60a5bf63e6e070514bc5302dc32f53bbe780d69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Jan 31 03:24:21 np0005603623 systemd[1]: libpod-conmon-8b8fbfe7d29eabb47a02fb9cc60a5bf63e6e070514bc5302dc32f53bbe780d69.scope: Deactivated successfully.
Jan 31 03:24:22 np0005603623 nova_compute[226235]: 2026-01-31 08:24:22.386 226239 DEBUG nova.compute.manager [req-d5848560-56dd-4239-9cd2-421754c3466d req-565acc2e-2885-4b88-80df-53dfe6be4ef0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:24:22 np0005603623 nova_compute[226235]: 2026-01-31 08:24:22.386 226239 DEBUG oslo_concurrency.lockutils [req-d5848560-56dd-4239-9cd2-421754c3466d req-565acc2e-2885-4b88-80df-53dfe6be4ef0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:22 np0005603623 nova_compute[226235]: 2026-01-31 08:24:22.386 226239 DEBUG oslo_concurrency.lockutils [req-d5848560-56dd-4239-9cd2-421754c3466d req-565acc2e-2885-4b88-80df-53dfe6be4ef0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:22 np0005603623 nova_compute[226235]: 2026-01-31 08:24:22.387 226239 DEBUG oslo_concurrency.lockutils [req-d5848560-56dd-4239-9cd2-421754c3466d req-565acc2e-2885-4b88-80df-53dfe6be4ef0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:22 np0005603623 nova_compute[226235]: 2026-01-31 08:24:22.387 226239 DEBUG nova.compute.manager [req-d5848560-56dd-4239-9cd2-421754c3466d req-565acc2e-2885-4b88-80df-53dfe6be4ef0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:24:22 np0005603623 nova_compute[226235]: 2026-01-31 08:24:22.387 226239 WARNING nova.compute.manager [req-d5848560-56dd-4239-9cd2-421754c3466d req-565acc2e-2885-4b88-80df-53dfe6be4ef0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:24:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:22.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:22.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:22 np0005603623 podman[278819]: 2026-01-31 08:24:22.779771616 +0000 UTC m=+1.069576481 container remove 8b8fbfe7d29eabb47a02fb9cc60a5bf63e6e070514bc5302dc32f53bbe780d69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 03:24:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:22.784 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[114d702f-a053-4de4-b54d-89a085d617e7]: (4, ('Sat Jan 31 08:24:19 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (8b8fbfe7d29eabb47a02fb9cc60a5bf63e6e070514bc5302dc32f53bbe780d69)\n8b8fbfe7d29eabb47a02fb9cc60a5bf63e6e070514bc5302dc32f53bbe780d69\nSat Jan 31 08:24:21 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (8b8fbfe7d29eabb47a02fb9cc60a5bf63e6e070514bc5302dc32f53bbe780d69)\n8b8fbfe7d29eabb47a02fb9cc60a5bf63e6e070514bc5302dc32f53bbe780d69\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:22.786 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5ca77f5b-af75-4766-b662-d305582bd7e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:22.787 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:24:22 np0005603623 nova_compute[226235]: 2026-01-31 08:24:22.789 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:22 np0005603623 kernel: tap1186b71b-00: left promiscuous mode
Jan 31 03:24:22 np0005603623 nova_compute[226235]: 2026-01-31 08:24:22.794 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:22.797 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6fd18f7c-1177-4c8c-a2f5-ae8c204245fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:22.826 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[960520f1-2c41-4d73-a939-32869f35e564]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:22.827 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3de12295-2aad-4d59-bec4-cc641be8af5e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:22.840 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7846c0e8-e702-4665-b03c-13ed6f0f952c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 698011, 'reachable_time': 43484, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278834, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:22.842 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:24:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:22.842 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[8a236aa4-ab42-45f6-810c-f136d918fdab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:22 np0005603623 systemd[1]: run-netns-ovnmeta\x2d1186b71b\x2d0c4b\x2d47f0\x2da55d\x2d4433241e46e7.mount: Deactivated successfully.
Jan 31 03:24:24 np0005603623 nova_compute[226235]: 2026-01-31 08:24:24.310 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:24 np0005603623 nova_compute[226235]: 2026-01-31 08:24:24.328 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:24.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:24.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:26.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:26.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:24:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:28.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:24:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:24:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:28.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:24:29 np0005603623 nova_compute[226235]: 2026-01-31 08:24:29.314 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:29 np0005603623 nova_compute[226235]: 2026-01-31 08:24:29.329 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:30.117 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:30.117 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:30.118 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:30.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:24:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:30.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:24:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:24:30Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:59:9f:8d 10.100.0.6
Jan 31 03:24:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:24:30Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:59:9f:8d 10.100.0.6
Jan 31 03:24:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:31.520 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:24:31 np0005603623 nova_compute[226235]: 2026-01-31 08:24:31.521 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:31.522 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:24:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:32.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:24:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:32.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:24:34 np0005603623 nova_compute[226235]: 2026-01-31 08:24:34.235 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847859.2335927, d0c13002-57d9-4fad-8579-7343af29719d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:24:34 np0005603623 nova_compute[226235]: 2026-01-31 08:24:34.235 226239 INFO nova.compute.manager [-] [instance: d0c13002-57d9-4fad-8579-7343af29719d] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:24:34 np0005603623 nova_compute[226235]: 2026-01-31 08:24:34.313 226239 DEBUG nova.compute.manager [None req-873a1d7c-b89e-4974-a68f-c7d45a2a3bbb - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:24:34 np0005603623 nova_compute[226235]: 2026-01-31 08:24:34.316 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:34 np0005603623 nova_compute[226235]: 2026-01-31 08:24:34.317 226239 DEBUG nova.compute.manager [None req-873a1d7c-b89e-4974-a68f-c7d45a2a3bbb - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: deleting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:24:34 np0005603623 nova_compute[226235]: 2026-01-31 08:24:34.331 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:34 np0005603623 nova_compute[226235]: 2026-01-31 08:24:34.355 226239 INFO nova.compute.manager [None req-873a1d7c-b89e-4974-a68f-c7d45a2a3bbb - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Jan 31 03:24:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:34.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:34.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:36.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:36 np0005603623 nova_compute[226235]: 2026-01-31 08:24:36.611 226239 INFO nova.virt.libvirt.driver [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Deleting instance files /var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d_del#033[00m
Jan 31 03:24:36 np0005603623 nova_compute[226235]: 2026-01-31 08:24:36.612 226239 INFO nova.virt.libvirt.driver [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Deletion of /var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d_del complete#033[00m
Jan 31 03:24:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:36.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:37 np0005603623 nova_compute[226235]: 2026-01-31 08:24:37.850 226239 INFO nova.compute.manager [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Took 20.65 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:24:37 np0005603623 nova_compute[226235]: 2026-01-31 08:24:37.851 226239 DEBUG oslo.service.loopingcall [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:24:37 np0005603623 nova_compute[226235]: 2026-01-31 08:24:37.852 226239 DEBUG nova.compute.manager [-] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:24:37 np0005603623 nova_compute[226235]: 2026-01-31 08:24:37.852 226239 DEBUG nova.network.neutron [-] [instance: d0c13002-57d9-4fad-8579-7343af29719d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:24:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:38.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:38.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:39 np0005603623 nova_compute[226235]: 2026-01-31 08:24:39.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:39 np0005603623 nova_compute[226235]: 2026-01-31 08:24:39.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:24:39 np0005603623 nova_compute[226235]: 2026-01-31 08:24:39.318 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:39 np0005603623 nova_compute[226235]: 2026-01-31 08:24:39.333 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:39 np0005603623 nova_compute[226235]: 2026-01-31 08:24:39.891 226239 DEBUG nova.network.neutron [-] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:24:39 np0005603623 nova_compute[226235]: 2026-01-31 08:24:39.923 226239 INFO nova.compute.manager [-] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Took 2.07 seconds to deallocate network for instance.#033[00m
Jan 31 03:24:40 np0005603623 nova_compute[226235]: 2026-01-31 08:24:40.028 226239 DEBUG oslo_concurrency.lockutils [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:40 np0005603623 nova_compute[226235]: 2026-01-31 08:24:40.029 226239 DEBUG oslo_concurrency.lockutils [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:40 np0005603623 nova_compute[226235]: 2026-01-31 08:24:40.062 226239 DEBUG nova.compute.manager [req-d526d76d-61e0-445b-a5d6-1ba9edc180a1 req-4a53563c-7fe4-442c-9bda-ab59b29807d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-deleted-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:24:40 np0005603623 nova_compute[226235]: 2026-01-31 08:24:40.118 226239 DEBUG oslo_concurrency.processutils [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:24:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:40.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:24:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2109806358' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:24:40 np0005603623 nova_compute[226235]: 2026-01-31 08:24:40.679 226239 DEBUG oslo_concurrency.processutils [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.562s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:24:40 np0005603623 nova_compute[226235]: 2026-01-31 08:24:40.685 226239 DEBUG nova.compute.provider_tree [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:24:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:24:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:40.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:24:40 np0005603623 nova_compute[226235]: 2026-01-31 08:24:40.778 226239 DEBUG nova.scheduler.client.report [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:24:41 np0005603623 nova_compute[226235]: 2026-01-31 08:24:41.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:41 np0005603623 nova_compute[226235]: 2026-01-31 08:24:41.463 226239 DEBUG oslo_concurrency.lockutils [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.434s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:41.524 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:24:41 np0005603623 nova_compute[226235]: 2026-01-31 08:24:41.666 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:41 np0005603623 nova_compute[226235]: 2026-01-31 08:24:41.667 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:41 np0005603623 nova_compute[226235]: 2026-01-31 08:24:41.667 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:41 np0005603623 nova_compute[226235]: 2026-01-31 08:24:41.668 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:24:41 np0005603623 nova_compute[226235]: 2026-01-31 08:24:41.668 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:24:42 np0005603623 nova_compute[226235]: 2026-01-31 08:24:42.089 226239 INFO nova.scheduler.client.report [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Deleted allocations for instance d0c13002-57d9-4fad-8579-7343af29719d#033[00m
Jan 31 03:24:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:24:42 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/682302512' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:24:42 np0005603623 nova_compute[226235]: 2026-01-31 08:24:42.189 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.521s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:24:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:24:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:42.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:24:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:42.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:42 np0005603623 nova_compute[226235]: 2026-01-31 08:24:42.872 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:43 np0005603623 nova_compute[226235]: 2026-01-31 08:24:43.090 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000077 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:24:43 np0005603623 nova_compute[226235]: 2026-01-31 08:24:43.090 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000077 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:24:43 np0005603623 nova_compute[226235]: 2026-01-31 08:24:43.101 226239 DEBUG oslo_concurrency.lockutils [None req-430430b4-65ad-4593-b486-60ad311c0411 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 25.902s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:43 np0005603623 nova_compute[226235]: 2026-01-31 08:24:43.252 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:24:43 np0005603623 nova_compute[226235]: 2026-01-31 08:24:43.254 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4325MB free_disk=20.851490020751953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:24:43 np0005603623 nova_compute[226235]: 2026-01-31 08:24:43.254 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:43 np0005603623 nova_compute[226235]: 2026-01-31 08:24:43.254 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:44 np0005603623 nova_compute[226235]: 2026-01-31 08:24:44.306 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance f3b36b5b-968c-4775-ac4f-93efc36f40ac actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:24:44 np0005603623 nova_compute[226235]: 2026-01-31 08:24:44.306 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:24:44 np0005603623 nova_compute[226235]: 2026-01-31 08:24:44.307 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:24:44 np0005603623 nova_compute[226235]: 2026-01-31 08:24:44.320 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:44 np0005603623 nova_compute[226235]: 2026-01-31 08:24:44.335 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:44 np0005603623 nova_compute[226235]: 2026-01-31 08:24:44.347 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:24:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:44.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:44.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:24:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2033254085' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:24:44 np0005603623 nova_compute[226235]: 2026-01-31 08:24:44.815 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:24:44 np0005603623 nova_compute[226235]: 2026-01-31 08:24:44.821 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:24:45 np0005603623 nova_compute[226235]: 2026-01-31 08:24:45.008 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:24:45 np0005603623 nova_compute[226235]: 2026-01-31 08:24:45.110 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:24:45 np0005603623 nova_compute[226235]: 2026-01-31 08:24:45.110 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.856s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:46 np0005603623 nova_compute[226235]: 2026-01-31 08:24:46.110 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:46 np0005603623 nova_compute[226235]: 2026-01-31 08:24:46.111 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:24:46 np0005603623 nova_compute[226235]: 2026-01-31 08:24:46.111 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:24:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:46.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:24:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:46.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:24:46 np0005603623 podman[278971]: 2026-01-31 08:24:46.994309335 +0000 UTC m=+0.075004822 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:24:47 np0005603623 podman[278997]: 2026-01-31 08:24:47.061374058 +0000 UTC m=+0.045331433 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 03:24:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:48.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:48.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:49 np0005603623 nova_compute[226235]: 2026-01-31 08:24:49.323 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:49 np0005603623 nova_compute[226235]: 2026-01-31 08:24:49.337 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:50.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:24:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:50.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:24:50 np0005603623 nova_compute[226235]: 2026-01-31 08:24:50.961 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:24:50 np0005603623 nova_compute[226235]: 2026-01-31 08:24:50.962 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:24:50 np0005603623 nova_compute[226235]: 2026-01-31 08:24:50.962 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:24:50 np0005603623 nova_compute[226235]: 2026-01-31 08:24:50.963 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f3b36b5b-968c-4775-ac4f-93efc36f40ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:24:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:24:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:52.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:24:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:52.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:54 np0005603623 nova_compute[226235]: 2026-01-31 08:24:54.327 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:54 np0005603623 nova_compute[226235]: 2026-01-31 08:24:54.339 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:54.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:54.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:54 np0005603623 nova_compute[226235]: 2026-01-31 08:24:54.871 226239 DEBUG oslo_concurrency.lockutils [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:54 np0005603623 nova_compute[226235]: 2026-01-31 08:24:54.873 226239 DEBUG oslo_concurrency.lockutils [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:54 np0005603623 nova_compute[226235]: 2026-01-31 08:24:54.873 226239 INFO nova.compute.manager [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Shelving#033[00m
Jan 31 03:24:55 np0005603623 nova_compute[226235]: 2026-01-31 08:24:55.001 226239 DEBUG nova.virt.libvirt.driver [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:24:55 np0005603623 nova_compute[226235]: 2026-01-31 08:24:55.130 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updating instance_info_cache with network_info: [{"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:24:55 np0005603623 nova_compute[226235]: 2026-01-31 08:24:55.419 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:24:55 np0005603623 nova_compute[226235]: 2026-01-31 08:24:55.420 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:24:55 np0005603623 nova_compute[226235]: 2026-01-31 08:24:55.421 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:55 np0005603623 nova_compute[226235]: 2026-01-31 08:24:55.421 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:55 np0005603623 nova_compute[226235]: 2026-01-31 08:24:55.421 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:55 np0005603623 nova_compute[226235]: 2026-01-31 08:24:55.421 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:56 np0005603623 nova_compute[226235]: 2026-01-31 08:24:56.049 226239 DEBUG oslo_concurrency.lockutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:56 np0005603623 nova_compute[226235]: 2026-01-31 08:24:56.050 226239 DEBUG oslo_concurrency.lockutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:56 np0005603623 nova_compute[226235]: 2026-01-31 08:24:56.119 226239 DEBUG nova.compute.manager [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:24:56 np0005603623 nova_compute[226235]: 2026-01-31 08:24:56.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:56 np0005603623 nova_compute[226235]: 2026-01-31 08:24:56.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:56 np0005603623 nova_compute[226235]: 2026-01-31 08:24:56.319 226239 DEBUG oslo_concurrency.lockutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:56 np0005603623 nova_compute[226235]: 2026-01-31 08:24:56.319 226239 DEBUG oslo_concurrency.lockutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:56 np0005603623 nova_compute[226235]: 2026-01-31 08:24:56.326 226239 DEBUG nova.virt.hardware [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:24:56 np0005603623 nova_compute[226235]: 2026-01-31 08:24:56.326 226239 INFO nova.compute.claims [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:24:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:24:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:56.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:24:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:24:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:56.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:24:57 np0005603623 nova_compute[226235]: 2026-01-31 08:24:57.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:57 np0005603623 nova_compute[226235]: 2026-01-31 08:24:57.430 226239 DEBUG oslo_concurrency.processutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:24:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:24:57 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3470061622' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:24:57 np0005603623 nova_compute[226235]: 2026-01-31 08:24:57.845 226239 DEBUG oslo_concurrency.processutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:24:57 np0005603623 nova_compute[226235]: 2026-01-31 08:24:57.851 226239 DEBUG nova.compute.provider_tree [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:24:58 np0005603623 nova_compute[226235]: 2026-01-31 08:24:58.020 226239 INFO nova.virt.libvirt.driver [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Instance shutdown successfully after 3 seconds.#033[00m
Jan 31 03:24:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:24:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:58.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:24:58 np0005603623 kernel: tap19a4f194-55 (unregistering): left promiscuous mode
Jan 31 03:24:58 np0005603623 NetworkManager[48970]: <info>  [1769847898.5796] device (tap19a4f194-55): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:24:58 np0005603623 nova_compute[226235]: 2026-01-31 08:24:58.631 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:58 np0005603623 ovn_controller[133449]: 2026-01-31T08:24:58Z|00475|binding|INFO|Releasing lport 19a4f194-5514-4b8e-b635-e6fe0255dc1a from this chassis (sb_readonly=0)
Jan 31 03:24:58 np0005603623 ovn_controller[133449]: 2026-01-31T08:24:58Z|00476|binding|INFO|Setting lport 19a4f194-5514-4b8e-b635-e6fe0255dc1a down in Southbound
Jan 31 03:24:58 np0005603623 ovn_controller[133449]: 2026-01-31T08:24:58Z|00477|binding|INFO|Removing iface tap19a4f194-55 ovn-installed in OVS
Jan 31 03:24:58 np0005603623 nova_compute[226235]: 2026-01-31 08:24:58.634 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:58 np0005603623 nova_compute[226235]: 2026-01-31 08:24:58.639 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:58 np0005603623 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000077.scope: Deactivated successfully.
Jan 31 03:24:58 np0005603623 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000077.scope: Consumed 14.232s CPU time.
Jan 31 03:24:58 np0005603623 systemd-machined[194379]: Machine qemu-53-instance-00000077 terminated.
Jan 31 03:24:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:24:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:58.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:58 np0005603623 nova_compute[226235]: 2026-01-31 08:24:58.765 226239 DEBUG nova.scheduler.client.report [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:24:58 np0005603623 kernel: tap19a4f194-55: entered promiscuous mode
Jan 31 03:24:58 np0005603623 kernel: tap19a4f194-55 (unregistering): left promiscuous mode
Jan 31 03:24:58 np0005603623 NetworkManager[48970]: <info>  [1769847898.8452] manager: (tap19a4f194-55): new Tun device (/org/freedesktop/NetworkManager/Devices/227)
Jan 31 03:24:58 np0005603623 nova_compute[226235]: 2026-01-31 08:24:58.849 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:58 np0005603623 nova_compute[226235]: 2026-01-31 08:24:58.858 226239 INFO nova.virt.libvirt.driver [-] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Instance destroyed successfully.#033[00m
Jan 31 03:24:58 np0005603623 nova_compute[226235]: 2026-01-31 08:24:58.859 226239 DEBUG nova.objects.instance [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'numa_topology' on Instance uuid f3b36b5b-968c-4775-ac4f-93efc36f40ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:24:59 np0005603623 ovn_controller[133449]: 2026-01-31T08:24:59Z|00478|binding|INFO|Releasing lport 044f1919-2550-4bba-9baa-5d3f39f69ec6 from this chassis (sb_readonly=0)
Jan 31 03:24:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:59.013 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:9f:8d 10.100.0.6'], port_security=['fa:16:3e:59:9f:8d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'f3b36b5b-968c-4775-ac4f-93efc36f40ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6525247d-48b2-4359-a813-d7276403ba32', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '491937de020742d7b4e847dc3bf57950', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fbd8fca4-628f-4f27-9bbb-a2cbce3d02c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.250'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c7370ba-0307-4b10-bef7-8ff686d828f1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=19a4f194-5514-4b8e-b635-e6fe0255dc1a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:24:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:59.014 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 19a4f194-5514-4b8e-b635-e6fe0255dc1a in datapath 6525247d-48b2-4359-a813-d7276403ba32 unbound from our chassis#033[00m
Jan 31 03:24:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:59.015 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6525247d-48b2-4359-a813-d7276403ba32, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:24:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:59.016 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[91ff419d-ce6e-4805-8e12-8fcf0a2d9d29]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:59.016 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 namespace which is not needed anymore#033[00m
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.016 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:59 np0005603623 ovn_controller[133449]: 2026-01-31T08:24:59Z|00479|binding|INFO|Releasing lport 044f1919-2550-4bba-9baa-5d3f39f69ec6 from this chassis (sb_readonly=0)
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.078 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.145 226239 DEBUG oslo_concurrency.lockutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.826s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.146 226239 DEBUG nova.compute.manager [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:24:59 np0005603623 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[278619]: [NOTICE]   (278623) : haproxy version is 2.8.14-c23fe91
Jan 31 03:24:59 np0005603623 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[278619]: [NOTICE]   (278623) : path to executable is /usr/sbin/haproxy
Jan 31 03:24:59 np0005603623 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[278619]: [WARNING]  (278623) : Exiting Master process...
Jan 31 03:24:59 np0005603623 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[278619]: [WARNING]  (278623) : Exiting Master process...
Jan 31 03:24:59 np0005603623 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[278619]: [ALERT]    (278623) : Current worker (278625) exited with code 143 (Terminated)
Jan 31 03:24:59 np0005603623 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[278619]: [WARNING]  (278623) : All workers exited. Exiting... (0)
Jan 31 03:24:59 np0005603623 systemd[1]: libpod-b918744e90131e523a3d9ed2ee42999ee487e65f3060ea05cdd9a11de6f808ca.scope: Deactivated successfully.
Jan 31 03:24:59 np0005603623 podman[279131]: 2026-01-31 08:24:59.191406325 +0000 UTC m=+0.111331110 container died b918744e90131e523a3d9ed2ee42999ee487e65f3060ea05cdd9a11de6f808ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:24:59 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b918744e90131e523a3d9ed2ee42999ee487e65f3060ea05cdd9a11de6f808ca-userdata-shm.mount: Deactivated successfully.
Jan 31 03:24:59 np0005603623 systemd[1]: var-lib-containers-storage-overlay-b340db11277158f09f809355243ab1c467f67e8fb5691e2d8712274f1b1dc204-merged.mount: Deactivated successfully.
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.330 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.340 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:59 np0005603623 podman[279131]: 2026-01-31 08:24:59.345061002 +0000 UTC m=+0.264985787 container cleanup b918744e90131e523a3d9ed2ee42999ee487e65f3060ea05cdd9a11de6f808ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 03:24:59 np0005603623 systemd[1]: libpod-conmon-b918744e90131e523a3d9ed2ee42999ee487e65f3060ea05cdd9a11de6f808ca.scope: Deactivated successfully.
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.358 226239 INFO nova.virt.libvirt.driver [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Beginning cold snapshot process#033[00m
Jan 31 03:24:59 np0005603623 podman[279164]: 2026-01-31 08:24:59.406992184 +0000 UTC m=+0.042174004 container remove b918744e90131e523a3d9ed2ee42999ee487e65f3060ea05cdd9a11de6f808ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:24:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:59.410 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5b077eda-0ae2-4563-975e-8022e9c6e3f2]: (4, ('Sat Jan 31 08:24:59 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 (b918744e90131e523a3d9ed2ee42999ee487e65f3060ea05cdd9a11de6f808ca)\nb918744e90131e523a3d9ed2ee42999ee487e65f3060ea05cdd9a11de6f808ca\nSat Jan 31 08:24:59 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 (b918744e90131e523a3d9ed2ee42999ee487e65f3060ea05cdd9a11de6f808ca)\nb918744e90131e523a3d9ed2ee42999ee487e65f3060ea05cdd9a11de6f808ca\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:59.412 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c12a0d76-840f-482c-ac3f-79d5007a2c5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:59.413 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6525247d-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.415 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:59 np0005603623 kernel: tap6525247d-40: left promiscuous mode
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.423 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:59.426 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3c83b657-f580-445a-ab61-fa06653853fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:59.441 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3a3c2915-c9ec-4a7e-8bfd-079fda498113]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:59.442 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[57a392c8-4290-41dc-9d06-5b18d8ea8f8e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:59.455 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8fc617d4-97a1-4884-a3f1-94cdbcdf1440]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 698847, 'reachable_time': 29315, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 279181, 'error': None, 'target': 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:59.458 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:24:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:24:59.458 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[605a91eb-85e9-4773-b95b-913e26d94c03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:59 np0005603623 systemd[1]: run-netns-ovnmeta\x2d6525247d\x2d48b2\x2d4359\x2da813\x2dd7276403ba32.mount: Deactivated successfully.
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.473 226239 DEBUG nova.compute.manager [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.474 226239 DEBUG nova.network.neutron [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.695 226239 DEBUG nova.compute.manager [req-2549ad60-672f-499c-ba00-fd0e89feab49 req-2162faa5-9509-4794-9acf-5fbf8a157bca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received event network-vif-unplugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.695 226239 DEBUG oslo_concurrency.lockutils [req-2549ad60-672f-499c-ba00-fd0e89feab49 req-2162faa5-9509-4794-9acf-5fbf8a157bca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.696 226239 DEBUG oslo_concurrency.lockutils [req-2549ad60-672f-499c-ba00-fd0e89feab49 req-2162faa5-9509-4794-9acf-5fbf8a157bca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.696 226239 DEBUG oslo_concurrency.lockutils [req-2549ad60-672f-499c-ba00-fd0e89feab49 req-2162faa5-9509-4794-9acf-5fbf8a157bca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.696 226239 DEBUG nova.compute.manager [req-2549ad60-672f-499c-ba00-fd0e89feab49 req-2162faa5-9509-4794-9acf-5fbf8a157bca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] No waiting events found dispatching network-vif-unplugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.696 226239 WARNING nova.compute.manager [req-2549ad60-672f-499c-ba00-fd0e89feab49 req-2162faa5-9509-4794-9acf-5fbf8a157bca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received unexpected event network-vif-unplugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a for instance with vm_state active and task_state shelving.#033[00m
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.801 226239 INFO nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:24:59 np0005603623 nova_compute[226235]: 2026-01-31 08:24:59.884 226239 DEBUG nova.compute.manager [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:25:00 np0005603623 nova_compute[226235]: 2026-01-31 08:25:00.103 226239 DEBUG nova.policy [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1d03198d8ab846bda092e089b2d5a6c7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:25:00 np0005603623 nova_compute[226235]: 2026-01-31 08:25:00.191 226239 DEBUG nova.virt.libvirt.imagebackend [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] No parent info for 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 31 03:25:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:25:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:00.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:25:00 np0005603623 nova_compute[226235]: 2026-01-31 08:25:00.476 226239 DEBUG nova.compute.manager [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:25:00 np0005603623 nova_compute[226235]: 2026-01-31 08:25:00.478 226239 DEBUG nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:25:00 np0005603623 nova_compute[226235]: 2026-01-31 08:25:00.479 226239 INFO nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Creating image(s)#033[00m
Jan 31 03:25:00 np0005603623 nova_compute[226235]: 2026-01-31 08:25:00.506 226239 DEBUG nova.storage.rbd_utils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:25:00 np0005603623 nova_compute[226235]: 2026-01-31 08:25:00.532 226239 DEBUG nova.storage.rbd_utils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:25:00 np0005603623 nova_compute[226235]: 2026-01-31 08:25:00.557 226239 DEBUG nova.storage.rbd_utils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:25:00 np0005603623 nova_compute[226235]: 2026-01-31 08:25:00.562 226239 DEBUG oslo_concurrency.processutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:25:00 np0005603623 nova_compute[226235]: 2026-01-31 08:25:00.588 226239 DEBUG nova.storage.rbd_utils [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] creating snapshot(6ef5498d6eaa44a88a6a57ff9388aecf) on rbd image(f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:25:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:25:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:00.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:25:01 np0005603623 nova_compute[226235]: 2026-01-31 08:25:01.272 226239 DEBUG oslo_concurrency.processutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.710s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:25:01 np0005603623 nova_compute[226235]: 2026-01-31 08:25:01.273 226239 DEBUG oslo_concurrency.lockutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:01 np0005603623 nova_compute[226235]: 2026-01-31 08:25:01.274 226239 DEBUG oslo_concurrency.lockutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:01 np0005603623 nova_compute[226235]: 2026-01-31 08:25:01.274 226239 DEBUG oslo_concurrency.lockutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:01 np0005603623 nova_compute[226235]: 2026-01-31 08:25:01.296 226239 DEBUG nova.storage.rbd_utils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:25:01 np0005603623 nova_compute[226235]: 2026-01-31 08:25:01.299 226239 DEBUG oslo_concurrency.processutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:25:02 np0005603623 nova_compute[226235]: 2026-01-31 08:25:02.158 226239 DEBUG nova.compute.manager [req-a8b26861-1cfb-42e0-8b0e-39d31840c39f req-b70d6072-800b-4b22-9c09-35e810c1dd1a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received event network-vif-plugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:25:02 np0005603623 nova_compute[226235]: 2026-01-31 08:25:02.158 226239 DEBUG oslo_concurrency.lockutils [req-a8b26861-1cfb-42e0-8b0e-39d31840c39f req-b70d6072-800b-4b22-9c09-35e810c1dd1a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:02 np0005603623 nova_compute[226235]: 2026-01-31 08:25:02.159 226239 DEBUG oslo_concurrency.lockutils [req-a8b26861-1cfb-42e0-8b0e-39d31840c39f req-b70d6072-800b-4b22-9c09-35e810c1dd1a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:02 np0005603623 nova_compute[226235]: 2026-01-31 08:25:02.159 226239 DEBUG oslo_concurrency.lockutils [req-a8b26861-1cfb-42e0-8b0e-39d31840c39f req-b70d6072-800b-4b22-9c09-35e810c1dd1a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:02 np0005603623 nova_compute[226235]: 2026-01-31 08:25:02.159 226239 DEBUG nova.compute.manager [req-a8b26861-1cfb-42e0-8b0e-39d31840c39f req-b70d6072-800b-4b22-9c09-35e810c1dd1a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] No waiting events found dispatching network-vif-plugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:25:02 np0005603623 nova_compute[226235]: 2026-01-31 08:25:02.160 226239 WARNING nova.compute.manager [req-a8b26861-1cfb-42e0-8b0e-39d31840c39f req-b70d6072-800b-4b22-9c09-35e810c1dd1a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received unexpected event network-vif-plugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 31 03:25:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:02.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:02.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:02 np0005603623 nova_compute[226235]: 2026-01-31 08:25:02.837 226239 DEBUG nova.network.neutron [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Successfully created port: d2ecd824-47f7-4503-a326-2007f56a02e7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:25:04 np0005603623 nova_compute[226235]: 2026-01-31 08:25:04.335 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:04 np0005603623 nova_compute[226235]: 2026-01-31 08:25:04.342 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:04.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:04.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e274 e274: 3 total, 3 up, 3 in
Jan 31 03:25:05 np0005603623 nova_compute[226235]: 2026-01-31 08:25:05.138 226239 DEBUG oslo_concurrency.processutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.839s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:25:05 np0005603623 nova_compute[226235]: 2026-01-31 08:25:05.279 226239 DEBUG nova.storage.rbd_utils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] resizing rbd image 08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:25:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:06.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:06.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:07 np0005603623 nova_compute[226235]: 2026-01-31 08:25:07.052 226239 DEBUG nova.network.neutron [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Successfully updated port: d2ecd824-47f7-4503-a326-2007f56a02e7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:25:07 np0005603623 nova_compute[226235]: 2026-01-31 08:25:07.327 226239 DEBUG oslo_concurrency.lockutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:25:07 np0005603623 nova_compute[226235]: 2026-01-31 08:25:07.328 226239 DEBUG oslo_concurrency.lockutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquired lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:25:07 np0005603623 nova_compute[226235]: 2026-01-31 08:25:07.328 226239 DEBUG nova.network.neutron [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:25:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:08 np0005603623 nova_compute[226235]: 2026-01-31 08:25:08.257 226239 DEBUG nova.objects.instance [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'migration_context' on Instance uuid 08dc30a4-60ae-4f4e-8b5e-e12610df2120 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:25:08 np0005603623 nova_compute[226235]: 2026-01-31 08:25:08.437 226239 DEBUG nova.storage.rbd_utils [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] cloning vms/f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk@6ef5498d6eaa44a88a6a57ff9388aecf to images/81322d27-2584-47ad-bfcd-7642da2c770e clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:25:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:08.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:08.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:08 np0005603623 nova_compute[226235]: 2026-01-31 08:25:08.830 226239 DEBUG nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:25:08 np0005603623 nova_compute[226235]: 2026-01-31 08:25:08.831 226239 DEBUG nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Ensure instance console log exists: /var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:25:08 np0005603623 nova_compute[226235]: 2026-01-31 08:25:08.831 226239 DEBUG oslo_concurrency.lockutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:08 np0005603623 nova_compute[226235]: 2026-01-31 08:25:08.831 226239 DEBUG oslo_concurrency.lockutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:08 np0005603623 nova_compute[226235]: 2026-01-31 08:25:08.832 226239 DEBUG oslo_concurrency.lockutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:09 np0005603623 nova_compute[226235]: 2026-01-31 08:25:09.259 226239 DEBUG nova.network.neutron [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:25:09 np0005603623 nova_compute[226235]: 2026-01-31 08:25:09.339 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:09 np0005603623 nova_compute[226235]: 2026-01-31 08:25:09.343 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:09 np0005603623 nova_compute[226235]: 2026-01-31 08:25:09.839 226239 DEBUG nova.compute.manager [req-ec02fe32-218d-42d6-97ff-f01f5038622c req-135d749e-9e62-4e12-af54-c472b431d68e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-changed-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:25:09 np0005603623 nova_compute[226235]: 2026-01-31 08:25:09.839 226239 DEBUG nova.compute.manager [req-ec02fe32-218d-42d6-97ff-f01f5038622c req-135d749e-9e62-4e12-af54-c472b431d68e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Refreshing instance network info cache due to event network-changed-d2ecd824-47f7-4503-a326-2007f56a02e7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:25:09 np0005603623 nova_compute[226235]: 2026-01-31 08:25:09.839 226239 DEBUG oslo_concurrency.lockutils [req-ec02fe32-218d-42d6-97ff-f01f5038622c req-135d749e-9e62-4e12-af54-c472b431d68e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:25:10 np0005603623 nova_compute[226235]: 2026-01-31 08:25:10.297 226239 DEBUG nova.storage.rbd_utils [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] flattening images/81322d27-2584-47ad-bfcd-7642da2c770e flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 31 03:25:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:25:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:10.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:25:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:10.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.547 226239 DEBUG nova.network.neutron [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updating instance_info_cache with network_info: [{"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.852 226239 DEBUG oslo_concurrency.lockutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Releasing lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.854 226239 DEBUG nova.compute.manager [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Instance network_info: |[{"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.854 226239 DEBUG oslo_concurrency.lockutils [req-ec02fe32-218d-42d6-97ff-f01f5038622c req-135d749e-9e62-4e12-af54-c472b431d68e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.854 226239 DEBUG nova.network.neutron [req-ec02fe32-218d-42d6-97ff-f01f5038622c req-135d749e-9e62-4e12-af54-c472b431d68e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Refreshing network info cache for port d2ecd824-47f7-4503-a326-2007f56a02e7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.858 226239 DEBUG nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Start _get_guest_xml network_info=[{"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.862 226239 WARNING nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.870 226239 DEBUG nova.virt.libvirt.host [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.871 226239 DEBUG nova.virt.libvirt.host [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.874 226239 DEBUG nova.virt.libvirt.host [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.875 226239 DEBUG nova.virt.libvirt.host [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.876 226239 DEBUG nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.877 226239 DEBUG nova.virt.hardware [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.877 226239 DEBUG nova.virt.hardware [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.878 226239 DEBUG nova.virt.hardware [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.878 226239 DEBUG nova.virt.hardware [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.878 226239 DEBUG nova.virt.hardware [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.878 226239 DEBUG nova.virt.hardware [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.878 226239 DEBUG nova.virt.hardware [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.879 226239 DEBUG nova.virt.hardware [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.879 226239 DEBUG nova.virt.hardware [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.879 226239 DEBUG nova.virt.hardware [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.879 226239 DEBUG nova.virt.hardware [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:25:11 np0005603623 nova_compute[226235]: 2026-01-31 08:25:11.882 226239 DEBUG oslo_concurrency.processutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:25:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:25:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2760648852' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:25:12 np0005603623 nova_compute[226235]: 2026-01-31 08:25:12.344 226239 DEBUG oslo_concurrency.processutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:25:12 np0005603623 nova_compute[226235]: 2026-01-31 08:25:12.372 226239 DEBUG nova.storage.rbd_utils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:25:12 np0005603623 nova_compute[226235]: 2026-01-31 08:25:12.378 226239 DEBUG oslo_concurrency.processutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:25:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:12.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:12.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:25:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/735466658' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.250 226239 DEBUG oslo_concurrency.processutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.872s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.251 226239 DEBUG nova.virt.libvirt.vif [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:24:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1941034472',display_name='tempest-ServerActionsTestJSON-server-1941034472',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1941034472',id=120,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-ra3xi6gb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:25:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=08dc30a4-60ae-4f4e-8b5e-e12610df2120,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.252 226239 DEBUG nova.network.os_vif_util [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.253 226239 DEBUG nova.network.os_vif_util [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.254 226239 DEBUG nova.objects.instance [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'pci_devices' on Instance uuid 08dc30a4-60ae-4f4e-8b5e-e12610df2120 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.280 226239 DEBUG nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:25:13 np0005603623 nova_compute[226235]:  <uuid>08dc30a4-60ae-4f4e-8b5e-e12610df2120</uuid>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:  <name>instance-00000078</name>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerActionsTestJSON-server-1941034472</nova:name>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:25:11</nova:creationTime>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:25:13 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:        <nova:user uuid="1d03198d8ab846bda092e089b2d5a6c7">tempest-ServerActionsTestJSON-1873947453-project-member</nova:user>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:        <nova:project uuid="5b87da3b3f42494f96baeeeaf60b54df">tempest-ServerActionsTestJSON-1873947453</nova:project>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:        <nova:port uuid="d2ecd824-47f7-4503-a326-2007f56a02e7">
Jan 31 03:25:13 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <entry name="serial">08dc30a4-60ae-4f4e-8b5e-e12610df2120</entry>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <entry name="uuid">08dc30a4-60ae-4f4e-8b5e-e12610df2120</entry>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk">
Jan 31 03:25:13 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:25:13 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk.config">
Jan 31 03:25:13 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:25:13 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:f9:0c:a9"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <target dev="tapd2ecd824-47"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120/console.log" append="off"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:25:13 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:25:13 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:25:13 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:25:13 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.281 226239 DEBUG nova.compute.manager [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Preparing to wait for external event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.281 226239 DEBUG oslo_concurrency.lockutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.282 226239 DEBUG oslo_concurrency.lockutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.282 226239 DEBUG oslo_concurrency.lockutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.283 226239 DEBUG nova.virt.libvirt.vif [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:24:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1941034472',display_name='tempest-ServerActionsTestJSON-server-1941034472',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1941034472',id=120,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-ra3xi6gb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:25:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=08dc30a4-60ae-4f4e-8b5e-e12610df2120,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.283 226239 DEBUG nova.network.os_vif_util [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.284 226239 DEBUG nova.network.os_vif_util [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.284 226239 DEBUG os_vif [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.285 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.285 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.286 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.288 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.288 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd2ecd824-47, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.288 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd2ecd824-47, col_values=(('external_ids', {'iface-id': 'd2ecd824-47f7-4503-a326-2007f56a02e7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:0c:a9', 'vm-uuid': '08dc30a4-60ae-4f4e-8b5e-e12610df2120'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.290 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:13 np0005603623 NetworkManager[48970]: <info>  [1769847913.2917] manager: (tapd2ecd824-47): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/228)
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.292 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.296 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.297 226239 INFO os_vif [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47')#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.859 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847898.8569212, f3b36b5b-968c-4775-ac4f-93efc36f40ac => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:25:13 np0005603623 nova_compute[226235]: 2026-01-31 08:25:13.860 226239 INFO nova.compute.manager [-] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:25:14 np0005603623 nova_compute[226235]: 2026-01-31 08:25:14.344 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:14.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:14.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:15 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:25:15 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:25:15 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:25:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:16.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:16.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:16 np0005603623 nova_compute[226235]: 2026-01-31 08:25:16.817 226239 DEBUG nova.compute.manager [None req-fdd755be-a17a-4eb0-8f25-bcf428c87a31 - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:25:16 np0005603623 nova_compute[226235]: 2026-01-31 08:25:16.820 226239 DEBUG nova.compute.manager [None req-fdd755be-a17a-4eb0-8f25-bcf428c87a31 - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: shelving_image_uploading, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:25:16 np0005603623 nova_compute[226235]: 2026-01-31 08:25:16.915 226239 DEBUG nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:25:16 np0005603623 nova_compute[226235]: 2026-01-31 08:25:16.916 226239 DEBUG nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:25:16 np0005603623 nova_compute[226235]: 2026-01-31 08:25:16.916 226239 DEBUG nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No VIF found with MAC fa:16:3e:f9:0c:a9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:25:16 np0005603623 nova_compute[226235]: 2026-01-31 08:25:16.916 226239 INFO nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Using config drive#033[00m
Jan 31 03:25:16 np0005603623 nova_compute[226235]: 2026-01-31 08:25:16.939 226239 DEBUG nova.storage.rbd_utils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:25:17 np0005603623 nova_compute[226235]: 2026-01-31 08:25:17.022 226239 INFO nova.compute.manager [None req-fdd755be-a17a-4eb0-8f25-bcf428c87a31 - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] During sync_power_state the instance has a pending task (shelving_image_uploading). Skip.#033[00m
Jan 31 03:25:17 np0005603623 nova_compute[226235]: 2026-01-31 08:25:17.537 226239 INFO nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Creating config drive at /var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120/disk.config#033[00m
Jan 31 03:25:17 np0005603623 nova_compute[226235]: 2026-01-31 08:25:17.541 226239 DEBUG oslo_concurrency.processutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp3byy0kcv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:25:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:25:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:25:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 03:25:17 np0005603623 nova_compute[226235]: 2026-01-31 08:25:17.664 226239 DEBUG oslo_concurrency.processutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp3byy0kcv" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:25:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:17 np0005603623 podman[279861]: 2026-01-31 08:25:17.967365104 +0000 UTC m=+0.054405936 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:25:17 np0005603623 nova_compute[226235]: 2026-01-31 08:25:17.968 226239 DEBUG nova.storage.rbd_utils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:25:17 np0005603623 nova_compute[226235]: 2026-01-31 08:25:17.973 226239 DEBUG oslo_concurrency.processutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120/disk.config 08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:25:18 np0005603623 podman[279862]: 2026-01-31 08:25:18.02527679 +0000 UTC m=+0.113285443 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller)
Jan 31 03:25:18 np0005603623 nova_compute[226235]: 2026-01-31 08:25:18.305 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:18.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:18 np0005603623 nova_compute[226235]: 2026-01-31 08:25:18.481 226239 DEBUG nova.storage.rbd_utils [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] removing snapshot(6ef5498d6eaa44a88a6a57ff9388aecf) on rbd image(f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:25:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 03:25:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:25:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 03:25:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:25:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:25:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:25:18 np0005603623 nova_compute[226235]: 2026-01-31 08:25:18.699 226239 DEBUG nova.network.neutron [req-ec02fe32-218d-42d6-97ff-f01f5038622c req-135d749e-9e62-4e12-af54-c472b431d68e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updated VIF entry in instance network info cache for port d2ecd824-47f7-4503-a326-2007f56a02e7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:25:18 np0005603623 nova_compute[226235]: 2026-01-31 08:25:18.700 226239 DEBUG nova.network.neutron [req-ec02fe32-218d-42d6-97ff-f01f5038622c req-135d749e-9e62-4e12-af54-c472b431d68e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updating instance_info_cache with network_info: [{"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:25:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:18.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:18 np0005603623 nova_compute[226235]: 2026-01-31 08:25:18.958 226239 DEBUG oslo_concurrency.lockutils [req-ec02fe32-218d-42d6-97ff-f01f5038622c req-135d749e-9e62-4e12-af54-c472b431d68e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:25:18 np0005603623 nova_compute[226235]: 2026-01-31 08:25:18.969 226239 DEBUG oslo_concurrency.processutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120/disk.config 08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.996s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:25:18 np0005603623 nova_compute[226235]: 2026-01-31 08:25:18.969 226239 INFO nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Deleting local config drive /var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120/disk.config because it was imported into RBD.#033[00m
Jan 31 03:25:19 np0005603623 kernel: tapd2ecd824-47: entered promiscuous mode
Jan 31 03:25:19 np0005603623 NetworkManager[48970]: <info>  [1769847919.0199] manager: (tapd2ecd824-47): new Tun device (/org/freedesktop/NetworkManager/Devices/229)
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.023 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:25:19Z|00480|binding|INFO|Claiming lport d2ecd824-47f7-4503-a326-2007f56a02e7 for this chassis.
Jan 31 03:25:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:25:19Z|00481|binding|INFO|d2ecd824-47f7-4503-a326-2007f56a02e7: Claiming fa:16:3e:f9:0c:a9 10.100.0.5
Jan 31 03:25:19 np0005603623 systemd-udevd[279967]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:25:19 np0005603623 systemd-machined[194379]: New machine qemu-54-instance-00000078.
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.053 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:19 np0005603623 NetworkManager[48970]: <info>  [1769847919.0583] device (tapd2ecd824-47): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:25:19 np0005603623 NetworkManager[48970]: <info>  [1769847919.0591] device (tapd2ecd824-47): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:25:19 np0005603623 systemd[1]: Started Virtual Machine qemu-54-instance-00000078.
Jan 31 03:25:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:25:19Z|00482|binding|INFO|Setting lport d2ecd824-47f7-4503-a326-2007f56a02e7 ovn-installed in OVS
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.064 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:25:19Z|00483|binding|INFO|Setting lport d2ecd824-47f7-4503-a326-2007f56a02e7 up in Southbound
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.075 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:0c:a9 10.100.0.5'], port_security=['fa:16:3e:f9:0c:a9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '08dc30a4-60ae-4f4e-8b5e-e12610df2120', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=d2ecd824-47f7-4503-a326-2007f56a02e7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.077 143258 INFO neutron.agent.ovn.metadata.agent [-] Port d2ecd824-47f7-4503-a326-2007f56a02e7 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 bound to our chassis#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.078 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1186b71b-0c4b-47f0-a55d-4433241e46e7#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.085 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0de9bdd7-46c1-4281-8f23-bda79c57dc55]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.086 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1186b71b-01 in ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.087 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1186b71b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.087 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[275e39d6-0d3c-42a7-b42c-db0c67c7ede9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.088 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b9b86f5d-e6dc-4550-8503-fdc4aaea8b78]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.096 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[4d5cf3f5-d45f-4f15-8c9e-5210367c97d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.106 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[695bdcc5-f28a-47fb-9579-26f3615b90e8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.121 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[6f3d65cc-8989-4463-92ad-43bf9bdbcce5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.126 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[97892dab-6be7-4b97-abf8-ac5f31ac5121]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:19 np0005603623 NetworkManager[48970]: <info>  [1769847919.1270] manager: (tap1186b71b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/230)
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.148 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[2fd2f0b7-6202-4456-8541-e01a364f7f14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.151 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[f87506f1-fe53-400d-9179-d6edad9024bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:19 np0005603623 NetworkManager[48970]: <info>  [1769847919.1678] device (tap1186b71b-00): carrier: link connected
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.173 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[2efeea33-eb63-4c4d-8840-931773129744]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.183 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[681916ea-8c3d-48e8-bed3-2e9c5079c7c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 705959, 'reachable_time': 27335, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280006, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.192 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8fca35b5-0983-4fa7-8af5-c877b751815d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:37ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 705959, 'tstamp': 705959}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280007, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.204 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e184c41d-5f50-435c-9bf7-f8217b3ed537]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 705959, 'reachable_time': 27335, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 280008, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.226 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fcf88fbe-dc81-4695-bf71-48889ccaff35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.266 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c7002320-af84-48f8-b2d7-6f99c61c97bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.268 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.268 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.268 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1186b71b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.270 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:19 np0005603623 kernel: tap1186b71b-00: entered promiscuous mode
Jan 31 03:25:19 np0005603623 NetworkManager[48970]: <info>  [1769847919.2713] manager: (tap1186b71b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/231)
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.274 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.275 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1186b71b-00, col_values=(('external_ids', {'iface-id': '4375f262-ce22-40bf-bf9b-24f6862763a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.276 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:25:19Z|00484|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.282 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.284 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.286 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.287 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e9fdcd98-0a60-4758-b139-a4fb9c21be08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.288 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:25:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:19.289 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'env', 'PROCESS_TAG=haproxy-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1186b71b-0c4b-47f0-a55d-4433241e46e7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.346 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.429 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847919.429211, 08dc30a4-60ae-4f4e-8b5e-e12610df2120 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.430 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] VM Started (Lifecycle Event)#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.531 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.534 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847919.4304316, 08dc30a4-60ae-4f4e-8b5e-e12610df2120 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.535 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:25:19 np0005603623 podman[280087]: 2026-01-31 08:25:19.588026281 +0000 UTC m=+0.018642287 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.709 226239 DEBUG nova.compute.manager [req-76ad5815-be74-40bd-a857-fe6ec4ecd73d req-428b92d4-58cd-4068-8ee6-9a9d80d86184 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.710 226239 DEBUG oslo_concurrency.lockutils [req-76ad5815-be74-40bd-a857-fe6ec4ecd73d req-428b92d4-58cd-4068-8ee6-9a9d80d86184 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.711 226239 DEBUG oslo_concurrency.lockutils [req-76ad5815-be74-40bd-a857-fe6ec4ecd73d req-428b92d4-58cd-4068-8ee6-9a9d80d86184 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.711 226239 DEBUG oslo_concurrency.lockutils [req-76ad5815-be74-40bd-a857-fe6ec4ecd73d req-428b92d4-58cd-4068-8ee6-9a9d80d86184 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.712 226239 DEBUG nova.compute.manager [req-76ad5815-be74-40bd-a857-fe6ec4ecd73d req-428b92d4-58cd-4068-8ee6-9a9d80d86184 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Processing event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.713 226239 DEBUG nova.compute.manager [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.716 226239 DEBUG nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.719 226239 INFO nova.virt.libvirt.driver [-] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Instance spawned successfully.#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.719 226239 DEBUG nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.741 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.745 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769847919.716579, 08dc30a4-60ae-4f4e-8b5e-e12610df2120 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.745 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.843 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.847 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.863 226239 DEBUG nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.864 226239 DEBUG nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.864 226239 DEBUG nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.864 226239 DEBUG nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.865 226239 DEBUG nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:25:19 np0005603623 nova_compute[226235]: 2026-01-31 08:25:19.865 226239 DEBUG nova.virt.libvirt.driver [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:25:20 np0005603623 nova_compute[226235]: 2026-01-31 08:25:20.122 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:25:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:20.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:20 np0005603623 nova_compute[226235]: 2026-01-31 08:25:20.470 226239 INFO nova.compute.manager [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Took 19.99 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:25:20 np0005603623 nova_compute[226235]: 2026-01-31 08:25:20.471 226239 DEBUG nova.compute.manager [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:25:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:20.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e275 e275: 3 total, 3 up, 3 in
Jan 31 03:25:20 np0005603623 nova_compute[226235]: 2026-01-31 08:25:20.786 226239 INFO nova.compute.manager [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Took 24.50 seconds to build instance.#033[00m
Jan 31 03:25:20 np0005603623 nova_compute[226235]: 2026-01-31 08:25:20.970 226239 DEBUG oslo_concurrency.lockutils [None req-ec188ab2-1357-43d0-9b43-f789f39214d9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 24.920s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:21 np0005603623 nova_compute[226235]: 2026-01-31 08:25:21.014 226239 DEBUG nova.storage.rbd_utils [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] creating snapshot(snap) on rbd image(81322d27-2584-47ad-bfcd-7642da2c770e) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:25:21 np0005603623 podman[280087]: 2026-01-31 08:25:21.050102394 +0000 UTC m=+1.480718380 container create 3671472881f5c89fb195cb72278ce1e399183cb16f3f776365380e0373695775 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:25:21 np0005603623 systemd[1]: Started libpod-conmon-3671472881f5c89fb195cb72278ce1e399183cb16f3f776365380e0373695775.scope.
Jan 31 03:25:21 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:25:21 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5f2ba640471292cadf74528f48ddd04fa54a9092df237e6ea583319a2e33134/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:25:22 np0005603623 podman[280087]: 2026-01-31 08:25:22.26598132 +0000 UTC m=+2.696597326 container init 3671472881f5c89fb195cb72278ce1e399183cb16f3f776365380e0373695775 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Jan 31 03:25:22 np0005603623 podman[280087]: 2026-01-31 08:25:22.273345131 +0000 UTC m=+2.703961117 container start 3671472881f5c89fb195cb72278ce1e399183cb16f3f776365380e0373695775 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 03:25:22 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[280122]: [NOTICE]   (280126) : New worker (280128) forked
Jan 31 03:25:22 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[280122]: [NOTICE]   (280126) : Loading success.
Jan 31 03:25:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:22.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:22 np0005603623 nova_compute[226235]: 2026-01-31 08:25:22.675 226239 DEBUG nova.compute.manager [req-97729638-c547-4a83-a184-cea0f89b5dbe req-49949807-fc6b-43a8-a065-8e6448e7a34b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:25:22 np0005603623 nova_compute[226235]: 2026-01-31 08:25:22.675 226239 DEBUG oslo_concurrency.lockutils [req-97729638-c547-4a83-a184-cea0f89b5dbe req-49949807-fc6b-43a8-a065-8e6448e7a34b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:22 np0005603623 nova_compute[226235]: 2026-01-31 08:25:22.675 226239 DEBUG oslo_concurrency.lockutils [req-97729638-c547-4a83-a184-cea0f89b5dbe req-49949807-fc6b-43a8-a065-8e6448e7a34b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:22 np0005603623 nova_compute[226235]: 2026-01-31 08:25:22.676 226239 DEBUG oslo_concurrency.lockutils [req-97729638-c547-4a83-a184-cea0f89b5dbe req-49949807-fc6b-43a8-a065-8e6448e7a34b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:22 np0005603623 nova_compute[226235]: 2026-01-31 08:25:22.676 226239 DEBUG nova.compute.manager [req-97729638-c547-4a83-a184-cea0f89b5dbe req-49949807-fc6b-43a8-a065-8e6448e7a34b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] No waiting events found dispatching network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:25:22 np0005603623 nova_compute[226235]: 2026-01-31 08:25:22.676 226239 WARNING nova.compute.manager [req-97729638-c547-4a83-a184-cea0f89b5dbe req-49949807-fc6b-43a8-a065-8e6448e7a34b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received unexpected event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:25:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:22.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:23 np0005603623 nova_compute[226235]: 2026-01-31 08:25:23.309 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:24 np0005603623 nova_compute[226235]: 2026-01-31 08:25:24.348 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:24.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:24.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #100. Immutable memtables: 0.
Jan 31 03:25:25 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:25:25.048241) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:25:25 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 61] Flushing memtable with next log file: 100
Jan 31 03:25:25 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847925048349, "job": 61, "event": "flush_started", "num_memtables": 1, "num_entries": 2482, "num_deletes": 253, "total_data_size": 6144851, "memory_usage": 6219104, "flush_reason": "Manual Compaction"}
Jan 31 03:25:25 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 61] Level-0 flush table #101: started
Jan 31 03:25:25 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847925290087, "cf_name": "default", "job": 61, "event": "table_file_creation", "file_number": 101, "file_size": 3969027, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 50158, "largest_seqno": 52635, "table_properties": {"data_size": 3958817, "index_size": 6512, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 21416, "raw_average_key_size": 20, "raw_value_size": 3938382, "raw_average_value_size": 3812, "num_data_blocks": 282, "num_entries": 1033, "num_filter_entries": 1033, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769847719, "oldest_key_time": 1769847719, "file_creation_time": 1769847925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 101, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:25:25 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 61] Flush lasted 241902 microseconds, and 5892 cpu microseconds.
Jan 31 03:25:25 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:25:25 np0005603623 NetworkManager[48970]: <info>  [1769847925.4300] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/232)
Jan 31 03:25:25 np0005603623 NetworkManager[48970]: <info>  [1769847925.4307] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/233)
Jan 31 03:25:25 np0005603623 nova_compute[226235]: 2026-01-31 08:25:25.429 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:25 np0005603623 nova_compute[226235]: 2026-01-31 08:25:25.448 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:25 np0005603623 ovn_controller[133449]: 2026-01-31T08:25:25Z|00485|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:25:25 np0005603623 nova_compute[226235]: 2026-01-31 08:25:25.465 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:25 np0005603623 nova_compute[226235]: 2026-01-31 08:25:25.950 226239 DEBUG nova.compute.manager [req-44dc6c91-ab2c-4e80-b826-0d559657cf06 req-ae6432fe-793e-472f-afec-4a3500b70840 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-changed-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:25:25 np0005603623 nova_compute[226235]: 2026-01-31 08:25:25.950 226239 DEBUG nova.compute.manager [req-44dc6c91-ab2c-4e80-b826-0d559657cf06 req-ae6432fe-793e-472f-afec-4a3500b70840 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Refreshing instance network info cache due to event network-changed-d2ecd824-47f7-4503-a326-2007f56a02e7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:25:25 np0005603623 nova_compute[226235]: 2026-01-31 08:25:25.951 226239 DEBUG oslo_concurrency.lockutils [req-44dc6c91-ab2c-4e80-b826-0d559657cf06 req-ae6432fe-793e-472f-afec-4a3500b70840 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:25:25 np0005603623 nova_compute[226235]: 2026-01-31 08:25:25.951 226239 DEBUG oslo_concurrency.lockutils [req-44dc6c91-ab2c-4e80-b826-0d559657cf06 req-ae6432fe-793e-472f-afec-4a3500b70840 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:25:25 np0005603623 nova_compute[226235]: 2026-01-31 08:25:25.951 226239 DEBUG nova.network.neutron [req-44dc6c91-ab2c-4e80-b826-0d559657cf06 req-ae6432fe-793e-472f-afec-4a3500b70840 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Refreshing network info cache for port d2ecd824-47f7-4503-a326-2007f56a02e7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:25:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:25:25.290146) [db/flush_job.cc:967] [default] [JOB 61] Level-0 flush table #101: 3969027 bytes OK
Jan 31 03:25:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:25:25.290165) [db/memtable_list.cc:519] [default] Level-0 commit table #101 started
Jan 31 03:25:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:25:26.060732) [db/memtable_list.cc:722] [default] Level-0 commit table #101: memtable #1 done
Jan 31 03:25:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:25:26.060821) EVENT_LOG_v1 {"time_micros": 1769847926060811, "job": 61, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:25:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:25:26.060846) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:25:26 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 61] Try to delete WAL files size 6133836, prev total WAL file size 6156974, number of live WAL files 2.
Jan 31 03:25:26 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000097.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:25:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:25:26.061951) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034323637' seq:72057594037927935, type:22 .. '7061786F730034353139' seq:0, type:0; will stop at (end)
Jan 31 03:25:26 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 62] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:25:26 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 61 Base level 0, inputs: [101(3876KB)], [99(9289KB)]
Jan 31 03:25:26 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847926061990, "job": 62, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [101], "files_L6": [99], "score": -1, "input_data_size": 13480998, "oldest_snapshot_seqno": -1}
Jan 31 03:25:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:26.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e276 e276: 3 total, 3 up, 3 in
Jan 31 03:25:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:26.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:26 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 62] Generated table #102: 7692 keys, 11541069 bytes, temperature: kUnknown
Jan 31 03:25:26 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847926946986, "cf_name": "default", "job": 62, "event": "table_file_creation", "file_number": 102, "file_size": 11541069, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11490094, "index_size": 30639, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19269, "raw_key_size": 198544, "raw_average_key_size": 25, "raw_value_size": 11353691, "raw_average_value_size": 1476, "num_data_blocks": 1207, "num_entries": 7692, "num_filter_entries": 7692, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769847926, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 102, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:25:26 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:25:27 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:25:26.947342) [db/compaction/compaction_job.cc:1663] [default] [JOB 62] Compacted 1@0 + 1@6 files to L6 => 11541069 bytes
Jan 31 03:25:27 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:25:27.449205) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 15.2 rd, 13.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 9.1 +0.0 blob) out(11.0 +0.0 blob), read-write-amplify(6.3) write-amplify(2.9) OK, records in: 8218, records dropped: 526 output_compression: NoCompression
Jan 31 03:25:27 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:25:27.449246) EVENT_LOG_v1 {"time_micros": 1769847927449230, "job": 62, "event": "compaction_finished", "compaction_time_micros": 885172, "compaction_time_cpu_micros": 25583, "output_level": 6, "num_output_files": 1, "total_output_size": 11541069, "num_input_records": 8218, "num_output_records": 7692, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:25:27 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000101.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:25:27 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847927450416, "job": 62, "event": "table_file_deletion", "file_number": 101}
Jan 31 03:25:27 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000099.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:25:27 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847927452805, "job": 62, "event": "table_file_deletion", "file_number": 99}
Jan 31 03:25:27 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:25:26.061850) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:25:27 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:25:27.452941) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:25:27 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:25:27.452947) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:25:27 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:25:27.452949) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:25:27 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:25:27.452951) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:25:27 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:25:27.452953) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:25:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:28 np0005603623 nova_compute[226235]: 2026-01-31 08:25:28.311 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:28.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:28.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:29 np0005603623 nova_compute[226235]: 2026-01-31 08:25:29.351 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:30 np0005603623 nova_compute[226235]: 2026-01-31 08:25:30.006 226239 DEBUG nova.network.neutron [req-44dc6c91-ab2c-4e80-b826-0d559657cf06 req-ae6432fe-793e-472f-afec-4a3500b70840 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updated VIF entry in instance network info cache for port d2ecd824-47f7-4503-a326-2007f56a02e7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:25:30 np0005603623 nova_compute[226235]: 2026-01-31 08:25:30.007 226239 DEBUG nova.network.neutron [req-44dc6c91-ab2c-4e80-b826-0d559657cf06 req-ae6432fe-793e-472f-afec-4a3500b70840 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updating instance_info_cache with network_info: [{"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:25:30 np0005603623 nova_compute[226235]: 2026-01-31 08:25:30.063 226239 DEBUG oslo_concurrency.lockutils [req-44dc6c91-ab2c-4e80-b826-0d559657cf06 req-ae6432fe-793e-472f-afec-4a3500b70840 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:25:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:30.118 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:30.118 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:25:30.119 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:25:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:30.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:25:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:30.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e277 e277: 3 total, 3 up, 3 in
Jan 31 03:25:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:32.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:32.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:33 np0005603623 nova_compute[226235]: 2026-01-31 08:25:33.313 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:34 np0005603623 nova_compute[226235]: 2026-01-31 08:25:34.353 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:34.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:34.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:35 np0005603623 nova_compute[226235]: 2026-01-31 08:25:35.962 226239 INFO nova.virt.libvirt.driver [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Snapshot image upload complete#033[00m
Jan 31 03:25:35 np0005603623 nova_compute[226235]: 2026-01-31 08:25:35.963 226239 DEBUG nova.compute.manager [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:25:36 np0005603623 nova_compute[226235]: 2026-01-31 08:25:36.062 226239 INFO nova.compute.manager [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Shelve offloading#033[00m
Jan 31 03:25:36 np0005603623 nova_compute[226235]: 2026-01-31 08:25:36.067 226239 INFO nova.virt.libvirt.driver [-] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Instance destroyed successfully.#033[00m
Jan 31 03:25:36 np0005603623 nova_compute[226235]: 2026-01-31 08:25:36.067 226239 DEBUG nova.compute.manager [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:25:36 np0005603623 nova_compute[226235]: 2026-01-31 08:25:36.069 226239 DEBUG oslo_concurrency.lockutils [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:25:36 np0005603623 nova_compute[226235]: 2026-01-31 08:25:36.069 226239 DEBUG oslo_concurrency.lockutils [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquired lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:25:36 np0005603623 nova_compute[226235]: 2026-01-31 08:25:36.070 226239 DEBUG nova.network.neutron [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:25:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:25:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:36.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:25:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:36.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:38 np0005603623 nova_compute[226235]: 2026-01-31 08:25:38.317 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:38.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:38.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:39 np0005603623 nova_compute[226235]: 2026-01-31 08:25:39.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:39 np0005603623 nova_compute[226235]: 2026-01-31 08:25:39.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:25:39 np0005603623 nova_compute[226235]: 2026-01-31 08:25:39.354 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:39 np0005603623 nova_compute[226235]: 2026-01-31 08:25:39.688 226239 DEBUG nova.network.neutron [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updating instance_info_cache with network_info: [{"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:25:39 np0005603623 nova_compute[226235]: 2026-01-31 08:25:39.744 226239 DEBUG oslo_concurrency.lockutils [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Releasing lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:25:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:40.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:25:40Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f9:0c:a9 10.100.0.5
Jan 31 03:25:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:25:40Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f9:0c:a9 10.100.0.5
Jan 31 03:25:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:40.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:42 np0005603623 nova_compute[226235]: 2026-01-31 08:25:42.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:42 np0005603623 nova_compute[226235]: 2026-01-31 08:25:42.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:25:42 np0005603623 nova_compute[226235]: 2026-01-31 08:25:42.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:25:42 np0005603623 nova_compute[226235]: 2026-01-31 08:25:42.183 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:25:42 np0005603623 nova_compute[226235]: 2026-01-31 08:25:42.184 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:25:42 np0005603623 nova_compute[226235]: 2026-01-31 08:25:42.184 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:25:42 np0005603623 nova_compute[226235]: 2026-01-31 08:25:42.184 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f3b36b5b-968c-4775-ac4f-93efc36f40ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:25:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:42.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:42.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:25:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:25:43 np0005603623 nova_compute[226235]: 2026-01-31 08:25:43.320 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:43 np0005603623 nova_compute[226235]: 2026-01-31 08:25:43.640 226239 INFO nova.virt.libvirt.driver [-] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Instance destroyed successfully.#033[00m
Jan 31 03:25:43 np0005603623 nova_compute[226235]: 2026-01-31 08:25:43.640 226239 DEBUG nova.objects.instance [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'resources' on Instance uuid f3b36b5b-968c-4775-ac4f-93efc36f40ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:25:43 np0005603623 nova_compute[226235]: 2026-01-31 08:25:43.666 226239 DEBUG nova.virt.libvirt.vif [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:23:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-657799937',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-657799937',id=119,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGuPmLyvW8nH0zjMVVycFJqThAJ40QxmOiqbjQtqD9yxLSlZxgvi2M6cEnwp9NZOC4D7auSCKwZopexRDoMXTIOi6B9+vGiJqB0/tIgguNHeMJkz6XGWm9K9JFV8LaAZqQ==',key_name='tempest-keypair-1582363486',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:24:08Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='491937de020742d7b4e847dc3bf57950',ramdisk_id='',reservation_id='r-mev9f22v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-60119558',owner_user_name='tempest-AttachVolumeShelveTestJSON-60119558-project-member',shelved_at='2026-01-31T08:25:35.963133',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='81322d27-2584-47ad-bfcd-7642da2c770e'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:25:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='432ac8867d8240408db455fc25bb5901',uuid=f3b36b5b-968c-4775-ac4f-93efc36f40ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:25:43 np0005603623 nova_compute[226235]: 2026-01-31 08:25:43.667 226239 DEBUG nova.network.os_vif_util [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converting VIF {"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:25:43 np0005603623 nova_compute[226235]: 2026-01-31 08:25:43.668 226239 DEBUG nova.network.os_vif_util [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:9f:8d,bridge_name='br-int',has_traffic_filtering=True,id=19a4f194-5514-4b8e-b635-e6fe0255dc1a,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a4f194-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:25:43 np0005603623 nova_compute[226235]: 2026-01-31 08:25:43.668 226239 DEBUG os_vif [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:9f:8d,bridge_name='br-int',has_traffic_filtering=True,id=19a4f194-5514-4b8e-b635-e6fe0255dc1a,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a4f194-55') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:25:43 np0005603623 nova_compute[226235]: 2026-01-31 08:25:43.669 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:43 np0005603623 nova_compute[226235]: 2026-01-31 08:25:43.670 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19a4f194-55, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:25:43 np0005603623 nova_compute[226235]: 2026-01-31 08:25:43.671 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:43 np0005603623 nova_compute[226235]: 2026-01-31 08:25:43.673 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:43 np0005603623 nova_compute[226235]: 2026-01-31 08:25:43.675 226239 INFO os_vif [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:9f:8d,bridge_name='br-int',has_traffic_filtering=True,id=19a4f194-5514-4b8e-b635-e6fe0255dc1a,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a4f194-55')#033[00m
Jan 31 03:25:43 np0005603623 nova_compute[226235]: 2026-01-31 08:25:43.943 226239 DEBUG nova.compute.manager [req-da0b7cce-bb77-4ba4-965e-d7961981fa3a req-6105363a-a30c-4e7e-8305-0dfbfd02149b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received event network-changed-19a4f194-5514-4b8e-b635-e6fe0255dc1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:25:43 np0005603623 nova_compute[226235]: 2026-01-31 08:25:43.944 226239 DEBUG nova.compute.manager [req-da0b7cce-bb77-4ba4-965e-d7961981fa3a req-6105363a-a30c-4e7e-8305-0dfbfd02149b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Refreshing instance network info cache due to event network-changed-19a4f194-5514-4b8e-b635-e6fe0255dc1a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:25:43 np0005603623 nova_compute[226235]: 2026-01-31 08:25:43.944 226239 DEBUG oslo_concurrency.lockutils [req-da0b7cce-bb77-4ba4-965e-d7961981fa3a req-6105363a-a30c-4e7e-8305-0dfbfd02149b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.261 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updating instance_info_cache with network_info: [{"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.279 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.280 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.280 226239 DEBUG oslo_concurrency.lockutils [req-da0b7cce-bb77-4ba4-965e-d7961981fa3a req-6105363a-a30c-4e7e-8305-0dfbfd02149b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.280 226239 DEBUG nova.network.neutron [req-da0b7cce-bb77-4ba4-965e-d7961981fa3a req-6105363a-a30c-4e7e-8305-0dfbfd02149b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Refreshing network info cache for port 19a4f194-5514-4b8e-b635-e6fe0255dc1a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.281 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.282 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.282 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.311 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.311 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.311 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.312 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.312 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.355 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:44.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:44.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:25:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/791549478' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.788 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.861 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000078 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.862 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000078 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.864 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000077 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.864 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000077 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.995 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.996 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4255MB free_disk=20.785682678222656GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.998 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:44 np0005603623 nova_compute[226235]: 2026-01-31 08:25:44.998 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:45 np0005603623 nova_compute[226235]: 2026-01-31 08:25:45.211 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance f3b36b5b-968c-4775-ac4f-93efc36f40ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:25:45 np0005603623 nova_compute[226235]: 2026-01-31 08:25:45.211 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 08dc30a4-60ae-4f4e-8b5e-e12610df2120 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:25:45 np0005603623 nova_compute[226235]: 2026-01-31 08:25:45.211 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:25:45 np0005603623 nova_compute[226235]: 2026-01-31 08:25:45.212 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:25:45 np0005603623 nova_compute[226235]: 2026-01-31 08:25:45.244 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:25:45 np0005603623 nova_compute[226235]: 2026-01-31 08:25:45.358 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:25:45 np0005603623 nova_compute[226235]: 2026-01-31 08:25:45.359 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:25:45 np0005603623 nova_compute[226235]: 2026-01-31 08:25:45.373 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:25:45 np0005603623 nova_compute[226235]: 2026-01-31 08:25:45.408 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:25:45 np0005603623 nova_compute[226235]: 2026-01-31 08:25:45.476 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:25:45 np0005603623 nova_compute[226235]: 2026-01-31 08:25:45.864 226239 DEBUG nova.network.neutron [req-da0b7cce-bb77-4ba4-965e-d7961981fa3a req-6105363a-a30c-4e7e-8305-0dfbfd02149b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updated VIF entry in instance network info cache for port 19a4f194-5514-4b8e-b635-e6fe0255dc1a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:25:45 np0005603623 nova_compute[226235]: 2026-01-31 08:25:45.865 226239 DEBUG nova.network.neutron [req-da0b7cce-bb77-4ba4-965e-d7961981fa3a req-6105363a-a30c-4e7e-8305-0dfbfd02149b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updating instance_info_cache with network_info: [{"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": null, "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap19a4f194-55", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:25:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:25:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1044609821' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:25:45 np0005603623 nova_compute[226235]: 2026-01-31 08:25:45.961 226239 DEBUG oslo_concurrency.lockutils [req-da0b7cce-bb77-4ba4-965e-d7961981fa3a req-6105363a-a30c-4e7e-8305-0dfbfd02149b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:25:45 np0005603623 nova_compute[226235]: 2026-01-31 08:25:45.970 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:25:45 np0005603623 nova_compute[226235]: 2026-01-31 08:25:45.975 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:25:46 np0005603623 nova_compute[226235]: 2026-01-31 08:25:46.075 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:25:46 np0005603623 nova_compute[226235]: 2026-01-31 08:25:46.158 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:25:46 np0005603623 nova_compute[226235]: 2026-01-31 08:25:46.159 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:25:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:46.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:25:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:46.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:25:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:48.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:25:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:48 np0005603623 nova_compute[226235]: 2026-01-31 08:25:48.672 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:48.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:48 np0005603623 podman[280372]: 2026-01-31 08:25:48.983234456 +0000 UTC m=+0.068236190 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:25:48 np0005603623 podman[280371]: 2026-01-31 08:25:48.988271704 +0000 UTC m=+0.073763254 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 31 03:25:49 np0005603623 nova_compute[226235]: 2026-01-31 08:25:49.032 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:49 np0005603623 nova_compute[226235]: 2026-01-31 08:25:49.032 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:49 np0005603623 nova_compute[226235]: 2026-01-31 08:25:49.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:49 np0005603623 nova_compute[226235]: 2026-01-31 08:25:49.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:25:49 np0005603623 nova_compute[226235]: 2026-01-31 08:25:49.358 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:50 np0005603623 nova_compute[226235]: 2026-01-31 08:25:50.086 226239 INFO nova.virt.libvirt.driver [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Deleting instance files /var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac_del#033[00m
Jan 31 03:25:50 np0005603623 nova_compute[226235]: 2026-01-31 08:25:50.087 226239 INFO nova.virt.libvirt.driver [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Deletion of /var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac_del complete#033[00m
Jan 31 03:25:50 np0005603623 nova_compute[226235]: 2026-01-31 08:25:50.228 226239 INFO nova.scheduler.client.report [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Deleted allocations for instance f3b36b5b-968c-4775-ac4f-93efc36f40ac#033[00m
Jan 31 03:25:50 np0005603623 nova_compute[226235]: 2026-01-31 08:25:50.287 226239 DEBUG oslo_concurrency.lockutils [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:50 np0005603623 nova_compute[226235]: 2026-01-31 08:25:50.287 226239 DEBUG oslo_concurrency.lockutils [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:50 np0005603623 nova_compute[226235]: 2026-01-31 08:25:50.337 226239 DEBUG oslo_concurrency.processutils [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:25:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:50.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:25:50 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3100315677' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:25:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:50.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:50 np0005603623 nova_compute[226235]: 2026-01-31 08:25:50.763 226239 DEBUG oslo_concurrency.processutils [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:25:50 np0005603623 nova_compute[226235]: 2026-01-31 08:25:50.767 226239 DEBUG nova.compute.provider_tree [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:25:50 np0005603623 nova_compute[226235]: 2026-01-31 08:25:50.793 226239 DEBUG nova.scheduler.client.report [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:25:50 np0005603623 nova_compute[226235]: 2026-01-31 08:25:50.820 226239 DEBUG oslo_concurrency.lockutils [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.533s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:50 np0005603623 nova_compute[226235]: 2026-01-31 08:25:50.870 226239 DEBUG oslo_concurrency.lockutils [None req-ff20c239-44cd-482d-a3a1-46d0f03641bc 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 55.997s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:51 np0005603623 nova_compute[226235]: 2026-01-31 08:25:51.168 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:52.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:52 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [P] New memtable created with log file: #52. Immutable memtables: 0.
Jan 31 03:25:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:25:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:52.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:25:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:53 np0005603623 nova_compute[226235]: 2026-01-31 08:25:53.676 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:54 np0005603623 nova_compute[226235]: 2026-01-31 08:25:54.361 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:54.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:54.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:56.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:56.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:57 np0005603623 nova_compute[226235]: 2026-01-31 08:25:57.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:58.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:58 np0005603623 nova_compute[226235]: 2026-01-31 08:25:58.716 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:25:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:58.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:59 np0005603623 nova_compute[226235]: 2026-01-31 08:25:59.362 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:00 np0005603623 nova_compute[226235]: 2026-01-31 08:26:00.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:00.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:26:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:00.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:26:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:02.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:02.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:03 np0005603623 nova_compute[226235]: 2026-01-31 08:26:03.719 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:04 np0005603623 nova_compute[226235]: 2026-01-31 08:26:04.389 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:04.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:04.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:05 np0005603623 nova_compute[226235]: 2026-01-31 08:26:05.033 226239 DEBUG oslo_concurrency.lockutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:26:05 np0005603623 nova_compute[226235]: 2026-01-31 08:26:05.034 226239 DEBUG oslo_concurrency.lockutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquired lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:26:05 np0005603623 nova_compute[226235]: 2026-01-31 08:26:05.034 226239 DEBUG nova.network.neutron [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:26:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:05.934 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:26:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:05.935 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:26:05 np0005603623 nova_compute[226235]: 2026-01-31 08:26:05.935 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:06.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:06.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:06 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:06.937 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:07 np0005603623 nova_compute[226235]: 2026-01-31 08:26:07.211 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:07 np0005603623 nova_compute[226235]: 2026-01-31 08:26:07.211 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:26:07 np0005603623 nova_compute[226235]: 2026-01-31 08:26:07.255 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:26:07 np0005603623 nova_compute[226235]: 2026-01-31 08:26:07.748 226239 DEBUG nova.network.neutron [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updating instance_info_cache with network_info: [{"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:26:07 np0005603623 nova_compute[226235]: 2026-01-31 08:26:07.824 226239 DEBUG oslo_concurrency.lockutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Releasing lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:26:07 np0005603623 nova_compute[226235]: 2026-01-31 08:26:07.970 226239 DEBUG nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 31 03:26:07 np0005603623 nova_compute[226235]: 2026-01-31 08:26:07.971 226239 DEBUG nova.virt.libvirt.volume.remotefs [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Creating file /var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120/d40f55de154d4871ab7a893adec210f0.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 31 03:26:07 np0005603623 nova_compute[226235]: 2026-01-31 08:26:07.971 226239 DEBUG oslo_concurrency.processutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120/d40f55de154d4871ab7a893adec210f0.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:08 np0005603623 nova_compute[226235]: 2026-01-31 08:26:08.414 226239 DEBUG oslo_concurrency.processutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120/d40f55de154d4871ab7a893adec210f0.tmp" returned: 1 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:08 np0005603623 nova_compute[226235]: 2026-01-31 08:26:08.415 226239 DEBUG oslo_concurrency.processutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120/d40f55de154d4871ab7a893adec210f0.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 31 03:26:08 np0005603623 nova_compute[226235]: 2026-01-31 08:26:08.416 226239 DEBUG nova.virt.libvirt.volume.remotefs [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Creating directory /var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 31 03:26:08 np0005603623 nova_compute[226235]: 2026-01-31 08:26:08.416 226239 DEBUG oslo_concurrency.processutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:26:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:08.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:26:08 np0005603623 nova_compute[226235]: 2026-01-31 08:26:08.640 226239 DEBUG oslo_concurrency.processutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120" returned: 0 in 0.224s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:08 np0005603623 nova_compute[226235]: 2026-01-31 08:26:08.645 226239 DEBUG nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:26:08 np0005603623 nova_compute[226235]: 2026-01-31 08:26:08.722 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:08.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:09 np0005603623 nova_compute[226235]: 2026-01-31 08:26:09.390 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:10.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:10.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:12.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:12.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:13 np0005603623 nova_compute[226235]: 2026-01-31 08:26:13.726 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:14 np0005603623 nova_compute[226235]: 2026-01-31 08:26:14.393 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:14.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e278 e278: 3 total, 3 up, 3 in
Jan 31 03:26:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:14.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:26:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:16.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:26:16 np0005603623 nova_compute[226235]: 2026-01-31 08:26:16.681 226239 INFO nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Instance shutdown successfully after 8 seconds.#033[00m
Jan 31 03:26:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:16.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:17 np0005603623 kernel: tapd2ecd824-47 (unregistering): left promiscuous mode
Jan 31 03:26:17 np0005603623 NetworkManager[48970]: <info>  [1769847977.1843] device (tapd2ecd824-47): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:26:17 np0005603623 ovn_controller[133449]: 2026-01-31T08:26:17Z|00486|binding|INFO|Releasing lport d2ecd824-47f7-4503-a326-2007f56a02e7 from this chassis (sb_readonly=0)
Jan 31 03:26:17 np0005603623 ovn_controller[133449]: 2026-01-31T08:26:17Z|00487|binding|INFO|Setting lport d2ecd824-47f7-4503-a326-2007f56a02e7 down in Southbound
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.186 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:17 np0005603623 ovn_controller[133449]: 2026-01-31T08:26:17Z|00488|binding|INFO|Removing iface tapd2ecd824-47 ovn-installed in OVS
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.196 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:17.221 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:0c:a9 10.100.0.5'], port_security=['fa:16:3e:f9:0c:a9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '08dc30a4-60ae-4f4e-8b5e-e12610df2120', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=d2ecd824-47f7-4503-a326-2007f56a02e7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:26:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:17.222 143258 INFO neutron.agent.ovn.metadata.agent [-] Port d2ecd824-47f7-4503-a326-2007f56a02e7 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 unbound from our chassis#033[00m
Jan 31 03:26:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:17.223 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1186b71b-0c4b-47f0-a55d-4433241e46e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:26:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:17.225 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0a80eea8-62b1-412d-ab08-4e36094df0eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:17.225 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace which is not needed anymore#033[00m
Jan 31 03:26:17 np0005603623 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000078.scope: Deactivated successfully.
Jan 31 03:26:17 np0005603623 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000078.scope: Consumed 13.902s CPU time.
Jan 31 03:26:17 np0005603623 systemd-machined[194379]: Machine qemu-54-instance-00000078 terminated.
Jan 31 03:26:17 np0005603623 kernel: tapd2ecd824-47: entered promiscuous mode
Jan 31 03:26:17 np0005603623 kernel: tapd2ecd824-47 (unregistering): left promiscuous mode
Jan 31 03:26:17 np0005603623 NetworkManager[48970]: <info>  [1769847977.2952] manager: (tapd2ecd824-47): new Tun device (/org/freedesktop/NetworkManager/Devices/234)
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.296 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:17 np0005603623 ovn_controller[133449]: 2026-01-31T08:26:17Z|00489|binding|INFO|Claiming lport d2ecd824-47f7-4503-a326-2007f56a02e7 for this chassis.
Jan 31 03:26:17 np0005603623 ovn_controller[133449]: 2026-01-31T08:26:17Z|00490|binding|INFO|d2ecd824-47f7-4503-a326-2007f56a02e7: Claiming fa:16:3e:f9:0c:a9 10.100.0.5
Jan 31 03:26:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:17.303 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:0c:a9 10.100.0.5'], port_security=['fa:16:3e:f9:0c:a9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '08dc30a4-60ae-4f4e-8b5e-e12610df2120', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=d2ecd824-47f7-4503-a326-2007f56a02e7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:26:17 np0005603623 ovn_controller[133449]: 2026-01-31T08:26:17Z|00491|binding|INFO|Releasing lport d2ecd824-47f7-4503-a326-2007f56a02e7 from this chassis (sb_readonly=0)
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.305 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:17.311 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:0c:a9 10.100.0.5'], port_security=['fa:16:3e:f9:0c:a9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '08dc30a4-60ae-4f4e-8b5e-e12610df2120', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=d2ecd824-47f7-4503-a326-2007f56a02e7) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.313 226239 INFO nova.virt.libvirt.driver [-] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Instance destroyed successfully.#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.314 226239 DEBUG nova.virt.libvirt.vif [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:24:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1941034472',display_name='tempest-ServerActionsTestJSON-server-1941034472',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1941034472',id=120,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:25:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-ra3xi6gb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:26:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=08dc30a4-60ae-4f4e-8b5e-e12610df2120,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-924085117-network", "vif_mac": "fa:16:3e:f9:0c:a9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.314 226239 DEBUG nova.network.os_vif_util [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-924085117-network", "vif_mac": "fa:16:3e:f9:0c:a9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.315 226239 DEBUG nova.network.os_vif_util [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.315 226239 DEBUG os_vif [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.317 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.317 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2ecd824-47, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.320 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.322 226239 INFO os_vif [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47')#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.326 226239 DEBUG nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] skipping disk for instance-00000078 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.326 226239 DEBUG nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] skipping disk for instance-00000078 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:26:17 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[280122]: [NOTICE]   (280126) : haproxy version is 2.8.14-c23fe91
Jan 31 03:26:17 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[280122]: [NOTICE]   (280126) : path to executable is /usr/sbin/haproxy
Jan 31 03:26:17 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[280122]: [WARNING]  (280126) : Exiting Master process...
Jan 31 03:26:17 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[280122]: [WARNING]  (280126) : Exiting Master process...
Jan 31 03:26:17 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[280122]: [ALERT]    (280126) : Current worker (280128) exited with code 143 (Terminated)
Jan 31 03:26:17 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[280122]: [WARNING]  (280126) : All workers exited. Exiting... (0)
Jan 31 03:26:17 np0005603623 systemd[1]: libpod-3671472881f5c89fb195cb72278ce1e399183cb16f3f776365380e0373695775.scope: Deactivated successfully.
Jan 31 03:26:17 np0005603623 podman[280523]: 2026-01-31 08:26:17.572044062 +0000 UTC m=+0.285183511 container died 3671472881f5c89fb195cb72278ce1e399183cb16f3f776365380e0373695775 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.739 226239 DEBUG neutronclient.v2_0.client [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port d2ecd824-47f7-4503-a326-2007f56a02e7 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.851 226239 DEBUG oslo_concurrency.lockutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.851 226239 DEBUG oslo_concurrency.lockutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.852 226239 DEBUG oslo_concurrency.lockutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.906 226239 DEBUG nova.compute.manager [req-87157d71-267d-4b7b-aa7d-6309ee1ef914 req-9b98a9c4-8304-44ba-b583-5f306c8138c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-unplugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.906 226239 DEBUG oslo_concurrency.lockutils [req-87157d71-267d-4b7b-aa7d-6309ee1ef914 req-9b98a9c4-8304-44ba-b583-5f306c8138c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.906 226239 DEBUG oslo_concurrency.lockutils [req-87157d71-267d-4b7b-aa7d-6309ee1ef914 req-9b98a9c4-8304-44ba-b583-5f306c8138c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.907 226239 DEBUG oslo_concurrency.lockutils [req-87157d71-267d-4b7b-aa7d-6309ee1ef914 req-9b98a9c4-8304-44ba-b583-5f306c8138c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.907 226239 DEBUG nova.compute.manager [req-87157d71-267d-4b7b-aa7d-6309ee1ef914 req-9b98a9c4-8304-44ba-b583-5f306c8138c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] No waiting events found dispatching network-vif-unplugged-d2ecd824-47f7-4503-a326-2007f56a02e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:26:17 np0005603623 nova_compute[226235]: 2026-01-31 08:26:17.907 226239 WARNING nova.compute.manager [req-87157d71-267d-4b7b-aa7d-6309ee1ef914 req-9b98a9c4-8304-44ba-b583-5f306c8138c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received unexpected event network-vif-unplugged-d2ecd824-47f7-4503-a326-2007f56a02e7 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:26:18 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3671472881f5c89fb195cb72278ce1e399183cb16f3f776365380e0373695775-userdata-shm.mount: Deactivated successfully.
Jan 31 03:26:18 np0005603623 systemd[1]: var-lib-containers-storage-overlay-b5f2ba640471292cadf74528f48ddd04fa54a9092df237e6ea583319a2e33134-merged.mount: Deactivated successfully.
Jan 31 03:26:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:18.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:18.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:18 np0005603623 podman[280523]: 2026-01-31 08:26:18.975185788 +0000 UTC m=+1.688325257 container cleanup 3671472881f5c89fb195cb72278ce1e399183cb16f3f776365380e0373695775 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:26:18 np0005603623 systemd[1]: libpod-conmon-3671472881f5c89fb195cb72278ce1e399183cb16f3f776365380e0373695775.scope: Deactivated successfully.
Jan 31 03:26:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:19 np0005603623 nova_compute[226235]: 2026-01-31 08:26:19.394 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:20 np0005603623 nova_compute[226235]: 2026-01-31 08:26:20.069 226239 DEBUG nova.compute.manager [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:20 np0005603623 nova_compute[226235]: 2026-01-31 08:26:20.070 226239 DEBUG oslo_concurrency.lockutils [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:20 np0005603623 nova_compute[226235]: 2026-01-31 08:26:20.070 226239 DEBUG oslo_concurrency.lockutils [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:20 np0005603623 nova_compute[226235]: 2026-01-31 08:26:20.070 226239 DEBUG oslo_concurrency.lockutils [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:20 np0005603623 nova_compute[226235]: 2026-01-31 08:26:20.070 226239 DEBUG nova.compute.manager [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] No waiting events found dispatching network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:26:20 np0005603623 nova_compute[226235]: 2026-01-31 08:26:20.070 226239 WARNING nova.compute.manager [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received unexpected event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:26:20 np0005603623 nova_compute[226235]: 2026-01-31 08:26:20.070 226239 DEBUG nova.compute.manager [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-changed-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:20 np0005603623 nova_compute[226235]: 2026-01-31 08:26:20.071 226239 DEBUG nova.compute.manager [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Refreshing instance network info cache due to event network-changed-d2ecd824-47f7-4503-a326-2007f56a02e7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:26:20 np0005603623 nova_compute[226235]: 2026-01-31 08:26:20.071 226239 DEBUG oslo_concurrency.lockutils [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:26:20 np0005603623 nova_compute[226235]: 2026-01-31 08:26:20.071 226239 DEBUG oslo_concurrency.lockutils [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:26:20 np0005603623 nova_compute[226235]: 2026-01-31 08:26:20.071 226239 DEBUG nova.network.neutron [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Refreshing network info cache for port d2ecd824-47f7-4503-a326-2007f56a02e7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:26:20 np0005603623 podman[280568]: 2026-01-31 08:26:20.248118413 +0000 UTC m=+1.258086121 container remove 3671472881f5c89fb195cb72278ce1e399183cb16f3f776365380e0373695775 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:26:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:20.253 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7cb65d10-4c0b-4ecc-9366-3c449c41b470]: (4, ('Sat Jan 31 08:26:17 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (3671472881f5c89fb195cb72278ce1e399183cb16f3f776365380e0373695775)\n3671472881f5c89fb195cb72278ce1e399183cb16f3f776365380e0373695775\nSat Jan 31 08:26:18 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (3671472881f5c89fb195cb72278ce1e399183cb16f3f776365380e0373695775)\n3671472881f5c89fb195cb72278ce1e399183cb16f3f776365380e0373695775\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:20.254 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b4cd04a8-c5a9-4984-9f43-427a87e68bb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:20.255 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:20 np0005603623 nova_compute[226235]: 2026-01-31 08:26:20.290 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:20 np0005603623 kernel: tap1186b71b-00: left promiscuous mode
Jan 31 03:26:20 np0005603623 nova_compute[226235]: 2026-01-31 08:26:20.295 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:20.297 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8a661057-bf1a-494d-8c56-aff251f8e227]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:20.310 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[35d37a09-029c-4928-b927-7dc81cb813d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:20.311 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef80e94-cd6f-4e69-921f-4d7303b77ec6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:20.322 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0baf216e-3007-4b95-b123-a0037b4c5d00]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 705954, 'reachable_time': 32748, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280620, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:20 np0005603623 podman[280582]: 2026-01-31 08:26:20.325472738 +0000 UTC m=+0.421053620 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:26:20 np0005603623 systemd[1]: run-netns-ovnmeta\x2d1186b71b\x2d0c4b\x2d47f0\x2da55d\x2d4433241e46e7.mount: Deactivated successfully.
Jan 31 03:26:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:20.326 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:26:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:20.326 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[235cd1f3-b65a-4b81-96cb-04e4bff64b61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:20.327 143258 INFO neutron.agent.ovn.metadata.agent [-] Port d2ecd824-47f7-4503-a326-2007f56a02e7 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 unbound from our chassis#033[00m
Jan 31 03:26:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:20.328 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1186b71b-0c4b-47f0-a55d-4433241e46e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:26:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:20.329 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c93e1b28-d56b-489a-81aa-ebdc4d027630]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:20.330 143258 INFO neutron.agent.ovn.metadata.agent [-] Port d2ecd824-47f7-4503-a326-2007f56a02e7 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 unbound from our chassis#033[00m
Jan 31 03:26:20 np0005603623 podman[280581]: 2026-01-31 08:26:20.331868368 +0000 UTC m=+0.427915225 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 31 03:26:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:20.331 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1186b71b-0c4b-47f0-a55d-4433241e46e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:26:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:20.332 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8a3c00f5-edcc-4a51-ba89-77aa910d90cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:20.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:20.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e279 e279: 3 total, 3 up, 3 in
Jan 31 03:26:22 np0005603623 nova_compute[226235]: 2026-01-31 08:26:22.144 226239 DEBUG nova.network.neutron [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updated VIF entry in instance network info cache for port d2ecd824-47f7-4503-a326-2007f56a02e7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:26:22 np0005603623 nova_compute[226235]: 2026-01-31 08:26:22.145 226239 DEBUG nova.network.neutron [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updating instance_info_cache with network_info: [{"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:26:22 np0005603623 nova_compute[226235]: 2026-01-31 08:26:22.162 226239 DEBUG oslo_concurrency.lockutils [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:26:22 np0005603623 nova_compute[226235]: 2026-01-31 08:26:22.319 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:26:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:22.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:26:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:22.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e280 e280: 3 total, 3 up, 3 in
Jan 31 03:26:24 np0005603623 nova_compute[226235]: 2026-01-31 08:26:24.396 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:24.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:26:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:24.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:26:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e280 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:26.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e281 e281: 3 total, 3 up, 3 in
Jan 31 03:26:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:26.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:27 np0005603623 nova_compute[226235]: 2026-01-31 08:26:27.320 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:28.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:26:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:28.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:26:29 np0005603623 nova_compute[226235]: 2026-01-31 08:26:29.398 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:30.118 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:30.119 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:30.119 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:30.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:30.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:32 np0005603623 nova_compute[226235]: 2026-01-31 08:26:32.313 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847977.3117373, 08dc30a4-60ae-4f4e-8b5e-e12610df2120 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:26:32 np0005603623 nova_compute[226235]: 2026-01-31 08:26:32.313 226239 INFO nova.compute.manager [-] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:26:32 np0005603623 nova_compute[226235]: 2026-01-31 08:26:32.322 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:32 np0005603623 nova_compute[226235]: 2026-01-31 08:26:32.339 226239 DEBUG nova.compute.manager [None req-7a649017-f0ba-4d50-b347-a07d07e0e749 - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:26:32 np0005603623 nova_compute[226235]: 2026-01-31 08:26:32.342 226239 DEBUG nova.compute.manager [None req-7a649017-f0ba-4d50-b347-a07d07e0e749 - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:26:32 np0005603623 nova_compute[226235]: 2026-01-31 08:26:32.382 226239 INFO nova.compute.manager [None req-7a649017-f0ba-4d50-b347-a07d07e0e749 - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 31 03:26:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:32.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:26:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:32.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:26:34 np0005603623 nova_compute[226235]: 2026-01-31 08:26:34.400 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:34.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:34.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:36 np0005603623 nova_compute[226235]: 2026-01-31 08:26:36.340 226239 DEBUG nova.compute.manager [req-b38176f4-b40f-405d-a6fe-06db7920afff req-0cfc6673-be2b-42be-b9d3-1e33803659b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:36 np0005603623 nova_compute[226235]: 2026-01-31 08:26:36.340 226239 DEBUG oslo_concurrency.lockutils [req-b38176f4-b40f-405d-a6fe-06db7920afff req-0cfc6673-be2b-42be-b9d3-1e33803659b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:36 np0005603623 nova_compute[226235]: 2026-01-31 08:26:36.340 226239 DEBUG oslo_concurrency.lockutils [req-b38176f4-b40f-405d-a6fe-06db7920afff req-0cfc6673-be2b-42be-b9d3-1e33803659b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:36 np0005603623 nova_compute[226235]: 2026-01-31 08:26:36.341 226239 DEBUG oslo_concurrency.lockutils [req-b38176f4-b40f-405d-a6fe-06db7920afff req-0cfc6673-be2b-42be-b9d3-1e33803659b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:36 np0005603623 nova_compute[226235]: 2026-01-31 08:26:36.341 226239 DEBUG nova.compute.manager [req-b38176f4-b40f-405d-a6fe-06db7920afff req-0cfc6673-be2b-42be-b9d3-1e33803659b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] No waiting events found dispatching network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:26:36 np0005603623 nova_compute[226235]: 2026-01-31 08:26:36.341 226239 WARNING nova.compute.manager [req-b38176f4-b40f-405d-a6fe-06db7920afff req-0cfc6673-be2b-42be-b9d3-1e33803659b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received unexpected event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 31 03:26:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:36.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:36.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e282 e282: 3 total, 3 up, 3 in
Jan 31 03:26:37 np0005603623 nova_compute[226235]: 2026-01-31 08:26:37.323 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:38 np0005603623 nova_compute[226235]: 2026-01-31 08:26:38.508 226239 DEBUG nova.compute.manager [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:38 np0005603623 nova_compute[226235]: 2026-01-31 08:26:38.509 226239 DEBUG oslo_concurrency.lockutils [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:38 np0005603623 nova_compute[226235]: 2026-01-31 08:26:38.509 226239 DEBUG oslo_concurrency.lockutils [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:38 np0005603623 nova_compute[226235]: 2026-01-31 08:26:38.509 226239 DEBUG oslo_concurrency.lockutils [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:38 np0005603623 nova_compute[226235]: 2026-01-31 08:26:38.509 226239 DEBUG nova.compute.manager [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] No waiting events found dispatching network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:26:38 np0005603623 nova_compute[226235]: 2026-01-31 08:26:38.509 226239 WARNING nova.compute.manager [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received unexpected event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:26:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:38.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:26:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:38.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:26:39 np0005603623 nova_compute[226235]: 2026-01-31 08:26:39.199 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:39 np0005603623 nova_compute[226235]: 2026-01-31 08:26:39.200 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:26:39 np0005603623 nova_compute[226235]: 2026-01-31 08:26:39.402 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e283 e283: 3 total, 3 up, 3 in
Jan 31 03:26:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:40.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:40.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e283 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:42 np0005603623 nova_compute[226235]: 2026-01-31 08:26:42.365 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:42.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:42.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:43 np0005603623 nova_compute[226235]: 2026-01-31 08:26:43.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:43 np0005603623 nova_compute[226235]: 2026-01-31 08:26:43.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:26:43 np0005603623 nova_compute[226235]: 2026-01-31 08:26:43.229 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Skipping network cache update for instance because it has been migrated to another host. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9902#033[00m
Jan 31 03:26:43 np0005603623 nova_compute[226235]: 2026-01-31 08:26:43.230 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:26:43 np0005603623 nova_compute[226235]: 2026-01-31 08:26:43.231 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:43 np0005603623 nova_compute[226235]: 2026-01-31 08:26:43.271 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:43 np0005603623 nova_compute[226235]: 2026-01-31 08:26:43.272 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:43 np0005603623 nova_compute[226235]: 2026-01-31 08:26:43.273 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:43 np0005603623 nova_compute[226235]: 2026-01-31 08:26:43.273 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:26:43 np0005603623 nova_compute[226235]: 2026-01-31 08:26:43.274 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:26:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1073206551' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:26:43 np0005603623 nova_compute[226235]: 2026-01-31 08:26:43.747 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:43 np0005603623 nova_compute[226235]: 2026-01-31 08:26:43.876 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000078 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:26:43 np0005603623 nova_compute[226235]: 2026-01-31 08:26:43.876 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000078 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:26:43 np0005603623 nova_compute[226235]: 2026-01-31 08:26:43.994 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:26:43 np0005603623 nova_compute[226235]: 2026-01-31 08:26:43.995 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4456MB free_disk=20.760211944580078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:26:43 np0005603623 nova_compute[226235]: 2026-01-31 08:26:43.995 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:43 np0005603623 nova_compute[226235]: 2026-01-31 08:26:43.995 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:26:44 np0005603623 nova_compute[226235]: 2026-01-31 08:26:44.132 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Migration for instance 08dc30a4-60ae-4f4e-8b5e-e12610df2120 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 03:26:44 np0005603623 nova_compute[226235]: 2026-01-31 08:26:44.228 226239 INFO nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updating resource usage from migration 0ecfaaa6-30a5-421d-affe-cfddf44ff0c2#033[00m
Jan 31 03:26:44 np0005603623 nova_compute[226235]: 2026-01-31 08:26:44.229 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Starting to track outgoing migration 0ecfaaa6-30a5-421d-affe-cfddf44ff0c2 with flavor a01eb4f0-fd80-416b-a750-75de320394d8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Jan 31 03:26:44 np0005603623 nova_compute[226235]: 2026-01-31 08:26:44.289 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Migration 0ecfaaa6-30a5-421d-affe-cfddf44ff0c2 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 03:26:44 np0005603623 nova_compute[226235]: 2026-01-31 08:26:44.289 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:26:44 np0005603623 nova_compute[226235]: 2026-01-31 08:26:44.290 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:26:44 np0005603623 nova_compute[226235]: 2026-01-31 08:26:44.404 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:44 np0005603623 nova_compute[226235]: 2026-01-31 08:26:44.408 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:44 np0005603623 nova_compute[226235]: 2026-01-31 08:26:44.431 226239 DEBUG nova.compute.manager [req-f2bc58e9-e42b-4fff-badf-69d2ca1fb38b req-9fb23e5a-400c-429f-b758-3d2e33b65386 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-unplugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:44 np0005603623 nova_compute[226235]: 2026-01-31 08:26:44.432 226239 DEBUG oslo_concurrency.lockutils [req-f2bc58e9-e42b-4fff-badf-69d2ca1fb38b req-9fb23e5a-400c-429f-b758-3d2e33b65386 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:44 np0005603623 nova_compute[226235]: 2026-01-31 08:26:44.432 226239 DEBUG oslo_concurrency.lockutils [req-f2bc58e9-e42b-4fff-badf-69d2ca1fb38b req-9fb23e5a-400c-429f-b758-3d2e33b65386 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:44 np0005603623 nova_compute[226235]: 2026-01-31 08:26:44.433 226239 DEBUG oslo_concurrency.lockutils [req-f2bc58e9-e42b-4fff-badf-69d2ca1fb38b req-9fb23e5a-400c-429f-b758-3d2e33b65386 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:44 np0005603623 nova_compute[226235]: 2026-01-31 08:26:44.433 226239 DEBUG nova.compute.manager [req-f2bc58e9-e42b-4fff-badf-69d2ca1fb38b req-9fb23e5a-400c-429f-b758-3d2e33b65386 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] No waiting events found dispatching network-vif-unplugged-d2ecd824-47f7-4503-a326-2007f56a02e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:26:44 np0005603623 nova_compute[226235]: 2026-01-31 08:26:44.433 226239 WARNING nova.compute.manager [req-f2bc58e9-e42b-4fff-badf-69d2ca1fb38b req-9fb23e5a-400c-429f-b758-3d2e33b65386 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received unexpected event network-vif-unplugged-d2ecd824-47f7-4503-a326-2007f56a02e7 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 31 03:26:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:26:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:44.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:26:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:44.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:26:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2287118839' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:26:44 np0005603623 nova_compute[226235]: 2026-01-31 08:26:44.887 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:44 np0005603623 nova_compute[226235]: 2026-01-31 08:26:44.891 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:26:44 np0005603623 nova_compute[226235]: 2026-01-31 08:26:44.914 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:26:45 np0005603623 nova_compute[226235]: 2026-01-31 08:26:45.171 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:26:45 np0005603623 nova_compute[226235]: 2026-01-31 08:26:45.171 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:45 np0005603623 nova_compute[226235]: 2026-01-31 08:26:45.515 226239 INFO nova.compute.manager [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Swapping old allocation on dict_keys(['492dc482-9d1e-49ca-87f3-0104a8508b72']) held by migration 0ecfaaa6-30a5-421d-affe-cfddf44ff0c2 for instance#033[00m
Jan 31 03:26:45 np0005603623 nova_compute[226235]: 2026-01-31 08:26:45.554 226239 DEBUG nova.scheduler.client.report [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Overwriting current allocation {'allocations': {'f7fd90d1-7583-42ff-b709-f5fc55f6e273': {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}, 'generation': 71}}, 'project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'user_id': '1d03198d8ab846bda092e089b2d5a6c7', 'consumer_generation': 1} on consumer 08dc30a4-60ae-4f4e-8b5e-e12610df2120 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Jan 31 03:26:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:26:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:26:46 np0005603623 nova_compute[226235]: 2026-01-31 08:26:46.094 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:46 np0005603623 nova_compute[226235]: 2026-01-31 08:26:46.094 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:46.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:46.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e283 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:46 np0005603623 nova_compute[226235]: 2026-01-31 08:26:46.941 226239 INFO nova.network.neutron [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updating port d2ecd824-47f7-4503-a326-2007f56a02e7 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 31 03:26:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:46.987 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:26:46 np0005603623 nova_compute[226235]: 2026-01-31 08:26:46.988 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:46.988 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:26:47 np0005603623 nova_compute[226235]: 2026-01-31 08:26:47.159 226239 DEBUG nova.compute.manager [req-184301e4-8407-45fc-bd5e-2f149c9a5d06 req-6ffb6de0-b7be-4bf9-a111-64b62b3326cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:47 np0005603623 nova_compute[226235]: 2026-01-31 08:26:47.160 226239 DEBUG oslo_concurrency.lockutils [req-184301e4-8407-45fc-bd5e-2f149c9a5d06 req-6ffb6de0-b7be-4bf9-a111-64b62b3326cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:47 np0005603623 nova_compute[226235]: 2026-01-31 08:26:47.160 226239 DEBUG oslo_concurrency.lockutils [req-184301e4-8407-45fc-bd5e-2f149c9a5d06 req-6ffb6de0-b7be-4bf9-a111-64b62b3326cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:47 np0005603623 nova_compute[226235]: 2026-01-31 08:26:47.160 226239 DEBUG oslo_concurrency.lockutils [req-184301e4-8407-45fc-bd5e-2f149c9a5d06 req-6ffb6de0-b7be-4bf9-a111-64b62b3326cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:47 np0005603623 nova_compute[226235]: 2026-01-31 08:26:47.160 226239 DEBUG nova.compute.manager [req-184301e4-8407-45fc-bd5e-2f149c9a5d06 req-6ffb6de0-b7be-4bf9-a111-64b62b3326cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] No waiting events found dispatching network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:26:47 np0005603623 nova_compute[226235]: 2026-01-31 08:26:47.160 226239 WARNING nova.compute.manager [req-184301e4-8407-45fc-bd5e-2f149c9a5d06 req-6ffb6de0-b7be-4bf9-a111-64b62b3326cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received unexpected event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 31 03:26:47 np0005603623 ovn_controller[133449]: 2026-01-31T08:26:47Z|00492|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 03:26:47 np0005603623 nova_compute[226235]: 2026-01-31 08:26:47.367 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:48 np0005603623 nova_compute[226235]: 2026-01-31 08:26:48.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e284 e284: 3 total, 3 up, 3 in
Jan 31 03:26:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:48.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:48.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:48 np0005603623 nova_compute[226235]: 2026-01-31 08:26:48.926 226239 DEBUG oslo_concurrency.lockutils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:26:48 np0005603623 nova_compute[226235]: 2026-01-31 08:26:48.926 226239 DEBUG oslo_concurrency.lockutils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquired lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:26:48 np0005603623 nova_compute[226235]: 2026-01-31 08:26:48.926 226239 DEBUG nova.network.neutron [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:26:49 np0005603623 nova_compute[226235]: 2026-01-31 08:26:49.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:49 np0005603623 nova_compute[226235]: 2026-01-31 08:26:49.296 226239 DEBUG nova.compute.manager [req-a55c3795-2c90-4bd6-a7e4-a07936b1ae94 req-44354c2e-2b3b-4f93-bd5e-1a2a2b4f6298 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-changed-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:49 np0005603623 nova_compute[226235]: 2026-01-31 08:26:49.297 226239 DEBUG nova.compute.manager [req-a55c3795-2c90-4bd6-a7e4-a07936b1ae94 req-44354c2e-2b3b-4f93-bd5e-1a2a2b4f6298 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Refreshing instance network info cache due to event network-changed-d2ecd824-47f7-4503-a326-2007f56a02e7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:26:49 np0005603623 nova_compute[226235]: 2026-01-31 08:26:49.297 226239 DEBUG oslo_concurrency.lockutils [req-a55c3795-2c90-4bd6-a7e4-a07936b1ae94 req-44354c2e-2b3b-4f93-bd5e-1a2a2b4f6298 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:26:49 np0005603623 nova_compute[226235]: 2026-01-31 08:26:49.406 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e285 e285: 3 total, 3 up, 3 in
Jan 31 03:26:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:26:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:50.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:26:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:50.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:50 np0005603623 podman[280922]: 2026-01-31 08:26:50.964045251 +0000 UTC m=+0.060311152 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:26:50 np0005603623 podman[280923]: 2026-01-31 08:26:50.971213806 +0000 UTC m=+0.066125585 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 31 03:26:51 np0005603623 nova_compute[226235]: 2026-01-31 08:26:51.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e285 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:52 np0005603623 nova_compute[226235]: 2026-01-31 08:26:52.369 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:52.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e286 e286: 3 total, 3 up, 3 in
Jan 31 03:26:52 np0005603623 nova_compute[226235]: 2026-01-31 08:26:52.769 226239 DEBUG nova.network.neutron [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updating instance_info_cache with network_info: [{"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:26:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:52 np0005603623 nova_compute[226235]: 2026-01-31 08:26:52.824 226239 DEBUG oslo_concurrency.lockutils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Releasing lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:26:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 31 03:26:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:52.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 31 03:26:52 np0005603623 nova_compute[226235]: 2026-01-31 08:26:52.826 226239 DEBUG nova.virt.libvirt.driver [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Jan 31 03:26:53 np0005603623 nova_compute[226235]: 2026-01-31 08:26:53.313 226239 DEBUG oslo_concurrency.lockutils [req-a55c3795-2c90-4bd6-a7e4-a07936b1ae94 req-44354c2e-2b3b-4f93-bd5e-1a2a2b4f6298 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:26:53 np0005603623 nova_compute[226235]: 2026-01-31 08:26:53.314 226239 DEBUG nova.network.neutron [req-a55c3795-2c90-4bd6-a7e4-a07936b1ae94 req-44354c2e-2b3b-4f93-bd5e-1a2a2b4f6298 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Refreshing network info cache for port d2ecd824-47f7-4503-a326-2007f56a02e7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:26:53 np0005603623 nova_compute[226235]: 2026-01-31 08:26:53.367 226239 DEBUG nova.storage.rbd_utils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rolling back rbd image(08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk) to snapshot(nova-resize) rollback_to_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:505#033[00m
Jan 31 03:26:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:26:53.990 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:54 np0005603623 nova_compute[226235]: 2026-01-31 08:26:54.408 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:54.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:54.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:55 np0005603623 nova_compute[226235]: 2026-01-31 08:26:55.016 226239 DEBUG nova.network.neutron [req-a55c3795-2c90-4bd6-a7e4-a07936b1ae94 req-44354c2e-2b3b-4f93-bd5e-1a2a2b4f6298 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updated VIF entry in instance network info cache for port d2ecd824-47f7-4503-a326-2007f56a02e7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:26:55 np0005603623 nova_compute[226235]: 2026-01-31 08:26:55.017 226239 DEBUG nova.network.neutron [req-a55c3795-2c90-4bd6-a7e4-a07936b1ae94 req-44354c2e-2b3b-4f93-bd5e-1a2a2b4f6298 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updating instance_info_cache with network_info: [{"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:26:55 np0005603623 nova_compute[226235]: 2026-01-31 08:26:55.120 226239 DEBUG oslo_concurrency.lockutils [req-a55c3795-2c90-4bd6-a7e4-a07936b1ae94 req-44354c2e-2b3b-4f93-bd5e-1a2a2b4f6298 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:26:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:56.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:56.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:57 np0005603623 nova_compute[226235]: 2026-01-31 08:26:57.371 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:58 np0005603623 nova_compute[226235]: 2026-01-31 08:26:58.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:26:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:58.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:26:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:26:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:26:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:58.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:26:59 np0005603623 nova_compute[226235]: 2026-01-31 08:26:59.409 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e287 e287: 3 total, 3 up, 3 in
Jan 31 03:27:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:27:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:00.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:27:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:00.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:01 np0005603623 nova_compute[226235]: 2026-01-31 08:27:01.375 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:01 np0005603623 nova_compute[226235]: 2026-01-31 08:27:01.377 226239 DEBUG nova.storage.rbd_utils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] removing snapshot(nova-resize) on rbd image(08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:27:02 np0005603623 nova_compute[226235]: 2026-01-31 08:27:02.373 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:02.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e287 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:27:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:02.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:27:04 np0005603623 nova_compute[226235]: 2026-01-31 08:27:04.410 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e288 e288: 3 total, 3 up, 3 in
Jan 31 03:27:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:27:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:04.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:27:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:04.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:06.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:06.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.002 226239 DEBUG nova.virt.libvirt.driver [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Start _get_guest_xml network_info=[{"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.005 226239 WARNING nova.virt.libvirt.driver [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.014 226239 DEBUG nova.virt.libvirt.host [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.015 226239 DEBUG nova.virt.libvirt.host [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.017 226239 DEBUG nova.virt.libvirt.host [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.017 226239 DEBUG nova.virt.libvirt.host [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.018 226239 DEBUG nova.virt.libvirt.driver [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.018 226239 DEBUG nova.virt.hardware [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.019 226239 DEBUG nova.virt.hardware [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.019 226239 DEBUG nova.virt.hardware [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.019 226239 DEBUG nova.virt.hardware [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.019 226239 DEBUG nova.virt.hardware [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.020 226239 DEBUG nova.virt.hardware [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.020 226239 DEBUG nova.virt.hardware [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.020 226239 DEBUG nova.virt.hardware [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.020 226239 DEBUG nova.virt.hardware [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.020 226239 DEBUG nova.virt.hardware [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.020 226239 DEBUG nova.virt.hardware [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.020 226239 DEBUG nova.objects.instance [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'vcpu_model' on Instance uuid 08dc30a4-60ae-4f4e-8b5e-e12610df2120 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.081 226239 DEBUG oslo_concurrency.processutils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.375 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:27:07 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2513143262' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.489 226239 DEBUG oslo_concurrency.processutils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:07 np0005603623 nova_compute[226235]: 2026-01-31 08:27:07.530 226239 DEBUG oslo_concurrency.processutils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e288 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:27:07 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1869480627' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.106 226239 DEBUG oslo_concurrency.processutils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.577s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.108 226239 DEBUG nova.virt.libvirt.vif [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:24:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1941034472',display_name='tempest-ServerActionsTestJSON-server-1941034472',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1941034472',id=120,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:26:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-ra3xi6gb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:26:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=08dc30a4-60ae-4f4e-8b5e-e12610df2120,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.109 226239 DEBUG nova.network.os_vif_util [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.110 226239 DEBUG nova.network.os_vif_util [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.113 226239 DEBUG nova.virt.libvirt.driver [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:27:08 np0005603623 nova_compute[226235]:  <uuid>08dc30a4-60ae-4f4e-8b5e-e12610df2120</uuid>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:  <name>instance-00000078</name>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerActionsTestJSON-server-1941034472</nova:name>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:27:07</nova:creationTime>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:27:08 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:        <nova:user uuid="1d03198d8ab846bda092e089b2d5a6c7">tempest-ServerActionsTestJSON-1873947453-project-member</nova:user>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:        <nova:project uuid="5b87da3b3f42494f96baeeeaf60b54df">tempest-ServerActionsTestJSON-1873947453</nova:project>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:        <nova:port uuid="d2ecd824-47f7-4503-a326-2007f56a02e7">
Jan 31 03:27:08 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <entry name="serial">08dc30a4-60ae-4f4e-8b5e-e12610df2120</entry>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <entry name="uuid">08dc30a4-60ae-4f4e-8b5e-e12610df2120</entry>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk">
Jan 31 03:27:08 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:27:08 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk.config">
Jan 31 03:27:08 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:27:08 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:f9:0c:a9"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <target dev="tapd2ecd824-47"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120/console.log" append="off"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <input type="keyboard" bus="usb"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:27:08 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:27:08 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:27:08 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:27:08 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.114 226239 DEBUG nova.compute.manager [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Preparing to wait for external event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.114 226239 DEBUG oslo_concurrency.lockutils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.114 226239 DEBUG oslo_concurrency.lockutils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.114 226239 DEBUG oslo_concurrency.lockutils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.115 226239 DEBUG nova.virt.libvirt.vif [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:24:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1941034472',display_name='tempest-ServerActionsTestJSON-server-1941034472',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1941034472',id=120,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:26:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-ra3xi6gb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:26:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=08dc30a4-60ae-4f4e-8b5e-e12610df2120,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.115 226239 DEBUG nova.network.os_vif_util [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.116 226239 DEBUG nova.network.os_vif_util [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.116 226239 DEBUG os_vif [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.117 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.117 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.118 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.120 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.121 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd2ecd824-47, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.121 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd2ecd824-47, col_values=(('external_ids', {'iface-id': 'd2ecd824-47f7-4503-a326-2007f56a02e7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:0c:a9', 'vm-uuid': '08dc30a4-60ae-4f4e-8b5e-e12610df2120'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.122 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:08 np0005603623 NetworkManager[48970]: <info>  [1769848028.1237] manager: (tapd2ecd824-47): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/235)
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.125 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.128 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.129 226239 INFO os_vif [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47')#033[00m
Jan 31 03:27:08 np0005603623 kernel: tapd2ecd824-47: entered promiscuous mode
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.241 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:08 np0005603623 ovn_controller[133449]: 2026-01-31T08:27:08Z|00493|binding|INFO|Claiming lport d2ecd824-47f7-4503-a326-2007f56a02e7 for this chassis.
Jan 31 03:27:08 np0005603623 ovn_controller[133449]: 2026-01-31T08:27:08Z|00494|binding|INFO|d2ecd824-47f7-4503-a326-2007f56a02e7: Claiming fa:16:3e:f9:0c:a9 10.100.0.5
Jan 31 03:27:08 np0005603623 NetworkManager[48970]: <info>  [1769848028.2436] manager: (tapd2ecd824-47): new Tun device (/org/freedesktop/NetworkManager/Devices/236)
Jan 31 03:27:08 np0005603623 ovn_controller[133449]: 2026-01-31T08:27:08Z|00495|binding|INFO|Setting lport d2ecd824-47f7-4503-a326-2007f56a02e7 ovn-installed in OVS
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.248 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.256 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.258 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:08 np0005603623 systemd-udevd[281156]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:27:08 np0005603623 systemd-machined[194379]: New machine qemu-55-instance-00000078.
Jan 31 03:27:08 np0005603623 NetworkManager[48970]: <info>  [1769848028.2811] device (tapd2ecd824-47): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:27:08 np0005603623 NetworkManager[48970]: <info>  [1769848028.2820] device (tapd2ecd824-47): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:27:08 np0005603623 systemd[1]: Started Virtual Machine qemu-55-instance-00000078.
Jan 31 03:27:08 np0005603623 ovn_controller[133449]: 2026-01-31T08:27:08Z|00496|binding|INFO|Setting lport d2ecd824-47f7-4503-a326-2007f56a02e7 up in Southbound
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.293 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:0c:a9 10.100.0.5'], port_security=['fa:16:3e:f9:0c:a9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '08dc30a4-60ae-4f4e-8b5e-e12610df2120', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '10', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=d2ecd824-47f7-4503-a326-2007f56a02e7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.294 143258 INFO neutron.agent.ovn.metadata.agent [-] Port d2ecd824-47f7-4503-a326-2007f56a02e7 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 bound to our chassis#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.295 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1186b71b-0c4b-47f0-a55d-4433241e46e7#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.304 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[59407067-d469-4b6a-aadf-48f4b66e730c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.304 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1186b71b-01 in ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.306 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1186b71b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.306 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3a065530-3533-4f2d-bec1-3ca4f9220d9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.307 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fda620fd-e9e6-4700-a334-b3231a919499]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.318 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[b06b1466-ca5b-40c7-82a0-31cc2e26a9cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.327 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9c047372-94fe-4478-aa81-ab1bbe03ecb9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:08 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:27:08 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.354 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[7ccff43e-1e30-4070-89a2-a1ba83d1e947]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:08 np0005603623 systemd-udevd[281158]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:27:08 np0005603623 NetworkManager[48970]: <info>  [1769848028.3612] manager: (tap1186b71b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/237)
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.362 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c0e3e0cd-34b4-4129-8e1c-0ddc4aa16e6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.385 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[55770af6-7a07-4ed9-b36d-68687848a7fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.389 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[cab7eb72-b534-4db3-8fc4-4aded899957a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:08 np0005603623 NetworkManager[48970]: <info>  [1769848028.4054] device (tap1186b71b-00): carrier: link connected
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.408 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[c17fcd73-34ba-4557-9628-2f6c6fd56ad3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.419 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ec79d934-3a73-4dac-8988-939dc27befae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 146], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 716883, 'reachable_time': 41199, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281238, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.429 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b47fa3f4-b685-4454-8c5c-ef578e1c4764]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:37ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 716883, 'tstamp': 716883}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281241, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.440 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ae60ec9b-e506-4f88-bb17-41fc7346a56f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 146], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 716883, 'reachable_time': 41199, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 281242, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.463 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4d670b19-83ae-452b-b202-9c85469f7d87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.499 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a193c042-8b3f-498b-a261-2ab4a0bf62e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.500 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.500 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.501 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1186b71b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.502 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:08 np0005603623 NetworkManager[48970]: <info>  [1769848028.5032] manager: (tap1186b71b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/238)
Jan 31 03:27:08 np0005603623 kernel: tap1186b71b-00: entered promiscuous mode
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.504 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.509 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1186b71b-00, col_values=(('external_ids', {'iface-id': '4375f262-ce22-40bf-bf9b-24f6862763a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:08 np0005603623 ovn_controller[133449]: 2026-01-31T08:27:08Z|00497|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=1)
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.510 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:08 np0005603623 nova_compute[226235]: 2026-01-31 08:27:08.515 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.517 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.518 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fe95c7ab-d0e1-4155-b08a-3baacc656d7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.519 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:27:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:08.519 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'env', 'PROCESS_TAG=haproxy-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1186b71b-0c4b-47f0-a55d-4433241e46e7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:27:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:08.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:08.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:08 np0005603623 podman[281292]: 2026-01-31 08:27:08.799965719 +0000 UTC m=+0.018646005 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:27:09 np0005603623 podman[281292]: 2026-01-31 08:27:09.207099272 +0000 UTC m=+0.425779538 container create ca8830c4acd79a0407277d5afff6ca39110deef47e85b0ac85d772372b648b7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 03:27:09 np0005603623 systemd[1]: Started libpod-conmon-ca8830c4acd79a0407277d5afff6ca39110deef47e85b0ac85d772372b648b7d.scope.
Jan 31 03:27:09 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:27:09 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36cde280d8e721fdc392f9f3625181573bb8039da7c1b3b1862e4cc0df1a7c7f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.411 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:09 np0005603623 podman[281292]: 2026-01-31 08:27:09.475772825 +0000 UTC m=+0.694453091 container init ca8830c4acd79a0407277d5afff6ca39110deef47e85b0ac85d772372b648b7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Jan 31 03:27:09 np0005603623 podman[281292]: 2026-01-31 08:27:09.480763092 +0000 UTC m=+0.699443348 container start ca8830c4acd79a0407277d5afff6ca39110deef47e85b0ac85d772372b648b7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Jan 31 03:27:09 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[281325]: [NOTICE]   (281335) : New worker (281337) forked
Jan 31 03:27:09 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[281325]: [NOTICE]   (281335) : Loading success.
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.498 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848029.4978962, 08dc30a4-60ae-4f4e-8b5e-e12610df2120 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.499 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] VM Started (Lifecycle Event)#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.536 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.539 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848029.4981196, 08dc30a4-60ae-4f4e-8b5e-e12610df2120 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.540 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.573 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.577 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.603 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.866 226239 DEBUG nova.compute.manager [req-ab0c0d25-e783-41da-ae77-389e49f2b5a3 req-e849a7a0-e479-4ea5-a8fa-60784d0a660b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.867 226239 DEBUG oslo_concurrency.lockutils [req-ab0c0d25-e783-41da-ae77-389e49f2b5a3 req-e849a7a0-e479-4ea5-a8fa-60784d0a660b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.867 226239 DEBUG oslo_concurrency.lockutils [req-ab0c0d25-e783-41da-ae77-389e49f2b5a3 req-e849a7a0-e479-4ea5-a8fa-60784d0a660b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.867 226239 DEBUG oslo_concurrency.lockutils [req-ab0c0d25-e783-41da-ae77-389e49f2b5a3 req-e849a7a0-e479-4ea5-a8fa-60784d0a660b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.867 226239 DEBUG nova.compute.manager [req-ab0c0d25-e783-41da-ae77-389e49f2b5a3 req-e849a7a0-e479-4ea5-a8fa-60784d0a660b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Processing event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.868 226239 DEBUG nova.compute.manager [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.871 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848029.8715475, 08dc30a4-60ae-4f4e-8b5e-e12610df2120 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.872 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.876 226239 INFO nova.virt.libvirt.driver [-] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Instance running successfully.#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.876 226239 DEBUG nova.virt.libvirt.driver [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.913 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.917 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.966 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 31 03:27:09 np0005603623 nova_compute[226235]: 2026-01-31 08:27:09.992 226239 INFO nova.compute.manager [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updating instance to original state: 'active'#033[00m
Jan 31 03:27:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:10.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:10.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e289 e289: 3 total, 3 up, 3 in
Jan 31 03:27:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:12.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e289 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:12.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:12 np0005603623 nova_compute[226235]: 2026-01-31 08:27:12.867 226239 DEBUG nova.compute.manager [req-e181d6c9-2c8d-4e84-bd35-83f254b379a9 req-320bfb02-3b97-4562-bbf7-2a1916a14810 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:27:12 np0005603623 nova_compute[226235]: 2026-01-31 08:27:12.867 226239 DEBUG oslo_concurrency.lockutils [req-e181d6c9-2c8d-4e84-bd35-83f254b379a9 req-320bfb02-3b97-4562-bbf7-2a1916a14810 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:12 np0005603623 nova_compute[226235]: 2026-01-31 08:27:12.867 226239 DEBUG oslo_concurrency.lockutils [req-e181d6c9-2c8d-4e84-bd35-83f254b379a9 req-320bfb02-3b97-4562-bbf7-2a1916a14810 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:12 np0005603623 nova_compute[226235]: 2026-01-31 08:27:12.867 226239 DEBUG oslo_concurrency.lockutils [req-e181d6c9-2c8d-4e84-bd35-83f254b379a9 req-320bfb02-3b97-4562-bbf7-2a1916a14810 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:12 np0005603623 nova_compute[226235]: 2026-01-31 08:27:12.868 226239 DEBUG nova.compute.manager [req-e181d6c9-2c8d-4e84-bd35-83f254b379a9 req-320bfb02-3b97-4562-bbf7-2a1916a14810 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] No waiting events found dispatching network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:27:12 np0005603623 nova_compute[226235]: 2026-01-31 08:27:12.868 226239 WARNING nova.compute.manager [req-e181d6c9-2c8d-4e84-bd35-83f254b379a9 req-320bfb02-3b97-4562-bbf7-2a1916a14810 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received unexpected event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:27:13 np0005603623 nova_compute[226235]: 2026-01-31 08:27:13.076 226239 DEBUG oslo_concurrency.lockutils [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:13 np0005603623 nova_compute[226235]: 2026-01-31 08:27:13.076 226239 DEBUG oslo_concurrency.lockutils [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:13 np0005603623 nova_compute[226235]: 2026-01-31 08:27:13.077 226239 DEBUG oslo_concurrency.lockutils [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:13 np0005603623 nova_compute[226235]: 2026-01-31 08:27:13.077 226239 DEBUG oslo_concurrency.lockutils [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:13 np0005603623 nova_compute[226235]: 2026-01-31 08:27:13.077 226239 DEBUG oslo_concurrency.lockutils [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:13 np0005603623 nova_compute[226235]: 2026-01-31 08:27:13.078 226239 INFO nova.compute.manager [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Terminating instance#033[00m
Jan 31 03:27:13 np0005603623 nova_compute[226235]: 2026-01-31 08:27:13.079 226239 DEBUG nova.compute.manager [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:27:13 np0005603623 nova_compute[226235]: 2026-01-31 08:27:13.124 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:13 np0005603623 kernel: tapd2ecd824-47 (unregistering): left promiscuous mode
Jan 31 03:27:13 np0005603623 NetworkManager[48970]: <info>  [1769848033.7786] device (tapd2ecd824-47): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:27:13 np0005603623 ovn_controller[133449]: 2026-01-31T08:27:13Z|00498|binding|INFO|Releasing lport d2ecd824-47f7-4503-a326-2007f56a02e7 from this chassis (sb_readonly=0)
Jan 31 03:27:13 np0005603623 ovn_controller[133449]: 2026-01-31T08:27:13Z|00499|binding|INFO|Setting lport d2ecd824-47f7-4503-a326-2007f56a02e7 down in Southbound
Jan 31 03:27:13 np0005603623 ovn_controller[133449]: 2026-01-31T08:27:13Z|00500|binding|INFO|Removing iface tapd2ecd824-47 ovn-installed in OVS
Jan 31 03:27:13 np0005603623 nova_compute[226235]: 2026-01-31 08:27:13.784 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:13 np0005603623 nova_compute[226235]: 2026-01-31 08:27:13.787 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:13 np0005603623 nova_compute[226235]: 2026-01-31 08:27:13.795 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:13 np0005603623 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000078.scope: Deactivated successfully.
Jan 31 03:27:13 np0005603623 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000078.scope: Consumed 3.878s CPU time.
Jan 31 03:27:13 np0005603623 systemd-machined[194379]: Machine qemu-55-instance-00000078 terminated.
Jan 31 03:27:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:13.858 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:0c:a9 10.100.0.5'], port_security=['fa:16:3e:f9:0c:a9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '08dc30a4-60ae-4f4e-8b5e-e12610df2120', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '12', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=d2ecd824-47f7-4503-a326-2007f56a02e7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:27:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:13.859 143258 INFO neutron.agent.ovn.metadata.agent [-] Port d2ecd824-47f7-4503-a326-2007f56a02e7 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 unbound from our chassis#033[00m
Jan 31 03:27:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:13.861 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1186b71b-0c4b-47f0-a55d-4433241e46e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:27:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:13.861 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c5434d1c-8424-423f-9a1f-e5c95112ee2e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:13.862 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace which is not needed anymore#033[00m
Jan 31 03:27:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e290 e290: 3 total, 3 up, 3 in
Jan 31 03:27:13 np0005603623 nova_compute[226235]: 2026-01-31 08:27:13.906 226239 INFO nova.virt.libvirt.driver [-] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Instance destroyed successfully.#033[00m
Jan 31 03:27:13 np0005603623 nova_compute[226235]: 2026-01-31 08:27:13.907 226239 DEBUG nova.objects.instance [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'resources' on Instance uuid 08dc30a4-60ae-4f4e-8b5e-e12610df2120 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:27:14 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[281325]: [NOTICE]   (281335) : haproxy version is 2.8.14-c23fe91
Jan 31 03:27:14 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[281325]: [NOTICE]   (281335) : path to executable is /usr/sbin/haproxy
Jan 31 03:27:14 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[281325]: [WARNING]  (281335) : Exiting Master process...
Jan 31 03:27:14 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[281325]: [WARNING]  (281335) : Exiting Master process...
Jan 31 03:27:14 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[281325]: [ALERT]    (281335) : Current worker (281337) exited with code 143 (Terminated)
Jan 31 03:27:14 np0005603623 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[281325]: [WARNING]  (281335) : All workers exited. Exiting... (0)
Jan 31 03:27:14 np0005603623 systemd[1]: libpod-ca8830c4acd79a0407277d5afff6ca39110deef47e85b0ac85d772372b648b7d.scope: Deactivated successfully.
Jan 31 03:27:14 np0005603623 podman[281383]: 2026-01-31 08:27:14.061211412 +0000 UTC m=+0.131560835 container died ca8830c4acd79a0407277d5afff6ca39110deef47e85b0ac85d772372b648b7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 31 03:27:14 np0005603623 nova_compute[226235]: 2026-01-31 08:27:14.096 226239 DEBUG nova.virt.libvirt.vif [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:24:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1941034472',display_name='tempest-ServerActionsTestJSON-server-1941034472',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1941034472',id=120,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:27:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-ra3xi6gb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:27:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=08dc30a4-60ae-4f4e-8b5e-e12610df2120,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:27:14 np0005603623 nova_compute[226235]: 2026-01-31 08:27:14.097 226239 DEBUG nova.network.os_vif_util [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:27:14 np0005603623 nova_compute[226235]: 2026-01-31 08:27:14.098 226239 DEBUG nova.network.os_vif_util [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:27:14 np0005603623 nova_compute[226235]: 2026-01-31 08:27:14.098 226239 DEBUG os_vif [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:27:14 np0005603623 nova_compute[226235]: 2026-01-31 08:27:14.101 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:14 np0005603623 nova_compute[226235]: 2026-01-31 08:27:14.102 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2ecd824-47, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:14 np0005603623 nova_compute[226235]: 2026-01-31 08:27:14.103 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:14 np0005603623 nova_compute[226235]: 2026-01-31 08:27:14.106 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:27:14 np0005603623 nova_compute[226235]: 2026-01-31 08:27:14.106 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:14 np0005603623 nova_compute[226235]: 2026-01-31 08:27:14.109 226239 INFO os_vif [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47')#033[00m
Jan 31 03:27:14 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ca8830c4acd79a0407277d5afff6ca39110deef47e85b0ac85d772372b648b7d-userdata-shm.mount: Deactivated successfully.
Jan 31 03:27:14 np0005603623 systemd[1]: var-lib-containers-storage-overlay-36cde280d8e721fdc392f9f3625181573bb8039da7c1b3b1862e4cc0df1a7c7f-merged.mount: Deactivated successfully.
Jan 31 03:27:14 np0005603623 nova_compute[226235]: 2026-01-31 08:27:14.412 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:27:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1914699856' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:27:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:27:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1914699856' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:27:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:27:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:14.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:27:14 np0005603623 podman[281383]: 2026-01-31 08:27:14.793653252 +0000 UTC m=+0.864002655 container cleanup ca8830c4acd79a0407277d5afff6ca39110deef47e85b0ac85d772372b648b7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:27:14 np0005603623 systemd[1]: libpod-conmon-ca8830c4acd79a0407277d5afff6ca39110deef47e85b0ac85d772372b648b7d.scope: Deactivated successfully.
Jan 31 03:27:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:14.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:15 np0005603623 podman[281433]: 2026-01-31 08:27:15.738440101 +0000 UTC m=+0.928249490 container remove ca8830c4acd79a0407277d5afff6ca39110deef47e85b0ac85d772372b648b7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:27:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:15.744 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a1187b48-253e-47a9-a9d6-9fb4bedfa446]: (4, ('Sat Jan 31 08:27:13 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (ca8830c4acd79a0407277d5afff6ca39110deef47e85b0ac85d772372b648b7d)\nca8830c4acd79a0407277d5afff6ca39110deef47e85b0ac85d772372b648b7d\nSat Jan 31 08:27:14 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (ca8830c4acd79a0407277d5afff6ca39110deef47e85b0ac85d772372b648b7d)\nca8830c4acd79a0407277d5afff6ca39110deef47e85b0ac85d772372b648b7d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:15.745 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6cd1ff68-168b-4ddf-b197-903febf61948]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:15.746 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:15 np0005603623 nova_compute[226235]: 2026-01-31 08:27:15.748 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:15 np0005603623 kernel: tap1186b71b-00: left promiscuous mode
Jan 31 03:27:15 np0005603623 nova_compute[226235]: 2026-01-31 08:27:15.750 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:15.753 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8b4d2d33-7249-4784-a3d0-15d8bf9ea276]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:15 np0005603623 nova_compute[226235]: 2026-01-31 08:27:15.755 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:15.769 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a6c49529-ed43-4e0a-b69e-95cc906136fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:15.770 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d068dd16-e88e-4fd2-8ee6-a55ab74a2aec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:15.784 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a5e77a33-f71b-4bf5-b611-1a213875755f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 716878, 'reachable_time': 17197, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281449, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:15 np0005603623 systemd[1]: run-netns-ovnmeta\x2d1186b71b\x2d0c4b\x2d47f0\x2da55d\x2d4433241e46e7.mount: Deactivated successfully.
Jan 31 03:27:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:15.788 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:27:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:15.788 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[9ecb7d16-0834-4acf-ad09-c3341dacc005]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:16 np0005603623 nova_compute[226235]: 2026-01-31 08:27:16.278 226239 DEBUG nova.compute.manager [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-unplugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:27:16 np0005603623 nova_compute[226235]: 2026-01-31 08:27:16.278 226239 DEBUG oslo_concurrency.lockutils [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:16 np0005603623 nova_compute[226235]: 2026-01-31 08:27:16.279 226239 DEBUG oslo_concurrency.lockutils [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:16 np0005603623 nova_compute[226235]: 2026-01-31 08:27:16.279 226239 DEBUG oslo_concurrency.lockutils [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:16 np0005603623 nova_compute[226235]: 2026-01-31 08:27:16.280 226239 DEBUG nova.compute.manager [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] No waiting events found dispatching network-vif-unplugged-d2ecd824-47f7-4503-a326-2007f56a02e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:27:16 np0005603623 nova_compute[226235]: 2026-01-31 08:27:16.280 226239 DEBUG nova.compute.manager [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-unplugged-d2ecd824-47f7-4503-a326-2007f56a02e7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:27:16 np0005603623 nova_compute[226235]: 2026-01-31 08:27:16.280 226239 DEBUG nova.compute.manager [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:27:16 np0005603623 nova_compute[226235]: 2026-01-31 08:27:16.280 226239 DEBUG oslo_concurrency.lockutils [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:16 np0005603623 nova_compute[226235]: 2026-01-31 08:27:16.281 226239 DEBUG oslo_concurrency.lockutils [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:16 np0005603623 nova_compute[226235]: 2026-01-31 08:27:16.281 226239 DEBUG oslo_concurrency.lockutils [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:16 np0005603623 nova_compute[226235]: 2026-01-31 08:27:16.281 226239 DEBUG nova.compute.manager [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] No waiting events found dispatching network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:27:16 np0005603623 nova_compute[226235]: 2026-01-31 08:27:16.282 226239 WARNING nova.compute.manager [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received unexpected event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:27:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:27:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:16.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:27:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:16.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:27:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:18.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:27:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e291 e291: 3 total, 3 up, 3 in
Jan 31 03:27:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:27:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:18.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:27:19 np0005603623 nova_compute[226235]: 2026-01-31 08:27:19.106 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:19 np0005603623 nova_compute[226235]: 2026-01-31 08:27:19.414 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:27:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:20.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:27:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:27:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:20.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:27:21 np0005603623 podman[281454]: 2026-01-31 08:27:21.955280329 +0000 UTC m=+0.047945154 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 31 03:27:21 np0005603623 podman[281455]: 2026-01-31 08:27:21.976147233 +0000 UTC m=+0.068943473 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 31 03:27:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:22.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:22.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e292 e292: 3 total, 3 up, 3 in
Jan 31 03:27:24 np0005603623 nova_compute[226235]: 2026-01-31 08:27:24.109 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:24 np0005603623 nova_compute[226235]: 2026-01-31 08:27:24.416 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:24.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:27:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:24.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:27:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e293 e293: 3 total, 3 up, 3 in
Jan 31 03:27:25 np0005603623 nova_compute[226235]: 2026-01-31 08:27:25.690 226239 INFO nova.virt.libvirt.driver [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Deleting instance files /var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120_del#033[00m
Jan 31 03:27:25 np0005603623 nova_compute[226235]: 2026-01-31 08:27:25.691 226239 INFO nova.virt.libvirt.driver [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Deletion of /var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120_del complete#033[00m
Jan 31 03:27:25 np0005603623 nova_compute[226235]: 2026-01-31 08:27:25.760 226239 INFO nova.compute.manager [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Took 12.68 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:27:25 np0005603623 nova_compute[226235]: 2026-01-31 08:27:25.763 226239 DEBUG oslo.service.loopingcall [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:27:25 np0005603623 nova_compute[226235]: 2026-01-31 08:27:25.764 226239 DEBUG nova.compute.manager [-] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:27:25 np0005603623 nova_compute[226235]: 2026-01-31 08:27:25.764 226239 DEBUG nova.network.neutron [-] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:27:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:26.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:27:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:26.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:27:27 np0005603623 nova_compute[226235]: 2026-01-31 08:27:27.251 226239 DEBUG nova.network.neutron [-] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:27:27 np0005603623 nova_compute[226235]: 2026-01-31 08:27:27.268 226239 INFO nova.compute.manager [-] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Took 1.50 seconds to deallocate network for instance.#033[00m
Jan 31 03:27:27 np0005603623 nova_compute[226235]: 2026-01-31 08:27:27.309 226239 DEBUG oslo_concurrency.lockutils [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:27 np0005603623 nova_compute[226235]: 2026-01-31 08:27:27.309 226239 DEBUG oslo_concurrency.lockutils [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:27 np0005603623 nova_compute[226235]: 2026-01-31 08:27:27.321 226239 DEBUG oslo_concurrency.lockutils [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:27 np0005603623 nova_compute[226235]: 2026-01-31 08:27:27.336 226239 DEBUG nova.compute.manager [req-3d3452b1-ffa5-45b0-9c39-05f34f304ab2 req-79764792-8b66-4308-adfc-3471a7f4825e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-deleted-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:27:27 np0005603623 nova_compute[226235]: 2026-01-31 08:27:27.359 226239 INFO nova.scheduler.client.report [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Deleted allocations for instance 08dc30a4-60ae-4f4e-8b5e-e12610df2120#033[00m
Jan 31 03:27:27 np0005603623 nova_compute[226235]: 2026-01-31 08:27:27.436 226239 DEBUG oslo_concurrency.lockutils [None req-7f075c9d-d882-4ca4-a912-af118b687648 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 14.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:28.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:27:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:28.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:27:28 np0005603623 nova_compute[226235]: 2026-01-31 08:27:28.905 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848033.9041207, 08dc30a4-60ae-4f4e-8b5e-e12610df2120 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:27:28 np0005603623 nova_compute[226235]: 2026-01-31 08:27:28.905 226239 INFO nova.compute.manager [-] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:27:28 np0005603623 nova_compute[226235]: 2026-01-31 08:27:28.995 226239 DEBUG nova.compute.manager [None req-6f5b9c95-f2aa-41d1-b8d8-ef2480ac2244 - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:27:29 np0005603623 nova_compute[226235]: 2026-01-31 08:27:29.111 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:29 np0005603623 nova_compute[226235]: 2026-01-31 08:27:29.417 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:30.120 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:30.120 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:30.121 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:30.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:30.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e294 e294: 3 total, 3 up, 3 in
Jan 31 03:27:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:32.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:32.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e295 e295: 3 total, 3 up, 3 in
Jan 31 03:27:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e296 e296: 3 total, 3 up, 3 in
Jan 31 03:27:34 np0005603623 nova_compute[226235]: 2026-01-31 08:27:34.145 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:34 np0005603623 nova_compute[226235]: 2026-01-31 08:27:34.419 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:34.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:34.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:27:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:36.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:27:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:36.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:38 np0005603623 nova_compute[226235]: 2026-01-31 08:27:38.247 226239 DEBUG oslo_concurrency.lockutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:38 np0005603623 nova_compute[226235]: 2026-01-31 08:27:38.247 226239 DEBUG oslo_concurrency.lockutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:38 np0005603623 nova_compute[226235]: 2026-01-31 08:27:38.274 226239 DEBUG nova.compute.manager [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:27:38 np0005603623 nova_compute[226235]: 2026-01-31 08:27:38.423 226239 DEBUG oslo_concurrency.lockutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:38 np0005603623 nova_compute[226235]: 2026-01-31 08:27:38.423 226239 DEBUG oslo_concurrency.lockutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:38 np0005603623 nova_compute[226235]: 2026-01-31 08:27:38.429 226239 DEBUG nova.virt.hardware [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:27:38 np0005603623 nova_compute[226235]: 2026-01-31 08:27:38.429 226239 INFO nova.compute.claims [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:27:38 np0005603623 nova_compute[226235]: 2026-01-31 08:27:38.555 226239 DEBUG oslo_concurrency.processutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:38.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:38.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:39 np0005603623 nova_compute[226235]: 2026-01-31 08:27:39.148 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:39 np0005603623 nova_compute[226235]: 2026-01-31 08:27:39.420 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:27:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2511867587' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:27:39 np0005603623 nova_compute[226235]: 2026-01-31 08:27:39.533 226239 DEBUG oslo_concurrency.processutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.978s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:39 np0005603623 nova_compute[226235]: 2026-01-31 08:27:39.537 226239 DEBUG nova.compute.provider_tree [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:27:39 np0005603623 nova_compute[226235]: 2026-01-31 08:27:39.554 226239 DEBUG nova.scheduler.client.report [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:27:39 np0005603623 nova_compute[226235]: 2026-01-31 08:27:39.585 226239 DEBUG oslo_concurrency.lockutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:39 np0005603623 nova_compute[226235]: 2026-01-31 08:27:39.586 226239 DEBUG nova.compute.manager [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:27:39 np0005603623 nova_compute[226235]: 2026-01-31 08:27:39.664 226239 DEBUG nova.compute.manager [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:27:39 np0005603623 nova_compute[226235]: 2026-01-31 08:27:39.665 226239 DEBUG nova.network.neutron [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:27:39 np0005603623 nova_compute[226235]: 2026-01-31 08:27:39.777 226239 INFO nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:27:39 np0005603623 nova_compute[226235]: 2026-01-31 08:27:39.813 226239 DEBUG nova.compute.manager [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:27:39 np0005603623 nova_compute[226235]: 2026-01-31 08:27:39.921 226239 DEBUG nova.compute.manager [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:27:39 np0005603623 nova_compute[226235]: 2026-01-31 08:27:39.922 226239 DEBUG nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:27:39 np0005603623 nova_compute[226235]: 2026-01-31 08:27:39.922 226239 INFO nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Creating image(s)#033[00m
Jan 31 03:27:39 np0005603623 nova_compute[226235]: 2026-01-31 08:27:39.982 226239 DEBUG nova.storage.rbd_utils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:27:40 np0005603623 nova_compute[226235]: 2026-01-31 08:27:40.005 226239 DEBUG nova.storage.rbd_utils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:27:40 np0005603623 nova_compute[226235]: 2026-01-31 08:27:40.032 226239 DEBUG nova.storage.rbd_utils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:27:40 np0005603623 nova_compute[226235]: 2026-01-31 08:27:40.036 226239 DEBUG oslo_concurrency.processutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:40 np0005603623 nova_compute[226235]: 2026-01-31 08:27:40.054 226239 DEBUG nova.policy [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '432ac8867d8240408db455fc25bb5901', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '491937de020742d7b4e847dc3bf57950', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:27:40 np0005603623 nova_compute[226235]: 2026-01-31 08:27:40.087 226239 DEBUG oslo_concurrency.processutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:40 np0005603623 nova_compute[226235]: 2026-01-31 08:27:40.088 226239 DEBUG oslo_concurrency.lockutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:40 np0005603623 nova_compute[226235]: 2026-01-31 08:27:40.088 226239 DEBUG oslo_concurrency.lockutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:40 np0005603623 nova_compute[226235]: 2026-01-31 08:27:40.089 226239 DEBUG oslo_concurrency.lockutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:40 np0005603623 nova_compute[226235]: 2026-01-31 08:27:40.116 226239 DEBUG nova.storage.rbd_utils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:27:40 np0005603623 nova_compute[226235]: 2026-01-31 08:27:40.119 226239 DEBUG oslo_concurrency.processutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:40.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:27:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:40.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:27:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e297 e297: 3 total, 3 up, 3 in
Jan 31 03:27:41 np0005603623 nova_compute[226235]: 2026-01-31 08:27:41.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:41 np0005603623 nova_compute[226235]: 2026-01-31 08:27:41.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:27:41 np0005603623 nova_compute[226235]: 2026-01-31 08:27:41.866 226239 DEBUG nova.network.neutron [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Successfully created port: 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:27:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:42.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:42 np0005603623 nova_compute[226235]: 2026-01-31 08:27:42.756 226239 DEBUG nova.network.neutron [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Successfully updated port: 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:27:42 np0005603623 nova_compute[226235]: 2026-01-31 08:27:42.791 226239 DEBUG oslo_concurrency.lockutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:27:42 np0005603623 nova_compute[226235]: 2026-01-31 08:27:42.792 226239 DEBUG oslo_concurrency.lockutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquired lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:27:42 np0005603623 nova_compute[226235]: 2026-01-31 08:27:42.792 226239 DEBUG nova.network.neutron [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:27:42 np0005603623 nova_compute[226235]: 2026-01-31 08:27:42.869 226239 DEBUG nova.compute.manager [req-0d36795e-5815-493d-8121-50dc0eae3445 req-b6864f41-559e-409d-a65d-6cf9e984743c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received event network-changed-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:27:42 np0005603623 nova_compute[226235]: 2026-01-31 08:27:42.869 226239 DEBUG nova.compute.manager [req-0d36795e-5815-493d-8121-50dc0eae3445 req-b6864f41-559e-409d-a65d-6cf9e984743c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Refreshing instance network info cache due to event network-changed-58bbf6e9-a33b-4f2b-81e8-812adc1221b5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:27:42 np0005603623 nova_compute[226235]: 2026-01-31 08:27:42.869 226239 DEBUG oslo_concurrency.lockutils [req-0d36795e-5815-493d-8121-50dc0eae3445 req-b6864f41-559e-409d-a65d-6cf9e984743c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:27:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:42.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:42 np0005603623 nova_compute[226235]: 2026-01-31 08:27:42.948 226239 DEBUG nova.network.neutron [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:27:43 np0005603623 nova_compute[226235]: 2026-01-31 08:27:43.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:43 np0005603623 nova_compute[226235]: 2026-01-31 08:27:43.187 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:43 np0005603623 nova_compute[226235]: 2026-01-31 08:27:43.188 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:43 np0005603623 nova_compute[226235]: 2026-01-31 08:27:43.188 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:43 np0005603623 nova_compute[226235]: 2026-01-31 08:27:43.188 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:27:43 np0005603623 nova_compute[226235]: 2026-01-31 08:27:43.189 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e297 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:27:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1172759248' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:27:43 np0005603623 nova_compute[226235]: 2026-01-31 08:27:43.768 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.579s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:43 np0005603623 nova_compute[226235]: 2026-01-31 08:27:43.895 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:27:43 np0005603623 nova_compute[226235]: 2026-01-31 08:27:43.896 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4400MB free_disk=20.876449584960938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:27:43 np0005603623 nova_compute[226235]: 2026-01-31 08:27:43.896 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:43 np0005603623 nova_compute[226235]: 2026-01-31 08:27:43.897 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:43 np0005603623 nova_compute[226235]: 2026-01-31 08:27:43.976 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance b9f38b79-63fc-48a1-a367-6998b8d6a9dc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:27:43 np0005603623 nova_compute[226235]: 2026-01-31 08:27:43.976 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:27:43 np0005603623 nova_compute[226235]: 2026-01-31 08:27:43.977 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:27:44 np0005603623 nova_compute[226235]: 2026-01-31 08:27:44.022 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:44 np0005603623 nova_compute[226235]: 2026-01-31 08:27:44.039 226239 DEBUG nova.network.neutron [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Updating instance_info_cache with network_info: [{"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:27:44 np0005603623 nova_compute[226235]: 2026-01-31 08:27:44.063 226239 DEBUG oslo_concurrency.lockutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Releasing lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:27:44 np0005603623 nova_compute[226235]: 2026-01-31 08:27:44.063 226239 DEBUG nova.compute.manager [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Instance network_info: |[{"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:27:44 np0005603623 nova_compute[226235]: 2026-01-31 08:27:44.064 226239 DEBUG oslo_concurrency.lockutils [req-0d36795e-5815-493d-8121-50dc0eae3445 req-b6864f41-559e-409d-a65d-6cf9e984743c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:27:44 np0005603623 nova_compute[226235]: 2026-01-31 08:27:44.064 226239 DEBUG nova.network.neutron [req-0d36795e-5815-493d-8121-50dc0eae3445 req-b6864f41-559e-409d-a65d-6cf9e984743c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Refreshing network info cache for port 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:27:44 np0005603623 nova_compute[226235]: 2026-01-31 08:27:44.151 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:44 np0005603623 nova_compute[226235]: 2026-01-31 08:27:44.425 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:44.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:27:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:44.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:27:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e298 e298: 3 total, 3 up, 3 in
Jan 31 03:27:45 np0005603623 nova_compute[226235]: 2026-01-31 08:27:45.299 226239 DEBUG nova.network.neutron [req-0d36795e-5815-493d-8121-50dc0eae3445 req-b6864f41-559e-409d-a65d-6cf9e984743c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Updated VIF entry in instance network info cache for port 58bbf6e9-a33b-4f2b-81e8-812adc1221b5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:27:45 np0005603623 nova_compute[226235]: 2026-01-31 08:27:45.300 226239 DEBUG nova.network.neutron [req-0d36795e-5815-493d-8121-50dc0eae3445 req-b6864f41-559e-409d-a65d-6cf9e984743c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Updating instance_info_cache with network_info: [{"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:27:45 np0005603623 nova_compute[226235]: 2026-01-31 08:27:45.428 226239 DEBUG oslo_concurrency.lockutils [req-0d36795e-5815-493d-8121-50dc0eae3445 req-b6864f41-559e-409d-a65d-6cf9e984743c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:27:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:27:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1773551707' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:27:45 np0005603623 nova_compute[226235]: 2026-01-31 08:27:45.722 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.700s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:45 np0005603623 nova_compute[226235]: 2026-01-31 08:27:45.727 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:27:45 np0005603623 nova_compute[226235]: 2026-01-31 08:27:45.822 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:27:46 np0005603623 nova_compute[226235]: 2026-01-31 08:27:46.055 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:27:46 np0005603623 nova_compute[226235]: 2026-01-31 08:27:46.055 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:46.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:46.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:47 np0005603623 nova_compute[226235]: 2026-01-31 08:27:47.055 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:47 np0005603623 nova_compute[226235]: 2026-01-31 08:27:47.056 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:27:47 np0005603623 nova_compute[226235]: 2026-01-31 08:27:47.056 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:27:47 np0005603623 nova_compute[226235]: 2026-01-31 08:27:47.070 226239 DEBUG oslo_concurrency.processutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 6.951s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:47 np0005603623 nova_compute[226235]: 2026-01-31 08:27:47.149 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:27:47 np0005603623 nova_compute[226235]: 2026-01-31 08:27:47.149 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:27:47 np0005603623 nova_compute[226235]: 2026-01-31 08:27:47.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:47 np0005603623 nova_compute[226235]: 2026-01-31 08:27:47.151 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:47 np0005603623 nova_compute[226235]: 2026-01-31 08:27:47.154 226239 DEBUG nova.storage.rbd_utils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] resizing rbd image b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:27:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:48.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.663 226239 DEBUG nova.objects.instance [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'migration_context' on Instance uuid b9f38b79-63fc-48a1-a367-6998b8d6a9dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.713 226239 DEBUG nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.714 226239 DEBUG nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Ensure instance console log exists: /var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.714 226239 DEBUG oslo_concurrency.lockutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.715 226239 DEBUG oslo_concurrency.lockutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.716 226239 DEBUG oslo_concurrency.lockutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.718 226239 DEBUG nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Start _get_guest_xml network_info=[{"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.721 226239 WARNING nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.726 226239 DEBUG nova.virt.libvirt.host [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.727 226239 DEBUG nova.virt.libvirt.host [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.731 226239 DEBUG nova.virt.libvirt.host [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.731 226239 DEBUG nova.virt.libvirt.host [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.733 226239 DEBUG nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.733 226239 DEBUG nova.virt.hardware [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.733 226239 DEBUG nova.virt.hardware [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.734 226239 DEBUG nova.virt.hardware [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.734 226239 DEBUG nova.virt.hardware [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.735 226239 DEBUG nova.virt.hardware [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.735 226239 DEBUG nova.virt.hardware [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.735 226239 DEBUG nova.virt.hardware [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.736 226239 DEBUG nova.virt.hardware [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.736 226239 DEBUG nova.virt.hardware [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.736 226239 DEBUG nova.virt.hardware [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.737 226239 DEBUG nova.virt.hardware [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:27:48 np0005603623 nova_compute[226235]: 2026-01-31 08:27:48.740 226239 DEBUG oslo_concurrency.processutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:48.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.154 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:27:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1914660991' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.344 226239 DEBUG oslo_concurrency.processutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.604s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.380 226239 DEBUG nova.storage.rbd_utils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.387 226239 DEBUG oslo_concurrency.processutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.425 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:27:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2769671720' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.796 226239 DEBUG oslo_concurrency.processutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.798 226239 DEBUG nova.virt.libvirt.vif [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:27:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-2092089502',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-2092089502',id=123,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDqJ2u98tnCfusFKrUeql0ngSDNf86DLAElp/RNmhRZkam9aFuB8mUdP/dAMmSCZVQ6AaZGjQO8tc+tThhzKBQRodouufnRusHHQiOXeUQ9hnIPnIcTcQ3b1LbRSS3JzxA==',key_name='tempest-keypair-1914515639',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='491937de020742d7b4e847dc3bf57950',ramdisk_id='',reservation_id='r-zq57k11v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-60119558',owner_user_name='tempest-AttachVolumeShelveTestJSON-60119558-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:27:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='432ac8867d8240408db455fc25bb5901',uuid=b9f38b79-63fc-48a1-a367-6998b8d6a9dc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.798 226239 DEBUG nova.network.os_vif_util [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converting VIF {"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.799 226239 DEBUG nova.network.os_vif_util [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:fd:8c,bridge_name='br-int',has_traffic_filtering=True,id=58bbf6e9-a33b-4f2b-81e8-812adc1221b5,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58bbf6e9-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.800 226239 DEBUG nova.objects.instance [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'pci_devices' on Instance uuid b9f38b79-63fc-48a1-a367-6998b8d6a9dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.820 226239 DEBUG nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:27:49 np0005603623 nova_compute[226235]:  <uuid>b9f38b79-63fc-48a1-a367-6998b8d6a9dc</uuid>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:  <name>instance-0000007b</name>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <nova:name>tempest-AttachVolumeShelveTestJSON-server-2092089502</nova:name>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:27:48</nova:creationTime>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:27:49 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:        <nova:user uuid="432ac8867d8240408db455fc25bb5901">tempest-AttachVolumeShelveTestJSON-60119558-project-member</nova:user>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:        <nova:project uuid="491937de020742d7b4e847dc3bf57950">tempest-AttachVolumeShelveTestJSON-60119558</nova:project>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:        <nova:port uuid="58bbf6e9-a33b-4f2b-81e8-812adc1221b5">
Jan 31 03:27:49 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <entry name="serial">b9f38b79-63fc-48a1-a367-6998b8d6a9dc</entry>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <entry name="uuid">b9f38b79-63fc-48a1-a367-6998b8d6a9dc</entry>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk">
Jan 31 03:27:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:27:49 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk.config">
Jan 31 03:27:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:27:49 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:ff:fd:8c"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <target dev="tap58bbf6e9-a3"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc/console.log" append="off"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:27:49 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:27:49 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:27:49 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:27:49 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.821 226239 DEBUG nova.compute.manager [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Preparing to wait for external event network-vif-plugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.821 226239 DEBUG oslo_concurrency.lockutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.822 226239 DEBUG oslo_concurrency.lockutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.822 226239 DEBUG oslo_concurrency.lockutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.822 226239 DEBUG nova.virt.libvirt.vif [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:27:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-2092089502',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-2092089502',id=123,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDqJ2u98tnCfusFKrUeql0ngSDNf86DLAElp/RNmhRZkam9aFuB8mUdP/dAMmSCZVQ6AaZGjQO8tc+tThhzKBQRodouufnRusHHQiOXeUQ9hnIPnIcTcQ3b1LbRSS3JzxA==',key_name='tempest-keypair-1914515639',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='491937de020742d7b4e847dc3bf57950',ramdisk_id='',reservation_id='r-zq57k11v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-60119558',owner_user_name='tempest-AttachVolumeShelveTestJSON-60119558-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:27:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='432ac8867d8240408db455fc25bb5901',uuid=b9f38b79-63fc-48a1-a367-6998b8d6a9dc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.823 226239 DEBUG nova.network.os_vif_util [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converting VIF {"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.823 226239 DEBUG nova.network.os_vif_util [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:fd:8c,bridge_name='br-int',has_traffic_filtering=True,id=58bbf6e9-a33b-4f2b-81e8-812adc1221b5,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58bbf6e9-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.824 226239 DEBUG os_vif [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:fd:8c,bridge_name='br-int',has_traffic_filtering=True,id=58bbf6e9-a33b-4f2b-81e8-812adc1221b5,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58bbf6e9-a3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.824 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.825 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.825 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.827 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.827 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58bbf6e9-a3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.827 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap58bbf6e9-a3, col_values=(('external_ids', {'iface-id': '58bbf6e9-a33b-4f2b-81e8-812adc1221b5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:fd:8c', 'vm-uuid': 'b9f38b79-63fc-48a1-a367-6998b8d6a9dc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.828 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:49 np0005603623 NetworkManager[48970]: <info>  [1769848069.8297] manager: (tap58bbf6e9-a3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/239)
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.830 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.834 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.834 226239 INFO os_vif [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:fd:8c,bridge_name='br-int',has_traffic_filtering=True,id=58bbf6e9-a33b-4f2b-81e8-812adc1221b5,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58bbf6e9-a3')#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.960 226239 DEBUG nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.961 226239 DEBUG nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.961 226239 DEBUG nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] No VIF found with MAC fa:16:3e:ff:fd:8c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.961 226239 INFO nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Using config drive#033[00m
Jan 31 03:27:49 np0005603623 nova_compute[226235]: 2026-01-31 08:27:49.989 226239 DEBUG nova.storage.rbd_utils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:27:50 np0005603623 nova_compute[226235]: 2026-01-31 08:27:50.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:50 np0005603623 nova_compute[226235]: 2026-01-31 08:27:50.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:50.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:50.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:50 np0005603623 nova_compute[226235]: 2026-01-31 08:27:50.952 226239 INFO nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Creating config drive at /var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc/disk.config#033[00m
Jan 31 03:27:50 np0005603623 nova_compute[226235]: 2026-01-31 08:27:50.958 226239 DEBUG oslo_concurrency.processutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpf69cu7s6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:51 np0005603623 nova_compute[226235]: 2026-01-31 08:27:51.079 226239 DEBUG oslo_concurrency.processutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpf69cu7s6" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:51 np0005603623 nova_compute[226235]: 2026-01-31 08:27:51.190 226239 DEBUG nova.storage.rbd_utils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:27:51 np0005603623 nova_compute[226235]: 2026-01-31 08:27:51.194 226239 DEBUG oslo_concurrency.processutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc/disk.config b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:51.991 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:27:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:27:51.992 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:27:52 np0005603623 nova_compute[226235]: 2026-01-31 08:27:52.038 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:52 np0005603623 nova_compute[226235]: 2026-01-31 08:27:52.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:52.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:52.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:52 np0005603623 podman[281972]: 2026-01-31 08:27:52.955269301 +0000 UTC m=+0.047952684 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 31 03:27:52 np0005603623 podman[281973]: 2026-01-31 08:27:52.973268166 +0000 UTC m=+0.066123865 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 31 03:27:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:54 np0005603623 nova_compute[226235]: 2026-01-31 08:27:54.427 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:27:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:54.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:27:54 np0005603623 nova_compute[226235]: 2026-01-31 08:27:54.829 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:27:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:54.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:27:54 np0005603623 nova_compute[226235]: 2026-01-31 08:27:54.903 226239 DEBUG oslo_concurrency.processutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc/disk.config b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.709s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:54 np0005603623 nova_compute[226235]: 2026-01-31 08:27:54.904 226239 INFO nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Deleting local config drive /var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc/disk.config because it was imported into RBD.#033[00m
Jan 31 03:27:54 np0005603623 kernel: tap58bbf6e9-a3: entered promiscuous mode
Jan 31 03:27:54 np0005603623 NetworkManager[48970]: <info>  [1769848074.9468] manager: (tap58bbf6e9-a3): new Tun device (/org/freedesktop/NetworkManager/Devices/240)
Jan 31 03:27:54 np0005603623 nova_compute[226235]: 2026-01-31 08:27:54.948 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:54 np0005603623 ovn_controller[133449]: 2026-01-31T08:27:54Z|00501|binding|INFO|Claiming lport 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 for this chassis.
Jan 31 03:27:54 np0005603623 ovn_controller[133449]: 2026-01-31T08:27:54Z|00502|binding|INFO|58bbf6e9-a33b-4f2b-81e8-812adc1221b5: Claiming fa:16:3e:ff:fd:8c 10.100.0.11
Jan 31 03:27:54 np0005603623 ovn_controller[133449]: 2026-01-31T08:27:54Z|00503|binding|INFO|Setting lport 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 ovn-installed in OVS
Jan 31 03:27:54 np0005603623 nova_compute[226235]: 2026-01-31 08:27:54.958 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:54 np0005603623 nova_compute[226235]: 2026-01-31 08:27:54.961 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:54 np0005603623 systemd-machined[194379]: New machine qemu-56-instance-0000007b.
Jan 31 03:27:54 np0005603623 systemd[1]: Started Virtual Machine qemu-56-instance-0000007b.
Jan 31 03:27:54 np0005603623 systemd-udevd[282031]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:27:55 np0005603623 NetworkManager[48970]: <info>  [1769848075.0036] device (tap58bbf6e9-a3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:27:55 np0005603623 NetworkManager[48970]: <info>  [1769848075.0049] device (tap58bbf6e9-a3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:27:56 np0005603623 nova_compute[226235]: 2026-01-31 08:27:56.078 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848076.0782948, b9f38b79-63fc-48a1-a367-6998b8d6a9dc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:27:56 np0005603623 nova_compute[226235]: 2026-01-31 08:27:56.079 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] VM Started (Lifecycle Event)#033[00m
Jan 31 03:27:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:56.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e299 e299: 3 total, 3 up, 3 in
Jan 31 03:27:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:56.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:58.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:27:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:58.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:59 np0005603623 nova_compute[226235]: 2026-01-31 08:27:59.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:59 np0005603623 nova_compute[226235]: 2026-01-31 08:27:59.466 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e300 e300: 3 total, 3 up, 3 in
Jan 31 03:27:59 np0005603623 nova_compute[226235]: 2026-01-31 08:27:59.831 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:00.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:00.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:01.995 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:02.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:28:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:02.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.313 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:fd:8c 10.100.0.11'], port_security=['fa:16:3e:ff:fd:8c 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b9f38b79-63fc-48a1-a367-6998b8d6a9dc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6525247d-48b2-4359-a813-d7276403ba32', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '491937de020742d7b4e847dc3bf57950', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1bba0198-0b6e-463b-bfab-427572262107', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c7370ba-0307-4b10-bef7-8ff686d828f1, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=58bbf6e9-a33b-4f2b-81e8-812adc1221b5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:28:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:28:03Z|00504|binding|INFO|Setting lport 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 up in Southbound
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.314 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 in datapath 6525247d-48b2-4359-a813-d7276403ba32 bound to our chassis#033[00m
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.339 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.343 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848076.078413, b9f38b79-63fc-48a1-a367-6998b8d6a9dc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.343 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.405 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6525247d-48b2-4359-a813-d7276403ba32#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.415 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[eff5e06a-701d-4594-b5e5-445a154adf47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.416 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6525247d-41 in ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.417 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6525247d-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.417 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ad77bfd0-b889-422e-b8f2-9cde8bc6e9ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.418 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6d7d58db-4cd4-4646-a848-a7cdf5d7e079]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.425 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[31919e49-fd4f-4276-8ed7-d60dcc94bec5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.426 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.430 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.434 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b7112db1-f710-4d55-bbe8-d25af24e7a3f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.454 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[f5c6da4f-0769-4701-b16b-0e295942a760]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.460 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[794a19f9-ebce-4480-9362-2fdb8143e5bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:03 np0005603623 NetworkManager[48970]: <info>  [1769848083.4613] manager: (tap6525247d-40): new Veth device (/org/freedesktop/NetworkManager/Devices/241)
Jan 31 03:28:03 np0005603623 systemd-udevd[282093]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.483 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[6a270b23-18e6-4aca-9923-134e21525c0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.486 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[dd10d5c9-0959-402c-8267-9e39182cde87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e300 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:03 np0005603623 NetworkManager[48970]: <info>  [1769848083.5012] device (tap6525247d-40): carrier: link connected
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.505 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ac0db630-73fc-4f94-a506-31bd364a8d29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.520 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2e2c3433-0ae7-44ee-b613-87172d586fe6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6525247d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c7:c8:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 149], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 722393, 'reachable_time': 31127, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282112, 'error': None, 'target': 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.529 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2a7148ff-46f5-4710-85f8-cd1c05a39834]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec7:c843'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 722393, 'tstamp': 722393}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 282113, 'error': None, 'target': 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.539 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[55440685-3673-462c-9677-ad77fa3a25e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6525247d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c7:c8:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 149], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 722393, 'reachable_time': 31127, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 282114, 'error': None, 'target': 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.543 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.559 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4bc4ebe5-42ef-4123-8c88-a2f47a616e4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.602 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[261c68a2-a6fa-40dc-9b30-73d32ac167a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.603 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6525247d-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.603 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.604 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6525247d-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:03 np0005603623 NetworkManager[48970]: <info>  [1769848083.6062] manager: (tap6525247d-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/242)
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.606 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:03 np0005603623 kernel: tap6525247d-40: entered promiscuous mode
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.609 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6525247d-40, col_values=(('external_ids', {'iface-id': '044f1919-2550-4bba-9baa-5d3f39f69ec6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:28:03Z|00505|binding|INFO|Releasing lport 044f1919-2550-4bba-9baa-5d3f39f69ec6 from this chassis (sb_readonly=0)
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.610 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.611 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6525247d-48b2-4359-a813-d7276403ba32.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6525247d-48b2-4359-a813-d7276403ba32.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.612 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7e84f8c6-d51c-4d9f-ae5e-35f4c45979cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.613 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-6525247d-48b2-4359-a813-d7276403ba32
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/6525247d-48b2-4359-a813-d7276403ba32.pid.haproxy
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 6525247d-48b2-4359-a813-d7276403ba32
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:28:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:03.613 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'env', 'PROCESS_TAG=haproxy-6525247d-48b2-4359-a813-d7276403ba32', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6525247d-48b2-4359-a813-d7276403ba32.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.615 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.964 226239 DEBUG nova.compute.manager [req-1428da3d-acf0-4b8d-a5f5-51fce30205c4 req-0acecb6e-2379-4022-b5cf-7e9574408507 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received event network-vif-plugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.964 226239 DEBUG oslo_concurrency.lockutils [req-1428da3d-acf0-4b8d-a5f5-51fce30205c4 req-0acecb6e-2379-4022-b5cf-7e9574408507 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.965 226239 DEBUG oslo_concurrency.lockutils [req-1428da3d-acf0-4b8d-a5f5-51fce30205c4 req-0acecb6e-2379-4022-b5cf-7e9574408507 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.965 226239 DEBUG oslo_concurrency.lockutils [req-1428da3d-acf0-4b8d-a5f5-51fce30205c4 req-0acecb6e-2379-4022-b5cf-7e9574408507 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.965 226239 DEBUG nova.compute.manager [req-1428da3d-acf0-4b8d-a5f5-51fce30205c4 req-0acecb6e-2379-4022-b5cf-7e9574408507 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Processing event network-vif-plugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.965 226239 DEBUG nova.compute.manager [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Instance event wait completed in 7 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.968 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848083.9681082, b9f38b79-63fc-48a1-a367-6998b8d6a9dc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.968 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.970 226239 DEBUG nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.973 226239 INFO nova.virt.libvirt.driver [-] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Instance spawned successfully.#033[00m
Jan 31 03:28:03 np0005603623 nova_compute[226235]: 2026-01-31 08:28:03.974 226239 DEBUG nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:28:03 np0005603623 podman[282147]: 2026-01-31 08:28:03.900950762 +0000 UTC m=+0.019418330 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:28:04 np0005603623 podman[282147]: 2026-01-31 08:28:04.040853957 +0000 UTC m=+0.159321495 container create 9f9ee12f8b71af06d2e9000b0b8b3e81ea3163ddba5cf099981b6151bf13638c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:28:04 np0005603623 nova_compute[226235]: 2026-01-31 08:28:04.051 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:28:04 np0005603623 nova_compute[226235]: 2026-01-31 08:28:04.055 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:28:04 np0005603623 nova_compute[226235]: 2026-01-31 08:28:04.066 226239 DEBUG nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:28:04 np0005603623 nova_compute[226235]: 2026-01-31 08:28:04.066 226239 DEBUG nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:28:04 np0005603623 nova_compute[226235]: 2026-01-31 08:28:04.067 226239 DEBUG nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:28:04 np0005603623 nova_compute[226235]: 2026-01-31 08:28:04.067 226239 DEBUG nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:28:04 np0005603623 nova_compute[226235]: 2026-01-31 08:28:04.068 226239 DEBUG nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:28:04 np0005603623 nova_compute[226235]: 2026-01-31 08:28:04.068 226239 DEBUG nova.virt.libvirt.driver [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:28:04 np0005603623 nova_compute[226235]: 2026-01-31 08:28:04.164 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:28:04 np0005603623 systemd[1]: Started libpod-conmon-9f9ee12f8b71af06d2e9000b0b8b3e81ea3163ddba5cf099981b6151bf13638c.scope.
Jan 31 03:28:04 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:28:04 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3f9fbfe84b5b822c2a575a1599aa6f9797b076043caedcb203ed0e7917b9d04/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:28:04 np0005603623 nova_compute[226235]: 2026-01-31 08:28:04.246 226239 INFO nova.compute.manager [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Took 24.32 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:28:04 np0005603623 nova_compute[226235]: 2026-01-31 08:28:04.246 226239 DEBUG nova.compute.manager [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:28:04 np0005603623 podman[282147]: 2026-01-31 08:28:04.313543696 +0000 UTC m=+0.432011254 container init 9f9ee12f8b71af06d2e9000b0b8b3e81ea3163ddba5cf099981b6151bf13638c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Jan 31 03:28:04 np0005603623 podman[282147]: 2026-01-31 08:28:04.31814955 +0000 UTC m=+0.436617078 container start 9f9ee12f8b71af06d2e9000b0b8b3e81ea3163ddba5cf099981b6151bf13638c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:28:04 np0005603623 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[282163]: [NOTICE]   (282167) : New worker (282169) forked
Jan 31 03:28:04 np0005603623 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[282163]: [NOTICE]   (282167) : Loading success.
Jan 31 03:28:04 np0005603623 nova_compute[226235]: 2026-01-31 08:28:04.385 226239 INFO nova.compute.manager [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Took 26.00 seconds to build instance.#033[00m
Jan 31 03:28:04 np0005603623 nova_compute[226235]: 2026-01-31 08:28:04.430 226239 DEBUG oslo_concurrency.lockutils [None req-ca307143-1d4c-486f-b988-f88b8a914f66 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 26.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:04 np0005603623 nova_compute[226235]: 2026-01-31 08:28:04.468 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:28:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:04.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:28:04 np0005603623 nova_compute[226235]: 2026-01-31 08:28:04.833 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:04.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e301 e301: 3 total, 3 up, 3 in
Jan 31 03:28:06 np0005603623 nova_compute[226235]: 2026-01-31 08:28:06.047 226239 DEBUG nova.compute.manager [req-84e67a0e-83f7-429f-a845-bc2c8a66f7bb req-d2bdd68b-32fd-4489-a339-6703cb1c9fda fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received event network-vif-plugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:06 np0005603623 nova_compute[226235]: 2026-01-31 08:28:06.048 226239 DEBUG oslo_concurrency.lockutils [req-84e67a0e-83f7-429f-a845-bc2c8a66f7bb req-d2bdd68b-32fd-4489-a339-6703cb1c9fda fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:06 np0005603623 nova_compute[226235]: 2026-01-31 08:28:06.048 226239 DEBUG oslo_concurrency.lockutils [req-84e67a0e-83f7-429f-a845-bc2c8a66f7bb req-d2bdd68b-32fd-4489-a339-6703cb1c9fda fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:06 np0005603623 nova_compute[226235]: 2026-01-31 08:28:06.048 226239 DEBUG oslo_concurrency.lockutils [req-84e67a0e-83f7-429f-a845-bc2c8a66f7bb req-d2bdd68b-32fd-4489-a339-6703cb1c9fda fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:06 np0005603623 nova_compute[226235]: 2026-01-31 08:28:06.048 226239 DEBUG nova.compute.manager [req-84e67a0e-83f7-429f-a845-bc2c8a66f7bb req-d2bdd68b-32fd-4489-a339-6703cb1c9fda fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] No waiting events found dispatching network-vif-plugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:28:06 np0005603623 nova_compute[226235]: 2026-01-31 08:28:06.048 226239 WARNING nova.compute.manager [req-84e67a0e-83f7-429f-a845-bc2c8a66f7bb req-d2bdd68b-32fd-4489-a339-6703cb1c9fda fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received unexpected event network-vif-plugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:28:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e302 e302: 3 total, 3 up, 3 in
Jan 31 03:28:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:06.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:06.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:07 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #53. Immutable memtables: 9.
Jan 31 03:28:08 np0005603623 ovn_controller[133449]: 2026-01-31T08:28:08Z|00506|binding|INFO|Releasing lport 044f1919-2550-4bba-9baa-5d3f39f69ec6 from this chassis (sb_readonly=0)
Jan 31 03:28:08 np0005603623 nova_compute[226235]: 2026-01-31 08:28:08.363 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:08 np0005603623 nova_compute[226235]: 2026-01-31 08:28:08.545 226239 DEBUG nova.compute.manager [req-ef05c6ec-abab-4ffa-a581-f91a074261ef req-1a31e69d-983b-465f-b408-fcb2bb51f4ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received event network-changed-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:08 np0005603623 nova_compute[226235]: 2026-01-31 08:28:08.546 226239 DEBUG nova.compute.manager [req-ef05c6ec-abab-4ffa-a581-f91a074261ef req-1a31e69d-983b-465f-b408-fcb2bb51f4ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Refreshing instance network info cache due to event network-changed-58bbf6e9-a33b-4f2b-81e8-812adc1221b5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:28:08 np0005603623 nova_compute[226235]: 2026-01-31 08:28:08.546 226239 DEBUG oslo_concurrency.lockutils [req-ef05c6ec-abab-4ffa-a581-f91a074261ef req-1a31e69d-983b-465f-b408-fcb2bb51f4ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:28:08 np0005603623 nova_compute[226235]: 2026-01-31 08:28:08.547 226239 DEBUG oslo_concurrency.lockutils [req-ef05c6ec-abab-4ffa-a581-f91a074261ef req-1a31e69d-983b-465f-b408-fcb2bb51f4ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:28:08 np0005603623 nova_compute[226235]: 2026-01-31 08:28:08.547 226239 DEBUG nova.network.neutron [req-ef05c6ec-abab-4ffa-a581-f91a074261ef req-1a31e69d-983b-465f-b408-fcb2bb51f4ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Refreshing network info cache for port 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:28:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:08.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:08.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:09 np0005603623 nova_compute[226235]: 2026-01-31 08:28:09.471 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #103. Immutable memtables: 0.
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:28:09.657568) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 63] Flushing memtable with next log file: 103
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848089657646, "job": 63, "event": "flush_started", "num_memtables": 1, "num_entries": 1929, "num_deletes": 272, "total_data_size": 4566460, "memory_usage": 4634784, "flush_reason": "Manual Compaction"}
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 63] Level-0 flush table #104: started
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848089683534, "cf_name": "default", "job": 63, "event": "table_file_creation", "file_number": 104, "file_size": 2999456, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 52640, "largest_seqno": 54564, "table_properties": {"data_size": 2990892, "index_size": 5314, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2181, "raw_key_size": 17994, "raw_average_key_size": 20, "raw_value_size": 2973760, "raw_average_value_size": 3461, "num_data_blocks": 230, "num_entries": 859, "num_filter_entries": 859, "num_deletions": 272, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769847925, "oldest_key_time": 1769847925, "file_creation_time": 1769848089, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 104, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 63] Flush lasted 26019 microseconds, and 6033 cpu microseconds.
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:28:09.683593) [db/flush_job.cc:967] [default] [JOB 63] Level-0 flush table #104: 2999456 bytes OK
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:28:09.683619) [db/memtable_list.cc:519] [default] Level-0 commit table #104 started
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:28:09.741862) [db/memtable_list.cc:722] [default] Level-0 commit table #104: memtable #1 done
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:28:09.741914) EVENT_LOG_v1 {"time_micros": 1769848089741902, "job": 63, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:28:09.741942) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 63] Try to delete WAL files size 4557461, prev total WAL file size 4557461, number of live WAL files 2.
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000100.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:28:09.742878) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031373537' seq:72057594037927935, type:22 .. '6C6F676D0032303133' seq:0, type:0; will stop at (end)
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 64] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 63 Base level 0, inputs: [104(2929KB)], [102(11MB)]
Jan 31 03:28:09 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848089742964, "job": 64, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [104], "files_L6": [102], "score": -1, "input_data_size": 14540525, "oldest_snapshot_seqno": -1}
Jan 31 03:28:09 np0005603623 nova_compute[226235]: 2026-01-31 08:28:09.880 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:10 np0005603623 nova_compute[226235]: 2026-01-31 08:28:10.208 226239 DEBUG nova.network.neutron [req-ef05c6ec-abab-4ffa-a581-f91a074261ef req-1a31e69d-983b-465f-b408-fcb2bb51f4ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Updated VIF entry in instance network info cache for port 58bbf6e9-a33b-4f2b-81e8-812adc1221b5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:28:10 np0005603623 nova_compute[226235]: 2026-01-31 08:28:10.209 226239 DEBUG nova.network.neutron [req-ef05c6ec-abab-4ffa-a581-f91a074261ef req-1a31e69d-983b-465f-b408-fcb2bb51f4ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Updating instance_info_cache with network_info: [{"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:28:10 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 64] Generated table #105: 7999 keys, 14407633 bytes, temperature: kUnknown
Jan 31 03:28:10 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848090255787, "cf_name": "default", "job": 64, "event": "table_file_creation", "file_number": 105, "file_size": 14407633, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14351100, "index_size": 35429, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20037, "raw_key_size": 206410, "raw_average_key_size": 25, "raw_value_size": 14205785, "raw_average_value_size": 1775, "num_data_blocks": 1406, "num_entries": 7999, "num_filter_entries": 7999, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769848089, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 105, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:28:10 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:28:10 np0005603623 nova_compute[226235]: 2026-01-31 08:28:10.272 226239 DEBUG oslo_concurrency.lockutils [req-ef05c6ec-abab-4ffa-a581-f91a074261ef req-1a31e69d-983b-465f-b408-fcb2bb51f4ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:28:10 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:28:10.256158) [db/compaction/compaction_job.cc:1663] [default] [JOB 64] Compacted 1@0 + 1@6 files to L6 => 14407633 bytes
Jan 31 03:28:10 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:28:10.386315) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 28.3 rd, 28.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.9, 11.0 +0.0 blob) out(13.7 +0.0 blob), read-write-amplify(9.7) write-amplify(4.8) OK, records in: 8551, records dropped: 552 output_compression: NoCompression
Jan 31 03:28:10 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:28:10.386362) EVENT_LOG_v1 {"time_micros": 1769848090386343, "job": 64, "event": "compaction_finished", "compaction_time_micros": 513033, "compaction_time_cpu_micros": 31963, "output_level": 6, "num_output_files": 1, "total_output_size": 14407633, "num_input_records": 8551, "num_output_records": 7999, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:28:10 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000104.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:28:10 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848090387096, "job": 64, "event": "table_file_deletion", "file_number": 104}
Jan 31 03:28:10 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000102.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:28:10 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848090389130, "job": 64, "event": "table_file_deletion", "file_number": 102}
Jan 31 03:28:10 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:28:09.742772) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:28:10 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:28:10.389229) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:28:10 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:28:10.389235) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:28:10 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:28:10.389236) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:28:10 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:28:10.389237) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:28:10 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:28:10.389239) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:28:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:10.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:10.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:12.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:12 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:28:12 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:28:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:28:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:12.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:28:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:13 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:28:13 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:28:13 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:28:14 np0005603623 nova_compute[226235]: 2026-01-31 08:28:14.519 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:14.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e303 e303: 3 total, 3 up, 3 in
Jan 31 03:28:14 np0005603623 nova_compute[226235]: 2026-01-31 08:28:14.882 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:28:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:14.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:28:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:16.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:16.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:18.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:28:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:18.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:28:19 np0005603623 nova_compute[226235]: 2026-01-31 08:28:19.521 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:19 np0005603623 nova_compute[226235]: 2026-01-31 08:28:19.884 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:28:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:20.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:28:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:20.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:21 np0005603623 ovn_controller[133449]: 2026-01-31T08:28:21Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ff:fd:8c 10.100.0.11
Jan 31 03:28:21 np0005603623 ovn_controller[133449]: 2026-01-31T08:28:21Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ff:fd:8c 10.100.0.11
Jan 31 03:28:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:22.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:22.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:23 np0005603623 podman[282369]: 2026-01-31 08:28:23.963210524 +0000 UTC m=+0.051512395 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:28:23 np0005603623 podman[282370]: 2026-01-31 08:28:23.992144221 +0000 UTC m=+0.080526755 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 03:28:24 np0005603623 nova_compute[226235]: 2026-01-31 08:28:24.522 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:24.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:24 np0005603623 nova_compute[226235]: 2026-01-31 08:28:24.924 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:28:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:24.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:28:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:28:26 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:28:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:26.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:26.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:28.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:28.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:29 np0005603623 nova_compute[226235]: 2026-01-31 08:28:29.524 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:29 np0005603623 nova_compute[226235]: 2026-01-31 08:28:29.970 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:30.121 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:30.121 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:30.122 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:30.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:30.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:28:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:32.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:28:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:32.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:34 np0005603623 nova_compute[226235]: 2026-01-31 08:28:34.569 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:34.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:34.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:34 np0005603623 nova_compute[226235]: 2026-01-31 08:28:34.973 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:36.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:36.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:38.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:38.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:39 np0005603623 nova_compute[226235]: 2026-01-31 08:28:39.570 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:39 np0005603623 nova_compute[226235]: 2026-01-31 08:28:39.974 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:28:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:40.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:28:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:40.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:41 np0005603623 nova_compute[226235]: 2026-01-31 08:28:41.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:28:41 np0005603623 nova_compute[226235]: 2026-01-31 08:28:41.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:28:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:42.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:42.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:43 np0005603623 nova_compute[226235]: 2026-01-31 08:28:43.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:28:43 np0005603623 nova_compute[226235]: 2026-01-31 08:28:43.241 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:43 np0005603623 nova_compute[226235]: 2026-01-31 08:28:43.241 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:43 np0005603623 nova_compute[226235]: 2026-01-31 08:28:43.242 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:43 np0005603623 nova_compute[226235]: 2026-01-31 08:28:43.242 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:28:43 np0005603623 nova_compute[226235]: 2026-01-31 08:28:43.242 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:28:43 np0005603623 nova_compute[226235]: 2026-01-31 08:28:43.414 226239 DEBUG oslo_concurrency.lockutils [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:43 np0005603623 nova_compute[226235]: 2026-01-31 08:28:43.416 226239 DEBUG oslo_concurrency.lockutils [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:43 np0005603623 nova_compute[226235]: 2026-01-31 08:28:43.416 226239 INFO nova.compute.manager [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Shelving#033[00m
Jan 31 03:28:43 np0005603623 nova_compute[226235]: 2026-01-31 08:28:43.469 226239 DEBUG nova.virt.libvirt.driver [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:28:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:28:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2025627057' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:28:44 np0005603623 nova_compute[226235]: 2026-01-31 08:28:44.105 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.863s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:28:44 np0005603623 nova_compute[226235]: 2026-01-31 08:28:44.572 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:44.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:44.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:44 np0005603623 nova_compute[226235]: 2026-01-31 08:28:44.976 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:45 np0005603623 nova_compute[226235]: 2026-01-31 08:28:45.549 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000007b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:28:45 np0005603623 nova_compute[226235]: 2026-01-31 08:28:45.551 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000007b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:28:45 np0005603623 nova_compute[226235]: 2026-01-31 08:28:45.707 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:28:45 np0005603623 nova_compute[226235]: 2026-01-31 08:28:45.708 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4245MB free_disk=20.77987289428711GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:28:45 np0005603623 nova_compute[226235]: 2026-01-31 08:28:45.709 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:45 np0005603623 nova_compute[226235]: 2026-01-31 08:28:45.709 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:45 np0005603623 nova_compute[226235]: 2026-01-31 08:28:45.983 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance b9f38b79-63fc-48a1-a367-6998b8d6a9dc actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:28:45 np0005603623 nova_compute[226235]: 2026-01-31 08:28:45.983 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:28:45 np0005603623 nova_compute[226235]: 2026-01-31 08:28:45.983 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:28:46 np0005603623 nova_compute[226235]: 2026-01-31 08:28:46.132 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:28:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:28:46 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3804210784' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:28:46 np0005603623 nova_compute[226235]: 2026-01-31 08:28:46.566 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:28:46 np0005603623 nova_compute[226235]: 2026-01-31 08:28:46.571 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:28:46 np0005603623 nova_compute[226235]: 2026-01-31 08:28:46.669 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:28:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:46.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:46 np0005603623 nova_compute[226235]: 2026-01-31 08:28:46.736 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:28:46 np0005603623 nova_compute[226235]: 2026-01-31 08:28:46.736 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:28:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:46.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:28:47 np0005603623 nova_compute[226235]: 2026-01-31 08:28:47.737 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:28:47 np0005603623 nova_compute[226235]: 2026-01-31 08:28:47.737 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:28:47 np0005603623 nova_compute[226235]: 2026-01-31 08:28:47.738 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:28:47 np0005603623 nova_compute[226235]: 2026-01-31 08:28:47.837 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:28:47 np0005603623 nova_compute[226235]: 2026-01-31 08:28:47.837 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:28:47 np0005603623 nova_compute[226235]: 2026-01-31 08:28:47.837 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:28:47 np0005603623 nova_compute[226235]: 2026-01-31 08:28:47.837 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b9f38b79-63fc-48a1-a367-6998b8d6a9dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:47 np0005603623 kernel: tap58bbf6e9-a3 (unregistering): left promiscuous mode
Jan 31 03:28:47 np0005603623 NetworkManager[48970]: <info>  [1769848127.9284] device (tap58bbf6e9-a3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:28:47 np0005603623 ovn_controller[133449]: 2026-01-31T08:28:47Z|00507|binding|INFO|Releasing lport 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 from this chassis (sb_readonly=0)
Jan 31 03:28:47 np0005603623 ovn_controller[133449]: 2026-01-31T08:28:47Z|00508|binding|INFO|Setting lport 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 down in Southbound
Jan 31 03:28:47 np0005603623 nova_compute[226235]: 2026-01-31 08:28:47.938 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:47 np0005603623 ovn_controller[133449]: 2026-01-31T08:28:47Z|00509|binding|INFO|Removing iface tap58bbf6e9-a3 ovn-installed in OVS
Jan 31 03:28:47 np0005603623 nova_compute[226235]: 2026-01-31 08:28:47.941 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:47 np0005603623 nova_compute[226235]: 2026-01-31 08:28:47.949 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:47 np0005603623 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000007b.scope: Deactivated successfully.
Jan 31 03:28:47 np0005603623 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000007b.scope: Consumed 13.885s CPU time.
Jan 31 03:28:47 np0005603623 systemd-machined[194379]: Machine qemu-56-instance-0000007b terminated.
Jan 31 03:28:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:48.087 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:fd:8c 10.100.0.11'], port_security=['fa:16:3e:ff:fd:8c 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b9f38b79-63fc-48a1-a367-6998b8d6a9dc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6525247d-48b2-4359-a813-d7276403ba32', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '491937de020742d7b4e847dc3bf57950', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1bba0198-0b6e-463b-bfab-427572262107', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.192'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c7370ba-0307-4b10-bef7-8ff686d828f1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=58bbf6e9-a33b-4f2b-81e8-812adc1221b5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:28:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:48.089 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 in datapath 6525247d-48b2-4359-a813-d7276403ba32 unbound from our chassis#033[00m
Jan 31 03:28:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:48.090 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6525247d-48b2-4359-a813-d7276403ba32, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:28:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:48.091 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9dfc4b6d-4f19-4f23-bfb7-6ce2bbe7e9e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:48.091 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 namespace which is not needed anymore#033[00m
Jan 31 03:28:48 np0005603623 nova_compute[226235]: 2026-01-31 08:28:48.155 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:48 np0005603623 nova_compute[226235]: 2026-01-31 08:28:48.158 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:48 np0005603623 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[282163]: [NOTICE]   (282167) : haproxy version is 2.8.14-c23fe91
Jan 31 03:28:48 np0005603623 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[282163]: [NOTICE]   (282167) : path to executable is /usr/sbin/haproxy
Jan 31 03:28:48 np0005603623 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[282163]: [WARNING]  (282167) : Exiting Master process...
Jan 31 03:28:48 np0005603623 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[282163]: [ALERT]    (282167) : Current worker (282169) exited with code 143 (Terminated)
Jan 31 03:28:48 np0005603623 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[282163]: [WARNING]  (282167) : All workers exited. Exiting... (0)
Jan 31 03:28:48 np0005603623 systemd[1]: libpod-9f9ee12f8b71af06d2e9000b0b8b3e81ea3163ddba5cf099981b6151bf13638c.scope: Deactivated successfully.
Jan 31 03:28:48 np0005603623 podman[282599]: 2026-01-31 08:28:48.346238116 +0000 UTC m=+0.181183992 container died 9f9ee12f8b71af06d2e9000b0b8b3e81ea3163ddba5cf099981b6151bf13638c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:28:48 np0005603623 nova_compute[226235]: 2026-01-31 08:28:48.493 226239 INFO nova.virt.libvirt.driver [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Instance shutdown successfully after 5 seconds.#033[00m
Jan 31 03:28:48 np0005603623 nova_compute[226235]: 2026-01-31 08:28:48.498 226239 INFO nova.virt.libvirt.driver [-] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Instance destroyed successfully.#033[00m
Jan 31 03:28:48 np0005603623 nova_compute[226235]: 2026-01-31 08:28:48.499 226239 DEBUG nova.objects.instance [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'numa_topology' on Instance uuid b9f38b79-63fc-48a1-a367-6998b8d6a9dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:48.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:48 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9f9ee12f8b71af06d2e9000b0b8b3e81ea3163ddba5cf099981b6151bf13638c-userdata-shm.mount: Deactivated successfully.
Jan 31 03:28:48 np0005603623 systemd[1]: var-lib-containers-storage-overlay-f3f9fbfe84b5b822c2a575a1599aa6f9797b076043caedcb203ed0e7917b9d04-merged.mount: Deactivated successfully.
Jan 31 03:28:48 np0005603623 nova_compute[226235]: 2026-01-31 08:28:48.798 226239 INFO nova.virt.libvirt.driver [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Beginning cold snapshot process#033[00m
Jan 31 03:28:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:48 np0005603623 podman[282599]: 2026-01-31 08:28:48.833612144 +0000 UTC m=+0.668558000 container cleanup 9f9ee12f8b71af06d2e9000b0b8b3e81ea3163ddba5cf099981b6151bf13638c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:28:48 np0005603623 systemd[1]: libpod-conmon-9f9ee12f8b71af06d2e9000b0b8b3e81ea3163ddba5cf099981b6151bf13638c.scope: Deactivated successfully.
Jan 31 03:28:48 np0005603623 podman[282687]: 2026-01-31 08:28:48.896846866 +0000 UTC m=+0.046171359 container remove 9f9ee12f8b71af06d2e9000b0b8b3e81ea3163ddba5cf099981b6151bf13638c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 31 03:28:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:48.900 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[23b86154-8f2e-4f11-bede-17dac7e043d9]: (4, ('Sat Jan 31 08:28:48 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 (9f9ee12f8b71af06d2e9000b0b8b3e81ea3163ddba5cf099981b6151bf13638c)\n9f9ee12f8b71af06d2e9000b0b8b3e81ea3163ddba5cf099981b6151bf13638c\nSat Jan 31 08:28:48 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 (9f9ee12f8b71af06d2e9000b0b8b3e81ea3163ddba5cf099981b6151bf13638c)\n9f9ee12f8b71af06d2e9000b0b8b3e81ea3163ddba5cf099981b6151bf13638c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:48.901 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[148b5b1a-25f5-459b-b711-7987d64f01e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:48.902 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6525247d-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:48 np0005603623 nova_compute[226235]: 2026-01-31 08:28:48.904 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:48 np0005603623 kernel: tap6525247d-40: left promiscuous mode
Jan 31 03:28:48 np0005603623 nova_compute[226235]: 2026-01-31 08:28:48.912 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:48 np0005603623 nova_compute[226235]: 2026-01-31 08:28:48.913 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:48.914 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b63dec2f-ad9a-4e00-858c-1888288f4fb1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:48.932 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a3d76d17-8a65-4f51-9e50-4924c2ccf5f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:48.933 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6864e966-3469-49a8-b99d-547ce808b6c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:48.943 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a94b0ccf-40c5-4a69-b9ed-bfa92f07f224]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 722388, 'reachable_time': 39079, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282706, 'error': None, 'target': 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:48 np0005603623 systemd[1]: run-netns-ovnmeta\x2d6525247d\x2d48b2\x2d4359\x2da813\x2dd7276403ba32.mount: Deactivated successfully.
Jan 31 03:28:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:48.946 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:28:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:28:48.946 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[d1edf6a0-ebd4-454d-b84a-e0b225e74245]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:48.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:49 np0005603623 nova_compute[226235]: 2026-01-31 08:28:49.090 226239 DEBUG nova.compute.manager [req-5ffe7247-7033-40c4-9441-e6da49eb15ec req-89b18612-21a2-4be6-ae5a-a2add180227c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received event network-vif-unplugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:49 np0005603623 nova_compute[226235]: 2026-01-31 08:28:49.090 226239 DEBUG oslo_concurrency.lockutils [req-5ffe7247-7033-40c4-9441-e6da49eb15ec req-89b18612-21a2-4be6-ae5a-a2add180227c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:49 np0005603623 nova_compute[226235]: 2026-01-31 08:28:49.090 226239 DEBUG oslo_concurrency.lockutils [req-5ffe7247-7033-40c4-9441-e6da49eb15ec req-89b18612-21a2-4be6-ae5a-a2add180227c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:49 np0005603623 nova_compute[226235]: 2026-01-31 08:28:49.091 226239 DEBUG oslo_concurrency.lockutils [req-5ffe7247-7033-40c4-9441-e6da49eb15ec req-89b18612-21a2-4be6-ae5a-a2add180227c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:49 np0005603623 nova_compute[226235]: 2026-01-31 08:28:49.091 226239 DEBUG nova.compute.manager [req-5ffe7247-7033-40c4-9441-e6da49eb15ec req-89b18612-21a2-4be6-ae5a-a2add180227c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] No waiting events found dispatching network-vif-unplugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:28:49 np0005603623 nova_compute[226235]: 2026-01-31 08:28:49.091 226239 WARNING nova.compute.manager [req-5ffe7247-7033-40c4-9441-e6da49eb15ec req-89b18612-21a2-4be6-ae5a-a2add180227c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received unexpected event network-vif-unplugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 for instance with vm_state active and task_state shelving_image_pending_upload.#033[00m
Jan 31 03:28:49 np0005603623 nova_compute[226235]: 2026-01-31 08:28:49.242 226239 DEBUG nova.virt.libvirt.imagebackend [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] No parent info for 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 31 03:28:49 np0005603623 nova_compute[226235]: 2026-01-31 08:28:49.484 226239 DEBUG nova.storage.rbd_utils [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] creating snapshot(1c799ab3d8aa46cf91a44111cd1ae41a) on rbd image(b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:28:49 np0005603623 nova_compute[226235]: 2026-01-31 08:28:49.576 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:49 np0005603623 ovn_controller[133449]: 2026-01-31T08:28:49Z|00510|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 03:28:49 np0005603623 nova_compute[226235]: 2026-01-31 08:28:49.978 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e304 e304: 3 total, 3 up, 3 in
Jan 31 03:28:50 np0005603623 nova_compute[226235]: 2026-01-31 08:28:50.289 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Updating instance_info_cache with network_info: [{"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:28:50 np0005603623 nova_compute[226235]: 2026-01-31 08:28:50.329 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:28:50 np0005603623 nova_compute[226235]: 2026-01-31 08:28:50.329 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:28:50 np0005603623 nova_compute[226235]: 2026-01-31 08:28:50.330 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:28:50 np0005603623 nova_compute[226235]: 2026-01-31 08:28:50.330 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:28:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:50.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:50 np0005603623 nova_compute[226235]: 2026-01-31 08:28:50.857 226239 DEBUG nova.storage.rbd_utils [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] cloning vms/b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk@1c799ab3d8aa46cf91a44111cd1ae41a to images/0e9c5d39-8319-43cb-ac29-5664ec7c2a71 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:28:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:50.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:51 np0005603623 nova_compute[226235]: 2026-01-31 08:28:51.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:28:51 np0005603623 nova_compute[226235]: 2026-01-31 08:28:51.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:28:51 np0005603623 nova_compute[226235]: 2026-01-31 08:28:51.282 226239 DEBUG nova.compute.manager [req-e929895a-738f-46b2-a793-74dcf60dcf95 req-46bf559f-1e71-4a3e-8a06-ed233122f3a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received event network-vif-plugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:51 np0005603623 nova_compute[226235]: 2026-01-31 08:28:51.283 226239 DEBUG oslo_concurrency.lockutils [req-e929895a-738f-46b2-a793-74dcf60dcf95 req-46bf559f-1e71-4a3e-8a06-ed233122f3a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:51 np0005603623 nova_compute[226235]: 2026-01-31 08:28:51.283 226239 DEBUG oslo_concurrency.lockutils [req-e929895a-738f-46b2-a793-74dcf60dcf95 req-46bf559f-1e71-4a3e-8a06-ed233122f3a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:51 np0005603623 nova_compute[226235]: 2026-01-31 08:28:51.283 226239 DEBUG oslo_concurrency.lockutils [req-e929895a-738f-46b2-a793-74dcf60dcf95 req-46bf559f-1e71-4a3e-8a06-ed233122f3a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:51 np0005603623 nova_compute[226235]: 2026-01-31 08:28:51.283 226239 DEBUG nova.compute.manager [req-e929895a-738f-46b2-a793-74dcf60dcf95 req-46bf559f-1e71-4a3e-8a06-ed233122f3a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] No waiting events found dispatching network-vif-plugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:28:51 np0005603623 nova_compute[226235]: 2026-01-31 08:28:51.284 226239 WARNING nova.compute.manager [req-e929895a-738f-46b2-a793-74dcf60dcf95 req-46bf559f-1e71-4a3e-8a06-ed233122f3a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received unexpected event network-vif-plugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 31 03:28:51 np0005603623 nova_compute[226235]: 2026-01-31 08:28:51.809 226239 DEBUG nova.storage.rbd_utils [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] flattening images/0e9c5d39-8319-43cb-ac29-5664ec7c2a71 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 31 03:28:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:52.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:52.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:53 np0005603623 nova_compute[226235]: 2026-01-31 08:28:53.157 226239 DEBUG nova.compute.manager [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Stashing vm_state: stopped _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 31 03:28:53 np0005603623 nova_compute[226235]: 2026-01-31 08:28:53.451 226239 DEBUG oslo_concurrency.lockutils [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:53 np0005603623 nova_compute[226235]: 2026-01-31 08:28:53.452 226239 DEBUG oslo_concurrency.lockutils [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:53 np0005603623 nova_compute[226235]: 2026-01-31 08:28:53.689 226239 DEBUG nova.objects.instance [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'pci_requests' on Instance uuid cca881fe-18fa-40c1-b9ef-2b1f28855b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:53 np0005603623 nova_compute[226235]: 2026-01-31 08:28:53.738 226239 DEBUG nova.storage.rbd_utils [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] removing snapshot(1c799ab3d8aa46cf91a44111cd1ae41a) on rbd image(b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:28:53 np0005603623 nova_compute[226235]: 2026-01-31 08:28:53.780 226239 DEBUG nova.virt.hardware [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:28:53 np0005603623 nova_compute[226235]: 2026-01-31 08:28:53.780 226239 INFO nova.compute.claims [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:28:53 np0005603623 nova_compute[226235]: 2026-01-31 08:28:53.781 226239 DEBUG nova.objects.instance [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'resources' on Instance uuid cca881fe-18fa-40c1-b9ef-2b1f28855b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e304 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:53 np0005603623 nova_compute[226235]: 2026-01-31 08:28:53.815 226239 DEBUG nova.objects.instance [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'pci_devices' on Instance uuid cca881fe-18fa-40c1-b9ef-2b1f28855b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:54 np0005603623 nova_compute[226235]: 2026-01-31 08:28:54.031 226239 INFO nova.compute.resource_tracker [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Updating resource usage from migration 45b6c84f-a4f0-4db4-98c8-5e319b46ded0#033[00m
Jan 31 03:28:54 np0005603623 nova_compute[226235]: 2026-01-31 08:28:54.032 226239 DEBUG nova.compute.resource_tracker [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Starting to track incoming migration 45b6c84f-a4f0-4db4-98c8-5e319b46ded0 with flavor f75c4aee-d826-4343-a7e3-f06a4b21de52 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 31 03:28:54 np0005603623 nova_compute[226235]: 2026-01-31 08:28:54.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:28:54 np0005603623 nova_compute[226235]: 2026-01-31 08:28:54.186 226239 DEBUG oslo_concurrency.processutils [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:28:54 np0005603623 nova_compute[226235]: 2026-01-31 08:28:54.575 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:28:54 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3352519338' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:28:54 np0005603623 nova_compute[226235]: 2026-01-31 08:28:54.615 226239 DEBUG oslo_concurrency.processutils [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:28:54 np0005603623 nova_compute[226235]: 2026-01-31 08:28:54.619 226239 DEBUG nova.compute.provider_tree [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:28:54 np0005603623 nova_compute[226235]: 2026-01-31 08:28:54.689 226239 DEBUG nova.scheduler.client.report [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:28:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:54.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:54 np0005603623 nova_compute[226235]: 2026-01-31 08:28:54.738 226239 DEBUG oslo_concurrency.lockutils [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 1.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:54 np0005603623 nova_compute[226235]: 2026-01-31 08:28:54.739 226239 INFO nova.compute.manager [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Migrating#033[00m
Jan 31 03:28:54 np0005603623 podman[282856]: 2026-01-31 08:28:54.958134479 +0000 UTC m=+0.049775891 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:28:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:54.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:54 np0005603623 podman[282857]: 2026-01-31 08:28:54.979494118 +0000 UTC m=+0.069894162 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 31 03:28:54 np0005603623 nova_compute[226235]: 2026-01-31 08:28:54.979 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e305 e305: 3 total, 3 up, 3 in
Jan 31 03:28:56 np0005603623 nova_compute[226235]: 2026-01-31 08:28:56.201 226239 DEBUG nova.storage.rbd_utils [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] creating snapshot(snap) on rbd image(0e9c5d39-8319-43cb-ac29-5664ec7c2a71) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:28:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:56.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:56 np0005603623 systemd-logind[795]: New session 57 of user nova.
Jan 31 03:28:56 np0005603623 systemd[1]: Created slice User Slice of UID 42436.
Jan 31 03:28:56 np0005603623 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 31 03:28:56 np0005603623 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 31 03:28:56 np0005603623 systemd[1]: Starting User Manager for UID 42436...
Jan 31 03:28:56 np0005603623 systemd[282921]: Queued start job for default target Main User Target.
Jan 31 03:28:56 np0005603623 systemd[282921]: Created slice User Application Slice.
Jan 31 03:28:56 np0005603623 systemd[282921]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 31 03:28:56 np0005603623 systemd[282921]: Started Daily Cleanup of User's Temporary Directories.
Jan 31 03:28:56 np0005603623 systemd[282921]: Reached target Paths.
Jan 31 03:28:56 np0005603623 systemd[282921]: Reached target Timers.
Jan 31 03:28:56 np0005603623 systemd[282921]: Starting D-Bus User Message Bus Socket...
Jan 31 03:28:56 np0005603623 systemd[282921]: Starting Create User's Volatile Files and Directories...
Jan 31 03:28:56 np0005603623 systemd[282921]: Listening on D-Bus User Message Bus Socket.
Jan 31 03:28:56 np0005603623 systemd[282921]: Reached target Sockets.
Jan 31 03:28:56 np0005603623 systemd[282921]: Finished Create User's Volatile Files and Directories.
Jan 31 03:28:56 np0005603623 systemd[282921]: Reached target Basic System.
Jan 31 03:28:56 np0005603623 systemd[282921]: Reached target Main User Target.
Jan 31 03:28:56 np0005603623 systemd[282921]: Startup finished in 154ms.
Jan 31 03:28:56 np0005603623 systemd[1]: Started User Manager for UID 42436.
Jan 31 03:28:56 np0005603623 systemd[1]: Started Session 57 of User nova.
Jan 31 03:28:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:56.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:57 np0005603623 systemd[1]: session-57.scope: Deactivated successfully.
Jan 31 03:28:57 np0005603623 systemd-logind[795]: Session 57 logged out. Waiting for processes to exit.
Jan 31 03:28:57 np0005603623 systemd-logind[795]: Removed session 57.
Jan 31 03:28:57 np0005603623 systemd-logind[795]: New session 59 of user nova.
Jan 31 03:28:57 np0005603623 systemd[1]: Started Session 59 of User nova.
Jan 31 03:28:57 np0005603623 systemd[1]: session-59.scope: Deactivated successfully.
Jan 31 03:28:57 np0005603623 systemd-logind[795]: Session 59 logged out. Waiting for processes to exit.
Jan 31 03:28:57 np0005603623 systemd-logind[795]: Removed session 59.
Jan 31 03:28:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e306 e306: 3 total, 3 up, 3 in
Jan 31 03:28:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:28:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:58.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:28:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e306 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:58 np0005603623 nova_compute[226235]: 2026-01-31 08:28:58.865 226239 INFO nova.network.neutron [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Updating port 109b6929-6b88-494a-b397-b36c434ed7a7 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 31 03:28:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:28:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:58.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:59 np0005603623 nova_compute[226235]: 2026-01-31 08:28:59.577 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:00 np0005603623 nova_compute[226235]: 2026-01-31 08:29:00.007 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:00.512 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:29:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:00.513 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:29:00 np0005603623 nova_compute[226235]: 2026-01-31 08:29:00.513 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:00.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:00.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:01 np0005603623 nova_compute[226235]: 2026-01-31 08:29:01.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:02.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:02.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:03 np0005603623 nova_compute[226235]: 2026-01-31 08:29:03.176 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848128.1748366, b9f38b79-63fc-48a1-a367-6998b8d6a9dc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:29:03 np0005603623 nova_compute[226235]: 2026-01-31 08:29:03.176 226239 INFO nova.compute.manager [-] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:29:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e306 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:04 np0005603623 nova_compute[226235]: 2026-01-31 08:29:04.470 226239 DEBUG nova.compute.manager [None req-b2ff6219-37e6-4930-bf23-1cb91edecd99 - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:29:04 np0005603623 nova_compute[226235]: 2026-01-31 08:29:04.473 226239 DEBUG nova.compute.manager [None req-b2ff6219-37e6-4930-bf23-1cb91edecd99 - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: shelving_image_uploading, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:29:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:04.515 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:29:04 np0005603623 nova_compute[226235]: 2026-01-31 08:29:04.578 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:04 np0005603623 nova_compute[226235]: 2026-01-31 08:29:04.661 226239 INFO nova.compute.manager [None req-b2ff6219-37e6-4930-bf23-1cb91edecd99 - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] During sync_power_state the instance has a pending task (shelving_image_uploading). Skip.#033[00m
Jan 31 03:29:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:04.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e307 e307: 3 total, 3 up, 3 in
Jan 31 03:29:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:04.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:05 np0005603623 nova_compute[226235]: 2026-01-31 08:29:05.008 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:05 np0005603623 nova_compute[226235]: 2026-01-31 08:29:05.068 226239 INFO nova.virt.libvirt.driver [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Snapshot image upload complete#033[00m
Jan 31 03:29:05 np0005603623 nova_compute[226235]: 2026-01-31 08:29:05.068 226239 DEBUG nova.compute.manager [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:29:05 np0005603623 nova_compute[226235]: 2026-01-31 08:29:05.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:05 np0005603623 nova_compute[226235]: 2026-01-31 08:29:05.639 226239 INFO nova.compute.manager [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Shelve offloading#033[00m
Jan 31 03:29:05 np0005603623 nova_compute[226235]: 2026-01-31 08:29:05.645 226239 INFO nova.virt.libvirt.driver [-] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Instance destroyed successfully.#033[00m
Jan 31 03:29:05 np0005603623 nova_compute[226235]: 2026-01-31 08:29:05.646 226239 DEBUG nova.compute.manager [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:29:05 np0005603623 nova_compute[226235]: 2026-01-31 08:29:05.648 226239 DEBUG oslo_concurrency.lockutils [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:29:05 np0005603623 nova_compute[226235]: 2026-01-31 08:29:05.648 226239 DEBUG oslo_concurrency.lockutils [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquired lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:29:05 np0005603623 nova_compute[226235]: 2026-01-31 08:29:05.648 226239 DEBUG nova.network.neutron [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:29:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:06.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:06.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:07 np0005603623 systemd[1]: Stopping User Manager for UID 42436...
Jan 31 03:29:07 np0005603623 systemd[282921]: Activating special unit Exit the Session...
Jan 31 03:29:07 np0005603623 systemd[282921]: Stopped target Main User Target.
Jan 31 03:29:07 np0005603623 systemd[282921]: Stopped target Basic System.
Jan 31 03:29:07 np0005603623 systemd[282921]: Stopped target Paths.
Jan 31 03:29:07 np0005603623 systemd[282921]: Stopped target Sockets.
Jan 31 03:29:07 np0005603623 systemd[282921]: Stopped target Timers.
Jan 31 03:29:07 np0005603623 systemd[282921]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 31 03:29:07 np0005603623 systemd[282921]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 31 03:29:07 np0005603623 systemd[282921]: Closed D-Bus User Message Bus Socket.
Jan 31 03:29:07 np0005603623 systemd[282921]: Stopped Create User's Volatile Files and Directories.
Jan 31 03:29:07 np0005603623 systemd[282921]: Removed slice User Application Slice.
Jan 31 03:29:07 np0005603623 systemd[282921]: Reached target Shutdown.
Jan 31 03:29:07 np0005603623 systemd[282921]: Finished Exit the Session.
Jan 31 03:29:07 np0005603623 systemd[282921]: Reached target Exit the Session.
Jan 31 03:29:07 np0005603623 nova_compute[226235]: 2026-01-31 08:29:07.394 226239 DEBUG oslo_concurrency.lockutils [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "refresh_cache-cca881fe-18fa-40c1-b9ef-2b1f28855b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:29:07 np0005603623 nova_compute[226235]: 2026-01-31 08:29:07.395 226239 DEBUG oslo_concurrency.lockutils [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquired lock "refresh_cache-cca881fe-18fa-40c1-b9ef-2b1f28855b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:29:07 np0005603623 nova_compute[226235]: 2026-01-31 08:29:07.395 226239 DEBUG nova.network.neutron [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:29:07 np0005603623 systemd[1]: user@42436.service: Deactivated successfully.
Jan 31 03:29:07 np0005603623 systemd[1]: Stopped User Manager for UID 42436.
Jan 31 03:29:07 np0005603623 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 31 03:29:07 np0005603623 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 31 03:29:07 np0005603623 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 31 03:29:07 np0005603623 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 31 03:29:07 np0005603623 systemd[1]: Removed slice User Slice of UID 42436.
Jan 31 03:29:07 np0005603623 nova_compute[226235]: 2026-01-31 08:29:07.723 226239 DEBUG nova.compute.manager [req-b6b7fd42-7ec5-4c2a-b0bc-ac2254aa38b9 req-ba833d40-3077-4e92-a7a0-8c02770fe94b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Received event network-changed-109b6929-6b88-494a-b397-b36c434ed7a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:29:07 np0005603623 nova_compute[226235]: 2026-01-31 08:29:07.724 226239 DEBUG nova.compute.manager [req-b6b7fd42-7ec5-4c2a-b0bc-ac2254aa38b9 req-ba833d40-3077-4e92-a7a0-8c02770fe94b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Refreshing instance network info cache due to event network-changed-109b6929-6b88-494a-b397-b36c434ed7a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:29:07 np0005603623 nova_compute[226235]: 2026-01-31 08:29:07.725 226239 DEBUG oslo_concurrency.lockutils [req-b6b7fd42-7ec5-4c2a-b0bc-ac2254aa38b9 req-ba833d40-3077-4e92-a7a0-8c02770fe94b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-cca881fe-18fa-40c1-b9ef-2b1f28855b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:29:07 np0005603623 nova_compute[226235]: 2026-01-31 08:29:07.727 226239 DEBUG nova.network.neutron [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Updating instance_info_cache with network_info: [{"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:29:07 np0005603623 nova_compute[226235]: 2026-01-31 08:29:07.941 226239 DEBUG oslo_concurrency.lockutils [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Releasing lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:29:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:08.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e307 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:29:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:08.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:29:09 np0005603623 nova_compute[226235]: 2026-01-31 08:29:09.579 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:10 np0005603623 nova_compute[226235]: 2026-01-31 08:29:10.011 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:10.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #106. Immutable memtables: 0.
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:10.902533) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 65] Flushing memtable with next log file: 106
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848150902596, "job": 65, "event": "flush_started", "num_memtables": 1, "num_entries": 947, "num_deletes": 253, "total_data_size": 1828272, "memory_usage": 1859880, "flush_reason": "Manual Compaction"}
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 65] Level-0 flush table #107: started
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848150921808, "cf_name": "default", "job": 65, "event": "table_file_creation", "file_number": 107, "file_size": 1205899, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 54569, "largest_seqno": 55511, "table_properties": {"data_size": 1201456, "index_size": 2095, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10360, "raw_average_key_size": 20, "raw_value_size": 1192318, "raw_average_value_size": 2337, "num_data_blocks": 91, "num_entries": 510, "num_filter_entries": 510, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848090, "oldest_key_time": 1769848090, "file_creation_time": 1769848150, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 107, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 65] Flush lasted 19307 microseconds, and 2690 cpu microseconds.
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:10.921849) [db/flush_job.cc:967] [default] [JOB 65] Level-0 flush table #107: 1205899 bytes OK
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:10.921865) [db/memtable_list.cc:519] [default] Level-0 commit table #107 started
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:10.925636) [db/memtable_list.cc:722] [default] Level-0 commit table #107: memtable #1 done
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:10.925649) EVENT_LOG_v1 {"time_micros": 1769848150925645, "job": 65, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:10.925664) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 65] Try to delete WAL files size 1823430, prev total WAL file size 1823430, number of live WAL files 2.
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000103.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:10.926159) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034353138' seq:72057594037927935, type:22 .. '7061786F730034373730' seq:0, type:0; will stop at (end)
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 66] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 65 Base level 0, inputs: [107(1177KB)], [105(13MB)]
Jan 31 03:29:10 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848150926182, "job": 66, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [107], "files_L6": [105], "score": -1, "input_data_size": 15613532, "oldest_snapshot_seqno": -1}
Jan 31 03:29:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:10.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:11 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 66] Generated table #108: 7983 keys, 13671487 bytes, temperature: kUnknown
Jan 31 03:29:11 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848151139643, "cf_name": "default", "job": 66, "event": "table_file_creation", "file_number": 108, "file_size": 13671487, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13615588, "index_size": 34859, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19973, "raw_key_size": 206857, "raw_average_key_size": 25, "raw_value_size": 13471078, "raw_average_value_size": 1687, "num_data_blocks": 1375, "num_entries": 7983, "num_filter_entries": 7983, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769848150, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 108, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:29:11 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:29:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:11.139869) [db/compaction/compaction_job.cc:1663] [default] [JOB 66] Compacted 1@0 + 1@6 files to L6 => 13671487 bytes
Jan 31 03:29:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:11.154626) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 73.1 rd, 64.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 13.7 +0.0 blob) out(13.0 +0.0 blob), read-write-amplify(24.3) write-amplify(11.3) OK, records in: 8509, records dropped: 526 output_compression: NoCompression
Jan 31 03:29:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:11.154659) EVENT_LOG_v1 {"time_micros": 1769848151154644, "job": 66, "event": "compaction_finished", "compaction_time_micros": 213532, "compaction_time_cpu_micros": 21238, "output_level": 6, "num_output_files": 1, "total_output_size": 13671487, "num_input_records": 8509, "num_output_records": 7983, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:29:11 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000107.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:29:11 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848151154971, "job": 66, "event": "table_file_deletion", "file_number": 107}
Jan 31 03:29:11 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000105.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:29:11 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848151156360, "job": 66, "event": "table_file_deletion", "file_number": 105}
Jan 31 03:29:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:10.926093) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:11.156410) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:11.156415) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:11.156417) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:11.156418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:11 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:11.156420) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:11 np0005603623 nova_compute[226235]: 2026-01-31 08:29:11.821 226239 DEBUG nova.network.neutron [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Updating instance_info_cache with network_info: [{"id": "109b6929-6b88-494a-b397-b36c434ed7a7", "address": "fa:16:3e:06:b8:22", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap109b6929-6b", "ovs_interfaceid": "109b6929-6b88-494a-b397-b36c434ed7a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:29:12 np0005603623 nova_compute[226235]: 2026-01-31 08:29:12.021 226239 DEBUG oslo_concurrency.lockutils [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Releasing lock "refresh_cache-cca881fe-18fa-40c1-b9ef-2b1f28855b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:29:12 np0005603623 nova_compute[226235]: 2026-01-31 08:29:12.025 226239 DEBUG oslo_concurrency.lockutils [req-b6b7fd42-7ec5-4c2a-b0bc-ac2254aa38b9 req-ba833d40-3077-4e92-a7a0-8c02770fe94b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-cca881fe-18fa-40c1-b9ef-2b1f28855b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:29:12 np0005603623 nova_compute[226235]: 2026-01-31 08:29:12.025 226239 DEBUG nova.network.neutron [req-b6b7fd42-7ec5-4c2a-b0bc-ac2254aa38b9 req-ba833d40-3077-4e92-a7a0-8c02770fe94b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Refreshing network info cache for port 109b6929-6b88-494a-b397-b36c434ed7a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:29:12 np0005603623 nova_compute[226235]: 2026-01-31 08:29:12.673 226239 DEBUG nova.virt.libvirt.driver [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 31 03:29:12 np0005603623 nova_compute[226235]: 2026-01-31 08:29:12.674 226239 DEBUG nova.virt.libvirt.driver [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 31 03:29:12 np0005603623 nova_compute[226235]: 2026-01-31 08:29:12.675 226239 INFO nova.virt.libvirt.driver [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Creating image(s)#033[00m
Jan 31 03:29:12 np0005603623 nova_compute[226235]: 2026-01-31 08:29:12.708 226239 DEBUG nova.storage.rbd_utils [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] creating snapshot(nova-resize) on rbd image(cca881fe-18fa-40c1-b9ef-2b1f28855b53_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:29:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:29:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:12.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:29:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:12.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e307 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:14 np0005603623 nova_compute[226235]: 2026-01-31 08:29:14.084 226239 INFO nova.virt.libvirt.driver [-] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Instance destroyed successfully.#033[00m
Jan 31 03:29:14 np0005603623 nova_compute[226235]: 2026-01-31 08:29:14.085 226239 DEBUG nova.objects.instance [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'resources' on Instance uuid b9f38b79-63fc-48a1-a367-6998b8d6a9dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:29:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e308 e308: 3 total, 3 up, 3 in
Jan 31 03:29:14 np0005603623 nova_compute[226235]: 2026-01-31 08:29:14.211 226239 DEBUG nova.virt.libvirt.vif [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:27:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-2092089502',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-2092089502',id=123,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDqJ2u98tnCfusFKrUeql0ngSDNf86DLAElp/RNmhRZkam9aFuB8mUdP/dAMmSCZVQ6AaZGjQO8tc+tThhzKBQRodouufnRusHHQiOXeUQ9hnIPnIcTcQ3b1LbRSS3JzxA==',key_name='tempest-keypair-1914515639',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:28:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='491937de020742d7b4e847dc3bf57950',ramdisk_id='',reservation_id='r-zq57k11v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-60119558',owner_user_name='tempest-AttachVolumeShelveTestJSON-60119558-project-member',shelved_at='2026-01-31T08:29:05.068624',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='0e9c5d39-8319-43cb-ac29-5664ec7c2a71'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:28:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='432ac8867d8240408db455fc25bb5901',uuid=b9f38b79-63fc-48a1-a367-6998b8d6a9dc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:29:14 np0005603623 nova_compute[226235]: 2026-01-31 08:29:14.212 226239 DEBUG nova.network.os_vif_util [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converting VIF {"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:29:14 np0005603623 nova_compute[226235]: 2026-01-31 08:29:14.212 226239 DEBUG nova.network.os_vif_util [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:fd:8c,bridge_name='br-int',has_traffic_filtering=True,id=58bbf6e9-a33b-4f2b-81e8-812adc1221b5,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58bbf6e9-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:29:14 np0005603623 nova_compute[226235]: 2026-01-31 08:29:14.213 226239 DEBUG os_vif [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:fd:8c,bridge_name='br-int',has_traffic_filtering=True,id=58bbf6e9-a33b-4f2b-81e8-812adc1221b5,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58bbf6e9-a3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:29:14 np0005603623 nova_compute[226235]: 2026-01-31 08:29:14.214 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:14 np0005603623 nova_compute[226235]: 2026-01-31 08:29:14.215 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58bbf6e9-a3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:29:14 np0005603623 nova_compute[226235]: 2026-01-31 08:29:14.216 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:14 np0005603623 nova_compute[226235]: 2026-01-31 08:29:14.219 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:29:14 np0005603623 nova_compute[226235]: 2026-01-31 08:29:14.221 226239 INFO os_vif [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:fd:8c,bridge_name='br-int',has_traffic_filtering=True,id=58bbf6e9-a33b-4f2b-81e8-812adc1221b5,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58bbf6e9-a3')#033[00m
Jan 31 03:29:14 np0005603623 nova_compute[226235]: 2026-01-31 08:29:14.580 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:14 np0005603623 nova_compute[226235]: 2026-01-31 08:29:14.627 226239 DEBUG nova.compute.manager [req-868c3565-1232-43ea-935e-8ec20122d479 req-76eb8242-3782-4d3f-a45e-0e135ece0bb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received event network-changed-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:29:14 np0005603623 nova_compute[226235]: 2026-01-31 08:29:14.627 226239 DEBUG nova.compute.manager [req-868c3565-1232-43ea-935e-8ec20122d479 req-76eb8242-3782-4d3f-a45e-0e135ece0bb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Refreshing instance network info cache due to event network-changed-58bbf6e9-a33b-4f2b-81e8-812adc1221b5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:29:14 np0005603623 nova_compute[226235]: 2026-01-31 08:29:14.627 226239 DEBUG oslo_concurrency.lockutils [req-868c3565-1232-43ea-935e-8ec20122d479 req-76eb8242-3782-4d3f-a45e-0e135ece0bb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:29:14 np0005603623 nova_compute[226235]: 2026-01-31 08:29:14.628 226239 DEBUG oslo_concurrency.lockutils [req-868c3565-1232-43ea-935e-8ec20122d479 req-76eb8242-3782-4d3f-a45e-0e135ece0bb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:29:14 np0005603623 nova_compute[226235]: 2026-01-31 08:29:14.628 226239 DEBUG nova.network.neutron [req-868c3565-1232-43ea-935e-8ec20122d479 req-76eb8242-3782-4d3f-a45e-0e135ece0bb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Refreshing network info cache for port 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:29:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:14.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:14.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:15 np0005603623 nova_compute[226235]: 2026-01-31 08:29:15.844 226239 DEBUG nova.network.neutron [req-b6b7fd42-7ec5-4c2a-b0bc-ac2254aa38b9 req-ba833d40-3077-4e92-a7a0-8c02770fe94b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Updated VIF entry in instance network info cache for port 109b6929-6b88-494a-b397-b36c434ed7a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:29:15 np0005603623 nova_compute[226235]: 2026-01-31 08:29:15.845 226239 DEBUG nova.network.neutron [req-b6b7fd42-7ec5-4c2a-b0bc-ac2254aa38b9 req-ba833d40-3077-4e92-a7a0-8c02770fe94b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Updating instance_info_cache with network_info: [{"id": "109b6929-6b88-494a-b397-b36c434ed7a7", "address": "fa:16:3e:06:b8:22", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap109b6929-6b", "ovs_interfaceid": "109b6929-6b88-494a-b397-b36c434ed7a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:29:15 np0005603623 nova_compute[226235]: 2026-01-31 08:29:15.906 226239 DEBUG nova.objects.instance [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'trusted_certs' on Instance uuid cca881fe-18fa-40c1-b9ef-2b1f28855b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.318 226239 DEBUG oslo_concurrency.lockutils [req-b6b7fd42-7ec5-4c2a-b0bc-ac2254aa38b9 req-ba833d40-3077-4e92-a7a0-8c02770fe94b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-cca881fe-18fa-40c1-b9ef-2b1f28855b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:29:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:16.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.935 226239 DEBUG nova.virt.libvirt.driver [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.936 226239 DEBUG nova.virt.libvirt.driver [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Ensure instance console log exists: /var/lib/nova/instances/cca881fe-18fa-40c1-b9ef-2b1f28855b53/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.936 226239 DEBUG oslo_concurrency.lockutils [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.936 226239 DEBUG oslo_concurrency.lockutils [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.937 226239 DEBUG oslo_concurrency.lockutils [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.940 226239 DEBUG nova.virt.libvirt.driver [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Start _get_guest_xml network_info=[{"id": "109b6929-6b88-494a-b397-b36c434ed7a7", "address": "fa:16:3e:06:b8:22", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-2130829654-network", "vif_mac": "fa:16:3e:06:b8:22"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap109b6929-6b", "ovs_interfaceid": "109b6929-6b88-494a-b397-b36c434ed7a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.944 226239 WARNING nova.virt.libvirt.driver [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.953 226239 DEBUG nova.virt.libvirt.host [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.954 226239 DEBUG nova.virt.libvirt.host [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.959 226239 DEBUG nova.virt.libvirt.host [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.960 226239 DEBUG nova.virt.libvirt.host [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.961 226239 DEBUG nova.virt.libvirt.driver [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.961 226239 DEBUG nova.virt.hardware [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='f75c4aee-d826-4343-a7e3-f06a4b21de52',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.961 226239 DEBUG nova.virt.hardware [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.961 226239 DEBUG nova.virt.hardware [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.961 226239 DEBUG nova.virt.hardware [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.962 226239 DEBUG nova.virt.hardware [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.962 226239 DEBUG nova.virt.hardware [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.962 226239 DEBUG nova.virt.hardware [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.962 226239 DEBUG nova.virt.hardware [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.962 226239 DEBUG nova.virt.hardware [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.962 226239 DEBUG nova.virt.hardware [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.962 226239 DEBUG nova.virt.hardware [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:29:16 np0005603623 nova_compute[226235]: 2026-01-31 08:29:16.963 226239 DEBUG nova.objects.instance [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'vcpu_model' on Instance uuid cca881fe-18fa-40c1-b9ef-2b1f28855b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:29:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:16.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:17 np0005603623 nova_compute[226235]: 2026-01-31 08:29:17.165 226239 DEBUG oslo_concurrency.processutils [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:29:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:29:17 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/115897970' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:29:17 np0005603623 nova_compute[226235]: 2026-01-31 08:29:17.613 226239 DEBUG oslo_concurrency.processutils [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:29:17 np0005603623 nova_compute[226235]: 2026-01-31 08:29:17.649 226239 DEBUG oslo_concurrency.processutils [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:29:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:29:18 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/540101779' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.116 226239 DEBUG oslo_concurrency.processutils [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.118 226239 DEBUG nova.virt.libvirt.vif [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:25:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-899650284',display_name='tempest-ServerActionsTestOtherB-server-899650284',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-899650284',id=121,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDsFGTxapW26dXB/XvUTGcfGzb7/71yMMg1CszLzfnGOAhIU/1lACOYAdVBK40cFjy/2kY258v2iqF8U2lfGaG9JRRfAxw6pRph+THb2i3B9US4SfAm/pgAAiW0mmqeasA==',key_name='tempest-keypair-1440000372',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:25:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='953a213fa5cb435ab3c04ad96152685f',ramdisk_id='',reservation_id='r-tuc10ywh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-1048458052',owner_user_name='tempest-ServerActionsTestOtherB-1048458052-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:28:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ef51681d234a4abc88ff433d0640b6e7',uuid=cca881fe-18fa-40c1-b9ef-2b1f28855b53,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "109b6929-6b88-494a-b397-b36c434ed7a7", "address": "fa:16:3e:06:b8:22", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-2130829654-network", "vif_mac": "fa:16:3e:06:b8:22"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap109b6929-6b", "ovs_interfaceid": "109b6929-6b88-494a-b397-b36c434ed7a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.118 226239 DEBUG nova.network.os_vif_util [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converting VIF {"id": "109b6929-6b88-494a-b397-b36c434ed7a7", "address": "fa:16:3e:06:b8:22", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-2130829654-network", "vif_mac": "fa:16:3e:06:b8:22"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap109b6929-6b", "ovs_interfaceid": "109b6929-6b88-494a-b397-b36c434ed7a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.119 226239 DEBUG nova.network.os_vif_util [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:b8:22,bridge_name='br-int',has_traffic_filtering=True,id=109b6929-6b88-494a-b397-b36c434ed7a7,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap109b6929-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.122 226239 DEBUG nova.virt.libvirt.driver [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:29:18 np0005603623 nova_compute[226235]:  <uuid>cca881fe-18fa-40c1-b9ef-2b1f28855b53</uuid>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:  <name>instance-00000079</name>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:  <memory>196608</memory>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerActionsTestOtherB-server-899650284</nova:name>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:29:16</nova:creationTime>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.micro">
Jan 31 03:29:18 np0005603623 nova_compute[226235]:        <nova:memory>192</nova:memory>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:        <nova:user uuid="ef51681d234a4abc88ff433d0640b6e7">tempest-ServerActionsTestOtherB-1048458052-project-member</nova:user>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:        <nova:project uuid="953a213fa5cb435ab3c04ad96152685f">tempest-ServerActionsTestOtherB-1048458052</nova:project>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:        <nova:port uuid="109b6929-6b88-494a-b397-b36c434ed7a7">
Jan 31 03:29:18 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <entry name="serial">cca881fe-18fa-40c1-b9ef-2b1f28855b53</entry>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <entry name="uuid">cca881fe-18fa-40c1-b9ef-2b1f28855b53</entry>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/cca881fe-18fa-40c1-b9ef-2b1f28855b53_disk">
Jan 31 03:29:18 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:29:18 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/cca881fe-18fa-40c1-b9ef-2b1f28855b53_disk.config">
Jan 31 03:29:18 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:29:18 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:06:b8:22"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <target dev="tap109b6929-6b"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/cca881fe-18fa-40c1-b9ef-2b1f28855b53/console.log" append="off"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:29:18 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:29:18 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:29:18 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:29:18 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.123 226239 DEBUG nova.virt.libvirt.vif [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:25:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-899650284',display_name='tempest-ServerActionsTestOtherB-server-899650284',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-899650284',id=121,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDsFGTxapW26dXB/XvUTGcfGzb7/71yMMg1CszLzfnGOAhIU/1lACOYAdVBK40cFjy/2kY258v2iqF8U2lfGaG9JRRfAxw6pRph+THb2i3B9US4SfAm/pgAAiW0mmqeasA==',key_name='tempest-keypair-1440000372',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:25:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='953a213fa5cb435ab3c04ad96152685f',ramdisk_id='',reservation_id='r-tuc10ywh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-1048458052',owner_user_name='tempest-ServerActionsTestOtherB-1048458052-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:28:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ef51681d234a4abc88ff433d0640b6e7',uuid=cca881fe-18fa-40c1-b9ef-2b1f28855b53,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "109b6929-6b88-494a-b397-b36c434ed7a7", "address": "fa:16:3e:06:b8:22", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-2130829654-network", "vif_mac": "fa:16:3e:06:b8:22"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap109b6929-6b", "ovs_interfaceid": "109b6929-6b88-494a-b397-b36c434ed7a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.123 226239 DEBUG nova.network.os_vif_util [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converting VIF {"id": "109b6929-6b88-494a-b397-b36c434ed7a7", "address": "fa:16:3e:06:b8:22", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-2130829654-network", "vif_mac": "fa:16:3e:06:b8:22"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap109b6929-6b", "ovs_interfaceid": "109b6929-6b88-494a-b397-b36c434ed7a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.123 226239 DEBUG nova.network.os_vif_util [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:b8:22,bridge_name='br-int',has_traffic_filtering=True,id=109b6929-6b88-494a-b397-b36c434ed7a7,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap109b6929-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.124 226239 DEBUG os_vif [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:b8:22,bridge_name='br-int',has_traffic_filtering=True,id=109b6929-6b88-494a-b397-b36c434ed7a7,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap109b6929-6b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.124 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.125 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.125 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.127 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.127 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap109b6929-6b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.127 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap109b6929-6b, col_values=(('external_ids', {'iface-id': '109b6929-6b88-494a-b397-b36c434ed7a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:b8:22', 'vm-uuid': 'cca881fe-18fa-40c1-b9ef-2b1f28855b53'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.128 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:18 np0005603623 NetworkManager[48970]: <info>  [1769848158.1297] manager: (tap109b6929-6b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/243)
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.131 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.137 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.138 226239 INFO os_vif [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:b8:22,bridge_name='br-int',has_traffic_filtering=True,id=109b6929-6b88-494a-b397-b36c434ed7a7,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap109b6929-6b')#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.435 226239 DEBUG nova.virt.libvirt.driver [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.436 226239 DEBUG nova.virt.libvirt.driver [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.436 226239 DEBUG nova.virt.libvirt.driver [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] No VIF found with MAC fa:16:3e:06:b8:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.437 226239 INFO nova.virt.libvirt.driver [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Using config drive#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.470 226239 DEBUG nova.compute.manager [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.470 226239 DEBUG nova.virt.libvirt.driver [None req-a5ed510e-3a81-45be-b58e-cb22d8907bb4 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 31 03:29:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:18.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.930 226239 DEBUG nova.network.neutron [req-868c3565-1232-43ea-935e-8ec20122d479 req-76eb8242-3782-4d3f-a45e-0e135ece0bb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Updated VIF entry in instance network info cache for port 58bbf6e9-a33b-4f2b-81e8-812adc1221b5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:29:18 np0005603623 nova_compute[226235]: 2026-01-31 08:29:18.931 226239 DEBUG nova.network.neutron [req-868c3565-1232-43ea-935e-8ec20122d479 req-76eb8242-3782-4d3f-a45e-0e135ece0bb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Updating instance_info_cache with network_info: [{"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": null, "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:29:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:29:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:18.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:29:19 np0005603623 nova_compute[226235]: 2026-01-31 08:29:19.169 226239 DEBUG oslo_concurrency.lockutils [req-868c3565-1232-43ea-935e-8ec20122d479 req-76eb8242-3782-4d3f-a45e-0e135ece0bb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:29:19 np0005603623 nova_compute[226235]: 2026-01-31 08:29:19.580 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:20.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:29:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:20.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:29:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:22.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:22.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:23 np0005603623 nova_compute[226235]: 2026-01-31 08:29:23.129 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:24 np0005603623 nova_compute[226235]: 2026-01-31 08:29:24.622 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:24.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:24 np0005603623 nova_compute[226235]: 2026-01-31 08:29:24.899 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:24 np0005603623 nova_compute[226235]: 2026-01-31 08:29:24.948 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:24.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:25 np0005603623 podman[283209]: 2026-01-31 08:29:25.481669046 +0000 UTC m=+0.072116172 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:29:25 np0005603623 podman[283210]: 2026-01-31 08:29:25.487386735 +0000 UTC m=+0.078072238 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:29:25 np0005603623 nova_compute[226235]: 2026-01-31 08:29:25.907 226239 INFO nova.virt.libvirt.driver [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Deleting instance files /var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc_del#033[00m
Jan 31 03:29:25 np0005603623 nova_compute[226235]: 2026-01-31 08:29:25.908 226239 INFO nova.virt.libvirt.driver [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Deletion of /var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc_del complete#033[00m
Jan 31 03:29:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:26.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:26.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:28 np0005603623 nova_compute[226235]: 2026-01-31 08:29:28.131 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:29:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:28.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:29:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:28 np0005603623 nova_compute[226235]: 2026-01-31 08:29:28.854 226239 INFO nova.scheduler.client.report [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Deleted allocations for instance b9f38b79-63fc-48a1-a367-6998b8d6a9dc#033[00m
Jan 31 03:29:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:29:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:28.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:29:29 np0005603623 nova_compute[226235]: 2026-01-31 08:29:29.593 226239 DEBUG oslo_concurrency.lockutils [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:29:29 np0005603623 nova_compute[226235]: 2026-01-31 08:29:29.593 226239 DEBUG oslo_concurrency.lockutils [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:29:29 np0005603623 nova_compute[226235]: 2026-01-31 08:29:29.624 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:29 np0005603623 nova_compute[226235]: 2026-01-31 08:29:29.688 226239 DEBUG oslo_concurrency.processutils [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:29:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:29:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:29:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:29:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:29:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:29:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:29:30 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3070123653' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:29:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:30.121 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:29:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:30.122 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:29:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:30.122 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:29:30 np0005603623 nova_compute[226235]: 2026-01-31 08:29:30.125 226239 DEBUG oslo_concurrency.processutils [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:29:30 np0005603623 nova_compute[226235]: 2026-01-31 08:29:30.132 226239 DEBUG nova.compute.provider_tree [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:29:30 np0005603623 nova_compute[226235]: 2026-01-31 08:29:30.236 226239 DEBUG nova.scheduler.client.report [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:29:30 np0005603623 nova_compute[226235]: 2026-01-31 08:29:30.396 226239 DEBUG oslo_concurrency.lockutils [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.803s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:29:30 np0005603623 nova_compute[226235]: 2026-01-31 08:29:30.560 226239 DEBUG oslo_concurrency.lockutils [None req-a8483a3d-7e55-412c-871d-5cd72bd84235 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 47.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:29:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:30.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:29:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:30.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:29:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e309 e309: 3 total, 3 up, 3 in
Jan 31 03:29:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:32.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:29:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:33.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:29:33 np0005603623 nova_compute[226235]: 2026-01-31 08:29:33.134 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:34 np0005603623 nova_compute[226235]: 2026-01-31 08:29:34.625 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:34.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #109. Immutable memtables: 0.
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:34.803935) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 67] Flushing memtable with next log file: 109
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848174804012, "job": 67, "event": "flush_started", "num_memtables": 1, "num_entries": 525, "num_deletes": 250, "total_data_size": 746430, "memory_usage": 757080, "flush_reason": "Manual Compaction"}
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 67] Level-0 flush table #110: started
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848174809850, "cf_name": "default", "job": 67, "event": "table_file_creation", "file_number": 110, "file_size": 400096, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 55516, "largest_seqno": 56036, "table_properties": {"data_size": 397401, "index_size": 731, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 7410, "raw_average_key_size": 20, "raw_value_size": 391745, "raw_average_value_size": 1106, "num_data_blocks": 32, "num_entries": 354, "num_filter_entries": 354, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848152, "oldest_key_time": 1769848152, "file_creation_time": 1769848174, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 110, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 67] Flush lasted 5998 microseconds, and 1635 cpu microseconds.
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:34.809940) [db/flush_job.cc:967] [default] [JOB 67] Level-0 flush table #110: 400096 bytes OK
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:34.809965) [db/memtable_list.cc:519] [default] Level-0 commit table #110 started
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:34.815775) [db/memtable_list.cc:722] [default] Level-0 commit table #110: memtable #1 done
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:34.815851) EVENT_LOG_v1 {"time_micros": 1769848174815836, "job": 67, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:34.815890) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 67] Try to delete WAL files size 743321, prev total WAL file size 743321, number of live WAL files 2.
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000106.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:34.816715) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031373533' seq:72057594037927935, type:22 .. '6D6772737461740032303034' seq:0, type:0; will stop at (end)
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 68] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 67 Base level 0, inputs: [110(390KB)], [108(13MB)]
Jan 31 03:29:34 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848174816825, "job": 68, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [110], "files_L6": [108], "score": -1, "input_data_size": 14071583, "oldest_snapshot_seqno": -1}
Jan 31 03:29:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:35.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:35 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 68] Generated table #111: 7827 keys, 10299864 bytes, temperature: kUnknown
Jan 31 03:29:35 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848175005337, "cf_name": "default", "job": 68, "event": "table_file_creation", "file_number": 111, "file_size": 10299864, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10249504, "index_size": 29704, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19589, "raw_key_size": 203875, "raw_average_key_size": 26, "raw_value_size": 10112116, "raw_average_value_size": 1291, "num_data_blocks": 1160, "num_entries": 7827, "num_filter_entries": 7827, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769848174, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 111, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:29:35 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:29:35 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:35.005639) [db/compaction/compaction_job.cc:1663] [default] [JOB 68] Compacted 1@0 + 1@6 files to L6 => 10299864 bytes
Jan 31 03:29:35 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:35.007159) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 74.6 rd, 54.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 13.0 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(60.9) write-amplify(25.7) OK, records in: 8337, records dropped: 510 output_compression: NoCompression
Jan 31 03:29:35 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:35.007182) EVENT_LOG_v1 {"time_micros": 1769848175007171, "job": 68, "event": "compaction_finished", "compaction_time_micros": 188589, "compaction_time_cpu_micros": 21883, "output_level": 6, "num_output_files": 1, "total_output_size": 10299864, "num_input_records": 8337, "num_output_records": 7827, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:29:35 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000110.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:29:35 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848175007371, "job": 68, "event": "table_file_deletion", "file_number": 110}
Jan 31 03:29:35 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000108.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:29:35 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848175009263, "job": 68, "event": "table_file_deletion", "file_number": 108}
Jan 31 03:29:35 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:34.816537) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:35 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:35.009306) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:35 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:35.009311) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:35 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:35.009313) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:35 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:35.009314) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:35 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:29:35.009316) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:36.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:29:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:37.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:29:38 np0005603623 nova_compute[226235]: 2026-01-31 08:29:38.136 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:38.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:39.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:29:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:29:39 np0005603623 nova_compute[226235]: 2026-01-31 08:29:39.201 226239 DEBUG nova.objects.instance [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'flavor' on Instance uuid cca881fe-18fa-40c1-b9ef-2b1f28855b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:29:39 np0005603623 nova_compute[226235]: 2026-01-31 08:29:39.342 226239 DEBUG oslo_concurrency.lockutils [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "refresh_cache-cca881fe-18fa-40c1-b9ef-2b1f28855b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:29:39 np0005603623 nova_compute[226235]: 2026-01-31 08:29:39.342 226239 DEBUG oslo_concurrency.lockutils [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquired lock "refresh_cache-cca881fe-18fa-40c1-b9ef-2b1f28855b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:29:39 np0005603623 nova_compute[226235]: 2026-01-31 08:29:39.343 226239 DEBUG nova.network.neutron [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:29:39 np0005603623 nova_compute[226235]: 2026-01-31 08:29:39.343 226239 DEBUG nova.objects.instance [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'info_cache' on Instance uuid cca881fe-18fa-40c1-b9ef-2b1f28855b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:29:39 np0005603623 nova_compute[226235]: 2026-01-31 08:29:39.626 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e310 e310: 3 total, 3 up, 3 in
Jan 31 03:29:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:40.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:41.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:41 np0005603623 nova_compute[226235]: 2026-01-31 08:29:41.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:41 np0005603623 nova_compute[226235]: 2026-01-31 08:29:41.156 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:29:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:42.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:43.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:43 np0005603623 nova_compute[226235]: 2026-01-31 08:29:43.139 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:44 np0005603623 nova_compute[226235]: 2026-01-31 08:29:44.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:44 np0005603623 nova_compute[226235]: 2026-01-31 08:29:44.245 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:29:44 np0005603623 nova_compute[226235]: 2026-01-31 08:29:44.246 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:29:44 np0005603623 nova_compute[226235]: 2026-01-31 08:29:44.247 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:29:44 np0005603623 nova_compute[226235]: 2026-01-31 08:29:44.247 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:29:44 np0005603623 nova_compute[226235]: 2026-01-31 08:29:44.248 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:29:44 np0005603623 nova_compute[226235]: 2026-01-31 08:29:44.629 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:29:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2183076858' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:29:44 np0005603623 nova_compute[226235]: 2026-01-31 08:29:44.688 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:29:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:44.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:44 np0005603623 nova_compute[226235]: 2026-01-31 08:29:44.890 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000079 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:29:44 np0005603623 nova_compute[226235]: 2026-01-31 08:29:44.891 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000079 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:29:44 np0005603623 nova_compute[226235]: 2026-01-31 08:29:44.999 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:29:45 np0005603623 nova_compute[226235]: 2026-01-31 08:29:45.000 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4417MB free_disk=20.89712142944336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:29:45 np0005603623 nova_compute[226235]: 2026-01-31 08:29:45.001 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:29:45 np0005603623 nova_compute[226235]: 2026-01-31 08:29:45.001 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:29:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:45.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:45 np0005603623 nova_compute[226235]: 2026-01-31 08:29:45.404 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance cca881fe-18fa-40c1-b9ef-2b1f28855b53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:29:45 np0005603623 nova_compute[226235]: 2026-01-31 08:29:45.405 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:29:45 np0005603623 nova_compute[226235]: 2026-01-31 08:29:45.405 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=704MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:29:45 np0005603623 nova_compute[226235]: 2026-01-31 08:29:45.732 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:29:45 np0005603623 nova_compute[226235]: 2026-01-31 08:29:45.991 226239 DEBUG nova.network.neutron [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Updating instance_info_cache with network_info: [{"id": "109b6929-6b88-494a-b397-b36c434ed7a7", "address": "fa:16:3e:06:b8:22", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap109b6929-6b", "ovs_interfaceid": "109b6929-6b88-494a-b397-b36c434ed7a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:29:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:29:46 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/616915218' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.158 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.165 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.190 226239 DEBUG oslo_concurrency.lockutils [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Releasing lock "refresh_cache-cca881fe-18fa-40c1-b9ef-2b1f28855b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.297 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.322 226239 INFO nova.virt.libvirt.driver [-] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Instance destroyed successfully.#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.323 226239 DEBUG nova.objects.instance [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'numa_topology' on Instance uuid cca881fe-18fa-40c1-b9ef-2b1f28855b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.683 226239 DEBUG nova.objects.instance [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'resources' on Instance uuid cca881fe-18fa-40c1-b9ef-2b1f28855b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:29:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:29:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:46.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.825 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.826 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.825s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.943 226239 DEBUG nova.virt.libvirt.vif [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:25:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-899650284',display_name='tempest-ServerActionsTestOtherB-server-899650284',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-899650284',id=121,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDsFGTxapW26dXB/XvUTGcfGzb7/71yMMg1CszLzfnGOAhIU/1lACOYAdVBK40cFjy/2kY258v2iqF8U2lfGaG9JRRfAxw6pRph+THb2i3B9US4SfAm/pgAAiW0mmqeasA==',key_name='tempest-keypair-1440000372',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:29:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='953a213fa5cb435ab3c04ad96152685f',ramdisk_id='',reservation_id='r-tuc10ywh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1048458052',owner_user_name='tempest-ServerActionsTestOtherB-1048458052-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:29:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ef51681d234a4abc88ff433d0640b6e7',uuid=cca881fe-18fa-40c1-b9ef-2b1f28855b53,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "109b6929-6b88-494a-b397-b36c434ed7a7", "address": "fa:16:3e:06:b8:22", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap109b6929-6b", "ovs_interfaceid": "109b6929-6b88-494a-b397-b36c434ed7a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.943 226239 DEBUG nova.network.os_vif_util [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converting VIF {"id": "109b6929-6b88-494a-b397-b36c434ed7a7", "address": "fa:16:3e:06:b8:22", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap109b6929-6b", "ovs_interfaceid": "109b6929-6b88-494a-b397-b36c434ed7a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.944 226239 DEBUG nova.network.os_vif_util [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:b8:22,bridge_name='br-int',has_traffic_filtering=True,id=109b6929-6b88-494a-b397-b36c434ed7a7,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap109b6929-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.944 226239 DEBUG os_vif [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:b8:22,bridge_name='br-int',has_traffic_filtering=True,id=109b6929-6b88-494a-b397-b36c434ed7a7,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap109b6929-6b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.946 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.947 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap109b6929-6b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.948 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.950 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.952 226239 INFO os_vif [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:b8:22,bridge_name='br-int',has_traffic_filtering=True,id=109b6929-6b88-494a-b397-b36c434ed7a7,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap109b6929-6b')#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.959 226239 DEBUG nova.virt.libvirt.driver [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Start _get_guest_xml network_info=[{"id": "109b6929-6b88-494a-b397-b36c434ed7a7", "address": "fa:16:3e:06:b8:22", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap109b6929-6b", "ovs_interfaceid": "109b6929-6b88-494a-b397-b36c434ed7a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.962 226239 WARNING nova.virt.libvirt.driver [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.969 226239 DEBUG nova.virt.libvirt.host [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.970 226239 DEBUG nova.virt.libvirt.host [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.981 226239 DEBUG nova.virt.libvirt.host [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.982 226239 DEBUG nova.virt.libvirt.host [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.983 226239 DEBUG nova.virt.libvirt.driver [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.984 226239 DEBUG nova.virt.hardware [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='f75c4aee-d826-4343-a7e3-f06a4b21de52',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.984 226239 DEBUG nova.virt.hardware [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.985 226239 DEBUG nova.virt.hardware [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.985 226239 DEBUG nova.virt.hardware [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.985 226239 DEBUG nova.virt.hardware [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.985 226239 DEBUG nova.virt.hardware [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.986 226239 DEBUG nova.virt.hardware [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.986 226239 DEBUG nova.virt.hardware [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.986 226239 DEBUG nova.virt.hardware [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.986 226239 DEBUG nova.virt.hardware [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.987 226239 DEBUG nova.virt.hardware [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:29:46 np0005603623 nova_compute[226235]: 2026-01-31 08:29:46.987 226239 DEBUG nova.objects.instance [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'vcpu_model' on Instance uuid cca881fe-18fa-40c1-b9ef-2b1f28855b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:29:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:47.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:47 np0005603623 nova_compute[226235]: 2026-01-31 08:29:47.052 226239 DEBUG oslo_concurrency.processutils [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:29:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:29:47 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1818585318' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:29:47 np0005603623 nova_compute[226235]: 2026-01-31 08:29:47.459 226239 DEBUG oslo_concurrency.processutils [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:29:47 np0005603623 nova_compute[226235]: 2026-01-31 08:29:47.494 226239 DEBUG oslo_concurrency.processutils [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:29:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:29:47 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2596735318' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:29:47 np0005603623 nova_compute[226235]: 2026-01-31 08:29:47.898 226239 DEBUG oslo_concurrency.processutils [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:29:47 np0005603623 nova_compute[226235]: 2026-01-31 08:29:47.901 226239 DEBUG nova.virt.libvirt.vif [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:25:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-899650284',display_name='tempest-ServerActionsTestOtherB-server-899650284',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-899650284',id=121,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDsFGTxapW26dXB/XvUTGcfGzb7/71yMMg1CszLzfnGOAhIU/1lACOYAdVBK40cFjy/2kY258v2iqF8U2lfGaG9JRRfAxw6pRph+THb2i3B9US4SfAm/pgAAiW0mmqeasA==',key_name='tempest-keypair-1440000372',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:29:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='953a213fa5cb435ab3c04ad96152685f',ramdisk_id='',reservation_id='r-tuc10ywh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1048458052',owner_user_name='tempest-ServerActionsTestOtherB-1048458052-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:29:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ef51681d234a4abc88ff433d0640b6e7',uuid=cca881fe-18fa-40c1-b9ef-2b1f28855b53,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "109b6929-6b88-494a-b397-b36c434ed7a7", "address": "fa:16:3e:06:b8:22", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap109b6929-6b", "ovs_interfaceid": "109b6929-6b88-494a-b397-b36c434ed7a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:29:47 np0005603623 nova_compute[226235]: 2026-01-31 08:29:47.902 226239 DEBUG nova.network.os_vif_util [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converting VIF {"id": "109b6929-6b88-494a-b397-b36c434ed7a7", "address": "fa:16:3e:06:b8:22", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap109b6929-6b", "ovs_interfaceid": "109b6929-6b88-494a-b397-b36c434ed7a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:29:47 np0005603623 nova_compute[226235]: 2026-01-31 08:29:47.902 226239 DEBUG nova.network.os_vif_util [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:b8:22,bridge_name='br-int',has_traffic_filtering=True,id=109b6929-6b88-494a-b397-b36c434ed7a7,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap109b6929-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:29:47 np0005603623 nova_compute[226235]: 2026-01-31 08:29:47.903 226239 DEBUG nova.objects.instance [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'pci_devices' on Instance uuid cca881fe-18fa-40c1-b9ef-2b1f28855b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.025 226239 DEBUG nova.virt.libvirt.driver [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:29:48 np0005603623 nova_compute[226235]:  <uuid>cca881fe-18fa-40c1-b9ef-2b1f28855b53</uuid>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:  <name>instance-00000079</name>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:  <memory>196608</memory>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerActionsTestOtherB-server-899650284</nova:name>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:29:46</nova:creationTime>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.micro">
Jan 31 03:29:48 np0005603623 nova_compute[226235]:        <nova:memory>192</nova:memory>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:        <nova:user uuid="ef51681d234a4abc88ff433d0640b6e7">tempest-ServerActionsTestOtherB-1048458052-project-member</nova:user>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:        <nova:project uuid="953a213fa5cb435ab3c04ad96152685f">tempest-ServerActionsTestOtherB-1048458052</nova:project>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:        <nova:port uuid="109b6929-6b88-494a-b397-b36c434ed7a7">
Jan 31 03:29:48 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <entry name="serial">cca881fe-18fa-40c1-b9ef-2b1f28855b53</entry>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <entry name="uuid">cca881fe-18fa-40c1-b9ef-2b1f28855b53</entry>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/cca881fe-18fa-40c1-b9ef-2b1f28855b53_disk">
Jan 31 03:29:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:29:48 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/cca881fe-18fa-40c1-b9ef-2b1f28855b53_disk.config">
Jan 31 03:29:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:29:48 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:06:b8:22"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <target dev="tap109b6929-6b"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/cca881fe-18fa-40c1-b9ef-2b1f28855b53/console.log" append="off"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <input type="keyboard" bus="usb"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:29:48 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:29:48 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:29:48 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:29:48 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.027 226239 DEBUG nova.virt.libvirt.driver [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] skipping disk for instance-00000079 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.027 226239 DEBUG nova.virt.libvirt.driver [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] skipping disk for instance-00000079 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.028 226239 DEBUG nova.virt.libvirt.vif [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:25:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-899650284',display_name='tempest-ServerActionsTestOtherB-server-899650284',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-899650284',id=121,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDsFGTxapW26dXB/XvUTGcfGzb7/71yMMg1CszLzfnGOAhIU/1lACOYAdVBK40cFjy/2kY258v2iqF8U2lfGaG9JRRfAxw6pRph+THb2i3B9US4SfAm/pgAAiW0mmqeasA==',key_name='tempest-keypair-1440000372',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:29:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='953a213fa5cb435ab3c04ad96152685f',ramdisk_id='',reservation_id='r-tuc10ywh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1048458052',owner_user_name='tempest-ServerActionsTestOtherB-1048458052-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:29:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ef51681d234a4abc88ff433d0640b6e7',uuid=cca881fe-18fa-40c1-b9ef-2b1f28855b53,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "109b6929-6b88-494a-b397-b36c434ed7a7", "address": "fa:16:3e:06:b8:22", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap109b6929-6b", "ovs_interfaceid": "109b6929-6b88-494a-b397-b36c434ed7a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.028 226239 DEBUG nova.network.os_vif_util [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converting VIF {"id": "109b6929-6b88-494a-b397-b36c434ed7a7", "address": "fa:16:3e:06:b8:22", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap109b6929-6b", "ovs_interfaceid": "109b6929-6b88-494a-b397-b36c434ed7a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.029 226239 DEBUG nova.network.os_vif_util [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:b8:22,bridge_name='br-int',has_traffic_filtering=True,id=109b6929-6b88-494a-b397-b36c434ed7a7,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap109b6929-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.029 226239 DEBUG os_vif [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:b8:22,bridge_name='br-int',has_traffic_filtering=True,id=109b6929-6b88-494a-b397-b36c434ed7a7,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap109b6929-6b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.030 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.031 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.031 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.033 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.033 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap109b6929-6b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.034 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap109b6929-6b, col_values=(('external_ids', {'iface-id': '109b6929-6b88-494a-b397-b36c434ed7a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:b8:22', 'vm-uuid': 'cca881fe-18fa-40c1-b9ef-2b1f28855b53'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.035 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:48 np0005603623 NetworkManager[48970]: <info>  [1769848188.0366] manager: (tap109b6929-6b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.039 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.041 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.042 226239 INFO os_vif [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:b8:22,bridge_name='br-int',has_traffic_filtering=True,id=109b6929-6b88-494a-b397-b36c434ed7a7,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap109b6929-6b')#033[00m
Jan 31 03:29:48 np0005603623 kernel: tap109b6929-6b: entered promiscuous mode
Jan 31 03:29:48 np0005603623 NetworkManager[48970]: <info>  [1769848188.1129] manager: (tap109b6929-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/245)
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.112 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:29:48Z|00511|binding|INFO|Claiming lport 109b6929-6b88-494a-b397-b36c434ed7a7 for this chassis.
Jan 31 03:29:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:29:48Z|00512|binding|INFO|109b6929-6b88-494a-b397-b36c434ed7a7: Claiming fa:16:3e:06:b8:22 10.100.0.13
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.116 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.120 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:48 np0005603623 NetworkManager[48970]: <info>  [1769848188.1253] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/246)
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.124 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:48 np0005603623 NetworkManager[48970]: <info>  [1769848188.1259] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/247)
Jan 31 03:29:48 np0005603623 systemd-udevd[283613]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:29:48 np0005603623 systemd-machined[194379]: New machine qemu-57-instance-00000079.
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.144 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:b8:22 10.100.0.13'], port_security=['fa:16:3e:06:b8:22 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'cca881fe-18fa-40c1-b9ef-2b1f28855b53', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44469d8b-ad30-4270-88fa-e67c568f3150', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '953a213fa5cb435ab3c04ad96152685f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '5b1bd8ad-0d2a-4d57-a00a-9a6b59df86e5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.181'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d972fb9d-6d12-4c1c-b135-704d64887b72, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=109b6929-6b88-494a-b397-b36c434ed7a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.146 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 109b6929-6b88-494a-b397-b36c434ed7a7 in datapath 44469d8b-ad30-4270-88fa-e67c568f3150 bound to our chassis#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.147 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 44469d8b-ad30-4270-88fa-e67c568f3150#033[00m
Jan 31 03:29:48 np0005603623 NetworkManager[48970]: <info>  [1769848188.1505] device (tap109b6929-6b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:29:48 np0005603623 NetworkManager[48970]: <info>  [1769848188.1512] device (tap109b6929-6b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.157 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2f64b1d8-6b32-4f71-91f8-24247ef255cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.158 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap44469d8b-a1 in ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.159 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap44469d8b-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.159 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[aa1301df-d141-49ea-a44e-9741abd85c28]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.160 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[322a731a-a8c9-483d-9208-ac40fb398792]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.170 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[e5956d25-0cf3-4ccd-a248-aa1edcbcfda9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:48 np0005603623 systemd[1]: Started Virtual Machine qemu-57-instance-00000079.
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.173 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.179 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1f031f91-736b-4cbd-bf21-a26c9cc1bb78]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:29:48Z|00513|binding|INFO|Setting lport 109b6929-6b88-494a-b397-b36c434ed7a7 up in Southbound
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.207 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[dfd36e92-9a84-4da0-aeda-a78b9a7eeb4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:29:48Z|00514|binding|INFO|Setting lport 109b6929-6b88-494a-b397-b36c434ed7a7 ovn-installed in OVS
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.226 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.228 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.230 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[58b92ed8-dd10-4224-b452-53bab6ee05d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:48 np0005603623 systemd-udevd[283616]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:29:48 np0005603623 NetworkManager[48970]: <info>  [1769848188.2326] manager: (tap44469d8b-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/248)
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.255 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[e1dec917-e1d9-4106-8c24-f5f118d3cd65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.258 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[3568a630-6529-4a6e-91f4-9e41a0989c72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:48 np0005603623 NetworkManager[48970]: <info>  [1769848188.2862] device (tap44469d8b-a0): carrier: link connected
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.289 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[0d0a9827-553e-41cd-9890-b958acacb096]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.304 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3a07b4ba-0d5b-4c3d-873c-fa5ef088ddf9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44469d8b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:98:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 732871, 'reachable_time': 23239, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283646, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.321 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d7ac6cce-9729-4047-851a-ad662c7c0f48]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:9820'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 732871, 'tstamp': 732871}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 283647, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.337 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e02cbb83-d966-4f41-b741-f70ab6b6872e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44469d8b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:98:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 732871, 'reachable_time': 23239, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 283648, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.359 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d11f3725-9916-4c15-bc61-3467e2937fb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.405 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ccd72eca-d6ab-43d1-8789-7c0d5f2d08fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.406 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44469d8b-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.407 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.407 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44469d8b-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:29:48 np0005603623 NetworkManager[48970]: <info>  [1769848188.4099] manager: (tap44469d8b-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/249)
Jan 31 03:29:48 np0005603623 kernel: tap44469d8b-a0: entered promiscuous mode
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.411 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.413 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap44469d8b-a0, col_values=(('external_ids', {'iface-id': '7e288124-e200-4c03-8a4a-baab3e3f3d7a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:29:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:29:48Z|00515|binding|INFO|Releasing lport 7e288124-e200-4c03-8a4a-baab3e3f3d7a from this chassis (sb_readonly=0)
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.414 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.417 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44469d8b-ad30-4270-88fa-e67c568f3150.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44469d8b-ad30-4270-88fa-e67c568f3150.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.418 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bdcf58cf-0437-4f15-af2c-0e4b4d45c836]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.419 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.419 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-44469d8b-ad30-4270-88fa-e67c568f3150
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/44469d8b-ad30-4270-88fa-e67c568f3150.pid.haproxy
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 44469d8b-ad30-4270-88fa-e67c568f3150
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:29:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:48.420 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'env', 'PROCESS_TAG=haproxy-44469d8b-ad30-4270-88fa-e67c568f3150', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/44469d8b-ad30-4270-88fa-e67c568f3150.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:29:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:48.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:48 np0005603623 podman[283696]: 2026-01-31 08:29:48.717873826 +0000 UTC m=+0.021517415 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.826 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.827 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.827 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:29:48 np0005603623 podman[283696]: 2026-01-31 08:29:48.87623793 +0000 UTC m=+0.179881509 container create 4897dc192fcd5485702e683913e32efe1da2fa46829b60e967f9418dd5891a24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.896 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848188.8956614, cca881fe-18fa-40c1-b9ef-2b1f28855b53 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.897 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.899 226239 DEBUG nova.compute.manager [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.905 226239 INFO nova.virt.libvirt.driver [-] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Instance rebooted successfully.#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.906 226239 DEBUG nova.compute.manager [None req-9c082648-f462-4c36-9fe0-c715622044c8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.911 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-cca881fe-18fa-40c1-b9ef-2b1f28855b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.911 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-cca881fe-18fa-40c1-b9ef-2b1f28855b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.911 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:29:48 np0005603623 nova_compute[226235]: 2026-01-31 08:29:48.912 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid cca881fe-18fa-40c1-b9ef-2b1f28855b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:29:48 np0005603623 systemd[1]: Started libpod-conmon-4897dc192fcd5485702e683913e32efe1da2fa46829b60e967f9418dd5891a24.scope.
Jan 31 03:29:48 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:29:48 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07e4ebefbc3cd87ab6b4a81d476ad872de376f10a6def78363a0ec741d848d82/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:29:49 np0005603623 nova_compute[226235]: 2026-01-31 08:29:49.010 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:29:49 np0005603623 nova_compute[226235]: 2026-01-31 08:29:49.014 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:29:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:49.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:49 np0005603623 podman[283696]: 2026-01-31 08:29:49.024300193 +0000 UTC m=+0.327943782 container init 4897dc192fcd5485702e683913e32efe1da2fa46829b60e967f9418dd5891a24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 31 03:29:49 np0005603623 podman[283696]: 2026-01-31 08:29:49.029841756 +0000 UTC m=+0.333485315 container start 4897dc192fcd5485702e683913e32efe1da2fa46829b60e967f9418dd5891a24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:29:49 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[283738]: [NOTICE]   (283767) : New worker (283775) forked
Jan 31 03:29:49 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[283738]: [NOTICE]   (283767) : Loading success.
Jan 31 03:29:49 np0005603623 nova_compute[226235]: 2026-01-31 08:29:49.077 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848188.895828, cca881fe-18fa-40c1-b9ef-2b1f28855b53 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:29:49 np0005603623 nova_compute[226235]: 2026-01-31 08:29:49.077 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] VM Started (Lifecycle Event)#033[00m
Jan 31 03:29:49 np0005603623 nova_compute[226235]: 2026-01-31 08:29:49.221 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:29:49 np0005603623 nova_compute[226235]: 2026-01-31 08:29:49.223 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:29:49 np0005603623 nova_compute[226235]: 2026-01-31 08:29:49.349 226239 DEBUG nova.compute.manager [req-c3f21d28-a8d1-4e5b-ba6e-07977679152f req-b6b242a3-f94f-4aee-8a7d-8f71ef8ce38e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Received event network-vif-plugged-109b6929-6b88-494a-b397-b36c434ed7a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:29:49 np0005603623 nova_compute[226235]: 2026-01-31 08:29:49.349 226239 DEBUG oslo_concurrency.lockutils [req-c3f21d28-a8d1-4e5b-ba6e-07977679152f req-b6b242a3-f94f-4aee-8a7d-8f71ef8ce38e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "cca881fe-18fa-40c1-b9ef-2b1f28855b53-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:29:49 np0005603623 nova_compute[226235]: 2026-01-31 08:29:49.349 226239 DEBUG oslo_concurrency.lockutils [req-c3f21d28-a8d1-4e5b-ba6e-07977679152f req-b6b242a3-f94f-4aee-8a7d-8f71ef8ce38e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "cca881fe-18fa-40c1-b9ef-2b1f28855b53-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:29:49 np0005603623 nova_compute[226235]: 2026-01-31 08:29:49.349 226239 DEBUG oslo_concurrency.lockutils [req-c3f21d28-a8d1-4e5b-ba6e-07977679152f req-b6b242a3-f94f-4aee-8a7d-8f71ef8ce38e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "cca881fe-18fa-40c1-b9ef-2b1f28855b53-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:29:49 np0005603623 nova_compute[226235]: 2026-01-31 08:29:49.350 226239 DEBUG nova.compute.manager [req-c3f21d28-a8d1-4e5b-ba6e-07977679152f req-b6b242a3-f94f-4aee-8a7d-8f71ef8ce38e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] No waiting events found dispatching network-vif-plugged-109b6929-6b88-494a-b397-b36c434ed7a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:29:49 np0005603623 nova_compute[226235]: 2026-01-31 08:29:49.350 226239 WARNING nova.compute.manager [req-c3f21d28-a8d1-4e5b-ba6e-07977679152f req-b6b242a3-f94f-4aee-8a7d-8f71ef8ce38e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Received unexpected event network-vif-plugged-109b6929-6b88-494a-b397-b36c434ed7a7 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:29:49 np0005603623 nova_compute[226235]: 2026-01-31 08:29:49.631 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:50.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:51.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:51 np0005603623 nova_compute[226235]: 2026-01-31 08:29:51.615 226239 DEBUG nova.compute.manager [req-bea57983-0eba-423f-b869-22ae839b38ad req-b9776c64-1ad3-4170-be9e-352e2643242c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Received event network-vif-plugged-109b6929-6b88-494a-b397-b36c434ed7a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:29:51 np0005603623 nova_compute[226235]: 2026-01-31 08:29:51.615 226239 DEBUG oslo_concurrency.lockutils [req-bea57983-0eba-423f-b869-22ae839b38ad req-b9776c64-1ad3-4170-be9e-352e2643242c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "cca881fe-18fa-40c1-b9ef-2b1f28855b53-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:29:51 np0005603623 nova_compute[226235]: 2026-01-31 08:29:51.616 226239 DEBUG oslo_concurrency.lockutils [req-bea57983-0eba-423f-b869-22ae839b38ad req-b9776c64-1ad3-4170-be9e-352e2643242c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "cca881fe-18fa-40c1-b9ef-2b1f28855b53-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:29:51 np0005603623 nova_compute[226235]: 2026-01-31 08:29:51.616 226239 DEBUG oslo_concurrency.lockutils [req-bea57983-0eba-423f-b869-22ae839b38ad req-b9776c64-1ad3-4170-be9e-352e2643242c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "cca881fe-18fa-40c1-b9ef-2b1f28855b53-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:29:51 np0005603623 nova_compute[226235]: 2026-01-31 08:29:51.616 226239 DEBUG nova.compute.manager [req-bea57983-0eba-423f-b869-22ae839b38ad req-b9776c64-1ad3-4170-be9e-352e2643242c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] No waiting events found dispatching network-vif-plugged-109b6929-6b88-494a-b397-b36c434ed7a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:29:51 np0005603623 nova_compute[226235]: 2026-01-31 08:29:51.616 226239 WARNING nova.compute.manager [req-bea57983-0eba-423f-b869-22ae839b38ad req-b9776c64-1ad3-4170-be9e-352e2643242c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Received unexpected event network-vif-plugged-109b6929-6b88-494a-b397-b36c434ed7a7 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:29:52 np0005603623 nova_compute[226235]: 2026-01-31 08:29:52.383 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:52.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:53.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:53 np0005603623 nova_compute[226235]: 2026-01-31 08:29:53.036 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:53 np0005603623 nova_compute[226235]: 2026-01-31 08:29:53.785 226239 DEBUG oslo_concurrency.lockutils [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "cca881fe-18fa-40c1-b9ef-2b1f28855b53" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:29:53 np0005603623 nova_compute[226235]: 2026-01-31 08:29:53.786 226239 DEBUG oslo_concurrency.lockutils [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "cca881fe-18fa-40c1-b9ef-2b1f28855b53" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:29:53 np0005603623 nova_compute[226235]: 2026-01-31 08:29:53.786 226239 DEBUG oslo_concurrency.lockutils [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "cca881fe-18fa-40c1-b9ef-2b1f28855b53-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:29:53 np0005603623 nova_compute[226235]: 2026-01-31 08:29:53.786 226239 DEBUG oslo_concurrency.lockutils [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "cca881fe-18fa-40c1-b9ef-2b1f28855b53-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:29:53 np0005603623 nova_compute[226235]: 2026-01-31 08:29:53.787 226239 DEBUG oslo_concurrency.lockutils [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "cca881fe-18fa-40c1-b9ef-2b1f28855b53-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:29:53 np0005603623 nova_compute[226235]: 2026-01-31 08:29:53.788 226239 INFO nova.compute.manager [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Terminating instance#033[00m
Jan 31 03:29:53 np0005603623 nova_compute[226235]: 2026-01-31 08:29:53.789 226239 DEBUG nova.compute.manager [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:29:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:53 np0005603623 kernel: tap109b6929-6b (unregistering): left promiscuous mode
Jan 31 03:29:53 np0005603623 NetworkManager[48970]: <info>  [1769848193.9701] device (tap109b6929-6b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:29:53 np0005603623 ovn_controller[133449]: 2026-01-31T08:29:53Z|00516|binding|INFO|Releasing lport 109b6929-6b88-494a-b397-b36c434ed7a7 from this chassis (sb_readonly=0)
Jan 31 03:29:53 np0005603623 ovn_controller[133449]: 2026-01-31T08:29:53Z|00517|binding|INFO|Setting lport 109b6929-6b88-494a-b397-b36c434ed7a7 down in Southbound
Jan 31 03:29:53 np0005603623 ovn_controller[133449]: 2026-01-31T08:29:53Z|00518|binding|INFO|Removing iface tap109b6929-6b ovn-installed in OVS
Jan 31 03:29:53 np0005603623 nova_compute[226235]: 2026-01-31 08:29:53.974 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:53 np0005603623 nova_compute[226235]: 2026-01-31 08:29:53.982 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:54.006 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:b8:22 10.100.0.13'], port_security=['fa:16:3e:06:b8:22 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'cca881fe-18fa-40c1-b9ef-2b1f28855b53', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44469d8b-ad30-4270-88fa-e67c568f3150', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '953a213fa5cb435ab3c04ad96152685f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '5b1bd8ad-0d2a-4d57-a00a-9a6b59df86e5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.181', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d972fb9d-6d12-4c1c-b135-704d64887b72, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=109b6929-6b88-494a-b397-b36c434ed7a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:29:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:54.008 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 109b6929-6b88-494a-b397-b36c434ed7a7 in datapath 44469d8b-ad30-4270-88fa-e67c568f3150 unbound from our chassis#033[00m
Jan 31 03:29:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:54.009 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 44469d8b-ad30-4270-88fa-e67c568f3150, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:29:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:54.010 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[146d1bef-4e09-4f64-aec2-9f0e35d09b5f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:54.011 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 namespace which is not needed anymore#033[00m
Jan 31 03:29:54 np0005603623 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000079.scope: Deactivated successfully.
Jan 31 03:29:54 np0005603623 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000079.scope: Consumed 5.773s CPU time.
Jan 31 03:29:54 np0005603623 systemd-machined[194379]: Machine qemu-57-instance-00000079 terminated.
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.037 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Updating instance_info_cache with network_info: [{"id": "109b6929-6b88-494a-b397-b36c434ed7a7", "address": "fa:16:3e:06:b8:22", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap109b6929-6b", "ovs_interfaceid": "109b6929-6b88-494a-b397-b36c434ed7a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:29:54 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[283738]: [NOTICE]   (283767) : haproxy version is 2.8.14-c23fe91
Jan 31 03:29:54 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[283738]: [NOTICE]   (283767) : path to executable is /usr/sbin/haproxy
Jan 31 03:29:54 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[283738]: [WARNING]  (283767) : Exiting Master process...
Jan 31 03:29:54 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[283738]: [ALERT]    (283767) : Current worker (283775) exited with code 143 (Terminated)
Jan 31 03:29:54 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[283738]: [WARNING]  (283767) : All workers exited. Exiting... (0)
Jan 31 03:29:54 np0005603623 systemd[1]: libpod-4897dc192fcd5485702e683913e32efe1da2fa46829b60e967f9418dd5891a24.scope: Deactivated successfully.
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.196 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-cca881fe-18fa-40c1-b9ef-2b1f28855b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.196 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.197 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.197 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.197 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.197 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.202 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:54 np0005603623 podman[283831]: 2026-01-31 08:29:54.203833782 +0000 UTC m=+0.122929644 container died 4897dc192fcd5485702e683913e32efe1da2fa46829b60e967f9418dd5891a24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.204 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.211 226239 INFO nova.virt.libvirt.driver [-] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Instance destroyed successfully.#033[00m
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.212 226239 DEBUG nova.objects.instance [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'resources' on Instance uuid cca881fe-18fa-40c1-b9ef-2b1f28855b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.295 226239 DEBUG nova.virt.libvirt.vif [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:25:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-899650284',display_name='tempest-ServerActionsTestOtherB-server-899650284',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-899650284',id=121,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDsFGTxapW26dXB/XvUTGcfGzb7/71yMMg1CszLzfnGOAhIU/1lACOYAdVBK40cFjy/2kY258v2iqF8U2lfGaG9JRRfAxw6pRph+THb2i3B9US4SfAm/pgAAiW0mmqeasA==',key_name='tempest-keypair-1440000372',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:29:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='953a213fa5cb435ab3c04ad96152685f',ramdisk_id='',reservation_id='r-tuc10ywh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1048458052',owner_user_name='tempest-ServerActionsTestOtherB-1048458052-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:29:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ef51681d234a4abc88ff433d0640b6e7',uuid=cca881fe-18fa-40c1-b9ef-2b1f28855b53,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "109b6929-6b88-494a-b397-b36c434ed7a7", "address": "fa:16:3e:06:b8:22", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap109b6929-6b", "ovs_interfaceid": "109b6929-6b88-494a-b397-b36c434ed7a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.296 226239 DEBUG nova.network.os_vif_util [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converting VIF {"id": "109b6929-6b88-494a-b397-b36c434ed7a7", "address": "fa:16:3e:06:b8:22", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap109b6929-6b", "ovs_interfaceid": "109b6929-6b88-494a-b397-b36c434ed7a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.296 226239 DEBUG nova.network.os_vif_util [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:b8:22,bridge_name='br-int',has_traffic_filtering=True,id=109b6929-6b88-494a-b397-b36c434ed7a7,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap109b6929-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.297 226239 DEBUG os_vif [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:b8:22,bridge_name='br-int',has_traffic_filtering=True,id=109b6929-6b88-494a-b397-b36c434ed7a7,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap109b6929-6b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.298 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.298 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap109b6929-6b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.300 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.301 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.303 226239 INFO os_vif [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:b8:22,bridge_name='br-int',has_traffic_filtering=True,id=109b6929-6b88-494a-b397-b36c434ed7a7,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap109b6929-6b')#033[00m
Jan 31 03:29:54 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4897dc192fcd5485702e683913e32efe1da2fa46829b60e967f9418dd5891a24-userdata-shm.mount: Deactivated successfully.
Jan 31 03:29:54 np0005603623 systemd[1]: var-lib-containers-storage-overlay-07e4ebefbc3cd87ab6b4a81d476ad872de376f10a6def78363a0ec741d848d82-merged.mount: Deactivated successfully.
Jan 31 03:29:54 np0005603623 podman[283831]: 2026-01-31 08:29:54.589107 +0000 UTC m=+0.508202872 container cleanup 4897dc192fcd5485702e683913e32efe1da2fa46829b60e967f9418dd5891a24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:29:54 np0005603623 systemd[1]: libpod-conmon-4897dc192fcd5485702e683913e32efe1da2fa46829b60e967f9418dd5891a24.scope: Deactivated successfully.
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.632 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:54 np0005603623 podman[283891]: 2026-01-31 08:29:54.723120722 +0000 UTC m=+0.117138483 container remove 4897dc192fcd5485702e683913e32efe1da2fa46829b60e967f9418dd5891a24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:29:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:54.729 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[11f93a60-1507-4b3b-8f47-722f4a50699d]: (4, ('Sat Jan 31 08:29:54 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 (4897dc192fcd5485702e683913e32efe1da2fa46829b60e967f9418dd5891a24)\n4897dc192fcd5485702e683913e32efe1da2fa46829b60e967f9418dd5891a24\nSat Jan 31 08:29:54 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 (4897dc192fcd5485702e683913e32efe1da2fa46829b60e967f9418dd5891a24)\n4897dc192fcd5485702e683913e32efe1da2fa46829b60e967f9418dd5891a24\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:54.732 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6d058493-3334-4ea0-aaa6-705e23ec9598]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:54.734 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44469d8b-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.737 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:54 np0005603623 kernel: tap44469d8b-a0: left promiscuous mode
Jan 31 03:29:54 np0005603623 nova_compute[226235]: 2026-01-31 08:29:54.743 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:54.745 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4bc1f939-9cdd-4e36-9bf5-ed880266242e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:54.761 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[20cb3e01-3ad3-4dc4-8c25-71684313b99a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:54.763 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e6ef6a8f-c7ea-4a63-ac5a-058db1c6b1c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:54.789 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0689f53a-08a9-4d8a-8839-ba74465b52ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 732863, 'reachable_time': 27716, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283906, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:54.794 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:29:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:29:54.794 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[de90e3fc-82d6-401a-ae17-fda6b59232c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:29:54 np0005603623 systemd[1]: run-netns-ovnmeta\x2d44469d8b\x2dad30\x2d4270\x2d88fa\x2de67c568f3150.mount: Deactivated successfully.
Jan 31 03:29:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:29:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:54.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:29:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:55.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:55 np0005603623 podman[283907]: 2026-01-31 08:29:55.949252869 +0000 UTC m=+0.045964993 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:29:55 np0005603623 podman[283908]: 2026-01-31 08:29:55.98434644 +0000 UTC m=+0.076798009 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:29:56 np0005603623 nova_compute[226235]: 2026-01-31 08:29:56.146 226239 DEBUG nova.compute.manager [req-7b40e61c-a520-4808-8df4-000855f465c2 req-ea53bbcc-e84d-4a2b-b83c-11eca5290800 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Received event network-vif-unplugged-109b6929-6b88-494a-b397-b36c434ed7a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:29:56 np0005603623 nova_compute[226235]: 2026-01-31 08:29:56.146 226239 DEBUG oslo_concurrency.lockutils [req-7b40e61c-a520-4808-8df4-000855f465c2 req-ea53bbcc-e84d-4a2b-b83c-11eca5290800 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "cca881fe-18fa-40c1-b9ef-2b1f28855b53-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:29:56 np0005603623 nova_compute[226235]: 2026-01-31 08:29:56.146 226239 DEBUG oslo_concurrency.lockutils [req-7b40e61c-a520-4808-8df4-000855f465c2 req-ea53bbcc-e84d-4a2b-b83c-11eca5290800 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "cca881fe-18fa-40c1-b9ef-2b1f28855b53-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:29:56 np0005603623 nova_compute[226235]: 2026-01-31 08:29:56.146 226239 DEBUG oslo_concurrency.lockutils [req-7b40e61c-a520-4808-8df4-000855f465c2 req-ea53bbcc-e84d-4a2b-b83c-11eca5290800 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "cca881fe-18fa-40c1-b9ef-2b1f28855b53-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:29:56 np0005603623 nova_compute[226235]: 2026-01-31 08:29:56.147 226239 DEBUG nova.compute.manager [req-7b40e61c-a520-4808-8df4-000855f465c2 req-ea53bbcc-e84d-4a2b-b83c-11eca5290800 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] No waiting events found dispatching network-vif-unplugged-109b6929-6b88-494a-b397-b36c434ed7a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:29:56 np0005603623 nova_compute[226235]: 2026-01-31 08:29:56.147 226239 DEBUG nova.compute.manager [req-7b40e61c-a520-4808-8df4-000855f465c2 req-ea53bbcc-e84d-4a2b-b83c-11eca5290800 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Received event network-vif-unplugged-109b6929-6b88-494a-b397-b36c434ed7a7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:29:56 np0005603623 nova_compute[226235]: 2026-01-31 08:29:56.147 226239 DEBUG nova.compute.manager [req-7b40e61c-a520-4808-8df4-000855f465c2 req-ea53bbcc-e84d-4a2b-b83c-11eca5290800 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Received event network-vif-plugged-109b6929-6b88-494a-b397-b36c434ed7a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:29:56 np0005603623 nova_compute[226235]: 2026-01-31 08:29:56.147 226239 DEBUG oslo_concurrency.lockutils [req-7b40e61c-a520-4808-8df4-000855f465c2 req-ea53bbcc-e84d-4a2b-b83c-11eca5290800 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "cca881fe-18fa-40c1-b9ef-2b1f28855b53-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:29:56 np0005603623 nova_compute[226235]: 2026-01-31 08:29:56.147 226239 DEBUG oslo_concurrency.lockutils [req-7b40e61c-a520-4808-8df4-000855f465c2 req-ea53bbcc-e84d-4a2b-b83c-11eca5290800 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "cca881fe-18fa-40c1-b9ef-2b1f28855b53-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:29:56 np0005603623 nova_compute[226235]: 2026-01-31 08:29:56.148 226239 DEBUG oslo_concurrency.lockutils [req-7b40e61c-a520-4808-8df4-000855f465c2 req-ea53bbcc-e84d-4a2b-b83c-11eca5290800 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "cca881fe-18fa-40c1-b9ef-2b1f28855b53-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:29:56 np0005603623 nova_compute[226235]: 2026-01-31 08:29:56.148 226239 DEBUG nova.compute.manager [req-7b40e61c-a520-4808-8df4-000855f465c2 req-ea53bbcc-e84d-4a2b-b83c-11eca5290800 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] No waiting events found dispatching network-vif-plugged-109b6929-6b88-494a-b397-b36c434ed7a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:29:56 np0005603623 nova_compute[226235]: 2026-01-31 08:29:56.148 226239 WARNING nova.compute.manager [req-7b40e61c-a520-4808-8df4-000855f465c2 req-ea53bbcc-e84d-4a2b-b83c-11eca5290800 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Received unexpected event network-vif-plugged-109b6929-6b88-494a-b397-b36c434ed7a7 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:29:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:56.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:29:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:57.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:29:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:58.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:29:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:59.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:59 np0005603623 nova_compute[226235]: 2026-01-31 08:29:59.302 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:59 np0005603623 nova_compute[226235]: 2026-01-31 08:29:59.520 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:59 np0005603623 nova_compute[226235]: 2026-01-31 08:29:59.634 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:59 np0005603623 nova_compute[226235]: 2026-01-31 08:29:59.891 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:00.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:01.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:01 np0005603623 nova_compute[226235]: 2026-01-31 08:30:01.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:01 np0005603623 ceph-mon[77037]: overall HEALTH_OK
Jan 31 03:30:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:02.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:03.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:03 np0005603623 nova_compute[226235]: 2026-01-31 08:30:03.060 226239 INFO nova.virt.libvirt.driver [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Deleting instance files /var/lib/nova/instances/cca881fe-18fa-40c1-b9ef-2b1f28855b53_del#033[00m
Jan 31 03:30:03 np0005603623 nova_compute[226235]: 2026-01-31 08:30:03.061 226239 INFO nova.virt.libvirt.driver [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Deletion of /var/lib/nova/instances/cca881fe-18fa-40c1-b9ef-2b1f28855b53_del complete#033[00m
Jan 31 03:30:03 np0005603623 nova_compute[226235]: 2026-01-31 08:30:03.388 226239 INFO nova.compute.manager [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Took 9.60 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:30:03 np0005603623 nova_compute[226235]: 2026-01-31 08:30:03.389 226239 DEBUG oslo.service.loopingcall [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:30:03 np0005603623 nova_compute[226235]: 2026-01-31 08:30:03.389 226239 DEBUG nova.compute.manager [-] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:30:03 np0005603623 nova_compute[226235]: 2026-01-31 08:30:03.389 226239 DEBUG nova.network.neutron [-] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:30:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:04 np0005603623 nova_compute[226235]: 2026-01-31 08:30:04.304 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:04 np0005603623 nova_compute[226235]: 2026-01-31 08:30:04.636 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:30:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:04.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:30:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:30:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:05.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:30:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:05.779 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:30:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:05.780 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:30:05 np0005603623 nova_compute[226235]: 2026-01-31 08:30:05.780 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:06.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:07.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:08 np0005603623 nova_compute[226235]: 2026-01-31 08:30:08.176 226239 DEBUG nova.network.neutron [-] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:30:08 np0005603623 nova_compute[226235]: 2026-01-31 08:30:08.290 226239 DEBUG nova.compute.manager [req-54007927-3b82-404e-8a1e-802369fec46b req-e91bb89a-1b12-4c47-815d-baa958008877 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Received event network-vif-deleted-109b6929-6b88-494a-b397-b36c434ed7a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:30:08 np0005603623 nova_compute[226235]: 2026-01-31 08:30:08.291 226239 INFO nova.compute.manager [req-54007927-3b82-404e-8a1e-802369fec46b req-e91bb89a-1b12-4c47-815d-baa958008877 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Neutron deleted interface 109b6929-6b88-494a-b397-b36c434ed7a7; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:30:08 np0005603623 nova_compute[226235]: 2026-01-31 08:30:08.291 226239 DEBUG nova.network.neutron [req-54007927-3b82-404e-8a1e-802369fec46b req-e91bb89a-1b12-4c47-815d-baa958008877 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:30:08 np0005603623 nova_compute[226235]: 2026-01-31 08:30:08.645 226239 INFO nova.compute.manager [-] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Took 5.26 seconds to deallocate network for instance.#033[00m
Jan 31 03:30:08 np0005603623 nova_compute[226235]: 2026-01-31 08:30:08.685 226239 DEBUG nova.compute.manager [req-54007927-3b82-404e-8a1e-802369fec46b req-e91bb89a-1b12-4c47-815d-baa958008877 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Detach interface failed, port_id=109b6929-6b88-494a-b397-b36c434ed7a7, reason: Instance cca881fe-18fa-40c1-b9ef-2b1f28855b53 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:30:08 np0005603623 nova_compute[226235]: 2026-01-31 08:30:08.791 226239 DEBUG oslo_concurrency.lockutils [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:08 np0005603623 nova_compute[226235]: 2026-01-31 08:30:08.792 226239 DEBUG oslo_concurrency.lockutils [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:30:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:08.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:30:08 np0005603623 nova_compute[226235]: 2026-01-31 08:30:08.924 226239 DEBUG oslo_concurrency.processutils [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:09.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:09 np0005603623 nova_compute[226235]: 2026-01-31 08:30:09.210 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848194.2100196, cca881fe-18fa-40c1-b9ef-2b1f28855b53 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:30:09 np0005603623 nova_compute[226235]: 2026-01-31 08:30:09.211 226239 INFO nova.compute.manager [-] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:30:09 np0005603623 nova_compute[226235]: 2026-01-31 08:30:09.253 226239 DEBUG nova.compute.manager [None req-059bef5b-2a15-4abe-be8e-45fa33c83f87 - - - - - -] [instance: cca881fe-18fa-40c1-b9ef-2b1f28855b53] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:30:09 np0005603623 nova_compute[226235]: 2026-01-31 08:30:09.307 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:30:09 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2492257465' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:30:09 np0005603623 nova_compute[226235]: 2026-01-31 08:30:09.505 226239 DEBUG oslo_concurrency.processutils [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.581s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:09 np0005603623 nova_compute[226235]: 2026-01-31 08:30:09.511 226239 DEBUG nova.compute.provider_tree [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:30:09 np0005603623 nova_compute[226235]: 2026-01-31 08:30:09.544 226239 DEBUG nova.scheduler.client.report [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:30:09 np0005603623 nova_compute[226235]: 2026-01-31 08:30:09.586 226239 DEBUG oslo_concurrency.lockutils [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.793s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:09 np0005603623 nova_compute[226235]: 2026-01-31 08:30:09.637 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:10 np0005603623 nova_compute[226235]: 2026-01-31 08:30:10.307 226239 INFO nova.scheduler.client.report [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Deleted allocations for instance cca881fe-18fa-40c1-b9ef-2b1f28855b53#033[00m
Jan 31 03:30:10 np0005603623 nova_compute[226235]: 2026-01-31 08:30:10.707 226239 DEBUG oslo_concurrency.lockutils [None req-75fcbd3c-8091-41b3-8770-4dde13adf913 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "cca881fe-18fa-40c1-b9ef-2b1f28855b53" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 16.921s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:10.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:11.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:30:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:12.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:30:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:30:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:13.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:30:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:14 np0005603623 nova_compute[226235]: 2026-01-31 08:30:14.310 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e311 e311: 3 total, 3 up, 3 in
Jan 31 03:30:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:30:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/384214357' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:30:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:30:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/384214357' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:30:14 np0005603623 nova_compute[226235]: 2026-01-31 08:30:14.639 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:14.781 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:14.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:15.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:16.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:30:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:17.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:30:17 np0005603623 nova_compute[226235]: 2026-01-31 08:30:17.450 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:18.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:19.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:19 np0005603623 nova_compute[226235]: 2026-01-31 08:30:19.313 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:19 np0005603623 nova_compute[226235]: 2026-01-31 08:30:19.607 226239 DEBUG oslo_concurrency.lockutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:19 np0005603623 nova_compute[226235]: 2026-01-31 08:30:19.608 226239 DEBUG oslo_concurrency.lockutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:19 np0005603623 nova_compute[226235]: 2026-01-31 08:30:19.641 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:19 np0005603623 nova_compute[226235]: 2026-01-31 08:30:19.734 226239 DEBUG nova.compute.manager [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:30:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 e312: 3 total, 3 up, 3 in
Jan 31 03:30:19 np0005603623 nova_compute[226235]: 2026-01-31 08:30:19.968 226239 DEBUG oslo_concurrency.lockutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:19 np0005603623 nova_compute[226235]: 2026-01-31 08:30:19.969 226239 DEBUG oslo_concurrency.lockutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:19 np0005603623 nova_compute[226235]: 2026-01-31 08:30:19.975 226239 DEBUG nova.virt.hardware [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:30:19 np0005603623 nova_compute[226235]: 2026-01-31 08:30:19.976 226239 INFO nova.compute.claims [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:30:20 np0005603623 nova_compute[226235]: 2026-01-31 08:30:20.306 226239 DEBUG oslo_concurrency.processutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:30:20 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1794909598' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:30:20 np0005603623 nova_compute[226235]: 2026-01-31 08:30:20.754 226239 DEBUG oslo_concurrency.processutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:20 np0005603623 nova_compute[226235]: 2026-01-31 08:30:20.760 226239 DEBUG nova.compute.provider_tree [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:30:20 np0005603623 nova_compute[226235]: 2026-01-31 08:30:20.799 226239 DEBUG nova.scheduler.client.report [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:30:20 np0005603623 nova_compute[226235]: 2026-01-31 08:30:20.840 226239 DEBUG oslo_concurrency.lockutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.871s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:20 np0005603623 nova_compute[226235]: 2026-01-31 08:30:20.841 226239 DEBUG nova.compute.manager [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:30:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:20.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:20 np0005603623 nova_compute[226235]: 2026-01-31 08:30:20.916 226239 DEBUG nova.compute.manager [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:30:20 np0005603623 nova_compute[226235]: 2026-01-31 08:30:20.917 226239 DEBUG nova.network.neutron [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:30:20 np0005603623 nova_compute[226235]: 2026-01-31 08:30:20.976 226239 INFO nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:30:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:21.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:21 np0005603623 nova_compute[226235]: 2026-01-31 08:30:21.092 226239 DEBUG nova.compute.manager [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:30:21 np0005603623 nova_compute[226235]: 2026-01-31 08:30:21.260 226239 DEBUG nova.compute.manager [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:30:21 np0005603623 nova_compute[226235]: 2026-01-31 08:30:21.261 226239 DEBUG nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:30:21 np0005603623 nova_compute[226235]: 2026-01-31 08:30:21.262 226239 INFO nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Creating image(s)#033[00m
Jan 31 03:30:21 np0005603623 nova_compute[226235]: 2026-01-31 08:30:21.317 226239 DEBUG nova.storage.rbd_utils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image a15175ec-85fd-457c-870b-8a6d7c13c906_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:30:21 np0005603623 nova_compute[226235]: 2026-01-31 08:30:21.345 226239 DEBUG nova.storage.rbd_utils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image a15175ec-85fd-457c-870b-8a6d7c13c906_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:30:21 np0005603623 nova_compute[226235]: 2026-01-31 08:30:21.377 226239 DEBUG nova.storage.rbd_utils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image a15175ec-85fd-457c-870b-8a6d7c13c906_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:30:21 np0005603623 nova_compute[226235]: 2026-01-31 08:30:21.380 226239 DEBUG oslo_concurrency.processutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:21 np0005603623 nova_compute[226235]: 2026-01-31 08:30:21.442 226239 DEBUG oslo_concurrency.processutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:21 np0005603623 nova_compute[226235]: 2026-01-31 08:30:21.442 226239 DEBUG oslo_concurrency.lockutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:21 np0005603623 nova_compute[226235]: 2026-01-31 08:30:21.443 226239 DEBUG oslo_concurrency.lockutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:21 np0005603623 nova_compute[226235]: 2026-01-31 08:30:21.443 226239 DEBUG oslo_concurrency.lockutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:21 np0005603623 nova_compute[226235]: 2026-01-31 08:30:21.465 226239 DEBUG nova.storage.rbd_utils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image a15175ec-85fd-457c-870b-8a6d7c13c906_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:30:21 np0005603623 nova_compute[226235]: 2026-01-31 08:30:21.471 226239 DEBUG oslo_concurrency.processutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 a15175ec-85fd-457c-870b-8a6d7c13c906_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:22 np0005603623 nova_compute[226235]: 2026-01-31 08:30:22.378 226239 DEBUG nova.policy [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef51681d234a4abc88ff433d0640b6e7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '953a213fa5cb435ab3c04ad96152685f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:30:22 np0005603623 nova_compute[226235]: 2026-01-31 08:30:22.501 226239 DEBUG oslo_concurrency.processutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 a15175ec-85fd-457c-870b-8a6d7c13c906_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:22 np0005603623 nova_compute[226235]: 2026-01-31 08:30:22.588 226239 DEBUG nova.storage.rbd_utils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] resizing rbd image a15175ec-85fd-457c-870b-8a6d7c13c906_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:30:22 np0005603623 nova_compute[226235]: 2026-01-31 08:30:22.761 226239 DEBUG nova.objects.instance [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'migration_context' on Instance uuid a15175ec-85fd-457c-870b-8a6d7c13c906 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:30:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:22.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:22 np0005603623 nova_compute[226235]: 2026-01-31 08:30:22.921 226239 DEBUG nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:30:22 np0005603623 nova_compute[226235]: 2026-01-31 08:30:22.922 226239 DEBUG nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Ensure instance console log exists: /var/lib/nova/instances/a15175ec-85fd-457c-870b-8a6d7c13c906/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:30:22 np0005603623 nova_compute[226235]: 2026-01-31 08:30:22.922 226239 DEBUG oslo_concurrency.lockutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:22 np0005603623 nova_compute[226235]: 2026-01-31 08:30:22.923 226239 DEBUG oslo_concurrency.lockutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:22 np0005603623 nova_compute[226235]: 2026-01-31 08:30:22.923 226239 DEBUG oslo_concurrency.lockutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:23.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:24 np0005603623 nova_compute[226235]: 2026-01-31 08:30:24.023 226239 DEBUG nova.network.neutron [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Successfully created port: 02df5608-7a85-4d54-b5ac-628d6c8e8179 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:30:24 np0005603623 nova_compute[226235]: 2026-01-31 08:30:24.316 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:24 np0005603623 nova_compute[226235]: 2026-01-31 08:30:24.644 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:24.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:30:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:25.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:30:26 np0005603623 nova_compute[226235]: 2026-01-31 08:30:26.378 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:26 np0005603623 nova_compute[226235]: 2026-01-31 08:30:26.729 226239 DEBUG nova.network.neutron [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Successfully updated port: 02df5608-7a85-4d54-b5ac-628d6c8e8179 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:30:26 np0005603623 nova_compute[226235]: 2026-01-31 08:30:26.852 226239 DEBUG oslo_concurrency.lockutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:30:26 np0005603623 nova_compute[226235]: 2026-01-31 08:30:26.852 226239 DEBUG oslo_concurrency.lockutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquired lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:30:26 np0005603623 nova_compute[226235]: 2026-01-31 08:30:26.852 226239 DEBUG nova.network.neutron [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:30:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:26.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:26 np0005603623 nova_compute[226235]: 2026-01-31 08:30:26.903 226239 DEBUG nova.compute.manager [req-b18c7ba1-22fd-496a-84d7-6e9125c2e2a8 req-0e8ad161-cfb0-4226-91e6-6170ce96a721 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received event network-changed-02df5608-7a85-4d54-b5ac-628d6c8e8179 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:30:26 np0005603623 nova_compute[226235]: 2026-01-31 08:30:26.903 226239 DEBUG nova.compute.manager [req-b18c7ba1-22fd-496a-84d7-6e9125c2e2a8 req-0e8ad161-cfb0-4226-91e6-6170ce96a721 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Refreshing instance network info cache due to event network-changed-02df5608-7a85-4d54-b5ac-628d6c8e8179. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:30:26 np0005603623 nova_compute[226235]: 2026-01-31 08:30:26.903 226239 DEBUG oslo_concurrency.lockutils [req-b18c7ba1-22fd-496a-84d7-6e9125c2e2a8 req-0e8ad161-cfb0-4226-91e6-6170ce96a721 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:30:26 np0005603623 podman[284230]: 2026-01-31 08:30:26.979179261 +0000 UTC m=+0.075028683 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:30:27 np0005603623 podman[284231]: 2026-01-31 08:30:27.011441983 +0000 UTC m=+0.106540241 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 03:30:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:30:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:27.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:30:27 np0005603623 nova_compute[226235]: 2026-01-31 08:30:27.196 226239 DEBUG nova.network.neutron [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:30:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:28.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:29.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.319 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.452 226239 DEBUG nova.network.neutron [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Updating instance_info_cache with network_info: [{"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.646 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.911 226239 DEBUG oslo_concurrency.lockutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Releasing lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.912 226239 DEBUG nova.compute.manager [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Instance network_info: |[{"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.912 226239 DEBUG oslo_concurrency.lockutils [req-b18c7ba1-22fd-496a-84d7-6e9125c2e2a8 req-0e8ad161-cfb0-4226-91e6-6170ce96a721 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.912 226239 DEBUG nova.network.neutron [req-b18c7ba1-22fd-496a-84d7-6e9125c2e2a8 req-0e8ad161-cfb0-4226-91e6-6170ce96a721 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Refreshing network info cache for port 02df5608-7a85-4d54-b5ac-628d6c8e8179 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.915 226239 DEBUG nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Start _get_guest_xml network_info=[{"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.919 226239 WARNING nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.924 226239 DEBUG nova.virt.libvirt.host [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.925 226239 DEBUG nova.virt.libvirt.host [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.931 226239 DEBUG nova.virt.libvirt.host [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.932 226239 DEBUG nova.virt.libvirt.host [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.933 226239 DEBUG nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.933 226239 DEBUG nova.virt.hardware [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.934 226239 DEBUG nova.virt.hardware [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.934 226239 DEBUG nova.virt.hardware [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.934 226239 DEBUG nova.virt.hardware [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.934 226239 DEBUG nova.virt.hardware [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.935 226239 DEBUG nova.virt.hardware [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.935 226239 DEBUG nova.virt.hardware [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.935 226239 DEBUG nova.virt.hardware [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.936 226239 DEBUG nova.virt.hardware [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.936 226239 DEBUG nova.virt.hardware [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.936 226239 DEBUG nova.virt.hardware [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:30:29 np0005603623 nova_compute[226235]: 2026-01-31 08:30:29.939 226239 DEBUG oslo_concurrency.processutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:30.123 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:30.123 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:30.124 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:30:30 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/549330810' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:30:30 np0005603623 nova_compute[226235]: 2026-01-31 08:30:30.388 226239 DEBUG oslo_concurrency.processutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:30 np0005603623 nova_compute[226235]: 2026-01-31 08:30:30.411 226239 DEBUG nova.storage.rbd_utils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image a15175ec-85fd-457c-870b-8a6d7c13c906_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:30:30 np0005603623 nova_compute[226235]: 2026-01-31 08:30:30.417 226239 DEBUG oslo_concurrency.processutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:30:30 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4057913033' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:30:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:30.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:30 np0005603623 nova_compute[226235]: 2026-01-31 08:30:30.887 226239 DEBUG oslo_concurrency.processutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:30 np0005603623 nova_compute[226235]: 2026-01-31 08:30:30.889 226239 DEBUG nova.virt.libvirt.vif [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:30:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-2097097080',display_name='tempest-ServerActionsTestOtherB-server-2097097080',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-2097097080',id=128,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDsFGTxapW26dXB/XvUTGcfGzb7/71yMMg1CszLzfnGOAhIU/1lACOYAdVBK40cFjy/2kY258v2iqF8U2lfGaG9JRRfAxw6pRph+THb2i3B9US4SfAm/pgAAiW0mmqeasA==',key_name='tempest-keypair-1440000372',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='953a213fa5cb435ab3c04ad96152685f',ramdisk_id='',reservation_id='r-41gbj3yx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1048458052',owner_user_name='tempest-ServerActionsTestOtherB-1048458052-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:30:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ef51681d234a4abc88ff433d0640b6e7',uuid=a15175ec-85fd-457c-870b-8a6d7c13c906,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:30:30 np0005603623 nova_compute[226235]: 2026-01-31 08:30:30.891 226239 DEBUG nova.network.os_vif_util [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converting VIF {"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:30:30 np0005603623 nova_compute[226235]: 2026-01-31 08:30:30.892 226239 DEBUG nova.network.os_vif_util [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:59:a9,bridge_name='br-int',has_traffic_filtering=True,id=02df5608-7a85-4d54-b5ac-628d6c8e8179,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02df5608-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:30:30 np0005603623 nova_compute[226235]: 2026-01-31 08:30:30.894 226239 DEBUG nova.objects.instance [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'pci_devices' on Instance uuid a15175ec-85fd-457c-870b-8a6d7c13c906 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:30:30 np0005603623 nova_compute[226235]: 2026-01-31 08:30:30.998 226239 DEBUG nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:30:30 np0005603623 nova_compute[226235]:  <uuid>a15175ec-85fd-457c-870b-8a6d7c13c906</uuid>
Jan 31 03:30:30 np0005603623 nova_compute[226235]:  <name>instance-00000080</name>
Jan 31 03:30:30 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:30:30 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:30:30 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:30:30 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:30:30 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:30:30 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerActionsTestOtherB-server-2097097080</nova:name>
Jan 31 03:30:30 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:30:29</nova:creationTime>
Jan 31 03:30:30 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:30:30 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:30:30 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:30:30 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:30:30 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:30:30 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:30:30 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:30:30 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:30:30 np0005603623 nova_compute[226235]:        <nova:user uuid="ef51681d234a4abc88ff433d0640b6e7">tempest-ServerActionsTestOtherB-1048458052-project-member</nova:user>
Jan 31 03:30:30 np0005603623 nova_compute[226235]:        <nova:project uuid="953a213fa5cb435ab3c04ad96152685f">tempest-ServerActionsTestOtherB-1048458052</nova:project>
Jan 31 03:30:30 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:        <nova:port uuid="02df5608-7a85-4d54-b5ac-628d6c8e8179">
Jan 31 03:30:31 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <entry name="serial">a15175ec-85fd-457c-870b-8a6d7c13c906</entry>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <entry name="uuid">a15175ec-85fd-457c-870b-8a6d7c13c906</entry>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/a15175ec-85fd-457c-870b-8a6d7c13c906_disk">
Jan 31 03:30:31 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:30:31 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/a15175ec-85fd-457c-870b-8a6d7c13c906_disk.config">
Jan 31 03:30:31 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:30:31 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:dd:59:a9"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <target dev="tap02df5608-7a"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/a15175ec-85fd-457c-870b-8a6d7c13c906/console.log" append="off"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:30:31 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:30:31 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:30:31 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:30:31 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:30.999 226239 DEBUG nova.compute.manager [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Preparing to wait for external event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:30.999 226239 DEBUG oslo_concurrency.lockutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.000 226239 DEBUG oslo_concurrency.lockutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.000 226239 DEBUG oslo_concurrency.lockutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.000 226239 DEBUG nova.virt.libvirt.vif [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:30:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-2097097080',display_name='tempest-ServerActionsTestOtherB-server-2097097080',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-2097097080',id=128,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDsFGTxapW26dXB/XvUTGcfGzb7/71yMMg1CszLzfnGOAhIU/1lACOYAdVBK40cFjy/2kY258v2iqF8U2lfGaG9JRRfAxw6pRph+THb2i3B9US4SfAm/pgAAiW0mmqeasA==',key_name='tempest-keypair-1440000372',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='953a213fa5cb435ab3c04ad96152685f',ramdisk_id='',reservation_id='r-41gbj3yx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1048458052',owner_user_name='tempest-ServerActionsTestOtherB-1048458052-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:30:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ef51681d234a4abc88ff433d0640b6e7',uuid=a15175ec-85fd-457c-870b-8a6d7c13c906,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.001 226239 DEBUG nova.network.os_vif_util [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converting VIF {"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.001 226239 DEBUG nova.network.os_vif_util [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:59:a9,bridge_name='br-int',has_traffic_filtering=True,id=02df5608-7a85-4d54-b5ac-628d6c8e8179,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02df5608-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.002 226239 DEBUG os_vif [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:59:a9,bridge_name='br-int',has_traffic_filtering=True,id=02df5608-7a85-4d54-b5ac-628d6c8e8179,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02df5608-7a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.002 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.003 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.003 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.006 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.007 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02df5608-7a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.007 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap02df5608-7a, col_values=(('external_ids', {'iface-id': '02df5608-7a85-4d54-b5ac-628d6c8e8179', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:59:a9', 'vm-uuid': 'a15175ec-85fd-457c-870b-8a6d7c13c906'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:31 np0005603623 NetworkManager[48970]: <info>  [1769848231.0102] manager: (tap02df5608-7a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/250)
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.012 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.015 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.016 226239 INFO os_vif [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:59:a9,bridge_name='br-int',has_traffic_filtering=True,id=02df5608-7a85-4d54-b5ac-628d6c8e8179,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02df5608-7a')#033[00m
Jan 31 03:30:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:30:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:31.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.234 226239 DEBUG nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.235 226239 DEBUG nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.235 226239 DEBUG nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] No VIF found with MAC fa:16:3e:dd:59:a9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.235 226239 INFO nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Using config drive#033[00m
Jan 31 03:30:31 np0005603623 nova_compute[226235]: 2026-01-31 08:30:31.257 226239 DEBUG nova.storage.rbd_utils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image a15175ec-85fd-457c-870b-8a6d7c13c906_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:30:32 np0005603623 nova_compute[226235]: 2026-01-31 08:30:32.063 226239 INFO nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Creating config drive at /var/lib/nova/instances/a15175ec-85fd-457c-870b-8a6d7c13c906/disk.config#033[00m
Jan 31 03:30:32 np0005603623 nova_compute[226235]: 2026-01-31 08:30:32.069 226239 DEBUG oslo_concurrency.processutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a15175ec-85fd-457c-870b-8a6d7c13c906/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp4dpp44r5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:32 np0005603623 nova_compute[226235]: 2026-01-31 08:30:32.193 226239 DEBUG oslo_concurrency.processutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a15175ec-85fd-457c-870b-8a6d7c13c906/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp4dpp44r5" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:32 np0005603623 nova_compute[226235]: 2026-01-31 08:30:32.220 226239 DEBUG nova.storage.rbd_utils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image a15175ec-85fd-457c-870b-8a6d7c13c906_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:30:32 np0005603623 nova_compute[226235]: 2026-01-31 08:30:32.224 226239 DEBUG oslo_concurrency.processutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a15175ec-85fd-457c-870b-8a6d7c13c906/disk.config a15175ec-85fd-457c-870b-8a6d7c13c906_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:32 np0005603623 nova_compute[226235]: 2026-01-31 08:30:32.464 226239 DEBUG nova.network.neutron [req-b18c7ba1-22fd-496a-84d7-6e9125c2e2a8 req-0e8ad161-cfb0-4226-91e6-6170ce96a721 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Updated VIF entry in instance network info cache for port 02df5608-7a85-4d54-b5ac-628d6c8e8179. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:30:32 np0005603623 nova_compute[226235]: 2026-01-31 08:30:32.465 226239 DEBUG nova.network.neutron [req-b18c7ba1-22fd-496a-84d7-6e9125c2e2a8 req-0e8ad161-cfb0-4226-91e6-6170ce96a721 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Updating instance_info_cache with network_info: [{"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:30:32 np0005603623 nova_compute[226235]: 2026-01-31 08:30:32.517 226239 DEBUG oslo_concurrency.processutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a15175ec-85fd-457c-870b-8a6d7c13c906/disk.config a15175ec-85fd-457c-870b-8a6d7c13c906_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.293s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:32 np0005603623 nova_compute[226235]: 2026-01-31 08:30:32.518 226239 INFO nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Deleting local config drive /var/lib/nova/instances/a15175ec-85fd-457c-870b-8a6d7c13c906/disk.config because it was imported into RBD.#033[00m
Jan 31 03:30:32 np0005603623 kernel: tap02df5608-7a: entered promiscuous mode
Jan 31 03:30:32 np0005603623 NetworkManager[48970]: <info>  [1769848232.5592] manager: (tap02df5608-7a): new Tun device (/org/freedesktop/NetworkManager/Devices/251)
Jan 31 03:30:32 np0005603623 nova_compute[226235]: 2026-01-31 08:30:32.559 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:30:32Z|00519|binding|INFO|Claiming lport 02df5608-7a85-4d54-b5ac-628d6c8e8179 for this chassis.
Jan 31 03:30:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:30:32Z|00520|binding|INFO|02df5608-7a85-4d54-b5ac-628d6c8e8179: Claiming fa:16:3e:dd:59:a9 10.100.0.10
Jan 31 03:30:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:30:32Z|00521|binding|INFO|Setting lport 02df5608-7a85-4d54-b5ac-628d6c8e8179 ovn-installed in OVS
Jan 31 03:30:32 np0005603623 nova_compute[226235]: 2026-01-31 08:30:32.566 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:32 np0005603623 nova_compute[226235]: 2026-01-31 08:30:32.569 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:32 np0005603623 systemd-udevd[284462]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:30:32 np0005603623 systemd-machined[194379]: New machine qemu-58-instance-00000080.
Jan 31 03:30:32 np0005603623 NetworkManager[48970]: <info>  [1769848232.5873] device (tap02df5608-7a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:30:32 np0005603623 NetworkManager[48970]: <info>  [1769848232.5879] device (tap02df5608-7a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:30:32 np0005603623 systemd[1]: Started Virtual Machine qemu-58-instance-00000080.
Jan 31 03:30:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:30:32Z|00522|binding|INFO|Setting lport 02df5608-7a85-4d54-b5ac-628d6c8e8179 up in Southbound
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.645 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:59:a9 10.100.0.10'], port_security=['fa:16:3e:dd:59:a9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'a15175ec-85fd-457c-870b-8a6d7c13c906', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44469d8b-ad30-4270-88fa-e67c568f3150', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '953a213fa5cb435ab3c04ad96152685f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5b1bd8ad-0d2a-4d57-a00a-9a6b59df86e5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d972fb9d-6d12-4c1c-b135-704d64887b72, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=02df5608-7a85-4d54-b5ac-628d6c8e8179) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.646 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 02df5608-7a85-4d54-b5ac-628d6c8e8179 in datapath 44469d8b-ad30-4270-88fa-e67c568f3150 bound to our chassis#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.647 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 44469d8b-ad30-4270-88fa-e67c568f3150#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.655 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[42c0159e-0980-4352-8104-73119c26664d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.656 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap44469d8b-a1 in ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.658 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap44469d8b-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.658 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ea3ae38c-4c7c-4d16-8b57-8ef03ce53c7f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.659 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[975d4402-1c97-4eaf-a342-52837cf30cef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.666 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[3606462b-fd56-4e8b-a9c2-95807b4c6107]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.675 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3ddeb6de-a9da-46bd-b527-e54b99a4ac27]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.697 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[28027ca3-9de6-4e94-9bbe-61d1f07c34cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:32 np0005603623 NetworkManager[48970]: <info>  [1769848232.7038] manager: (tap44469d8b-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/252)
Jan 31 03:30:32 np0005603623 systemd-udevd[284465]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.704 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[503df4c5-c803-4243-a602-7907f6238f19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:32 np0005603623 nova_compute[226235]: 2026-01-31 08:30:32.711 226239 DEBUG oslo_concurrency.lockutils [req-b18c7ba1-22fd-496a-84d7-6e9125c2e2a8 req-0e8ad161-cfb0-4226-91e6-6170ce96a721 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.727 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[de179282-04e6-4af7-af4e-433348012f83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.730 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[6ff7c0fa-78ba-4a8d-ba1b-43cdf01a56b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:32 np0005603623 NetworkManager[48970]: <info>  [1769848232.7495] device (tap44469d8b-a0): carrier: link connected
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.753 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[3b2d90d2-2e1a-4432-83a2-690a3011859a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.766 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[25133a62-b208-4f5e-b7ab-3bdf62a62480]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44469d8b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:98:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 737317, 'reachable_time': 35248, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284496, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.776 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ee6bda31-32b8-4e74-88c9-1a7ff8829e5f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:9820'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 737317, 'tstamp': 737317}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 284497, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.795 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b582c182-91b8-446f-8098-90307bfadeb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44469d8b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:98:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 737317, 'reachable_time': 35248, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 284498, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.820 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e5ce94a5-01bf-41da-a8ba-6e2545fb2c07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:32.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.873 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[284c5274-710e-4f8c-9714-79eddb1b008a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.875 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44469d8b-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.875 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.875 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44469d8b-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:32 np0005603623 nova_compute[226235]: 2026-01-31 08:30:32.877 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:32 np0005603623 NetworkManager[48970]: <info>  [1769848232.8782] manager: (tap44469d8b-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/253)
Jan 31 03:30:32 np0005603623 kernel: tap44469d8b-a0: entered promiscuous mode
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.880 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap44469d8b-a0, col_values=(('external_ids', {'iface-id': '7e288124-e200-4c03-8a4a-baab3e3f3d7a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:30:32Z|00523|binding|INFO|Releasing lport 7e288124-e200-4c03-8a4a-baab3e3f3d7a from this chassis (sb_readonly=0)
Jan 31 03:30:32 np0005603623 nova_compute[226235]: 2026-01-31 08:30:32.881 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.882 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44469d8b-ad30-4270-88fa-e67c568f3150.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44469d8b-ad30-4270-88fa-e67c568f3150.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.883 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[56b14d02-1913-44b3-bba0-2d18168ebb1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.883 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-44469d8b-ad30-4270-88fa-e67c568f3150
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/44469d8b-ad30-4270-88fa-e67c568f3150.pid.haproxy
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 44469d8b-ad30-4270-88fa-e67c568f3150
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:30:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:30:32.884 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'env', 'PROCESS_TAG=haproxy-44469d8b-ad30-4270-88fa-e67c568f3150', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/44469d8b-ad30-4270-88fa-e67c568f3150.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:30:32 np0005603623 nova_compute[226235]: 2026-01-31 08:30:32.886 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.018 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848233.017298, a15175ec-85fd-457c-870b-8a6d7c13c906 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.018 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] VM Started (Lifecycle Event)#033[00m
Jan 31 03:30:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:33.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.189 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.193 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848233.0184278, a15175ec-85fd-457c-870b-8a6d7c13c906 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.194 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:30:33 np0005603623 podman[284572]: 2026-01-31 08:30:33.217759912 +0000 UTC m=+0.065731372 container create e38f4f8de56aa760fec11f6cb96156ac695981de3bd097825a9ee19d0a6e765a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Jan 31 03:30:33 np0005603623 podman[284572]: 2026-01-31 08:30:33.169325033 +0000 UTC m=+0.017296503 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:30:33 np0005603623 systemd[1]: Started libpod-conmon-e38f4f8de56aa760fec11f6cb96156ac695981de3bd097825a9ee19d0a6e765a.scope.
Jan 31 03:30:33 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:30:33 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/286fcce50259d70be7775964016cd425c6389db63d3507e4d6f7fdb83f6dcbbe/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:30:33 np0005603623 podman[284572]: 2026-01-31 08:30:33.303984885 +0000 UTC m=+0.151956345 container init e38f4f8de56aa760fec11f6cb96156ac695981de3bd097825a9ee19d0a6e765a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:30:33 np0005603623 podman[284572]: 2026-01-31 08:30:33.309590811 +0000 UTC m=+0.157562261 container start e38f4f8de56aa760fec11f6cb96156ac695981de3bd097825a9ee19d0a6e765a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:30:33 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[284587]: [NOTICE]   (284591) : New worker (284593) forked
Jan 31 03:30:33 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[284587]: [NOTICE]   (284591) : Loading success.
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.349 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.353 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.474 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.655 226239 DEBUG nova.compute.manager [req-002f1c3e-e725-4fca-8951-f3da4b02c9be req-227e1e09-41b7-4b76-a54a-ee28bc2174aa fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.656 226239 DEBUG oslo_concurrency.lockutils [req-002f1c3e-e725-4fca-8951-f3da4b02c9be req-227e1e09-41b7-4b76-a54a-ee28bc2174aa fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.657 226239 DEBUG oslo_concurrency.lockutils [req-002f1c3e-e725-4fca-8951-f3da4b02c9be req-227e1e09-41b7-4b76-a54a-ee28bc2174aa fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.657 226239 DEBUG oslo_concurrency.lockutils [req-002f1c3e-e725-4fca-8951-f3da4b02c9be req-227e1e09-41b7-4b76-a54a-ee28bc2174aa fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.657 226239 DEBUG nova.compute.manager [req-002f1c3e-e725-4fca-8951-f3da4b02c9be req-227e1e09-41b7-4b76-a54a-ee28bc2174aa fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Processing event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.658 226239 DEBUG nova.compute.manager [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.662 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848233.6617296, a15175ec-85fd-457c-870b-8a6d7c13c906 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.662 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.664 226239 DEBUG nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.666 226239 INFO nova.virt.libvirt.driver [-] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Instance spawned successfully.#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.667 226239 DEBUG nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:30:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.932 226239 DEBUG nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.933 226239 DEBUG nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.933 226239 DEBUG nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.934 226239 DEBUG nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.934 226239 DEBUG nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.934 226239 DEBUG nova.virt.libvirt.driver [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.950 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:30:33 np0005603623 nova_compute[226235]: 2026-01-31 08:30:33.953 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:30:34 np0005603623 nova_compute[226235]: 2026-01-31 08:30:34.648 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:34 np0005603623 nova_compute[226235]: 2026-01-31 08:30:34.837 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:30:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:34.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:35.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:35 np0005603623 nova_compute[226235]: 2026-01-31 08:30:35.105 226239 INFO nova.compute.manager [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Took 13.84 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:30:35 np0005603623 nova_compute[226235]: 2026-01-31 08:30:35.105 226239 DEBUG nova.compute.manager [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:30:35 np0005603623 nova_compute[226235]: 2026-01-31 08:30:35.522 226239 INFO nova.compute.manager [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Took 15.59 seconds to build instance.#033[00m
Jan 31 03:30:35 np0005603623 nova_compute[226235]: 2026-01-31 08:30:35.830 226239 DEBUG oslo_concurrency.lockutils [None req-c8dfcffa-bb14-4299-8120-3ae6a1b8d883 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:35 np0005603623 nova_compute[226235]: 2026-01-31 08:30:35.973 226239 DEBUG nova.compute.manager [req-3eb9f5d9-ac98-41ac-84a1-e599d8aa7f3a req-c30fabd4-a508-4da2-b286-1a8d3fdda403 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:30:35 np0005603623 nova_compute[226235]: 2026-01-31 08:30:35.973 226239 DEBUG oslo_concurrency.lockutils [req-3eb9f5d9-ac98-41ac-84a1-e599d8aa7f3a req-c30fabd4-a508-4da2-b286-1a8d3fdda403 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:35 np0005603623 nova_compute[226235]: 2026-01-31 08:30:35.974 226239 DEBUG oslo_concurrency.lockutils [req-3eb9f5d9-ac98-41ac-84a1-e599d8aa7f3a req-c30fabd4-a508-4da2-b286-1a8d3fdda403 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:35 np0005603623 nova_compute[226235]: 2026-01-31 08:30:35.974 226239 DEBUG oslo_concurrency.lockutils [req-3eb9f5d9-ac98-41ac-84a1-e599d8aa7f3a req-c30fabd4-a508-4da2-b286-1a8d3fdda403 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:35 np0005603623 nova_compute[226235]: 2026-01-31 08:30:35.974 226239 DEBUG nova.compute.manager [req-3eb9f5d9-ac98-41ac-84a1-e599d8aa7f3a req-c30fabd4-a508-4da2-b286-1a8d3fdda403 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] No waiting events found dispatching network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:30:35 np0005603623 nova_compute[226235]: 2026-01-31 08:30:35.974 226239 WARNING nova.compute.manager [req-3eb9f5d9-ac98-41ac-84a1-e599d8aa7f3a req-c30fabd4-a508-4da2-b286-1a8d3fdda403 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received unexpected event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:30:36 np0005603623 nova_compute[226235]: 2026-01-31 08:30:36.010 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:36.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:37.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:38.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:39.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:39 np0005603623 nova_compute[226235]: 2026-01-31 08:30:39.654 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:30:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:30:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:30:40 np0005603623 nova_compute[226235]: 2026-01-31 08:30:40.709 226239 DEBUG nova.compute.manager [req-7f4192bf-1348-4037-bbca-49736756f513 req-0ec09e65-34a2-4181-b8ec-3b48be829117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received event network-changed-02df5608-7a85-4d54-b5ac-628d6c8e8179 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:30:40 np0005603623 nova_compute[226235]: 2026-01-31 08:30:40.709 226239 DEBUG nova.compute.manager [req-7f4192bf-1348-4037-bbca-49736756f513 req-0ec09e65-34a2-4181-b8ec-3b48be829117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Refreshing instance network info cache due to event network-changed-02df5608-7a85-4d54-b5ac-628d6c8e8179. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:30:40 np0005603623 nova_compute[226235]: 2026-01-31 08:30:40.710 226239 DEBUG oslo_concurrency.lockutils [req-7f4192bf-1348-4037-bbca-49736756f513 req-0ec09e65-34a2-4181-b8ec-3b48be829117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:30:40 np0005603623 nova_compute[226235]: 2026-01-31 08:30:40.710 226239 DEBUG oslo_concurrency.lockutils [req-7f4192bf-1348-4037-bbca-49736756f513 req-0ec09e65-34a2-4181-b8ec-3b48be829117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:30:40 np0005603623 nova_compute[226235]: 2026-01-31 08:30:40.710 226239 DEBUG nova.network.neutron [req-7f4192bf-1348-4037-bbca-49736756f513 req-0ec09e65-34a2-4181-b8ec-3b48be829117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Refreshing network info cache for port 02df5608-7a85-4d54-b5ac-628d6c8e8179 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:30:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:40.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:41 np0005603623 nova_compute[226235]: 2026-01-31 08:30:41.013 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:30:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:41.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:30:41 np0005603623 nova_compute[226235]: 2026-01-31 08:30:41.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:41 np0005603623 nova_compute[226235]: 2026-01-31 08:30:41.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:30:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:42.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:43.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:44 np0005603623 nova_compute[226235]: 2026-01-31 08:30:44.656 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:44.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:45 np0005603623 nova_compute[226235]: 2026-01-31 08:30:45.055 226239 DEBUG nova.network.neutron [req-7f4192bf-1348-4037-bbca-49736756f513 req-0ec09e65-34a2-4181-b8ec-3b48be829117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Updated VIF entry in instance network info cache for port 02df5608-7a85-4d54-b5ac-628d6c8e8179. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:30:45 np0005603623 nova_compute[226235]: 2026-01-31 08:30:45.055 226239 DEBUG nova.network.neutron [req-7f4192bf-1348-4037-bbca-49736756f513 req-0ec09e65-34a2-4181-b8ec-3b48be829117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Updating instance_info_cache with network_info: [{"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:30:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:30:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:45.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:30:45 np0005603623 nova_compute[226235]: 2026-01-31 08:30:45.261 226239 DEBUG oslo_concurrency.lockutils [req-7f4192bf-1348-4037-bbca-49736756f513 req-0ec09e65-34a2-4181-b8ec-3b48be829117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:30:45 np0005603623 ovn_controller[133449]: 2026-01-31T08:30:45Z|00524|binding|INFO|Releasing lport 7e288124-e200-4c03-8a4a-baab3e3f3d7a from this chassis (sb_readonly=0)
Jan 31 03:30:45 np0005603623 nova_compute[226235]: 2026-01-31 08:30:45.608 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:46 np0005603623 nova_compute[226235]: 2026-01-31 08:30:46.015 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:46 np0005603623 nova_compute[226235]: 2026-01-31 08:30:46.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:46 np0005603623 nova_compute[226235]: 2026-01-31 08:30:46.209 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:46 np0005603623 nova_compute[226235]: 2026-01-31 08:30:46.210 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:46 np0005603623 nova_compute[226235]: 2026-01-31 08:30:46.210 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:46 np0005603623 nova_compute[226235]: 2026-01-31 08:30:46.210 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:30:46 np0005603623 nova_compute[226235]: 2026-01-31 08:30:46.211 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:30:46 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1578304986' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:30:46 np0005603623 nova_compute[226235]: 2026-01-31 08:30:46.696 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:30:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:46.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:30:47 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:30:47 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:30:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:47.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:47 np0005603623 nova_compute[226235]: 2026-01-31 08:30:47.225 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000080 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:30:47 np0005603623 nova_compute[226235]: 2026-01-31 08:30:47.225 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000080 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:30:47 np0005603623 nova_compute[226235]: 2026-01-31 08:30:47.374 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:30:47 np0005603623 nova_compute[226235]: 2026-01-31 08:30:47.375 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4234MB free_disk=20.90091323852539GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:30:47 np0005603623 nova_compute[226235]: 2026-01-31 08:30:47.375 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:47 np0005603623 nova_compute[226235]: 2026-01-31 08:30:47.376 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:47 np0005603623 nova_compute[226235]: 2026-01-31 08:30:47.991 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance a15175ec-85fd-457c-870b-8a6d7c13c906 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:30:47 np0005603623 nova_compute[226235]: 2026-01-31 08:30:47.992 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:30:47 np0005603623 nova_compute[226235]: 2026-01-31 08:30:47.992 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:30:48 np0005603623 nova_compute[226235]: 2026-01-31 08:30:48.094 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:30:48 np0005603623 nova_compute[226235]: 2026-01-31 08:30:48.132 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:30:48 np0005603623 nova_compute[226235]: 2026-01-31 08:30:48.133 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:30:48 np0005603623 nova_compute[226235]: 2026-01-31 08:30:48.157 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:30:48 np0005603623 nova_compute[226235]: 2026-01-31 08:30:48.180 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:30:48 np0005603623 nova_compute[226235]: 2026-01-31 08:30:48.224 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:30:48 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4165603544' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:30:48 np0005603623 nova_compute[226235]: 2026-01-31 08:30:48.650 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:48 np0005603623 nova_compute[226235]: 2026-01-31 08:30:48.658 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:30:48 np0005603623 nova_compute[226235]: 2026-01-31 08:30:48.783 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:30:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:30:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:48.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:30:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:30:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:49.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:30:49 np0005603623 nova_compute[226235]: 2026-01-31 08:30:49.243 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:30:49 np0005603623 nova_compute[226235]: 2026-01-31 08:30:49.244 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.868s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:49 np0005603623 nova_compute[226235]: 2026-01-31 08:30:49.658 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:50 np0005603623 nova_compute[226235]: 2026-01-31 08:30:50.244 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:50 np0005603623 nova_compute[226235]: 2026-01-31 08:30:50.245 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:30:50 np0005603623 nova_compute[226235]: 2026-01-31 08:30:50.245 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:30:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:30:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:50.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:30:50 np0005603623 nova_compute[226235]: 2026-01-31 08:30:50.946 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:30:50 np0005603623 nova_compute[226235]: 2026-01-31 08:30:50.947 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:30:50 np0005603623 nova_compute[226235]: 2026-01-31 08:30:50.948 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:30:50 np0005603623 nova_compute[226235]: 2026-01-31 08:30:50.948 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a15175ec-85fd-457c-870b-8a6d7c13c906 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:30:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:30:50Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:dd:59:a9 10.100.0.10
Jan 31 03:30:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:30:50Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dd:59:a9 10.100.0.10
Jan 31 03:30:51 np0005603623 nova_compute[226235]: 2026-01-31 08:30:51.017 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:30:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:51.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:30:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:52.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:53.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:53 np0005603623 nova_compute[226235]: 2026-01-31 08:30:53.528 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Updating instance_info_cache with network_info: [{"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:30:53 np0005603623 nova_compute[226235]: 2026-01-31 08:30:53.577 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:30:53 np0005603623 nova_compute[226235]: 2026-01-31 08:30:53.577 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:30:53 np0005603623 nova_compute[226235]: 2026-01-31 08:30:53.578 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:53 np0005603623 nova_compute[226235]: 2026-01-31 08:30:53.578 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:53 np0005603623 nova_compute[226235]: 2026-01-31 08:30:53.578 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:53 np0005603623 nova_compute[226235]: 2026-01-31 08:30:53.578 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:53 np0005603623 nova_compute[226235]: 2026-01-31 08:30:53.579 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:53 np0005603623 nova_compute[226235]: 2026-01-31 08:30:53.579 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:30:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:54 np0005603623 nova_compute[226235]: 2026-01-31 08:30:54.659 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:30:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:54.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:30:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:55.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:56 np0005603623 nova_compute[226235]: 2026-01-31 08:30:56.019 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:30:56 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2677523786' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:30:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:30:56 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2677523786' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:30:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:56.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:57.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:57 np0005603623 podman[284887]: 2026-01-31 08:30:57.985407304 +0000 UTC m=+0.069248035 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 03:30:58 np0005603623 podman[284888]: 2026-01-31 08:30:58.04991403 +0000 UTC m=+0.125931017 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:30:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:58.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:30:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:59.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:59 np0005603623 nova_compute[226235]: 2026-01-31 08:30:59.638 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:59 np0005603623 nova_compute[226235]: 2026-01-31 08:30:59.662 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:00.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:01 np0005603623 nova_compute[226235]: 2026-01-31 08:31:01.022 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:31:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:01.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:31:01 np0005603623 nova_compute[226235]: 2026-01-31 08:31:01.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:02.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:31:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:03.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:31:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:04 np0005603623 nova_compute[226235]: 2026-01-31 08:31:04.666 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:04.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:31:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:05.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:31:06 np0005603623 nova_compute[226235]: 2026-01-31 08:31:06.024 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:06 np0005603623 nova_compute[226235]: 2026-01-31 08:31:06.886 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:06.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:07.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:08.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:09.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:09 np0005603623 nova_compute[226235]: 2026-01-31 08:31:09.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:09 np0005603623 nova_compute[226235]: 2026-01-31 08:31:09.667 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:10 np0005603623 nova_compute[226235]: 2026-01-31 08:31:10.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:10.691 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:31:10 np0005603623 nova_compute[226235]: 2026-01-31 08:31:10.692 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:10.692 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:31:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:10.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:11 np0005603623 nova_compute[226235]: 2026-01-31 08:31:11.025 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:11.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:12.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:13.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:13 np0005603623 ovn_controller[133449]: 2026-01-31T08:31:13Z|00525|binding|INFO|Releasing lport 7e288124-e200-4c03-8a4a-baab3e3f3d7a from this chassis (sb_readonly=0)
Jan 31 03:31:13 np0005603623 nova_compute[226235]: 2026-01-31 08:31:13.755 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:14 np0005603623 nova_compute[226235]: 2026-01-31 08:31:14.716 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:14.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:15.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:15.694 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:31:16 np0005603623 nova_compute[226235]: 2026-01-31 08:31:16.028 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:31:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:16.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:31:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:17.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:18.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:31:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:19.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:31:19 np0005603623 nova_compute[226235]: 2026-01-31 08:31:19.171 226239 DEBUG oslo_concurrency.lockutils [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:19 np0005603623 nova_compute[226235]: 2026-01-31 08:31:19.171 226239 DEBUG oslo_concurrency.lockutils [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:19 np0005603623 nova_compute[226235]: 2026-01-31 08:31:19.412 226239 DEBUG nova.objects.instance [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'flavor' on Instance uuid a15175ec-85fd-457c-870b-8a6d7c13c906 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:31:19 np0005603623 nova_compute[226235]: 2026-01-31 08:31:19.719 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:20 np0005603623 nova_compute[226235]: 2026-01-31 08:31:20.301 226239 DEBUG oslo_concurrency.lockutils [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 1.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:20 np0005603623 nova_compute[226235]: 2026-01-31 08:31:20.312 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:20 np0005603623 nova_compute[226235]: 2026-01-31 08:31:20.313 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:31:20 np0005603623 nova_compute[226235]: 2026-01-31 08:31:20.428 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:31:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:20.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:21 np0005603623 nova_compute[226235]: 2026-01-31 08:31:21.030 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:31:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:21.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:31:21 np0005603623 nova_compute[226235]: 2026-01-31 08:31:21.708 226239 DEBUG oslo_concurrency.lockutils [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:21 np0005603623 nova_compute[226235]: 2026-01-31 08:31:21.708 226239 DEBUG oslo_concurrency.lockutils [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:21 np0005603623 nova_compute[226235]: 2026-01-31 08:31:21.709 226239 INFO nova.compute.manager [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Attaching volume 901896ec-4cee-48ca-89ea-1ef061e9fbf3 to /dev/vdb#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.090 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.226 226239 DEBUG os_brick.utils [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.228 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.239 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.239 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac2e120-02ef-4eef-befe-dccb781aaeaa]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.241 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.250 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.251 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[ede13b19-cec6-4a44-ba92-c993fedc9acd]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.253 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.259 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.260 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[0a26dae8-a642-4936-823f-14df41318d35]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.262 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[33927176-e508-4e24-b224-0d3023a6502f]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.264 226239 DEBUG oslo_concurrency.processutils [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.284 226239 DEBUG oslo_concurrency.processutils [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "nvme version" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.286 226239 DEBUG os_brick.initiator.connectors.lightos [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.287 226239 DEBUG os_brick.initiator.connectors.lightos [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.287 226239 DEBUG os_brick.initiator.connectors.lightos [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.287 226239 DEBUG os_brick.utils [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] <== get_connector_properties: return (61ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:31:22 np0005603623 nova_compute[226235]: 2026-01-31 08:31:22.288 226239 DEBUG nova.virt.block_device [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Updating existing volume attachment record: c7f45505-bd56-4ce4-b729-5f7effde355f _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:31:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:31:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:22.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:31:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:23.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:31:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2238504479' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:31:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:24 np0005603623 nova_compute[226235]: 2026-01-31 08:31:24.720 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:24 np0005603623 nova_compute[226235]: 2026-01-31 08:31:24.749 226239 DEBUG nova.objects.instance [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'flavor' on Instance uuid a15175ec-85fd-457c-870b-8a6d7c13c906 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:31:24 np0005603623 nova_compute[226235]: 2026-01-31 08:31:24.795 226239 DEBUG nova.virt.libvirt.driver [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Attempting to attach volume 901896ec-4cee-48ca-89ea-1ef061e9fbf3 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 03:31:24 np0005603623 nova_compute[226235]: 2026-01-31 08:31:24.798 226239 DEBUG nova.virt.libvirt.guest [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 03:31:24 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:31:24 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-901896ec-4cee-48ca-89ea-1ef061e9fbf3">
Jan 31 03:31:24 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:31:24 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:31:24 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:31:24 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:31:24 np0005603623 nova_compute[226235]:  <auth username="openstack">
Jan 31 03:31:24 np0005603623 nova_compute[226235]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:31:24 np0005603623 nova_compute[226235]:  </auth>
Jan 31 03:31:24 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:31:24 np0005603623 nova_compute[226235]:  <serial>901896ec-4cee-48ca-89ea-1ef061e9fbf3</serial>
Jan 31 03:31:24 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:31:24 np0005603623 nova_compute[226235]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:31:24 np0005603623 nova_compute[226235]: 2026-01-31 08:31:24.930 226239 DEBUG nova.virt.libvirt.driver [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:31:24 np0005603623 nova_compute[226235]: 2026-01-31 08:31:24.931 226239 DEBUG nova.virt.libvirt.driver [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:31:24 np0005603623 nova_compute[226235]: 2026-01-31 08:31:24.931 226239 DEBUG nova.virt.libvirt.driver [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:31:24 np0005603623 nova_compute[226235]: 2026-01-31 08:31:24.932 226239 DEBUG nova.virt.libvirt.driver [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] No VIF found with MAC fa:16:3e:dd:59:a9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:31:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:31:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:24.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:31:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:25.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:25 np0005603623 nova_compute[226235]: 2026-01-31 08:31:25.762 226239 DEBUG oslo_concurrency.lockutils [None req-810bdee9-edd6-4578-adbe-7eff584a47e9 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 4.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:26 np0005603623 nova_compute[226235]: 2026-01-31 08:31:26.032 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:26 np0005603623 nova_compute[226235]: 2026-01-31 08:31:26.055 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:26 np0005603623 nova_compute[226235]: 2026-01-31 08:31:26.083 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Triggering sync for uuid a15175ec-85fd-457c-870b-8a6d7c13c906 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 03:31:26 np0005603623 nova_compute[226235]: 2026-01-31 08:31:26.083 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:26 np0005603623 nova_compute[226235]: 2026-01-31 08:31:26.084 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:26 np0005603623 nova_compute[226235]: 2026-01-31 08:31:26.139 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:26.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:27.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:28.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:28 np0005603623 podman[285026]: 2026-01-31 08:31:28.978486891 +0000 UTC m=+0.061839643 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Jan 31 03:31:28 np0005603623 podman[285025]: 2026-01-31 08:31:28.982991953 +0000 UTC m=+0.070641889 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:31:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:29.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:29 np0005603623 nova_compute[226235]: 2026-01-31 08:31:29.759 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:30.123 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:30.124 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:30.124 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:30 np0005603623 nova_compute[226235]: 2026-01-31 08:31:30.314 226239 DEBUG oslo_concurrency.lockutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Acquiring lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:30 np0005603623 nova_compute[226235]: 2026-01-31 08:31:30.314 226239 DEBUG oslo_concurrency.lockutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:30 np0005603623 nova_compute[226235]: 2026-01-31 08:31:30.356 226239 DEBUG nova.compute.manager [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:31:30 np0005603623 nova_compute[226235]: 2026-01-31 08:31:30.545 226239 DEBUG oslo_concurrency.lockutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:30 np0005603623 nova_compute[226235]: 2026-01-31 08:31:30.546 226239 DEBUG oslo_concurrency.lockutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:30 np0005603623 nova_compute[226235]: 2026-01-31 08:31:30.552 226239 DEBUG nova.virt.hardware [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:31:30 np0005603623 nova_compute[226235]: 2026-01-31 08:31:30.553 226239 INFO nova.compute.claims [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:31:30 np0005603623 nova_compute[226235]: 2026-01-31 08:31:30.761 226239 DEBUG oslo_concurrency.processutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:31:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:30.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.035 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:31.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:31:31 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3300367585' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.222 226239 DEBUG oslo_concurrency.processutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.227 226239 DEBUG nova.compute.provider_tree [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.268 226239 DEBUG nova.scheduler.client.report [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.309 226239 DEBUG oslo_concurrency.lockutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.763s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.310 226239 DEBUG nova.compute.manager [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.399 226239 DEBUG nova.compute.manager [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.400 226239 DEBUG nova.network.neutron [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.430 226239 INFO nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.455 226239 DEBUG nova.compute.manager [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.664 226239 DEBUG nova.compute.manager [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.665 226239 DEBUG nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.665 226239 INFO nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Creating image(s)#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.686 226239 DEBUG nova.storage.rbd_utils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] rbd image 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.714 226239 DEBUG nova.storage.rbd_utils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] rbd image 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.742 226239 DEBUG nova.storage.rbd_utils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] rbd image 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.746 226239 DEBUG oslo_concurrency.processutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.797 226239 DEBUG oslo_concurrency.processutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.798 226239 DEBUG oslo_concurrency.lockutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.799 226239 DEBUG oslo_concurrency.lockutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.799 226239 DEBUG oslo_concurrency.lockutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.826 226239 DEBUG nova.storage.rbd_utils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] rbd image 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.829 226239 DEBUG oslo_concurrency.processutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:31:31 np0005603623 nova_compute[226235]: 2026-01-31 08:31:31.931 226239 DEBUG nova.policy [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd91ac41a8e444974a11ffbef7b04ddb3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b6a7de05649d42c6acb1aa6e6026b2b4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:31:32 np0005603623 nova_compute[226235]: 2026-01-31 08:31:32.616 226239 DEBUG oslo_concurrency.processutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.786s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:31:32 np0005603623 nova_compute[226235]: 2026-01-31 08:31:32.682 226239 DEBUG nova.storage.rbd_utils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] resizing rbd image 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:31:32 np0005603623 nova_compute[226235]: 2026-01-31 08:31:32.770 226239 DEBUG nova.objects.instance [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lazy-loading 'migration_context' on Instance uuid 4e0408c0-b1a7-4079-ba79-3e737fded2ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:31:32 np0005603623 nova_compute[226235]: 2026-01-31 08:31:32.797 226239 DEBUG nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:31:32 np0005603623 nova_compute[226235]: 2026-01-31 08:31:32.797 226239 DEBUG nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Ensure instance console log exists: /var/lib/nova/instances/4e0408c0-b1a7-4079-ba79-3e737fded2ea/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:31:32 np0005603623 nova_compute[226235]: 2026-01-31 08:31:32.798 226239 DEBUG oslo_concurrency.lockutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:32 np0005603623 nova_compute[226235]: 2026-01-31 08:31:32.798 226239 DEBUG oslo_concurrency.lockutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:32 np0005603623 nova_compute[226235]: 2026-01-31 08:31:32.798 226239 DEBUG oslo_concurrency.lockutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:32.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:33.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:33 np0005603623 nova_compute[226235]: 2026-01-31 08:31:33.195 226239 DEBUG oslo_concurrency.lockutils [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:31:33 np0005603623 nova_compute[226235]: 2026-01-31 08:31:33.195 226239 DEBUG oslo_concurrency.lockutils [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquired lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:31:33 np0005603623 nova_compute[226235]: 2026-01-31 08:31:33.195 226239 DEBUG nova.network.neutron [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:31:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:34 np0005603623 nova_compute[226235]: 2026-01-31 08:31:34.762 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:34.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:31:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:35.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:31:35 np0005603623 nova_compute[226235]: 2026-01-31 08:31:35.505 226239 DEBUG nova.network.neutron [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Successfully created port: 5c8118c5-4238-4d06-99ff-6e0b763563c7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:31:36 np0005603623 nova_compute[226235]: 2026-01-31 08:31:36.037 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:31:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:36.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:31:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:37.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:37 np0005603623 nova_compute[226235]: 2026-01-31 08:31:37.189 226239 DEBUG nova.network.neutron [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Successfully updated port: 5c8118c5-4238-4d06-99ff-6e0b763563c7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:31:37 np0005603623 nova_compute[226235]: 2026-01-31 08:31:37.288 226239 DEBUG nova.network.neutron [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Updating instance_info_cache with network_info: [{"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:31:37 np0005603623 nova_compute[226235]: 2026-01-31 08:31:37.292 226239 DEBUG oslo_concurrency.lockutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Acquiring lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:31:37 np0005603623 nova_compute[226235]: 2026-01-31 08:31:37.292 226239 DEBUG oslo_concurrency.lockutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Acquired lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:31:37 np0005603623 nova_compute[226235]: 2026-01-31 08:31:37.293 226239 DEBUG nova.network.neutron [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:31:37 np0005603623 nova_compute[226235]: 2026-01-31 08:31:37.331 226239 DEBUG oslo_concurrency.lockutils [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Releasing lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:31:37 np0005603623 nova_compute[226235]: 2026-01-31 08:31:37.378 226239 DEBUG nova.compute.manager [req-a4354fe3-cf3d-4c7e-919a-a77153cacc8a req-47f4c2f5-9007-443b-b416-b489f31fada1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received event network-changed-5c8118c5-4238-4d06-99ff-6e0b763563c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:31:37 np0005603623 nova_compute[226235]: 2026-01-31 08:31:37.378 226239 DEBUG nova.compute.manager [req-a4354fe3-cf3d-4c7e-919a-a77153cacc8a req-47f4c2f5-9007-443b-b416-b489f31fada1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Refreshing instance network info cache due to event network-changed-5c8118c5-4238-4d06-99ff-6e0b763563c7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:31:37 np0005603623 nova_compute[226235]: 2026-01-31 08:31:37.379 226239 DEBUG oslo_concurrency.lockutils [req-a4354fe3-cf3d-4c7e-919a-a77153cacc8a req-47f4c2f5-9007-443b-b416-b489f31fada1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:31:37 np0005603623 nova_compute[226235]: 2026-01-31 08:31:37.571 226239 DEBUG nova.virt.libvirt.driver [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 31 03:31:37 np0005603623 nova_compute[226235]: 2026-01-31 08:31:37.572 226239 DEBUG nova.virt.libvirt.volume.remotefs [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Creating file /var/lib/nova/instances/a15175ec-85fd-457c-870b-8a6d7c13c906/82bb7d6752184e7dac9a2370f6c183fa.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 31 03:31:37 np0005603623 nova_compute[226235]: 2026-01-31 08:31:37.573 226239 DEBUG oslo_concurrency.processutils [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/a15175ec-85fd-457c-870b-8a6d7c13c906/82bb7d6752184e7dac9a2370f6c183fa.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:31:37 np0005603623 nova_compute[226235]: 2026-01-31 08:31:37.927 226239 DEBUG oslo_concurrency.processutils [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/a15175ec-85fd-457c-870b-8a6d7c13c906/82bb7d6752184e7dac9a2370f6c183fa.tmp" returned: 1 in 0.354s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:31:37 np0005603623 nova_compute[226235]: 2026-01-31 08:31:37.928 226239 DEBUG oslo_concurrency.processutils [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/a15175ec-85fd-457c-870b-8a6d7c13c906/82bb7d6752184e7dac9a2370f6c183fa.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 31 03:31:37 np0005603623 nova_compute[226235]: 2026-01-31 08:31:37.928 226239 DEBUG nova.virt.libvirt.volume.remotefs [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Creating directory /var/lib/nova/instances/a15175ec-85fd-457c-870b-8a6d7c13c906 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 31 03:31:37 np0005603623 nova_compute[226235]: 2026-01-31 08:31:37.929 226239 DEBUG oslo_concurrency.processutils [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/a15175ec-85fd-457c-870b-8a6d7c13c906 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:31:38 np0005603623 nova_compute[226235]: 2026-01-31 08:31:38.117 226239 DEBUG oslo_concurrency.processutils [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/a15175ec-85fd-457c-870b-8a6d7c13c906" returned: 0 in 0.188s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:31:38 np0005603623 nova_compute[226235]: 2026-01-31 08:31:38.120 226239 DEBUG nova.virt.libvirt.driver [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:31:38 np0005603623 nova_compute[226235]: 2026-01-31 08:31:38.131 226239 DEBUG nova.network.neutron [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:31:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:38.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:39.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:39 np0005603623 nova_compute[226235]: 2026-01-31 08:31:39.658 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:39 np0005603623 nova_compute[226235]: 2026-01-31 08:31:39.763 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:40.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.083 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:31:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:41.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.190 226239 DEBUG nova.network.neutron [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Updating instance_info_cache with network_info: [{"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.358 226239 DEBUG oslo_concurrency.lockutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Releasing lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.358 226239 DEBUG nova.compute.manager [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Instance network_info: |[{"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.359 226239 DEBUG oslo_concurrency.lockutils [req-a4354fe3-cf3d-4c7e-919a-a77153cacc8a req-47f4c2f5-9007-443b-b416-b489f31fada1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.359 226239 DEBUG nova.network.neutron [req-a4354fe3-cf3d-4c7e-919a-a77153cacc8a req-47f4c2f5-9007-443b-b416-b489f31fada1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Refreshing network info cache for port 5c8118c5-4238-4d06-99ff-6e0b763563c7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.361 226239 DEBUG nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Start _get_guest_xml network_info=[{"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.365 226239 WARNING nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.371 226239 DEBUG nova.virt.libvirt.host [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.372 226239 DEBUG nova.virt.libvirt.host [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.375 226239 DEBUG nova.virt.libvirt.host [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.376 226239 DEBUG nova.virt.libvirt.host [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.377 226239 DEBUG nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.377 226239 DEBUG nova.virt.hardware [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.378 226239 DEBUG nova.virt.hardware [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.378 226239 DEBUG nova.virt.hardware [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.378 226239 DEBUG nova.virt.hardware [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.378 226239 DEBUG nova.virt.hardware [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.379 226239 DEBUG nova.virt.hardware [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.379 226239 DEBUG nova.virt.hardware [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.379 226239 DEBUG nova.virt.hardware [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.379 226239 DEBUG nova.virt.hardware [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.379 226239 DEBUG nova.virt.hardware [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.380 226239 DEBUG nova.virt.hardware [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.383 226239 DEBUG oslo_concurrency.processutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:31:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:31:41 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2524032096' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.834 226239 DEBUG oslo_concurrency.processutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.862 226239 DEBUG nova.storage.rbd_utils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] rbd image 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:31:41 np0005603623 nova_compute[226235]: 2026-01-31 08:31:41.866 226239 DEBUG oslo_concurrency.processutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:31:42 np0005603623 nova_compute[226235]: 2026-01-31 08:31:42.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:42 np0005603623 nova_compute[226235]: 2026-01-31 08:31:42.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:31:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:31:42 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1435085495' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:31:42 np0005603623 nova_compute[226235]: 2026-01-31 08:31:42.926 226239 DEBUG oslo_concurrency.processutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:31:42 np0005603623 nova_compute[226235]: 2026-01-31 08:31:42.927 226239 DEBUG nova.virt.libvirt.vif [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:31:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-989214032',display_name='tempest-ServerRescueTestJSONUnderV235-server-989214032',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-989214032',id=131,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b6a7de05649d42c6acb1aa6e6026b2b4',ramdisk_id='',reservation_id='r-muxmeooe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1941698863',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1941698863-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:31:31Z,user_data=None,user_id='d91ac41a8e444974a11ffbef7b04ddb3',uuid=4e0408c0-b1a7-4079-ba79-3e737fded2ea,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:31:42 np0005603623 nova_compute[226235]: 2026-01-31 08:31:42.927 226239 DEBUG nova.network.os_vif_util [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Converting VIF {"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:31:42 np0005603623 nova_compute[226235]: 2026-01-31 08:31:42.928 226239 DEBUG nova.network.os_vif_util [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:ba:04,bridge_name='br-int',has_traffic_filtering=True,id=5c8118c5-4238-4d06-99ff-6e0b763563c7,network=Network(1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c8118c5-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:31:42 np0005603623 nova_compute[226235]: 2026-01-31 08:31:42.929 226239 DEBUG nova.objects.instance [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4e0408c0-b1a7-4079-ba79-3e737fded2ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:31:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:42.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.062 226239 DEBUG nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:31:43 np0005603623 nova_compute[226235]:  <uuid>4e0408c0-b1a7-4079-ba79-3e737fded2ea</uuid>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:  <name>instance-00000083</name>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerRescueTestJSONUnderV235-server-989214032</nova:name>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:31:41</nova:creationTime>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:31:43 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:        <nova:user uuid="d91ac41a8e444974a11ffbef7b04ddb3">tempest-ServerRescueTestJSONUnderV235-1941698863-project-member</nova:user>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:        <nova:project uuid="b6a7de05649d42c6acb1aa6e6026b2b4">tempest-ServerRescueTestJSONUnderV235-1941698863</nova:project>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:        <nova:port uuid="5c8118c5-4238-4d06-99ff-6e0b763563c7">
Jan 31 03:31:43 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <entry name="serial">4e0408c0-b1a7-4079-ba79-3e737fded2ea</entry>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <entry name="uuid">4e0408c0-b1a7-4079-ba79-3e737fded2ea</entry>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk">
Jan 31 03:31:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:31:43 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk.config">
Jan 31 03:31:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:31:43 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:fb:ba:04"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <target dev="tap5c8118c5-42"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/4e0408c0-b1a7-4079-ba79-3e737fded2ea/console.log" append="off"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:31:43 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:31:43 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:31:43 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:31:43 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.062 226239 DEBUG nova.compute.manager [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Preparing to wait for external event network-vif-plugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.062 226239 DEBUG oslo_concurrency.lockutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Acquiring lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.062 226239 DEBUG oslo_concurrency.lockutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.063 226239 DEBUG oslo_concurrency.lockutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.063 226239 DEBUG nova.virt.libvirt.vif [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:31:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-989214032',display_name='tempest-ServerRescueTestJSONUnderV235-server-989214032',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-989214032',id=131,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b6a7de05649d42c6acb1aa6e6026b2b4',ramdisk_id='',reservation_id='r-muxmeooe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1941698863',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1941698863-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:31:31Z,user_data=None,user_id='d91ac41a8e444974a11ffbef7b04ddb3',uuid=4e0408c0-b1a7-4079-ba79-3e737fded2ea,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.063 226239 DEBUG nova.network.os_vif_util [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Converting VIF {"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.064 226239 DEBUG nova.network.os_vif_util [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:ba:04,bridge_name='br-int',has_traffic_filtering=True,id=5c8118c5-4238-4d06-99ff-6e0b763563c7,network=Network(1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c8118c5-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.064 226239 DEBUG os_vif [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:ba:04,bridge_name='br-int',has_traffic_filtering=True,id=5c8118c5-4238-4d06-99ff-6e0b763563c7,network=Network(1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c8118c5-42') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.065 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.065 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.065 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.067 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.068 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5c8118c5-42, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.068 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5c8118c5-42, col_values=(('external_ids', {'iface-id': '5c8118c5-4238-4d06-99ff-6e0b763563c7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fb:ba:04', 'vm-uuid': '4e0408c0-b1a7-4079-ba79-3e737fded2ea'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.069 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:43 np0005603623 NetworkManager[48970]: <info>  [1769848303.0702] manager: (tap5c8118c5-42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/254)
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.072 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.074 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.075 226239 INFO os_vif [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:ba:04,bridge_name='br-int',has_traffic_filtering=True,id=5c8118c5-4238-4d06-99ff-6e0b763563c7,network=Network(1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c8118c5-42')#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.142 226239 INFO nova.virt.libvirt.driver [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Instance shutdown successfully after 5 seconds.#033[00m
Jan 31 03:31:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:43.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.401 226239 DEBUG nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.402 226239 DEBUG nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.402 226239 DEBUG nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] No VIF found with MAC fa:16:3e:fb:ba:04, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.402 226239 INFO nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Using config drive#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.552 226239 DEBUG nova.storage.rbd_utils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] rbd image 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:31:43 np0005603623 kernel: tap02df5608-7a (unregistering): left promiscuous mode
Jan 31 03:31:43 np0005603623 NetworkManager[48970]: <info>  [1769848303.7942] device (tap02df5608-7a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.794 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:43 np0005603623 ovn_controller[133449]: 2026-01-31T08:31:43Z|00526|binding|INFO|Releasing lport 02df5608-7a85-4d54-b5ac-628d6c8e8179 from this chassis (sb_readonly=0)
Jan 31 03:31:43 np0005603623 ovn_controller[133449]: 2026-01-31T08:31:43Z|00527|binding|INFO|Setting lport 02df5608-7a85-4d54-b5ac-628d6c8e8179 down in Southbound
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.802 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:43 np0005603623 ovn_controller[133449]: 2026-01-31T08:31:43Z|00528|binding|INFO|Removing iface tap02df5608-7a ovn-installed in OVS
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.805 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.810 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:43 np0005603623 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000080.scope: Deactivated successfully.
Jan 31 03:31:43 np0005603623 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000080.scope: Consumed 14.612s CPU time.
Jan 31 03:31:43 np0005603623 systemd-machined[194379]: Machine qemu-58-instance-00000080 terminated.
Jan 31 03:31:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:43.850 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:59:a9 10.100.0.10'], port_security=['fa:16:3e:dd:59:a9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'a15175ec-85fd-457c-870b-8a6d7c13c906', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44469d8b-ad30-4270-88fa-e67c568f3150', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '953a213fa5cb435ab3c04ad96152685f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5b1bd8ad-0d2a-4d57-a00a-9a6b59df86e5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.181'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d972fb9d-6d12-4c1c-b135-704d64887b72, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=02df5608-7a85-4d54-b5ac-628d6c8e8179) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:31:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:43.852 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 02df5608-7a85-4d54-b5ac-628d6c8e8179 in datapath 44469d8b-ad30-4270-88fa-e67c568f3150 unbound from our chassis#033[00m
Jan 31 03:31:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:43.854 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 44469d8b-ad30-4270-88fa-e67c568f3150, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:31:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:43.855 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ae1f9011-ba35-4627-aafc-9987357e9f5c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:31:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:43.855 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 namespace which is not needed anymore#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.972 226239 INFO nova.virt.libvirt.driver [-] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Instance destroyed successfully.#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.973 226239 DEBUG nova.virt.libvirt.vif [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:30:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-2097097080',display_name='tempest-ServerActionsTestOtherB-server-2097097080',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-2097097080',id=128,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDsFGTxapW26dXB/XvUTGcfGzb7/71yMMg1CszLzfnGOAhIU/1lACOYAdVBK40cFjy/2kY258v2iqF8U2lfGaG9JRRfAxw6pRph+THb2i3B9US4SfAm/pgAAiW0mmqeasA==',key_name='tempest-keypair-1440000372',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:30:35Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='953a213fa5cb435ab3c04ad96152685f',ramdisk_id='',reservation_id='r-41gbj3yx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestOtherB-1048458052',owner_user_name='tempest-ServerActionsTestOtherB-1048458052-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:31:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ef51681d234a4abc88ff433d0640b6e7',uuid=a15175ec-85fd-457c-870b-8a6d7c13c906,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-2130829654-network", "vif_mac": "fa:16:3e:dd:59:a9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.974 226239 DEBUG nova.network.os_vif_util [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converting VIF {"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-2130829654-network", "vif_mac": "fa:16:3e:dd:59:a9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.975 226239 DEBUG nova.network.os_vif_util [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:59:a9,bridge_name='br-int',has_traffic_filtering=True,id=02df5608-7a85-4d54-b5ac-628d6c8e8179,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02df5608-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.975 226239 DEBUG os_vif [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:59:a9,bridge_name='br-int',has_traffic_filtering=True,id=02df5608-7a85-4d54-b5ac-628d6c8e8179,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02df5608-7a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.976 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.977 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02df5608-7a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.978 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.980 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.983 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.984 226239 INFO os_vif [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:59:a9,bridge_name='br-int',has_traffic_filtering=True,id=02df5608-7a85-4d54-b5ac-628d6c8e8179,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02df5608-7a')#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.989 226239 DEBUG nova.virt.libvirt.driver [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] skipping disk for instance-00000080 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.989 226239 DEBUG nova.virt.libvirt.driver [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] skipping disk for instance-00000080 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:31:43 np0005603623 nova_compute[226235]: 2026-01-31 08:31:43.990 226239 DEBUG nova.virt.libvirt.driver [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] skipping disk for instance-00000080 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:31:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:44 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[284587]: [NOTICE]   (284591) : haproxy version is 2.8.14-c23fe91
Jan 31 03:31:44 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[284587]: [NOTICE]   (284591) : path to executable is /usr/sbin/haproxy
Jan 31 03:31:44 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[284587]: [WARNING]  (284591) : Exiting Master process...
Jan 31 03:31:44 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[284587]: [WARNING]  (284591) : Exiting Master process...
Jan 31 03:31:44 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[284587]: [ALERT]    (284591) : Current worker (284593) exited with code 143 (Terminated)
Jan 31 03:31:44 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[284587]: [WARNING]  (284591) : All workers exited. Exiting... (0)
Jan 31 03:31:44 np0005603623 systemd[1]: libpod-e38f4f8de56aa760fec11f6cb96156ac695981de3bd097825a9ee19d0a6e765a.scope: Deactivated successfully.
Jan 31 03:31:44 np0005603623 podman[285422]: 2026-01-31 08:31:44.387963823 +0000 UTC m=+0.462942981 container died e38f4f8de56aa760fec11f6cb96156ac695981de3bd097825a9ee19d0a6e765a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:31:44 np0005603623 nova_compute[226235]: 2026-01-31 08:31:44.637 226239 INFO nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Creating config drive at /var/lib/nova/instances/4e0408c0-b1a7-4079-ba79-3e737fded2ea/disk.config#033[00m
Jan 31 03:31:44 np0005603623 nova_compute[226235]: 2026-01-31 08:31:44.641 226239 DEBUG oslo_concurrency.processutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4e0408c0-b1a7-4079-ba79-3e737fded2ea/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpr8540d4m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:31:44 np0005603623 nova_compute[226235]: 2026-01-31 08:31:44.762 226239 DEBUG oslo_concurrency.processutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4e0408c0-b1a7-4079-ba79-3e737fded2ea/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpr8540d4m" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:31:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:44.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:45 np0005603623 nova_compute[226235]: 2026-01-31 08:31:45.101 226239 DEBUG nova.storage.rbd_utils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] rbd image 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:31:45 np0005603623 nova_compute[226235]: 2026-01-31 08:31:45.105 226239 DEBUG oslo_concurrency.processutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4e0408c0-b1a7-4079-ba79-3e737fded2ea/disk.config 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:31:45 np0005603623 nova_compute[226235]: 2026-01-31 08:31:45.127 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:31:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:45.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:31:45 np0005603623 nova_compute[226235]: 2026-01-31 08:31:45.628 226239 DEBUG nova.compute.manager [req-a950222a-26c7-47bf-8341-3dcb75b457ed req-8db48737-7094-4dfd-bae8-61efb5b2eb92 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received event network-vif-unplugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:31:45 np0005603623 nova_compute[226235]: 2026-01-31 08:31:45.628 226239 DEBUG oslo_concurrency.lockutils [req-a950222a-26c7-47bf-8341-3dcb75b457ed req-8db48737-7094-4dfd-bae8-61efb5b2eb92 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:45 np0005603623 nova_compute[226235]: 2026-01-31 08:31:45.629 226239 DEBUG oslo_concurrency.lockutils [req-a950222a-26c7-47bf-8341-3dcb75b457ed req-8db48737-7094-4dfd-bae8-61efb5b2eb92 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:45 np0005603623 nova_compute[226235]: 2026-01-31 08:31:45.629 226239 DEBUG oslo_concurrency.lockutils [req-a950222a-26c7-47bf-8341-3dcb75b457ed req-8db48737-7094-4dfd-bae8-61efb5b2eb92 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:45 np0005603623 nova_compute[226235]: 2026-01-31 08:31:45.629 226239 DEBUG nova.compute.manager [req-a950222a-26c7-47bf-8341-3dcb75b457ed req-8db48737-7094-4dfd-bae8-61efb5b2eb92 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] No waiting events found dispatching network-vif-unplugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:31:45 np0005603623 nova_compute[226235]: 2026-01-31 08:31:45.629 226239 WARNING nova.compute.manager [req-a950222a-26c7-47bf-8341-3dcb75b457ed req-8db48737-7094-4dfd-bae8-61efb5b2eb92 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received unexpected event network-vif-unplugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 31 03:31:46 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e38f4f8de56aa760fec11f6cb96156ac695981de3bd097825a9ee19d0a6e765a-userdata-shm.mount: Deactivated successfully.
Jan 31 03:31:46 np0005603623 systemd[1]: var-lib-containers-storage-overlay-286fcce50259d70be7775964016cd425c6389db63d3507e4d6f7fdb83f6dcbbe-merged.mount: Deactivated successfully.
Jan 31 03:31:46 np0005603623 nova_compute[226235]: 2026-01-31 08:31:46.141 226239 DEBUG neutronclient.v2_0.client [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 02df5608-7a85-4d54-b5ac-628d6c8e8179 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 31 03:31:46 np0005603623 nova_compute[226235]: 2026-01-31 08:31:46.181 226239 DEBUG nova.network.neutron [req-a4354fe3-cf3d-4c7e-919a-a77153cacc8a req-47f4c2f5-9007-443b-b416-b489f31fada1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Updated VIF entry in instance network info cache for port 5c8118c5-4238-4d06-99ff-6e0b763563c7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:31:46 np0005603623 nova_compute[226235]: 2026-01-31 08:31:46.182 226239 DEBUG nova.network.neutron [req-a4354fe3-cf3d-4c7e-919a-a77153cacc8a req-47f4c2f5-9007-443b-b416-b489f31fada1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Updating instance_info_cache with network_info: [{"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:31:46 np0005603623 nova_compute[226235]: 2026-01-31 08:31:46.652 226239 DEBUG oslo_concurrency.lockutils [req-a4354fe3-cf3d-4c7e-919a-a77153cacc8a req-47f4c2f5-9007-443b-b416-b489f31fada1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:31:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:46.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:47.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:47 np0005603623 podman[285422]: 2026-01-31 08:31:47.259859343 +0000 UTC m=+3.334838501 container cleanup e38f4f8de56aa760fec11f6cb96156ac695981de3bd097825a9ee19d0a6e765a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 03:31:47 np0005603623 systemd[1]: libpod-conmon-e38f4f8de56aa760fec11f6cb96156ac695981de3bd097825a9ee19d0a6e765a.scope: Deactivated successfully.
Jan 31 03:31:47 np0005603623 nova_compute[226235]: 2026-01-31 08:31:47.643 226239 DEBUG oslo_concurrency.lockutils [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:47 np0005603623 nova_compute[226235]: 2026-01-31 08:31:47.644 226239 DEBUG oslo_concurrency.lockutils [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:47 np0005603623 nova_compute[226235]: 2026-01-31 08:31:47.644 226239 DEBUG oslo_concurrency.lockutils [None req-aa86097b-9337-407f-9ec4-ac81e4a5ff8b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.026 226239 DEBUG nova.compute.manager [req-38c9c760-c661-40a0-8ae8-850b662454ee req-6edd6131-1104-4be3-a624-5e4d7727359e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.026 226239 DEBUG oslo_concurrency.lockutils [req-38c9c760-c661-40a0-8ae8-850b662454ee req-6edd6131-1104-4be3-a624-5e4d7727359e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.027 226239 DEBUG oslo_concurrency.lockutils [req-38c9c760-c661-40a0-8ae8-850b662454ee req-6edd6131-1104-4be3-a624-5e4d7727359e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.027 226239 DEBUG oslo_concurrency.lockutils [req-38c9c760-c661-40a0-8ae8-850b662454ee req-6edd6131-1104-4be3-a624-5e4d7727359e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.027 226239 DEBUG nova.compute.manager [req-38c9c760-c661-40a0-8ae8-850b662454ee req-6edd6131-1104-4be3-a624-5e4d7727359e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] No waiting events found dispatching network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.027 226239 WARNING nova.compute.manager [req-38c9c760-c661-40a0-8ae8-850b662454ee req-6edd6131-1104-4be3-a624-5e4d7727359e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received unexpected event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.243 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.243 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.244 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.370 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.371 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.371 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.371 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.371 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:31:48 np0005603623 podman[285617]: 2026-01-31 08:31:48.774161132 +0000 UTC m=+1.497975680 container remove e38f4f8de56aa760fec11f6cb96156ac695981de3bd097825a9ee19d0a6e765a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 03:31:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:48.778 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6be7f7a3-b887-4a86-8efd-cedb94fce024]: (4, ('Sat Jan 31 08:31:43 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 (e38f4f8de56aa760fec11f6cb96156ac695981de3bd097825a9ee19d0a6e765a)\ne38f4f8de56aa760fec11f6cb96156ac695981de3bd097825a9ee19d0a6e765a\nSat Jan 31 08:31:47 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 (e38f4f8de56aa760fec11f6cb96156ac695981de3bd097825a9ee19d0a6e765a)\ne38f4f8de56aa760fec11f6cb96156ac695981de3bd097825a9ee19d0a6e765a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:31:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:48.780 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[93ada43d-929f-478d-9c46-7f926e8fabb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:31:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:48.781 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44469d8b-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:31:48 np0005603623 kernel: tap44469d8b-a0: left promiscuous mode
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.783 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.792 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:48.794 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2cff9ddd-83f5-4fec-b309-3dfaac7aa6cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:31:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:48.808 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cc0a750b-fe79-4475-bd2b-b3ce2ae70601]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:31:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:48.809 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[29e61c97-7362-40e8-aad2-21e130f61773]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:31:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:48.821 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d664fea6-4ab6-4c9a-a7b9-6f750042a12a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 737312, 'reachable_time': 35453, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285685, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:31:48 np0005603623 systemd[1]: run-netns-ovnmeta\x2d44469d8b\x2dad30\x2d4270\x2d88fa\x2de67c568f3150.mount: Deactivated successfully.
Jan 31 03:31:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:48.824 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:31:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:48.824 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[2217eaae-bc36-407f-b65c-f9857cef9732]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:31:48 np0005603623 nova_compute[226235]: 2026-01-31 08:31:48.979 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:31:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:48.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:31:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:31:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/548329715' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:31:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:49 np0005603623 nova_compute[226235]: 2026-01-31 08:31:49.059 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.687s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:31:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:49.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:49 np0005603623 podman[285723]: 2026-01-31 08:31:49.461773924 +0000 UTC m=+0.493499066 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Jan 31 03:31:49 np0005603623 nova_compute[226235]: 2026-01-31 08:31:49.622 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000080 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:31:49 np0005603623 nova_compute[226235]: 2026-01-31 08:31:49.623 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000080 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:31:49 np0005603623 nova_compute[226235]: 2026-01-31 08:31:49.623 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000080 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:31:49 np0005603623 nova_compute[226235]: 2026-01-31 08:31:49.625 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000083 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:31:49 np0005603623 nova_compute[226235]: 2026-01-31 08:31:49.626 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000083 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:31:49 np0005603623 nova_compute[226235]: 2026-01-31 08:31:49.737 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:31:49 np0005603623 nova_compute[226235]: 2026-01-31 08:31:49.738 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4386MB free_disk=20.824199676513672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:31:49 np0005603623 nova_compute[226235]: 2026-01-31 08:31:49.739 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:49 np0005603623 nova_compute[226235]: 2026-01-31 08:31:49.739 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:49 np0005603623 nova_compute[226235]: 2026-01-31 08:31:49.767 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:49 np0005603623 podman[285723]: 2026-01-31 08:31:49.829867998 +0000 UTC m=+0.861593140 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Jan 31 03:31:49 np0005603623 nova_compute[226235]: 2026-01-31 08:31:49.982 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Migration for instance a15175ec-85fd-457c-870b-8a6d7c13c906 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 03:31:50 np0005603623 nova_compute[226235]: 2026-01-31 08:31:50.340 226239 INFO nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Updating resource usage from migration 3a66ac3c-101b-497a-a72d-758b98e95184#033[00m
Jan 31 03:31:50 np0005603623 nova_compute[226235]: 2026-01-31 08:31:50.341 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Starting to track outgoing migration 3a66ac3c-101b-497a-a72d-758b98e95184 with flavor a01eb4f0-fd80-416b-a750-75de320394d8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Jan 31 03:31:50 np0005603623 nova_compute[226235]: 2026-01-31 08:31:50.368 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Migration 3a66ac3c-101b-497a-a72d-758b98e95184 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 03:31:50 np0005603623 nova_compute[226235]: 2026-01-31 08:31:50.368 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 4e0408c0-b1a7-4079-ba79-3e737fded2ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:31:50 np0005603623 nova_compute[226235]: 2026-01-31 08:31:50.369 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:31:50 np0005603623 nova_compute[226235]: 2026-01-31 08:31:50.369 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:31:50 np0005603623 nova_compute[226235]: 2026-01-31 08:31:50.394 226239 DEBUG oslo_concurrency.processutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4e0408c0-b1a7-4079-ba79-3e737fded2ea/disk.config 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 5.289s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:31:50 np0005603623 nova_compute[226235]: 2026-01-31 08:31:50.394 226239 INFO nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Deleting local config drive /var/lib/nova/instances/4e0408c0-b1a7-4079-ba79-3e737fded2ea/disk.config because it was imported into RBD.#033[00m
Jan 31 03:31:50 np0005603623 NetworkManager[48970]: <info>  [1769848310.4454] manager: (tap5c8118c5-42): new Tun device (/org/freedesktop/NetworkManager/Devices/255)
Jan 31 03:31:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:31:50Z|00529|binding|INFO|Claiming lport 5c8118c5-4238-4d06-99ff-6e0b763563c7 for this chassis.
Jan 31 03:31:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:31:50Z|00530|binding|INFO|5c8118c5-4238-4d06-99ff-6e0b763563c7: Claiming fa:16:3e:fb:ba:04 10.100.0.3
Jan 31 03:31:50 np0005603623 nova_compute[226235]: 2026-01-31 08:31:50.451 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:50 np0005603623 kernel: tap5c8118c5-42: entered promiscuous mode
Jan 31 03:31:50 np0005603623 systemd-udevd[285686]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:31:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:31:50Z|00531|binding|INFO|Setting lport 5c8118c5-4238-4d06-99ff-6e0b763563c7 ovn-installed in OVS
Jan 31 03:31:50 np0005603623 nova_compute[226235]: 2026-01-31 08:31:50.460 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:50 np0005603623 nova_compute[226235]: 2026-01-31 08:31:50.464 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:50 np0005603623 NetworkManager[48970]: <info>  [1769848310.4653] device (tap5c8118c5-42): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:31:50 np0005603623 NetworkManager[48970]: <info>  [1769848310.4658] device (tap5c8118c5-42): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:31:50 np0005603623 systemd-machined[194379]: New machine qemu-59-instance-00000083.
Jan 31 03:31:50 np0005603623 systemd[1]: Started Virtual Machine qemu-59-instance-00000083.
Jan 31 03:31:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:31:50Z|00532|binding|INFO|Setting lport 5c8118c5-4238-4d06-99ff-6e0b763563c7 up in Southbound
Jan 31 03:31:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:50.535 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:ba:04 10.100.0.3'], port_security=['fa:16:3e:fb:ba:04 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '4e0408c0-b1a7-4079-ba79-3e737fded2ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b6a7de05649d42c6acb1aa6e6026b2b4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8a3bb86b-ceb7-476c-96e2-ee30ea8ecd63', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41049c6c-e208-4c6a-ad10-15df89677733, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=5c8118c5-4238-4d06-99ff-6e0b763563c7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:31:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:50.537 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 5c8118c5-4238-4d06-99ff-6e0b763563c7 in datapath 1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7 bound to our chassis#033[00m
Jan 31 03:31:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:50.538 143258 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 31 03:31:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:31:50.539 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1db96900-9fee-4559-af94-0718e87678b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:31:50 np0005603623 nova_compute[226235]: 2026-01-31 08:31:50.548 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:31:50 np0005603623 nova_compute[226235]: 2026-01-31 08:31:50.847 226239 DEBUG nova.compute.manager [req-5548df7c-22d0-4646-a4a2-fbccb70fd5ec req-e951292e-cc6b-407a-95c1-3df387e2a1d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received event network-changed-02df5608-7a85-4d54-b5ac-628d6c8e8179 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:31:50 np0005603623 nova_compute[226235]: 2026-01-31 08:31:50.848 226239 DEBUG nova.compute.manager [req-5548df7c-22d0-4646-a4a2-fbccb70fd5ec req-e951292e-cc6b-407a-95c1-3df387e2a1d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Refreshing instance network info cache due to event network-changed-02df5608-7a85-4d54-b5ac-628d6c8e8179. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:31:50 np0005603623 nova_compute[226235]: 2026-01-31 08:31:50.848 226239 DEBUG oslo_concurrency.lockutils [req-5548df7c-22d0-4646-a4a2-fbccb70fd5ec req-e951292e-cc6b-407a-95c1-3df387e2a1d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:31:50 np0005603623 nova_compute[226235]: 2026-01-31 08:31:50.848 226239 DEBUG oslo_concurrency.lockutils [req-5548df7c-22d0-4646-a4a2-fbccb70fd5ec req-e951292e-cc6b-407a-95c1-3df387e2a1d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:31:50 np0005603623 nova_compute[226235]: 2026-01-31 08:31:50.848 226239 DEBUG nova.network.neutron [req-5548df7c-22d0-4646-a4a2-fbccb70fd5ec req-e951292e-cc6b-407a-95c1-3df387e2a1d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Refreshing network info cache for port 02df5608-7a85-4d54-b5ac-628d6c8e8179 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:31:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:31:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:50.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:31:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:31:51 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2559064441' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:31:51 np0005603623 nova_compute[226235]: 2026-01-31 08:31:51.145 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.597s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:31:51 np0005603623 nova_compute[226235]: 2026-01-31 08:31:51.150 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:31:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:51.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:51 np0005603623 podman[285964]: 2026-01-31 08:31:51.302210276 +0000 UTC m=+0.529015975 container exec dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 03:31:51 np0005603623 podman[285964]: 2026-01-31 08:31:51.615798668 +0000 UTC m=+0.842604347 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 03:31:51 np0005603623 nova_compute[226235]: 2026-01-31 08:31:51.855 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:31:52 np0005603623 nova_compute[226235]: 2026-01-31 08:31:52.433 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:31:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:52.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.097 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848313.0969744, 4e0408c0-b1a7-4079-ba79-3e737fded2ea => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.098 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] VM Started (Lifecycle Event)#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.138 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.138 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.399s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:53.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.388 226239 DEBUG nova.compute.manager [req-255cd009-a77d-425d-8255-6dbca71a095a req-1e6f4b26-5185-4e5e-97da-5fce927bc810 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received event network-vif-plugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.389 226239 DEBUG oslo_concurrency.lockutils [req-255cd009-a77d-425d-8255-6dbca71a095a req-1e6f4b26-5185-4e5e-97da-5fce927bc810 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.389 226239 DEBUG oslo_concurrency.lockutils [req-255cd009-a77d-425d-8255-6dbca71a095a req-1e6f4b26-5185-4e5e-97da-5fce927bc810 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.389 226239 DEBUG oslo_concurrency.lockutils [req-255cd009-a77d-425d-8255-6dbca71a095a req-1e6f4b26-5185-4e5e-97da-5fce927bc810 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.389 226239 DEBUG nova.compute.manager [req-255cd009-a77d-425d-8255-6dbca71a095a req-1e6f4b26-5185-4e5e-97da-5fce927bc810 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Processing event network-vif-plugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.390 226239 DEBUG nova.compute.manager [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.393 226239 DEBUG nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.397 226239 INFO nova.virt.libvirt.driver [-] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Instance spawned successfully.#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.397 226239 DEBUG nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.403 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.406 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.587 226239 DEBUG nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.588 226239 DEBUG nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.588 226239 DEBUG nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.589 226239 DEBUG nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.589 226239 DEBUG nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.590 226239 DEBUG nova.virt.libvirt.driver [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.594 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.594 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848313.0977874, 4e0408c0-b1a7-4079-ba79-3e737fded2ea => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.595 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.936 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.940 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848313.392517, 4e0408c0-b1a7-4079-ba79-3e737fded2ea => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.940 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:31:53 np0005603623 nova_compute[226235]: 2026-01-31 08:31:53.980 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:54 np0005603623 nova_compute[226235]: 2026-01-31 08:31:54.048 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:54 np0005603623 nova_compute[226235]: 2026-01-31 08:31:54.048 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:54 np0005603623 nova_compute[226235]: 2026-01-31 08:31:54.049 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:54 np0005603623 nova_compute[226235]: 2026-01-31 08:31:54.049 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:54 np0005603623 nova_compute[226235]: 2026-01-31 08:31:54.151 226239 INFO nova.compute.manager [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Took 22.49 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:31:54 np0005603623 nova_compute[226235]: 2026-01-31 08:31:54.152 226239 DEBUG nova.compute.manager [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:31:54 np0005603623 podman[286072]: 2026-01-31 08:31:54.287233993 +0000 UTC m=+0.665238733 container exec 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, io.openshift.expose-services=, name=keepalived, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph.)
Jan 31 03:31:54 np0005603623 nova_compute[226235]: 2026-01-31 08:31:54.375 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:31:54 np0005603623 nova_compute[226235]: 2026-01-31 08:31:54.379 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:31:54 np0005603623 nova_compute[226235]: 2026-01-31 08:31:54.541 226239 INFO nova.compute.manager [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Took 24.07 seconds to build instance.#033[00m
Jan 31 03:31:54 np0005603623 podman[286072]: 2026-01-31 08:31:54.713373892 +0000 UTC m=+1.091378612 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=2.2.4, name=keepalived, release=1793, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, distribution-scope=public, io.buildah.version=1.28.2)
Jan 31 03:31:54 np0005603623 nova_compute[226235]: 2026-01-31 08:31:54.769 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:54 np0005603623 nova_compute[226235]: 2026-01-31 08:31:54.889 226239 DEBUG oslo_concurrency.lockutils [None req-5b465e99-0d03-4876-9616-58bb29a04f38 d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 24.575s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:54.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:55.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:56 np0005603623 nova_compute[226235]: 2026-01-31 08:31:56.151 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:56 np0005603623 nova_compute[226235]: 2026-01-31 08:31:56.164 226239 DEBUG nova.network.neutron [req-5548df7c-22d0-4646-a4a2-fbccb70fd5ec req-e951292e-cc6b-407a-95c1-3df387e2a1d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Updated VIF entry in instance network info cache for port 02df5608-7a85-4d54-b5ac-628d6c8e8179. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:31:56 np0005603623 nova_compute[226235]: 2026-01-31 08:31:56.165 226239 DEBUG nova.network.neutron [req-5548df7c-22d0-4646-a4a2-fbccb70fd5ec req-e951292e-cc6b-407a-95c1-3df387e2a1d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Updating instance_info_cache with network_info: [{"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:31:56 np0005603623 nova_compute[226235]: 2026-01-31 08:31:56.426 226239 DEBUG oslo_concurrency.lockutils [req-5548df7c-22d0-4646-a4a2-fbccb70fd5ec req-e951292e-cc6b-407a-95c1-3df387e2a1d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:31:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:56.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:57.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:31:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:31:57 np0005603623 nova_compute[226235]: 2026-01-31 08:31:57.733 226239 DEBUG nova.compute.manager [req-fe60cf61-cbf1-4f9a-b975-6a6f8f0ec2e7 req-0c21264c-1c6f-4324-a4db-7344dba915c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received event network-vif-plugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:31:57 np0005603623 nova_compute[226235]: 2026-01-31 08:31:57.734 226239 DEBUG oslo_concurrency.lockutils [req-fe60cf61-cbf1-4f9a-b975-6a6f8f0ec2e7 req-0c21264c-1c6f-4324-a4db-7344dba915c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:57 np0005603623 nova_compute[226235]: 2026-01-31 08:31:57.734 226239 DEBUG oslo_concurrency.lockutils [req-fe60cf61-cbf1-4f9a-b975-6a6f8f0ec2e7 req-0c21264c-1c6f-4324-a4db-7344dba915c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:57 np0005603623 nova_compute[226235]: 2026-01-31 08:31:57.735 226239 DEBUG oslo_concurrency.lockutils [req-fe60cf61-cbf1-4f9a-b975-6a6f8f0ec2e7 req-0c21264c-1c6f-4324-a4db-7344dba915c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:57 np0005603623 nova_compute[226235]: 2026-01-31 08:31:57.735 226239 DEBUG nova.compute.manager [req-fe60cf61-cbf1-4f9a-b975-6a6f8f0ec2e7 req-0c21264c-1c6f-4324-a4db-7344dba915c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] No waiting events found dispatching network-vif-plugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:31:57 np0005603623 nova_compute[226235]: 2026-01-31 08:31:57.735 226239 WARNING nova.compute.manager [req-fe60cf61-cbf1-4f9a-b975-6a6f8f0ec2e7 req-0c21264c-1c6f-4324-a4db-7344dba915c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received unexpected event network-vif-plugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:31:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:31:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:31:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:31:58 np0005603623 nova_compute[226235]: 2026-01-31 08:31:58.971 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848303.9695477, a15175ec-85fd-457c-870b-8a6d7c13c906 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:31:58 np0005603623 nova_compute[226235]: 2026-01-31 08:31:58.972 226239 INFO nova.compute.manager [-] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:31:58 np0005603623 nova_compute[226235]: 2026-01-31 08:31:58.983 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:31:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:58.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:31:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:31:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:59.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:59 np0005603623 nova_compute[226235]: 2026-01-31 08:31:59.176 226239 DEBUG nova.compute.manager [None req-995e493d-f9d3-48ec-9b5c-8a0193f611f3 - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:31:59 np0005603623 nova_compute[226235]: 2026-01-31 08:31:59.180 226239 DEBUG nova.compute.manager [None req-995e493d-f9d3-48ec-9b5c-8a0193f611f3 - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: resize_migrated, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:31:59 np0005603623 nova_compute[226235]: 2026-01-31 08:31:59.305 226239 INFO nova.compute.manager [None req-995e493d-f9d3-48ec-9b5c-8a0193f611f3 - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 31 03:31:59 np0005603623 nova_compute[226235]: 2026-01-31 08:31:59.803 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:59 np0005603623 podman[286240]: 2026-01-31 08:31:59.963193654 +0000 UTC m=+0.053945317 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127)
Jan 31 03:31:59 np0005603623 podman[286241]: 2026-01-31 08:31:59.986310956 +0000 UTC m=+0.076668027 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 03:32:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:01.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:01.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:01 np0005603623 nova_compute[226235]: 2026-01-31 08:32:01.564 226239 INFO nova.compute.manager [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Rescuing#033[00m
Jan 31 03:32:01 np0005603623 nova_compute[226235]: 2026-01-31 08:32:01.565 226239 DEBUG oslo_concurrency.lockutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Acquiring lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:32:01 np0005603623 nova_compute[226235]: 2026-01-31 08:32:01.565 226239 DEBUG oslo_concurrency.lockutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Acquired lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:32:01 np0005603623 nova_compute[226235]: 2026-01-31 08:32:01.565 226239 DEBUG nova.network.neutron [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:32:02 np0005603623 nova_compute[226235]: 2026-01-31 08:32:02.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:32:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:03.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:32:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:03.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:32:03 np0005603623 nova_compute[226235]: 2026-01-31 08:32:03.984 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:04 np0005603623 nova_compute[226235]: 2026-01-31 08:32:04.805 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e313 e313: 3 total, 3 up, 3 in
Jan 31 03:32:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:05.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:05.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:06 np0005603623 nova_compute[226235]: 2026-01-31 08:32:06.157 226239 DEBUG nova.network.neutron [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Updating instance_info_cache with network_info: [{"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:32:06 np0005603623 nova_compute[226235]: 2026-01-31 08:32:06.289 226239 DEBUG oslo_concurrency.lockutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Releasing lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:32:07 np0005603623 nova_compute[226235]: 2026-01-31 08:32:06.999 226239 DEBUG nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:32:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:07.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:07.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:08 np0005603623 nova_compute[226235]: 2026-01-31 08:32:08.378 226239 DEBUG nova.compute.manager [req-41d27d7b-3a89-4b76-a82e-5d6c3fb783c6 req-fd777b72-a437-4c15-9e5c-72dec2d22f74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:32:08 np0005603623 nova_compute[226235]: 2026-01-31 08:32:08.379 226239 DEBUG oslo_concurrency.lockutils [req-41d27d7b-3a89-4b76-a82e-5d6c3fb783c6 req-fd777b72-a437-4c15-9e5c-72dec2d22f74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:08 np0005603623 nova_compute[226235]: 2026-01-31 08:32:08.379 226239 DEBUG oslo_concurrency.lockutils [req-41d27d7b-3a89-4b76-a82e-5d6c3fb783c6 req-fd777b72-a437-4c15-9e5c-72dec2d22f74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:08 np0005603623 nova_compute[226235]: 2026-01-31 08:32:08.380 226239 DEBUG oslo_concurrency.lockutils [req-41d27d7b-3a89-4b76-a82e-5d6c3fb783c6 req-fd777b72-a437-4c15-9e5c-72dec2d22f74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:08 np0005603623 nova_compute[226235]: 2026-01-31 08:32:08.380 226239 DEBUG nova.compute.manager [req-41d27d7b-3a89-4b76-a82e-5d6c3fb783c6 req-fd777b72-a437-4c15-9e5c-72dec2d22f74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] No waiting events found dispatching network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:32:08 np0005603623 nova_compute[226235]: 2026-01-31 08:32:08.380 226239 WARNING nova.compute.manager [req-41d27d7b-3a89-4b76-a82e-5d6c3fb783c6 req-fd777b72-a437-4c15-9e5c-72dec2d22f74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received unexpected event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 31 03:32:08 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:32:08 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:32:08 np0005603623 nova_compute[226235]: 2026-01-31 08:32:08.987 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:09.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:09.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:09 np0005603623 nova_compute[226235]: 2026-01-31 08:32:09.806 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:11 np0005603623 nova_compute[226235]: 2026-01-31 08:32:11.006 226239 DEBUG nova.compute.manager [req-f7bbbae5-31fd-46dd-a61e-77e896054b84 req-c217e6a1-018d-412c-8962-bcf30f321d35 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:32:11 np0005603623 nova_compute[226235]: 2026-01-31 08:32:11.007 226239 DEBUG oslo_concurrency.lockutils [req-f7bbbae5-31fd-46dd-a61e-77e896054b84 req-c217e6a1-018d-412c-8962-bcf30f321d35 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:11 np0005603623 nova_compute[226235]: 2026-01-31 08:32:11.007 226239 DEBUG oslo_concurrency.lockutils [req-f7bbbae5-31fd-46dd-a61e-77e896054b84 req-c217e6a1-018d-412c-8962-bcf30f321d35 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:11 np0005603623 nova_compute[226235]: 2026-01-31 08:32:11.007 226239 DEBUG oslo_concurrency.lockutils [req-f7bbbae5-31fd-46dd-a61e-77e896054b84 req-c217e6a1-018d-412c-8962-bcf30f321d35 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:11 np0005603623 nova_compute[226235]: 2026-01-31 08:32:11.007 226239 DEBUG nova.compute.manager [req-f7bbbae5-31fd-46dd-a61e-77e896054b84 req-c217e6a1-018d-412c-8962-bcf30f321d35 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] No waiting events found dispatching network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:32:11 np0005603623 nova_compute[226235]: 2026-01-31 08:32:11.007 226239 WARNING nova.compute.manager [req-f7bbbae5-31fd-46dd-a61e-77e896054b84 req-c217e6a1-018d-412c-8962-bcf30f321d35 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received unexpected event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 31 03:32:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:11.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:11 np0005603623 nova_compute[226235]: 2026-01-31 08:32:11.084 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:11.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:12 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:12.177 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:32:12 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:12.178 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:32:12 np0005603623 nova_compute[226235]: 2026-01-31 08:32:12.178 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:13.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:13.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:13 np0005603623 nova_compute[226235]: 2026-01-31 08:32:13.989 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:14 np0005603623 nova_compute[226235]: 2026-01-31 08:32:14.807 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:15.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:15.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:17.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:17 np0005603623 nova_compute[226235]: 2026-01-31 08:32:17.040 226239 DEBUG nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 03:32:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:17.181 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:32:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:17.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:18 np0005603623 nova_compute[226235]: 2026-01-31 08:32:18.991 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:19.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:19.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:19 np0005603623 nova_compute[226235]: 2026-01-31 08:32:19.810 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:19 np0005603623 nova_compute[226235]: 2026-01-31 08:32:19.823 226239 DEBUG nova.compute.manager [req-3c72f3db-0840-4239-a18e-855945238e68 req-95d9a412-4d08-468a-8b26-59b82390c244 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received event network-vif-unplugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:32:19 np0005603623 nova_compute[226235]: 2026-01-31 08:32:19.824 226239 DEBUG oslo_concurrency.lockutils [req-3c72f3db-0840-4239-a18e-855945238e68 req-95d9a412-4d08-468a-8b26-59b82390c244 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:19 np0005603623 nova_compute[226235]: 2026-01-31 08:32:19.824 226239 DEBUG oslo_concurrency.lockutils [req-3c72f3db-0840-4239-a18e-855945238e68 req-95d9a412-4d08-468a-8b26-59b82390c244 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:19 np0005603623 nova_compute[226235]: 2026-01-31 08:32:19.825 226239 DEBUG oslo_concurrency.lockutils [req-3c72f3db-0840-4239-a18e-855945238e68 req-95d9a412-4d08-468a-8b26-59b82390c244 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:19 np0005603623 nova_compute[226235]: 2026-01-31 08:32:19.825 226239 DEBUG nova.compute.manager [req-3c72f3db-0840-4239-a18e-855945238e68 req-95d9a412-4d08-468a-8b26-59b82390c244 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] No waiting events found dispatching network-vif-unplugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:32:19 np0005603623 nova_compute[226235]: 2026-01-31 08:32:19.825 226239 WARNING nova.compute.manager [req-3c72f3db-0840-4239-a18e-855945238e68 req-95d9a412-4d08-468a-8b26-59b82390c244 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received unexpected event network-vif-unplugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 31 03:32:19 np0005603623 nova_compute[226235]: 2026-01-31 08:32:19.921 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:21.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:21.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:22 np0005603623 nova_compute[226235]: 2026-01-31 08:32:22.768 226239 DEBUG nova.compute.manager [req-8891d946-9608-4206-9af9-04a676c1a601 req-e331167b-71ad-4455-91c6-eb26b0d16f2c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:32:22 np0005603623 nova_compute[226235]: 2026-01-31 08:32:22.769 226239 DEBUG oslo_concurrency.lockutils [req-8891d946-9608-4206-9af9-04a676c1a601 req-e331167b-71ad-4455-91c6-eb26b0d16f2c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:22 np0005603623 nova_compute[226235]: 2026-01-31 08:32:22.769 226239 DEBUG oslo_concurrency.lockutils [req-8891d946-9608-4206-9af9-04a676c1a601 req-e331167b-71ad-4455-91c6-eb26b0d16f2c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:22 np0005603623 nova_compute[226235]: 2026-01-31 08:32:22.769 226239 DEBUG oslo_concurrency.lockutils [req-8891d946-9608-4206-9af9-04a676c1a601 req-e331167b-71ad-4455-91c6-eb26b0d16f2c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:22 np0005603623 nova_compute[226235]: 2026-01-31 08:32:22.769 226239 DEBUG nova.compute.manager [req-8891d946-9608-4206-9af9-04a676c1a601 req-e331167b-71ad-4455-91c6-eb26b0d16f2c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] No waiting events found dispatching network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:32:22 np0005603623 nova_compute[226235]: 2026-01-31 08:32:22.769 226239 WARNING nova.compute.manager [req-8891d946-9608-4206-9af9-04a676c1a601 req-e331167b-71ad-4455-91c6-eb26b0d16f2c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received unexpected event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 31 03:32:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:23.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:23 np0005603623 nova_compute[226235]: 2026-01-31 08:32:23.160 226239 INFO nova.compute.manager [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Swapping old allocation on dict_keys(['492dc482-9d1e-49ca-87f3-0104a8508b72']) held by migration 3a66ac3c-101b-497a-a72d-758b98e95184 for instance#033[00m
Jan 31 03:32:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:23.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:23 np0005603623 nova_compute[226235]: 2026-01-31 08:32:23.230 226239 DEBUG nova.scheduler.client.report [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Overwriting current allocation {'allocations': {'d7116329-87c2-469a-b33a-1e01daf74ceb': {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}, 'generation': 63}}, 'project_id': '953a213fa5cb435ab3c04ad96152685f', 'user_id': 'ef51681d234a4abc88ff433d0640b6e7', 'consumer_generation': 1} on consumer a15175ec-85fd-457c-870b-8a6d7c13c906 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Jan 31 03:32:23 np0005603623 nova_compute[226235]: 2026-01-31 08:32:23.711 226239 INFO nova.network.neutron [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Updating port 02df5608-7a85-4d54-b5ac-628d6c8e8179 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 31 03:32:23 np0005603623 nova_compute[226235]: 2026-01-31 08:32:23.993 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:24 np0005603623 nova_compute[226235]: 2026-01-31 08:32:24.811 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:25.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:32:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:25.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:32:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:32:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.0 total, 600.0 interval#012Cumulative writes: 11K writes, 57K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.12 GB, 0.03 MB/s#012Cumulative WAL: 11K writes, 11K syncs, 1.00 writes per sync, written: 0.12 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1556 writes, 7368 keys, 1556 commit groups, 1.0 writes per commit group, ingest: 16.35 MB, 0.03 MB/s#012Interval WAL: 1556 writes, 1556 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     37.8      1.85              0.17        34    0.054       0      0       0.0       0.0#012  L6      1/0    9.82 MB   0.0      0.4     0.1      0.3       0.3      0.0       0.0   4.6     56.8     47.9      6.68              0.70        33    0.203    207K    18K       0.0       0.0#012 Sum      1/0    9.82 MB   0.0      0.4     0.1      0.3       0.4      0.1       0.0   5.6     44.5     45.7      8.53              0.87        67    0.127    207K    18K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   6.8     26.3     26.6      2.09              0.12         8    0.262     33K   2114       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.4     0.1      0.3       0.3      0.0       0.0   0.0     56.8     47.9      6.68              0.70        33    0.203    207K    18K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     37.9      1.85              0.17        33    0.056       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 4200.0 total, 600.0 interval#012Flush(GB): cumulative 0.068, interval 0.008#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.38 GB write, 0.09 MB/s write, 0.37 GB read, 0.09 MB/s read, 8.5 seconds#012Interval compaction: 0.05 GB write, 0.09 MB/s write, 0.05 GB read, 0.09 MB/s read, 2.1 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557fc5f1b1f0#2 capacity: 304.00 MB usage: 41.72 MB table_size: 0 occupancy: 18446744073709551615 collections: 8 last_copies: 0 last_secs: 0.000234 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2387,40.16 MB,13.209%) FilterBlock(67,581.30 KB,0.186734%) IndexBlock(67,1019.58 KB,0.327527%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 03:32:26 np0005603623 nova_compute[226235]: 2026-01-31 08:32:26.457 226239 DEBUG oslo_concurrency.lockutils [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:32:26 np0005603623 nova_compute[226235]: 2026-01-31 08:32:26.457 226239 DEBUG oslo_concurrency.lockutils [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquired lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:32:26 np0005603623 nova_compute[226235]: 2026-01-31 08:32:26.457 226239 DEBUG nova.network.neutron [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:32:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:27.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:27.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:27 np0005603623 nova_compute[226235]: 2026-01-31 08:32:27.719 226239 DEBUG nova.compute.manager [req-b3d048e7-a622-4fa9-958f-2eab5483d904 req-8a47665f-bec5-4725-a079-281023b9a616 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received event network-changed-02df5608-7a85-4d54-b5ac-628d6c8e8179 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:32:27 np0005603623 nova_compute[226235]: 2026-01-31 08:32:27.720 226239 DEBUG nova.compute.manager [req-b3d048e7-a622-4fa9-958f-2eab5483d904 req-8a47665f-bec5-4725-a079-281023b9a616 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Refreshing instance network info cache due to event network-changed-02df5608-7a85-4d54-b5ac-628d6c8e8179. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:32:27 np0005603623 nova_compute[226235]: 2026-01-31 08:32:27.720 226239 DEBUG oslo_concurrency.lockutils [req-b3d048e7-a622-4fa9-958f-2eab5483d904 req-8a47665f-bec5-4725-a079-281023b9a616 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:32:28 np0005603623 nova_compute[226235]: 2026-01-31 08:32:28.085 226239 DEBUG nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 03:32:28 np0005603623 nova_compute[226235]: 2026-01-31 08:32:28.996 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:29.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:29 np0005603623 nova_compute[226235]: 2026-01-31 08:32:29.092 226239 INFO nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Instance shutdown successfully after 22 seconds.#033[00m
Jan 31 03:32:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:29.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:29 np0005603623 nova_compute[226235]: 2026-01-31 08:32:29.814 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:29 np0005603623 nova_compute[226235]: 2026-01-31 08:32:29.895 226239 DEBUG nova.network.neutron [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Updating instance_info_cache with network_info: [{"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:32:29 np0005603623 kernel: tap5c8118c5-42 (unregistering): left promiscuous mode
Jan 31 03:32:29 np0005603623 NetworkManager[48970]: <info>  [1769848349.9205] device (tap5c8118c5-42): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:32:29 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:29Z|00533|binding|INFO|Releasing lport 5c8118c5-4238-4d06-99ff-6e0b763563c7 from this chassis (sb_readonly=0)
Jan 31 03:32:29 np0005603623 nova_compute[226235]: 2026-01-31 08:32:29.928 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:29 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:29Z|00534|binding|INFO|Setting lport 5c8118c5-4238-4d06-99ff-6e0b763563c7 down in Southbound
Jan 31 03:32:29 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:29Z|00535|binding|INFO|Removing iface tap5c8118c5-42 ovn-installed in OVS
Jan 31 03:32:29 np0005603623 nova_compute[226235]: 2026-01-31 08:32:29.936 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:29 np0005603623 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000083.scope: Deactivated successfully.
Jan 31 03:32:29 np0005603623 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000083.scope: Consumed 13.283s CPU time.
Jan 31 03:32:29 np0005603623 systemd-machined[194379]: Machine qemu-59-instance-00000083 terminated.
Jan 31 03:32:30 np0005603623 podman[286441]: 2026-01-31 08:32:30.077273374 +0000 UTC m=+0.080444875 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 31 03:32:30 np0005603623 podman[286464]: 2026-01-31 08:32:30.092705826 +0000 UTC m=+0.069672988 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:32:30 np0005603623 kernel: tap5c8118c5-42: entered promiscuous mode
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.109 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:30 np0005603623 kernel: tap5c8118c5-42 (unregistering): left promiscuous mode
Jan 31 03:32:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:30Z|00536|if_status|INFO|Dropped 2 log messages in last 1107 seconds (most recently, 1107 seconds ago) due to excessive rate
Jan 31 03:32:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:30Z|00537|if_status|INFO|Not updating pb chassis for 5c8118c5-4238-4d06-99ff-6e0b763563c7 now as sb is readonly
Jan 31 03:32:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:30Z|00538|binding|INFO|Claiming lport 5c8118c5-4238-4d06-99ff-6e0b763563c7 for this chassis.
Jan 31 03:32:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:30Z|00539|binding|INFO|5c8118c5-4238-4d06-99ff-6e0b763563c7: Claiming fa:16:3e:fb:ba:04 10.100.0.3
Jan 31 03:32:30 np0005603623 NetworkManager[48970]: <info>  [1769848350.1123] manager: (tap5c8118c5-42): new Tun device (/org/freedesktop/NetworkManager/Devices/256)
Jan 31 03:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:30.112 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:ba:04 10.100.0.3'], port_security=['fa:16:3e:fb:ba:04 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '4e0408c0-b1a7-4079-ba79-3e737fded2ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b6a7de05649d42c6acb1aa6e6026b2b4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8a3bb86b-ceb7-476c-96e2-ee30ea8ecd63', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41049c6c-e208-4c6a-ad10-15df89677733, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=5c8118c5-4238-4d06-99ff-6e0b763563c7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:30.113 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 5c8118c5-4238-4d06-99ff-6e0b763563c7 in datapath 1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7 unbound from our chassis#033[00m
Jan 31 03:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:30.114 143258 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 31 03:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:30.115 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4d2c8f9d-df40-4a1a-9834-6d74d87cff3c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:30Z|00540|binding|INFO|Setting lport 5c8118c5-4238-4d06-99ff-6e0b763563c7 ovn-installed in OVS
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.120 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:30Z|00541|if_status|INFO|Dropped 6 log messages in last 1523 seconds (most recently, 1523 seconds ago) due to excessive rate
Jan 31 03:32:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:30Z|00542|if_status|INFO|Not setting lport 5c8118c5-4238-4d06-99ff-6e0b763563c7 down as sb is readonly
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.123 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:30.124 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:30.125 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:30.125 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.134 226239 INFO nova.virt.libvirt.driver [-] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Instance destroyed successfully.#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.134 226239 DEBUG nova.objects.instance [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lazy-loading 'numa_topology' on Instance uuid 4e0408c0-b1a7-4079-ba79-3e737fded2ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:32:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:30Z|00543|binding|INFO|Releasing lport 5c8118c5-4238-4d06-99ff-6e0b763563c7 from this chassis (sb_readonly=0)
Jan 31 03:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:30.141 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:ba:04 10.100.0.3'], port_security=['fa:16:3e:fb:ba:04 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '4e0408c0-b1a7-4079-ba79-3e737fded2ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b6a7de05649d42c6acb1aa6e6026b2b4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8a3bb86b-ceb7-476c-96e2-ee30ea8ecd63', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41049c6c-e208-4c6a-ad10-15df89677733, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=5c8118c5-4238-4d06-99ff-6e0b763563c7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:30.142 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 5c8118c5-4238-4d06-99ff-6e0b763563c7 in datapath 1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7 bound to our chassis#033[00m
Jan 31 03:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:30.143 143258 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 31 03:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:30.144 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[deb8b6cb-d927-4676-8b12-2c5cdba65075]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.147 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:30.288 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:ba:04 10.100.0.3'], port_security=['fa:16:3e:fb:ba:04 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '4e0408c0-b1a7-4079-ba79-3e737fded2ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b6a7de05649d42c6acb1aa6e6026b2b4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8a3bb86b-ceb7-476c-96e2-ee30ea8ecd63', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41049c6c-e208-4c6a-ad10-15df89677733, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=5c8118c5-4238-4d06-99ff-6e0b763563c7) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:30.289 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 5c8118c5-4238-4d06-99ff-6e0b763563c7 in datapath 1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7 unbound from our chassis#033[00m
Jan 31 03:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:30.290 143258 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 31 03:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:30.291 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c90a968e-84ed-42c2-99e2-379abea41177]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.468 226239 DEBUG oslo_concurrency.lockutils [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Releasing lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.469 226239 DEBUG os_brick.utils [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.470 226239 DEBUG oslo_concurrency.lockutils [req-b3d048e7-a622-4fa9-958f-2eab5483d904 req-8a47665f-bec5-4725-a079-281023b9a616 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.470 226239 DEBUG nova.network.neutron [req-b3d048e7-a622-4fa9-958f-2eab5483d904 req-8a47665f-bec5-4725-a079-281023b9a616 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Refreshing network info cache for port 02df5608-7a85-4d54-b5ac-628d6c8e8179 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.470 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.475 226239 INFO nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Attempting rescue#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.476 226239 DEBUG nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.479 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.479 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[a97bea0f-6e46-4785-b5a3-55f1ef7df70e]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.481 226239 DEBUG nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.482 226239 INFO nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Creating image(s)#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.537 226239 DEBUG nova.storage.rbd_utils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] rbd image 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.541 226239 DEBUG nova.objects.instance [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4e0408c0-b1a7-4079-ba79-3e737fded2ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.481 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.488 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.488 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[d40ff4ad-3cf9-43a2-b2f3-ec2a2dbcd3b7]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.547 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.555 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.555 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[f2dd896b-ef20-4259-9a58-4749faf9283f]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.556 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e93e25-ea41-4e81-b6ff-fcebae649de0]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.557 226239 DEBUG oslo_concurrency.processutils [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:30 np0005603623 nova_compute[226235]: 2026-01-31 08:32:30.992 226239 DEBUG nova.storage.rbd_utils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] rbd image 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.020 226239 DEBUG nova.storage.rbd_utils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] rbd image 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.024 226239 DEBUG oslo_concurrency.processutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.044 226239 DEBUG oslo_concurrency.processutils [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "nvme version" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.046 226239 DEBUG os_brick.initiator.connectors.lightos [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.047 226239 DEBUG os_brick.initiator.connectors.lightos [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.047 226239 DEBUG os_brick.initiator.connectors.lightos [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.047 226239 DEBUG os_brick.utils [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] <== get_connector_properties: return (578ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:32:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:31.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.070 226239 DEBUG nova.compute.manager [req-e8183ccf-921e-4c8f-adbb-9e1cb8c485cf req-edba2271-eba5-4fe2-9d8a-224427fc52e7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received event network-vif-unplugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.070 226239 DEBUG oslo_concurrency.lockutils [req-e8183ccf-921e-4c8f-adbb-9e1cb8c485cf req-edba2271-eba5-4fe2-9d8a-224427fc52e7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.071 226239 DEBUG oslo_concurrency.lockutils [req-e8183ccf-921e-4c8f-adbb-9e1cb8c485cf req-edba2271-eba5-4fe2-9d8a-224427fc52e7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.071 226239 DEBUG oslo_concurrency.lockutils [req-e8183ccf-921e-4c8f-adbb-9e1cb8c485cf req-edba2271-eba5-4fe2-9d8a-224427fc52e7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.071 226239 DEBUG nova.compute.manager [req-e8183ccf-921e-4c8f-adbb-9e1cb8c485cf req-edba2271-eba5-4fe2-9d8a-224427fc52e7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] No waiting events found dispatching network-vif-unplugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.071 226239 WARNING nova.compute.manager [req-e8183ccf-921e-4c8f-adbb-9e1cb8c485cf req-edba2271-eba5-4fe2-9d8a-224427fc52e7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received unexpected event network-vif-unplugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 for instance with vm_state active and task_state rescuing.#033[00m
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.075 226239 DEBUG oslo_concurrency.processutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.075 226239 DEBUG oslo_concurrency.lockutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.076 226239 DEBUG oslo_concurrency.lockutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.076 226239 DEBUG oslo_concurrency.lockutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.100 226239 DEBUG nova.storage.rbd_utils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] rbd image 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:32:31 np0005603623 nova_compute[226235]: 2026-01-31 08:32:31.103 226239 DEBUG oslo_concurrency.processutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:32:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:31.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:32:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:32:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2857825263' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:32:32 np0005603623 nova_compute[226235]: 2026-01-31 08:32:32.796 226239 DEBUG nova.virt.libvirt.driver [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Jan 31 03:32:32 np0005603623 nova_compute[226235]: 2026-01-31 08:32:32.968 226239 DEBUG nova.storage.rbd_utils [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rolling back rbd image(a15175ec-85fd-457c-870b-8a6d7c13c906_disk) to snapshot(nova-resize) rollback_to_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:505#033[00m
Jan 31 03:32:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:32:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:33.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:32:33 np0005603623 nova_compute[226235]: 2026-01-31 08:32:33.185 226239 DEBUG nova.network.neutron [req-b3d048e7-a622-4fa9-958f-2eab5483d904 req-8a47665f-bec5-4725-a079-281023b9a616 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Updated VIF entry in instance network info cache for port 02df5608-7a85-4d54-b5ac-628d6c8e8179. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:32:33 np0005603623 nova_compute[226235]: 2026-01-31 08:32:33.185 226239 DEBUG nova.network.neutron [req-b3d048e7-a622-4fa9-958f-2eab5483d904 req-8a47665f-bec5-4725-a079-281023b9a616 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Updating instance_info_cache with network_info: [{"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:32:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:33.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:33 np0005603623 nova_compute[226235]: 2026-01-31 08:32:33.326 226239 DEBUG oslo_concurrency.lockutils [req-b3d048e7-a622-4fa9-958f-2eab5483d904 req-8a47665f-bec5-4725-a079-281023b9a616 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-a15175ec-85fd-457c-870b-8a6d7c13c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:32:33 np0005603623 nova_compute[226235]: 2026-01-31 08:32:33.361 226239 DEBUG nova.compute.manager [req-82303d5b-c85f-4b0a-92aa-56dd5b7171ff req-efcc059f-83b6-48a4-a3f9-3c98bb28e44a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received event network-vif-plugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:32:33 np0005603623 nova_compute[226235]: 2026-01-31 08:32:33.361 226239 DEBUG oslo_concurrency.lockutils [req-82303d5b-c85f-4b0a-92aa-56dd5b7171ff req-efcc059f-83b6-48a4-a3f9-3c98bb28e44a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:33 np0005603623 nova_compute[226235]: 2026-01-31 08:32:33.362 226239 DEBUG oslo_concurrency.lockutils [req-82303d5b-c85f-4b0a-92aa-56dd5b7171ff req-efcc059f-83b6-48a4-a3f9-3c98bb28e44a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:33 np0005603623 nova_compute[226235]: 2026-01-31 08:32:33.362 226239 DEBUG oslo_concurrency.lockutils [req-82303d5b-c85f-4b0a-92aa-56dd5b7171ff req-efcc059f-83b6-48a4-a3f9-3c98bb28e44a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:33 np0005603623 nova_compute[226235]: 2026-01-31 08:32:33.362 226239 DEBUG nova.compute.manager [req-82303d5b-c85f-4b0a-92aa-56dd5b7171ff req-efcc059f-83b6-48a4-a3f9-3c98bb28e44a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] No waiting events found dispatching network-vif-plugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:32:33 np0005603623 nova_compute[226235]: 2026-01-31 08:32:33.362 226239 WARNING nova.compute.manager [req-82303d5b-c85f-4b0a-92aa-56dd5b7171ff req-efcc059f-83b6-48a4-a3f9-3c98bb28e44a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received unexpected event network-vif-plugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 for instance with vm_state active and task_state rescuing.#033[00m
Jan 31 03:32:33 np0005603623 nova_compute[226235]: 2026-01-31 08:32:33.421 226239 DEBUG oslo_concurrency.processutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.318s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:33 np0005603623 nova_compute[226235]: 2026-01-31 08:32:33.421 226239 DEBUG nova.objects.instance [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lazy-loading 'migration_context' on Instance uuid 4e0408c0-b1a7-4079-ba79-3e737fded2ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:32:33 np0005603623 nova_compute[226235]: 2026-01-31 08:32:33.651 226239 DEBUG nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:32:33 np0005603623 nova_compute[226235]: 2026-01-31 08:32:33.652 226239 DEBUG nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Start _get_guest_xml network_info=[{"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "vif_mac": "fa:16:3e:fb:ba:04"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:32:33 np0005603623 nova_compute[226235]: 2026-01-31 08:32:33.653 226239 DEBUG nova.objects.instance [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lazy-loading 'resources' on Instance uuid 4e0408c0-b1a7-4079-ba79-3e737fded2ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.039 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.053 226239 WARNING nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.061 226239 DEBUG nova.virt.libvirt.host [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.062 226239 DEBUG nova.virt.libvirt.host [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.067 226239 DEBUG nova.virt.libvirt.host [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.068 226239 DEBUG nova.virt.libvirt.host [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.070 226239 DEBUG nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.070 226239 DEBUG nova.virt.hardware [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.071 226239 DEBUG nova.virt.hardware [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.072 226239 DEBUG nova.virt.hardware [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.072 226239 DEBUG nova.virt.hardware [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.072 226239 DEBUG nova.virt.hardware [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.073 226239 DEBUG nova.virt.hardware [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.073 226239 DEBUG nova.virt.hardware [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.074 226239 DEBUG nova.virt.hardware [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.074 226239 DEBUG nova.virt.hardware [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.075 226239 DEBUG nova.virt.hardware [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.075 226239 DEBUG nova.virt.hardware [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.076 226239 DEBUG nova.objects.instance [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4e0408c0-b1a7-4079-ba79-3e737fded2ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.227 226239 DEBUG oslo_concurrency.processutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:32:34 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/338382328' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.685 226239 DEBUG oslo_concurrency.processutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.686 226239 DEBUG oslo_concurrency.processutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:34 np0005603623 nova_compute[226235]: 2026-01-31 08:32:34.816 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:35.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:32:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:35.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:32:35 np0005603623 nova_compute[226235]: 2026-01-31 08:32:35.344 226239 DEBUG oslo_concurrency.processutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.658s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:35 np0005603623 nova_compute[226235]: 2026-01-31 08:32:35.347 226239 DEBUG oslo_concurrency.processutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:35 np0005603623 nova_compute[226235]: 2026-01-31 08:32:35.366 226239 DEBUG nova.storage.rbd_utils [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] removing snapshot(nova-resize) on rbd image(a15175ec-85fd-457c-870b-8a6d7c13c906_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:32:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:32:35 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4272559740' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:32:35 np0005603623 nova_compute[226235]: 2026-01-31 08:32:35.814 226239 DEBUG oslo_concurrency.processutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:35 np0005603623 nova_compute[226235]: 2026-01-31 08:32:35.816 226239 DEBUG nova.virt.libvirt.vif [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:31:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-989214032',display_name='tempest-ServerRescueTestJSONUnderV235-server-989214032',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-989214032',id=131,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:31:54Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b6a7de05649d42c6acb1aa6e6026b2b4',ramdisk_id='',reservation_id='r-muxmeooe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1941698863',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1941698863-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:31:54Z,user_data=None,user_id='d91ac41a8e444974a11ffbef7b04ddb3',uuid=4e0408c0-b1a7-4079-ba79-3e737fded2ea,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "vif_mac": "fa:16:3e:fb:ba:04"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:32:35 np0005603623 nova_compute[226235]: 2026-01-31 08:32:35.817 226239 DEBUG nova.network.os_vif_util [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Converting VIF {"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "vif_mac": "fa:16:3e:fb:ba:04"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:32:35 np0005603623 nova_compute[226235]: 2026-01-31 08:32:35.817 226239 DEBUG nova.network.os_vif_util [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fb:ba:04,bridge_name='br-int',has_traffic_filtering=True,id=5c8118c5-4238-4d06-99ff-6e0b763563c7,network=Network(1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c8118c5-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:32:35 np0005603623 nova_compute[226235]: 2026-01-31 08:32:35.818 226239 DEBUG nova.objects.instance [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4e0408c0-b1a7-4079-ba79-3e737fded2ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:32:35 np0005603623 nova_compute[226235]: 2026-01-31 08:32:35.992 226239 DEBUG nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:  <uuid>4e0408c0-b1a7-4079-ba79-3e737fded2ea</uuid>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:  <name>instance-00000083</name>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerRescueTestJSONUnderV235-server-989214032</nova:name>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:32:34</nova:creationTime>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <nova:user uuid="d91ac41a8e444974a11ffbef7b04ddb3">tempest-ServerRescueTestJSONUnderV235-1941698863-project-member</nova:user>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <nova:project uuid="b6a7de05649d42c6acb1aa6e6026b2b4">tempest-ServerRescueTestJSONUnderV235-1941698863</nova:project>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <nova:port uuid="5c8118c5-4238-4d06-99ff-6e0b763563c7">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <entry name="serial">4e0408c0-b1a7-4079-ba79-3e737fded2ea</entry>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <entry name="uuid">4e0408c0-b1a7-4079-ba79-3e737fded2ea</entry>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk.rescue">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <target dev="vdb" bus="virtio"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk.config.rescue">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:fb:ba:04"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <target dev="tap5c8118c5-42"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/4e0408c0-b1a7-4079-ba79-3e737fded2ea/console.log" append="off"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:32:35 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:32:36 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:32:36 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:32:36 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:32:36 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:32:36 np0005603623 nova_compute[226235]: 2026-01-31 08:32:35.998 226239 INFO nova.virt.libvirt.driver [-] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Instance destroyed successfully.#033[00m
Jan 31 03:32:36 np0005603623 nova_compute[226235]: 2026-01-31 08:32:36.560 226239 DEBUG nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:32:36 np0005603623 nova_compute[226235]: 2026-01-31 08:32:36.561 226239 DEBUG nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:32:36 np0005603623 nova_compute[226235]: 2026-01-31 08:32:36.561 226239 DEBUG nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:32:36 np0005603623 nova_compute[226235]: 2026-01-31 08:32:36.561 226239 DEBUG nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] No VIF found with MAC fa:16:3e:fb:ba:04, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:32:36 np0005603623 nova_compute[226235]: 2026-01-31 08:32:36.562 226239 INFO nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Using config drive#033[00m
Jan 31 03:32:36 np0005603623 nova_compute[226235]: 2026-01-31 08:32:36.598 226239 DEBUG nova.storage.rbd_utils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] rbd image 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:32:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e314 e314: 3 total, 3 up, 3 in
Jan 31 03:32:36 np0005603623 nova_compute[226235]: 2026-01-31 08:32:36.908 226239 DEBUG nova.objects.instance [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 4e0408c0-b1a7-4079-ba79-3e737fded2ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:32:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:37.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:37.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.466 226239 DEBUG nova.objects.instance [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lazy-loading 'keypairs' on Instance uuid 4e0408c0-b1a7-4079-ba79-3e737fded2ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.865 226239 DEBUG nova.virt.libvirt.driver [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Start _get_guest_xml network_info=[{"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [{'attachment_id': '88f848a3-4362-4a8e-8117-5b16a2ea4b45', 'delete_on_termination': False, 'guest_format': None, 'device_type': 'disk', 'boot_index': None, 'disk_bus': 'virtio', 'mount_device': '/dev/vdb', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-901896ec-4cee-48ca-89ea-1ef061e9fbf3', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '901896ec-4cee-48ca-89ea-1ef061e9fbf3', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'attaching', 'instance': 'a15175ec-85fd-457c-870b-8a6d7c13c906', 'attached_at': '2026-01-31T08:32:32.000000', 'detached_at': '', 'volume_id': '901896ec-4cee-48ca-89ea-1ef061e9fbf3', 'serial': '901896ec-4cee-48ca-89ea-1ef061e9fbf3'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.868 226239 WARNING nova.virt.libvirt.driver [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.872 226239 DEBUG nova.virt.libvirt.host [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.873 226239 DEBUG nova.virt.libvirt.host [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.877 226239 DEBUG nova.virt.libvirt.host [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.877 226239 DEBUG nova.virt.libvirt.host [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.878 226239 DEBUG nova.virt.libvirt.driver [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.878 226239 DEBUG nova.virt.hardware [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.879 226239 DEBUG nova.virt.hardware [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.879 226239 DEBUG nova.virt.hardware [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.879 226239 DEBUG nova.virt.hardware [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.879 226239 DEBUG nova.virt.hardware [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.880 226239 DEBUG nova.virt.hardware [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.880 226239 DEBUG nova.virt.hardware [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.880 226239 DEBUG nova.virt.hardware [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.880 226239 DEBUG nova.virt.hardware [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.881 226239 DEBUG nova.virt.hardware [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.881 226239 DEBUG nova.virt.hardware [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:32:37 np0005603623 nova_compute[226235]: 2026-01-31 08:32:37.881 226239 DEBUG nova.objects.instance [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'vcpu_model' on Instance uuid a15175ec-85fd-457c-870b-8a6d7c13c906 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:32:38 np0005603623 nova_compute[226235]: 2026-01-31 08:32:38.472 226239 DEBUG oslo_concurrency.processutils [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:32:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4107697265' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:32:38 np0005603623 nova_compute[226235]: 2026-01-31 08:32:38.888 226239 DEBUG oslo_concurrency.processutils [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:38 np0005603623 nova_compute[226235]: 2026-01-31 08:32:38.930 226239 DEBUG oslo_concurrency.processutils [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.042 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:39.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:39.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.213 226239 INFO nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Creating config drive at /var/lib/nova/instances/4e0408c0-b1a7-4079-ba79-3e737fded2ea/disk.config.rescue#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.216 226239 DEBUG oslo_concurrency.processutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4e0408c0-b1a7-4079-ba79-3e737fded2ea/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpvp1gf9um execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.337 226239 DEBUG oslo_concurrency.processutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4e0408c0-b1a7-4079-ba79-3e737fded2ea/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpvp1gf9um" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3522322541' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.363 226239 DEBUG nova.storage.rbd_utils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] rbd image 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.367 226239 DEBUG oslo_concurrency.processutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4e0408c0-b1a7-4079-ba79-3e737fded2ea/disk.config.rescue 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #112. Immutable memtables: 0.
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:32:39.522679) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 69] Flushing memtable with next log file: 112
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848359522737, "job": 69, "event": "flush_started", "num_memtables": 1, "num_entries": 2140, "num_deletes": 253, "total_data_size": 4935565, "memory_usage": 4996368, "flush_reason": "Manual Compaction"}
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 69] Level-0 flush table #113: started
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.543 226239 DEBUG oslo_concurrency.processutils [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.613s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848359558416, "cf_name": "default", "job": 69, "event": "table_file_creation", "file_number": 113, "file_size": 3233228, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 56041, "largest_seqno": 58176, "table_properties": {"data_size": 3224598, "index_size": 5252, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 18800, "raw_average_key_size": 20, "raw_value_size": 3207014, "raw_average_value_size": 3520, "num_data_blocks": 229, "num_entries": 911, "num_filter_entries": 911, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848176, "oldest_key_time": 1769848176, "file_creation_time": 1769848359, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 113, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 69] Flush lasted 35822 microseconds, and 5711 cpu microseconds.
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:32:39.558497) [db/flush_job.cc:967] [default] [JOB 69] Level-0 flush table #113: 3233228 bytes OK
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:32:39.558523) [db/memtable_list.cc:519] [default] Level-0 commit table #113 started
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:32:39.584160) [db/memtable_list.cc:722] [default] Level-0 commit table #113: memtable #1 done
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:32:39.584207) EVENT_LOG_v1 {"time_micros": 1769848359584198, "job": 69, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:32:39.584230) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 69] Try to delete WAL files size 4925996, prev total WAL file size 4925996, number of live WAL files 2.
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000109.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:32:39.585507) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034373639' seq:72057594037927935, type:22 .. '7061786F730035303231' seq:0, type:0; will stop at (end)
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 70] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 69 Base level 0, inputs: [113(3157KB)], [111(10058KB)]
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848359585579, "job": 70, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [113], "files_L6": [111], "score": -1, "input_data_size": 13533092, "oldest_snapshot_seqno": -1}
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.610 226239 DEBUG nova.virt.libvirt.vif [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:30:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-2097097080',display_name='tempest-ServerActionsTestOtherB-server-2097097080',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-2097097080',id=128,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDsFGTxapW26dXB/XvUTGcfGzb7/71yMMg1CszLzfnGOAhIU/1lACOYAdVBK40cFjy/2kY258v2iqF8U2lfGaG9JRRfAxw6pRph+THb2i3B9US4SfAm/pgAAiW0mmqeasA==',key_name='tempest-keypair-1440000372',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:32:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='953a213fa5cb435ab3c04ad96152685f',ramdisk_id='',reservation_id='r-41gbj3yx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestOtherB-1048458052',owner_user_name='tempest-ServerActionsTestOtherB-1048458052-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:32:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ef51681d234a4abc88ff433d0640b6e7',uuid=a15175ec-85fd-457c-870b-8a6d7c13c906,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.610 226239 DEBUG nova.network.os_vif_util [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converting VIF {"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.611 226239 DEBUG nova.network.os_vif_util [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:59:a9,bridge_name='br-int',has_traffic_filtering=True,id=02df5608-7a85-4d54-b5ac-628d6c8e8179,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02df5608-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.613 226239 DEBUG nova.virt.libvirt.driver [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:  <uuid>a15175ec-85fd-457c-870b-8a6d7c13c906</uuid>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:  <name>instance-00000080</name>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerActionsTestOtherB-server-2097097080</nova:name>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:32:37</nova:creationTime>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <nova:user uuid="ef51681d234a4abc88ff433d0640b6e7">tempest-ServerActionsTestOtherB-1048458052-project-member</nova:user>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <nova:project uuid="953a213fa5cb435ab3c04ad96152685f">tempest-ServerActionsTestOtherB-1048458052</nova:project>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <nova:port uuid="02df5608-7a85-4d54-b5ac-628d6c8e8179">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <entry name="serial">a15175ec-85fd-457c-870b-8a6d7c13c906</entry>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <entry name="uuid">a15175ec-85fd-457c-870b-8a6d7c13c906</entry>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/a15175ec-85fd-457c-870b-8a6d7c13c906_disk">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/a15175ec-85fd-457c-870b-8a6d7c13c906_disk.config">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="volumes/volume-901896ec-4cee-48ca-89ea-1ef061e9fbf3">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <target dev="vdb" bus="virtio"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <serial>901896ec-4cee-48ca-89ea-1ef061e9fbf3</serial>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:dd:59:a9"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <target dev="tap02df5608-7a"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/a15175ec-85fd-457c-870b-8a6d7c13c906/console.log" append="off"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <input type="keyboard" bus="usb"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:32:39 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:32:39 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:32:39 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:32:39 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.614 226239 DEBUG nova.compute.manager [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Preparing to wait for external event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.614 226239 DEBUG oslo_concurrency.lockutils [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.615 226239 DEBUG oslo_concurrency.lockutils [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.615 226239 DEBUG oslo_concurrency.lockutils [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.615 226239 DEBUG nova.virt.libvirt.vif [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:30:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-2097097080',display_name='tempest-ServerActionsTestOtherB-server-2097097080',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-2097097080',id=128,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDsFGTxapW26dXB/XvUTGcfGzb7/71yMMg1CszLzfnGOAhIU/1lACOYAdVBK40cFjy/2kY258v2iqF8U2lfGaG9JRRfAxw6pRph+THb2i3B9US4SfAm/pgAAiW0mmqeasA==',key_name='tempest-keypair-1440000372',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:32:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='953a213fa5cb435ab3c04ad96152685f',ramdisk_id='',reservation_id='r-41gbj3yx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestOtherB-1048458052',owner_user_name='tempest-ServerActionsTestOtherB-1048458052-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:32:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ef51681d234a4abc88ff433d0640b6e7',uuid=a15175ec-85fd-457c-870b-8a6d7c13c906,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.616 226239 DEBUG nova.network.os_vif_util [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converting VIF {"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.616 226239 DEBUG nova.network.os_vif_util [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:59:a9,bridge_name='br-int',has_traffic_filtering=True,id=02df5608-7a85-4d54-b5ac-628d6c8e8179,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02df5608-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.616 226239 DEBUG os_vif [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:59:a9,bridge_name='br-int',has_traffic_filtering=True,id=02df5608-7a85-4d54-b5ac-628d6c8e8179,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02df5608-7a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.617 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.617 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.618 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.620 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.620 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02df5608-7a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.621 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap02df5608-7a, col_values=(('external_ids', {'iface-id': '02df5608-7a85-4d54-b5ac-628d6c8e8179', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:59:a9', 'vm-uuid': 'a15175ec-85fd-457c-870b-8a6d7c13c906'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.622 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:39 np0005603623 NetworkManager[48970]: <info>  [1769848359.6231] manager: (tap02df5608-7a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/257)
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.624 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.628 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.629 226239 INFO os_vif [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:59:a9,bridge_name='br-int',has_traffic_filtering=True,id=02df5608-7a85-4d54-b5ac-628d6c8e8179,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02df5608-7a')#033[00m
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e314 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.818 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:39 np0005603623 kernel: tap02df5608-7a: entered promiscuous mode
Jan 31 03:32:39 np0005603623 NetworkManager[48970]: <info>  [1769848359.8525] manager: (tap02df5608-7a): new Tun device (/org/freedesktop/NetworkManager/Devices/258)
Jan 31 03:32:39 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:39Z|00544|binding|INFO|Claiming lport 02df5608-7a85-4d54-b5ac-628d6c8e8179 for this chassis.
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.856 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:39 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:39Z|00545|binding|INFO|02df5608-7a85-4d54-b5ac-628d6c8e8179: Claiming fa:16:3e:dd:59:a9 10.100.0.10
Jan 31 03:32:39 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:39Z|00546|binding|INFO|Setting lport 02df5608-7a85-4d54-b5ac-628d6c8e8179 ovn-installed in OVS
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.866 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:39 np0005603623 nova_compute[226235]: 2026-01-31 08:32:39.870 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:39 np0005603623 systemd-udevd[286876]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:32:39 np0005603623 systemd-machined[194379]: New machine qemu-60-instance-00000080.
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 70] Generated table #114: 8212 keys, 11456216 bytes, temperature: kUnknown
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848359883009, "cf_name": "default", "job": 70, "event": "table_file_creation", "file_number": 114, "file_size": 11456216, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11402407, "index_size": 32193, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20549, "raw_key_size": 212762, "raw_average_key_size": 25, "raw_value_size": 11257492, "raw_average_value_size": 1370, "num_data_blocks": 1261, "num_entries": 8212, "num_filter_entries": 8212, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769848359, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 114, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:32:39 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:32:39 np0005603623 NetworkManager[48970]: <info>  [1769848359.8837] device (tap02df5608-7a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:32:39 np0005603623 NetworkManager[48970]: <info>  [1769848359.8844] device (tap02df5608-7a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:32:39 np0005603623 systemd[1]: Started Virtual Machine qemu-60-instance-00000080.
Jan 31 03:32:39 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 31 03:32:40 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:32:39.883260) [db/compaction/compaction_job.cc:1663] [default] [JOB 70] Compacted 1@0 + 1@6 files to L6 => 11456216 bytes
Jan 31 03:32:40 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:32:40.032238) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 45.5 rd, 38.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 9.8 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(7.7) write-amplify(3.5) OK, records in: 8738, records dropped: 526 output_compression: NoCompression
Jan 31 03:32:40 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:32:40.032277) EVENT_LOG_v1 {"time_micros": 1769848360032262, "job": 70, "event": "compaction_finished", "compaction_time_micros": 297517, "compaction_time_cpu_micros": 20839, "output_level": 6, "num_output_files": 1, "total_output_size": 11456216, "num_input_records": 8738, "num_output_records": 8212, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:32:40 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000113.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:32:40 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848360032786, "job": 70, "event": "table_file_deletion", "file_number": 113}
Jan 31 03:32:40 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000111.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:32:40 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848360034037, "job": 70, "event": "table_file_deletion", "file_number": 111}
Jan 31 03:32:40 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:32:39.585297) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:32:40 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:32:40.034195) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:32:40 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:32:40.034200) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:32:40 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:32:40.034202) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:32:40 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:32:40.034205) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:32:40 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:32:40.034206) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:32:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:40Z|00547|binding|INFO|Setting lport 02df5608-7a85-4d54-b5ac-628d6c8e8179 up in Southbound
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.069 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:59:a9 10.100.0.10'], port_security=['fa:16:3e:dd:59:a9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'a15175ec-85fd-457c-870b-8a6d7c13c906', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44469d8b-ad30-4270-88fa-e67c568f3150', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '953a213fa5cb435ab3c04ad96152685f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '5b1bd8ad-0d2a-4d57-a00a-9a6b59df86e5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.181'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d972fb9d-6d12-4c1c-b135-704d64887b72, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=02df5608-7a85-4d54-b5ac-628d6c8e8179) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.071 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 02df5608-7a85-4d54-b5ac-628d6c8e8179 in datapath 44469d8b-ad30-4270-88fa-e67c568f3150 bound to our chassis#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.072 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 44469d8b-ad30-4270-88fa-e67c568f3150#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.080 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f6168a43-e07d-46d9-938b-f145b39a1679]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.081 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap44469d8b-a1 in ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.083 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap44469d8b-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.083 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[93a9fcbb-d7c3-477d-8de6-03ad77f60d19]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.084 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[469656e8-c93a-4ce5-b36b-39366098f81b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.096 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[cb65c4b0-ca06-4618-ae09-ba9a9e37724f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.105 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[83eb7e0e-0f40-44ab-8a86-53927d4ddd1a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.127 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[a80a187b-b1d9-4144-ab5a-0b8279e179a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.132 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[63922abe-90a0-4e0b-a162-7c1b62ad64e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:40 np0005603623 NetworkManager[48970]: <info>  [1769848360.1334] manager: (tap44469d8b-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/259)
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.155 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[03dfac4b-db5a-4f19-8ada-8f7a0bc7d640]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.157 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[6e6b5040-e481-468b-a798-0fb6795b5126]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:40 np0005603623 NetworkManager[48970]: <info>  [1769848360.1727] device (tap44469d8b-a0): carrier: link connected
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.177 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[de8beb3b-ccfb-470d-a046-3b69b78845f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.188 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b759e19c-9edd-499e-8acc-774c19411a8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44469d8b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:98:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 160], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 750060, 'reachable_time': 33757, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 286931, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.197 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cc084546-308c-4fe9-ab73-6567fbc55e86]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:9820'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 750060, 'tstamp': 750060}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286932, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.207 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1b8a680c-13aa-42c0-afac-b3a1c2666f61]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44469d8b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:98:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 160], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 750060, 'reachable_time': 33757, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 286933, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.224 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e58ccef3-c8f8-4d55-a56f-310835d2342f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.266 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bdb881a2-c758-4e45-8924-98fe3ebe7eb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.268 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44469d8b-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.268 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.269 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44469d8b-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:32:40 np0005603623 NetworkManager[48970]: <info>  [1769848360.2715] manager: (tap44469d8b-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/260)
Jan 31 03:32:40 np0005603623 nova_compute[226235]: 2026-01-31 08:32:40.271 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:40 np0005603623 kernel: tap44469d8b-a0: entered promiscuous mode
Jan 31 03:32:40 np0005603623 nova_compute[226235]: 2026-01-31 08:32:40.274 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.276 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap44469d8b-a0, col_values=(('external_ids', {'iface-id': '7e288124-e200-4c03-8a4a-baab3e3f3d7a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:32:40 np0005603623 nova_compute[226235]: 2026-01-31 08:32:40.277 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:40Z|00548|binding|INFO|Releasing lport 7e288124-e200-4c03-8a4a-baab3e3f3d7a from this chassis (sb_readonly=0)
Jan 31 03:32:40 np0005603623 nova_compute[226235]: 2026-01-31 08:32:40.284 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:40 np0005603623 nova_compute[226235]: 2026-01-31 08:32:40.288 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.289 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44469d8b-ad30-4270-88fa-e67c568f3150.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44469d8b-ad30-4270-88fa-e67c568f3150.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.290 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8eb84994-50df-4953-86f2-aa74e6a595a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.290 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-44469d8b-ad30-4270-88fa-e67c568f3150
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/44469d8b-ad30-4270-88fa-e67c568f3150.pid.haproxy
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 44469d8b-ad30-4270-88fa-e67c568f3150
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:32:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:40.291 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'env', 'PROCESS_TAG=haproxy-44469d8b-ad30-4270-88fa-e67c568f3150', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/44469d8b-ad30-4270-88fa-e67c568f3150.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:32:40 np0005603623 nova_compute[226235]: 2026-01-31 08:32:40.632 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848360.6313493, a15175ec-85fd-457c-870b-8a6d7c13c906 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:32:40 np0005603623 nova_compute[226235]: 2026-01-31 08:32:40.633 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] VM Started (Lifecycle Event)#033[00m
Jan 31 03:32:40 np0005603623 nova_compute[226235]: 2026-01-31 08:32:40.692 226239 DEBUG oslo_concurrency.processutils [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4e0408c0-b1a7-4079-ba79-3e737fded2ea/disk.config.rescue 4e0408c0-b1a7-4079-ba79-3e737fded2ea_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.325s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:40 np0005603623 nova_compute[226235]: 2026-01-31 08:32:40.693 226239 INFO nova.virt.libvirt.driver [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Deleting local config drive /var/lib/nova/instances/4e0408c0-b1a7-4079-ba79-3e737fded2ea/disk.config.rescue because it was imported into RBD.#033[00m
Jan 31 03:32:40 np0005603623 podman[287013]: 2026-01-31 08:32:40.616316028 +0000 UTC m=+0.026422436 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:32:40 np0005603623 podman[287013]: 2026-01-31 08:32:40.724289574 +0000 UTC m=+0.134395952 container create 05c8de9b91c10b0f420c359eb12bbcb230651e6cdb4be6fb501453f6a6cbe2a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:32:40 np0005603623 kernel: tap5c8118c5-42: entered promiscuous mode
Jan 31 03:32:40 np0005603623 NetworkManager[48970]: <info>  [1769848360.7495] manager: (tap5c8118c5-42): new Tun device (/org/freedesktop/NetworkManager/Devices/261)
Jan 31 03:32:40 np0005603623 systemd-udevd[286904]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:32:40 np0005603623 nova_compute[226235]: 2026-01-31 08:32:40.751 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:40Z|00549|binding|INFO|Claiming lport 5c8118c5-4238-4d06-99ff-6e0b763563c7 for this chassis.
Jan 31 03:32:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:40Z|00550|binding|INFO|5c8118c5-4238-4d06-99ff-6e0b763563c7: Claiming fa:16:3e:fb:ba:04 10.100.0.3
Jan 31 03:32:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:40Z|00551|binding|INFO|Removing lport 5c8118c5-4238-4d06-99ff-6e0b763563c7 ovn-installed in OVS
Jan 31 03:32:40 np0005603623 nova_compute[226235]: 2026-01-31 08:32:40.754 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:40Z|00552|binding|INFO|Setting lport 5c8118c5-4238-4d06-99ff-6e0b763563c7 ovn-installed in OVS
Jan 31 03:32:40 np0005603623 nova_compute[226235]: 2026-01-31 08:32:40.758 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:40 np0005603623 nova_compute[226235]: 2026-01-31 08:32:40.763 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:40 np0005603623 NetworkManager[48970]: <info>  [1769848360.7664] device (tap5c8118c5-42): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:32:40 np0005603623 NetworkManager[48970]: <info>  [1769848360.7671] device (tap5c8118c5-42): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:32:40 np0005603623 systemd[1]: Started libpod-conmon-05c8de9b91c10b0f420c359eb12bbcb230651e6cdb4be6fb501453f6a6cbe2a5.scope.
Jan 31 03:32:40 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:32:40 np0005603623 systemd-machined[194379]: New machine qemu-61-instance-00000083.
Jan 31 03:32:40 np0005603623 systemd[1]: Started Virtual Machine qemu-61-instance-00000083.
Jan 31 03:32:40 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5621b5ebc871e50dce9107c4fdf188f0b359120cc269aca2f5ed43dcc699edd6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:32:40 np0005603623 podman[287013]: 2026-01-31 08:32:40.895605278 +0000 UTC m=+0.305711696 container init 05c8de9b91c10b0f420c359eb12bbcb230651e6cdb4be6fb501453f6a6cbe2a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:32:40 np0005603623 podman[287013]: 2026-01-31 08:32:40.901191003 +0000 UTC m=+0.311297391 container start 05c8de9b91c10b0f420c359eb12bbcb230651e6cdb4be6fb501453f6a6cbe2a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 31 03:32:40 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[287040]: [NOTICE]   (287053) : New worker (287055) forked
Jan 31 03:32:40 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[287040]: [NOTICE]   (287053) : Loading success.
Jan 31 03:32:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:41.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:41 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:41Z|00553|binding|INFO|Setting lport 5c8118c5-4238-4d06-99ff-6e0b763563c7 up in Southbound
Jan 31 03:32:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:41.072 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:ba:04 10.100.0.3'], port_security=['fa:16:3e:fb:ba:04 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '4e0408c0-b1a7-4079-ba79-3e737fded2ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b6a7de05649d42c6acb1aa6e6026b2b4', 'neutron:revision_number': '5', 'neutron:security_group_ids': '8a3bb86b-ceb7-476c-96e2-ee30ea8ecd63', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41049c6c-e208-4c6a-ad10-15df89677733, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=5c8118c5-4238-4d06-99ff-6e0b763563c7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:32:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:41.074 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 5c8118c5-4238-4d06-99ff-6e0b763563c7 in datapath 1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7 bound to our chassis#033[00m
Jan 31 03:32:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:41.075 143258 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 31 03:32:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:41.076 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f1d9f40a-04b6-4b92-b14c-0dd1030aa633]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:41 np0005603623 nova_compute[226235]: 2026-01-31 08:32:41.092 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:32:41 np0005603623 nova_compute[226235]: 2026-01-31 08:32:41.097 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848360.632744, a15175ec-85fd-457c-870b-8a6d7c13c906 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:32:41 np0005603623 nova_compute[226235]: 2026-01-31 08:32:41.097 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:32:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:32:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:41.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:32:41 np0005603623 nova_compute[226235]: 2026-01-31 08:32:41.410 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:32:41 np0005603623 nova_compute[226235]: 2026-01-31 08:32:41.415 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:32:41 np0005603623 nova_compute[226235]: 2026-01-31 08:32:41.671 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 31 03:32:41 np0005603623 nova_compute[226235]: 2026-01-31 08:32:41.835 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Removed pending event for 4e0408c0-b1a7-4079-ba79-3e737fded2ea due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:32:41 np0005603623 nova_compute[226235]: 2026-01-31 08:32:41.836 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848361.8346753, 4e0408c0-b1a7-4079-ba79-3e737fded2ea => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:32:41 np0005603623 nova_compute[226235]: 2026-01-31 08:32:41.837 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:32:41 np0005603623 nova_compute[226235]: 2026-01-31 08:32:41.844 226239 DEBUG nova.compute.manager [None req-01bf01dc-5023-4910-8300-a0c685055c0d d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:32:41 np0005603623 nova_compute[226235]: 2026-01-31 08:32:41.970 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:32:41 np0005603623 nova_compute[226235]: 2026-01-31 08:32:41.974 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:32:42 np0005603623 nova_compute[226235]: 2026-01-31 08:32:42.100 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 31 03:32:42 np0005603623 nova_compute[226235]: 2026-01-31 08:32:42.100 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848361.8393295, 4e0408c0-b1a7-4079-ba79-3e737fded2ea => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:32:42 np0005603623 nova_compute[226235]: 2026-01-31 08:32:42.100 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] VM Started (Lifecycle Event)#033[00m
Jan 31 03:32:42 np0005603623 nova_compute[226235]: 2026-01-31 08:32:42.344 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:32:42 np0005603623 nova_compute[226235]: 2026-01-31 08:32:42.348 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:32:42 np0005603623 nova_compute[226235]: 2026-01-31 08:32:42.814 226239 DEBUG nova.compute.manager [req-64ba190d-478a-49cc-9cf9-39d1177d9758 req-82298b42-6e03-456f-bd8e-cab36bcb18a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:32:42 np0005603623 nova_compute[226235]: 2026-01-31 08:32:42.814 226239 DEBUG oslo_concurrency.lockutils [req-64ba190d-478a-49cc-9cf9-39d1177d9758 req-82298b42-6e03-456f-bd8e-cab36bcb18a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:42 np0005603623 nova_compute[226235]: 2026-01-31 08:32:42.814 226239 DEBUG oslo_concurrency.lockutils [req-64ba190d-478a-49cc-9cf9-39d1177d9758 req-82298b42-6e03-456f-bd8e-cab36bcb18a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:42 np0005603623 nova_compute[226235]: 2026-01-31 08:32:42.815 226239 DEBUG oslo_concurrency.lockutils [req-64ba190d-478a-49cc-9cf9-39d1177d9758 req-82298b42-6e03-456f-bd8e-cab36bcb18a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:42 np0005603623 nova_compute[226235]: 2026-01-31 08:32:42.815 226239 DEBUG nova.compute.manager [req-64ba190d-478a-49cc-9cf9-39d1177d9758 req-82298b42-6e03-456f-bd8e-cab36bcb18a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Processing event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:32:42 np0005603623 nova_compute[226235]: 2026-01-31 08:32:42.815 226239 DEBUG nova.compute.manager [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:32:42 np0005603623 nova_compute[226235]: 2026-01-31 08:32:42.818 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848362.8186843, a15175ec-85fd-457c-870b-8a6d7c13c906 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:32:42 np0005603623 nova_compute[226235]: 2026-01-31 08:32:42.819 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:32:42 np0005603623 nova_compute[226235]: 2026-01-31 08:32:42.822 226239 INFO nova.virt.libvirt.driver [-] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Instance running successfully.#033[00m
Jan 31 03:32:42 np0005603623 nova_compute[226235]: 2026-01-31 08:32:42.822 226239 DEBUG nova.virt.libvirt.driver [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Jan 31 03:32:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:43.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:43 np0005603623 nova_compute[226235]: 2026-01-31 08:32:43.163 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:32:43 np0005603623 nova_compute[226235]: 2026-01-31 08:32:43.170 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:32:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:43.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:43 np0005603623 nova_compute[226235]: 2026-01-31 08:32:43.305 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 31 03:32:43 np0005603623 nova_compute[226235]: 2026-01-31 08:32:43.776 226239 INFO nova.compute.manager [None req-8164fb27-073f-45ea-84c4-9cbcd9d0826b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Updating instance to original state: 'active'#033[00m
Jan 31 03:32:44 np0005603623 nova_compute[226235]: 2026-01-31 08:32:44.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:32:44 np0005603623 nova_compute[226235]: 2026-01-31 08:32:44.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:32:44 np0005603623 nova_compute[226235]: 2026-01-31 08:32:44.623 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e314 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:44 np0005603623 nova_compute[226235]: 2026-01-31 08:32:44.820 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:45.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:45.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:45 np0005603623 nova_compute[226235]: 2026-01-31 08:32:45.285 226239 DEBUG nova.compute.manager [req-b280fc41-3d3c-4f73-8fa7-cc0f3def3006 req-3ff84aa3-f53c-42c4-bf73-070c6f7f21e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:32:45 np0005603623 nova_compute[226235]: 2026-01-31 08:32:45.286 226239 DEBUG oslo_concurrency.lockutils [req-b280fc41-3d3c-4f73-8fa7-cc0f3def3006 req-3ff84aa3-f53c-42c4-bf73-070c6f7f21e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:45 np0005603623 nova_compute[226235]: 2026-01-31 08:32:45.287 226239 DEBUG oslo_concurrency.lockutils [req-b280fc41-3d3c-4f73-8fa7-cc0f3def3006 req-3ff84aa3-f53c-42c4-bf73-070c6f7f21e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:45 np0005603623 nova_compute[226235]: 2026-01-31 08:32:45.287 226239 DEBUG oslo_concurrency.lockutils [req-b280fc41-3d3c-4f73-8fa7-cc0f3def3006 req-3ff84aa3-f53c-42c4-bf73-070c6f7f21e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:45 np0005603623 nova_compute[226235]: 2026-01-31 08:32:45.288 226239 DEBUG nova.compute.manager [req-b280fc41-3d3c-4f73-8fa7-cc0f3def3006 req-3ff84aa3-f53c-42c4-bf73-070c6f7f21e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] No waiting events found dispatching network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:32:45 np0005603623 nova_compute[226235]: 2026-01-31 08:32:45.288 226239 WARNING nova.compute.manager [req-b280fc41-3d3c-4f73-8fa7-cc0f3def3006 req-3ff84aa3-f53c-42c4-bf73-070c6f7f21e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received unexpected event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:32:45 np0005603623 nova_compute[226235]: 2026-01-31 08:32:45.288 226239 DEBUG nova.compute.manager [req-b280fc41-3d3c-4f73-8fa7-cc0f3def3006 req-3ff84aa3-f53c-42c4-bf73-070c6f7f21e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received event network-vif-plugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:32:45 np0005603623 nova_compute[226235]: 2026-01-31 08:32:45.289 226239 DEBUG oslo_concurrency.lockutils [req-b280fc41-3d3c-4f73-8fa7-cc0f3def3006 req-3ff84aa3-f53c-42c4-bf73-070c6f7f21e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:45 np0005603623 nova_compute[226235]: 2026-01-31 08:32:45.289 226239 DEBUG oslo_concurrency.lockutils [req-b280fc41-3d3c-4f73-8fa7-cc0f3def3006 req-3ff84aa3-f53c-42c4-bf73-070c6f7f21e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:45 np0005603623 nova_compute[226235]: 2026-01-31 08:32:45.289 226239 DEBUG oslo_concurrency.lockutils [req-b280fc41-3d3c-4f73-8fa7-cc0f3def3006 req-3ff84aa3-f53c-42c4-bf73-070c6f7f21e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:45 np0005603623 nova_compute[226235]: 2026-01-31 08:32:45.290 226239 DEBUG nova.compute.manager [req-b280fc41-3d3c-4f73-8fa7-cc0f3def3006 req-3ff84aa3-f53c-42c4-bf73-070c6f7f21e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] No waiting events found dispatching network-vif-plugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:32:45 np0005603623 nova_compute[226235]: 2026-01-31 08:32:45.290 226239 WARNING nova.compute.manager [req-b280fc41-3d3c-4f73-8fa7-cc0f3def3006 req-3ff84aa3-f53c-42c4-bf73-070c6f7f21e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received unexpected event network-vif-plugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 for instance with vm_state rescued and task_state None.#033[00m
Jan 31 03:32:45 np0005603623 nova_compute[226235]: 2026-01-31 08:32:45.290 226239 DEBUG nova.compute.manager [req-b280fc41-3d3c-4f73-8fa7-cc0f3def3006 req-3ff84aa3-f53c-42c4-bf73-070c6f7f21e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received event network-vif-plugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:32:45 np0005603623 nova_compute[226235]: 2026-01-31 08:32:45.291 226239 DEBUG oslo_concurrency.lockutils [req-b280fc41-3d3c-4f73-8fa7-cc0f3def3006 req-3ff84aa3-f53c-42c4-bf73-070c6f7f21e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:45 np0005603623 nova_compute[226235]: 2026-01-31 08:32:45.291 226239 DEBUG oslo_concurrency.lockutils [req-b280fc41-3d3c-4f73-8fa7-cc0f3def3006 req-3ff84aa3-f53c-42c4-bf73-070c6f7f21e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:45 np0005603623 nova_compute[226235]: 2026-01-31 08:32:45.291 226239 DEBUG oslo_concurrency.lockutils [req-b280fc41-3d3c-4f73-8fa7-cc0f3def3006 req-3ff84aa3-f53c-42c4-bf73-070c6f7f21e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:45 np0005603623 nova_compute[226235]: 2026-01-31 08:32:45.292 226239 DEBUG nova.compute.manager [req-b280fc41-3d3c-4f73-8fa7-cc0f3def3006 req-3ff84aa3-f53c-42c4-bf73-070c6f7f21e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] No waiting events found dispatching network-vif-plugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:32:45 np0005603623 nova_compute[226235]: 2026-01-31 08:32:45.292 226239 WARNING nova.compute.manager [req-b280fc41-3d3c-4f73-8fa7-cc0f3def3006 req-3ff84aa3-f53c-42c4-bf73-070c6f7f21e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received unexpected event network-vif-plugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 for instance with vm_state rescued and task_state None.#033[00m
Jan 31 03:32:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e315 e315: 3 total, 3 up, 3 in
Jan 31 03:32:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:47.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:47.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.197 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.198 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.199 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.199 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.199 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:32:48 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1178881675' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.686 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.747 226239 DEBUG nova.compute.manager [req-322e24fc-a2b4-4b33-a35b-693a8247db29 req-2637e88e-8dfe-4c18-919e-c02d18eea884 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received event network-changed-5c8118c5-4238-4d06-99ff-6e0b763563c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.748 226239 DEBUG nova.compute.manager [req-322e24fc-a2b4-4b33-a35b-693a8247db29 req-2637e88e-8dfe-4c18-919e-c02d18eea884 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Refreshing instance network info cache due to event network-changed-5c8118c5-4238-4d06-99ff-6e0b763563c7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.748 226239 DEBUG oslo_concurrency.lockutils [req-322e24fc-a2b4-4b33-a35b-693a8247db29 req-2637e88e-8dfe-4c18-919e-c02d18eea884 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.749 226239 DEBUG oslo_concurrency.lockutils [req-322e24fc-a2b4-4b33-a35b-693a8247db29 req-2637e88e-8dfe-4c18-919e-c02d18eea884 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.749 226239 DEBUG nova.network.neutron [req-322e24fc-a2b4-4b33-a35b-693a8247db29 req-2637e88e-8dfe-4c18-919e-c02d18eea884 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Refreshing network info cache for port 5c8118c5-4238-4d06-99ff-6e0b763563c7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.774 226239 DEBUG oslo_concurrency.lockutils [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.775 226239 DEBUG oslo_concurrency.lockutils [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.776 226239 DEBUG oslo_concurrency.lockutils [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.776 226239 DEBUG oslo_concurrency.lockutils [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.776 226239 DEBUG oslo_concurrency.lockutils [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.778 226239 INFO nova.compute.manager [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Terminating instance#033[00m
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.779 226239 DEBUG nova.compute.manager [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:32:48 np0005603623 kernel: tap02df5608-7a (unregistering): left promiscuous mode
Jan 31 03:32:48 np0005603623 NetworkManager[48970]: <info>  [1769848368.8260] device (tap02df5608-7a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:32:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:48Z|00554|binding|INFO|Releasing lport 02df5608-7a85-4d54-b5ac-628d6c8e8179 from this chassis (sb_readonly=0)
Jan 31 03:32:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:48Z|00555|binding|INFO|Setting lport 02df5608-7a85-4d54-b5ac-628d6c8e8179 down in Southbound
Jan 31 03:32:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:32:48Z|00556|binding|INFO|Removing iface tap02df5608-7a ovn-installed in OVS
Jan 31 03:32:48 np0005603623 nova_compute[226235]: 2026-01-31 08:32:48.848 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:48 np0005603623 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000080.scope: Deactivated successfully.
Jan 31 03:32:48 np0005603623 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000080.scope: Consumed 6.689s CPU time.
Jan 31 03:32:48 np0005603623 systemd-machined[194379]: Machine qemu-60-instance-00000080 terminated.
Jan 31 03:32:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:48.925 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:59:a9 10.100.0.10'], port_security=['fa:16:3e:dd:59:a9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'a15175ec-85fd-457c-870b-8a6d7c13c906', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44469d8b-ad30-4270-88fa-e67c568f3150', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '953a213fa5cb435ab3c04ad96152685f', 'neutron:revision_number': '12', 'neutron:security_group_ids': '5b1bd8ad-0d2a-4d57-a00a-9a6b59df86e5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.181', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d972fb9d-6d12-4c1c-b135-704d64887b72, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=02df5608-7a85-4d54-b5ac-628d6c8e8179) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:32:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:48.926 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 02df5608-7a85-4d54-b5ac-628d6c8e8179 in datapath 44469d8b-ad30-4270-88fa-e67c568f3150 unbound from our chassis#033[00m
Jan 31 03:32:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:48.928 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 44469d8b-ad30-4270-88fa-e67c568f3150, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:32:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:48.929 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2b74a24e-9cf5-4b89-8b82-b50ec20efb3b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:48.930 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 namespace which is not needed anymore#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.022 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000080 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.023 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000080 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.023 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000080 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.026 226239 INFO nova.virt.libvirt.driver [-] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Instance destroyed successfully.#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.027 226239 DEBUG nova.objects.instance [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'resources' on Instance uuid a15175ec-85fd-457c-870b-8a6d7c13c906 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.030 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000083 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.030 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000083 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.030 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000083 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:32:49 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[287040]: [NOTICE]   (287053) : haproxy version is 2.8.14-c23fe91
Jan 31 03:32:49 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[287040]: [NOTICE]   (287053) : path to executable is /usr/sbin/haproxy
Jan 31 03:32:49 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[287040]: [WARNING]  (287053) : Exiting Master process...
Jan 31 03:32:49 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[287040]: [ALERT]    (287053) : Current worker (287055) exited with code 143 (Terminated)
Jan 31 03:32:49 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[287040]: [WARNING]  (287053) : All workers exited. Exiting... (0)
Jan 31 03:32:49 np0005603623 systemd[1]: libpod-05c8de9b91c10b0f420c359eb12bbcb230651e6cdb4be6fb501453f6a6cbe2a5.scope: Deactivated successfully.
Jan 31 03:32:49 np0005603623 podman[287179]: 2026-01-31 08:32:49.070441233 +0000 UTC m=+0.056053313 container died 05c8de9b91c10b0f420c359eb12bbcb230651e6cdb4be6fb501453f6a6cbe2a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:32:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:32:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:49.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.090 226239 DEBUG nova.virt.libvirt.vif [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:30:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-2097097080',display_name='tempest-ServerActionsTestOtherB-server-2097097080',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-2097097080',id=128,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDsFGTxapW26dXB/XvUTGcfGzb7/71yMMg1CszLzfnGOAhIU/1lACOYAdVBK40cFjy/2kY258v2iqF8U2lfGaG9JRRfAxw6pRph+THb2i3B9US4SfAm/pgAAiW0mmqeasA==',key_name='tempest-keypair-1440000372',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:32:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='953a213fa5cb435ab3c04ad96152685f',ramdisk_id='',reservation_id='r-41gbj3yx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1048458052',owner_user_name='tempest-ServerActionsTestOtherB-1048458052-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:32:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ef51681d234a4abc88ff433d0640b6e7',uuid=a15175ec-85fd-457c-870b-8a6d7c13c906,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.091 226239 DEBUG nova.network.os_vif_util [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converting VIF {"id": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "address": "fa:16:3e:dd:59:a9", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02df5608-7a", "ovs_interfaceid": "02df5608-7a85-4d54-b5ac-628d6c8e8179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.092 226239 DEBUG nova.network.os_vif_util [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:59:a9,bridge_name='br-int',has_traffic_filtering=True,id=02df5608-7a85-4d54-b5ac-628d6c8e8179,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02df5608-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.092 226239 DEBUG os_vif [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:59:a9,bridge_name='br-int',has_traffic_filtering=True,id=02df5608-7a85-4d54-b5ac-628d6c8e8179,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02df5608-7a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.094 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.094 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02df5608-7a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.095 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.097 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.103 226239 INFO os_vif [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:59:a9,bridge_name='br-int',has_traffic_filtering=True,id=02df5608-7a85-4d54-b5ac-628d6c8e8179,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02df5608-7a')#033[00m
Jan 31 03:32:49 np0005603623 systemd[1]: var-lib-containers-storage-overlay-5621b5ebc871e50dce9107c4fdf188f0b359120cc269aca2f5ed43dcc699edd6-merged.mount: Deactivated successfully.
Jan 31 03:32:49 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-05c8de9b91c10b0f420c359eb12bbcb230651e6cdb4be6fb501453f6a6cbe2a5-userdata-shm.mount: Deactivated successfully.
Jan 31 03:32:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:49.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:49 np0005603623 podman[287179]: 2026-01-31 08:32:49.235238093 +0000 UTC m=+0.220850173 container cleanup 05c8de9b91c10b0f420c359eb12bbcb230651e6cdb4be6fb501453f6a6cbe2a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:32:49 np0005603623 systemd[1]: libpod-conmon-05c8de9b91c10b0f420c359eb12bbcb230651e6cdb4be6fb501453f6a6cbe2a5.scope: Deactivated successfully.
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.271 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.272 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4096MB free_disk=20.785003662109375GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.273 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.273 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:49 np0005603623 podman[287231]: 2026-01-31 08:32:49.308348728 +0000 UTC m=+0.051274493 container remove 05c8de9b91c10b0f420c359eb12bbcb230651e6cdb4be6fb501453f6a6cbe2a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:32:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:49.313 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b595ec20-1b03-4362-b014-5f20fdb787ef]: (4, ('Sat Jan 31 08:32:49 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 (05c8de9b91c10b0f420c359eb12bbcb230651e6cdb4be6fb501453f6a6cbe2a5)\n05c8de9b91c10b0f420c359eb12bbcb230651e6cdb4be6fb501453f6a6cbe2a5\nSat Jan 31 08:32:49 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 (05c8de9b91c10b0f420c359eb12bbcb230651e6cdb4be6fb501453f6a6cbe2a5)\n05c8de9b91c10b0f420c359eb12bbcb230651e6cdb4be6fb501453f6a6cbe2a5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:49.315 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[307f174e-247a-419f-b7ee-e710fe4e8525]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:49.316 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44469d8b-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.318 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:49 np0005603623 kernel: tap44469d8b-a0: left promiscuous mode
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.326 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:49.334 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cf5de346-7e1e-439e-b66b-9d8c14eeffd3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:49.354 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[91130e08-92a2-442e-bcf0-7533d961e4f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:49.356 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[94520986-b293-4dc6-8832-26e9453ddab2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:49.366 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[188f1946-ffa0-451a-a06f-0d3bdacdd00f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 750055, 'reachable_time': 36501, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287246, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:49 np0005603623 systemd[1]: run-netns-ovnmeta\x2d44469d8b\x2dad30\x2d4270\x2d88fa\x2de67c568f3150.mount: Deactivated successfully.
Jan 31 03:32:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:49.369 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:32:49 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:32:49.369 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[3cff432e-20ad-44f4-9cd0-cf1a74dc8aa0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.384 226239 DEBUG nova.compute.manager [req-a9fe350c-630e-4e7d-b2f8-ef96f18bf370 req-2e61e882-5343-4a8c-9a61-e9b85937a1f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received event network-vif-unplugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.385 226239 DEBUG oslo_concurrency.lockutils [req-a9fe350c-630e-4e7d-b2f8-ef96f18bf370 req-2e61e882-5343-4a8c-9a61-e9b85937a1f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.385 226239 DEBUG oslo_concurrency.lockutils [req-a9fe350c-630e-4e7d-b2f8-ef96f18bf370 req-2e61e882-5343-4a8c-9a61-e9b85937a1f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.385 226239 DEBUG oslo_concurrency.lockutils [req-a9fe350c-630e-4e7d-b2f8-ef96f18bf370 req-2e61e882-5343-4a8c-9a61-e9b85937a1f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.386 226239 DEBUG nova.compute.manager [req-a9fe350c-630e-4e7d-b2f8-ef96f18bf370 req-2e61e882-5343-4a8c-9a61-e9b85937a1f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] No waiting events found dispatching network-vif-unplugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.386 226239 DEBUG nova.compute.manager [req-a9fe350c-630e-4e7d-b2f8-ef96f18bf370 req-2e61e882-5343-4a8c-9a61-e9b85937a1f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received event network-vif-unplugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.411 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 4e0408c0-b1a7-4079-ba79-3e737fded2ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.411 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance a15175ec-85fd-457c-870b-8a6d7c13c906 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.412 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.412 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.471 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.824 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:32:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3199247516' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.915 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:49 np0005603623 nova_compute[226235]: 2026-01-31 08:32:49.922 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:32:50 np0005603623 nova_compute[226235]: 2026-01-31 08:32:50.061 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:32:50 np0005603623 nova_compute[226235]: 2026-01-31 08:32:50.315 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:32:50 np0005603623 nova_compute[226235]: 2026-01-31 08:32:50.316 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.043s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:51.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:32:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 46K writes, 205K keys, 46K commit groups, 1.0 writes per commit group, ingest: 0.21 GB, 0.05 MB/s#012Cumulative WAL: 46K writes, 14K syncs, 3.13 writes per sync, written: 0.21 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 6811 writes, 32K keys, 6811 commit groups, 1.0 writes per commit group, ingest: 33.00 MB, 0.06 MB/s#012Interval WAL: 6810 writes, 2072 syncs, 3.29 writes per sync, written: 0.03 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 03:32:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:51.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:51 np0005603623 nova_compute[226235]: 2026-01-31 08:32:51.253 226239 DEBUG nova.compute.manager [req-ea959cba-4b49-4d77-8e45-69ed2d149b4f req-348735a7-e83a-4b01-9191-251215ed7e24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received event network-changed-5c8118c5-4238-4d06-99ff-6e0b763563c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:32:51 np0005603623 nova_compute[226235]: 2026-01-31 08:32:51.254 226239 DEBUG nova.compute.manager [req-ea959cba-4b49-4d77-8e45-69ed2d149b4f req-348735a7-e83a-4b01-9191-251215ed7e24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Refreshing instance network info cache due to event network-changed-5c8118c5-4238-4d06-99ff-6e0b763563c7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:32:51 np0005603623 nova_compute[226235]: 2026-01-31 08:32:51.255 226239 DEBUG oslo_concurrency.lockutils [req-ea959cba-4b49-4d77-8e45-69ed2d149b4f req-348735a7-e83a-4b01-9191-251215ed7e24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:32:51 np0005603623 nova_compute[226235]: 2026-01-31 08:32:51.317 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:32:51 np0005603623 nova_compute[226235]: 2026-01-31 08:32:51.318 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:32:51 np0005603623 nova_compute[226235]: 2026-01-31 08:32:51.318 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:32:51 np0005603623 nova_compute[226235]: 2026-01-31 08:32:51.358 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 31 03:32:51 np0005603623 nova_compute[226235]: 2026-01-31 08:32:51.652 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:32:51 np0005603623 nova_compute[226235]: 2026-01-31 08:32:51.722 226239 DEBUG nova.compute.manager [req-a46bcf8b-e0bd-4039-b422-4dc8535426a2 req-d2810a21-6a00-479c-8938-b7f2c6934a45 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:32:51 np0005603623 nova_compute[226235]: 2026-01-31 08:32:51.723 226239 DEBUG oslo_concurrency.lockutils [req-a46bcf8b-e0bd-4039-b422-4dc8535426a2 req-d2810a21-6a00-479c-8938-b7f2c6934a45 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:51 np0005603623 nova_compute[226235]: 2026-01-31 08:32:51.723 226239 DEBUG oslo_concurrency.lockutils [req-a46bcf8b-e0bd-4039-b422-4dc8535426a2 req-d2810a21-6a00-479c-8938-b7f2c6934a45 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:51 np0005603623 nova_compute[226235]: 2026-01-31 08:32:51.723 226239 DEBUG oslo_concurrency.lockutils [req-a46bcf8b-e0bd-4039-b422-4dc8535426a2 req-d2810a21-6a00-479c-8938-b7f2c6934a45 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:51 np0005603623 nova_compute[226235]: 2026-01-31 08:32:51.723 226239 DEBUG nova.compute.manager [req-a46bcf8b-e0bd-4039-b422-4dc8535426a2 req-d2810a21-6a00-479c-8938-b7f2c6934a45 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] No waiting events found dispatching network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:32:51 np0005603623 nova_compute[226235]: 2026-01-31 08:32:51.724 226239 WARNING nova.compute.manager [req-a46bcf8b-e0bd-4039-b422-4dc8535426a2 req-d2810a21-6a00-479c-8938-b7f2c6934a45 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received unexpected event network-vif-plugged-02df5608-7a85-4d54-b5ac-628d6c8e8179 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:32:51 np0005603623 nova_compute[226235]: 2026-01-31 08:32:51.996 226239 DEBUG nova.network.neutron [req-322e24fc-a2b4-4b33-a35b-693a8247db29 req-2637e88e-8dfe-4c18-919e-c02d18eea884 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Updated VIF entry in instance network info cache for port 5c8118c5-4238-4d06-99ff-6e0b763563c7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:32:51 np0005603623 nova_compute[226235]: 2026-01-31 08:32:51.997 226239 DEBUG nova.network.neutron [req-322e24fc-a2b4-4b33-a35b-693a8247db29 req-2637e88e-8dfe-4c18-919e-c02d18eea884 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Updating instance_info_cache with network_info: [{"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:32:52 np0005603623 nova_compute[226235]: 2026-01-31 08:32:52.125 226239 DEBUG oslo_concurrency.lockutils [req-322e24fc-a2b4-4b33-a35b-693a8247db29 req-2637e88e-8dfe-4c18-919e-c02d18eea884 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:32:52 np0005603623 nova_compute[226235]: 2026-01-31 08:32:52.126 226239 DEBUG oslo_concurrency.lockutils [req-ea959cba-4b49-4d77-8e45-69ed2d149b4f req-348735a7-e83a-4b01-9191-251215ed7e24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:32:52 np0005603623 nova_compute[226235]: 2026-01-31 08:32:52.126 226239 DEBUG nova.network.neutron [req-ea959cba-4b49-4d77-8e45-69ed2d149b4f req-348735a7-e83a-4b01-9191-251215ed7e24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Refreshing network info cache for port 5c8118c5-4238-4d06-99ff-6e0b763563c7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:32:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:32:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:53.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:32:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:53.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:54 np0005603623 nova_compute[226235]: 2026-01-31 08:32:54.097 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:54 np0005603623 nova_compute[226235]: 2026-01-31 08:32:54.825 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:55.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:32:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:55.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:32:55 np0005603623 nova_compute[226235]: 2026-01-31 08:32:55.425 226239 INFO nova.virt.libvirt.driver [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Deleting instance files /var/lib/nova/instances/a15175ec-85fd-457c-870b-8a6d7c13c906_del#033[00m
Jan 31 03:32:55 np0005603623 nova_compute[226235]: 2026-01-31 08:32:55.426 226239 INFO nova.virt.libvirt.driver [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Deletion of /var/lib/nova/instances/a15175ec-85fd-457c-870b-8a6d7c13c906_del complete#033[00m
Jan 31 03:32:55 np0005603623 nova_compute[226235]: 2026-01-31 08:32:55.878 226239 INFO nova.compute.manager [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Took 7.10 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:32:55 np0005603623 nova_compute[226235]: 2026-01-31 08:32:55.879 226239 DEBUG oslo.service.loopingcall [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:32:55 np0005603623 nova_compute[226235]: 2026-01-31 08:32:55.880 226239 DEBUG nova.compute.manager [-] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:32:55 np0005603623 nova_compute[226235]: 2026-01-31 08:32:55.880 226239 DEBUG nova.network.neutron [-] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:32:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:57.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:32:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:57.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:32:57 np0005603623 nova_compute[226235]: 2026-01-31 08:32:57.395 226239 DEBUG nova.network.neutron [req-ea959cba-4b49-4d77-8e45-69ed2d149b4f req-348735a7-e83a-4b01-9191-251215ed7e24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Updated VIF entry in instance network info cache for port 5c8118c5-4238-4d06-99ff-6e0b763563c7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:32:57 np0005603623 nova_compute[226235]: 2026-01-31 08:32:57.396 226239 DEBUG nova.network.neutron [req-ea959cba-4b49-4d77-8e45-69ed2d149b4f req-348735a7-e83a-4b01-9191-251215ed7e24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Updating instance_info_cache with network_info: [{"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:32:57 np0005603623 nova_compute[226235]: 2026-01-31 08:32:57.427 226239 DEBUG oslo_concurrency.lockutils [req-ea959cba-4b49-4d77-8e45-69ed2d149b4f req-348735a7-e83a-4b01-9191-251215ed7e24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:32:57 np0005603623 nova_compute[226235]: 2026-01-31 08:32:57.428 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:32:57 np0005603623 nova_compute[226235]: 2026-01-31 08:32:57.428 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:32:57 np0005603623 nova_compute[226235]: 2026-01-31 08:32:57.428 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4e0408c0-b1a7-4079-ba79-3e737fded2ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:32:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:59.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:59 np0005603623 nova_compute[226235]: 2026-01-31 08:32:59.100 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:32:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:59.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:59 np0005603623 nova_compute[226235]: 2026-01-31 08:32:59.578 226239 DEBUG nova.network.neutron [-] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:32:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:59 np0005603623 nova_compute[226235]: 2026-01-31 08:32:59.739 226239 INFO nova.compute.manager [-] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Took 3.86 seconds to deallocate network for instance.#033[00m
Jan 31 03:32:59 np0005603623 nova_compute[226235]: 2026-01-31 08:32:59.826 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:00 np0005603623 nova_compute[226235]: 2026-01-31 08:33:00.021 226239 INFO nova.compute.manager [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Took 0.28 seconds to detach 1 volumes for instance.#033[00m
Jan 31 03:33:00 np0005603623 nova_compute[226235]: 2026-01-31 08:33:00.111 226239 DEBUG oslo_concurrency.lockutils [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:00 np0005603623 nova_compute[226235]: 2026-01-31 08:33:00.111 226239 DEBUG oslo_concurrency.lockutils [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:00 np0005603623 nova_compute[226235]: 2026-01-31 08:33:00.200 226239 DEBUG nova.compute.manager [req-2371e9bb-7945-45b3-8b35-edbc2e21ea8c req-72a8deee-2817-40d9-8cd0-be962e843de0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received event network-changed-5c8118c5-4238-4d06-99ff-6e0b763563c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:33:00 np0005603623 nova_compute[226235]: 2026-01-31 08:33:00.201 226239 DEBUG nova.compute.manager [req-2371e9bb-7945-45b3-8b35-edbc2e21ea8c req-72a8deee-2817-40d9-8cd0-be962e843de0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Refreshing instance network info cache due to event network-changed-5c8118c5-4238-4d06-99ff-6e0b763563c7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:33:00 np0005603623 nova_compute[226235]: 2026-01-31 08:33:00.201 226239 DEBUG oslo_concurrency.lockutils [req-2371e9bb-7945-45b3-8b35-edbc2e21ea8c req-72a8deee-2817-40d9-8cd0-be962e843de0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:33:00 np0005603623 nova_compute[226235]: 2026-01-31 08:33:00.228 226239 DEBUG oslo_concurrency.processutils [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:00 np0005603623 nova_compute[226235]: 2026-01-31 08:33:00.586 226239 DEBUG nova.compute.manager [req-4a5116e3-48df-44de-9165-ebfea6139edf req-152feee1-1827-4a94-ab35-843d061d99fb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Received event network-vif-deleted-02df5608-7a85-4d54-b5ac-628d6c8e8179 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:33:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:33:00 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1666231277' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:33:00 np0005603623 nova_compute[226235]: 2026-01-31 08:33:00.668 226239 DEBUG oslo_concurrency.processutils [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:00 np0005603623 nova_compute[226235]: 2026-01-31 08:33:00.674 226239 DEBUG nova.compute.provider_tree [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:33:00 np0005603623 nova_compute[226235]: 2026-01-31 08:33:00.714 226239 DEBUG nova.scheduler.client.report [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:33:00 np0005603623 nova_compute[226235]: 2026-01-31 08:33:00.759 226239 DEBUG oslo_concurrency.lockutils [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.648s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:00 np0005603623 nova_compute[226235]: 2026-01-31 08:33:00.828 226239 INFO nova.scheduler.client.report [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Deleted allocations for instance a15175ec-85fd-457c-870b-8a6d7c13c906#033[00m
Jan 31 03:33:00 np0005603623 nova_compute[226235]: 2026-01-31 08:33:00.930 226239 DEBUG oslo_concurrency.lockutils [None req-1c387d19-2def-41c3-9cc9-6f2aa4ebebb8 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "a15175ec-85fd-457c-870b-8a6d7c13c906" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 12.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:00 np0005603623 podman[287351]: 2026-01-31 08:33:00.975287676 +0000 UTC m=+0.065650802 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:33:01 np0005603623 podman[287352]: 2026-01-31 08:33:01.00644584 +0000 UTC m=+0.093964017 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 31 03:33:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:33:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:01.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:33:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:01.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:01 np0005603623 nova_compute[226235]: 2026-01-31 08:33:01.470 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Updating instance_info_cache with network_info: [{"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:33:01 np0005603623 nova_compute[226235]: 2026-01-31 08:33:01.554 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:33:01 np0005603623 nova_compute[226235]: 2026-01-31 08:33:01.555 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:33:01 np0005603623 nova_compute[226235]: 2026-01-31 08:33:01.555 226239 DEBUG oslo_concurrency.lockutils [req-2371e9bb-7945-45b3-8b35-edbc2e21ea8c req-72a8deee-2817-40d9-8cd0-be962e843de0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:33:01 np0005603623 nova_compute[226235]: 2026-01-31 08:33:01.555 226239 DEBUG nova.network.neutron [req-2371e9bb-7945-45b3-8b35-edbc2e21ea8c req-72a8deee-2817-40d9-8cd0-be962e843de0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Refreshing network info cache for port 5c8118c5-4238-4d06-99ff-6e0b763563c7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:33:01 np0005603623 nova_compute[226235]: 2026-01-31 08:33:01.556 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:01 np0005603623 nova_compute[226235]: 2026-01-31 08:33:01.557 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:01 np0005603623 nova_compute[226235]: 2026-01-31 08:33:01.557 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:01 np0005603623 nova_compute[226235]: 2026-01-31 08:33:01.557 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:03.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:03 np0005603623 nova_compute[226235]: 2026-01-31 08:33:03.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:03 np0005603623 nova_compute[226235]: 2026-01-31 08:33:03.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:03.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:04 np0005603623 nova_compute[226235]: 2026-01-31 08:33:04.020 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848369.0183258, a15175ec-85fd-457c-870b-8a6d7c13c906 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:33:04 np0005603623 nova_compute[226235]: 2026-01-31 08:33:04.020 226239 INFO nova.compute.manager [-] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:33:04 np0005603623 nova_compute[226235]: 2026-01-31 08:33:04.102 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:04 np0005603623 nova_compute[226235]: 2026-01-31 08:33:04.120 226239 DEBUG nova.compute.manager [None req-a14c110c-a536-4c8f-a2ec-127fa6c27b57 - - - - - -] [instance: a15175ec-85fd-457c-870b-8a6d7c13c906] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:33:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:04 np0005603623 nova_compute[226235]: 2026-01-31 08:33:04.827 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:05 np0005603623 nova_compute[226235]: 2026-01-31 08:33:05.054 226239 DEBUG nova.network.neutron [req-2371e9bb-7945-45b3-8b35-edbc2e21ea8c req-72a8deee-2817-40d9-8cd0-be962e843de0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Updated VIF entry in instance network info cache for port 5c8118c5-4238-4d06-99ff-6e0b763563c7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:33:05 np0005603623 nova_compute[226235]: 2026-01-31 08:33:05.054 226239 DEBUG nova.network.neutron [req-2371e9bb-7945-45b3-8b35-edbc2e21ea8c req-72a8deee-2817-40d9-8cd0-be962e843de0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Updating instance_info_cache with network_info: [{"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:33:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:05.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:05 np0005603623 nova_compute[226235]: 2026-01-31 08:33:05.115 226239 DEBUG oslo_concurrency.lockutils [req-2371e9bb-7945-45b3-8b35-edbc2e21ea8c req-72a8deee-2817-40d9-8cd0-be962e843de0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:33:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:05.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:07.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:07.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:08 np0005603623 nova_compute[226235]: 2026-01-31 08:33:08.031 226239 DEBUG nova.compute.manager [req-48de3124-8d51-47fc-b95e-e3ba0b470dd0 req-d9cedaf6-d144-4825-bd4f-19cc9e11c3ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received event network-changed-5c8118c5-4238-4d06-99ff-6e0b763563c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:33:08 np0005603623 nova_compute[226235]: 2026-01-31 08:33:08.032 226239 DEBUG nova.compute.manager [req-48de3124-8d51-47fc-b95e-e3ba0b470dd0 req-d9cedaf6-d144-4825-bd4f-19cc9e11c3ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Refreshing instance network info cache due to event network-changed-5c8118c5-4238-4d06-99ff-6e0b763563c7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:33:08 np0005603623 nova_compute[226235]: 2026-01-31 08:33:08.032 226239 DEBUG oslo_concurrency.lockutils [req-48de3124-8d51-47fc-b95e-e3ba0b470dd0 req-d9cedaf6-d144-4825-bd4f-19cc9e11c3ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:33:08 np0005603623 nova_compute[226235]: 2026-01-31 08:33:08.032 226239 DEBUG oslo_concurrency.lockutils [req-48de3124-8d51-47fc-b95e-e3ba0b470dd0 req-d9cedaf6-d144-4825-bd4f-19cc9e11c3ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:33:08 np0005603623 nova_compute[226235]: 2026-01-31 08:33:08.032 226239 DEBUG nova.network.neutron [req-48de3124-8d51-47fc-b95e-e3ba0b470dd0 req-d9cedaf6-d144-4825-bd4f-19cc9e11c3ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Refreshing network info cache for port 5c8118c5-4238-4d06-99ff-6e0b763563c7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:33:09 np0005603623 nova_compute[226235]: 2026-01-31 08:33:09.044 226239 DEBUG oslo_concurrency.lockutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:09 np0005603623 nova_compute[226235]: 2026-01-31 08:33:09.044 226239 DEBUG oslo_concurrency.lockutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:33:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:09.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:33:09 np0005603623 nova_compute[226235]: 2026-01-31 08:33:09.106 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:09 np0005603623 nova_compute[226235]: 2026-01-31 08:33:09.126 226239 DEBUG nova.compute.manager [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:33:09 np0005603623 nova_compute[226235]: 2026-01-31 08:33:09.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:09.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:09 np0005603623 nova_compute[226235]: 2026-01-31 08:33:09.235 226239 DEBUG oslo_concurrency.lockutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:09 np0005603623 nova_compute[226235]: 2026-01-31 08:33:09.235 226239 DEBUG oslo_concurrency.lockutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:09 np0005603623 nova_compute[226235]: 2026-01-31 08:33:09.241 226239 DEBUG nova.virt.hardware [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:33:09 np0005603623 nova_compute[226235]: 2026-01-31 08:33:09.241 226239 INFO nova.compute.claims [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:33:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:33:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:33:09 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:33:09 np0005603623 nova_compute[226235]: 2026-01-31 08:33:09.553 226239 DEBUG oslo_concurrency.processutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:09 np0005603623 nova_compute[226235]: 2026-01-31 08:33:09.861 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:33:09 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/241069967' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:33:09 np0005603623 nova_compute[226235]: 2026-01-31 08:33:09.968 226239 DEBUG oslo_concurrency.processutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:09 np0005603623 nova_compute[226235]: 2026-01-31 08:33:09.976 226239 DEBUG nova.compute.provider_tree [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.269 226239 DEBUG nova.scheduler.client.report [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.342 226239 DEBUG oslo_concurrency.lockutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.343 226239 DEBUG nova.compute.manager [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.448 226239 DEBUG nova.compute.manager [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.448 226239 DEBUG nova.network.neutron [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.477 226239 INFO nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.510 226239 DEBUG nova.compute.manager [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.633 226239 DEBUG nova.compute.manager [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.634 226239 DEBUG nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.634 226239 INFO nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Creating image(s)#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.654 226239 DEBUG nova.storage.rbd_utils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image adfc4c25-9eb9-45cc-ac90-2029677bcb67_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.677 226239 DEBUG nova.storage.rbd_utils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image adfc4c25-9eb9-45cc-ac90-2029677bcb67_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.699 226239 DEBUG nova.storage.rbd_utils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image adfc4c25-9eb9-45cc-ac90-2029677bcb67_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.703 226239 DEBUG oslo_concurrency.processutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.785 226239 DEBUG oslo_concurrency.processutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.787 226239 DEBUG oslo_concurrency.lockutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.788 226239 DEBUG oslo_concurrency.lockutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.788 226239 DEBUG oslo_concurrency.lockutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.817 226239 DEBUG nova.storage.rbd_utils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image adfc4c25-9eb9-45cc-ac90-2029677bcb67_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:33:10 np0005603623 nova_compute[226235]: 2026-01-31 08:33:10.821 226239 DEBUG oslo_concurrency.processutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 adfc4c25-9eb9-45cc-ac90-2029677bcb67_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:11 np0005603623 nova_compute[226235]: 2026-01-31 08:33:11.086 226239 DEBUG nova.network.neutron [req-48de3124-8d51-47fc-b95e-e3ba0b470dd0 req-d9cedaf6-d144-4825-bd4f-19cc9e11c3ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Updated VIF entry in instance network info cache for port 5c8118c5-4238-4d06-99ff-6e0b763563c7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:33:11 np0005603623 nova_compute[226235]: 2026-01-31 08:33:11.087 226239 DEBUG nova.network.neutron [req-48de3124-8d51-47fc-b95e-e3ba0b470dd0 req-d9cedaf6-d144-4825-bd4f-19cc9e11c3ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Updating instance_info_cache with network_info: [{"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:33:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:11.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:11 np0005603623 nova_compute[226235]: 2026-01-31 08:33:11.195 226239 DEBUG nova.policy [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef51681d234a4abc88ff433d0640b6e7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '953a213fa5cb435ab3c04ad96152685f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:33:11 np0005603623 nova_compute[226235]: 2026-01-31 08:33:11.200 226239 DEBUG oslo_concurrency.lockutils [req-48de3124-8d51-47fc-b95e-e3ba0b470dd0 req-d9cedaf6-d144-4825-bd4f-19cc9e11c3ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4e0408c0-b1a7-4079-ba79-3e737fded2ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:33:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:33:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:11.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:33:12 np0005603623 nova_compute[226235]: 2026-01-31 08:33:12.114 226239 DEBUG oslo_concurrency.processutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 adfc4c25-9eb9-45cc-ac90-2029677bcb67_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.294s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:12 np0005603623 nova_compute[226235]: 2026-01-31 08:33:12.187 226239 DEBUG nova.storage.rbd_utils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] resizing rbd image adfc4c25-9eb9-45cc-ac90-2029677bcb67_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:33:12 np0005603623 nova_compute[226235]: 2026-01-31 08:33:12.463 226239 DEBUG nova.objects.instance [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'migration_context' on Instance uuid adfc4c25-9eb9-45cc-ac90-2029677bcb67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:33:12 np0005603623 nova_compute[226235]: 2026-01-31 08:33:12.485 226239 DEBUG nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:33:12 np0005603623 nova_compute[226235]: 2026-01-31 08:33:12.486 226239 DEBUG nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Ensure instance console log exists: /var/lib/nova/instances/adfc4c25-9eb9-45cc-ac90-2029677bcb67/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:33:12 np0005603623 nova_compute[226235]: 2026-01-31 08:33:12.487 226239 DEBUG oslo_concurrency.lockutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:12 np0005603623 nova_compute[226235]: 2026-01-31 08:33:12.487 226239 DEBUG oslo_concurrency.lockutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:12 np0005603623 nova_compute[226235]: 2026-01-31 08:33:12.487 226239 DEBUG oslo_concurrency.lockutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:13.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:13.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:14 np0005603623 nova_compute[226235]: 2026-01-31 08:33:14.108 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:33:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3811464561' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:33:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:33:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2551216714' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:33:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:33:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2551216714' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:33:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:14.756 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:33:14 np0005603623 nova_compute[226235]: 2026-01-31 08:33:14.757 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:14.758 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:33:14 np0005603623 nova_compute[226235]: 2026-01-31 08:33:14.863 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:15.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:15 np0005603623 nova_compute[226235]: 2026-01-31 08:33:15.133 226239 DEBUG nova.network.neutron [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Successfully created port: ae035cfb-a17b-4578-a506-e2581da09f74 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:33:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:15.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:15 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:33:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:15.760 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:33:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:33:16 np0005603623 nova_compute[226235]: 2026-01-31 08:33:16.667 226239 DEBUG nova.network.neutron [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Successfully updated port: ae035cfb-a17b-4578-a506-e2581da09f74 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:33:16 np0005603623 nova_compute[226235]: 2026-01-31 08:33:16.687 226239 DEBUG oslo_concurrency.lockutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "refresh_cache-adfc4c25-9eb9-45cc-ac90-2029677bcb67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:33:16 np0005603623 nova_compute[226235]: 2026-01-31 08:33:16.687 226239 DEBUG oslo_concurrency.lockutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquired lock "refresh_cache-adfc4c25-9eb9-45cc-ac90-2029677bcb67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:33:16 np0005603623 nova_compute[226235]: 2026-01-31 08:33:16.688 226239 DEBUG nova.network.neutron [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:33:16 np0005603623 nova_compute[226235]: 2026-01-31 08:33:16.810 226239 DEBUG oslo_concurrency.lockutils [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Acquiring lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:16 np0005603623 nova_compute[226235]: 2026-01-31 08:33:16.811 226239 DEBUG oslo_concurrency.lockutils [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:16 np0005603623 nova_compute[226235]: 2026-01-31 08:33:16.811 226239 DEBUG oslo_concurrency.lockutils [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Acquiring lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:16 np0005603623 nova_compute[226235]: 2026-01-31 08:33:16.811 226239 DEBUG oslo_concurrency.lockutils [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:16 np0005603623 nova_compute[226235]: 2026-01-31 08:33:16.811 226239 DEBUG oslo_concurrency.lockutils [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:16 np0005603623 nova_compute[226235]: 2026-01-31 08:33:16.813 226239 INFO nova.compute.manager [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Terminating instance#033[00m
Jan 31 03:33:16 np0005603623 nova_compute[226235]: 2026-01-31 08:33:16.814 226239 DEBUG nova.compute.manager [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:33:16 np0005603623 kernel: tap5c8118c5-42 (unregistering): left promiscuous mode
Jan 31 03:33:16 np0005603623 NetworkManager[48970]: <info>  [1769848396.8913] device (tap5c8118c5-42): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:33:16 np0005603623 ovn_controller[133449]: 2026-01-31T08:33:16Z|00557|binding|INFO|Releasing lport 5c8118c5-4238-4d06-99ff-6e0b763563c7 from this chassis (sb_readonly=0)
Jan 31 03:33:16 np0005603623 ovn_controller[133449]: 2026-01-31T08:33:16Z|00558|binding|INFO|Setting lport 5c8118c5-4238-4d06-99ff-6e0b763563c7 down in Southbound
Jan 31 03:33:16 np0005603623 ovn_controller[133449]: 2026-01-31T08:33:16Z|00559|binding|INFO|Removing iface tap5c8118c5-42 ovn-installed in OVS
Jan 31 03:33:16 np0005603623 nova_compute[226235]: 2026-01-31 08:33:16.898 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:16 np0005603623 nova_compute[226235]: 2026-01-31 08:33:16.906 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:16.914 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:ba:04 10.100.0.3'], port_security=['fa:16:3e:fb:ba:04 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '4e0408c0-b1a7-4079-ba79-3e737fded2ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b6a7de05649d42c6acb1aa6e6026b2b4', 'neutron:revision_number': '8', 'neutron:security_group_ids': '8a3bb86b-ceb7-476c-96e2-ee30ea8ecd63', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41049c6c-e208-4c6a-ad10-15df89677733, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=5c8118c5-4238-4d06-99ff-6e0b763563c7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:33:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:16.916 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 5c8118c5-4238-4d06-99ff-6e0b763563c7 in datapath 1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7 unbound from our chassis#033[00m
Jan 31 03:33:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:16.917 143258 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 31 03:33:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:16.918 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d7befad6-7ac5-46bb-b75d-744eb91194ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:16 np0005603623 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000083.scope: Deactivated successfully.
Jan 31 03:33:16 np0005603623 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000083.scope: Consumed 13.964s CPU time.
Jan 31 03:33:16 np0005603623 systemd-machined[194379]: Machine qemu-61-instance-00000083 terminated.
Jan 31 03:33:16 np0005603623 nova_compute[226235]: 2026-01-31 08:33:16.990 226239 DEBUG nova.compute.manager [req-ca1daed6-d86f-42b6-a480-6b4fa24d19b8 req-74b7c75e-5408-403c-8a52-50021f523ffd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Received event network-changed-ae035cfb-a17b-4578-a506-e2581da09f74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:33:16 np0005603623 nova_compute[226235]: 2026-01-31 08:33:16.991 226239 DEBUG nova.compute.manager [req-ca1daed6-d86f-42b6-a480-6b4fa24d19b8 req-74b7c75e-5408-403c-8a52-50021f523ffd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Refreshing instance network info cache due to event network-changed-ae035cfb-a17b-4578-a506-e2581da09f74. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:33:16 np0005603623 nova_compute[226235]: 2026-01-31 08:33:16.991 226239 DEBUG oslo_concurrency.lockutils [req-ca1daed6-d86f-42b6-a480-6b4fa24d19b8 req-74b7c75e-5408-403c-8a52-50021f523ffd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-adfc4c25-9eb9-45cc-ac90-2029677bcb67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.016 226239 DEBUG nova.network.neutron [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.035 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.039 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.055 226239 INFO nova.virt.libvirt.driver [-] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Instance destroyed successfully.#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.056 226239 DEBUG nova.objects.instance [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lazy-loading 'resources' on Instance uuid 4e0408c0-b1a7-4079-ba79-3e737fded2ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.082 226239 DEBUG nova.virt.libvirt.vif [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:31:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-989214032',display_name='tempest-ServerRescueTestJSONUnderV235-server-989214032',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-989214032',id=131,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:32:41Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b6a7de05649d42c6acb1aa6e6026b2b4',ramdisk_id='',reservation_id='r-muxmeooe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1941698863',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1941698863-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:32:42Z,user_data=None,user_id='d91ac41a8e444974a11ffbef7b04ddb3',uuid=4e0408c0-b1a7-4079-ba79-3e737fded2ea,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.083 226239 DEBUG nova.network.os_vif_util [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Converting VIF {"id": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "address": "fa:16:3e:fb:ba:04", "network": {"id": "1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1418094668-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b6a7de05649d42c6acb1aa6e6026b2b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c8118c5-42", "ovs_interfaceid": "5c8118c5-4238-4d06-99ff-6e0b763563c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.083 226239 DEBUG nova.network.os_vif_util [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fb:ba:04,bridge_name='br-int',has_traffic_filtering=True,id=5c8118c5-4238-4d06-99ff-6e0b763563c7,network=Network(1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c8118c5-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.084 226239 DEBUG os_vif [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fb:ba:04,bridge_name='br-int',has_traffic_filtering=True,id=5c8118c5-4238-4d06-99ff-6e0b763563c7,network=Network(1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c8118c5-42') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.085 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.085 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c8118c5-42, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.087 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.089 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.091 226239 INFO os_vif [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fb:ba:04,bridge_name='br-int',has_traffic_filtering=True,id=5c8118c5-4238-4d06-99ff-6e0b763563c7,network=Network(1b5fcbd8-aaa4-4f62-83f3-2dbfdad665d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c8118c5-42')#033[00m
Jan 31 03:33:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:33:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:17.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:33:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:17.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.323 226239 DEBUG nova.compute.manager [req-78b7e45c-0c04-4794-85cd-b5a32cfa0b47 req-49b12ee5-a7d7-4981-958b-9c33d459606d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received event network-vif-unplugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.323 226239 DEBUG oslo_concurrency.lockutils [req-78b7e45c-0c04-4794-85cd-b5a32cfa0b47 req-49b12ee5-a7d7-4981-958b-9c33d459606d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.324 226239 DEBUG oslo_concurrency.lockutils [req-78b7e45c-0c04-4794-85cd-b5a32cfa0b47 req-49b12ee5-a7d7-4981-958b-9c33d459606d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.324 226239 DEBUG oslo_concurrency.lockutils [req-78b7e45c-0c04-4794-85cd-b5a32cfa0b47 req-49b12ee5-a7d7-4981-958b-9c33d459606d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.324 226239 DEBUG nova.compute.manager [req-78b7e45c-0c04-4794-85cd-b5a32cfa0b47 req-49b12ee5-a7d7-4981-958b-9c33d459606d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] No waiting events found dispatching network-vif-unplugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:33:17 np0005603623 nova_compute[226235]: 2026-01-31 08:33:17.324 226239 DEBUG nova.compute.manager [req-78b7e45c-0c04-4794-85cd-b5a32cfa0b47 req-49b12ee5-a7d7-4981-958b-9c33d459606d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received event network-vif-unplugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.572 226239 DEBUG nova.network.neutron [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Updating instance_info_cache with network_info: [{"id": "ae035cfb-a17b-4578-a506-e2581da09f74", "address": "fa:16:3e:30:5a:60", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae035cfb-a1", "ovs_interfaceid": "ae035cfb-a17b-4578-a506-e2581da09f74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.596 226239 DEBUG oslo_concurrency.lockutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Releasing lock "refresh_cache-adfc4c25-9eb9-45cc-ac90-2029677bcb67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.596 226239 DEBUG nova.compute.manager [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Instance network_info: |[{"id": "ae035cfb-a17b-4578-a506-e2581da09f74", "address": "fa:16:3e:30:5a:60", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae035cfb-a1", "ovs_interfaceid": "ae035cfb-a17b-4578-a506-e2581da09f74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.597 226239 DEBUG oslo_concurrency.lockutils [req-ca1daed6-d86f-42b6-a480-6b4fa24d19b8 req-74b7c75e-5408-403c-8a52-50021f523ffd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-adfc4c25-9eb9-45cc-ac90-2029677bcb67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.597 226239 DEBUG nova.network.neutron [req-ca1daed6-d86f-42b6-a480-6b4fa24d19b8 req-74b7c75e-5408-403c-8a52-50021f523ffd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Refreshing network info cache for port ae035cfb-a17b-4578-a506-e2581da09f74 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.600 226239 DEBUG nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Start _get_guest_xml network_info=[{"id": "ae035cfb-a17b-4578-a506-e2581da09f74", "address": "fa:16:3e:30:5a:60", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae035cfb-a1", "ovs_interfaceid": "ae035cfb-a17b-4578-a506-e2581da09f74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.607 226239 WARNING nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.612 226239 DEBUG nova.virt.libvirt.host [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.613 226239 DEBUG nova.virt.libvirt.host [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.616 226239 DEBUG nova.virt.libvirt.host [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.617 226239 DEBUG nova.virt.libvirt.host [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.617 226239 DEBUG nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.618 226239 DEBUG nova.virt.hardware [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.618 226239 DEBUG nova.virt.hardware [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.618 226239 DEBUG nova.virt.hardware [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.618 226239 DEBUG nova.virt.hardware [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.619 226239 DEBUG nova.virt.hardware [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.619 226239 DEBUG nova.virt.hardware [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.619 226239 DEBUG nova.virt.hardware [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.619 226239 DEBUG nova.virt.hardware [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.619 226239 DEBUG nova.virt.hardware [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.620 226239 DEBUG nova.virt.hardware [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.620 226239 DEBUG nova.virt.hardware [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:33:18 np0005603623 nova_compute[226235]: 2026-01-31 08:33:18.622 226239 DEBUG oslo_concurrency.processutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:33:19 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3250369912' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.053 226239 DEBUG oslo_concurrency.processutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.075 226239 DEBUG nova.storage.rbd_utils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image adfc4c25-9eb9-45cc-ac90-2029677bcb67_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.079 226239 DEBUG oslo_concurrency.processutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:19.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:19.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.468 226239 DEBUG nova.compute.manager [req-5b7fab81-2dca-4805-89ea-d3eae90fdf04 req-bf1eaf7d-16a4-4891-8369-752b9852a3d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received event network-vif-plugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.469 226239 DEBUG oslo_concurrency.lockutils [req-5b7fab81-2dca-4805-89ea-d3eae90fdf04 req-bf1eaf7d-16a4-4891-8369-752b9852a3d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.469 226239 DEBUG oslo_concurrency.lockutils [req-5b7fab81-2dca-4805-89ea-d3eae90fdf04 req-bf1eaf7d-16a4-4891-8369-752b9852a3d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.469 226239 DEBUG oslo_concurrency.lockutils [req-5b7fab81-2dca-4805-89ea-d3eae90fdf04 req-bf1eaf7d-16a4-4891-8369-752b9852a3d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.469 226239 DEBUG nova.compute.manager [req-5b7fab81-2dca-4805-89ea-d3eae90fdf04 req-bf1eaf7d-16a4-4891-8369-752b9852a3d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] No waiting events found dispatching network-vif-plugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.470 226239 WARNING nova.compute.manager [req-5b7fab81-2dca-4805-89ea-d3eae90fdf04 req-bf1eaf7d-16a4-4891-8369-752b9852a3d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received unexpected event network-vif-plugged-5c8118c5-4238-4d06-99ff-6e0b763563c7 for instance with vm_state rescued and task_state deleting.#033[00m
Jan 31 03:33:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:33:19 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3952723352' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.612 226239 DEBUG oslo_concurrency.processutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.533s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.614 226239 DEBUG nova.virt.libvirt.vif [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:33:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-966483760',display_name='tempest-ServerActionsTestOtherB-server-966483760',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-966483760',id=135,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDsFGTxapW26dXB/XvUTGcfGzb7/71yMMg1CszLzfnGOAhIU/1lACOYAdVBK40cFjy/2kY258v2iqF8U2lfGaG9JRRfAxw6pRph+THb2i3B9US4SfAm/pgAAiW0mmqeasA==',key_name='tempest-keypair-1440000372',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='953a213fa5cb435ab3c04ad96152685f',ramdisk_id='',reservation_id='r-3ojgffxo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1048458052',owner_user_name='tempest-ServerActionsTestOtherB-1048458052-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:33:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ef51681d234a4abc88ff433d0640b6e7',uuid=adfc4c25-9eb9-45cc-ac90-2029677bcb67,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ae035cfb-a17b-4578-a506-e2581da09f74", "address": "fa:16:3e:30:5a:60", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae035cfb-a1", "ovs_interfaceid": "ae035cfb-a17b-4578-a506-e2581da09f74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.614 226239 DEBUG nova.network.os_vif_util [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converting VIF {"id": "ae035cfb-a17b-4578-a506-e2581da09f74", "address": "fa:16:3e:30:5a:60", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae035cfb-a1", "ovs_interfaceid": "ae035cfb-a17b-4578-a506-e2581da09f74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.615 226239 DEBUG nova.network.os_vif_util [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:5a:60,bridge_name='br-int',has_traffic_filtering=True,id=ae035cfb-a17b-4578-a506-e2581da09f74,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae035cfb-a1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.616 226239 DEBUG nova.objects.instance [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'pci_devices' on Instance uuid adfc4c25-9eb9-45cc-ac90-2029677bcb67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.643 226239 DEBUG nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:33:19 np0005603623 nova_compute[226235]:  <uuid>adfc4c25-9eb9-45cc-ac90-2029677bcb67</uuid>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:  <name>instance-00000087</name>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerActionsTestOtherB-server-966483760</nova:name>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:33:18</nova:creationTime>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:33:19 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:        <nova:user uuid="ef51681d234a4abc88ff433d0640b6e7">tempest-ServerActionsTestOtherB-1048458052-project-member</nova:user>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:        <nova:project uuid="953a213fa5cb435ab3c04ad96152685f">tempest-ServerActionsTestOtherB-1048458052</nova:project>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:        <nova:port uuid="ae035cfb-a17b-4578-a506-e2581da09f74">
Jan 31 03:33:19 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <entry name="serial">adfc4c25-9eb9-45cc-ac90-2029677bcb67</entry>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <entry name="uuid">adfc4c25-9eb9-45cc-ac90-2029677bcb67</entry>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/adfc4c25-9eb9-45cc-ac90-2029677bcb67_disk">
Jan 31 03:33:19 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:33:19 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/adfc4c25-9eb9-45cc-ac90-2029677bcb67_disk.config">
Jan 31 03:33:19 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:33:19 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:30:5a:60"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <target dev="tapae035cfb-a1"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/adfc4c25-9eb9-45cc-ac90-2029677bcb67/console.log" append="off"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:33:19 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:33:19 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:33:19 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:33:19 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.645 226239 DEBUG nova.compute.manager [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Preparing to wait for external event network-vif-plugged-ae035cfb-a17b-4578-a506-e2581da09f74 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.645 226239 DEBUG oslo_concurrency.lockutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.646 226239 DEBUG oslo_concurrency.lockutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.646 226239 DEBUG oslo_concurrency.lockutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.647 226239 DEBUG nova.virt.libvirt.vif [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:33:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-966483760',display_name='tempest-ServerActionsTestOtherB-server-966483760',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-966483760',id=135,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDsFGTxapW26dXB/XvUTGcfGzb7/71yMMg1CszLzfnGOAhIU/1lACOYAdVBK40cFjy/2kY258v2iqF8U2lfGaG9JRRfAxw6pRph+THb2i3B9US4SfAm/pgAAiW0mmqeasA==',key_name='tempest-keypair-1440000372',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='953a213fa5cb435ab3c04ad96152685f',ramdisk_id='',reservation_id='r-3ojgffxo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1048458052',owner_user_name='tempest-ServerActionsTestOtherB-1048458052-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:33:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ef51681d234a4abc88ff433d0640b6e7',uuid=adfc4c25-9eb9-45cc-ac90-2029677bcb67,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ae035cfb-a17b-4578-a506-e2581da09f74", "address": "fa:16:3e:30:5a:60", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae035cfb-a1", "ovs_interfaceid": "ae035cfb-a17b-4578-a506-e2581da09f74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.647 226239 DEBUG nova.network.os_vif_util [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converting VIF {"id": "ae035cfb-a17b-4578-a506-e2581da09f74", "address": "fa:16:3e:30:5a:60", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae035cfb-a1", "ovs_interfaceid": "ae035cfb-a17b-4578-a506-e2581da09f74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.648 226239 DEBUG nova.network.os_vif_util [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:5a:60,bridge_name='br-int',has_traffic_filtering=True,id=ae035cfb-a17b-4578-a506-e2581da09f74,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae035cfb-a1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.648 226239 DEBUG os_vif [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:5a:60,bridge_name='br-int',has_traffic_filtering=True,id=ae035cfb-a17b-4578-a506-e2581da09f74,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae035cfb-a1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.649 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.649 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.649 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.652 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.652 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapae035cfb-a1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.652 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapae035cfb-a1, col_values=(('external_ids', {'iface-id': 'ae035cfb-a17b-4578-a506-e2581da09f74', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:30:5a:60', 'vm-uuid': 'adfc4c25-9eb9-45cc-ac90-2029677bcb67'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.654 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:19 np0005603623 NetworkManager[48970]: <info>  [1769848399.6549] manager: (tapae035cfb-a1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.657 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.659 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.660 226239 INFO os_vif [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:5a:60,bridge_name='br-int',has_traffic_filtering=True,id=ae035cfb-a17b-4578-a506-e2581da09f74,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae035cfb-a1')#033[00m
Jan 31 03:33:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.748 226239 DEBUG nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.749 226239 DEBUG nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.749 226239 DEBUG nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] No VIF found with MAC fa:16:3e:30:5a:60, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.750 226239 INFO nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Using config drive#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.774 226239 DEBUG nova.storage.rbd_utils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image adfc4c25-9eb9-45cc-ac90-2029677bcb67_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:33:19 np0005603623 nova_compute[226235]: 2026-01-31 08:33:19.865 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:20 np0005603623 nova_compute[226235]: 2026-01-31 08:33:20.067 226239 INFO nova.virt.libvirt.driver [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Deleting instance files /var/lib/nova/instances/4e0408c0-b1a7-4079-ba79-3e737fded2ea_del#033[00m
Jan 31 03:33:20 np0005603623 nova_compute[226235]: 2026-01-31 08:33:20.067 226239 INFO nova.virt.libvirt.driver [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Deletion of /var/lib/nova/instances/4e0408c0-b1a7-4079-ba79-3e737fded2ea_del complete#033[00m
Jan 31 03:33:20 np0005603623 nova_compute[226235]: 2026-01-31 08:33:20.177 226239 INFO nova.compute.manager [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Took 3.36 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:33:20 np0005603623 nova_compute[226235]: 2026-01-31 08:33:20.178 226239 DEBUG oslo.service.loopingcall [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:33:20 np0005603623 nova_compute[226235]: 2026-01-31 08:33:20.178 226239 DEBUG nova.compute.manager [-] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:33:20 np0005603623 nova_compute[226235]: 2026-01-31 08:33:20.178 226239 DEBUG nova.network.neutron [-] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:33:20 np0005603623 nova_compute[226235]: 2026-01-31 08:33:20.623 226239 INFO nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Creating config drive at /var/lib/nova/instances/adfc4c25-9eb9-45cc-ac90-2029677bcb67/disk.config#033[00m
Jan 31 03:33:20 np0005603623 nova_compute[226235]: 2026-01-31 08:33:20.628 226239 DEBUG oslo_concurrency.processutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/adfc4c25-9eb9-45cc-ac90-2029677bcb67/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6q7xhrg9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:20 np0005603623 nova_compute[226235]: 2026-01-31 08:33:20.753 226239 DEBUG oslo_concurrency.processutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/adfc4c25-9eb9-45cc-ac90-2029677bcb67/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6q7xhrg9" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:20 np0005603623 nova_compute[226235]: 2026-01-31 08:33:20.780 226239 DEBUG nova.storage.rbd_utils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image adfc4c25-9eb9-45cc-ac90-2029677bcb67_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:33:20 np0005603623 nova_compute[226235]: 2026-01-31 08:33:20.783 226239 DEBUG oslo_concurrency.processutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/adfc4c25-9eb9-45cc-ac90-2029677bcb67/disk.config adfc4c25-9eb9-45cc-ac90-2029677bcb67_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:20 np0005603623 nova_compute[226235]: 2026-01-31 08:33:20.930 226239 DEBUG oslo_concurrency.processutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/adfc4c25-9eb9-45cc-ac90-2029677bcb67/disk.config adfc4c25-9eb9-45cc-ac90-2029677bcb67_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:20 np0005603623 nova_compute[226235]: 2026-01-31 08:33:20.931 226239 INFO nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Deleting local config drive /var/lib/nova/instances/adfc4c25-9eb9-45cc-ac90-2029677bcb67/disk.config because it was imported into RBD.#033[00m
Jan 31 03:33:20 np0005603623 kernel: tapae035cfb-a1: entered promiscuous mode
Jan 31 03:33:20 np0005603623 NetworkManager[48970]: <info>  [1769848400.9712] manager: (tapae035cfb-a1): new Tun device (/org/freedesktop/NetworkManager/Devices/263)
Jan 31 03:33:20 np0005603623 nova_compute[226235]: 2026-01-31 08:33:20.972 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:20 np0005603623 ovn_controller[133449]: 2026-01-31T08:33:20Z|00560|binding|INFO|Claiming lport ae035cfb-a17b-4578-a506-e2581da09f74 for this chassis.
Jan 31 03:33:20 np0005603623 ovn_controller[133449]: 2026-01-31T08:33:20Z|00561|binding|INFO|ae035cfb-a17b-4578-a506-e2581da09f74: Claiming fa:16:3e:30:5a:60 10.100.0.12
Jan 31 03:33:20 np0005603623 ovn_controller[133449]: 2026-01-31T08:33:20Z|00562|binding|INFO|Setting lport ae035cfb-a17b-4578-a506-e2581da09f74 ovn-installed in OVS
Jan 31 03:33:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:20.979 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:5a:60 10.100.0.12'], port_security=['fa:16:3e:30:5a:60 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'adfc4c25-9eb9-45cc-ac90-2029677bcb67', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44469d8b-ad30-4270-88fa-e67c568f3150', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '953a213fa5cb435ab3c04ad96152685f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5b1bd8ad-0d2a-4d57-a00a-9a6b59df86e5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d972fb9d-6d12-4c1c-b135-704d64887b72, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=ae035cfb-a17b-4578-a506-e2581da09f74) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:33:20 np0005603623 ovn_controller[133449]: 2026-01-31T08:33:20Z|00563|binding|INFO|Setting lport ae035cfb-a17b-4578-a506-e2581da09f74 up in Southbound
Jan 31 03:33:20 np0005603623 nova_compute[226235]: 2026-01-31 08:33:20.980 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:20.981 143258 INFO neutron.agent.ovn.metadata.agent [-] Port ae035cfb-a17b-4578-a506-e2581da09f74 in datapath 44469d8b-ad30-4270-88fa-e67c568f3150 bound to our chassis#033[00m
Jan 31 03:33:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:20.983 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 44469d8b-ad30-4270-88fa-e67c568f3150#033[00m
Jan 31 03:33:20 np0005603623 systemd-udevd[287998]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:33:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:20.991 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6490a23e-2c5d-4cd6-912a-e8aad3f3a43d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:20.992 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap44469d8b-a1 in ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:33:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:20.993 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap44469d8b-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:33:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:20.994 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b11fb3e7-2f80-4a20-9203-2137ae56e9d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:20.994 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dd7c1157-5f7e-4c04-885b-9a8bb4a4f9a1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:21 np0005603623 systemd-machined[194379]: New machine qemu-62-instance-00000087.
Jan 31 03:33:21 np0005603623 NetworkManager[48970]: <info>  [1769848401.0025] device (tapae035cfb-a1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:33:21 np0005603623 NetworkManager[48970]: <info>  [1769848401.0031] device (tapae035cfb-a1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.003 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[bc2a60d8-995d-4a07-b99f-1fb89da70ca1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:21 np0005603623 systemd[1]: Started Virtual Machine qemu-62-instance-00000087.
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.013 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6e3bf46b-7257-4674-bb22-3df8b79648b2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.031 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[8eed5096-9521-415a-ad98-68ffe63c6010]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.035 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7fbe5cd7-9f2f-4d57-9f6e-d727329227f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:21 np0005603623 NetworkManager[48970]: <info>  [1769848401.0368] manager: (tap44469d8b-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/264)
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.051 226239 DEBUG nova.network.neutron [req-ca1daed6-d86f-42b6-a480-6b4fa24d19b8 req-74b7c75e-5408-403c-8a52-50021f523ffd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Updated VIF entry in instance network info cache for port ae035cfb-a17b-4578-a506-e2581da09f74. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.051 226239 DEBUG nova.network.neutron [req-ca1daed6-d86f-42b6-a480-6b4fa24d19b8 req-74b7c75e-5408-403c-8a52-50021f523ffd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Updating instance_info_cache with network_info: [{"id": "ae035cfb-a17b-4578-a506-e2581da09f74", "address": "fa:16:3e:30:5a:60", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae035cfb-a1", "ovs_interfaceid": "ae035cfb-a17b-4578-a506-e2581da09f74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.054 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ac4b7a9c-0a6f-43b9-b076-68dd586699d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.057 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[fa3c8d00-09e8-491d-9bc4-8ae716f24bc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:21 np0005603623 NetworkManager[48970]: <info>  [1769848401.0719] device (tap44469d8b-a0): carrier: link connected
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.076 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[4c45cec8-e012-4d7f-aa59-a74736a2620f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.077 226239 DEBUG oslo_concurrency.lockutils [req-ca1daed6-d86f-42b6-a480-6b4fa24d19b8 req-74b7c75e-5408-403c-8a52-50021f523ffd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-adfc4c25-9eb9-45cc-ac90-2029677bcb67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.088 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[069cc3ad-faff-4b56-a828-a4924413be86]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44469d8b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:98:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 165], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 754150, 'reachable_time': 23106, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288032, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.100 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f5c782fd-4adb-4c1a-98bd-ce0675481cc4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:9820'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 754150, 'tstamp': 754150}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288033, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.112 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d79d7dc6-5727-4c93-aa2c-270aa1bd8009]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44469d8b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:98:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 165], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 754150, 'reachable_time': 23106, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 288034, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:33:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:21.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.132 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b2e2c5ac-6da1-42a6-89be-8d9eea108097]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.172 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fd6bdd86-ac58-4226-bdae-6184d4e761d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.173 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44469d8b-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.173 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.173 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44469d8b-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:33:21 np0005603623 NetworkManager[48970]: <info>  [1769848401.1754] manager: (tap44469d8b-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/265)
Jan 31 03:33:21 np0005603623 kernel: tap44469d8b-a0: entered promiscuous mode
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.178 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap44469d8b-a0, col_values=(('external_ids', {'iface-id': '7e288124-e200-4c03-8a4a-baab3e3f3d7a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:33:21 np0005603623 ovn_controller[133449]: 2026-01-31T08:33:21Z|00564|binding|INFO|Releasing lport 7e288124-e200-4c03-8a4a-baab3e3f3d7a from this chassis (sb_readonly=0)
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.184 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44469d8b-ad30-4270-88fa-e67c568f3150.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44469d8b-ad30-4270-88fa-e67c568f3150.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.183 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.185 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fea521dc-f48d-400f-89dd-e98053da6160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.185 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-44469d8b-ad30-4270-88fa-e67c568f3150
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/44469d8b-ad30-4270-88fa-e67c568f3150.pid.haproxy
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 44469d8b-ad30-4270-88fa-e67c568f3150
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:33:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:21.186 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'env', 'PROCESS_TAG=haproxy-44469d8b-ad30-4270-88fa-e67c568f3150', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/44469d8b-ad30-4270-88fa-e67c568f3150.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:33:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:21.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.259 226239 DEBUG nova.network.neutron [-] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.474 226239 DEBUG nova.compute.manager [req-6ff0f909-b03a-49fa-9b18-1ed8cc5d4f16 req-a9ed17c2-8a3c-4346-b34f-cfd4df74786d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Received event network-vif-deleted-5c8118c5-4238-4d06-99ff-6e0b763563c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.475 226239 INFO nova.compute.manager [req-6ff0f909-b03a-49fa-9b18-1ed8cc5d4f16 req-a9ed17c2-8a3c-4346-b34f-cfd4df74786d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Neutron deleted interface 5c8118c5-4238-4d06-99ff-6e0b763563c7; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.476 226239 DEBUG nova.network.neutron [req-6ff0f909-b03a-49fa-9b18-1ed8cc5d4f16 req-a9ed17c2-8a3c-4346-b34f-cfd4df74786d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.480 226239 DEBUG nova.compute.manager [req-15e89004-eba3-4d0c-821b-e95814147205 req-edee3d76-65c3-4891-87f0-6c98090767ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Received event network-vif-plugged-ae035cfb-a17b-4578-a506-e2581da09f74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.480 226239 DEBUG oslo_concurrency.lockutils [req-15e89004-eba3-4d0c-821b-e95814147205 req-edee3d76-65c3-4891-87f0-6c98090767ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.481 226239 DEBUG oslo_concurrency.lockutils [req-15e89004-eba3-4d0c-821b-e95814147205 req-edee3d76-65c3-4891-87f0-6c98090767ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.482 226239 DEBUG oslo_concurrency.lockutils [req-15e89004-eba3-4d0c-821b-e95814147205 req-edee3d76-65c3-4891-87f0-6c98090767ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.482 226239 DEBUG nova.compute.manager [req-15e89004-eba3-4d0c-821b-e95814147205 req-edee3d76-65c3-4891-87f0-6c98090767ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Processing event network-vif-plugged-ae035cfb-a17b-4578-a506-e2581da09f74 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:33:21 np0005603623 podman[288066]: 2026-01-31 08:33:21.513001443 +0000 UTC m=+0.053258136 container create ae62c108476985852c675a6b56444586a2bd2ae3d5469c06bd9ad8db3952e059 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 31 03:33:21 np0005603623 systemd[1]: Started libpod-conmon-ae62c108476985852c675a6b56444586a2bd2ae3d5469c06bd9ad8db3952e059.scope.
Jan 31 03:33:21 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:33:21 np0005603623 podman[288066]: 2026-01-31 08:33:21.48219234 +0000 UTC m=+0.022449133 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:33:21 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/267a114d746ed6c5571ecf39454d72bed5d1100967c5b2995bf4370d8ee8064a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:33:21 np0005603623 podman[288066]: 2026-01-31 08:33:21.5884178 +0000 UTC m=+0.128674523 container init ae62c108476985852c675a6b56444586a2bd2ae3d5469c06bd9ad8db3952e059 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:33:21 np0005603623 podman[288066]: 2026-01-31 08:33:21.593482988 +0000 UTC m=+0.133739691 container start ae62c108476985852c675a6b56444586a2bd2ae3d5469c06bd9ad8db3952e059 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:33:21 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[288098]: [NOTICE]   (288120) : New worker (288124) forked
Jan 31 03:33:21 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[288098]: [NOTICE]   (288120) : Loading success.
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.704 226239 DEBUG nova.compute.manager [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.706 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848401.7040596, adfc4c25-9eb9-45cc-ac90-2029677bcb67 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.706 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] VM Started (Lifecycle Event)#033[00m
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.709 226239 DEBUG nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.712 226239 INFO nova.virt.libvirt.driver [-] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Instance spawned successfully.#033[00m
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.713 226239 DEBUG nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:33:21 np0005603623 nova_compute[226235]: 2026-01-31 08:33:21.889 226239 INFO nova.compute.manager [-] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Took 1.71 seconds to deallocate network for instance.#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.098 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.100 226239 DEBUG nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.100 226239 DEBUG nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.101 226239 DEBUG nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.101 226239 DEBUG nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.101 226239 DEBUG nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.102 226239 DEBUG nova.virt.libvirt.driver [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.105 226239 DEBUG nova.compute.manager [req-6ff0f909-b03a-49fa-9b18-1ed8cc5d4f16 req-a9ed17c2-8a3c-4346-b34f-cfd4df74786d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Detach interface failed, port_id=5c8118c5-4238-4d06-99ff-6e0b763563c7, reason: Instance 4e0408c0-b1a7-4079-ba79-3e737fded2ea could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.107 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.316 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.317 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848401.705398, adfc4c25-9eb9-45cc-ac90-2029677bcb67 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.317 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.560 226239 INFO nova.compute.manager [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Took 11.93 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.561 226239 DEBUG nova.compute.manager [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.583 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.587 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848401.7088716, adfc4c25-9eb9-45cc-ac90-2029677bcb67 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.588 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.612 226239 DEBUG oslo_concurrency.lockutils [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.612 226239 DEBUG oslo_concurrency.lockutils [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.704 226239 DEBUG oslo_concurrency.processutils [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.838 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.842 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:33:22 np0005603623 nova_compute[226235]: 2026-01-31 08:33:22.904 226239 INFO nova.compute.manager [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Took 13.70 seconds to build instance.#033[00m
Jan 31 03:33:23 np0005603623 nova_compute[226235]: 2026-01-31 08:33:23.009 226239 DEBUG oslo_concurrency.lockutils [None req-344fb366-f201-4ef9-8bb3-18bdb783b33b ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.964s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:33:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/185406479' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:33:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:23.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:23 np0005603623 nova_compute[226235]: 2026-01-31 08:33:23.143 226239 DEBUG oslo_concurrency.processutils [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:23 np0005603623 nova_compute[226235]: 2026-01-31 08:33:23.149 226239 DEBUG nova.compute.provider_tree [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:33:23 np0005603623 nova_compute[226235]: 2026-01-31 08:33:23.171 226239 DEBUG nova.scheduler.client.report [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:33:23 np0005603623 nova_compute[226235]: 2026-01-31 08:33:23.227 226239 DEBUG oslo_concurrency.lockutils [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:23.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:23 np0005603623 nova_compute[226235]: 2026-01-31 08:33:23.284 226239 INFO nova.scheduler.client.report [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Deleted allocations for instance 4e0408c0-b1a7-4079-ba79-3e737fded2ea#033[00m
Jan 31 03:33:23 np0005603623 nova_compute[226235]: 2026-01-31 08:33:23.370 226239 DEBUG oslo_concurrency.lockutils [None req-428d37e0-36fd-4ba8-ab45-e4d5dacfeaab d91ac41a8e444974a11ffbef7b04ddb3 b6a7de05649d42c6acb1aa6e6026b2b4 - - default default] Lock "4e0408c0-b1a7-4079-ba79-3e737fded2ea" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.559s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:23 np0005603623 nova_compute[226235]: 2026-01-31 08:33:23.635 226239 DEBUG nova.compute.manager [req-70469dc3-7ff6-4f1a-b079-b56fed137398 req-4840cf22-ad3d-482d-914b-76c0bcea22d4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Received event network-vif-plugged-ae035cfb-a17b-4578-a506-e2581da09f74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:33:23 np0005603623 nova_compute[226235]: 2026-01-31 08:33:23.635 226239 DEBUG oslo_concurrency.lockutils [req-70469dc3-7ff6-4f1a-b079-b56fed137398 req-4840cf22-ad3d-482d-914b-76c0bcea22d4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:23 np0005603623 nova_compute[226235]: 2026-01-31 08:33:23.636 226239 DEBUG oslo_concurrency.lockutils [req-70469dc3-7ff6-4f1a-b079-b56fed137398 req-4840cf22-ad3d-482d-914b-76c0bcea22d4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:23 np0005603623 nova_compute[226235]: 2026-01-31 08:33:23.636 226239 DEBUG oslo_concurrency.lockutils [req-70469dc3-7ff6-4f1a-b079-b56fed137398 req-4840cf22-ad3d-482d-914b-76c0bcea22d4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:23 np0005603623 nova_compute[226235]: 2026-01-31 08:33:23.636 226239 DEBUG nova.compute.manager [req-70469dc3-7ff6-4f1a-b079-b56fed137398 req-4840cf22-ad3d-482d-914b-76c0bcea22d4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] No waiting events found dispatching network-vif-plugged-ae035cfb-a17b-4578-a506-e2581da09f74 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:33:23 np0005603623 nova_compute[226235]: 2026-01-31 08:33:23.637 226239 WARNING nova.compute.manager [req-70469dc3-7ff6-4f1a-b079-b56fed137398 req-4840cf22-ad3d-482d-914b-76c0bcea22d4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Received unexpected event network-vif-plugged-ae035cfb-a17b-4578-a506-e2581da09f74 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:33:24 np0005603623 nova_compute[226235]: 2026-01-31 08:33:24.656 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:24 np0005603623 nova_compute[226235]: 2026-01-31 08:33:24.867 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:25.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:33:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:25.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:33:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:33:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:27.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:33:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:27.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:29.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:29.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:29 np0005603623 nova_compute[226235]: 2026-01-31 08:33:29.520 226239 DEBUG nova.compute.manager [req-de9e1490-f57a-4b4e-83bc-d56e2b453068 req-de7dad74-ca6f-4a06-b5ec-52ff5f42b66a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Received event network-changed-ae035cfb-a17b-4578-a506-e2581da09f74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:33:29 np0005603623 nova_compute[226235]: 2026-01-31 08:33:29.521 226239 DEBUG nova.compute.manager [req-de9e1490-f57a-4b4e-83bc-d56e2b453068 req-de7dad74-ca6f-4a06-b5ec-52ff5f42b66a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Refreshing instance network info cache due to event network-changed-ae035cfb-a17b-4578-a506-e2581da09f74. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:33:29 np0005603623 nova_compute[226235]: 2026-01-31 08:33:29.521 226239 DEBUG oslo_concurrency.lockutils [req-de9e1490-f57a-4b4e-83bc-d56e2b453068 req-de7dad74-ca6f-4a06-b5ec-52ff5f42b66a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-adfc4c25-9eb9-45cc-ac90-2029677bcb67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:33:29 np0005603623 nova_compute[226235]: 2026-01-31 08:33:29.521 226239 DEBUG oslo_concurrency.lockutils [req-de9e1490-f57a-4b4e-83bc-d56e2b453068 req-de7dad74-ca6f-4a06-b5ec-52ff5f42b66a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-adfc4c25-9eb9-45cc-ac90-2029677bcb67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:33:29 np0005603623 nova_compute[226235]: 2026-01-31 08:33:29.521 226239 DEBUG nova.network.neutron [req-de9e1490-f57a-4b4e-83bc-d56e2b453068 req-de7dad74-ca6f-4a06-b5ec-52ff5f42b66a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Refreshing network info cache for port ae035cfb-a17b-4578-a506-e2581da09f74 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:33:29 np0005603623 nova_compute[226235]: 2026-01-31 08:33:29.659 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:29 np0005603623 nova_compute[226235]: 2026-01-31 08:33:29.869 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:30.125 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:30.126 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:33:30.126 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:31.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:31.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:31 np0005603623 ovn_controller[133449]: 2026-01-31T08:33:31Z|00565|binding|INFO|Releasing lport 7e288124-e200-4c03-8a4a-baab3e3f3d7a from this chassis (sb_readonly=0)
Jan 31 03:33:31 np0005603623 nova_compute[226235]: 2026-01-31 08:33:31.550 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:31 np0005603623 podman[288217]: 2026-01-31 08:33:31.96214516 +0000 UTC m=+0.048658052 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 03:33:32 np0005603623 nova_compute[226235]: 2026-01-31 08:33:32.004 226239 DEBUG nova.network.neutron [req-de9e1490-f57a-4b4e-83bc-d56e2b453068 req-de7dad74-ca6f-4a06-b5ec-52ff5f42b66a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Updated VIF entry in instance network info cache for port ae035cfb-a17b-4578-a506-e2581da09f74. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:33:32 np0005603623 nova_compute[226235]: 2026-01-31 08:33:32.005 226239 DEBUG nova.network.neutron [req-de9e1490-f57a-4b4e-83bc-d56e2b453068 req-de7dad74-ca6f-4a06-b5ec-52ff5f42b66a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Updating instance_info_cache with network_info: [{"id": "ae035cfb-a17b-4578-a506-e2581da09f74", "address": "fa:16:3e:30:5a:60", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae035cfb-a1", "ovs_interfaceid": "ae035cfb-a17b-4578-a506-e2581da09f74", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:33:32 np0005603623 podman[288218]: 2026-01-31 08:33:32.042175601 +0000 UTC m=+0.128067434 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:33:32 np0005603623 nova_compute[226235]: 2026-01-31 08:33:32.054 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848397.053217, 4e0408c0-b1a7-4079-ba79-3e737fded2ea => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:33:32 np0005603623 nova_compute[226235]: 2026-01-31 08:33:32.055 226239 INFO nova.compute.manager [-] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:33:32 np0005603623 nova_compute[226235]: 2026-01-31 08:33:32.290 226239 DEBUG nova.compute.manager [None req-2067d11d-c984-4678-8ce1-022c8972d213 - - - - - -] [instance: 4e0408c0-b1a7-4079-ba79-3e737fded2ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:33:32 np0005603623 nova_compute[226235]: 2026-01-31 08:33:32.523 226239 DEBUG oslo_concurrency.lockutils [req-de9e1490-f57a-4b4e-83bc-d56e2b453068 req-de7dad74-ca6f-4a06-b5ec-52ff5f42b66a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-adfc4c25-9eb9-45cc-ac90-2029677bcb67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:33:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:33.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:33.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:33 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #54. Immutable memtables: 10.
Jan 31 03:33:34 np0005603623 nova_compute[226235]: 2026-01-31 08:33:34.661 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:34 np0005603623 nova_compute[226235]: 2026-01-31 08:33:34.870 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:35.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:35.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:36 np0005603623 ovn_controller[133449]: 2026-01-31T08:33:36Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:30:5a:60 10.100.0.12
Jan 31 03:33:36 np0005603623 ovn_controller[133449]: 2026-01-31T08:33:36Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:30:5a:60 10.100.0.12
Jan 31 03:33:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:37.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:37.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:39.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:33:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:39.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:33:39 np0005603623 nova_compute[226235]: 2026-01-31 08:33:39.664 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:39 np0005603623 nova_compute[226235]: 2026-01-31 08:33:39.871 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:41.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:41.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:33:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:43.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:33:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:43.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:44 np0005603623 nova_compute[226235]: 2026-01-31 08:33:44.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:44 np0005603623 nova_compute[226235]: 2026-01-31 08:33:44.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:33:44 np0005603623 nova_compute[226235]: 2026-01-31 08:33:44.517 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:44 np0005603623 nova_compute[226235]: 2026-01-31 08:33:44.665 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:44 np0005603623 nova_compute[226235]: 2026-01-31 08:33:44.872 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:45.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:45.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:33:46 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1458537302' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:33:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:33:46 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1458537302' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:33:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:47.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:33:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:47.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:33:49 np0005603623 nova_compute[226235]: 2026-01-31 08:33:49.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:33:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:49.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:33:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:49.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:49 np0005603623 nova_compute[226235]: 2026-01-31 08:33:49.307 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:49 np0005603623 nova_compute[226235]: 2026-01-31 08:33:49.308 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:49 np0005603623 nova_compute[226235]: 2026-01-31 08:33:49.308 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:49 np0005603623 nova_compute[226235]: 2026-01-31 08:33:49.308 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:33:49 np0005603623 nova_compute[226235]: 2026-01-31 08:33:49.309 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:49 np0005603623 nova_compute[226235]: 2026-01-31 08:33:49.669 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:33:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4279800718' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:33:49 np0005603623 nova_compute[226235]: 2026-01-31 08:33:49.764 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:49 np0005603623 nova_compute[226235]: 2026-01-31 08:33:49.874 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:49 np0005603623 nova_compute[226235]: 2026-01-31 08:33:49.983 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000087 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:33:49 np0005603623 nova_compute[226235]: 2026-01-31 08:33:49.983 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000087 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:33:50 np0005603623 nova_compute[226235]: 2026-01-31 08:33:50.142 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:33:50 np0005603623 nova_compute[226235]: 2026-01-31 08:33:50.143 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4213MB free_disk=20.85165786743164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:33:50 np0005603623 nova_compute[226235]: 2026-01-31 08:33:50.143 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:50 np0005603623 nova_compute[226235]: 2026-01-31 08:33:50.143 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:50 np0005603623 nova_compute[226235]: 2026-01-31 08:33:50.278 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance adfc4c25-9eb9-45cc-ac90-2029677bcb67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:33:50 np0005603623 nova_compute[226235]: 2026-01-31 08:33:50.278 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:33:50 np0005603623 nova_compute[226235]: 2026-01-31 08:33:50.279 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:33:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:33:50 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4177923632' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:33:50 np0005603623 nova_compute[226235]: 2026-01-31 08:33:50.639 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:33:51 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4008828514' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:33:51 np0005603623 nova_compute[226235]: 2026-01-31 08:33:51.078 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:51 np0005603623 nova_compute[226235]: 2026-01-31 08:33:51.083 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:33:51 np0005603623 nova_compute[226235]: 2026-01-31 08:33:51.141 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:33:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:51.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:51.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:51 np0005603623 nova_compute[226235]: 2026-01-31 08:33:51.304 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:33:51 np0005603623 nova_compute[226235]: 2026-01-31 08:33:51.304 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:51 np0005603623 nova_compute[226235]: 2026-01-31 08:33:51.386 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:53.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:53.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:53 np0005603623 nova_compute[226235]: 2026-01-31 08:33:53.306 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:53 np0005603623 nova_compute[226235]: 2026-01-31 08:33:53.307 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:33:53 np0005603623 nova_compute[226235]: 2026-01-31 08:33:53.307 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:33:53 np0005603623 nova_compute[226235]: 2026-01-31 08:33:53.684 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-adfc4c25-9eb9-45cc-ac90-2029677bcb67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:33:53 np0005603623 nova_compute[226235]: 2026-01-31 08:33:53.684 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-adfc4c25-9eb9-45cc-ac90-2029677bcb67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:33:53 np0005603623 nova_compute[226235]: 2026-01-31 08:33:53.684 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:33:53 np0005603623 nova_compute[226235]: 2026-01-31 08:33:53.684 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid adfc4c25-9eb9-45cc-ac90-2029677bcb67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:33:54 np0005603623 nova_compute[226235]: 2026-01-31 08:33:54.594 226239 DEBUG oslo_concurrency.lockutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "058110ef-426a-46ab-8f57-d84a048d54be" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:54 np0005603623 nova_compute[226235]: 2026-01-31 08:33:54.594 226239 DEBUG oslo_concurrency.lockutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "058110ef-426a-46ab-8f57-d84a048d54be" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:54 np0005603623 nova_compute[226235]: 2026-01-31 08:33:54.649 226239 DEBUG nova.compute.manager [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:33:54 np0005603623 nova_compute[226235]: 2026-01-31 08:33:54.672 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:54 np0005603623 nova_compute[226235]: 2026-01-31 08:33:54.740 226239 DEBUG oslo_concurrency.lockutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:54 np0005603623 nova_compute[226235]: 2026-01-31 08:33:54.740 226239 DEBUG oslo_concurrency.lockutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:54 np0005603623 nova_compute[226235]: 2026-01-31 08:33:54.751 226239 DEBUG nova.virt.hardware [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:33:54 np0005603623 nova_compute[226235]: 2026-01-31 08:33:54.751 226239 INFO nova.compute.claims [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:33:54 np0005603623 nova_compute[226235]: 2026-01-31 08:33:54.877 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:54 np0005603623 nova_compute[226235]: 2026-01-31 08:33:54.979 226239 DEBUG oslo_concurrency.processutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:55.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:55.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:33:55 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1689407442' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:33:55 np0005603623 nova_compute[226235]: 2026-01-31 08:33:55.411 226239 DEBUG oslo_concurrency.processutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:55 np0005603623 nova_compute[226235]: 2026-01-31 08:33:55.417 226239 DEBUG nova.compute.provider_tree [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:33:55 np0005603623 nova_compute[226235]: 2026-01-31 08:33:55.461 226239 DEBUG nova.scheduler.client.report [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:33:55 np0005603623 nova_compute[226235]: 2026-01-31 08:33:55.545 226239 DEBUG oslo_concurrency.lockutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.805s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:55 np0005603623 nova_compute[226235]: 2026-01-31 08:33:55.546 226239 DEBUG nova.compute.manager [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:33:55 np0005603623 nova_compute[226235]: 2026-01-31 08:33:55.627 226239 DEBUG nova.compute.manager [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:33:55 np0005603623 nova_compute[226235]: 2026-01-31 08:33:55.627 226239 DEBUG nova.network.neutron [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:33:55 np0005603623 nova_compute[226235]: 2026-01-31 08:33:55.704 226239 INFO nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:33:55 np0005603623 ovn_controller[133449]: 2026-01-31T08:33:55Z|00566|binding|INFO|Releasing lport 7e288124-e200-4c03-8a4a-baab3e3f3d7a from this chassis (sb_readonly=0)
Jan 31 03:33:55 np0005603623 nova_compute[226235]: 2026-01-31 08:33:55.734 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:55 np0005603623 nova_compute[226235]: 2026-01-31 08:33:55.778 226239 DEBUG nova.compute.manager [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:33:55 np0005603623 nova_compute[226235]: 2026-01-31 08:33:55.855 226239 DEBUG nova.policy [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fb3f20f0143d465ebfe98f6a13200890', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '40db421b27d84f809f8074c58151327f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.119 226239 DEBUG nova.compute.manager [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.122 226239 DEBUG nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.122 226239 INFO nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Creating image(s)#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.156 226239 DEBUG nova.storage.rbd_utils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image 058110ef-426a-46ab-8f57-d84a048d54be_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.191 226239 DEBUG nova.storage.rbd_utils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image 058110ef-426a-46ab-8f57-d84a048d54be_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.221 226239 DEBUG nova.storage.rbd_utils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image 058110ef-426a-46ab-8f57-d84a048d54be_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.225 226239 DEBUG oslo_concurrency.processutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.282 226239 DEBUG oslo_concurrency.processutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.283 226239 DEBUG oslo_concurrency.lockutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.284 226239 DEBUG oslo_concurrency.lockutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.284 226239 DEBUG oslo_concurrency.lockutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.310 226239 DEBUG nova.storage.rbd_utils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image 058110ef-426a-46ab-8f57-d84a048d54be_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.316 226239 DEBUG oslo_concurrency.processutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 058110ef-426a-46ab-8f57-d84a048d54be_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.381 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Updating instance_info_cache with network_info: [{"id": "ae035cfb-a17b-4578-a506-e2581da09f74", "address": "fa:16:3e:30:5a:60", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae035cfb-a1", "ovs_interfaceid": "ae035cfb-a17b-4578-a506-e2581da09f74", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.479 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-adfc4c25-9eb9-45cc-ac90-2029677bcb67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.479 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.480 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.480 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.481 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:56 np0005603623 nova_compute[226235]: 2026-01-31 08:33:56.481 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:57 np0005603623 nova_compute[226235]: 2026-01-31 08:33:57.145 226239 DEBUG nova.network.neutron [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Successfully created port: acbebf00-067f-42fc-a3a1-50ffc7af9827 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:33:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:33:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:57.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:33:57 np0005603623 nova_compute[226235]: 2026-01-31 08:33:57.232 226239 DEBUG oslo_concurrency.processutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 058110ef-426a-46ab-8f57-d84a048d54be_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.917s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:57.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:57 np0005603623 nova_compute[226235]: 2026-01-31 08:33:57.312 226239 DEBUG nova.storage.rbd_utils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] resizing rbd image 058110ef-426a-46ab-8f57-d84a048d54be_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:33:57 np0005603623 nova_compute[226235]: 2026-01-31 08:33:57.430 226239 DEBUG nova.objects.instance [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lazy-loading 'migration_context' on Instance uuid 058110ef-426a-46ab-8f57-d84a048d54be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:33:57 np0005603623 nova_compute[226235]: 2026-01-31 08:33:57.543 226239 DEBUG nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:33:57 np0005603623 nova_compute[226235]: 2026-01-31 08:33:57.544 226239 DEBUG nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Ensure instance console log exists: /var/lib/nova/instances/058110ef-426a-46ab-8f57-d84a048d54be/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:33:57 np0005603623 nova_compute[226235]: 2026-01-31 08:33:57.544 226239 DEBUG oslo_concurrency.lockutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:57 np0005603623 nova_compute[226235]: 2026-01-31 08:33:57.545 226239 DEBUG oslo_concurrency.lockutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:57 np0005603623 nova_compute[226235]: 2026-01-31 08:33:57.545 226239 DEBUG oslo_concurrency.lockutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:59 np0005603623 nova_compute[226235]: 2026-01-31 08:33:59.133 226239 DEBUG nova.network.neutron [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Successfully updated port: acbebf00-067f-42fc-a3a1-50ffc7af9827 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:33:59 np0005603623 nova_compute[226235]: 2026-01-31 08:33:59.169 226239 DEBUG oslo_concurrency.lockutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "refresh_cache-058110ef-426a-46ab-8f57-d84a048d54be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:33:59 np0005603623 nova_compute[226235]: 2026-01-31 08:33:59.169 226239 DEBUG oslo_concurrency.lockutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquired lock "refresh_cache-058110ef-426a-46ab-8f57-d84a048d54be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:33:59 np0005603623 nova_compute[226235]: 2026-01-31 08:33:59.169 226239 DEBUG nova.network.neutron [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:33:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:59.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:59 np0005603623 nova_compute[226235]: 2026-01-31 08:33:59.271 226239 DEBUG nova.compute.manager [req-28b052a2-c5d4-41ca-8981-2e81db6a048f req-d0f687b8-ed6e-4977-b7b0-8261ffb5a1e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Received event network-changed-acbebf00-067f-42fc-a3a1-50ffc7af9827 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:33:59 np0005603623 nova_compute[226235]: 2026-01-31 08:33:59.272 226239 DEBUG nova.compute.manager [req-28b052a2-c5d4-41ca-8981-2e81db6a048f req-d0f687b8-ed6e-4977-b7b0-8261ffb5a1e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Refreshing instance network info cache due to event network-changed-acbebf00-067f-42fc-a3a1-50ffc7af9827. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:33:59 np0005603623 nova_compute[226235]: 2026-01-31 08:33:59.273 226239 DEBUG oslo_concurrency.lockutils [req-28b052a2-c5d4-41ca-8981-2e81db6a048f req-d0f687b8-ed6e-4977-b7b0-8261ffb5a1e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-058110ef-426a-46ab-8f57-d84a048d54be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:33:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:33:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:59.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:59 np0005603623 nova_compute[226235]: 2026-01-31 08:33:59.325 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:59 np0005603623 nova_compute[226235]: 2026-01-31 08:33:59.344 226239 DEBUG nova.network.neutron [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:33:59 np0005603623 nova_compute[226235]: 2026-01-31 08:33:59.674 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:59 np0005603623 nova_compute[226235]: 2026-01-31 08:33:59.879 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.536 226239 DEBUG nova.network.neutron [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Updating instance_info_cache with network_info: [{"id": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "address": "fa:16:3e:76:70:ab", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacbebf00-06", "ovs_interfaceid": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.683 226239 DEBUG oslo_concurrency.lockutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Releasing lock "refresh_cache-058110ef-426a-46ab-8f57-d84a048d54be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.683 226239 DEBUG nova.compute.manager [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Instance network_info: |[{"id": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "address": "fa:16:3e:76:70:ab", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacbebf00-06", "ovs_interfaceid": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.683 226239 DEBUG oslo_concurrency.lockutils [req-28b052a2-c5d4-41ca-8981-2e81db6a048f req-d0f687b8-ed6e-4977-b7b0-8261ffb5a1e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-058110ef-426a-46ab-8f57-d84a048d54be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.684 226239 DEBUG nova.network.neutron [req-28b052a2-c5d4-41ca-8981-2e81db6a048f req-d0f687b8-ed6e-4977-b7b0-8261ffb5a1e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Refreshing network info cache for port acbebf00-067f-42fc-a3a1-50ffc7af9827 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.687 226239 DEBUG nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Start _get_guest_xml network_info=[{"id": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "address": "fa:16:3e:76:70:ab", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacbebf00-06", "ovs_interfaceid": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.692 226239 WARNING nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.697 226239 DEBUG nova.virt.libvirt.host [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.698 226239 DEBUG nova.virt.libvirt.host [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.701 226239 DEBUG nova.virt.libvirt.host [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.702 226239 DEBUG nova.virt.libvirt.host [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.703 226239 DEBUG nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.704 226239 DEBUG nova.virt.hardware [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.704 226239 DEBUG nova.virt.hardware [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.705 226239 DEBUG nova.virt.hardware [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.705 226239 DEBUG nova.virt.hardware [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.705 226239 DEBUG nova.virt.hardware [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.706 226239 DEBUG nova.virt.hardware [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.706 226239 DEBUG nova.virt.hardware [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.706 226239 DEBUG nova.virt.hardware [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.707 226239 DEBUG nova.virt.hardware [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.707 226239 DEBUG nova.virt.hardware [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.708 226239 DEBUG nova.virt.hardware [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:34:00 np0005603623 nova_compute[226235]: 2026-01-31 08:34:00.712 226239 DEBUG oslo_concurrency.processutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:34:01 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1520244835' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.154 226239 DEBUG oslo_concurrency.processutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:01.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.202 226239 DEBUG nova.storage.rbd_utils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image 058110ef-426a-46ab-8f57-d84a048d54be_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.206 226239 DEBUG oslo_concurrency.processutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:01.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:34:01 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3994869422' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.640 226239 DEBUG oslo_concurrency.processutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.641 226239 DEBUG nova.virt.libvirt.vif [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:33:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-947719579',display_name='tempest-ServersTestJSON-server-947719579',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-947719579',id=138,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='40db421b27d84f809f8074c58151327f',ramdisk_id='',reservation_id='r-5oenbp4h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1064072764',owner_user_name='tempest-ServersTestJSON-1064072764-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:33:55Z,user_data=None,user_id='fb3f20f0143d465ebfe98f6a13200890',uuid=058110ef-426a-46ab-8f57-d84a048d54be,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "address": "fa:16:3e:76:70:ab", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacbebf00-06", "ovs_interfaceid": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.642 226239 DEBUG nova.network.os_vif_util [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Converting VIF {"id": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "address": "fa:16:3e:76:70:ab", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacbebf00-06", "ovs_interfaceid": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.643 226239 DEBUG nova.network.os_vif_util [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:70:ab,bridge_name='br-int',has_traffic_filtering=True,id=acbebf00-067f-42fc-a3a1-50ffc7af9827,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacbebf00-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.644 226239 DEBUG nova.objects.instance [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lazy-loading 'pci_devices' on Instance uuid 058110ef-426a-46ab-8f57-d84a048d54be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.693 226239 DEBUG nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:34:01 np0005603623 nova_compute[226235]:  <uuid>058110ef-426a-46ab-8f57-d84a048d54be</uuid>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:  <name>instance-0000008a</name>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServersTestJSON-server-947719579</nova:name>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:34:00</nova:creationTime>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:34:01 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:        <nova:user uuid="fb3f20f0143d465ebfe98f6a13200890">tempest-ServersTestJSON-1064072764-project-member</nova:user>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:        <nova:project uuid="40db421b27d84f809f8074c58151327f">tempest-ServersTestJSON-1064072764</nova:project>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:        <nova:port uuid="acbebf00-067f-42fc-a3a1-50ffc7af9827">
Jan 31 03:34:01 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <entry name="serial">058110ef-426a-46ab-8f57-d84a048d54be</entry>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <entry name="uuid">058110ef-426a-46ab-8f57-d84a048d54be</entry>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/058110ef-426a-46ab-8f57-d84a048d54be_disk">
Jan 31 03:34:01 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:34:01 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/058110ef-426a-46ab-8f57-d84a048d54be_disk.config">
Jan 31 03:34:01 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:34:01 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:76:70:ab"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <target dev="tapacbebf00-06"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/058110ef-426a-46ab-8f57-d84a048d54be/console.log" append="off"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:34:01 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:34:01 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:34:01 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:34:01 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.694 226239 DEBUG nova.compute.manager [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Preparing to wait for external event network-vif-plugged-acbebf00-067f-42fc-a3a1-50ffc7af9827 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.694 226239 DEBUG oslo_concurrency.lockutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "058110ef-426a-46ab-8f57-d84a048d54be-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.695 226239 DEBUG oslo_concurrency.lockutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "058110ef-426a-46ab-8f57-d84a048d54be-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.695 226239 DEBUG oslo_concurrency.lockutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "058110ef-426a-46ab-8f57-d84a048d54be-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.696 226239 DEBUG nova.virt.libvirt.vif [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:33:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-947719579',display_name='tempest-ServersTestJSON-server-947719579',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-947719579',id=138,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='40db421b27d84f809f8074c58151327f',ramdisk_id='',reservation_id='r-5oenbp4h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1064072764',owner_user_name='tempest-ServersTestJSON-1064072764-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:33:55Z,user_data=None,user_id='fb3f20f0143d465ebfe98f6a13200890',uuid=058110ef-426a-46ab-8f57-d84a048d54be,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "address": "fa:16:3e:76:70:ab", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacbebf00-06", "ovs_interfaceid": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.696 226239 DEBUG nova.network.os_vif_util [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Converting VIF {"id": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "address": "fa:16:3e:76:70:ab", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacbebf00-06", "ovs_interfaceid": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.697 226239 DEBUG nova.network.os_vif_util [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:70:ab,bridge_name='br-int',has_traffic_filtering=True,id=acbebf00-067f-42fc-a3a1-50ffc7af9827,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacbebf00-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.697 226239 DEBUG os_vif [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:70:ab,bridge_name='br-int',has_traffic_filtering=True,id=acbebf00-067f-42fc-a3a1-50ffc7af9827,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacbebf00-06') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.697 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.698 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.698 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.701 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.701 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapacbebf00-06, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.702 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapacbebf00-06, col_values=(('external_ids', {'iface-id': 'acbebf00-067f-42fc-a3a1-50ffc7af9827', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:76:70:ab', 'vm-uuid': '058110ef-426a-46ab-8f57-d84a048d54be'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.703 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:01 np0005603623 NetworkManager[48970]: <info>  [1769848441.7048] manager: (tapacbebf00-06): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/266)
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.706 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.710 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.711 226239 INFO os_vif [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:70:ab,bridge_name='br-int',has_traffic_filtering=True,id=acbebf00-067f-42fc-a3a1-50ffc7af9827,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacbebf00-06')#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.790 226239 DEBUG nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.790 226239 DEBUG nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.791 226239 DEBUG nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] No VIF found with MAC fa:16:3e:76:70:ab, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.791 226239 INFO nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Using config drive#033[00m
Jan 31 03:34:01 np0005603623 nova_compute[226235]: 2026-01-31 08:34:01.815 226239 DEBUG nova.storage.rbd_utils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image 058110ef-426a-46ab-8f57-d84a048d54be_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:34:02 np0005603623 nova_compute[226235]: 2026-01-31 08:34:02.527 226239 INFO nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Creating config drive at /var/lib/nova/instances/058110ef-426a-46ab-8f57-d84a048d54be/disk.config#033[00m
Jan 31 03:34:02 np0005603623 nova_compute[226235]: 2026-01-31 08:34:02.530 226239 DEBUG oslo_concurrency.processutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/058110ef-426a-46ab-8f57-d84a048d54be/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpafz_177_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:02 np0005603623 nova_compute[226235]: 2026-01-31 08:34:02.651 226239 DEBUG oslo_concurrency.processutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/058110ef-426a-46ab-8f57-d84a048d54be/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpafz_177_" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:02 np0005603623 nova_compute[226235]: 2026-01-31 08:34:02.681 226239 DEBUG nova.storage.rbd_utils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image 058110ef-426a-46ab-8f57-d84a048d54be_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:34:02 np0005603623 nova_compute[226235]: 2026-01-31 08:34:02.686 226239 DEBUG oslo_concurrency.processutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/058110ef-426a-46ab-8f57-d84a048d54be/disk.config 058110ef-426a-46ab-8f57-d84a048d54be_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:02 np0005603623 nova_compute[226235]: 2026-01-31 08:34:02.703 226239 DEBUG nova.network.neutron [req-28b052a2-c5d4-41ca-8981-2e81db6a048f req-d0f687b8-ed6e-4977-b7b0-8261ffb5a1e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Updated VIF entry in instance network info cache for port acbebf00-067f-42fc-a3a1-50ffc7af9827. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:34:02 np0005603623 nova_compute[226235]: 2026-01-31 08:34:02.704 226239 DEBUG nova.network.neutron [req-28b052a2-c5d4-41ca-8981-2e81db6a048f req-d0f687b8-ed6e-4977-b7b0-8261ffb5a1e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Updating instance_info_cache with network_info: [{"id": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "address": "fa:16:3e:76:70:ab", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacbebf00-06", "ovs_interfaceid": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:34:02 np0005603623 nova_compute[226235]: 2026-01-31 08:34:02.878 226239 DEBUG oslo_concurrency.processutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/058110ef-426a-46ab-8f57-d84a048d54be/disk.config 058110ef-426a-46ab-8f57-d84a048d54be_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.192s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:02 np0005603623 nova_compute[226235]: 2026-01-31 08:34:02.880 226239 INFO nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Deleting local config drive /var/lib/nova/instances/058110ef-426a-46ab-8f57-d84a048d54be/disk.config because it was imported into RBD.#033[00m
Jan 31 03:34:02 np0005603623 NetworkManager[48970]: <info>  [1769848442.9261] manager: (tapacbebf00-06): new Tun device (/org/freedesktop/NetworkManager/Devices/267)
Jan 31 03:34:02 np0005603623 kernel: tapacbebf00-06: entered promiscuous mode
Jan 31 03:34:02 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:02Z|00567|binding|INFO|Claiming lport acbebf00-067f-42fc-a3a1-50ffc7af9827 for this chassis.
Jan 31 03:34:02 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:02Z|00568|binding|INFO|acbebf00-067f-42fc-a3a1-50ffc7af9827: Claiming fa:16:3e:76:70:ab 10.100.0.4
Jan 31 03:34:02 np0005603623 nova_compute[226235]: 2026-01-31 08:34:02.956 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:02 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:02Z|00569|binding|INFO|Setting lport acbebf00-067f-42fc-a3a1-50ffc7af9827 ovn-installed in OVS
Jan 31 03:34:02 np0005603623 nova_compute[226235]: 2026-01-31 08:34:02.967 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:02 np0005603623 systemd-udevd[288728]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:34:02 np0005603623 podman[288682]: 2026-01-31 08:34:02.986377578 +0000 UTC m=+0.077401350 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:34:02 np0005603623 systemd-machined[194379]: New machine qemu-63-instance-0000008a.
Jan 31 03:34:02 np0005603623 NetworkManager[48970]: <info>  [1769848442.9983] device (tapacbebf00-06): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:34:02 np0005603623 NetworkManager[48970]: <info>  [1769848442.9991] device (tapacbebf00-06): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:34:03 np0005603623 systemd[1]: Started Virtual Machine qemu-63-instance-0000008a.
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.031 226239 DEBUG oslo_concurrency.lockutils [req-28b052a2-c5d4-41ca-8981-2e81db6a048f req-d0f687b8-ed6e-4977-b7b0-8261ffb5a1e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-058110ef-426a-46ab-8f57-d84a048d54be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:34:03 np0005603623 podman[288685]: 2026-01-31 08:34:03.058620946 +0000 UTC m=+0.146970184 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 31 03:34:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:03Z|00570|binding|INFO|Setting lport acbebf00-067f-42fc-a3a1-50ffc7af9827 up in Southbound
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.069 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:70:ab 10.100.0.4'], port_security=['fa:16:3e:76:70:ab 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '058110ef-426a-46ab-8f57-d84a048d54be', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6071a46-64a6-45aa-97c6-06e6c564195b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '40db421b27d84f809f8074c58151327f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '986b09c9-4243-429e-9b6e-93ffcacf8cb5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=111856e4-2ce2-4b64-a82d-6a5bd7b8a457, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=acbebf00-067f-42fc-a3a1-50ffc7af9827) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.071 143258 INFO neutron.agent.ovn.metadata.agent [-] Port acbebf00-067f-42fc-a3a1-50ffc7af9827 in datapath f6071a46-64a6-45aa-97c6-06e6c564195b bound to our chassis#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.073 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f6071a46-64a6-45aa-97c6-06e6c564195b#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.084 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cd3dae98-1732-4719-be95-3307f06021b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.085 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf6071a46-61 in ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.087 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf6071a46-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.088 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c6153001-29f9-45a3-909c-dd67f087509d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.089 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ca0f2834-c9cb-4505-9c56-8ef0255a8030]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.102 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[18779f0e-42dd-4fb5-85d6-1ebf580a7801]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.114 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1ae88d00-455d-4418-8165-ae6018a3c03c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.136 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[1b0481c7-e79d-40c8-a6fa-82e491dbe97f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:03 np0005603623 NetworkManager[48970]: <info>  [1769848443.1446] manager: (tapf6071a46-60): new Veth device (/org/freedesktop/NetworkManager/Devices/268)
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.145 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3aac20d2-184e-4f58-a013-568ec3f46996]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:03 np0005603623 systemd-udevd[288731]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.173 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[a7de21b9-624a-45ea-b6e1-c01307c15d7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.177 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[9777c39a-10c8-46c3-9d46-7f7f5dd05374]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:03.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:03 np0005603623 NetworkManager[48970]: <info>  [1769848443.1965] device (tapf6071a46-60): carrier: link connected
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.200 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[686d9452-937e-4561-a1ec-b3cc91c5fe1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.212 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5a607cca-e0d9-423d-9985-64661cf19b41]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf6071a46-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8c:48'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 167], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 758362, 'reachable_time': 28686, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288789, 'error': None, 'target': 'ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.225 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3bef0934-7ec8-40bd-b2b8-3b08e8b2680f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3c:8c48'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 758362, 'tstamp': 758362}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288801, 'error': None, 'target': 'ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.239 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[17b6079d-1058-41af-ae84-78eacb67073d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf6071a46-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8c:48'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 167], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 758362, 'reachable_time': 28686, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 288807, 'error': None, 'target': 'ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.266 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7b571d6d-ddbc-41bd-bb76-93bbacd42cda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:34:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:03.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.314 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6d0d8ddd-c76a-4df0-8d50-4671ad10af82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.316 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6071a46-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.316 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.317 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6071a46-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:03 np0005603623 NetworkManager[48970]: <info>  [1769848443.3191] manager: (tapf6071a46-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/269)
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.318 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:03 np0005603623 kernel: tapf6071a46-60: entered promiscuous mode
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.323 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf6071a46-60, col_values=(('external_ids', {'iface-id': 'e9a7861c-c6ea-4166-9252-dc2aacdf4771'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:03Z|00571|binding|INFO|Releasing lport e9a7861c-c6ea-4166-9252-dc2aacdf4771 from this chassis (sb_readonly=0)
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.325 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.331 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.331 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f6071a46-64a6-45aa-97c6-06e6c564195b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f6071a46-64a6-45aa-97c6-06e6c564195b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.332 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5c0ee13e-c0ea-4220-b63c-f8fedbe9d432]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.333 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-f6071a46-64a6-45aa-97c6-06e6c564195b
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/f6071a46-64a6-45aa-97c6-06e6c564195b.pid.haproxy
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID f6071a46-64a6-45aa-97c6-06e6c564195b
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:34:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:03.333 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b', 'env', 'PROCESS_TAG=haproxy-f6071a46-64a6-45aa-97c6-06e6c564195b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f6071a46-64a6-45aa-97c6-06e6c564195b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.339 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848443.3390129, 058110ef-426a-46ab-8f57-d84a048d54be => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.340 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] VM Started (Lifecycle Event)#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.382 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.385 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848443.3414986, 058110ef-426a-46ab-8f57-d84a048d54be => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.385 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:34:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:03Z|00572|binding|INFO|Releasing lport e9a7861c-c6ea-4166-9252-dc2aacdf4771 from this chassis (sb_readonly=0)
Jan 31 03:34:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:03Z|00573|binding|INFO|Releasing lport 7e288124-e200-4c03-8a4a-baab3e3f3d7a from this chassis (sb_readonly=0)
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.481 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.537 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.542 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.635 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:34:03 np0005603623 podman[288847]: 2026-01-31 08:34:03.638905373 +0000 UTC m=+0.044016787 container create 0a46b0cbe0be4b94d64786e9052330659c65699956ba5fbc2b85c72d8dee3442 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:34:03 np0005603623 systemd[1]: Started libpod-conmon-0a46b0cbe0be4b94d64786e9052330659c65699956ba5fbc2b85c72d8dee3442.scope.
Jan 31 03:34:03 np0005603623 podman[288847]: 2026-01-31 08:34:03.615657376 +0000 UTC m=+0.020768810 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:34:03 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:34:03 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3650bbc214f96e7aa775aebca0b99766fc70071cf51c7092cd380943cf329096/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:34:03 np0005603623 podman[288847]: 2026-01-31 08:34:03.730573988 +0000 UTC m=+0.135685492 container init 0a46b0cbe0be4b94d64786e9052330659c65699956ba5fbc2b85c72d8dee3442 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Jan 31 03:34:03 np0005603623 podman[288847]: 2026-01-31 08:34:03.735643216 +0000 UTC m=+0.140754670 container start 0a46b0cbe0be4b94d64786e9052330659c65699956ba5fbc2b85c72d8dee3442 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:34:03 np0005603623 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[288862]: [NOTICE]   (288866) : New worker (288868) forked
Jan 31 03:34:03 np0005603623 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[288862]: [NOTICE]   (288866) : Loading success.
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.899 226239 DEBUG nova.compute.manager [req-26490082-0811-41a6-ad61-c2d1fbe92044 req-21b860b3-031f-4157-81ae-f73d5fbf898a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Received event network-vif-plugged-acbebf00-067f-42fc-a3a1-50ffc7af9827 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.899 226239 DEBUG oslo_concurrency.lockutils [req-26490082-0811-41a6-ad61-c2d1fbe92044 req-21b860b3-031f-4157-81ae-f73d5fbf898a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "058110ef-426a-46ab-8f57-d84a048d54be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.899 226239 DEBUG oslo_concurrency.lockutils [req-26490082-0811-41a6-ad61-c2d1fbe92044 req-21b860b3-031f-4157-81ae-f73d5fbf898a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "058110ef-426a-46ab-8f57-d84a048d54be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.900 226239 DEBUG oslo_concurrency.lockutils [req-26490082-0811-41a6-ad61-c2d1fbe92044 req-21b860b3-031f-4157-81ae-f73d5fbf898a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "058110ef-426a-46ab-8f57-d84a048d54be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.900 226239 DEBUG nova.compute.manager [req-26490082-0811-41a6-ad61-c2d1fbe92044 req-21b860b3-031f-4157-81ae-f73d5fbf898a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Processing event network-vif-plugged-acbebf00-067f-42fc-a3a1-50ffc7af9827 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.900 226239 DEBUG nova.compute.manager [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.905 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848443.9053705, 058110ef-426a-46ab-8f57-d84a048d54be => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.906 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.908 226239 DEBUG nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.911 226239 INFO nova.virt.libvirt.driver [-] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Instance spawned successfully.#033[00m
Jan 31 03:34:03 np0005603623 nova_compute[226235]: 2026-01-31 08:34:03.911 226239 DEBUG nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:34:04 np0005603623 nova_compute[226235]: 2026-01-31 08:34:04.145 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:04 np0005603623 nova_compute[226235]: 2026-01-31 08:34:04.151 226239 DEBUG nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:34:04 np0005603623 nova_compute[226235]: 2026-01-31 08:34:04.152 226239 DEBUG nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:34:04 np0005603623 nova_compute[226235]: 2026-01-31 08:34:04.152 226239 DEBUG nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:34:04 np0005603623 nova_compute[226235]: 2026-01-31 08:34:04.153 226239 DEBUG nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:34:04 np0005603623 nova_compute[226235]: 2026-01-31 08:34:04.154 226239 DEBUG nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:34:04 np0005603623 nova_compute[226235]: 2026-01-31 08:34:04.155 226239 DEBUG nova.virt.libvirt.driver [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:34:04 np0005603623 nova_compute[226235]: 2026-01-31 08:34:04.162 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:34:04 np0005603623 nova_compute[226235]: 2026-01-31 08:34:04.427 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:34:04 np0005603623 nova_compute[226235]: 2026-01-31 08:34:04.584 226239 INFO nova.compute.manager [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Took 8.46 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:34:04 np0005603623 nova_compute[226235]: 2026-01-31 08:34:04.585 226239 DEBUG nova.compute.manager [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:04 np0005603623 nova_compute[226235]: 2026-01-31 08:34:04.688 226239 INFO nova.compute.manager [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Took 9.97 seconds to build instance.#033[00m
Jan 31 03:34:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:04 np0005603623 nova_compute[226235]: 2026-01-31 08:34:04.796 226239 DEBUG oslo_concurrency.lockutils [None req-19eb8926-316e-434a-80f0-130191753228 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "058110ef-426a-46ab-8f57-d84a048d54be" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:04 np0005603623 nova_compute[226235]: 2026-01-31 08:34:04.880 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:05.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:05.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:06 np0005603623 nova_compute[226235]: 2026-01-31 08:34:06.704 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:07.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:07.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:07 np0005603623 nova_compute[226235]: 2026-01-31 08:34:07.298 226239 DEBUG nova.compute.manager [req-5473193e-5bb7-41e8-b204-ec8f46b7537a req-c956fdab-78dd-4554-88cd-fcdaf940d42c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Received event network-vif-plugged-acbebf00-067f-42fc-a3a1-50ffc7af9827 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:07 np0005603623 nova_compute[226235]: 2026-01-31 08:34:07.298 226239 DEBUG oslo_concurrency.lockutils [req-5473193e-5bb7-41e8-b204-ec8f46b7537a req-c956fdab-78dd-4554-88cd-fcdaf940d42c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "058110ef-426a-46ab-8f57-d84a048d54be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:07 np0005603623 nova_compute[226235]: 2026-01-31 08:34:07.298 226239 DEBUG oslo_concurrency.lockutils [req-5473193e-5bb7-41e8-b204-ec8f46b7537a req-c956fdab-78dd-4554-88cd-fcdaf940d42c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "058110ef-426a-46ab-8f57-d84a048d54be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:07 np0005603623 nova_compute[226235]: 2026-01-31 08:34:07.299 226239 DEBUG oslo_concurrency.lockutils [req-5473193e-5bb7-41e8-b204-ec8f46b7537a req-c956fdab-78dd-4554-88cd-fcdaf940d42c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "058110ef-426a-46ab-8f57-d84a048d54be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:07 np0005603623 nova_compute[226235]: 2026-01-31 08:34:07.299 226239 DEBUG nova.compute.manager [req-5473193e-5bb7-41e8-b204-ec8f46b7537a req-c956fdab-78dd-4554-88cd-fcdaf940d42c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] No waiting events found dispatching network-vif-plugged-acbebf00-067f-42fc-a3a1-50ffc7af9827 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:34:07 np0005603623 nova_compute[226235]: 2026-01-31 08:34:07.299 226239 WARNING nova.compute.manager [req-5473193e-5bb7-41e8-b204-ec8f46b7537a req-c956fdab-78dd-4554-88cd-fcdaf940d42c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Received unexpected event network-vif-plugged-acbebf00-067f-42fc-a3a1-50ffc7af9827 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:34:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:09.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:09.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e316 e316: 3 total, 3 up, 3 in
Jan 31 03:34:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e316 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:09 np0005603623 nova_compute[226235]: 2026-01-31 08:34:09.884 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e317 e317: 3 total, 3 up, 3 in
Jan 31 03:34:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:11.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:11.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.399 226239 DEBUG oslo_concurrency.lockutils [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "058110ef-426a-46ab-8f57-d84a048d54be" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.401 226239 DEBUG oslo_concurrency.lockutils [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "058110ef-426a-46ab-8f57-d84a048d54be" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.401 226239 DEBUG oslo_concurrency.lockutils [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "058110ef-426a-46ab-8f57-d84a048d54be-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.401 226239 DEBUG oslo_concurrency.lockutils [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "058110ef-426a-46ab-8f57-d84a048d54be-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.401 226239 DEBUG oslo_concurrency.lockutils [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "058110ef-426a-46ab-8f57-d84a048d54be-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.403 226239 INFO nova.compute.manager [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Terminating instance#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.403 226239 DEBUG nova.compute.manager [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:34:11 np0005603623 kernel: tapacbebf00-06 (unregistering): left promiscuous mode
Jan 31 03:34:11 np0005603623 NetworkManager[48970]: <info>  [1769848451.4561] device (tapacbebf00-06): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:34:11 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:11Z|00574|binding|INFO|Releasing lport acbebf00-067f-42fc-a3a1-50ffc7af9827 from this chassis (sb_readonly=0)
Jan 31 03:34:11 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:11Z|00575|binding|INFO|Setting lport acbebf00-067f-42fc-a3a1-50ffc7af9827 down in Southbound
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.460 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:11 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:11Z|00576|binding|INFO|Removing iface tapacbebf00-06 ovn-installed in OVS
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.462 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.470 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:11.484 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:70:ab 10.100.0.4'], port_security=['fa:16:3e:76:70:ab 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '058110ef-426a-46ab-8f57-d84a048d54be', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6071a46-64a6-45aa-97c6-06e6c564195b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '40db421b27d84f809f8074c58151327f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '986b09c9-4243-429e-9b6e-93ffcacf8cb5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=111856e4-2ce2-4b64-a82d-6a5bd7b8a457, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=acbebf00-067f-42fc-a3a1-50ffc7af9827) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:34:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:11.487 143258 INFO neutron.agent.ovn.metadata.agent [-] Port acbebf00-067f-42fc-a3a1-50ffc7af9827 in datapath f6071a46-64a6-45aa-97c6-06e6c564195b unbound from our chassis#033[00m
Jan 31 03:34:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:11.489 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f6071a46-64a6-45aa-97c6-06e6c564195b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:34:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:11.490 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ab44e1ac-b2ff-4514-ae67-d63a8ef954cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:11.490 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b namespace which is not needed anymore#033[00m
Jan 31 03:34:11 np0005603623 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000008a.scope: Deactivated successfully.
Jan 31 03:34:11 np0005603623 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000008a.scope: Consumed 7.958s CPU time.
Jan 31 03:34:11 np0005603623 systemd-machined[194379]: Machine qemu-63-instance-0000008a terminated.
Jan 31 03:34:11 np0005603623 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[288862]: [NOTICE]   (288866) : haproxy version is 2.8.14-c23fe91
Jan 31 03:34:11 np0005603623 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[288862]: [NOTICE]   (288866) : path to executable is /usr/sbin/haproxy
Jan 31 03:34:11 np0005603623 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[288862]: [WARNING]  (288866) : Exiting Master process...
Jan 31 03:34:11 np0005603623 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[288862]: [ALERT]    (288866) : Current worker (288868) exited with code 143 (Terminated)
Jan 31 03:34:11 np0005603623 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[288862]: [WARNING]  (288866) : All workers exited. Exiting... (0)
Jan 31 03:34:11 np0005603623 systemd[1]: libpod-0a46b0cbe0be4b94d64786e9052330659c65699956ba5fbc2b85c72d8dee3442.scope: Deactivated successfully.
Jan 31 03:34:11 np0005603623 podman[288956]: 2026-01-31 08:34:11.610785292 +0000 UTC m=+0.045152192 container died 0a46b0cbe0be4b94d64786e9052330659c65699956ba5fbc2b85c72d8dee3442 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.635 226239 INFO nova.virt.libvirt.driver [-] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Instance destroyed successfully.#033[00m
Jan 31 03:34:11 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0a46b0cbe0be4b94d64786e9052330659c65699956ba5fbc2b85c72d8dee3442-userdata-shm.mount: Deactivated successfully.
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.636 226239 DEBUG nova.objects.instance [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lazy-loading 'resources' on Instance uuid 058110ef-426a-46ab-8f57-d84a048d54be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:11 np0005603623 systemd[1]: var-lib-containers-storage-overlay-3650bbc214f96e7aa775aebca0b99766fc70071cf51c7092cd380943cf329096-merged.mount: Deactivated successfully.
Jan 31 03:34:11 np0005603623 podman[288956]: 2026-01-31 08:34:11.65615399 +0000 UTC m=+0.090520860 container cleanup 0a46b0cbe0be4b94d64786e9052330659c65699956ba5fbc2b85c72d8dee3442 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Jan 31 03:34:11 np0005603623 systemd[1]: libpod-conmon-0a46b0cbe0be4b94d64786e9052330659c65699956ba5fbc2b85c72d8dee3442.scope: Deactivated successfully.
Jan 31 03:34:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e318 e318: 3 total, 3 up, 3 in
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.705 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:11 np0005603623 podman[288996]: 2026-01-31 08:34:11.721313637 +0000 UTC m=+0.048579880 container remove 0a46b0cbe0be4b94d64786e9052330659c65699956ba5fbc2b85c72d8dee3442 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127)
Jan 31 03:34:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:11.726 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ad7f606d-1747-47e2-828e-51fd640e13fb]: (4, ('Sat Jan 31 08:34:11 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b (0a46b0cbe0be4b94d64786e9052330659c65699956ba5fbc2b85c72d8dee3442)\n0a46b0cbe0be4b94d64786e9052330659c65699956ba5fbc2b85c72d8dee3442\nSat Jan 31 08:34:11 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b (0a46b0cbe0be4b94d64786e9052330659c65699956ba5fbc2b85c72d8dee3442)\n0a46b0cbe0be4b94d64786e9052330659c65699956ba5fbc2b85c72d8dee3442\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:11.728 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c1f6e587-c52b-41a9-ac6c-e6c17919db01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:11.729 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6071a46-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.731 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:11 np0005603623 kernel: tapf6071a46-60: left promiscuous mode
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.741 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.742 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:11.744 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cf909bf6-c7ff-4b23-b1dd-ae64ae0062ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.748 226239 DEBUG nova.virt.libvirt.vif [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:202:202,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:33:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-947719579',display_name='tempest-ServersTestJSON-server-947719579',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-947719579',id=138,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:34:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='40db421b27d84f809f8074c58151327f',ramdisk_id='',reservation_id='r-5oenbp4h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1064072764',owner_user_name='tempest-ServersTestJSON-1064072764-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:34:09Z,user_data=None,user_id='fb3f20f0143d465ebfe98f6a13200890',uuid=058110ef-426a-46ab-8f57-d84a048d54be,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "address": "fa:16:3e:76:70:ab", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacbebf00-06", "ovs_interfaceid": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.749 226239 DEBUG nova.network.os_vif_util [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Converting VIF {"id": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "address": "fa:16:3e:76:70:ab", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacbebf00-06", "ovs_interfaceid": "acbebf00-067f-42fc-a3a1-50ffc7af9827", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.750 226239 DEBUG nova.network.os_vif_util [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:70:ab,bridge_name='br-int',has_traffic_filtering=True,id=acbebf00-067f-42fc-a3a1-50ffc7af9827,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacbebf00-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.751 226239 DEBUG os_vif [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:70:ab,bridge_name='br-int',has_traffic_filtering=True,id=acbebf00-067f-42fc-a3a1-50ffc7af9827,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacbebf00-06') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.752 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.753 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapacbebf00-06, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.754 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.756 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.756 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:11 np0005603623 nova_compute[226235]: 2026-01-31 08:34:11.758 226239 INFO os_vif [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:70:ab,bridge_name='br-int',has_traffic_filtering=True,id=acbebf00-067f-42fc-a3a1-50ffc7af9827,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacbebf00-06')#033[00m
Jan 31 03:34:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:11.763 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a5044e50-d18d-4e67-943c-6bf86f04764f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:11.764 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7499c57f-841d-4a2f-b88c-30f5d6dec75e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:11.778 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[136fc1cc-a2b7-4613-8221-25319a0020e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 758356, 'reachable_time': 44914, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 289022, 'error': None, 'target': 'ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:11.780 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:34:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:11.780 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[33c4dee4-d2fa-4c79-b9cb-322d019bdeff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:11 np0005603623 systemd[1]: run-netns-ovnmeta\x2df6071a46\x2d64a6\x2d45aa\x2d97c6\x2d06e6c564195b.mount: Deactivated successfully.
Jan 31 03:34:12 np0005603623 nova_compute[226235]: 2026-01-31 08:34:12.253 226239 DEBUG nova.compute.manager [req-16353585-57e2-4013-bebd-a7a50d35a65e req-877b4851-2d8d-4304-9457-d54d210df7d4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Received event network-vif-unplugged-acbebf00-067f-42fc-a3a1-50ffc7af9827 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:12 np0005603623 nova_compute[226235]: 2026-01-31 08:34:12.254 226239 DEBUG oslo_concurrency.lockutils [req-16353585-57e2-4013-bebd-a7a50d35a65e req-877b4851-2d8d-4304-9457-d54d210df7d4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "058110ef-426a-46ab-8f57-d84a048d54be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:12 np0005603623 nova_compute[226235]: 2026-01-31 08:34:12.254 226239 DEBUG oslo_concurrency.lockutils [req-16353585-57e2-4013-bebd-a7a50d35a65e req-877b4851-2d8d-4304-9457-d54d210df7d4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "058110ef-426a-46ab-8f57-d84a048d54be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:12 np0005603623 nova_compute[226235]: 2026-01-31 08:34:12.254 226239 DEBUG oslo_concurrency.lockutils [req-16353585-57e2-4013-bebd-a7a50d35a65e req-877b4851-2d8d-4304-9457-d54d210df7d4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "058110ef-426a-46ab-8f57-d84a048d54be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:12 np0005603623 nova_compute[226235]: 2026-01-31 08:34:12.254 226239 DEBUG nova.compute.manager [req-16353585-57e2-4013-bebd-a7a50d35a65e req-877b4851-2d8d-4304-9457-d54d210df7d4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] No waiting events found dispatching network-vif-unplugged-acbebf00-067f-42fc-a3a1-50ffc7af9827 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:34:12 np0005603623 nova_compute[226235]: 2026-01-31 08:34:12.255 226239 DEBUG nova.compute.manager [req-16353585-57e2-4013-bebd-a7a50d35a65e req-877b4851-2d8d-4304-9457-d54d210df7d4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Received event network-vif-unplugged-acbebf00-067f-42fc-a3a1-50ffc7af9827 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:34:12 np0005603623 nova_compute[226235]: 2026-01-31 08:34:12.449 226239 INFO nova.virt.libvirt.driver [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Deleting instance files /var/lib/nova/instances/058110ef-426a-46ab-8f57-d84a048d54be_del#033[00m
Jan 31 03:34:12 np0005603623 nova_compute[226235]: 2026-01-31 08:34:12.450 226239 INFO nova.virt.libvirt.driver [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Deletion of /var/lib/nova/instances/058110ef-426a-46ab-8f57-d84a048d54be_del complete#033[00m
Jan 31 03:34:12 np0005603623 nova_compute[226235]: 2026-01-31 08:34:12.607 226239 INFO nova.compute.manager [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Took 1.20 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:34:12 np0005603623 nova_compute[226235]: 2026-01-31 08:34:12.607 226239 DEBUG oslo.service.loopingcall [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:34:12 np0005603623 nova_compute[226235]: 2026-01-31 08:34:12.607 226239 DEBUG nova.compute.manager [-] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:34:12 np0005603623 nova_compute[226235]: 2026-01-31 08:34:12.608 226239 DEBUG nova.network.neutron [-] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:34:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:34:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:13.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:34:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:34:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:13.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:34:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e319 e319: 3 total, 3 up, 3 in
Jan 31 03:34:13 np0005603623 nova_compute[226235]: 2026-01-31 08:34:13.905 226239 DEBUG nova.network.neutron [-] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:34:13 np0005603623 nova_compute[226235]: 2026-01-31 08:34:13.981 226239 INFO nova.compute.manager [-] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Took 1.37 seconds to deallocate network for instance.#033[00m
Jan 31 03:34:14 np0005603623 nova_compute[226235]: 2026-01-31 08:34:14.128 226239 DEBUG oslo_concurrency.lockutils [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:14 np0005603623 nova_compute[226235]: 2026-01-31 08:34:14.129 226239 DEBUG oslo_concurrency.lockutils [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:14 np0005603623 nova_compute[226235]: 2026-01-31 08:34:14.227 226239 DEBUG oslo_concurrency.processutils [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:14 np0005603623 nova_compute[226235]: 2026-01-31 08:34:14.471 226239 DEBUG nova.compute.manager [req-b3a42ea4-ec65-4391-9690-9d9a91ee23ce req-8b2d99e8-f000-4fe6-af02-c8e2d7780f5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Received event network-vif-plugged-acbebf00-067f-42fc-a3a1-50ffc7af9827 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:14 np0005603623 nova_compute[226235]: 2026-01-31 08:34:14.471 226239 DEBUG oslo_concurrency.lockutils [req-b3a42ea4-ec65-4391-9690-9d9a91ee23ce req-8b2d99e8-f000-4fe6-af02-c8e2d7780f5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "058110ef-426a-46ab-8f57-d84a048d54be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:14 np0005603623 nova_compute[226235]: 2026-01-31 08:34:14.472 226239 DEBUG oslo_concurrency.lockutils [req-b3a42ea4-ec65-4391-9690-9d9a91ee23ce req-8b2d99e8-f000-4fe6-af02-c8e2d7780f5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "058110ef-426a-46ab-8f57-d84a048d54be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:14 np0005603623 nova_compute[226235]: 2026-01-31 08:34:14.472 226239 DEBUG oslo_concurrency.lockutils [req-b3a42ea4-ec65-4391-9690-9d9a91ee23ce req-8b2d99e8-f000-4fe6-af02-c8e2d7780f5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "058110ef-426a-46ab-8f57-d84a048d54be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:14 np0005603623 nova_compute[226235]: 2026-01-31 08:34:14.472 226239 DEBUG nova.compute.manager [req-b3a42ea4-ec65-4391-9690-9d9a91ee23ce req-8b2d99e8-f000-4fe6-af02-c8e2d7780f5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] No waiting events found dispatching network-vif-plugged-acbebf00-067f-42fc-a3a1-50ffc7af9827 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:34:14 np0005603623 nova_compute[226235]: 2026-01-31 08:34:14.473 226239 WARNING nova.compute.manager [req-b3a42ea4-ec65-4391-9690-9d9a91ee23ce req-8b2d99e8-f000-4fe6-af02-c8e2d7780f5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Received unexpected event network-vif-plugged-acbebf00-067f-42fc-a3a1-50ffc7af9827 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:34:14 np0005603623 nova_compute[226235]: 2026-01-31 08:34:14.473 226239 DEBUG nova.compute.manager [req-b3a42ea4-ec65-4391-9690-9d9a91ee23ce req-8b2d99e8-f000-4fe6-af02-c8e2d7780f5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Received event network-vif-deleted-acbebf00-067f-42fc-a3a1-50ffc7af9827 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:34:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1523538904' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:34:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:34:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1523538904' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:34:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:34:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1908308623' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:34:14 np0005603623 nova_compute[226235]: 2026-01-31 08:34:14.664 226239 DEBUG oslo_concurrency.processutils [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:14 np0005603623 nova_compute[226235]: 2026-01-31 08:34:14.669 226239 DEBUG nova.compute.provider_tree [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:34:14 np0005603623 nova_compute[226235]: 2026-01-31 08:34:14.692 226239 DEBUG nova.scheduler.client.report [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:34:14 np0005603623 nova_compute[226235]: 2026-01-31 08:34:14.720 226239 DEBUG oslo_concurrency.lockutils [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e319 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:14 np0005603623 nova_compute[226235]: 2026-01-31 08:34:14.794 226239 INFO nova.scheduler.client.report [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Deleted allocations for instance 058110ef-426a-46ab-8f57-d84a048d54be#033[00m
Jan 31 03:34:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e320 e320: 3 total, 3 up, 3 in
Jan 31 03:34:14 np0005603623 nova_compute[226235]: 2026-01-31 08:34:14.886 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:14 np0005603623 nova_compute[226235]: 2026-01-31 08:34:14.889 226239 DEBUG oslo_concurrency.lockutils [None req-3647c538-2373-4bef-83ad-267b7425b149 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "058110ef-426a-46ab-8f57-d84a048d54be" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.489s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:15.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:15.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e321 e321: 3 total, 3 up, 3 in
Jan 31 03:34:16 np0005603623 nova_compute[226235]: 2026-01-31 08:34:16.754 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:34:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:34:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:34:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:34:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:17.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:34:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:17.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:18 np0005603623 nova_compute[226235]: 2026-01-31 08:34:18.181 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:18.181 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:34:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:18.182 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:34:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:18.183 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:19.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:19.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:19 np0005603623 nova_compute[226235]: 2026-01-31 08:34:19.650 226239 DEBUG oslo_concurrency.lockutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:19 np0005603623 nova_compute[226235]: 2026-01-31 08:34:19.650 226239 DEBUG oslo_concurrency.lockutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:19 np0005603623 nova_compute[226235]: 2026-01-31 08:34:19.685 226239 DEBUG nova.compute.manager [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:34:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:19 np0005603623 nova_compute[226235]: 2026-01-31 08:34:19.790 226239 DEBUG oslo_concurrency.lockutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:19 np0005603623 nova_compute[226235]: 2026-01-31 08:34:19.791 226239 DEBUG oslo_concurrency.lockutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:19 np0005603623 nova_compute[226235]: 2026-01-31 08:34:19.796 226239 DEBUG nova.virt.hardware [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:34:19 np0005603623 nova_compute[226235]: 2026-01-31 08:34:19.797 226239 INFO nova.compute.claims [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:34:19 np0005603623 nova_compute[226235]: 2026-01-31 08:34:19.887 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:20 np0005603623 nova_compute[226235]: 2026-01-31 08:34:20.037 226239 DEBUG oslo_concurrency.processutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e322 e322: 3 total, 3 up, 3 in
Jan 31 03:34:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:34:20 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1419878256' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:34:20 np0005603623 nova_compute[226235]: 2026-01-31 08:34:20.466 226239 DEBUG oslo_concurrency.processutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:20 np0005603623 nova_compute[226235]: 2026-01-31 08:34:20.473 226239 DEBUG nova.compute.provider_tree [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:34:20 np0005603623 nova_compute[226235]: 2026-01-31 08:34:20.519 226239 DEBUG nova.scheduler.client.report [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:34:20 np0005603623 nova_compute[226235]: 2026-01-31 08:34:20.626 226239 DEBUG oslo_concurrency.lockutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.835s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:20 np0005603623 nova_compute[226235]: 2026-01-31 08:34:20.627 226239 DEBUG nova.compute.manager [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:34:20 np0005603623 nova_compute[226235]: 2026-01-31 08:34:20.704 226239 DEBUG nova.compute.manager [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:34:20 np0005603623 nova_compute[226235]: 2026-01-31 08:34:20.704 226239 DEBUG nova.network.neutron [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:34:20 np0005603623 nova_compute[226235]: 2026-01-31 08:34:20.738 226239 INFO nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:34:20 np0005603623 nova_compute[226235]: 2026-01-31 08:34:20.767 226239 DEBUG nova.compute.manager [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:34:20 np0005603623 nova_compute[226235]: 2026-01-31 08:34:20.951 226239 DEBUG nova.compute.manager [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:34:20 np0005603623 nova_compute[226235]: 2026-01-31 08:34:20.952 226239 DEBUG nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:34:20 np0005603623 nova_compute[226235]: 2026-01-31 08:34:20.953 226239 INFO nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Creating image(s)#033[00m
Jan 31 03:34:20 np0005603623 nova_compute[226235]: 2026-01-31 08:34:20.977 226239 DEBUG nova.storage.rbd_utils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:34:20 np0005603623 nova_compute[226235]: 2026-01-31 08:34:20.997 226239 DEBUG nova.storage.rbd_utils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:34:21 np0005603623 nova_compute[226235]: 2026-01-31 08:34:21.022 226239 DEBUG nova.storage.rbd_utils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:34:21 np0005603623 nova_compute[226235]: 2026-01-31 08:34:21.025 226239 DEBUG oslo_concurrency.processutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:21 np0005603623 nova_compute[226235]: 2026-01-31 08:34:21.089 226239 DEBUG oslo_concurrency.processutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:21 np0005603623 nova_compute[226235]: 2026-01-31 08:34:21.090 226239 DEBUG oslo_concurrency.lockutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:21 np0005603623 nova_compute[226235]: 2026-01-31 08:34:21.091 226239 DEBUG oslo_concurrency.lockutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:21 np0005603623 nova_compute[226235]: 2026-01-31 08:34:21.091 226239 DEBUG oslo_concurrency.lockutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:21 np0005603623 nova_compute[226235]: 2026-01-31 08:34:21.117 226239 DEBUG nova.storage.rbd_utils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:34:21 np0005603623 nova_compute[226235]: 2026-01-31 08:34:21.120 226239 DEBUG oslo_concurrency.processutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:21 np0005603623 nova_compute[226235]: 2026-01-31 08:34:21.138 226239 DEBUG nova.policy [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fb3f20f0143d465ebfe98f6a13200890', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '40db421b27d84f809f8074c58151327f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:34:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:21.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:21.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:21 np0005603623 nova_compute[226235]: 2026-01-31 08:34:21.755 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:21 np0005603623 nova_compute[226235]: 2026-01-31 08:34:21.990 226239 DEBUG oslo_concurrency.processutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.869s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:22 np0005603623 nova_compute[226235]: 2026-01-31 08:34:22.078 226239 DEBUG nova.storage.rbd_utils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] resizing rbd image 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:34:22 np0005603623 nova_compute[226235]: 2026-01-31 08:34:22.305 226239 DEBUG nova.objects.instance [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lazy-loading 'migration_context' on Instance uuid 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:22 np0005603623 nova_compute[226235]: 2026-01-31 08:34:22.325 226239 DEBUG nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:34:22 np0005603623 nova_compute[226235]: 2026-01-31 08:34:22.325 226239 DEBUG nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Ensure instance console log exists: /var/lib/nova/instances/155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:34:22 np0005603623 nova_compute[226235]: 2026-01-31 08:34:22.326 226239 DEBUG oslo_concurrency.lockutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:22 np0005603623 nova_compute[226235]: 2026-01-31 08:34:22.326 226239 DEBUG oslo_concurrency.lockutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:22 np0005603623 nova_compute[226235]: 2026-01-31 08:34:22.326 226239 DEBUG oslo_concurrency.lockutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:22 np0005603623 nova_compute[226235]: 2026-01-31 08:34:22.576 226239 DEBUG nova.network.neutron [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Successfully created port: 01f28bd1-9c6d-4b95-a258-4642d646f8c3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:34:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:34:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:34:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:23.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:34:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:23.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:34:23 np0005603623 nova_compute[226235]: 2026-01-31 08:34:23.491 226239 DEBUG nova.network.neutron [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Successfully updated port: 01f28bd1-9c6d-4b95-a258-4642d646f8c3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:34:23 np0005603623 nova_compute[226235]: 2026-01-31 08:34:23.514 226239 DEBUG oslo_concurrency.lockutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "refresh_cache-155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:34:23 np0005603623 nova_compute[226235]: 2026-01-31 08:34:23.515 226239 DEBUG oslo_concurrency.lockutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquired lock "refresh_cache-155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:34:23 np0005603623 nova_compute[226235]: 2026-01-31 08:34:23.515 226239 DEBUG nova.network.neutron [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:34:23 np0005603623 nova_compute[226235]: 2026-01-31 08:34:23.681 226239 DEBUG nova.compute.manager [req-15c79f83-46cd-4faf-94f6-29ac1ff75208 req-71e46070-71f0-4550-8db0-d3473a5caaa1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Received event network-changed-01f28bd1-9c6d-4b95-a258-4642d646f8c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:23 np0005603623 nova_compute[226235]: 2026-01-31 08:34:23.682 226239 DEBUG nova.compute.manager [req-15c79f83-46cd-4faf-94f6-29ac1ff75208 req-71e46070-71f0-4550-8db0-d3473a5caaa1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Refreshing instance network info cache due to event network-changed-01f28bd1-9c6d-4b95-a258-4642d646f8c3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:34:23 np0005603623 nova_compute[226235]: 2026-01-31 08:34:23.682 226239 DEBUG oslo_concurrency.lockutils [req-15c79f83-46cd-4faf-94f6-29ac1ff75208 req-71e46070-71f0-4550-8db0-d3473a5caaa1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:34:23 np0005603623 nova_compute[226235]: 2026-01-31 08:34:23.691 226239 DEBUG nova.network.neutron [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:34:23 np0005603623 nova_compute[226235]: 2026-01-31 08:34:23.742 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:23 np0005603623 nova_compute[226235]: 2026-01-31 08:34:23.971 226239 DEBUG oslo_concurrency.lockutils [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:23 np0005603623 nova_compute[226235]: 2026-01-31 08:34:23.972 226239 DEBUG oslo_concurrency.lockutils [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:23 np0005603623 nova_compute[226235]: 2026-01-31 08:34:23.972 226239 INFO nova.compute.manager [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Shelving#033[00m
Jan 31 03:34:24 np0005603623 nova_compute[226235]: 2026-01-31 08:34:24.005 226239 DEBUG nova.virt.libvirt.driver [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:34:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:24 np0005603623 nova_compute[226235]: 2026-01-31 08:34:24.889 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.107 226239 DEBUG nova.network.neutron [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Updating instance_info_cache with network_info: [{"id": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "address": "fa:16:3e:f9:68:3c", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f28bd1-9c", "ovs_interfaceid": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:34:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:25.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:34:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:25.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.342 226239 DEBUG oslo_concurrency.lockutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Releasing lock "refresh_cache-155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.342 226239 DEBUG nova.compute.manager [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Instance network_info: |[{"id": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "address": "fa:16:3e:f9:68:3c", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f28bd1-9c", "ovs_interfaceid": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.343 226239 DEBUG oslo_concurrency.lockutils [req-15c79f83-46cd-4faf-94f6-29ac1ff75208 req-71e46070-71f0-4550-8db0-d3473a5caaa1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.344 226239 DEBUG nova.network.neutron [req-15c79f83-46cd-4faf-94f6-29ac1ff75208 req-71e46070-71f0-4550-8db0-d3473a5caaa1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Refreshing network info cache for port 01f28bd1-9c6d-4b95-a258-4642d646f8c3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.349 226239 DEBUG nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Start _get_guest_xml network_info=[{"id": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "address": "fa:16:3e:f9:68:3c", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f28bd1-9c", "ovs_interfaceid": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.354 226239 WARNING nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.359 226239 DEBUG nova.virt.libvirt.host [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.360 226239 DEBUG nova.virt.libvirt.host [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.363 226239 DEBUG nova.virt.libvirt.host [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.364 226239 DEBUG nova.virt.libvirt.host [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.366 226239 DEBUG nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.366 226239 DEBUG nova.virt.hardware [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.367 226239 DEBUG nova.virt.hardware [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.367 226239 DEBUG nova.virt.hardware [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.368 226239 DEBUG nova.virt.hardware [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.368 226239 DEBUG nova.virt.hardware [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.369 226239 DEBUG nova.virt.hardware [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.369 226239 DEBUG nova.virt.hardware [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.370 226239 DEBUG nova.virt.hardware [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.370 226239 DEBUG nova.virt.hardware [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.371 226239 DEBUG nova.virt.hardware [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.371 226239 DEBUG nova.virt.hardware [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.376 226239 DEBUG oslo_concurrency.processutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:34:25 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3501685897' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.861 226239 DEBUG oslo_concurrency.processutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.886 226239 DEBUG nova.storage.rbd_utils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:34:25 np0005603623 nova_compute[226235]: 2026-01-31 08:34:25.889 226239 DEBUG oslo_concurrency.processutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:26 np0005603623 kernel: tapae035cfb-a1 (unregistering): left promiscuous mode
Jan 31 03:34:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:34:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/828410912' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:34:26 np0005603623 NetworkManager[48970]: <info>  [1769848466.3284] device (tapae035cfb-a1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:34:26 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:26Z|00577|binding|INFO|Releasing lport ae035cfb-a17b-4578-a506-e2581da09f74 from this chassis (sb_readonly=0)
Jan 31 03:34:26 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:26Z|00578|binding|INFO|Setting lport ae035cfb-a17b-4578-a506-e2581da09f74 down in Southbound
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.334 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:26 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:26Z|00579|binding|INFO|Removing iface tapae035cfb-a1 ovn-installed in OVS
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.350 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.364 226239 DEBUG oslo_concurrency.processutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.365 226239 DEBUG nova.virt.libvirt.vif [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:34:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-375371164',display_name='tempest-ServersTestJSON-server-375371164',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-375371164',id=140,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='40db421b27d84f809f8074c58151327f',ramdisk_id='',reservation_id='r-li94zx5u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1064072764',owner_user_name='tempest-ServersTestJSON-1064072764-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:34:20Z,user_data=None,user_id='fb3f20f0143d465ebfe98f6a13200890',uuid=155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "address": "fa:16:3e:f9:68:3c", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f28bd1-9c", "ovs_interfaceid": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.365 226239 DEBUG nova.network.os_vif_util [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Converting VIF {"id": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "address": "fa:16:3e:f9:68:3c", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f28bd1-9c", "ovs_interfaceid": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.366 226239 DEBUG nova.network.os_vif_util [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=01f28bd1-9c6d-4b95-a258-4642d646f8c3,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01f28bd1-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.367 226239 DEBUG nova.objects.instance [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lazy-loading 'pci_devices' on Instance uuid 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:26 np0005603623 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000087.scope: Deactivated successfully.
Jan 31 03:34:26 np0005603623 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000087.scope: Consumed 14.845s CPU time.
Jan 31 03:34:26 np0005603623 systemd-machined[194379]: Machine qemu-62-instance-00000087 terminated.
Jan 31 03:34:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:26.452 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:5a:60 10.100.0.12'], port_security=['fa:16:3e:30:5a:60 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'adfc4c25-9eb9-45cc-ac90-2029677bcb67', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44469d8b-ad30-4270-88fa-e67c568f3150', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '953a213fa5cb435ab3c04ad96152685f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5b1bd8ad-0d2a-4d57-a00a-9a6b59df86e5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.181'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d972fb9d-6d12-4c1c-b135-704d64887b72, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=ae035cfb-a17b-4578-a506-e2581da09f74) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:34:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:26.453 143258 INFO neutron.agent.ovn.metadata.agent [-] Port ae035cfb-a17b-4578-a506-e2581da09f74 in datapath 44469d8b-ad30-4270-88fa-e67c568f3150 unbound from our chassis#033[00m
Jan 31 03:34:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:26.454 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 44469d8b-ad30-4270-88fa-e67c568f3150, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:34:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:26.455 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7781aedd-b3b3-4149-8742-9f0e180dfa8a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:26.455 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 namespace which is not needed anymore#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.550 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.554 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:26 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[288098]: [NOTICE]   (288120) : haproxy version is 2.8.14-c23fe91
Jan 31 03:34:26 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[288098]: [NOTICE]   (288120) : path to executable is /usr/sbin/haproxy
Jan 31 03:34:26 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[288098]: [WARNING]  (288120) : Exiting Master process...
Jan 31 03:34:26 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[288098]: [ALERT]    (288120) : Current worker (288124) exited with code 143 (Terminated)
Jan 31 03:34:26 np0005603623 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[288098]: [WARNING]  (288120) : All workers exited. Exiting... (0)
Jan 31 03:34:26 np0005603623 systemd[1]: libpod-ae62c108476985852c675a6b56444586a2bd2ae3d5469c06bd9ad8db3952e059.scope: Deactivated successfully.
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.562 226239 DEBUG nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:34:26 np0005603623 nova_compute[226235]:  <uuid>155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6</uuid>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:  <name>instance-0000008c</name>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServersTestJSON-server-375371164</nova:name>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:34:25</nova:creationTime>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:34:26 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:        <nova:user uuid="fb3f20f0143d465ebfe98f6a13200890">tempest-ServersTestJSON-1064072764-project-member</nova:user>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:        <nova:project uuid="40db421b27d84f809f8074c58151327f">tempest-ServersTestJSON-1064072764</nova:project>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:        <nova:port uuid="01f28bd1-9c6d-4b95-a258-4642d646f8c3">
Jan 31 03:34:26 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <entry name="serial">155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6</entry>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <entry name="uuid">155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6</entry>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6_disk">
Jan 31 03:34:26 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:34:26 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6_disk.config">
Jan 31 03:34:26 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:34:26 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:f9:68:3c"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <target dev="tap01f28bd1-9c"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6/console.log" append="off"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:34:26 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:34:26 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:34:26 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:34:26 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.563 226239 DEBUG nova.compute.manager [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Preparing to wait for external event network-vif-plugged-01f28bd1-9c6d-4b95-a258-4642d646f8c3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.563 226239 DEBUG oslo_concurrency.lockutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.563 226239 DEBUG oslo_concurrency.lockutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.564 226239 DEBUG oslo_concurrency.lockutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.564 226239 DEBUG nova.virt.libvirt.vif [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:34:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-375371164',display_name='tempest-ServersTestJSON-server-375371164',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-375371164',id=140,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='40db421b27d84f809f8074c58151327f',ramdisk_id='',reservation_id='r-li94zx5u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1064072764',owner_user_name='tempest-ServersTestJSON-1064072764-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:34:20Z,user_data=None,user_id='fb3f20f0143d465ebfe98f6a13200890',uuid=155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "address": "fa:16:3e:f9:68:3c", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f28bd1-9c", "ovs_interfaceid": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.564 226239 DEBUG nova.network.os_vif_util [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Converting VIF {"id": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "address": "fa:16:3e:f9:68:3c", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f28bd1-9c", "ovs_interfaceid": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.565 226239 DEBUG nova.network.os_vif_util [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=01f28bd1-9c6d-4b95-a258-4642d646f8c3,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01f28bd1-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:34:26 np0005603623 podman[289521]: 2026-01-31 08:34:26.565944934 +0000 UTC m=+0.041507978 container died ae62c108476985852c675a6b56444586a2bd2ae3d5469c06bd9ad8db3952e059 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.565 226239 DEBUG os_vif [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=01f28bd1-9c6d-4b95-a258-4642d646f8c3,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01f28bd1-9c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.566 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.566 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.567 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.569 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.569 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01f28bd1-9c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.569 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap01f28bd1-9c, col_values=(('external_ids', {'iface-id': '01f28bd1-9c6d-4b95-a258-4642d646f8c3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:68:3c', 'vm-uuid': '155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.571 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:26 np0005603623 NetworkManager[48970]: <info>  [1769848466.5722] manager: (tap01f28bd1-9c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/270)
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.573 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.579 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.580 226239 INFO os_vif [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=01f28bd1-9c6d-4b95-a258-4642d646f8c3,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01f28bd1-9c')#033[00m
Jan 31 03:34:26 np0005603623 systemd[1]: var-lib-containers-storage-overlay-267a114d746ed6c5571ecf39454d72bed5d1100967c5b2995bf4370d8ee8064a-merged.mount: Deactivated successfully.
Jan 31 03:34:26 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ae62c108476985852c675a6b56444586a2bd2ae3d5469c06bd9ad8db3952e059-userdata-shm.mount: Deactivated successfully.
Jan 31 03:34:26 np0005603623 podman[289521]: 2026-01-31 08:34:26.602108504 +0000 UTC m=+0.077671518 container cleanup ae62c108476985852c675a6b56444586a2bd2ae3d5469c06bd9ad8db3952e059 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:34:26 np0005603623 systemd[1]: libpod-conmon-ae62c108476985852c675a6b56444586a2bd2ae3d5469c06bd9ad8db3952e059.scope: Deactivated successfully.
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.634 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848451.634224, 058110ef-426a-46ab-8f57-d84a048d54be => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.635 226239 INFO nova.compute.manager [-] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:34:26 np0005603623 podman[289559]: 2026-01-31 08:34:26.661977636 +0000 UTC m=+0.042313894 container remove ae62c108476985852c675a6b56444586a2bd2ae3d5469c06bd9ad8db3952e059 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:34:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:26.667 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3fe49a9a-c484-4b90-b6ac-b04e5a1ef0ea]: (4, ('Sat Jan 31 08:34:26 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 (ae62c108476985852c675a6b56444586a2bd2ae3d5469c06bd9ad8db3952e059)\nae62c108476985852c675a6b56444586a2bd2ae3d5469c06bd9ad8db3952e059\nSat Jan 31 08:34:26 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 (ae62c108476985852c675a6b56444586a2bd2ae3d5469c06bd9ad8db3952e059)\nae62c108476985852c675a6b56444586a2bd2ae3d5469c06bd9ad8db3952e059\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:26.670 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[04553d04-1638-4f38-8422-56309a36f8f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:26.671 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44469d8b-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.723 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:26 np0005603623 kernel: tap44469d8b-a0: left promiscuous mode
Jan 31 03:34:26 np0005603623 nova_compute[226235]: 2026-01-31 08:34:26.739 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:26.742 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ef8f5268-be0f-4e14-9030-7efa6323184d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:26.753 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0977be13-9d8c-4778-b873-cf7d02e3cece]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:26.754 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a6e3043f-7ed3-4c21-b2df-c1add2740aaf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:26.765 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[823485a6-f94e-4f76-8237-e378a821bdd9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 754145, 'reachable_time': 39782, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 289580, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:26.767 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:34:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:26.767 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[bcaa4064-6471-4255-9437-1bbed8192c89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:26 np0005603623 systemd[1]: run-netns-ovnmeta\x2d44469d8b\x2dad30\x2d4270\x2d88fa\x2de67c568f3150.mount: Deactivated successfully.
Jan 31 03:34:27 np0005603623 nova_compute[226235]: 2026-01-31 08:34:27.020 226239 INFO nova.virt.libvirt.driver [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Instance shutdown successfully after 3 seconds.#033[00m
Jan 31 03:34:27 np0005603623 nova_compute[226235]: 2026-01-31 08:34:27.024 226239 INFO nova.virt.libvirt.driver [-] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Instance destroyed successfully.#033[00m
Jan 31 03:34:27 np0005603623 nova_compute[226235]: 2026-01-31 08:34:27.025 226239 DEBUG nova.objects.instance [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'numa_topology' on Instance uuid adfc4c25-9eb9-45cc-ac90-2029677bcb67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:27 np0005603623 nova_compute[226235]: 2026-01-31 08:34:27.060 226239 DEBUG nova.compute.manager [None req-0e13eccb-3a7a-4f5e-9c7f-72e9588d75ff - - - - - -] [instance: 058110ef-426a-46ab-8f57-d84a048d54be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:34:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:27.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:34:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:27.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:27 np0005603623 nova_compute[226235]: 2026-01-31 08:34:27.329 226239 DEBUG nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:34:27 np0005603623 nova_compute[226235]: 2026-01-31 08:34:27.330 226239 DEBUG nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:34:27 np0005603623 nova_compute[226235]: 2026-01-31 08:34:27.331 226239 DEBUG nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] No VIF found with MAC fa:16:3e:f9:68:3c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:34:27 np0005603623 nova_compute[226235]: 2026-01-31 08:34:27.331 226239 INFO nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Using config drive#033[00m
Jan 31 03:34:27 np0005603623 nova_compute[226235]: 2026-01-31 08:34:27.358 226239 DEBUG nova.storage.rbd_utils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:34:27 np0005603623 nova_compute[226235]: 2026-01-31 08:34:27.416 226239 DEBUG nova.compute.manager [req-ea4d81cb-5fe6-4074-9da0-42cd2cf5c494 req-30a557d6-1141-4ac1-a618-896fb476ba36 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Received event network-vif-unplugged-ae035cfb-a17b-4578-a506-e2581da09f74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:27 np0005603623 nova_compute[226235]: 2026-01-31 08:34:27.417 226239 DEBUG oslo_concurrency.lockutils [req-ea4d81cb-5fe6-4074-9da0-42cd2cf5c494 req-30a557d6-1141-4ac1-a618-896fb476ba36 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:27 np0005603623 nova_compute[226235]: 2026-01-31 08:34:27.417 226239 DEBUG oslo_concurrency.lockutils [req-ea4d81cb-5fe6-4074-9da0-42cd2cf5c494 req-30a557d6-1141-4ac1-a618-896fb476ba36 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:27 np0005603623 nova_compute[226235]: 2026-01-31 08:34:27.418 226239 DEBUG oslo_concurrency.lockutils [req-ea4d81cb-5fe6-4074-9da0-42cd2cf5c494 req-30a557d6-1141-4ac1-a618-896fb476ba36 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:27 np0005603623 nova_compute[226235]: 2026-01-31 08:34:27.418 226239 DEBUG nova.compute.manager [req-ea4d81cb-5fe6-4074-9da0-42cd2cf5c494 req-30a557d6-1141-4ac1-a618-896fb476ba36 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] No waiting events found dispatching network-vif-unplugged-ae035cfb-a17b-4578-a506-e2581da09f74 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:34:27 np0005603623 nova_compute[226235]: 2026-01-31 08:34:27.418 226239 WARNING nova.compute.manager [req-ea4d81cb-5fe6-4074-9da0-42cd2cf5c494 req-30a557d6-1141-4ac1-a618-896fb476ba36 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Received unexpected event network-vif-unplugged-ae035cfb-a17b-4578-a506-e2581da09f74 for instance with vm_state active and task_state shelving.#033[00m
Jan 31 03:34:27 np0005603623 nova_compute[226235]: 2026-01-31 08:34:27.562 226239 DEBUG nova.network.neutron [req-15c79f83-46cd-4faf-94f6-29ac1ff75208 req-71e46070-71f0-4550-8db0-d3473a5caaa1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Updated VIF entry in instance network info cache for port 01f28bd1-9c6d-4b95-a258-4642d646f8c3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:34:27 np0005603623 nova_compute[226235]: 2026-01-31 08:34:27.563 226239 DEBUG nova.network.neutron [req-15c79f83-46cd-4faf-94f6-29ac1ff75208 req-71e46070-71f0-4550-8db0-d3473a5caaa1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Updating instance_info_cache with network_info: [{"id": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "address": "fa:16:3e:f9:68:3c", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f28bd1-9c", "ovs_interfaceid": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:34:27 np0005603623 nova_compute[226235]: 2026-01-31 08:34:27.687 226239 INFO nova.virt.libvirt.driver [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Beginning cold snapshot process#033[00m
Jan 31 03:34:28 np0005603623 nova_compute[226235]: 2026-01-31 08:34:28.063 226239 DEBUG oslo_concurrency.lockutils [req-15c79f83-46cd-4faf-94f6-29ac1ff75208 req-71e46070-71f0-4550-8db0-d3473a5caaa1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:34:28 np0005603623 nova_compute[226235]: 2026-01-31 08:34:28.160 226239 INFO nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Creating config drive at /var/lib/nova/instances/155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6/disk.config#033[00m
Jan 31 03:34:28 np0005603623 nova_compute[226235]: 2026-01-31 08:34:28.167 226239 DEBUG oslo_concurrency.processutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp0aws1xsr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:28 np0005603623 nova_compute[226235]: 2026-01-31 08:34:28.296 226239 DEBUG oslo_concurrency.processutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp0aws1xsr" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:28 np0005603623 nova_compute[226235]: 2026-01-31 08:34:28.323 226239 DEBUG nova.storage.rbd_utils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:34:28 np0005603623 nova_compute[226235]: 2026-01-31 08:34:28.326 226239 DEBUG oslo_concurrency.processutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6/disk.config 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:28 np0005603623 nova_compute[226235]: 2026-01-31 08:34:28.463 226239 DEBUG oslo_concurrency.processutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6/disk.config 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:28 np0005603623 nova_compute[226235]: 2026-01-31 08:34:28.464 226239 INFO nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Deleting local config drive /var/lib/nova/instances/155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6/disk.config because it was imported into RBD.#033[00m
Jan 31 03:34:28 np0005603623 kernel: tap01f28bd1-9c: entered promiscuous mode
Jan 31 03:34:28 np0005603623 systemd-udevd[289499]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:34:28 np0005603623 NetworkManager[48970]: <info>  [1769848468.5052] manager: (tap01f28bd1-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/271)
Jan 31 03:34:28 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:28Z|00580|binding|INFO|Claiming lport 01f28bd1-9c6d-4b95-a258-4642d646f8c3 for this chassis.
Jan 31 03:34:28 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:28Z|00581|binding|INFO|01f28bd1-9c6d-4b95-a258-4642d646f8c3: Claiming fa:16:3e:f9:68:3c 10.100.0.10
Jan 31 03:34:28 np0005603623 NetworkManager[48970]: <info>  [1769848468.5140] device (tap01f28bd1-9c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:34:28 np0005603623 NetworkManager[48970]: <info>  [1769848468.5148] device (tap01f28bd1-9c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:34:28 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:28Z|00582|binding|INFO|Setting lport 01f28bd1-9c6d-4b95-a258-4642d646f8c3 ovn-installed in OVS
Jan 31 03:34:28 np0005603623 nova_compute[226235]: 2026-01-31 08:34:28.524 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:28 np0005603623 nova_compute[226235]: 2026-01-31 08:34:28.529 226239 DEBUG nova.virt.libvirt.imagebackend [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] No parent info for 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 31 03:34:28 np0005603623 systemd-machined[194379]: New machine qemu-64-instance-0000008c.
Jan 31 03:34:28 np0005603623 systemd[1]: Started Virtual Machine qemu-64-instance-0000008c.
Jan 31 03:34:28 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:28Z|00583|binding|INFO|Setting lport 01f28bd1-9c6d-4b95-a258-4642d646f8c3 up in Southbound
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.643 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:68:3c 10.100.0.10'], port_security=['fa:16:3e:f9:68:3c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6071a46-64a6-45aa-97c6-06e6c564195b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '40db421b27d84f809f8074c58151327f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '986b09c9-4243-429e-9b6e-93ffcacf8cb5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=111856e4-2ce2-4b64-a82d-6a5bd7b8a457, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=01f28bd1-9c6d-4b95-a258-4642d646f8c3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.644 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 01f28bd1-9c6d-4b95-a258-4642d646f8c3 in datapath f6071a46-64a6-45aa-97c6-06e6c564195b bound to our chassis#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.646 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f6071a46-64a6-45aa-97c6-06e6c564195b#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.653 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2a10ff8b-6d96-488e-a2ea-4757d297e31f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.654 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf6071a46-61 in ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.655 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf6071a46-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.655 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7b6441c3-cfa1-468f-9bf6-096ab137ea21]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.656 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f8b4ab84-f02e-428d-b2b8-2ed67cb696f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.667 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[6578079f-3491-430c-b558-a4302f05f307]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.676 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6aed7ba1-cd91-4244-a6d8-4eb8cca22c3f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.699 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[fc05bd41-8bbb-4ef4-93ff-81d7bfce63a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:28 np0005603623 NetworkManager[48970]: <info>  [1769848468.7074] manager: (tapf6071a46-60): new Veth device (/org/freedesktop/NetworkManager/Devices/272)
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.705 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ebd2b22b-7190-4ea3-9237-45bce352d749]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.732 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[068d42e6-be99-44bf-85b8-4d329ef64c17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.735 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[795b824c-b6ea-433d-a5d0-c3878c58203f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:28 np0005603623 NetworkManager[48970]: <info>  [1769848468.7607] device (tapf6071a46-60): carrier: link connected
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.765 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[8bcb497a-6d9b-4fc9-8d1a-19a339eb0a88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.781 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[87c708ed-360f-4d02-a9b2-c4d341312e96]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf6071a46-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8c:48'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 171], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 760918, 'reachable_time': 29348, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 289723, 'error': None, 'target': 'ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:28 np0005603623 nova_compute[226235]: 2026-01-31 08:34:28.787 226239 DEBUG nova.storage.rbd_utils [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] creating snapshot(2c6785d4af004b46b00c768e020dc0d3) on rbd image(adfc4c25-9eb9-45cc-ac90-2029677bcb67_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.794 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7936a3a5-e5a4-4a73-825f-a4f9a1ee6a6f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3c:8c48'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 760918, 'tstamp': 760918}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 289724, 'error': None, 'target': 'ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.808 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[259a744c-3070-46c3-b1a0-4c5f0be27d1d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf6071a46-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8c:48'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 171], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 760918, 'reachable_time': 29348, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 289732, 'error': None, 'target': 'ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.835 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fd4599c1-2336-4a92-84c1-ce6ff90ecb64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.883 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0c403ed8-2b02-4951-af67-54fc0d3504fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.885 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6071a46-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.885 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.886 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6071a46-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:28 np0005603623 nova_compute[226235]: 2026-01-31 08:34:28.887 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:28 np0005603623 NetworkManager[48970]: <info>  [1769848468.8882] manager: (tapf6071a46-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/273)
Jan 31 03:34:28 np0005603623 kernel: tapf6071a46-60: entered promiscuous mode
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.893 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf6071a46-60, col_values=(('external_ids', {'iface-id': 'e9a7861c-c6ea-4166-9252-dc2aacdf4771'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:28 np0005603623 nova_compute[226235]: 2026-01-31 08:34:28.893 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:28 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:28Z|00584|binding|INFO|Releasing lport e9a7861c-c6ea-4166-9252-dc2aacdf4771 from this chassis (sb_readonly=0)
Jan 31 03:34:28 np0005603623 nova_compute[226235]: 2026-01-31 08:34:28.901 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.902 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f6071a46-64a6-45aa-97c6-06e6c564195b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f6071a46-64a6-45aa-97c6-06e6c564195b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.903 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[16336535-ae33-45a4-95dc-a7e9b40791a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.904 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-f6071a46-64a6-45aa-97c6-06e6c564195b
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/f6071a46-64a6-45aa-97c6-06e6c564195b.pid.haproxy
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID f6071a46-64a6-45aa-97c6-06e6c564195b
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:34:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:28.904 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b', 'env', 'PROCESS_TAG=haproxy-f6071a46-64a6-45aa-97c6-06e6c564195b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f6071a46-64a6-45aa-97c6-06e6c564195b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:34:29 np0005603623 nova_compute[226235]: 2026-01-31 08:34:29.049 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848469.0489352, 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:34:29 np0005603623 nova_compute[226235]: 2026-01-31 08:34:29.050 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] VM Started (Lifecycle Event)#033[00m
Jan 31 03:34:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:29.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:29 np0005603623 podman[289821]: 2026-01-31 08:34:29.240611011 +0000 UTC m=+0.042233551 container create 092037ad7413877c8bbcf787456a5681bbabc6cd860a3827820b54100f567f00 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 31 03:34:29 np0005603623 systemd[1]: Started libpod-conmon-092037ad7413877c8bbcf787456a5681bbabc6cd860a3827820b54100f567f00.scope.
Jan 31 03:34:29 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:34:29 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0da21ba41bad48b5194d66e18cc9d2130f854337530fc591b96993d8015cad1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:34:29 np0005603623 podman[289821]: 2026-01-31 08:34:29.216663843 +0000 UTC m=+0.018286363 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:34:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:29.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:29 np0005603623 podman[289821]: 2026-01-31 08:34:29.323494041 +0000 UTC m=+0.125116561 container init 092037ad7413877c8bbcf787456a5681bbabc6cd860a3827820b54100f567f00 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 03:34:29 np0005603623 podman[289821]: 2026-01-31 08:34:29.329252941 +0000 UTC m=+0.130875451 container start 092037ad7413877c8bbcf787456a5681bbabc6cd860a3827820b54100f567f00 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127)
Jan 31 03:34:29 np0005603623 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[289836]: [NOTICE]   (289840) : New worker (289842) forked
Jan 31 03:34:29 np0005603623 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[289836]: [NOTICE]   (289840) : Loading success.
Jan 31 03:34:29 np0005603623 nova_compute[226235]: 2026-01-31 08:34:29.407 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:29 np0005603623 nova_compute[226235]: 2026-01-31 08:34:29.412 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848469.0493937, 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:34:29 np0005603623 nova_compute[226235]: 2026-01-31 08:34:29.413 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:34:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e323 e323: 3 total, 3 up, 3 in
Jan 31 03:34:29 np0005603623 nova_compute[226235]: 2026-01-31 08:34:29.491 226239 DEBUG nova.storage.rbd_utils [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] cloning vms/adfc4c25-9eb9-45cc-ac90-2029677bcb67_disk@2c6785d4af004b46b00c768e020dc0d3 to images/a445cf05-8653-452b-bc15-8061b7aa6a98 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:34:29 np0005603623 nova_compute[226235]: 2026-01-31 08:34:29.618 226239 DEBUG nova.storage.rbd_utils [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] flattening images/a445cf05-8653-452b-bc15-8061b7aa6a98 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 31 03:34:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e323 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:29 np0005603623 nova_compute[226235]: 2026-01-31 08:34:29.891 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:30 np0005603623 nova_compute[226235]: 2026-01-31 08:34:30.058 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:30 np0005603623 nova_compute[226235]: 2026-01-31 08:34:30.061 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:34:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:30.125 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:30.126 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:30.127 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:30 np0005603623 nova_compute[226235]: 2026-01-31 08:34:30.137 226239 DEBUG nova.storage.rbd_utils [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] removing snapshot(2c6785d4af004b46b00c768e020dc0d3) on rbd image(adfc4c25-9eb9-45cc-ac90-2029677bcb67_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #115. Immutable memtables: 0.
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:34:30.344531) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 71] Flushing memtable with next log file: 115
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848470344649, "job": 71, "event": "flush_started", "num_memtables": 1, "num_entries": 1566, "num_deletes": 259, "total_data_size": 3173116, "memory_usage": 3218384, "flush_reason": "Manual Compaction"}
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 71] Level-0 flush table #116: started
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848470363150, "cf_name": "default", "job": 71, "event": "table_file_creation", "file_number": 116, "file_size": 2079860, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 58181, "largest_seqno": 59742, "table_properties": {"data_size": 2073364, "index_size": 3571, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 14618, "raw_average_key_size": 20, "raw_value_size": 2059856, "raw_average_value_size": 2837, "num_data_blocks": 157, "num_entries": 726, "num_filter_entries": 726, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848360, "oldest_key_time": 1769848360, "file_creation_time": 1769848470, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 116, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 71] Flush lasted 18664 microseconds, and 5208 cpu microseconds.
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:34:30.363212) [db/flush_job.cc:967] [default] [JOB 71] Level-0 flush table #116: 2079860 bytes OK
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:34:30.363243) [db/memtable_list.cc:519] [default] Level-0 commit table #116 started
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:34:30.365283) [db/memtable_list.cc:722] [default] Level-0 commit table #116: memtable #1 done
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:34:30.365300) EVENT_LOG_v1 {"time_micros": 1769848470365293, "job": 71, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:34:30.365325) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 71] Try to delete WAL files size 3165771, prev total WAL file size 3165771, number of live WAL files 2.
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000112.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:34:30.366134) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032303132' seq:72057594037927935, type:22 .. '6C6F676D0032323634' seq:0, type:0; will stop at (end)
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 72] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 71 Base level 0, inputs: [116(2031KB)], [114(10MB)]
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848470366200, "job": 72, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [116], "files_L6": [114], "score": -1, "input_data_size": 13536076, "oldest_snapshot_seqno": -1}
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 72] Generated table #117: 8401 keys, 13400414 bytes, temperature: kUnknown
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848470451709, "cf_name": "default", "job": 72, "event": "table_file_creation", "file_number": 117, "file_size": 13400414, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13343044, "index_size": 35242, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21061, "raw_key_size": 217785, "raw_average_key_size": 25, "raw_value_size": 13192577, "raw_average_value_size": 1570, "num_data_blocks": 1387, "num_entries": 8401, "num_filter_entries": 8401, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769848470, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 117, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:34:30.451949) [db/compaction/compaction_job.cc:1663] [default] [JOB 72] Compacted 1@0 + 1@6 files to L6 => 13400414 bytes
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:34:30.458828) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 158.2 rd, 156.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 10.9 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(13.0) write-amplify(6.4) OK, records in: 8938, records dropped: 537 output_compression: NoCompression
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:34:30.458853) EVENT_LOG_v1 {"time_micros": 1769848470458841, "job": 72, "event": "compaction_finished", "compaction_time_micros": 85582, "compaction_time_cpu_micros": 20419, "output_level": 6, "num_output_files": 1, "total_output_size": 13400414, "num_input_records": 8938, "num_output_records": 8401, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000116.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848470459173, "job": 72, "event": "table_file_deletion", "file_number": 116}
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000114.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848470460565, "job": 72, "event": "table_file_deletion", "file_number": 114}
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:34:30.366052) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:34:30.460600) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:34:30.460604) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:34:30.460606) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:34:30.460608) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:34:30.460610) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:34:30 np0005603623 nova_compute[226235]: 2026-01-31 08:34:30.560 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:34:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e324 e324: 3 total, 3 up, 3 in
Jan 31 03:34:30 np0005603623 nova_compute[226235]: 2026-01-31 08:34:30.599 226239 DEBUG nova.storage.rbd_utils [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] creating snapshot(snap) on rbd image(a445cf05-8653-452b-bc15-8061b7aa6a98) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:34:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:34:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:31.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:34:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:31.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:31 np0005603623 nova_compute[226235]: 2026-01-31 08:34:31.572 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e325 e325: 3 total, 3 up, 3 in
Jan 31 03:34:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:33.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.274 226239 DEBUG nova.compute.manager [req-164f7913-5ece-414f-921f-befddd53a323 req-06ae0c27-ed7d-4944-870d-cbe0afa69048 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Received event network-vif-plugged-ae035cfb-a17b-4578-a506-e2581da09f74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.275 226239 DEBUG oslo_concurrency.lockutils [req-164f7913-5ece-414f-921f-befddd53a323 req-06ae0c27-ed7d-4944-870d-cbe0afa69048 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.275 226239 DEBUG oslo_concurrency.lockutils [req-164f7913-5ece-414f-921f-befddd53a323 req-06ae0c27-ed7d-4944-870d-cbe0afa69048 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.275 226239 DEBUG oslo_concurrency.lockutils [req-164f7913-5ece-414f-921f-befddd53a323 req-06ae0c27-ed7d-4944-870d-cbe0afa69048 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.275 226239 DEBUG nova.compute.manager [req-164f7913-5ece-414f-921f-befddd53a323 req-06ae0c27-ed7d-4944-870d-cbe0afa69048 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] No waiting events found dispatching network-vif-plugged-ae035cfb-a17b-4578-a506-e2581da09f74 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.275 226239 WARNING nova.compute.manager [req-164f7913-5ece-414f-921f-befddd53a323 req-06ae0c27-ed7d-4944-870d-cbe0afa69048 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Received unexpected event network-vif-plugged-ae035cfb-a17b-4578-a506-e2581da09f74 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.275 226239 DEBUG nova.compute.manager [req-164f7913-5ece-414f-921f-befddd53a323 req-06ae0c27-ed7d-4944-870d-cbe0afa69048 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Received event network-vif-plugged-01f28bd1-9c6d-4b95-a258-4642d646f8c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.276 226239 DEBUG oslo_concurrency.lockutils [req-164f7913-5ece-414f-921f-befddd53a323 req-06ae0c27-ed7d-4944-870d-cbe0afa69048 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.276 226239 DEBUG oslo_concurrency.lockutils [req-164f7913-5ece-414f-921f-befddd53a323 req-06ae0c27-ed7d-4944-870d-cbe0afa69048 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.276 226239 DEBUG oslo_concurrency.lockutils [req-164f7913-5ece-414f-921f-befddd53a323 req-06ae0c27-ed7d-4944-870d-cbe0afa69048 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.276 226239 DEBUG nova.compute.manager [req-164f7913-5ece-414f-921f-befddd53a323 req-06ae0c27-ed7d-4944-870d-cbe0afa69048 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Processing event network-vif-plugged-01f28bd1-9c6d-4b95-a258-4642d646f8c3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.295 226239 DEBUG nova.compute.manager [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.300 226239 DEBUG nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.301 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848473.3009896, 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.301 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.308 226239 INFO nova.virt.libvirt.driver [-] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Instance spawned successfully.#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.309 226239 DEBUG nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:34:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:33.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.445 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.452 226239 DEBUG nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.453 226239 DEBUG nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.454 226239 DEBUG nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.455 226239 DEBUG nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.456 226239 DEBUG nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.457 226239 DEBUG nova.virt.libvirt.driver [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.465 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.549 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.794 226239 INFO nova.compute.manager [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Took 12.84 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.794 226239 DEBUG nova.compute.manager [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:33 np0005603623 podman[289993]: 2026-01-31 08:34:33.980296838 +0000 UTC m=+0.060019107 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:34:33 np0005603623 nova_compute[226235]: 2026-01-31 08:34:33.996 226239 INFO nova.compute.manager [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Took 14.25 seconds to build instance.#033[00m
Jan 31 03:34:33 np0005603623 podman[289994]: 2026-01-31 08:34:33.997110014 +0000 UTC m=+0.077901326 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible)
Jan 31 03:34:34 np0005603623 nova_compute[226235]: 2026-01-31 08:34:34.037 226239 DEBUG oslo_concurrency.lockutils [None req-115b1732-501f-4b78-a033-7a98bab8b16a fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.387s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:34 np0005603623 nova_compute[226235]: 2026-01-31 08:34:34.689 226239 INFO nova.virt.libvirt.driver [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Snapshot image upload complete#033[00m
Jan 31 03:34:34 np0005603623 nova_compute[226235]: 2026-01-31 08:34:34.691 226239 DEBUG nova.compute.manager [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:34 np0005603623 nova_compute[226235]: 2026-01-31 08:34:34.824 226239 INFO nova.compute.manager [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Shelve offloading#033[00m
Jan 31 03:34:34 np0005603623 nova_compute[226235]: 2026-01-31 08:34:34.830 226239 INFO nova.virt.libvirt.driver [-] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Instance destroyed successfully.#033[00m
Jan 31 03:34:34 np0005603623 nova_compute[226235]: 2026-01-31 08:34:34.830 226239 DEBUG nova.compute.manager [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:34 np0005603623 nova_compute[226235]: 2026-01-31 08:34:34.832 226239 DEBUG oslo_concurrency.lockutils [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "refresh_cache-adfc4c25-9eb9-45cc-ac90-2029677bcb67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:34:34 np0005603623 nova_compute[226235]: 2026-01-31 08:34:34.832 226239 DEBUG oslo_concurrency.lockutils [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquired lock "refresh_cache-adfc4c25-9eb9-45cc-ac90-2029677bcb67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:34:34 np0005603623 nova_compute[226235]: 2026-01-31 08:34:34.833 226239 DEBUG nova.network.neutron [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:34:34 np0005603623 nova_compute[226235]: 2026-01-31 08:34:34.893 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:35.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:35.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:35 np0005603623 nova_compute[226235]: 2026-01-31 08:34:35.542 226239 DEBUG nova.compute.manager [req-816dedc9-4cc0-4e88-89d4-2ca2412fa42d req-9b94813a-35bb-4766-bf37-47f5e739f232 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Received event network-vif-plugged-01f28bd1-9c6d-4b95-a258-4642d646f8c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:35 np0005603623 nova_compute[226235]: 2026-01-31 08:34:35.543 226239 DEBUG oslo_concurrency.lockutils [req-816dedc9-4cc0-4e88-89d4-2ca2412fa42d req-9b94813a-35bb-4766-bf37-47f5e739f232 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:35 np0005603623 nova_compute[226235]: 2026-01-31 08:34:35.543 226239 DEBUG oslo_concurrency.lockutils [req-816dedc9-4cc0-4e88-89d4-2ca2412fa42d req-9b94813a-35bb-4766-bf37-47f5e739f232 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:35 np0005603623 nova_compute[226235]: 2026-01-31 08:34:35.543 226239 DEBUG oslo_concurrency.lockutils [req-816dedc9-4cc0-4e88-89d4-2ca2412fa42d req-9b94813a-35bb-4766-bf37-47f5e739f232 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:35 np0005603623 nova_compute[226235]: 2026-01-31 08:34:35.544 226239 DEBUG nova.compute.manager [req-816dedc9-4cc0-4e88-89d4-2ca2412fa42d req-9b94813a-35bb-4766-bf37-47f5e739f232 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] No waiting events found dispatching network-vif-plugged-01f28bd1-9c6d-4b95-a258-4642d646f8c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:34:35 np0005603623 nova_compute[226235]: 2026-01-31 08:34:35.544 226239 WARNING nova.compute.manager [req-816dedc9-4cc0-4e88-89d4-2ca2412fa42d req-9b94813a-35bb-4766-bf37-47f5e739f232 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Received unexpected event network-vif-plugged-01f28bd1-9c6d-4b95-a258-4642d646f8c3 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:34:36 np0005603623 nova_compute[226235]: 2026-01-31 08:34:36.627 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:34:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:37.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:34:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:37.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:37 np0005603623 nova_compute[226235]: 2026-01-31 08:34:37.524 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:39.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:34:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:39.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:34:39 np0005603623 nova_compute[226235]: 2026-01-31 08:34:39.342 226239 DEBUG nova.network.neutron [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Updating instance_info_cache with network_info: [{"id": "ae035cfb-a17b-4578-a506-e2581da09f74", "address": "fa:16:3e:30:5a:60", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae035cfb-a1", "ovs_interfaceid": "ae035cfb-a17b-4578-a506-e2581da09f74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:34:39 np0005603623 nova_compute[226235]: 2026-01-31 08:34:39.424 226239 DEBUG oslo_concurrency.lockutils [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Releasing lock "refresh_cache-adfc4c25-9eb9-45cc-ac90-2029677bcb67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:34:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:39 np0005603623 nova_compute[226235]: 2026-01-31 08:34:39.894 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e326 e326: 3 total, 3 up, 3 in
Jan 31 03:34:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:34:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:41.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.320 226239 DEBUG oslo_concurrency.lockutils [None req-cef6d3eb-2256-4ec8-bd4f-544866b52f43 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.321 226239 DEBUG oslo_concurrency.lockutils [None req-cef6d3eb-2256-4ec8-bd4f-544866b52f43 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.321 226239 DEBUG nova.compute.manager [None req-cef6d3eb-2256-4ec8-bd4f-544866b52f43 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.326 226239 DEBUG nova.compute.manager [None req-cef6d3eb-2256-4ec8-bd4f-544866b52f43 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.327 226239 DEBUG nova.objects.instance [None req-cef6d3eb-2256-4ec8-bd4f-544866b52f43 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lazy-loading 'flavor' on Instance uuid 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:41.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.354 226239 DEBUG nova.virt.libvirt.driver [None req-cef6d3eb-2256-4ec8-bd4f-544866b52f43 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.455 226239 INFO nova.virt.libvirt.driver [-] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Instance destroyed successfully.#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.456 226239 DEBUG nova.objects.instance [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'resources' on Instance uuid adfc4c25-9eb9-45cc-ac90-2029677bcb67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.497 226239 DEBUG nova.virt.libvirt.vif [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:33:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-966483760',display_name='tempest-ServerActionsTestOtherB-server-966483760',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-966483760',id=135,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDsFGTxapW26dXB/XvUTGcfGzb7/71yMMg1CszLzfnGOAhIU/1lACOYAdVBK40cFjy/2kY258v2iqF8U2lfGaG9JRRfAxw6pRph+THb2i3B9US4SfAm/pgAAiW0mmqeasA==',key_name='tempest-keypair-1440000372',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:33:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='953a213fa5cb435ab3c04ad96152685f',ramdisk_id='',reservation_id='r-3ojgffxo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1048458052',owner_user_name='tempest-ServerActionsTestOtherB-1048458052-project-member',shelved_at='2026-01-31T08:34:34.691196',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='a445cf05-8653-452b-bc15-8061b7aa6a98'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:34:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ef51681d234a4abc88ff433d0640b6e7',uuid=adfc4c25-9eb9-45cc-ac90-2029677bcb67,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "ae035cfb-a17b-4578-a506-e2581da09f74", "address": "fa:16:3e:30:5a:60", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae035cfb-a1", "ovs_interfaceid": "ae035cfb-a17b-4578-a506-e2581da09f74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.498 226239 DEBUG nova.network.os_vif_util [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converting VIF {"id": "ae035cfb-a17b-4578-a506-e2581da09f74", "address": "fa:16:3e:30:5a:60", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae035cfb-a1", "ovs_interfaceid": "ae035cfb-a17b-4578-a506-e2581da09f74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.499 226239 DEBUG nova.network.os_vif_util [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:5a:60,bridge_name='br-int',has_traffic_filtering=True,id=ae035cfb-a17b-4578-a506-e2581da09f74,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae035cfb-a1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.500 226239 DEBUG os_vif [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:5a:60,bridge_name='br-int',has_traffic_filtering=True,id=ae035cfb-a17b-4578-a506-e2581da09f74,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae035cfb-a1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.503 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.504 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae035cfb-a1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.506 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.509 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.513 226239 INFO os_vif [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:5a:60,bridge_name='br-int',has_traffic_filtering=True,id=ae035cfb-a17b-4578-a506-e2581da09f74,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae035cfb-a1')#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.560 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848466.5592682, adfc4c25-9eb9-45cc-ac90-2029677bcb67 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.561 226239 INFO nova.compute.manager [-] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.581 226239 DEBUG nova.compute.manager [req-0cd9fa2c-bc09-47e7-a602-85291d72f82e req-d5f3dee2-1bb8-42ac-8ae8-0f87d69cb7ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Received event network-changed-ae035cfb-a17b-4578-a506-e2581da09f74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.581 226239 DEBUG nova.compute.manager [req-0cd9fa2c-bc09-47e7-a602-85291d72f82e req-d5f3dee2-1bb8-42ac-8ae8-0f87d69cb7ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Refreshing instance network info cache due to event network-changed-ae035cfb-a17b-4578-a506-e2581da09f74. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.581 226239 DEBUG oslo_concurrency.lockutils [req-0cd9fa2c-bc09-47e7-a602-85291d72f82e req-d5f3dee2-1bb8-42ac-8ae8-0f87d69cb7ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-adfc4c25-9eb9-45cc-ac90-2029677bcb67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.582 226239 DEBUG oslo_concurrency.lockutils [req-0cd9fa2c-bc09-47e7-a602-85291d72f82e req-d5f3dee2-1bb8-42ac-8ae8-0f87d69cb7ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-adfc4c25-9eb9-45cc-ac90-2029677bcb67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.582 226239 DEBUG nova.network.neutron [req-0cd9fa2c-bc09-47e7-a602-85291d72f82e req-d5f3dee2-1bb8-42ac-8ae8-0f87d69cb7ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Refreshing network info cache for port ae035cfb-a17b-4578-a506-e2581da09f74 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.585 226239 DEBUG nova.compute.manager [None req-e122f6ae-afb8-4b60-8459-4fdd98e1fb68 - - - - - -] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.589 226239 DEBUG nova.compute.manager [None req-e122f6ae-afb8-4b60-8459-4fdd98e1fb68 - - - - - -] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: shelved, current task_state: shelving_offloading, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:34:41 np0005603623 nova_compute[226235]: 2026-01-31 08:34:41.631 226239 INFO nova.compute.manager [None req-e122f6ae-afb8-4b60-8459-4fdd98e1fb68 - - - - - -] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] During sync_power_state the instance has a pending task (shelving_offloading). Skip.#033[00m
Jan 31 03:34:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:43.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:43.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:44 np0005603623 nova_compute[226235]: 2026-01-31 08:34:44.022 226239 INFO nova.virt.libvirt.driver [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Deleting instance files /var/lib/nova/instances/adfc4c25-9eb9-45cc-ac90-2029677bcb67_del#033[00m
Jan 31 03:34:44 np0005603623 nova_compute[226235]: 2026-01-31 08:34:44.023 226239 INFO nova.virt.libvirt.driver [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Deletion of /var/lib/nova/instances/adfc4c25-9eb9-45cc-ac90-2029677bcb67_del complete#033[00m
Jan 31 03:34:44 np0005603623 nova_compute[226235]: 2026-01-31 08:34:44.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:34:44 np0005603623 nova_compute[226235]: 2026-01-31 08:34:44.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:34:44 np0005603623 nova_compute[226235]: 2026-01-31 08:34:44.441 226239 INFO nova.scheduler.client.report [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Deleted allocations for instance adfc4c25-9eb9-45cc-ac90-2029677bcb67#033[00m
Jan 31 03:34:44 np0005603623 nova_compute[226235]: 2026-01-31 08:34:44.511 226239 DEBUG nova.network.neutron [req-0cd9fa2c-bc09-47e7-a602-85291d72f82e req-d5f3dee2-1bb8-42ac-8ae8-0f87d69cb7ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Updated VIF entry in instance network info cache for port ae035cfb-a17b-4578-a506-e2581da09f74. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:34:44 np0005603623 nova_compute[226235]: 2026-01-31 08:34:44.511 226239 DEBUG nova.network.neutron [req-0cd9fa2c-bc09-47e7-a602-85291d72f82e req-d5f3dee2-1bb8-42ac-8ae8-0f87d69cb7ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: adfc4c25-9eb9-45cc-ac90-2029677bcb67] Updating instance_info_cache with network_info: [{"id": "ae035cfb-a17b-4578-a506-e2581da09f74", "address": "fa:16:3e:30:5a:60", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": null, "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tapae035cfb-a1", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:34:44 np0005603623 nova_compute[226235]: 2026-01-31 08:34:44.564 226239 DEBUG oslo_concurrency.lockutils [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:44 np0005603623 nova_compute[226235]: 2026-01-31 08:34:44.565 226239 DEBUG oslo_concurrency.lockutils [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:44 np0005603623 nova_compute[226235]: 2026-01-31 08:34:44.623 226239 DEBUG oslo_concurrency.lockutils [req-0cd9fa2c-bc09-47e7-a602-85291d72f82e req-d5f3dee2-1bb8-42ac-8ae8-0f87d69cb7ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-adfc4c25-9eb9-45cc-ac90-2029677bcb67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:34:44 np0005603623 nova_compute[226235]: 2026-01-31 08:34:44.642 226239 DEBUG oslo_concurrency.processutils [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e326 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:44 np0005603623 nova_compute[226235]: 2026-01-31 08:34:44.896 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:45 np0005603623 nova_compute[226235]: 2026-01-31 08:34:45.175 226239 DEBUG oslo_concurrency.processutils [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.533s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:45 np0005603623 nova_compute[226235]: 2026-01-31 08:34:45.183 226239 DEBUG nova.compute.provider_tree [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:34:45 np0005603623 nova_compute[226235]: 2026-01-31 08:34:45.204 226239 DEBUG nova.scheduler.client.report [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:34:45 np0005603623 nova_compute[226235]: 2026-01-31 08:34:45.254 226239 DEBUG oslo_concurrency.lockutils [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.689s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:45.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:45 np0005603623 nova_compute[226235]: 2026-01-31 08:34:45.320 226239 DEBUG oslo_concurrency.lockutils [None req-3bddba1b-b4f5-4886-b692-22b399596ba7 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "adfc4c25-9eb9-45cc-ac90-2029677bcb67" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 21.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:45.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:46 np0005603623 nova_compute[226235]: 2026-01-31 08:34:46.550 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:47.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:47.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:48Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f9:68:3c 10.100.0.10
Jan 31 03:34:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:48Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f9:68:3c 10.100.0.10
Jan 31 03:34:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:49.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:49.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e326 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:49 np0005603623 nova_compute[226235]: 2026-01-31 08:34:49.897 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:51 np0005603623 nova_compute[226235]: 2026-01-31 08:34:51.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:34:51 np0005603623 nova_compute[226235]: 2026-01-31 08:34:51.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:34:51 np0005603623 nova_compute[226235]: 2026-01-31 08:34:51.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:34:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:51.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:51.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:51 np0005603623 nova_compute[226235]: 2026-01-31 08:34:51.393 226239 DEBUG nova.virt.libvirt.driver [None req-cef6d3eb-2256-4ec8-bd4f-544866b52f43 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 03:34:51 np0005603623 nova_compute[226235]: 2026-01-31 08:34:51.471 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:34:51 np0005603623 nova_compute[226235]: 2026-01-31 08:34:51.471 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:34:51 np0005603623 nova_compute[226235]: 2026-01-31 08:34:51.471 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:34:51 np0005603623 nova_compute[226235]: 2026-01-31 08:34:51.471 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:51 np0005603623 nova_compute[226235]: 2026-01-31 08:34:51.553 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:34:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:53.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:34:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:53.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:54 np0005603623 nova_compute[226235]: 2026-01-31 08:34:54.405 226239 INFO nova.virt.libvirt.driver [None req-cef6d3eb-2256-4ec8-bd4f-544866b52f43 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Instance shutdown successfully after 13 seconds.#033[00m
Jan 31 03:34:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e326 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:54 np0005603623 kernel: tap01f28bd1-9c (unregistering): left promiscuous mode
Jan 31 03:34:54 np0005603623 NetworkManager[48970]: <info>  [1769848494.7612] device (tap01f28bd1-9c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:34:54 np0005603623 nova_compute[226235]: 2026-01-31 08:34:54.775 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:54 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:54Z|00585|binding|INFO|Releasing lport 01f28bd1-9c6d-4b95-a258-4642d646f8c3 from this chassis (sb_readonly=0)
Jan 31 03:34:54 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:54Z|00586|binding|INFO|Setting lport 01f28bd1-9c6d-4b95-a258-4642d646f8c3 down in Southbound
Jan 31 03:34:54 np0005603623 ovn_controller[133449]: 2026-01-31T08:34:54Z|00587|binding|INFO|Removing iface tap01f28bd1-9c ovn-installed in OVS
Jan 31 03:34:54 np0005603623 nova_compute[226235]: 2026-01-31 08:34:54.778 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:54 np0005603623 nova_compute[226235]: 2026-01-31 08:34:54.783 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:54 np0005603623 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000008c.scope: Deactivated successfully.
Jan 31 03:34:54 np0005603623 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000008c.scope: Consumed 13.087s CPU time.
Jan 31 03:34:54 np0005603623 systemd-machined[194379]: Machine qemu-64-instance-0000008c terminated.
Jan 31 03:34:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:54.899 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:68:3c 10.100.0.10'], port_security=['fa:16:3e:f9:68:3c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6071a46-64a6-45aa-97c6-06e6c564195b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '40db421b27d84f809f8074c58151327f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '986b09c9-4243-429e-9b6e-93ffcacf8cb5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=111856e4-2ce2-4b64-a82d-6a5bd7b8a457, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=01f28bd1-9c6d-4b95-a258-4642d646f8c3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:34:54 np0005603623 nova_compute[226235]: 2026-01-31 08:34:54.900 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:54.901 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 01f28bd1-9c6d-4b95-a258-4642d646f8c3 in datapath f6071a46-64a6-45aa-97c6-06e6c564195b unbound from our chassis#033[00m
Jan 31 03:34:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:54.902 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f6071a46-64a6-45aa-97c6-06e6c564195b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:34:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:54.903 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8f293c48-823a-4857-afe8-e8701f1f98e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:54.904 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b namespace which is not needed anymore#033[00m
Jan 31 03:34:55 np0005603623 nova_compute[226235]: 2026-01-31 08:34:55.037 226239 INFO nova.virt.libvirt.driver [-] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Instance destroyed successfully.#033[00m
Jan 31 03:34:55 np0005603623 nova_compute[226235]: 2026-01-31 08:34:55.039 226239 DEBUG nova.objects.instance [None req-cef6d3eb-2256-4ec8-bd4f-544866b52f43 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lazy-loading 'numa_topology' on Instance uuid 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:55 np0005603623 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[289836]: [NOTICE]   (289840) : haproxy version is 2.8.14-c23fe91
Jan 31 03:34:55 np0005603623 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[289836]: [NOTICE]   (289840) : path to executable is /usr/sbin/haproxy
Jan 31 03:34:55 np0005603623 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[289836]: [WARNING]  (289840) : Exiting Master process...
Jan 31 03:34:55 np0005603623 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[289836]: [ALERT]    (289840) : Current worker (289842) exited with code 143 (Terminated)
Jan 31 03:34:55 np0005603623 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[289836]: [WARNING]  (289840) : All workers exited. Exiting... (0)
Jan 31 03:34:55 np0005603623 systemd[1]: libpod-092037ad7413877c8bbcf787456a5681bbabc6cd860a3827820b54100f567f00.scope: Deactivated successfully.
Jan 31 03:34:55 np0005603623 podman[290164]: 2026-01-31 08:34:55.060721073 +0000 UTC m=+0.053307267 container died 092037ad7413877c8bbcf787456a5681bbabc6cd860a3827820b54100f567f00 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:34:55 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-092037ad7413877c8bbcf787456a5681bbabc6cd860a3827820b54100f567f00-userdata-shm.mount: Deactivated successfully.
Jan 31 03:34:55 np0005603623 systemd[1]: var-lib-containers-storage-overlay-f0da21ba41bad48b5194d66e18cc9d2130f854337530fc591b96993d8015cad1-merged.mount: Deactivated successfully.
Jan 31 03:34:55 np0005603623 podman[290164]: 2026-01-31 08:34:55.102344744 +0000 UTC m=+0.094930928 container cleanup 092037ad7413877c8bbcf787456a5681bbabc6cd860a3827820b54100f567f00 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 31 03:34:55 np0005603623 systemd[1]: libpod-conmon-092037ad7413877c8bbcf787456a5681bbabc6cd860a3827820b54100f567f00.scope: Deactivated successfully.
Jan 31 03:34:55 np0005603623 nova_compute[226235]: 2026-01-31 08:34:55.142 226239 DEBUG nova.compute.manager [None req-cef6d3eb-2256-4ec8-bd4f-544866b52f43 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:55 np0005603623 podman[290208]: 2026-01-31 08:34:55.160551163 +0000 UTC m=+0.042352034 container remove 092037ad7413877c8bbcf787456a5681bbabc6cd860a3827820b54100f567f00 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:34:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:55.165 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7ed5578d-3106-4704-b1a4-9cf736a92b13]: (4, ('Sat Jan 31 08:34:54 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b (092037ad7413877c8bbcf787456a5681bbabc6cd860a3827820b54100f567f00)\n092037ad7413877c8bbcf787456a5681bbabc6cd860a3827820b54100f567f00\nSat Jan 31 08:34:55 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b (092037ad7413877c8bbcf787456a5681bbabc6cd860a3827820b54100f567f00)\n092037ad7413877c8bbcf787456a5681bbabc6cd860a3827820b54100f567f00\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:55.168 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cbb7957b-b7c5-461b-af6d-1b776d481c73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:55.169 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6071a46-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:55 np0005603623 nova_compute[226235]: 2026-01-31 08:34:55.172 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:55 np0005603623 kernel: tapf6071a46-60: left promiscuous mode
Jan 31 03:34:55 np0005603623 nova_compute[226235]: 2026-01-31 08:34:55.182 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:55.186 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ecc2d37b-b4b8-4d77-a038-6862ad6d7e8e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:55.198 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7478dcf3-57fc-4854-a325-b3936a501e74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:55.200 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6b7e361d-14d6-42d1-8fdb-98f90d40bdc4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:55.213 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b3d5dd4d-0740-4ffc-af06-a5fd7b3e1a5f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 760912, 'reachable_time': 30508, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290227, 'error': None, 'target': 'ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:55 np0005603623 systemd[1]: run-netns-ovnmeta\x2df6071a46\x2d64a6\x2d45aa\x2d97c6\x2d06e6c564195b.mount: Deactivated successfully.
Jan 31 03:34:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:55.216 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:34:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:34:55.217 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[6699814a-cc22-4b72-91c6-7c74c9ed0202]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:55.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:34:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:55.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:34:55 np0005603623 nova_compute[226235]: 2026-01-31 08:34:55.614 226239 DEBUG oslo_concurrency.lockutils [None req-cef6d3eb-2256-4ec8-bd4f-544866b52f43 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 14.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:56 np0005603623 nova_compute[226235]: 2026-01-31 08:34:56.555 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:57 np0005603623 nova_compute[226235]: 2026-01-31 08:34:57.007 226239 DEBUG nova.compute.manager [req-995104a6-3020-4190-b029-721804541cc6 req-d4584e47-146f-41d7-ac70-586c83118eef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Received event network-vif-unplugged-01f28bd1-9c6d-4b95-a258-4642d646f8c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:57 np0005603623 nova_compute[226235]: 2026-01-31 08:34:57.008 226239 DEBUG oslo_concurrency.lockutils [req-995104a6-3020-4190-b029-721804541cc6 req-d4584e47-146f-41d7-ac70-586c83118eef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:57 np0005603623 nova_compute[226235]: 2026-01-31 08:34:57.008 226239 DEBUG oslo_concurrency.lockutils [req-995104a6-3020-4190-b029-721804541cc6 req-d4584e47-146f-41d7-ac70-586c83118eef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:57 np0005603623 nova_compute[226235]: 2026-01-31 08:34:57.008 226239 DEBUG oslo_concurrency.lockutils [req-995104a6-3020-4190-b029-721804541cc6 req-d4584e47-146f-41d7-ac70-586c83118eef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:57 np0005603623 nova_compute[226235]: 2026-01-31 08:34:57.008 226239 DEBUG nova.compute.manager [req-995104a6-3020-4190-b029-721804541cc6 req-d4584e47-146f-41d7-ac70-586c83118eef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] No waiting events found dispatching network-vif-unplugged-01f28bd1-9c6d-4b95-a258-4642d646f8c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:34:57 np0005603623 nova_compute[226235]: 2026-01-31 08:34:57.009 226239 WARNING nova.compute.manager [req-995104a6-3020-4190-b029-721804541cc6 req-d4584e47-146f-41d7-ac70-586c83118eef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Received unexpected event network-vif-unplugged-01f28bd1-9c6d-4b95-a258-4642d646f8c3 for instance with vm_state stopped and task_state None.#033[00m
Jan 31 03:34:57 np0005603623 nova_compute[226235]: 2026-01-31 08:34:57.255 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Updating instance_info_cache with network_info: [{"id": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "address": "fa:16:3e:f9:68:3c", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f28bd1-9c", "ovs_interfaceid": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:34:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:57.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:57.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:57 np0005603623 nova_compute[226235]: 2026-01-31 08:34:57.641 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:34:57 np0005603623 nova_compute[226235]: 2026-01-31 08:34:57.642 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:34:57 np0005603623 nova_compute[226235]: 2026-01-31 08:34:57.642 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:34:57 np0005603623 nova_compute[226235]: 2026-01-31 08:34:57.642 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:34:57 np0005603623 nova_compute[226235]: 2026-01-31 08:34:57.643 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:34:57 np0005603623 nova_compute[226235]: 2026-01-31 08:34:57.643 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:34:57 np0005603623 nova_compute[226235]: 2026-01-31 08:34:57.907 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:57 np0005603623 nova_compute[226235]: 2026-01-31 08:34:57.907 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:57 np0005603623 nova_compute[226235]: 2026-01-31 08:34:57.907 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:57 np0005603623 nova_compute[226235]: 2026-01-31 08:34:57.908 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:34:57 np0005603623 nova_compute[226235]: 2026-01-31 08:34:57.908 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:34:58 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4031320988' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:34:58 np0005603623 nova_compute[226235]: 2026-01-31 08:34:58.383 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:58 np0005603623 nova_compute[226235]: 2026-01-31 08:34:58.567 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000008c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:34:58 np0005603623 nova_compute[226235]: 2026-01-31 08:34:58.568 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000008c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:34:58 np0005603623 nova_compute[226235]: 2026-01-31 08:34:58.682 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:34:58 np0005603623 nova_compute[226235]: 2026-01-31 08:34:58.683 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4353MB free_disk=20.785491943359375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:34:58 np0005603623 nova_compute[226235]: 2026-01-31 08:34:58.683 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:58 np0005603623 nova_compute[226235]: 2026-01-31 08:34:58.683 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:59 np0005603623 nova_compute[226235]: 2026-01-31 08:34:59.207 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:34:59 np0005603623 nova_compute[226235]: 2026-01-31 08:34:59.208 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:34:59 np0005603623 nova_compute[226235]: 2026-01-31 08:34:59.208 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:34:59 np0005603623 nova_compute[226235]: 2026-01-31 08:34:59.230 226239 DEBUG nova.compute.manager [req-239d3d92-b745-47d2-bc4b-a8fa78a816fd req-c6d2cd6b-2f50-41c5-afe3-9210c5aedbc6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Received event network-vif-plugged-01f28bd1-9c6d-4b95-a258-4642d646f8c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:59 np0005603623 nova_compute[226235]: 2026-01-31 08:34:59.231 226239 DEBUG oslo_concurrency.lockutils [req-239d3d92-b745-47d2-bc4b-a8fa78a816fd req-c6d2cd6b-2f50-41c5-afe3-9210c5aedbc6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:59 np0005603623 nova_compute[226235]: 2026-01-31 08:34:59.231 226239 DEBUG oslo_concurrency.lockutils [req-239d3d92-b745-47d2-bc4b-a8fa78a816fd req-c6d2cd6b-2f50-41c5-afe3-9210c5aedbc6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:59 np0005603623 nova_compute[226235]: 2026-01-31 08:34:59.231 226239 DEBUG oslo_concurrency.lockutils [req-239d3d92-b745-47d2-bc4b-a8fa78a816fd req-c6d2cd6b-2f50-41c5-afe3-9210c5aedbc6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:59 np0005603623 nova_compute[226235]: 2026-01-31 08:34:59.231 226239 DEBUG nova.compute.manager [req-239d3d92-b745-47d2-bc4b-a8fa78a816fd req-c6d2cd6b-2f50-41c5-afe3-9210c5aedbc6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] No waiting events found dispatching network-vif-plugged-01f28bd1-9c6d-4b95-a258-4642d646f8c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:34:59 np0005603623 nova_compute[226235]: 2026-01-31 08:34:59.232 226239 WARNING nova.compute.manager [req-239d3d92-b745-47d2-bc4b-a8fa78a816fd req-c6d2cd6b-2f50-41c5-afe3-9210c5aedbc6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Received unexpected event network-vif-plugged-01f28bd1-9c6d-4b95-a258-4642d646f8c3 for instance with vm_state stopped and task_state None.#033[00m
Jan 31 03:34:59 np0005603623 nova_compute[226235]: 2026-01-31 08:34:59.241 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:59.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:34:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:59.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:34:59 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2393733873' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:34:59 np0005603623 nova_compute[226235]: 2026-01-31 08:34:59.645 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:59 np0005603623 nova_compute[226235]: 2026-01-31 08:34:59.650 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:34:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e326 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:59 np0005603623 nova_compute[226235]: 2026-01-31 08:34:59.780 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:34:59 np0005603623 nova_compute[226235]: 2026-01-31 08:34:59.903 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:00 np0005603623 nova_compute[226235]: 2026-01-31 08:35:00.113 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:35:00 np0005603623 nova_compute[226235]: 2026-01-31 08:35:00.113 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.430s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:00.456 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:35:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:00.457 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:35:00 np0005603623 nova_compute[226235]: 2026-01-31 08:35:00.458 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:00 np0005603623 nova_compute[226235]: 2026-01-31 08:35:00.625 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:00 np0005603623 nova_compute[226235]: 2026-01-31 08:35:00.626 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:01.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:01.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.604 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.634 226239 DEBUG oslo_concurrency.lockutils [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.634 226239 DEBUG oslo_concurrency.lockutils [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.635 226239 DEBUG oslo_concurrency.lockutils [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.635 226239 DEBUG oslo_concurrency.lockutils [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.635 226239 DEBUG oslo_concurrency.lockutils [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.636 226239 INFO nova.compute.manager [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Terminating instance#033[00m
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.637 226239 DEBUG nova.compute.manager [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.643 226239 INFO nova.virt.libvirt.driver [-] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Instance destroyed successfully.#033[00m
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.643 226239 DEBUG nova.objects.instance [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lazy-loading 'resources' on Instance uuid 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.769 226239 DEBUG nova.virt.libvirt.vif [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:34:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-375371164',display_name='tempest-Íñstáñcé-1118374102',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-375371164',id=140,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:34:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='40db421b27d84f809f8074c58151327f',ramdisk_id='',reservation_id='r-li94zx5u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1064072764',owner_user_name='tempest-ServersTestJSON-1064072764-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:34:58Z,user_data=None,user_id='fb3f20f0143d465ebfe98f6a13200890',uuid=155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "address": "fa:16:3e:f9:68:3c", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f28bd1-9c", "ovs_interfaceid": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.769 226239 DEBUG nova.network.os_vif_util [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Converting VIF {"id": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "address": "fa:16:3e:f9:68:3c", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f28bd1-9c", "ovs_interfaceid": "01f28bd1-9c6d-4b95-a258-4642d646f8c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.770 226239 DEBUG nova.network.os_vif_util [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f9:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=01f28bd1-9c6d-4b95-a258-4642d646f8c3,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01f28bd1-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.770 226239 DEBUG os_vif [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=01f28bd1-9c6d-4b95-a258-4642d646f8c3,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01f28bd1-9c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.771 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.772 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01f28bd1-9c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.773 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.775 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:01 np0005603623 nova_compute[226235]: 2026-01-31 08:35:01.777 226239 INFO os_vif [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=01f28bd1-9c6d-4b95-a258-4642d646f8c3,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01f28bd1-9c')#033[00m
Jan 31 03:35:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:03.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:03.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:04 np0005603623 nova_compute[226235]: 2026-01-31 08:35:04.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e326 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:04 np0005603623 nova_compute[226235]: 2026-01-31 08:35:04.849 226239 INFO nova.virt.libvirt.driver [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Deleting instance files /var/lib/nova/instances/155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6_del#033[00m
Jan 31 03:35:04 np0005603623 nova_compute[226235]: 2026-01-31 08:35:04.850 226239 INFO nova.virt.libvirt.driver [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Deletion of /var/lib/nova/instances/155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6_del complete#033[00m
Jan 31 03:35:04 np0005603623 nova_compute[226235]: 2026-01-31 08:35:04.904 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:04 np0005603623 podman[290297]: 2026-01-31 08:35:04.95914681 +0000 UTC m=+0.047648720 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent)
Jan 31 03:35:04 np0005603623 podman[290298]: 2026-01-31 08:35:04.987201837 +0000 UTC m=+0.067216682 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 31 03:35:05 np0005603623 nova_compute[226235]: 2026-01-31 08:35:05.022 226239 INFO nova.compute.manager [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Took 3.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:35:05 np0005603623 nova_compute[226235]: 2026-01-31 08:35:05.023 226239 DEBUG oslo.service.loopingcall [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:35:05 np0005603623 nova_compute[226235]: 2026-01-31 08:35:05.023 226239 DEBUG nova.compute.manager [-] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:35:05 np0005603623 nova_compute[226235]: 2026-01-31 08:35:05.023 226239 DEBUG nova.network.neutron [-] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:35:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:05.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:05.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:06 np0005603623 nova_compute[226235]: 2026-01-31 08:35:06.535 226239 DEBUG nova.network.neutron [-] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:35:06 np0005603623 nova_compute[226235]: 2026-01-31 08:35:06.774 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:06 np0005603623 nova_compute[226235]: 2026-01-31 08:35:06.823 226239 INFO nova.compute.manager [-] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Took 1.80 seconds to deallocate network for instance.#033[00m
Jan 31 03:35:07 np0005603623 nova_compute[226235]: 2026-01-31 08:35:07.243 226239 DEBUG oslo_concurrency.lockutils [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:07 np0005603623 nova_compute[226235]: 2026-01-31 08:35:07.244 226239 DEBUG oslo_concurrency.lockutils [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:07 np0005603623 nova_compute[226235]: 2026-01-31 08:35:07.285 226239 DEBUG oslo_concurrency.processutils [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:07.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:35:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:07.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:35:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:35:07 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2908702614' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:35:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e327 e327: 3 total, 3 up, 3 in
Jan 31 03:35:07 np0005603623 nova_compute[226235]: 2026-01-31 08:35:07.731 226239 DEBUG oslo_concurrency.processutils [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:07 np0005603623 nova_compute[226235]: 2026-01-31 08:35:07.736 226239 DEBUG nova.compute.provider_tree [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:35:08 np0005603623 nova_compute[226235]: 2026-01-31 08:35:08.905 226239 DEBUG nova.scheduler.client.report [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:35:08 np0005603623 nova_compute[226235]: 2026-01-31 08:35:08.910 226239 DEBUG nova.compute.manager [req-0b7cfadb-0e39-41fd-a279-cdb8fce67cae req-9d098e16-21ec-45f3-ae0e-a0c3cae0cded fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Received event network-vif-deleted-01f28bd1-9c6d-4b95-a258-4642d646f8c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:09 np0005603623 nova_compute[226235]: 2026-01-31 08:35:09.068 226239 DEBUG oslo_concurrency.lockutils [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.825s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:09 np0005603623 nova_compute[226235]: 2026-01-31 08:35:09.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:35:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:09.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:35:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:35:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:09.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:35:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:09.460 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e327 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:09 np0005603623 nova_compute[226235]: 2026-01-31 08:35:09.906 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:10 np0005603623 nova_compute[226235]: 2026-01-31 08:35:10.035 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848495.034011, 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:35:10 np0005603623 nova_compute[226235]: 2026-01-31 08:35:10.036 226239 INFO nova.compute.manager [-] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:35:10 np0005603623 nova_compute[226235]: 2026-01-31 08:35:10.283 226239 DEBUG nova.compute.manager [None req-54b85486-2567-4aca-904f-c28edfc09f8d - - - - - -] [instance: 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:35:10 np0005603623 nova_compute[226235]: 2026-01-31 08:35:10.577 226239 INFO nova.scheduler.client.report [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Deleted allocations for instance 155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6#033[00m
Jan 31 03:35:11 np0005603623 nova_compute[226235]: 2026-01-31 08:35:11.068 226239 DEBUG oslo_concurrency.lockutils [None req-3f1c7acc-99c7-4311-a54d-51b78bdc6f71 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "155a3ff7-9eb6-4e3f-9382-3f7fd3d985b6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.434s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:11.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:11.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:11 np0005603623 nova_compute[226235]: 2026-01-31 08:35:11.777 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:35:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:13.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:35:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:13.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e328 e328: 3 total, 3 up, 3 in
Jan 31 03:35:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:14 np0005603623 nova_compute[226235]: 2026-01-31 08:35:14.908 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:15.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:15.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e329 e329: 3 total, 3 up, 3 in
Jan 31 03:35:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e330 e330: 3 total, 3 up, 3 in
Jan 31 03:35:16 np0005603623 nova_compute[226235]: 2026-01-31 08:35:16.780 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:35:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:17.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:35:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:17.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e331 e331: 3 total, 3 up, 3 in
Jan 31 03:35:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:19.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:19.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:19 np0005603623 nova_compute[226235]: 2026-01-31 08:35:19.910 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:21.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:21.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:21 np0005603623 nova_compute[226235]: 2026-01-31 08:35:21.782 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:35:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:23.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:35:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:23.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:35:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:35:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:35:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 03:35:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:35:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:35:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:35:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:24 np0005603623 nova_compute[226235]: 2026-01-31 08:35:24.911 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 03:35:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 03:35:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:35:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:35:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:35:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:25.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:25.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e332 e332: 3 total, 3 up, 3 in
Jan 31 03:35:25 np0005603623 nova_compute[226235]: 2026-01-31 08:35:25.827 226239 DEBUG nova.compute.manager [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 31 03:35:26 np0005603623 nova_compute[226235]: 2026-01-31 08:35:26.251 226239 DEBUG oslo_concurrency.lockutils [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:26 np0005603623 nova_compute[226235]: 2026-01-31 08:35:26.252 226239 DEBUG oslo_concurrency.lockutils [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:26 np0005603623 nova_compute[226235]: 2026-01-31 08:35:26.532 226239 DEBUG nova.objects.instance [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lazy-loading 'pci_requests' on Instance uuid 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:26 np0005603623 nova_compute[226235]: 2026-01-31 08:35:26.705 226239 DEBUG nova.virt.hardware [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:35:26 np0005603623 nova_compute[226235]: 2026-01-31 08:35:26.706 226239 INFO nova.compute.claims [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:35:26 np0005603623 nova_compute[226235]: 2026-01-31 08:35:26.706 226239 DEBUG nova.objects.instance [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lazy-loading 'resources' on Instance uuid 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:26 np0005603623 nova_compute[226235]: 2026-01-31 08:35:26.784 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:27 np0005603623 nova_compute[226235]: 2026-01-31 08:35:27.196 226239 DEBUG nova.objects.instance [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lazy-loading 'numa_topology' on Instance uuid 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:27.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:27.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:27 np0005603623 nova_compute[226235]: 2026-01-31 08:35:27.462 226239 DEBUG nova.objects.instance [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lazy-loading 'pci_devices' on Instance uuid 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:27 np0005603623 nova_compute[226235]: 2026-01-31 08:35:27.881 226239 INFO nova.compute.resource_tracker [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Updating resource usage from migration c515f698-9e01-4a1c-97de-ee3d9443f03e#033[00m
Jan 31 03:35:27 np0005603623 nova_compute[226235]: 2026-01-31 08:35:27.882 226239 DEBUG nova.compute.resource_tracker [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Starting to track incoming migration c515f698-9e01-4a1c-97de-ee3d9443f03e with flavor a01eb4f0-fd80-416b-a750-75de320394d8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 31 03:35:28 np0005603623 nova_compute[226235]: 2026-01-31 08:35:28.975 226239 DEBUG oslo_concurrency.processutils [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:29.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:35:29 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2534280391' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:35:29 np0005603623 nova_compute[226235]: 2026-01-31 08:35:29.360 226239 DEBUG oslo_concurrency.processutils [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.385s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:29 np0005603623 nova_compute[226235]: 2026-01-31 08:35:29.364 226239 DEBUG nova.compute.provider_tree [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:35:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:29.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:29 np0005603623 nova_compute[226235]: 2026-01-31 08:35:29.484 226239 DEBUG nova.scheduler.client.report [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:35:29 np0005603623 nova_compute[226235]: 2026-01-31 08:35:29.549 226239 DEBUG oslo_concurrency.lockutils [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 3.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:29 np0005603623 nova_compute[226235]: 2026-01-31 08:35:29.550 226239 INFO nova.compute.manager [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Migrating#033[00m
Jan 31 03:35:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:29 np0005603623 nova_compute[226235]: 2026-01-31 08:35:29.912 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:30.126 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:30.127 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:30.127 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:35:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:35:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:35:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:31.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:35:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:35:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:31.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:35:31 np0005603623 nova_compute[226235]: 2026-01-31 08:35:31.787 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:31 np0005603623 nova_compute[226235]: 2026-01-31 08:35:31.878 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:31 np0005603623 nova_compute[226235]: 2026-01-31 08:35:31.960 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:33 np0005603623 systemd[1]: Created slice User Slice of UID 42436.
Jan 31 03:35:33 np0005603623 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 31 03:35:33 np0005603623 systemd-logind[795]: New session 60 of user nova.
Jan 31 03:35:33 np0005603623 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 31 03:35:33 np0005603623 systemd[1]: Starting User Manager for UID 42436...
Jan 31 03:35:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:35:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:33.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:35:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:33.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:33 np0005603623 systemd[290809]: Queued start job for default target Main User Target.
Jan 31 03:35:33 np0005603623 systemd[290809]: Created slice User Application Slice.
Jan 31 03:35:33 np0005603623 systemd[290809]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 31 03:35:33 np0005603623 systemd[290809]: Started Daily Cleanup of User's Temporary Directories.
Jan 31 03:35:33 np0005603623 systemd[290809]: Reached target Paths.
Jan 31 03:35:33 np0005603623 systemd[290809]: Reached target Timers.
Jan 31 03:35:33 np0005603623 systemd[290809]: Starting D-Bus User Message Bus Socket...
Jan 31 03:35:33 np0005603623 systemd[290809]: Starting Create User's Volatile Files and Directories...
Jan 31 03:35:33 np0005603623 systemd[290809]: Listening on D-Bus User Message Bus Socket.
Jan 31 03:35:33 np0005603623 systemd[290809]: Reached target Sockets.
Jan 31 03:35:33 np0005603623 systemd[290809]: Finished Create User's Volatile Files and Directories.
Jan 31 03:35:33 np0005603623 systemd[290809]: Reached target Basic System.
Jan 31 03:35:33 np0005603623 systemd[290809]: Reached target Main User Target.
Jan 31 03:35:33 np0005603623 systemd[290809]: Startup finished in 107ms.
Jan 31 03:35:33 np0005603623 systemd[1]: Started User Manager for UID 42436.
Jan 31 03:35:33 np0005603623 systemd[1]: Started Session 60 of User nova.
Jan 31 03:35:33 np0005603623 systemd[1]: session-60.scope: Deactivated successfully.
Jan 31 03:35:33 np0005603623 systemd-logind[795]: Session 60 logged out. Waiting for processes to exit.
Jan 31 03:35:33 np0005603623 systemd-logind[795]: Removed session 60.
Jan 31 03:35:33 np0005603623 systemd-logind[795]: New session 62 of user nova.
Jan 31 03:35:33 np0005603623 systemd[1]: Started Session 62 of User nova.
Jan 31 03:35:33 np0005603623 systemd[1]: session-62.scope: Deactivated successfully.
Jan 31 03:35:33 np0005603623 systemd-logind[795]: Session 62 logged out. Waiting for processes to exit.
Jan 31 03:35:33 np0005603623 systemd-logind[795]: Removed session 62.
Jan 31 03:35:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:34 np0005603623 nova_compute[226235]: 2026-01-31 08:35:34.914 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:35.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:35.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:35 np0005603623 podman[290832]: 2026-01-31 08:35:35.96091869 +0000 UTC m=+0.051654026 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:35:35 np0005603623 podman[290833]: 2026-01-31 08:35:35.97437296 +0000 UTC m=+0.067580823 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:35:36 np0005603623 nova_compute[226235]: 2026-01-31 08:35:36.789 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:36 np0005603623 nova_compute[226235]: 2026-01-31 08:35:36.826 226239 DEBUG nova.compute.manager [req-5a0bf9cc-61c1-4ee4-8e96-83804cf4228a req-0929bc51-b329-4b0b-92b9-2d5ea202a0ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Received event network-vif-unplugged-58956ac4-88cf-49c2-988a-8a3746f1e622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:36 np0005603623 nova_compute[226235]: 2026-01-31 08:35:36.826 226239 DEBUG oslo_concurrency.lockutils [req-5a0bf9cc-61c1-4ee4-8e96-83804cf4228a req-0929bc51-b329-4b0b-92b9-2d5ea202a0ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:36 np0005603623 nova_compute[226235]: 2026-01-31 08:35:36.827 226239 DEBUG oslo_concurrency.lockutils [req-5a0bf9cc-61c1-4ee4-8e96-83804cf4228a req-0929bc51-b329-4b0b-92b9-2d5ea202a0ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:36 np0005603623 nova_compute[226235]: 2026-01-31 08:35:36.827 226239 DEBUG oslo_concurrency.lockutils [req-5a0bf9cc-61c1-4ee4-8e96-83804cf4228a req-0929bc51-b329-4b0b-92b9-2d5ea202a0ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:36 np0005603623 nova_compute[226235]: 2026-01-31 08:35:36.827 226239 DEBUG nova.compute.manager [req-5a0bf9cc-61c1-4ee4-8e96-83804cf4228a req-0929bc51-b329-4b0b-92b9-2d5ea202a0ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] No waiting events found dispatching network-vif-unplugged-58956ac4-88cf-49c2-988a-8a3746f1e622 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:35:36 np0005603623 nova_compute[226235]: 2026-01-31 08:35:36.827 226239 WARNING nova.compute.manager [req-5a0bf9cc-61c1-4ee4-8e96-83804cf4228a req-0929bc51-b329-4b0b-92b9-2d5ea202a0ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Received unexpected event network-vif-unplugged-58956ac4-88cf-49c2-988a-8a3746f1e622 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 31 03:35:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:37.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:37.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:37 np0005603623 nova_compute[226235]: 2026-01-31 08:35:37.753 226239 INFO nova.network.neutron [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Updating port 58956ac4-88cf-49c2-988a-8a3746f1e622 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 31 03:35:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:39.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:39.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:39 np0005603623 nova_compute[226235]: 2026-01-31 08:35:39.699 226239 DEBUG nova.compute.manager [req-2acd6282-deab-46c9-95f4-685f3238b758 req-28ef24f2-6b6f-47ac-bb9e-550568164d21 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Received event network-vif-plugged-58956ac4-88cf-49c2-988a-8a3746f1e622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:39 np0005603623 nova_compute[226235]: 2026-01-31 08:35:39.699 226239 DEBUG oslo_concurrency.lockutils [req-2acd6282-deab-46c9-95f4-685f3238b758 req-28ef24f2-6b6f-47ac-bb9e-550568164d21 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:39 np0005603623 nova_compute[226235]: 2026-01-31 08:35:39.699 226239 DEBUG oslo_concurrency.lockutils [req-2acd6282-deab-46c9-95f4-685f3238b758 req-28ef24f2-6b6f-47ac-bb9e-550568164d21 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:39 np0005603623 nova_compute[226235]: 2026-01-31 08:35:39.699 226239 DEBUG oslo_concurrency.lockutils [req-2acd6282-deab-46c9-95f4-685f3238b758 req-28ef24f2-6b6f-47ac-bb9e-550568164d21 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:39 np0005603623 nova_compute[226235]: 2026-01-31 08:35:39.700 226239 DEBUG nova.compute.manager [req-2acd6282-deab-46c9-95f4-685f3238b758 req-28ef24f2-6b6f-47ac-bb9e-550568164d21 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] No waiting events found dispatching network-vif-plugged-58956ac4-88cf-49c2-988a-8a3746f1e622 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:35:39 np0005603623 nova_compute[226235]: 2026-01-31 08:35:39.700 226239 WARNING nova.compute.manager [req-2acd6282-deab-46c9-95f4-685f3238b758 req-28ef24f2-6b6f-47ac-bb9e-550568164d21 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Received unexpected event network-vif-plugged-58956ac4-88cf-49c2-988a-8a3746f1e622 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:35:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:39 np0005603623 nova_compute[226235]: 2026-01-31 08:35:39.915 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:41.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:41.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:41 np0005603623 nova_compute[226235]: 2026-01-31 08:35:41.826 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:42 np0005603623 nova_compute[226235]: 2026-01-31 08:35:42.054 226239 DEBUG oslo_concurrency.lockutils [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Acquiring lock "refresh_cache-60462c66-f02d-4ca4-aa2a-b6ea91c8a6af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:35:42 np0005603623 nova_compute[226235]: 2026-01-31 08:35:42.055 226239 DEBUG oslo_concurrency.lockutils [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Acquired lock "refresh_cache-60462c66-f02d-4ca4-aa2a-b6ea91c8a6af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:35:42 np0005603623 nova_compute[226235]: 2026-01-31 08:35:42.055 226239 DEBUG nova.network.neutron [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #118. Immutable memtables: 0.
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:35:42.747727) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 73] Flushing memtable with next log file: 118
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848542747814, "job": 73, "event": "flush_started", "num_memtables": 1, "num_entries": 1171, "num_deletes": 255, "total_data_size": 2377701, "memory_usage": 2405680, "flush_reason": "Manual Compaction"}
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 73] Level-0 flush table #119: started
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848542821011, "cf_name": "default", "job": 73, "event": "table_file_creation", "file_number": 119, "file_size": 1556740, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 59747, "largest_seqno": 60913, "table_properties": {"data_size": 1551316, "index_size": 2820, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12225, "raw_average_key_size": 20, "raw_value_size": 1540356, "raw_average_value_size": 2610, "num_data_blocks": 122, "num_entries": 590, "num_filter_entries": 590, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848470, "oldest_key_time": 1769848470, "file_creation_time": 1769848542, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 119, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 73] Flush lasted 73334 microseconds, and 3769 cpu microseconds.
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:35:42.821062) [db/flush_job.cc:967] [default] [JOB 73] Level-0 flush table #119: 1556740 bytes OK
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:35:42.821084) [db/memtable_list.cc:519] [default] Level-0 commit table #119 started
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:35:42.847751) [db/memtable_list.cc:722] [default] Level-0 commit table #119: memtable #1 done
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:35:42.847797) EVENT_LOG_v1 {"time_micros": 1769848542847789, "job": 73, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:35:42.847821) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 73] Try to delete WAL files size 2371937, prev total WAL file size 2371937, number of live WAL files 2.
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000115.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:35:42.848509) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035303230' seq:72057594037927935, type:22 .. '7061786F730035323732' seq:0, type:0; will stop at (end)
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 74] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 73 Base level 0, inputs: [119(1520KB)], [117(12MB)]
Jan 31 03:35:42 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848542848554, "job": 74, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [119], "files_L6": [117], "score": -1, "input_data_size": 14957154, "oldest_snapshot_seqno": -1}
Jan 31 03:35:43 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 74] Generated table #120: 8462 keys, 12785344 bytes, temperature: kUnknown
Jan 31 03:35:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848543248361, "cf_name": "default", "job": 74, "event": "table_file_creation", "file_number": 120, "file_size": 12785344, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12728184, "index_size": 34904, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21189, "raw_key_size": 219942, "raw_average_key_size": 25, "raw_value_size": 12577227, "raw_average_value_size": 1486, "num_data_blocks": 1366, "num_entries": 8462, "num_filter_entries": 8462, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769848542, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 120, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:35:43 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:35:43 np0005603623 nova_compute[226235]: 2026-01-31 08:35:43.249 226239 DEBUG nova.compute.manager [req-2ce5e9f0-e59e-4819-8633-5b0b462ca53d req-8679fbc9-e527-4a29-a342-b65491e48167 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Received event network-changed-58956ac4-88cf-49c2-988a-8a3746f1e622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:43 np0005603623 nova_compute[226235]: 2026-01-31 08:35:43.249 226239 DEBUG nova.compute.manager [req-2ce5e9f0-e59e-4819-8633-5b0b462ca53d req-8679fbc9-e527-4a29-a342-b65491e48167 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Refreshing instance network info cache due to event network-changed-58956ac4-88cf-49c2-988a-8a3746f1e622. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:35:43 np0005603623 nova_compute[226235]: 2026-01-31 08:35:43.250 226239 DEBUG oslo_concurrency.lockutils [req-2ce5e9f0-e59e-4819-8633-5b0b462ca53d req-8679fbc9-e527-4a29-a342-b65491e48167 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-60462c66-f02d-4ca4-aa2a-b6ea91c8a6af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:35:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:35:43.248651) [db/compaction/compaction_job.cc:1663] [default] [JOB 74] Compacted 1@0 + 1@6 files to L6 => 12785344 bytes
Jan 31 03:35:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:35:43.278907) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 37.4 rd, 32.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 12.8 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(17.8) write-amplify(8.2) OK, records in: 8991, records dropped: 529 output_compression: NoCompression
Jan 31 03:35:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:35:43.278973) EVENT_LOG_v1 {"time_micros": 1769848543278948, "job": 74, "event": "compaction_finished", "compaction_time_micros": 399916, "compaction_time_cpu_micros": 24666, "output_level": 6, "num_output_files": 1, "total_output_size": 12785344, "num_input_records": 8991, "num_output_records": 8462, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:35:43 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000119.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:35:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848543279394, "job": 74, "event": "table_file_deletion", "file_number": 119}
Jan 31 03:35:43 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000117.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:35:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848543280240, "job": 74, "event": "table_file_deletion", "file_number": 117}
Jan 31 03:35:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:35:42.848363) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:35:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:35:43.280284) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:35:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:35:43.280291) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:35:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:35:43.280292) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:35:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:35:43.280293) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:35:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:35:43.280295) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:35:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:43.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:35:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:43.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:35:43 np0005603623 systemd[1]: Stopping User Manager for UID 42436...
Jan 31 03:35:43 np0005603623 systemd[290809]: Activating special unit Exit the Session...
Jan 31 03:35:43 np0005603623 systemd[290809]: Stopped target Main User Target.
Jan 31 03:35:43 np0005603623 systemd[290809]: Stopped target Basic System.
Jan 31 03:35:43 np0005603623 systemd[290809]: Stopped target Paths.
Jan 31 03:35:43 np0005603623 systemd[290809]: Stopped target Sockets.
Jan 31 03:35:43 np0005603623 systemd[290809]: Stopped target Timers.
Jan 31 03:35:43 np0005603623 systemd[290809]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 31 03:35:43 np0005603623 systemd[290809]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 31 03:35:43 np0005603623 systemd[290809]: Closed D-Bus User Message Bus Socket.
Jan 31 03:35:43 np0005603623 systemd[290809]: Stopped Create User's Volatile Files and Directories.
Jan 31 03:35:43 np0005603623 systemd[290809]: Removed slice User Application Slice.
Jan 31 03:35:43 np0005603623 systemd[290809]: Reached target Shutdown.
Jan 31 03:35:43 np0005603623 systemd[290809]: Finished Exit the Session.
Jan 31 03:35:43 np0005603623 systemd[290809]: Reached target Exit the Session.
Jan 31 03:35:43 np0005603623 systemd[1]: user@42436.service: Deactivated successfully.
Jan 31 03:35:43 np0005603623 systemd[1]: Stopped User Manager for UID 42436.
Jan 31 03:35:43 np0005603623 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 31 03:35:43 np0005603623 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 31 03:35:43 np0005603623 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 31 03:35:43 np0005603623 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 31 03:35:43 np0005603623 systemd[1]: Removed slice User Slice of UID 42436.
Jan 31 03:35:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:44 np0005603623 nova_compute[226235]: 2026-01-31 08:35:44.917 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:35:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:45.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:35:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:45.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:46 np0005603623 nova_compute[226235]: 2026-01-31 08:35:46.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:46 np0005603623 nova_compute[226235]: 2026-01-31 08:35:46.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:35:46 np0005603623 nova_compute[226235]: 2026-01-31 08:35:46.828 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:47.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:47.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:47 np0005603623 nova_compute[226235]: 2026-01-31 08:35:47.816 226239 DEBUG nova.network.neutron [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Updating instance_info_cache with network_info: [{"id": "58956ac4-88cf-49c2-988a-8a3746f1e622", "address": "fa:16:3e:75:ff:26", "network": {"id": "e45621cc-e984-4d02-a4f7-adf5b5457b33", "bridge": "br-int", "label": "tempest-network-smoke--147789550", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58956ac4-88", "ovs_interfaceid": "58956ac4-88cf-49c2-988a-8a3746f1e622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:35:48 np0005603623 nova_compute[226235]: 2026-01-31 08:35:48.780 226239 DEBUG oslo_concurrency.lockutils [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Releasing lock "refresh_cache-60462c66-f02d-4ca4-aa2a-b6ea91c8a6af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:35:48 np0005603623 nova_compute[226235]: 2026-01-31 08:35:48.783 226239 DEBUG oslo_concurrency.lockutils [req-2ce5e9f0-e59e-4819-8633-5b0b462ca53d req-8679fbc9-e527-4a29-a342-b65491e48167 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-60462c66-f02d-4ca4-aa2a-b6ea91c8a6af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:35:48 np0005603623 nova_compute[226235]: 2026-01-31 08:35:48.784 226239 DEBUG nova.network.neutron [req-2ce5e9f0-e59e-4819-8633-5b0b462ca53d req-8679fbc9-e527-4a29-a342-b65491e48167 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Refreshing network info cache for port 58956ac4-88cf-49c2-988a-8a3746f1e622 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:35:49 np0005603623 nova_compute[226235]: 2026-01-31 08:35:49.194 226239 DEBUG oslo_concurrency.lockutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Acquiring lock "80b18469-4c81-4aa6-b657-efec55ba102b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:49 np0005603623 nova_compute[226235]: 2026-01-31 08:35:49.195 226239 DEBUG oslo_concurrency.lockutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "80b18469-4c81-4aa6-b657-efec55ba102b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:49 np0005603623 nova_compute[226235]: 2026-01-31 08:35:49.286 226239 DEBUG nova.virt.libvirt.driver [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 31 03:35:49 np0005603623 nova_compute[226235]: 2026-01-31 08:35:49.288 226239 DEBUG nova.virt.libvirt.driver [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 31 03:35:49 np0005603623 nova_compute[226235]: 2026-01-31 08:35:49.288 226239 INFO nova.virt.libvirt.driver [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Creating image(s)#033[00m
Jan 31 03:35:49 np0005603623 nova_compute[226235]: 2026-01-31 08:35:49.322 226239 DEBUG nova.storage.rbd_utils [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] creating snapshot(nova-resize) on rbd image(60462c66-f02d-4ca4-aa2a-b6ea91c8a6af_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:35:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:35:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:49.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:35:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:49.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:49 np0005603623 nova_compute[226235]: 2026-01-31 08:35:49.530 226239 DEBUG nova.compute.manager [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:35:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e333 e333: 3 total, 3 up, 3 in
Jan 31 03:35:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:49 np0005603623 nova_compute[226235]: 2026-01-31 08:35:49.918 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.006 226239 DEBUG oslo_concurrency.lockutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.006 226239 DEBUG oslo_concurrency.lockutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.015 226239 DEBUG nova.virt.hardware [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.015 226239 INFO nova.compute.claims [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.216 226239 DEBUG nova.objects.instance [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.684 226239 DEBUG nova.virt.libvirt.driver [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.685 226239 DEBUG nova.virt.libvirt.driver [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Ensure instance console log exists: /var/lib/nova/instances/60462c66-f02d-4ca4-aa2a-b6ea91c8a6af/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.685 226239 DEBUG oslo_concurrency.lockutils [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.685 226239 DEBUG oslo_concurrency.lockutils [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.685 226239 DEBUG oslo_concurrency.lockutils [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.687 226239 DEBUG nova.virt.libvirt.driver [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Start _get_guest_xml network_info=[{"id": "58956ac4-88cf-49c2-988a-8a3746f1e622", "address": "fa:16:3e:75:ff:26", "network": {"id": "e45621cc-e984-4d02-a4f7-adf5b5457b33", "bridge": "br-int", "label": "tempest-network-smoke--147789550", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--147789550", "vif_mac": "fa:16:3e:75:ff:26"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58956ac4-88", "ovs_interfaceid": "58956ac4-88cf-49c2-988a-8a3746f1e622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.692 226239 WARNING nova.virt.libvirt.driver [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.696 226239 DEBUG nova.virt.libvirt.host [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.696 226239 DEBUG nova.virt.libvirt.host [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.700 226239 DEBUG nova.virt.libvirt.host [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.700 226239 DEBUG nova.virt.libvirt.host [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.701 226239 DEBUG nova.virt.libvirt.driver [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.701 226239 DEBUG nova.virt.hardware [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.702 226239 DEBUG nova.virt.hardware [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.702 226239 DEBUG nova.virt.hardware [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.702 226239 DEBUG nova.virt.hardware [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.702 226239 DEBUG nova.virt.hardware [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.703 226239 DEBUG nova.virt.hardware [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.703 226239 DEBUG nova.virt.hardware [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.703 226239 DEBUG nova.virt.hardware [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.703 226239 DEBUG nova.virt.hardware [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.703 226239 DEBUG nova.virt.hardware [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.704 226239 DEBUG nova.virt.hardware [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.704 226239 DEBUG nova.objects.instance [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.755 226239 DEBUG oslo_concurrency.processutils [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:50 np0005603623 nova_compute[226235]: 2026-01-31 08:35:50.997 226239 DEBUG nova.scheduler.client.report [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.117 226239 DEBUG nova.scheduler.client.report [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.118 226239 DEBUG nova.compute.provider_tree [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.148 226239 DEBUG nova.scheduler.client.report [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:35:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:35:51 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3825252750' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.230 226239 DEBUG nova.scheduler.client.report [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.271 226239 DEBUG oslo_concurrency.processutils [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.313 226239 DEBUG oslo_concurrency.processutils [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:51.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.403 226239 DEBUG oslo_concurrency.processutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:51.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:35:51 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3117998316' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.754 226239 DEBUG oslo_concurrency.processutils [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.756 226239 DEBUG nova.virt.libvirt.vif [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:34:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1672201743',display_name='tempest-TestNetworkAdvancedServerOps-server-1672201743',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1672201743',id=141,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPUoionOf1jsbgYnjxtSF8S5kbM7WrnC+AvzdWQ5Iv9NrHSu1YTmh7OvNKWVCt94tfduQMP4jFzkhpdFTOQdH6c769sX4vCZIDbSCuBl9lgkWTK5Ks3sTtkCsO2rA5PBWA==',key_name='tempest-TestNetworkAdvancedServerOps-2012991436',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:34:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-4je44sr0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:35:37Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=60462c66-f02d-4ca4-aa2a-b6ea91c8a6af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "58956ac4-88cf-49c2-988a-8a3746f1e622", "address": "fa:16:3e:75:ff:26", "network": {"id": "e45621cc-e984-4d02-a4f7-adf5b5457b33", "bridge": "br-int", "label": "tempest-network-smoke--147789550", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--147789550", "vif_mac": "fa:16:3e:75:ff:26"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58956ac4-88", "ovs_interfaceid": "58956ac4-88cf-49c2-988a-8a3746f1e622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.757 226239 DEBUG nova.network.os_vif_util [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Converting VIF {"id": "58956ac4-88cf-49c2-988a-8a3746f1e622", "address": "fa:16:3e:75:ff:26", "network": {"id": "e45621cc-e984-4d02-a4f7-adf5b5457b33", "bridge": "br-int", "label": "tempest-network-smoke--147789550", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--147789550", "vif_mac": "fa:16:3e:75:ff:26"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58956ac4-88", "ovs_interfaceid": "58956ac4-88cf-49c2-988a-8a3746f1e622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.758 226239 DEBUG nova.network.os_vif_util [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:75:ff:26,bridge_name='br-int',has_traffic_filtering=True,id=58956ac4-88cf-49c2-988a-8a3746f1e622,network=Network(e45621cc-e984-4d02-a4f7-adf5b5457b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58956ac4-88') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.761 226239 DEBUG nova.virt.libvirt.driver [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:35:51 np0005603623 nova_compute[226235]:  <uuid>60462c66-f02d-4ca4-aa2a-b6ea91c8a6af</uuid>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:  <name>instance-0000008d</name>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1672201743</nova:name>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:35:50</nova:creationTime>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:35:51 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:        <nova:user uuid="f1c6e7eff11b435a81429826a682b32f">tempest-TestNetworkAdvancedServerOps-840410497-project-member</nova:user>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:        <nova:project uuid="0bfe11bd9d694684b527666e2c378eed">tempest-TestNetworkAdvancedServerOps-840410497</nova:project>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:        <nova:port uuid="58956ac4-88cf-49c2-988a-8a3746f1e622">
Jan 31 03:35:51 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <entry name="serial">60462c66-f02d-4ca4-aa2a-b6ea91c8a6af</entry>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <entry name="uuid">60462c66-f02d-4ca4-aa2a-b6ea91c8a6af</entry>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/60462c66-f02d-4ca4-aa2a-b6ea91c8a6af_disk">
Jan 31 03:35:51 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:35:51 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/60462c66-f02d-4ca4-aa2a-b6ea91c8a6af_disk.config">
Jan 31 03:35:51 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:35:51 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:75:ff:26"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <target dev="tap58956ac4-88"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/60462c66-f02d-4ca4-aa2a-b6ea91c8a6af/console.log" append="off"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:35:51 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:35:51 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:35:51 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:35:51 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.763 226239 DEBUG nova.virt.libvirt.vif [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:34:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1672201743',display_name='tempest-TestNetworkAdvancedServerOps-server-1672201743',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1672201743',id=141,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPUoionOf1jsbgYnjxtSF8S5kbM7WrnC+AvzdWQ5Iv9NrHSu1YTmh7OvNKWVCt94tfduQMP4jFzkhpdFTOQdH6c769sX4vCZIDbSCuBl9lgkWTK5Ks3sTtkCsO2rA5PBWA==',key_name='tempest-TestNetworkAdvancedServerOps-2012991436',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:34:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-4je44sr0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:35:37Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=60462c66-f02d-4ca4-aa2a-b6ea91c8a6af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "58956ac4-88cf-49c2-988a-8a3746f1e622", "address": "fa:16:3e:75:ff:26", "network": {"id": "e45621cc-e984-4d02-a4f7-adf5b5457b33", "bridge": "br-int", "label": "tempest-network-smoke--147789550", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--147789550", "vif_mac": "fa:16:3e:75:ff:26"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58956ac4-88", "ovs_interfaceid": "58956ac4-88cf-49c2-988a-8a3746f1e622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.764 226239 DEBUG nova.network.os_vif_util [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Converting VIF {"id": "58956ac4-88cf-49c2-988a-8a3746f1e622", "address": "fa:16:3e:75:ff:26", "network": {"id": "e45621cc-e984-4d02-a4f7-adf5b5457b33", "bridge": "br-int", "label": "tempest-network-smoke--147789550", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--147789550", "vif_mac": "fa:16:3e:75:ff:26"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58956ac4-88", "ovs_interfaceid": "58956ac4-88cf-49c2-988a-8a3746f1e622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.764 226239 DEBUG nova.network.os_vif_util [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:75:ff:26,bridge_name='br-int',has_traffic_filtering=True,id=58956ac4-88cf-49c2-988a-8a3746f1e622,network=Network(e45621cc-e984-4d02-a4f7-adf5b5457b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58956ac4-88') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.765 226239 DEBUG os_vif [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:75:ff:26,bridge_name='br-int',has_traffic_filtering=True,id=58956ac4-88cf-49c2-988a-8a3746f1e622,network=Network(e45621cc-e984-4d02-a4f7-adf5b5457b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58956ac4-88') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.766 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.766 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.767 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.770 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.770 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58956ac4-88, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.771 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap58956ac4-88, col_values=(('external_ids', {'iface-id': '58956ac4-88cf-49c2-988a-8a3746f1e622', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:75:ff:26', 'vm-uuid': '60462c66-f02d-4ca4-aa2a-b6ea91c8a6af'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.773 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:51 np0005603623 NetworkManager[48970]: <info>  [1769848551.7738] manager: (tap58956ac4-88): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/274)
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.775 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.779 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.780 226239 INFO os_vif [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:75:ff:26,bridge_name='br-int',has_traffic_filtering=True,id=58956ac4-88cf-49c2-988a-8a3746f1e622,network=Network(e45621cc-e984-4d02-a4f7-adf5b5457b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58956ac4-88')#033[00m
Jan 31 03:35:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:35:51 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1718812366' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.822 226239 DEBUG oslo_concurrency.processutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:51 np0005603623 nova_compute[226235]: 2026-01-31 08:35:51.828 226239 DEBUG nova.compute.provider_tree [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.051 226239 DEBUG nova.scheduler.client.report [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.090 226239 DEBUG nova.virt.libvirt.driver [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.090 226239 DEBUG nova.virt.libvirt.driver [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.091 226239 DEBUG nova.virt.libvirt.driver [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] No VIF found with MAC fa:16:3e:75:ff:26, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.092 226239 INFO nova.virt.libvirt.driver [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Using config drive#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:35:52 np0005603623 kernel: tap58956ac4-88: entered promiscuous mode
Jan 31 03:35:52 np0005603623 NetworkManager[48970]: <info>  [1769848552.1747] manager: (tap58956ac4-88): new Tun device (/org/freedesktop/NetworkManager/Devices/275)
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.175 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:35:52Z|00588|binding|INFO|Claiming lport 58956ac4-88cf-49c2-988a-8a3746f1e622 for this chassis.
Jan 31 03:35:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:35:52Z|00589|binding|INFO|58956ac4-88cf-49c2-988a-8a3746f1e622: Claiming fa:16:3e:75:ff:26 10.100.0.7
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.179 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.181 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.186 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:52 np0005603623 NetworkManager[48970]: <info>  [1769848552.1903] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/276)
Jan 31 03:35:52 np0005603623 NetworkManager[48970]: <info>  [1769848552.1911] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/277)
Jan 31 03:35:52 np0005603623 systemd-udevd[291124]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:35:52 np0005603623 systemd-machined[194379]: New machine qemu-65-instance-0000008d.
Jan 31 03:35:52 np0005603623 NetworkManager[48970]: <info>  [1769848552.2105] device (tap58956ac4-88): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:35:52 np0005603623 NetworkManager[48970]: <info>  [1769848552.2113] device (tap58956ac4-88): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.230 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:52 np0005603623 systemd[1]: Started Virtual Machine qemu-65-instance-0000008d.
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.233 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.241 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.248 226239 DEBUG oslo_concurrency.lockutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.249 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:75:ff:26 10.100.0.7'], port_security=['fa:16:3e:75:ff:26 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '60462c66-f02d-4ca4-aa2a-b6ea91c8a6af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e45621cc-e984-4d02-a4f7-adf5b5457b33', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0bfe11bd9d694684b527666e2c378eed', 'neutron:revision_number': '6', 'neutron:security_group_ids': '1bca5a82-b0f2-4237-92f5-d7d2dbf4afe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.183'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e036da0b-b229-4d68-8cb9-77eeebb375fb, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=58956ac4-88cf-49c2-988a-8a3746f1e622) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.249 226239 DEBUG nova.compute.manager [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.250 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 58956ac4-88cf-49c2-988a-8a3746f1e622 in datapath e45621cc-e984-4d02-a4f7-adf5b5457b33 bound to our chassis#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.252 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e45621cc-e984-4d02-a4f7-adf5b5457b33#033[00m
Jan 31 03:35:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:35:52Z|00590|binding|INFO|Setting lport 58956ac4-88cf-49c2-988a-8a3746f1e622 ovn-installed in OVS
Jan 31 03:35:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:35:52Z|00591|binding|INFO|Setting lport 58956ac4-88cf-49c2-988a-8a3746f1e622 up in Southbound
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.257 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.261 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.261 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0f34a1db-a609-4863-8f95-bfb8e0a09346]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.262 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape45621cc-e1 in ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.264 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape45621cc-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.264 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[815d89d5-50d5-4af9-86e9-1b9ba8b148ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.265 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c1e4586e-9e9c-46a8-8091-aa9f554f27ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.274 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[fb233e20-2d14-4f16-8837-e193b5a507ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.282 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[121d7e16-593c-4230-87ec-095129795de8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.285 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.286 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-60462c66-f02d-4ca4-aa2a-b6ea91c8a6af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.304 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[db148d61-5e6b-4e94-a656-e3f4463c818f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:52 np0005603623 NetworkManager[48970]: <info>  [1769848552.3124] manager: (tape45621cc-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/278)
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.312 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1d7715ae-b0ca-47d0-a5b3-5cd2d418ea0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.333 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[5e4f99e0-be7d-43b7-ba65-7878648c1826]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.336 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[9e9da2ec-619f-4ec8-8b58-dfb4fb303e32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:52 np0005603623 NetworkManager[48970]: <info>  [1769848552.3507] device (tape45621cc-e0): carrier: link connected
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.355 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[496065e7-0b14-4f9c-b23c-12ce2a302304]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.361 226239 DEBUG nova.compute.manager [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.366 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c1faf02c-afc4-467b-b891-4044d77a6fa3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape45621cc-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:e8:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 769278, 'reachable_time': 25186, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 291159, 'error': None, 'target': 'ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.380 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[893e442f-a294-40e0-997f-c60685a5af09]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:e862'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 769278, 'tstamp': 769278}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 291160, 'error': None, 'target': 'ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.392 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e755bc2d-ea46-4d91-b81d-d2376b9d0f63]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape45621cc-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:e8:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 769278, 'reachable_time': 25186, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 291161, 'error': None, 'target': 'ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.414 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[01aa6d55-5349-416a-9b80-1f4c498f63a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.456 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[033b1a59-443c-433b-814d-add6a9ad7007]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.458 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape45621cc-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.458 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.459 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape45621cc-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:52 np0005603623 NetworkManager[48970]: <info>  [1769848552.4610] manager: (tape45621cc-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/279)
Jan 31 03:35:52 np0005603623 kernel: tape45621cc-e0: entered promiscuous mode
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.460 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.462 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape45621cc-e0, col_values=(('external_ids', {'iface-id': '98bdd03c-3803-4f50-b99f-a5baefc4ec8a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.464 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e45621cc-e984-4d02-a4f7-adf5b5457b33.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e45621cc-e984-4d02-a4f7-adf5b5457b33.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:35:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:35:52Z|00592|binding|INFO|Releasing lport 98bdd03c-3803-4f50-b99f-a5baefc4ec8a from this chassis (sb_readonly=0)
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.464 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.465 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c056dc4e-ad0d-4ede-9359-cf40fec464e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.466 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-e45621cc-e984-4d02-a4f7-adf5b5457b33
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/e45621cc-e984-4d02-a4f7-adf5b5457b33.pid.haproxy
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID e45621cc-e984-4d02-a4f7-adf5b5457b33
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:35:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:35:52.467 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33', 'env', 'PROCESS_TAG=haproxy-e45621cc-e984-4d02-a4f7-adf5b5457b33', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e45621cc-e984-4d02-a4f7-adf5b5457b33.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.469 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.532 226239 DEBUG nova.network.neutron [req-2ce5e9f0-e59e-4819-8633-5b0b462ca53d req-8679fbc9-e527-4a29-a342-b65491e48167 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Updated VIF entry in instance network info cache for port 58956ac4-88cf-49c2-988a-8a3746f1e622. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.532 226239 DEBUG nova.network.neutron [req-2ce5e9f0-e59e-4819-8633-5b0b462ca53d req-8679fbc9-e527-4a29-a342-b65491e48167 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Updating instance_info_cache with network_info: [{"id": "58956ac4-88cf-49c2-988a-8a3746f1e622", "address": "fa:16:3e:75:ff:26", "network": {"id": "e45621cc-e984-4d02-a4f7-adf5b5457b33", "bridge": "br-int", "label": "tempest-network-smoke--147789550", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58956ac4-88", "ovs_interfaceid": "58956ac4-88cf-49c2-988a-8a3746f1e622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.542 226239 INFO nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.803 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848552.8036304, 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.804 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.805 226239 DEBUG nova.compute.manager [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.808 226239 INFO nova.virt.libvirt.driver [-] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Instance running successfully.#033[00m
Jan 31 03:35:52 np0005603623 virtqemud[225858]: argument unsupported: QEMU guest agent is not configured
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.810 226239 DEBUG nova.virt.libvirt.guest [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 31 03:35:52 np0005603623 nova_compute[226235]: 2026-01-31 08:35:52.810 226239 DEBUG nova.virt.libvirt.driver [None req-7fad474e-c634-4e22-8175-4b33c26a0248 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 31 03:35:52 np0005603623 podman[291234]: 2026-01-31 08:35:52.788069609 +0000 UTC m=+0.018199099 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:35:52 np0005603623 podman[291234]: 2026-01-31 08:35:52.977423647 +0000 UTC m=+0.207553117 container create 1a2a6c91c870112ffd805cd0bcd681d28df8f6527e5020f2ff7f303e5c59e0a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 31 03:35:53 np0005603623 systemd[1]: Started libpod-conmon-1a2a6c91c870112ffd805cd0bcd681d28df8f6527e5020f2ff7f303e5c59e0a8.scope.
Jan 31 03:35:53 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:35:53 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cefa4ab7f7559fbd57a17ddf1fc213012691c5275abfad0067c1df7e00426a26/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:35:53 np0005603623 podman[291234]: 2026-01-31 08:35:53.105426999 +0000 UTC m=+0.335556489 container init 1a2a6c91c870112ffd805cd0bcd681d28df8f6527e5020f2ff7f303e5c59e0a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Jan 31 03:35:53 np0005603623 podman[291234]: 2026-01-31 08:35:53.110636832 +0000 UTC m=+0.340766302 container start 1a2a6c91c870112ffd805cd0bcd681d28df8f6527e5020f2ff7f303e5c59e0a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:35:53 np0005603623 neutron-haproxy-ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33[291251]: [NOTICE]   (291255) : New worker (291257) forked
Jan 31 03:35:53 np0005603623 neutron-haproxy-ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33[291251]: [NOTICE]   (291255) : Loading success.
Jan 31 03:35:53 np0005603623 nova_compute[226235]: 2026-01-31 08:35:53.318 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:35:53 np0005603623 nova_compute[226235]: 2026-01-31 08:35:53.320 226239 DEBUG oslo_concurrency.lockutils [req-2ce5e9f0-e59e-4819-8633-5b0b462ca53d req-8679fbc9-e527-4a29-a342-b65491e48167 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-60462c66-f02d-4ca4-aa2a-b6ea91c8a6af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:35:53 np0005603623 nova_compute[226235]: 2026-01-31 08:35:53.320 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-60462c66-f02d-4ca4-aa2a-b6ea91c8a6af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:35:53 np0005603623 nova_compute[226235]: 2026-01-31 08:35:53.321 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:35:53 np0005603623 nova_compute[226235]: 2026-01-31 08:35:53.321 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:53 np0005603623 nova_compute[226235]: 2026-01-31 08:35:53.324 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:35:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:53.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:53.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:53 np0005603623 nova_compute[226235]: 2026-01-31 08:35:53.582 226239 DEBUG nova.compute.manager [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:35:53 np0005603623 nova_compute[226235]: 2026-01-31 08:35:53.640 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 31 03:35:53 np0005603623 nova_compute[226235]: 2026-01-31 08:35:53.641 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848552.8063319, 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:35:53 np0005603623 nova_compute[226235]: 2026-01-31 08:35:53.641 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] VM Started (Lifecycle Event)#033[00m
Jan 31 03:35:53 np0005603623 nova_compute[226235]: 2026-01-31 08:35:53.943 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:35:53 np0005603623 nova_compute[226235]: 2026-01-31 08:35:53.946 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:35:54 np0005603623 nova_compute[226235]: 2026-01-31 08:35:54.132 226239 DEBUG nova.compute.manager [req-e6c99750-a036-4e7c-9a2b-7c17e78bdc74 req-82077e0c-a972-42b1-8ca0-7cf663b143af fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Received event network-vif-plugged-58956ac4-88cf-49c2-988a-8a3746f1e622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:54 np0005603623 nova_compute[226235]: 2026-01-31 08:35:54.133 226239 DEBUG oslo_concurrency.lockutils [req-e6c99750-a036-4e7c-9a2b-7c17e78bdc74 req-82077e0c-a972-42b1-8ca0-7cf663b143af fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:54 np0005603623 nova_compute[226235]: 2026-01-31 08:35:54.133 226239 DEBUG oslo_concurrency.lockutils [req-e6c99750-a036-4e7c-9a2b-7c17e78bdc74 req-82077e0c-a972-42b1-8ca0-7cf663b143af fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:54 np0005603623 nova_compute[226235]: 2026-01-31 08:35:54.133 226239 DEBUG oslo_concurrency.lockutils [req-e6c99750-a036-4e7c-9a2b-7c17e78bdc74 req-82077e0c-a972-42b1-8ca0-7cf663b143af fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:54 np0005603623 nova_compute[226235]: 2026-01-31 08:35:54.134 226239 DEBUG nova.compute.manager [req-e6c99750-a036-4e7c-9a2b-7c17e78bdc74 req-82077e0c-a972-42b1-8ca0-7cf663b143af fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] No waiting events found dispatching network-vif-plugged-58956ac4-88cf-49c2-988a-8a3746f1e622 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:35:54 np0005603623 nova_compute[226235]: 2026-01-31 08:35:54.134 226239 WARNING nova.compute.manager [req-e6c99750-a036-4e7c-9a2b-7c17e78bdc74 req-82077e0c-a972-42b1-8ca0-7cf663b143af fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Received unexpected event network-vif-plugged-58956ac4-88cf-49c2-988a-8a3746f1e622 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 31 03:35:54 np0005603623 nova_compute[226235]: 2026-01-31 08:35:54.703 226239 DEBUG nova.compute.manager [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:35:54 np0005603623 nova_compute[226235]: 2026-01-31 08:35:54.705 226239 DEBUG nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:35:54 np0005603623 nova_compute[226235]: 2026-01-31 08:35:54.705 226239 INFO nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Creating image(s)#033[00m
Jan 31 03:35:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:54 np0005603623 nova_compute[226235]: 2026-01-31 08:35:54.865 226239 DEBUG nova.storage.rbd_utils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] rbd image 80b18469-4c81-4aa6-b657-efec55ba102b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:35:54 np0005603623 nova_compute[226235]: 2026-01-31 08:35:54.900 226239 DEBUG nova.storage.rbd_utils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] rbd image 80b18469-4c81-4aa6-b657-efec55ba102b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:35:54 np0005603623 nova_compute[226235]: 2026-01-31 08:35:54.929 226239 DEBUG nova.storage.rbd_utils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] rbd image 80b18469-4c81-4aa6-b657-efec55ba102b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:35:54 np0005603623 nova_compute[226235]: 2026-01-31 08:35:54.932 226239 DEBUG oslo_concurrency.processutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:54 np0005603623 nova_compute[226235]: 2026-01-31 08:35:54.947 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:54 np0005603623 nova_compute[226235]: 2026-01-31 08:35:54.979 226239 DEBUG oslo_concurrency.processutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:54 np0005603623 nova_compute[226235]: 2026-01-31 08:35:54.980 226239 DEBUG oslo_concurrency.lockutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:54 np0005603623 nova_compute[226235]: 2026-01-31 08:35:54.980 226239 DEBUG oslo_concurrency.lockutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:54 np0005603623 nova_compute[226235]: 2026-01-31 08:35:54.981 226239 DEBUG oslo_concurrency.lockutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:55 np0005603623 nova_compute[226235]: 2026-01-31 08:35:55.004 226239 DEBUG nova.storage.rbd_utils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] rbd image 80b18469-4c81-4aa6-b657-efec55ba102b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:35:55 np0005603623 nova_compute[226235]: 2026-01-31 08:35:55.007 226239 DEBUG oslo_concurrency.processutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 80b18469-4c81-4aa6-b657-efec55ba102b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:55.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:35:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:55.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:35:56 np0005603623 nova_compute[226235]: 2026-01-31 08:35:56.806 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:57.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:57.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:57 np0005603623 nova_compute[226235]: 2026-01-31 08:35:57.585 226239 DEBUG oslo_concurrency.processutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 80b18469-4c81-4aa6-b657-efec55ba102b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.578s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:57 np0005603623 nova_compute[226235]: 2026-01-31 08:35:57.866 226239 DEBUG nova.compute.manager [req-9b7d1204-994a-405e-bd9e-62d20640772f req-bef38eb4-7af8-4ad5-9558-5ddd16cab4dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Received event network-vif-plugged-58956ac4-88cf-49c2-988a-8a3746f1e622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:57 np0005603623 nova_compute[226235]: 2026-01-31 08:35:57.867 226239 DEBUG oslo_concurrency.lockutils [req-9b7d1204-994a-405e-bd9e-62d20640772f req-bef38eb4-7af8-4ad5-9558-5ddd16cab4dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:57 np0005603623 nova_compute[226235]: 2026-01-31 08:35:57.867 226239 DEBUG oslo_concurrency.lockutils [req-9b7d1204-994a-405e-bd9e-62d20640772f req-bef38eb4-7af8-4ad5-9558-5ddd16cab4dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:57 np0005603623 nova_compute[226235]: 2026-01-31 08:35:57.867 226239 DEBUG oslo_concurrency.lockutils [req-9b7d1204-994a-405e-bd9e-62d20640772f req-bef38eb4-7af8-4ad5-9558-5ddd16cab4dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:57 np0005603623 nova_compute[226235]: 2026-01-31 08:35:57.868 226239 DEBUG nova.compute.manager [req-9b7d1204-994a-405e-bd9e-62d20640772f req-bef38eb4-7af8-4ad5-9558-5ddd16cab4dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] No waiting events found dispatching network-vif-plugged-58956ac4-88cf-49c2-988a-8a3746f1e622 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:35:57 np0005603623 nova_compute[226235]: 2026-01-31 08:35:57.868 226239 WARNING nova.compute.manager [req-9b7d1204-994a-405e-bd9e-62d20640772f req-bef38eb4-7af8-4ad5-9558-5ddd16cab4dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Received unexpected event network-vif-plugged-58956ac4-88cf-49c2-988a-8a3746f1e622 for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:35:57 np0005603623 nova_compute[226235]: 2026-01-31 08:35:57.909 226239 DEBUG nova.storage.rbd_utils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] resizing rbd image 80b18469-4c81-4aa6-b657-efec55ba102b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.119 226239 DEBUG nova.objects.instance [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lazy-loading 'migration_context' on Instance uuid 80b18469-4c81-4aa6-b657-efec55ba102b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.169 226239 DEBUG nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.170 226239 DEBUG nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Ensure instance console log exists: /var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.170 226239 DEBUG oslo_concurrency.lockutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.170 226239 DEBUG oslo_concurrency.lockutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.171 226239 DEBUG oslo_concurrency.lockutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.172 226239 DEBUG nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.176 226239 WARNING nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.180 226239 DEBUG nova.virt.libvirt.host [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.181 226239 DEBUG nova.virt.libvirt.host [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.184 226239 DEBUG nova.virt.libvirt.host [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.184 226239 DEBUG nova.virt.libvirt.host [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.185 226239 DEBUG nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.188 226239 DEBUG nova.virt.hardware [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.189 226239 DEBUG nova.virt.hardware [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.189 226239 DEBUG nova.virt.hardware [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.189 226239 DEBUG nova.virt.hardware [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.189 226239 DEBUG nova.virt.hardware [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.190 226239 DEBUG nova.virt.hardware [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.190 226239 DEBUG nova.virt.hardware [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.191 226239 DEBUG nova.virt.hardware [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.191 226239 DEBUG nova.virt.hardware [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.191 226239 DEBUG nova.virt.hardware [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.191 226239 DEBUG nova.virt.hardware [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.194 226239 DEBUG oslo_concurrency.processutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:59 np0005603623 ovn_controller[133449]: 2026-01-31T08:35:59Z|00593|binding|INFO|Releasing lport 98bdd03c-3803-4f50-b99f-a5baefc4ec8a from this chassis (sb_readonly=0)
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.221 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:59.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:35:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:35:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:59.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.527 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Updating instance_info_cache with network_info: [{"id": "58956ac4-88cf-49c2-988a-8a3746f1e622", "address": "fa:16:3e:75:ff:26", "network": {"id": "e45621cc-e984-4d02-a4f7-adf5b5457b33", "bridge": "br-int", "label": "tempest-network-smoke--147789550", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58956ac4-88", "ovs_interfaceid": "58956ac4-88cf-49c2-988a-8a3746f1e622", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:35:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:35:59 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3227668991' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.698 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-60462c66-f02d-4ca4-aa2a-b6ea91c8a6af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.698 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.699 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.699 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.699 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.700 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.700 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.809 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.809 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.809 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.810 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.810 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.923 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.945 226239 DEBUG oslo_concurrency.processutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.751s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.972 226239 DEBUG nova.storage.rbd_utils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] rbd image 80b18469-4c81-4aa6-b657-efec55ba102b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:35:59 np0005603623 nova_compute[226235]: 2026-01-31 08:35:59.977 226239 DEBUG oslo_concurrency.processutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:36:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:36:00 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/859189101' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:36:00 np0005603623 nova_compute[226235]: 2026-01-31 08:36:00.254 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:36:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:36:00 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1029663368' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:36:00 np0005603623 nova_compute[226235]: 2026-01-31 08:36:00.459 226239 DEBUG oslo_concurrency.processutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:36:00 np0005603623 nova_compute[226235]: 2026-01-31 08:36:00.460 226239 DEBUG nova.objects.instance [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lazy-loading 'pci_devices' on Instance uuid 80b18469-4c81-4aa6-b657-efec55ba102b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:36:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:01.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:01.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:01 np0005603623 nova_compute[226235]: 2026-01-31 08:36:01.852 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:01 np0005603623 nova_compute[226235]: 2026-01-31 08:36:01.879 226239 DEBUG nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:36:01 np0005603623 nova_compute[226235]:  <uuid>80b18469-4c81-4aa6-b657-efec55ba102b</uuid>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:  <name>instance-0000008f</name>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerShowV257Test-server-382383596</nova:name>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:35:59</nova:creationTime>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:36:01 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:        <nova:user uuid="59bd928a3bb24a89af6a53a8392bd344">tempest-ServerShowV257Test-135657040-project-member</nova:user>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:        <nova:project uuid="2c23fe6507444c32a4c0254003f2c6cb">tempest-ServerShowV257Test-135657040</nova:project>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <nova:ports/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <entry name="serial">80b18469-4c81-4aa6-b657-efec55ba102b</entry>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <entry name="uuid">80b18469-4c81-4aa6-b657-efec55ba102b</entry>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/80b18469-4c81-4aa6-b657-efec55ba102b_disk">
Jan 31 03:36:01 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:36:01 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/80b18469-4c81-4aa6-b657-efec55ba102b_disk.config">
Jan 31 03:36:01 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:36:01 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b/console.log" append="off"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:36:01 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:36:01 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:36:01 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:36:01 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:36:01 np0005603623 nova_compute[226235]: 2026-01-31 08:36:01.890 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000008d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:36:01 np0005603623 nova_compute[226235]: 2026-01-31 08:36:01.891 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000008d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.045 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.046 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4172MB free_disk=20.84573745727539GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.046 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.046 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.063 226239 DEBUG nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.064 226239 DEBUG nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.064 226239 INFO nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Using config drive#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.143 226239 DEBUG nova.storage.rbd_utils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] rbd image 80b18469-4c81-4aa6-b657-efec55ba102b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.230 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Applying migration context for instance 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af as it has an incoming, in-progress migration c515f698-9e01-4a1c-97de-ee3d9443f03e. Migration status is confirming _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.231 226239 INFO nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Updating resource usage from migration c515f698-9e01-4a1c-97de-ee3d9443f03e#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.264 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.264 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 80b18469-4c81-4aa6-b657-efec55ba102b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.264 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.265 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.365 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.532 226239 INFO nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Creating config drive at /var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b/disk.config#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.537 226239 DEBUG oslo_concurrency.processutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpx9jjq5bb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.658 226239 DEBUG oslo_concurrency.processutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpx9jjq5bb" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.697 226239 DEBUG nova.storage.rbd_utils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] rbd image 80b18469-4c81-4aa6-b657-efec55ba102b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.704 226239 DEBUG oslo_concurrency.processutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b/disk.config 80b18469-4c81-4aa6-b657-efec55ba102b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:36:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:36:02 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1201397376' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.768 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.776 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.834 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.926 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.926 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.880s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.927 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:36:02 np0005603623 nova_compute[226235]: 2026-01-31 08:36:02.928 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:36:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:03.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:03.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:04 np0005603623 nova_compute[226235]: 2026-01-31 08:36:04.925 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:36:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:05.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:36:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:05.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:06 np0005603623 nova_compute[226235]: 2026-01-31 08:36:06.596 226239 DEBUG oslo_concurrency.processutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b/disk.config 80b18469-4c81-4aa6-b657-efec55ba102b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.892s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:36:06 np0005603623 nova_compute[226235]: 2026-01-31 08:36:06.596 226239 INFO nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Deleting local config drive /var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b/disk.config because it was imported into RBD.#033[00m
Jan 31 03:36:06 np0005603623 systemd-machined[194379]: New machine qemu-66-instance-0000008f.
Jan 31 03:36:06 np0005603623 systemd[1]: Started Virtual Machine qemu-66-instance-0000008f.
Jan 31 03:36:06 np0005603623 podman[291610]: 2026-01-31 08:36:06.722471917 +0000 UTC m=+0.060292295 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 03:36:06 np0005603623 podman[291611]: 2026-01-31 08:36:06.7897418 +0000 UTC m=+0.127850028 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:36:06 np0005603623 nova_compute[226235]: 2026-01-31 08:36:06.854 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:36:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:07.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:36:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:07.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e334 e334: 3 total, 3 up, 3 in
Jan 31 03:36:08 np0005603623 ovn_controller[133449]: 2026-01-31T08:36:08Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:75:ff:26 10.100.0.7
Jan 31 03:36:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:08.243 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.244 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:08.245 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.555 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.556 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.616 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848568.6152985, 80b18469-4c81-4aa6-b657-efec55ba102b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.617 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.621 226239 DEBUG nova.compute.manager [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.621 226239 DEBUG nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.627 226239 INFO nova.virt.libvirt.driver [-] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Instance spawned successfully.#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.629 226239 DEBUG nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.744 226239 DEBUG nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.745 226239 DEBUG nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.745 226239 DEBUG nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.745 226239 DEBUG nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.746 226239 DEBUG nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.746 226239 DEBUG nova.virt.libvirt.driver [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.751 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.753 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.864 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.864 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848568.616858, 80b18469-4c81-4aa6-b657-efec55ba102b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.865 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] VM Started (Lifecycle Event)#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.922 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.925 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:36:08 np0005603623 nova_compute[226235]: 2026-01-31 08:36:08.995 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:36:09 np0005603623 nova_compute[226235]: 2026-01-31 08:36:09.051 226239 INFO nova.compute.manager [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Took 14.35 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:36:09 np0005603623 nova_compute[226235]: 2026-01-31 08:36:09.051 226239 DEBUG nova.compute.manager [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:36:09 np0005603623 nova_compute[226235]: 2026-01-31 08:36:09.214 226239 INFO nova.compute.manager [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Took 19.22 seconds to build instance.#033[00m
Jan 31 03:36:09 np0005603623 nova_compute[226235]: 2026-01-31 08:36:09.293 226239 DEBUG oslo_concurrency.lockutils [None req-d0cdcf21-f445-4ab8-81cb-4452b1dfe02e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "80b18469-4c81-4aa6-b657-efec55ba102b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:36:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:36:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:09.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:36:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:36:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:09.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:36:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e334 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:09 np0005603623 nova_compute[226235]: 2026-01-31 08:36:09.927 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:11.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:36:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:11.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:36:11 np0005603623 nova_compute[226235]: 2026-01-31 08:36:11.856 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:13 np0005603623 nova_compute[226235]: 2026-01-31 08:36:13.040 226239 INFO nova.compute.manager [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Rebuilding instance#033[00m
Jan 31 03:36:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:13.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:13.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:13 np0005603623 nova_compute[226235]: 2026-01-31 08:36:13.456 226239 DEBUG nova.objects.instance [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lazy-loading 'trusted_certs' on Instance uuid 80b18469-4c81-4aa6-b657-efec55ba102b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:36:13 np0005603623 nova_compute[226235]: 2026-01-31 08:36:13.505 226239 DEBUG nova.compute.manager [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:36:13 np0005603623 nova_compute[226235]: 2026-01-31 08:36:13.683 226239 DEBUG nova.objects.instance [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lazy-loading 'pci_requests' on Instance uuid 80b18469-4c81-4aa6-b657-efec55ba102b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:36:13 np0005603623 nova_compute[226235]: 2026-01-31 08:36:13.868 226239 DEBUG nova.objects.instance [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lazy-loading 'pci_devices' on Instance uuid 80b18469-4c81-4aa6-b657-efec55ba102b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:36:13 np0005603623 nova_compute[226235]: 2026-01-31 08:36:13.933 226239 DEBUG nova.objects.instance [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lazy-loading 'resources' on Instance uuid 80b18469-4c81-4aa6-b657-efec55ba102b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:36:14 np0005603623 nova_compute[226235]: 2026-01-31 08:36:14.125 226239 DEBUG nova.objects.instance [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lazy-loading 'migration_context' on Instance uuid 80b18469-4c81-4aa6-b657-efec55ba102b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:36:14 np0005603623 nova_compute[226235]: 2026-01-31 08:36:14.274 226239 DEBUG nova.objects.instance [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 03:36:14 np0005603623 nova_compute[226235]: 2026-01-31 08:36:14.279 226239 DEBUG nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:36:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e334 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:14 np0005603623 nova_compute[226235]: 2026-01-31 08:36:14.930 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:15.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:15.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e335 e335: 3 total, 3 up, 3 in
Jan 31 03:36:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:16.247 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:36:16 np0005603623 nova_compute[226235]: 2026-01-31 08:36:16.860 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:17 np0005603623 nova_compute[226235]: 2026-01-31 08:36:17.125 226239 INFO nova.compute.manager [None req-01babfb1-a383-4ff5-ae31-aa2768d222f5 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Get console output#033[00m
Jan 31 03:36:17 np0005603623 nova_compute[226235]: 2026-01-31 08:36:17.227 270602 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 03:36:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:17.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:36:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:17.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:36:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:19.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:19.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:19 np0005603623 nova_compute[226235]: 2026-01-31 08:36:19.780 226239 DEBUG nova.compute.manager [req-feedea87-c59d-46ed-9a02-35b7b92dc1fe req-aa00b4f2-94e0-4cf5-9024-85321183a515 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Received event network-changed-58956ac4-88cf-49c2-988a-8a3746f1e622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:36:19 np0005603623 nova_compute[226235]: 2026-01-31 08:36:19.782 226239 DEBUG nova.compute.manager [req-feedea87-c59d-46ed-9a02-35b7b92dc1fe req-aa00b4f2-94e0-4cf5-9024-85321183a515 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Refreshing instance network info cache due to event network-changed-58956ac4-88cf-49c2-988a-8a3746f1e622. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:36:19 np0005603623 nova_compute[226235]: 2026-01-31 08:36:19.783 226239 DEBUG oslo_concurrency.lockutils [req-feedea87-c59d-46ed-9a02-35b7b92dc1fe req-aa00b4f2-94e0-4cf5-9024-85321183a515 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-60462c66-f02d-4ca4-aa2a-b6ea91c8a6af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:36:19 np0005603623 nova_compute[226235]: 2026-01-31 08:36:19.783 226239 DEBUG oslo_concurrency.lockutils [req-feedea87-c59d-46ed-9a02-35b7b92dc1fe req-aa00b4f2-94e0-4cf5-9024-85321183a515 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-60462c66-f02d-4ca4-aa2a-b6ea91c8a6af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:36:19 np0005603623 nova_compute[226235]: 2026-01-31 08:36:19.783 226239 DEBUG nova.network.neutron [req-feedea87-c59d-46ed-9a02-35b7b92dc1fe req-aa00b4f2-94e0-4cf5-9024-85321183a515 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Refreshing network info cache for port 58956ac4-88cf-49c2-988a-8a3746f1e622 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:36:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:19 np0005603623 nova_compute[226235]: 2026-01-31 08:36:19.931 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:20 np0005603623 nova_compute[226235]: 2026-01-31 08:36:20.722 226239 DEBUG oslo_concurrency.lockutils [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:36:20 np0005603623 nova_compute[226235]: 2026-01-31 08:36:20.723 226239 DEBUG oslo_concurrency.lockutils [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:36:20 np0005603623 nova_compute[226235]: 2026-01-31 08:36:20.723 226239 DEBUG oslo_concurrency.lockutils [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:36:20 np0005603623 nova_compute[226235]: 2026-01-31 08:36:20.723 226239 DEBUG oslo_concurrency.lockutils [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:36:20 np0005603623 nova_compute[226235]: 2026-01-31 08:36:20.724 226239 DEBUG oslo_concurrency.lockutils [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:36:20 np0005603623 nova_compute[226235]: 2026-01-31 08:36:20.725 226239 INFO nova.compute.manager [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Terminating instance#033[00m
Jan 31 03:36:20 np0005603623 nova_compute[226235]: 2026-01-31 08:36:20.726 226239 DEBUG nova.compute.manager [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:36:20 np0005603623 kernel: tap58956ac4-88 (unregistering): left promiscuous mode
Jan 31 03:36:20 np0005603623 NetworkManager[48970]: <info>  [1769848580.8010] device (tap58956ac4-88): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:36:20 np0005603623 nova_compute[226235]: 2026-01-31 08:36:20.807 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:20 np0005603623 ovn_controller[133449]: 2026-01-31T08:36:20Z|00594|binding|INFO|Releasing lport 58956ac4-88cf-49c2-988a-8a3746f1e622 from this chassis (sb_readonly=0)
Jan 31 03:36:20 np0005603623 ovn_controller[133449]: 2026-01-31T08:36:20Z|00595|binding|INFO|Setting lport 58956ac4-88cf-49c2-988a-8a3746f1e622 down in Southbound
Jan 31 03:36:20 np0005603623 ovn_controller[133449]: 2026-01-31T08:36:20Z|00596|binding|INFO|Removing iface tap58956ac4-88 ovn-installed in OVS
Jan 31 03:36:20 np0005603623 nova_compute[226235]: 2026-01-31 08:36:20.809 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:20 np0005603623 nova_compute[226235]: 2026-01-31 08:36:20.819 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:20.843 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:75:ff:26 10.100.0.7'], port_security=['fa:16:3e:75:ff:26 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '60462c66-f02d-4ca4-aa2a-b6ea91c8a6af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e45621cc-e984-4d02-a4f7-adf5b5457b33', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0bfe11bd9d694684b527666e2c378eed', 'neutron:revision_number': '8', 'neutron:security_group_ids': '1bca5a82-b0f2-4237-92f5-d7d2dbf4afe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e036da0b-b229-4d68-8cb9-77eeebb375fb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=58956ac4-88cf-49c2-988a-8a3746f1e622) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:36:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:20.844 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 58956ac4-88cf-49c2-988a-8a3746f1e622 in datapath e45621cc-e984-4d02-a4f7-adf5b5457b33 unbound from our chassis#033[00m
Jan 31 03:36:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:20.846 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e45621cc-e984-4d02-a4f7-adf5b5457b33, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:36:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:20.847 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3f62851d-6b7e-4496-9826-f4e43316e5db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:36:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:20.848 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33 namespace which is not needed anymore#033[00m
Jan 31 03:36:20 np0005603623 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000008d.scope: Deactivated successfully.
Jan 31 03:36:20 np0005603623 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000008d.scope: Consumed 12.823s CPU time.
Jan 31 03:36:20 np0005603623 systemd-machined[194379]: Machine qemu-65-instance-0000008d terminated.
Jan 31 03:36:20 np0005603623 kernel: tap58956ac4-88: entered promiscuous mode
Jan 31 03:36:20 np0005603623 kernel: tap58956ac4-88 (unregistering): left promiscuous mode
Jan 31 03:36:20 np0005603623 NetworkManager[48970]: <info>  [1769848580.9474] manager: (tap58956ac4-88): new Tun device (/org/freedesktop/NetworkManager/Devices/280)
Jan 31 03:36:20 np0005603623 nova_compute[226235]: 2026-01-31 08:36:20.952 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:20 np0005603623 nova_compute[226235]: 2026-01-31 08:36:20.960 226239 INFO nova.virt.libvirt.driver [-] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Instance destroyed successfully.#033[00m
Jan 31 03:36:20 np0005603623 nova_compute[226235]: 2026-01-31 08:36:20.960 226239 DEBUG nova.objects.instance [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'resources' on Instance uuid 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:36:20 np0005603623 neutron-haproxy-ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33[291251]: [NOTICE]   (291255) : haproxy version is 2.8.14-c23fe91
Jan 31 03:36:20 np0005603623 neutron-haproxy-ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33[291251]: [NOTICE]   (291255) : path to executable is /usr/sbin/haproxy
Jan 31 03:36:20 np0005603623 neutron-haproxy-ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33[291251]: [WARNING]  (291255) : Exiting Master process...
Jan 31 03:36:20 np0005603623 neutron-haproxy-ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33[291251]: [ALERT]    (291255) : Current worker (291257) exited with code 143 (Terminated)
Jan 31 03:36:20 np0005603623 neutron-haproxy-ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33[291251]: [WARNING]  (291255) : All workers exited. Exiting... (0)
Jan 31 03:36:20 np0005603623 systemd[1]: libpod-1a2a6c91c870112ffd805cd0bcd681d28df8f6527e5020f2ff7f303e5c59e0a8.scope: Deactivated successfully.
Jan 31 03:36:20 np0005603623 podman[291790]: 2026-01-31 08:36:20.995807638 +0000 UTC m=+0.062073251 container died 1a2a6c91c870112ffd805cd0bcd681d28df8f6527e5020f2ff7f303e5c59e0a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:36:21 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1a2a6c91c870112ffd805cd0bcd681d28df8f6527e5020f2ff7f303e5c59e0a8-userdata-shm.mount: Deactivated successfully.
Jan 31 03:36:21 np0005603623 systemd[1]: var-lib-containers-storage-overlay-cefa4ab7f7559fbd57a17ddf1fc213012691c5275abfad0067c1df7e00426a26-merged.mount: Deactivated successfully.
Jan 31 03:36:21 np0005603623 podman[291790]: 2026-01-31 08:36:21.064156144 +0000 UTC m=+0.130421757 container cleanup 1a2a6c91c870112ffd805cd0bcd681d28df8f6527e5020f2ff7f303e5c59e0a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:36:21 np0005603623 systemd[1]: libpod-conmon-1a2a6c91c870112ffd805cd0bcd681d28df8f6527e5020f2ff7f303e5c59e0a8.scope: Deactivated successfully.
Jan 31 03:36:21 np0005603623 podman[291828]: 2026-01-31 08:36:21.124014275 +0000 UTC m=+0.043838111 container remove 1a2a6c91c870112ffd805cd0bcd681d28df8f6527e5020f2ff7f303e5c59e0a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:36:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:21.128 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d18825-ddb6-4c65-ae22-6ebea17a3604]: (4, ('Sat Jan 31 08:36:20 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33 (1a2a6c91c870112ffd805cd0bcd681d28df8f6527e5020f2ff7f303e5c59e0a8)\n1a2a6c91c870112ffd805cd0bcd681d28df8f6527e5020f2ff7f303e5c59e0a8\nSat Jan 31 08:36:21 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33 (1a2a6c91c870112ffd805cd0bcd681d28df8f6527e5020f2ff7f303e5c59e0a8)\n1a2a6c91c870112ffd805cd0bcd681d28df8f6527e5020f2ff7f303e5c59e0a8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:36:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:21.130 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a6937816-7d2f-476b-9565-4290caf041e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:36:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:21.131 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape45621cc-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:36:21 np0005603623 nova_compute[226235]: 2026-01-31 08:36:21.132 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:21 np0005603623 kernel: tape45621cc-e0: left promiscuous mode
Jan 31 03:36:21 np0005603623 nova_compute[226235]: 2026-01-31 08:36:21.142 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:21.144 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[46076833-dbd8-49ac-af28-e699aec4bf6f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:36:21 np0005603623 nova_compute[226235]: 2026-01-31 08:36:21.150 226239 DEBUG nova.virt.libvirt.vif [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:34:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1672201743',display_name='tempest-TestNetworkAdvancedServerOps-server-1672201743',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1672201743',id=141,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPUoionOf1jsbgYnjxtSF8S5kbM7WrnC+AvzdWQ5Iv9NrHSu1YTmh7OvNKWVCt94tfduQMP4jFzkhpdFTOQdH6c769sX4vCZIDbSCuBl9lgkWTK5Ks3sTtkCsO2rA5PBWA==',key_name='tempest-TestNetworkAdvancedServerOps-2012991436',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:35:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-4je44sr0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:36:09Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=60462c66-f02d-4ca4-aa2a-b6ea91c8a6af,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "58956ac4-88cf-49c2-988a-8a3746f1e622", "address": "fa:16:3e:75:ff:26", "network": {"id": "e45621cc-e984-4d02-a4f7-adf5b5457b33", "bridge": "br-int", "label": "tempest-network-smoke--147789550", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58956ac4-88", "ovs_interfaceid": "58956ac4-88cf-49c2-988a-8a3746f1e622", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:36:21 np0005603623 nova_compute[226235]: 2026-01-31 08:36:21.150 226239 DEBUG nova.network.os_vif_util [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converting VIF {"id": "58956ac4-88cf-49c2-988a-8a3746f1e622", "address": "fa:16:3e:75:ff:26", "network": {"id": "e45621cc-e984-4d02-a4f7-adf5b5457b33", "bridge": "br-int", "label": "tempest-network-smoke--147789550", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58956ac4-88", "ovs_interfaceid": "58956ac4-88cf-49c2-988a-8a3746f1e622", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:36:21 np0005603623 nova_compute[226235]: 2026-01-31 08:36:21.151 226239 DEBUG nova.network.os_vif_util [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:75:ff:26,bridge_name='br-int',has_traffic_filtering=True,id=58956ac4-88cf-49c2-988a-8a3746f1e622,network=Network(e45621cc-e984-4d02-a4f7-adf5b5457b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58956ac4-88') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:36:21 np0005603623 nova_compute[226235]: 2026-01-31 08:36:21.151 226239 DEBUG os_vif [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:75:ff:26,bridge_name='br-int',has_traffic_filtering=True,id=58956ac4-88cf-49c2-988a-8a3746f1e622,network=Network(e45621cc-e984-4d02-a4f7-adf5b5457b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58956ac4-88') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:36:21 np0005603623 nova_compute[226235]: 2026-01-31 08:36:21.153 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:21 np0005603623 nova_compute[226235]: 2026-01-31 08:36:21.153 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58956ac4-88, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:36:21 np0005603623 nova_compute[226235]: 2026-01-31 08:36:21.155 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:36:21 np0005603623 nova_compute[226235]: 2026-01-31 08:36:21.158 226239 INFO os_vif [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:75:ff:26,bridge_name='br-int',has_traffic_filtering=True,id=58956ac4-88cf-49c2-988a-8a3746f1e622,network=Network(e45621cc-e984-4d02-a4f7-adf5b5457b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58956ac4-88')#033[00m
Jan 31 03:36:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:21.164 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[84599c97-33fa-4243-84b1-09d131d6fad8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:36:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:21.166 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0f62ad50-5bd7-4864-8ce0-3fcba55ffa2b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:36:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:21.181 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a7fdd5a0-263a-461a-9806-70097d57f642]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 769273, 'reachable_time': 31675, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 291862, 'error': None, 'target': 'ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:36:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:21.183 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e45621cc-e984-4d02-a4f7-adf5b5457b33 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:36:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:21.183 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[72d15527-bf2a-42cb-8f9d-80a2d5bdbf43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:36:21 np0005603623 systemd[1]: run-netns-ovnmeta\x2de45621cc\x2de984\x2d4d02\x2da4f7\x2dadf5b5457b33.mount: Deactivated successfully.
Jan 31 03:36:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:36:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:21.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:36:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:36:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:21.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:36:21 np0005603623 nova_compute[226235]: 2026-01-31 08:36:21.714 226239 INFO nova.virt.libvirt.driver [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Deleting instance files /var/lib/nova/instances/60462c66-f02d-4ca4-aa2a-b6ea91c8a6af_del#033[00m
Jan 31 03:36:21 np0005603623 nova_compute[226235]: 2026-01-31 08:36:21.715 226239 INFO nova.virt.libvirt.driver [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Deletion of /var/lib/nova/instances/60462c66-f02d-4ca4-aa2a-b6ea91c8a6af_del complete#033[00m
Jan 31 03:36:21 np0005603623 nova_compute[226235]: 2026-01-31 08:36:21.972 226239 INFO nova.compute.manager [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Took 1.25 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:36:21 np0005603623 nova_compute[226235]: 2026-01-31 08:36:21.973 226239 DEBUG oslo.service.loopingcall [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:36:21 np0005603623 nova_compute[226235]: 2026-01-31 08:36:21.973 226239 DEBUG nova.compute.manager [-] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:36:21 np0005603623 nova_compute[226235]: 2026-01-31 08:36:21.973 226239 DEBUG nova.network.neutron [-] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:36:22 np0005603623 nova_compute[226235]: 2026-01-31 08:36:22.043 226239 DEBUG nova.compute.manager [req-f532482e-048b-4dc3-a377-6804a71e60bb req-833d2d3b-11b7-44d6-b712-f212ffb393fa fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Received event network-vif-unplugged-58956ac4-88cf-49c2-988a-8a3746f1e622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:36:22 np0005603623 nova_compute[226235]: 2026-01-31 08:36:22.044 226239 DEBUG oslo_concurrency.lockutils [req-f532482e-048b-4dc3-a377-6804a71e60bb req-833d2d3b-11b7-44d6-b712-f212ffb393fa fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:36:22 np0005603623 nova_compute[226235]: 2026-01-31 08:36:22.044 226239 DEBUG oslo_concurrency.lockutils [req-f532482e-048b-4dc3-a377-6804a71e60bb req-833d2d3b-11b7-44d6-b712-f212ffb393fa fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:36:22 np0005603623 nova_compute[226235]: 2026-01-31 08:36:22.044 226239 DEBUG oslo_concurrency.lockutils [req-f532482e-048b-4dc3-a377-6804a71e60bb req-833d2d3b-11b7-44d6-b712-f212ffb393fa fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:36:22 np0005603623 nova_compute[226235]: 2026-01-31 08:36:22.044 226239 DEBUG nova.compute.manager [req-f532482e-048b-4dc3-a377-6804a71e60bb req-833d2d3b-11b7-44d6-b712-f212ffb393fa fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] No waiting events found dispatching network-vif-unplugged-58956ac4-88cf-49c2-988a-8a3746f1e622 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:36:22 np0005603623 nova_compute[226235]: 2026-01-31 08:36:22.044 226239 DEBUG nova.compute.manager [req-f532482e-048b-4dc3-a377-6804a71e60bb req-833d2d3b-11b7-44d6-b712-f212ffb393fa fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Received event network-vif-unplugged-58956ac4-88cf-49c2-988a-8a3746f1e622 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:36:22 np0005603623 nova_compute[226235]: 2026-01-31 08:36:22.250 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:22 np0005603623 nova_compute[226235]: 2026-01-31 08:36:22.580 226239 DEBUG nova.network.neutron [req-feedea87-c59d-46ed-9a02-35b7b92dc1fe req-aa00b4f2-94e0-4cf5-9024-85321183a515 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Updated VIF entry in instance network info cache for port 58956ac4-88cf-49c2-988a-8a3746f1e622. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:36:22 np0005603623 nova_compute[226235]: 2026-01-31 08:36:22.580 226239 DEBUG nova.network.neutron [req-feedea87-c59d-46ed-9a02-35b7b92dc1fe req-aa00b4f2-94e0-4cf5-9024-85321183a515 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Updating instance_info_cache with network_info: [{"id": "58956ac4-88cf-49c2-988a-8a3746f1e622", "address": "fa:16:3e:75:ff:26", "network": {"id": "e45621cc-e984-4d02-a4f7-adf5b5457b33", "bridge": "br-int", "label": "tempest-network-smoke--147789550", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58956ac4-88", "ovs_interfaceid": "58956ac4-88cf-49c2-988a-8a3746f1e622", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:36:23 np0005603623 nova_compute[226235]: 2026-01-31 08:36:23.219 226239 DEBUG oslo_concurrency.lockutils [req-feedea87-c59d-46ed-9a02-35b7b92dc1fe req-aa00b4f2-94e0-4cf5-9024-85321183a515 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-60462c66-f02d-4ca4-aa2a-b6ea91c8a6af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:36:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:36:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:23.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:36:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:23.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:23 np0005603623 nova_compute[226235]: 2026-01-31 08:36:23.458 226239 DEBUG nova.network.neutron [-] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:36:23 np0005603623 nova_compute[226235]: 2026-01-31 08:36:23.519 226239 INFO nova.compute.manager [-] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Took 1.55 seconds to deallocate network for instance.#033[00m
Jan 31 03:36:23 np0005603623 nova_compute[226235]: 2026-01-31 08:36:23.589 226239 DEBUG nova.compute.manager [req-873ea2c8-c989-4842-9ffe-8dcf4ec97bdb req-ed4fa462-0ee8-412b-9651-004e185c1266 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Received event network-vif-deleted-58956ac4-88cf-49c2-988a-8a3746f1e622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:36:23 np0005603623 nova_compute[226235]: 2026-01-31 08:36:23.656 226239 DEBUG oslo_concurrency.lockutils [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:36:23 np0005603623 nova_compute[226235]: 2026-01-31 08:36:23.656 226239 DEBUG oslo_concurrency.lockutils [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:36:23 np0005603623 nova_compute[226235]: 2026-01-31 08:36:23.738 226239 DEBUG oslo_concurrency.processutils [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:36:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:36:24 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/46380977' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:36:24 np0005603623 nova_compute[226235]: 2026-01-31 08:36:24.148 226239 DEBUG oslo_concurrency.processutils [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:36:24 np0005603623 nova_compute[226235]: 2026-01-31 08:36:24.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:36:24 np0005603623 nova_compute[226235]: 2026-01-31 08:36:24.157 226239 DEBUG nova.compute.provider_tree [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:36:24 np0005603623 nova_compute[226235]: 2026-01-31 08:36:24.318 226239 DEBUG nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 03:36:24 np0005603623 nova_compute[226235]: 2026-01-31 08:36:24.441 226239 DEBUG nova.compute.manager [req-b06db59b-11ff-47b4-85fb-d1e3d891d88c req-4c8170ea-93c1-461d-9c0d-857e02547e04 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Received event network-vif-plugged-58956ac4-88cf-49c2-988a-8a3746f1e622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:36:24 np0005603623 nova_compute[226235]: 2026-01-31 08:36:24.441 226239 DEBUG oslo_concurrency.lockutils [req-b06db59b-11ff-47b4-85fb-d1e3d891d88c req-4c8170ea-93c1-461d-9c0d-857e02547e04 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:36:24 np0005603623 nova_compute[226235]: 2026-01-31 08:36:24.442 226239 DEBUG oslo_concurrency.lockutils [req-b06db59b-11ff-47b4-85fb-d1e3d891d88c req-4c8170ea-93c1-461d-9c0d-857e02547e04 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:36:24 np0005603623 nova_compute[226235]: 2026-01-31 08:36:24.442 226239 DEBUG oslo_concurrency.lockutils [req-b06db59b-11ff-47b4-85fb-d1e3d891d88c req-4c8170ea-93c1-461d-9c0d-857e02547e04 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:36:24 np0005603623 nova_compute[226235]: 2026-01-31 08:36:24.442 226239 DEBUG nova.compute.manager [req-b06db59b-11ff-47b4-85fb-d1e3d891d88c req-4c8170ea-93c1-461d-9c0d-857e02547e04 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] No waiting events found dispatching network-vif-plugged-58956ac4-88cf-49c2-988a-8a3746f1e622 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:36:24 np0005603623 nova_compute[226235]: 2026-01-31 08:36:24.442 226239 WARNING nova.compute.manager [req-b06db59b-11ff-47b4-85fb-d1e3d891d88c req-4c8170ea-93c1-461d-9c0d-857e02547e04 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Received unexpected event network-vif-plugged-58956ac4-88cf-49c2-988a-8a3746f1e622 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:36:24 np0005603623 nova_compute[226235]: 2026-01-31 08:36:24.480 226239 DEBUG nova.scheduler.client.report [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:36:24 np0005603623 nova_compute[226235]: 2026-01-31 08:36:24.851 226239 DEBUG oslo_concurrency.lockutils [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:36:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:24 np0005603623 nova_compute[226235]: 2026-01-31 08:36:24.933 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:24 np0005603623 nova_compute[226235]: 2026-01-31 08:36:24.971 226239 INFO nova.scheduler.client.report [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Deleted allocations for instance 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af#033[00m
Jan 31 03:36:25 np0005603623 nova_compute[226235]: 2026-01-31 08:36:25.187 226239 DEBUG oslo_concurrency.lockutils [None req-e868e7f8-3d0b-4d50-bac8-1b6a22342689 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "60462c66-f02d-4ca4-aa2a-b6ea91c8a6af" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.464s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:36:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:25.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:25.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:26 np0005603623 nova_compute[226235]: 2026-01-31 08:36:26.154 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:27 np0005603623 nova_compute[226235]: 2026-01-31 08:36:27.252 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:27 np0005603623 nova_compute[226235]: 2026-01-31 08:36:27.332 226239 INFO nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Instance shutdown successfully after 13 seconds.#033[00m
Jan 31 03:36:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:36:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:27.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:36:27 np0005603623 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000008f.scope: Deactivated successfully.
Jan 31 03:36:27 np0005603623 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000008f.scope: Consumed 13.852s CPU time.
Jan 31 03:36:27 np0005603623 systemd-machined[194379]: Machine qemu-66-instance-0000008f terminated.
Jan 31 03:36:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:27.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:27 np0005603623 nova_compute[226235]: 2026-01-31 08:36:27.547 226239 INFO nova.virt.libvirt.driver [-] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Instance destroyed successfully.#033[00m
Jan 31 03:36:27 np0005603623 nova_compute[226235]: 2026-01-31 08:36:27.551 226239 INFO nova.virt.libvirt.driver [-] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Instance destroyed successfully.#033[00m
Jan 31 03:36:29 np0005603623 nova_compute[226235]: 2026-01-31 08:36:29.135 226239 INFO nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Deleting instance files /var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b_del#033[00m
Jan 31 03:36:29 np0005603623 nova_compute[226235]: 2026-01-31 08:36:29.136 226239 INFO nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Deletion of /var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b_del complete#033[00m
Jan 31 03:36:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:36:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:29.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:36:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:36:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:29.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:36:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:29 np0005603623 nova_compute[226235]: 2026-01-31 08:36:29.934 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:30.127 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:36:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:30.128 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:36:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:36:30.128 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:36:31 np0005603623 nova_compute[226235]: 2026-01-31 08:36:31.155 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:31.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:31.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:32 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:36:32 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:36:32 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:36:32 np0005603623 nova_compute[226235]: 2026-01-31 08:36:32.484 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:36:32 np0005603623 nova_compute[226235]: 2026-01-31 08:36:32.485 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:36:32 np0005603623 nova_compute[226235]: 2026-01-31 08:36:32.666 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:36:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:36:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:33.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:36:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:33.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:33 np0005603623 nova_compute[226235]: 2026-01-31 08:36:33.913 226239 DEBUG nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:36:33 np0005603623 nova_compute[226235]: 2026-01-31 08:36:33.914 226239 INFO nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Creating image(s)#033[00m
Jan 31 03:36:33 np0005603623 nova_compute[226235]: 2026-01-31 08:36:33.938 226239 DEBUG nova.storage.rbd_utils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] rbd image 80b18469-4c81-4aa6-b657-efec55ba102b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:36:33 np0005603623 nova_compute[226235]: 2026-01-31 08:36:33.964 226239 DEBUG nova.storage.rbd_utils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] rbd image 80b18469-4c81-4aa6-b657-efec55ba102b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:36:33 np0005603623 nova_compute[226235]: 2026-01-31 08:36:33.998 226239 DEBUG nova.storage.rbd_utils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] rbd image 80b18469-4c81-4aa6-b657-efec55ba102b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:36:34 np0005603623 nova_compute[226235]: 2026-01-31 08:36:34.002 226239 DEBUG oslo_concurrency.processutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:36:34 np0005603623 nova_compute[226235]: 2026-01-31 08:36:34.075 226239 DEBUG oslo_concurrency.processutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:36:34 np0005603623 nova_compute[226235]: 2026-01-31 08:36:34.076 226239 DEBUG oslo_concurrency.lockutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Acquiring lock "365f9823d2619ef09948bdeed685488da63755b5" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:36:34 np0005603623 nova_compute[226235]: 2026-01-31 08:36:34.077 226239 DEBUG oslo_concurrency.lockutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "365f9823d2619ef09948bdeed685488da63755b5" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:36:34 np0005603623 nova_compute[226235]: 2026-01-31 08:36:34.077 226239 DEBUG oslo_concurrency.lockutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "365f9823d2619ef09948bdeed685488da63755b5" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:36:34 np0005603623 nova_compute[226235]: 2026-01-31 08:36:34.104 226239 DEBUG nova.storage.rbd_utils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] rbd image 80b18469-4c81-4aa6-b657-efec55ba102b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:36:34 np0005603623 nova_compute[226235]: 2026-01-31 08:36:34.108 226239 DEBUG oslo_concurrency.processutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 80b18469-4c81-4aa6-b657-efec55ba102b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:36:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:34 np0005603623 nova_compute[226235]: 2026-01-31 08:36:34.936 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e336 e336: 3 total, 3 up, 3 in
Jan 31 03:36:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:35.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:35.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:35 np0005603623 nova_compute[226235]: 2026-01-31 08:36:35.959 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848580.957851, 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:36:35 np0005603623 nova_compute[226235]: 2026-01-31 08:36:35.959 226239 INFO nova.compute.manager [-] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.156 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.278 226239 DEBUG oslo_concurrency.processutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 80b18469-4c81-4aa6-b657-efec55ba102b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.170s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.351 226239 DEBUG nova.storage.rbd_utils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] resizing rbd image 80b18469-4c81-4aa6-b657-efec55ba102b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.517 226239 DEBUG nova.compute.manager [None req-102669ff-1db5-4184-a12b-e6675afd015f - - - - - -] [instance: 60462c66-f02d-4ca4-aa2a-b6ea91c8a6af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.668 226239 DEBUG nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.669 226239 DEBUG nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Ensure instance console log exists: /var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.670 226239 DEBUG oslo_concurrency.lockutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.670 226239 DEBUG oslo_concurrency.lockutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.671 226239 DEBUG oslo_concurrency.lockutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.673 226239 DEBUG nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:45Z,direct_url=<?>,disk_format='qcow2',id=0864ca59-9877-4e6d-adfc-f0a3204ed8f8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:47Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.679 226239 WARNING nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.686 226239 DEBUG nova.virt.libvirt.host [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.687 226239 DEBUG nova.virt.libvirt.host [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.691 226239 DEBUG nova.virt.libvirt.host [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.692 226239 DEBUG nova.virt.libvirt.host [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.693 226239 DEBUG nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.693 226239 DEBUG nova.virt.hardware [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:45Z,direct_url=<?>,disk_format='qcow2',id=0864ca59-9877-4e6d-adfc-f0a3204ed8f8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:47Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.694 226239 DEBUG nova.virt.hardware [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.694 226239 DEBUG nova.virt.hardware [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.695 226239 DEBUG nova.virt.hardware [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.695 226239 DEBUG nova.virt.hardware [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.695 226239 DEBUG nova.virt.hardware [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.695 226239 DEBUG nova.virt.hardware [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.696 226239 DEBUG nova.virt.hardware [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.696 226239 DEBUG nova.virt.hardware [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.696 226239 DEBUG nova.virt.hardware [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.697 226239 DEBUG nova.virt.hardware [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:36:36 np0005603623 nova_compute[226235]: 2026-01-31 08:36:36.698 226239 DEBUG nova.objects.instance [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lazy-loading 'vcpu_model' on Instance uuid 80b18469-4c81-4aa6-b657-efec55ba102b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:36:36 np0005603623 podman[292267]: 2026-01-31 08:36:36.96965941 +0000 UTC m=+0.054533876 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent)
Jan 31 03:36:36 np0005603623 podman[292268]: 2026-01-31 08:36:36.995417874 +0000 UTC m=+0.078546726 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 31 03:36:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:37.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:37 np0005603623 nova_compute[226235]: 2026-01-31 08:36:37.427 226239 DEBUG oslo_concurrency.processutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:36:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:37.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:37 np0005603623 nova_compute[226235]: 2026-01-31 08:36:37.809 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:36:37 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3722433258' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:36:37 np0005603623 nova_compute[226235]: 2026-01-31 08:36:37.862 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:37 np0005603623 nova_compute[226235]: 2026-01-31 08:36:37.881 226239 DEBUG oslo_concurrency.processutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:36:37 np0005603623 nova_compute[226235]: 2026-01-31 08:36:37.911 226239 DEBUG nova.storage.rbd_utils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] rbd image 80b18469-4c81-4aa6-b657-efec55ba102b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:36:37 np0005603623 nova_compute[226235]: 2026-01-31 08:36:37.915 226239 DEBUG oslo_concurrency.processutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:36:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e337 e337: 3 total, 3 up, 3 in
Jan 31 03:36:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:36:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1461701787' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:36:38 np0005603623 nova_compute[226235]: 2026-01-31 08:36:38.349 226239 DEBUG oslo_concurrency.processutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:36:38 np0005603623 nova_compute[226235]: 2026-01-31 08:36:38.352 226239 DEBUG nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:36:38 np0005603623 nova_compute[226235]:  <uuid>80b18469-4c81-4aa6-b657-efec55ba102b</uuid>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:  <name>instance-0000008f</name>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerShowV257Test-server-382383596</nova:name>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:36:36</nova:creationTime>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:36:38 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:        <nova:user uuid="59bd928a3bb24a89af6a53a8392bd344">tempest-ServerShowV257Test-135657040-project-member</nova:user>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:        <nova:project uuid="2c23fe6507444c32a4c0254003f2c6cb">tempest-ServerShowV257Test-135657040</nova:project>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="0864ca59-9877-4e6d-adfc-f0a3204ed8f8"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <nova:ports/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <entry name="serial">80b18469-4c81-4aa6-b657-efec55ba102b</entry>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <entry name="uuid">80b18469-4c81-4aa6-b657-efec55ba102b</entry>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/80b18469-4c81-4aa6-b657-efec55ba102b_disk">
Jan 31 03:36:38 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:36:38 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/80b18469-4c81-4aa6-b657-efec55ba102b_disk.config">
Jan 31 03:36:38 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:36:38 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b/console.log" append="off"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:36:38 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:36:38 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:36:38 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:36:38 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:36:38 np0005603623 nova_compute[226235]: 2026-01-31 08:36:38.613 226239 DEBUG nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:36:38 np0005603623 nova_compute[226235]: 2026-01-31 08:36:38.613 226239 DEBUG nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:36:38 np0005603623 nova_compute[226235]: 2026-01-31 08:36:38.615 226239 INFO nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Using config drive#033[00m
Jan 31 03:36:38 np0005603623 nova_compute[226235]: 2026-01-31 08:36:38.648 226239 DEBUG nova.storage.rbd_utils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] rbd image 80b18469-4c81-4aa6-b657-efec55ba102b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:36:38 np0005603623 nova_compute[226235]: 2026-01-31 08:36:38.881 226239 DEBUG nova.objects.instance [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lazy-loading 'ec2_ids' on Instance uuid 80b18469-4c81-4aa6-b657-efec55ba102b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:36:39 np0005603623 nova_compute[226235]: 2026-01-31 08:36:39.083 226239 DEBUG nova.objects.instance [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lazy-loading 'keypairs' on Instance uuid 80b18469-4c81-4aa6-b657-efec55ba102b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:36:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e338 e338: 3 total, 3 up, 3 in
Jan 31 03:36:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:36:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:36:39 np0005603623 nova_compute[226235]: 2026-01-31 08:36:39.391 226239 INFO nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Creating config drive at /var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b/disk.config#033[00m
Jan 31 03:36:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:39.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:39 np0005603623 nova_compute[226235]: 2026-01-31 08:36:39.395 226239 DEBUG oslo_concurrency.processutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpnerqpuq5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:36:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:39.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:39 np0005603623 nova_compute[226235]: 2026-01-31 08:36:39.531 226239 DEBUG oslo_concurrency.processutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpnerqpuq5" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:36:39 np0005603623 nova_compute[226235]: 2026-01-31 08:36:39.560 226239 DEBUG nova.storage.rbd_utils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] rbd image 80b18469-4c81-4aa6-b657-efec55ba102b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:36:39 np0005603623 nova_compute[226235]: 2026-01-31 08:36:39.565 226239 DEBUG oslo_concurrency.processutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b/disk.config 80b18469-4c81-4aa6-b657-efec55ba102b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:36:39 np0005603623 nova_compute[226235]: 2026-01-31 08:36:39.848 226239 DEBUG oslo_concurrency.processutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b/disk.config 80b18469-4c81-4aa6-b657-efec55ba102b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.283s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:36:39 np0005603623 nova_compute[226235]: 2026-01-31 08:36:39.849 226239 INFO nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Deleting local config drive /var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b/disk.config because it was imported into RBD.#033[00m
Jan 31 03:36:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e338 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:39 np0005603623 systemd-machined[194379]: New machine qemu-67-instance-0000008f.
Jan 31 03:36:39 np0005603623 systemd[1]: Started Virtual Machine qemu-67-instance-0000008f.
Jan 31 03:36:39 np0005603623 nova_compute[226235]: 2026-01-31 08:36:39.937 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:40 np0005603623 nova_compute[226235]: 2026-01-31 08:36:40.672 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Removed pending event for 80b18469-4c81-4aa6-b657-efec55ba102b due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:36:40 np0005603623 nova_compute[226235]: 2026-01-31 08:36:40.673 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848600.6722088, 80b18469-4c81-4aa6-b657-efec55ba102b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:36:40 np0005603623 nova_compute[226235]: 2026-01-31 08:36:40.673 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:36:40 np0005603623 nova_compute[226235]: 2026-01-31 08:36:40.675 226239 DEBUG nova.compute.manager [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:36:40 np0005603623 nova_compute[226235]: 2026-01-31 08:36:40.676 226239 DEBUG nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:36:40 np0005603623 nova_compute[226235]: 2026-01-31 08:36:40.680 226239 INFO nova.virt.libvirt.driver [-] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Instance spawned successfully.#033[00m
Jan 31 03:36:40 np0005603623 nova_compute[226235]: 2026-01-31 08:36:40.681 226239 DEBUG nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:36:40 np0005603623 nova_compute[226235]: 2026-01-31 08:36:40.766 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:36:40 np0005603623 nova_compute[226235]: 2026-01-31 08:36:40.770 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:36:40 np0005603623 nova_compute[226235]: 2026-01-31 08:36:40.797 226239 DEBUG nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:36:40 np0005603623 nova_compute[226235]: 2026-01-31 08:36:40.797 226239 DEBUG nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:36:40 np0005603623 nova_compute[226235]: 2026-01-31 08:36:40.798 226239 DEBUG nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:36:40 np0005603623 nova_compute[226235]: 2026-01-31 08:36:40.798 226239 DEBUG nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:36:40 np0005603623 nova_compute[226235]: 2026-01-31 08:36:40.798 226239 DEBUG nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:36:40 np0005603623 nova_compute[226235]: 2026-01-31 08:36:40.799 226239 DEBUG nova.virt.libvirt.driver [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:36:40 np0005603623 nova_compute[226235]: 2026-01-31 08:36:40.902 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 31 03:36:40 np0005603623 nova_compute[226235]: 2026-01-31 08:36:40.902 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848600.6723564, 80b18469-4c81-4aa6-b657-efec55ba102b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:36:40 np0005603623 nova_compute[226235]: 2026-01-31 08:36:40.903 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] VM Started (Lifecycle Event)#033[00m
Jan 31 03:36:41 np0005603623 nova_compute[226235]: 2026-01-31 08:36:41.111 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:36:41 np0005603623 nova_compute[226235]: 2026-01-31 08:36:41.115 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:36:41 np0005603623 nova_compute[226235]: 2026-01-31 08:36:41.158 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:41 np0005603623 nova_compute[226235]: 2026-01-31 08:36:41.239 226239 DEBUG nova.compute.manager [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:36:41 np0005603623 nova_compute[226235]: 2026-01-31 08:36:41.268 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 31 03:36:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:36:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:41.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:36:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:41.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:41 np0005603623 nova_compute[226235]: 2026-01-31 08:36:41.505 226239 DEBUG oslo_concurrency.lockutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:36:41 np0005603623 nova_compute[226235]: 2026-01-31 08:36:41.506 226239 DEBUG oslo_concurrency.lockutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:36:41 np0005603623 nova_compute[226235]: 2026-01-31 08:36:41.506 226239 DEBUG nova.objects.instance [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 03:36:42 np0005603623 nova_compute[226235]: 2026-01-31 08:36:42.098 226239 DEBUG oslo_concurrency.lockutils [None req-a887a450-3316-4222-86c3-d99ea80ef01c 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:36:43 np0005603623 nova_compute[226235]: 2026-01-31 08:36:43.141 226239 DEBUG oslo_concurrency.lockutils [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Acquiring lock "80b18469-4c81-4aa6-b657-efec55ba102b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:36:43 np0005603623 nova_compute[226235]: 2026-01-31 08:36:43.142 226239 DEBUG oslo_concurrency.lockutils [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "80b18469-4c81-4aa6-b657-efec55ba102b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:36:43 np0005603623 nova_compute[226235]: 2026-01-31 08:36:43.142 226239 DEBUG oslo_concurrency.lockutils [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Acquiring lock "80b18469-4c81-4aa6-b657-efec55ba102b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:36:43 np0005603623 nova_compute[226235]: 2026-01-31 08:36:43.142 226239 DEBUG oslo_concurrency.lockutils [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "80b18469-4c81-4aa6-b657-efec55ba102b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:36:43 np0005603623 nova_compute[226235]: 2026-01-31 08:36:43.143 226239 DEBUG oslo_concurrency.lockutils [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "80b18469-4c81-4aa6-b657-efec55ba102b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:36:43 np0005603623 nova_compute[226235]: 2026-01-31 08:36:43.144 226239 INFO nova.compute.manager [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Terminating instance#033[00m
Jan 31 03:36:43 np0005603623 nova_compute[226235]: 2026-01-31 08:36:43.144 226239 DEBUG oslo_concurrency.lockutils [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Acquiring lock "refresh_cache-80b18469-4c81-4aa6-b657-efec55ba102b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:36:43 np0005603623 nova_compute[226235]: 2026-01-31 08:36:43.145 226239 DEBUG oslo_concurrency.lockutils [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Acquired lock "refresh_cache-80b18469-4c81-4aa6-b657-efec55ba102b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:36:43 np0005603623 nova_compute[226235]: 2026-01-31 08:36:43.145 226239 DEBUG nova.network.neutron [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:36:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:43.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:43.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e338 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:44 np0005603623 nova_compute[226235]: 2026-01-31 08:36:44.940 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:36:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:45.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:36:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:36:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:45.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:36:46 np0005603623 nova_compute[226235]: 2026-01-31 08:36:46.160 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 e339: 3 total, 3 up, 3 in
Jan 31 03:36:47 np0005603623 nova_compute[226235]: 2026-01-31 08:36:47.336 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:36:47 np0005603623 nova_compute[226235]: 2026-01-31 08:36:47.336 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:36:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:36:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:47.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:36:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:47.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:47 np0005603623 nova_compute[226235]: 2026-01-31 08:36:47.799 226239 DEBUG nova.network.neutron [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:36:48 np0005603623 nova_compute[226235]: 2026-01-31 08:36:48.384 226239 DEBUG nova.network.neutron [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:36:48 np0005603623 nova_compute[226235]: 2026-01-31 08:36:48.494 226239 DEBUG oslo_concurrency.lockutils [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Releasing lock "refresh_cache-80b18469-4c81-4aa6-b657-efec55ba102b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:36:48 np0005603623 nova_compute[226235]: 2026-01-31 08:36:48.495 226239 DEBUG nova.compute.manager [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:36:48 np0005603623 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000008f.scope: Deactivated successfully.
Jan 31 03:36:48 np0005603623 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000008f.scope: Consumed 8.756s CPU time.
Jan 31 03:36:48 np0005603623 systemd-machined[194379]: Machine qemu-67-instance-0000008f terminated.
Jan 31 03:36:48 np0005603623 nova_compute[226235]: 2026-01-31 08:36:48.910 226239 INFO nova.virt.libvirt.driver [-] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Instance destroyed successfully.#033[00m
Jan 31 03:36:48 np0005603623 nova_compute[226235]: 2026-01-31 08:36:48.910 226239 DEBUG nova.objects.instance [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lazy-loading 'resources' on Instance uuid 80b18469-4c81-4aa6-b657-efec55ba102b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:36:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:49.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:49.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:49 np0005603623 nova_compute[226235]: 2026-01-31 08:36:49.941 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:51 np0005603623 nova_compute[226235]: 2026-01-31 08:36:51.163 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:36:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:51.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:36:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:51.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:36:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:53.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:36:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:53.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:54 np0005603623 nova_compute[226235]: 2026-01-31 08:36:54.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:36:54 np0005603623 nova_compute[226235]: 2026-01-31 08:36:54.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:36:54 np0005603623 nova_compute[226235]: 2026-01-31 08:36:54.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:36:54 np0005603623 nova_compute[226235]: 2026-01-31 08:36:54.472 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 31 03:36:54 np0005603623 nova_compute[226235]: 2026-01-31 08:36:54.472 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:36:54 np0005603623 nova_compute[226235]: 2026-01-31 08:36:54.473 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:36:54 np0005603623 nova_compute[226235]: 2026-01-31 08:36:54.622 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:36:54 np0005603623 nova_compute[226235]: 2026-01-31 08:36:54.623 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:36:54 np0005603623 nova_compute[226235]: 2026-01-31 08:36:54.623 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:36:54 np0005603623 nova_compute[226235]: 2026-01-31 08:36:54.623 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:36:54 np0005603623 nova_compute[226235]: 2026-01-31 08:36:54.624 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:36:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:54 np0005603623 nova_compute[226235]: 2026-01-31 08:36:54.942 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:36:55 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1878260077' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:36:55 np0005603623 nova_compute[226235]: 2026-01-31 08:36:55.069 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:36:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:55.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:55.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:55 np0005603623 nova_compute[226235]: 2026-01-31 08:36:55.533 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000008f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:36:55 np0005603623 nova_compute[226235]: 2026-01-31 08:36:55.533 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000008f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:36:55 np0005603623 nova_compute[226235]: 2026-01-31 08:36:55.656 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:36:55 np0005603623 nova_compute[226235]: 2026-01-31 08:36:55.657 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4339MB free_disk=20.838668823242188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:36:55 np0005603623 nova_compute[226235]: 2026-01-31 08:36:55.657 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:36:55 np0005603623 nova_compute[226235]: 2026-01-31 08:36:55.658 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:36:56 np0005603623 nova_compute[226235]: 2026-01-31 08:36:56.164 226239 INFO nova.virt.libvirt.driver [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Deleting instance files /var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b_del#033[00m
Jan 31 03:36:56 np0005603623 nova_compute[226235]: 2026-01-31 08:36:56.165 226239 INFO nova.virt.libvirt.driver [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Deletion of /var/lib/nova/instances/80b18469-4c81-4aa6-b657-efec55ba102b_del complete#033[00m
Jan 31 03:36:56 np0005603623 nova_compute[226235]: 2026-01-31 08:36:56.167 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:56 np0005603623 nova_compute[226235]: 2026-01-31 08:36:56.316 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 80b18469-4c81-4aa6-b657-efec55ba102b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:36:56 np0005603623 nova_compute[226235]: 2026-01-31 08:36:56.317 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:36:56 np0005603623 nova_compute[226235]: 2026-01-31 08:36:56.317 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:36:56 np0005603623 nova_compute[226235]: 2026-01-31 08:36:56.363 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:36:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:36:56 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/700229161' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:36:56 np0005603623 nova_compute[226235]: 2026-01-31 08:36:56.808 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:36:56 np0005603623 nova_compute[226235]: 2026-01-31 08:36:56.813 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:36:57 np0005603623 nova_compute[226235]: 2026-01-31 08:36:57.381 226239 INFO nova.compute.manager [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Took 8.89 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:36:57 np0005603623 nova_compute[226235]: 2026-01-31 08:36:57.382 226239 DEBUG oslo.service.loopingcall [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:36:57 np0005603623 nova_compute[226235]: 2026-01-31 08:36:57.382 226239 DEBUG nova.compute.manager [-] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:36:57 np0005603623 nova_compute[226235]: 2026-01-31 08:36:57.382 226239 DEBUG nova.network.neutron [-] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:36:57 np0005603623 nova_compute[226235]: 2026-01-31 08:36:57.390 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:36:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:36:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:57.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:36:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:57.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:57 np0005603623 nova_compute[226235]: 2026-01-31 08:36:57.588 226239 DEBUG nova.network.neutron [-] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:36:58 np0005603623 nova_compute[226235]: 2026-01-31 08:36:58.518 226239 DEBUG nova.network.neutron [-] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:36:58 np0005603623 nova_compute[226235]: 2026-01-31 08:36:58.877 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:36:58 np0005603623 nova_compute[226235]: 2026-01-31 08:36:58.878 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:36:59 np0005603623 nova_compute[226235]: 2026-01-31 08:36:59.083 226239 INFO nova.compute.manager [-] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Took 1.70 seconds to deallocate network for instance.#033[00m
Jan 31 03:36:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:36:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:59.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:36:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:36:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:59.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:59 np0005603623 nova_compute[226235]: 2026-01-31 08:36:59.531 226239 DEBUG oslo_concurrency.lockutils [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:36:59 np0005603623 nova_compute[226235]: 2026-01-31 08:36:59.532 226239 DEBUG oslo_concurrency.lockutils [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:36:59 np0005603623 nova_compute[226235]: 2026-01-31 08:36:59.622 226239 DEBUG oslo_concurrency.processutils [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:36:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:59 np0005603623 nova_compute[226235]: 2026-01-31 08:36:59.945 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:37:00 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3657109350' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:37:00 np0005603623 nova_compute[226235]: 2026-01-31 08:37:00.070 226239 DEBUG oslo_concurrency.processutils [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:37:00 np0005603623 nova_compute[226235]: 2026-01-31 08:37:00.075 226239 DEBUG nova.compute.provider_tree [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:37:00 np0005603623 nova_compute[226235]: 2026-01-31 08:37:00.494 226239 DEBUG nova.scheduler.client.report [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:37:00 np0005603623 nova_compute[226235]: 2026-01-31 08:37:00.559 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:00 np0005603623 nova_compute[226235]: 2026-01-31 08:37:00.559 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:00 np0005603623 nova_compute[226235]: 2026-01-31 08:37:00.560 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:00 np0005603623 nova_compute[226235]: 2026-01-31 08:37:00.560 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:00 np0005603623 nova_compute[226235]: 2026-01-31 08:37:00.560 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:00 np0005603623 nova_compute[226235]: 2026-01-31 08:37:00.760 226239 DEBUG oslo_concurrency.lockutils [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:37:01 np0005603623 nova_compute[226235]: 2026-01-31 08:37:01.168 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:37:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:01.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:37:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:01.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:01 np0005603623 nova_compute[226235]: 2026-01-31 08:37:01.613 226239 INFO nova.scheduler.client.report [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Deleted allocations for instance 80b18469-4c81-4aa6-b657-efec55ba102b#033[00m
Jan 31 03:37:02 np0005603623 nova_compute[226235]: 2026-01-31 08:37:02.209 226239 DEBUG oslo_concurrency.lockutils [None req-32a8029e-9b69-4fea-88f8-7d9225ce1f3e 59bd928a3bb24a89af6a53a8392bd344 2c23fe6507444c32a4c0254003f2c6cb - - default default] Lock "80b18469-4c81-4aa6-b657-efec55ba102b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 19.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:37:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:37:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:03.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:37:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:03.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:03 np0005603623 nova_compute[226235]: 2026-01-31 08:37:03.908 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848608.9075267, 80b18469-4c81-4aa6-b657-efec55ba102b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:37:03 np0005603623 nova_compute[226235]: 2026-01-31 08:37:03.909 226239 INFO nova.compute.manager [-] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:37:04 np0005603623 nova_compute[226235]: 2026-01-31 08:37:04.145 226239 DEBUG nova.compute.manager [None req-eb781428-b6ea-428b-b097-9d07dab4d6b5 - - - - - -] [instance: 80b18469-4c81-4aa6-b657-efec55ba102b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:37:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:04 np0005603623 nova_compute[226235]: 2026-01-31 08:37:04.947 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:37:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:05.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:37:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:05.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:06 np0005603623 nova_compute[226235]: 2026-01-31 08:37:06.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:06 np0005603623 nova_compute[226235]: 2026-01-31 08:37:06.170 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:07.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:07.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:07 np0005603623 podman[292692]: 2026-01-31 08:37:07.952935768 +0000 UTC m=+0.044167951 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 31 03:37:08 np0005603623 podman[292693]: 2026-01-31 08:37:08.012298304 +0000 UTC m=+0.105216769 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:37:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:37:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:09.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:37:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:09.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:09 np0005603623 nova_compute[226235]: 2026-01-31 08:37:09.948 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:10 np0005603623 nova_compute[226235]: 2026-01-31 08:37:10.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:11 np0005603623 nova_compute[226235]: 2026-01-31 08:37:11.171 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:37:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:11.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:37:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:11.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:37:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:13.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:37:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:13.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:37:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2981826607' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:37:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:37:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2981826607' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:37:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:14 np0005603623 nova_compute[226235]: 2026-01-31 08:37:14.950 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:15.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:15.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:16 np0005603623 nova_compute[226235]: 2026-01-31 08:37:16.173 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:16 np0005603623 nova_compute[226235]: 2026-01-31 08:37:16.316 226239 DEBUG oslo_concurrency.lockutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Acquiring lock "91f04a93-ce38-4962-919b-e1ac0677b4da" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:37:16 np0005603623 nova_compute[226235]: 2026-01-31 08:37:16.316 226239 DEBUG oslo_concurrency.lockutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:37:16 np0005603623 nova_compute[226235]: 2026-01-31 08:37:16.546 226239 DEBUG nova.compute.manager [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:37:17 np0005603623 nova_compute[226235]: 2026-01-31 08:37:17.152 226239 DEBUG oslo_concurrency.lockutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:37:17 np0005603623 nova_compute[226235]: 2026-01-31 08:37:17.153 226239 DEBUG oslo_concurrency.lockutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:37:17 np0005603623 nova_compute[226235]: 2026-01-31 08:37:17.162 226239 DEBUG nova.virt.hardware [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:37:17 np0005603623 nova_compute[226235]: 2026-01-31 08:37:17.163 226239 INFO nova.compute.claims [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:37:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:37:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:17.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:37:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:37:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:17.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:37:17 np0005603623 nova_compute[226235]: 2026-01-31 08:37:17.523 226239 DEBUG oslo_concurrency.processutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:37:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:37:17 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3579971505' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:37:17 np0005603623 nova_compute[226235]: 2026-01-31 08:37:17.931 226239 DEBUG oslo_concurrency.processutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:37:17 np0005603623 nova_compute[226235]: 2026-01-31 08:37:17.938 226239 DEBUG nova.compute.provider_tree [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:37:18 np0005603623 nova_compute[226235]: 2026-01-31 08:37:18.306 226239 DEBUG nova.scheduler.client.report [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:37:18 np0005603623 nova_compute[226235]: 2026-01-31 08:37:18.440 226239 DEBUG oslo_concurrency.lockutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:37:18 np0005603623 nova_compute[226235]: 2026-01-31 08:37:18.441 226239 DEBUG nova.compute.manager [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:37:18 np0005603623 nova_compute[226235]: 2026-01-31 08:37:18.815 226239 DEBUG nova.compute.manager [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:37:18 np0005603623 nova_compute[226235]: 2026-01-31 08:37:18.815 226239 DEBUG nova.network.neutron [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:37:18 np0005603623 nova_compute[226235]: 2026-01-31 08:37:18.963 226239 INFO nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:37:19 np0005603623 nova_compute[226235]: 2026-01-31 08:37:19.042 226239 DEBUG nova.compute.manager [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:37:19 np0005603623 nova_compute[226235]: 2026-01-31 08:37:19.193 226239 INFO nova.virt.block_device [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Booting with volume 7ff7e47c-5991-4995-b62d-9010ad81e5bf at /dev/vda#033[00m
Jan 31 03:37:19 np0005603623 nova_compute[226235]: 2026-01-31 08:37:19.357 226239 DEBUG nova.policy [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c13cac16cdef424b8050c4bea5a7e9c3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fcc39962194d44e5b37cad3fb1adc6c4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:37:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:37:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:19.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:37:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:19.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:19 np0005603623 nova_compute[226235]: 2026-01-31 08:37:19.953 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:20 np0005603623 nova_compute[226235]: 2026-01-31 08:37:20.051 226239 DEBUG os_brick.utils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:37:20 np0005603623 nova_compute[226235]: 2026-01-31 08:37:20.052 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:37:20 np0005603623 nova_compute[226235]: 2026-01-31 08:37:20.061 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:37:20 np0005603623 nova_compute[226235]: 2026-01-31 08:37:20.061 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[ef2966bc-f9bf-4de2-9968-ef869c01819e]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:20 np0005603623 nova_compute[226235]: 2026-01-31 08:37:20.062 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:37:20 np0005603623 nova_compute[226235]: 2026-01-31 08:37:20.067 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.005s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:37:20 np0005603623 nova_compute[226235]: 2026-01-31 08:37:20.067 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[f6e85c2e-4aa1-49d2-a2ee-a78da5b43aff]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:20 np0005603623 nova_compute[226235]: 2026-01-31 08:37:20.069 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:37:20 np0005603623 nova_compute[226235]: 2026-01-31 08:37:20.075 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:37:20 np0005603623 nova_compute[226235]: 2026-01-31 08:37:20.076 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[70fc1700-e639-4a71-9bb3-0cb11993f1b7]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:20 np0005603623 nova_compute[226235]: 2026-01-31 08:37:20.077 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[c9a4d411-c9e3-4643-a228-5a0520a8494b]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:20 np0005603623 nova_compute[226235]: 2026-01-31 08:37:20.078 226239 DEBUG oslo_concurrency.processutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:37:20 np0005603623 nova_compute[226235]: 2026-01-31 08:37:20.098 226239 DEBUG oslo_concurrency.processutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] CMD "nvme version" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:37:20 np0005603623 nova_compute[226235]: 2026-01-31 08:37:20.100 226239 DEBUG os_brick.initiator.connectors.lightos [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:37:20 np0005603623 nova_compute[226235]: 2026-01-31 08:37:20.100 226239 DEBUG os_brick.initiator.connectors.lightos [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:37:20 np0005603623 nova_compute[226235]: 2026-01-31 08:37:20.101 226239 DEBUG os_brick.initiator.connectors.lightos [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:37:20 np0005603623 nova_compute[226235]: 2026-01-31 08:37:20.101 226239 DEBUG os_brick.utils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] <== get_connector_properties: return (49ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:37:20 np0005603623 nova_compute[226235]: 2026-01-31 08:37:20.101 226239 DEBUG nova.virt.block_device [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Updating existing volume attachment record: dcaca1fd-2764-4e7e-a587-8401069935cb _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:37:21 np0005603623 nova_compute[226235]: 2026-01-31 08:37:21.175 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:21.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:21.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:22 np0005603623 nova_compute[226235]: 2026-01-31 08:37:22.036 226239 DEBUG nova.compute.manager [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:37:22 np0005603623 nova_compute[226235]: 2026-01-31 08:37:22.038 226239 DEBUG nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:37:22 np0005603623 nova_compute[226235]: 2026-01-31 08:37:22.038 226239 INFO nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Creating image(s)#033[00m
Jan 31 03:37:22 np0005603623 nova_compute[226235]: 2026-01-31 08:37:22.039 226239 DEBUG nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 03:37:22 np0005603623 nova_compute[226235]: 2026-01-31 08:37:22.039 226239 DEBUG nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Ensure instance console log exists: /var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:37:22 np0005603623 nova_compute[226235]: 2026-01-31 08:37:22.039 226239 DEBUG oslo_concurrency.lockutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:37:22 np0005603623 nova_compute[226235]: 2026-01-31 08:37:22.039 226239 DEBUG oslo_concurrency.lockutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:37:22 np0005603623 nova_compute[226235]: 2026-01-31 08:37:22.040 226239 DEBUG oslo_concurrency.lockutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:37:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:22.128 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:37:22 np0005603623 nova_compute[226235]: 2026-01-31 08:37:22.128 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:22.130 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:37:23 np0005603623 nova_compute[226235]: 2026-01-31 08:37:23.333 226239 DEBUG nova.network.neutron [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Successfully created port: ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:37:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:23.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:23.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:24.132 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:37:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:24 np0005603623 nova_compute[226235]: 2026-01-31 08:37:24.956 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:25 np0005603623 nova_compute[226235]: 2026-01-31 08:37:25.398 226239 DEBUG nova.network.neutron [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Successfully updated port: ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:37:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:25.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:25 np0005603623 nova_compute[226235]: 2026-01-31 08:37:25.491 226239 DEBUG oslo_concurrency.lockutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Acquiring lock "refresh_cache-91f04a93-ce38-4962-919b-e1ac0677b4da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:37:25 np0005603623 nova_compute[226235]: 2026-01-31 08:37:25.491 226239 DEBUG oslo_concurrency.lockutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Acquired lock "refresh_cache-91f04a93-ce38-4962-919b-e1ac0677b4da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:37:25 np0005603623 nova_compute[226235]: 2026-01-31 08:37:25.491 226239 DEBUG nova.network.neutron [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:37:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:37:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:25.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:37:25 np0005603623 nova_compute[226235]: 2026-01-31 08:37:25.644 226239 DEBUG nova.compute.manager [req-bb0f4406-bf91-4104-a66a-23a3f79241ec req-c06c257f-2f57-4441-9211-9f211edd548b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received event network-changed-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:37:25 np0005603623 nova_compute[226235]: 2026-01-31 08:37:25.644 226239 DEBUG nova.compute.manager [req-bb0f4406-bf91-4104-a66a-23a3f79241ec req-c06c257f-2f57-4441-9211-9f211edd548b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Refreshing instance network info cache due to event network-changed-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:37:25 np0005603623 nova_compute[226235]: 2026-01-31 08:37:25.644 226239 DEBUG oslo_concurrency.lockutils [req-bb0f4406-bf91-4104-a66a-23a3f79241ec req-c06c257f-2f57-4441-9211-9f211edd548b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-91f04a93-ce38-4962-919b-e1ac0677b4da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:37:25 np0005603623 nova_compute[226235]: 2026-01-31 08:37:25.895 226239 DEBUG nova.network.neutron [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:37:26 np0005603623 nova_compute[226235]: 2026-01-31 08:37:26.178 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:27.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:27.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:29.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:29.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:29 np0005603623 nova_compute[226235]: 2026-01-31 08:37:29.958 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:30.128 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:37:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:30.128 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:37:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:30.128 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.862 226239 DEBUG nova.network.neutron [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Updating instance_info_cache with network_info: [{"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.940 226239 DEBUG oslo_concurrency.lockutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Releasing lock "refresh_cache-91f04a93-ce38-4962-919b-e1ac0677b4da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.941 226239 DEBUG nova.compute.manager [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Instance network_info: |[{"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.941 226239 DEBUG oslo_concurrency.lockutils [req-bb0f4406-bf91-4104-a66a-23a3f79241ec req-c06c257f-2f57-4441-9211-9f211edd548b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-91f04a93-ce38-4962-919b-e1ac0677b4da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.941 226239 DEBUG nova.network.neutron [req-bb0f4406-bf91-4104-a66a-23a3f79241ec req-c06c257f-2f57-4441-9211-9f211edd548b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Refreshing network info cache for port ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.944 226239 DEBUG nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Start _get_guest_xml network_info=[{"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'attachment_id': 'dcaca1fd-2764-4e7e-a587-8401069935cb', 'delete_on_termination': True, 'guest_format': None, 'device_type': 'disk', 'boot_index': 0, 'disk_bus': 'virtio', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-7ff7e47c-5991-4995-b62d-9010ad81e5bf', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '7ff7e47c-5991-4995-b62d-9010ad81e5bf', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '91f04a93-ce38-4962-919b-e1ac0677b4da', 'attached_at': '', 'detached_at': '', 'volume_id': '7ff7e47c-5991-4995-b62d-9010ad81e5bf', 'serial': '7ff7e47c-5991-4995-b62d-9010ad81e5bf'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.947 226239 WARNING nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.957 226239 DEBUG nova.virt.libvirt.host [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.958 226239 DEBUG nova.virt.libvirt.host [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.962 226239 DEBUG nova.virt.libvirt.host [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.963 226239 DEBUG nova.virt.libvirt.host [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.964 226239 DEBUG nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.965 226239 DEBUG nova.virt.hardware [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.965 226239 DEBUG nova.virt.hardware [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.965 226239 DEBUG nova.virt.hardware [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.966 226239 DEBUG nova.virt.hardware [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.966 226239 DEBUG nova.virt.hardware [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.966 226239 DEBUG nova.virt.hardware [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.966 226239 DEBUG nova.virt.hardware [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.967 226239 DEBUG nova.virt.hardware [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.967 226239 DEBUG nova.virt.hardware [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.967 226239 DEBUG nova.virt.hardware [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.967 226239 DEBUG nova.virt.hardware [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.989 226239 DEBUG nova.storage.rbd_utils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] rbd image 91f04a93-ce38-4962-919b-e1ac0677b4da_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:37:30 np0005603623 nova_compute[226235]: 2026-01-31 08:37:30.992 226239 DEBUG oslo_concurrency.processutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.179 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:37:31 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4031166104' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:37:31 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 03:37:31 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.399 226239 DEBUG oslo_concurrency.processutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.407s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:37:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:31.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:31.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.537 226239 DEBUG nova.virt.libvirt.vif [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:37:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerActionsV293TestJSON-server-781350970',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionsv293testjson-server-781350970',id=145,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH5gkx3V7BznMz2Ml1SKHQUD/7nRL++Tb8fgaa8FBib96ELGecwEzyV24CRvXhcVt4CV7M8yM6O/exmt6u050Gx0p7dChbKR6iHlWT/0AVPPaPVFOCFMpSshNGBUdPtSXg==',key_name='tempest-keypair-1508930548',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fcc39962194d44e5b37cad3fb1adc6c4',ramdisk_id='',reservation_id='r-i2540rfe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-ServerActionsV293TestJSON-769176929',owner_user_name='tempest-ServerActionsV293TestJSON-769176929-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:37:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c13cac16cdef424b8050c4bea5a7e9c3',uuid=91f04a93-ce38-4962-919b-e1ac0677b4da,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.537 226239 DEBUG nova.network.os_vif_util [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Converting VIF {"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.538 226239 DEBUG nova.network.os_vif_util [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:82:ab,bridge_name='br-int',has_traffic_filtering=True,id=ff4db50a-8c68-47ef-a6b0-c8caeab94ae5,network=Network(9ade8f79-180d-4cd9-82d7-f3d41cab1210),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff4db50a-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.539 226239 DEBUG nova.objects.instance [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 91f04a93-ce38-4962-919b-e1ac0677b4da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.638 226239 DEBUG nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:37:31 np0005603623 nova_compute[226235]:  <uuid>91f04a93-ce38-4962-919b-e1ac0677b4da</uuid>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:  <name>instance-00000091</name>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerActionsV293TestJSON-server-781350970</nova:name>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:37:30</nova:creationTime>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:37:31 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:        <nova:user uuid="c13cac16cdef424b8050c4bea5a7e9c3">tempest-ServerActionsV293TestJSON-769176929-project-member</nova:user>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:        <nova:project uuid="fcc39962194d44e5b37cad3fb1adc6c4">tempest-ServerActionsV293TestJSON-769176929</nova:project>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:        <nova:port uuid="ff4db50a-8c68-47ef-a6b0-c8caeab94ae5">
Jan 31 03:37:31 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <entry name="serial">91f04a93-ce38-4962-919b-e1ac0677b4da</entry>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <entry name="uuid">91f04a93-ce38-4962-919b-e1ac0677b4da</entry>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/91f04a93-ce38-4962-919b-e1ac0677b4da_disk.config">
Jan 31 03:37:31 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:37:31 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="volumes/volume-7ff7e47c-5991-4995-b62d-9010ad81e5bf">
Jan 31 03:37:31 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:37:31 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <serial>7ff7e47c-5991-4995-b62d-9010ad81e5bf</serial>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:7b:82:ab"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <target dev="tapff4db50a-8c"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da/console.log" append="off"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:37:31 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:37:31 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:37:31 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:37:31 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.639 226239 DEBUG nova.compute.manager [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Preparing to wait for external event network-vif-plugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.639 226239 DEBUG oslo_concurrency.lockutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Acquiring lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.640 226239 DEBUG oslo_concurrency.lockutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.640 226239 DEBUG oslo_concurrency.lockutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.641 226239 DEBUG nova.virt.libvirt.vif [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:37:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerActionsV293TestJSON-server-781350970',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionsv293testjson-server-781350970',id=145,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH5gkx3V7BznMz2Ml1SKHQUD/7nRL++Tb8fgaa8FBib96ELGecwEzyV24CRvXhcVt4CV7M8yM6O/exmt6u050Gx0p7dChbKR6iHlWT/0AVPPaPVFOCFMpSshNGBUdPtSXg==',key_name='tempest-keypair-1508930548',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fcc39962194d44e5b37cad3fb1adc6c4',ramdisk_id='',reservation_id='r-i2540rfe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-ServerActionsV293TestJSON-769176929',owner_user_name='tempest-ServerActionsV293TestJSON-769176929-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:37:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c13cac16cdef424b8050c4bea5a7e9c3',uuid=91f04a93-ce38-4962-919b-e1ac0677b4da,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.641 226239 DEBUG nova.network.os_vif_util [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Converting VIF {"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.642 226239 DEBUG nova.network.os_vif_util [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:82:ab,bridge_name='br-int',has_traffic_filtering=True,id=ff4db50a-8c68-47ef-a6b0-c8caeab94ae5,network=Network(9ade8f79-180d-4cd9-82d7-f3d41cab1210),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff4db50a-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.642 226239 DEBUG os_vif [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:82:ab,bridge_name='br-int',has_traffic_filtering=True,id=ff4db50a-8c68-47ef-a6b0-c8caeab94ae5,network=Network(9ade8f79-180d-4cd9-82d7-f3d41cab1210),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff4db50a-8c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.643 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.643 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.644 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.647 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.647 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapff4db50a-8c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.647 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapff4db50a-8c, col_values=(('external_ids', {'iface-id': 'ff4db50a-8c68-47ef-a6b0-c8caeab94ae5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:82:ab', 'vm-uuid': '91f04a93-ce38-4962-919b-e1ac0677b4da'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.649 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:31 np0005603623 NetworkManager[48970]: <info>  [1769848651.6498] manager: (tapff4db50a-8c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/281)
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.651 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.654 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.655 226239 INFO os_vif [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:82:ab,bridge_name='br-int',has_traffic_filtering=True,id=ff4db50a-8c68-47ef-a6b0-c8caeab94ae5,network=Network(9ade8f79-180d-4cd9-82d7-f3d41cab1210),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff4db50a-8c')#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.777 226239 DEBUG nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.777 226239 DEBUG nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.777 226239 DEBUG nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] No VIF found with MAC fa:16:3e:7b:82:ab, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.778 226239 INFO nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Using config drive#033[00m
Jan 31 03:37:31 np0005603623 nova_compute[226235]: 2026-01-31 08:37:31.797 226239 DEBUG nova.storage.rbd_utils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] rbd image 91f04a93-ce38-4962-919b-e1ac0677b4da_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:37:33 np0005603623 nova_compute[226235]: 2026-01-31 08:37:33.344 226239 INFO nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Creating config drive at /var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da/disk.config#033[00m
Jan 31 03:37:33 np0005603623 nova_compute[226235]: 2026-01-31 08:37:33.349 226239 DEBUG oslo_concurrency.processutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpsgdjs_uy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:37:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:33.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:33 np0005603623 nova_compute[226235]: 2026-01-31 08:37:33.471 226239 DEBUG oslo_concurrency.processutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpsgdjs_uy" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:37:33 np0005603623 nova_compute[226235]: 2026-01-31 08:37:33.496 226239 DEBUG nova.storage.rbd_utils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] rbd image 91f04a93-ce38-4962-919b-e1ac0677b4da_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:37:33 np0005603623 nova_compute[226235]: 2026-01-31 08:37:33.500 226239 DEBUG oslo_concurrency.processutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da/disk.config 91f04a93-ce38-4962-919b-e1ac0677b4da_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:37:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.005000158s ======
Jan 31 03:37:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:33.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.005000158s
Jan 31 03:37:33 np0005603623 nova_compute[226235]: 2026-01-31 08:37:33.655 226239 DEBUG oslo_concurrency.processutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da/disk.config 91f04a93-ce38-4962-919b-e1ac0677b4da_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.156s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:37:33 np0005603623 nova_compute[226235]: 2026-01-31 08:37:33.657 226239 INFO nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Deleting local config drive /var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da/disk.config because it was imported into RBD.#033[00m
Jan 31 03:37:33 np0005603623 kernel: tapff4db50a-8c: entered promiscuous mode
Jan 31 03:37:33 np0005603623 NetworkManager[48970]: <info>  [1769848653.6931] manager: (tapff4db50a-8c): new Tun device (/org/freedesktop/NetworkManager/Devices/282)
Jan 31 03:37:33 np0005603623 ovn_controller[133449]: 2026-01-31T08:37:33Z|00597|binding|INFO|Claiming lport ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 for this chassis.
Jan 31 03:37:33 np0005603623 ovn_controller[133449]: 2026-01-31T08:37:33Z|00598|binding|INFO|ff4db50a-8c68-47ef-a6b0-c8caeab94ae5: Claiming fa:16:3e:7b:82:ab 10.100.0.8
Jan 31 03:37:33 np0005603623 nova_compute[226235]: 2026-01-31 08:37:33.693 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:33 np0005603623 nova_compute[226235]: 2026-01-31 08:37:33.698 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:33 np0005603623 systemd-machined[194379]: New machine qemu-68-instance-00000091.
Jan 31 03:37:33 np0005603623 nova_compute[226235]: 2026-01-31 08:37:33.718 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:33 np0005603623 ovn_controller[133449]: 2026-01-31T08:37:33Z|00599|binding|INFO|Setting lport ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 ovn-installed in OVS
Jan 31 03:37:33 np0005603623 nova_compute[226235]: 2026-01-31 08:37:33.721 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:33 np0005603623 systemd[1]: Started Virtual Machine qemu-68-instance-00000091.
Jan 31 03:37:33 np0005603623 systemd-udevd[292994]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:37:33 np0005603623 NetworkManager[48970]: <info>  [1769848653.7585] device (tapff4db50a-8c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:37:33 np0005603623 NetworkManager[48970]: <info>  [1769848653.7592] device (tapff4db50a-8c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:37:34 np0005603623 nova_compute[226235]: 2026-01-31 08:37:34.135 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848654.1345832, 91f04a93-ce38-4962-919b-e1ac0677b4da => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:37:34 np0005603623 nova_compute[226235]: 2026-01-31 08:37:34.136 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] VM Started (Lifecycle Event)#033[00m
Jan 31 03:37:34 np0005603623 ovn_controller[133449]: 2026-01-31T08:37:34Z|00600|binding|INFO|Setting lport ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 up in Southbound
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.209 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:82:ab 10.100.0.8'], port_security=['fa:16:3e:7b:82:ab 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '91f04a93-ce38-4962-919b-e1ac0677b4da', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9ade8f79-180d-4cd9-82d7-f3d41cab1210', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fcc39962194d44e5b37cad3fb1adc6c4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '71fde71b-c204-4df2-b3b2-d40465166772', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2a1d055-9eac-4b84-8a08-5dfefe7d7d79, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=ff4db50a-8c68-47ef-a6b0-c8caeab94ae5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.210 143258 INFO neutron.agent.ovn.metadata.agent [-] Port ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 in datapath 9ade8f79-180d-4cd9-82d7-f3d41cab1210 bound to our chassis#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.211 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9ade8f79-180d-4cd9-82d7-f3d41cab1210#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.220 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[006ce847-9b61-4a61-916a-5b92ca71467b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.221 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9ade8f79-11 in ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.222 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9ade8f79-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.222 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8ba69b52-2a77-4e35-8945-925a77f0fb11]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.223 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[722458d3-141d-436f-bad3-52155bb2949d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.235 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[3539c61e-edb3-4a64-876d-7d01cd36bc44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.255 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c079b40c-28f3-46be-abc8-688204d6d6e2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.277 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[1c1d2d78-955b-4f9f-b5c4-0ee214ce8d0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.281 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c398b510-db14-47b1-8e3a-0f4021936eea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:34 np0005603623 NetworkManager[48970]: <info>  [1769848654.2830] manager: (tap9ade8f79-10): new Veth device (/org/freedesktop/NetworkManager/Devices/283)
Jan 31 03:37:34 np0005603623 systemd-udevd[292998]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:37:34 np0005603623 nova_compute[226235]: 2026-01-31 08:37:34.291 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:37:34 np0005603623 nova_compute[226235]: 2026-01-31 08:37:34.298 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848654.1356883, 91f04a93-ce38-4962-919b-e1ac0677b4da => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:37:34 np0005603623 nova_compute[226235]: 2026-01-31 08:37:34.298 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.304 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[fc42ae8f-db5d-436a-8e4f-cb92430808e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.306 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[22217cb5-27e9-4d22-9d97-af3e12908f34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:34 np0005603623 NetworkManager[48970]: <info>  [1769848654.3223] device (tap9ade8f79-10): carrier: link connected
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.326 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[dd24e555-67dd-40d8-9cff-5f49383a3d0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.338 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ba31eec2-e8c2-4253-be2a-6b2f277575b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9ade8f79-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:8e:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 177], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 779475, 'reachable_time': 37380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 293069, 'error': None, 'target': 'ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.349 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3f0cf726-0242-413c-9da3-b51dd3fda150]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe40:8e96'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 779475, 'tstamp': 779475}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 293070, 'error': None, 'target': 'ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.361 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[08f7695d-dda2-4e64-a4e4-c9d72aa5c555]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9ade8f79-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:8e:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 177], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 779475, 'reachable_time': 37380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 293071, 'error': None, 'target': 'ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.389 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6b67b2c7-43f5-4697-9c65-1cf20da9889a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.442 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dc10b980-6c11-4e2d-9b16-000c1dd657f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.444 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ade8f79-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.444 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.444 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9ade8f79-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:37:34 np0005603623 nova_compute[226235]: 2026-01-31 08:37:34.446 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:34 np0005603623 kernel: tap9ade8f79-10: entered promiscuous mode
Jan 31 03:37:34 np0005603623 NetworkManager[48970]: <info>  [1769848654.4472] manager: (tap9ade8f79-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/284)
Jan 31 03:37:34 np0005603623 nova_compute[226235]: 2026-01-31 08:37:34.448 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.450 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9ade8f79-10, col_values=(('external_ids', {'iface-id': '376b816c-9608-42bc-a912-f97f5e7fdb78'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:37:34 np0005603623 ovn_controller[133449]: 2026-01-31T08:37:34Z|00601|binding|INFO|Releasing lport 376b816c-9608-42bc-a912-f97f5e7fdb78 from this chassis (sb_readonly=0)
Jan 31 03:37:34 np0005603623 nova_compute[226235]: 2026-01-31 08:37:34.451 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.452 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9ade8f79-180d-4cd9-82d7-f3d41cab1210.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9ade8f79-180d-4cd9-82d7-f3d41cab1210.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.453 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[65124267-ef76-477b-9af2-57b0972b985d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.454 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-9ade8f79-180d-4cd9-82d7-f3d41cab1210
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/9ade8f79-180d-4cd9-82d7-f3d41cab1210.pid.haproxy
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 9ade8f79-180d-4cd9-82d7-f3d41cab1210
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:37:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:37:34.454 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210', 'env', 'PROCESS_TAG=haproxy-9ade8f79-180d-4cd9-82d7-f3d41cab1210', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9ade8f79-180d-4cd9-82d7-f3d41cab1210.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:37:34 np0005603623 nova_compute[226235]: 2026-01-31 08:37:34.456 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:34 np0005603623 nova_compute[226235]: 2026-01-31 08:37:34.623 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:37:34 np0005603623 nova_compute[226235]: 2026-01-31 08:37:34.626 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:37:34 np0005603623 nova_compute[226235]: 2026-01-31 08:37:34.709 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:37:34 np0005603623 podman[293104]: 2026-01-31 08:37:34.749605827 +0000 UTC m=+0.019523581 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:37:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:34 np0005603623 podman[293104]: 2026-01-31 08:37:34.933813564 +0000 UTC m=+0.203731308 container create 248ecfa6fbb4fc33eaffb01a309d45b9ec41bdef230198c9fb7331a79be41cd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:37:34 np0005603623 nova_compute[226235]: 2026-01-31 08:37:34.960 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:35 np0005603623 systemd[1]: Started libpod-conmon-248ecfa6fbb4fc33eaffb01a309d45b9ec41bdef230198c9fb7331a79be41cd9.scope.
Jan 31 03:37:35 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:37:35 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fa60ecdc99345a913f6b19439d5f2be67b99ce0a3847bbe4d0c48c2047481a4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:37:35 np0005603623 nova_compute[226235]: 2026-01-31 08:37:35.090 226239 DEBUG nova.network.neutron [req-bb0f4406-bf91-4104-a66a-23a3f79241ec req-c06c257f-2f57-4441-9211-9f211edd548b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Updated VIF entry in instance network info cache for port ff4db50a-8c68-47ef-a6b0-c8caeab94ae5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:37:35 np0005603623 nova_compute[226235]: 2026-01-31 08:37:35.091 226239 DEBUG nova.network.neutron [req-bb0f4406-bf91-4104-a66a-23a3f79241ec req-c06c257f-2f57-4441-9211-9f211edd548b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Updating instance_info_cache with network_info: [{"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:37:35 np0005603623 podman[293104]: 2026-01-31 08:37:35.111836578 +0000 UTC m=+0.381754352 container init 248ecfa6fbb4fc33eaffb01a309d45b9ec41bdef230198c9fb7331a79be41cd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:37:35 np0005603623 podman[293104]: 2026-01-31 08:37:35.116975478 +0000 UTC m=+0.386893232 container start 248ecfa6fbb4fc33eaffb01a309d45b9ec41bdef230198c9fb7331a79be41cd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Jan 31 03:37:35 np0005603623 neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210[293120]: [NOTICE]   (293124) : New worker (293126) forked
Jan 31 03:37:35 np0005603623 neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210[293120]: [NOTICE]   (293124) : Loading success.
Jan 31 03:37:35 np0005603623 nova_compute[226235]: 2026-01-31 08:37:35.156 226239 DEBUG oslo_concurrency.lockutils [req-bb0f4406-bf91-4104-a66a-23a3f79241ec req-c06c257f-2f57-4441-9211-9f211edd548b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-91f04a93-ce38-4962-919b-e1ac0677b4da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:37:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:35.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:35.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:36 np0005603623 nova_compute[226235]: 2026-01-31 08:37:36.649 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:37:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:37.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:37:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:37.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:37 np0005603623 nova_compute[226235]: 2026-01-31 08:37:37.659 226239 DEBUG nova.compute.manager [req-2525bae9-e93e-4ebc-9b5f-79332484d818 req-b4a1f65c-ca9e-4bb3-9b48-5511f4a9d070 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received event network-vif-plugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:37:37 np0005603623 nova_compute[226235]: 2026-01-31 08:37:37.659 226239 DEBUG oslo_concurrency.lockutils [req-2525bae9-e93e-4ebc-9b5f-79332484d818 req-b4a1f65c-ca9e-4bb3-9b48-5511f4a9d070 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:37:37 np0005603623 nova_compute[226235]: 2026-01-31 08:37:37.660 226239 DEBUG oslo_concurrency.lockutils [req-2525bae9-e93e-4ebc-9b5f-79332484d818 req-b4a1f65c-ca9e-4bb3-9b48-5511f4a9d070 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:37:37 np0005603623 nova_compute[226235]: 2026-01-31 08:37:37.660 226239 DEBUG oslo_concurrency.lockutils [req-2525bae9-e93e-4ebc-9b5f-79332484d818 req-b4a1f65c-ca9e-4bb3-9b48-5511f4a9d070 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:37:37 np0005603623 nova_compute[226235]: 2026-01-31 08:37:37.660 226239 DEBUG nova.compute.manager [req-2525bae9-e93e-4ebc-9b5f-79332484d818 req-b4a1f65c-ca9e-4bb3-9b48-5511f4a9d070 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Processing event network-vif-plugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:37:37 np0005603623 nova_compute[226235]: 2026-01-31 08:37:37.660 226239 DEBUG nova.compute.manager [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:37:37 np0005603623 nova_compute[226235]: 2026-01-31 08:37:37.665 226239 DEBUG nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:37:37 np0005603623 nova_compute[226235]: 2026-01-31 08:37:37.665 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848657.6650753, 91f04a93-ce38-4962-919b-e1ac0677b4da => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:37:37 np0005603623 nova_compute[226235]: 2026-01-31 08:37:37.666 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:37:37 np0005603623 nova_compute[226235]: 2026-01-31 08:37:37.670 226239 INFO nova.virt.libvirt.driver [-] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Instance spawned successfully.#033[00m
Jan 31 03:37:37 np0005603623 nova_compute[226235]: 2026-01-31 08:37:37.671 226239 DEBUG nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:37:38 np0005603623 nova_compute[226235]: 2026-01-31 08:37:38.398 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:37:38 np0005603623 nova_compute[226235]: 2026-01-31 08:37:38.403 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:37:38 np0005603623 nova_compute[226235]: 2026-01-31 08:37:38.405 226239 DEBUG nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:37:38 np0005603623 nova_compute[226235]: 2026-01-31 08:37:38.406 226239 DEBUG nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:37:38 np0005603623 nova_compute[226235]: 2026-01-31 08:37:38.406 226239 DEBUG nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:37:38 np0005603623 nova_compute[226235]: 2026-01-31 08:37:38.406 226239 DEBUG nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:37:38 np0005603623 nova_compute[226235]: 2026-01-31 08:37:38.407 226239 DEBUG nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:37:38 np0005603623 nova_compute[226235]: 2026-01-31 08:37:38.407 226239 DEBUG nova.virt.libvirt.driver [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:37:38 np0005603623 podman[293137]: 2026-01-31 08:37:38.94935737 +0000 UTC m=+0.043231583 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:37:38 np0005603623 podman[293138]: 2026-01-31 08:37:38.968681193 +0000 UTC m=+0.063968631 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 31 03:37:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:39.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:39.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:39 np0005603623 nova_compute[226235]: 2026-01-31 08:37:39.963 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:40 np0005603623 nova_compute[226235]: 2026-01-31 08:37:40.101 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:37:40 np0005603623 nova_compute[226235]: 2026-01-31 08:37:40.960 226239 DEBUG nova.compute.manager [req-ea6859ac-4979-4290-95d4-8b226282da50 req-90d9fd19-c0e2-4aed-accc-3c00765367de fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received event network-vif-plugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:37:40 np0005603623 nova_compute[226235]: 2026-01-31 08:37:40.961 226239 DEBUG oslo_concurrency.lockutils [req-ea6859ac-4979-4290-95d4-8b226282da50 req-90d9fd19-c0e2-4aed-accc-3c00765367de fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:37:40 np0005603623 nova_compute[226235]: 2026-01-31 08:37:40.961 226239 DEBUG oslo_concurrency.lockutils [req-ea6859ac-4979-4290-95d4-8b226282da50 req-90d9fd19-c0e2-4aed-accc-3c00765367de fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:37:40 np0005603623 nova_compute[226235]: 2026-01-31 08:37:40.961 226239 DEBUG oslo_concurrency.lockutils [req-ea6859ac-4979-4290-95d4-8b226282da50 req-90d9fd19-c0e2-4aed-accc-3c00765367de fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:37:40 np0005603623 nova_compute[226235]: 2026-01-31 08:37:40.961 226239 DEBUG nova.compute.manager [req-ea6859ac-4979-4290-95d4-8b226282da50 req-90d9fd19-c0e2-4aed-accc-3c00765367de fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] No waiting events found dispatching network-vif-plugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:37:40 np0005603623 nova_compute[226235]: 2026-01-31 08:37:40.962 226239 WARNING nova.compute.manager [req-ea6859ac-4979-4290-95d4-8b226282da50 req-90d9fd19-c0e2-4aed-accc-3c00765367de fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received unexpected event network-vif-plugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:37:41 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:37:41 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:37:41 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:37:41 np0005603623 nova_compute[226235]: 2026-01-31 08:37:41.269 226239 INFO nova.compute.manager [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Took 19.23 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:37:41 np0005603623 nova_compute[226235]: 2026-01-31 08:37:41.270 226239 DEBUG nova.compute.manager [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:37:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:37:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:41.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:37:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:41.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:41 np0005603623 nova_compute[226235]: 2026-01-31 08:37:41.652 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:42 np0005603623 nova_compute[226235]: 2026-01-31 08:37:42.732 226239 INFO nova.compute.manager [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Took 25.84 seconds to build instance.#033[00m
Jan 31 03:37:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:43.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:37:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:43.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:37:44 np0005603623 nova_compute[226235]: 2026-01-31 08:37:44.077 226239 DEBUG oslo_concurrency.lockutils [None req-dd5bd3c4-1f12-4fa6-ab90-9b4d8ee00417 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 27.761s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:37:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:44 np0005603623 nova_compute[226235]: 2026-01-31 08:37:44.965 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:45.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:45.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:46 np0005603623 nova_compute[226235]: 2026-01-31 08:37:46.655 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:47.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:47.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:49 np0005603623 nova_compute[226235]: 2026-01-31 08:37:49.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:49 np0005603623 nova_compute[226235]: 2026-01-31 08:37:49.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:37:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:37:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:49.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:37:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:49.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:37:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:37:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:49 np0005603623 nova_compute[226235]: 2026-01-31 08:37:49.965 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:37:50Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7b:82:ab 10.100.0.8
Jan 31 03:37:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:37:50Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7b:82:ab 10.100.0.8
Jan 31 03:37:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:51.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:51.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:51 np0005603623 nova_compute[226235]: 2026-01-31 08:37:51.657 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:37:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:53.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:37:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:53.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:54 np0005603623 nova_compute[226235]: 2026-01-31 08:37:54.967 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:55 np0005603623 nova_compute[226235]: 2026-01-31 08:37:55.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:55 np0005603623 nova_compute[226235]: 2026-01-31 08:37:55.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:37:55 np0005603623 nova_compute[226235]: 2026-01-31 08:37:55.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:37:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:37:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:55.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:37:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:37:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:55.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:37:56 np0005603623 nova_compute[226235]: 2026-01-31 08:37:56.659 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:57 np0005603623 ceph-mgr[77391]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3835187053
Jan 31 03:37:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:37:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:57.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:37:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:57.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:59.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:37:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:59.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:59 np0005603623 nova_compute[226235]: 2026-01-31 08:37:59.969 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:00 np0005603623 nova_compute[226235]: 2026-01-31 08:38:00.096 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-91f04a93-ce38-4962-919b-e1ac0677b4da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:38:00 np0005603623 nova_compute[226235]: 2026-01-31 08:38:00.097 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-91f04a93-ce38-4962-919b-e1ac0677b4da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:38:00 np0005603623 nova_compute[226235]: 2026-01-31 08:38:00.097 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:38:00 np0005603623 nova_compute[226235]: 2026-01-31 08:38:00.097 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 91f04a93-ce38-4962-919b-e1ac0677b4da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:38:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:01.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:01.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:01 np0005603623 nova_compute[226235]: 2026-01-31 08:38:01.792 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #121. Immutable memtables: 0.
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:02.095037) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 75] Flushing memtable with next log file: 121
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848682095118, "job": 75, "event": "flush_started", "num_memtables": 1, "num_entries": 1627, "num_deletes": 252, "total_data_size": 3711468, "memory_usage": 3746280, "flush_reason": "Manual Compaction"}
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 75] Level-0 flush table #122: started
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848682121109, "cf_name": "default", "job": 75, "event": "table_file_creation", "file_number": 122, "file_size": 1489167, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 60918, "largest_seqno": 62540, "table_properties": {"data_size": 1483842, "index_size": 2593, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 14077, "raw_average_key_size": 21, "raw_value_size": 1472163, "raw_average_value_size": 2213, "num_data_blocks": 115, "num_entries": 665, "num_filter_entries": 665, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848544, "oldest_key_time": 1769848544, "file_creation_time": 1769848682, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 122, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 75] Flush lasted 26122 microseconds, and 3965 cpu microseconds.
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:02.121157) [db/flush_job.cc:967] [default] [JOB 75] Level-0 flush table #122: 1489167 bytes OK
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:02.121178) [db/memtable_list.cc:519] [default] Level-0 commit table #122 started
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:02.131350) [db/memtable_list.cc:722] [default] Level-0 commit table #122: memtable #1 done
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:02.131388) EVENT_LOG_v1 {"time_micros": 1769848682131381, "job": 75, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:02.131409) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 75] Try to delete WAL files size 3703979, prev total WAL file size 3720030, number of live WAL files 2.
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000118.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:02.132233) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032303033' seq:72057594037927935, type:22 .. '6D6772737461740032323534' seq:0, type:0; will stop at (end)
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 76] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 75 Base level 0, inputs: [122(1454KB)], [120(12MB)]
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848682132297, "job": 76, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [122], "files_L6": [120], "score": -1, "input_data_size": 14274511, "oldest_snapshot_seqno": -1}
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 76] Generated table #123: 8669 keys, 11381354 bytes, temperature: kUnknown
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848682326744, "cf_name": "default", "job": 76, "event": "table_file_creation", "file_number": 123, "file_size": 11381354, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11325547, "index_size": 33039, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21701, "raw_key_size": 224582, "raw_average_key_size": 25, "raw_value_size": 11173829, "raw_average_value_size": 1288, "num_data_blocks": 1291, "num_entries": 8669, "num_filter_entries": 8669, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769848682, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 123, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:02.327005) [db/compaction/compaction_job.cc:1663] [default] [JOB 76] Compacted 1@0 + 1@6 files to L6 => 11381354 bytes
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:02.329720) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 73.4 rd, 58.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 12.2 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(17.2) write-amplify(7.6) OK, records in: 9127, records dropped: 458 output_compression: NoCompression
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:02.329741) EVENT_LOG_v1 {"time_micros": 1769848682329731, "job": 76, "event": "compaction_finished", "compaction_time_micros": 194524, "compaction_time_cpu_micros": 25450, "output_level": 6, "num_output_files": 1, "total_output_size": 11381354, "num_input_records": 9127, "num_output_records": 8669, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000122.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848682330044, "job": 76, "event": "table_file_deletion", "file_number": 122}
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000120.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848682331276, "job": 76, "event": "table_file_deletion", "file_number": 120}
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:02.132145) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:02.331416) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:02.331424) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:02.331426) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:02.331428) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:02 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:02.331430) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:03.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:03.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:04 np0005603623 nova_compute[226235]: 2026-01-31 08:38:04.295 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:04 np0005603623 NetworkManager[48970]: <info>  [1769848684.2960] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/285)
Jan 31 03:38:04 np0005603623 NetworkManager[48970]: <info>  [1769848684.2971] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/286)
Jan 31 03:38:04 np0005603623 nova_compute[226235]: 2026-01-31 08:38:04.331 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:04 np0005603623 ovn_controller[133449]: 2026-01-31T08:38:04Z|00602|binding|INFO|Releasing lport 376b816c-9608-42bc-a912-f97f5e7fdb78 from this chassis (sb_readonly=0)
Jan 31 03:38:04 np0005603623 nova_compute[226235]: 2026-01-31 08:38:04.347 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:04 np0005603623 nova_compute[226235]: 2026-01-31 08:38:04.970 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:05.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:05.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:06 np0005603623 nova_compute[226235]: 2026-01-31 08:38:06.796 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:07.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:07.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:09 np0005603623 nova_compute[226235]: 2026-01-31 08:38:09.256 226239 DEBUG nova.compute.manager [req-58a4ea75-0315-4901-a514-2023caf0dac1 req-4e2203b7-6b4d-44c3-9ac5-8f8b790af27b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received event network-changed-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:38:09 np0005603623 nova_compute[226235]: 2026-01-31 08:38:09.256 226239 DEBUG nova.compute.manager [req-58a4ea75-0315-4901-a514-2023caf0dac1 req-4e2203b7-6b4d-44c3-9ac5-8f8b790af27b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Refreshing instance network info cache due to event network-changed-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:38:09 np0005603623 nova_compute[226235]: 2026-01-31 08:38:09.256 226239 DEBUG oslo_concurrency.lockutils [req-58a4ea75-0315-4901-a514-2023caf0dac1 req-4e2203b7-6b4d-44c3-9ac5-8f8b790af27b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-91f04a93-ce38-4962-919b-e1ac0677b4da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:38:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:09.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:09.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:09 np0005603623 nova_compute[226235]: 2026-01-31 08:38:09.878 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Updating instance_info_cache with network_info: [{"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:38:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:09 np0005603623 podman[293429]: 2026-01-31 08:38:09.967322094 +0000 UTC m=+0.048199528 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 31 03:38:09 np0005603623 nova_compute[226235]: 2026-01-31 08:38:09.972 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:10 np0005603623 podman[293430]: 2026-01-31 08:38:10.01519853 +0000 UTC m=+0.095970431 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 31 03:38:10 np0005603623 nova_compute[226235]: 2026-01-31 08:38:10.809 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-91f04a93-ce38-4962-919b-e1ac0677b4da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:38:10 np0005603623 nova_compute[226235]: 2026-01-31 08:38:10.809 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:38:10 np0005603623 nova_compute[226235]: 2026-01-31 08:38:10.809 226239 DEBUG oslo_concurrency.lockutils [req-58a4ea75-0315-4901-a514-2023caf0dac1 req-4e2203b7-6b4d-44c3-9ac5-8f8b790af27b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-91f04a93-ce38-4962-919b-e1ac0677b4da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:38:10 np0005603623 nova_compute[226235]: 2026-01-31 08:38:10.810 226239 DEBUG nova.network.neutron [req-58a4ea75-0315-4901-a514-2023caf0dac1 req-4e2203b7-6b4d-44c3-9ac5-8f8b790af27b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Refreshing network info cache for port ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:38:10 np0005603623 nova_compute[226235]: 2026-01-31 08:38:10.810 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:10 np0005603623 nova_compute[226235]: 2026-01-31 08:38:10.811 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:10 np0005603623 nova_compute[226235]: 2026-01-31 08:38:10.811 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:10 np0005603623 nova_compute[226235]: 2026-01-31 08:38:10.812 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:10 np0005603623 nova_compute[226235]: 2026-01-31 08:38:10.812 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:10 np0005603623 nova_compute[226235]: 2026-01-31 08:38:10.812 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:11 np0005603623 nova_compute[226235]: 2026-01-31 08:38:11.036 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:11 np0005603623 nova_compute[226235]: 2026-01-31 08:38:11.036 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:11 np0005603623 nova_compute[226235]: 2026-01-31 08:38:11.037 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:11 np0005603623 nova_compute[226235]: 2026-01-31 08:38:11.037 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:38:11 np0005603623 nova_compute[226235]: 2026-01-31 08:38:11.037 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:38:11 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3123031748' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:38:11 np0005603623 nova_compute[226235]: 2026-01-31 08:38:11.480 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:11.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:11.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:11 np0005603623 nova_compute[226235]: 2026-01-31 08:38:11.798 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:12 np0005603623 nova_compute[226235]: 2026-01-31 08:38:12.706 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000091 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:38:12 np0005603623 nova_compute[226235]: 2026-01-31 08:38:12.706 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000091 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:38:12 np0005603623 nova_compute[226235]: 2026-01-31 08:38:12.833 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:38:12 np0005603623 nova_compute[226235]: 2026-01-31 08:38:12.835 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4164MB free_disk=20.830692291259766GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:38:12 np0005603623 nova_compute[226235]: 2026-01-31 08:38:12.835 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:12 np0005603623 nova_compute[226235]: 2026-01-31 08:38:12.835 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:13.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:38:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:13.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:38:13 np0005603623 nova_compute[226235]: 2026-01-31 08:38:13.674 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:14 np0005603623 nova_compute[226235]: 2026-01-31 08:38:14.976 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:15 np0005603623 nova_compute[226235]: 2026-01-31 08:38:15.451 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 91f04a93-ce38-4962-919b-e1ac0677b4da actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:38:15 np0005603623 nova_compute[226235]: 2026-01-31 08:38:15.451 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:38:15 np0005603623 nova_compute[226235]: 2026-01-31 08:38:15.451 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:38:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:38:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:15.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:38:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:38:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:15.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:38:15 np0005603623 nova_compute[226235]: 2026-01-31 08:38:15.823 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:38:16 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3141412446' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:38:16 np0005603623 nova_compute[226235]: 2026-01-31 08:38:16.246 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:16 np0005603623 nova_compute[226235]: 2026-01-31 08:38:16.251 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:38:16 np0005603623 nova_compute[226235]: 2026-01-31 08:38:16.331 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:38:16 np0005603623 nova_compute[226235]: 2026-01-31 08:38:16.417 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:38:16 np0005603623 nova_compute[226235]: 2026-01-31 08:38:16.418 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.582s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:16 np0005603623 nova_compute[226235]: 2026-01-31 08:38:16.775 226239 DEBUG nova.network.neutron [req-58a4ea75-0315-4901-a514-2023caf0dac1 req-4e2203b7-6b4d-44c3-9ac5-8f8b790af27b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Updated VIF entry in instance network info cache for port ff4db50a-8c68-47ef-a6b0-c8caeab94ae5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:38:16 np0005603623 nova_compute[226235]: 2026-01-31 08:38:16.776 226239 DEBUG nova.network.neutron [req-58a4ea75-0315-4901-a514-2023caf0dac1 req-4e2203b7-6b4d-44c3-9ac5-8f8b790af27b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Updating instance_info_cache with network_info: [{"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:38:16 np0005603623 nova_compute[226235]: 2026-01-31 08:38:16.800 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:16 np0005603623 nova_compute[226235]: 2026-01-31 08:38:16.824 226239 DEBUG oslo_concurrency.lockutils [req-58a4ea75-0315-4901-a514-2023caf0dac1 req-4e2203b7-6b4d-44c3-9ac5-8f8b790af27b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-91f04a93-ce38-4962-919b-e1ac0677b4da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:38:17 np0005603623 nova_compute[226235]: 2026-01-31 08:38:17.017 226239 INFO nova.compute.manager [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Rebuilding instance#033[00m
Jan 31 03:38:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:17.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:38:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:17.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:38:18 np0005603623 nova_compute[226235]: 2026-01-31 08:38:18.010 226239 DEBUG nova.objects.instance [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 91f04a93-ce38-4962-919b-e1ac0677b4da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:38:18 np0005603623 nova_compute[226235]: 2026-01-31 08:38:18.144 226239 DEBUG nova.compute.manager [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:38:18 np0005603623 nova_compute[226235]: 2026-01-31 08:38:18.344 226239 DEBUG nova.objects.instance [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lazy-loading 'pci_requests' on Instance uuid 91f04a93-ce38-4962-919b-e1ac0677b4da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:38:18 np0005603623 nova_compute[226235]: 2026-01-31 08:38:18.466 226239 DEBUG nova.objects.instance [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 91f04a93-ce38-4962-919b-e1ac0677b4da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:38:18 np0005603623 nova_compute[226235]: 2026-01-31 08:38:18.542 226239 DEBUG nova.objects.instance [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lazy-loading 'resources' on Instance uuid 91f04a93-ce38-4962-919b-e1ac0677b4da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:38:18 np0005603623 nova_compute[226235]: 2026-01-31 08:38:18.655 226239 DEBUG nova.objects.instance [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lazy-loading 'migration_context' on Instance uuid 91f04a93-ce38-4962-919b-e1ac0677b4da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:38:18 np0005603623 nova_compute[226235]: 2026-01-31 08:38:18.690 226239 DEBUG nova.objects.instance [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 03:38:18 np0005603623 nova_compute[226235]: 2026-01-31 08:38:18.695 226239 DEBUG nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:38:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:19.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:38:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:19.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:38:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:19 np0005603623 nova_compute[226235]: 2026-01-31 08:38:19.978 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:21 np0005603623 kernel: tapff4db50a-8c (unregistering): left promiscuous mode
Jan 31 03:38:21 np0005603623 NetworkManager[48970]: <info>  [1769848701.2794] device (tapff4db50a-8c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:38:21 np0005603623 ovn_controller[133449]: 2026-01-31T08:38:21Z|00603|binding|INFO|Releasing lport ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 from this chassis (sb_readonly=0)
Jan 31 03:38:21 np0005603623 ovn_controller[133449]: 2026-01-31T08:38:21Z|00604|binding|INFO|Setting lport ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 down in Southbound
Jan 31 03:38:21 np0005603623 ovn_controller[133449]: 2026-01-31T08:38:21Z|00605|binding|INFO|Removing iface tapff4db50a-8c ovn-installed in OVS
Jan 31 03:38:21 np0005603623 nova_compute[226235]: 2026-01-31 08:38:21.286 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:21 np0005603623 nova_compute[226235]: 2026-01-31 08:38:21.298 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:21.337 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:82:ab 10.100.0.8'], port_security=['fa:16:3e:7b:82:ab 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '91f04a93-ce38-4962-919b-e1ac0677b4da', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9ade8f79-180d-4cd9-82d7-f3d41cab1210', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fcc39962194d44e5b37cad3fb1adc6c4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '71fde71b-c204-4df2-b3b2-d40465166772', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.241'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2a1d055-9eac-4b84-8a08-5dfefe7d7d79, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=ff4db50a-8c68-47ef-a6b0-c8caeab94ae5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:38:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:21.339 143258 INFO neutron.agent.ovn.metadata.agent [-] Port ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 in datapath 9ade8f79-180d-4cd9-82d7-f3d41cab1210 unbound from our chassis#033[00m
Jan 31 03:38:21 np0005603623 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000091.scope: Deactivated successfully.
Jan 31 03:38:21 np0005603623 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000091.scope: Consumed 13.558s CPU time.
Jan 31 03:38:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:21.342 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9ade8f79-180d-4cd9-82d7-f3d41cab1210, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:38:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:21.343 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7880f88a-90b6-49f3-a107-269a841ed31d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:21.344 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210 namespace which is not needed anymore#033[00m
Jan 31 03:38:21 np0005603623 systemd-machined[194379]: Machine qemu-68-instance-00000091 terminated.
Jan 31 03:38:21 np0005603623 nova_compute[226235]: 2026-01-31 08:38:21.413 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:21 np0005603623 neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210[293120]: [NOTICE]   (293124) : haproxy version is 2.8.14-c23fe91
Jan 31 03:38:21 np0005603623 neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210[293120]: [NOTICE]   (293124) : path to executable is /usr/sbin/haproxy
Jan 31 03:38:21 np0005603623 neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210[293120]: [WARNING]  (293124) : Exiting Master process...
Jan 31 03:38:21 np0005603623 neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210[293120]: [ALERT]    (293124) : Current worker (293126) exited with code 143 (Terminated)
Jan 31 03:38:21 np0005603623 neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210[293120]: [WARNING]  (293124) : All workers exited. Exiting... (0)
Jan 31 03:38:21 np0005603623 systemd[1]: libpod-248ecfa6fbb4fc33eaffb01a309d45b9ec41bdef230198c9fb7331a79be41cd9.scope: Deactivated successfully.
Jan 31 03:38:21 np0005603623 podman[293599]: 2026-01-31 08:38:21.487009589 +0000 UTC m=+0.064957311 container died 248ecfa6fbb4fc33eaffb01a309d45b9ec41bdef230198c9fb7331a79be41cd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:38:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:21.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:21 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-248ecfa6fbb4fc33eaffb01a309d45b9ec41bdef230198c9fb7331a79be41cd9-userdata-shm.mount: Deactivated successfully.
Jan 31 03:38:21 np0005603623 systemd[1]: var-lib-containers-storage-overlay-6fa60ecdc99345a913f6b19439d5f2be67b99ce0a3847bbe4d0c48c2047481a4-merged.mount: Deactivated successfully.
Jan 31 03:38:21 np0005603623 podman[293599]: 2026-01-31 08:38:21.546608992 +0000 UTC m=+0.124556724 container cleanup 248ecfa6fbb4fc33eaffb01a309d45b9ec41bdef230198c9fb7331a79be41cd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:38:21 np0005603623 systemd[1]: libpod-conmon-248ecfa6fbb4fc33eaffb01a309d45b9ec41bdef230198c9fb7331a79be41cd9.scope: Deactivated successfully.
Jan 31 03:38:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:38:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:21.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:38:21 np0005603623 podman[293640]: 2026-01-31 08:38:21.594613583 +0000 UTC m=+0.031835417 container remove 248ecfa6fbb4fc33eaffb01a309d45b9ec41bdef230198c9fb7331a79be41cd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Jan 31 03:38:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:21.597 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[99400381-300b-4d71-a8d7-66a0043326c1]: (4, ('Sat Jan 31 08:38:21 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210 (248ecfa6fbb4fc33eaffb01a309d45b9ec41bdef230198c9fb7331a79be41cd9)\n248ecfa6fbb4fc33eaffb01a309d45b9ec41bdef230198c9fb7331a79be41cd9\nSat Jan 31 08:38:21 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210 (248ecfa6fbb4fc33eaffb01a309d45b9ec41bdef230198c9fb7331a79be41cd9)\n248ecfa6fbb4fc33eaffb01a309d45b9ec41bdef230198c9fb7331a79be41cd9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:21.599 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c8f92827-e87b-4723-b8ac-b51a41e747b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:21.600 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ade8f79-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:21 np0005603623 nova_compute[226235]: 2026-01-31 08:38:21.601 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:21 np0005603623 kernel: tap9ade8f79-10: left promiscuous mode
Jan 31 03:38:21 np0005603623 nova_compute[226235]: 2026-01-31 08:38:21.608 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:21.612 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[aacd028b-ae26-4ee6-8b5a-5495bbef81cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:21.626 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[493e93ad-2e41-483e-8dc1-96a873fc9818]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:21.628 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d146f2a4-83a7-4478-99a3-cafa0273842d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:21.637 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1e99d2a2-5afc-4788-9569-0d9a7b46e1a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 779470, 'reachable_time': 28452, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 293659, 'error': None, 'target': 'ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:21.639 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:38:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:21.639 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[1932f8de-3515-456f-aaec-96a7e2f7e575]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:21 np0005603623 systemd[1]: run-netns-ovnmeta\x2d9ade8f79\x2d180d\x2d4cd9\x2d82d7\x2df3d41cab1210.mount: Deactivated successfully.
Jan 31 03:38:21 np0005603623 nova_compute[226235]: 2026-01-31 08:38:21.716 226239 INFO nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Instance shutdown successfully after 3 seconds.#033[00m
Jan 31 03:38:21 np0005603623 nova_compute[226235]: 2026-01-31 08:38:21.720 226239 INFO nova.virt.libvirt.driver [-] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Instance destroyed successfully.#033[00m
Jan 31 03:38:21 np0005603623 nova_compute[226235]: 2026-01-31 08:38:21.724 226239 INFO nova.virt.libvirt.driver [-] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Instance destroyed successfully.#033[00m
Jan 31 03:38:21 np0005603623 nova_compute[226235]: 2026-01-31 08:38:21.725 226239 DEBUG nova.virt.libvirt.vif [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:37:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerActionsV293TestJSON-server-331887853',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionsv293testjson-server-781350970',id=145,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH5gkx3V7BznMz2Ml1SKHQUD/7nRL++Tb8fgaa8FBib96ELGecwEzyV24CRvXhcVt4CV7M8yM6O/exmt6u050Gx0p7dChbKR6iHlWT/0AVPPaPVFOCFMpSshNGBUdPtSXg==',key_name='tempest-keypair-1508930548',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:37:41Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='fcc39962194d44e5b37cad3fb1adc6c4',ramdisk_id='',reservation_id='r-i2540rfe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsV293TestJSON-769176929',owner_user_name='tempest-ServerActionsV293TestJSON-769176929-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:38:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c13cac16cdef424b8050c4bea5a7e9c3',uuid=91f04a93-ce38-4962-919b-e1ac0677b4da,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:38:21 np0005603623 nova_compute[226235]: 2026-01-31 08:38:21.725 226239 DEBUG nova.network.os_vif_util [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Converting VIF {"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:38:21 np0005603623 nova_compute[226235]: 2026-01-31 08:38:21.726 226239 DEBUG nova.network.os_vif_util [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:82:ab,bridge_name='br-int',has_traffic_filtering=True,id=ff4db50a-8c68-47ef-a6b0-c8caeab94ae5,network=Network(9ade8f79-180d-4cd9-82d7-f3d41cab1210),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff4db50a-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:38:21 np0005603623 nova_compute[226235]: 2026-01-31 08:38:21.726 226239 DEBUG os_vif [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:82:ab,bridge_name='br-int',has_traffic_filtering=True,id=ff4db50a-8c68-47ef-a6b0-c8caeab94ae5,network=Network(9ade8f79-180d-4cd9-82d7-f3d41cab1210),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff4db50a-8c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:38:21 np0005603623 nova_compute[226235]: 2026-01-31 08:38:21.728 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:21 np0005603623 nova_compute[226235]: 2026-01-31 08:38:21.729 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff4db50a-8c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:21 np0005603623 nova_compute[226235]: 2026-01-31 08:38:21.730 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:21 np0005603623 nova_compute[226235]: 2026-01-31 08:38:21.731 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:21 np0005603623 nova_compute[226235]: 2026-01-31 08:38:21.733 226239 INFO os_vif [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:82:ab,bridge_name='br-int',has_traffic_filtering=True,id=ff4db50a-8c68-47ef-a6b0-c8caeab94ae5,network=Network(9ade8f79-180d-4cd9-82d7-f3d41cab1210),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff4db50a-8c')#033[00m
Jan 31 03:38:22 np0005603623 nova_compute[226235]: 2026-01-31 08:38:22.181 226239 INFO nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Deleting instance files /var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da_del#033[00m
Jan 31 03:38:22 np0005603623 nova_compute[226235]: 2026-01-31 08:38:22.182 226239 INFO nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Deletion of /var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da_del complete#033[00m
Jan 31 03:38:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:23.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:38:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:23.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:38:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:24 np0005603623 nova_compute[226235]: 2026-01-31 08:38:24.980 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:25.062 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:38:25 np0005603623 nova_compute[226235]: 2026-01-31 08:38:25.063 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:25.064 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:38:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:38:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:25.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:38:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:38:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:25.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:38:25 np0005603623 nova_compute[226235]: 2026-01-31 08:38:25.643 226239 WARNING nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] During detach_volume, instance disappeared.: nova.exception.InstanceNotFound: Instance 91f04a93-ce38-4962-919b-e1ac0677b4da could not be found.#033[00m
Jan 31 03:38:25 np0005603623 nova_compute[226235]: 2026-01-31 08:38:25.941 226239 DEBUG nova.compute.manager [req-dcd5c27d-fa52-463d-9be7-bb3182d96178 req-7f7a0252-6a29-4978-b7ae-36e27af23a76 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received event network-vif-unplugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:38:25 np0005603623 nova_compute[226235]: 2026-01-31 08:38:25.942 226239 DEBUG oslo_concurrency.lockutils [req-dcd5c27d-fa52-463d-9be7-bb3182d96178 req-7f7a0252-6a29-4978-b7ae-36e27af23a76 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:25 np0005603623 nova_compute[226235]: 2026-01-31 08:38:25.942 226239 DEBUG oslo_concurrency.lockutils [req-dcd5c27d-fa52-463d-9be7-bb3182d96178 req-7f7a0252-6a29-4978-b7ae-36e27af23a76 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:25 np0005603623 nova_compute[226235]: 2026-01-31 08:38:25.942 226239 DEBUG oslo_concurrency.lockutils [req-dcd5c27d-fa52-463d-9be7-bb3182d96178 req-7f7a0252-6a29-4978-b7ae-36e27af23a76 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:25 np0005603623 nova_compute[226235]: 2026-01-31 08:38:25.943 226239 DEBUG nova.compute.manager [req-dcd5c27d-fa52-463d-9be7-bb3182d96178 req-7f7a0252-6a29-4978-b7ae-36e27af23a76 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] No waiting events found dispatching network-vif-unplugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:38:25 np0005603623 nova_compute[226235]: 2026-01-31 08:38:25.943 226239 WARNING nova.compute.manager [req-dcd5c27d-fa52-463d-9be7-bb3182d96178 req-7f7a0252-6a29-4978-b7ae-36e27af23a76 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received unexpected event network-vif-unplugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 for instance with vm_state active and task_state rebuilding.#033[00m
Jan 31 03:38:26 np0005603623 nova_compute[226235]: 2026-01-31 08:38:26.784 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:26 np0005603623 nova_compute[226235]: 2026-01-31 08:38:26.796 226239 DEBUG oslo_concurrency.lockutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:26 np0005603623 nova_compute[226235]: 2026-01-31 08:38:26.797 226239 DEBUG oslo_concurrency.lockutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:26 np0005603623 nova_compute[226235]: 2026-01-31 08:38:26.930 226239 DEBUG nova.compute.manager [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:38:27 np0005603623 nova_compute[226235]: 2026-01-31 08:38:27.125 226239 DEBUG nova.compute.manager [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Preparing to wait for external event volume-reimaged-7ff7e47c-5991-4995-b62d-9010ad81e5bf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:38:27 np0005603623 nova_compute[226235]: 2026-01-31 08:38:27.125 226239 DEBUG oslo_concurrency.lockutils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Acquiring lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:27 np0005603623 nova_compute[226235]: 2026-01-31 08:38:27.125 226239 DEBUG oslo_concurrency.lockutils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:27 np0005603623 nova_compute[226235]: 2026-01-31 08:38:27.126 226239 DEBUG oslo_concurrency.lockutils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:27 np0005603623 nova_compute[226235]: 2026-01-31 08:38:27.135 226239 DEBUG oslo_concurrency.lockutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:27 np0005603623 nova_compute[226235]: 2026-01-31 08:38:27.135 226239 DEBUG oslo_concurrency.lockutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:27 np0005603623 nova_compute[226235]: 2026-01-31 08:38:27.142 226239 DEBUG nova.virt.hardware [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:38:27 np0005603623 nova_compute[226235]: 2026-01-31 08:38:27.142 226239 INFO nova.compute.claims [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:38:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:27.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:38:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:27.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:38:28 np0005603623 nova_compute[226235]: 2026-01-31 08:38:28.063 226239 DEBUG oslo_concurrency.processutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:28 np0005603623 nova_compute[226235]: 2026-01-31 08:38:28.346 226239 DEBUG nova.compute.manager [req-34f16f27-9105-449e-a910-10372528373e req-0c6db01e-0b39-4474-b09c-90c4e2fde1ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received event network-vif-plugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:38:28 np0005603623 nova_compute[226235]: 2026-01-31 08:38:28.347 226239 DEBUG oslo_concurrency.lockutils [req-34f16f27-9105-449e-a910-10372528373e req-0c6db01e-0b39-4474-b09c-90c4e2fde1ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:28 np0005603623 nova_compute[226235]: 2026-01-31 08:38:28.347 226239 DEBUG oslo_concurrency.lockutils [req-34f16f27-9105-449e-a910-10372528373e req-0c6db01e-0b39-4474-b09c-90c4e2fde1ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:28 np0005603623 nova_compute[226235]: 2026-01-31 08:38:28.347 226239 DEBUG oslo_concurrency.lockutils [req-34f16f27-9105-449e-a910-10372528373e req-0c6db01e-0b39-4474-b09c-90c4e2fde1ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:28 np0005603623 nova_compute[226235]: 2026-01-31 08:38:28.348 226239 DEBUG nova.compute.manager [req-34f16f27-9105-449e-a910-10372528373e req-0c6db01e-0b39-4474-b09c-90c4e2fde1ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] No event matching network-vif-plugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 in dict_keys([('volume-reimaged', '7ff7e47c-5991-4995-b62d-9010ad81e5bf')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 31 03:38:28 np0005603623 nova_compute[226235]: 2026-01-31 08:38:28.348 226239 WARNING nova.compute.manager [req-34f16f27-9105-449e-a910-10372528373e req-0c6db01e-0b39-4474-b09c-90c4e2fde1ee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received unexpected event network-vif-plugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 for instance with vm_state active and task_state rebuilding.#033[00m
Jan 31 03:38:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:38:28 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3411593780' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:38:28 np0005603623 nova_compute[226235]: 2026-01-31 08:38:28.466 226239 DEBUG oslo_concurrency.processutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.403s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:28 np0005603623 nova_compute[226235]: 2026-01-31 08:38:28.472 226239 DEBUG nova.compute.provider_tree [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:38:28 np0005603623 nova_compute[226235]: 2026-01-31 08:38:28.595 226239 DEBUG nova.scheduler.client.report [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:38:28 np0005603623 nova_compute[226235]: 2026-01-31 08:38:28.697 226239 DEBUG oslo_concurrency.lockutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.562s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:28 np0005603623 nova_compute[226235]: 2026-01-31 08:38:28.699 226239 DEBUG nova.compute.manager [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:38:29 np0005603623 nova_compute[226235]: 2026-01-31 08:38:29.393 226239 DEBUG nova.compute.manager [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:38:29 np0005603623 nova_compute[226235]: 2026-01-31 08:38:29.394 226239 DEBUG nova.network.neutron [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:38:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:29.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:38:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:29.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:38:29 np0005603623 nova_compute[226235]: 2026-01-31 08:38:29.616 226239 DEBUG nova.policy [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '516e093a00a44667ba1308900be70d8d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '621c17d53cba46d386de8efb560a988e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:38:29 np0005603623 nova_compute[226235]: 2026-01-31 08:38:29.646 226239 INFO nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:38:29 np0005603623 nova_compute[226235]: 2026-01-31 08:38:29.855 226239 DEBUG nova.compute.manager [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:38:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:29 np0005603623 nova_compute[226235]: 2026-01-31 08:38:29.983 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:30.128 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:30.129 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:30.129 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:30 np0005603623 nova_compute[226235]: 2026-01-31 08:38:30.241 226239 DEBUG nova.compute.manager [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:38:30 np0005603623 nova_compute[226235]: 2026-01-31 08:38:30.242 226239 DEBUG nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:38:30 np0005603623 nova_compute[226235]: 2026-01-31 08:38:30.243 226239 INFO nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Creating image(s)#033[00m
Jan 31 03:38:30 np0005603623 nova_compute[226235]: 2026-01-31 08:38:30.266 226239 DEBUG nova.storage.rbd_utils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] rbd image bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:38:30 np0005603623 nova_compute[226235]: 2026-01-31 08:38:30.292 226239 DEBUG nova.storage.rbd_utils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] rbd image bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:38:30 np0005603623 nova_compute[226235]: 2026-01-31 08:38:30.319 226239 DEBUG nova.storage.rbd_utils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] rbd image bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:38:30 np0005603623 nova_compute[226235]: 2026-01-31 08:38:30.322 226239 DEBUG oslo_concurrency.processutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:30 np0005603623 nova_compute[226235]: 2026-01-31 08:38:30.373 226239 DEBUG oslo_concurrency.processutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:30 np0005603623 nova_compute[226235]: 2026-01-31 08:38:30.374 226239 DEBUG oslo_concurrency.lockutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:30 np0005603623 nova_compute[226235]: 2026-01-31 08:38:30.374 226239 DEBUG oslo_concurrency.lockutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:30 np0005603623 nova_compute[226235]: 2026-01-31 08:38:30.374 226239 DEBUG oslo_concurrency.lockutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:30 np0005603623 nova_compute[226235]: 2026-01-31 08:38:30.398 226239 DEBUG nova.storage.rbd_utils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] rbd image bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:38:30 np0005603623 nova_compute[226235]: 2026-01-31 08:38:30.401 226239 DEBUG oslo_concurrency.processutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:31.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:38:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:31.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:38:31 np0005603623 nova_compute[226235]: 2026-01-31 08:38:31.575 226239 DEBUG oslo_concurrency.processutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.173s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:31 np0005603623 nova_compute[226235]: 2026-01-31 08:38:31.641 226239 DEBUG nova.storage.rbd_utils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] resizing rbd image bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:38:31 np0005603623 nova_compute[226235]: 2026-01-31 08:38:31.736 226239 DEBUG nova.objects.instance [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'migration_context' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:38:31 np0005603623 nova_compute[226235]: 2026-01-31 08:38:31.786 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:31 np0005603623 nova_compute[226235]: 2026-01-31 08:38:31.818 226239 DEBUG nova.network.neutron [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Successfully created port: b122a11a-5b9d-4b27-a9c3-8327cb8162ae _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:38:31 np0005603623 nova_compute[226235]: 2026-01-31 08:38:31.943 226239 DEBUG nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:38:31 np0005603623 nova_compute[226235]: 2026-01-31 08:38:31.944 226239 DEBUG nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Ensure instance console log exists: /var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:38:31 np0005603623 nova_compute[226235]: 2026-01-31 08:38:31.944 226239 DEBUG oslo_concurrency.lockutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:31 np0005603623 nova_compute[226235]: 2026-01-31 08:38:31.944 226239 DEBUG oslo_concurrency.lockutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:31 np0005603623 nova_compute[226235]: 2026-01-31 08:38:31.945 226239 DEBUG oslo_concurrency.lockutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 31 03:38:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:33.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 31 03:38:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:33.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:34.065 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:34 np0005603623 nova_compute[226235]: 2026-01-31 08:38:34.985 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:35 np0005603623 nova_compute[226235]: 2026-01-31 08:38:35.067 226239 DEBUG nova.network.neutron [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Successfully updated port: b122a11a-5b9d-4b27-a9c3-8327cb8162ae _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:38:35 np0005603623 nova_compute[226235]: 2026-01-31 08:38:35.121 226239 DEBUG oslo_concurrency.lockutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:38:35 np0005603623 nova_compute[226235]: 2026-01-31 08:38:35.121 226239 DEBUG oslo_concurrency.lockutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquired lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:38:35 np0005603623 nova_compute[226235]: 2026-01-31 08:38:35.121 226239 DEBUG nova.network.neutron [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:38:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:38:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:35.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:38:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:35.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:35 np0005603623 nova_compute[226235]: 2026-01-31 08:38:35.663 226239 DEBUG nova.network.neutron [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:38:36 np0005603623 nova_compute[226235]: 2026-01-31 08:38:36.281 226239 DEBUG nova.compute.manager [req-6a7f3730-3455-4609-8354-5e805301a7a3 req-f0fb7605-7a72-4a51-b5df-ac8c33446886 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received event network-changed-b122a11a-5b9d-4b27-a9c3-8327cb8162ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:38:36 np0005603623 nova_compute[226235]: 2026-01-31 08:38:36.282 226239 DEBUG nova.compute.manager [req-6a7f3730-3455-4609-8354-5e805301a7a3 req-f0fb7605-7a72-4a51-b5df-ac8c33446886 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Refreshing instance network info cache due to event network-changed-b122a11a-5b9d-4b27-a9c3-8327cb8162ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:38:36 np0005603623 nova_compute[226235]: 2026-01-31 08:38:36.282 226239 DEBUG oslo_concurrency.lockutils [req-6a7f3730-3455-4609-8354-5e805301a7a3 req-f0fb7605-7a72-4a51-b5df-ac8c33446886 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:38:36 np0005603623 nova_compute[226235]: 2026-01-31 08:38:36.521 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848701.520652, 91f04a93-ce38-4962-919b-e1ac0677b4da => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:38:36 np0005603623 nova_compute[226235]: 2026-01-31 08:38:36.522 226239 INFO nova.compute.manager [-] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:38:36 np0005603623 nova_compute[226235]: 2026-01-31 08:38:36.609 226239 DEBUG nova.compute.manager [None req-9a5b91d4-b423-4079-8eae-882d06818c16 - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:38:36 np0005603623 nova_compute[226235]: 2026-01-31 08:38:36.788 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:38:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:37.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:38:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:37.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.783 226239 DEBUG nova.network.neutron [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updating instance_info_cache with network_info: [{"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.886 226239 DEBUG oslo_concurrency.lockutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Releasing lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.886 226239 DEBUG nova.compute.manager [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Instance network_info: |[{"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.887 226239 DEBUG oslo_concurrency.lockutils [req-6a7f3730-3455-4609-8354-5e805301a7a3 req-f0fb7605-7a72-4a51-b5df-ac8c33446886 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.887 226239 DEBUG nova.network.neutron [req-6a7f3730-3455-4609-8354-5e805301a7a3 req-f0fb7605-7a72-4a51-b5df-ac8c33446886 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Refreshing network info cache for port b122a11a-5b9d-4b27-a9c3-8327cb8162ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.890 226239 DEBUG nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Start _get_guest_xml network_info=[{"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.895 226239 WARNING nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.901 226239 DEBUG nova.virt.libvirt.host [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.902 226239 DEBUG nova.virt.libvirt.host [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.908 226239 DEBUG nova.virt.libvirt.host [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.909 226239 DEBUG nova.virt.libvirt.host [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.910 226239 DEBUG nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.911 226239 DEBUG nova.virt.hardware [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.911 226239 DEBUG nova.virt.hardware [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.911 226239 DEBUG nova.virt.hardware [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.912 226239 DEBUG nova.virt.hardware [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.912 226239 DEBUG nova.virt.hardware [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.912 226239 DEBUG nova.virt.hardware [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.912 226239 DEBUG nova.virt.hardware [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.913 226239 DEBUG nova.virt.hardware [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.913 226239 DEBUG nova.virt.hardware [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.913 226239 DEBUG nova.virt.hardware [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.913 226239 DEBUG nova.virt.hardware [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:38:37 np0005603623 nova_compute[226235]: 2026-01-31 08:38:37.916 226239 DEBUG oslo_concurrency.processutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:38:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2489682929' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.359 226239 DEBUG oslo_concurrency.processutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.384 226239 DEBUG nova.storage.rbd_utils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] rbd image bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.388 226239 DEBUG oslo_concurrency.processutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:38:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1614227337' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.823 226239 DEBUG oslo_concurrency.processutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.825 226239 DEBUG nova.virt.libvirt.vif [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:38:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1679208816',display_name='tempest-ServersNegativeTestJSON-server-1679208816',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1679208816',id=148,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='621c17d53cba46d386de8efb560a988e',ramdisk_id='',reservation_id='r-kclqkoza',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-183161027',owner_user_name='tempest-ServersNegativeTestJSON-183161027-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:38:29Z,user_data=None,user_id='516e093a00a44667ba1308900be70d8d',uuid=bd87e542-0f7b-453e-b8d1-643ad6fb64f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.825 226239 DEBUG nova.network.os_vif_util [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Converting VIF {"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.826 226239 DEBUG nova.network.os_vif_util [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:83:50,bridge_name='br-int',has_traffic_filtering=True,id=b122a11a-5b9d-4b27-a9c3-8327cb8162ae,network=Network(550cf3a2-62ab-424d-afc0-3148a4a687ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb122a11a-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.827 226239 DEBUG nova.objects.instance [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'pci_devices' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.881 226239 DEBUG nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:38:38 np0005603623 nova_compute[226235]:  <uuid>bd87e542-0f7b-453e-b8d1-643ad6fb64f0</uuid>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:  <name>instance-00000094</name>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServersNegativeTestJSON-server-1679208816</nova:name>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:38:37</nova:creationTime>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:38:38 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:        <nova:user uuid="516e093a00a44667ba1308900be70d8d">tempest-ServersNegativeTestJSON-183161027-project-member</nova:user>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:        <nova:project uuid="621c17d53cba46d386de8efb560a988e">tempest-ServersNegativeTestJSON-183161027</nova:project>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:        <nova:port uuid="b122a11a-5b9d-4b27-a9c3-8327cb8162ae">
Jan 31 03:38:38 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <entry name="serial">bd87e542-0f7b-453e-b8d1-643ad6fb64f0</entry>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <entry name="uuid">bd87e542-0f7b-453e-b8d1-643ad6fb64f0</entry>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk">
Jan 31 03:38:38 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:38:38 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk.config">
Jan 31 03:38:38 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:38:38 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:c1:83:50"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <target dev="tapb122a11a-5b"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0/console.log" append="off"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:38:38 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:38:38 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:38:38 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:38:38 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.882 226239 DEBUG nova.compute.manager [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Preparing to wait for external event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.882 226239 DEBUG oslo_concurrency.lockutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.883 226239 DEBUG oslo_concurrency.lockutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.883 226239 DEBUG oslo_concurrency.lockutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.883 226239 DEBUG nova.virt.libvirt.vif [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:38:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1679208816',display_name='tempest-ServersNegativeTestJSON-server-1679208816',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1679208816',id=148,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='621c17d53cba46d386de8efb560a988e',ramdisk_id='',reservation_id='r-kclqkoza',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-183161027',owner_user_name='tempest-ServersNegativeTestJSON-183161027-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:38:29Z,user_data=None,user_id='516e093a00a44667ba1308900be70d8d',uuid=bd87e542-0f7b-453e-b8d1-643ad6fb64f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.884 226239 DEBUG nova.network.os_vif_util [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Converting VIF {"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.884 226239 DEBUG nova.network.os_vif_util [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:83:50,bridge_name='br-int',has_traffic_filtering=True,id=b122a11a-5b9d-4b27-a9c3-8327cb8162ae,network=Network(550cf3a2-62ab-424d-afc0-3148a4a687ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb122a11a-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.885 226239 DEBUG os_vif [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:83:50,bridge_name='br-int',has_traffic_filtering=True,id=b122a11a-5b9d-4b27-a9c3-8327cb8162ae,network=Network(550cf3a2-62ab-424d-afc0-3148a4a687ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb122a11a-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.885 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.886 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.886 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.891 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.891 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb122a11a-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.891 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb122a11a-5b, col_values=(('external_ids', {'iface-id': 'b122a11a-5b9d-4b27-a9c3-8327cb8162ae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:83:50', 'vm-uuid': 'bd87e542-0f7b-453e-b8d1-643ad6fb64f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.893 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.895 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:38:38 np0005603623 NetworkManager[48970]: <info>  [1769848718.8950] manager: (tapb122a11a-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/287)
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.899 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:38 np0005603623 nova_compute[226235]: 2026-01-31 08:38:38.900 226239 INFO os_vif [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:83:50,bridge_name='br-int',has_traffic_filtering=True,id=b122a11a-5b9d-4b27-a9c3-8327cb8162ae,network=Network(550cf3a2-62ab-424d-afc0-3148a4a687ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb122a11a-5b')#033[00m
Jan 31 03:38:39 np0005603623 nova_compute[226235]: 2026-01-31 08:38:39.061 226239 DEBUG nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:38:39 np0005603623 nova_compute[226235]: 2026-01-31 08:38:39.062 226239 DEBUG nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:38:39 np0005603623 nova_compute[226235]: 2026-01-31 08:38:39.062 226239 DEBUG nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] No VIF found with MAC fa:16:3e:c1:83:50, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:38:39 np0005603623 nova_compute[226235]: 2026-01-31 08:38:39.063 226239 INFO nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Using config drive#033[00m
Jan 31 03:38:39 np0005603623 nova_compute[226235]: 2026-01-31 08:38:39.205 226239 DEBUG nova.storage.rbd_utils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] rbd image bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:38:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:39.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:38:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:39.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:38:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:39 np0005603623 nova_compute[226235]: 2026-01-31 08:38:39.986 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.093 226239 INFO nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Creating config drive at /var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0/disk.config#033[00m
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.096 226239 DEBUG oslo_concurrency.processutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpt4aqnszz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.221 226239 DEBUG oslo_concurrency.processutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpt4aqnszz" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.248 226239 DEBUG nova.storage.rbd_utils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] rbd image bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.252 226239 DEBUG oslo_concurrency.processutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0/disk.config bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.397 226239 DEBUG oslo_concurrency.processutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0/disk.config bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.398 226239 INFO nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Deleting local config drive /var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0/disk.config because it was imported into RBD.#033[00m
Jan 31 03:38:40 np0005603623 kernel: tapb122a11a-5b: entered promiscuous mode
Jan 31 03:38:40 np0005603623 NetworkManager[48970]: <info>  [1769848720.4375] manager: (tapb122a11a-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/288)
Jan 31 03:38:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:38:40Z|00606|binding|INFO|Claiming lport b122a11a-5b9d-4b27-a9c3-8327cb8162ae for this chassis.
Jan 31 03:38:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:38:40Z|00607|binding|INFO|b122a11a-5b9d-4b27-a9c3-8327cb8162ae: Claiming fa:16:3e:c1:83:50 10.100.0.11
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.438 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:38:40Z|00608|binding|INFO|Setting lport b122a11a-5b9d-4b27-a9c3-8327cb8162ae ovn-installed in OVS
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.444 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.445 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:40 np0005603623 systemd-machined[194379]: New machine qemu-69-instance-00000094.
Jan 31 03:38:40 np0005603623 systemd[1]: Started Virtual Machine qemu-69-instance-00000094.
Jan 31 03:38:40 np0005603623 systemd-udevd[294088]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:38:40 np0005603623 NetworkManager[48970]: <info>  [1769848720.5032] device (tapb122a11a-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:38:40 np0005603623 NetworkManager[48970]: <info>  [1769848720.5036] device (tapb122a11a-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:38:40 np0005603623 podman[294061]: 2026-01-31 08:38:40.535594971 +0000 UTC m=+0.071496456 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible)
Jan 31 03:38:40 np0005603623 podman[294062]: 2026-01-31 08:38:40.571177742 +0000 UTC m=+0.107071186 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.648 226239 DEBUG nova.network.neutron [req-6a7f3730-3455-4609-8354-5e805301a7a3 req-f0fb7605-7a72-4a51-b5df-ac8c33446886 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updated VIF entry in instance network info cache for port b122a11a-5b9d-4b27-a9c3-8327cb8162ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.649 226239 DEBUG nova.network.neutron [req-6a7f3730-3455-4609-8354-5e805301a7a3 req-f0fb7605-7a72-4a51-b5df-ac8c33446886 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updating instance_info_cache with network_info: [{"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:38:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:38:40Z|00609|binding|INFO|Setting lport b122a11a-5b9d-4b27-a9c3-8327cb8162ae up in Southbound
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.706 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:83:50 10.100.0.11'], port_security=['fa:16:3e:c1:83:50 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bd87e542-0f7b-453e-b8d1-643ad6fb64f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '621c17d53cba46d386de8efb560a988e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1c8dcf47-c169-4871-843e-ae38c0fc69f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bda2ce92-ce79-4f8b-b120-fd83adc645ef, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=b122a11a-5b9d-4b27-a9c3-8327cb8162ae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.707 143258 INFO neutron.agent.ovn.metadata.agent [-] Port b122a11a-5b9d-4b27-a9c3-8327cb8162ae in datapath 550cf3a2-62ab-424d-afc0-3148a4a687ee bound to our chassis#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.709 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 550cf3a2-62ab-424d-afc0-3148a4a687ee#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.715 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4c22a254-e038-4765-8e4d-612eb05f77b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.716 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap550cf3a2-61 in ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.718 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap550cf3a2-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.718 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2037484f-c1dc-49c6-89b5-65e075214247]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.719 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2bc2d796-52cd-408c-a0dd-d2c8567f6845]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.726 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[9c472b91-d6c7-4301-8332-4e3ff7d84c59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.736 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[912d6d3e-1c91-4b5a-a03a-2b8654ebebc6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.756 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[601a3e48-6dbf-4f1b-97be-6bb624eeaa5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.757 226239 DEBUG oslo_concurrency.lockutils [req-6a7f3730-3455-4609-8354-5e805301a7a3 req-f0fb7605-7a72-4a51-b5df-ac8c33446886 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.763 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5d5dfeba-ff9f-403a-b712-fa38ab2f06ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:40 np0005603623 NetworkManager[48970]: <info>  [1769848720.7641] manager: (tap550cf3a2-60): new Veth device (/org/freedesktop/NetworkManager/Devices/289)
Jan 31 03:38:40 np0005603623 systemd-udevd[294099]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.790 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[1950c02d-b4a6-4ed7-9e16-8a0db4919435]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.793 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[8522ba00-d02f-4c5e-af8a-2b54a7cc1498]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:40 np0005603623 NetworkManager[48970]: <info>  [1769848720.8136] device (tap550cf3a2-60): carrier: link connected
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.817 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[5c038bb3-3793-423c-a6ad-ed9b9abac2fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.829 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c171d588-e256-43cb-81c2-1ea2c8103073]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap550cf3a2-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:fc:48'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 786124, 'reachable_time': 20790, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294171, 'error': None, 'target': 'ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.842 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[afe1323b-1106-4253-9998-19108b7f49bd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6e:fc48'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 786124, 'tstamp': 786124}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 294175, 'error': None, 'target': 'ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.858 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[397865a1-564d-4fa2-88b2-ace573972645]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap550cf3a2-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:fc:48'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 786124, 'reachable_time': 20790, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 294177, 'error': None, 'target': 'ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.881 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c2413f3a-61cd-4f41-8da6-7d69948d3eb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.927 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[54f0d801-9a18-49b1-8f0d-0fa4bd992d3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.928 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap550cf3a2-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.928 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.929 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap550cf3a2-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.929 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848720.9292989, bd87e542-0f7b-453e-b8d1-643ad6fb64f0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.930 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] VM Started (Lifecycle Event)#033[00m
Jan 31 03:38:40 np0005603623 NetworkManager[48970]: <info>  [1769848720.9312] manager: (tap550cf3a2-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/290)
Jan 31 03:38:40 np0005603623 kernel: tap550cf3a2-60: entered promiscuous mode
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.932 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.934 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap550cf3a2-60, col_values=(('external_ids', {'iface-id': '9f1ac82b-bf6c-400f-a03c-b15ad5392890'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.935 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:40 np0005603623 ovn_controller[133449]: 2026-01-31T08:38:40Z|00610|binding|INFO|Releasing lport 9f1ac82b-bf6c-400f-a03c-b15ad5392890 from this chassis (sb_readonly=0)
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.937 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.937 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/550cf3a2-62ab-424d-afc0-3148a4a687ee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/550cf3a2-62ab-424d-afc0-3148a4a687ee.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.938 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[682c650a-c969-48ae-a942-db942a838e9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.939 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-550cf3a2-62ab-424d-afc0-3148a4a687ee
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/550cf3a2-62ab-424d-afc0-3148a4a687ee.pid.haproxy
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 550cf3a2-62ab-424d-afc0-3148a4a687ee
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:38:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:40.940 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'env', 'PROCESS_TAG=haproxy-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/550cf3a2-62ab-424d-afc0-3148a4a687ee.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:38:40 np0005603623 nova_compute[226235]: 2026-01-31 08:38:40.941 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:41 np0005603623 nova_compute[226235]: 2026-01-31 08:38:41.131 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:38:41 np0005603623 nova_compute[226235]: 2026-01-31 08:38:41.134 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848720.9294024, bd87e542-0f7b-453e-b8d1-643ad6fb64f0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:38:41 np0005603623 nova_compute[226235]: 2026-01-31 08:38:41.134 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:38:41 np0005603623 podman[294213]: 2026-01-31 08:38:41.235356031 +0000 UTC m=+0.040563578 container create cad769e5ee9c0f97fe418455181a51dc43b786e95552942ce0fcbcc53a004292 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 03:38:41 np0005603623 systemd[1]: Started libpod-conmon-cad769e5ee9c0f97fe418455181a51dc43b786e95552942ce0fcbcc53a004292.scope.
Jan 31 03:38:41 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:38:41 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f97f863101af7dab351d9b243b175beb9926797e921cc32decf95811d3debceb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:38:41 np0005603623 podman[294213]: 2026-01-31 08:38:41.286751908 +0000 UTC m=+0.091959465 container init cad769e5ee9c0f97fe418455181a51dc43b786e95552942ce0fcbcc53a004292 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 31 03:38:41 np0005603623 podman[294213]: 2026-01-31 08:38:41.290786894 +0000 UTC m=+0.095994441 container start cad769e5ee9c0f97fe418455181a51dc43b786e95552942ce0fcbcc53a004292 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 31 03:38:41 np0005603623 podman[294213]: 2026-01-31 08:38:41.213624842 +0000 UTC m=+0.018832409 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:38:41 np0005603623 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[294228]: [NOTICE]   (294232) : New worker (294234) forked
Jan 31 03:38:41 np0005603623 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[294228]: [NOTICE]   (294232) : Loading success.
Jan 31 03:38:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:41.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:41.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:41 np0005603623 nova_compute[226235]: 2026-01-31 08:38:41.967 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:38:41 np0005603623 nova_compute[226235]: 2026-01-31 08:38:41.971 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:38:42 np0005603623 nova_compute[226235]: 2026-01-31 08:38:42.916 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:38:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:43.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:43.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:43 np0005603623 nova_compute[226235]: 2026-01-31 08:38:43.893 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.682 226239 DEBUG nova.compute.manager [req-8bde392d-f1a9-44fe-a7f0-7a28dc553fd6 req-abcc67b2-e228-42c3-9816-7f8436c0b5b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.683 226239 DEBUG oslo_concurrency.lockutils [req-8bde392d-f1a9-44fe-a7f0-7a28dc553fd6 req-abcc67b2-e228-42c3-9816-7f8436c0b5b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.683 226239 DEBUG oslo_concurrency.lockutils [req-8bde392d-f1a9-44fe-a7f0-7a28dc553fd6 req-abcc67b2-e228-42c3-9816-7f8436c0b5b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.683 226239 DEBUG oslo_concurrency.lockutils [req-8bde392d-f1a9-44fe-a7f0-7a28dc553fd6 req-abcc67b2-e228-42c3-9816-7f8436c0b5b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.683 226239 DEBUG nova.compute.manager [req-8bde392d-f1a9-44fe-a7f0-7a28dc553fd6 req-abcc67b2-e228-42c3-9816-7f8436c0b5b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Processing event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.684 226239 DEBUG nova.compute.manager [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.687 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848724.6874094, bd87e542-0f7b-453e-b8d1-643ad6fb64f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.687 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.689 226239 DEBUG nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.692 226239 INFO nova.virt.libvirt.driver [-] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Instance spawned successfully.#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.692 226239 DEBUG nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.789 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.793 226239 DEBUG nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.793 226239 DEBUG nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.793 226239 DEBUG nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.794 226239 DEBUG nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.794 226239 DEBUG nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.794 226239 DEBUG nova.virt.libvirt.driver [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.798 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:38:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.938 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:38:44 np0005603623 nova_compute[226235]: 2026-01-31 08:38:44.988 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:45 np0005603623 nova_compute[226235]: 2026-01-31 08:38:45.299 226239 INFO nova.compute.manager [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Took 15.06 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:38:45 np0005603623 nova_compute[226235]: 2026-01-31 08:38:45.300 226239 DEBUG nova.compute.manager [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:38:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:45.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:45 np0005603623 nova_compute[226235]: 2026-01-31 08:38:45.570 226239 DEBUG nova.compute.manager [req-0561b464-8b14-41da-8c27-c76b9e73af38 req-580a0455-8070-4120-80ed-a39d06c813ee 9ec13114bad64754bf2a77073de5e283 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received event volume-reimaged-7ff7e47c-5991-4995-b62d-9010ad81e5bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:38:45 np0005603623 nova_compute[226235]: 2026-01-31 08:38:45.571 226239 DEBUG oslo_concurrency.lockutils [req-0561b464-8b14-41da-8c27-c76b9e73af38 req-580a0455-8070-4120-80ed-a39d06c813ee 9ec13114bad64754bf2a77073de5e283 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:45 np0005603623 nova_compute[226235]: 2026-01-31 08:38:45.571 226239 DEBUG oslo_concurrency.lockutils [req-0561b464-8b14-41da-8c27-c76b9e73af38 req-580a0455-8070-4120-80ed-a39d06c813ee 9ec13114bad64754bf2a77073de5e283 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:45 np0005603623 nova_compute[226235]: 2026-01-31 08:38:45.571 226239 DEBUG oslo_concurrency.lockutils [req-0561b464-8b14-41da-8c27-c76b9e73af38 req-580a0455-8070-4120-80ed-a39d06c813ee 9ec13114bad64754bf2a77073de5e283 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:45 np0005603623 nova_compute[226235]: 2026-01-31 08:38:45.572 226239 DEBUG nova.compute.manager [req-0561b464-8b14-41da-8c27-c76b9e73af38 req-580a0455-8070-4120-80ed-a39d06c813ee 9ec13114bad64754bf2a77073de5e283 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Processing event volume-reimaged-7ff7e47c-5991-4995-b62d-9010ad81e5bf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:38:45 np0005603623 nova_compute[226235]: 2026-01-31 08:38:45.573 226239 DEBUG nova.compute.manager [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Instance event wait completed in 16 seconds for volume-reimaged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:38:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:45.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:45 np0005603623 nova_compute[226235]: 2026-01-31 08:38:45.815 226239 INFO nova.compute.manager [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Took 18.73 seconds to build instance.#033[00m
Jan 31 03:38:45 np0005603623 nova_compute[226235]: 2026-01-31 08:38:45.918 226239 INFO nova.virt.block_device [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Booting with volume 7ff7e47c-5991-4995-b62d-9010ad81e5bf at /dev/vda#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.125 226239 DEBUG oslo_concurrency.lockutils [None req-1cfb3bc5-3f4f-4b9e-93d2-b94da52ab61d 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.328s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.151 226239 DEBUG os_brick.utils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.152 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.162 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.162 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[5874916b-52fa-4863-9274-ef371bbf1713]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.164 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.170 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.170 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[1d46ae94-a5e1-4957-abf3-98408233711c]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.171 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.177 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.178 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[63ed7097-bc86-4f3d-95ce-a5da4fc21f8f]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.179 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[239e630c-dcf9-45de-aee4-da15354f4584]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.179 226239 DEBUG oslo_concurrency.processutils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.197 226239 DEBUG oslo_concurrency.processutils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] CMD "nvme version" returned: 0 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.199 226239 DEBUG os_brick.initiator.connectors.lightos [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.199 226239 DEBUG os_brick.initiator.connectors.lightos [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.199 226239 DEBUG os_brick.initiator.connectors.lightos [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.200 226239 DEBUG os_brick.utils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] <== get_connector_properties: return (48ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:38:46 np0005603623 nova_compute[226235]: 2026-01-31 08:38:46.200 226239 DEBUG nova.virt.block_device [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Updating existing volume attachment record: 762750a1-b67d-4b15-a2ca-e926ab33383d _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:38:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:38:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:47.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:38:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:47.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:47 np0005603623 nova_compute[226235]: 2026-01-31 08:38:47.974 226239 DEBUG nova.compute.manager [req-2f55d546-e3be-47cc-9314-2f8fb2fd6cab req-a178b46b-b96a-4ce9-ab8f-7f1ea232658c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:38:47 np0005603623 nova_compute[226235]: 2026-01-31 08:38:47.975 226239 DEBUG oslo_concurrency.lockutils [req-2f55d546-e3be-47cc-9314-2f8fb2fd6cab req-a178b46b-b96a-4ce9-ab8f-7f1ea232658c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:47 np0005603623 nova_compute[226235]: 2026-01-31 08:38:47.975 226239 DEBUG oslo_concurrency.lockutils [req-2f55d546-e3be-47cc-9314-2f8fb2fd6cab req-a178b46b-b96a-4ce9-ab8f-7f1ea232658c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:47 np0005603623 nova_compute[226235]: 2026-01-31 08:38:47.975 226239 DEBUG oslo_concurrency.lockutils [req-2f55d546-e3be-47cc-9314-2f8fb2fd6cab req-a178b46b-b96a-4ce9-ab8f-7f1ea232658c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:47 np0005603623 nova_compute[226235]: 2026-01-31 08:38:47.976 226239 DEBUG nova.compute.manager [req-2f55d546-e3be-47cc-9314-2f8fb2fd6cab req-a178b46b-b96a-4ce9-ab8f-7f1ea232658c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] No waiting events found dispatching network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:38:47 np0005603623 nova_compute[226235]: 2026-01-31 08:38:47.976 226239 WARNING nova.compute.manager [req-2f55d546-e3be-47cc-9314-2f8fb2fd6cab req-a178b46b-b96a-4ce9-ab8f-7f1ea232658c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received unexpected event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae for instance with vm_state active and task_state None.#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.894 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.957 226239 DEBUG nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.957 226239 INFO nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Creating image(s)#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.958 226239 DEBUG nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.958 226239 DEBUG nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Ensure instance console log exists: /var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.958 226239 DEBUG oslo_concurrency.lockutils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.959 226239 DEBUG oslo_concurrency.lockutils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.959 226239 DEBUG oslo_concurrency.lockutils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.962 226239 DEBUG nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Start _get_guest_xml network_info=[{"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:45Z,direct_url=<?>,disk_format='qcow2',id=0864ca59-9877-4e6d-adfc-f0a3204ed8f8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:47Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'attachment_id': '762750a1-b67d-4b15-a2ca-e926ab33383d', 'delete_on_termination': True, 'guest_format': None, 'device_type': 'disk', 'boot_index': 0, 'disk_bus': 'virtio', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-7ff7e47c-5991-4995-b62d-9010ad81e5bf', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '7ff7e47c-5991-4995-b62d-9010ad81e5bf', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '91f04a93-ce38-4962-919b-e1ac0677b4da', 'attached_at': '', 'detached_at': '', 'volume_id': '7ff7e47c-5991-4995-b62d-9010ad81e5bf', 'serial': '7ff7e47c-5991-4995-b62d-9010ad81e5bf'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.967 226239 WARNING nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.975 226239 DEBUG nova.virt.libvirt.host [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.976 226239 DEBUG nova.virt.libvirt.host [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.983 226239 DEBUG nova.virt.libvirt.host [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.984 226239 DEBUG nova.virt.libvirt.host [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.985 226239 DEBUG nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.985 226239 DEBUG nova.virt.hardware [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:45Z,direct_url=<?>,disk_format='qcow2',id=0864ca59-9877-4e6d-adfc-f0a3204ed8f8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:47Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.986 226239 DEBUG nova.virt.hardware [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.986 226239 DEBUG nova.virt.hardware [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.986 226239 DEBUG nova.virt.hardware [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.987 226239 DEBUG nova.virt.hardware [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.987 226239 DEBUG nova.virt.hardware [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.987 226239 DEBUG nova.virt.hardware [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.987 226239 DEBUG nova.virt.hardware [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.988 226239 DEBUG nova.virt.hardware [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.988 226239 DEBUG nova.virt.hardware [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.988 226239 DEBUG nova.virt.hardware [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:38:48 np0005603623 nova_compute[226235]: 2026-01-31 08:38:48.988 226239 DEBUG nova.objects.instance [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 91f04a93-ce38-4962-919b-e1ac0677b4da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.083 226239 DEBUG nova.storage.rbd_utils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] rbd image 91f04a93-ce38-4962-919b-e1ac0677b4da_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.087 226239 DEBUG oslo_concurrency.processutils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:38:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1416927381' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.508 226239 DEBUG oslo_concurrency.processutils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:49.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:49.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.825 226239 DEBUG nova.virt.libvirt.vif [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:37:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerActionsV293TestJSON-server-331887853',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionsv293testjson-server-781350970',id=145,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH5gkx3V7BznMz2Ml1SKHQUD/7nRL++Tb8fgaa8FBib96ELGecwEzyV24CRvXhcVt4CV7M8yM6O/exmt6u050Gx0p7dChbKR6iHlWT/0AVPPaPVFOCFMpSshNGBUdPtSXg==',key_name='tempest-keypair-1508930548',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:37:41Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='fcc39962194d44e5b37cad3fb1adc6c4',ramdisk_id='',reservation_id='r-i2540rfe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsV293TestJSON-769176929',owner_user_name='tempest-ServerActionsV293TestJSON-769176929-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:38:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c13cac16cdef424b8050c4bea5a7e9c3',uuid=91f04a93-ce38-4962-919b-e1ac0677b4da,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.826 226239 DEBUG nova.network.os_vif_util [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Converting VIF {"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.827 226239 DEBUG nova.network.os_vif_util [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:82:ab,bridge_name='br-int',has_traffic_filtering=True,id=ff4db50a-8c68-47ef-a6b0-c8caeab94ae5,network=Network(9ade8f79-180d-4cd9-82d7-f3d41cab1210),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff4db50a-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.830 226239 DEBUG nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:38:49 np0005603623 nova_compute[226235]:  <uuid>91f04a93-ce38-4962-919b-e1ac0677b4da</uuid>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:  <name>instance-00000091</name>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerActionsV293TestJSON-server-331887853</nova:name>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:38:48</nova:creationTime>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:38:49 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:        <nova:user uuid="c13cac16cdef424b8050c4bea5a7e9c3">tempest-ServerActionsV293TestJSON-769176929-project-member</nova:user>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:        <nova:project uuid="fcc39962194d44e5b37cad3fb1adc6c4">tempest-ServerActionsV293TestJSON-769176929</nova:project>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:        <nova:port uuid="ff4db50a-8c68-47ef-a6b0-c8caeab94ae5">
Jan 31 03:38:49 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <entry name="serial">91f04a93-ce38-4962-919b-e1ac0677b4da</entry>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <entry name="uuid">91f04a93-ce38-4962-919b-e1ac0677b4da</entry>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/91f04a93-ce38-4962-919b-e1ac0677b4da_disk.config">
Jan 31 03:38:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:38:49 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="volumes/volume-7ff7e47c-5991-4995-b62d-9010ad81e5bf">
Jan 31 03:38:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:38:49 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <serial>7ff7e47c-5991-4995-b62d-9010ad81e5bf</serial>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:7b:82:ab"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <target dev="tapff4db50a-8c"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da/console.log" append="off"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:38:49 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:38:49 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:38:49 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:38:49 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.832 226239 DEBUG nova.virt.libvirt.vif [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:37:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerActionsV293TestJSON-server-331887853',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionsv293testjson-server-781350970',id=145,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH5gkx3V7BznMz2Ml1SKHQUD/7nRL++Tb8fgaa8FBib96ELGecwEzyV24CRvXhcVt4CV7M8yM6O/exmt6u050Gx0p7dChbKR6iHlWT/0AVPPaPVFOCFMpSshNGBUdPtSXg==',key_name='tempest-keypair-1508930548',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:37:41Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='fcc39962194d44e5b37cad3fb1adc6c4',ramdisk_id='',reservation_id='r-i2540rfe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsV293TestJSON-769176929',owner_user_name='tempest-ServerActionsV293TestJSON-769176929-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:38:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c13cac16cdef424b8050c4bea5a7e9c3',uuid=91f04a93-ce38-4962-919b-e1ac0677b4da,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.832 226239 DEBUG nova.network.os_vif_util [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Converting VIF {"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.833 226239 DEBUG nova.network.os_vif_util [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:82:ab,bridge_name='br-int',has_traffic_filtering=True,id=ff4db50a-8c68-47ef-a6b0-c8caeab94ae5,network=Network(9ade8f79-180d-4cd9-82d7-f3d41cab1210),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff4db50a-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.833 226239 DEBUG os_vif [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:82:ab,bridge_name='br-int',has_traffic_filtering=True,id=ff4db50a-8c68-47ef-a6b0-c8caeab94ae5,network=Network(9ade8f79-180d-4cd9-82d7-f3d41cab1210),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff4db50a-8c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.834 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.834 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.835 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.838 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.838 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapff4db50a-8c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.838 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapff4db50a-8c, col_values=(('external_ids', {'iface-id': 'ff4db50a-8c68-47ef-a6b0-c8caeab94ae5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:82:ab', 'vm-uuid': '91f04a93-ce38-4962-919b-e1ac0677b4da'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.840 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:49 np0005603623 NetworkManager[48970]: <info>  [1769848729.8415] manager: (tapff4db50a-8c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.842 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.846 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.847 226239 INFO os_vif [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:82:ab,bridge_name='br-int',has_traffic_filtering=True,id=ff4db50a-8c68-47ef-a6b0-c8caeab94ae5,network=Network(9ade8f79-180d-4cd9-82d7-f3d41cab1210),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff4db50a-8c')#033[00m
Jan 31 03:38:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:49 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #55. Immutable memtables: 11.
Jan 31 03:38:49 np0005603623 nova_compute[226235]: 2026-01-31 08:38:49.989 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:50 np0005603623 nova_compute[226235]: 2026-01-31 08:38:50.250 226239 DEBUG nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:38:50 np0005603623 nova_compute[226235]: 2026-01-31 08:38:50.250 226239 DEBUG nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:38:50 np0005603623 nova_compute[226235]: 2026-01-31 08:38:50.251 226239 DEBUG nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] No VIF found with MAC fa:16:3e:7b:82:ab, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:38:50 np0005603623 nova_compute[226235]: 2026-01-31 08:38:50.252 226239 INFO nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Using config drive#033[00m
Jan 31 03:38:50 np0005603623 nova_compute[226235]: 2026-01-31 08:38:50.290 226239 DEBUG nova.storage.rbd_utils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] rbd image 91f04a93-ce38-4962-919b-e1ac0677b4da_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:38:50 np0005603623 nova_compute[226235]: 2026-01-31 08:38:50.384 226239 DEBUG nova.objects.instance [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 91f04a93-ce38-4962-919b-e1ac0677b4da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:38:50 np0005603623 nova_compute[226235]: 2026-01-31 08:38:50.461 226239 DEBUG nova.objects.instance [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lazy-loading 'keypairs' on Instance uuid 91f04a93-ce38-4962-919b-e1ac0677b4da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:38:51 np0005603623 nova_compute[226235]: 2026-01-31 08:38:51.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:51 np0005603623 nova_compute[226235]: 2026-01-31 08:38:51.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:38:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:51.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:51.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:51 np0005603623 nova_compute[226235]: 2026-01-31 08:38:51.676 226239 INFO nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Creating config drive at /var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da/disk.config#033[00m
Jan 31 03:38:51 np0005603623 nova_compute[226235]: 2026-01-31 08:38:51.680 226239 DEBUG oslo_concurrency.processutils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp4eteo4p0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:51 np0005603623 nova_compute[226235]: 2026-01-31 08:38:51.803 226239 DEBUG oslo_concurrency.processutils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp4eteo4p0" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:51 np0005603623 nova_compute[226235]: 2026-01-31 08:38:51.826 226239 DEBUG nova.storage.rbd_utils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] rbd image 91f04a93-ce38-4962-919b-e1ac0677b4da_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:38:51 np0005603623 nova_compute[226235]: 2026-01-31 08:38:51.830 226239 DEBUG oslo_concurrency.processutils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da/disk.config 91f04a93-ce38-4962-919b-e1ac0677b4da_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:38:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:38:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:38:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:38:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:38:51 np0005603623 nova_compute[226235]: 2026-01-31 08:38:51.968 226239 DEBUG oslo_concurrency.processutils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da/disk.config 91f04a93-ce38-4962-919b-e1ac0677b4da_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:51 np0005603623 nova_compute[226235]: 2026-01-31 08:38:51.969 226239 INFO nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Deleting local config drive /var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da/disk.config because it was imported into RBD.#033[00m
Jan 31 03:38:52 np0005603623 kernel: tapff4db50a-8c: entered promiscuous mode
Jan 31 03:38:52 np0005603623 NetworkManager[48970]: <info>  [1769848732.0018] manager: (tapff4db50a-8c): new Tun device (/org/freedesktop/NetworkManager/Devices/292)
Jan 31 03:38:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:38:52Z|00611|binding|INFO|Claiming lport ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 for this chassis.
Jan 31 03:38:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:38:52Z|00612|binding|INFO|ff4db50a-8c68-47ef-a6b0-c8caeab94ae5: Claiming fa:16:3e:7b:82:ab 10.100.0.8
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.005 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:38:52Z|00613|binding|INFO|Setting lport ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 ovn-installed in OVS
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.021 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:52 np0005603623 systemd-udevd[294500]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.025 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:52 np0005603623 systemd-machined[194379]: New machine qemu-70-instance-00000091.
Jan 31 03:38:52 np0005603623 NetworkManager[48970]: <info>  [1769848732.0356] device (tapff4db50a-8c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:38:52 np0005603623 NetworkManager[48970]: <info>  [1769848732.0366] device (tapff4db50a-8c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:38:52 np0005603623 systemd[1]: Started Virtual Machine qemu-70-instance-00000091.
Jan 31 03:38:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:38:52Z|00614|binding|INFO|Setting lport ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 up in Southbound
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.051 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:82:ab 10.100.0.8'], port_security=['fa:16:3e:7b:82:ab 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '91f04a93-ce38-4962-919b-e1ac0677b4da', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9ade8f79-180d-4cd9-82d7-f3d41cab1210', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fcc39962194d44e5b37cad3fb1adc6c4', 'neutron:revision_number': '5', 'neutron:security_group_ids': '71fde71b-c204-4df2-b3b2-d40465166772', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.241'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2a1d055-9eac-4b84-8a08-5dfefe7d7d79, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=ff4db50a-8c68-47ef-a6b0-c8caeab94ae5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.052 143258 INFO neutron.agent.ovn.metadata.agent [-] Port ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 in datapath 9ade8f79-180d-4cd9-82d7-f3d41cab1210 bound to our chassis#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.054 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9ade8f79-180d-4cd9-82d7-f3d41cab1210#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.062 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b33dc9f2-af0c-4dc9-98c8-31d8a0ade51e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.063 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9ade8f79-11 in ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.065 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9ade8f79-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.065 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[78c8dcb6-2520-4983-a1e3-58a8f8bd9fab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.065 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[97339752-13be-4f38-9057-1b0fab6f7643]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.074 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[4430ffba-54ad-41da-8d2d-5229bb25e14f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.084 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d000867e-b648-48da-9ffb-e02de3c421b7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.106 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[09c44887-7a42-4fba-a9d3-9cb9aec50165]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:52 np0005603623 systemd-udevd[294502]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:38:52 np0005603623 NetworkManager[48970]: <info>  [1769848732.1210] manager: (tap9ade8f79-10): new Veth device (/org/freedesktop/NetworkManager/Devices/293)
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.120 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3d3855a7-705e-4d7f-8f71-8c50c92e809b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.146 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[7e565293-533a-4909-a45d-bcd9b1855921]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.151 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[2217bc0d-16c5-40a1-b3b7-0347b78c7d77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:52 np0005603623 NetworkManager[48970]: <info>  [1769848732.1636] device (tap9ade8f79-10): carrier: link connected
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.168 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[790162c8-258e-400d-bcd2-1863f371aaff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.178 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[131376ad-8c1b-4e3c-aeaf-be303d41fcaa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9ade8f79-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:8e:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 182], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 787259, 'reachable_time': 16687, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294533, 'error': None, 'target': 'ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.189 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[665ab016-462b-4706-8705-825273a59a0e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe40:8e96'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 787259, 'tstamp': 787259}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 294534, 'error': None, 'target': 'ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.199 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[375eb205-58c1-413b-b206-c9167f941400]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9ade8f79-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:8e:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 182], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 787259, 'reachable_time': 16687, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 294535, 'error': None, 'target': 'ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.221 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[03a5a55b-c9c4-442d-b65a-5687d9e94ba8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.281 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9d321042-ee58-452a-8800-b317cbef6718]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:52 np0005603623 kernel: tap9ade8f79-10: entered promiscuous mode
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.283 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ade8f79-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.283 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.283 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9ade8f79-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:52 np0005603623 NetworkManager[48970]: <info>  [1769848732.2872] manager: (tap9ade8f79-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/294)
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.287 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.288 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9ade8f79-10, col_values=(('external_ids', {'iface-id': '376b816c-9608-42bc-a912-f97f5e7fdb78'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:38:52Z|00615|binding|INFO|Releasing lport 376b816c-9608-42bc-a912-f97f5e7fdb78 from this chassis (sb_readonly=0)
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.294 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9ade8f79-180d-4cd9-82d7-f3d41cab1210.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9ade8f79-180d-4cd9-82d7-f3d41cab1210.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.295 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.295 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[71cc2a41-53b7-4ce5-a11c-c9b3ee5e0f07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.297 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-9ade8f79-180d-4cd9-82d7-f3d41cab1210
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/9ade8f79-180d-4cd9-82d7-f3d41cab1210.pid.haproxy
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 9ade8f79-180d-4cd9-82d7-f3d41cab1210
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:38:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:38:52.298 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210', 'env', 'PROCESS_TAG=haproxy-9ade8f79-180d-4cd9-82d7-f3d41cab1210', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9ade8f79-180d-4cd9-82d7-f3d41cab1210.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.570 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848732.5698051, 91f04a93-ce38-4962-919b-e1ac0677b4da => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.571 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.573 226239 DEBUG nova.compute.manager [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.574 226239 DEBUG nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.578 226239 INFO nova.virt.libvirt.driver [-] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Instance spawned successfully.#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.578 226239 DEBUG nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:38:52 np0005603623 podman[294658]: 2026-01-31 08:38:52.638698332 +0000 UTC m=+0.051057617 container create 8ab641208845b111b953a73b9640515da63c31a5d6ce262d7a542f3f52758d04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:38:52 np0005603623 systemd[1]: Started libpod-conmon-8ab641208845b111b953a73b9640515da63c31a5d6ce262d7a542f3f52758d04.scope.
Jan 31 03:38:52 np0005603623 podman[294658]: 2026-01-31 08:38:52.610755238 +0000 UTC m=+0.023114543 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:38:52 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:38:52 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33200478a22694555f2ae52ce197798ea4076b37e9eab38b0edf273bf0c13756/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.726 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.729 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:38:52 np0005603623 podman[294658]: 2026-01-31 08:38:52.729962234 +0000 UTC m=+0.142321519 container init 8ab641208845b111b953a73b9640515da63c31a5d6ce262d7a542f3f52758d04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:38:52 np0005603623 podman[294658]: 2026-01-31 08:38:52.738251233 +0000 UTC m=+0.150610518 container start 8ab641208845b111b953a73b9640515da63c31a5d6ce262d7a542f3f52758d04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.744 226239 DEBUG nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.745 226239 DEBUG nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.745 226239 DEBUG nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.745 226239 DEBUG nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.746 226239 DEBUG nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.746 226239 DEBUG nova.virt.libvirt.driver [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:38:52 np0005603623 neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210[294673]: [NOTICE]   (294677) : New worker (294679) forked
Jan 31 03:38:52 np0005603623 neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210[294673]: [NOTICE]   (294677) : Loading success.
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.770 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.771 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848732.5707319, 91f04a93-ce38-4962-919b-e1ac0677b4da => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.771 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] VM Started (Lifecycle Event)#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.870 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.873 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.950 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.996 226239 DEBUG nova.compute.manager [req-a630294c-5eb9-457d-8e8a-487edadc821c req-2551d624-1f2e-4a8e-b016-1cd061b656db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received event network-vif-plugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.997 226239 DEBUG oslo_concurrency.lockutils [req-a630294c-5eb9-457d-8e8a-487edadc821c req-2551d624-1f2e-4a8e-b016-1cd061b656db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.997 226239 DEBUG oslo_concurrency.lockutils [req-a630294c-5eb9-457d-8e8a-487edadc821c req-2551d624-1f2e-4a8e-b016-1cd061b656db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.997 226239 DEBUG oslo_concurrency.lockutils [req-a630294c-5eb9-457d-8e8a-487edadc821c req-2551d624-1f2e-4a8e-b016-1cd061b656db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.998 226239 DEBUG nova.compute.manager [req-a630294c-5eb9-457d-8e8a-487edadc821c req-2551d624-1f2e-4a8e-b016-1cd061b656db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] No waiting events found dispatching network-vif-plugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:38:52 np0005603623 nova_compute[226235]: 2026-01-31 08:38:52.998 226239 WARNING nova.compute.manager [req-a630294c-5eb9-457d-8e8a-487edadc821c req-2551d624-1f2e-4a8e-b016-1cd061b656db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received unexpected event network-vif-plugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 31 03:38:53 np0005603623 nova_compute[226235]: 2026-01-31 08:38:53.018 226239 DEBUG nova.compute.manager [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:38:53 np0005603623 nova_compute[226235]: 2026-01-31 08:38:53.232 226239 DEBUG oslo_concurrency.lockutils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:53 np0005603623 nova_compute[226235]: 2026-01-31 08:38:53.234 226239 DEBUG oslo_concurrency.lockutils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:53 np0005603623 nova_compute[226235]: 2026-01-31 08:38:53.234 226239 DEBUG nova.objects.instance [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 03:38:53 np0005603623 nova_compute[226235]: 2026-01-31 08:38:53.397 226239 DEBUG oslo_concurrency.lockutils [None req-0561b464-8b14-41da-8c27-c76b9e73af38 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:53.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:53.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:54 np0005603623 nova_compute[226235]: 2026-01-31 08:38:54.841 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:54 np0005603623 nova_compute[226235]: 2026-01-31 08:38:54.991 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:55 np0005603623 nova_compute[226235]: 2026-01-31 08:38:55.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:55 np0005603623 nova_compute[226235]: 2026-01-31 08:38:55.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:38:55 np0005603623 nova_compute[226235]: 2026-01-31 08:38:55.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:38:55 np0005603623 nova_compute[226235]: 2026-01-31 08:38:55.206 226239 DEBUG nova.compute.manager [req-2f015db5-1b7c-41f5-b31d-6d4e9d2d1715 req-5da667ec-38c6-436c-94d7-34969766a68d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received event network-vif-plugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:38:55 np0005603623 nova_compute[226235]: 2026-01-31 08:38:55.207 226239 DEBUG oslo_concurrency.lockutils [req-2f015db5-1b7c-41f5-b31d-6d4e9d2d1715 req-5da667ec-38c6-436c-94d7-34969766a68d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:55 np0005603623 nova_compute[226235]: 2026-01-31 08:38:55.207 226239 DEBUG oslo_concurrency.lockutils [req-2f015db5-1b7c-41f5-b31d-6d4e9d2d1715 req-5da667ec-38c6-436c-94d7-34969766a68d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:55 np0005603623 nova_compute[226235]: 2026-01-31 08:38:55.207 226239 DEBUG oslo_concurrency.lockutils [req-2f015db5-1b7c-41f5-b31d-6d4e9d2d1715 req-5da667ec-38c6-436c-94d7-34969766a68d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:55 np0005603623 nova_compute[226235]: 2026-01-31 08:38:55.208 226239 DEBUG nova.compute.manager [req-2f015db5-1b7c-41f5-b31d-6d4e9d2d1715 req-5da667ec-38c6-436c-94d7-34969766a68d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] No waiting events found dispatching network-vif-plugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:38:55 np0005603623 nova_compute[226235]: 2026-01-31 08:38:55.208 226239 WARNING nova.compute.manager [req-2f015db5-1b7c-41f5-b31d-6d4e9d2d1715 req-5da667ec-38c6-436c-94d7-34969766a68d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received unexpected event network-vif-plugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:38:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:38:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:55.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:38:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:55.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:55 np0005603623 nova_compute[226235]: 2026-01-31 08:38:55.654 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:38:55 np0005603623 nova_compute[226235]: 2026-01-31 08:38:55.654 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:38:55 np0005603623 nova_compute[226235]: 2026-01-31 08:38:55.655 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:38:55 np0005603623 nova_compute[226235]: 2026-01-31 08:38:55.655 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #124. Immutable memtables: 0.
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:55.792332) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 77] Flushing memtable with next log file: 124
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848735792390, "job": 77, "event": "flush_started", "num_memtables": 1, "num_entries": 800, "num_deletes": 250, "total_data_size": 1491976, "memory_usage": 1509792, "flush_reason": "Manual Compaction"}
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 77] Level-0 flush table #125: started
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848735801109, "cf_name": "default", "job": 77, "event": "table_file_creation", "file_number": 125, "file_size": 985411, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 62545, "largest_seqno": 63340, "table_properties": {"data_size": 981599, "index_size": 1592, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 7780, "raw_average_key_size": 17, "raw_value_size": 973989, "raw_average_value_size": 2178, "num_data_blocks": 71, "num_entries": 447, "num_filter_entries": 447, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848682, "oldest_key_time": 1769848682, "file_creation_time": 1769848735, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 125, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 77] Flush lasted 8813 microseconds, and 2247 cpu microseconds.
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:55.801151) [db/flush_job.cc:967] [default] [JOB 77] Level-0 flush table #125: 985411 bytes OK
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:55.801170) [db/memtable_list.cc:519] [default] Level-0 commit table #125 started
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:55.804390) [db/memtable_list.cc:722] [default] Level-0 commit table #125: memtable #1 done
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:55.804414) EVENT_LOG_v1 {"time_micros": 1769848735804407, "job": 77, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:55.804463) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 77] Try to delete WAL files size 1487782, prev total WAL file size 1487782, number of live WAL files 2.
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000121.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:55.805038) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323530' seq:72057594037927935, type:22 .. '6B7600353031' seq:0, type:0; will stop at (end)
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 78] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 77 Base level 0, inputs: [125(962KB)], [123(10MB)]
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848735805099, "job": 78, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [125], "files_L6": [123], "score": -1, "input_data_size": 12366765, "oldest_snapshot_seqno": -1}
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 78] Generated table #126: 8603 keys, 11294965 bytes, temperature: kUnknown
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848735951364, "cf_name": "default", "job": 78, "event": "table_file_creation", "file_number": 126, "file_size": 11294965, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11239622, "index_size": 32703, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21573, "raw_key_size": 224939, "raw_average_key_size": 26, "raw_value_size": 11088858, "raw_average_value_size": 1288, "num_data_blocks": 1260, "num_entries": 8603, "num_filter_entries": 8603, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769848735, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 126, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:55.951730) [db/compaction/compaction_job.cc:1663] [default] [JOB 78] Compacted 1@0 + 1@6 files to L6 => 11294965 bytes
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:55.955934) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 84.5 rd, 77.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 10.9 +0.0 blob) out(10.8 +0.0 blob), read-write-amplify(24.0) write-amplify(11.5) OK, records in: 9116, records dropped: 513 output_compression: NoCompression
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:55.955983) EVENT_LOG_v1 {"time_micros": 1769848735955962, "job": 78, "event": "compaction_finished", "compaction_time_micros": 146416, "compaction_time_cpu_micros": 31683, "output_level": 6, "num_output_files": 1, "total_output_size": 11294965, "num_input_records": 9116, "num_output_records": 8603, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000125.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848735956314, "job": 78, "event": "table_file_deletion", "file_number": 125}
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000123.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848735958000, "job": 78, "event": "table_file_deletion", "file_number": 123}
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:55.804979) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:55.958039) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:55.958047) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:55.958052) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:55.958059) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:55 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:38:55.958062) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:57.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:57.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:38:57Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c1:83:50 10.100.0.11
Jan 31 03:38:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:38:57Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c1:83:50 10.100.0.11
Jan 31 03:38:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:38:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:38:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:38:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:59.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:38:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:38:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:38:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:59.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:38:59 np0005603623 nova_compute[226235]: 2026-01-31 08:38:59.844 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:59 np0005603623 nova_compute[226235]: 2026-01-31 08:38:59.993 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:00 np0005603623 nova_compute[226235]: 2026-01-31 08:39:00.050 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updating instance_info_cache with network_info: [{"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:39:00 np0005603623 nova_compute[226235]: 2026-01-31 08:39:00.236 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:39:00 np0005603623 nova_compute[226235]: 2026-01-31 08:39:00.236 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:39:00 np0005603623 nova_compute[226235]: 2026-01-31 08:39:00.236 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:00 np0005603623 nova_compute[226235]: 2026-01-31 08:39:00.237 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:00 np0005603623 nova_compute[226235]: 2026-01-31 08:39:00.237 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:00 np0005603623 nova_compute[226235]: 2026-01-31 08:39:00.387 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:00 np0005603623 nova_compute[226235]: 2026-01-31 08:39:00.387 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:00 np0005603623 nova_compute[226235]: 2026-01-31 08:39:00.387 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:00 np0005603623 nova_compute[226235]: 2026-01-31 08:39:00.387 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:39:00 np0005603623 nova_compute[226235]: 2026-01-31 08:39:00.387 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:39:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:39:00 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2196067898' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:39:00 np0005603623 nova_compute[226235]: 2026-01-31 08:39:00.844 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:39:01 np0005603623 nova_compute[226235]: 2026-01-31 08:39:01.152 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000091 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:39:01 np0005603623 nova_compute[226235]: 2026-01-31 08:39:01.152 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000091 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:39:01 np0005603623 nova_compute[226235]: 2026-01-31 08:39:01.155 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000094 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:39:01 np0005603623 nova_compute[226235]: 2026-01-31 08:39:01.155 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000094 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:39:01 np0005603623 nova_compute[226235]: 2026-01-31 08:39:01.318 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:39:01 np0005603623 nova_compute[226235]: 2026-01-31 08:39:01.320 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3985MB free_disk=20.700511932373047GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:39:01 np0005603623 nova_compute[226235]: 2026-01-31 08:39:01.321 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:01 np0005603623 nova_compute[226235]: 2026-01-31 08:39:01.321 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:01 np0005603623 nova_compute[226235]: 2026-01-31 08:39:01.544 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 91f04a93-ce38-4962-919b-e1ac0677b4da actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:39:01 np0005603623 nova_compute[226235]: 2026-01-31 08:39:01.547 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance bd87e542-0f7b-453e-b8d1-643ad6fb64f0 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:39:01 np0005603623 nova_compute[226235]: 2026-01-31 08:39:01.547 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:39:01 np0005603623 nova_compute[226235]: 2026-01-31 08:39:01.547 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:39:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:01.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:01.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:01 np0005603623 nova_compute[226235]: 2026-01-31 08:39:01.647 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:39:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:39:02 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4106164928' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:39:02 np0005603623 nova_compute[226235]: 2026-01-31 08:39:02.058 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.411s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:39:02 np0005603623 nova_compute[226235]: 2026-01-31 08:39:02.063 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:39:02 np0005603623 nova_compute[226235]: 2026-01-31 08:39:02.118 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:39:02 np0005603623 nova_compute[226235]: 2026-01-31 08:39:02.163 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:39:02 np0005603623 nova_compute[226235]: 2026-01-31 08:39:02.164 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.843s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:03 np0005603623 nova_compute[226235]: 2026-01-31 08:39:03.533 226239 DEBUG nova.compute.manager [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 31 03:39:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:03.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:39:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:03.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:39:03 np0005603623 nova_compute[226235]: 2026-01-31 08:39:03.743 226239 DEBUG oslo_concurrency.lockutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:03 np0005603623 nova_compute[226235]: 2026-01-31 08:39:03.744 226239 DEBUG oslo_concurrency.lockutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:03 np0005603623 nova_compute[226235]: 2026-01-31 08:39:03.787 226239 DEBUG nova.objects.instance [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lazy-loading 'pci_requests' on Instance uuid 815ef28e-2297-49ba-88a1-23f722c3fa0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:39:03 np0005603623 nova_compute[226235]: 2026-01-31 08:39:03.829 226239 DEBUG nova.virt.hardware [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:39:03 np0005603623 nova_compute[226235]: 2026-01-31 08:39:03.830 226239 INFO nova.compute.claims [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:39:03 np0005603623 nova_compute[226235]: 2026-01-31 08:39:03.830 226239 DEBUG nova.objects.instance [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lazy-loading 'resources' on Instance uuid 815ef28e-2297-49ba-88a1-23f722c3fa0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:39:03 np0005603623 nova_compute[226235]: 2026-01-31 08:39:03.897 226239 DEBUG nova.objects.instance [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lazy-loading 'numa_topology' on Instance uuid 815ef28e-2297-49ba-88a1-23f722c3fa0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:39:04 np0005603623 nova_compute[226235]: 2026-01-31 08:39:04.082 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:04 np0005603623 nova_compute[226235]: 2026-01-31 08:39:04.082 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:04 np0005603623 nova_compute[226235]: 2026-01-31 08:39:04.083 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:04 np0005603623 nova_compute[226235]: 2026-01-31 08:39:04.149 226239 DEBUG nova.objects.instance [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lazy-loading 'pci_devices' on Instance uuid 815ef28e-2297-49ba-88a1-23f722c3fa0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:39:04 np0005603623 nova_compute[226235]: 2026-01-31 08:39:04.240 226239 INFO nova.compute.resource_tracker [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updating resource usage from migration cf86d346-be26-4774-9a47-8138c96edea8#033[00m
Jan 31 03:39:04 np0005603623 nova_compute[226235]: 2026-01-31 08:39:04.241 226239 DEBUG nova.compute.resource_tracker [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Starting to track incoming migration cf86d346-be26-4774-9a47-8138c96edea8 with flavor a01eb4f0-fd80-416b-a750-75de320394d8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 31 03:39:04 np0005603623 nova_compute[226235]: 2026-01-31 08:39:04.362 226239 DEBUG oslo_concurrency.processutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:39:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:39:04 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3527933210' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:39:04 np0005603623 nova_compute[226235]: 2026-01-31 08:39:04.800 226239 DEBUG oslo_concurrency.processutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:39:04 np0005603623 nova_compute[226235]: 2026-01-31 08:39:04.806 226239 DEBUG nova.compute.provider_tree [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:39:04 np0005603623 nova_compute[226235]: 2026-01-31 08:39:04.847 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:04 np0005603623 nova_compute[226235]: 2026-01-31 08:39:04.872 226239 DEBUG nova.scheduler.client.report [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:39:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:04 np0005603623 nova_compute[226235]: 2026-01-31 08:39:04.921 226239 DEBUG oslo_concurrency.lockutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 1.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:04 np0005603623 nova_compute[226235]: 2026-01-31 08:39:04.921 226239 INFO nova.compute.manager [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Migrating#033[00m
Jan 31 03:39:05 np0005603623 nova_compute[226235]: 2026-01-31 08:39:05.030 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:05.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:05.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:06 np0005603623 ovn_controller[133449]: 2026-01-31T08:39:06Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7b:82:ab 10.100.0.8
Jan 31 03:39:06 np0005603623 ovn_controller[133449]: 2026-01-31T08:39:06Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7b:82:ab 10.100.0.8
Jan 31 03:39:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:07.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e340 e340: 3 total, 3 up, 3 in
Jan 31 03:39:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:07.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:39:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:09.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:39:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:09.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:09 np0005603623 nova_compute[226235]: 2026-01-31 08:39:09.850 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e340 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:10 np0005603623 nova_compute[226235]: 2026-01-31 08:39:10.033 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:10 np0005603623 nova_compute[226235]: 2026-01-31 08:39:10.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:10 np0005603623 nova_compute[226235]: 2026-01-31 08:39:10.192 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:10 np0005603623 systemd[1]: Created slice User Slice of UID 42436.
Jan 31 03:39:10 np0005603623 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 31 03:39:10 np0005603623 systemd-logind[795]: New session 63 of user nova.
Jan 31 03:39:10 np0005603623 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 31 03:39:10 np0005603623 systemd[1]: Starting User Manager for UID 42436...
Jan 31 03:39:10 np0005603623 systemd[294817]: Queued start job for default target Main User Target.
Jan 31 03:39:10 np0005603623 systemd[294817]: Created slice User Application Slice.
Jan 31 03:39:10 np0005603623 systemd[294817]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 31 03:39:10 np0005603623 systemd[294817]: Started Daily Cleanup of User's Temporary Directories.
Jan 31 03:39:10 np0005603623 systemd[294817]: Reached target Paths.
Jan 31 03:39:10 np0005603623 systemd[294817]: Reached target Timers.
Jan 31 03:39:10 np0005603623 systemd[294817]: Starting D-Bus User Message Bus Socket...
Jan 31 03:39:10 np0005603623 systemd[294817]: Starting Create User's Volatile Files and Directories...
Jan 31 03:39:10 np0005603623 systemd[294817]: Listening on D-Bus User Message Bus Socket.
Jan 31 03:39:10 np0005603623 systemd[294817]: Finished Create User's Volatile Files and Directories.
Jan 31 03:39:10 np0005603623 systemd[294817]: Reached target Sockets.
Jan 31 03:39:10 np0005603623 systemd[294817]: Reached target Basic System.
Jan 31 03:39:10 np0005603623 systemd[294817]: Reached target Main User Target.
Jan 31 03:39:10 np0005603623 systemd[294817]: Startup finished in 132ms.
Jan 31 03:39:10 np0005603623 systemd[1]: Started User Manager for UID 42436.
Jan 31 03:39:10 np0005603623 systemd[1]: Started Session 63 of User nova.
Jan 31 03:39:10 np0005603623 systemd[1]: session-63.scope: Deactivated successfully.
Jan 31 03:39:10 np0005603623 systemd-logind[795]: Session 63 logged out. Waiting for processes to exit.
Jan 31 03:39:10 np0005603623 systemd-logind[795]: Removed session 63.
Jan 31 03:39:10 np0005603623 systemd-logind[795]: New session 65 of user nova.
Jan 31 03:39:10 np0005603623 systemd[1]: Started Session 65 of User nova.
Jan 31 03:39:10 np0005603623 systemd[1]: session-65.scope: Deactivated successfully.
Jan 31 03:39:10 np0005603623 systemd-logind[795]: Session 65 logged out. Waiting for processes to exit.
Jan 31 03:39:10 np0005603623 systemd-logind[795]: Removed session 65.
Jan 31 03:39:10 np0005603623 podman[294837]: 2026-01-31 08:39:10.781271913 +0000 UTC m=+0.063738280 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Jan 31 03:39:10 np0005603623 podman[294839]: 2026-01-31 08:39:10.829696083 +0000 UTC m=+0.112766609 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Jan 31 03:39:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:11.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:39:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:11.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:39:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e341 e341: 3 total, 3 up, 3 in
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #127. Immutable memtables: 0.
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:39:13.374948) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 79] Flushing memtable with next log file: 127
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848753375266, "job": 79, "event": "flush_started", "num_memtables": 1, "num_entries": 491, "num_deletes": 251, "total_data_size": 622385, "memory_usage": 631688, "flush_reason": "Manual Compaction"}
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 79] Level-0 flush table #128: started
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e342 e342: 3 total, 3 up, 3 in
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848753380200, "cf_name": "default", "job": 79, "event": "table_file_creation", "file_number": 128, "file_size": 410188, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 63345, "largest_seqno": 63831, "table_properties": {"data_size": 407517, "index_size": 706, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6695, "raw_average_key_size": 19, "raw_value_size": 402015, "raw_average_value_size": 1158, "num_data_blocks": 30, "num_entries": 347, "num_filter_entries": 347, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848736, "oldest_key_time": 1769848736, "file_creation_time": 1769848753, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 128, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 79] Flush lasted 5323 microseconds, and 2907 cpu microseconds.
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:39:13.380268) [db/flush_job.cc:967] [default] [JOB 79] Level-0 flush table #128: 410188 bytes OK
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:39:13.380292) [db/memtable_list.cc:519] [default] Level-0 commit table #128 started
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:39:13.381746) [db/memtable_list.cc:722] [default] Level-0 commit table #128: memtable #1 done
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:39:13.381758) EVENT_LOG_v1 {"time_micros": 1769848753381754, "job": 79, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:39:13.381773) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 79] Try to delete WAL files size 619405, prev total WAL file size 619446, number of live WAL files 2.
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000124.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:39:13.382109) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035323731' seq:72057594037927935, type:22 .. '7061786F730035353233' seq:0, type:0; will stop at (end)
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 80] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 79 Base level 0, inputs: [128(400KB)], [126(10MB)]
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848753382141, "job": 80, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [128], "files_L6": [126], "score": -1, "input_data_size": 11705153, "oldest_snapshot_seqno": -1}
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 80] Generated table #129: 8433 keys, 9792545 bytes, temperature: kUnknown
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848753498022, "cf_name": "default", "job": 80, "event": "table_file_creation", "file_number": 129, "file_size": 9792545, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9739631, "index_size": 30721, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21125, "raw_key_size": 222139, "raw_average_key_size": 26, "raw_value_size": 9593017, "raw_average_value_size": 1137, "num_data_blocks": 1170, "num_entries": 8433, "num_filter_entries": 8433, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769848753, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 129, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:39:13.498345) [db/compaction/compaction_job.cc:1663] [default] [JOB 80] Compacted 1@0 + 1@6 files to L6 => 9792545 bytes
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:39:13.500174) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 100.9 rd, 84.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 10.8 +0.0 blob) out(9.3 +0.0 blob), read-write-amplify(52.4) write-amplify(23.9) OK, records in: 8950, records dropped: 517 output_compression: NoCompression
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:39:13.500217) EVENT_LOG_v1 {"time_micros": 1769848753500198, "job": 80, "event": "compaction_finished", "compaction_time_micros": 115974, "compaction_time_cpu_micros": 21109, "output_level": 6, "num_output_files": 1, "total_output_size": 9792545, "num_input_records": 8950, "num_output_records": 8433, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000128.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848753500415, "job": 80, "event": "table_file_deletion", "file_number": 128}
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000126.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848753502237, "job": 80, "event": "table_file_deletion", "file_number": 126}
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:39:13.382057) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:39:13.502308) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:39:13.502314) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:39:13.502315) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:39:13.502317) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:39:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:39:13.502318) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:39:13 np0005603623 nova_compute[226235]: 2026-01-31 08:39:13.554 226239 DEBUG oslo_concurrency.lockutils [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Acquiring lock "91f04a93-ce38-4962-919b-e1ac0677b4da" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:13 np0005603623 nova_compute[226235]: 2026-01-31 08:39:13.554 226239 DEBUG oslo_concurrency.lockutils [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:13 np0005603623 nova_compute[226235]: 2026-01-31 08:39:13.555 226239 DEBUG oslo_concurrency.lockutils [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Acquiring lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:13 np0005603623 nova_compute[226235]: 2026-01-31 08:39:13.555 226239 DEBUG oslo_concurrency.lockutils [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:13 np0005603623 nova_compute[226235]: 2026-01-31 08:39:13.556 226239 DEBUG oslo_concurrency.lockutils [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:13 np0005603623 nova_compute[226235]: 2026-01-31 08:39:13.558 226239 INFO nova.compute.manager [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Terminating instance#033[00m
Jan 31 03:39:13 np0005603623 nova_compute[226235]: 2026-01-31 08:39:13.560 226239 DEBUG nova.compute.manager [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:39:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:39:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:13.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:39:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:39:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:13.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:39:14 np0005603623 kernel: tapff4db50a-8c (unregistering): left promiscuous mode
Jan 31 03:39:14 np0005603623 NetworkManager[48970]: <info>  [1769848754.1768] device (tapff4db50a-8c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:39:14 np0005603623 nova_compute[226235]: 2026-01-31 08:39:14.215 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:14 np0005603623 ovn_controller[133449]: 2026-01-31T08:39:14Z|00616|binding|INFO|Releasing lport ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 from this chassis (sb_readonly=0)
Jan 31 03:39:14 np0005603623 ovn_controller[133449]: 2026-01-31T08:39:14Z|00617|binding|INFO|Setting lport ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 down in Southbound
Jan 31 03:39:14 np0005603623 ovn_controller[133449]: 2026-01-31T08:39:14Z|00618|binding|INFO|Removing iface tapff4db50a-8c ovn-installed in OVS
Jan 31 03:39:14 np0005603623 nova_compute[226235]: 2026-01-31 08:39:14.225 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:14 np0005603623 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000091.scope: Deactivated successfully.
Jan 31 03:39:14 np0005603623 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000091.scope: Consumed 12.910s CPU time.
Jan 31 03:39:14 np0005603623 systemd-machined[194379]: Machine qemu-70-instance-00000091 terminated.
Jan 31 03:39:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:14.306 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:82:ab 10.100.0.8'], port_security=['fa:16:3e:7b:82:ab 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '91f04a93-ce38-4962-919b-e1ac0677b4da', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9ade8f79-180d-4cd9-82d7-f3d41cab1210', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fcc39962194d44e5b37cad3fb1adc6c4', 'neutron:revision_number': '6', 'neutron:security_group_ids': '71fde71b-c204-4df2-b3b2-d40465166772', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.241', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2a1d055-9eac-4b84-8a08-5dfefe7d7d79, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=ff4db50a-8c68-47ef-a6b0-c8caeab94ae5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:39:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:14.308 143258 INFO neutron.agent.ovn.metadata.agent [-] Port ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 in datapath 9ade8f79-180d-4cd9-82d7-f3d41cab1210 unbound from our chassis#033[00m
Jan 31 03:39:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:14.311 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9ade8f79-180d-4cd9-82d7-f3d41cab1210, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:39:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:14.313 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3b50122a-87ed-4d62-927a-1c349b207ac2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:14.313 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210 namespace which is not needed anymore#033[00m
Jan 31 03:39:14 np0005603623 nova_compute[226235]: 2026-01-31 08:39:14.395 226239 INFO nova.virt.libvirt.driver [-] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Instance destroyed successfully.#033[00m
Jan 31 03:39:14 np0005603623 nova_compute[226235]: 2026-01-31 08:39:14.396 226239 DEBUG nova.objects.instance [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lazy-loading 'resources' on Instance uuid 91f04a93-ce38-4962-919b-e1ac0677b4da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:39:14 np0005603623 neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210[294673]: [NOTICE]   (294677) : haproxy version is 2.8.14-c23fe91
Jan 31 03:39:14 np0005603623 neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210[294673]: [NOTICE]   (294677) : path to executable is /usr/sbin/haproxy
Jan 31 03:39:14 np0005603623 neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210[294673]: [WARNING]  (294677) : Exiting Master process...
Jan 31 03:39:14 np0005603623 neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210[294673]: [WARNING]  (294677) : Exiting Master process...
Jan 31 03:39:14 np0005603623 neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210[294673]: [ALERT]    (294677) : Current worker (294679) exited with code 143 (Terminated)
Jan 31 03:39:14 np0005603623 neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210[294673]: [WARNING]  (294677) : All workers exited. Exiting... (0)
Jan 31 03:39:14 np0005603623 systemd[1]: libpod-8ab641208845b111b953a73b9640515da63c31a5d6ce262d7a542f3f52758d04.scope: Deactivated successfully.
Jan 31 03:39:14 np0005603623 conmon[294673]: conmon 8ab641208845b111b953 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-8ab641208845b111b953a73b9640515da63c31a5d6ce262d7a542f3f52758d04.scope/container/memory.events
Jan 31 03:39:14 np0005603623 podman[294964]: 2026-01-31 08:39:14.436100188 +0000 UTC m=+0.044702273 container died 8ab641208845b111b953a73b9640515da63c31a5d6ce262d7a542f3f52758d04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:39:14 np0005603623 nova_compute[226235]: 2026-01-31 08:39:14.452 226239 DEBUG nova.virt.libvirt.vif [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:37:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerActionsV293TestJSON-server-331887853',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionsv293testjson-server-781350970',id=145,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH5gkx3V7BznMz2Ml1SKHQUD/7nRL++Tb8fgaa8FBib96ELGecwEzyV24CRvXhcVt4CV7M8yM6O/exmt6u050Gx0p7dChbKR6iHlWT/0AVPPaPVFOCFMpSshNGBUdPtSXg==',key_name='tempest-keypair-1508930548',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:38:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fcc39962194d44e5b37cad3fb1adc6c4',ramdisk_id='',reservation_id='r-i2540rfe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsV293TestJSON-769176929',owner_user_name='tempest-ServerActionsV293TestJSON-769176929-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:38:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c13cac16cdef424b8050c4bea5a7e9c3',uuid=91f04a93-ce38-4962-919b-e1ac0677b4da,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:39:14 np0005603623 nova_compute[226235]: 2026-01-31 08:39:14.452 226239 DEBUG nova.network.os_vif_util [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Converting VIF {"id": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "address": "fa:16:3e:7b:82:ab", "network": {"id": "9ade8f79-180d-4cd9-82d7-f3d41cab1210", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1618969782-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fcc39962194d44e5b37cad3fb1adc6c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4db50a-8c", "ovs_interfaceid": "ff4db50a-8c68-47ef-a6b0-c8caeab94ae5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:39:14 np0005603623 nova_compute[226235]: 2026-01-31 08:39:14.453 226239 DEBUG nova.network.os_vif_util [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:82:ab,bridge_name='br-int',has_traffic_filtering=True,id=ff4db50a-8c68-47ef-a6b0-c8caeab94ae5,network=Network(9ade8f79-180d-4cd9-82d7-f3d41cab1210),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff4db50a-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:39:14 np0005603623 nova_compute[226235]: 2026-01-31 08:39:14.453 226239 DEBUG os_vif [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:82:ab,bridge_name='br-int',has_traffic_filtering=True,id=ff4db50a-8c68-47ef-a6b0-c8caeab94ae5,network=Network(9ade8f79-180d-4cd9-82d7-f3d41cab1210),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff4db50a-8c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:39:14 np0005603623 nova_compute[226235]: 2026-01-31 08:39:14.455 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:14 np0005603623 nova_compute[226235]: 2026-01-31 08:39:14.456 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff4db50a-8c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:14 np0005603623 nova_compute[226235]: 2026-01-31 08:39:14.457 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:14 np0005603623 nova_compute[226235]: 2026-01-31 08:39:14.459 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:39:14 np0005603623 nova_compute[226235]: 2026-01-31 08:39:14.460 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:14 np0005603623 nova_compute[226235]: 2026-01-31 08:39:14.463 226239 INFO os_vif [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:82:ab,bridge_name='br-int',has_traffic_filtering=True,id=ff4db50a-8c68-47ef-a6b0-c8caeab94ae5,network=Network(9ade8f79-180d-4cd9-82d7-f3d41cab1210),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff4db50a-8c')#033[00m
Jan 31 03:39:14 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8ab641208845b111b953a73b9640515da63c31a5d6ce262d7a542f3f52758d04-userdata-shm.mount: Deactivated successfully.
Jan 31 03:39:14 np0005603623 systemd[1]: var-lib-containers-storage-overlay-33200478a22694555f2ae52ce197798ea4076b37e9eab38b0edf273bf0c13756-merged.mount: Deactivated successfully.
Jan 31 03:39:14 np0005603623 podman[294964]: 2026-01-31 08:39:14.480244983 +0000 UTC m=+0.088847068 container cleanup 8ab641208845b111b953a73b9640515da63c31a5d6ce262d7a542f3f52758d04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:39:14 np0005603623 systemd[1]: libpod-conmon-8ab641208845b111b953a73b9640515da63c31a5d6ce262d7a542f3f52758d04.scope: Deactivated successfully.
Jan 31 03:39:14 np0005603623 podman[295009]: 2026-01-31 08:39:14.531598484 +0000 UTC m=+0.035712281 container remove 8ab641208845b111b953a73b9640515da63c31a5d6ce262d7a542f3f52758d04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:39:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:14.535 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[959a0af8-2f7f-42c2-ad8d-fac6fa97a598]: (4, ('Sat Jan 31 08:39:14 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210 (8ab641208845b111b953a73b9640515da63c31a5d6ce262d7a542f3f52758d04)\n8ab641208845b111b953a73b9640515da63c31a5d6ce262d7a542f3f52758d04\nSat Jan 31 08:39:14 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210 (8ab641208845b111b953a73b9640515da63c31a5d6ce262d7a542f3f52758d04)\n8ab641208845b111b953a73b9640515da63c31a5d6ce262d7a542f3f52758d04\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:14.536 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9dc5b48f-1e8d-4677-85c8-9710710b43c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:14.537 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ade8f79-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:14 np0005603623 nova_compute[226235]: 2026-01-31 08:39:14.539 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:14 np0005603623 kernel: tap9ade8f79-10: left promiscuous mode
Jan 31 03:39:14 np0005603623 nova_compute[226235]: 2026-01-31 08:39:14.544 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:14.546 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fbe15a1d-6780-4b08-bed2-84a07202e211]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:14.560 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3c2bca4e-78d8-4bbb-81ae-085f1b69c50d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:14.562 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d2f26ae6-bc90-4bb1-a687-cee9ae3671b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:14.572 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3b7e3e13-e5ba-4269-b5d7-e93dd425692e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 787253, 'reachable_time': 34433, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295036, 'error': None, 'target': 'ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:14.573 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9ade8f79-180d-4cd9-82d7-f3d41cab1210 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:39:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:14.573 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[1c0463b2-b667-4c00-b970-9e0cbaf44467]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:14 np0005603623 systemd[1]: run-netns-ovnmeta\x2d9ade8f79\x2d180d\x2d4cd9\x2d82d7\x2df3d41cab1210.mount: Deactivated successfully.
Jan 31 03:39:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:39:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/205876414' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:39:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:39:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/205876414' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:39:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e342 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:15 np0005603623 nova_compute[226235]: 2026-01-31 08:39:15.034 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:15 np0005603623 nova_compute[226235]: 2026-01-31 08:39:15.070 226239 DEBUG nova.compute.manager [req-aaba77bc-c5cb-43e7-92b7-ed45d8948f34 req-36d93f30-a927-4f2a-b26b-13067b66629e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-unplugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:15 np0005603623 nova_compute[226235]: 2026-01-31 08:39:15.070 226239 DEBUG oslo_concurrency.lockutils [req-aaba77bc-c5cb-43e7-92b7-ed45d8948f34 req-36d93f30-a927-4f2a-b26b-13067b66629e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:15 np0005603623 nova_compute[226235]: 2026-01-31 08:39:15.070 226239 DEBUG oslo_concurrency.lockutils [req-aaba77bc-c5cb-43e7-92b7-ed45d8948f34 req-36d93f30-a927-4f2a-b26b-13067b66629e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:15 np0005603623 nova_compute[226235]: 2026-01-31 08:39:15.071 226239 DEBUG oslo_concurrency.lockutils [req-aaba77bc-c5cb-43e7-92b7-ed45d8948f34 req-36d93f30-a927-4f2a-b26b-13067b66629e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:15 np0005603623 nova_compute[226235]: 2026-01-31 08:39:15.071 226239 DEBUG nova.compute.manager [req-aaba77bc-c5cb-43e7-92b7-ed45d8948f34 req-36d93f30-a927-4f2a-b26b-13067b66629e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] No waiting events found dispatching network-vif-unplugged-b62616fc-dd91-4cc2-b323-70fffebab4fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:39:15 np0005603623 nova_compute[226235]: 2026-01-31 08:39:15.071 226239 WARNING nova.compute.manager [req-aaba77bc-c5cb-43e7-92b7-ed45d8948f34 req-36d93f30-a927-4f2a-b26b-13067b66629e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received unexpected event network-vif-unplugged-b62616fc-dd91-4cc2-b323-70fffebab4fb for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 31 03:39:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:15.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:39:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:15.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:39:15 np0005603623 nova_compute[226235]: 2026-01-31 08:39:15.699 226239 INFO nova.network.neutron [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updating port b62616fc-dd91-4cc2-b323-70fffebab4fb with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 31 03:39:15 np0005603623 nova_compute[226235]: 2026-01-31 08:39:15.902 226239 INFO nova.virt.libvirt.driver [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Deleting instance files /var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da_del#033[00m
Jan 31 03:39:15 np0005603623 nova_compute[226235]: 2026-01-31 08:39:15.903 226239 INFO nova.virt.libvirt.driver [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Deletion of /var/lib/nova/instances/91f04a93-ce38-4962-919b-e1ac0677b4da_del complete#033[00m
Jan 31 03:39:16 np0005603623 nova_compute[226235]: 2026-01-31 08:39:16.021 226239 DEBUG nova.compute.manager [req-08cc7b9a-4c5e-436d-aca6-bacbb0500978 req-a0f3ab2c-0db8-4ca1-b2d2-cf12fbbf0ea5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received event network-vif-unplugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:16 np0005603623 nova_compute[226235]: 2026-01-31 08:39:16.022 226239 DEBUG oslo_concurrency.lockutils [req-08cc7b9a-4c5e-436d-aca6-bacbb0500978 req-a0f3ab2c-0db8-4ca1-b2d2-cf12fbbf0ea5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:16 np0005603623 nova_compute[226235]: 2026-01-31 08:39:16.022 226239 DEBUG oslo_concurrency.lockutils [req-08cc7b9a-4c5e-436d-aca6-bacbb0500978 req-a0f3ab2c-0db8-4ca1-b2d2-cf12fbbf0ea5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:16 np0005603623 nova_compute[226235]: 2026-01-31 08:39:16.022 226239 DEBUG oslo_concurrency.lockutils [req-08cc7b9a-4c5e-436d-aca6-bacbb0500978 req-a0f3ab2c-0db8-4ca1-b2d2-cf12fbbf0ea5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:16 np0005603623 nova_compute[226235]: 2026-01-31 08:39:16.022 226239 DEBUG nova.compute.manager [req-08cc7b9a-4c5e-436d-aca6-bacbb0500978 req-a0f3ab2c-0db8-4ca1-b2d2-cf12fbbf0ea5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] No waiting events found dispatching network-vif-unplugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:39:16 np0005603623 nova_compute[226235]: 2026-01-31 08:39:16.022 226239 DEBUG nova.compute.manager [req-08cc7b9a-4c5e-436d-aca6-bacbb0500978 req-a0f3ab2c-0db8-4ca1-b2d2-cf12fbbf0ea5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received event network-vif-unplugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:39:16 np0005603623 nova_compute[226235]: 2026-01-31 08:39:16.038 226239 INFO nova.compute.manager [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Took 2.48 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:39:16 np0005603623 nova_compute[226235]: 2026-01-31 08:39:16.040 226239 DEBUG oslo.service.loopingcall [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:39:16 np0005603623 nova_compute[226235]: 2026-01-31 08:39:16.041 226239 DEBUG nova.compute.manager [-] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:39:16 np0005603623 nova_compute[226235]: 2026-01-31 08:39:16.041 226239 DEBUG nova.network.neutron [-] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:39:17 np0005603623 nova_compute[226235]: 2026-01-31 08:39:17.173 226239 DEBUG oslo_concurrency.lockutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Acquiring lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:39:17 np0005603623 nova_compute[226235]: 2026-01-31 08:39:17.174 226239 DEBUG oslo_concurrency.lockutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Acquired lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:39:17 np0005603623 nova_compute[226235]: 2026-01-31 08:39:17.174 226239 DEBUG nova.network.neutron [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:39:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:39:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:17.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:39:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:17.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:17 np0005603623 nova_compute[226235]: 2026-01-31 08:39:17.966 226239 DEBUG nova.compute.manager [req-8f8a69e3-9aec-4a8d-ab09-213505569e77 req-429b3b12-ee13-4a10-b06b-7f882a1ea8e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:17 np0005603623 nova_compute[226235]: 2026-01-31 08:39:17.966 226239 DEBUG oslo_concurrency.lockutils [req-8f8a69e3-9aec-4a8d-ab09-213505569e77 req-429b3b12-ee13-4a10-b06b-7f882a1ea8e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:17 np0005603623 nova_compute[226235]: 2026-01-31 08:39:17.966 226239 DEBUG oslo_concurrency.lockutils [req-8f8a69e3-9aec-4a8d-ab09-213505569e77 req-429b3b12-ee13-4a10-b06b-7f882a1ea8e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:17 np0005603623 nova_compute[226235]: 2026-01-31 08:39:17.967 226239 DEBUG oslo_concurrency.lockutils [req-8f8a69e3-9aec-4a8d-ab09-213505569e77 req-429b3b12-ee13-4a10-b06b-7f882a1ea8e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:17 np0005603623 nova_compute[226235]: 2026-01-31 08:39:17.967 226239 DEBUG nova.compute.manager [req-8f8a69e3-9aec-4a8d-ab09-213505569e77 req-429b3b12-ee13-4a10-b06b-7f882a1ea8e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] No waiting events found dispatching network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:39:17 np0005603623 nova_compute[226235]: 2026-01-31 08:39:17.967 226239 WARNING nova.compute.manager [req-8f8a69e3-9aec-4a8d-ab09-213505569e77 req-429b3b12-ee13-4a10-b06b-7f882a1ea8e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received unexpected event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:39:18 np0005603623 nova_compute[226235]: 2026-01-31 08:39:18.101 226239 DEBUG nova.network.neutron [-] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:39:18 np0005603623 nova_compute[226235]: 2026-01-31 08:39:18.211 226239 INFO nova.compute.manager [-] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Took 2.17 seconds to deallocate network for instance.#033[00m
Jan 31 03:39:18 np0005603623 nova_compute[226235]: 2026-01-31 08:39:18.334 226239 DEBUG nova.compute.manager [req-6f0d21bd-efcf-4ee2-affa-744ede133e0c req-eaef525f-6bc3-4292-973d-1354151f089f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received event network-vif-plugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:18 np0005603623 nova_compute[226235]: 2026-01-31 08:39:18.335 226239 DEBUG oslo_concurrency.lockutils [req-6f0d21bd-efcf-4ee2-affa-744ede133e0c req-eaef525f-6bc3-4292-973d-1354151f089f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:18 np0005603623 nova_compute[226235]: 2026-01-31 08:39:18.335 226239 DEBUG oslo_concurrency.lockutils [req-6f0d21bd-efcf-4ee2-affa-744ede133e0c req-eaef525f-6bc3-4292-973d-1354151f089f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:18 np0005603623 nova_compute[226235]: 2026-01-31 08:39:18.335 226239 DEBUG oslo_concurrency.lockutils [req-6f0d21bd-efcf-4ee2-affa-744ede133e0c req-eaef525f-6bc3-4292-973d-1354151f089f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:18 np0005603623 nova_compute[226235]: 2026-01-31 08:39:18.335 226239 DEBUG nova.compute.manager [req-6f0d21bd-efcf-4ee2-affa-744ede133e0c req-eaef525f-6bc3-4292-973d-1354151f089f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] No waiting events found dispatching network-vif-plugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:39:18 np0005603623 nova_compute[226235]: 2026-01-31 08:39:18.336 226239 WARNING nova.compute.manager [req-6f0d21bd-efcf-4ee2-affa-744ede133e0c req-eaef525f-6bc3-4292-973d-1354151f089f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received unexpected event network-vif-plugged-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:39:18 np0005603623 nova_compute[226235]: 2026-01-31 08:39:18.336 226239 DEBUG nova.compute.manager [req-6f0d21bd-efcf-4ee2-affa-744ede133e0c req-eaef525f-6bc3-4292-973d-1354151f089f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Received event network-vif-deleted-ff4db50a-8c68-47ef-a6b0-c8caeab94ae5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:18 np0005603623 nova_compute[226235]: 2026-01-31 08:39:18.633 226239 INFO nova.compute.manager [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Took 0.42 seconds to detach 1 volumes for instance.#033[00m
Jan 31 03:39:18 np0005603623 nova_compute[226235]: 2026-01-31 08:39:18.635 226239 DEBUG nova.compute.manager [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Deleting volume: 7ff7e47c-5991-4995-b62d-9010ad81e5bf _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 31 03:39:19 np0005603623 nova_compute[226235]: 2026-01-31 08:39:19.244 226239 DEBUG oslo_concurrency.lockutils [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:19 np0005603623 nova_compute[226235]: 2026-01-31 08:39:19.245 226239 DEBUG oslo_concurrency.lockutils [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:19 np0005603623 nova_compute[226235]: 2026-01-31 08:39:19.458 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:19 np0005603623 nova_compute[226235]: 2026-01-31 08:39:19.471 226239 DEBUG oslo_concurrency.processutils [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:39:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:39:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:19.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:39:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:19.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:39:19 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2240426231' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:39:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e342 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:19 np0005603623 nova_compute[226235]: 2026-01-31 08:39:19.907 226239 DEBUG oslo_concurrency.processutils [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:39:19 np0005603623 nova_compute[226235]: 2026-01-31 08:39:19.913 226239 DEBUG nova.compute.provider_tree [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:39:19 np0005603623 nova_compute[226235]: 2026-01-31 08:39:19.972 226239 DEBUG nova.scheduler.client.report [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:39:20 np0005603623 nova_compute[226235]: 2026-01-31 08:39:20.036 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:20 np0005603623 nova_compute[226235]: 2026-01-31 08:39:20.057 226239 DEBUG oslo_concurrency.lockutils [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.813s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:20 np0005603623 nova_compute[226235]: 2026-01-31 08:39:20.133 226239 INFO nova.scheduler.client.report [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Deleted allocations for instance 91f04a93-ce38-4962-919b-e1ac0677b4da#033[00m
Jan 31 03:39:20 np0005603623 nova_compute[226235]: 2026-01-31 08:39:20.428 226239 DEBUG nova.compute.manager [req-dfe8333c-a58d-449c-af3f-bfe4b7e3aee4 req-47693ab9-4312-4e13-9bff-551b15b975f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-changed-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:20 np0005603623 nova_compute[226235]: 2026-01-31 08:39:20.428 226239 DEBUG nova.compute.manager [req-dfe8333c-a58d-449c-af3f-bfe4b7e3aee4 req-47693ab9-4312-4e13-9bff-551b15b975f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Refreshing instance network info cache due to event network-changed-b62616fc-dd91-4cc2-b323-70fffebab4fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:39:20 np0005603623 nova_compute[226235]: 2026-01-31 08:39:20.428 226239 DEBUG oslo_concurrency.lockutils [req-dfe8333c-a58d-449c-af3f-bfe4b7e3aee4 req-47693ab9-4312-4e13-9bff-551b15b975f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:39:20 np0005603623 nova_compute[226235]: 2026-01-31 08:39:20.529 226239 DEBUG oslo_concurrency.lockutils [None req-8e318f8a-b691-47af-b58a-a67507441698 c13cac16cdef424b8050c4bea5a7e9c3 fcc39962194d44e5b37cad3fb1adc6c4 - - default default] Lock "91f04a93-ce38-4962-919b-e1ac0677b4da" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.975s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:20 np0005603623 nova_compute[226235]: 2026-01-31 08:39:20.657 226239 DEBUG nova.network.neutron [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updating instance_info_cache with network_info: [{"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:39:20 np0005603623 nova_compute[226235]: 2026-01-31 08:39:20.743 226239 DEBUG oslo_concurrency.lockutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Releasing lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:39:20 np0005603623 nova_compute[226235]: 2026-01-31 08:39:20.748 226239 DEBUG oslo_concurrency.lockutils [req-dfe8333c-a58d-449c-af3f-bfe4b7e3aee4 req-47693ab9-4312-4e13-9bff-551b15b975f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:39:20 np0005603623 nova_compute[226235]: 2026-01-31 08:39:20.748 226239 DEBUG nova.network.neutron [req-dfe8333c-a58d-449c-af3f-bfe4b7e3aee4 req-47693ab9-4312-4e13-9bff-551b15b975f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Refreshing network info cache for port b62616fc-dd91-4cc2-b323-70fffebab4fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:39:20 np0005603623 systemd[1]: Stopping User Manager for UID 42436...
Jan 31 03:39:20 np0005603623 systemd[294817]: Activating special unit Exit the Session...
Jan 31 03:39:20 np0005603623 systemd[294817]: Stopped target Main User Target.
Jan 31 03:39:20 np0005603623 systemd[294817]: Stopped target Basic System.
Jan 31 03:39:20 np0005603623 systemd[294817]: Stopped target Paths.
Jan 31 03:39:20 np0005603623 systemd[294817]: Stopped target Sockets.
Jan 31 03:39:20 np0005603623 systemd[294817]: Stopped target Timers.
Jan 31 03:39:20 np0005603623 systemd[294817]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 31 03:39:20 np0005603623 systemd[294817]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 31 03:39:20 np0005603623 systemd[294817]: Closed D-Bus User Message Bus Socket.
Jan 31 03:39:20 np0005603623 systemd[294817]: Stopped Create User's Volatile Files and Directories.
Jan 31 03:39:20 np0005603623 systemd[294817]: Removed slice User Application Slice.
Jan 31 03:39:20 np0005603623 systemd[294817]: Reached target Shutdown.
Jan 31 03:39:20 np0005603623 systemd[294817]: Finished Exit the Session.
Jan 31 03:39:20 np0005603623 systemd[294817]: Reached target Exit the Session.
Jan 31 03:39:20 np0005603623 systemd[1]: user@42436.service: Deactivated successfully.
Jan 31 03:39:20 np0005603623 systemd[1]: Stopped User Manager for UID 42436.
Jan 31 03:39:20 np0005603623 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 31 03:39:20 np0005603623 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 31 03:39:20 np0005603623 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 31 03:39:20 np0005603623 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 31 03:39:20 np0005603623 systemd[1]: Removed slice User Slice of UID 42436.
Jan 31 03:39:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e343 e343: 3 total, 3 up, 3 in
Jan 31 03:39:20 np0005603623 nova_compute[226235]: 2026-01-31 08:39:20.974 226239 DEBUG nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 31 03:39:20 np0005603623 nova_compute[226235]: 2026-01-31 08:39:20.975 226239 DEBUG nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 31 03:39:20 np0005603623 nova_compute[226235]: 2026-01-31 08:39:20.975 226239 INFO nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Creating image(s)#033[00m
Jan 31 03:39:21 np0005603623 nova_compute[226235]: 2026-01-31 08:39:21.135 226239 DEBUG nova.storage.rbd_utils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] creating snapshot(nova-resize) on rbd image(815ef28e-2297-49ba-88a1-23f722c3fa0a_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:39:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:21.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:21.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e344 e344: 3 total, 3 up, 3 in
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.515 226239 DEBUG nova.objects.instance [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 815ef28e-2297-49ba-88a1-23f722c3fa0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.673 226239 DEBUG nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.673 226239 DEBUG nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Ensure instance console log exists: /var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.674 226239 DEBUG oslo_concurrency.lockutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.674 226239 DEBUG oslo_concurrency.lockutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.674 226239 DEBUG oslo_concurrency.lockutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.677 226239 DEBUG nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Start _get_guest_xml network_info=[{"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1478468357", "vif_mac": "fa:16:3e:44:41:0a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.680 226239 WARNING nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.693 226239 DEBUG nova.virt.libvirt.host [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.694 226239 DEBUG nova.virt.libvirt.host [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.697 226239 DEBUG nova.virt.libvirt.host [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.698 226239 DEBUG nova.virt.libvirt.host [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.699 226239 DEBUG nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.699 226239 DEBUG nova.virt.hardware [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.699 226239 DEBUG nova.virt.hardware [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.700 226239 DEBUG nova.virt.hardware [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.700 226239 DEBUG nova.virt.hardware [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.700 226239 DEBUG nova.virt.hardware [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.700 226239 DEBUG nova.virt.hardware [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.700 226239 DEBUG nova.virt.hardware [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.701 226239 DEBUG nova.virt.hardware [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.701 226239 DEBUG nova.virt.hardware [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.701 226239 DEBUG nova.virt.hardware [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.701 226239 DEBUG nova.virt.hardware [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.702 226239 DEBUG nova.objects.instance [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 815ef28e-2297-49ba-88a1-23f722c3fa0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:39:22 np0005603623 nova_compute[226235]: 2026-01-31 08:39:22.769 226239 DEBUG oslo_concurrency.processutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:39:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:39:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1049171139' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.206 226239 DEBUG oslo_concurrency.processutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.247 226239 DEBUG oslo_concurrency.processutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:39:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:23.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:23.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:39:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4292125182' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.663 226239 DEBUG oslo_concurrency.processutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.665 226239 DEBUG nova.virt.libvirt.vif [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:37:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2115362642',display_name='tempest-TestNetworkAdvancedServerOps-server-2115362642',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2115362642',id=146,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNl3t7Tpw2ZGAKx5tF0XRvkA5V1qQ9xszE6olcVe0qbeqbBI1oq6Zjq+3DsZDE5JpsKvfdWgNEpJ9rXaTzL6wLNNLR+GbnRbZpWjtFLfeYVgQUQ4VVzMWaZiV7/jRigBGA==',key_name='tempest-TestNetworkAdvancedServerOps-1092980412',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:38:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-4f40bqke',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:39:15Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=815ef28e-2297-49ba-88a1-23f722c3fa0a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1478468357", "vif_mac": "fa:16:3e:44:41:0a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.665 226239 DEBUG nova.network.os_vif_util [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Converting VIF {"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1478468357", "vif_mac": "fa:16:3e:44:41:0a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.666 226239 DEBUG nova.network.os_vif_util [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.668 226239 DEBUG nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:39:23 np0005603623 nova_compute[226235]:  <uuid>815ef28e-2297-49ba-88a1-23f722c3fa0a</uuid>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:  <name>instance-00000092</name>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-2115362642</nova:name>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:39:22</nova:creationTime>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:39:23 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:        <nova:user uuid="f1c6e7eff11b435a81429826a682b32f">tempest-TestNetworkAdvancedServerOps-840410497-project-member</nova:user>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:        <nova:project uuid="0bfe11bd9d694684b527666e2c378eed">tempest-TestNetworkAdvancedServerOps-840410497</nova:project>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:        <nova:port uuid="b62616fc-dd91-4cc2-b323-70fffebab4fb">
Jan 31 03:39:23 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <entry name="serial">815ef28e-2297-49ba-88a1-23f722c3fa0a</entry>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <entry name="uuid">815ef28e-2297-49ba-88a1-23f722c3fa0a</entry>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/815ef28e-2297-49ba-88a1-23f722c3fa0a_disk">
Jan 31 03:39:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:39:23 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/815ef28e-2297-49ba-88a1-23f722c3fa0a_disk.config">
Jan 31 03:39:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:39:23 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:44:41:0a"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <target dev="tapb62616fc-dd"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a/console.log" append="off"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:39:23 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:39:23 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:39:23 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:39:23 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.670 226239 DEBUG nova.virt.libvirt.vif [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:37:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2115362642',display_name='tempest-TestNetworkAdvancedServerOps-server-2115362642',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2115362642',id=146,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNl3t7Tpw2ZGAKx5tF0XRvkA5V1qQ9xszE6olcVe0qbeqbBI1oq6Zjq+3DsZDE5JpsKvfdWgNEpJ9rXaTzL6wLNNLR+GbnRbZpWjtFLfeYVgQUQ4VVzMWaZiV7/jRigBGA==',key_name='tempest-TestNetworkAdvancedServerOps-1092980412',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:38:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-4f40bqke',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:39:15Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=815ef28e-2297-49ba-88a1-23f722c3fa0a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1478468357", "vif_mac": "fa:16:3e:44:41:0a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.670 226239 DEBUG nova.network.os_vif_util [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Converting VIF {"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1478468357", "vif_mac": "fa:16:3e:44:41:0a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.670 226239 DEBUG nova.network.os_vif_util [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.671 226239 DEBUG os_vif [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.671 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.672 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.672 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.674 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.674 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb62616fc-dd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.674 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb62616fc-dd, col_values=(('external_ids', {'iface-id': 'b62616fc-dd91-4cc2-b323-70fffebab4fb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:41:0a', 'vm-uuid': '815ef28e-2297-49ba-88a1-23f722c3fa0a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:23 np0005603623 NetworkManager[48970]: <info>  [1769848763.6769] manager: (tapb62616fc-dd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.679 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.681 226239 INFO os_vif [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd')#033[00m
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.978 226239 DEBUG nova.network.neutron [req-dfe8333c-a58d-449c-af3f-bfe4b7e3aee4 req-47693ab9-4312-4e13-9bff-551b15b975f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updated VIF entry in instance network info cache for port b62616fc-dd91-4cc2-b323-70fffebab4fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:39:23 np0005603623 nova_compute[226235]: 2026-01-31 08:39:23.979 226239 DEBUG nova.network.neutron [req-dfe8333c-a58d-449c-af3f-bfe4b7e3aee4 req-47693ab9-4312-4e13-9bff-551b15b975f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updating instance_info_cache with network_info: [{"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:39:24 np0005603623 nova_compute[226235]: 2026-01-31 08:39:24.127 226239 DEBUG oslo_concurrency.lockutils [req-dfe8333c-a58d-449c-af3f-bfe4b7e3aee4 req-47693ab9-4312-4e13-9bff-551b15b975f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:39:24 np0005603623 nova_compute[226235]: 2026-01-31 08:39:24.129 226239 DEBUG nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:39:24 np0005603623 nova_compute[226235]: 2026-01-31 08:39:24.129 226239 DEBUG nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:39:24 np0005603623 nova_compute[226235]: 2026-01-31 08:39:24.129 226239 DEBUG nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] No VIF found with MAC fa:16:3e:44:41:0a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:39:24 np0005603623 nova_compute[226235]: 2026-01-31 08:39:24.130 226239 INFO nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Using config drive#033[00m
Jan 31 03:39:24 np0005603623 kernel: tapb62616fc-dd: entered promiscuous mode
Jan 31 03:39:24 np0005603623 NetworkManager[48970]: <info>  [1769848764.2431] manager: (tapb62616fc-dd): new Tun device (/org/freedesktop/NetworkManager/Devices/296)
Jan 31 03:39:24 np0005603623 nova_compute[226235]: 2026-01-31 08:39:24.242 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:24 np0005603623 ovn_controller[133449]: 2026-01-31T08:39:24Z|00619|binding|INFO|Claiming lport b62616fc-dd91-4cc2-b323-70fffebab4fb for this chassis.
Jan 31 03:39:24 np0005603623 ovn_controller[133449]: 2026-01-31T08:39:24Z|00620|binding|INFO|b62616fc-dd91-4cc2-b323-70fffebab4fb: Claiming fa:16:3e:44:41:0a 10.100.0.3
Jan 31 03:39:24 np0005603623 nova_compute[226235]: 2026-01-31 08:39:24.251 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:24 np0005603623 ovn_controller[133449]: 2026-01-31T08:39:24Z|00621|binding|INFO|Setting lport b62616fc-dd91-4cc2-b323-70fffebab4fb ovn-installed in OVS
Jan 31 03:39:24 np0005603623 systemd-udevd[295231]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:39:24 np0005603623 systemd-machined[194379]: New machine qemu-71-instance-00000092.
Jan 31 03:39:24 np0005603623 NetworkManager[48970]: <info>  [1769848764.2765] device (tapb62616fc-dd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:39:24 np0005603623 NetworkManager[48970]: <info>  [1769848764.2774] device (tapb62616fc-dd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:39:24 np0005603623 systemd[1]: Started Virtual Machine qemu-71-instance-00000092.
Jan 31 03:39:24 np0005603623 ovn_controller[133449]: 2026-01-31T08:39:24Z|00622|binding|INFO|Setting lport b62616fc-dd91-4cc2-b323-70fffebab4fb up in Southbound
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.434 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:41:0a 10.100.0.3'], port_security=['fa:16:3e:44:41:0a 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '815ef28e-2297-49ba-88a1-23f722c3fa0a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d238e24-9954-4b32-b589-6db6c8760a3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0bfe11bd9d694684b527666e2c378eed', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'e09d3566-f99d-4e7a-854e-68c93732c8e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=09414a4b-4852-4431-b971-0c29958bdb7a, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=b62616fc-dd91-4cc2-b323-70fffebab4fb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.435 143258 INFO neutron.agent.ovn.metadata.agent [-] Port b62616fc-dd91-4cc2-b323-70fffebab4fb in datapath 3d238e24-9954-4b32-b589-6db6c8760a3f bound to our chassis#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.437 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3d238e24-9954-4b32-b589-6db6c8760a3f#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.444 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c6431301-864c-48f2-8059-4988a3bcc0cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.445 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3d238e24-91 in ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.447 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3d238e24-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.447 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7a93d8e6-d9f2-43fa-94cf-51db43ee4f15]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.448 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e3e53df6-1918-4a42-a717-7b678d312e43]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.457 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[4251001f-2770-4d32-8d23-74f69b4a87eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.465 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d430094a-661a-4625-94c8-b68418d04ce2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.492 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[d5dcf052-e5bb-4d1d-87b1-9d7e1408ff93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:24 np0005603623 systemd-udevd[295234]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.496 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[55e15827-65ee-4ff8-9790-f58027dbbe59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:24 np0005603623 NetworkManager[48970]: <info>  [1769848764.4973] manager: (tap3d238e24-90): new Veth device (/org/freedesktop/NetworkManager/Devices/297)
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.519 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[a9124fca-d687-4ad4-ad8b-0a276440a72e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.522 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[c8bf26a4-6876-42b9-9311-3dea4fc32a5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:24 np0005603623 NetworkManager[48970]: <info>  [1769848764.5374] device (tap3d238e24-90): carrier: link connected
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.541 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[8dc50e5b-19b5-4693-8544-80708690bb24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.555 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3b251229-2af2-425c-af9e-31324ee0877b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3d238e24-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:78:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 185], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 790496, 'reachable_time': 23853, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295266, 'error': None, 'target': 'ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.565 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ed2796c2-d102-41a7-a336-81f2d93a8728]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedf:782a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 790496, 'tstamp': 790496}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 295267, 'error': None, 'target': 'ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.574 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dcebdba8-a313-4eaf-9932-6411ae1ca4bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3d238e24-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:78:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 185], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 790496, 'reachable_time': 23853, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 295268, 'error': None, 'target': 'ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.596 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b826c7e8-496e-4dce-938a-ac9810bc4516]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.636 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ef0d2255-eab5-4f0f-bfe5-27b2e7743970]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.638 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d238e24-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.638 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.638 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3d238e24-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:24 np0005603623 nova_compute[226235]: 2026-01-31 08:39:24.640 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:24 np0005603623 NetworkManager[48970]: <info>  [1769848764.6407] manager: (tap3d238e24-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/298)
Jan 31 03:39:24 np0005603623 kernel: tap3d238e24-90: entered promiscuous mode
Jan 31 03:39:24 np0005603623 nova_compute[226235]: 2026-01-31 08:39:24.641 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.642 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3d238e24-90, col_values=(('external_ids', {'iface-id': '7571c123-c1d9-4ad9-a9c7-718eca889c7b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:24 np0005603623 nova_compute[226235]: 2026-01-31 08:39:24.643 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:24 np0005603623 ovn_controller[133449]: 2026-01-31T08:39:24Z|00623|binding|INFO|Releasing lport 7571c123-c1d9-4ad9-a9c7-718eca889c7b from this chassis (sb_readonly=1)
Jan 31 03:39:24 np0005603623 nova_compute[226235]: 2026-01-31 08:39:24.644 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.644 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3d238e24-9954-4b32-b589-6db6c8760a3f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3d238e24-9954-4b32-b589-6db6c8760a3f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.645 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[862976c8-94cb-41d7-ac98-427fc81acec9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.646 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-3d238e24-9954-4b32-b589-6db6c8760a3f
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/3d238e24-9954-4b32-b589-6db6c8760a3f.pid.haproxy
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 3d238e24-9954-4b32-b589-6db6c8760a3f
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:39:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:24.648 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f', 'env', 'PROCESS_TAG=haproxy-3d238e24-9954-4b32-b589-6db6c8760a3f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3d238e24-9954-4b32-b589-6db6c8760a3f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:39:24 np0005603623 nova_compute[226235]: 2026-01-31 08:39:24.648 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:24 np0005603623 podman[295300]: 2026-01-31 08:39:24.962649415 +0000 UTC m=+0.041062430 container create 973c084048bf76ed4ad0d58425120ac82be48a7c8028b29321de9d15b22cebad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:39:24 np0005603623 systemd[1]: Started libpod-conmon-973c084048bf76ed4ad0d58425120ac82be48a7c8028b29321de9d15b22cebad.scope.
Jan 31 03:39:25 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:39:25 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c2db3e6bd362e594c99610791ad887b9dd2bb1175d9bf61bdfc03b9878c8969/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:39:25 np0005603623 podman[295300]: 2026-01-31 08:39:24.938203848 +0000 UTC m=+0.016616893 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:39:25 np0005603623 podman[295300]: 2026-01-31 08:39:25.035272343 +0000 UTC m=+0.113685358 container init 973c084048bf76ed4ad0d58425120ac82be48a7c8028b29321de9d15b22cebad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 03:39:25 np0005603623 nova_compute[226235]: 2026-01-31 08:39:25.037 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:25 np0005603623 podman[295300]: 2026-01-31 08:39:25.041919751 +0000 UTC m=+0.120332766 container start 973c084048bf76ed4ad0d58425120ac82be48a7c8028b29321de9d15b22cebad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 31 03:39:25 np0005603623 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[295330]: [NOTICE]   (295353) : New worker (295357) forked
Jan 31 03:39:25 np0005603623 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[295330]: [NOTICE]   (295353) : Loading success.
Jan 31 03:39:25 np0005603623 nova_compute[226235]: 2026-01-31 08:39:25.187 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848765.187184, 815ef28e-2297-49ba-88a1-23f722c3fa0a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:39:25 np0005603623 nova_compute[226235]: 2026-01-31 08:39:25.187 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:39:25 np0005603623 nova_compute[226235]: 2026-01-31 08:39:25.190 226239 DEBUG nova.compute.manager [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:39:25 np0005603623 nova_compute[226235]: 2026-01-31 08:39:25.193 226239 INFO nova.virt.libvirt.driver [-] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Instance running successfully.#033[00m
Jan 31 03:39:25 np0005603623 virtqemud[225858]: argument unsupported: QEMU guest agent is not configured
Jan 31 03:39:25 np0005603623 nova_compute[226235]: 2026-01-31 08:39:25.196 226239 DEBUG nova.virt.libvirt.guest [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 31 03:39:25 np0005603623 nova_compute[226235]: 2026-01-31 08:39:25.196 226239 DEBUG nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 31 03:39:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:39:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:25.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:39:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:25.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:26 np0005603623 nova_compute[226235]: 2026-01-31 08:39:26.105 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:39:26 np0005603623 nova_compute[226235]: 2026-01-31 08:39:26.108 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:39:26 np0005603623 nova_compute[226235]: 2026-01-31 08:39:26.495 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 31 03:39:26 np0005603623 nova_compute[226235]: 2026-01-31 08:39:26.496 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848765.1885202, 815ef28e-2297-49ba-88a1-23f722c3fa0a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:39:26 np0005603623 nova_compute[226235]: 2026-01-31 08:39:26.496 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] VM Started (Lifecycle Event)#033[00m
Jan 31 03:39:27 np0005603623 nova_compute[226235]: 2026-01-31 08:39:27.112 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:39:27 np0005603623 nova_compute[226235]: 2026-01-31 08:39:27.118 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:39:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:39:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:27.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:39:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:27.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:28 np0005603623 nova_compute[226235]: 2026-01-31 08:39:28.676 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:29 np0005603623 nova_compute[226235]: 2026-01-31 08:39:29.392 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848754.3916075, 91f04a93-ce38-4962-919b-e1ac0677b4da => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:39:29 np0005603623 nova_compute[226235]: 2026-01-31 08:39:29.393 226239 INFO nova.compute.manager [-] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:39:29 np0005603623 nova_compute[226235]: 2026-01-31 08:39:29.465 226239 DEBUG nova.compute.manager [None req-17ce0538-d91e-41b0-9e1b-3c179a0ae167 - - - - - -] [instance: 91f04a93-ce38-4962-919b-e1ac0677b4da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:39:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:29.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:29.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:30 np0005603623 nova_compute[226235]: 2026-01-31 08:39:30.040 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:30.131 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:30.131 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:30.132 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:30 np0005603623 nova_compute[226235]: 2026-01-31 08:39:30.318 226239 DEBUG nova.compute.manager [req-de314086-56f0-492e-a792-931e705c5f62 req-5eb199ab-7a61-46c8-ab1d-9b486527b68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:30 np0005603623 nova_compute[226235]: 2026-01-31 08:39:30.318 226239 DEBUG oslo_concurrency.lockutils [req-de314086-56f0-492e-a792-931e705c5f62 req-5eb199ab-7a61-46c8-ab1d-9b486527b68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:30 np0005603623 nova_compute[226235]: 2026-01-31 08:39:30.319 226239 DEBUG oslo_concurrency.lockutils [req-de314086-56f0-492e-a792-931e705c5f62 req-5eb199ab-7a61-46c8-ab1d-9b486527b68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:30 np0005603623 nova_compute[226235]: 2026-01-31 08:39:30.319 226239 DEBUG oslo_concurrency.lockutils [req-de314086-56f0-492e-a792-931e705c5f62 req-5eb199ab-7a61-46c8-ab1d-9b486527b68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:30 np0005603623 nova_compute[226235]: 2026-01-31 08:39:30.319 226239 DEBUG nova.compute.manager [req-de314086-56f0-492e-a792-931e705c5f62 req-5eb199ab-7a61-46c8-ab1d-9b486527b68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] No waiting events found dispatching network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:39:30 np0005603623 nova_compute[226235]: 2026-01-31 08:39:30.319 226239 WARNING nova.compute.manager [req-de314086-56f0-492e-a792-931e705c5f62 req-5eb199ab-7a61-46c8-ab1d-9b486527b68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received unexpected event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:39:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:31.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:31.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:39:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:33.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:39:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:33.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:33 np0005603623 nova_compute[226235]: 2026-01-31 08:39:33.680 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:33 np0005603623 nova_compute[226235]: 2026-01-31 08:39:33.786 226239 DEBUG nova.compute.manager [req-19bd9bde-436f-4789-ba19-107be588c9fe req-063757dc-48a8-4a7b-8ce2-25cc3ec32aa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:33 np0005603623 nova_compute[226235]: 2026-01-31 08:39:33.787 226239 DEBUG oslo_concurrency.lockutils [req-19bd9bde-436f-4789-ba19-107be588c9fe req-063757dc-48a8-4a7b-8ce2-25cc3ec32aa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:33 np0005603623 nova_compute[226235]: 2026-01-31 08:39:33.787 226239 DEBUG oslo_concurrency.lockutils [req-19bd9bde-436f-4789-ba19-107be588c9fe req-063757dc-48a8-4a7b-8ce2-25cc3ec32aa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:33 np0005603623 nova_compute[226235]: 2026-01-31 08:39:33.787 226239 DEBUG oslo_concurrency.lockutils [req-19bd9bde-436f-4789-ba19-107be588c9fe req-063757dc-48a8-4a7b-8ce2-25cc3ec32aa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:33 np0005603623 nova_compute[226235]: 2026-01-31 08:39:33.788 226239 DEBUG nova.compute.manager [req-19bd9bde-436f-4789-ba19-107be588c9fe req-063757dc-48a8-4a7b-8ce2-25cc3ec32aa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] No waiting events found dispatching network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:39:33 np0005603623 nova_compute[226235]: 2026-01-31 08:39:33.788 226239 WARNING nova.compute.manager [req-19bd9bde-436f-4789-ba19-107be588c9fe req-063757dc-48a8-4a7b-8ce2-25cc3ec32aa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received unexpected event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 31 03:39:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:35 np0005603623 nova_compute[226235]: 2026-01-31 08:39:35.041 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:35 np0005603623 nova_compute[226235]: 2026-01-31 08:39:35.156 226239 DEBUG nova.network.neutron [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Port b62616fc-dd91-4cc2-b323-70fffebab4fb binding to destination host compute-2.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Jan 31 03:39:35 np0005603623 nova_compute[226235]: 2026-01-31 08:39:35.157 226239 DEBUG oslo_concurrency.lockutils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:39:35 np0005603623 nova_compute[226235]: 2026-01-31 08:39:35.157 226239 DEBUG oslo_concurrency.lockutils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquired lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:39:35 np0005603623 nova_compute[226235]: 2026-01-31 08:39:35.157 226239 DEBUG nova.network.neutron [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:39:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:39:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:35.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:39:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:35.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:39:36 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1564915178' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:39:37 np0005603623 ovn_controller[133449]: 2026-01-31T08:39:37Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:44:41:0a 10.100.0.3
Jan 31 03:39:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:37.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:39:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:37.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:39:38 np0005603623 nova_compute[226235]: 2026-01-31 08:39:38.685 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:39.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:39.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:39 np0005603623 nova_compute[226235]: 2026-01-31 08:39:39.774 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:39 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:39.774 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:39:39 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:39.776 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:39:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:39 np0005603623 nova_compute[226235]: 2026-01-31 08:39:39.919 226239 DEBUG nova.network.neutron [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updating instance_info_cache with network_info: [{"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:39:40 np0005603623 nova_compute[226235]: 2026-01-31 08:39:40.042 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:40 np0005603623 nova_compute[226235]: 2026-01-31 08:39:40.329 226239 DEBUG oslo_concurrency.lockutils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Releasing lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:39:40 np0005603623 podman[295432]: 2026-01-31 08:39:40.955157592 +0000 UTC m=+0.052295721 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:39:40 np0005603623 podman[295433]: 2026-01-31 08:39:40.977227394 +0000 UTC m=+0.073451394 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:39:41 np0005603623 kernel: tapb62616fc-dd (unregistering): left promiscuous mode
Jan 31 03:39:41 np0005603623 NetworkManager[48970]: <info>  [1769848781.0376] device (tapb62616fc-dd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:39:41 np0005603623 ovn_controller[133449]: 2026-01-31T08:39:41Z|00624|binding|INFO|Releasing lport b62616fc-dd91-4cc2-b323-70fffebab4fb from this chassis (sb_readonly=0)
Jan 31 03:39:41 np0005603623 ovn_controller[133449]: 2026-01-31T08:39:41Z|00625|binding|INFO|Setting lport b62616fc-dd91-4cc2-b323-70fffebab4fb down in Southbound
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.044 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:41 np0005603623 ovn_controller[133449]: 2026-01-31T08:39:41Z|00626|binding|INFO|Removing iface tapb62616fc-dd ovn-installed in OVS
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.046 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.051 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:41 np0005603623 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000092.scope: Deactivated successfully.
Jan 31 03:39:41 np0005603623 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000092.scope: Consumed 12.443s CPU time.
Jan 31 03:39:41 np0005603623 systemd-machined[194379]: Machine qemu-71-instance-00000092 terminated.
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.109 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.112 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.129 226239 INFO nova.virt.libvirt.driver [-] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Instance destroyed successfully.#033[00m
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.130 226239 DEBUG nova.objects.instance [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'resources' on Instance uuid 815ef28e-2297-49ba-88a1-23f722c3fa0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.296 226239 DEBUG nova.virt.libvirt.vif [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:37:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2115362642',display_name='tempest-TestNetworkAdvancedServerOps-server-2115362642',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2115362642',id=146,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNl3t7Tpw2ZGAKx5tF0XRvkA5V1qQ9xszE6olcVe0qbeqbBI1oq6Zjq+3DsZDE5JpsKvfdWgNEpJ9rXaTzL6wLNNLR+GbnRbZpWjtFLfeYVgQUQ4VVzMWaZiV7/jRigBGA==',key_name='tempest-TestNetworkAdvancedServerOps-1092980412',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:39:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-4f40bqke',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:39:27Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=815ef28e-2297-49ba-88a1-23f722c3fa0a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.297 226239 DEBUG nova.network.os_vif_util [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converting VIF {"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.298 226239 DEBUG nova.network.os_vif_util [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.298 226239 DEBUG os_vif [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.299 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.300 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb62616fc-dd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.301 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.302 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.305 226239 INFO os_vif [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd')#033[00m
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.309 226239 DEBUG oslo_concurrency.lockutils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.309 226239 DEBUG oslo_concurrency.lockutils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:41.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:41.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:41 np0005603623 ovn_controller[133449]: 2026-01-31T08:39:41Z|00627|binding|INFO|Releasing lport 9f1ac82b-bf6c-400f-a03c-b15ad5392890 from this chassis (sb_readonly=0)
Jan 31 03:39:41 np0005603623 ovn_controller[133449]: 2026-01-31T08:39:41Z|00628|binding|INFO|Releasing lport 7571c123-c1d9-4ad9-a9c7-718eca889c7b from this chassis (sb_readonly=0)
Jan 31 03:39:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:41.676 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:41:0a 10.100.0.3'], port_security=['fa:16:3e:44:41:0a 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '815ef28e-2297-49ba-88a1-23f722c3fa0a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d238e24-9954-4b32-b589-6db6c8760a3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0bfe11bd9d694684b527666e2c378eed', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'e09d3566-f99d-4e7a-854e-68c93732c8e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=09414a4b-4852-4431-b971-0c29958bdb7a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=b62616fc-dd91-4cc2-b323-70fffebab4fb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:39:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:41.677 143258 INFO neutron.agent.ovn.metadata.agent [-] Port b62616fc-dd91-4cc2-b323-70fffebab4fb in datapath 3d238e24-9954-4b32-b589-6db6c8760a3f unbound from our chassis#033[00m
Jan 31 03:39:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:41.679 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3d238e24-9954-4b32-b589-6db6c8760a3f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:39:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:41.680 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[27212d5b-2795-4470-bc67-5c20afa5cad1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:41.680 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f namespace which is not needed anymore#033[00m
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.683 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:41 np0005603623 ovn_controller[133449]: 2026-01-31T08:39:41Z|00629|binding|INFO|Releasing lport 9f1ac82b-bf6c-400f-a03c-b15ad5392890 from this chassis (sb_readonly=0)
Jan 31 03:39:41 np0005603623 ovn_controller[133449]: 2026-01-31T08:39:41Z|00630|binding|INFO|Releasing lport 7571c123-c1d9-4ad9-a9c7-718eca889c7b from this chassis (sb_readonly=0)
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.733 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:41 np0005603623 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[295330]: [NOTICE]   (295353) : haproxy version is 2.8.14-c23fe91
Jan 31 03:39:41 np0005603623 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[295330]: [NOTICE]   (295353) : path to executable is /usr/sbin/haproxy
Jan 31 03:39:41 np0005603623 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[295330]: [WARNING]  (295353) : Exiting Master process...
Jan 31 03:39:41 np0005603623 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[295330]: [WARNING]  (295353) : Exiting Master process...
Jan 31 03:39:41 np0005603623 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[295330]: [ALERT]    (295353) : Current worker (295357) exited with code 143 (Terminated)
Jan 31 03:39:41 np0005603623 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[295330]: [WARNING]  (295353) : All workers exited. Exiting... (0)
Jan 31 03:39:41 np0005603623 systemd[1]: libpod-973c084048bf76ed4ad0d58425120ac82be48a7c8028b29321de9d15b22cebad.scope: Deactivated successfully.
Jan 31 03:39:41 np0005603623 podman[295508]: 2026-01-31 08:39:41.785924134 +0000 UTC m=+0.039633974 container died 973c084048bf76ed4ad0d58425120ac82be48a7c8028b29321de9d15b22cebad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 31 03:39:41 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-973c084048bf76ed4ad0d58425120ac82be48a7c8028b29321de9d15b22cebad-userdata-shm.mount: Deactivated successfully.
Jan 31 03:39:41 np0005603623 systemd[1]: var-lib-containers-storage-overlay-0c2db3e6bd362e594c99610791ad887b9dd2bb1175d9bf61bdfc03b9878c8969-merged.mount: Deactivated successfully.
Jan 31 03:39:41 np0005603623 podman[295508]: 2026-01-31 08:39:41.830640597 +0000 UTC m=+0.084350437 container cleanup 973c084048bf76ed4ad0d58425120ac82be48a7c8028b29321de9d15b22cebad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 31 03:39:41 np0005603623 systemd[1]: libpod-conmon-973c084048bf76ed4ad0d58425120ac82be48a7c8028b29321de9d15b22cebad.scope: Deactivated successfully.
Jan 31 03:39:41 np0005603623 podman[295540]: 2026-01-31 08:39:41.882477853 +0000 UTC m=+0.036431294 container remove 973c084048bf76ed4ad0d58425120ac82be48a7c8028b29321de9d15b22cebad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:39:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:41.886 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c1c07aa2-f9f0-4da1-9e05-c36dc04e60c8]: (4, ('Sat Jan 31 08:39:41 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f (973c084048bf76ed4ad0d58425120ac82be48a7c8028b29321de9d15b22cebad)\n973c084048bf76ed4ad0d58425120ac82be48a7c8028b29321de9d15b22cebad\nSat Jan 31 08:39:41 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f (973c084048bf76ed4ad0d58425120ac82be48a7c8028b29321de9d15b22cebad)\n973c084048bf76ed4ad0d58425120ac82be48a7c8028b29321de9d15b22cebad\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:41.887 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3f4adc93-475c-4013-989a-e2d67b97cffe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:41.888 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d238e24-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.889 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:41 np0005603623 kernel: tap3d238e24-90: left promiscuous mode
Jan 31 03:39:41 np0005603623 nova_compute[226235]: 2026-01-31 08:39:41.894 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:41.897 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c1e43a5e-8019-411f-9efe-901029c108d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:41.912 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b76bf982-305f-45d4-a476-b14e7fd76a58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:41.914 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[44c90bc5-2d1e-4370-b4da-96d1f391a9fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:41.924 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6e9afb1a-9ec3-4665-89a1-8eecfffede3a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 790491, 'reachable_time': 39940, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295555, 'error': None, 'target': 'ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:41.928 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:39:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:41.928 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[c750a1c8-96df-416b-8de8-a2b52ef1a53f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:41 np0005603623 systemd[1]: run-netns-ovnmeta\x2d3d238e24\x2d9954\x2d4b32\x2db589\x2d6db6c8760a3f.mount: Deactivated successfully.
Jan 31 03:39:42 np0005603623 nova_compute[226235]: 2026-01-31 08:39:42.125 226239 DEBUG nova.objects.instance [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'migration_context' on Instance uuid 815ef28e-2297-49ba-88a1-23f722c3fa0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:39:43 np0005603623 nova_compute[226235]: 2026-01-31 08:39:43.206 226239 DEBUG oslo_concurrency.processutils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:39:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:39:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4024809991' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:39:43 np0005603623 nova_compute[226235]: 2026-01-31 08:39:43.606 226239 DEBUG oslo_concurrency.processutils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:39:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:39:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:43.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:39:43 np0005603623 nova_compute[226235]: 2026-01-31 08:39:43.611 226239 DEBUG nova.compute.provider_tree [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:39:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:39:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:43.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:39:43 np0005603623 nova_compute[226235]: 2026-01-31 08:39:43.800 226239 DEBUG nova.scheduler.client.report [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:39:44 np0005603623 nova_compute[226235]: 2026-01-31 08:39:44.043 226239 DEBUG nova.compute.manager [req-85a37662-60c1-4594-9e60-0b499c628068 req-d5ea5a43-99da-4108-8eb9-f8bc4a41f9fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-unplugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:44 np0005603623 nova_compute[226235]: 2026-01-31 08:39:44.043 226239 DEBUG oslo_concurrency.lockutils [req-85a37662-60c1-4594-9e60-0b499c628068 req-d5ea5a43-99da-4108-8eb9-f8bc4a41f9fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:44 np0005603623 nova_compute[226235]: 2026-01-31 08:39:44.044 226239 DEBUG oslo_concurrency.lockutils [req-85a37662-60c1-4594-9e60-0b499c628068 req-d5ea5a43-99da-4108-8eb9-f8bc4a41f9fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:44 np0005603623 nova_compute[226235]: 2026-01-31 08:39:44.044 226239 DEBUG oslo_concurrency.lockutils [req-85a37662-60c1-4594-9e60-0b499c628068 req-d5ea5a43-99da-4108-8eb9-f8bc4a41f9fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:44 np0005603623 nova_compute[226235]: 2026-01-31 08:39:44.045 226239 DEBUG nova.compute.manager [req-85a37662-60c1-4594-9e60-0b499c628068 req-d5ea5a43-99da-4108-8eb9-f8bc4a41f9fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] No waiting events found dispatching network-vif-unplugged-b62616fc-dd91-4cc2-b323-70fffebab4fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:39:44 np0005603623 nova_compute[226235]: 2026-01-31 08:39:44.045 226239 WARNING nova.compute.manager [req-85a37662-60c1-4594-9e60-0b499c628068 req-d5ea5a43-99da-4108-8eb9-f8bc4a41f9fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received unexpected event network-vif-unplugged-b62616fc-dd91-4cc2-b323-70fffebab4fb for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 31 03:39:44 np0005603623 nova_compute[226235]: 2026-01-31 08:39:44.138 226239 DEBUG oslo_concurrency.lockutils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 2.829s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:45 np0005603623 nova_compute[226235]: 2026-01-31 08:39:45.044 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:39:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:45.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:39:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:45.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:46 np0005603623 nova_compute[226235]: 2026-01-31 08:39:46.301 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:47.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:47.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:39:48.779 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:39:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:49.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:39:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:49.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:50 np0005603623 nova_compute[226235]: 2026-01-31 08:39:50.045 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:50 np0005603623 nova_compute[226235]: 2026-01-31 08:39:50.872 226239 DEBUG nova.compute.manager [req-84087c1f-385c-4f5f-8c3c-3b8c2b2924c7 req-74a26f3c-764e-4bfe-bdfc-1952c1b6e265 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:50 np0005603623 nova_compute[226235]: 2026-01-31 08:39:50.873 226239 DEBUG oslo_concurrency.lockutils [req-84087c1f-385c-4f5f-8c3c-3b8c2b2924c7 req-74a26f3c-764e-4bfe-bdfc-1952c1b6e265 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:50 np0005603623 nova_compute[226235]: 2026-01-31 08:39:50.873 226239 DEBUG oslo_concurrency.lockutils [req-84087c1f-385c-4f5f-8c3c-3b8c2b2924c7 req-74a26f3c-764e-4bfe-bdfc-1952c1b6e265 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:50 np0005603623 nova_compute[226235]: 2026-01-31 08:39:50.873 226239 DEBUG oslo_concurrency.lockutils [req-84087c1f-385c-4f5f-8c3c-3b8c2b2924c7 req-74a26f3c-764e-4bfe-bdfc-1952c1b6e265 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:50 np0005603623 nova_compute[226235]: 2026-01-31 08:39:50.873 226239 DEBUG nova.compute.manager [req-84087c1f-385c-4f5f-8c3c-3b8c2b2924c7 req-74a26f3c-764e-4bfe-bdfc-1952c1b6e265 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] No waiting events found dispatching network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:39:50 np0005603623 nova_compute[226235]: 2026-01-31 08:39:50.873 226239 WARNING nova.compute.manager [req-84087c1f-385c-4f5f-8c3c-3b8c2b2924c7 req-74a26f3c-764e-4bfe-bdfc-1952c1b6e265 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received unexpected event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 31 03:39:51 np0005603623 nova_compute[226235]: 2026-01-31 08:39:51.303 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:51.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:39:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:51.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:39:53 np0005603623 nova_compute[226235]: 2026-01-31 08:39:53.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:53 np0005603623 nova_compute[226235]: 2026-01-31 08:39:53.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:39:53 np0005603623 nova_compute[226235]: 2026-01-31 08:39:53.487 226239 DEBUG nova.compute.manager [req-e19069bb-5f05-479e-ae2a-13411aa25f6c req-4b24288f-0f0d-4615-b6c5-fc589702377e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-changed-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:53 np0005603623 nova_compute[226235]: 2026-01-31 08:39:53.487 226239 DEBUG nova.compute.manager [req-e19069bb-5f05-479e-ae2a-13411aa25f6c req-4b24288f-0f0d-4615-b6c5-fc589702377e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Refreshing instance network info cache due to event network-changed-b62616fc-dd91-4cc2-b323-70fffebab4fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:39:53 np0005603623 nova_compute[226235]: 2026-01-31 08:39:53.488 226239 DEBUG oslo_concurrency.lockutils [req-e19069bb-5f05-479e-ae2a-13411aa25f6c req-4b24288f-0f0d-4615-b6c5-fc589702377e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:39:53 np0005603623 nova_compute[226235]: 2026-01-31 08:39:53.488 226239 DEBUG oslo_concurrency.lockutils [req-e19069bb-5f05-479e-ae2a-13411aa25f6c req-4b24288f-0f0d-4615-b6c5-fc589702377e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:39:53 np0005603623 nova_compute[226235]: 2026-01-31 08:39:53.488 226239 DEBUG nova.network.neutron [req-e19069bb-5f05-479e-ae2a-13411aa25f6c req-4b24288f-0f0d-4615-b6c5-fc589702377e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Refreshing network info cache for port b62616fc-dd91-4cc2-b323-70fffebab4fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:39:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:39:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:53.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:39:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:53.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:55 np0005603623 nova_compute[226235]: 2026-01-31 08:39:55.049 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:55.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:55.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:56 np0005603623 nova_compute[226235]: 2026-01-31 08:39:56.129 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848781.127722, 815ef28e-2297-49ba-88a1-23f722c3fa0a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:39:56 np0005603623 nova_compute[226235]: 2026-01-31 08:39:56.129 226239 INFO nova.compute.manager [-] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:39:56 np0005603623 nova_compute[226235]: 2026-01-31 08:39:56.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:56 np0005603623 nova_compute[226235]: 2026-01-31 08:39:56.153 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:39:56 np0005603623 nova_compute[226235]: 2026-01-31 08:39:56.237 226239 DEBUG nova.compute.manager [None req-3cf691e2-9b88-406d-89a0-233b3751e99d - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:39:56 np0005603623 nova_compute[226235]: 2026-01-31 08:39:56.305 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:56 np0005603623 nova_compute[226235]: 2026-01-31 08:39:56.408 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:39:56 np0005603623 nova_compute[226235]: 2026-01-31 08:39:56.409 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:56 np0005603623 nova_compute[226235]: 2026-01-31 08:39:56.791 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:56 np0005603623 nova_compute[226235]: 2026-01-31 08:39:56.792 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:56 np0005603623 nova_compute[226235]: 2026-01-31 08:39:56.792 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:56 np0005603623 nova_compute[226235]: 2026-01-31 08:39:56.792 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:39:56 np0005603623 nova_compute[226235]: 2026-01-31 08:39:56.792 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:39:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:39:57 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1820445459' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:39:57 np0005603623 nova_compute[226235]: 2026-01-31 08:39:57.221 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:39:57 np0005603623 nova_compute[226235]: 2026-01-31 08:39:57.370 226239 DEBUG nova.network.neutron [req-e19069bb-5f05-479e-ae2a-13411aa25f6c req-4b24288f-0f0d-4615-b6c5-fc589702377e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updated VIF entry in instance network info cache for port b62616fc-dd91-4cc2-b323-70fffebab4fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:39:57 np0005603623 nova_compute[226235]: 2026-01-31 08:39:57.370 226239 DEBUG nova.network.neutron [req-e19069bb-5f05-479e-ae2a-13411aa25f6c req-4b24288f-0f0d-4615-b6c5-fc589702377e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updating instance_info_cache with network_info: [{"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:39:57 np0005603623 nova_compute[226235]: 2026-01-31 08:39:57.586 226239 DEBUG oslo_concurrency.lockutils [req-e19069bb-5f05-479e-ae2a-13411aa25f6c req-4b24288f-0f0d-4615-b6c5-fc589702377e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:39:57 np0005603623 nova_compute[226235]: 2026-01-31 08:39:57.620 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000094 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:39:57 np0005603623 nova_compute[226235]: 2026-01-31 08:39:57.620 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000094 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:39:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:39:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:57.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:39:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:39:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:57.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:39:57 np0005603623 nova_compute[226235]: 2026-01-31 08:39:57.771 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:39:57 np0005603623 nova_compute[226235]: 2026-01-31 08:39:57.772 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4086MB free_disk=20.714786529541016GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:39:57 np0005603623 nova_compute[226235]: 2026-01-31 08:39:57.772 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:57 np0005603623 nova_compute[226235]: 2026-01-31 08:39:57.772 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e345 e345: 3 total, 3 up, 3 in
Jan 31 03:39:58 np0005603623 nova_compute[226235]: 2026-01-31 08:39:58.162 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance bd87e542-0f7b-453e-b8d1-643ad6fb64f0 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:39:58 np0005603623 nova_compute[226235]: 2026-01-31 08:39:58.163 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:39:58 np0005603623 nova_compute[226235]: 2026-01-31 08:39:58.163 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:39:58 np0005603623 nova_compute[226235]: 2026-01-31 08:39:58.231 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:39:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:39:58 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1398649482' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:39:58 np0005603623 nova_compute[226235]: 2026-01-31 08:39:58.640 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:39:58 np0005603623 nova_compute[226235]: 2026-01-31 08:39:58.645 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:39:58 np0005603623 nova_compute[226235]: 2026-01-31 08:39:58.804 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:39:59 np0005603623 nova_compute[226235]: 2026-01-31 08:39:59.309 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:39:59 np0005603623 nova_compute[226235]: 2026-01-31 08:39:59.310 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:59.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:39:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:39:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:59.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:39:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e345 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:39:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:39:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:39:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:39:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:40:00 np0005603623 nova_compute[226235]: 2026-01-31 08:40:00.050 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:00 np0005603623 ceph-mon[77037]: overall HEALTH_OK
Jan 31 03:40:01 np0005603623 nova_compute[226235]: 2026-01-31 08:40:01.307 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:01 np0005603623 nova_compute[226235]: 2026-01-31 08:40:01.630 226239 DEBUG nova.compute.manager [req-eb9e934f-2387-4f4a-a0f1-9ef89cb75816 req-7b38a019-995b-4b04-a596-9a412be80bb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:40:01 np0005603623 nova_compute[226235]: 2026-01-31 08:40:01.630 226239 DEBUG oslo_concurrency.lockutils [req-eb9e934f-2387-4f4a-a0f1-9ef89cb75816 req-7b38a019-995b-4b04-a596-9a412be80bb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:01 np0005603623 nova_compute[226235]: 2026-01-31 08:40:01.630 226239 DEBUG oslo_concurrency.lockutils [req-eb9e934f-2387-4f4a-a0f1-9ef89cb75816 req-7b38a019-995b-4b04-a596-9a412be80bb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:01 np0005603623 nova_compute[226235]: 2026-01-31 08:40:01.630 226239 DEBUG oslo_concurrency.lockutils [req-eb9e934f-2387-4f4a-a0f1-9ef89cb75816 req-7b38a019-995b-4b04-a596-9a412be80bb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:01 np0005603623 nova_compute[226235]: 2026-01-31 08:40:01.630 226239 DEBUG nova.compute.manager [req-eb9e934f-2387-4f4a-a0f1-9ef89cb75816 req-7b38a019-995b-4b04-a596-9a412be80bb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] No waiting events found dispatching network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:40:01 np0005603623 nova_compute[226235]: 2026-01-31 08:40:01.631 226239 WARNING nova.compute.manager [req-eb9e934f-2387-4f4a-a0f1-9ef89cb75816 req-7b38a019-995b-4b04-a596-9a412be80bb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received unexpected event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 31 03:40:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:01.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:40:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:01.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:40:02 np0005603623 nova_compute[226235]: 2026-01-31 08:40:02.056 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:02 np0005603623 nova_compute[226235]: 2026-01-31 08:40:02.057 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:02 np0005603623 nova_compute[226235]: 2026-01-31 08:40:02.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:02 np0005603623 nova_compute[226235]: 2026-01-31 08:40:02.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:03 np0005603623 nova_compute[226235]: 2026-01-31 08:40:03.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:03 np0005603623 nova_compute[226235]: 2026-01-31 08:40:03.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:40:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:03.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:40:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:03.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e345 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:05 np0005603623 nova_compute[226235]: 2026-01-31 08:40:05.052 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:05.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:05.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e346 e346: 3 total, 3 up, 3 in
Jan 31 03:40:06 np0005603623 nova_compute[226235]: 2026-01-31 08:40:06.309 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:40:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:40:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:07.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:07.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:09.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:40:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:09.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:40:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:10 np0005603623 nova_compute[226235]: 2026-01-31 08:40:10.054 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:10 np0005603623 nova_compute[226235]: 2026-01-31 08:40:10.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:11 np0005603623 nova_compute[226235]: 2026-01-31 08:40:11.311 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:11.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:11.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:11 np0005603623 podman[295869]: 2026-01-31 08:40:11.962375103 +0000 UTC m=+0.048940417 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Jan 31 03:40:11 np0005603623 podman[295870]: 2026-01-31 08:40:11.986436977 +0000 UTC m=+0.073304120 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible)
Jan 31 03:40:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:13.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:13.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:40:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2784090521' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:40:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:40:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2784090521' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:40:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:15 np0005603623 nova_compute[226235]: 2026-01-31 08:40:15.055 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:15.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:15.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:16 np0005603623 nova_compute[226235]: 2026-01-31 08:40:16.312 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:40:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:17.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:40:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:17.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:19.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:40:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:19.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:40:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:20 np0005603623 nova_compute[226235]: 2026-01-31 08:40:20.100 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:21 np0005603623 nova_compute[226235]: 2026-01-31 08:40:21.314 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:21.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:21.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:40:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:23.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:40:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:40:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:23.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:40:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:25 np0005603623 nova_compute[226235]: 2026-01-31 08:40:25.101 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:25.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:25.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:26 np0005603623 nova_compute[226235]: 2026-01-31 08:40:26.316 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:40:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:27.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:40:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:27.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:29.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:40:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:29.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:40:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:29 np0005603623 nova_compute[226235]: 2026-01-31 08:40:29.991 226239 DEBUG oslo_concurrency.lockutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Acquiring lock "1f710a83-a591-405d-a7f3-d80ab66f0b94" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:29 np0005603623 nova_compute[226235]: 2026-01-31 08:40:29.991 226239 DEBUG oslo_concurrency.lockutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lock "1f710a83-a591-405d-a7f3-d80ab66f0b94" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:30 np0005603623 nova_compute[226235]: 2026-01-31 08:40:30.102 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:30.132 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:30.132 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:30.133 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:30 np0005603623 nova_compute[226235]: 2026-01-31 08:40:30.212 226239 DEBUG nova.compute.manager [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:40:30 np0005603623 nova_compute[226235]: 2026-01-31 08:40:30.414 226239 DEBUG oslo_concurrency.lockutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:30 np0005603623 nova_compute[226235]: 2026-01-31 08:40:30.415 226239 DEBUG oslo_concurrency.lockutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:30 np0005603623 nova_compute[226235]: 2026-01-31 08:40:30.421 226239 DEBUG nova.virt.hardware [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:40:30 np0005603623 nova_compute[226235]: 2026-01-31 08:40:30.421 226239 INFO nova.compute.claims [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:40:30 np0005603623 nova_compute[226235]: 2026-01-31 08:40:30.773 226239 DEBUG oslo_concurrency.processutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:40:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:40:31 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4286798509' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:40:31 np0005603623 nova_compute[226235]: 2026-01-31 08:40:31.198 226239 DEBUG oslo_concurrency.processutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:40:31 np0005603623 nova_compute[226235]: 2026-01-31 08:40:31.205 226239 DEBUG nova.compute.provider_tree [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:40:31 np0005603623 nova_compute[226235]: 2026-01-31 08:40:31.319 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:31 np0005603623 nova_compute[226235]: 2026-01-31 08:40:31.441 226239 DEBUG nova.scheduler.client.report [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:40:31 np0005603623 nova_compute[226235]: 2026-01-31 08:40:31.648 226239 DEBUG oslo_concurrency.lockutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:31 np0005603623 nova_compute[226235]: 2026-01-31 08:40:31.648 226239 DEBUG nova.compute.manager [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:40:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:40:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:31.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:40:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:31.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:32 np0005603623 nova_compute[226235]: 2026-01-31 08:40:32.338 226239 DEBUG nova.compute.manager [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:40:32 np0005603623 nova_compute[226235]: 2026-01-31 08:40:32.339 226239 DEBUG nova.network.neutron [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:40:32 np0005603623 nova_compute[226235]: 2026-01-31 08:40:32.484 226239 INFO nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:40:32 np0005603623 nova_compute[226235]: 2026-01-31 08:40:32.671 226239 DEBUG nova.compute.manager [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:40:33 np0005603623 nova_compute[226235]: 2026-01-31 08:40:33.087 226239 DEBUG nova.compute.manager [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:40:33 np0005603623 nova_compute[226235]: 2026-01-31 08:40:33.089 226239 DEBUG nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:40:33 np0005603623 nova_compute[226235]: 2026-01-31 08:40:33.089 226239 INFO nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Creating image(s)#033[00m
Jan 31 03:40:33 np0005603623 nova_compute[226235]: 2026-01-31 08:40:33.130 226239 DEBUG nova.storage.rbd_utils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] rbd image 1f710a83-a591-405d-a7f3-d80ab66f0b94_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:40:33 np0005603623 nova_compute[226235]: 2026-01-31 08:40:33.155 226239 DEBUG nova.storage.rbd_utils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] rbd image 1f710a83-a591-405d-a7f3-d80ab66f0b94_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:40:33 np0005603623 nova_compute[226235]: 2026-01-31 08:40:33.177 226239 DEBUG nova.storage.rbd_utils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] rbd image 1f710a83-a591-405d-a7f3-d80ab66f0b94_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:40:33 np0005603623 nova_compute[226235]: 2026-01-31 08:40:33.180 226239 DEBUG oslo_concurrency.processutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:40:33 np0005603623 nova_compute[226235]: 2026-01-31 08:40:33.230 226239 DEBUG oslo_concurrency.processutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:40:33 np0005603623 nova_compute[226235]: 2026-01-31 08:40:33.231 226239 DEBUG oslo_concurrency.lockutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:33 np0005603623 nova_compute[226235]: 2026-01-31 08:40:33.232 226239 DEBUG oslo_concurrency.lockutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:33 np0005603623 nova_compute[226235]: 2026-01-31 08:40:33.232 226239 DEBUG oslo_concurrency.lockutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:33 np0005603623 nova_compute[226235]: 2026-01-31 08:40:33.252 226239 DEBUG nova.storage.rbd_utils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] rbd image 1f710a83-a591-405d-a7f3-d80ab66f0b94_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:40:33 np0005603623 nova_compute[226235]: 2026-01-31 08:40:33.255 226239 DEBUG oslo_concurrency.processutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 1f710a83-a591-405d-a7f3-d80ab66f0b94_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:40:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:40:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:33.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:40:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:40:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:33.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:40:33 np0005603623 nova_compute[226235]: 2026-01-31 08:40:33.756 226239 DEBUG nova.policy [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c6c60cceb0414556a99e4009200ad565', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0026cad467ff4524ae5b675192e66666', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:40:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:35 np0005603623 nova_compute[226235]: 2026-01-31 08:40:35.104 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:35 np0005603623 nova_compute[226235]: 2026-01-31 08:40:35.501 226239 DEBUG oslo_concurrency.processutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 1f710a83-a591-405d-a7f3-d80ab66f0b94_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.246s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:40:35 np0005603623 nova_compute[226235]: 2026-01-31 08:40:35.555 226239 DEBUG nova.storage.rbd_utils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] resizing rbd image 1f710a83-a591-405d-a7f3-d80ab66f0b94_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:40:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:35.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:40:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:35.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:40:36 np0005603623 nova_compute[226235]: 2026-01-31 08:40:36.019 226239 DEBUG nova.objects.instance [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lazy-loading 'migration_context' on Instance uuid 1f710a83-a591-405d-a7f3-d80ab66f0b94 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:40:36 np0005603623 nova_compute[226235]: 2026-01-31 08:40:36.165 226239 DEBUG nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:40:36 np0005603623 nova_compute[226235]: 2026-01-31 08:40:36.165 226239 DEBUG nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Ensure instance console log exists: /var/lib/nova/instances/1f710a83-a591-405d-a7f3-d80ab66f0b94/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:40:36 np0005603623 nova_compute[226235]: 2026-01-31 08:40:36.166 226239 DEBUG oslo_concurrency.lockutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:36 np0005603623 nova_compute[226235]: 2026-01-31 08:40:36.166 226239 DEBUG oslo_concurrency.lockutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:36 np0005603623 nova_compute[226235]: 2026-01-31 08:40:36.166 226239 DEBUG oslo_concurrency.lockutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:36 np0005603623 nova_compute[226235]: 2026-01-31 08:40:36.321 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:36 np0005603623 nova_compute[226235]: 2026-01-31 08:40:36.975 226239 DEBUG nova.network.neutron [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Successfully created port: b8b7c630-442b-4a07-8813-bf635675e452 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:40:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:37.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8546f0 =====
Jan 31 03:40:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8546f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:40:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8546f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:37.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:40:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:39.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8546f0 =====
Jan 31 03:40:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8546f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8546f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:39.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:40 np0005603623 nova_compute[226235]: 2026-01-31 08:40:40.105 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:41 np0005603623 nova_compute[226235]: 2026-01-31 08:40:41.322 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:41 np0005603623 nova_compute[226235]: 2026-01-31 08:40:41.606 226239 DEBUG nova.network.neutron [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Successfully updated port: b8b7c630-442b-4a07-8813-bf635675e452 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:40:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8546f0 =====
Jan 31 03:40:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:41.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8546f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8546f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:41.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:41 np0005603623 nova_compute[226235]: 2026-01-31 08:40:41.971 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:42 np0005603623 nova_compute[226235]: 2026-01-31 08:40:42.224 226239 DEBUG oslo_concurrency.lockutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Acquiring lock "refresh_cache-1f710a83-a591-405d-a7f3-d80ab66f0b94" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:40:42 np0005603623 nova_compute[226235]: 2026-01-31 08:40:42.224 226239 DEBUG oslo_concurrency.lockutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Acquired lock "refresh_cache-1f710a83-a591-405d-a7f3-d80ab66f0b94" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:40:42 np0005603623 nova_compute[226235]: 2026-01-31 08:40:42.225 226239 DEBUG nova.network.neutron [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:40:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:42.289 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:40:42 np0005603623 nova_compute[226235]: 2026-01-31 08:40:42.289 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:42 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:42.290 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:40:42 np0005603623 nova_compute[226235]: 2026-01-31 08:40:42.567 226239 DEBUG nova.compute.manager [req-3f8e60f8-3a86-4354-95e0-9dc99a181ed2 req-9625baf3-642c-47ca-a544-0d604a0cec83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Received event network-changed-b8b7c630-442b-4a07-8813-bf635675e452 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:40:42 np0005603623 nova_compute[226235]: 2026-01-31 08:40:42.567 226239 DEBUG nova.compute.manager [req-3f8e60f8-3a86-4354-95e0-9dc99a181ed2 req-9625baf3-642c-47ca-a544-0d604a0cec83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Refreshing instance network info cache due to event network-changed-b8b7c630-442b-4a07-8813-bf635675e452. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:40:42 np0005603623 nova_compute[226235]: 2026-01-31 08:40:42.568 226239 DEBUG oslo_concurrency.lockutils [req-3f8e60f8-3a86-4354-95e0-9dc99a181ed2 req-9625baf3-642c-47ca-a544-0d604a0cec83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-1f710a83-a591-405d-a7f3-d80ab66f0b94" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:40:42 np0005603623 podman[296222]: 2026-01-31 08:40:42.981328752 +0000 UTC m=+0.073474957 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 03:40:42 np0005603623 podman[296221]: 2026-01-31 08:40:42.983561401 +0000 UTC m=+0.074512198 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent)
Jan 31 03:40:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:43.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:40:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:43.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:40:43 np0005603623 nova_compute[226235]: 2026-01-31 08:40:43.735 226239 DEBUG nova.network.neutron [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:40:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:44.293 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:40:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:45 np0005603623 nova_compute[226235]: 2026-01-31 08:40:45.107 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:45.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:45.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e347 e347: 3 total, 3 up, 3 in
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.112 226239 DEBUG nova.network.neutron [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Updating instance_info_cache with network_info: [{"id": "b8b7c630-442b-4a07-8813-bf635675e452", "address": "fa:16:3e:2e:3e:ef", "network": {"id": "124757ba-89c0-4223-923e-3c1484eeaccd", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-847601166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0026cad467ff4524ae5b675192e66666", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8b7c630-44", "ovs_interfaceid": "b8b7c630-442b-4a07-8813-bf635675e452", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.323 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.405 226239 DEBUG oslo_concurrency.lockutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Releasing lock "refresh_cache-1f710a83-a591-405d-a7f3-d80ab66f0b94" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.405 226239 DEBUG nova.compute.manager [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Instance network_info: |[{"id": "b8b7c630-442b-4a07-8813-bf635675e452", "address": "fa:16:3e:2e:3e:ef", "network": {"id": "124757ba-89c0-4223-923e-3c1484eeaccd", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-847601166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0026cad467ff4524ae5b675192e66666", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8b7c630-44", "ovs_interfaceid": "b8b7c630-442b-4a07-8813-bf635675e452", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.406 226239 DEBUG oslo_concurrency.lockutils [req-3f8e60f8-3a86-4354-95e0-9dc99a181ed2 req-9625baf3-642c-47ca-a544-0d604a0cec83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-1f710a83-a591-405d-a7f3-d80ab66f0b94" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.406 226239 DEBUG nova.network.neutron [req-3f8e60f8-3a86-4354-95e0-9dc99a181ed2 req-9625baf3-642c-47ca-a544-0d604a0cec83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Refreshing network info cache for port b8b7c630-442b-4a07-8813-bf635675e452 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.408 226239 DEBUG nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Start _get_guest_xml network_info=[{"id": "b8b7c630-442b-4a07-8813-bf635675e452", "address": "fa:16:3e:2e:3e:ef", "network": {"id": "124757ba-89c0-4223-923e-3c1484eeaccd", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-847601166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0026cad467ff4524ae5b675192e66666", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8b7c630-44", "ovs_interfaceid": "b8b7c630-442b-4a07-8813-bf635675e452", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.413 226239 WARNING nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.423 226239 DEBUG nova.virt.libvirt.host [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.423 226239 DEBUG nova.virt.libvirt.host [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.427 226239 DEBUG nova.virt.libvirt.host [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.428 226239 DEBUG nova.virt.libvirt.host [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.429 226239 DEBUG nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.429 226239 DEBUG nova.virt.hardware [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.429 226239 DEBUG nova.virt.hardware [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.430 226239 DEBUG nova.virt.hardware [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.430 226239 DEBUG nova.virt.hardware [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.430 226239 DEBUG nova.virt.hardware [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.430 226239 DEBUG nova.virt.hardware [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.430 226239 DEBUG nova.virt.hardware [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.431 226239 DEBUG nova.virt.hardware [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.431 226239 DEBUG nova.virt.hardware [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.431 226239 DEBUG nova.virt.hardware [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.431 226239 DEBUG nova.virt.hardware [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.434 226239 DEBUG oslo_concurrency.processutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:40:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:40:46 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/241161619' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.856 226239 DEBUG oslo_concurrency.processutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.877 226239 DEBUG nova.storage.rbd_utils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] rbd image 1f710a83-a591-405d-a7f3-d80ab66f0b94_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:40:46 np0005603623 nova_compute[226235]: 2026-01-31 08:40:46.882 226239 DEBUG oslo_concurrency.processutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:40:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:40:47 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2631265997' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.286 226239 DEBUG oslo_concurrency.processutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.405s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.288 226239 DEBUG nova.virt.libvirt.vif [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:40:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-221933362',display_name='tempest-ServerAddressesTestJSON-server-221933362',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-221933362',id=150,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0026cad467ff4524ae5b675192e66666',ramdisk_id='',reservation_id='r-0ck9pueu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-254892063',owner_user_name='tempest-ServerAddressesTestJSON-254892063-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:40:32Z,user_data=None,user_id='c6c60cceb0414556a99e4009200ad565',uuid=1f710a83-a591-405d-a7f3-d80ab66f0b94,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b8b7c630-442b-4a07-8813-bf635675e452", "address": "fa:16:3e:2e:3e:ef", "network": {"id": "124757ba-89c0-4223-923e-3c1484eeaccd", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-847601166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0026cad467ff4524ae5b675192e66666", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8b7c630-44", "ovs_interfaceid": "b8b7c630-442b-4a07-8813-bf635675e452", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.288 226239 DEBUG nova.network.os_vif_util [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Converting VIF {"id": "b8b7c630-442b-4a07-8813-bf635675e452", "address": "fa:16:3e:2e:3e:ef", "network": {"id": "124757ba-89c0-4223-923e-3c1484eeaccd", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-847601166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0026cad467ff4524ae5b675192e66666", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8b7c630-44", "ovs_interfaceid": "b8b7c630-442b-4a07-8813-bf635675e452", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.289 226239 DEBUG nova.network.os_vif_util [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:3e:ef,bridge_name='br-int',has_traffic_filtering=True,id=b8b7c630-442b-4a07-8813-bf635675e452,network=Network(124757ba-89c0-4223-923e-3c1484eeaccd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8b7c630-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.290 226239 DEBUG nova.objects.instance [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1f710a83-a591-405d-a7f3-d80ab66f0b94 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.524 226239 DEBUG nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:40:47 np0005603623 nova_compute[226235]:  <uuid>1f710a83-a591-405d-a7f3-d80ab66f0b94</uuid>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:  <name>instance-00000096</name>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerAddressesTestJSON-server-221933362</nova:name>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:40:46</nova:creationTime>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:40:47 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:        <nova:user uuid="c6c60cceb0414556a99e4009200ad565">tempest-ServerAddressesTestJSON-254892063-project-member</nova:user>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:        <nova:project uuid="0026cad467ff4524ae5b675192e66666">tempest-ServerAddressesTestJSON-254892063</nova:project>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:        <nova:port uuid="b8b7c630-442b-4a07-8813-bf635675e452">
Jan 31 03:40:47 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <entry name="serial">1f710a83-a591-405d-a7f3-d80ab66f0b94</entry>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <entry name="uuid">1f710a83-a591-405d-a7f3-d80ab66f0b94</entry>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/1f710a83-a591-405d-a7f3-d80ab66f0b94_disk">
Jan 31 03:40:47 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:40:47 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/1f710a83-a591-405d-a7f3-d80ab66f0b94_disk.config">
Jan 31 03:40:47 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:40:47 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:2e:3e:ef"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <target dev="tapb8b7c630-44"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/1f710a83-a591-405d-a7f3-d80ab66f0b94/console.log" append="off"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:40:47 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:40:47 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:40:47 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:40:47 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.525 226239 DEBUG nova.compute.manager [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Preparing to wait for external event network-vif-plugged-b8b7c630-442b-4a07-8813-bf635675e452 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.526 226239 DEBUG oslo_concurrency.lockutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Acquiring lock "1f710a83-a591-405d-a7f3-d80ab66f0b94-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.526 226239 DEBUG oslo_concurrency.lockutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lock "1f710a83-a591-405d-a7f3-d80ab66f0b94-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.526 226239 DEBUG oslo_concurrency.lockutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lock "1f710a83-a591-405d-a7f3-d80ab66f0b94-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.527 226239 DEBUG nova.virt.libvirt.vif [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:40:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-221933362',display_name='tempest-ServerAddressesTestJSON-server-221933362',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-221933362',id=150,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0026cad467ff4524ae5b675192e66666',ramdisk_id='',reservation_id='r-0ck9pueu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-254892063',owner_user_name='tempest-ServerAddressesTestJSON-254892063-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:40:32Z,user_data=None,user_id='c6c60cceb0414556a99e4009200ad565',uuid=1f710a83-a591-405d-a7f3-d80ab66f0b94,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b8b7c630-442b-4a07-8813-bf635675e452", "address": "fa:16:3e:2e:3e:ef", "network": {"id": "124757ba-89c0-4223-923e-3c1484eeaccd", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-847601166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0026cad467ff4524ae5b675192e66666", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8b7c630-44", "ovs_interfaceid": "b8b7c630-442b-4a07-8813-bf635675e452", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.528 226239 DEBUG nova.network.os_vif_util [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Converting VIF {"id": "b8b7c630-442b-4a07-8813-bf635675e452", "address": "fa:16:3e:2e:3e:ef", "network": {"id": "124757ba-89c0-4223-923e-3c1484eeaccd", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-847601166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0026cad467ff4524ae5b675192e66666", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8b7c630-44", "ovs_interfaceid": "b8b7c630-442b-4a07-8813-bf635675e452", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.528 226239 DEBUG nova.network.os_vif_util [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:3e:ef,bridge_name='br-int',has_traffic_filtering=True,id=b8b7c630-442b-4a07-8813-bf635675e452,network=Network(124757ba-89c0-4223-923e-3c1484eeaccd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8b7c630-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.529 226239 DEBUG os_vif [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:3e:ef,bridge_name='br-int',has_traffic_filtering=True,id=b8b7c630-442b-4a07-8813-bf635675e452,network=Network(124757ba-89c0-4223-923e-3c1484eeaccd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8b7c630-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.530 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.530 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.531 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.534 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.534 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8b7c630-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.534 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb8b7c630-44, col_values=(('external_ids', {'iface-id': 'b8b7c630-442b-4a07-8813-bf635675e452', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2e:3e:ef', 'vm-uuid': '1f710a83-a591-405d-a7f3-d80ab66f0b94'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.536 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:47 np0005603623 NetworkManager[48970]: <info>  [1769848847.5368] manager: (tapb8b7c630-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/299)
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.538 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.542 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:47 np0005603623 nova_compute[226235]: 2026-01-31 08:40:47.542 226239 INFO os_vif [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:3e:ef,bridge_name='br-int',has_traffic_filtering=True,id=b8b7c630-442b-4a07-8813-bf635675e452,network=Network(124757ba-89c0-4223-923e-3c1484eeaccd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8b7c630-44')#033[00m
Jan 31 03:40:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:40:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:47.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:40:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:47.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:48 np0005603623 nova_compute[226235]: 2026-01-31 08:40:48.232 226239 DEBUG nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:40:48 np0005603623 nova_compute[226235]: 2026-01-31 08:40:48.233 226239 DEBUG nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:40:48 np0005603623 nova_compute[226235]: 2026-01-31 08:40:48.233 226239 DEBUG nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] No VIF found with MAC fa:16:3e:2e:3e:ef, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:40:48 np0005603623 nova_compute[226235]: 2026-01-31 08:40:48.234 226239 INFO nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Using config drive#033[00m
Jan 31 03:40:48 np0005603623 nova_compute[226235]: 2026-01-31 08:40:48.257 226239 DEBUG nova.storage.rbd_utils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] rbd image 1f710a83-a591-405d-a7f3-d80ab66f0b94_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:40:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:40:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:49.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:40:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:40:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:49.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:40:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e347 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:50 np0005603623 nova_compute[226235]: 2026-01-31 08:40:50.108 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:51 np0005603623 nova_compute[226235]: 2026-01-31 08:40:51.231 226239 INFO nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Creating config drive at /var/lib/nova/instances/1f710a83-a591-405d-a7f3-d80ab66f0b94/disk.config#033[00m
Jan 31 03:40:51 np0005603623 nova_compute[226235]: 2026-01-31 08:40:51.236 226239 DEBUG oslo_concurrency.processutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1f710a83-a591-405d-a7f3-d80ab66f0b94/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpf9jfvu6p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:40:51 np0005603623 nova_compute[226235]: 2026-01-31 08:40:51.361 226239 DEBUG oslo_concurrency.processutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1f710a83-a591-405d-a7f3-d80ab66f0b94/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpf9jfvu6p" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:40:51 np0005603623 nova_compute[226235]: 2026-01-31 08:40:51.390 226239 DEBUG nova.storage.rbd_utils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] rbd image 1f710a83-a591-405d-a7f3-d80ab66f0b94_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:40:51 np0005603623 nova_compute[226235]: 2026-01-31 08:40:51.395 226239 DEBUG oslo_concurrency.processutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1f710a83-a591-405d-a7f3-d80ab66f0b94/disk.config 1f710a83-a591-405d-a7f3-d80ab66f0b94_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:40:51 np0005603623 nova_compute[226235]: 2026-01-31 08:40:51.425 226239 DEBUG nova.network.neutron [req-3f8e60f8-3a86-4354-95e0-9dc99a181ed2 req-9625baf3-642c-47ca-a544-0d604a0cec83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Updated VIF entry in instance network info cache for port b8b7c630-442b-4a07-8813-bf635675e452. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:40:51 np0005603623 nova_compute[226235]: 2026-01-31 08:40:51.427 226239 DEBUG nova.network.neutron [req-3f8e60f8-3a86-4354-95e0-9dc99a181ed2 req-9625baf3-642c-47ca-a544-0d604a0cec83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Updating instance_info_cache with network_info: [{"id": "b8b7c630-442b-4a07-8813-bf635675e452", "address": "fa:16:3e:2e:3e:ef", "network": {"id": "124757ba-89c0-4223-923e-3c1484eeaccd", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-847601166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0026cad467ff4524ae5b675192e66666", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8b7c630-44", "ovs_interfaceid": "b8b7c630-442b-4a07-8813-bf635675e452", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:40:51 np0005603623 nova_compute[226235]: 2026-01-31 08:40:51.548 226239 DEBUG oslo_concurrency.lockutils [req-3f8e60f8-3a86-4354-95e0-9dc99a181ed2 req-9625baf3-642c-47ca-a544-0d604a0cec83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-1f710a83-a591-405d-a7f3-d80ab66f0b94" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:40:51 np0005603623 nova_compute[226235]: 2026-01-31 08:40:51.571 226239 DEBUG oslo_concurrency.processutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1f710a83-a591-405d-a7f3-d80ab66f0b94/disk.config 1f710a83-a591-405d-a7f3-d80ab66f0b94_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.176s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:40:51 np0005603623 nova_compute[226235]: 2026-01-31 08:40:51.572 226239 INFO nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Deleting local config drive /var/lib/nova/instances/1f710a83-a591-405d-a7f3-d80ab66f0b94/disk.config because it was imported into RBD.#033[00m
Jan 31 03:40:51 np0005603623 kernel: tapb8b7c630-44: entered promiscuous mode
Jan 31 03:40:51 np0005603623 NetworkManager[48970]: <info>  [1769848851.6031] manager: (tapb8b7c630-44): new Tun device (/org/freedesktop/NetworkManager/Devices/300)
Jan 31 03:40:51 np0005603623 ovn_controller[133449]: 2026-01-31T08:40:51Z|00631|binding|INFO|Claiming lport b8b7c630-442b-4a07-8813-bf635675e452 for this chassis.
Jan 31 03:40:51 np0005603623 ovn_controller[133449]: 2026-01-31T08:40:51Z|00632|binding|INFO|b8b7c630-442b-4a07-8813-bf635675e452: Claiming fa:16:3e:2e:3e:ef 10.100.0.5
Jan 31 03:40:51 np0005603623 nova_compute[226235]: 2026-01-31 08:40:51.604 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:51 np0005603623 nova_compute[226235]: 2026-01-31 08:40:51.606 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:51 np0005603623 nova_compute[226235]: 2026-01-31 08:40:51.610 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:51 np0005603623 systemd-udevd[296403]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:40:51 np0005603623 ovn_controller[133449]: 2026-01-31T08:40:51Z|00633|binding|INFO|Setting lport b8b7c630-442b-4a07-8813-bf635675e452 ovn-installed in OVS
Jan 31 03:40:51 np0005603623 nova_compute[226235]: 2026-01-31 08:40:51.631 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:51 np0005603623 systemd-machined[194379]: New machine qemu-72-instance-00000096.
Jan 31 03:40:51 np0005603623 NetworkManager[48970]: <info>  [1769848851.6351] device (tapb8b7c630-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:40:51 np0005603623 NetworkManager[48970]: <info>  [1769848851.6357] device (tapb8b7c630-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:40:51 np0005603623 systemd[1]: Started Virtual Machine qemu-72-instance-00000096.
Jan 31 03:40:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:51.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:51.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:51 np0005603623 ovn_controller[133449]: 2026-01-31T08:40:51Z|00634|binding|INFO|Setting lport b8b7c630-442b-4a07-8813-bf635675e452 up in Southbound
Jan 31 03:40:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:51.883 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:3e:ef 10.100.0.5'], port_security=['fa:16:3e:2e:3e:ef 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '1f710a83-a591-405d-a7f3-d80ab66f0b94', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-124757ba-89c0-4223-923e-3c1484eeaccd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0026cad467ff4524ae5b675192e66666', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3dfdb701-4561-4357-a441-3d1ad22e5ee5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f6e19354-a3d1-4f6a-90e8-0e5a842c6dca, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=b8b7c630-442b-4a07-8813-bf635675e452) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:40:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:51.884 143258 INFO neutron.agent.ovn.metadata.agent [-] Port b8b7c630-442b-4a07-8813-bf635675e452 in datapath 124757ba-89c0-4223-923e-3c1484eeaccd bound to our chassis#033[00m
Jan 31 03:40:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:51.887 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 124757ba-89c0-4223-923e-3c1484eeaccd#033[00m
Jan 31 03:40:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:51.894 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c50c49a5-c669-4a16-97ab-3ef458816e78]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:51.895 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap124757ba-81 in ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:40:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:51.897 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap124757ba-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:40:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:51.897 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[19f58073-35b0-4051-b4d6-9ec2cb5f1791]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:51.897 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6f44a002-b8aa-4b6c-b814-bf7943e7b4eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:51.905 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[a47051e8-4e93-470f-82d7-84b76a9c3b44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:51.915 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3807288f-a52c-4df8-afaf-0f06999796ef]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:51.936 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[10fa03cd-ebc7-49f7-9eb4-0b3daf9bba44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:51 np0005603623 systemd-udevd[296406]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:40:51 np0005603623 NetworkManager[48970]: <info>  [1769848851.9438] manager: (tap124757ba-80): new Veth device (/org/freedesktop/NetworkManager/Devices/301)
Jan 31 03:40:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:51.944 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e4a45d05-df72-4129-b570-04de550056a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:51.967 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[9132ced2-1ffb-47db-ac34-bbfaf493c840]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:51.970 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[73cf791e-3064-4df3-8fa7-84909132e4de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:51 np0005603623 NetworkManager[48970]: <info>  [1769848851.9856] device (tap124757ba-80): carrier: link connected
Jan 31 03:40:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:51.988 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[e64b3495-6d1d-430b-aef5-a40217192cbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:51.999 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a0495ff8-65d6-402b-9770-0576aeff6f41]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap124757ba-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:03:e2:bf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 188], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 799241, 'reachable_time': 18165, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 296473, 'error': None, 'target': 'ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:52.009 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5b46e3c8-2976-4f20-a278-09abd71f8260]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe03:e2bf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 799241, 'tstamp': 799241}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 296474, 'error': None, 'target': 'ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:52.021 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2009bdd5-3cc9-4a00-9710-68b059110473]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap124757ba-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:03:e2:bf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 188], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 799241, 'reachable_time': 18165, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 296475, 'error': None, 'target': 'ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:52.039 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[52950aab-126b-433d-8d23-6f71b63931e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:52.080 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[31882e12-d128-4ce2-b544-f60e7a8db23d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:52.081 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap124757ba-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:52.081 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:52.082 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap124757ba-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:40:52 np0005603623 nova_compute[226235]: 2026-01-31 08:40:52.083 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:52 np0005603623 NetworkManager[48970]: <info>  [1769848852.0843] manager: (tap124757ba-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/302)
Jan 31 03:40:52 np0005603623 kernel: tap124757ba-80: entered promiscuous mode
Jan 31 03:40:52 np0005603623 nova_compute[226235]: 2026-01-31 08:40:52.087 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:52.089 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap124757ba-80, col_values=(('external_ids', {'iface-id': '5400fcf9-e9e0-4b1a-b5b0-56d155971152'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:40:52 np0005603623 nova_compute[226235]: 2026-01-31 08:40:52.090 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:40:52Z|00635|binding|INFO|Releasing lport 5400fcf9-e9e0-4b1a-b5b0-56d155971152 from this chassis (sb_readonly=0)
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:52.093 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/124757ba-89c0-4223-923e-3c1484eeaccd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/124757ba-89c0-4223-923e-3c1484eeaccd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:52.093 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6c6b717e-1f9c-4cc6-98bd-b7a62398c777]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:52.094 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-124757ba-89c0-4223-923e-3c1484eeaccd
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/124757ba-89c0-4223-923e-3c1484eeaccd.pid.haproxy
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 124757ba-89c0-4223-923e-3c1484eeaccd
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:40:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:40:52.095 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd', 'env', 'PROCESS_TAG=haproxy-124757ba-89c0-4223-923e-3c1484eeaccd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/124757ba-89c0-4223-923e-3c1484eeaccd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:40:52 np0005603623 nova_compute[226235]: 2026-01-31 08:40:52.095 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:52 np0005603623 nova_compute[226235]: 2026-01-31 08:40:52.196 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848852.195975, 1f710a83-a591-405d-a7f3-d80ab66f0b94 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:40:52 np0005603623 nova_compute[226235]: 2026-01-31 08:40:52.196 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] VM Started (Lifecycle Event)#033[00m
Jan 31 03:40:52 np0005603623 nova_compute[226235]: 2026-01-31 08:40:52.368 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:40:52 np0005603623 nova_compute[226235]: 2026-01-31 08:40:52.373 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848852.1960454, 1f710a83-a591-405d-a7f3-d80ab66f0b94 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:40:52 np0005603623 nova_compute[226235]: 2026-01-31 08:40:52.373 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:40:52 np0005603623 podman[296512]: 2026-01-31 08:40:52.459078305 +0000 UTC m=+0.098747079 container create 798db1aaf9cf1ec6985a06a08e7c10e1e438bfd4c3e7afefb34a888186fdbd96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:40:52 np0005603623 podman[296512]: 2026-01-31 08:40:52.381550283 +0000 UTC m=+0.021219077 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:40:52 np0005603623 systemd[1]: Started libpod-conmon-798db1aaf9cf1ec6985a06a08e7c10e1e438bfd4c3e7afefb34a888186fdbd96.scope.
Jan 31 03:40:52 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:40:52 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f75db3003074366805df65d93d98ed43e9fa51d133c14318003bc7f2668e5c1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:40:52 np0005603623 podman[296512]: 2026-01-31 08:40:52.52302371 +0000 UTC m=+0.162692504 container init 798db1aaf9cf1ec6985a06a08e7c10e1e438bfd4c3e7afefb34a888186fdbd96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:40:52 np0005603623 podman[296512]: 2026-01-31 08:40:52.527169131 +0000 UTC m=+0.166837905 container start 798db1aaf9cf1ec6985a06a08e7c10e1e438bfd4c3e7afefb34a888186fdbd96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:40:52 np0005603623 nova_compute[226235]: 2026-01-31 08:40:52.536 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:52 np0005603623 neutron-haproxy-ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd[296530]: [NOTICE]   (296534) : New worker (296536) forked
Jan 31 03:40:52 np0005603623 neutron-haproxy-ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd[296530]: [NOTICE]   (296534) : Loading success.
Jan 31 03:40:52 np0005603623 nova_compute[226235]: 2026-01-31 08:40:52.852 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:40:52 np0005603623 nova_compute[226235]: 2026-01-31 08:40:52.856 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:40:53 np0005603623 nova_compute[226235]: 2026-01-31 08:40:53.019 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:40:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:53.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:53.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.193 226239 INFO nova.compute.manager [None req-81b5d909-9b7c-4010-9a8a-a605504635f9 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Pausing#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.194 226239 DEBUG nova.objects.instance [None req-81b5d909-9b7c-4010-9a8a-a605504635f9 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'flavor' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.231 226239 DEBUG nova.compute.manager [req-1100b4d9-3e0b-40dd-9d64-29bd5dc57d7c req-e098a9a5-82fd-4928-a89f-a136ab1360fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Received event network-vif-plugged-b8b7c630-442b-4a07-8813-bf635675e452 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.231 226239 DEBUG oslo_concurrency.lockutils [req-1100b4d9-3e0b-40dd-9d64-29bd5dc57d7c req-e098a9a5-82fd-4928-a89f-a136ab1360fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "1f710a83-a591-405d-a7f3-d80ab66f0b94-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.231 226239 DEBUG oslo_concurrency.lockutils [req-1100b4d9-3e0b-40dd-9d64-29bd5dc57d7c req-e098a9a5-82fd-4928-a89f-a136ab1360fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1f710a83-a591-405d-a7f3-d80ab66f0b94-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.232 226239 DEBUG oslo_concurrency.lockutils [req-1100b4d9-3e0b-40dd-9d64-29bd5dc57d7c req-e098a9a5-82fd-4928-a89f-a136ab1360fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1f710a83-a591-405d-a7f3-d80ab66f0b94-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.232 226239 DEBUG nova.compute.manager [req-1100b4d9-3e0b-40dd-9d64-29bd5dc57d7c req-e098a9a5-82fd-4928-a89f-a136ab1360fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Processing event network-vif-plugged-b8b7c630-442b-4a07-8813-bf635675e452 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.233 226239 DEBUG nova.compute.manager [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.236 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848854.2367072, 1f710a83-a591-405d-a7f3-d80ab66f0b94 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.237 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.238 226239 DEBUG nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.241 226239 INFO nova.virt.libvirt.driver [-] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Instance spawned successfully.#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.241 226239 DEBUG nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.459 226239 DEBUG nova.compute.manager [None req-81b5d909-9b7c-4010-9a8a-a605504635f9 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.532 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.537 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.541 226239 DEBUG nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.541 226239 DEBUG nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.542 226239 DEBUG nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.542 226239 DEBUG nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.543 226239 DEBUG nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.543 226239 DEBUG nova.virt.libvirt.driver [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.870 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.871 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848854.4595404, bd87e542-0f7b-453e-b8d1-643ad6fb64f0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.871 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.901 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:54 np0005603623 nova_compute[226235]: 2026-01-31 08:40:54.902 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:40:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e347 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:55 np0005603623 nova_compute[226235]: 2026-01-31 08:40:55.109 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:55 np0005603623 nova_compute[226235]: 2026-01-31 08:40:55.212 226239 INFO nova.compute.manager [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Took 22.12 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:40:55 np0005603623 nova_compute[226235]: 2026-01-31 08:40:55.213 226239 DEBUG nova.compute.manager [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:40:55 np0005603623 nova_compute[226235]: 2026-01-31 08:40:55.504 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:40:55 np0005603623 nova_compute[226235]: 2026-01-31 08:40:55.508 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: paused, current task_state: None, current DB power_state: 3, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:40:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:55.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:55.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e348 e348: 3 total, 3 up, 3 in
Jan 31 03:40:56 np0005603623 nova_compute[226235]: 2026-01-31 08:40:56.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:56 np0005603623 nova_compute[226235]: 2026-01-31 08:40:56.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:40:56 np0005603623 nova_compute[226235]: 2026-01-31 08:40:56.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:40:56 np0005603623 nova_compute[226235]: 2026-01-31 08:40:56.326 226239 INFO nova.compute.manager [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Took 25.94 seconds to build instance.#033[00m
Jan 31 03:40:56 np0005603623 nova_compute[226235]: 2026-01-31 08:40:56.450 226239 DEBUG nova.compute.manager [req-3b708d73-5094-41e2-9b88-f4d4c8e80932 req-4da04260-9b04-4131-a8b4-8ff817b516a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Received event network-vif-plugged-b8b7c630-442b-4a07-8813-bf635675e452 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:40:56 np0005603623 nova_compute[226235]: 2026-01-31 08:40:56.451 226239 DEBUG oslo_concurrency.lockutils [req-3b708d73-5094-41e2-9b88-f4d4c8e80932 req-4da04260-9b04-4131-a8b4-8ff817b516a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "1f710a83-a591-405d-a7f3-d80ab66f0b94-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:56 np0005603623 nova_compute[226235]: 2026-01-31 08:40:56.451 226239 DEBUG oslo_concurrency.lockutils [req-3b708d73-5094-41e2-9b88-f4d4c8e80932 req-4da04260-9b04-4131-a8b4-8ff817b516a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1f710a83-a591-405d-a7f3-d80ab66f0b94-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:56 np0005603623 nova_compute[226235]: 2026-01-31 08:40:56.452 226239 DEBUG oslo_concurrency.lockutils [req-3b708d73-5094-41e2-9b88-f4d4c8e80932 req-4da04260-9b04-4131-a8b4-8ff817b516a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1f710a83-a591-405d-a7f3-d80ab66f0b94-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:56 np0005603623 nova_compute[226235]: 2026-01-31 08:40:56.452 226239 DEBUG nova.compute.manager [req-3b708d73-5094-41e2-9b88-f4d4c8e80932 req-4da04260-9b04-4131-a8b4-8ff817b516a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] No waiting events found dispatching network-vif-plugged-b8b7c630-442b-4a07-8813-bf635675e452 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:40:56 np0005603623 nova_compute[226235]: 2026-01-31 08:40:56.452 226239 WARNING nova.compute.manager [req-3b708d73-5094-41e2-9b88-f4d4c8e80932 req-4da04260-9b04-4131-a8b4-8ff817b516a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Received unexpected event network-vif-plugged-b8b7c630-442b-4a07-8813-bf635675e452 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:40:56 np0005603623 nova_compute[226235]: 2026-01-31 08:40:56.531 226239 DEBUG oslo_concurrency.lockutils [None req-cb8533d5-158d-4d77-8edc-69d9d7a6c169 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lock "1f710a83-a591-405d-a7f3-d80ab66f0b94" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 26.540s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:56 np0005603623 nova_compute[226235]: 2026-01-31 08:40:56.729 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:40:56 np0005603623 nova_compute[226235]: 2026-01-31 08:40:56.730 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:40:56 np0005603623 nova_compute[226235]: 2026-01-31 08:40:56.730 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:40:56 np0005603623 nova_compute[226235]: 2026-01-31 08:40:56.731 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:40:57 np0005603623 nova_compute[226235]: 2026-01-31 08:40:57.538 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:40:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:57.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:40:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:57.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:57 np0005603623 nova_compute[226235]: 2026-01-31 08:40:57.969 226239 INFO nova.compute.manager [None req-e3f171d5-26f2-4309-b3a8-11194c4ad469 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Unpausing#033[00m
Jan 31 03:40:57 np0005603623 nova_compute[226235]: 2026-01-31 08:40:57.970 226239 DEBUG nova.objects.instance [None req-e3f171d5-26f2-4309-b3a8-11194c4ad469 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'flavor' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:40:58 np0005603623 nova_compute[226235]: 2026-01-31 08:40:58.150 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848858.1502273, bd87e542-0f7b-453e-b8d1-643ad6fb64f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:40:58 np0005603623 nova_compute[226235]: 2026-01-31 08:40:58.150 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:40:58 np0005603623 virtqemud[225858]: argument unsupported: QEMU guest agent is not configured
Jan 31 03:40:58 np0005603623 nova_compute[226235]: 2026-01-31 08:40:58.155 226239 DEBUG nova.virt.libvirt.guest [None req-e3f171d5-26f2-4309-b3a8-11194c4ad469 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 31 03:40:58 np0005603623 nova_compute[226235]: 2026-01-31 08:40:58.155 226239 DEBUG nova.compute.manager [None req-e3f171d5-26f2-4309-b3a8-11194c4ad469 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:40:58 np0005603623 nova_compute[226235]: 2026-01-31 08:40:58.246 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:40:58 np0005603623 nova_compute[226235]: 2026-01-31 08:40:58.250 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:40:58 np0005603623 nova_compute[226235]: 2026-01-31 08:40:58.291 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Jan 31 03:40:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:59.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:40:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:59.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e348 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.111 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.277 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updating instance_info_cache with network_info: [{"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.283 226239 DEBUG oslo_concurrency.lockutils [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Acquiring lock "1f710a83-a591-405d-a7f3-d80ab66f0b94" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.284 226239 DEBUG oslo_concurrency.lockutils [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lock "1f710a83-a591-405d-a7f3-d80ab66f0b94" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.284 226239 DEBUG oslo_concurrency.lockutils [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Acquiring lock "1f710a83-a591-405d-a7f3-d80ab66f0b94-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.285 226239 DEBUG oslo_concurrency.lockutils [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lock "1f710a83-a591-405d-a7f3-d80ab66f0b94-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.285 226239 DEBUG oslo_concurrency.lockutils [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lock "1f710a83-a591-405d-a7f3-d80ab66f0b94-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.286 226239 INFO nova.compute.manager [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Terminating instance#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.287 226239 DEBUG nova.compute.manager [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.313 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.313 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.314 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.314 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.361 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.362 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.362 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.362 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.363 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:41:00 np0005603623 kernel: tapb8b7c630-44 (unregistering): left promiscuous mode
Jan 31 03:41:00 np0005603623 NetworkManager[48970]: <info>  [1769848860.4719] device (tapb8b7c630-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.479 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:00 np0005603623 ovn_controller[133449]: 2026-01-31T08:41:00Z|00636|binding|INFO|Releasing lport b8b7c630-442b-4a07-8813-bf635675e452 from this chassis (sb_readonly=0)
Jan 31 03:41:00 np0005603623 ovn_controller[133449]: 2026-01-31T08:41:00Z|00637|binding|INFO|Setting lport b8b7c630-442b-4a07-8813-bf635675e452 down in Southbound
Jan 31 03:41:00 np0005603623 ovn_controller[133449]: 2026-01-31T08:41:00Z|00638|binding|INFO|Removing iface tapb8b7c630-44 ovn-installed in OVS
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.487 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:00 np0005603623 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d00000096.scope: Deactivated successfully.
Jan 31 03:41:00 np0005603623 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d00000096.scope: Consumed 6.562s CPU time.
Jan 31 03:41:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:00.506 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:3e:ef 10.100.0.5'], port_security=['fa:16:3e:2e:3e:ef 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '1f710a83-a591-405d-a7f3-d80ab66f0b94', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-124757ba-89c0-4223-923e-3c1484eeaccd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0026cad467ff4524ae5b675192e66666', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3dfdb701-4561-4357-a441-3d1ad22e5ee5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f6e19354-a3d1-4f6a-90e8-0e5a842c6dca, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=b8b7c630-442b-4a07-8813-bf635675e452) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:41:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:00.507 143258 INFO neutron.agent.ovn.metadata.agent [-] Port b8b7c630-442b-4a07-8813-bf635675e452 in datapath 124757ba-89c0-4223-923e-3c1484eeaccd unbound from our chassis#033[00m
Jan 31 03:41:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:00.510 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 124757ba-89c0-4223-923e-3c1484eeaccd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:41:00 np0005603623 systemd-machined[194379]: Machine qemu-72-instance-00000096 terminated.
Jan 31 03:41:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:00.512 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[acebd7e6-1aa4-4926-9d62-6de1caa7a5db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:00.513 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd namespace which is not needed anymore#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.721 226239 INFO nova.virt.libvirt.driver [-] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Instance destroyed successfully.#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.722 226239 DEBUG nova.objects.instance [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lazy-loading 'resources' on Instance uuid 1f710a83-a591-405d-a7f3-d80ab66f0b94 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:41:00 np0005603623 neutron-haproxy-ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd[296530]: [NOTICE]   (296534) : haproxy version is 2.8.14-c23fe91
Jan 31 03:41:00 np0005603623 neutron-haproxy-ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd[296530]: [NOTICE]   (296534) : path to executable is /usr/sbin/haproxy
Jan 31 03:41:00 np0005603623 neutron-haproxy-ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd[296530]: [WARNING]  (296534) : Exiting Master process...
Jan 31 03:41:00 np0005603623 neutron-haproxy-ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd[296530]: [WARNING]  (296534) : Exiting Master process...
Jan 31 03:41:00 np0005603623 neutron-haproxy-ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd[296530]: [ALERT]    (296534) : Current worker (296536) exited with code 143 (Terminated)
Jan 31 03:41:00 np0005603623 neutron-haproxy-ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd[296530]: [WARNING]  (296534) : All workers exited. Exiting... (0)
Jan 31 03:41:00 np0005603623 systemd[1]: libpod-798db1aaf9cf1ec6985a06a08e7c10e1e438bfd4c3e7afefb34a888186fdbd96.scope: Deactivated successfully.
Jan 31 03:41:00 np0005603623 podman[296643]: 2026-01-31 08:41:00.735555204 +0000 UTC m=+0.155715416 container died 798db1aaf9cf1ec6985a06a08e7c10e1e438bfd4c3e7afefb34a888186fdbd96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.762 226239 DEBUG nova.virt.libvirt.vif [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:40:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-221933362',display_name='tempest-ServerAddressesTestJSON-server-221933362',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-221933362',id=150,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:40:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0026cad467ff4524ae5b675192e66666',ramdisk_id='',reservation_id='r-0ck9pueu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesTestJSON-254892063',owner_user_name='tempest-ServerAddressesTestJSON-254892063-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:40:56Z,user_data=None,user_id='c6c60cceb0414556a99e4009200ad565',uuid=1f710a83-a591-405d-a7f3-d80ab66f0b94,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b8b7c630-442b-4a07-8813-bf635675e452", "address": "fa:16:3e:2e:3e:ef", "network": {"id": "124757ba-89c0-4223-923e-3c1484eeaccd", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-847601166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0026cad467ff4524ae5b675192e66666", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8b7c630-44", "ovs_interfaceid": "b8b7c630-442b-4a07-8813-bf635675e452", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.763 226239 DEBUG nova.network.os_vif_util [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Converting VIF {"id": "b8b7c630-442b-4a07-8813-bf635675e452", "address": "fa:16:3e:2e:3e:ef", "network": {"id": "124757ba-89c0-4223-923e-3c1484eeaccd", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-847601166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0026cad467ff4524ae5b675192e66666", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8b7c630-44", "ovs_interfaceid": "b8b7c630-442b-4a07-8813-bf635675e452", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.763 226239 DEBUG nova.network.os_vif_util [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:3e:ef,bridge_name='br-int',has_traffic_filtering=True,id=b8b7c630-442b-4a07-8813-bf635675e452,network=Network(124757ba-89c0-4223-923e-3c1484eeaccd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8b7c630-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.764 226239 DEBUG os_vif [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:3e:ef,bridge_name='br-int',has_traffic_filtering=True,id=b8b7c630-442b-4a07-8813-bf635675e452,network=Network(124757ba-89c0-4223-923e-3c1484eeaccd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8b7c630-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.765 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.766 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8b7c630-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.767 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.769 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:00 np0005603623 nova_compute[226235]: 2026-01-31 08:41:00.772 226239 INFO os_vif [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:3e:ef,bridge_name='br-int',has_traffic_filtering=True,id=b8b7c630-442b-4a07-8813-bf635675e452,network=Network(124757ba-89c0-4223-923e-3c1484eeaccd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8b7c630-44')#033[00m
Jan 31 03:41:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:41:01 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1762102758' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.041 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.678s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:41:01 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-798db1aaf9cf1ec6985a06a08e7c10e1e438bfd4c3e7afefb34a888186fdbd96-userdata-shm.mount: Deactivated successfully.
Jan 31 03:41:01 np0005603623 systemd[1]: var-lib-containers-storage-overlay-3f75db3003074366805df65d93d98ed43e9fa51d133c14318003bc7f2668e5c1-merged.mount: Deactivated successfully.
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.162 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000094 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.162 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000094 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.164 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000096 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.164 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000096 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.177 226239 DEBUG nova.compute.manager [req-a8728aac-5fa8-4c88-9662-0b6db64784f6 req-1162d193-0727-4459-b4fe-75bd6f98d438 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Received event network-vif-unplugged-b8b7c630-442b-4a07-8813-bf635675e452 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.178 226239 DEBUG oslo_concurrency.lockutils [req-a8728aac-5fa8-4c88-9662-0b6db64784f6 req-1162d193-0727-4459-b4fe-75bd6f98d438 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "1f710a83-a591-405d-a7f3-d80ab66f0b94-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.178 226239 DEBUG oslo_concurrency.lockutils [req-a8728aac-5fa8-4c88-9662-0b6db64784f6 req-1162d193-0727-4459-b4fe-75bd6f98d438 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1f710a83-a591-405d-a7f3-d80ab66f0b94-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.179 226239 DEBUG oslo_concurrency.lockutils [req-a8728aac-5fa8-4c88-9662-0b6db64784f6 req-1162d193-0727-4459-b4fe-75bd6f98d438 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1f710a83-a591-405d-a7f3-d80ab66f0b94-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.179 226239 DEBUG nova.compute.manager [req-a8728aac-5fa8-4c88-9662-0b6db64784f6 req-1162d193-0727-4459-b4fe-75bd6f98d438 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] No waiting events found dispatching network-vif-unplugged-b8b7c630-442b-4a07-8813-bf635675e452 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.179 226239 DEBUG nova.compute.manager [req-a8728aac-5fa8-4c88-9662-0b6db64784f6 req-1162d193-0727-4459-b4fe-75bd6f98d438 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Received event network-vif-unplugged-b8b7c630-442b-4a07-8813-bf635675e452 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.316 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.317 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4074MB free_disk=20.785202026367188GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.317 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.317 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.404 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance bd87e542-0f7b-453e-b8d1-643ad6fb64f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.404 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 1f710a83-a591-405d-a7f3-d80ab66f0b94 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.405 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.405 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:41:01 np0005603623 podman[296643]: 2026-01-31 08:41:01.411938652 +0000 UTC m=+0.832098854 container cleanup 798db1aaf9cf1ec6985a06a08e7c10e1e438bfd4c3e7afefb34a888186fdbd96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:41:01 np0005603623 systemd[1]: libpod-conmon-798db1aaf9cf1ec6985a06a08e7c10e1e438bfd4c3e7afefb34a888186fdbd96.scope: Deactivated successfully.
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.432 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.469 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.469 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.492 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.530 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.631 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:41:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:01.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:41:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:01.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:41:01 np0005603623 podman[296706]: 2026-01-31 08:41:01.760783676 +0000 UTC m=+0.332549073 container remove 798db1aaf9cf1ec6985a06a08e7c10e1e438bfd4c3e7afefb34a888186fdbd96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:41:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:01.766 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[82e90282-52a8-42c9-929d-bdde55b4020c]: (4, ('Sat Jan 31 08:41:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd (798db1aaf9cf1ec6985a06a08e7c10e1e438bfd4c3e7afefb34a888186fdbd96)\n798db1aaf9cf1ec6985a06a08e7c10e1e438bfd4c3e7afefb34a888186fdbd96\nSat Jan 31 08:41:01 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd (798db1aaf9cf1ec6985a06a08e7c10e1e438bfd4c3e7afefb34a888186fdbd96)\n798db1aaf9cf1ec6985a06a08e7c10e1e438bfd4c3e7afefb34a888186fdbd96\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:01.768 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b6e87bd3-128a-4fb8-b65a-4fdbdd66d673]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:01.769 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap124757ba-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.771 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:01 np0005603623 kernel: tap124757ba-80: left promiscuous mode
Jan 31 03:41:01 np0005603623 nova_compute[226235]: 2026-01-31 08:41:01.776 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:01.778 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[289cbb20-3176-4507-ad3b-b4bccbea3d90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:01.797 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3e2a8efc-422a-49cf-9280-6237059f53a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:01.798 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[121f6459-056e-407d-b3a9-935a18c96f5c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:01.808 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8ce1a637-b83b-452b-955a-d4d3ef6b8d98]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 799236, 'reachable_time': 37835, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 296742, 'error': None, 'target': 'ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:01 np0005603623 systemd[1]: run-netns-ovnmeta\x2d124757ba\x2d89c0\x2d4223\x2d923e\x2d3c1484eeaccd.mount: Deactivated successfully.
Jan 31 03:41:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:01.813 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-124757ba-89c0-4223-923e-3c1484eeaccd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:41:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:01.813 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[0b16beb0-26a2-441e-884f-233f7f24198b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:41:02 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2128459435' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:41:02 np0005603623 nova_compute[226235]: 2026-01-31 08:41:02.062 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:41:02 np0005603623 nova_compute[226235]: 2026-01-31 08:41:02.066 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:41:02 np0005603623 nova_compute[226235]: 2026-01-31 08:41:02.116 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:41:02 np0005603623 nova_compute[226235]: 2026-01-31 08:41:02.249 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:41:02 np0005603623 nova_compute[226235]: 2026-01-31 08:41:02.250 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.933s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:03.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:03.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:03 np0005603623 nova_compute[226235]: 2026-01-31 08:41:03.998 226239 DEBUG nova.compute.manager [req-f5c23262-e3b1-4496-85f9-db0ed91a95d2 req-575a8460-af06-4c32-8f81-7a3201c255f0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Received event network-vif-plugged-b8b7c630-442b-4a07-8813-bf635675e452 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:41:03 np0005603623 nova_compute[226235]: 2026-01-31 08:41:03.998 226239 DEBUG oslo_concurrency.lockutils [req-f5c23262-e3b1-4496-85f9-db0ed91a95d2 req-575a8460-af06-4c32-8f81-7a3201c255f0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "1f710a83-a591-405d-a7f3-d80ab66f0b94-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:03 np0005603623 nova_compute[226235]: 2026-01-31 08:41:03.998 226239 DEBUG oslo_concurrency.lockutils [req-f5c23262-e3b1-4496-85f9-db0ed91a95d2 req-575a8460-af06-4c32-8f81-7a3201c255f0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1f710a83-a591-405d-a7f3-d80ab66f0b94-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:03 np0005603623 nova_compute[226235]: 2026-01-31 08:41:03.998 226239 DEBUG oslo_concurrency.lockutils [req-f5c23262-e3b1-4496-85f9-db0ed91a95d2 req-575a8460-af06-4c32-8f81-7a3201c255f0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1f710a83-a591-405d-a7f3-d80ab66f0b94-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:03 np0005603623 nova_compute[226235]: 2026-01-31 08:41:03.999 226239 DEBUG nova.compute.manager [req-f5c23262-e3b1-4496-85f9-db0ed91a95d2 req-575a8460-af06-4c32-8f81-7a3201c255f0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] No waiting events found dispatching network-vif-plugged-b8b7c630-442b-4a07-8813-bf635675e452 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:41:03 np0005603623 nova_compute[226235]: 2026-01-31 08:41:03.999 226239 WARNING nova.compute.manager [req-f5c23262-e3b1-4496-85f9-db0ed91a95d2 req-575a8460-af06-4c32-8f81-7a3201c255f0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Received unexpected event network-vif-plugged-b8b7c630-442b-4a07-8813-bf635675e452 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:41:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e348 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:05 np0005603623 nova_compute[226235]: 2026-01-31 08:41:05.091 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:05 np0005603623 nova_compute[226235]: 2026-01-31 08:41:05.092 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:05 np0005603623 nova_compute[226235]: 2026-01-31 08:41:05.092 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:05 np0005603623 nova_compute[226235]: 2026-01-31 08:41:05.113 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:05 np0005603623 nova_compute[226235]: 2026-01-31 08:41:05.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:05.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:05.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:05 np0005603623 nova_compute[226235]: 2026-01-31 08:41:05.768 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:41:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:41:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:41:07 np0005603623 nova_compute[226235]: 2026-01-31 08:41:07.370 226239 INFO nova.virt.libvirt.driver [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Deleting instance files /var/lib/nova/instances/1f710a83-a591-405d-a7f3-d80ab66f0b94_del#033[00m
Jan 31 03:41:07 np0005603623 nova_compute[226235]: 2026-01-31 08:41:07.370 226239 INFO nova.virt.libvirt.driver [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Deletion of /var/lib/nova/instances/1f710a83-a591-405d-a7f3-d80ab66f0b94_del complete#033[00m
Jan 31 03:41:07 np0005603623 nova_compute[226235]: 2026-01-31 08:41:07.485 226239 INFO nova.compute.manager [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Took 7.20 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:41:07 np0005603623 nova_compute[226235]: 2026-01-31 08:41:07.486 226239 DEBUG oslo.service.loopingcall [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:41:07 np0005603623 nova_compute[226235]: 2026-01-31 08:41:07.486 226239 DEBUG nova.compute.manager [-] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:41:07 np0005603623 nova_compute[226235]: 2026-01-31 08:41:07.487 226239 DEBUG nova.network.neutron [-] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:41:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e349 e349: 3 total, 3 up, 3 in
Jan 31 03:41:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:07.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:07.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:08 np0005603623 nova_compute[226235]: 2026-01-31 08:41:08.639 226239 DEBUG nova.network.neutron [-] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:41:08 np0005603623 nova_compute[226235]: 2026-01-31 08:41:08.734 226239 INFO nova.compute.manager [-] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Took 1.25 seconds to deallocate network for instance.#033[00m
Jan 31 03:41:08 np0005603623 nova_compute[226235]: 2026-01-31 08:41:08.787 226239 DEBUG nova.compute.manager [req-3fa091b7-07f6-46c9-874e-2e7f5820364c req-a1a5d00e-1751-4da1-b068-b62af58d4d2e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Received event network-vif-deleted-b8b7c630-442b-4a07-8813-bf635675e452 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:41:08 np0005603623 nova_compute[226235]: 2026-01-31 08:41:08.900 226239 DEBUG oslo_concurrency.lockutils [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:08 np0005603623 nova_compute[226235]: 2026-01-31 08:41:08.901 226239 DEBUG oslo_concurrency.lockutils [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:09 np0005603623 nova_compute[226235]: 2026-01-31 08:41:09.006 226239 DEBUG oslo_concurrency.processutils [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:41:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:41:09 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1689424137' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:41:09 np0005603623 nova_compute[226235]: 2026-01-31 08:41:09.407 226239 DEBUG oslo_concurrency.processutils [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.401s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:41:09 np0005603623 nova_compute[226235]: 2026-01-31 08:41:09.413 226239 DEBUG nova.compute.provider_tree [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:41:09 np0005603623 nova_compute[226235]: 2026-01-31 08:41:09.452 226239 DEBUG nova.scheduler.client.report [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:41:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:09.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:09.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:10 np0005603623 nova_compute[226235]: 2026-01-31 08:41:10.041 226239 DEBUG oslo_concurrency.lockutils [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:10 np0005603623 nova_compute[226235]: 2026-01-31 08:41:10.113 226239 INFO nova.scheduler.client.report [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Deleted allocations for instance 1f710a83-a591-405d-a7f3-d80ab66f0b94#033[00m
Jan 31 03:41:10 np0005603623 nova_compute[226235]: 2026-01-31 08:41:10.115 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:10 np0005603623 nova_compute[226235]: 2026-01-31 08:41:10.378 226239 DEBUG oslo_concurrency.lockutils [None req-a0210aae-dae7-4228-9c2f-076c1b36bcb8 c6c60cceb0414556a99e4009200ad565 0026cad467ff4524ae5b675192e66666 - - default default] Lock "1f710a83-a591-405d-a7f3-d80ab66f0b94" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:10 np0005603623 nova_compute[226235]: 2026-01-31 08:41:10.771 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:11.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:11.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:12 np0005603623 nova_compute[226235]: 2026-01-31 08:41:12.151 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:12 np0005603623 nova_compute[226235]: 2026-01-31 08:41:12.189 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:13 np0005603623 podman[296979]: 2026-01-31 08:41:13.166345258 +0000 UTC m=+0.052338413 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, tcib_managed=true)
Jan 31 03:41:13 np0005603623 podman[296980]: 2026-01-31 08:41:13.211307199 +0000 UTC m=+0.097314535 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 31 03:41:13 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:41:13 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:41:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:13.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:13.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:41:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/626549870' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:41:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:41:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/626549870' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:41:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:15 np0005603623 nova_compute[226235]: 2026-01-31 08:41:15.117 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:15.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:15 np0005603623 nova_compute[226235]: 2026-01-31 08:41:15.717 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848860.7169557, 1f710a83-a591-405d-a7f3-d80ab66f0b94 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:41:15 np0005603623 nova_compute[226235]: 2026-01-31 08:41:15.718 226239 INFO nova.compute.manager [-] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:41:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:15.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:15 np0005603623 nova_compute[226235]: 2026-01-31 08:41:15.773 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e350 e350: 3 total, 3 up, 3 in
Jan 31 03:41:15 np0005603623 nova_compute[226235]: 2026-01-31 08:41:15.894 226239 DEBUG nova.compute.manager [None req-59e69b05-8eb6-4f3a-b611-f3ba6aa6961c - - - - - -] [instance: 1f710a83-a591-405d-a7f3-d80ab66f0b94] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #130. Immutable memtables: 0.
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:41:15.923427) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 81] Flushing memtable with next log file: 130
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848875923498, "job": 81, "event": "flush_started", "num_memtables": 1, "num_entries": 1581, "num_deletes": 258, "total_data_size": 3487193, "memory_usage": 3544752, "flush_reason": "Manual Compaction"}
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 81] Level-0 flush table #131: started
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848875949846, "cf_name": "default", "job": 81, "event": "table_file_creation", "file_number": 131, "file_size": 2277783, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 63837, "largest_seqno": 65412, "table_properties": {"data_size": 2271038, "index_size": 3815, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 14575, "raw_average_key_size": 20, "raw_value_size": 2257330, "raw_average_value_size": 3126, "num_data_blocks": 168, "num_entries": 722, "num_filter_entries": 722, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848753, "oldest_key_time": 1769848753, "file_creation_time": 1769848875, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 131, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 81] Flush lasted 26478 microseconds, and 4227 cpu microseconds.
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:41:15.949902) [db/flush_job.cc:967] [default] [JOB 81] Level-0 flush table #131: 2277783 bytes OK
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:41:15.949926) [db/memtable_list.cc:519] [default] Level-0 commit table #131 started
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:41:15.951816) [db/memtable_list.cc:722] [default] Level-0 commit table #131: memtable #1 done
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:41:15.951829) EVENT_LOG_v1 {"time_micros": 1769848875951825, "job": 81, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:41:15.951846) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 81] Try to delete WAL files size 3479816, prev total WAL file size 3479816, number of live WAL files 2.
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000127.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:41:15.952411) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032323633' seq:72057594037927935, type:22 .. '6C6F676D0032353134' seq:0, type:0; will stop at (end)
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 82] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 81 Base level 0, inputs: [131(2224KB)], [129(9563KB)]
Jan 31 03:41:15 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848875952569, "job": 82, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [131], "files_L6": [129], "score": -1, "input_data_size": 12070328, "oldest_snapshot_seqno": -1}
Jan 31 03:41:16 np0005603623 nova_compute[226235]: 2026-01-31 08:41:16.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:16 np0005603623 nova_compute[226235]: 2026-01-31 08:41:16.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:41:16 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 82] Generated table #132: 8623 keys, 11926740 bytes, temperature: kUnknown
Jan 31 03:41:16 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848876188750, "cf_name": "default", "job": 82, "event": "table_file_creation", "file_number": 132, "file_size": 11926740, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11870072, "index_size": 34027, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21573, "raw_key_size": 227226, "raw_average_key_size": 26, "raw_value_size": 11717750, "raw_average_value_size": 1358, "num_data_blocks": 1307, "num_entries": 8623, "num_filter_entries": 8623, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769848875, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 132, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:41:16 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:41:16 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:41:16.189050) [db/compaction/compaction_job.cc:1663] [default] [JOB 82] Compacted 1@0 + 1@6 files to L6 => 11926740 bytes
Jan 31 03:41:16 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:41:16.196090) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 51.1 rd, 50.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 9.3 +0.0 blob) out(11.4 +0.0 blob), read-write-amplify(10.5) write-amplify(5.2) OK, records in: 9155, records dropped: 532 output_compression: NoCompression
Jan 31 03:41:16 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:41:16.196137) EVENT_LOG_v1 {"time_micros": 1769848876196119, "job": 82, "event": "compaction_finished", "compaction_time_micros": 236240, "compaction_time_cpu_micros": 27069, "output_level": 6, "num_output_files": 1, "total_output_size": 11926740, "num_input_records": 9155, "num_output_records": 8623, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:41:16 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000131.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:41:16 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848876196519, "job": 82, "event": "table_file_deletion", "file_number": 131}
Jan 31 03:41:16 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000129.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:41:16 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848876197397, "job": 82, "event": "table_file_deletion", "file_number": 129}
Jan 31 03:41:16 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:41:15.952279) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:41:16 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:41:16.197465) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:41:16 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:41:16.197470) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:41:16 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:41:16.197472) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:41:16 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:41:16.197475) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:41:16 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:41:16.197480) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:41:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:17.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:17.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e351 e351: 3 total, 3 up, 3 in
Jan 31 03:41:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:19.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:19.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e351 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:20 np0005603623 nova_compute[226235]: 2026-01-31 08:41:20.119 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:20 np0005603623 ovn_controller[133449]: 2026-01-31T08:41:20Z|00639|binding|INFO|Releasing lport 9f1ac82b-bf6c-400f-a03c-b15ad5392890 from this chassis (sb_readonly=0)
Jan 31 03:41:20 np0005603623 nova_compute[226235]: 2026-01-31 08:41:20.263 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:20 np0005603623 nova_compute[226235]: 2026-01-31 08:41:20.775 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:21.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:21.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:23.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:23.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e351 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:25 np0005603623 nova_compute[226235]: 2026-01-31 08:41:25.121 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:25.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:25.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:25 np0005603623 nova_compute[226235]: 2026-01-31 08:41:25.778 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e352 e352: 3 total, 3 up, 3 in
Jan 31 03:41:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:27.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:27.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:29.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:29.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e352 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:30 np0005603623 nova_compute[226235]: 2026-01-31 08:41:30.122 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:30.133 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:30.133 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:30.134 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:30 np0005603623 nova_compute[226235]: 2026-01-31 08:41:30.780 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:31.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:31.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:32 np0005603623 nova_compute[226235]: 2026-01-31 08:41:32.698 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:32 np0005603623 nova_compute[226235]: 2026-01-31 08:41:32.699 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:41:32 np0005603623 nova_compute[226235]: 2026-01-31 08:41:32.864 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:41:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:33.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:33.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e352 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:35 np0005603623 nova_compute[226235]: 2026-01-31 08:41:35.123 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:35.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:41:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:35.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:41:35 np0005603623 nova_compute[226235]: 2026-01-31 08:41:35.781 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:36 np0005603623 nova_compute[226235]: 2026-01-31 08:41:36.104 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:36 np0005603623 nova_compute[226235]: 2026-01-31 08:41:36.374 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Triggering sync for uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 03:41:36 np0005603623 nova_compute[226235]: 2026-01-31 08:41:36.374 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:36 np0005603623 nova_compute[226235]: 2026-01-31 08:41:36.375 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:37 np0005603623 nova_compute[226235]: 2026-01-31 08:41:37.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:37 np0005603623 nova_compute[226235]: 2026-01-31 08:41:37.157 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.783s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:37.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:37.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:39.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:39.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e352 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:40 np0005603623 nova_compute[226235]: 2026-01-31 08:41:40.126 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:40 np0005603623 nova_compute[226235]: 2026-01-31 08:41:40.782 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:41.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:41.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e353 e353: 3 total, 3 up, 3 in
Jan 31 03:41:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:43.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:43.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:43 np0005603623 podman[297116]: 2026-01-31 08:41:43.949314852 +0000 UTC m=+0.039005974 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:41:43 np0005603623 podman[297117]: 2026-01-31 08:41:43.982614717 +0000 UTC m=+0.068967105 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 03:41:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:44.894 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=61, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=60) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:41:44 np0005603623 nova_compute[226235]: 2026-01-31 08:41:44.894 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:44.896 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:41:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e353 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:45 np0005603623 nova_compute[226235]: 2026-01-31 08:41:45.128 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:45.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:45.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:45 np0005603623 nova_compute[226235]: 2026-01-31 08:41:45.784 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:47.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:47.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:49.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:49.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e353 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:50 np0005603623 nova_compute[226235]: 2026-01-31 08:41:50.129 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:50 np0005603623 nova_compute[226235]: 2026-01-31 08:41:50.787 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:50.899 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '61'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:41:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e354 e354: 3 total, 3 up, 3 in
Jan 31 03:41:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:51.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:51.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:52 np0005603623 nova_compute[226235]: 2026-01-31 08:41:52.301 226239 DEBUG oslo_concurrency.lockutils [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:52 np0005603623 nova_compute[226235]: 2026-01-31 08:41:52.301 226239 DEBUG oslo_concurrency.lockutils [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:52 np0005603623 nova_compute[226235]: 2026-01-31 08:41:52.301 226239 INFO nova.compute.manager [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Shelving#033[00m
Jan 31 03:41:52 np0005603623 nova_compute[226235]: 2026-01-31 08:41:52.409 226239 DEBUG nova.virt.libvirt.driver [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:41:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:41:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:53.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:41:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:53.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:54 np0005603623 nova_compute[226235]: 2026-01-31 08:41:54.321 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:54 np0005603623 nova_compute[226235]: 2026-01-31 08:41:54.322 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:41:55 np0005603623 nova_compute[226235]: 2026-01-31 08:41:55.131 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:55 np0005603623 nova_compute[226235]: 2026-01-31 08:41:55.426 226239 INFO nova.virt.libvirt.driver [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Instance shutdown successfully after 3 seconds.#033[00m
Jan 31 03:41:55 np0005603623 kernel: tapb122a11a-5b (unregistering): left promiscuous mode
Jan 31 03:41:55 np0005603623 NetworkManager[48970]: <info>  [1769848915.6227] device (tapb122a11a-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:41:55 np0005603623 ovn_controller[133449]: 2026-01-31T08:41:55Z|00640|binding|INFO|Releasing lport b122a11a-5b9d-4b27-a9c3-8327cb8162ae from this chassis (sb_readonly=0)
Jan 31 03:41:55 np0005603623 ovn_controller[133449]: 2026-01-31T08:41:55Z|00641|binding|INFO|Setting lport b122a11a-5b9d-4b27-a9c3-8327cb8162ae down in Southbound
Jan 31 03:41:55 np0005603623 ovn_controller[133449]: 2026-01-31T08:41:55Z|00642|binding|INFO|Removing iface tapb122a11a-5b ovn-installed in OVS
Jan 31 03:41:55 np0005603623 nova_compute[226235]: 2026-01-31 08:41:55.628 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:55 np0005603623 nova_compute[226235]: 2026-01-31 08:41:55.638 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:55 np0005603623 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000094.scope: Deactivated successfully.
Jan 31 03:41:55 np0005603623 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000094.scope: Consumed 18.961s CPU time.
Jan 31 03:41:55 np0005603623 systemd-machined[194379]: Machine qemu-69-instance-00000094 terminated.
Jan 31 03:41:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:41:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:55.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:41:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:55.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:55 np0005603623 nova_compute[226235]: 2026-01-31 08:41:55.788 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:55.837 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:83:50 10.100.0.11'], port_security=['fa:16:3e:c1:83:50 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bd87e542-0f7b-453e-b8d1-643ad6fb64f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '621c17d53cba46d386de8efb560a988e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1c8dcf47-c169-4871-843e-ae38c0fc69f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bda2ce92-ce79-4f8b-b120-fd83adc645ef, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=b122a11a-5b9d-4b27-a9c3-8327cb8162ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:41:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:55.839 143258 INFO neutron.agent.ovn.metadata.agent [-] Port b122a11a-5b9d-4b27-a9c3-8327cb8162ae in datapath 550cf3a2-62ab-424d-afc0-3148a4a687ee unbound from our chassis#033[00m
Jan 31 03:41:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:55.840 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 550cf3a2-62ab-424d-afc0-3148a4a687ee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:41:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:55.841 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[22a2899c-1c72-4c32-ae1b-fec2cd557e9f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:55.841 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee namespace which is not needed anymore#033[00m
Jan 31 03:41:55 np0005603623 nova_compute[226235]: 2026-01-31 08:41:55.860 226239 INFO nova.virt.libvirt.driver [-] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Instance destroyed successfully.#033[00m
Jan 31 03:41:55 np0005603623 nova_compute[226235]: 2026-01-31 08:41:55.860 226239 DEBUG nova.objects.instance [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'numa_topology' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:41:56 np0005603623 nova_compute[226235]: 2026-01-31 08:41:56.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:56 np0005603623 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[294228]: [NOTICE]   (294232) : haproxy version is 2.8.14-c23fe91
Jan 31 03:41:56 np0005603623 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[294228]: [NOTICE]   (294232) : path to executable is /usr/sbin/haproxy
Jan 31 03:41:56 np0005603623 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[294228]: [WARNING]  (294232) : Exiting Master process...
Jan 31 03:41:56 np0005603623 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[294228]: [ALERT]    (294232) : Current worker (294234) exited with code 143 (Terminated)
Jan 31 03:41:56 np0005603623 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[294228]: [WARNING]  (294232) : All workers exited. Exiting... (0)
Jan 31 03:41:56 np0005603623 systemd[1]: libpod-cad769e5ee9c0f97fe418455181a51dc43b786e95552942ce0fcbcc53a004292.scope: Deactivated successfully.
Jan 31 03:41:56 np0005603623 podman[297253]: 2026-01-31 08:41:56.23585675 +0000 UTC m=+0.313704943 container died cad769e5ee9c0f97fe418455181a51dc43b786e95552942ce0fcbcc53a004292 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:41:56 np0005603623 nova_compute[226235]: 2026-01-31 08:41:56.338 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:56 np0005603623 nova_compute[226235]: 2026-01-31 08:41:56.339 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:56 np0005603623 nova_compute[226235]: 2026-01-31 08:41:56.340 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:56 np0005603623 nova_compute[226235]: 2026-01-31 08:41:56.340 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:41:56 np0005603623 nova_compute[226235]: 2026-01-31 08:41:56.341 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:41:56 np0005603623 nova_compute[226235]: 2026-01-31 08:41:56.581 226239 INFO nova.virt.libvirt.driver [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Beginning cold snapshot process#033[00m
Jan 31 03:41:56 np0005603623 systemd[1]: var-lib-containers-storage-overlay-f97f863101af7dab351d9b243b175beb9926797e921cc32decf95811d3debceb-merged.mount: Deactivated successfully.
Jan 31 03:41:56 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cad769e5ee9c0f97fe418455181a51dc43b786e95552942ce0fcbcc53a004292-userdata-shm.mount: Deactivated successfully.
Jan 31 03:41:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:41:56 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4043869258' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:41:56 np0005603623 nova_compute[226235]: 2026-01-31 08:41:56.786 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:41:57 np0005603623 nova_compute[226235]: 2026-01-31 08:41:57.409 226239 DEBUG nova.compute.manager [req-28b9b9cf-d305-4158-8b85-41cfd6b71206 req-50969162-ad80-48a8-82df-021f615f8a06 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received event network-vif-unplugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:41:57 np0005603623 nova_compute[226235]: 2026-01-31 08:41:57.409 226239 DEBUG oslo_concurrency.lockutils [req-28b9b9cf-d305-4158-8b85-41cfd6b71206 req-50969162-ad80-48a8-82df-021f615f8a06 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:57 np0005603623 nova_compute[226235]: 2026-01-31 08:41:57.409 226239 DEBUG oslo_concurrency.lockutils [req-28b9b9cf-d305-4158-8b85-41cfd6b71206 req-50969162-ad80-48a8-82df-021f615f8a06 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:57 np0005603623 nova_compute[226235]: 2026-01-31 08:41:57.410 226239 DEBUG oslo_concurrency.lockutils [req-28b9b9cf-d305-4158-8b85-41cfd6b71206 req-50969162-ad80-48a8-82df-021f615f8a06 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:57 np0005603623 nova_compute[226235]: 2026-01-31 08:41:57.410 226239 DEBUG nova.compute.manager [req-28b9b9cf-d305-4158-8b85-41cfd6b71206 req-50969162-ad80-48a8-82df-021f615f8a06 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] No waiting events found dispatching network-vif-unplugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:41:57 np0005603623 nova_compute[226235]: 2026-01-31 08:41:57.410 226239 WARNING nova.compute.manager [req-28b9b9cf-d305-4158-8b85-41cfd6b71206 req-50969162-ad80-48a8-82df-021f615f8a06 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received unexpected event network-vif-unplugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae for instance with vm_state active and task_state shelving.#033[00m
Jan 31 03:41:57 np0005603623 podman[297253]: 2026-01-31 08:41:57.542266793 +0000 UTC m=+1.620114976 container cleanup cad769e5ee9c0f97fe418455181a51dc43b786e95552942ce0fcbcc53a004292 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:41:57 np0005603623 systemd[1]: libpod-conmon-cad769e5ee9c0f97fe418455181a51dc43b786e95552942ce0fcbcc53a004292.scope: Deactivated successfully.
Jan 31 03:41:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:57.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:57.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:57 np0005603623 nova_compute[226235]: 2026-01-31 08:41:57.798 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000094 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:41:57 np0005603623 nova_compute[226235]: 2026-01-31 08:41:57.799 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-00000094 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:41:58 np0005603623 nova_compute[226235]: 2026-01-31 08:41:58.192 226239 DEBUG nova.virt.libvirt.imagebackend [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] No parent info for 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 31 03:41:58 np0005603623 nova_compute[226235]: 2026-01-31 08:41:58.203 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:41:58 np0005603623 nova_compute[226235]: 2026-01-31 08:41:58.204 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4345MB free_disk=20.922027587890625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:41:58 np0005603623 nova_compute[226235]: 2026-01-31 08:41:58.205 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:58 np0005603623 nova_compute[226235]: 2026-01-31 08:41:58.205 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:58 np0005603623 podman[297308]: 2026-01-31 08:41:58.484364307 +0000 UTC m=+0.927477976 container remove cad769e5ee9c0f97fe418455181a51dc43b786e95552942ce0fcbcc53a004292 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:41:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:58.491 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[650fd9ba-d635-4de6-8987-f6e9571b993c]: (4, ('Sat Jan 31 08:41:55 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee (cad769e5ee9c0f97fe418455181a51dc43b786e95552942ce0fcbcc53a004292)\ncad769e5ee9c0f97fe418455181a51dc43b786e95552942ce0fcbcc53a004292\nSat Jan 31 08:41:57 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee (cad769e5ee9c0f97fe418455181a51dc43b786e95552942ce0fcbcc53a004292)\ncad769e5ee9c0f97fe418455181a51dc43b786e95552942ce0fcbcc53a004292\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:58.493 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8874d17c-1efe-44d3-b526-cbd304e9552b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:58.494 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap550cf3a2-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:41:58 np0005603623 nova_compute[226235]: 2026-01-31 08:41:58.496 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:58 np0005603623 kernel: tap550cf3a2-60: left promiscuous mode
Jan 31 03:41:58 np0005603623 nova_compute[226235]: 2026-01-31 08:41:58.506 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:58.508 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[46bd959c-3754-4937-ab6b-acd19844ca00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:58.522 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[245b30d6-b8b9-41ae-808b-8640e0af3955]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:58.523 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2b65200f-7096-4900-954b-75973725d23f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:58.535 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5863947f-c477-4985-99e3-7f15efc1acf5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 786118, 'reachable_time': 44854, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297361, 'error': None, 'target': 'ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:58 np0005603623 systemd[1]: run-netns-ovnmeta\x2d550cf3a2\x2d62ab\x2d424d\x2dafc0\x2d3148a4a687ee.mount: Deactivated successfully.
Jan 31 03:41:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:58.539 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:41:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:41:58.539 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[7e44e58e-fc3e-44cc-8b4e-85391295c2bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:58 np0005603623 nova_compute[226235]: 2026-01-31 08:41:58.542 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance bd87e542-0f7b-453e-b8d1-643ad6fb64f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:41:58 np0005603623 nova_compute[226235]: 2026-01-31 08:41:58.543 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:41:58 np0005603623 nova_compute[226235]: 2026-01-31 08:41:58.543 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:41:58 np0005603623 nova_compute[226235]: 2026-01-31 08:41:58.589 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:41:58 np0005603623 nova_compute[226235]: 2026-01-31 08:41:58.743 226239 DEBUG nova.storage.rbd_utils [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] creating snapshot(0c76699b09fb40a694067cba7ddd7a2e) on rbd image(bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:41:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:41:58 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1730568988' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:41:59 np0005603623 nova_compute[226235]: 2026-01-31 08:41:59.001 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:41:59 np0005603623 nova_compute[226235]: 2026-01-31 08:41:59.008 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:41:59 np0005603623 nova_compute[226235]: 2026-01-31 08:41:59.182 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:41:59 np0005603623 nova_compute[226235]: 2026-01-31 08:41:59.589 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:41:59 np0005603623 nova_compute[226235]: 2026-01-31 08:41:59.589 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:59.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:41:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:41:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:59.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:41:59 np0005603623 nova_compute[226235]: 2026-01-31 08:41:59.930 226239 DEBUG nova.compute.manager [req-f8f9fb37-b25c-4ef2-ac33-70ffee5c4735 req-7f0da6a9-ef8e-4f9e-bdb9-8026e91da303 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:41:59 np0005603623 nova_compute[226235]: 2026-01-31 08:41:59.931 226239 DEBUG oslo_concurrency.lockutils [req-f8f9fb37-b25c-4ef2-ac33-70ffee5c4735 req-7f0da6a9-ef8e-4f9e-bdb9-8026e91da303 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:59 np0005603623 nova_compute[226235]: 2026-01-31 08:41:59.931 226239 DEBUG oslo_concurrency.lockutils [req-f8f9fb37-b25c-4ef2-ac33-70ffee5c4735 req-7f0da6a9-ef8e-4f9e-bdb9-8026e91da303 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:59 np0005603623 nova_compute[226235]: 2026-01-31 08:41:59.931 226239 DEBUG oslo_concurrency.lockutils [req-f8f9fb37-b25c-4ef2-ac33-70ffee5c4735 req-7f0da6a9-ef8e-4f9e-bdb9-8026e91da303 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:59 np0005603623 nova_compute[226235]: 2026-01-31 08:41:59.932 226239 DEBUG nova.compute.manager [req-f8f9fb37-b25c-4ef2-ac33-70ffee5c4735 req-7f0da6a9-ef8e-4f9e-bdb9-8026e91da303 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] No waiting events found dispatching network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:41:59 np0005603623 nova_compute[226235]: 2026-01-31 08:41:59.932 226239 WARNING nova.compute.manager [req-f8f9fb37-b25c-4ef2-ac33-70ffee5c4735 req-7f0da6a9-ef8e-4f9e-bdb9-8026e91da303 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received unexpected event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 31 03:42:00 np0005603623 nova_compute[226235]: 2026-01-31 08:42:00.135 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:00 np0005603623 nova_compute[226235]: 2026-01-31 08:42:00.791 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e355 e355: 3 total, 3 up, 3 in
Jan 31 03:42:01 np0005603623 nova_compute[226235]: 2026-01-31 08:42:01.590 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:42:01 np0005603623 nova_compute[226235]: 2026-01-31 08:42:01.591 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:42:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:42:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:01.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:42:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:01.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:02 np0005603623 nova_compute[226235]: 2026-01-31 08:42:02.124 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:42:02 np0005603623 nova_compute[226235]: 2026-01-31 08:42:02.124 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:42:03 np0005603623 nova_compute[226235]: 2026-01-31 08:42:03.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:42:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:03.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:03.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:04 np0005603623 nova_compute[226235]: 2026-01-31 08:42:04.058 226239 DEBUG nova.storage.rbd_utils [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] cloning vms/bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk@0c76699b09fb40a694067cba7ddd7a2e to images/269becbf-04a2-4537-bd31-66899150ed70 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:42:04 np0005603623 nova_compute[226235]: 2026-01-31 08:42:04.530 226239 DEBUG nova.storage.rbd_utils [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] flattening images/269becbf-04a2-4537-bd31-66899150ed70 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 31 03:42:05 np0005603623 nova_compute[226235]: 2026-01-31 08:42:05.136 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:05 np0005603623 nova_compute[226235]: 2026-01-31 08:42:05.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:42:05 np0005603623 nova_compute[226235]: 2026-01-31 08:42:05.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:42:05 np0005603623 nova_compute[226235]: 2026-01-31 08:42:05.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:42:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e355 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:42:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:05.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:42:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:05.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:05 np0005603623 nova_compute[226235]: 2026-01-31 08:42:05.792 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:07.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:07.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:08 np0005603623 nova_compute[226235]: 2026-01-31 08:42:08.510 226239 DEBUG nova.storage.rbd_utils [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] removing snapshot(0c76699b09fb40a694067cba7ddd7a2e) on rbd image(bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:42:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:09.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:09.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e356 e356: 3 total, 3 up, 3 in
Jan 31 03:42:10 np0005603623 nova_compute[226235]: 2026-01-31 08:42:10.138 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:10 np0005603623 nova_compute[226235]: 2026-01-31 08:42:10.285 226239 DEBUG nova.storage.rbd_utils [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] creating snapshot(snap) on rbd image(269becbf-04a2-4537-bd31-66899150ed70) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:42:10 np0005603623 nova_compute[226235]: 2026-01-31 08:42:10.795 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:10 np0005603623 nova_compute[226235]: 2026-01-31 08:42:10.858 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848915.857055, bd87e542-0f7b-453e-b8d1-643ad6fb64f0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:42:10 np0005603623 nova_compute[226235]: 2026-01-31 08:42:10.859 226239 INFO nova.compute.manager [-] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:42:10 np0005603623 nova_compute[226235]: 2026-01-31 08:42:10.961 226239 DEBUG nova.compute.manager [None req-1bacbd5f-06eb-4d56-9a9e-b19f3ae1afa5 - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:42:10 np0005603623 nova_compute[226235]: 2026-01-31 08:42:10.964 226239 DEBUG nova.compute.manager [None req-1bacbd5f-06eb-4d56-9a9e-b19f3ae1afa5 - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: shelving_image_uploading, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:42:11 np0005603623 nova_compute[226235]: 2026-01-31 08:42:11.012 226239 INFO nova.compute.manager [None req-1bacbd5f-06eb-4d56-9a9e-b19f3ae1afa5 - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] During sync_power_state the instance has a pending task (shelving_image_uploading). Skip.#033[00m
Jan 31 03:42:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e357 e357: 3 total, 3 up, 3 in
Jan 31 03:42:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:11.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:11.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:13 np0005603623 nova_compute[226235]: 2026-01-31 08:42:13.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:42:13 np0005603623 podman[297668]: 2026-01-31 08:42:13.308244632 +0000 UTC m=+0.069419409 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 31 03:42:13 np0005603623 podman[297668]: 2026-01-31 08:42:13.421136304 +0000 UTC m=+0.182311061 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Jan 31 03:42:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:42:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:13.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:42:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:13.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:14 np0005603623 podman[297869]: 2026-01-31 08:42:14.067541382 +0000 UTC m=+0.210666141 container exec dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 03:42:14 np0005603623 podman[297890]: 2026-01-31 08:42:14.235712568 +0000 UTC m=+0.154770037 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 03:42:14 np0005603623 podman[297869]: 2026-01-31 08:42:14.546238159 +0000 UTC m=+0.689362898 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 03:42:14 np0005603623 podman[297905]: 2026-01-31 08:42:14.711216844 +0000 UTC m=+0.132978962 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 03:42:14 np0005603623 podman[297906]: 2026-01-31 08:42:14.73498253 +0000 UTC m=+0.156768709 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Jan 31 03:42:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:42:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/627005755' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:42:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:42:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/627005755' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:42:15 np0005603623 nova_compute[226235]: 2026-01-31 08:42:15.139 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e357 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:15 np0005603623 podman[297982]: 2026-01-31 08:42:15.439666556 +0000 UTC m=+0.647045999 container exec 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793)
Jan 31 03:42:15 np0005603623 podman[298002]: 2026-01-31 08:42:15.561587251 +0000 UTC m=+0.103568410 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, description=keepalived for Ceph, distribution-scope=public, vendor=Red Hat, Inc., version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, name=keepalived, io.openshift.expose-services=)
Jan 31 03:42:15 np0005603623 podman[297982]: 2026-01-31 08:42:15.57431188 +0000 UTC m=+0.781691323 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, release=1793, io.buildah.version=1.28.2, io.openshift.expose-services=, vendor=Red Hat, Inc., description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, vcs-type=git, version=2.2.4, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived)
Jan 31 03:42:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:15.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:15.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:15 np0005603623 nova_compute[226235]: 2026-01-31 08:42:15.797 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:16 np0005603623 nova_compute[226235]: 2026-01-31 08:42:16.247 226239 INFO nova.virt.libvirt.driver [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Snapshot image upload complete#033[00m
Jan 31 03:42:16 np0005603623 nova_compute[226235]: 2026-01-31 08:42:16.248 226239 DEBUG nova.compute.manager [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:42:16 np0005603623 nova_compute[226235]: 2026-01-31 08:42:16.493 226239 INFO nova.compute.manager [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Shelve offloading#033[00m
Jan 31 03:42:16 np0005603623 nova_compute[226235]: 2026-01-31 08:42:16.500 226239 INFO nova.virt.libvirt.driver [-] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Instance destroyed successfully.#033[00m
Jan 31 03:42:16 np0005603623 nova_compute[226235]: 2026-01-31 08:42:16.500 226239 DEBUG nova.compute.manager [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:42:16 np0005603623 nova_compute[226235]: 2026-01-31 08:42:16.502 226239 DEBUG oslo_concurrency.lockutils [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:42:16 np0005603623 nova_compute[226235]: 2026-01-31 08:42:16.502 226239 DEBUG oslo_concurrency.lockutils [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquired lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:42:16 np0005603623 nova_compute[226235]: 2026-01-31 08:42:16.502 226239 DEBUG nova.network.neutron [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:42:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e358 e358: 3 total, 3 up, 3 in
Jan 31 03:42:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:42:17 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:42:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:17.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:42:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:17.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:42:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:42:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:42:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:42:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:42:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:19.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:42:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:19.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:20 np0005603623 nova_compute[226235]: 2026-01-31 08:42:20.141 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:20 np0005603623 nova_compute[226235]: 2026-01-31 08:42:20.201 226239 DEBUG nova.network.neutron [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updating instance_info_cache with network_info: [{"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:42:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:20 np0005603623 nova_compute[226235]: 2026-01-31 08:42:20.465 226239 DEBUG oslo_concurrency.lockutils [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Releasing lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:42:20 np0005603623 nova_compute[226235]: 2026-01-31 08:42:20.799 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:42:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:21.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:42:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:21.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:23.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:23.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:25 np0005603623 nova_compute[226235]: 2026-01-31 08:42:25.143 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:42:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.0 total, 600.0 interval#012Cumulative writes: 13K writes, 66K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.13 GB, 0.03 MB/s#012Cumulative WAL: 13K writes, 13K syncs, 1.00 writes per sync, written: 0.13 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1647 writes, 8413 keys, 1647 commit groups, 1.0 writes per commit group, ingest: 16.49 MB, 0.03 MB/s#012Interval WAL: 1647 writes, 1647 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     39.8      2.04              0.20        41    0.050       0      0       0.0       0.0#012  L6      1/0   11.37 MB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   4.9     57.4     48.8      8.16              0.87        40    0.204    270K    21K       0.0       0.0#012 Sum      1/0   11.37 MB   0.0      0.5     0.1      0.4       0.5      0.1       0.0   5.9     45.9     47.0     10.20              1.07        81    0.126    270K    21K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.8     52.8     53.7      1.67              0.20        14    0.119     63K   3612       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   0.0     57.4     48.8      8.16              0.87        40    0.204    270K    21K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     39.9      2.04              0.20        40    0.051       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 4800.0 total, 600.0 interval#012Flush(GB): cumulative 0.079, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.47 GB write, 0.10 MB/s write, 0.46 GB read, 0.10 MB/s read, 10.2 seconds#012Interval compaction: 0.09 GB write, 0.15 MB/s write, 0.09 GB read, 0.15 MB/s read, 1.7 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557fc5f1b1f0#2 capacity: 304.00 MB usage: 50.63 MB table_size: 0 occupancy: 18446744073709551615 collections: 9 last_copies: 0 last_secs: 0.000332 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2882,48.61 MB,15.9911%) FilterBlock(81,761.98 KB,0.244778%) IndexBlock(81,1.27 MB,0.417263%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 03:42:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:42:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:42:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:25.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:25 np0005603623 nova_compute[226235]: 2026-01-31 08:42:25.801 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:42:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:25.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:42:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:42:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:27.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:42:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:42:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:27.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:42:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:42:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:29.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:42:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:29.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:29 np0005603623 nova_compute[226235]: 2026-01-31 08:42:29.867 226239 INFO nova.virt.libvirt.driver [-] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Instance destroyed successfully.#033[00m
Jan 31 03:42:29 np0005603623 nova_compute[226235]: 2026-01-31 08:42:29.868 226239 DEBUG nova.objects.instance [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'resources' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:42:29 np0005603623 nova_compute[226235]: 2026-01-31 08:42:29.984 226239 DEBUG nova.virt.libvirt.vif [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:38:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1679208816',display_name='tempest-ServersNegativeTestJSON-server-1679208816',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1679208816',id=148,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:38:45Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='621c17d53cba46d386de8efb560a988e',ramdisk_id='',reservation_id='r-kclqkoza',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-183161027',owner_user_name='tempest-ServersNegativeTestJSON-183161027-project-member',shelved_at='2026-01-31T08:42:16.247982',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='269becbf-04a2-4537-bd31-66899150ed70'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:41:57Z,user_data=None,user_id='516e093a00a44667ba1308900be70d8d',uuid=bd87e542-0f7b-453e-b8d1-643ad6fb64f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:42:29 np0005603623 nova_compute[226235]: 2026-01-31 08:42:29.985 226239 DEBUG nova.network.os_vif_util [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Converting VIF {"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:42:29 np0005603623 nova_compute[226235]: 2026-01-31 08:42:29.985 226239 DEBUG nova.network.os_vif_util [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:83:50,bridge_name='br-int',has_traffic_filtering=True,id=b122a11a-5b9d-4b27-a9c3-8327cb8162ae,network=Network(550cf3a2-62ab-424d-afc0-3148a4a687ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb122a11a-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:42:29 np0005603623 nova_compute[226235]: 2026-01-31 08:42:29.986 226239 DEBUG os_vif [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:83:50,bridge_name='br-int',has_traffic_filtering=True,id=b122a11a-5b9d-4b27-a9c3-8327cb8162ae,network=Network(550cf3a2-62ab-424d-afc0-3148a4a687ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb122a11a-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:42:29 np0005603623 nova_compute[226235]: 2026-01-31 08:42:29.987 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:29 np0005603623 nova_compute[226235]: 2026-01-31 08:42:29.988 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb122a11a-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:42:29 np0005603623 nova_compute[226235]: 2026-01-31 08:42:29.989 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:29 np0005603623 nova_compute[226235]: 2026-01-31 08:42:29.991 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:42:29 np0005603623 nova_compute[226235]: 2026-01-31 08:42:29.993 226239 INFO os_vif [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:83:50,bridge_name='br-int',has_traffic_filtering=True,id=b122a11a-5b9d-4b27-a9c3-8327cb8162ae,network=Network(550cf3a2-62ab-424d-afc0-3148a4a687ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb122a11a-5b')#033[00m
Jan 31 03:42:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:42:30.134 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:42:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:42:30.134 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:42:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:42:30.135 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:42:30 np0005603623 nova_compute[226235]: 2026-01-31 08:42:30.144 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:30 np0005603623 nova_compute[226235]: 2026-01-31 08:42:30.348 226239 DEBUG nova.compute.manager [req-141ae6ef-d82d-4ad9-a9ef-155e674e002f req-c8610b47-e6b2-4d84-bc20-3988e0b535e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received event network-changed-b122a11a-5b9d-4b27-a9c3-8327cb8162ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:42:30 np0005603623 nova_compute[226235]: 2026-01-31 08:42:30.348 226239 DEBUG nova.compute.manager [req-141ae6ef-d82d-4ad9-a9ef-155e674e002f req-c8610b47-e6b2-4d84-bc20-3988e0b535e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Refreshing instance network info cache due to event network-changed-b122a11a-5b9d-4b27-a9c3-8327cb8162ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:42:30 np0005603623 nova_compute[226235]: 2026-01-31 08:42:30.349 226239 DEBUG oslo_concurrency.lockutils [req-141ae6ef-d82d-4ad9-a9ef-155e674e002f req-c8610b47-e6b2-4d84-bc20-3988e0b535e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:42:30 np0005603623 nova_compute[226235]: 2026-01-31 08:42:30.349 226239 DEBUG oslo_concurrency.lockutils [req-141ae6ef-d82d-4ad9-a9ef-155e674e002f req-c8610b47-e6b2-4d84-bc20-3988e0b535e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:42:30 np0005603623 nova_compute[226235]: 2026-01-31 08:42:30.349 226239 DEBUG nova.network.neutron [req-141ae6ef-d82d-4ad9-a9ef-155e674e002f req-c8610b47-e6b2-4d84-bc20-3988e0b535e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Refreshing network info cache for port b122a11a-5b9d-4b27-a9c3-8327cb8162ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:42:30 np0005603623 nova_compute[226235]: 2026-01-31 08:42:30.566 226239 INFO nova.virt.libvirt.driver [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Deleting instance files /var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0_del#033[00m
Jan 31 03:42:30 np0005603623 nova_compute[226235]: 2026-01-31 08:42:30.567 226239 INFO nova.virt.libvirt.driver [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Deletion of /var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0_del complete#033[00m
Jan 31 03:42:31 np0005603623 nova_compute[226235]: 2026-01-31 08:42:31.591 226239 INFO nova.scheduler.client.report [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Deleted allocations for instance bd87e542-0f7b-453e-b8d1-643ad6fb64f0#033[00m
Jan 31 03:42:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:31.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:42:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:31.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:42:31 np0005603623 nova_compute[226235]: 2026-01-31 08:42:31.955 226239 DEBUG oslo_concurrency.lockutils [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:42:31 np0005603623 nova_compute[226235]: 2026-01-31 08:42:31.956 226239 DEBUG oslo_concurrency.lockutils [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:42:32 np0005603623 nova_compute[226235]: 2026-01-31 08:42:32.006 226239 DEBUG oslo_concurrency.processutils [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:42:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:42:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2709231582' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:42:32 np0005603623 nova_compute[226235]: 2026-01-31 08:42:32.427 226239 DEBUG oslo_concurrency.processutils [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:42:32 np0005603623 nova_compute[226235]: 2026-01-31 08:42:32.433 226239 DEBUG nova.compute.provider_tree [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:42:32 np0005603623 nova_compute[226235]: 2026-01-31 08:42:32.499 226239 DEBUG nova.scheduler.client.report [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:42:32 np0005603623 nova_compute[226235]: 2026-01-31 08:42:32.541 226239 DEBUG oslo_concurrency.lockutils [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:42:32 np0005603623 nova_compute[226235]: 2026-01-31 08:42:32.623 226239 DEBUG oslo_concurrency.lockutils [None req-55aac2d3-a13e-4dab-9c9c-0f04fabc2cbb 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 40.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:42:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:42:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:33.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:42:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:33.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:35 np0005603623 nova_compute[226235]: 2026-01-31 08:42:35.046 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:35 np0005603623 nova_compute[226235]: 2026-01-31 08:42:35.147 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:42:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:35.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:42:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:35.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #133. Immutable memtables: 0.
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:42:36.691384) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 83] Flushing memtable with next log file: 133
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848956691419, "job": 83, "event": "flush_started", "num_memtables": 1, "num_entries": 1136, "num_deletes": 256, "total_data_size": 2381978, "memory_usage": 2420712, "flush_reason": "Manual Compaction"}
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 83] Level-0 flush table #134: started
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848956713352, "cf_name": "default", "job": 83, "event": "table_file_creation", "file_number": 134, "file_size": 1571194, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 65417, "largest_seqno": 66548, "table_properties": {"data_size": 1565959, "index_size": 2694, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11602, "raw_average_key_size": 20, "raw_value_size": 1555436, "raw_average_value_size": 2748, "num_data_blocks": 117, "num_entries": 566, "num_filter_entries": 566, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848876, "oldest_key_time": 1769848876, "file_creation_time": 1769848956, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 134, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 83] Flush lasted 22019 microseconds, and 3331 cpu microseconds.
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:42:36.713398) [db/flush_job.cc:967] [default] [JOB 83] Level-0 flush table #134: 1571194 bytes OK
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:42:36.713421) [db/memtable_list.cc:519] [default] Level-0 commit table #134 started
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:42:36.725724) [db/memtable_list.cc:722] [default] Level-0 commit table #134: memtable #1 done
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:42:36.725764) EVENT_LOG_v1 {"time_micros": 1769848956725755, "job": 83, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:42:36.725786) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 83] Try to delete WAL files size 2376410, prev total WAL file size 2377119, number of live WAL files 2.
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000130.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:42:36.726364) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035353232' seq:72057594037927935, type:22 .. '7061786F730035373734' seq:0, type:0; will stop at (end)
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 84] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 83 Base level 0, inputs: [134(1534KB)], [132(11MB)]
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848956726413, "job": 84, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [134], "files_L6": [132], "score": -1, "input_data_size": 13497934, "oldest_snapshot_seqno": -1}
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 84] Generated table #135: 8659 keys, 11616857 bytes, temperature: kUnknown
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848956821549, "cf_name": "default", "job": 84, "event": "table_file_creation", "file_number": 135, "file_size": 11616857, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11560154, "index_size": 33934, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21701, "raw_key_size": 228727, "raw_average_key_size": 26, "raw_value_size": 11407380, "raw_average_value_size": 1317, "num_data_blocks": 1297, "num_entries": 8659, "num_filter_entries": 8659, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769848956, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 135, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:42:36.821777) [db/compaction/compaction_job.cc:1663] [default] [JOB 84] Compacted 1@0 + 1@6 files to L6 => 11616857 bytes
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:42:36.824089) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 141.8 rd, 122.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 11.4 +0.0 blob) out(11.1 +0.0 blob), read-write-amplify(16.0) write-amplify(7.4) OK, records in: 9189, records dropped: 530 output_compression: NoCompression
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:42:36.824110) EVENT_LOG_v1 {"time_micros": 1769848956824100, "job": 84, "event": "compaction_finished", "compaction_time_micros": 95193, "compaction_time_cpu_micros": 19737, "output_level": 6, "num_output_files": 1, "total_output_size": 11616857, "num_input_records": 9189, "num_output_records": 8659, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:42:36.726273) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:42:36.824164) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:42:36.824168) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:42:36.824170) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:42:36.824172) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:42:36.824173) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000134.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848956824673, "job": 0, "event": "table_file_deletion", "file_number": 134}
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000132.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:42:36 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848956825694, "job": 0, "event": "table_file_deletion", "file_number": 132}
Jan 31 03:42:37 np0005603623 nova_compute[226235]: 2026-01-31 08:42:37.103 226239 DEBUG nova.network.neutron [req-141ae6ef-d82d-4ad9-a9ef-155e674e002f req-c8610b47-e6b2-4d84-bc20-3988e0b535e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updated VIF entry in instance network info cache for port b122a11a-5b9d-4b27-a9c3-8327cb8162ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:42:37 np0005603623 nova_compute[226235]: 2026-01-31 08:42:37.103 226239 DEBUG nova.network.neutron [req-141ae6ef-d82d-4ad9-a9ef-155e674e002f req-c8610b47-e6b2-4d84-bc20-3988e0b535e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updating instance_info_cache with network_info: [{"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": null, "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tapb122a11a-5b", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:42:37 np0005603623 nova_compute[226235]: 2026-01-31 08:42:37.190 226239 DEBUG oslo_concurrency.lockutils [req-141ae6ef-d82d-4ad9-a9ef-155e674e002f req-c8610b47-e6b2-4d84-bc20-3988e0b535e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:42:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:37.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:42:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:37.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:42:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:42:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:39.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:42:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:39.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:40 np0005603623 nova_compute[226235]: 2026-01-31 08:42:40.093 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:40 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 31 03:42:40 np0005603623 nova_compute[226235]: 2026-01-31 08:42:40.148 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:41.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:41.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:43.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:43.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:44 np0005603623 podman[298302]: 2026-01-31 08:42:44.954259122 +0000 UTC m=+0.050386242 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 03:42:44 np0005603623 podman[298303]: 2026-01-31 08:42:44.975161858 +0000 UTC m=+0.070745741 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:42:45 np0005603623 nova_compute[226235]: 2026-01-31 08:42:45.095 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:45 np0005603623 nova_compute[226235]: 2026-01-31 08:42:45.149 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:45.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:45.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:46 np0005603623 nova_compute[226235]: 2026-01-31 08:42:46.536 226239 DEBUG nova.virt.libvirt.driver [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Creating tmpfile /var/lib/nova/instances/tmpxoswf42f to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 31 03:42:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:42:46.636 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=62, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=61) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:42:46 np0005603623 nova_compute[226235]: 2026-01-31 08:42:46.637 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:46 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:42:46.638 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:42:46 np0005603623 nova_compute[226235]: 2026-01-31 08:42:46.967 226239 DEBUG nova.compute.manager [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=18432,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxoswf42f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 31 03:42:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:42:47.639 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '62'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:42:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:47.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:47.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:49.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:49.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:50 np0005603623 nova_compute[226235]: 2026-01-31 08:42:50.098 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:50 np0005603623 nova_compute[226235]: 2026-01-31 08:42:50.151 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:50 np0005603623 nova_compute[226235]: 2026-01-31 08:42:50.348 226239 DEBUG nova.compute.manager [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=18432,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxoswf42f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='e7694c5e-8d11-4f04-aec6-d1933f668d11',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 31 03:42:50 np0005603623 nova_compute[226235]: 2026-01-31 08:42:50.408 226239 DEBUG oslo_concurrency.lockutils [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Acquiring lock "refresh_cache-e7694c5e-8d11-4f04-aec6-d1933f668d11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:42:50 np0005603623 nova_compute[226235]: 2026-01-31 08:42:50.409 226239 DEBUG oslo_concurrency.lockutils [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Acquired lock "refresh_cache-e7694c5e-8d11-4f04-aec6-d1933f668d11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:42:50 np0005603623 nova_compute[226235]: 2026-01-31 08:42:50.409 226239 DEBUG nova.network.neutron [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:42:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:42:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.1 total, 600.0 interval#012Cumulative writes: 54K writes, 237K keys, 54K commit groups, 1.0 writes per commit group, ingest: 0.24 GB, 0.05 MB/s#012Cumulative WAL: 54K writes, 17K syncs, 3.05 writes per sync, written: 0.24 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 7760 writes, 31K keys, 7760 commit groups, 1.0 writes per commit group, ingest: 31.83 MB, 0.05 MB/s#012Interval WAL: 7761 writes, 2957 syncs, 2.62 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 03:42:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:51.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:42:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:51.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:42:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:53.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:53.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:54 np0005603623 nova_compute[226235]: 2026-01-31 08:42:54.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:42:54 np0005603623 nova_compute[226235]: 2026-01-31 08:42:54.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:42:55 np0005603623 nova_compute[226235]: 2026-01-31 08:42:55.101 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:55 np0005603623 nova_compute[226235]: 2026-01-31 08:42:55.152 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:55.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:55.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:55 np0005603623 nova_compute[226235]: 2026-01-31 08:42:55.977 226239 DEBUG nova.network.neutron [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Updating instance_info_cache with network_info: [{"id": "b57f4c41-e254-4e29-be21-1899bdb779e0", "address": "fa:16:3e:26:78:9b", "network": {"id": "b000b527-ea00-4c0c-84c4-e93c10d4bae5", "bridge": "br-int", "label": "tempest-network-smoke--1061452383", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb57f4c41-e2", "ovs_interfaceid": "b57f4c41-e254-4e29-be21-1899bdb779e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.094 226239 DEBUG oslo_concurrency.lockutils [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Releasing lock "refresh_cache-e7694c5e-8d11-4f04-aec6-d1933f668d11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.096 226239 DEBUG nova.virt.libvirt.driver [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=18432,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxoswf42f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='e7694c5e-8d11-4f04-aec6-d1933f668d11',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.097 226239 DEBUG nova.virt.libvirt.driver [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Creating instance directory: /var/lib/nova/instances/e7694c5e-8d11-4f04-aec6-d1933f668d11 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.097 226239 DEBUG nova.virt.libvirt.driver [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Ensure instance console log exists: /var/lib/nova/instances/e7694c5e-8d11-4f04-aec6-d1933f668d11/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.098 226239 DEBUG nova.virt.libvirt.driver [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.100 226239 DEBUG nova.virt.libvirt.vif [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:41:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-617880352',display_name='tempest-TestNetworkAdvancedServerOps-server-617880352',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-617880352',id=151,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE9oW7hyNK/c0GmlhWHVsudW1EFOU1/778j2Zfzh7XKLIHLI+8KsqNzzhySs7L5TOC+KBq7HkFVRK05TmxJs9LQc4oVDYzV+eQ5EXf3bE6KOfId7bnJvpzjj70u8lMALWA==',key_name='tempest-TestNetworkAdvancedServerOps-530143631',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:42:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-3ynh01gy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:42:05Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=e7694c5e-8d11-4f04-aec6-d1933f668d11,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b57f4c41-e254-4e29-be21-1899bdb779e0", "address": "fa:16:3e:26:78:9b", "network": {"id": "b000b527-ea00-4c0c-84c4-e93c10d4bae5", "bridge": "br-int", "label": "tempest-network-smoke--1061452383", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb57f4c41-e2", "ovs_interfaceid": "b57f4c41-e254-4e29-be21-1899bdb779e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.100 226239 DEBUG nova.network.os_vif_util [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Converting VIF {"id": "b57f4c41-e254-4e29-be21-1899bdb779e0", "address": "fa:16:3e:26:78:9b", "network": {"id": "b000b527-ea00-4c0c-84c4-e93c10d4bae5", "bridge": "br-int", "label": "tempest-network-smoke--1061452383", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb57f4c41-e2", "ovs_interfaceid": "b57f4c41-e254-4e29-be21-1899bdb779e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.101 226239 DEBUG nova.network.os_vif_util [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:78:9b,bridge_name='br-int',has_traffic_filtering=True,id=b57f4c41-e254-4e29-be21-1899bdb779e0,network=Network(b000b527-ea00-4c0c-84c4-e93c10d4bae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb57f4c41-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.101 226239 DEBUG os_vif [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:78:9b,bridge_name='br-int',has_traffic_filtering=True,id=b57f4c41-e254-4e29-be21-1899bdb779e0,network=Network(b000b527-ea00-4c0c-84c4-e93c10d4bae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb57f4c41-e2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.102 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.102 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.103 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.106 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.107 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb57f4c41-e2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.107 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb57f4c41-e2, col_values=(('external_ids', {'iface-id': 'b57f4c41-e254-4e29-be21-1899bdb779e0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:78:9b', 'vm-uuid': 'e7694c5e-8d11-4f04-aec6-d1933f668d11'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.109 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.111 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:42:56 np0005603623 NetworkManager[48970]: <info>  [1769848976.1123] manager: (tapb57f4c41-e2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/303)
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.118 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.119 226239 INFO os_vif [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:78:9b,bridge_name='br-int',has_traffic_filtering=True,id=b57f4c41-e254-4e29-be21-1899bdb779e0,network=Network(b000b527-ea00-4c0c-84c4-e93c10d4bae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb57f4c41-e2')#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.120 226239 DEBUG nova.virt.libvirt.driver [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 31 03:42:56 np0005603623 nova_compute[226235]: 2026-01-31 08:42:56.120 226239 DEBUG nova.compute.manager [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=18432,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxoswf42f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='e7694c5e-8d11-4f04-aec6-d1933f668d11',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 31 03:42:57 np0005603623 nova_compute[226235]: 2026-01-31 08:42:57.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:42:57 np0005603623 nova_compute[226235]: 2026-01-31 08:42:57.627 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:42:57 np0005603623 nova_compute[226235]: 2026-01-31 08:42:57.627 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:42:57 np0005603623 nova_compute[226235]: 2026-01-31 08:42:57.628 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:42:57 np0005603623 nova_compute[226235]: 2026-01-31 08:42:57.628 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:42:57 np0005603623 nova_compute[226235]: 2026-01-31 08:42:57.628 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:42:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:57.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:42:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:57.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:42:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:42:58 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2256123524' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:42:58 np0005603623 nova_compute[226235]: 2026-01-31 08:42:58.035 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.407s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:42:58 np0005603623 nova_compute[226235]: 2026-01-31 08:42:58.170 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:42:58 np0005603623 nova_compute[226235]: 2026-01-31 08:42:58.171 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4316MB free_disk=20.942729949951172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:42:58 np0005603623 nova_compute[226235]: 2026-01-31 08:42:58.171 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:42:58 np0005603623 nova_compute[226235]: 2026-01-31 08:42:58.171 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:42:58 np0005603623 nova_compute[226235]: 2026-01-31 08:42:58.376 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Migration for instance e7694c5e-8d11-4f04-aec6-d1933f668d11 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 03:42:58 np0005603623 nova_compute[226235]: 2026-01-31 08:42:58.429 226239 INFO nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Updating resource usage from migration 3a883d64-0212-4213-b49a-7b9716133539#033[00m
Jan 31 03:42:58 np0005603623 nova_compute[226235]: 2026-01-31 08:42:58.429 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Starting to track incoming migration 3a883d64-0212-4213-b49a-7b9716133539 with flavor a01eb4f0-fd80-416b-a750-75de320394d8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 31 03:42:58 np0005603623 nova_compute[226235]: 2026-01-31 08:42:58.741 226239 WARNING nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance e7694c5e-8d11-4f04-aec6-d1933f668d11 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Jan 31 03:42:58 np0005603623 nova_compute[226235]: 2026-01-31 08:42:58.741 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:42:58 np0005603623 nova_compute[226235]: 2026-01-31 08:42:58.742 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:42:58 np0005603623 nova_compute[226235]: 2026-01-31 08:42:58.904 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:42:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:42:59 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2352666999' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:42:59 np0005603623 nova_compute[226235]: 2026-01-31 08:42:59.288 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.384s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:42:59 np0005603623 nova_compute[226235]: 2026-01-31 08:42:59.293 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:42:59 np0005603623 nova_compute[226235]: 2026-01-31 08:42:59.383 226239 DEBUG nova.network.neutron [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Port b57f4c41-e254-4e29-be21-1899bdb779e0 updated with migration profile {'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 31 03:42:59 np0005603623 nova_compute[226235]: 2026-01-31 08:42:59.385 226239 DEBUG nova.compute.manager [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=18432,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxoswf42f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='e7694c5e-8d11-4f04-aec6-d1933f668d11',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 31 03:42:59 np0005603623 nova_compute[226235]: 2026-01-31 08:42:59.399 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:42:59 np0005603623 nova_compute[226235]: 2026-01-31 08:42:59.567 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:42:59 np0005603623 nova_compute[226235]: 2026-01-31 08:42:59.568 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.396s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:42:59 np0005603623 systemd[1]: Starting libvirt proxy daemon...
Jan 31 03:42:59 np0005603623 systemd[1]: Started libvirt proxy daemon.
Jan 31 03:42:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:59.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:42:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:59.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:59 np0005603623 kernel: tapb57f4c41-e2: entered promiscuous mode
Jan 31 03:42:59 np0005603623 NetworkManager[48970]: <info>  [1769848979.9078] manager: (tapb57f4c41-e2): new Tun device (/org/freedesktop/NetworkManager/Devices/304)
Jan 31 03:42:59 np0005603623 ovn_controller[133449]: 2026-01-31T08:42:59Z|00643|binding|INFO|Claiming lport b57f4c41-e254-4e29-be21-1899bdb779e0 for this additional chassis.
Jan 31 03:42:59 np0005603623 ovn_controller[133449]: 2026-01-31T08:42:59Z|00644|binding|INFO|b57f4c41-e254-4e29-be21-1899bdb779e0: Claiming fa:16:3e:26:78:9b 10.100.0.7
Jan 31 03:42:59 np0005603623 nova_compute[226235]: 2026-01-31 08:42:59.909 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:59 np0005603623 nova_compute[226235]: 2026-01-31 08:42:59.916 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:59 np0005603623 nova_compute[226235]: 2026-01-31 08:42:59.919 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:59 np0005603623 NetworkManager[48970]: <info>  [1769848979.9292] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/305)
Jan 31 03:42:59 np0005603623 NetworkManager[48970]: <info>  [1769848979.9298] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/306)
Jan 31 03:42:59 np0005603623 nova_compute[226235]: 2026-01-31 08:42:59.928 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:59 np0005603623 systemd-udevd[298479]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:42:59 np0005603623 systemd-machined[194379]: New machine qemu-73-instance-00000097.
Jan 31 03:42:59 np0005603623 NetworkManager[48970]: <info>  [1769848979.9461] device (tapb57f4c41-e2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:42:59 np0005603623 NetworkManager[48970]: <info>  [1769848979.9467] device (tapb57f4c41-e2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:42:59 np0005603623 systemd[1]: Started Virtual Machine qemu-73-instance-00000097.
Jan 31 03:42:59 np0005603623 nova_compute[226235]: 2026-01-31 08:42:59.962 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:59 np0005603623 nova_compute[226235]: 2026-01-31 08:42:59.969 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:00 np0005603623 ovn_controller[133449]: 2026-01-31T08:43:00Z|00645|binding|INFO|Setting lport b57f4c41-e254-4e29-be21-1899bdb779e0 ovn-installed in OVS
Jan 31 03:43:00 np0005603623 nova_compute[226235]: 2026-01-31 08:43:00.023 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:00 np0005603623 nova_compute[226235]: 2026-01-31 08:43:00.026 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:00 np0005603623 nova_compute[226235]: 2026-01-31 08:43:00.155 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:00 np0005603623 nova_compute[226235]: 2026-01-31 08:43:00.707 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848980.7070854, e7694c5e-8d11-4f04-aec6-d1933f668d11 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:43:00 np0005603623 nova_compute[226235]: 2026-01-31 08:43:00.707 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] VM Started (Lifecycle Event)#033[00m
Jan 31 03:43:00 np0005603623 nova_compute[226235]: 2026-01-31 08:43:00.800 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:43:01 np0005603623 nova_compute[226235]: 2026-01-31 08:43:01.092 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769848981.0922284, e7694c5e-8d11-4f04-aec6-d1933f668d11 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:43:01 np0005603623 nova_compute[226235]: 2026-01-31 08:43:01.093 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:43:01 np0005603623 nova_compute[226235]: 2026-01-31 08:43:01.110 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:01 np0005603623 nova_compute[226235]: 2026-01-31 08:43:01.255 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:43:01 np0005603623 nova_compute[226235]: 2026-01-31 08:43:01.258 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:43:01 np0005603623 nova_compute[226235]: 2026-01-31 08:43:01.364 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 31 03:43:01 np0005603623 nova_compute[226235]: 2026-01-31 08:43:01.568 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:01 np0005603623 nova_compute[226235]: 2026-01-31 08:43:01.569 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:43:01 np0005603623 nova_compute[226235]: 2026-01-31 08:43:01.569 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:43:01 np0005603623 nova_compute[226235]: 2026-01-31 08:43:01.628 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:43:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:01.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:01.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:03 np0005603623 nova_compute[226235]: 2026-01-31 08:43:03.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:43:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:03.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:43:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:43:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:03.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:43:04 np0005603623 nova_compute[226235]: 2026-01-31 08:43:04.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:04 np0005603623 ovn_controller[133449]: 2026-01-31T08:43:04Z|00646|binding|INFO|Claiming lport b57f4c41-e254-4e29-be21-1899bdb779e0 for this chassis.
Jan 31 03:43:04 np0005603623 ovn_controller[133449]: 2026-01-31T08:43:04Z|00647|binding|INFO|b57f4c41-e254-4e29-be21-1899bdb779e0: Claiming fa:16:3e:26:78:9b 10.100.0.7
Jan 31 03:43:04 np0005603623 ovn_controller[133449]: 2026-01-31T08:43:04Z|00648|binding|INFO|Setting lport b57f4c41-e254-4e29-be21-1899bdb779e0 up in Southbound
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.571 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:78:9b 10.100.0.7'], port_security=['fa:16:3e:26:78:9b 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e7694c5e-8d11-4f04-aec6-d1933f668d11', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b000b527-ea00-4c0c-84c4-e93c10d4bae5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0bfe11bd9d694684b527666e2c378eed', 'neutron:revision_number': '10', 'neutron:security_group_ids': '15398c59-1164-4f8b-8737-a5ada60dadf3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.243'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=afdea9bd-63de-451e-8b4c-572440598122, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=b57f4c41-e254-4e29-be21-1899bdb779e0) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.573 143258 INFO neutron.agent.ovn.metadata.agent [-] Port b57f4c41-e254-4e29-be21-1899bdb779e0 in datapath b000b527-ea00-4c0c-84c4-e93c10d4bae5 bound to our chassis#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.574 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b000b527-ea00-4c0c-84c4-e93c10d4bae5#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.582 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4df1e4d9-6f9f-488d-9b56-a243a53f6c81]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.584 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb000b527-e1 in ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.586 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb000b527-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.586 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[51ec67b4-4ca1-426a-af2b-63c3de17e81d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.588 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[18547076-b670-49e5-b5b8-4233073ebcf5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.598 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[bd9bebb8-5d18-4b40-a92a-03b8030016fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.607 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6becfe6c-7f6e-4b21-a457-c3e8537760f8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.628 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[3f225a59-d9ac-4e91-b39f-13ceddb98577]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.632 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a14b9305-5169-4ad2-b77a-db4ce45046eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:04 np0005603623 NetworkManager[48970]: <info>  [1769848984.6333] manager: (tapb000b527-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/307)
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.652 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[17fefdbd-e9a2-41a4-bbab-cda652310005]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:04 np0005603623 systemd-udevd[298541]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.655 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[348367f1-3585-4415-9264-c7fe69257e5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:04 np0005603623 NetworkManager[48970]: <info>  [1769848984.6734] device (tapb000b527-e0): carrier: link connected
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.675 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[54adbc97-274a-4c90-94c9-66be54fcb5e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.688 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9ca19949-fadf-412b-86e9-c22b46bda3f5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb000b527-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:b7:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 192], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 812510, 'reachable_time': 23963, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 298560, 'error': None, 'target': 'ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.701 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a379b97a-a751-46d0-9d39-be7c022b67a1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:b75a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 812510, 'tstamp': 812510}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 298561, 'error': None, 'target': 'ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.713 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0ea989f3-9b59-4bfc-aee4-bab028438614]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb000b527-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:b7:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 192], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 812510, 'reachable_time': 23963, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 298562, 'error': None, 'target': 'ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.734 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7698350d-e72b-4211-8757-d6ac01eebe7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.782 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0d00df45-2f9a-40a4-a996-9a007130f51b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.783 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb000b527-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.783 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.784 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb000b527-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:04 np0005603623 kernel: tapb000b527-e0: entered promiscuous mode
Jan 31 03:43:04 np0005603623 NetworkManager[48970]: <info>  [1769848984.7863] manager: (tapb000b527-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/308)
Jan 31 03:43:04 np0005603623 nova_compute[226235]: 2026-01-31 08:43:04.786 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.788 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb000b527-e0, col_values=(('external_ids', {'iface-id': '3a41031b-7ec9-4414-97e4-222c1c56b61b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:04 np0005603623 ovn_controller[133449]: 2026-01-31T08:43:04Z|00649|binding|INFO|Releasing lport 3a41031b-7ec9-4414-97e4-222c1c56b61b from this chassis (sb_readonly=0)
Jan 31 03:43:04 np0005603623 nova_compute[226235]: 2026-01-31 08:43:04.792 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.793 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b000b527-ea00-4c0c-84c4-e93c10d4bae5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b000b527-ea00-4c0c-84c4-e93c10d4bae5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.794 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6fcc0254-13df-402a-8438-685d713a0e6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.794 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-b000b527-ea00-4c0c-84c4-e93c10d4bae5
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/b000b527-ea00-4c0c-84c4-e93c10d4bae5.pid.haproxy
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID b000b527-ea00-4c0c-84c4-e93c10d4bae5
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:43:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:04.794 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5', 'env', 'PROCESS_TAG=haproxy-b000b527-ea00-4c0c-84c4-e93c10d4bae5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b000b527-ea00-4c0c-84c4-e93c10d4bae5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:43:05 np0005603623 nova_compute[226235]: 2026-01-31 08:43:05.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:05 np0005603623 podman[298595]: 2026-01-31 08:43:05.169136688 +0000 UTC m=+0.068913013 container create e151085cd96dfd59f788f865a8c80bc5a90f4dafdb82d929915a02db27304978 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 03:43:05 np0005603623 nova_compute[226235]: 2026-01-31 08:43:05.184 226239 INFO nova.compute.manager [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Post operation of migration started#033[00m
Jan 31 03:43:05 np0005603623 nova_compute[226235]: 2026-01-31 08:43:05.188 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:05 np0005603623 systemd[1]: Started libpod-conmon-e151085cd96dfd59f788f865a8c80bc5a90f4dafdb82d929915a02db27304978.scope.
Jan 31 03:43:05 np0005603623 podman[298595]: 2026-01-31 08:43:05.125769727 +0000 UTC m=+0.025546072 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:43:05 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:43:05 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81318cf4c2ff6c58ae624044bb1c76772f3f5fbe97bffcf3e995711abb796f23/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:43:05 np0005603623 podman[298595]: 2026-01-31 08:43:05.254841366 +0000 UTC m=+0.154617721 container init e151085cd96dfd59f788f865a8c80bc5a90f4dafdb82d929915a02db27304978 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 03:43:05 np0005603623 podman[298595]: 2026-01-31 08:43:05.260329108 +0000 UTC m=+0.160105433 container start e151085cd96dfd59f788f865a8c80bc5a90f4dafdb82d929915a02db27304978 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:43:05 np0005603623 neutron-haproxy-ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5[298610]: [NOTICE]   (298614) : New worker (298616) forked
Jan 31 03:43:05 np0005603623 neutron-haproxy-ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5[298610]: [NOTICE]   (298614) : Loading success.
Jan 31 03:43:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:05.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:43:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:05.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:43:06 np0005603623 nova_compute[226235]: 2026-01-31 08:43:06.112 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:06 np0005603623 nova_compute[226235]: 2026-01-31 08:43:06.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:06 np0005603623 nova_compute[226235]: 2026-01-31 08:43:06.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:06 np0005603623 nova_compute[226235]: 2026-01-31 08:43:06.400 226239 DEBUG oslo_concurrency.lockutils [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Acquiring lock "refresh_cache-e7694c5e-8d11-4f04-aec6-d1933f668d11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:43:06 np0005603623 nova_compute[226235]: 2026-01-31 08:43:06.400 226239 DEBUG oslo_concurrency.lockutils [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Acquired lock "refresh_cache-e7694c5e-8d11-4f04-aec6-d1933f668d11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:43:06 np0005603623 nova_compute[226235]: 2026-01-31 08:43:06.401 226239 DEBUG nova.network.neutron [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:43:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:43:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:07.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:43:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:43:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:07.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:43:09 np0005603623 nova_compute[226235]: 2026-01-31 08:43:09.763 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:09.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:09.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:10 np0005603623 nova_compute[226235]: 2026-01-31 08:43:10.188 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:11 np0005603623 nova_compute[226235]: 2026-01-31 08:43:11.114 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:43:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:11.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:43:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:11.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:12 np0005603623 nova_compute[226235]: 2026-01-31 08:43:12.309 226239 DEBUG oslo_concurrency.lockutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Acquiring lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:12 np0005603623 nova_compute[226235]: 2026-01-31 08:43:12.310 226239 DEBUG oslo_concurrency.lockutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:12 np0005603623 nova_compute[226235]: 2026-01-31 08:43:12.392 226239 DEBUG nova.compute.manager [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:43:12 np0005603623 nova_compute[226235]: 2026-01-31 08:43:12.628 226239 DEBUG oslo_concurrency.lockutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:12 np0005603623 nova_compute[226235]: 2026-01-31 08:43:12.629 226239 DEBUG oslo_concurrency.lockutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:12 np0005603623 nova_compute[226235]: 2026-01-31 08:43:12.638 226239 DEBUG nova.virt.hardware [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:43:12 np0005603623 nova_compute[226235]: 2026-01-31 08:43:12.638 226239 INFO nova.compute.claims [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:43:13 np0005603623 nova_compute[226235]: 2026-01-31 08:43:13.246 226239 DEBUG oslo_concurrency.processutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:43:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:43:13 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4084711501' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:43:13 np0005603623 nova_compute[226235]: 2026-01-31 08:43:13.659 226239 DEBUG oslo_concurrency.processutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:43:13 np0005603623 nova_compute[226235]: 2026-01-31 08:43:13.665 226239 DEBUG nova.compute.provider_tree [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:43:13 np0005603623 nova_compute[226235]: 2026-01-31 08:43:13.776 226239 DEBUG nova.scheduler.client.report [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:43:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:43:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:13.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:43:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:13.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:14 np0005603623 nova_compute[226235]: 2026-01-31 08:43:14.093 226239 DEBUG nova.network.neutron [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Updating instance_info_cache with network_info: [{"id": "b57f4c41-e254-4e29-be21-1899bdb779e0", "address": "fa:16:3e:26:78:9b", "network": {"id": "b000b527-ea00-4c0c-84c4-e93c10d4bae5", "bridge": "br-int", "label": "tempest-network-smoke--1061452383", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb57f4c41-e2", "ovs_interfaceid": "b57f4c41-e254-4e29-be21-1899bdb779e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:43:14 np0005603623 nova_compute[226235]: 2026-01-31 08:43:14.166 226239 DEBUG oslo_concurrency.lockutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:14 np0005603623 nova_compute[226235]: 2026-01-31 08:43:14.167 226239 DEBUG nova.compute.manager [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:43:14 np0005603623 nova_compute[226235]: 2026-01-31 08:43:14.187 226239 DEBUG oslo_concurrency.lockutils [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Releasing lock "refresh_cache-e7694c5e-8d11-4f04-aec6-d1933f668d11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:43:14 np0005603623 nova_compute[226235]: 2026-01-31 08:43:14.373 226239 DEBUG oslo_concurrency.lockutils [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:14 np0005603623 nova_compute[226235]: 2026-01-31 08:43:14.374 226239 DEBUG oslo_concurrency.lockutils [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:14 np0005603623 nova_compute[226235]: 2026-01-31 08:43:14.374 226239 DEBUG oslo_concurrency.lockutils [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:14 np0005603623 nova_compute[226235]: 2026-01-31 08:43:14.379 226239 INFO nova.virt.libvirt.driver [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 31 03:43:14 np0005603623 virtqemud[225858]: Domain id=73 name='instance-00000097' uuid=e7694c5e-8d11-4f04-aec6-d1933f668d11 is tainted: custom-monitor
Jan 31 03:43:14 np0005603623 nova_compute[226235]: 2026-01-31 08:43:14.500 226239 DEBUG nova.compute.manager [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:43:14 np0005603623 nova_compute[226235]: 2026-01-31 08:43:14.500 226239 DEBUG nova.network.neutron [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:43:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e359 e359: 3 total, 3 up, 3 in
Jan 31 03:43:15 np0005603623 nova_compute[226235]: 2026-01-31 08:43:15.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:15 np0005603623 nova_compute[226235]: 2026-01-31 08:43:15.168 226239 INFO nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:43:15 np0005603623 nova_compute[226235]: 2026-01-31 08:43:15.190 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:15 np0005603623 nova_compute[226235]: 2026-01-31 08:43:15.387 226239 INFO nova.virt.libvirt.driver [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 31 03:43:15 np0005603623 nova_compute[226235]: 2026-01-31 08:43:15.397 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:15 np0005603623 nova_compute[226235]: 2026-01-31 08:43:15.590 226239 DEBUG nova.compute.manager [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:43:15 np0005603623 nova_compute[226235]: 2026-01-31 08:43:15.761 226239 DEBUG nova.policy [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '152becc96c854e3ca68b1b377cae845d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0a5099d95b08434397e0bf691596b2cf', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:43:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:43:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:15.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:43:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:15.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:15 np0005603623 podman[298702]: 2026-01-31 08:43:15.994840998 +0000 UTC m=+0.082353195 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 31 03:43:16 np0005603623 podman[298703]: 2026-01-31 08:43:16.040778409 +0000 UTC m=+0.128674758 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.116 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.152 226239 DEBUG nova.compute.manager [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.153 226239 DEBUG nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.153 226239 INFO nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Creating image(s)#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.178 226239 DEBUG nova.storage.rbd_utils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] rbd image 87263c34-4ecf-4f8a-b2b5-159e41f58aed_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.204 226239 DEBUG nova.storage.rbd_utils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] rbd image 87263c34-4ecf-4f8a-b2b5-159e41f58aed_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.229 226239 DEBUG nova.storage.rbd_utils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] rbd image 87263c34-4ecf-4f8a-b2b5-159e41f58aed_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.233 226239 DEBUG oslo_concurrency.processutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.291 226239 DEBUG oslo_concurrency.processutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.291 226239 DEBUG oslo_concurrency.lockutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.292 226239 DEBUG oslo_concurrency.lockutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.292 226239 DEBUG oslo_concurrency.lockutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.316 226239 DEBUG nova.storage.rbd_utils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] rbd image 87263c34-4ecf-4f8a-b2b5-159e41f58aed_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.319 226239 DEBUG oslo_concurrency.processutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 87263c34-4ecf-4f8a-b2b5-159e41f58aed_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.393 226239 INFO nova.virt.libvirt.driver [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.397 226239 DEBUG nova.compute.manager [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.556 226239 DEBUG nova.objects.instance [None req-648dcc49-c1b4-4494-b3c2-3f1aac78e4e2 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.674 226239 DEBUG oslo_concurrency.processutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 87263c34-4ecf-4f8a-b2b5-159e41f58aed_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.355s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.741 226239 DEBUG nova.storage.rbd_utils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] resizing rbd image 87263c34-4ecf-4f8a-b2b5-159e41f58aed_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:43:16 np0005603623 nova_compute[226235]: 2026-01-31 08:43:16.939 226239 DEBUG nova.objects.instance [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lazy-loading 'migration_context' on Instance uuid 87263c34-4ecf-4f8a-b2b5-159e41f58aed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:43:17 np0005603623 nova_compute[226235]: 2026-01-31 08:43:17.016 226239 DEBUG nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:43:17 np0005603623 nova_compute[226235]: 2026-01-31 08:43:17.017 226239 DEBUG nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Ensure instance console log exists: /var/lib/nova/instances/87263c34-4ecf-4f8a-b2b5-159e41f58aed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:43:17 np0005603623 nova_compute[226235]: 2026-01-31 08:43:17.017 226239 DEBUG oslo_concurrency.lockutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:17 np0005603623 nova_compute[226235]: 2026-01-31 08:43:17.018 226239 DEBUG oslo_concurrency.lockutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:17 np0005603623 nova_compute[226235]: 2026-01-31 08:43:17.018 226239 DEBUG oslo_concurrency.lockutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:17 np0005603623 nova_compute[226235]: 2026-01-31 08:43:17.634 226239 DEBUG nova.network.neutron [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Successfully created port: 6b860d9b-53bc-4fbb-ae3a-554edc649838 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:43:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:17.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:17.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:19.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:43:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:19.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:43:20 np0005603623 nova_compute[226235]: 2026-01-31 08:43:20.192 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:21 np0005603623 nova_compute[226235]: 2026-01-31 08:43:21.117 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:21 np0005603623 nova_compute[226235]: 2026-01-31 08:43:21.227 226239 DEBUG nova.network.neutron [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Successfully updated port: 6b860d9b-53bc-4fbb-ae3a-554edc649838 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:43:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 e360: 3 total, 3 up, 3 in
Jan 31 03:43:21 np0005603623 nova_compute[226235]: 2026-01-31 08:43:21.375 226239 DEBUG oslo_concurrency.lockutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Acquiring lock "refresh_cache-87263c34-4ecf-4f8a-b2b5-159e41f58aed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:43:21 np0005603623 nova_compute[226235]: 2026-01-31 08:43:21.376 226239 DEBUG oslo_concurrency.lockutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Acquired lock "refresh_cache-87263c34-4ecf-4f8a-b2b5-159e41f58aed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:43:21 np0005603623 nova_compute[226235]: 2026-01-31 08:43:21.376 226239 DEBUG nova.network.neutron [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:43:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:21.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:21.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:22 np0005603623 nova_compute[226235]: 2026-01-31 08:43:22.019 226239 DEBUG nova.network.neutron [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:43:22 np0005603623 nova_compute[226235]: 2026-01-31 08:43:22.371 226239 DEBUG nova.compute.manager [req-869fabd9-f659-4b50-8814-73069334648c req-f569ab80-c9d2-4936-aa1b-7ef0bf711782 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Received event network-changed-6b860d9b-53bc-4fbb-ae3a-554edc649838 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:43:22 np0005603623 nova_compute[226235]: 2026-01-31 08:43:22.371 226239 DEBUG nova.compute.manager [req-869fabd9-f659-4b50-8814-73069334648c req-f569ab80-c9d2-4936-aa1b-7ef0bf711782 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Refreshing instance network info cache due to event network-changed-6b860d9b-53bc-4fbb-ae3a-554edc649838. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:43:22 np0005603623 nova_compute[226235]: 2026-01-31 08:43:22.371 226239 DEBUG oslo_concurrency.lockutils [req-869fabd9-f659-4b50-8814-73069334648c req-f569ab80-c9d2-4936-aa1b-7ef0bf711782 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-87263c34-4ecf-4f8a-b2b5-159e41f58aed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:43:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:23.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:43:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:23.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:43:24 np0005603623 nova_compute[226235]: 2026-01-31 08:43:24.817 226239 INFO nova.compute.manager [None req-28686b42-2357-4835-bc72-6c5bc9f8680a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Get console output#033[00m
Jan 31 03:43:24 np0005603623 nova_compute[226235]: 2026-01-31 08:43:24.822 270602 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.195 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.615 226239 DEBUG nova.network.neutron [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Updating instance_info_cache with network_info: [{"id": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "address": "fa:16:3e:86:4b:b5", "network": {"id": "d2217bdc-492e-4a75-b218-e1312785e754", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-756188436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a5099d95b08434397e0bf691596b2cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b860d9b-53", "ovs_interfaceid": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:43:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:43:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:25.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.874 226239 DEBUG oslo_concurrency.lockutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Releasing lock "refresh_cache-87263c34-4ecf-4f8a-b2b5-159e41f58aed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.874 226239 DEBUG nova.compute.manager [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Instance network_info: |[{"id": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "address": "fa:16:3e:86:4b:b5", "network": {"id": "d2217bdc-492e-4a75-b218-e1312785e754", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-756188436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a5099d95b08434397e0bf691596b2cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b860d9b-53", "ovs_interfaceid": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.875 226239 DEBUG oslo_concurrency.lockutils [req-869fabd9-f659-4b50-8814-73069334648c req-f569ab80-c9d2-4936-aa1b-7ef0bf711782 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-87263c34-4ecf-4f8a-b2b5-159e41f58aed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.875 226239 DEBUG nova.network.neutron [req-869fabd9-f659-4b50-8814-73069334648c req-f569ab80-c9d2-4936-aa1b-7ef0bf711782 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Refreshing network info cache for port 6b860d9b-53bc-4fbb-ae3a-554edc649838 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.880 226239 DEBUG nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Start _get_guest_xml network_info=[{"id": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "address": "fa:16:3e:86:4b:b5", "network": {"id": "d2217bdc-492e-4a75-b218-e1312785e754", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-756188436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a5099d95b08434397e0bf691596b2cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b860d9b-53", "ovs_interfaceid": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:43:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.884 226239 WARNING nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:43:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:25.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:43:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:43:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.891 226239 DEBUG nova.virt.libvirt.host [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.892 226239 DEBUG nova.virt.libvirt.host [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.895 226239 DEBUG nova.virt.libvirt.host [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.897 226239 DEBUG nova.virt.libvirt.host [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.900 226239 DEBUG nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.900 226239 DEBUG nova.virt.hardware [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.901 226239 DEBUG nova.virt.hardware [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.902 226239 DEBUG nova.virt.hardware [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.902 226239 DEBUG nova.virt.hardware [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.903 226239 DEBUG nova.virt.hardware [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.903 226239 DEBUG nova.virt.hardware [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.903 226239 DEBUG nova.virt.hardware [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.904 226239 DEBUG nova.virt.hardware [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.904 226239 DEBUG nova.virt.hardware [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.905 226239 DEBUG nova.virt.hardware [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.905 226239 DEBUG nova.virt.hardware [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:43:25 np0005603623 nova_compute[226235]: 2026-01-31 08:43:25.910 226239 DEBUG oslo_concurrency.processutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:43:26 np0005603623 nova_compute[226235]: 2026-01-31 08:43:26.119 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:43:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1436310098' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:43:26 np0005603623 nova_compute[226235]: 2026-01-31 08:43:26.354 226239 DEBUG oslo_concurrency.processutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:43:26 np0005603623 nova_compute[226235]: 2026-01-31 08:43:26.397 226239 DEBUG nova.storage.rbd_utils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] rbd image 87263c34-4ecf-4f8a-b2b5-159e41f58aed_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:43:26 np0005603623 nova_compute[226235]: 2026-01-31 08:43:26.401 226239 DEBUG oslo_concurrency.processutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:43:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:43:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/988874058' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:43:26 np0005603623 nova_compute[226235]: 2026-01-31 08:43:26.821 226239 DEBUG oslo_concurrency.processutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:43:26 np0005603623 nova_compute[226235]: 2026-01-31 08:43:26.823 226239 DEBUG nova.virt.libvirt.vif [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:43:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-152812282',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servertagstestjson-server-152812282',id=152,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0a5099d95b08434397e0bf691596b2cf',ramdisk_id='',reservation_id='r-0aev7bsy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerTagsTestJSON-638914080',owner_user_name='tempest-ServerTagsTestJSON-638914080-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:43:15Z,user_data=None,user_id='152becc96c854e3ca68b1b377cae845d',uuid=87263c34-4ecf-4f8a-b2b5-159e41f58aed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "address": "fa:16:3e:86:4b:b5", "network": {"id": "d2217bdc-492e-4a75-b218-e1312785e754", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-756188436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a5099d95b08434397e0bf691596b2cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b860d9b-53", "ovs_interfaceid": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:43:26 np0005603623 nova_compute[226235]: 2026-01-31 08:43:26.823 226239 DEBUG nova.network.os_vif_util [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Converting VIF {"id": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "address": "fa:16:3e:86:4b:b5", "network": {"id": "d2217bdc-492e-4a75-b218-e1312785e754", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-756188436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a5099d95b08434397e0bf691596b2cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b860d9b-53", "ovs_interfaceid": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:43:26 np0005603623 nova_compute[226235]: 2026-01-31 08:43:26.824 226239 DEBUG nova.network.os_vif_util [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:4b:b5,bridge_name='br-int',has_traffic_filtering=True,id=6b860d9b-53bc-4fbb-ae3a-554edc649838,network=Network(d2217bdc-492e-4a75-b218-e1312785e754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b860d9b-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:43:26 np0005603623 nova_compute[226235]: 2026-01-31 08:43:26.825 226239 DEBUG nova.objects.instance [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lazy-loading 'pci_devices' on Instance uuid 87263c34-4ecf-4f8a-b2b5-159e41f58aed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.008 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:27 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:27.010 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=63, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=62) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:43:27 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:27.012 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.025 226239 DEBUG nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:43:27 np0005603623 nova_compute[226235]:  <uuid>87263c34-4ecf-4f8a-b2b5-159e41f58aed</uuid>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:  <name>instance-00000098</name>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerTagsTestJSON-server-152812282</nova:name>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:43:25</nova:creationTime>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:43:27 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:        <nova:user uuid="152becc96c854e3ca68b1b377cae845d">tempest-ServerTagsTestJSON-638914080-project-member</nova:user>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:        <nova:project uuid="0a5099d95b08434397e0bf691596b2cf">tempest-ServerTagsTestJSON-638914080</nova:project>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:        <nova:port uuid="6b860d9b-53bc-4fbb-ae3a-554edc649838">
Jan 31 03:43:27 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <entry name="serial">87263c34-4ecf-4f8a-b2b5-159e41f58aed</entry>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <entry name="uuid">87263c34-4ecf-4f8a-b2b5-159e41f58aed</entry>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/87263c34-4ecf-4f8a-b2b5-159e41f58aed_disk">
Jan 31 03:43:27 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:43:27 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/87263c34-4ecf-4f8a-b2b5-159e41f58aed_disk.config">
Jan 31 03:43:27 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:43:27 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:86:4b:b5"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <target dev="tap6b860d9b-53"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/87263c34-4ecf-4f8a-b2b5-159e41f58aed/console.log" append="off"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:43:27 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:43:27 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:43:27 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:43:27 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.027 226239 DEBUG nova.compute.manager [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Preparing to wait for external event network-vif-plugged-6b860d9b-53bc-4fbb-ae3a-554edc649838 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.028 226239 DEBUG oslo_concurrency.lockutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Acquiring lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.028 226239 DEBUG oslo_concurrency.lockutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.028 226239 DEBUG oslo_concurrency.lockutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.029 226239 DEBUG nova.virt.libvirt.vif [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:43:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-152812282',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servertagstestjson-server-152812282',id=152,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0a5099d95b08434397e0bf691596b2cf',ramdisk_id='',reservation_id='r-0aev7bsy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerTagsTestJSON-638914080',owner_user_name='tempest-ServerTagsTestJSON-638914080-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:43:15Z,user_data=None,user_id='152becc96c854e3ca68b1b377cae845d',uuid=87263c34-4ecf-4f8a-b2b5-159e41f58aed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "address": "fa:16:3e:86:4b:b5", "network": {"id": "d2217bdc-492e-4a75-b218-e1312785e754", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-756188436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a5099d95b08434397e0bf691596b2cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b860d9b-53", "ovs_interfaceid": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.029 226239 DEBUG nova.network.os_vif_util [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Converting VIF {"id": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "address": "fa:16:3e:86:4b:b5", "network": {"id": "d2217bdc-492e-4a75-b218-e1312785e754", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-756188436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a5099d95b08434397e0bf691596b2cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b860d9b-53", "ovs_interfaceid": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.029 226239 DEBUG nova.network.os_vif_util [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:4b:b5,bridge_name='br-int',has_traffic_filtering=True,id=6b860d9b-53bc-4fbb-ae3a-554edc649838,network=Network(d2217bdc-492e-4a75-b218-e1312785e754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b860d9b-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.030 226239 DEBUG os_vif [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:4b:b5,bridge_name='br-int',has_traffic_filtering=True,id=6b860d9b-53bc-4fbb-ae3a-554edc649838,network=Network(d2217bdc-492e-4a75-b218-e1312785e754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b860d9b-53') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.030 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.031 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.031 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.034 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.034 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6b860d9b-53, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.034 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6b860d9b-53, col_values=(('external_ids', {'iface-id': '6b860d9b-53bc-4fbb-ae3a-554edc649838', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:86:4b:b5', 'vm-uuid': '87263c34-4ecf-4f8a-b2b5-159e41f58aed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.035 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:27 np0005603623 NetworkManager[48970]: <info>  [1769849007.0372] manager: (tap6b860d9b-53): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/309)
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.038 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.041 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.042 226239 INFO os_vif [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:4b:b5,bridge_name='br-int',has_traffic_filtering=True,id=6b860d9b-53bc-4fbb-ae3a-554edc649838,network=Network(d2217bdc-492e-4a75-b218-e1312785e754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b860d9b-53')#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.809 226239 DEBUG nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.810 226239 DEBUG nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.810 226239 DEBUG nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] No VIF found with MAC fa:16:3e:86:4b:b5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.811 226239 INFO nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Using config drive#033[00m
Jan 31 03:43:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:27.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:27 np0005603623 nova_compute[226235]: 2026-01-31 08:43:27.846 226239 DEBUG nova.storage.rbd_utils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] rbd image 87263c34-4ecf-4f8a-b2b5-159e41f58aed_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:43:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:27.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:28 np0005603623 nova_compute[226235]: 2026-01-31 08:43:28.874 226239 INFO nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Creating config drive at /var/lib/nova/instances/87263c34-4ecf-4f8a-b2b5-159e41f58aed/disk.config#033[00m
Jan 31 03:43:28 np0005603623 nova_compute[226235]: 2026-01-31 08:43:28.882 226239 DEBUG oslo_concurrency.processutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/87263c34-4ecf-4f8a-b2b5-159e41f58aed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmppgzc1ais execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.012 226239 DEBUG oslo_concurrency.processutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/87263c34-4ecf-4f8a-b2b5-159e41f58aed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmppgzc1ais" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.046 226239 DEBUG nova.storage.rbd_utils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] rbd image 87263c34-4ecf-4f8a-b2b5-159e41f58aed_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.050 226239 DEBUG oslo_concurrency.processutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/87263c34-4ecf-4f8a-b2b5-159e41f58aed/disk.config 87263c34-4ecf-4f8a-b2b5-159e41f58aed_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.186 226239 DEBUG nova.compute.manager [req-73073c3e-6263-4ebc-8319-0b1bc249f317 req-0b9f8426-d49d-4174-849b-c3888f4f6a5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Received event network-changed-b57f4c41-e254-4e29-be21-1899bdb779e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.186 226239 DEBUG nova.compute.manager [req-73073c3e-6263-4ebc-8319-0b1bc249f317 req-0b9f8426-d49d-4174-849b-c3888f4f6a5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Refreshing instance network info cache due to event network-changed-b57f4c41-e254-4e29-be21-1899bdb779e0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.188 226239 DEBUG oslo_concurrency.lockutils [req-73073c3e-6263-4ebc-8319-0b1bc249f317 req-0b9f8426-d49d-4174-849b-c3888f4f6a5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e7694c5e-8d11-4f04-aec6-d1933f668d11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.188 226239 DEBUG oslo_concurrency.lockutils [req-73073c3e-6263-4ebc-8319-0b1bc249f317 req-0b9f8426-d49d-4174-849b-c3888f4f6a5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e7694c5e-8d11-4f04-aec6-d1933f668d11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.189 226239 DEBUG nova.network.neutron [req-73073c3e-6263-4ebc-8319-0b1bc249f317 req-0b9f8426-d49d-4174-849b-c3888f4f6a5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Refreshing network info cache for port b57f4c41-e254-4e29-be21-1899bdb779e0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.280 226239 DEBUG nova.network.neutron [req-869fabd9-f659-4b50-8814-73069334648c req-f569ab80-c9d2-4936-aa1b-7ef0bf711782 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Updated VIF entry in instance network info cache for port 6b860d9b-53bc-4fbb-ae3a-554edc649838. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.281 226239 DEBUG nova.network.neutron [req-869fabd9-f659-4b50-8814-73069334648c req-f569ab80-c9d2-4936-aa1b-7ef0bf711782 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Updating instance_info_cache with network_info: [{"id": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "address": "fa:16:3e:86:4b:b5", "network": {"id": "d2217bdc-492e-4a75-b218-e1312785e754", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-756188436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a5099d95b08434397e0bf691596b2cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b860d9b-53", "ovs_interfaceid": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.285 226239 DEBUG oslo_concurrency.lockutils [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "e7694c5e-8d11-4f04-aec6-d1933f668d11" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.286 226239 DEBUG oslo_concurrency.lockutils [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "e7694c5e-8d11-4f04-aec6-d1933f668d11" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.286 226239 DEBUG oslo_concurrency.lockutils [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "e7694c5e-8d11-4f04-aec6-d1933f668d11-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.286 226239 DEBUG oslo_concurrency.lockutils [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "e7694c5e-8d11-4f04-aec6-d1933f668d11-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.287 226239 DEBUG oslo_concurrency.lockutils [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "e7694c5e-8d11-4f04-aec6-d1933f668d11-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.288 226239 INFO nova.compute.manager [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Terminating instance#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.289 226239 DEBUG nova.compute.manager [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.575 226239 DEBUG oslo_concurrency.lockutils [req-869fabd9-f659-4b50-8814-73069334648c req-f569ab80-c9d2-4936-aa1b-7ef0bf711782 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-87263c34-4ecf-4f8a-b2b5-159e41f58aed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:43:29 np0005603623 kernel: tapb57f4c41-e2 (unregistering): left promiscuous mode
Jan 31 03:43:29 np0005603623 NetworkManager[48970]: <info>  [1769849009.5898] device (tapb57f4c41-e2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.589 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:29 np0005603623 ovn_controller[133449]: 2026-01-31T08:43:29Z|00650|binding|INFO|Releasing lport b57f4c41-e254-4e29-be21-1899bdb779e0 from this chassis (sb_readonly=0)
Jan 31 03:43:29 np0005603623 ovn_controller[133449]: 2026-01-31T08:43:29Z|00651|binding|INFO|Setting lport b57f4c41-e254-4e29-be21-1899bdb779e0 down in Southbound
Jan 31 03:43:29 np0005603623 ovn_controller[133449]: 2026-01-31T08:43:29Z|00652|binding|INFO|Removing iface tapb57f4c41-e2 ovn-installed in OVS
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.599 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.601 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.607 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:29 np0005603623 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000097.scope: Deactivated successfully.
Jan 31 03:43:29 np0005603623 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000097.scope: Consumed 2.134s CPU time.
Jan 31 03:43:29 np0005603623 systemd-machined[194379]: Machine qemu-73-instance-00000097 terminated.
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.732 226239 INFO nova.virt.libvirt.driver [-] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Instance destroyed successfully.#033[00m
Jan 31 03:43:29 np0005603623 nova_compute[226235]: 2026-01-31 08:43:29.733 226239 DEBUG nova.objects.instance [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'resources' on Instance uuid e7694c5e-8d11-4f04-aec6-d1933f668d11 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:43:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:29.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:29.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:30.135 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:30.135 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:30.136 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:30.165 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:78:9b 10.100.0.7'], port_security=['fa:16:3e:26:78:9b 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e7694c5e-8d11-4f04-aec6-d1933f668d11', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b000b527-ea00-4c0c-84c4-e93c10d4bae5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0bfe11bd9d694684b527666e2c378eed', 'neutron:revision_number': '12', 'neutron:security_group_ids': '15398c59-1164-4f8b-8737-a5ada60dadf3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=afdea9bd-63de-451e-8b4c-572440598122, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=b57f4c41-e254-4e29-be21-1899bdb779e0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:43:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:30.166 143258 INFO neutron.agent.ovn.metadata.agent [-] Port b57f4c41-e254-4e29-be21-1899bdb779e0 in datapath b000b527-ea00-4c0c-84c4-e93c10d4bae5 unbound from our chassis#033[00m
Jan 31 03:43:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:30.167 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b000b527-ea00-4c0c-84c4-e93c10d4bae5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:43:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:30.168 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ea8f06d7-f522-4f58-b5f3-a8147124af34]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:30.168 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5 namespace which is not needed anymore#033[00m
Jan 31 03:43:30 np0005603623 nova_compute[226235]: 2026-01-31 08:43:30.233 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:30 np0005603623 neutron-haproxy-ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5[298610]: [NOTICE]   (298614) : haproxy version is 2.8.14-c23fe91
Jan 31 03:43:30 np0005603623 neutron-haproxy-ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5[298610]: [NOTICE]   (298614) : path to executable is /usr/sbin/haproxy
Jan 31 03:43:30 np0005603623 neutron-haproxy-ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5[298610]: [WARNING]  (298614) : Exiting Master process...
Jan 31 03:43:30 np0005603623 neutron-haproxy-ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5[298610]: [WARNING]  (298614) : Exiting Master process...
Jan 31 03:43:30 np0005603623 neutron-haproxy-ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5[298610]: [ALERT]    (298614) : Current worker (298616) exited with code 143 (Terminated)
Jan 31 03:43:30 np0005603623 neutron-haproxy-ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5[298610]: [WARNING]  (298614) : All workers exited. Exiting... (0)
Jan 31 03:43:30 np0005603623 systemd[1]: libpod-e151085cd96dfd59f788f865a8c80bc5a90f4dafdb82d929915a02db27304978.scope: Deactivated successfully.
Jan 31 03:43:30 np0005603623 podman[299210]: 2026-01-31 08:43:30.424988571 +0000 UTC m=+0.179339547 container died e151085cd96dfd59f788f865a8c80bc5a90f4dafdb82d929915a02db27304978 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 31 03:43:30 np0005603623 nova_compute[226235]: 2026-01-31 08:43:30.458 226239 DEBUG oslo_concurrency.processutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/87263c34-4ecf-4f8a-b2b5-159e41f58aed/disk.config 87263c34-4ecf-4f8a-b2b5-159e41f58aed_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:43:30 np0005603623 nova_compute[226235]: 2026-01-31 08:43:30.459 226239 INFO nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Deleting local config drive /var/lib/nova/instances/87263c34-4ecf-4f8a-b2b5-159e41f58aed/disk.config because it was imported into RBD.#033[00m
Jan 31 03:43:30 np0005603623 kernel: tap6b860d9b-53: entered promiscuous mode
Jan 31 03:43:30 np0005603623 NetworkManager[48970]: <info>  [1769849010.5106] manager: (tap6b860d9b-53): new Tun device (/org/freedesktop/NetworkManager/Devices/310)
Jan 31 03:43:30 np0005603623 systemd-udevd[299177]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:43:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:43:30Z|00653|binding|INFO|Claiming lport 6b860d9b-53bc-4fbb-ae3a-554edc649838 for this chassis.
Jan 31 03:43:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:43:30Z|00654|binding|INFO|6b860d9b-53bc-4fbb-ae3a-554edc649838: Claiming fa:16:3e:86:4b:b5 10.100.0.6
Jan 31 03:43:30 np0005603623 nova_compute[226235]: 2026-01-31 08:43:30.514 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:43:30Z|00655|binding|INFO|Setting lport 6b860d9b-53bc-4fbb-ae3a-554edc649838 ovn-installed in OVS
Jan 31 03:43:30 np0005603623 nova_compute[226235]: 2026-01-31 08:43:30.526 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:30 np0005603623 nova_compute[226235]: 2026-01-31 08:43:30.530 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:30 np0005603623 NetworkManager[48970]: <info>  [1769849010.5312] device (tap6b860d9b-53): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:43:30 np0005603623 NetworkManager[48970]: <info>  [1769849010.5331] device (tap6b860d9b-53): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:43:30 np0005603623 nova_compute[226235]: 2026-01-31 08:43:30.539 226239 DEBUG nova.virt.libvirt.vif [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:41:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-617880352',display_name='tempest-TestNetworkAdvancedServerOps-server-617880352',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-617880352',id=151,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE9oW7hyNK/c0GmlhWHVsudW1EFOU1/778j2Zfzh7XKLIHLI+8KsqNzzhySs7L5TOC+KBq7HkFVRK05TmxJs9LQc4oVDYzV+eQ5EXf3bE6KOfId7bnJvpzjj70u8lMALWA==',key_name='tempest-TestNetworkAdvancedServerOps-530143631',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:42:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-3ynh01gy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:43:16Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=e7694c5e-8d11-4f04-aec6-d1933f668d11,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b57f4c41-e254-4e29-be21-1899bdb779e0", "address": "fa:16:3e:26:78:9b", "network": {"id": "b000b527-ea00-4c0c-84c4-e93c10d4bae5", "bridge": "br-int", "label": "tempest-network-smoke--1061452383", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb57f4c41-e2", "ovs_interfaceid": "b57f4c41-e254-4e29-be21-1899bdb779e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:43:30 np0005603623 nova_compute[226235]: 2026-01-31 08:43:30.539 226239 DEBUG nova.network.os_vif_util [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converting VIF {"id": "b57f4c41-e254-4e29-be21-1899bdb779e0", "address": "fa:16:3e:26:78:9b", "network": {"id": "b000b527-ea00-4c0c-84c4-e93c10d4bae5", "bridge": "br-int", "label": "tempest-network-smoke--1061452383", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb57f4c41-e2", "ovs_interfaceid": "b57f4c41-e254-4e29-be21-1899bdb779e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:43:30 np0005603623 nova_compute[226235]: 2026-01-31 08:43:30.540 226239 DEBUG nova.network.os_vif_util [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:78:9b,bridge_name='br-int',has_traffic_filtering=True,id=b57f4c41-e254-4e29-be21-1899bdb779e0,network=Network(b000b527-ea00-4c0c-84c4-e93c10d4bae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb57f4c41-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:43:30 np0005603623 nova_compute[226235]: 2026-01-31 08:43:30.541 226239 DEBUG os_vif [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:78:9b,bridge_name='br-int',has_traffic_filtering=True,id=b57f4c41-e254-4e29-be21-1899bdb779e0,network=Network(b000b527-ea00-4c0c-84c4-e93c10d4bae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb57f4c41-e2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:43:30 np0005603623 nova_compute[226235]: 2026-01-31 08:43:30.542 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:30 np0005603623 nova_compute[226235]: 2026-01-31 08:43:30.543 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb57f4c41-e2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:30 np0005603623 nova_compute[226235]: 2026-01-31 08:43:30.544 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:30 np0005603623 nova_compute[226235]: 2026-01-31 08:43:30.546 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:30 np0005603623 nova_compute[226235]: 2026-01-31 08:43:30.549 226239 INFO os_vif [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:78:9b,bridge_name='br-int',has_traffic_filtering=True,id=b57f4c41-e254-4e29-be21-1899bdb779e0,network=Network(b000b527-ea00-4c0c-84c4-e93c10d4bae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb57f4c41-e2')#033[00m
Jan 31 03:43:30 np0005603623 systemd-machined[194379]: New machine qemu-74-instance-00000098.
Jan 31 03:43:30 np0005603623 systemd[1]: Started Virtual Machine qemu-74-instance-00000098.
Jan 31 03:43:30 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e151085cd96dfd59f788f865a8c80bc5a90f4dafdb82d929915a02db27304978-userdata-shm.mount: Deactivated successfully.
Jan 31 03:43:30 np0005603623 systemd[1]: var-lib-containers-storage-overlay-81318cf4c2ff6c58ae624044bb1c76772f3f5fbe97bffcf3e995711abb796f23-merged.mount: Deactivated successfully.
Jan 31 03:43:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:30.987 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:4b:b5 10.100.0.6'], port_security=['fa:16:3e:86:4b:b5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '87263c34-4ecf-4f8a-b2b5-159e41f58aed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2217bdc-492e-4a75-b218-e1312785e754', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a5099d95b08434397e0bf691596b2cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '58379112-028d-4319-85ca-398661444998', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bef6b07b-a0fb-46f3-a966-1dcef9527ad0, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=6b860d9b-53bc-4fbb-ae3a-554edc649838) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:43:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:43:30Z|00656|binding|INFO|Setting lport 6b860d9b-53bc-4fbb-ae3a-554edc649838 up in Southbound
Jan 31 03:43:31 np0005603623 podman[299210]: 2026-01-31 08:43:31.058088622 +0000 UTC m=+0.812439598 container cleanup e151085cd96dfd59f788f865a8c80bc5a90f4dafdb82d929915a02db27304978 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:43:31 np0005603623 systemd[1]: libpod-conmon-e151085cd96dfd59f788f865a8c80bc5a90f4dafdb82d929915a02db27304978.scope: Deactivated successfully.
Jan 31 03:43:31 np0005603623 podman[299282]: 2026-01-31 08:43:31.128504041 +0000 UTC m=+0.049643098 container remove e151085cd96dfd59f788f865a8c80bc5a90f4dafdb82d929915a02db27304978 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.133 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c2a0b74c-c877-4934-8169-32706a230bff]: (4, ('Sat Jan 31 08:43:30 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5 (e151085cd96dfd59f788f865a8c80bc5a90f4dafdb82d929915a02db27304978)\ne151085cd96dfd59f788f865a8c80bc5a90f4dafdb82d929915a02db27304978\nSat Jan 31 08:43:31 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5 (e151085cd96dfd59f788f865a8c80bc5a90f4dafdb82d929915a02db27304978)\ne151085cd96dfd59f788f865a8c80bc5a90f4dafdb82d929915a02db27304978\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.134 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e378fe71-0a56-4297-8522-9055fcde9655]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.135 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb000b527-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:31 np0005603623 nova_compute[226235]: 2026-01-31 08:43:31.136 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:31 np0005603623 kernel: tapb000b527-e0: left promiscuous mode
Jan 31 03:43:31 np0005603623 nova_compute[226235]: 2026-01-31 08:43:31.144 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.148 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[decb7a0b-0d27-43b7-bb4c-2602c7ed92e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.164 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d02de9b8-516f-493b-ad47-9df35a33da7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.165 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6a49185b-1933-4714-8d75-ac6e3dd53cc2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.180 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[57887458-03ad-4026-82d8-be6d2d1963e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 812505, 'reachable_time': 20076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 299297, 'error': None, 'target': 'ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 systemd[1]: run-netns-ovnmeta\x2db000b527\x2dea00\x2d4c0c\x2d84c4\x2de93c10d4bae5.mount: Deactivated successfully.
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.182 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b000b527-ea00-4c0c-84c4-e93c10d4bae5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.182 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[a76df5d9-3e1c-4916-aa6c-9da941f7371d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.184 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 6b860d9b-53bc-4fbb-ae3a-554edc649838 in datapath d2217bdc-492e-4a75-b218-e1312785e754 unbound from our chassis#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.186 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d2217bdc-492e-4a75-b218-e1312785e754#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.193 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fff65b6d-bcf1-4af7-be99-fe8568468271]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.194 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd2217bdc-41 in ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.196 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd2217bdc-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.196 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cdbf9d55-c03b-49b3-bf03-2e64b1c56ecb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.198 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6feb44d7-3573-4166-a66b-3f346b0ea15b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.212 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ab0660-a598-42ab-ae9d-341158176ed6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.222 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[75be2f0b-f927-46a1-b58d-7a36caf1cad1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.242 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[01b704d6-50df-4a42-acb9-054b5237672b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.247 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0ef2d33e-30be-42e5-9e78-5207241655f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 NetworkManager[48970]: <info>  [1769849011.2509] manager: (tapd2217bdc-40): new Veth device (/org/freedesktop/NetworkManager/Devices/311)
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.274 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[1eb8651c-58b0-4c7e-97b2-b91f3088cb7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.278 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[f0ec605d-2a90-4edb-8e1d-95975184ef23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 NetworkManager[48970]: <info>  [1769849011.2946] device (tapd2217bdc-40): carrier: link connected
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.296 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[d34d2134-1570-40c7-8171-3c951484c629]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.310 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9cb41c86-961a-4fe2-bc94-4168b3c96ec2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd2217bdc-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:c0:83'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 195], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 815172, 'reachable_time': 30904, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 299321, 'error': None, 'target': 'ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.323 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b315526c-81e6-48f4-a9ed-c76162f09fd8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6f:c083'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 815172, 'tstamp': 815172}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 299322, 'error': None, 'target': 'ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.335 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bc7329ba-0fc3-42f7-b78f-4c1eb1525ef5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd2217bdc-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:c0:83'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 195], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 815172, 'reachable_time': 30904, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 299323, 'error': None, 'target': 'ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.358 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[19d962fe-c1ac-489f-a0c8-7a8c7263f78a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.395 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c4831438-4956-416b-a437-0806c57524cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.396 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2217bdc-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.396 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.397 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd2217bdc-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:31 np0005603623 nova_compute[226235]: 2026-01-31 08:43:31.398 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:31 np0005603623 NetworkManager[48970]: <info>  [1769849011.3993] manager: (tapd2217bdc-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/312)
Jan 31 03:43:31 np0005603623 kernel: tapd2217bdc-40: entered promiscuous mode
Jan 31 03:43:31 np0005603623 nova_compute[226235]: 2026-01-31 08:43:31.400 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.401 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd2217bdc-40, col_values=(('external_ids', {'iface-id': 'cdd2a999-fc1d-4caa-bc97-b2f364cd3cb5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:31 np0005603623 nova_compute[226235]: 2026-01-31 08:43:31.402 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:31 np0005603623 ovn_controller[133449]: 2026-01-31T08:43:31Z|00657|binding|INFO|Releasing lport cdd2a999-fc1d-4caa-bc97-b2f364cd3cb5 from this chassis (sb_readonly=0)
Jan 31 03:43:31 np0005603623 nova_compute[226235]: 2026-01-31 08:43:31.407 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.408 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d2217bdc-492e-4a75-b218-e1312785e754.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d2217bdc-492e-4a75-b218-e1312785e754.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.408 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b7b66fcf-aeb9-4698-91ad-9606a2be1874]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.409 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-d2217bdc-492e-4a75-b218-e1312785e754
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/d2217bdc-492e-4a75-b218-e1312785e754.pid.haproxy
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID d2217bdc-492e-4a75-b218-e1312785e754
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:43:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:31.409 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754', 'env', 'PROCESS_TAG=haproxy-d2217bdc-492e-4a75-b218-e1312785e754', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d2217bdc-492e-4a75-b218-e1312785e754.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:43:31 np0005603623 nova_compute[226235]: 2026-01-31 08:43:31.722 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849011.7220006, 87263c34-4ecf-4f8a-b2b5-159e41f58aed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:43:31 np0005603623 nova_compute[226235]: 2026-01-31 08:43:31.724 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] VM Started (Lifecycle Event)#033[00m
Jan 31 03:43:31 np0005603623 podman[299397]: 2026-01-31 08:43:31.781347512 +0000 UTC m=+0.044720854 container create 4942bfcacba5dfc34d106b492b6195dfbcf33ba64f9e49a97662f16257a25e87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:43:31 np0005603623 systemd[1]: Started libpod-conmon-4942bfcacba5dfc34d106b492b6195dfbcf33ba64f9e49a97662f16257a25e87.scope.
Jan 31 03:43:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:43:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:31.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:43:31 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:43:31 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6cd4a7607cd659e416399f280302de06d72e9023beb6b1a99f601b3e82a6ae26/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:43:31 np0005603623 podman[299397]: 2026-01-31 08:43:31.759902808 +0000 UTC m=+0.023276180 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:43:31 np0005603623 podman[299397]: 2026-01-31 08:43:31.867742142 +0000 UTC m=+0.131115504 container init 4942bfcacba5dfc34d106b492b6195dfbcf33ba64f9e49a97662f16257a25e87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 03:43:31 np0005603623 podman[299397]: 2026-01-31 08:43:31.872603544 +0000 UTC m=+0.135976886 container start 4942bfcacba5dfc34d106b492b6195dfbcf33ba64f9e49a97662f16257a25e87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:43:31 np0005603623 neutron-haproxy-ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754[299414]: [NOTICE]   (299418) : New worker (299420) forked
Jan 31 03:43:31 np0005603623 neutron-haproxy-ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754[299414]: [NOTICE]   (299418) : Loading success.
Jan 31 03:43:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:31.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:32 np0005603623 nova_compute[226235]: 2026-01-31 08:43:32.039 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:43:32 np0005603623 nova_compute[226235]: 2026-01-31 08:43:32.049 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849011.7233891, 87263c34-4ecf-4f8a-b2b5-159e41f58aed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:43:32 np0005603623 nova_compute[226235]: 2026-01-31 08:43:32.051 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:43:32 np0005603623 nova_compute[226235]: 2026-01-31 08:43:32.256 226239 DEBUG nova.compute.manager [req-1061080a-a7e7-4887-adb9-72a676a62949 req-1eac5cc2-4475-4d36-855a-4c1d40305b38 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Received event network-vif-unplugged-b57f4c41-e254-4e29-be21-1899bdb779e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:43:32 np0005603623 nova_compute[226235]: 2026-01-31 08:43:32.256 226239 DEBUG oslo_concurrency.lockutils [req-1061080a-a7e7-4887-adb9-72a676a62949 req-1eac5cc2-4475-4d36-855a-4c1d40305b38 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e7694c5e-8d11-4f04-aec6-d1933f668d11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:32 np0005603623 nova_compute[226235]: 2026-01-31 08:43:32.257 226239 DEBUG oslo_concurrency.lockutils [req-1061080a-a7e7-4887-adb9-72a676a62949 req-1eac5cc2-4475-4d36-855a-4c1d40305b38 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e7694c5e-8d11-4f04-aec6-d1933f668d11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:32 np0005603623 nova_compute[226235]: 2026-01-31 08:43:32.257 226239 DEBUG oslo_concurrency.lockutils [req-1061080a-a7e7-4887-adb9-72a676a62949 req-1eac5cc2-4475-4d36-855a-4c1d40305b38 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e7694c5e-8d11-4f04-aec6-d1933f668d11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:32 np0005603623 nova_compute[226235]: 2026-01-31 08:43:32.257 226239 DEBUG nova.compute.manager [req-1061080a-a7e7-4887-adb9-72a676a62949 req-1eac5cc2-4475-4d36-855a-4c1d40305b38 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] No waiting events found dispatching network-vif-unplugged-b57f4c41-e254-4e29-be21-1899bdb779e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:43:32 np0005603623 nova_compute[226235]: 2026-01-31 08:43:32.257 226239 DEBUG nova.compute.manager [req-1061080a-a7e7-4887-adb9-72a676a62949 req-1eac5cc2-4475-4d36-855a-4c1d40305b38 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Received event network-vif-unplugged-b57f4c41-e254-4e29-be21-1899bdb779e0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:43:32 np0005603623 nova_compute[226235]: 2026-01-31 08:43:32.274 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:43:32 np0005603623 nova_compute[226235]: 2026-01-31 08:43:32.278 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:43:32 np0005603623 nova_compute[226235]: 2026-01-31 08:43:32.873 226239 INFO nova.virt.libvirt.driver [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Deleting instance files /var/lib/nova/instances/e7694c5e-8d11-4f04-aec6-d1933f668d11_del#033[00m
Jan 31 03:43:32 np0005603623 nova_compute[226235]: 2026-01-31 08:43:32.874 226239 INFO nova.virt.libvirt.driver [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Deletion of /var/lib/nova/instances/e7694c5e-8d11-4f04-aec6-d1933f668d11_del complete#033[00m
Jan 31 03:43:32 np0005603623 nova_compute[226235]: 2026-01-31 08:43:32.945 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:43:32 np0005603623 nova_compute[226235]: 2026-01-31 08:43:32.987 226239 DEBUG nova.network.neutron [req-73073c3e-6263-4ebc-8319-0b1bc249f317 req-0b9f8426-d49d-4174-849b-c3888f4f6a5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Updated VIF entry in instance network info cache for port b57f4c41-e254-4e29-be21-1899bdb779e0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:43:32 np0005603623 nova_compute[226235]: 2026-01-31 08:43:32.988 226239 DEBUG nova.network.neutron [req-73073c3e-6263-4ebc-8319-0b1bc249f317 req-0b9f8426-d49d-4174-849b-c3888f4f6a5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Updating instance_info_cache with network_info: [{"id": "b57f4c41-e254-4e29-be21-1899bdb779e0", "address": "fa:16:3e:26:78:9b", "network": {"id": "b000b527-ea00-4c0c-84c4-e93c10d4bae5", "bridge": "br-int", "label": "tempest-network-smoke--1061452383", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb57f4c41-e2", "ovs_interfaceid": "b57f4c41-e254-4e29-be21-1899bdb779e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:43:33 np0005603623 nova_compute[226235]: 2026-01-31 08:43:33.467 226239 INFO nova.compute.manager [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Took 4.18 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:43:33 np0005603623 nova_compute[226235]: 2026-01-31 08:43:33.467 226239 DEBUG oslo.service.loopingcall [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:43:33 np0005603623 nova_compute[226235]: 2026-01-31 08:43:33.468 226239 DEBUG nova.compute.manager [-] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:43:33 np0005603623 nova_compute[226235]: 2026-01-31 08:43:33.468 226239 DEBUG nova.network.neutron [-] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:43:33 np0005603623 nova_compute[226235]: 2026-01-31 08:43:33.508 226239 DEBUG nova.compute.manager [req-6fb6886d-8117-43bd-9c52-723c17545c09 req-e0f7ad87-b356-482d-9f96-ca56d553f8cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Received event network-vif-plugged-6b860d9b-53bc-4fbb-ae3a-554edc649838 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:43:33 np0005603623 nova_compute[226235]: 2026-01-31 08:43:33.509 226239 DEBUG oslo_concurrency.lockutils [req-6fb6886d-8117-43bd-9c52-723c17545c09 req-e0f7ad87-b356-482d-9f96-ca56d553f8cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:33 np0005603623 nova_compute[226235]: 2026-01-31 08:43:33.509 226239 DEBUG oslo_concurrency.lockutils [req-6fb6886d-8117-43bd-9c52-723c17545c09 req-e0f7ad87-b356-482d-9f96-ca56d553f8cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:33 np0005603623 nova_compute[226235]: 2026-01-31 08:43:33.509 226239 DEBUG oslo_concurrency.lockutils [req-6fb6886d-8117-43bd-9c52-723c17545c09 req-e0f7ad87-b356-482d-9f96-ca56d553f8cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:33 np0005603623 nova_compute[226235]: 2026-01-31 08:43:33.510 226239 DEBUG nova.compute.manager [req-6fb6886d-8117-43bd-9c52-723c17545c09 req-e0f7ad87-b356-482d-9f96-ca56d553f8cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Processing event network-vif-plugged-6b860d9b-53bc-4fbb-ae3a-554edc649838 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:43:33 np0005603623 nova_compute[226235]: 2026-01-31 08:43:33.510 226239 DEBUG nova.compute.manager [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:43:33 np0005603623 nova_compute[226235]: 2026-01-31 08:43:33.514 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849013.5140352, 87263c34-4ecf-4f8a-b2b5-159e41f58aed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:43:33 np0005603623 nova_compute[226235]: 2026-01-31 08:43:33.514 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:43:33 np0005603623 nova_compute[226235]: 2026-01-31 08:43:33.516 226239 DEBUG nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:43:33 np0005603623 nova_compute[226235]: 2026-01-31 08:43:33.520 226239 INFO nova.virt.libvirt.driver [-] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Instance spawned successfully.#033[00m
Jan 31 03:43:33 np0005603623 nova_compute[226235]: 2026-01-31 08:43:33.521 226239 DEBUG nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:43:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:33.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:33.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:33 np0005603623 nova_compute[226235]: 2026-01-31 08:43:33.925 226239 DEBUG oslo_concurrency.lockutils [req-73073c3e-6263-4ebc-8319-0b1bc249f317 req-0b9f8426-d49d-4174-849b-c3888f4f6a5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e7694c5e-8d11-4f04-aec6-d1933f668d11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:43:34 np0005603623 nova_compute[226235]: 2026-01-31 08:43:34.080 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:43:34 np0005603623 nova_compute[226235]: 2026-01-31 08:43:34.086 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:43:34 np0005603623 nova_compute[226235]: 2026-01-31 08:43:34.089 226239 DEBUG nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:43:34 np0005603623 nova_compute[226235]: 2026-01-31 08:43:34.089 226239 DEBUG nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:43:34 np0005603623 nova_compute[226235]: 2026-01-31 08:43:34.090 226239 DEBUG nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:43:34 np0005603623 nova_compute[226235]: 2026-01-31 08:43:34.090 226239 DEBUG nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:43:34 np0005603623 nova_compute[226235]: 2026-01-31 08:43:34.091 226239 DEBUG nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:43:34 np0005603623 nova_compute[226235]: 2026-01-31 08:43:34.091 226239 DEBUG nova.virt.libvirt.driver [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:43:34 np0005603623 nova_compute[226235]: 2026-01-31 08:43:34.505 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:43:34 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:43:34 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:43:35 np0005603623 nova_compute[226235]: 2026-01-31 08:43:35.022 226239 INFO nova.compute.manager [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Took 18.87 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:43:35 np0005603623 nova_compute[226235]: 2026-01-31 08:43:35.022 226239 DEBUG nova.compute.manager [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:43:35 np0005603623 nova_compute[226235]: 2026-01-31 08:43:35.235 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:35 np0005603623 nova_compute[226235]: 2026-01-31 08:43:35.453 226239 INFO nova.compute.manager [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Took 22.89 seconds to build instance.#033[00m
Jan 31 03:43:35 np0005603623 nova_compute[226235]: 2026-01-31 08:43:35.544 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:35 np0005603623 nova_compute[226235]: 2026-01-31 08:43:35.550 226239 DEBUG oslo_concurrency.lockutils [None req-adae89eb-9cbf-42f0-a098-41d062d40213 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 23.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:35.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:43:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:35.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:43:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:36.015 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '63'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:36 np0005603623 nova_compute[226235]: 2026-01-31 08:43:36.732 226239 DEBUG nova.compute.manager [req-f8f87c6e-f1ee-4b15-9a39-9e44e429376e req-55ce5352-41be-4e34-92ca-5ec30547d269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Received event network-vif-plugged-b57f4c41-e254-4e29-be21-1899bdb779e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:43:36 np0005603623 nova_compute[226235]: 2026-01-31 08:43:36.732 226239 DEBUG oslo_concurrency.lockutils [req-f8f87c6e-f1ee-4b15-9a39-9e44e429376e req-55ce5352-41be-4e34-92ca-5ec30547d269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e7694c5e-8d11-4f04-aec6-d1933f668d11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:36 np0005603623 nova_compute[226235]: 2026-01-31 08:43:36.732 226239 DEBUG oslo_concurrency.lockutils [req-f8f87c6e-f1ee-4b15-9a39-9e44e429376e req-55ce5352-41be-4e34-92ca-5ec30547d269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e7694c5e-8d11-4f04-aec6-d1933f668d11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:36 np0005603623 nova_compute[226235]: 2026-01-31 08:43:36.733 226239 DEBUG oslo_concurrency.lockutils [req-f8f87c6e-f1ee-4b15-9a39-9e44e429376e req-55ce5352-41be-4e34-92ca-5ec30547d269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e7694c5e-8d11-4f04-aec6-d1933f668d11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:36 np0005603623 nova_compute[226235]: 2026-01-31 08:43:36.733 226239 DEBUG nova.compute.manager [req-f8f87c6e-f1ee-4b15-9a39-9e44e429376e req-55ce5352-41be-4e34-92ca-5ec30547d269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] No waiting events found dispatching network-vif-plugged-b57f4c41-e254-4e29-be21-1899bdb779e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:43:36 np0005603623 nova_compute[226235]: 2026-01-31 08:43:36.733 226239 WARNING nova.compute.manager [req-f8f87c6e-f1ee-4b15-9a39-9e44e429376e req-55ce5352-41be-4e34-92ca-5ec30547d269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Received unexpected event network-vif-plugged-b57f4c41-e254-4e29-be21-1899bdb779e0 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:43:37 np0005603623 nova_compute[226235]: 2026-01-31 08:43:37.112 226239 DEBUG nova.compute.manager [req-d7b4e547-ad73-4d1a-adbf-a009830fecf4 req-d1411b6b-fc44-425b-a380-53320b0aae8c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Received event network-vif-plugged-6b860d9b-53bc-4fbb-ae3a-554edc649838 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:43:37 np0005603623 nova_compute[226235]: 2026-01-31 08:43:37.113 226239 DEBUG oslo_concurrency.lockutils [req-d7b4e547-ad73-4d1a-adbf-a009830fecf4 req-d1411b6b-fc44-425b-a380-53320b0aae8c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:37 np0005603623 nova_compute[226235]: 2026-01-31 08:43:37.113 226239 DEBUG oslo_concurrency.lockutils [req-d7b4e547-ad73-4d1a-adbf-a009830fecf4 req-d1411b6b-fc44-425b-a380-53320b0aae8c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:37 np0005603623 nova_compute[226235]: 2026-01-31 08:43:37.113 226239 DEBUG oslo_concurrency.lockutils [req-d7b4e547-ad73-4d1a-adbf-a009830fecf4 req-d1411b6b-fc44-425b-a380-53320b0aae8c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:37 np0005603623 nova_compute[226235]: 2026-01-31 08:43:37.113 226239 DEBUG nova.compute.manager [req-d7b4e547-ad73-4d1a-adbf-a009830fecf4 req-d1411b6b-fc44-425b-a380-53320b0aae8c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] No waiting events found dispatching network-vif-plugged-6b860d9b-53bc-4fbb-ae3a-554edc649838 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:43:37 np0005603623 nova_compute[226235]: 2026-01-31 08:43:37.114 226239 WARNING nova.compute.manager [req-d7b4e547-ad73-4d1a-adbf-a009830fecf4 req-d1411b6b-fc44-425b-a380-53320b0aae8c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Received unexpected event network-vif-plugged-6b860d9b-53bc-4fbb-ae3a-554edc649838 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:43:37 np0005603623 nova_compute[226235]: 2026-01-31 08:43:37.353 226239 DEBUG nova.network.neutron [-] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:43:37 np0005603623 nova_compute[226235]: 2026-01-31 08:43:37.606 226239 INFO nova.compute.manager [-] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Took 4.14 seconds to deallocate network for instance.#033[00m
Jan 31 03:43:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:43:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:37.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:43:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:37.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:38 np0005603623 nova_compute[226235]: 2026-01-31 08:43:38.296 226239 DEBUG oslo_concurrency.lockutils [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:38 np0005603623 nova_compute[226235]: 2026-01-31 08:43:38.298 226239 DEBUG oslo_concurrency.lockutils [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:38 np0005603623 nova_compute[226235]: 2026-01-31 08:43:38.304 226239 DEBUG oslo_concurrency.lockutils [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:38 np0005603623 nova_compute[226235]: 2026-01-31 08:43:38.917 226239 INFO nova.scheduler.client.report [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Deleted allocations for instance e7694c5e-8d11-4f04-aec6-d1933f668d11#033[00m
Jan 31 03:43:39 np0005603623 nova_compute[226235]: 2026-01-31 08:43:39.551 226239 DEBUG nova.compute.manager [req-d94fe3d5-f8ee-4fe0-af98-d0ff3e813269 req-e3324f5d-9db7-4af7-b764-e57fb7757827 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Received event network-vif-deleted-b57f4c41-e254-4e29-be21-1899bdb779e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:43:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:43:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:39.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:43:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:39.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:40 np0005603623 nova_compute[226235]: 2026-01-31 08:43:40.003 226239 DEBUG oslo_concurrency.lockutils [None req-0ef4cc65-e462-4caf-a980-f5212c21806a f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "e7694c5e-8d11-4f04-aec6-d1933f668d11" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.718s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:40 np0005603623 nova_compute[226235]: 2026-01-31 08:43:40.238 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:40 np0005603623 nova_compute[226235]: 2026-01-31 08:43:40.546 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:43:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:41.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:43:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:43:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:41.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:43:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:43.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:43.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:44 np0005603623 nova_compute[226235]: 2026-01-31 08:43:44.730 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849009.7289104, e7694c5e-8d11-4f04-aec6-d1933f668d11 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:43:44 np0005603623 nova_compute[226235]: 2026-01-31 08:43:44.731 226239 INFO nova.compute.manager [-] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:43:45 np0005603623 nova_compute[226235]: 2026-01-31 08:43:45.249 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:45 np0005603623 nova_compute[226235]: 2026-01-31 08:43:45.548 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:45 np0005603623 ovn_controller[133449]: 2026-01-31T08:43:45Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:86:4b:b5 10.100.0.6
Jan 31 03:43:45 np0005603623 ovn_controller[133449]: 2026-01-31T08:43:45Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:86:4b:b5 10.100.0.6
Jan 31 03:43:45 np0005603623 nova_compute[226235]: 2026-01-31 08:43:45.618 226239 DEBUG nova.compute.manager [None req-49578e6a-d86e-4531-a49b-0c92950f792e - - - - - -] [instance: e7694c5e-8d11-4f04-aec6-d1933f668d11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:43:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:43:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:45.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:43:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:43:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:45.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:43:46 np0005603623 podman[299537]: 2026-01-31 08:43:46.987445498 +0000 UTC m=+0.059102225 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:43:47 np0005603623 podman[299538]: 2026-01-31 08:43:47.011222184 +0000 UTC m=+0.083363397 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:43:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:47.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:47.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.118 226239 DEBUG oslo_concurrency.lockutils [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Acquiring lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.119 226239 DEBUG oslo_concurrency.lockutils [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.119 226239 DEBUG oslo_concurrency.lockutils [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Acquiring lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.120 226239 DEBUG oslo_concurrency.lockutils [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.120 226239 DEBUG oslo_concurrency.lockutils [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.121 226239 INFO nova.compute.manager [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Terminating instance#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.122 226239 DEBUG nova.compute.manager [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:43:48 np0005603623 kernel: tap6b860d9b-53 (unregistering): left promiscuous mode
Jan 31 03:43:48 np0005603623 NetworkManager[48970]: <info>  [1769849028.1880] device (tap6b860d9b-53): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.187 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.192 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:43:48Z|00658|binding|INFO|Releasing lport 6b860d9b-53bc-4fbb-ae3a-554edc649838 from this chassis (sb_readonly=0)
Jan 31 03:43:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:43:48Z|00659|binding|INFO|Setting lport 6b860d9b-53bc-4fbb-ae3a-554edc649838 down in Southbound
Jan 31 03:43:48 np0005603623 ovn_controller[133449]: 2026-01-31T08:43:48Z|00660|binding|INFO|Removing iface tap6b860d9b-53 ovn-installed in OVS
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.195 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.200 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:48 np0005603623 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000098.scope: Deactivated successfully.
Jan 31 03:43:48 np0005603623 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000098.scope: Consumed 12.882s CPU time.
Jan 31 03:43:48 np0005603623 systemd-machined[194379]: Machine qemu-74-instance-00000098 terminated.
Jan 31 03:43:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:48.270 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:4b:b5 10.100.0.6'], port_security=['fa:16:3e:86:4b:b5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '87263c34-4ecf-4f8a-b2b5-159e41f58aed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2217bdc-492e-4a75-b218-e1312785e754', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a5099d95b08434397e0bf691596b2cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '58379112-028d-4319-85ca-398661444998', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bef6b07b-a0fb-46f3-a966-1dcef9527ad0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=6b860d9b-53bc-4fbb-ae3a-554edc649838) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:43:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:48.271 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 6b860d9b-53bc-4fbb-ae3a-554edc649838 in datapath d2217bdc-492e-4a75-b218-e1312785e754 unbound from our chassis#033[00m
Jan 31 03:43:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:48.273 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d2217bdc-492e-4a75-b218-e1312785e754, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:43:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:48.274 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dfd7b4eb-7b62-453c-bfd4-91e7386fc12b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:48.274 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754 namespace which is not needed anymore#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.339 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.343 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.354 226239 INFO nova.virt.libvirt.driver [-] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Instance destroyed successfully.#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.355 226239 DEBUG nova.objects.instance [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lazy-loading 'resources' on Instance uuid 87263c34-4ecf-4f8a-b2b5-159e41f58aed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:43:48 np0005603623 neutron-haproxy-ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754[299414]: [NOTICE]   (299418) : haproxy version is 2.8.14-c23fe91
Jan 31 03:43:48 np0005603623 neutron-haproxy-ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754[299414]: [NOTICE]   (299418) : path to executable is /usr/sbin/haproxy
Jan 31 03:43:48 np0005603623 neutron-haproxy-ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754[299414]: [WARNING]  (299418) : Exiting Master process...
Jan 31 03:43:48 np0005603623 neutron-haproxy-ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754[299414]: [ALERT]    (299418) : Current worker (299420) exited with code 143 (Terminated)
Jan 31 03:43:48 np0005603623 neutron-haproxy-ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754[299414]: [WARNING]  (299418) : All workers exited. Exiting... (0)
Jan 31 03:43:48 np0005603623 systemd[1]: libpod-4942bfcacba5dfc34d106b492b6195dfbcf33ba64f9e49a97662f16257a25e87.scope: Deactivated successfully.
Jan 31 03:43:48 np0005603623 podman[299608]: 2026-01-31 08:43:48.392071262 +0000 UTC m=+0.042867585 container died 4942bfcacba5dfc34d106b492b6195dfbcf33ba64f9e49a97662f16257a25e87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 31 03:43:48 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4942bfcacba5dfc34d106b492b6195dfbcf33ba64f9e49a97662f16257a25e87-userdata-shm.mount: Deactivated successfully.
Jan 31 03:43:48 np0005603623 systemd[1]: var-lib-containers-storage-overlay-6cd4a7607cd659e416399f280302de06d72e9023beb6b1a99f601b3e82a6ae26-merged.mount: Deactivated successfully.
Jan 31 03:43:48 np0005603623 podman[299608]: 2026-01-31 08:43:48.42837086 +0000 UTC m=+0.079167173 container cleanup 4942bfcacba5dfc34d106b492b6195dfbcf33ba64f9e49a97662f16257a25e87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:43:48 np0005603623 systemd[1]: libpod-conmon-4942bfcacba5dfc34d106b492b6195dfbcf33ba64f9e49a97662f16257a25e87.scope: Deactivated successfully.
Jan 31 03:43:48 np0005603623 podman[299645]: 2026-01-31 08:43:48.481501058 +0000 UTC m=+0.037564230 container remove 4942bfcacba5dfc34d106b492b6195dfbcf33ba64f9e49a97662f16257a25e87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 31 03:43:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:48.485 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0b850267-d61d-4b30-b16d-59fc657cb961]: (4, ('Sat Jan 31 08:43:48 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754 (4942bfcacba5dfc34d106b492b6195dfbcf33ba64f9e49a97662f16257a25e87)\n4942bfcacba5dfc34d106b492b6195dfbcf33ba64f9e49a97662f16257a25e87\nSat Jan 31 08:43:48 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754 (4942bfcacba5dfc34d106b492b6195dfbcf33ba64f9e49a97662f16257a25e87)\n4942bfcacba5dfc34d106b492b6195dfbcf33ba64f9e49a97662f16257a25e87\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:48.487 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d432536e-3e38-436f-b109-74c23f60782b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:48.488 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2217bdc-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.528 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:48 np0005603623 kernel: tapd2217bdc-40: left promiscuous mode
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.536 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:48.539 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2dc64274-33ba-4f31-9cdf-e978b20e1c11]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.544 226239 DEBUG nova.virt.libvirt.vif [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:43:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-152812282',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servertagstestjson-server-152812282',id=152,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:43:35Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0a5099d95b08434397e0bf691596b2cf',ramdisk_id='',reservation_id='r-0aev7bsy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerTagsTestJSON-638914080',owner_user_name='tempest-ServerTagsTestJSON-638914080-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:43:35Z,user_data=None,user_id='152becc96c854e3ca68b1b377cae845d',uuid=87263c34-4ecf-4f8a-b2b5-159e41f58aed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "address": "fa:16:3e:86:4b:b5", "network": {"id": "d2217bdc-492e-4a75-b218-e1312785e754", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-756188436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a5099d95b08434397e0bf691596b2cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b860d9b-53", "ovs_interfaceid": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.545 226239 DEBUG nova.network.os_vif_util [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Converting VIF {"id": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "address": "fa:16:3e:86:4b:b5", "network": {"id": "d2217bdc-492e-4a75-b218-e1312785e754", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-756188436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a5099d95b08434397e0bf691596b2cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b860d9b-53", "ovs_interfaceid": "6b860d9b-53bc-4fbb-ae3a-554edc649838", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.546 226239 DEBUG nova.network.os_vif_util [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:4b:b5,bridge_name='br-int',has_traffic_filtering=True,id=6b860d9b-53bc-4fbb-ae3a-554edc649838,network=Network(d2217bdc-492e-4a75-b218-e1312785e754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b860d9b-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.546 226239 DEBUG os_vif [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:4b:b5,bridge_name='br-int',has_traffic_filtering=True,id=6b860d9b-53bc-4fbb-ae3a-554edc649838,network=Network(d2217bdc-492e-4a75-b218-e1312785e754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b860d9b-53') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.548 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.548 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b860d9b-53, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.549 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:48.550 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c08b6fa2-ec3a-4060-b431-070bc24140f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.552 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:43:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:48.552 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a4615493-51c1-48ba-b23f-a70921afea06]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.554 226239 INFO os_vif [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:4b:b5,bridge_name='br-int',has_traffic_filtering=True,id=6b860d9b-53bc-4fbb-ae3a-554edc649838,network=Network(d2217bdc-492e-4a75-b218-e1312785e754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b860d9b-53')#033[00m
Jan 31 03:43:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:48.564 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0a397db9-ac7e-4e5d-a6ef-b97b6dd84d86]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 815166, 'reachable_time': 41999, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 299665, 'error': None, 'target': 'ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:48.567 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d2217bdc-492e-4a75-b218-e1312785e754 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:43:48 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:43:48.567 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[9ed273ee-8263-40a2-bee5-e685f89a76e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:48 np0005603623 systemd[1]: run-netns-ovnmeta\x2dd2217bdc\x2d492e\x2d4a75\x2db218\x2de1312785e754.mount: Deactivated successfully.
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.930 226239 INFO nova.virt.libvirt.driver [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Deleting instance files /var/lib/nova/instances/87263c34-4ecf-4f8a-b2b5-159e41f58aed_del#033[00m
Jan 31 03:43:48 np0005603623 nova_compute[226235]: 2026-01-31 08:43:48.931 226239 INFO nova.virt.libvirt.driver [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Deletion of /var/lib/nova/instances/87263c34-4ecf-4f8a-b2b5-159e41f58aed_del complete#033[00m
Jan 31 03:43:49 np0005603623 nova_compute[226235]: 2026-01-31 08:43:49.385 226239 DEBUG nova.compute.manager [req-059d7d38-835b-455f-b581-d8810eb83174 req-4b71a3d8-4d53-46ec-b83c-c05d4a3c0013 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Received event network-vif-unplugged-6b860d9b-53bc-4fbb-ae3a-554edc649838 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:43:49 np0005603623 nova_compute[226235]: 2026-01-31 08:43:49.385 226239 DEBUG oslo_concurrency.lockutils [req-059d7d38-835b-455f-b581-d8810eb83174 req-4b71a3d8-4d53-46ec-b83c-c05d4a3c0013 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:49 np0005603623 nova_compute[226235]: 2026-01-31 08:43:49.386 226239 DEBUG oslo_concurrency.lockutils [req-059d7d38-835b-455f-b581-d8810eb83174 req-4b71a3d8-4d53-46ec-b83c-c05d4a3c0013 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:49 np0005603623 nova_compute[226235]: 2026-01-31 08:43:49.386 226239 DEBUG oslo_concurrency.lockutils [req-059d7d38-835b-455f-b581-d8810eb83174 req-4b71a3d8-4d53-46ec-b83c-c05d4a3c0013 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:49 np0005603623 nova_compute[226235]: 2026-01-31 08:43:49.386 226239 DEBUG nova.compute.manager [req-059d7d38-835b-455f-b581-d8810eb83174 req-4b71a3d8-4d53-46ec-b83c-c05d4a3c0013 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] No waiting events found dispatching network-vif-unplugged-6b860d9b-53bc-4fbb-ae3a-554edc649838 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:43:49 np0005603623 nova_compute[226235]: 2026-01-31 08:43:49.387 226239 DEBUG nova.compute.manager [req-059d7d38-835b-455f-b581-d8810eb83174 req-4b71a3d8-4d53-46ec-b83c-c05d4a3c0013 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Received event network-vif-unplugged-6b860d9b-53bc-4fbb-ae3a-554edc649838 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:43:49 np0005603623 nova_compute[226235]: 2026-01-31 08:43:49.465 226239 INFO nova.compute.manager [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Took 1.34 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:43:49 np0005603623 nova_compute[226235]: 2026-01-31 08:43:49.465 226239 DEBUG oslo.service.loopingcall [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:43:49 np0005603623 nova_compute[226235]: 2026-01-31 08:43:49.465 226239 DEBUG nova.compute.manager [-] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:43:49 np0005603623 nova_compute[226235]: 2026-01-31 08:43:49.466 226239 DEBUG nova.network.neutron [-] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:43:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:49.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:43:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:49.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:43:50 np0005603623 nova_compute[226235]: 2026-01-31 08:43:50.250 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:51 np0005603623 nova_compute[226235]: 2026-01-31 08:43:51.153 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:51 np0005603623 nova_compute[226235]: 2026-01-31 08:43:51.209 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:51 np0005603623 nova_compute[226235]: 2026-01-31 08:43:51.594 226239 DEBUG nova.network.neutron [-] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:43:51 np0005603623 nova_compute[226235]: 2026-01-31 08:43:51.687 226239 DEBUG nova.compute.manager [req-14106436-d204-4eca-ada8-62f8bb3ad65c req-2d82d2b1-5245-48b4-8389-bdae4ae426e8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Received event network-vif-plugged-6b860d9b-53bc-4fbb-ae3a-554edc649838 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:43:51 np0005603623 nova_compute[226235]: 2026-01-31 08:43:51.687 226239 DEBUG oslo_concurrency.lockutils [req-14106436-d204-4eca-ada8-62f8bb3ad65c req-2d82d2b1-5245-48b4-8389-bdae4ae426e8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:51 np0005603623 nova_compute[226235]: 2026-01-31 08:43:51.688 226239 DEBUG oslo_concurrency.lockutils [req-14106436-d204-4eca-ada8-62f8bb3ad65c req-2d82d2b1-5245-48b4-8389-bdae4ae426e8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:51 np0005603623 nova_compute[226235]: 2026-01-31 08:43:51.688 226239 DEBUG oslo_concurrency.lockutils [req-14106436-d204-4eca-ada8-62f8bb3ad65c req-2d82d2b1-5245-48b4-8389-bdae4ae426e8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:51 np0005603623 nova_compute[226235]: 2026-01-31 08:43:51.689 226239 DEBUG nova.compute.manager [req-14106436-d204-4eca-ada8-62f8bb3ad65c req-2d82d2b1-5245-48b4-8389-bdae4ae426e8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] No waiting events found dispatching network-vif-plugged-6b860d9b-53bc-4fbb-ae3a-554edc649838 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:43:51 np0005603623 nova_compute[226235]: 2026-01-31 08:43:51.689 226239 WARNING nova.compute.manager [req-14106436-d204-4eca-ada8-62f8bb3ad65c req-2d82d2b1-5245-48b4-8389-bdae4ae426e8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Received unexpected event network-vif-plugged-6b860d9b-53bc-4fbb-ae3a-554edc649838 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:43:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:51.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:43:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:51.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:43:52 np0005603623 nova_compute[226235]: 2026-01-31 08:43:52.070 226239 INFO nova.compute.manager [-] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Took 2.60 seconds to deallocate network for instance.#033[00m
Jan 31 03:43:52 np0005603623 nova_compute[226235]: 2026-01-31 08:43:52.260 226239 DEBUG oslo_concurrency.lockutils [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:52 np0005603623 nova_compute[226235]: 2026-01-31 08:43:52.260 226239 DEBUG oslo_concurrency.lockutils [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:52 np0005603623 nova_compute[226235]: 2026-01-31 08:43:52.585 226239 DEBUG oslo_concurrency.processutils [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:43:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:43:53 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4182248260' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:43:53 np0005603623 nova_compute[226235]: 2026-01-31 08:43:53.045 226239 DEBUG oslo_concurrency.processutils [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:43:53 np0005603623 nova_compute[226235]: 2026-01-31 08:43:53.052 226239 DEBUG nova.compute.provider_tree [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:43:53 np0005603623 nova_compute[226235]: 2026-01-31 08:43:53.549 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:53 np0005603623 nova_compute[226235]: 2026-01-31 08:43:53.685 226239 DEBUG nova.scheduler.client.report [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:43:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:53.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:53.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:54 np0005603623 nova_compute[226235]: 2026-01-31 08:43:54.130 226239 DEBUG nova.compute.manager [req-5eb96fb2-2e83-4be0-a03a-1a758846ef2d req-7ff776c0-100c-471a-acde-90dac9fae5ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Received event network-vif-deleted-6b860d9b-53bc-4fbb-ae3a-554edc649838 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:43:54 np0005603623 nova_compute[226235]: 2026-01-31 08:43:54.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:54 np0005603623 nova_compute[226235]: 2026-01-31 08:43:54.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:43:54 np0005603623 nova_compute[226235]: 2026-01-31 08:43:54.277 226239 DEBUG oslo_concurrency.lockutils [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.017s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:54 np0005603623 nova_compute[226235]: 2026-01-31 08:43:54.771 226239 INFO nova.scheduler.client.report [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Deleted allocations for instance 87263c34-4ecf-4f8a-b2b5-159e41f58aed#033[00m
Jan 31 03:43:55 np0005603623 nova_compute[226235]: 2026-01-31 08:43:55.286 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:55.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:55.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:56 np0005603623 nova_compute[226235]: 2026-01-31 08:43:56.006 226239 DEBUG oslo_concurrency.lockutils [None req-a1b67fb7-1ba7-40b2-97ff-d4218b5078ec 152becc96c854e3ca68b1b377cae845d 0a5099d95b08434397e0bf691596b2cf - - default default] Lock "87263c34-4ecf-4f8a-b2b5-159e41f58aed" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.887s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:57.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:57.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:58 np0005603623 nova_compute[226235]: 2026-01-31 08:43:58.561 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:59 np0005603623 nova_compute[226235]: 2026-01-31 08:43:59.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:59 np0005603623 nova_compute[226235]: 2026-01-31 08:43:59.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:43:59 np0005603623 nova_compute[226235]: 2026-01-31 08:43:59.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:43:59 np0005603623 nova_compute[226235]: 2026-01-31 08:43:59.315 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:43:59 np0005603623 nova_compute[226235]: 2026-01-31 08:43:59.315 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:59 np0005603623 nova_compute[226235]: 2026-01-31 08:43:59.521 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:59 np0005603623 nova_compute[226235]: 2026-01-31 08:43:59.522 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:59 np0005603623 nova_compute[226235]: 2026-01-31 08:43:59.522 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:59 np0005603623 nova_compute[226235]: 2026-01-31 08:43:59.522 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:43:59 np0005603623 nova_compute[226235]: 2026-01-31 08:43:59.522 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:43:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:43:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:59.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:43:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:43:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:59.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:43:59 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/762076791' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:43:59 np0005603623 nova_compute[226235]: 2026-01-31 08:43:59.976 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:44:00 np0005603623 nova_compute[226235]: 2026-01-31 08:44:00.131 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:44:00 np0005603623 nova_compute[226235]: 2026-01-31 08:44:00.132 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4294MB free_disk=20.92180633544922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:44:00 np0005603623 nova_compute[226235]: 2026-01-31 08:44:00.132 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:44:00 np0005603623 nova_compute[226235]: 2026-01-31 08:44:00.132 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:44:00 np0005603623 nova_compute[226235]: 2026-01-31 08:44:00.288 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:01 np0005603623 nova_compute[226235]: 2026-01-31 08:44:01.756 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:44:01 np0005603623 nova_compute[226235]: 2026-01-31 08:44:01.756 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:44:01 np0005603623 nova_compute[226235]: 2026-01-31 08:44:01.821 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:44:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:01.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:01.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:44:02 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2492018723' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:44:02 np0005603623 nova_compute[226235]: 2026-01-31 08:44:02.238 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:44:02 np0005603623 nova_compute[226235]: 2026-01-31 08:44:02.243 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:44:02 np0005603623 nova_compute[226235]: 2026-01-31 08:44:02.461 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:44:03 np0005603623 nova_compute[226235]: 2026-01-31 08:44:03.002 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:44:03 np0005603623 nova_compute[226235]: 2026-01-31 08:44:03.003 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.870s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:44:03 np0005603623 nova_compute[226235]: 2026-01-31 08:44:03.353 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849028.3514056, 87263c34-4ecf-4f8a-b2b5-159e41f58aed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:44:03 np0005603623 nova_compute[226235]: 2026-01-31 08:44:03.353 226239 INFO nova.compute.manager [-] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:44:03 np0005603623 nova_compute[226235]: 2026-01-31 08:44:03.564 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:03 np0005603623 nova_compute[226235]: 2026-01-31 08:44:03.841 226239 DEBUG nova.compute.manager [None req-d7dd5bb9-a2e4-48a7-a88a-11a8a7064de5 - - - - - -] [instance: 87263c34-4ecf-4f8a-b2b5-159e41f58aed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:44:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:44:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:03.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:44:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:03.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:44:04.648 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=64, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=63) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:44:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:44:04.649 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:44:04 np0005603623 nova_compute[226235]: 2026-01-31 08:44:04.650 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:05 np0005603623 nova_compute[226235]: 2026-01-31 08:44:05.291 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:05.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:05.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:07 np0005603623 nova_compute[226235]: 2026-01-31 08:44:07.841 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:44:07 np0005603623 nova_compute[226235]: 2026-01-31 08:44:07.842 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:44:07 np0005603623 nova_compute[226235]: 2026-01-31 08:44:07.842 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:44:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:44:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:07.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:44:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:44:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:07.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:44:08 np0005603623 nova_compute[226235]: 2026-01-31 08:44:08.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:44:08 np0005603623 nova_compute[226235]: 2026-01-31 08:44:08.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:44:08 np0005603623 nova_compute[226235]: 2026-01-31 08:44:08.606 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:44:09.651 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '64'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:44:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:44:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:09.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:44:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:09.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:10 np0005603623 nova_compute[226235]: 2026-01-31 08:44:10.292 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:11.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:11.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:13 np0005603623 nova_compute[226235]: 2026-01-31 08:44:13.608 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:44:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:13.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:44:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:13.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:15 np0005603623 nova_compute[226235]: 2026-01-31 08:44:15.293 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:44:15 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2171301321' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:44:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:44:15 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2171301321' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:44:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:15.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:15.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:16 np0005603623 nova_compute[226235]: 2026-01-31 08:44:16.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:44:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:17.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:17.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:17 np0005603623 podman[299867]: 2026-01-31 08:44:17.95840546 +0000 UTC m=+0.050264018 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:44:17 np0005603623 podman[299868]: 2026-01-31 08:44:17.976171068 +0000 UTC m=+0.068037556 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 31 03:44:18 np0005603623 nova_compute[226235]: 2026-01-31 08:44:18.623 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:19.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:19.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:20 np0005603623 nova_compute[226235]: 2026-01-31 08:44:20.295 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:21.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:21.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:23 np0005603623 nova_compute[226235]: 2026-01-31 08:44:23.625 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:23.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:44:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:23.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:44:25 np0005603623 nova_compute[226235]: 2026-01-31 08:44:25.301 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:25.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:25.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:44:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1845717769' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:44:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:44:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1845717769' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:44:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:27.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:27.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:28 np0005603623 nova_compute[226235]: 2026-01-31 08:44:28.668 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:44:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:29.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:44:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:44:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:29.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:44:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:44:30.135 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:44:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:44:30.136 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:44:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:44:30.136 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:44:30 np0005603623 nova_compute[226235]: 2026-01-31 08:44:30.303 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:44:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:31.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:44:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:31.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:33 np0005603623 nova_compute[226235]: 2026-01-31 08:44:33.671 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:33.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:44:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:33.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:44:35 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:44:35 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:44:35 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:44:35 np0005603623 nova_compute[226235]: 2026-01-31 08:44:35.305 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:44:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:35.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:44:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:35.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:44:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:37.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:44:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:37.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:38 np0005603623 nova_compute[226235]: 2026-01-31 08:44:38.709 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:39.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:39.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:40 np0005603623 nova_compute[226235]: 2026-01-31 08:44:40.306 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:44:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:41.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:44:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:41.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:43 np0005603623 nova_compute[226235]: 2026-01-31 08:44:43.712 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:44:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:43.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:44:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:44:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:43.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:44:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:44:44.578 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=65, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=64) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:44:44 np0005603623 nova_compute[226235]: 2026-01-31 08:44:44.578 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:44:44.579 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:44:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:44:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:44:45 np0005603623 nova_compute[226235]: 2026-01-31 08:44:45.308 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:44:45.580 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '65'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:44:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:44:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:45.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:44:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:44:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:45.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:44:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:44:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:47.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:44:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:47.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:48 np0005603623 nova_compute[226235]: 2026-01-31 08:44:48.762 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:48 np0005603623 podman[300155]: 2026-01-31 08:44:48.956197889 +0000 UTC m=+0.047021186 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 31 03:44:48 np0005603623 podman[300156]: 2026-01-31 08:44:48.97664962 +0000 UTC m=+0.066114115 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 03:44:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:49.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:49.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:50 np0005603623 nova_compute[226235]: 2026-01-31 08:44:50.310 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:51.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:51.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:53 np0005603623 nova_compute[226235]: 2026-01-31 08:44:53.765 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:44:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:53.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:44:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:53.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:55 np0005603623 nova_compute[226235]: 2026-01-31 08:44:55.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:44:55 np0005603623 nova_compute[226235]: 2026-01-31 08:44:55.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:44:55 np0005603623 nova_compute[226235]: 2026-01-31 08:44:55.352 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:55.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:55.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:57.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:57.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:58 np0005603623 nova_compute[226235]: 2026-01-31 08:44:58.769 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:59 np0005603623 nova_compute[226235]: 2026-01-31 08:44:59.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:44:59 np0005603623 nova_compute[226235]: 2026-01-31 08:44:59.313 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:44:59 np0005603623 nova_compute[226235]: 2026-01-31 08:44:59.313 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:44:59 np0005603623 nova_compute[226235]: 2026-01-31 08:44:59.313 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:44:59 np0005603623 nova_compute[226235]: 2026-01-31 08:44:59.314 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:44:59 np0005603623 nova_compute[226235]: 2026-01-31 08:44:59.314 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:44:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:44:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:59.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:44:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:44:59 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/999339012' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:44:59 np0005603623 nova_compute[226235]: 2026-01-31 08:44:59.948 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.635s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:44:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:44:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:59.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:00 np0005603623 nova_compute[226235]: 2026-01-31 08:45:00.078 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:45:00 np0005603623 nova_compute[226235]: 2026-01-31 08:45:00.079 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4325MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:45:00 np0005603623 nova_compute[226235]: 2026-01-31 08:45:00.079 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:45:00 np0005603623 nova_compute[226235]: 2026-01-31 08:45:00.079 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:45:00 np0005603623 nova_compute[226235]: 2026-01-31 08:45:00.354 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:01 np0005603623 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 31 03:45:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:45:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:01.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:45:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:45:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:01.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:45:03 np0005603623 nova_compute[226235]: 2026-01-31 08:45:03.102 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:45:03 np0005603623 nova_compute[226235]: 2026-01-31 08:45:03.102 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:45:03 np0005603623 nova_compute[226235]: 2026-01-31 08:45:03.772 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:45:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:03.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:45:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:04.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:04 np0005603623 nova_compute[226235]: 2026-01-31 08:45:04.683 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:45:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:45:05 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/365553074' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:45:05 np0005603623 nova_compute[226235]: 2026-01-31 08:45:05.157 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:45:05 np0005603623 nova_compute[226235]: 2026-01-31 08:45:05.165 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:45:05 np0005603623 nova_compute[226235]: 2026-01-31 08:45:05.355 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:05 np0005603623 nova_compute[226235]: 2026-01-31 08:45:05.472 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:45:05 np0005603623 nova_compute[226235]: 2026-01-31 08:45:05.473 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:45:05 np0005603623 nova_compute[226235]: 2026-01-31 08:45:05.474 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 5.394s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:45:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:05.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:06.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:07 np0005603623 nova_compute[226235]: 2026-01-31 08:45:07.474 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:07 np0005603623 nova_compute[226235]: 2026-01-31 08:45:07.475 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:07 np0005603623 nova_compute[226235]: 2026-01-31 08:45:07.475 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:45:07 np0005603623 nova_compute[226235]: 2026-01-31 08:45:07.475 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:45:07 np0005603623 nova_compute[226235]: 2026-01-31 08:45:07.531 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:45:07 np0005603623 nova_compute[226235]: 2026-01-31 08:45:07.532 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:07 np0005603623 nova_compute[226235]: 2026-01-31 08:45:07.532 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:45:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:07.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:45:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:08.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:08 np0005603623 nova_compute[226235]: 2026-01-31 08:45:08.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:08 np0005603623 nova_compute[226235]: 2026-01-31 08:45:08.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:08 np0005603623 nova_compute[226235]: 2026-01-31 08:45:08.812 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:45:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:09.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:45:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:10.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:10 np0005603623 nova_compute[226235]: 2026-01-31 08:45:10.356 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:11.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:12.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:13 np0005603623 nova_compute[226235]: 2026-01-31 08:45:13.815 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:13.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:14.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:15 np0005603623 nova_compute[226235]: 2026-01-31 08:45:15.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:15 np0005603623 nova_compute[226235]: 2026-01-31 08:45:15.357 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:45:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:15.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:45:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:45:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:16.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:45:16 np0005603623 nova_compute[226235]: 2026-01-31 08:45:16.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:17.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:18.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:18 np0005603623 nova_compute[226235]: 2026-01-31 08:45:18.825 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:19.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:19 np0005603623 podman[300358]: 2026-01-31 08:45:19.94244242 +0000 UTC m=+0.042376950 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:45:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:45:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:20.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:45:20 np0005603623 podman[300359]: 2026-01-31 08:45:20.019202857 +0000 UTC m=+0.115921826 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 31 03:45:20 np0005603623 nova_compute[226235]: 2026-01-31 08:45:20.360 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:21.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:22.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:23 np0005603623 nova_compute[226235]: 2026-01-31 08:45:23.828 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:23.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:45:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:24.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:45:25 np0005603623 nova_compute[226235]: 2026-01-31 08:45:25.395 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:25.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:26.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:45:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:27.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:45:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:45:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:28.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:45:28 np0005603623 nova_compute[226235]: 2026-01-31 08:45:28.831 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:45:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:29.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:45:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:30.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:45:30.137 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:45:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:45:30.137 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:45:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:45:30.137 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:45:30 np0005603623 nova_compute[226235]: 2026-01-31 08:45:30.398 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:45:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:31.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:45:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:45:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:32.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:45:33 np0005603623 nova_compute[226235]: 2026-01-31 08:45:33.834 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:33.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:34.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:35 np0005603623 nova_compute[226235]: 2026-01-31 08:45:35.399 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:35.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:45:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:36.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:45:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:45:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:37.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:45:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:45:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:38.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:45:38 np0005603623 nova_compute[226235]: 2026-01-31 08:45:38.837 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:39.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:40.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:40 np0005603623 nova_compute[226235]: 2026-01-31 08:45:40.401 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:41.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:42.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:43 np0005603623 ovn_controller[133449]: 2026-01-31T08:45:43Z|00661|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 03:45:43 np0005603623 nova_compute[226235]: 2026-01-31 08:45:43.841 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:45:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:43.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:45:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:44.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:45 np0005603623 nova_compute[226235]: 2026-01-31 08:45:45.402 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:45:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:45:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:45:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:45:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:45:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:45:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 03:45:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 03:45:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 03:45:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:45:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:45:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:45:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:45.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:46.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:45:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:47.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:45:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:45:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:48.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:45:48 np0005603623 nova_compute[226235]: 2026-01-31 08:45:48.844 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:49.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:50.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:50 np0005603623 nova_compute[226235]: 2026-01-31 08:45:50.405 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:50 np0005603623 podman[300746]: 2026-01-31 08:45:50.967035764 +0000 UTC m=+0.064643589 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 31 03:45:50 np0005603623 podman[300735]: 2026-01-31 08:45:50.967185459 +0000 UTC m=+0.066407984 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 31 03:45:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:45:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:45:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:51.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:52.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:53 np0005603623 nova_compute[226235]: 2026-01-31 08:45:53.847 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:53.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:45:54.055 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=66, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=65) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:45:54 np0005603623 nova_compute[226235]: 2026-01-31 08:45:54.056 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:45:54.056 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:45:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:54.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:55 np0005603623 nova_compute[226235]: 2026-01-31 08:45:55.407 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:55.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:56.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:56 np0005603623 nova_compute[226235]: 2026-01-31 08:45:56.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:56 np0005603623 nova_compute[226235]: 2026-01-31 08:45:56.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:45:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:45:57.058 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '66'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:45:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:57.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:45:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:58.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:45:58 np0005603623 nova_compute[226235]: 2026-01-31 08:45:58.068 226239 DEBUG oslo_concurrency.lockutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:45:58 np0005603623 nova_compute[226235]: 2026-01-31 08:45:58.069 226239 DEBUG oslo_concurrency.lockutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:45:58 np0005603623 nova_compute[226235]: 2026-01-31 08:45:58.683 226239 DEBUG nova.compute.manager [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:45:58 np0005603623 nova_compute[226235]: 2026-01-31 08:45:58.850 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:58 np0005603623 nova_compute[226235]: 2026-01-31 08:45:58.885 226239 DEBUG oslo_concurrency.lockutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:45:58 np0005603623 nova_compute[226235]: 2026-01-31 08:45:58.886 226239 DEBUG oslo_concurrency.lockutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:45:58 np0005603623 nova_compute[226235]: 2026-01-31 08:45:58.897 226239 DEBUG nova.virt.hardware [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:45:58 np0005603623 nova_compute[226235]: 2026-01-31 08:45:58.897 226239 INFO nova.compute.claims [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:45:59 np0005603623 nova_compute[226235]: 2026-01-31 08:45:59.092 226239 DEBUG oslo_concurrency.processutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:45:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:45:59 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1089867997' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:45:59 np0005603623 nova_compute[226235]: 2026-01-31 08:45:59.492 226239 DEBUG oslo_concurrency.processutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:45:59 np0005603623 nova_compute[226235]: 2026-01-31 08:45:59.497 226239 DEBUG nova.compute.provider_tree [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:45:59 np0005603623 nova_compute[226235]: 2026-01-31 08:45:59.663 226239 DEBUG nova.scheduler.client.report [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:45:59 np0005603623 nova_compute[226235]: 2026-01-31 08:45:59.876 226239 DEBUG oslo_concurrency.lockutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.990s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:45:59 np0005603623 nova_compute[226235]: 2026-01-31 08:45:59.877 226239 DEBUG nova.compute.manager [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:45:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:45:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:59.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:00.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:00 np0005603623 nova_compute[226235]: 2026-01-31 08:46:00.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:00 np0005603623 nova_compute[226235]: 2026-01-31 08:46:00.181 226239 DEBUG nova.compute.manager [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:46:00 np0005603623 nova_compute[226235]: 2026-01-31 08:46:00.181 226239 DEBUG nova.network.neutron [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:46:00 np0005603623 nova_compute[226235]: 2026-01-31 08:46:00.371 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:00 np0005603623 nova_compute[226235]: 2026-01-31 08:46:00.372 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:00 np0005603623 nova_compute[226235]: 2026-01-31 08:46:00.372 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:00 np0005603623 nova_compute[226235]: 2026-01-31 08:46:00.372 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:46:00 np0005603623 nova_compute[226235]: 2026-01-31 08:46:00.372 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:00 np0005603623 nova_compute[226235]: 2026-01-31 08:46:00.408 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:00 np0005603623 nova_compute[226235]: 2026-01-31 08:46:00.531 226239 INFO nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:46:00 np0005603623 nova_compute[226235]: 2026-01-31 08:46:00.583 226239 DEBUG nova.policy [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a498364761ef428b99cac3f92e603385', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8397e0fed04b4dabb57148d0924de2dc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:46:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:46:00 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4210979433' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:46:00 np0005603623 nova_compute[226235]: 2026-01-31 08:46:00.810 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:00 np0005603623 nova_compute[226235]: 2026-01-31 08:46:00.844 226239 DEBUG nova.compute.manager [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:46:00 np0005603623 nova_compute[226235]: 2026-01-31 08:46:00.935 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:46:00 np0005603623 nova_compute[226235]: 2026-01-31 08:46:00.936 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4295MB free_disk=20.92184066772461GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:46:00 np0005603623 nova_compute[226235]: 2026-01-31 08:46:00.937 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:00 np0005603623 nova_compute[226235]: 2026-01-31 08:46:00.937 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.242 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 0edbf2b9-b76f-446b-85fa-09a4dcb37976 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.242 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.243 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.307 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.342 226239 INFO nova.virt.block_device [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Booting with volume bf35e6ca-068a-4538-b11a-fe35ddc37a44 at /dev/vda#033[00m
Jan 31 03:46:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:46:01 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1695806019' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.703 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.395s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.708 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.720 226239 DEBUG os_brick.utils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.722 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.732 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.732 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[146a0f6e-78b7-4b30-be96-624e3745fd81]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.734 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.740 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.740 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[9cc38817-d3ec-487b-8352-ae34545b117d]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.742 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.749 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.749 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[55926396-1983-4fd7-84c0-1b6aefd1d1d6]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.750 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[3b2e3dbd-df1f-44e1-b1f5-b8d06daf81b0]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.751 226239 DEBUG oslo_concurrency.processutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.771 226239 DEBUG oslo_concurrency.processutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "nvme version" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.773 226239 DEBUG os_brick.initiator.connectors.lightos [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.774 226239 DEBUG os_brick.initiator.connectors.lightos [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.774 226239 DEBUG os_brick.initiator.connectors.lightos [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.774 226239 DEBUG os_brick.utils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] <== get_connector_properties: return (52ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.774 226239 DEBUG nova.virt.block_device [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Updating existing volume attachment record: f7eacdf7-1634-4b3b-b8a3-3de979ab84e4 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:46:01 np0005603623 nova_compute[226235]: 2026-01-31 08:46:01.811 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:46:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:46:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:01.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:46:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:46:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:02.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:46:02 np0005603623 nova_compute[226235]: 2026-01-31 08:46:02.149 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:46:02 np0005603623 nova_compute[226235]: 2026-01-31 08:46:02.149 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:03 np0005603623 nova_compute[226235]: 2026-01-31 08:46:03.133 226239 DEBUG nova.network.neutron [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Successfully created port: e6486275-22a6-4ee0-854f-fde4ef96bd8f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:46:03 np0005603623 nova_compute[226235]: 2026-01-31 08:46:03.853 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:03.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:04.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:05 np0005603623 nova_compute[226235]: 2026-01-31 08:46:05.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:05 np0005603623 nova_compute[226235]: 2026-01-31 08:46:05.150 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:46:05 np0005603623 nova_compute[226235]: 2026-01-31 08:46:05.151 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:46:05 np0005603623 nova_compute[226235]: 2026-01-31 08:46:05.407 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:46:05 np0005603623 nova_compute[226235]: 2026-01-31 08:46:05.408 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:46:05 np0005603623 nova_compute[226235]: 2026-01-31 08:46:05.410 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:05 np0005603623 nova_compute[226235]: 2026-01-31 08:46:05.831 226239 DEBUG nova.compute.manager [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:46:05 np0005603623 nova_compute[226235]: 2026-01-31 08:46:05.832 226239 DEBUG nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:46:05 np0005603623 nova_compute[226235]: 2026-01-31 08:46:05.833 226239 INFO nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Creating image(s)#033[00m
Jan 31 03:46:05 np0005603623 nova_compute[226235]: 2026-01-31 08:46:05.833 226239 DEBUG nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 03:46:05 np0005603623 nova_compute[226235]: 2026-01-31 08:46:05.834 226239 DEBUG nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Ensure instance console log exists: /var/lib/nova/instances/0edbf2b9-b76f-446b-85fa-09a4dcb37976/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:46:05 np0005603623 nova_compute[226235]: 2026-01-31 08:46:05.834 226239 DEBUG oslo_concurrency.lockutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:05 np0005603623 nova_compute[226235]: 2026-01-31 08:46:05.834 226239 DEBUG oslo_concurrency.lockutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:05 np0005603623 nova_compute[226235]: 2026-01-31 08:46:05.835 226239 DEBUG oslo_concurrency.lockutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:05.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:06.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:06 np0005603623 nova_compute[226235]: 2026-01-31 08:46:06.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #136. Immutable memtables: 0.
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:06.687024) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 85] Flushing memtable with next log file: 136
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849166687084, "job": 85, "event": "flush_started", "num_memtables": 1, "num_entries": 2370, "num_deletes": 252, "total_data_size": 5697481, "memory_usage": 5761072, "flush_reason": "Manual Compaction"}
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 85] Level-0 flush table #137: started
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849166744233, "cf_name": "default", "job": 85, "event": "table_file_creation", "file_number": 137, "file_size": 3726365, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 66553, "largest_seqno": 68918, "table_properties": {"data_size": 3716782, "index_size": 6012, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19999, "raw_average_key_size": 20, "raw_value_size": 3697597, "raw_average_value_size": 3792, "num_data_blocks": 262, "num_entries": 975, "num_filter_entries": 975, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848956, "oldest_key_time": 1769848956, "file_creation_time": 1769849166, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 137, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 85] Flush lasted 57299 microseconds, and 5503 cpu microseconds.
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:06.744317) [db/flush_job.cc:967] [default] [JOB 85] Level-0 flush table #137: 3726365 bytes OK
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:06.744340) [db/memtable_list.cc:519] [default] Level-0 commit table #137 started
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:06.746043) [db/memtable_list.cc:722] [default] Level-0 commit table #137: memtable #1 done
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:06.746057) EVENT_LOG_v1 {"time_micros": 1769849166746052, "job": 85, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:06.746073) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 85] Try to delete WAL files size 5687059, prev total WAL file size 5687059, number of live WAL files 2.
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000133.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:06.746967) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035373733' seq:72057594037927935, type:22 .. '7061786F730036303235' seq:0, type:0; will stop at (end)
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 86] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 85 Base level 0, inputs: [137(3639KB)], [135(11MB)]
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849166747049, "job": 86, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [137], "files_L6": [135], "score": -1, "input_data_size": 15343222, "oldest_snapshot_seqno": -1}
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 86] Generated table #138: 9109 keys, 13361090 bytes, temperature: kUnknown
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849166906011, "cf_name": "default", "job": 86, "event": "table_file_creation", "file_number": 138, "file_size": 13361090, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13300049, "index_size": 37184, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22789, "raw_key_size": 238846, "raw_average_key_size": 26, "raw_value_size": 13138141, "raw_average_value_size": 1442, "num_data_blocks": 1432, "num_entries": 9109, "num_filter_entries": 9109, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769849166, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 138, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:06.906429) [db/compaction/compaction_job.cc:1663] [default] [JOB 86] Compacted 1@0 + 1@6 files to L6 => 13361090 bytes
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:06.908654) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 96.4 rd, 84.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 11.1 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(7.7) write-amplify(3.6) OK, records in: 9634, records dropped: 525 output_compression: NoCompression
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:06.908732) EVENT_LOG_v1 {"time_micros": 1769849166908678, "job": 86, "event": "compaction_finished", "compaction_time_micros": 159135, "compaction_time_cpu_micros": 23803, "output_level": 6, "num_output_files": 1, "total_output_size": 13361090, "num_input_records": 9634, "num_output_records": 9109, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000137.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849166910037, "job": 86, "event": "table_file_deletion", "file_number": 137}
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000135.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849166912050, "job": 86, "event": "table_file_deletion", "file_number": 135}
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:06.746888) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:06.912159) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:06.912164) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:06.912166) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:06.912168) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:06 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:06.912170) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:07 np0005603623 nova_compute[226235]: 2026-01-31 08:46:07.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:46:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:07.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:46:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:08.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:08 np0005603623 nova_compute[226235]: 2026-01-31 08:46:08.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:08 np0005603623 nova_compute[226235]: 2026-01-31 08:46:08.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:08 np0005603623 nova_compute[226235]: 2026-01-31 08:46:08.667 226239 DEBUG nova.network.neutron [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Successfully updated port: e6486275-22a6-4ee0-854f-fde4ef96bd8f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:46:08 np0005603623 nova_compute[226235]: 2026-01-31 08:46:08.857 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:08 np0005603623 nova_compute[226235]: 2026-01-31 08:46:08.895 226239 DEBUG oslo_concurrency.lockutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "refresh_cache-0edbf2b9-b76f-446b-85fa-09a4dcb37976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:46:08 np0005603623 nova_compute[226235]: 2026-01-31 08:46:08.896 226239 DEBUG oslo_concurrency.lockutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquired lock "refresh_cache-0edbf2b9-b76f-446b-85fa-09a4dcb37976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:46:08 np0005603623 nova_compute[226235]: 2026-01-31 08:46:08.896 226239 DEBUG nova.network.neutron [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:46:09 np0005603623 nova_compute[226235]: 2026-01-31 08:46:09.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:09 np0005603623 nova_compute[226235]: 2026-01-31 08:46:09.298 226239 DEBUG nova.compute.manager [req-c69472c7-47b6-419d-b928-5aa84bac3e5e req-f4936a76-2c08-4519-ae44-2270305a337a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Received event network-changed-e6486275-22a6-4ee0-854f-fde4ef96bd8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:46:09 np0005603623 nova_compute[226235]: 2026-01-31 08:46:09.298 226239 DEBUG nova.compute.manager [req-c69472c7-47b6-419d-b928-5aa84bac3e5e req-f4936a76-2c08-4519-ae44-2270305a337a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Refreshing instance network info cache due to event network-changed-e6486275-22a6-4ee0-854f-fde4ef96bd8f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:46:09 np0005603623 nova_compute[226235]: 2026-01-31 08:46:09.298 226239 DEBUG oslo_concurrency.lockutils [req-c69472c7-47b6-419d-b928-5aa84bac3e5e req-f4936a76-2c08-4519-ae44-2270305a337a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-0edbf2b9-b76f-446b-85fa-09a4dcb37976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:46:09 np0005603623 nova_compute[226235]: 2026-01-31 08:46:09.353 226239 DEBUG nova.network.neutron [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:46:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:46:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:09.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:46:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:10.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:10 np0005603623 nova_compute[226235]: 2026-01-31 08:46:10.412 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.417 226239 DEBUG nova.network.neutron [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Updating instance_info_cache with network_info: [{"id": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "address": "fa:16:3e:84:6e:68", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6486275-22", "ovs_interfaceid": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.485 226239 DEBUG oslo_concurrency.lockutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Releasing lock "refresh_cache-0edbf2b9-b76f-446b-85fa-09a4dcb37976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.485 226239 DEBUG nova.compute.manager [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Instance network_info: |[{"id": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "address": "fa:16:3e:84:6e:68", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6486275-22", "ovs_interfaceid": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.485 226239 DEBUG oslo_concurrency.lockutils [req-c69472c7-47b6-419d-b928-5aa84bac3e5e req-f4936a76-2c08-4519-ae44-2270305a337a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-0edbf2b9-b76f-446b-85fa-09a4dcb37976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.486 226239 DEBUG nova.network.neutron [req-c69472c7-47b6-419d-b928-5aa84bac3e5e req-f4936a76-2c08-4519-ae44-2270305a337a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Refreshing network info cache for port e6486275-22a6-4ee0-854f-fde4ef96bd8f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.489 226239 DEBUG nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Start _get_guest_xml network_info=[{"id": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "address": "fa:16:3e:84:6e:68", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6486275-22", "ovs_interfaceid": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'attachment_id': 'f7eacdf7-1634-4b3b-b8a3-3de979ab84e4', 'delete_on_termination': False, 'guest_format': None, 'device_type': 'disk', 'boot_index': 0, 'disk_bus': 'virtio', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-bf35e6ca-068a-4538-b11a-fe35ddc37a44', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'bf35e6ca-068a-4538-b11a-fe35ddc37a44', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '0edbf2b9-b76f-446b-85fa-09a4dcb37976', 'attached_at': '', 'detached_at': '', 'volume_id': 'bf35e6ca-068a-4538-b11a-fe35ddc37a44', 'serial': 'bf35e6ca-068a-4538-b11a-fe35ddc37a44', 'multiattach': True}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.493 226239 WARNING nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.499 226239 DEBUG nova.virt.libvirt.host [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.500 226239 DEBUG nova.virt.libvirt.host [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.505 226239 DEBUG nova.virt.libvirt.host [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.506 226239 DEBUG nova.virt.libvirt.host [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.507 226239 DEBUG nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.507 226239 DEBUG nova.virt.hardware [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.508 226239 DEBUG nova.virt.hardware [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.508 226239 DEBUG nova.virt.hardware [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.508 226239 DEBUG nova.virt.hardware [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.509 226239 DEBUG nova.virt.hardware [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.509 226239 DEBUG nova.virt.hardware [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.509 226239 DEBUG nova.virt.hardware [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.509 226239 DEBUG nova.virt.hardware [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.510 226239 DEBUG nova.virt.hardware [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.510 226239 DEBUG nova.virt.hardware [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.510 226239 DEBUG nova.virt.hardware [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.531 226239 DEBUG nova.storage.rbd_utils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image 0edbf2b9-b76f-446b-85fa-09a4dcb37976_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.535 226239 DEBUG oslo_concurrency.processutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:46:11 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4122943622' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.943 226239 DEBUG oslo_concurrency.processutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.992 226239 DEBUG nova.virt.libvirt.vif [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:45:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeMultiAttachTest-server-356846984',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumemultiattachtest-server-356846984',id=156,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8397e0fed04b4dabb57148d0924de2dc',ramdisk_id='',reservation_id='r-e0b5r0ay',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-1931311941',owner_user_name='tempest-AttachVolumeMultiAttachTest-1931311941-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:46:01Z,user_data=None,user_id='a498364761ef428b99cac3f92e603385',uuid=0edbf2b9-b76f-446b-85fa-09a4dcb37976,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "address": "fa:16:3e:84:6e:68", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6486275-22", "ovs_interfaceid": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.992 226239 DEBUG nova.network.os_vif_util [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converting VIF {"id": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "address": "fa:16:3e:84:6e:68", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6486275-22", "ovs_interfaceid": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.993 226239 DEBUG nova.network.os_vif_util [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:6e:68,bridge_name='br-int',has_traffic_filtering=True,id=e6486275-22a6-4ee0-854f-fde4ef96bd8f,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6486275-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:46:11 np0005603623 nova_compute[226235]: 2026-01-31 08:46:11.994 226239 DEBUG nova.objects.instance [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'pci_devices' on Instance uuid 0edbf2b9-b76f-446b-85fa-09a4dcb37976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:46:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:46:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:11.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.056 226239 DEBUG nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:46:12 np0005603623 nova_compute[226235]:  <uuid>0edbf2b9-b76f-446b-85fa-09a4dcb37976</uuid>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:  <name>instance-0000009c</name>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <nova:name>tempest-AttachVolumeMultiAttachTest-server-356846984</nova:name>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:46:11</nova:creationTime>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:46:12 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:        <nova:user uuid="a498364761ef428b99cac3f92e603385">tempest-AttachVolumeMultiAttachTest-1931311941-project-member</nova:user>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:        <nova:project uuid="8397e0fed04b4dabb57148d0924de2dc">tempest-AttachVolumeMultiAttachTest-1931311941</nova:project>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:        <nova:port uuid="e6486275-22a6-4ee0-854f-fde4ef96bd8f">
Jan 31 03:46:12 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <entry name="serial">0edbf2b9-b76f-446b-85fa-09a4dcb37976</entry>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <entry name="uuid">0edbf2b9-b76f-446b-85fa-09a4dcb37976</entry>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/0edbf2b9-b76f-446b-85fa-09a4dcb37976_disk.config">
Jan 31 03:46:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:46:12 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="volumes/volume-bf35e6ca-068a-4538-b11a-fe35ddc37a44">
Jan 31 03:46:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:46:12 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <serial>bf35e6ca-068a-4538-b11a-fe35ddc37a44</serial>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <shareable/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:84:6e:68"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <target dev="tape6486275-22"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/0edbf2b9-b76f-446b-85fa-09a4dcb37976/console.log" append="off"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:46:12 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:46:12 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:46:12 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:46:12 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.057 226239 DEBUG nova.compute.manager [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Preparing to wait for external event network-vif-plugged-e6486275-22a6-4ee0-854f-fde4ef96bd8f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.057 226239 DEBUG oslo_concurrency.lockutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.058 226239 DEBUG oslo_concurrency.lockutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.058 226239 DEBUG oslo_concurrency.lockutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.059 226239 DEBUG nova.virt.libvirt.vif [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:45:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeMultiAttachTest-server-356846984',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumemultiattachtest-server-356846984',id=156,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8397e0fed04b4dabb57148d0924de2dc',ramdisk_id='',reservation_id='r-e0b5r0ay',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-1931311941',owner_user_name='tempest-AttachVolumeMultiAttachTest-1931311941-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:46:01Z,user_data=None,user_id='a498364761ef428b99cac3f92e603385',uuid=0edbf2b9-b76f-446b-85fa-09a4dcb37976,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "address": "fa:16:3e:84:6e:68", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6486275-22", "ovs_interfaceid": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.060 226239 DEBUG nova.network.os_vif_util [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converting VIF {"id": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "address": "fa:16:3e:84:6e:68", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6486275-22", "ovs_interfaceid": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.061 226239 DEBUG nova.network.os_vif_util [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:6e:68,bridge_name='br-int',has_traffic_filtering=True,id=e6486275-22a6-4ee0-854f-fde4ef96bd8f,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6486275-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.061 226239 DEBUG os_vif [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:6e:68,bridge_name='br-int',has_traffic_filtering=True,id=e6486275-22a6-4ee0-854f-fde4ef96bd8f,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6486275-22') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.062 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.062 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.063 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.066 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.066 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape6486275-22, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.067 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape6486275-22, col_values=(('external_ids', {'iface-id': 'e6486275-22a6-4ee0-854f-fde4ef96bd8f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:84:6e:68', 'vm-uuid': '0edbf2b9-b76f-446b-85fa-09a4dcb37976'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.068 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:12 np0005603623 NetworkManager[48970]: <info>  [1769849172.0696] manager: (tape6486275-22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/313)
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.071 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.074 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.075 226239 INFO os_vif [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:6e:68,bridge_name='br-int',has_traffic_filtering=True,id=e6486275-22a6-4ee0-854f-fde4ef96bd8f,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6486275-22')#033[00m
Jan 31 03:46:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:12.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.260 226239 DEBUG nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.260 226239 DEBUG nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.261 226239 DEBUG nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No VIF found with MAC fa:16:3e:84:6e:68, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.261 226239 INFO nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Using config drive#033[00m
Jan 31 03:46:12 np0005603623 nova_compute[226235]: 2026-01-31 08:46:12.284 226239 DEBUG nova.storage.rbd_utils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image 0edbf2b9-b76f-446b-85fa-09a4dcb37976_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:46:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:14.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:14.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:46:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2344771145' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:46:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:46:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2344771145' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:46:15 np0005603623 nova_compute[226235]: 2026-01-31 08:46:15.366 226239 INFO nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Creating config drive at /var/lib/nova/instances/0edbf2b9-b76f-446b-85fa-09a4dcb37976/disk.config#033[00m
Jan 31 03:46:15 np0005603623 nova_compute[226235]: 2026-01-31 08:46:15.371 226239 DEBUG oslo_concurrency.processutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0edbf2b9-b76f-446b-85fa-09a4dcb37976/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpp_gelva5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:15 np0005603623 nova_compute[226235]: 2026-01-31 08:46:15.413 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:15 np0005603623 nova_compute[226235]: 2026-01-31 08:46:15.491 226239 DEBUG oslo_concurrency.processutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0edbf2b9-b76f-446b-85fa-09a4dcb37976/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpp_gelva5" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:15 np0005603623 nova_compute[226235]: 2026-01-31 08:46:15.517 226239 DEBUG nova.storage.rbd_utils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image 0edbf2b9-b76f-446b-85fa-09a4dcb37976_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:46:15 np0005603623 nova_compute[226235]: 2026-01-31 08:46:15.520 226239 DEBUG oslo_concurrency.processutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0edbf2b9-b76f-446b-85fa-09a4dcb37976/disk.config 0edbf2b9-b76f-446b-85fa-09a4dcb37976_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:15 np0005603623 nova_compute[226235]: 2026-01-31 08:46:15.726 226239 DEBUG oslo_concurrency.processutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0edbf2b9-b76f-446b-85fa-09a4dcb37976/disk.config 0edbf2b9-b76f-446b-85fa-09a4dcb37976_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.206s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:15 np0005603623 nova_compute[226235]: 2026-01-31 08:46:15.727 226239 INFO nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Deleting local config drive /var/lib/nova/instances/0edbf2b9-b76f-446b-85fa-09a4dcb37976/disk.config because it was imported into RBD.#033[00m
Jan 31 03:46:15 np0005603623 kernel: tape6486275-22: entered promiscuous mode
Jan 31 03:46:15 np0005603623 NetworkManager[48970]: <info>  [1769849175.7648] manager: (tape6486275-22): new Tun device (/org/freedesktop/NetworkManager/Devices/314)
Jan 31 03:46:15 np0005603623 nova_compute[226235]: 2026-01-31 08:46:15.765 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:15 np0005603623 ovn_controller[133449]: 2026-01-31T08:46:15Z|00662|binding|INFO|Claiming lport e6486275-22a6-4ee0-854f-fde4ef96bd8f for this chassis.
Jan 31 03:46:15 np0005603623 ovn_controller[133449]: 2026-01-31T08:46:15Z|00663|binding|INFO|e6486275-22a6-4ee0-854f-fde4ef96bd8f: Claiming fa:16:3e:84:6e:68 10.100.0.4
Jan 31 03:46:15 np0005603623 nova_compute[226235]: 2026-01-31 08:46:15.772 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:15 np0005603623 nova_compute[226235]: 2026-01-31 08:46:15.789 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:15 np0005603623 ovn_controller[133449]: 2026-01-31T08:46:15Z|00664|binding|INFO|Setting lport e6486275-22a6-4ee0-854f-fde4ef96bd8f ovn-installed in OVS
Jan 31 03:46:15 np0005603623 systemd-udevd[301113]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:46:15 np0005603623 nova_compute[226235]: 2026-01-31 08:46:15.793 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:15 np0005603623 systemd-machined[194379]: New machine qemu-75-instance-0000009c.
Jan 31 03:46:15 np0005603623 NetworkManager[48970]: <info>  [1769849175.8017] device (tape6486275-22): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:46:15 np0005603623 NetworkManager[48970]: <info>  [1769849175.8022] device (tape6486275-22): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:46:15 np0005603623 systemd[1]: Started Virtual Machine qemu-75-instance-0000009c.
Jan 31 03:46:15 np0005603623 ovn_controller[133449]: 2026-01-31T08:46:15Z|00665|binding|INFO|Setting lport e6486275-22a6-4ee0-854f-fde4ef96bd8f up in Southbound
Jan 31 03:46:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:15.951 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:6e:68 10.100.0.4'], port_security=['fa:16:3e:84:6e:68 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '0edbf2b9-b76f-446b-85fa-09a4dcb37976', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8397e0fed04b4dabb57148d0924de2dc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5a5f5fc8-9ea2-499a-9817-9f89f2dea440', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbd2578f-ff6e-4dc3-bc49-93cbf023edc5, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=e6486275-22a6-4ee0-854f-fde4ef96bd8f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:46:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:15.953 143258 INFO neutron.agent.ovn.metadata.agent [-] Port e6486275-22a6-4ee0-854f-fde4ef96bd8f in datapath 3afaf607-43a1-4d65-95fc-0a22b5c901d0 bound to our chassis#033[00m
Jan 31 03:46:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:15.954 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3afaf607-43a1-4d65-95fc-0a22b5c901d0#033[00m
Jan 31 03:46:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:15.961 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0afaede4-b74d-4b35-af78-8d2d9875be82]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:15.962 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3afaf607-41 in ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:46:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:15.963 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3afaf607-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:46:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:15.963 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dbfa254e-d9a6-4ab0-8cde-07f3bff3f2a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:15.964 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[69fd28fb-28ab-4973-b7b6-03b3efab5f8c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:15.971 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[fbf1688e-5887-42f8-b306-699820bf7aed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:15.979 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7f93fbc4-4723-4384-9c7e-5eb63fca3ff1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:16.002 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[c54e2f97-b5ad-4187-bd8b-37958bdb9c3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:16.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:16 np0005603623 NetworkManager[48970]: <info>  [1769849176.0076] manager: (tap3afaf607-40): new Veth device (/org/freedesktop/NetworkManager/Devices/315)
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:16.008 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2c743d2b-0e2d-4857-9627-3d2266396603]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:16.027 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[12a866a3-2b6e-4a5a-8070-9903e727d05e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:16.030 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[77191b28-0668-44b2-8e05-967cb470eeee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:16 np0005603623 NetworkManager[48970]: <info>  [1769849176.0431] device (tap3afaf607-40): carrier: link connected
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:16.047 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[bf7b8645-150f-4387-ab42-2a149150d936]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:16.060 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b035c592-787b-4ac0-94d2-981339a6465b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3afaf607-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:84:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 198], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 831647, 'reachable_time': 26185, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 301146, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:16.070 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d3a3a3da-0e32-4307-bedd-37fc4f885b79]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee0:8444'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 831647, 'tstamp': 831647}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 301147, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:16.083 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fbf1698a-d689-4314-a4d2-b2657f146b0e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3afaf607-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:84:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 198], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 831647, 'reachable_time': 26185, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 301155, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:16.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:16.111 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[218953f7-c8b9-49d9-a3a6-7f54118e85d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:16 np0005603623 nova_compute[226235]: 2026-01-31 08:46:16.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:16.171 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0b049851-1c47-4194-8a9c-529fb35fabcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:16.173 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3afaf607-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:16.173 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:16.173 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3afaf607-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:46:16 np0005603623 nova_compute[226235]: 2026-01-31 08:46:16.175 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:16 np0005603623 kernel: tap3afaf607-40: entered promiscuous mode
Jan 31 03:46:16 np0005603623 NetworkManager[48970]: <info>  [1769849176.1763] manager: (tap3afaf607-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/316)
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:16.178 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3afaf607-40, col_values=(('external_ids', {'iface-id': '0ed76a0a-650c-4ec7-a4d4-0e745236b047'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:46:16 np0005603623 nova_compute[226235]: 2026-01-31 08:46:16.179 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:16 np0005603623 ovn_controller[133449]: 2026-01-31T08:46:16Z|00666|binding|INFO|Releasing lport 0ed76a0a-650c-4ec7-a4d4-0e745236b047 from this chassis (sb_readonly=0)
Jan 31 03:46:16 np0005603623 nova_compute[226235]: 2026-01-31 08:46:16.180 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:16.180 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3afaf607-43a1-4d65-95fc-0a22b5c901d0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3afaf607-43a1-4d65-95fc-0a22b5c901d0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:16.181 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c910a624-dc29-4441-8989-cb1b87338e60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:16.182 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-3afaf607-43a1-4d65-95fc-0a22b5c901d0
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/3afaf607-43a1-4d65-95fc-0a22b5c901d0.pid.haproxy
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 3afaf607-43a1-4d65-95fc-0a22b5c901d0
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:46:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:16.182 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'env', 'PROCESS_TAG=haproxy-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3afaf607-43a1-4d65-95fc-0a22b5c901d0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:46:16 np0005603623 nova_compute[226235]: 2026-01-31 08:46:16.188 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:16 np0005603623 nova_compute[226235]: 2026-01-31 08:46:16.245 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849176.2450154, 0edbf2b9-b76f-446b-85fa-09a4dcb37976 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:46:16 np0005603623 nova_compute[226235]: 2026-01-31 08:46:16.245 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] VM Started (Lifecycle Event)#033[00m
Jan 31 03:46:16 np0005603623 nova_compute[226235]: 2026-01-31 08:46:16.349 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:46:16 np0005603623 nova_compute[226235]: 2026-01-31 08:46:16.354 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849176.2451181, 0edbf2b9-b76f-446b-85fa-09a4dcb37976 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:46:16 np0005603623 nova_compute[226235]: 2026-01-31 08:46:16.354 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:46:16 np0005603623 nova_compute[226235]: 2026-01-31 08:46:16.397 226239 DEBUG nova.network.neutron [req-c69472c7-47b6-419d-b928-5aa84bac3e5e req-f4936a76-2c08-4519-ae44-2270305a337a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Updated VIF entry in instance network info cache for port e6486275-22a6-4ee0-854f-fde4ef96bd8f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:46:16 np0005603623 nova_compute[226235]: 2026-01-31 08:46:16.398 226239 DEBUG nova.network.neutron [req-c69472c7-47b6-419d-b928-5aa84bac3e5e req-f4936a76-2c08-4519-ae44-2270305a337a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Updating instance_info_cache with network_info: [{"id": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "address": "fa:16:3e:84:6e:68", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6486275-22", "ovs_interfaceid": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:46:16 np0005603623 podman[301222]: 2026-01-31 08:46:16.473098468 +0000 UTC m=+0.037461616 container create 832c21cfde38e68c26070a267000959452b48a65c850ae543df2c2c21340bda8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 03:46:16 np0005603623 systemd[1]: Started libpod-conmon-832c21cfde38e68c26070a267000959452b48a65c850ae543df2c2c21340bda8.scope.
Jan 31 03:46:16 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:46:16 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57b24ebc73cd1c3e235e82dd9f5c7c4d672133ec160820974673c76e0e5d7e62/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:46:16 np0005603623 podman[301222]: 2026-01-31 08:46:16.542836235 +0000 UTC m=+0.107199413 container init 832c21cfde38e68c26070a267000959452b48a65c850ae543df2c2c21340bda8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 03:46:16 np0005603623 podman[301222]: 2026-01-31 08:46:16.54742547 +0000 UTC m=+0.111788618 container start 832c21cfde38e68c26070a267000959452b48a65c850ae543df2c2c21340bda8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:46:16 np0005603623 podman[301222]: 2026-01-31 08:46:16.452213123 +0000 UTC m=+0.016576271 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:46:16 np0005603623 neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0[301238]: [NOTICE]   (301242) : New worker (301244) forked
Jan 31 03:46:16 np0005603623 neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0[301238]: [NOTICE]   (301242) : Loading success.
Jan 31 03:46:16 np0005603623 nova_compute[226235]: 2026-01-31 08:46:16.662 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:46:16 np0005603623 nova_compute[226235]: 2026-01-31 08:46:16.667 226239 DEBUG oslo_concurrency.lockutils [req-c69472c7-47b6-419d-b928-5aa84bac3e5e req-f4936a76-2c08-4519-ae44-2270305a337a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-0edbf2b9-b76f-446b-85fa-09a4dcb37976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:46:16 np0005603623 nova_compute[226235]: 2026-01-31 08:46:16.669 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:46:17 np0005603623 nova_compute[226235]: 2026-01-31 08:46:17.069 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:17 np0005603623 nova_compute[226235]: 2026-01-31 08:46:17.690 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:46:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:46:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:18.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:46:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:18.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.278 226239 DEBUG nova.compute.manager [req-e284d3af-c6c3-4b27-9f4e-2b0a293c37c7 req-bff206a5-ffce-4af0-b0fa-e396b0ff63cb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Received event network-vif-plugged-e6486275-22a6-4ee0-854f-fde4ef96bd8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.278 226239 DEBUG oslo_concurrency.lockutils [req-e284d3af-c6c3-4b27-9f4e-2b0a293c37c7 req-bff206a5-ffce-4af0-b0fa-e396b0ff63cb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.279 226239 DEBUG oslo_concurrency.lockutils [req-e284d3af-c6c3-4b27-9f4e-2b0a293c37c7 req-bff206a5-ffce-4af0-b0fa-e396b0ff63cb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.279 226239 DEBUG oslo_concurrency.lockutils [req-e284d3af-c6c3-4b27-9f4e-2b0a293c37c7 req-bff206a5-ffce-4af0-b0fa-e396b0ff63cb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.279 226239 DEBUG nova.compute.manager [req-e284d3af-c6c3-4b27-9f4e-2b0a293c37c7 req-bff206a5-ffce-4af0-b0fa-e396b0ff63cb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Processing event network-vif-plugged-e6486275-22a6-4ee0-854f-fde4ef96bd8f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.280 226239 DEBUG nova.compute.manager [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.284 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849178.283971, 0edbf2b9-b76f-446b-85fa-09a4dcb37976 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.284 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.287 226239 DEBUG nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.290 226239 INFO nova.virt.libvirt.driver [-] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Instance spawned successfully.#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.291 226239 DEBUG nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.361 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.364 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.371 226239 DEBUG nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.372 226239 DEBUG nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.372 226239 DEBUG nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.373 226239 DEBUG nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.373 226239 DEBUG nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.373 226239 DEBUG nova.virt.libvirt.driver [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.431 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.588 226239 INFO nova.compute.manager [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Took 12.76 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.588 226239 DEBUG nova.compute.manager [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:46:18 np0005603623 nova_compute[226235]: 2026-01-31 08:46:18.780 226239 INFO nova.compute.manager [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Took 20.00 seconds to build instance.#033[00m
Jan 31 03:46:19 np0005603623 nova_compute[226235]: 2026-01-31 08:46:19.182 226239 DEBUG oslo_concurrency.lockutils [None req-b367e1e1-d6ef-49bf-804e-eaf751873798 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:20.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:20.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:20 np0005603623 nova_compute[226235]: 2026-01-31 08:46:20.416 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:20 np0005603623 nova_compute[226235]: 2026-01-31 08:46:20.622 226239 DEBUG nova.compute.manager [req-1c0986be-14da-46de-908e-3c129a50ae07 req-c1650ba6-1c25-46c6-8136-80f3ae7769c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Received event network-vif-plugged-e6486275-22a6-4ee0-854f-fde4ef96bd8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:46:20 np0005603623 nova_compute[226235]: 2026-01-31 08:46:20.623 226239 DEBUG oslo_concurrency.lockutils [req-1c0986be-14da-46de-908e-3c129a50ae07 req-c1650ba6-1c25-46c6-8136-80f3ae7769c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:20 np0005603623 nova_compute[226235]: 2026-01-31 08:46:20.623 226239 DEBUG oslo_concurrency.lockutils [req-1c0986be-14da-46de-908e-3c129a50ae07 req-c1650ba6-1c25-46c6-8136-80f3ae7769c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:20 np0005603623 nova_compute[226235]: 2026-01-31 08:46:20.624 226239 DEBUG oslo_concurrency.lockutils [req-1c0986be-14da-46de-908e-3c129a50ae07 req-c1650ba6-1c25-46c6-8136-80f3ae7769c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:20 np0005603623 nova_compute[226235]: 2026-01-31 08:46:20.624 226239 DEBUG nova.compute.manager [req-1c0986be-14da-46de-908e-3c129a50ae07 req-c1650ba6-1c25-46c6-8136-80f3ae7769c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] No waiting events found dispatching network-vif-plugged-e6486275-22a6-4ee0-854f-fde4ef96bd8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:46:20 np0005603623 nova_compute[226235]: 2026-01-31 08:46:20.625 226239 WARNING nova.compute.manager [req-1c0986be-14da-46de-908e-3c129a50ae07 req-c1650ba6-1c25-46c6-8136-80f3ae7769c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Received unexpected event network-vif-plugged-e6486275-22a6-4ee0-854f-fde4ef96bd8f for instance with vm_state active and task_state None.#033[00m
Jan 31 03:46:21 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #139. Immutable memtables: 0.
Jan 31 03:46:21 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:21.872719) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:46:21 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 87] Flushing memtable with next log file: 139
Jan 31 03:46:21 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849181872805, "job": 87, "event": "flush_started", "num_memtables": 1, "num_entries": 401, "num_deletes": 251, "total_data_size": 375834, "memory_usage": 383520, "flush_reason": "Manual Compaction"}
Jan 31 03:46:21 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 87] Level-0 flush table #140: started
Jan 31 03:46:21 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849181889000, "cf_name": "default", "job": 87, "event": "table_file_creation", "file_number": 140, "file_size": 231490, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 68923, "largest_seqno": 69319, "table_properties": {"data_size": 229197, "index_size": 392, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6275, "raw_average_key_size": 20, "raw_value_size": 224646, "raw_average_value_size": 729, "num_data_blocks": 18, "num_entries": 308, "num_filter_entries": 308, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769849167, "oldest_key_time": 1769849167, "file_creation_time": 1769849181, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 140, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:46:21 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 87] Flush lasted 16335 microseconds, and 1739 cpu microseconds.
Jan 31 03:46:21 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:46:21 np0005603623 podman[301255]: 2026-01-31 08:46:21.949276289 +0000 UTC m=+0.047259594 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 31 03:46:21 np0005603623 podman[301256]: 2026-01-31 08:46:21.972694814 +0000 UTC m=+0.068924944 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 03:46:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:22.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:21.889068) [db/flush_job.cc:967] [default] [JOB 87] Level-0 flush table #140: 231490 bytes OK
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:21.889087) [db/memtable_list.cc:519] [default] Level-0 commit table #140 started
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:22.062594) [db/memtable_list.cc:722] [default] Level-0 commit table #140: memtable #1 done
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:22.062642) EVENT_LOG_v1 {"time_micros": 1769849182062631, "job": 87, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:22.062667) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 87] Try to delete WAL files size 373244, prev total WAL file size 373244, number of live WAL files 2.
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000136.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:22.063239) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032323533' seq:72057594037927935, type:22 .. '6D6772737461740032353035' seq:0, type:0; will stop at (end)
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 88] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 87 Base level 0, inputs: [140(226KB)], [138(12MB)]
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849182063332, "job": 88, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [140], "files_L6": [138], "score": -1, "input_data_size": 13592580, "oldest_snapshot_seqno": -1}
Jan 31 03:46:22 np0005603623 nova_compute[226235]: 2026-01-31 08:46:22.070 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:22.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 88] Generated table #141: 8908 keys, 9756082 bytes, temperature: kUnknown
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849182613348, "cf_name": "default", "job": 88, "event": "table_file_creation", "file_number": 141, "file_size": 9756082, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9701156, "index_size": 31581, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22277, "raw_key_size": 234881, "raw_average_key_size": 26, "raw_value_size": 9547491, "raw_average_value_size": 1071, "num_data_blocks": 1198, "num_entries": 8908, "num_filter_entries": 8908, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769849182, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 141, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:22.613698) [db/compaction/compaction_job.cc:1663] [default] [JOB 88] Compacted 1@0 + 1@6 files to L6 => 9756082 bytes
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:22.618534) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 24.7 rd, 17.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 12.7 +0.0 blob) out(9.3 +0.0 blob), read-write-amplify(100.9) write-amplify(42.1) OK, records in: 9417, records dropped: 509 output_compression: NoCompression
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:22.618585) EVENT_LOG_v1 {"time_micros": 1769849182618560, "job": 88, "event": "compaction_finished", "compaction_time_micros": 550165, "compaction_time_cpu_micros": 22674, "output_level": 6, "num_output_files": 1, "total_output_size": 9756082, "num_input_records": 9417, "num_output_records": 8908, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000140.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849182618843, "job": 88, "event": "table_file_deletion", "file_number": 140}
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000138.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849182620444, "job": 88, "event": "table_file_deletion", "file_number": 138}
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:22.063112) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:22.620537) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:22.620542) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:22.620544) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:22.620547) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:46:22.620548) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:46:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:24.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:46:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:46:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:24.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:46:25 np0005603623 nova_compute[226235]: 2026-01-31 08:46:25.418 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:26.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:46:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:26.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:46:27 np0005603623 nova_compute[226235]: 2026-01-31 08:46:27.071 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:28.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:28.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:28 np0005603623 nova_compute[226235]: 2026-01-31 08:46:28.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:28 np0005603623 nova_compute[226235]: 2026-01-31 08:46:28.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:46:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:30.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:30.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:30.137 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:30.138 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:30.138 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:30 np0005603623 nova_compute[226235]: 2026-01-31 08:46:30.461 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:46:30Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:84:6e:68 10.100.0.4
Jan 31 03:46:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:46:30Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:84:6e:68 10.100.0.4
Jan 31 03:46:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:32.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:32 np0005603623 nova_compute[226235]: 2026-01-31 08:46:32.073 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:32.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:34.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:34.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:35 np0005603623 nova_compute[226235]: 2026-01-31 08:46:35.463 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:36.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:36.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:37 np0005603623 nova_compute[226235]: 2026-01-31 08:46:37.074 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:46:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:38.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:46:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:38.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:38 np0005603623 nova_compute[226235]: 2026-01-31 08:46:38.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:40.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:40.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:40 np0005603623 nova_compute[226235]: 2026-01-31 08:46:40.466 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:42.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:42 np0005603623 nova_compute[226235]: 2026-01-31 08:46:42.076 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:42.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:42 np0005603623 nova_compute[226235]: 2026-01-31 08:46:42.224 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:42 np0005603623 nova_compute[226235]: 2026-01-31 08:46:42.224 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:46:42 np0005603623 nova_compute[226235]: 2026-01-31 08:46:42.333 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:46:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:46:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:44.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:46:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:46:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:44.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:46:45 np0005603623 nova_compute[226235]: 2026-01-31 08:46:45.467 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:46.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:46.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:47 np0005603623 nova_compute[226235]: 2026-01-31 08:46:47.078 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:48.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:48.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:50.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:50.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:50 np0005603623 nova_compute[226235]: 2026-01-31 08:46:50.468 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:46:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:46:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:46:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:46:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:52.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:46:52 np0005603623 nova_compute[226235]: 2026-01-31 08:46:52.079 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:46:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:52.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:46:52 np0005603623 podman[301499]: 2026-01-31 08:46:52.960338416 +0000 UTC m=+0.047797050 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 03:46:52 np0005603623 podman[301500]: 2026-01-31 08:46:52.990472281 +0000 UTC m=+0.077930985 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:46:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:54.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:54.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:55 np0005603623 nova_compute[226235]: 2026-01-31 08:46:55.469 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:55.735 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=67, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=66) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:46:55 np0005603623 nova_compute[226235]: 2026-01-31 08:46:55.736 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:55.736 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:46:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:56.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:56.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:57 np0005603623 nova_compute[226235]: 2026-01-31 08:46:57.083 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:57 np0005603623 nova_compute[226235]: 2026-01-31 08:46:57.264 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:57 np0005603623 nova_compute[226235]: 2026-01-31 08:46:57.264 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:46:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:58.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:46:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:58.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:46:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:46:58 np0005603623 nova_compute[226235]: 2026-01-31 08:46:58.462 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:58 np0005603623 NetworkManager[48970]: <info>  [1769849218.4638] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/317)
Jan 31 03:46:58 np0005603623 NetworkManager[48970]: <info>  [1769849218.4647] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/318)
Jan 31 03:46:58 np0005603623 nova_compute[226235]: 2026-01-31 08:46:58.504 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:58 np0005603623 ovn_controller[133449]: 2026-01-31T08:46:58Z|00667|binding|INFO|Releasing lport 0ed76a0a-650c-4ec7-a4d4-0e745236b047 from this chassis (sb_readonly=0)
Jan 31 03:46:58 np0005603623 nova_compute[226235]: 2026-01-31 08:46:58.521 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:46:59.740 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '67'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:00.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:00 np0005603623 nova_compute[226235]: 2026-01-31 08:47:00.088 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:00.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:00 np0005603623 nova_compute[226235]: 2026-01-31 08:47:00.471 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:01 np0005603623 nova_compute[226235]: 2026-01-31 08:47:01.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:01 np0005603623 nova_compute[226235]: 2026-01-31 08:47:01.213 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:01 np0005603623 nova_compute[226235]: 2026-01-31 08:47:01.214 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:01 np0005603623 nova_compute[226235]: 2026-01-31 08:47:01.214 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:01 np0005603623 nova_compute[226235]: 2026-01-31 08:47:01.214 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:47:01 np0005603623 nova_compute[226235]: 2026-01-31 08:47:01.214 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:47:01 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3577242807' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:47:01 np0005603623 nova_compute[226235]: 2026-01-31 08:47:01.648 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:01 np0005603623 nova_compute[226235]: 2026-01-31 08:47:01.987 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000009c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:47:01 np0005603623 nova_compute[226235]: 2026-01-31 08:47:01.988 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000009c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:47:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:02.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:02 np0005603623 nova_compute[226235]: 2026-01-31 08:47:02.084 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:02 np0005603623 nova_compute[226235]: 2026-01-31 08:47:02.108 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:47:02 np0005603623 nova_compute[226235]: 2026-01-31 08:47:02.109 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4137MB free_disk=20.94631576538086GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:47:02 np0005603623 nova_compute[226235]: 2026-01-31 08:47:02.109 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:02 np0005603623 nova_compute[226235]: 2026-01-31 08:47:02.110 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:02.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:02 np0005603623 nova_compute[226235]: 2026-01-31 08:47:02.930 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 0edbf2b9-b76f-446b-85fa-09a4dcb37976 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:47:02 np0005603623 nova_compute[226235]: 2026-01-31 08:47:02.931 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:47:02 np0005603623 nova_compute[226235]: 2026-01-31 08:47:02.931 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:47:02 np0005603623 nova_compute[226235]: 2026-01-31 08:47:02.952 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:47:02 np0005603623 nova_compute[226235]: 2026-01-31 08:47:02.978 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:47:02 np0005603623 nova_compute[226235]: 2026-01-31 08:47:02.979 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:47:03 np0005603623 nova_compute[226235]: 2026-01-31 08:47:02.999 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:47:03 np0005603623 nova_compute[226235]: 2026-01-31 08:47:03.030 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:47:03 np0005603623 nova_compute[226235]: 2026-01-31 08:47:03.077 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:47:03 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/376560586' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:47:03 np0005603623 nova_compute[226235]: 2026-01-31 08:47:03.489 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:03 np0005603623 nova_compute[226235]: 2026-01-31 08:47:03.496 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:47:03 np0005603623 nova_compute[226235]: 2026-01-31 08:47:03.740 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:47:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:04.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:04.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:04 np0005603623 nova_compute[226235]: 2026-01-31 08:47:04.215 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:47:04 np0005603623 nova_compute[226235]: 2026-01-31 08:47:04.215 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:05 np0005603623 nova_compute[226235]: 2026-01-31 08:47:05.513 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:06.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:06.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:07 np0005603623 nova_compute[226235]: 2026-01-31 08:47:07.087 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:07 np0005603623 nova_compute[226235]: 2026-01-31 08:47:07.216 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:07 np0005603623 nova_compute[226235]: 2026-01-31 08:47:07.217 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:47:07 np0005603623 nova_compute[226235]: 2026-01-31 08:47:07.217 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:47:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:08.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:08.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:08 np0005603623 nova_compute[226235]: 2026-01-31 08:47:08.450 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-0edbf2b9-b76f-446b-85fa-09a4dcb37976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:47:08 np0005603623 nova_compute[226235]: 2026-01-31 08:47:08.450 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-0edbf2b9-b76f-446b-85fa-09a4dcb37976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:47:08 np0005603623 nova_compute[226235]: 2026-01-31 08:47:08.451 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:47:08 np0005603623 nova_compute[226235]: 2026-01-31 08:47:08.451 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0edbf2b9-b76f-446b-85fa-09a4dcb37976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:47:09 np0005603623 nova_compute[226235]: 2026-01-31 08:47:09.439 226239 DEBUG oslo_concurrency.lockutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:09 np0005603623 nova_compute[226235]: 2026-01-31 08:47:09.439 226239 DEBUG oslo_concurrency.lockutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:09 np0005603623 nova_compute[226235]: 2026-01-31 08:47:09.472 226239 DEBUG nova.compute.manager [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:47:09 np0005603623 nova_compute[226235]: 2026-01-31 08:47:09.648 226239 DEBUG oslo_concurrency.lockutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:09 np0005603623 nova_compute[226235]: 2026-01-31 08:47:09.648 226239 DEBUG oslo_concurrency.lockutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:09 np0005603623 nova_compute[226235]: 2026-01-31 08:47:09.668 226239 DEBUG nova.virt.hardware [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:47:09 np0005603623 nova_compute[226235]: 2026-01-31 08:47:09.669 226239 INFO nova.compute.claims [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:47:09 np0005603623 nova_compute[226235]: 2026-01-31 08:47:09.888 226239 DEBUG oslo_concurrency.processutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:10.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:10.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:47:10 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2833045658' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:47:10 np0005603623 nova_compute[226235]: 2026-01-31 08:47:10.313 226239 DEBUG oslo_concurrency.processutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:10 np0005603623 nova_compute[226235]: 2026-01-31 08:47:10.318 226239 DEBUG nova.compute.provider_tree [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:47:10 np0005603623 nova_compute[226235]: 2026-01-31 08:47:10.346 226239 DEBUG nova.scheduler.client.report [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:47:10 np0005603623 nova_compute[226235]: 2026-01-31 08:47:10.433 226239 DEBUG oslo_concurrency.lockutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:10 np0005603623 nova_compute[226235]: 2026-01-31 08:47:10.434 226239 DEBUG nova.compute.manager [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:47:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:10 np0005603623 nova_compute[226235]: 2026-01-31 08:47:10.515 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:10 np0005603623 nova_compute[226235]: 2026-01-31 08:47:10.582 226239 DEBUG nova.compute.manager [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:47:10 np0005603623 nova_compute[226235]: 2026-01-31 08:47:10.583 226239 DEBUG nova.network.neutron [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:47:10 np0005603623 nova_compute[226235]: 2026-01-31 08:47:10.625 226239 INFO nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:47:10 np0005603623 nova_compute[226235]: 2026-01-31 08:47:10.671 226239 DEBUG nova.compute.manager [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:47:10 np0005603623 nova_compute[226235]: 2026-01-31 08:47:10.857 226239 DEBUG nova.compute.manager [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:47:10 np0005603623 nova_compute[226235]: 2026-01-31 08:47:10.860 226239 DEBUG nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:47:10 np0005603623 nova_compute[226235]: 2026-01-31 08:47:10.861 226239 INFO nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Creating image(s)#033[00m
Jan 31 03:47:10 np0005603623 nova_compute[226235]: 2026-01-31 08:47:10.886 226239 DEBUG nova.storage.rbd_utils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:10 np0005603623 nova_compute[226235]: 2026-01-31 08:47:10.909 226239 DEBUG nova.storage.rbd_utils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:10 np0005603623 nova_compute[226235]: 2026-01-31 08:47:10.938 226239 DEBUG nova.storage.rbd_utils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:10 np0005603623 nova_compute[226235]: 2026-01-31 08:47:10.943 226239 DEBUG oslo_concurrency.processutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:11 np0005603623 nova_compute[226235]: 2026-01-31 08:47:11.002 226239 DEBUG oslo_concurrency.processutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:11 np0005603623 nova_compute[226235]: 2026-01-31 08:47:11.003 226239 DEBUG oslo_concurrency.lockutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:11 np0005603623 nova_compute[226235]: 2026-01-31 08:47:11.004 226239 DEBUG oslo_concurrency.lockutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:11 np0005603623 nova_compute[226235]: 2026-01-31 08:47:11.004 226239 DEBUG oslo_concurrency.lockutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:11 np0005603623 nova_compute[226235]: 2026-01-31 08:47:11.029 226239 DEBUG nova.storage.rbd_utils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:11 np0005603623 nova_compute[226235]: 2026-01-31 08:47:11.033 226239 DEBUG oslo_concurrency.processutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:11 np0005603623 nova_compute[226235]: 2026-01-31 08:47:11.404 226239 DEBUG nova.policy [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aa7f893021af4a84b03d85b476dadfe0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:47:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:12.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:12 np0005603623 nova_compute[226235]: 2026-01-31 08:47:12.089 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:12.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:12 np0005603623 nova_compute[226235]: 2026-01-31 08:47:12.220 226239 DEBUG oslo_concurrency.processutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.187s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:12 np0005603623 nova_compute[226235]: 2026-01-31 08:47:12.282 226239 DEBUG nova.storage.rbd_utils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] resizing rbd image da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:47:12 np0005603623 nova_compute[226235]: 2026-01-31 08:47:12.675 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Updating instance_info_cache with network_info: [{"id": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "address": "fa:16:3e:84:6e:68", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6486275-22", "ovs_interfaceid": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:47:12 np0005603623 nova_compute[226235]: 2026-01-31 08:47:12.736 226239 DEBUG nova.objects.instance [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'migration_context' on Instance uuid da4e355d-c6c2-446e-8eb1-d2ca8279e549 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:47:12 np0005603623 nova_compute[226235]: 2026-01-31 08:47:12.858 226239 DEBUG nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:47:12 np0005603623 nova_compute[226235]: 2026-01-31 08:47:12.859 226239 DEBUG nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Ensure instance console log exists: /var/lib/nova/instances/da4e355d-c6c2-446e-8eb1-d2ca8279e549/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:47:12 np0005603623 nova_compute[226235]: 2026-01-31 08:47:12.859 226239 DEBUG oslo_concurrency.lockutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:12 np0005603623 nova_compute[226235]: 2026-01-31 08:47:12.860 226239 DEBUG oslo_concurrency.lockutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:12 np0005603623 nova_compute[226235]: 2026-01-31 08:47:12.860 226239 DEBUG oslo_concurrency.lockutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:12 np0005603623 nova_compute[226235]: 2026-01-31 08:47:12.894 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-0edbf2b9-b76f-446b-85fa-09a4dcb37976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:47:12 np0005603623 nova_compute[226235]: 2026-01-31 08:47:12.894 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:47:12 np0005603623 nova_compute[226235]: 2026-01-31 08:47:12.894 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:12 np0005603623 nova_compute[226235]: 2026-01-31 08:47:12.895 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:12 np0005603623 nova_compute[226235]: 2026-01-31 08:47:12.895 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:12 np0005603623 nova_compute[226235]: 2026-01-31 08:47:12.895 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:13 np0005603623 nova_compute[226235]: 2026-01-31 08:47:13.828 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:14 np0005603623 ovn_controller[133449]: 2026-01-31T08:47:14Z|00668|binding|INFO|Releasing lport 0ed76a0a-650c-4ec7-a4d4-0e745236b047 from this chassis (sb_readonly=0)
Jan 31 03:47:14 np0005603623 nova_compute[226235]: 2026-01-31 08:47:14.044 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:14.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:14.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:15 np0005603623 nova_compute[226235]: 2026-01-31 08:47:15.518 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:15 np0005603623 nova_compute[226235]: 2026-01-31 08:47:15.553 226239 DEBUG nova.network.neutron [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Successfully created port: 5224a5be-3c99-4dab-acc8-a3d0488d9a42 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:47:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:16.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:16.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:17 np0005603623 nova_compute[226235]: 2026-01-31 08:47:17.090 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:47:17 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3691770370' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:47:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:18.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:18 np0005603623 nova_compute[226235]: 2026-01-31 08:47:18.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:18.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:18 np0005603623 nova_compute[226235]: 2026-01-31 08:47:18.340 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:18 np0005603623 nova_compute[226235]: 2026-01-31 08:47:18.944 226239 DEBUG nova.network.neutron [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Successfully updated port: 5224a5be-3c99-4dab-acc8-a3d0488d9a42 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:47:19 np0005603623 nova_compute[226235]: 2026-01-31 08:47:19.133 226239 DEBUG oslo_concurrency.lockutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "refresh_cache-da4e355d-c6c2-446e-8eb1-d2ca8279e549" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:47:19 np0005603623 nova_compute[226235]: 2026-01-31 08:47:19.133 226239 DEBUG oslo_concurrency.lockutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquired lock "refresh_cache-da4e355d-c6c2-446e-8eb1-d2ca8279e549" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:47:19 np0005603623 nova_compute[226235]: 2026-01-31 08:47:19.133 226239 DEBUG nova.network.neutron [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:47:19 np0005603623 nova_compute[226235]: 2026-01-31 08:47:19.436 226239 DEBUG nova.network.neutron [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:47:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:20.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:20.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:20 np0005603623 nova_compute[226235]: 2026-01-31 08:47:20.290 226239 DEBUG nova.compute.manager [req-9949fb60-20c8-4302-b789-e4725a1740f6 req-38bea645-686b-4da2-97b1-e4097c36ce1c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Received event network-changed-5224a5be-3c99-4dab-acc8-a3d0488d9a42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:47:20 np0005603623 nova_compute[226235]: 2026-01-31 08:47:20.291 226239 DEBUG nova.compute.manager [req-9949fb60-20c8-4302-b789-e4725a1740f6 req-38bea645-686b-4da2-97b1-e4097c36ce1c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Refreshing instance network info cache due to event network-changed-5224a5be-3c99-4dab-acc8-a3d0488d9a42. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:47:20 np0005603623 nova_compute[226235]: 2026-01-31 08:47:20.291 226239 DEBUG oslo_concurrency.lockutils [req-9949fb60-20c8-4302-b789-e4725a1740f6 req-38bea645-686b-4da2-97b1-e4097c36ce1c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-da4e355d-c6c2-446e-8eb1-d2ca8279e549" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:47:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:20 np0005603623 nova_compute[226235]: 2026-01-31 08:47:20.519 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.900 226239 DEBUG nova.network.neutron [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Updating instance_info_cache with network_info: [{"id": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "address": "fa:16:3e:15:96:10", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5224a5be-3c", "ovs_interfaceid": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.974 226239 DEBUG oslo_concurrency.lockutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Releasing lock "refresh_cache-da4e355d-c6c2-446e-8eb1-d2ca8279e549" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.975 226239 DEBUG nova.compute.manager [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Instance network_info: |[{"id": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "address": "fa:16:3e:15:96:10", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5224a5be-3c", "ovs_interfaceid": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.975 226239 DEBUG oslo_concurrency.lockutils [req-9949fb60-20c8-4302-b789-e4725a1740f6 req-38bea645-686b-4da2-97b1-e4097c36ce1c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-da4e355d-c6c2-446e-8eb1-d2ca8279e549" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.975 226239 DEBUG nova.network.neutron [req-9949fb60-20c8-4302-b789-e4725a1740f6 req-38bea645-686b-4da2-97b1-e4097c36ce1c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Refreshing network info cache for port 5224a5be-3c99-4dab-acc8-a3d0488d9a42 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.977 226239 DEBUG nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Start _get_guest_xml network_info=[{"id": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "address": "fa:16:3e:15:96:10", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5224a5be-3c", "ovs_interfaceid": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.981 226239 WARNING nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.989 226239 DEBUG nova.virt.libvirt.host [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.990 226239 DEBUG nova.virt.libvirt.host [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.992 226239 DEBUG nova.virt.libvirt.host [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.993 226239 DEBUG nova.virt.libvirt.host [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.994 226239 DEBUG nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.994 226239 DEBUG nova.virt.hardware [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.994 226239 DEBUG nova.virt.hardware [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.995 226239 DEBUG nova.virt.hardware [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.995 226239 DEBUG nova.virt.hardware [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.995 226239 DEBUG nova.virt.hardware [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.995 226239 DEBUG nova.virt.hardware [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.995 226239 DEBUG nova.virt.hardware [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.996 226239 DEBUG nova.virt.hardware [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.996 226239 DEBUG nova.virt.hardware [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.996 226239 DEBUG nova.virt.hardware [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.996 226239 DEBUG nova.virt.hardware [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:47:21 np0005603623 nova_compute[226235]: 2026-01-31 08:47:21.998 226239 DEBUG oslo_concurrency.processutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:22.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.091 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:22.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:47:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4135245342' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.409 226239 DEBUG oslo_concurrency.processutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.433 226239 DEBUG nova.storage.rbd_utils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.438 226239 DEBUG oslo_concurrency.processutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:47:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2439826371' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.869 226239 DEBUG oslo_concurrency.processutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.871 226239 DEBUG nova.virt.libvirt.vif [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:47:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1869656790',display_name='tempest-ServerRescueNegativeTestJSON-server-1869656790',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1869656790',id=162,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bf1c3d387dbe4191b4d05bdfca5959da',ramdisk_id='',reservation_id='r-l0z9mou0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-81297706',owner_user_name='tempest-ServerRescueNegativeTestJSON-81297706-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:47:10Z,user_data=None,user_id='aa7f893021af4a84b03d85b476dadfe0',uuid=da4e355d-c6c2-446e-8eb1-d2ca8279e549,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "address": "fa:16:3e:15:96:10", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5224a5be-3c", "ovs_interfaceid": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.871 226239 DEBUG nova.network.os_vif_util [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converting VIF {"id": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "address": "fa:16:3e:15:96:10", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5224a5be-3c", "ovs_interfaceid": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.872 226239 DEBUG nova.network.os_vif_util [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:96:10,bridge_name='br-int',has_traffic_filtering=True,id=5224a5be-3c99-4dab-acc8-a3d0488d9a42,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5224a5be-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.873 226239 DEBUG nova.objects.instance [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'pci_devices' on Instance uuid da4e355d-c6c2-446e-8eb1-d2ca8279e549 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.908 226239 DEBUG nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:47:22 np0005603623 nova_compute[226235]:  <uuid>da4e355d-c6c2-446e-8eb1-d2ca8279e549</uuid>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:  <name>instance-000000a2</name>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-1869656790</nova:name>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:47:21</nova:creationTime>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:47:22 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:        <nova:user uuid="aa7f893021af4a84b03d85b476dadfe0">tempest-ServerRescueNegativeTestJSON-81297706-project-member</nova:user>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:        <nova:project uuid="bf1c3d387dbe4191b4d05bdfca5959da">tempest-ServerRescueNegativeTestJSON-81297706</nova:project>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:        <nova:port uuid="5224a5be-3c99-4dab-acc8-a3d0488d9a42">
Jan 31 03:47:22 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <entry name="serial">da4e355d-c6c2-446e-8eb1-d2ca8279e549</entry>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <entry name="uuid">da4e355d-c6c2-446e-8eb1-d2ca8279e549</entry>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk">
Jan 31 03:47:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:47:22 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk.config">
Jan 31 03:47:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:47:22 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:15:96:10"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <target dev="tap5224a5be-3c"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/da4e355d-c6c2-446e-8eb1-d2ca8279e549/console.log" append="off"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:47:22 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:47:22 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:47:22 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:47:22 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.910 226239 DEBUG nova.compute.manager [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Preparing to wait for external event network-vif-plugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.911 226239 DEBUG oslo_concurrency.lockutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.911 226239 DEBUG oslo_concurrency.lockutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.912 226239 DEBUG oslo_concurrency.lockutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.912 226239 DEBUG nova.virt.libvirt.vif [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:47:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1869656790',display_name='tempest-ServerRescueNegativeTestJSON-server-1869656790',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1869656790',id=162,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bf1c3d387dbe4191b4d05bdfca5959da',ramdisk_id='',reservation_id='r-l0z9mou0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-81297706',owner_user_name='tempest-ServerRescueNegativeTestJSON-81297706-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:47:10Z,user_data=None,user_id='aa7f893021af4a84b03d85b476dadfe0',uuid=da4e355d-c6c2-446e-8eb1-d2ca8279e549,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "address": "fa:16:3e:15:96:10", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5224a5be-3c", "ovs_interfaceid": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.913 226239 DEBUG nova.network.os_vif_util [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converting VIF {"id": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "address": "fa:16:3e:15:96:10", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5224a5be-3c", "ovs_interfaceid": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.914 226239 DEBUG nova.network.os_vif_util [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:96:10,bridge_name='br-int',has_traffic_filtering=True,id=5224a5be-3c99-4dab-acc8-a3d0488d9a42,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5224a5be-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.914 226239 DEBUG os_vif [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:96:10,bridge_name='br-int',has_traffic_filtering=True,id=5224a5be-3c99-4dab-acc8-a3d0488d9a42,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5224a5be-3c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.915 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.916 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.916 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.920 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.921 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5224a5be-3c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.922 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5224a5be-3c, col_values=(('external_ids', {'iface-id': '5224a5be-3c99-4dab-acc8-a3d0488d9a42', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:96:10', 'vm-uuid': 'da4e355d-c6c2-446e-8eb1-d2ca8279e549'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.963 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:22 np0005603623 NetworkManager[48970]: <info>  [1769849242.9637] manager: (tap5224a5be-3c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/319)
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.967 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.968 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:22 np0005603623 nova_compute[226235]: 2026-01-31 08:47:22.970 226239 INFO os_vif [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:96:10,bridge_name='br-int',has_traffic_filtering=True,id=5224a5be-3c99-4dab-acc8-a3d0488d9a42,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5224a5be-3c')#033[00m
Jan 31 03:47:23 np0005603623 nova_compute[226235]: 2026-01-31 08:47:23.079 226239 DEBUG nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:47:23 np0005603623 nova_compute[226235]: 2026-01-31 08:47:23.080 226239 DEBUG nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:47:23 np0005603623 nova_compute[226235]: 2026-01-31 08:47:23.080 226239 DEBUG nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No VIF found with MAC fa:16:3e:15:96:10, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:47:23 np0005603623 nova_compute[226235]: 2026-01-31 08:47:23.081 226239 INFO nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Using config drive#033[00m
Jan 31 03:47:23 np0005603623 nova_compute[226235]: 2026-01-31 08:47:23.102 226239 DEBUG nova.storage.rbd_utils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:23 np0005603623 podman[302027]: 2026-01-31 08:47:23.970363671 +0000 UTC m=+0.064139633 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 31 03:47:23 np0005603623 podman[302026]: 2026-01-31 08:47:23.97543096 +0000 UTC m=+0.069818921 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:47:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:24.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:24.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:24 np0005603623 nova_compute[226235]: 2026-01-31 08:47:24.313 226239 INFO nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Creating config drive at /var/lib/nova/instances/da4e355d-c6c2-446e-8eb1-d2ca8279e549/disk.config#033[00m
Jan 31 03:47:24 np0005603623 nova_compute[226235]: 2026-01-31 08:47:24.316 226239 DEBUG oslo_concurrency.processutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/da4e355d-c6c2-446e-8eb1-d2ca8279e549/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpf4o7ivpp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:24 np0005603623 nova_compute[226235]: 2026-01-31 08:47:24.438 226239 DEBUG oslo_concurrency.processutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/da4e355d-c6c2-446e-8eb1-d2ca8279e549/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpf4o7ivpp" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:24 np0005603623 nova_compute[226235]: 2026-01-31 08:47:24.462 226239 DEBUG nova.storage.rbd_utils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:24 np0005603623 nova_compute[226235]: 2026-01-31 08:47:24.465 226239 DEBUG oslo_concurrency.processutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/da4e355d-c6c2-446e-8eb1-d2ca8279e549/disk.config da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:24 np0005603623 ovn_controller[133449]: 2026-01-31T08:47:24Z|00669|binding|INFO|Releasing lport 0ed76a0a-650c-4ec7-a4d4-0e745236b047 from this chassis (sb_readonly=0)
Jan 31 03:47:24 np0005603623 nova_compute[226235]: 2026-01-31 08:47:24.546 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:24 np0005603623 nova_compute[226235]: 2026-01-31 08:47:24.681 226239 DEBUG oslo_concurrency.processutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/da4e355d-c6c2-446e-8eb1-d2ca8279e549/disk.config da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.216s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:24 np0005603623 nova_compute[226235]: 2026-01-31 08:47:24.682 226239 INFO nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Deleting local config drive /var/lib/nova/instances/da4e355d-c6c2-446e-8eb1-d2ca8279e549/disk.config because it was imported into RBD.#033[00m
Jan 31 03:47:24 np0005603623 kernel: tap5224a5be-3c: entered promiscuous mode
Jan 31 03:47:24 np0005603623 NetworkManager[48970]: <info>  [1769849244.7191] manager: (tap5224a5be-3c): new Tun device (/org/freedesktop/NetworkManager/Devices/320)
Jan 31 03:47:24 np0005603623 ovn_controller[133449]: 2026-01-31T08:47:24Z|00670|binding|INFO|Claiming lport 5224a5be-3c99-4dab-acc8-a3d0488d9a42 for this chassis.
Jan 31 03:47:24 np0005603623 nova_compute[226235]: 2026-01-31 08:47:24.720 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:24 np0005603623 ovn_controller[133449]: 2026-01-31T08:47:24Z|00671|binding|INFO|5224a5be-3c99-4dab-acc8-a3d0488d9a42: Claiming fa:16:3e:15:96:10 10.100.0.4
Jan 31 03:47:24 np0005603623 ovn_controller[133449]: 2026-01-31T08:47:24Z|00672|binding|INFO|Setting lport 5224a5be-3c99-4dab-acc8-a3d0488d9a42 ovn-installed in OVS
Jan 31 03:47:24 np0005603623 nova_compute[226235]: 2026-01-31 08:47:24.728 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:24 np0005603623 nova_compute[226235]: 2026-01-31 08:47:24.730 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.737 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:96:10 10.100.0.4'], port_security=['fa:16:3e:15:96:10 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'da4e355d-c6c2-446e-8eb1-d2ca8279e549', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2128154c-0218-4f66-9509-e0db66eba3fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4937dacf-809a-410a-970f-8b358db49b15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc4ff4f3-028a-4adf-9ffc-a84ef2563d05, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=5224a5be-3c99-4dab-acc8-a3d0488d9a42) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:47:24 np0005603623 ovn_controller[133449]: 2026-01-31T08:47:24Z|00673|binding|INFO|Setting lport 5224a5be-3c99-4dab-acc8-a3d0488d9a42 up in Southbound
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.738 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 5224a5be-3c99-4dab-acc8-a3d0488d9a42 in datapath 2128154c-0218-4f66-9509-e0db66eba3fc bound to our chassis#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.740 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2128154c-0218-4f66-9509-e0db66eba3fc#033[00m
Jan 31 03:47:24 np0005603623 systemd-udevd[302122]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:47:24 np0005603623 systemd-machined[194379]: New machine qemu-76-instance-000000a2.
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.749 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a9a5ff55-5ad4-468f-8acd-c2bc18a6075d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.749 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2128154c-01 in ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.751 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2128154c-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.751 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a5a1f8e3-815b-409f-b44f-61814e8a2fc0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.752 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2313152e-1e13-4ed9-a9a9-14a231bb0c69]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:24 np0005603623 NetworkManager[48970]: <info>  [1769849244.7548] device (tap5224a5be-3c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:47:24 np0005603623 NetworkManager[48970]: <info>  [1769849244.7558] device (tap5224a5be-3c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:47:24 np0005603623 systemd[1]: Started Virtual Machine qemu-76-instance-000000a2.
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.770 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[f9f480b0-1e2d-4fc0-a8c9-2022e51ddf8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.788 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[29f98b03-e9c7-4f63-8105-25bbacb0d9fa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.809 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[bac5669c-061d-4416-b6fa-34789cd26965]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.813 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4166d58d-653c-4c9e-bbcf-6ed9268eafe7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:24 np0005603623 NetworkManager[48970]: <info>  [1769849244.8149] manager: (tap2128154c-00): new Veth device (/org/freedesktop/NetworkManager/Devices/321)
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.840 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[e623d3e4-7a0d-42d4-bdee-b73d99c2b9d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.843 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[eb716239-88ab-4f47-a2fc-5b18ecaefe85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:24 np0005603623 NetworkManager[48970]: <info>  [1769849244.8677] device (tap2128154c-00): carrier: link connected
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.872 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b9b0a0-a344-4f4e-8bdc-cc81767e7449]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.883 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[42735460-a9c5-4412-a029-1320318cb946]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2128154c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:32:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 838529, 'reachable_time': 17084, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 302155, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.894 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[aec794c2-eb9b-4406-b4e6-8b4bb091ab54]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feeb:3208'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 838529, 'tstamp': 838529}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 302156, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.907 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[53b770b4-6a6a-45f8-a90e-9c9c98089abb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2128154c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:32:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 838529, 'reachable_time': 17084, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 302157, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.928 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6d506a86-23f3-41a3-98e8-07e6a0a29840]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.974 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5309d481-d447-4399-914b-7345c9b2c7d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.978 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2128154c-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.978 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.979 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2128154c-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:24 np0005603623 NetworkManager[48970]: <info>  [1769849244.9818] manager: (tap2128154c-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/322)
Jan 31 03:47:24 np0005603623 kernel: tap2128154c-00: entered promiscuous mode
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.984 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2128154c-00, col_values=(('external_ids', {'iface-id': '5976b74a-78ce-46e1-bd2c-76a2a502c8f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:24 np0005603623 ovn_controller[133449]: 2026-01-31T08:47:24Z|00674|binding|INFO|Releasing lport 5976b74a-78ce-46e1-bd2c-76a2a502c8f5 from this chassis (sb_readonly=0)
Jan 31 03:47:24 np0005603623 nova_compute[226235]: 2026-01-31 08:47:24.985 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.986 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2128154c-0218-4f66-9509-e0db66eba3fc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2128154c-0218-4f66-9509-e0db66eba3fc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.987 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[365b16b7-2a02-4da9-a347-242778f68671]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.988 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-2128154c-0218-4f66-9509-e0db66eba3fc
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/2128154c-0218-4f66-9509-e0db66eba3fc.pid.haproxy
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 2128154c-0218-4f66-9509-e0db66eba3fc
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:47:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:24.990 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'env', 'PROCESS_TAG=haproxy-2128154c-0218-4f66-9509-e0db66eba3fc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2128154c-0218-4f66-9509-e0db66eba3fc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:47:24 np0005603623 nova_compute[226235]: 2026-01-31 08:47:24.990 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.155 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849245.154616, da4e355d-c6c2-446e-8eb1-d2ca8279e549 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.155 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] VM Started (Lifecycle Event)#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.205 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.210 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849245.157627, da4e355d-c6c2-446e-8eb1-d2ca8279e549 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.210 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.254 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.260 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:47:25 np0005603623 podman[302231]: 2026-01-31 08:47:25.299329942 +0000 UTC m=+0.042590257 container create 4981b5deb383659879cb55a50f8cf0f6d98befb3695c3924cec06e31e9047a11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.321 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:47:25 np0005603623 systemd[1]: Started libpod-conmon-4981b5deb383659879cb55a50f8cf0f6d98befb3695c3924cec06e31e9047a11.scope.
Jan 31 03:47:25 np0005603623 podman[302231]: 2026-01-31 08:47:25.27505297 +0000 UTC m=+0.018313325 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:47:25 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:47:25 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9b5ecbd6f1ac18f1acbd306f6554cfd9deae10cd11d24038d02b1bc60da3f314/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:47:25 np0005603623 podman[302231]: 2026-01-31 08:47:25.390179571 +0000 UTC m=+0.133439896 container init 4981b5deb383659879cb55a50f8cf0f6d98befb3695c3924cec06e31e9047a11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:47:25 np0005603623 podman[302231]: 2026-01-31 08:47:25.394026552 +0000 UTC m=+0.137286877 container start 4981b5deb383659879cb55a50f8cf0f6d98befb3695c3924cec06e31e9047a11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 03:47:25 np0005603623 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[302246]: [NOTICE]   (302250) : New worker (302252) forked
Jan 31 03:47:25 np0005603623 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[302246]: [NOTICE]   (302250) : Loading success.
Jan 31 03:47:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.520 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.755 226239 DEBUG nova.compute.manager [req-f5be37f7-d86f-4fb3-a280-a1f9669e6efd req-113c7e8c-5bfe-4b22-9acd-56b4b482584d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Received event network-vif-plugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.756 226239 DEBUG oslo_concurrency.lockutils [req-f5be37f7-d86f-4fb3-a280-a1f9669e6efd req-113c7e8c-5bfe-4b22-9acd-56b4b482584d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.756 226239 DEBUG oslo_concurrency.lockutils [req-f5be37f7-d86f-4fb3-a280-a1f9669e6efd req-113c7e8c-5bfe-4b22-9acd-56b4b482584d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.757 226239 DEBUG oslo_concurrency.lockutils [req-f5be37f7-d86f-4fb3-a280-a1f9669e6efd req-113c7e8c-5bfe-4b22-9acd-56b4b482584d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.758 226239 DEBUG nova.compute.manager [req-f5be37f7-d86f-4fb3-a280-a1f9669e6efd req-113c7e8c-5bfe-4b22-9acd-56b4b482584d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Processing event network-vif-plugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.759 226239 DEBUG nova.compute.manager [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.765 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849245.765398, da4e355d-c6c2-446e-8eb1-d2ca8279e549 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.766 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.770 226239 DEBUG nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.775 226239 INFO nova.virt.libvirt.driver [-] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Instance spawned successfully.#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.775 226239 DEBUG nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.819 226239 DEBUG nova.network.neutron [req-9949fb60-20c8-4302-b789-e4725a1740f6 req-38bea645-686b-4da2-97b1-e4097c36ce1c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Updated VIF entry in instance network info cache for port 5224a5be-3c99-4dab-acc8-a3d0488d9a42. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.819 226239 DEBUG nova.network.neutron [req-9949fb60-20c8-4302-b789-e4725a1740f6 req-38bea645-686b-4da2-97b1-e4097c36ce1c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Updating instance_info_cache with network_info: [{"id": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "address": "fa:16:3e:15:96:10", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5224a5be-3c", "ovs_interfaceid": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.830 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.838 226239 DEBUG nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.838 226239 DEBUG nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.839 226239 DEBUG nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.839 226239 DEBUG nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.840 226239 DEBUG nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.840 226239 DEBUG nova.virt.libvirt.driver [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.844 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.863 226239 DEBUG oslo_concurrency.lockutils [req-9949fb60-20c8-4302-b789-e4725a1740f6 req-38bea645-686b-4da2-97b1-e4097c36ce1c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-da4e355d-c6c2-446e-8eb1-d2ca8279e549" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:47:25 np0005603623 nova_compute[226235]: 2026-01-31 08:47:25.906 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:47:26 np0005603623 nova_compute[226235]: 2026-01-31 08:47:26.022 226239 INFO nova.compute.manager [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Took 15.16 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:47:26 np0005603623 nova_compute[226235]: 2026-01-31 08:47:26.023 226239 DEBUG nova.compute.manager [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:47:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:26.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:26 np0005603623 nova_compute[226235]: 2026-01-31 08:47:26.132 226239 INFO nova.compute.manager [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Took 16.53 seconds to build instance.#033[00m
Jan 31 03:47:26 np0005603623 nova_compute[226235]: 2026-01-31 08:47:26.158 226239 DEBUG oslo_concurrency.lockutils [None req-c3a9a980-85b8-4c71-aa11-57d43be8cffb aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.718s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:26.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:27 np0005603623 nova_compute[226235]: 2026-01-31 08:47:27.964 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:28.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:28.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:29 np0005603623 nova_compute[226235]: 2026-01-31 08:47:29.773 226239 DEBUG nova.compute.manager [req-0c06e0ee-cd6e-4a25-8622-ff9f41615a6f req-2493a98c-c7c0-465d-a9e1-3ac172833b25 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Received event network-vif-plugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:47:29 np0005603623 nova_compute[226235]: 2026-01-31 08:47:29.774 226239 DEBUG oslo_concurrency.lockutils [req-0c06e0ee-cd6e-4a25-8622-ff9f41615a6f req-2493a98c-c7c0-465d-a9e1-3ac172833b25 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:29 np0005603623 nova_compute[226235]: 2026-01-31 08:47:29.774 226239 DEBUG oslo_concurrency.lockutils [req-0c06e0ee-cd6e-4a25-8622-ff9f41615a6f req-2493a98c-c7c0-465d-a9e1-3ac172833b25 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:29 np0005603623 nova_compute[226235]: 2026-01-31 08:47:29.775 226239 DEBUG oslo_concurrency.lockutils [req-0c06e0ee-cd6e-4a25-8622-ff9f41615a6f req-2493a98c-c7c0-465d-a9e1-3ac172833b25 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:29 np0005603623 nova_compute[226235]: 2026-01-31 08:47:29.775 226239 DEBUG nova.compute.manager [req-0c06e0ee-cd6e-4a25-8622-ff9f41615a6f req-2493a98c-c7c0-465d-a9e1-3ac172833b25 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] No waiting events found dispatching network-vif-plugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:47:29 np0005603623 nova_compute[226235]: 2026-01-31 08:47:29.775 226239 WARNING nova.compute.manager [req-0c06e0ee-cd6e-4a25-8622-ff9f41615a6f req-2493a98c-c7c0-465d-a9e1-3ac172833b25 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Received unexpected event network-vif-plugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:47:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:30.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:30.139 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:30.139 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:30.140 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:30.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:30 np0005603623 nova_compute[226235]: 2026-01-31 08:47:30.301 226239 INFO nova.compute.manager [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Rescuing#033[00m
Jan 31 03:47:30 np0005603623 nova_compute[226235]: 2026-01-31 08:47:30.302 226239 DEBUG oslo_concurrency.lockutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "refresh_cache-da4e355d-c6c2-446e-8eb1-d2ca8279e549" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:47:30 np0005603623 nova_compute[226235]: 2026-01-31 08:47:30.302 226239 DEBUG oslo_concurrency.lockutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquired lock "refresh_cache-da4e355d-c6c2-446e-8eb1-d2ca8279e549" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:47:30 np0005603623 nova_compute[226235]: 2026-01-31 08:47:30.302 226239 DEBUG nova.network.neutron [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:47:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:30 np0005603623 nova_compute[226235]: 2026-01-31 08:47:30.523 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:47:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:32.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:47:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:32.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:32 np0005603623 nova_compute[226235]: 2026-01-31 08:47:32.993 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:34 np0005603623 nova_compute[226235]: 2026-01-31 08:47:34.066 226239 DEBUG nova.network.neutron [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Updating instance_info_cache with network_info: [{"id": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "address": "fa:16:3e:15:96:10", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5224a5be-3c", "ovs_interfaceid": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:47:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:34.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:34.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:34 np0005603623 nova_compute[226235]: 2026-01-31 08:47:34.176 226239 DEBUG oslo_concurrency.lockutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Releasing lock "refresh_cache-da4e355d-c6c2-446e-8eb1-d2ca8279e549" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:47:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:35 np0005603623 nova_compute[226235]: 2026-01-31 08:47:35.524 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:36.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:36 np0005603623 nova_compute[226235]: 2026-01-31 08:47:36.147 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:36.147 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=68, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=67) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:47:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:36.149 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:47:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:36.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:36 np0005603623 nova_compute[226235]: 2026-01-31 08:47:36.285 226239 DEBUG nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:47:36 np0005603623 ceph-osd[79732]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/lock/cls_lock.cc:291: Could not read list of current lockers off disk: (2) No such file or directory
Jan 31 03:47:37 np0005603623 nova_compute[226235]: 2026-01-31 08:47:37.996 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:38 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #56. Immutable memtables: 12.
Jan 31 03:47:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:38.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:38.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:47:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/253624523' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:47:39 np0005603623 ovn_controller[133449]: 2026-01-31T08:47:39Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:15:96:10 10.100.0.4
Jan 31 03:47:39 np0005603623 ovn_controller[133449]: 2026-01-31T08:47:39Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:15:96:10 10.100.0.4
Jan 31 03:47:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:40.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:40.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:40 np0005603623 nova_compute[226235]: 2026-01-31 08:47:40.526 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:41.151 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '68'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:42.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:42.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:43 np0005603623 nova_compute[226235]: 2026-01-31 08:47:42.999 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:47:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/319647191' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:47:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:44.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:44.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:44 np0005603623 nova_compute[226235]: 2026-01-31 08:47:44.951 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:45 np0005603623 nova_compute[226235]: 2026-01-31 08:47:45.529 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:46.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:46.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:46 np0005603623 nova_compute[226235]: 2026-01-31 08:47:46.323 226239 DEBUG nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 03:47:48 np0005603623 nova_compute[226235]: 2026-01-31 08:47:48.001 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:48.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:48.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:49 np0005603623 kernel: tap5224a5be-3c (unregistering): left promiscuous mode
Jan 31 03:47:49 np0005603623 NetworkManager[48970]: <info>  [1769849269.9873] device (tap5224a5be-3c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:47:49 np0005603623 nova_compute[226235]: 2026-01-31 08:47:49.995 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:49 np0005603623 ovn_controller[133449]: 2026-01-31T08:47:49Z|00675|binding|INFO|Releasing lport 5224a5be-3c99-4dab-acc8-a3d0488d9a42 from this chassis (sb_readonly=0)
Jan 31 03:47:49 np0005603623 ovn_controller[133449]: 2026-01-31T08:47:49Z|00676|binding|INFO|Setting lport 5224a5be-3c99-4dab-acc8-a3d0488d9a42 down in Southbound
Jan 31 03:47:49 np0005603623 ovn_controller[133449]: 2026-01-31T08:47:49Z|00677|binding|INFO|Removing iface tap5224a5be-3c ovn-installed in OVS
Jan 31 03:47:50 np0005603623 nova_compute[226235]: 2026-01-31 08:47:50.003 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:50 np0005603623 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a2.scope: Deactivated successfully.
Jan 31 03:47:50 np0005603623 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a2.scope: Consumed 12.973s CPU time.
Jan 31 03:47:50 np0005603623 systemd-machined[194379]: Machine qemu-76-instance-000000a2 terminated.
Jan 31 03:47:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:50.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:50.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:50 np0005603623 nova_compute[226235]: 2026-01-31 08:47:50.342 226239 INFO nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Instance shutdown successfully after 14 seconds.#033[00m
Jan 31 03:47:50 np0005603623 nova_compute[226235]: 2026-01-31 08:47:50.349 226239 INFO nova.virt.libvirt.driver [-] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Instance destroyed successfully.#033[00m
Jan 31 03:47:50 np0005603623 nova_compute[226235]: 2026-01-31 08:47:50.349 226239 DEBUG nova.objects.instance [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'numa_topology' on Instance uuid da4e355d-c6c2-446e-8eb1-d2ca8279e549 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:47:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:50.513 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:96:10 10.100.0.4'], port_security=['fa:16:3e:15:96:10 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'da4e355d-c6c2-446e-8eb1-d2ca8279e549', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2128154c-0218-4f66-9509-e0db66eba3fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4937dacf-809a-410a-970f-8b358db49b15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc4ff4f3-028a-4adf-9ffc-a84ef2563d05, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=5224a5be-3c99-4dab-acc8-a3d0488d9a42) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:47:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:50.514 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 5224a5be-3c99-4dab-acc8-a3d0488d9a42 in datapath 2128154c-0218-4f66-9509-e0db66eba3fc unbound from our chassis#033[00m
Jan 31 03:47:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:50.516 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2128154c-0218-4f66-9509-e0db66eba3fc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:47:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:50.517 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8cffbd43-d098-4ab5-b20c-bc5c9f9ca8a4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:50.517 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc namespace which is not needed anymore#033[00m
Jan 31 03:47:50 np0005603623 nova_compute[226235]: 2026-01-31 08:47:50.530 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:50 np0005603623 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[302246]: [NOTICE]   (302250) : haproxy version is 2.8.14-c23fe91
Jan 31 03:47:50 np0005603623 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[302246]: [NOTICE]   (302250) : path to executable is /usr/sbin/haproxy
Jan 31 03:47:50 np0005603623 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[302246]: [WARNING]  (302250) : Exiting Master process...
Jan 31 03:47:50 np0005603623 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[302246]: [ALERT]    (302250) : Current worker (302252) exited with code 143 (Terminated)
Jan 31 03:47:50 np0005603623 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[302246]: [WARNING]  (302250) : All workers exited. Exiting... (0)
Jan 31 03:47:50 np0005603623 systemd[1]: libpod-4981b5deb383659879cb55a50f8cf0f6d98befb3695c3924cec06e31e9047a11.scope: Deactivated successfully.
Jan 31 03:47:50 np0005603623 podman[302358]: 2026-01-31 08:47:50.641275848 +0000 UTC m=+0.043207027 container died 4981b5deb383659879cb55a50f8cf0f6d98befb3695c3924cec06e31e9047a11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:47:50 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4981b5deb383659879cb55a50f8cf0f6d98befb3695c3924cec06e31e9047a11-userdata-shm.mount: Deactivated successfully.
Jan 31 03:47:50 np0005603623 systemd[1]: var-lib-containers-storage-overlay-9b5ecbd6f1ac18f1acbd306f6554cfd9deae10cd11d24038d02b1bc60da3f314-merged.mount: Deactivated successfully.
Jan 31 03:47:50 np0005603623 podman[302358]: 2026-01-31 08:47:50.674253092 +0000 UTC m=+0.076184281 container cleanup 4981b5deb383659879cb55a50f8cf0f6d98befb3695c3924cec06e31e9047a11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:47:50 np0005603623 systemd[1]: libpod-conmon-4981b5deb383659879cb55a50f8cf0f6d98befb3695c3924cec06e31e9047a11.scope: Deactivated successfully.
Jan 31 03:47:50 np0005603623 nova_compute[226235]: 2026-01-31 08:47:50.698 226239 INFO nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Attempting rescue#033[00m
Jan 31 03:47:50 np0005603623 nova_compute[226235]: 2026-01-31 08:47:50.699 226239 DEBUG nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 31 03:47:50 np0005603623 nova_compute[226235]: 2026-01-31 08:47:50.702 226239 DEBUG nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 31 03:47:50 np0005603623 nova_compute[226235]: 2026-01-31 08:47:50.702 226239 INFO nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Creating image(s)#033[00m
Jan 31 03:47:50 np0005603623 nova_compute[226235]: 2026-01-31 08:47:50.728 226239 DEBUG nova.storage.rbd_utils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:50 np0005603623 podman[302389]: 2026-01-31 08:47:50.731270621 +0000 UTC m=+0.043426753 container remove 4981b5deb383659879cb55a50f8cf0f6d98befb3695c3924cec06e31e9047a11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:47:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:50.735 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[343697fe-e6c6-4eef-a699-304ae118e19b]: (4, ('Sat Jan 31 08:47:50 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc (4981b5deb383659879cb55a50f8cf0f6d98befb3695c3924cec06e31e9047a11)\n4981b5deb383659879cb55a50f8cf0f6d98befb3695c3924cec06e31e9047a11\nSat Jan 31 08:47:50 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc (4981b5deb383659879cb55a50f8cf0f6d98befb3695c3924cec06e31e9047a11)\n4981b5deb383659879cb55a50f8cf0f6d98befb3695c3924cec06e31e9047a11\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:50 np0005603623 nova_compute[226235]: 2026-01-31 08:47:50.735 226239 DEBUG nova.objects.instance [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'trusted_certs' on Instance uuid da4e355d-c6c2-446e-8eb1-d2ca8279e549 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:47:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:50.737 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dddb7a48-44a6-4e87-b605-b6cd181792f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:50.737 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2128154c-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:50 np0005603623 nova_compute[226235]: 2026-01-31 08:47:50.739 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:50 np0005603623 kernel: tap2128154c-00: left promiscuous mode
Jan 31 03:47:50 np0005603623 nova_compute[226235]: 2026-01-31 08:47:50.747 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:50.751 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6835538b-7ab2-4470-9c42-2ed2213fa29f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:50.774 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[22c8e33f-b88d-4bd4-9ec1-080d3a3ac6c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:50.775 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[61099a67-2218-4c9e-a173-1d2cc49b94f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:50.785 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[84185514-4115-4e2a-acd6-7aeefa7a2b84]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 838523, 'reachable_time': 21708, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 302425, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:50.787 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:47:50 np0005603623 systemd[1]: run-netns-ovnmeta\x2d2128154c\x2d0218\x2d4f66\x2d9509\x2de0db66eba3fc.mount: Deactivated successfully.
Jan 31 03:47:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:50.787 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee04f26-558f-477a-80c1-f64aa5c8f6b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.051 226239 DEBUG nova.storage.rbd_utils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.076 226239 DEBUG nova.storage.rbd_utils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.079 226239 DEBUG oslo_concurrency.processutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.135 226239 DEBUG oslo_concurrency.processutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.136 226239 DEBUG oslo_concurrency.lockutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.136 226239 DEBUG oslo_concurrency.lockutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.137 226239 DEBUG oslo_concurrency.lockutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.158 226239 DEBUG nova.storage.rbd_utils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.161 226239 DEBUG oslo_concurrency.processutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.435 226239 DEBUG oslo_concurrency.processutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.274s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.436 226239 DEBUG nova.objects.instance [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'migration_context' on Instance uuid da4e355d-c6c2-446e-8eb1-d2ca8279e549 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.608 226239 DEBUG nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.609 226239 DEBUG nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Start _get_guest_xml network_info=[{"id": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "address": "fa:16:3e:15:96:10", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "vif_mac": "fa:16:3e:15:96:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5224a5be-3c", "ovs_interfaceid": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.610 226239 DEBUG nova.objects.instance [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'resources' on Instance uuid da4e355d-c6c2-446e-8eb1-d2ca8279e549 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.783 226239 WARNING nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.790 226239 DEBUG nova.virt.libvirt.host [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.791 226239 DEBUG nova.virt.libvirt.host [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.795 226239 DEBUG nova.virt.libvirt.host [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.795 226239 DEBUG nova.virt.libvirt.host [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.796 226239 DEBUG nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.796 226239 DEBUG nova.virt.hardware [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.796 226239 DEBUG nova.virt.hardware [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.797 226239 DEBUG nova.virt.hardware [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.797 226239 DEBUG nova.virt.hardware [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.797 226239 DEBUG nova.virt.hardware [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.797 226239 DEBUG nova.virt.hardware [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.797 226239 DEBUG nova.virt.hardware [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.798 226239 DEBUG nova.virt.hardware [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.798 226239 DEBUG nova.virt.hardware [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.798 226239 DEBUG nova.virt.hardware [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.798 226239 DEBUG nova.virt.hardware [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:47:51 np0005603623 nova_compute[226235]: 2026-01-31 08:47:51.798 226239 DEBUG nova.objects.instance [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'vcpu_model' on Instance uuid da4e355d-c6c2-446e-8eb1-d2ca8279e549 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:47:52 np0005603623 nova_compute[226235]: 2026-01-31 08:47:52.024 226239 DEBUG oslo_concurrency.processutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:52.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:52.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:47:52 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3553480562' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:47:52 np0005603623 nova_compute[226235]: 2026-01-31 08:47:52.444 226239 DEBUG oslo_concurrency.processutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:52 np0005603623 nova_compute[226235]: 2026-01-31 08:47:52.445 226239 DEBUG oslo_concurrency.processutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:47:52 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2639290693' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:47:52 np0005603623 nova_compute[226235]: 2026-01-31 08:47:52.859 226239 DEBUG oslo_concurrency.processutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:52 np0005603623 nova_compute[226235]: 2026-01-31 08:47:52.860 226239 DEBUG oslo_concurrency.processutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:53 np0005603623 nova_compute[226235]: 2026-01-31 08:47:53.004 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:47:53 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2718387945' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:47:53 np0005603623 nova_compute[226235]: 2026-01-31 08:47:53.261 226239 DEBUG oslo_concurrency.processutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:53 np0005603623 nova_compute[226235]: 2026-01-31 08:47:53.262 226239 DEBUG nova.virt.libvirt.vif [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:47:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1869656790',display_name='tempest-ServerRescueNegativeTestJSON-server-1869656790',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1869656790',id=162,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:47:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bf1c3d387dbe4191b4d05bdfca5959da',ramdisk_id='',reservation_id='r-l0z9mou0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-81297706',owner_user_name='tempest-ServerRescueNegativeTestJSON-81297706-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:47:26Z,user_data=None,user_id='aa7f893021af4a84b03d85b476dadfe0',uuid=da4e355d-c6c2-446e-8eb1-d2ca8279e549,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "address": "fa:16:3e:15:96:10", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "vif_mac": "fa:16:3e:15:96:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5224a5be-3c", "ovs_interfaceid": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:47:53 np0005603623 nova_compute[226235]: 2026-01-31 08:47:53.263 226239 DEBUG nova.network.os_vif_util [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converting VIF {"id": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "address": "fa:16:3e:15:96:10", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "vif_mac": "fa:16:3e:15:96:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5224a5be-3c", "ovs_interfaceid": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:47:53 np0005603623 nova_compute[226235]: 2026-01-31 08:47:53.264 226239 DEBUG nova.network.os_vif_util [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:96:10,bridge_name='br-int',has_traffic_filtering=True,id=5224a5be-3c99-4dab-acc8-a3d0488d9a42,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5224a5be-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:47:53 np0005603623 nova_compute[226235]: 2026-01-31 08:47:53.265 226239 DEBUG nova.objects.instance [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'pci_devices' on Instance uuid da4e355d-c6c2-446e-8eb1-d2ca8279e549 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:47:53 np0005603623 nova_compute[226235]: 2026-01-31 08:47:53.435 226239 DEBUG nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:  <uuid>da4e355d-c6c2-446e-8eb1-d2ca8279e549</uuid>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:  <name>instance-000000a2</name>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-1869656790</nova:name>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:47:51</nova:creationTime>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <nova:user uuid="aa7f893021af4a84b03d85b476dadfe0">tempest-ServerRescueNegativeTestJSON-81297706-project-member</nova:user>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <nova:project uuid="bf1c3d387dbe4191b4d05bdfca5959da">tempest-ServerRescueNegativeTestJSON-81297706</nova:project>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <nova:port uuid="5224a5be-3c99-4dab-acc8-a3d0488d9a42">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <entry name="serial">da4e355d-c6c2-446e-8eb1-d2ca8279e549</entry>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <entry name="uuid">da4e355d-c6c2-446e-8eb1-d2ca8279e549</entry>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk.rescue">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <target dev="vdb" bus="virtio"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk.config.rescue">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:15:96:10"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <target dev="tap5224a5be-3c"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/da4e355d-c6c2-446e-8eb1-d2ca8279e549/console.log" append="off"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:47:53 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:47:53 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:47:53 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:47:53 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:47:53 np0005603623 nova_compute[226235]: 2026-01-31 08:47:53.442 226239 INFO nova.virt.libvirt.driver [-] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Instance destroyed successfully.#033[00m
Jan 31 03:47:53 np0005603623 nova_compute[226235]: 2026-01-31 08:47:53.821 226239 DEBUG nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:47:53 np0005603623 nova_compute[226235]: 2026-01-31 08:47:53.822 226239 DEBUG nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:47:53 np0005603623 nova_compute[226235]: 2026-01-31 08:47:53.822 226239 DEBUG nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:47:53 np0005603623 nova_compute[226235]: 2026-01-31 08:47:53.822 226239 DEBUG nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No VIF found with MAC fa:16:3e:15:96:10, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:47:53 np0005603623 nova_compute[226235]: 2026-01-31 08:47:53.822 226239 INFO nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Using config drive#033[00m
Jan 31 03:47:53 np0005603623 nova_compute[226235]: 2026-01-31 08:47:53.846 226239 DEBUG nova.storage.rbd_utils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:53 np0005603623 nova_compute[226235]: 2026-01-31 08:47:53.906 226239 DEBUG nova.objects.instance [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'ec2_ids' on Instance uuid da4e355d-c6c2-446e-8eb1-d2ca8279e549 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:47:53 np0005603623 nova_compute[226235]: 2026-01-31 08:47:53.986 226239 DEBUG nova.objects.instance [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'keypairs' on Instance uuid da4e355d-c6c2-446e-8eb1-d2ca8279e549 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:47:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:54.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:54.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:54 np0005603623 nova_compute[226235]: 2026-01-31 08:47:54.427 226239 DEBUG nova.compute.manager [req-5e3951a7-722c-491a-b3c4-2aa182ac9e59 req-6a89459b-d5c4-4270-8d49-684ef1accee0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Received event network-vif-unplugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:47:54 np0005603623 nova_compute[226235]: 2026-01-31 08:47:54.427 226239 DEBUG oslo_concurrency.lockutils [req-5e3951a7-722c-491a-b3c4-2aa182ac9e59 req-6a89459b-d5c4-4270-8d49-684ef1accee0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:54 np0005603623 nova_compute[226235]: 2026-01-31 08:47:54.428 226239 DEBUG oslo_concurrency.lockutils [req-5e3951a7-722c-491a-b3c4-2aa182ac9e59 req-6a89459b-d5c4-4270-8d49-684ef1accee0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:54 np0005603623 nova_compute[226235]: 2026-01-31 08:47:54.428 226239 DEBUG oslo_concurrency.lockutils [req-5e3951a7-722c-491a-b3c4-2aa182ac9e59 req-6a89459b-d5c4-4270-8d49-684ef1accee0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:54 np0005603623 nova_compute[226235]: 2026-01-31 08:47:54.428 226239 DEBUG nova.compute.manager [req-5e3951a7-722c-491a-b3c4-2aa182ac9e59 req-6a89459b-d5c4-4270-8d49-684ef1accee0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] No waiting events found dispatching network-vif-unplugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:47:54 np0005603623 nova_compute[226235]: 2026-01-31 08:47:54.428 226239 WARNING nova.compute.manager [req-5e3951a7-722c-491a-b3c4-2aa182ac9e59 req-6a89459b-d5c4-4270-8d49-684ef1accee0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Received unexpected event network-vif-unplugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 for instance with vm_state active and task_state rescuing.#033[00m
Jan 31 03:47:54 np0005603623 nova_compute[226235]: 2026-01-31 08:47:54.429 226239 DEBUG nova.compute.manager [req-5e3951a7-722c-491a-b3c4-2aa182ac9e59 req-6a89459b-d5c4-4270-8d49-684ef1accee0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Received event network-vif-plugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:47:54 np0005603623 nova_compute[226235]: 2026-01-31 08:47:54.429 226239 DEBUG oslo_concurrency.lockutils [req-5e3951a7-722c-491a-b3c4-2aa182ac9e59 req-6a89459b-d5c4-4270-8d49-684ef1accee0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:54 np0005603623 nova_compute[226235]: 2026-01-31 08:47:54.429 226239 DEBUG oslo_concurrency.lockutils [req-5e3951a7-722c-491a-b3c4-2aa182ac9e59 req-6a89459b-d5c4-4270-8d49-684ef1accee0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:54 np0005603623 nova_compute[226235]: 2026-01-31 08:47:54.429 226239 DEBUG oslo_concurrency.lockutils [req-5e3951a7-722c-491a-b3c4-2aa182ac9e59 req-6a89459b-d5c4-4270-8d49-684ef1accee0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:54 np0005603623 nova_compute[226235]: 2026-01-31 08:47:54.430 226239 DEBUG nova.compute.manager [req-5e3951a7-722c-491a-b3c4-2aa182ac9e59 req-6a89459b-d5c4-4270-8d49-684ef1accee0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] No waiting events found dispatching network-vif-plugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:47:54 np0005603623 nova_compute[226235]: 2026-01-31 08:47:54.430 226239 WARNING nova.compute.manager [req-5e3951a7-722c-491a-b3c4-2aa182ac9e59 req-6a89459b-d5c4-4270-8d49-684ef1accee0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Received unexpected event network-vif-plugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 for instance with vm_state active and task_state rescuing.#033[00m
Jan 31 03:47:54 np0005603623 nova_compute[226235]: 2026-01-31 08:47:54.439 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:54 np0005603623 nova_compute[226235]: 2026-01-31 08:47:54.766 226239 INFO nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Creating config drive at /var/lib/nova/instances/da4e355d-c6c2-446e-8eb1-d2ca8279e549/disk.config.rescue#033[00m
Jan 31 03:47:54 np0005603623 nova_compute[226235]: 2026-01-31 08:47:54.771 226239 DEBUG oslo_concurrency.processutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/da4e355d-c6c2-446e-8eb1-d2ca8279e549/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp2ewd6oeh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:54 np0005603623 nova_compute[226235]: 2026-01-31 08:47:54.897 226239 DEBUG oslo_concurrency.processutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/da4e355d-c6c2-446e-8eb1-d2ca8279e549/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp2ewd6oeh" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:54 np0005603623 nova_compute[226235]: 2026-01-31 08:47:54.923 226239 DEBUG nova.storage.rbd_utils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:54 np0005603623 nova_compute[226235]: 2026-01-31 08:47:54.927 226239 DEBUG oslo_concurrency.processutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/da4e355d-c6c2-446e-8eb1-d2ca8279e549/disk.config.rescue da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:54 np0005603623 podman[302593]: 2026-01-31 08:47:54.958190012 +0000 UTC m=+0.055277715 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 03:47:54 np0005603623 podman[302594]: 2026-01-31 08:47:54.980941516 +0000 UTC m=+0.076386557 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 31 03:47:55 np0005603623 nova_compute[226235]: 2026-01-31 08:47:55.158 226239 DEBUG oslo_concurrency.processutils [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/da4e355d-c6c2-446e-8eb1-d2ca8279e549/disk.config.rescue da4e355d-c6c2-446e-8eb1-d2ca8279e549_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.231s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:55 np0005603623 nova_compute[226235]: 2026-01-31 08:47:55.159 226239 INFO nova.virt.libvirt.driver [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Deleting local config drive /var/lib/nova/instances/da4e355d-c6c2-446e-8eb1-d2ca8279e549/disk.config.rescue because it was imported into RBD.#033[00m
Jan 31 03:47:55 np0005603623 kernel: tap5224a5be-3c: entered promiscuous mode
Jan 31 03:47:55 np0005603623 NetworkManager[48970]: <info>  [1769849275.1993] manager: (tap5224a5be-3c): new Tun device (/org/freedesktop/NetworkManager/Devices/323)
Jan 31 03:47:55 np0005603623 nova_compute[226235]: 2026-01-31 08:47:55.199 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:55 np0005603623 ovn_controller[133449]: 2026-01-31T08:47:55Z|00678|binding|INFO|Claiming lport 5224a5be-3c99-4dab-acc8-a3d0488d9a42 for this chassis.
Jan 31 03:47:55 np0005603623 ovn_controller[133449]: 2026-01-31T08:47:55Z|00679|binding|INFO|5224a5be-3c99-4dab-acc8-a3d0488d9a42: Claiming fa:16:3e:15:96:10 10.100.0.4
Jan 31 03:47:55 np0005603623 ovn_controller[133449]: 2026-01-31T08:47:55Z|00680|binding|INFO|Setting lport 5224a5be-3c99-4dab-acc8-a3d0488d9a42 ovn-installed in OVS
Jan 31 03:47:55 np0005603623 nova_compute[226235]: 2026-01-31 08:47:55.206 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:55 np0005603623 nova_compute[226235]: 2026-01-31 08:47:55.209 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:55 np0005603623 ovn_controller[133449]: 2026-01-31T08:47:55Z|00681|binding|INFO|Setting lport 5224a5be-3c99-4dab-acc8-a3d0488d9a42 up in Southbound
Jan 31 03:47:55 np0005603623 systemd-udevd[302683]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.221 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:96:10 10.100.0.4'], port_security=['fa:16:3e:15:96:10 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'da4e355d-c6c2-446e-8eb1-d2ca8279e549', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2128154c-0218-4f66-9509-e0db66eba3fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'neutron:revision_number': '5', 'neutron:security_group_ids': '4937dacf-809a-410a-970f-8b358db49b15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc4ff4f3-028a-4adf-9ffc-a84ef2563d05, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=5224a5be-3c99-4dab-acc8-a3d0488d9a42) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.222 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 5224a5be-3c99-4dab-acc8-a3d0488d9a42 in datapath 2128154c-0218-4f66-9509-e0db66eba3fc bound to our chassis#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.223 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2128154c-0218-4f66-9509-e0db66eba3fc#033[00m
Jan 31 03:47:55 np0005603623 NetworkManager[48970]: <info>  [1769849275.2328] device (tap5224a5be-3c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.231 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8d968621-ad2e-4bdc-8fe6-7ac0af78c295]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.232 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2128154c-01 in ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:47:55 np0005603623 NetworkManager[48970]: <info>  [1769849275.2333] device (tap5224a5be-3c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:47:55 np0005603623 systemd-machined[194379]: New machine qemu-77-instance-000000a2.
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.236 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2128154c-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.236 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[029e9826-525b-4811-830a-56d99e6eefe9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.238 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[03ba392b-3702-4784-b261-7daa2a3b0b42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:55 np0005603623 systemd[1]: Started Virtual Machine qemu-77-instance-000000a2.
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.246 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[7b680f5d-0473-4dcd-bc55-b27134abe197]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.268 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[af6a8a6c-72cb-475d-af35-6bd454eedb61]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.287 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[af37fa14-245f-429f-b202-2fcbb69a70ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.292 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5995708c-e891-4d48-a07c-7498e3319bb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:55 np0005603623 NetworkManager[48970]: <info>  [1769849275.2931] manager: (tap2128154c-00): new Veth device (/org/freedesktop/NetworkManager/Devices/324)
Jan 31 03:47:55 np0005603623 systemd-udevd[302697]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.313 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[495fdfb5-ccaa-4a8c-8093-b543e79074c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.316 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[b389d2c3-f135-401b-a1ae-466d88b1eefc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:55 np0005603623 NetworkManager[48970]: <info>  [1769849275.3338] device (tap2128154c-00): carrier: link connected
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.337 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[2fcbd39e-9ee3-423f-904d-4369f855ac64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.350 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[85d70b37-60b9-41db-a58a-b7a70f491c41]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2128154c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:32:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 203], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 841576, 'reachable_time': 16153, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 302769, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.361 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7ccb336e-d55d-43f1-a892-16c5e6d34335]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feeb:3208'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 841576, 'tstamp': 841576}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 302770, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.372 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1e7fc68c-38d8-42cd-b776-2e927ede4f58]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2128154c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:32:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 203], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 841576, 'reachable_time': 16153, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 302771, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.389 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[453acb73-0b83-4793-9caf-1ba3cc068f3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.419 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6c3c2efb-bbe3-41e4-9a53-1e38ff2fabf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.421 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2128154c-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.421 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.422 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2128154c-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:55 np0005603623 kernel: tap2128154c-00: entered promiscuous mode
Jan 31 03:47:55 np0005603623 NetworkManager[48970]: <info>  [1769849275.4250] manager: (tap2128154c-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/325)
Jan 31 03:47:55 np0005603623 nova_compute[226235]: 2026-01-31 08:47:55.427 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.428 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2128154c-00, col_values=(('external_ids', {'iface-id': '5976b74a-78ce-46e1-bd2c-76a2a502c8f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:55 np0005603623 ovn_controller[133449]: 2026-01-31T08:47:55Z|00682|binding|INFO|Releasing lport 5976b74a-78ce-46e1-bd2c-76a2a502c8f5 from this chassis (sb_readonly=0)
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.430 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2128154c-0218-4f66-9509-e0db66eba3fc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2128154c-0218-4f66-9509-e0db66eba3fc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.431 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6c1299c8-c6d1-405e-84e2-ddc6f3763590]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.432 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-2128154c-0218-4f66-9509-e0db66eba3fc
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/2128154c-0218-4f66-9509-e0db66eba3fc.pid.haproxy
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 2128154c-0218-4f66-9509-e0db66eba3fc
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:47:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:47:55.432 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'env', 'PROCESS_TAG=haproxy-2128154c-0218-4f66-9509-e0db66eba3fc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2128154c-0218-4f66-9509-e0db66eba3fc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:47:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:55 np0005603623 podman[302859]: 2026-01-31 08:47:55.768546543 +0000 UTC m=+0.046125667 container create 48fa582d356ec680dc52136e1e18162b6970df1b5a6e57304c8d510d5cd5b9c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 03:47:55 np0005603623 systemd[1]: Started libpod-conmon-48fa582d356ec680dc52136e1e18162b6970df1b5a6e57304c8d510d5cd5b9c7.scope.
Jan 31 03:47:55 np0005603623 nova_compute[226235]: 2026-01-31 08:47:55.807 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Removed pending event for da4e355d-c6c2-446e-8eb1-d2ca8279e549 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:47:55 np0005603623 nova_compute[226235]: 2026-01-31 08:47:55.807 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849275.806648, da4e355d-c6c2-446e-8eb1-d2ca8279e549 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:47:55 np0005603623 nova_compute[226235]: 2026-01-31 08:47:55.808 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:47:55 np0005603623 nova_compute[226235]: 2026-01-31 08:47:55.812 226239 DEBUG nova.compute.manager [None req-76dbaefe-dbf5-4319-b94c-1b71c6923d02 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:47:55 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:47:55 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af6961a16a3922407915a4aeb0886c8d731b95c44887313e6d243b222e185681/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:47:55 np0005603623 podman[302859]: 2026-01-31 08:47:55.831013843 +0000 UTC m=+0.108592987 container init 48fa582d356ec680dc52136e1e18162b6970df1b5a6e57304c8d510d5cd5b9c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 03:47:55 np0005603623 podman[302859]: 2026-01-31 08:47:55.837519697 +0000 UTC m=+0.115098821 container start 48fa582d356ec680dc52136e1e18162b6970df1b5a6e57304c8d510d5cd5b9c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:47:55 np0005603623 podman[302859]: 2026-01-31 08:47:55.743194998 +0000 UTC m=+0.020774152 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:47:55 np0005603623 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[302876]: [NOTICE]   (302880) : New worker (302882) forked
Jan 31 03:47:55 np0005603623 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[302876]: [NOTICE]   (302880) : Loading success.
Jan 31 03:47:55 np0005603623 nova_compute[226235]: 2026-01-31 08:47:55.976 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:47:55 np0005603623 nova_compute[226235]: 2026-01-31 08:47:55.980 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:47:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:56.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:56.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:56 np0005603623 nova_compute[226235]: 2026-01-31 08:47:56.192 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849275.80784, da4e355d-c6c2-446e-8eb1-d2ca8279e549 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:47:56 np0005603623 nova_compute[226235]: 2026-01-31 08:47:56.192 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] VM Started (Lifecycle Event)#033[00m
Jan 31 03:47:56 np0005603623 nova_compute[226235]: 2026-01-31 08:47:56.643 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:47:56 np0005603623 nova_compute[226235]: 2026-01-31 08:47:56.646 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #142. Immutable memtables: 0.
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:47:56.687257) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 89] Flushing memtable with next log file: 142
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849276687360, "job": 89, "event": "flush_started", "num_memtables": 1, "num_entries": 1205, "num_deletes": 255, "total_data_size": 2437618, "memory_usage": 2473632, "flush_reason": "Manual Compaction"}
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 89] Level-0 flush table #143: started
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849276700008, "cf_name": "default", "job": 89, "event": "table_file_creation", "file_number": 143, "file_size": 1607580, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 69324, "largest_seqno": 70524, "table_properties": {"data_size": 1602497, "index_size": 2542, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11355, "raw_average_key_size": 19, "raw_value_size": 1592067, "raw_average_value_size": 2730, "num_data_blocks": 112, "num_entries": 583, "num_filter_entries": 583, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769849182, "oldest_key_time": 1769849182, "file_creation_time": 1769849276, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 143, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 89] Flush lasted 12801 microseconds, and 3907 cpu microseconds.
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:47:56.700050) [db/flush_job.cc:967] [default] [JOB 89] Level-0 flush table #143: 1607580 bytes OK
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:47:56.700080) [db/memtable_list.cc:519] [default] Level-0 commit table #143 started
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:47:56.703977) [db/memtable_list.cc:722] [default] Level-0 commit table #143: memtable #1 done
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:47:56.703995) EVENT_LOG_v1 {"time_micros": 1769849276703989, "job": 89, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:47:56.704013) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 89] Try to delete WAL files size 2431832, prev total WAL file size 2431832, number of live WAL files 2.
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000139.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:47:56.704716) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032353133' seq:72057594037927935, type:22 .. '6C6F676D0032373634' seq:0, type:0; will stop at (end)
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 90] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 89 Base level 0, inputs: [143(1569KB)], [141(9527KB)]
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849276704806, "job": 90, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [143], "files_L6": [141], "score": -1, "input_data_size": 11363662, "oldest_snapshot_seqno": -1}
Jan 31 03:47:56 np0005603623 nova_compute[226235]: 2026-01-31 08:47:56.733 226239 DEBUG nova.compute.manager [req-48b75ade-7689-4f9a-9929-c21f70533350 req-48c3c8b6-c895-416a-b879-c59b6b946620 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Received event network-vif-plugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:47:56 np0005603623 nova_compute[226235]: 2026-01-31 08:47:56.733 226239 DEBUG oslo_concurrency.lockutils [req-48b75ade-7689-4f9a-9929-c21f70533350 req-48c3c8b6-c895-416a-b879-c59b6b946620 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:56 np0005603623 nova_compute[226235]: 2026-01-31 08:47:56.733 226239 DEBUG oslo_concurrency.lockutils [req-48b75ade-7689-4f9a-9929-c21f70533350 req-48c3c8b6-c895-416a-b879-c59b6b946620 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:56 np0005603623 nova_compute[226235]: 2026-01-31 08:47:56.734 226239 DEBUG oslo_concurrency.lockutils [req-48b75ade-7689-4f9a-9929-c21f70533350 req-48c3c8b6-c895-416a-b879-c59b6b946620 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:56 np0005603623 nova_compute[226235]: 2026-01-31 08:47:56.734 226239 DEBUG nova.compute.manager [req-48b75ade-7689-4f9a-9929-c21f70533350 req-48c3c8b6-c895-416a-b879-c59b6b946620 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] No waiting events found dispatching network-vif-plugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:47:56 np0005603623 nova_compute[226235]: 2026-01-31 08:47:56.734 226239 WARNING nova.compute.manager [req-48b75ade-7689-4f9a-9929-c21f70533350 req-48c3c8b6-c895-416a-b879-c59b6b946620 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Received unexpected event network-vif-plugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 for instance with vm_state rescued and task_state None.#033[00m
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 90] Generated table #144: 8968 keys, 11230700 bytes, temperature: kUnknown
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849276835167, "cf_name": "default", "job": 90, "event": "table_file_creation", "file_number": 144, "file_size": 11230700, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11173590, "index_size": 33573, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22469, "raw_key_size": 237091, "raw_average_key_size": 26, "raw_value_size": 11017254, "raw_average_value_size": 1228, "num_data_blocks": 1280, "num_entries": 8968, "num_filter_entries": 8968, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769849276, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 144, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:47:56.835494) [db/compaction/compaction_job.cc:1663] [default] [JOB 90] Compacted 1@0 + 1@6 files to L6 => 11230700 bytes
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:47:56.836635) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 87.1 rd, 86.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 9.3 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(14.1) write-amplify(7.0) OK, records in: 9491, records dropped: 523 output_compression: NoCompression
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:47:56.836662) EVENT_LOG_v1 {"time_micros": 1769849276836649, "job": 90, "event": "compaction_finished", "compaction_time_micros": 130454, "compaction_time_cpu_micros": 25862, "output_level": 6, "num_output_files": 1, "total_output_size": 11230700, "num_input_records": 9491, "num_output_records": 8968, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000143.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849276837031, "job": 90, "event": "table_file_deletion", "file_number": 143}
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000141.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849276838489, "job": 90, "event": "table_file_deletion", "file_number": 141}
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:47:56.704604) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:47:56.838539) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:47:56.838544) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:47:56.838546) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:47:56.838548) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:47:56 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:47:56.838550) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:47:57 np0005603623 nova_compute[226235]: 2026-01-31 08:47:57.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:57 np0005603623 nova_compute[226235]: 2026-01-31 08:47:57.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:47:58 np0005603623 nova_compute[226235]: 2026-01-31 08:47:58.007 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:58.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:47:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:47:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:58.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:47:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:47:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:47:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:47:59 np0005603623 nova_compute[226235]: 2026-01-31 08:47:59.049 226239 DEBUG nova.compute.manager [req-8bb816e6-7f7c-4d15-8cba-5b94e74eba34 req-1cd02311-9ce8-4424-9e7a-967ff5b0cb33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Received event network-vif-plugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:47:59 np0005603623 nova_compute[226235]: 2026-01-31 08:47:59.049 226239 DEBUG oslo_concurrency.lockutils [req-8bb816e6-7f7c-4d15-8cba-5b94e74eba34 req-1cd02311-9ce8-4424-9e7a-967ff5b0cb33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:59 np0005603623 nova_compute[226235]: 2026-01-31 08:47:59.049 226239 DEBUG oslo_concurrency.lockutils [req-8bb816e6-7f7c-4d15-8cba-5b94e74eba34 req-1cd02311-9ce8-4424-9e7a-967ff5b0cb33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:59 np0005603623 nova_compute[226235]: 2026-01-31 08:47:59.050 226239 DEBUG oslo_concurrency.lockutils [req-8bb816e6-7f7c-4d15-8cba-5b94e74eba34 req-1cd02311-9ce8-4424-9e7a-967ff5b0cb33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:59 np0005603623 nova_compute[226235]: 2026-01-31 08:47:59.050 226239 DEBUG nova.compute.manager [req-8bb816e6-7f7c-4d15-8cba-5b94e74eba34 req-1cd02311-9ce8-4424-9e7a-967ff5b0cb33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] No waiting events found dispatching network-vif-plugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:47:59 np0005603623 nova_compute[226235]: 2026-01-31 08:47:59.050 226239 WARNING nova.compute.manager [req-8bb816e6-7f7c-4d15-8cba-5b94e74eba34 req-1cd02311-9ce8-4424-9e7a-967ff5b0cb33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Received unexpected event network-vif-plugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 for instance with vm_state rescued and task_state None.#033[00m
Jan 31 03:48:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:48:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:00.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:48:00 np0005603623 nova_compute[226235]: 2026-01-31 08:48:00.129 226239 DEBUG oslo_concurrency.lockutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "c215327f-37ad-41a7-a883-3dbb23334df6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:00 np0005603623 nova_compute[226235]: 2026-01-31 08:48:00.129 226239 DEBUG oslo_concurrency.lockutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:00.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:00 np0005603623 nova_compute[226235]: 2026-01-31 08:48:00.207 226239 DEBUG nova.compute.manager [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:48:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:00 np0005603623 nova_compute[226235]: 2026-01-31 08:48:00.535 226239 DEBUG oslo_concurrency.lockutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:00 np0005603623 nova_compute[226235]: 2026-01-31 08:48:00.536 226239 DEBUG oslo_concurrency.lockutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:00 np0005603623 nova_compute[226235]: 2026-01-31 08:48:00.536 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:00 np0005603623 nova_compute[226235]: 2026-01-31 08:48:00.551 226239 DEBUG nova.virt.hardware [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:48:00 np0005603623 nova_compute[226235]: 2026-01-31 08:48:00.552 226239 INFO nova.compute.claims [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:48:00 np0005603623 nova_compute[226235]: 2026-01-31 08:48:00.855 226239 DEBUG oslo_concurrency.processutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:48:01 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3040883363' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:48:01 np0005603623 nova_compute[226235]: 2026-01-31 08:48:01.292 226239 DEBUG oslo_concurrency.processutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:01 np0005603623 nova_compute[226235]: 2026-01-31 08:48:01.297 226239 DEBUG nova.compute.provider_tree [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:48:01 np0005603623 nova_compute[226235]: 2026-01-31 08:48:01.371 226239 DEBUG nova.scheduler.client.report [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:48:01 np0005603623 nova_compute[226235]: 2026-01-31 08:48:01.490 226239 DEBUG oslo_concurrency.lockutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.955s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:01 np0005603623 nova_compute[226235]: 2026-01-31 08:48:01.491 226239 DEBUG nova.compute.manager [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:48:01 np0005603623 nova_compute[226235]: 2026-01-31 08:48:01.730 226239 DEBUG nova.compute.manager [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:48:01 np0005603623 nova_compute[226235]: 2026-01-31 08:48:01.730 226239 DEBUG nova.network.neutron [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:48:01 np0005603623 nova_compute[226235]: 2026-01-31 08:48:01.923 226239 INFO nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:48:02 np0005603623 nova_compute[226235]: 2026-01-31 08:48:02.030 226239 DEBUG nova.compute.manager [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:48:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:02.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:02 np0005603623 nova_compute[226235]: 2026-01-31 08:48:02.132 226239 DEBUG nova.policy [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a498364761ef428b99cac3f92e603385', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8397e0fed04b4dabb57148d0924de2dc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:48:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:02.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:02 np0005603623 nova_compute[226235]: 2026-01-31 08:48:02.373 226239 DEBUG nova.compute.manager [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:48:02 np0005603623 nova_compute[226235]: 2026-01-31 08:48:02.375 226239 DEBUG nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:48:02 np0005603623 nova_compute[226235]: 2026-01-31 08:48:02.375 226239 INFO nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Creating image(s)#033[00m
Jan 31 03:48:02 np0005603623 nova_compute[226235]: 2026-01-31 08:48:02.402 226239 DEBUG nova.storage.rbd_utils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image c215327f-37ad-41a7-a883-3dbb23334df6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:48:02 np0005603623 nova_compute[226235]: 2026-01-31 08:48:02.427 226239 DEBUG nova.storage.rbd_utils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image c215327f-37ad-41a7-a883-3dbb23334df6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:48:02 np0005603623 nova_compute[226235]: 2026-01-31 08:48:02.453 226239 DEBUG nova.storage.rbd_utils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image c215327f-37ad-41a7-a883-3dbb23334df6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:48:02 np0005603623 nova_compute[226235]: 2026-01-31 08:48:02.458 226239 DEBUG oslo_concurrency.processutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:02 np0005603623 nova_compute[226235]: 2026-01-31 08:48:02.508 226239 DEBUG oslo_concurrency.processutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:02 np0005603623 nova_compute[226235]: 2026-01-31 08:48:02.509 226239 DEBUG oslo_concurrency.lockutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:02 np0005603623 nova_compute[226235]: 2026-01-31 08:48:02.509 226239 DEBUG oslo_concurrency.lockutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:02 np0005603623 nova_compute[226235]: 2026-01-31 08:48:02.509 226239 DEBUG oslo_concurrency.lockutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:02 np0005603623 nova_compute[226235]: 2026-01-31 08:48:02.534 226239 DEBUG nova.storage.rbd_utils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image c215327f-37ad-41a7-a883-3dbb23334df6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:48:02 np0005603623 nova_compute[226235]: 2026-01-31 08:48:02.538 226239 DEBUG oslo_concurrency.processutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 c215327f-37ad-41a7-a883-3dbb23334df6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:02 np0005603623 nova_compute[226235]: 2026-01-31 08:48:02.969 226239 DEBUG oslo_concurrency.processutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 c215327f-37ad-41a7-a883-3dbb23334df6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.058 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.065 226239 DEBUG nova.storage.rbd_utils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] resizing rbd image c215327f-37ad-41a7-a883-3dbb23334df6_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.176 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.181 226239 DEBUG nova.objects.instance [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'migration_context' on Instance uuid c215327f-37ad-41a7-a883-3dbb23334df6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.256 226239 DEBUG nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.256 226239 DEBUG nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Ensure instance console log exists: /var/lib/nova/instances/c215327f-37ad-41a7-a883-3dbb23334df6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.256 226239 DEBUG oslo_concurrency.lockutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.257 226239 DEBUG oslo_concurrency.lockutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.257 226239 DEBUG oslo_concurrency.lockutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.271 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.271 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.271 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.271 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.272 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:48:03 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3016005742' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.693 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.841 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000009c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.842 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000009c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.846 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.846 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.846 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.988 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.989 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3810MB free_disk=20.739471435546875GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.989 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:03 np0005603623 nova_compute[226235]: 2026-01-31 08:48:03.990 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:04.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:04.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:04 np0005603623 nova_compute[226235]: 2026-01-31 08:48:04.224 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 0edbf2b9-b76f-446b-85fa-09a4dcb37976 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:48:04 np0005603623 nova_compute[226235]: 2026-01-31 08:48:04.224 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance da4e355d-c6c2-446e-8eb1-d2ca8279e549 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:48:04 np0005603623 nova_compute[226235]: 2026-01-31 08:48:04.224 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance c215327f-37ad-41a7-a883-3dbb23334df6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:48:04 np0005603623 nova_compute[226235]: 2026-01-31 08:48:04.224 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:48:04 np0005603623 nova_compute[226235]: 2026-01-31 08:48:04.225 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:48:04 np0005603623 nova_compute[226235]: 2026-01-31 08:48:04.302 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:48:04 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/923701690' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:48:04 np0005603623 nova_compute[226235]: 2026-01-31 08:48:04.726 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:04 np0005603623 nova_compute[226235]: 2026-01-31 08:48:04.730 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:48:04 np0005603623 nova_compute[226235]: 2026-01-31 08:48:04.841 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:48:05 np0005603623 nova_compute[226235]: 2026-01-31 08:48:05.062 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:48:05 np0005603623 nova_compute[226235]: 2026-01-31 08:48:05.063 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:05 np0005603623 nova_compute[226235]: 2026-01-31 08:48:05.537 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:48:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:48:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:06.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:06 np0005603623 nova_compute[226235]: 2026-01-31 08:48:06.166 226239 DEBUG nova.network.neutron [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Successfully created port: fbe66833-82a6-4f72-9b11-a4732140845a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:48:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:06.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:08 np0005603623 nova_compute[226235]: 2026-01-31 08:48:08.042 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:48:08 np0005603623 nova_compute[226235]: 2026-01-31 08:48:08.042 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:48:08 np0005603623 nova_compute[226235]: 2026-01-31 08:48:08.042 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:48:08 np0005603623 nova_compute[226235]: 2026-01-31 08:48:08.061 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:08.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:08.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:08 np0005603623 nova_compute[226235]: 2026-01-31 08:48:08.938 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:48:08 np0005603623 ovn_controller[133449]: 2026-01-31T08:48:08Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:15:96:10 10.100.0.4
Jan 31 03:48:09 np0005603623 nova_compute[226235]: 2026-01-31 08:48:09.348 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-0edbf2b9-b76f-446b-85fa-09a4dcb37976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:48:09 np0005603623 nova_compute[226235]: 2026-01-31 08:48:09.348 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-0edbf2b9-b76f-446b-85fa-09a4dcb37976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:48:09 np0005603623 nova_compute[226235]: 2026-01-31 08:48:09.349 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:48:09 np0005603623 nova_compute[226235]: 2026-01-31 08:48:09.349 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0edbf2b9-b76f-446b-85fa-09a4dcb37976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:48:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:48:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:10.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:48:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:10.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:10 np0005603623 nova_compute[226235]: 2026-01-31 08:48:10.539 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:11 np0005603623 nova_compute[226235]: 2026-01-31 08:48:11.214 226239 DEBUG nova.network.neutron [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Successfully updated port: fbe66833-82a6-4f72-9b11-a4732140845a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:48:11 np0005603623 nova_compute[226235]: 2026-01-31 08:48:11.250 226239 DEBUG oslo_concurrency.lockutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:48:11 np0005603623 nova_compute[226235]: 2026-01-31 08:48:11.250 226239 DEBUG oslo_concurrency.lockutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquired lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:48:11 np0005603623 nova_compute[226235]: 2026-01-31 08:48:11.250 226239 DEBUG nova.network.neutron [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:48:11 np0005603623 nova_compute[226235]: 2026-01-31 08:48:11.978 226239 DEBUG nova.network.neutron [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:48:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:12.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:12.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:12 np0005603623 nova_compute[226235]: 2026-01-31 08:48:12.938 226239 DEBUG nova.compute.manager [req-e02af598-cee2-4110-afaf-e9533c11840e req-020b6861-4c13-424f-9530-a54cf5d5ea37 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Received event network-changed-fbe66833-82a6-4f72-9b11-a4732140845a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:48:12 np0005603623 nova_compute[226235]: 2026-01-31 08:48:12.939 226239 DEBUG nova.compute.manager [req-e02af598-cee2-4110-afaf-e9533c11840e req-020b6861-4c13-424f-9530-a54cf5d5ea37 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Refreshing instance network info cache due to event network-changed-fbe66833-82a6-4f72-9b11-a4732140845a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:48:12 np0005603623 nova_compute[226235]: 2026-01-31 08:48:12.939 226239 DEBUG oslo_concurrency.lockutils [req-e02af598-cee2-4110-afaf-e9533c11840e req-020b6861-4c13-424f-9530-a54cf5d5ea37 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:48:13 np0005603623 nova_compute[226235]: 2026-01-31 08:48:13.063 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.128 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Updating instance_info_cache with network_info: [{"id": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "address": "fa:16:3e:84:6e:68", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6486275-22", "ovs_interfaceid": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:48:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:14.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.150 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-0edbf2b9-b76f-446b-85fa-09a4dcb37976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.150 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.151 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.151 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.151 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.152 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:48:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:48:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:14.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.602 226239 DEBUG nova.network.neutron [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Updating instance_info_cache with network_info: [{"id": "fbe66833-82a6-4f72-9b11-a4732140845a", "address": "fa:16:3e:d6:4d:37", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbe66833-82", "ovs_interfaceid": "fbe66833-82a6-4f72-9b11-a4732140845a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.720 226239 DEBUG oslo_concurrency.lockutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Releasing lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.721 226239 DEBUG nova.compute.manager [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Instance network_info: |[{"id": "fbe66833-82a6-4f72-9b11-a4732140845a", "address": "fa:16:3e:d6:4d:37", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbe66833-82", "ovs_interfaceid": "fbe66833-82a6-4f72-9b11-a4732140845a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.722 226239 DEBUG oslo_concurrency.lockutils [req-e02af598-cee2-4110-afaf-e9533c11840e req-020b6861-4c13-424f-9530-a54cf5d5ea37 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.722 226239 DEBUG nova.network.neutron [req-e02af598-cee2-4110-afaf-e9533c11840e req-020b6861-4c13-424f-9530-a54cf5d5ea37 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Refreshing network info cache for port fbe66833-82a6-4f72-9b11-a4732140845a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.727 226239 DEBUG nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Start _get_guest_xml network_info=[{"id": "fbe66833-82a6-4f72-9b11-a4732140845a", "address": "fa:16:3e:d6:4d:37", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbe66833-82", "ovs_interfaceid": "fbe66833-82a6-4f72-9b11-a4732140845a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.733 226239 WARNING nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.747 226239 DEBUG nova.virt.libvirt.host [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.748 226239 DEBUG nova.virt.libvirt.host [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.756 226239 DEBUG nova.virt.libvirt.host [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.757 226239 DEBUG nova.virt.libvirt.host [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.759 226239 DEBUG nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.759 226239 DEBUG nova.virt.hardware [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.760 226239 DEBUG nova.virt.hardware [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.760 226239 DEBUG nova.virt.hardware [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.760 226239 DEBUG nova.virt.hardware [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.760 226239 DEBUG nova.virt.hardware [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.761 226239 DEBUG nova.virt.hardware [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.761 226239 DEBUG nova.virt.hardware [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.761 226239 DEBUG nova.virt.hardware [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.761 226239 DEBUG nova.virt.hardware [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.762 226239 DEBUG nova.virt.hardware [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.762 226239 DEBUG nova.virt.hardware [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:48:14 np0005603623 nova_compute[226235]: 2026-01-31 08:48:14.765 226239 DEBUG oslo_concurrency.processutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:48:15 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3803277030' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.189 226239 DEBUG oslo_concurrency.processutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.214 226239 DEBUG nova.storage.rbd_utils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image c215327f-37ad-41a7-a883-3dbb23334df6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.218 226239 DEBUG oslo_concurrency.processutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.259 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:48:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.541 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:48:15 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2881961970' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.626 226239 DEBUG oslo_concurrency.processutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.628 226239 DEBUG nova.virt.libvirt.vif [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:47:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-0',id=163,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMGvV4tGHwFrQ7+1WPmMS3fGcrpcMKpLQBFiD2ZG0NedKq4jaCN6oHf8RWlX+X72Ff/PSGJSQ5nqRPZm+CDMr01vn3vAMra9m4dZ/R1d2vwh+NDFwu298PivPHJQkyuCpg==',key_name='tempest-keypair-600650673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8397e0fed04b4dabb57148d0924de2dc',ramdisk_id='',reservation_id='r-libm6dxn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-1931311941',owner_user_name='tempest-AttachVolumeMultiAttachTest-1931311941-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:48:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a498364761ef428b99cac3f92e603385',uuid=c215327f-37ad-41a7-a883-3dbb23334df6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fbe66833-82a6-4f72-9b11-a4732140845a", "address": "fa:16:3e:d6:4d:37", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbe66833-82", "ovs_interfaceid": "fbe66833-82a6-4f72-9b11-a4732140845a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.628 226239 DEBUG nova.network.os_vif_util [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converting VIF {"id": "fbe66833-82a6-4f72-9b11-a4732140845a", "address": "fa:16:3e:d6:4d:37", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbe66833-82", "ovs_interfaceid": "fbe66833-82a6-4f72-9b11-a4732140845a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.630 226239 DEBUG nova.network.os_vif_util [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:4d:37,bridge_name='br-int',has_traffic_filtering=True,id=fbe66833-82a6-4f72-9b11-a4732140845a,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbe66833-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.631 226239 DEBUG nova.objects.instance [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'pci_devices' on Instance uuid c215327f-37ad-41a7-a883-3dbb23334df6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.690 226239 DEBUG nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:48:15 np0005603623 nova_compute[226235]:  <uuid>c215327f-37ad-41a7-a883-3dbb23334df6</uuid>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:  <name>instance-000000a3</name>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <nova:name>multiattach-server-0</nova:name>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:48:14</nova:creationTime>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:48:15 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:        <nova:user uuid="a498364761ef428b99cac3f92e603385">tempest-AttachVolumeMultiAttachTest-1931311941-project-member</nova:user>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:        <nova:project uuid="8397e0fed04b4dabb57148d0924de2dc">tempest-AttachVolumeMultiAttachTest-1931311941</nova:project>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:        <nova:port uuid="fbe66833-82a6-4f72-9b11-a4732140845a">
Jan 31 03:48:15 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <entry name="serial">c215327f-37ad-41a7-a883-3dbb23334df6</entry>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <entry name="uuid">c215327f-37ad-41a7-a883-3dbb23334df6</entry>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/c215327f-37ad-41a7-a883-3dbb23334df6_disk">
Jan 31 03:48:15 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:48:15 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/c215327f-37ad-41a7-a883-3dbb23334df6_disk.config">
Jan 31 03:48:15 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:48:15 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:d6:4d:37"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <target dev="tapfbe66833-82"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/c215327f-37ad-41a7-a883-3dbb23334df6/console.log" append="off"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:48:15 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:48:15 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:48:15 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:48:15 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.691 226239 DEBUG nova.compute.manager [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Preparing to wait for external event network-vif-plugged-fbe66833-82a6-4f72-9b11-a4732140845a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.691 226239 DEBUG oslo_concurrency.lockutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.692 226239 DEBUG oslo_concurrency.lockutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.692 226239 DEBUG oslo_concurrency.lockutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.693 226239 DEBUG nova.virt.libvirt.vif [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:47:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-0',id=163,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMGvV4tGHwFrQ7+1WPmMS3fGcrpcMKpLQBFiD2ZG0NedKq4jaCN6oHf8RWlX+X72Ff/PSGJSQ5nqRPZm+CDMr01vn3vAMra9m4dZ/R1d2vwh+NDFwu298PivPHJQkyuCpg==',key_name='tempest-keypair-600650673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8397e0fed04b4dabb57148d0924de2dc',ramdisk_id='',reservation_id='r-libm6dxn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-1931311941',owner_user_name='tempest-AttachVolumeMultiAttachTest-1931311941-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:48:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a498364761ef428b99cac3f92e603385',uuid=c215327f-37ad-41a7-a883-3dbb23334df6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fbe66833-82a6-4f72-9b11-a4732140845a", "address": "fa:16:3e:d6:4d:37", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbe66833-82", "ovs_interfaceid": "fbe66833-82a6-4f72-9b11-a4732140845a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.693 226239 DEBUG nova.network.os_vif_util [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converting VIF {"id": "fbe66833-82a6-4f72-9b11-a4732140845a", "address": "fa:16:3e:d6:4d:37", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbe66833-82", "ovs_interfaceid": "fbe66833-82a6-4f72-9b11-a4732140845a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.694 226239 DEBUG nova.network.os_vif_util [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:4d:37,bridge_name='br-int',has_traffic_filtering=True,id=fbe66833-82a6-4f72-9b11-a4732140845a,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbe66833-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.694 226239 DEBUG os_vif [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:4d:37,bridge_name='br-int',has_traffic_filtering=True,id=fbe66833-82a6-4f72-9b11-a4732140845a,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbe66833-82') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.695 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.695 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.696 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.700 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.700 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbe66833-82, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.701 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfbe66833-82, col_values=(('external_ids', {'iface-id': 'fbe66833-82a6-4f72-9b11-a4732140845a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d6:4d:37', 'vm-uuid': 'c215327f-37ad-41a7-a883-3dbb23334df6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.702 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:15 np0005603623 NetworkManager[48970]: <info>  [1769849295.7036] manager: (tapfbe66833-82): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/326)
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.704 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.709 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:15 np0005603623 nova_compute[226235]: 2026-01-31 08:48:15.710 226239 INFO os_vif [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:4d:37,bridge_name='br-int',has_traffic_filtering=True,id=fbe66833-82a6-4f72-9b11-a4732140845a,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbe66833-82')#033[00m
Jan 31 03:48:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:16.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:16.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:16 np0005603623 nova_compute[226235]: 2026-01-31 08:48:16.978 226239 DEBUG nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:48:16 np0005603623 nova_compute[226235]: 2026-01-31 08:48:16.978 226239 DEBUG nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:48:16 np0005603623 nova_compute[226235]: 2026-01-31 08:48:16.979 226239 DEBUG nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No VIF found with MAC fa:16:3e:d6:4d:37, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:48:16 np0005603623 nova_compute[226235]: 2026-01-31 08:48:16.979 226239 INFO nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Using config drive#033[00m
Jan 31 03:48:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:16.986 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=69, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=68) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:48:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:16.987 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:48:17 np0005603623 nova_compute[226235]: 2026-01-31 08:48:17.006 226239 DEBUG nova.storage.rbd_utils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image c215327f-37ad-41a7-a883-3dbb23334df6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:48:17 np0005603623 nova_compute[226235]: 2026-01-31 08:48:17.011 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:18.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:48:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:18.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:48:18 np0005603623 nova_compute[226235]: 2026-01-31 08:48:18.350 226239 INFO nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Creating config drive at /var/lib/nova/instances/c215327f-37ad-41a7-a883-3dbb23334df6/disk.config#033[00m
Jan 31 03:48:18 np0005603623 nova_compute[226235]: 2026-01-31 08:48:18.355 226239 DEBUG oslo_concurrency.processutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c215327f-37ad-41a7-a883-3dbb23334df6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmprey97j5_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:18 np0005603623 nova_compute[226235]: 2026-01-31 08:48:18.480 226239 DEBUG oslo_concurrency.processutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c215327f-37ad-41a7-a883-3dbb23334df6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmprey97j5_" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:18 np0005603623 nova_compute[226235]: 2026-01-31 08:48:18.508 226239 DEBUG nova.storage.rbd_utils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image c215327f-37ad-41a7-a883-3dbb23334df6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:48:18 np0005603623 nova_compute[226235]: 2026-01-31 08:48:18.511 226239 DEBUG oslo_concurrency.processutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c215327f-37ad-41a7-a883-3dbb23334df6/disk.config c215327f-37ad-41a7-a883-3dbb23334df6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:18 np0005603623 nova_compute[226235]: 2026-01-31 08:48:18.788 226239 DEBUG nova.network.neutron [req-e02af598-cee2-4110-afaf-e9533c11840e req-020b6861-4c13-424f-9530-a54cf5d5ea37 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Updated VIF entry in instance network info cache for port fbe66833-82a6-4f72-9b11-a4732140845a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:48:18 np0005603623 nova_compute[226235]: 2026-01-31 08:48:18.788 226239 DEBUG nova.network.neutron [req-e02af598-cee2-4110-afaf-e9533c11840e req-020b6861-4c13-424f-9530-a54cf5d5ea37 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Updating instance_info_cache with network_info: [{"id": "fbe66833-82a6-4f72-9b11-a4732140845a", "address": "fa:16:3e:d6:4d:37", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbe66833-82", "ovs_interfaceid": "fbe66833-82a6-4f72-9b11-a4732140845a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:48:18 np0005603623 nova_compute[226235]: 2026-01-31 08:48:18.868 226239 DEBUG oslo_concurrency.lockutils [req-e02af598-cee2-4110-afaf-e9533c11840e req-020b6861-4c13-424f-9530-a54cf5d5ea37 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:48:19 np0005603623 nova_compute[226235]: 2026-01-31 08:48:19.574 226239 DEBUG oslo_concurrency.processutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c215327f-37ad-41a7-a883-3dbb23334df6/disk.config c215327f-37ad-41a7-a883-3dbb23334df6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:19 np0005603623 nova_compute[226235]: 2026-01-31 08:48:19.574 226239 INFO nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Deleting local config drive /var/lib/nova/instances/c215327f-37ad-41a7-a883-3dbb23334df6/disk.config because it was imported into RBD.#033[00m
Jan 31 03:48:19 np0005603623 NetworkManager[48970]: <info>  [1769849299.6152] manager: (tapfbe66833-82): new Tun device (/org/freedesktop/NetworkManager/Devices/327)
Jan 31 03:48:19 np0005603623 kernel: tapfbe66833-82: entered promiscuous mode
Jan 31 03:48:19 np0005603623 nova_compute[226235]: 2026-01-31 08:48:19.618 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:48:19Z|00683|binding|INFO|Claiming lport fbe66833-82a6-4f72-9b11-a4732140845a for this chassis.
Jan 31 03:48:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:48:19Z|00684|binding|INFO|fbe66833-82a6-4f72-9b11-a4732140845a: Claiming fa:16:3e:d6:4d:37 10.100.0.6
Jan 31 03:48:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:48:19Z|00685|binding|INFO|Setting lport fbe66833-82a6-4f72-9b11-a4732140845a ovn-installed in OVS
Jan 31 03:48:19 np0005603623 nova_compute[226235]: 2026-01-31 08:48:19.629 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:19 np0005603623 systemd-udevd[303501]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:48:19 np0005603623 systemd-machined[194379]: New machine qemu-78-instance-000000a3.
Jan 31 03:48:19 np0005603623 NetworkManager[48970]: <info>  [1769849299.6512] device (tapfbe66833-82): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:48:19 np0005603623 NetworkManager[48970]: <info>  [1769849299.6521] device (tapfbe66833-82): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:48:19 np0005603623 systemd[1]: Started Virtual Machine qemu-78-instance-000000a3.
Jan 31 03:48:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:20.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:20 np0005603623 nova_compute[226235]: 2026-01-31 08:48:20.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:48:20 np0005603623 ovn_controller[133449]: 2026-01-31T08:48:20Z|00686|binding|INFO|Setting lport fbe66833-82a6-4f72-9b11-a4732140845a up in Southbound
Jan 31 03:48:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:20.159 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:4d:37 10.100.0.6'], port_security=['fa:16:3e:d6:4d:37 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c215327f-37ad-41a7-a883-3dbb23334df6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8397e0fed04b4dabb57148d0924de2dc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd636f3a4-efef-465a-ac59-8182d61336f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbd2578f-ff6e-4dc3-bc49-93cbf023edc5, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=fbe66833-82a6-4f72-9b11-a4732140845a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:48:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:20.160 143258 INFO neutron.agent.ovn.metadata.agent [-] Port fbe66833-82a6-4f72-9b11-a4732140845a in datapath 3afaf607-43a1-4d65-95fc-0a22b5c901d0 bound to our chassis#033[00m
Jan 31 03:48:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:20.162 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3afaf607-43a1-4d65-95fc-0a22b5c901d0#033[00m
Jan 31 03:48:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:20.172 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a088da58-65a2-43df-9f88-49924bfd99c2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:20.196 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[358720b1-cb5e-4ac8-b79e-011078407427]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:20.198 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[95c217e1-0156-48db-9df0-290f113cefc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:48:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:20.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:48:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:20.215 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[8d4fdd02-bb5a-4a21-8acb-776b9e9d1ef7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:20.229 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[057ad832-c984-419e-ba0c-5c1e08d3f800]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3afaf607-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:84:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 198], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 831647, 'reachable_time': 39542, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 303558, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:20.244 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[73bc813a-32d1-4a87-82a3-e412c8cdca0d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3afaf607-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 831657, 'tstamp': 831657}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 303559, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3afaf607-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 831659, 'tstamp': 831659}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 303559, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:20.246 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3afaf607-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:48:20 np0005603623 nova_compute[226235]: 2026-01-31 08:48:20.247 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:20.249 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3afaf607-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:48:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:20.249 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:48:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:20.249 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3afaf607-40, col_values=(('external_ids', {'iface-id': '0ed76a0a-650c-4ec7-a4d4-0e745236b047'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:48:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:20.249 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:48:20 np0005603623 nova_compute[226235]: 2026-01-31 08:48:20.254 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849300.25336, c215327f-37ad-41a7-a883-3dbb23334df6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:48:20 np0005603623 nova_compute[226235]: 2026-01-31 08:48:20.254 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] VM Started (Lifecycle Event)#033[00m
Jan 31 03:48:20 np0005603623 nova_compute[226235]: 2026-01-31 08:48:20.299 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:48:20 np0005603623 nova_compute[226235]: 2026-01-31 08:48:20.303 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849300.256087, c215327f-37ad-41a7-a883-3dbb23334df6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:48:20 np0005603623 nova_compute[226235]: 2026-01-31 08:48:20.303 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:48:20 np0005603623 nova_compute[226235]: 2026-01-31 08:48:20.380 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:48:20 np0005603623 nova_compute[226235]: 2026-01-31 08:48:20.383 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:48:20 np0005603623 nova_compute[226235]: 2026-01-31 08:48:20.431 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:48:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:20 np0005603623 nova_compute[226235]: 2026-01-31 08:48:20.543 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:20 np0005603623 nova_compute[226235]: 2026-01-31 08:48:20.703 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.459 226239 DEBUG nova.compute.manager [req-a41fefd3-33a2-41cc-8f55-ae7e65a0c3f8 req-35d07a88-c677-43c1-8d0a-5b4d674b4cef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Received event network-vif-plugged-fbe66833-82a6-4f72-9b11-a4732140845a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.459 226239 DEBUG oslo_concurrency.lockutils [req-a41fefd3-33a2-41cc-8f55-ae7e65a0c3f8 req-35d07a88-c677-43c1-8d0a-5b4d674b4cef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.460 226239 DEBUG oslo_concurrency.lockutils [req-a41fefd3-33a2-41cc-8f55-ae7e65a0c3f8 req-35d07a88-c677-43c1-8d0a-5b4d674b4cef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.460 226239 DEBUG oslo_concurrency.lockutils [req-a41fefd3-33a2-41cc-8f55-ae7e65a0c3f8 req-35d07a88-c677-43c1-8d0a-5b4d674b4cef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.460 226239 DEBUG nova.compute.manager [req-a41fefd3-33a2-41cc-8f55-ae7e65a0c3f8 req-35d07a88-c677-43c1-8d0a-5b4d674b4cef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Processing event network-vif-plugged-fbe66833-82a6-4f72-9b11-a4732140845a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.461 226239 DEBUG nova.compute.manager [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.463 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849301.463728, c215327f-37ad-41a7-a883-3dbb23334df6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.464 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.465 226239 DEBUG nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.468 226239 INFO nova.virt.libvirt.driver [-] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Instance spawned successfully.#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.468 226239 DEBUG nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.568 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.575 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.579 226239 DEBUG nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.579 226239 DEBUG nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.580 226239 DEBUG nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.580 226239 DEBUG nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.580 226239 DEBUG nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.581 226239 DEBUG nova.virt.libvirt.driver [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.643 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.741 226239 INFO nova.compute.manager [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Took 19.37 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:48:21 np0005603623 nova_compute[226235]: 2026-01-31 08:48:21.741 226239 DEBUG nova.compute.manager [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:48:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:22.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:22.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:22 np0005603623 nova_compute[226235]: 2026-01-31 08:48:22.428 226239 INFO nova.compute.manager [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Took 21.97 seconds to build instance.#033[00m
Jan 31 03:48:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:22.989 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '69'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:48:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:24.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:24 np0005603623 nova_compute[226235]: 2026-01-31 08:48:24.160 226239 DEBUG oslo_concurrency.lockutils [None req-aaf2c778-dbed-401c-9d91-cbd0e71277a6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 24.031s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:48:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:24.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:48:24 np0005603623 nova_compute[226235]: 2026-01-31 08:48:24.407 226239 DEBUG nova.compute.manager [req-27812b62-7502-4ab2-b105-ad0613338cd5 req-36c5d877-3876-426f-89ea-b6e396b34e48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Received event network-vif-plugged-fbe66833-82a6-4f72-9b11-a4732140845a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:48:24 np0005603623 nova_compute[226235]: 2026-01-31 08:48:24.408 226239 DEBUG oslo_concurrency.lockutils [req-27812b62-7502-4ab2-b105-ad0613338cd5 req-36c5d877-3876-426f-89ea-b6e396b34e48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:24 np0005603623 nova_compute[226235]: 2026-01-31 08:48:24.408 226239 DEBUG oslo_concurrency.lockutils [req-27812b62-7502-4ab2-b105-ad0613338cd5 req-36c5d877-3876-426f-89ea-b6e396b34e48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:24 np0005603623 nova_compute[226235]: 2026-01-31 08:48:24.408 226239 DEBUG oslo_concurrency.lockutils [req-27812b62-7502-4ab2-b105-ad0613338cd5 req-36c5d877-3876-426f-89ea-b6e396b34e48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:24 np0005603623 nova_compute[226235]: 2026-01-31 08:48:24.409 226239 DEBUG nova.compute.manager [req-27812b62-7502-4ab2-b105-ad0613338cd5 req-36c5d877-3876-426f-89ea-b6e396b34e48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] No waiting events found dispatching network-vif-plugged-fbe66833-82a6-4f72-9b11-a4732140845a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:48:24 np0005603623 nova_compute[226235]: 2026-01-31 08:48:24.409 226239 WARNING nova.compute.manager [req-27812b62-7502-4ab2-b105-ad0613338cd5 req-36c5d877-3876-426f-89ea-b6e396b34e48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Received unexpected event network-vif-plugged-fbe66833-82a6-4f72-9b11-a4732140845a for instance with vm_state active and task_state None.#033[00m
Jan 31 03:48:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:25 np0005603623 nova_compute[226235]: 2026-01-31 08:48:25.544 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:25 np0005603623 nova_compute[226235]: 2026-01-31 08:48:25.705 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:25 np0005603623 podman[303564]: 2026-01-31 08:48:25.975141445 +0000 UTC m=+0.053318862 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 03:48:26 np0005603623 podman[303565]: 2026-01-31 08:48:26.024129603 +0000 UTC m=+0.100780593 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:48:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:48:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:26.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:48:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:26.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:28.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:28.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:30.139 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:30.140 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:30.140 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:48:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:30.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:48:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:30.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:30 np0005603623 nova_compute[226235]: 2026-01-31 08:48:30.546 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:30 np0005603623 nova_compute[226235]: 2026-01-31 08:48:30.706 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:48:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:32.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:48:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:32.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:48:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:34.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:48:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:34.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:35 np0005603623 nova_compute[226235]: 2026-01-31 08:48:35.568 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:35 np0005603623 nova_compute[226235]: 2026-01-31 08:48:35.707 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:36 np0005603623 ovn_controller[133449]: 2026-01-31T08:48:36Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d6:4d:37 10.100.0.6
Jan 31 03:48:36 np0005603623 ovn_controller[133449]: 2026-01-31T08:48:36Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d6:4d:37 10.100.0.6
Jan 31 03:48:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:48:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:36.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:48:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:36.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:48:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:38.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:48:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:48:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:38.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:48:38 np0005603623 nova_compute[226235]: 2026-01-31 08:48:38.486 226239 DEBUG nova.compute.manager [req-aa40d134-0c27-4271-9553-e40f9fca3690 req-ef3adc77-4e19-4f0e-9cfd-7dc1715b1be3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Received event network-changed-fbe66833-82a6-4f72-9b11-a4732140845a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:48:38 np0005603623 nova_compute[226235]: 2026-01-31 08:48:38.486 226239 DEBUG nova.compute.manager [req-aa40d134-0c27-4271-9553-e40f9fca3690 req-ef3adc77-4e19-4f0e-9cfd-7dc1715b1be3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Refreshing instance network info cache due to event network-changed-fbe66833-82a6-4f72-9b11-a4732140845a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:48:38 np0005603623 nova_compute[226235]: 2026-01-31 08:48:38.487 226239 DEBUG oslo_concurrency.lockutils [req-aa40d134-0c27-4271-9553-e40f9fca3690 req-ef3adc77-4e19-4f0e-9cfd-7dc1715b1be3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:48:38 np0005603623 nova_compute[226235]: 2026-01-31 08:48:38.487 226239 DEBUG oslo_concurrency.lockutils [req-aa40d134-0c27-4271-9553-e40f9fca3690 req-ef3adc77-4e19-4f0e-9cfd-7dc1715b1be3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:48:38 np0005603623 nova_compute[226235]: 2026-01-31 08:48:38.487 226239 DEBUG nova.network.neutron [req-aa40d134-0c27-4271-9553-e40f9fca3690 req-ef3adc77-4e19-4f0e-9cfd-7dc1715b1be3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Refreshing network info cache for port fbe66833-82a6-4f72-9b11-a4732140845a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:48:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:48:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:40.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:48:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:40.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:40 np0005603623 nova_compute[226235]: 2026-01-31 08:48:40.569 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:40 np0005603623 nova_compute[226235]: 2026-01-31 08:48:40.709 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:42.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:48:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:42.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:48:42 np0005603623 ovn_controller[133449]: 2026-01-31T08:48:42Z|00687|binding|INFO|Releasing lport 5976b74a-78ce-46e1-bd2c-76a2a502c8f5 from this chassis (sb_readonly=0)
Jan 31 03:48:42 np0005603623 ovn_controller[133449]: 2026-01-31T08:48:42Z|00688|binding|INFO|Releasing lport 0ed76a0a-650c-4ec7-a4d4-0e745236b047 from this chassis (sb_readonly=0)
Jan 31 03:48:42 np0005603623 nova_compute[226235]: 2026-01-31 08:48:42.668 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:43 np0005603623 nova_compute[226235]: 2026-01-31 08:48:43.383 226239 DEBUG nova.network.neutron [req-aa40d134-0c27-4271-9553-e40f9fca3690 req-ef3adc77-4e19-4f0e-9cfd-7dc1715b1be3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Updated VIF entry in instance network info cache for port fbe66833-82a6-4f72-9b11-a4732140845a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:48:43 np0005603623 nova_compute[226235]: 2026-01-31 08:48:43.384 226239 DEBUG nova.network.neutron [req-aa40d134-0c27-4271-9553-e40f9fca3690 req-ef3adc77-4e19-4f0e-9cfd-7dc1715b1be3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Updating instance_info_cache with network_info: [{"id": "fbe66833-82a6-4f72-9b11-a4732140845a", "address": "fa:16:3e:d6:4d:37", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbe66833-82", "ovs_interfaceid": "fbe66833-82a6-4f72-9b11-a4732140845a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:48:43 np0005603623 nova_compute[226235]: 2026-01-31 08:48:43.803 226239 DEBUG oslo_concurrency.lockutils [req-aa40d134-0c27-4271-9553-e40f9fca3690 req-ef3adc77-4e19-4f0e-9cfd-7dc1715b1be3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:48:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:44.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:48:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:44.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:48:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:45 np0005603623 nova_compute[226235]: 2026-01-31 08:48:45.572 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:45 np0005603623 nova_compute[226235]: 2026-01-31 08:48:45.712 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:46.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:46.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:48.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:48:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:48.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:48:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:50.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:50.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:50 np0005603623 nova_compute[226235]: 2026-01-31 08:48:50.574 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:50 np0005603623 nova_compute[226235]: 2026-01-31 08:48:50.714 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:52.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:52.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:54.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:54.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:54.259 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=70, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=69) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:48:54 np0005603623 nova_compute[226235]: 2026-01-31 08:48:54.260 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:54.260 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:48:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:48:54.261 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '70'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:48:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:55 np0005603623 nova_compute[226235]: 2026-01-31 08:48:55.577 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:55 np0005603623 nova_compute[226235]: 2026-01-31 08:48:55.715 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:48:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:56.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:48:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:48:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:56.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:48:56 np0005603623 podman[303724]: 2026-01-31 08:48:56.948286835 +0000 UTC m=+0.045433246 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127)
Jan 31 03:48:56 np0005603623 podman[303725]: 2026-01-31 08:48:56.969211311 +0000 UTC m=+0.066168226 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Jan 31 03:48:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:48:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:58.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:48:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:48:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:58.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:59 np0005603623 nova_compute[226235]: 2026-01-31 08:48:59.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:48:59 np0005603623 nova_compute[226235]: 2026-01-31 08:48:59.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:49:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:49:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:00.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:49:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:00.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:00 np0005603623 nova_compute[226235]: 2026-01-31 08:49:00.579 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:00 np0005603623 nova_compute[226235]: 2026-01-31 08:49:00.717 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:02.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:02.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:03 np0005603623 nova_compute[226235]: 2026-01-31 08:49:03.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:49:03 np0005603623 nova_compute[226235]: 2026-01-31 08:49:03.196 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:03 np0005603623 nova_compute[226235]: 2026-01-31 08:49:03.197 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:03 np0005603623 nova_compute[226235]: 2026-01-31 08:49:03.197 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:03 np0005603623 nova_compute[226235]: 2026-01-31 08:49:03.197 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:49:03 np0005603623 nova_compute[226235]: 2026-01-31 08:49:03.198 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:49:03 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1986605316' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:49:03 np0005603623 nova_compute[226235]: 2026-01-31 08:49:03.614 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:03 np0005603623 nova_compute[226235]: 2026-01-31 08:49:03.729 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000009c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:49:03 np0005603623 nova_compute[226235]: 2026-01-31 08:49:03.729 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000009c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:49:03 np0005603623 nova_compute[226235]: 2026-01-31 08:49:03.732 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:49:03 np0005603623 nova_compute[226235]: 2026-01-31 08:49:03.732 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:49:03 np0005603623 nova_compute[226235]: 2026-01-31 08:49:03.735 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:49:03 np0005603623 nova_compute[226235]: 2026-01-31 08:49:03.735 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:49:03 np0005603623 nova_compute[226235]: 2026-01-31 08:49:03.736 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:49:03 np0005603623 nova_compute[226235]: 2026-01-31 08:49:03.866 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:49:03 np0005603623 nova_compute[226235]: 2026-01-31 08:49:03.867 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3649MB free_disk=20.673076629638672GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:49:03 np0005603623 nova_compute[226235]: 2026-01-31 08:49:03.868 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:03 np0005603623 nova_compute[226235]: 2026-01-31 08:49:03.868 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:04 np0005603623 nova_compute[226235]: 2026-01-31 08:49:04.155 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 0edbf2b9-b76f-446b-85fa-09a4dcb37976 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:49:04 np0005603623 nova_compute[226235]: 2026-01-31 08:49:04.156 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance da4e355d-c6c2-446e-8eb1-d2ca8279e549 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:49:04 np0005603623 nova_compute[226235]: 2026-01-31 08:49:04.156 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance c215327f-37ad-41a7-a883-3dbb23334df6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:49:04 np0005603623 nova_compute[226235]: 2026-01-31 08:49:04.156 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:49:04 np0005603623 nova_compute[226235]: 2026-01-31 08:49:04.156 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:49:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:04.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:04.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:04 np0005603623 nova_compute[226235]: 2026-01-31 08:49:04.449 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:49:04 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2739606521' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:49:04 np0005603623 nova_compute[226235]: 2026-01-31 08:49:04.866 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:04 np0005603623 nova_compute[226235]: 2026-01-31 08:49:04.870 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:49:04 np0005603623 nova_compute[226235]: 2026-01-31 08:49:04.905 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:49:04 np0005603623 nova_compute[226235]: 2026-01-31 08:49:04.946 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:49:04 np0005603623 nova_compute[226235]: 2026-01-31 08:49:04.946 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:05 np0005603623 nova_compute[226235]: 2026-01-31 08:49:05.582 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:05 np0005603623 nova_compute[226235]: 2026-01-31 08:49:05.718 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:06.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:06.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:06 np0005603623 nova_compute[226235]: 2026-01-31 08:49:06.317 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:49:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:49:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:49:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:49:07 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:49:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:49:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:08.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:49:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:08.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:09 np0005603623 nova_compute[226235]: 2026-01-31 08:49:09.946 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:49:09 np0005603623 nova_compute[226235]: 2026-01-31 08:49:09.946 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:49:09 np0005603623 nova_compute[226235]: 2026-01-31 08:49:09.946 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:49:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:49:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:10.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:49:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:10.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:10 np0005603623 nova_compute[226235]: 2026-01-31 08:49:10.583 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:10 np0005603623 nova_compute[226235]: 2026-01-31 08:49:10.720 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:11 np0005603623 nova_compute[226235]: 2026-01-31 08:49:11.254 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-da4e355d-c6c2-446e-8eb1-d2ca8279e549" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:49:11 np0005603623 nova_compute[226235]: 2026-01-31 08:49:11.254 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-da4e355d-c6c2-446e-8eb1-d2ca8279e549" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:49:11 np0005603623 nova_compute[226235]: 2026-01-31 08:49:11.255 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:49:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:12.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:12.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:14.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:14.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:15 np0005603623 nova_compute[226235]: 2026-01-31 08:49:15.586 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:15 np0005603623 nova_compute[226235]: 2026-01-31 08:49:15.721 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:16.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:16.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:16 np0005603623 nova_compute[226235]: 2026-01-31 08:49:16.694 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Updating instance_info_cache with network_info: [{"id": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "address": "fa:16:3e:15:96:10", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5224a5be-3c", "ovs_interfaceid": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:49:16 np0005603623 nova_compute[226235]: 2026-01-31 08:49:16.727 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-da4e355d-c6c2-446e-8eb1-d2ca8279e549" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:49:16 np0005603623 nova_compute[226235]: 2026-01-31 08:49:16.728 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:49:16 np0005603623 nova_compute[226235]: 2026-01-31 08:49:16.728 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:49:16 np0005603623 nova_compute[226235]: 2026-01-31 08:49:16.728 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:49:16 np0005603623 nova_compute[226235]: 2026-01-31 08:49:16.728 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:49:16 np0005603623 nova_compute[226235]: 2026-01-31 08:49:16.728 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:49:16 np0005603623 nova_compute[226235]: 2026-01-31 08:49:16.985 226239 DEBUG nova.compute.manager [req-af6b7497-d228-4783-bd08-50b05a3e4129 req-03ae055a-9781-4e39-aed9-c52fc46ba100 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Received event network-changed-fbe66833-82a6-4f72-9b11-a4732140845a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:49:16 np0005603623 nova_compute[226235]: 2026-01-31 08:49:16.985 226239 DEBUG nova.compute.manager [req-af6b7497-d228-4783-bd08-50b05a3e4129 req-03ae055a-9781-4e39-aed9-c52fc46ba100 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Refreshing instance network info cache due to event network-changed-fbe66833-82a6-4f72-9b11-a4732140845a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:49:16 np0005603623 nova_compute[226235]: 2026-01-31 08:49:16.985 226239 DEBUG oslo_concurrency.lockutils [req-af6b7497-d228-4783-bd08-50b05a3e4129 req-03ae055a-9781-4e39-aed9-c52fc46ba100 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:49:16 np0005603623 nova_compute[226235]: 2026-01-31 08:49:16.986 226239 DEBUG oslo_concurrency.lockutils [req-af6b7497-d228-4783-bd08-50b05a3e4129 req-03ae055a-9781-4e39-aed9-c52fc46ba100 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:49:16 np0005603623 nova_compute[226235]: 2026-01-31 08:49:16.986 226239 DEBUG nova.network.neutron [req-af6b7497-d228-4783-bd08-50b05a3e4129 req-03ae055a-9781-4e39-aed9-c52fc46ba100 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Refreshing network info cache for port fbe66833-82a6-4f72-9b11-a4732140845a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:49:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:49:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:18.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:49:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:49:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:18.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:49:19 np0005603623 nova_compute[226235]: 2026-01-31 08:49:19.188 226239 DEBUG nova.network.neutron [req-af6b7497-d228-4783-bd08-50b05a3e4129 req-03ae055a-9781-4e39-aed9-c52fc46ba100 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Updated VIF entry in instance network info cache for port fbe66833-82a6-4f72-9b11-a4732140845a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:49:19 np0005603623 nova_compute[226235]: 2026-01-31 08:49:19.189 226239 DEBUG nova.network.neutron [req-af6b7497-d228-4783-bd08-50b05a3e4129 req-03ae055a-9781-4e39-aed9-c52fc46ba100 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Updating instance_info_cache with network_info: [{"id": "fbe66833-82a6-4f72-9b11-a4732140845a", "address": "fa:16:3e:d6:4d:37", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbe66833-82", "ovs_interfaceid": "fbe66833-82a6-4f72-9b11-a4732140845a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:49:19 np0005603623 nova_compute[226235]: 2026-01-31 08:49:19.531 226239 DEBUG oslo_concurrency.lockutils [req-af6b7497-d228-4783-bd08-50b05a3e4129 req-03ae055a-9781-4e39-aed9-c52fc46ba100 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:49:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:20.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:49:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:20.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:49:20 np0005603623 nova_compute[226235]: 2026-01-31 08:49:20.587 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:20 np0005603623 nova_compute[226235]: 2026-01-31 08:49:20.724 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:20 np0005603623 nova_compute[226235]: 2026-01-31 08:49:20.848 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:22 np0005603623 nova_compute[226235]: 2026-01-31 08:49:22.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:49:22 np0005603623 nova_compute[226235]: 2026-01-31 08:49:22.200 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:49:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:22.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:22.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:22 np0005603623 nova_compute[226235]: 2026-01-31 08:49:22.424 226239 DEBUG oslo_concurrency.lockutils [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "c215327f-37ad-41a7-a883-3dbb23334df6" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:22 np0005603623 nova_compute[226235]: 2026-01-31 08:49:22.425 226239 DEBUG oslo_concurrency.lockutils [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:22 np0005603623 nova_compute[226235]: 2026-01-31 08:49:22.497 226239 DEBUG nova.objects.instance [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'flavor' on Instance uuid c215327f-37ad-41a7-a883-3dbb23334df6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:49:22 np0005603623 nova_compute[226235]: 2026-01-31 08:49:22.603 226239 DEBUG oslo_concurrency.lockutils [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.122 226239 DEBUG oslo_concurrency.lockutils [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "c215327f-37ad-41a7-a883-3dbb23334df6" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.122 226239 DEBUG oslo_concurrency.lockutils [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.122 226239 INFO nova.compute.manager [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Attaching volume 12e9d9b2-8ec9-4b16-b334-60c0f639cb59 to /dev/vdb#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.430 226239 DEBUG os_brick.utils [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.432 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.439 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.440 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[260d2a59-3653-4911-91ae-3770d1500210]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.441 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.445 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.004s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.445 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[62dc33aa-f348-4b5e-af05-432de0c0803c]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.447 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.455 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.455 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[01125045-4843-42e1-b13a-d26c2f804209]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.456 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[1dcc69f3-592f-46fa-b5f4-09e59c116c08]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.457 226239 DEBUG oslo_concurrency.processutils [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.473 226239 DEBUG oslo_concurrency.processutils [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "nvme version" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.475 226239 DEBUG os_brick.initiator.connectors.lightos [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.476 226239 DEBUG os_brick.initiator.connectors.lightos [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.476 226239 DEBUG os_brick.initiator.connectors.lightos [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.476 226239 DEBUG os_brick.utils [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] <== get_connector_properties: return (45ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:49:23 np0005603623 nova_compute[226235]: 2026-01-31 08:49:23.477 226239 DEBUG nova.virt.block_device [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Updating existing volume attachment record: 94195552-eaf2-4abd-8796-9ad3f1747392 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:49:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:49:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:49:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:24.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:49:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:24.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:49:24 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/951411946' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:49:25 np0005603623 nova_compute[226235]: 2026-01-31 08:49:25.457 226239 DEBUG nova.objects.instance [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'flavor' on Instance uuid c215327f-37ad-41a7-a883-3dbb23334df6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:49:25 np0005603623 nova_compute[226235]: 2026-01-31 08:49:25.555 226239 DEBUG nova.virt.libvirt.driver [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Attempting to attach volume 12e9d9b2-8ec9-4b16-b334-60c0f639cb59 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 03:49:25 np0005603623 nova_compute[226235]: 2026-01-31 08:49:25.558 226239 DEBUG nova.virt.libvirt.guest [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 03:49:25 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:49:25 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-12e9d9b2-8ec9-4b16-b334-60c0f639cb59">
Jan 31 03:49:25 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:49:25 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:49:25 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:49:25 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:49:25 np0005603623 nova_compute[226235]:  <auth username="openstack">
Jan 31 03:49:25 np0005603623 nova_compute[226235]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:49:25 np0005603623 nova_compute[226235]:  </auth>
Jan 31 03:49:25 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:49:25 np0005603623 nova_compute[226235]:  <serial>12e9d9b2-8ec9-4b16-b334-60c0f639cb59</serial>
Jan 31 03:49:25 np0005603623 nova_compute[226235]:  <shareable/>
Jan 31 03:49:25 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:49:25 np0005603623 nova_compute[226235]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:49:25 np0005603623 nova_compute[226235]: 2026-01-31 08:49:25.589 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:25 np0005603623 nova_compute[226235]: 2026-01-31 08:49:25.726 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:25 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:49:25 np0005603623 nova_compute[226235]: 2026-01-31 08:49:25.935 226239 DEBUG nova.virt.libvirt.driver [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:49:25 np0005603623 nova_compute[226235]: 2026-01-31 08:49:25.935 226239 DEBUG nova.virt.libvirt.driver [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:49:25 np0005603623 nova_compute[226235]: 2026-01-31 08:49:25.935 226239 DEBUG nova.virt.libvirt.driver [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:49:25 np0005603623 nova_compute[226235]: 2026-01-31 08:49:25.936 226239 DEBUG nova.virt.libvirt.driver [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No VIF found with MAC fa:16:3e:d6:4d:37, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:49:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:26.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:26.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:26 np0005603623 nova_compute[226235]: 2026-01-31 08:49:26.415 226239 DEBUG oslo_concurrency.lockutils [None req-5a9f3376-1f20-4237-9516-1129a76457f2 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 3.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:27 np0005603623 podman[304092]: 2026-01-31 08:49:27.952298163 +0000 UTC m=+0.043615519 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:49:28 np0005603623 podman[304093]: 2026-01-31 08:49:28.005269745 +0000 UTC m=+0.093657549 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 31 03:49:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:49:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:28.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:49:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:49:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:28.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:49:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:30.140 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:30.141 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:30.141 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:30.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:30.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:30 np0005603623 nova_compute[226235]: 2026-01-31 08:49:30.590 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:30 np0005603623 nova_compute[226235]: 2026-01-31 08:49:30.728 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:32.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:49:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:32.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:49:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:33.398 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=71, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=70) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:49:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:33.399 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:49:33 np0005603623 nova_compute[226235]: 2026-01-31 08:49:33.406 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:34.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:49:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:34.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:49:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:34.402 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '71'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:35 np0005603623 nova_compute[226235]: 2026-01-31 08:49:35.633 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:35 np0005603623 nova_compute[226235]: 2026-01-31 08:49:35.729 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:36.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:49:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:36.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:49:36 np0005603623 nova_compute[226235]: 2026-01-31 08:49:36.579 226239 DEBUG oslo_concurrency.lockutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "07ab810a-aca3-4084-abb7-b092f658255b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:36 np0005603623 nova_compute[226235]: 2026-01-31 08:49:36.580 226239 DEBUG oslo_concurrency.lockutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "07ab810a-aca3-4084-abb7-b092f658255b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:36 np0005603623 nova_compute[226235]: 2026-01-31 08:49:36.625 226239 DEBUG nova.compute.manager [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:49:36 np0005603623 nova_compute[226235]: 2026-01-31 08:49:36.860 226239 DEBUG oslo_concurrency.lockutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:36 np0005603623 nova_compute[226235]: 2026-01-31 08:49:36.861 226239 DEBUG oslo_concurrency.lockutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:36 np0005603623 nova_compute[226235]: 2026-01-31 08:49:36.868 226239 DEBUG nova.virt.hardware [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:49:36 np0005603623 nova_compute[226235]: 2026-01-31 08:49:36.868 226239 INFO nova.compute.claims [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:49:37 np0005603623 nova_compute[226235]: 2026-01-31 08:49:37.226 226239 DEBUG oslo_concurrency.processutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:49:37 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2067848571' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:49:37 np0005603623 nova_compute[226235]: 2026-01-31 08:49:37.677 226239 DEBUG oslo_concurrency.processutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:37 np0005603623 nova_compute[226235]: 2026-01-31 08:49:37.682 226239 DEBUG nova.compute.provider_tree [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:49:37 np0005603623 nova_compute[226235]: 2026-01-31 08:49:37.711 226239 DEBUG nova.scheduler.client.report [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:49:37 np0005603623 nova_compute[226235]: 2026-01-31 08:49:37.782 226239 DEBUG oslo_concurrency.lockutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.922s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:37 np0005603623 nova_compute[226235]: 2026-01-31 08:49:37.783 226239 DEBUG nova.compute.manager [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:49:37 np0005603623 nova_compute[226235]: 2026-01-31 08:49:37.921 226239 DEBUG nova.compute.manager [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:49:37 np0005603623 nova_compute[226235]: 2026-01-31 08:49:37.921 226239 DEBUG nova.network.neutron [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:49:37 np0005603623 nova_compute[226235]: 2026-01-31 08:49:37.990 226239 INFO nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:49:38 np0005603623 nova_compute[226235]: 2026-01-31 08:49:38.052 226239 DEBUG nova.compute.manager [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:49:38 np0005603623 nova_compute[226235]: 2026-01-31 08:49:38.220 226239 DEBUG nova.compute.manager [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:49:38 np0005603623 nova_compute[226235]: 2026-01-31 08:49:38.221 226239 DEBUG nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:49:38 np0005603623 nova_compute[226235]: 2026-01-31 08:49:38.221 226239 INFO nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Creating image(s)#033[00m
Jan 31 03:49:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:38.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:38 np0005603623 nova_compute[226235]: 2026-01-31 08:49:38.241 226239 DEBUG nova.storage.rbd_utils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] rbd image 07ab810a-aca3-4084-abb7-b092f658255b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:49:38 np0005603623 nova_compute[226235]: 2026-01-31 08:49:38.262 226239 DEBUG nova.storage.rbd_utils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] rbd image 07ab810a-aca3-4084-abb7-b092f658255b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:49:38 np0005603623 nova_compute[226235]: 2026-01-31 08:49:38.284 226239 DEBUG nova.storage.rbd_utils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] rbd image 07ab810a-aca3-4084-abb7-b092f658255b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:49:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:38 np0005603623 nova_compute[226235]: 2026-01-31 08:49:38.289 226239 DEBUG oslo_concurrency.processutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:38.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:38 np0005603623 nova_compute[226235]: 2026-01-31 08:49:38.343 226239 DEBUG oslo_concurrency.processutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:38 np0005603623 nova_compute[226235]: 2026-01-31 08:49:38.344 226239 DEBUG oslo_concurrency.lockutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:38 np0005603623 nova_compute[226235]: 2026-01-31 08:49:38.344 226239 DEBUG oslo_concurrency.lockutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:38 np0005603623 nova_compute[226235]: 2026-01-31 08:49:38.345 226239 DEBUG oslo_concurrency.lockutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #145. Immutable memtables: 0.
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:49:38.358197) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 91] Flushing memtable with next log file: 145
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849378358369, "job": 91, "event": "flush_started", "num_memtables": 1, "num_entries": 1270, "num_deletes": 251, "total_data_size": 2736284, "memory_usage": 2776704, "flush_reason": "Manual Compaction"}
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 91] Level-0 flush table #146: started
Jan 31 03:49:38 np0005603623 nova_compute[226235]: 2026-01-31 08:49:38.380 226239 DEBUG nova.storage.rbd_utils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] rbd image 07ab810a-aca3-4084-abb7-b092f658255b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849378382917, "cf_name": "default", "job": 91, "event": "table_file_creation", "file_number": 146, "file_size": 1794342, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 70529, "largest_seqno": 71794, "table_properties": {"data_size": 1788844, "index_size": 2893, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12368, "raw_average_key_size": 20, "raw_value_size": 1777646, "raw_average_value_size": 2899, "num_data_blocks": 127, "num_entries": 613, "num_filter_entries": 613, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769849276, "oldest_key_time": 1769849276, "file_creation_time": 1769849378, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 146, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 91] Flush lasted 24793 microseconds, and 3717 cpu microseconds.
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:49:38 np0005603623 nova_compute[226235]: 2026-01-31 08:49:38.385 226239 DEBUG oslo_concurrency.processutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 07ab810a-aca3-4084-abb7-b092f658255b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:49:38.382992) [db/flush_job.cc:967] [default] [JOB 91] Level-0 flush table #146: 1794342 bytes OK
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:49:38.383022) [db/memtable_list.cc:519] [default] Level-0 commit table #146 started
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:49:38.387368) [db/memtable_list.cc:722] [default] Level-0 commit table #146: memtable #1 done
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:49:38.387420) EVENT_LOG_v1 {"time_micros": 1769849378387409, "job": 91, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:49:38.387471) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 91] Try to delete WAL files size 2730179, prev total WAL file size 2730179, number of live WAL files 2.
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000142.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:49:38.388191) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036303234' seq:72057594037927935, type:22 .. '7061786F730036323736' seq:0, type:0; will stop at (end)
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 92] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 91 Base level 0, inputs: [146(1752KB)], [144(10MB)]
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849378388244, "job": 92, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [146], "files_L6": [144], "score": -1, "input_data_size": 13025042, "oldest_snapshot_seqno": -1}
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 92] Generated table #147: 9064 keys, 11040790 bytes, temperature: kUnknown
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849378531911, "cf_name": "default", "job": 92, "event": "table_file_creation", "file_number": 147, "file_size": 11040790, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10983384, "index_size": 33613, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22725, "raw_key_size": 239889, "raw_average_key_size": 26, "raw_value_size": 10825646, "raw_average_value_size": 1194, "num_data_blocks": 1276, "num_entries": 9064, "num_filter_entries": 9064, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769849378, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 147, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:49:38.532196) [db/compaction/compaction_job.cc:1663] [default] [JOB 92] Compacted 1@0 + 1@6 files to L6 => 11040790 bytes
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:49:38.567578) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 90.6 rd, 76.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 10.7 +0.0 blob) out(10.5 +0.0 blob), read-write-amplify(13.4) write-amplify(6.2) OK, records in: 9581, records dropped: 517 output_compression: NoCompression
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:49:38.567621) EVENT_LOG_v1 {"time_micros": 1769849378567604, "job": 92, "event": "compaction_finished", "compaction_time_micros": 143765, "compaction_time_cpu_micros": 24252, "output_level": 6, "num_output_files": 1, "total_output_size": 11040790, "num_input_records": 9581, "num_output_records": 9064, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000146.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849378568055, "job": 92, "event": "table_file_deletion", "file_number": 146}
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000144.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849378569326, "job": 92, "event": "table_file_deletion", "file_number": 144}
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:49:38.388101) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:49:38.569408) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:49:38.569413) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:49:38.569415) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:49:38.569418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:49:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:49:38.569420) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:49:38 np0005603623 nova_compute[226235]: 2026-01-31 08:49:38.928 226239 DEBUG oslo_concurrency.processutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 07ab810a-aca3-4084-abb7-b092f658255b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.543s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:38 np0005603623 nova_compute[226235]: 2026-01-31 08:49:38.983 226239 DEBUG nova.storage.rbd_utils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] resizing rbd image 07ab810a-aca3-4084-abb7-b092f658255b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:49:39 np0005603623 nova_compute[226235]: 2026-01-31 08:49:39.060 226239 DEBUG nova.policy [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3bd4ce8a916a4bdbbc988eb4fe32991e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76fb5cb7abcd4d74abfc471a96bbd12c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:49:39 np0005603623 nova_compute[226235]: 2026-01-31 08:49:39.111 226239 DEBUG nova.objects.instance [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lazy-loading 'migration_context' on Instance uuid 07ab810a-aca3-4084-abb7-b092f658255b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:49:39 np0005603623 nova_compute[226235]: 2026-01-31 08:49:39.160 226239 DEBUG nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:49:39 np0005603623 nova_compute[226235]: 2026-01-31 08:49:39.161 226239 DEBUG nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Ensure instance console log exists: /var/lib/nova/instances/07ab810a-aca3-4084-abb7-b092f658255b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:49:39 np0005603623 nova_compute[226235]: 2026-01-31 08:49:39.162 226239 DEBUG oslo_concurrency.lockutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:39 np0005603623 nova_compute[226235]: 2026-01-31 08:49:39.162 226239 DEBUG oslo_concurrency.lockutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:39 np0005603623 nova_compute[226235]: 2026-01-31 08:49:39.162 226239 DEBUG oslo_concurrency.lockutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:40.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:40.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:40 np0005603623 nova_compute[226235]: 2026-01-31 08:49:40.636 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:40 np0005603623 nova_compute[226235]: 2026-01-31 08:49:40.731 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:42 np0005603623 nova_compute[226235]: 2026-01-31 08:49:42.224 226239 DEBUG nova.network.neutron [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Successfully created port: 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:49:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:42.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:42.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:43 np0005603623 nova_compute[226235]: 2026-01-31 08:49:43.404 226239 DEBUG oslo_concurrency.lockutils [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:49:43 np0005603623 nova_compute[226235]: 2026-01-31 08:49:43.404 226239 DEBUG oslo_concurrency.lockutils [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquired lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:49:43 np0005603623 nova_compute[226235]: 2026-01-31 08:49:43.405 226239 DEBUG nova.network.neutron [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:49:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:44.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:49:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:44.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:49:45 np0005603623 nova_compute[226235]: 2026-01-31 08:49:45.638 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:45 np0005603623 nova_compute[226235]: 2026-01-31 08:49:45.732 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:45 np0005603623 nova_compute[226235]: 2026-01-31 08:49:45.941 226239 DEBUG nova.network.neutron [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Successfully updated port: 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:49:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:46.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:46.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:47 np0005603623 nova_compute[226235]: 2026-01-31 08:49:47.249 226239 DEBUG nova.network.neutron [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Updating instance_info_cache with network_info: [{"id": "fbe66833-82a6-4f72-9b11-a4732140845a", "address": "fa:16:3e:d6:4d:37", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbe66833-82", "ovs_interfaceid": "fbe66833-82a6-4f72-9b11-a4732140845a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:49:47 np0005603623 nova_compute[226235]: 2026-01-31 08:49:47.643 226239 DEBUG nova.compute.manager [req-17794f51-9aa0-4764-8d7c-bce14557501a req-67546d6f-19b2-4d59-a101-6631a2f58b0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Received event network-changed-0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:49:47 np0005603623 nova_compute[226235]: 2026-01-31 08:49:47.643 226239 DEBUG nova.compute.manager [req-17794f51-9aa0-4764-8d7c-bce14557501a req-67546d6f-19b2-4d59-a101-6631a2f58b0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Refreshing instance network info cache due to event network-changed-0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:49:47 np0005603623 nova_compute[226235]: 2026-01-31 08:49:47.643 226239 DEBUG oslo_concurrency.lockutils [req-17794f51-9aa0-4764-8d7c-bce14557501a req-67546d6f-19b2-4d59-a101-6631a2f58b0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-07ab810a-aca3-4084-abb7-b092f658255b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:49:47 np0005603623 nova_compute[226235]: 2026-01-31 08:49:47.644 226239 DEBUG oslo_concurrency.lockutils [req-17794f51-9aa0-4764-8d7c-bce14557501a req-67546d6f-19b2-4d59-a101-6631a2f58b0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-07ab810a-aca3-4084-abb7-b092f658255b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:49:47 np0005603623 nova_compute[226235]: 2026-01-31 08:49:47.644 226239 DEBUG nova.network.neutron [req-17794f51-9aa0-4764-8d7c-bce14557501a req-67546d6f-19b2-4d59-a101-6631a2f58b0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Refreshing network info cache for port 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:49:47 np0005603623 nova_compute[226235]: 2026-01-31 08:49:47.645 226239 DEBUG oslo_concurrency.lockutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "refresh_cache-07ab810a-aca3-4084-abb7-b092f658255b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:49:47 np0005603623 nova_compute[226235]: 2026-01-31 08:49:47.669 226239 DEBUG oslo_concurrency.lockutils [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Releasing lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:49:47 np0005603623 nova_compute[226235]: 2026-01-31 08:49:47.943 226239 DEBUG nova.virt.libvirt.driver [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 31 03:49:47 np0005603623 nova_compute[226235]: 2026-01-31 08:49:47.944 226239 DEBUG nova.virt.libvirt.volume.remotefs [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Creating file /var/lib/nova/instances/c215327f-37ad-41a7-a883-3dbb23334df6/fa58486748c04cf5a326feb984bb30f1.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 31 03:49:47 np0005603623 nova_compute[226235]: 2026-01-31 08:49:47.944 226239 DEBUG oslo_concurrency.processutils [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/c215327f-37ad-41a7-a883-3dbb23334df6/fa58486748c04cf5a326feb984bb30f1.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:47 np0005603623 nova_compute[226235]: 2026-01-31 08:49:47.975 226239 DEBUG nova.network.neutron [req-17794f51-9aa0-4764-8d7c-bce14557501a req-67546d6f-19b2-4d59-a101-6631a2f58b0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:49:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:49:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:48.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:49:48 np0005603623 nova_compute[226235]: 2026-01-31 08:49:48.295 226239 DEBUG oslo_concurrency.processutils [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/c215327f-37ad-41a7-a883-3dbb23334df6/fa58486748c04cf5a326feb984bb30f1.tmp" returned: 1 in 0.351s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:48 np0005603623 nova_compute[226235]: 2026-01-31 08:49:48.296 226239 DEBUG oslo_concurrency.processutils [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/c215327f-37ad-41a7-a883-3dbb23334df6/fa58486748c04cf5a326feb984bb30f1.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 31 03:49:48 np0005603623 nova_compute[226235]: 2026-01-31 08:49:48.296 226239 DEBUG nova.virt.libvirt.volume.remotefs [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Creating directory /var/lib/nova/instances/c215327f-37ad-41a7-a883-3dbb23334df6 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 31 03:49:48 np0005603623 nova_compute[226235]: 2026-01-31 08:49:48.296 226239 DEBUG oslo_concurrency.processutils [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/c215327f-37ad-41a7-a883-3dbb23334df6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:49:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:48.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:49:48 np0005603623 nova_compute[226235]: 2026-01-31 08:49:48.478 226239 DEBUG oslo_concurrency.processutils [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/c215327f-37ad-41a7-a883-3dbb23334df6" returned: 0 in 0.181s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:48 np0005603623 nova_compute[226235]: 2026-01-31 08:49:48.484 226239 DEBUG nova.virt.libvirt.driver [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:49:49 np0005603623 nova_compute[226235]: 2026-01-31 08:49:49.126 226239 DEBUG nova.network.neutron [req-17794f51-9aa0-4764-8d7c-bce14557501a req-67546d6f-19b2-4d59-a101-6631a2f58b0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:49:49 np0005603623 nova_compute[226235]: 2026-01-31 08:49:49.165 226239 DEBUG oslo_concurrency.lockutils [req-17794f51-9aa0-4764-8d7c-bce14557501a req-67546d6f-19b2-4d59-a101-6631a2f58b0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-07ab810a-aca3-4084-abb7-b092f658255b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:49:49 np0005603623 nova_compute[226235]: 2026-01-31 08:49:49.165 226239 DEBUG oslo_concurrency.lockutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquired lock "refresh_cache-07ab810a-aca3-4084-abb7-b092f658255b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:49:49 np0005603623 nova_compute[226235]: 2026-01-31 08:49:49.166 226239 DEBUG nova.network.neutron [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:49:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:49:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:50.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:49:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:49:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:50.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:49:50 np0005603623 nova_compute[226235]: 2026-01-31 08:49:50.672 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:50 np0005603623 nova_compute[226235]: 2026-01-31 08:49:50.734 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:50 np0005603623 nova_compute[226235]: 2026-01-31 08:49:50.964 226239 DEBUG nova.network.neutron [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:49:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:52.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:52.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:52 np0005603623 kernel: tapfbe66833-82 (unregistering): left promiscuous mode
Jan 31 03:49:52 np0005603623 NetworkManager[48970]: <info>  [1769849392.4665] device (tapfbe66833-82): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:49:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:49:52Z|00689|binding|INFO|Releasing lport fbe66833-82a6-4f72-9b11-a4732140845a from this chassis (sb_readonly=0)
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.473 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:49:52Z|00690|binding|INFO|Setting lport fbe66833-82a6-4f72-9b11-a4732140845a down in Southbound
Jan 31 03:49:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:49:52Z|00691|binding|INFO|Removing iface tapfbe66833-82 ovn-installed in OVS
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.477 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.482 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:52.503 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:4d:37 10.100.0.6'], port_security=['fa:16:3e:d6:4d:37 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c215327f-37ad-41a7-a883-3dbb23334df6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8397e0fed04b4dabb57148d0924de2dc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd636f3a4-efef-465a-ac59-8182d61336f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbd2578f-ff6e-4dc3-bc49-93cbf023edc5, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=fbe66833-82a6-4f72-9b11-a4732140845a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:49:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:52.505 143258 INFO neutron.agent.ovn.metadata.agent [-] Port fbe66833-82a6-4f72-9b11-a4732140845a in datapath 3afaf607-43a1-4d65-95fc-0a22b5c901d0 unbound from our chassis#033[00m
Jan 31 03:49:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:52.507 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3afaf607-43a1-4d65-95fc-0a22b5c901d0#033[00m
Jan 31 03:49:52 np0005603623 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000a3.scope: Deactivated successfully.
Jan 31 03:49:52 np0005603623 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000a3.scope: Consumed 16.732s CPU time.
Jan 31 03:49:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:52.521 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1ca5bf9d-f877-4f1b-b1df-7a9176399f1e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:52 np0005603623 systemd-machined[194379]: Machine qemu-78-instance-000000a3 terminated.
Jan 31 03:49:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:52.541 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[e4e0ccda-8dcf-48f6-951e-f02b95c09dfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:52.546 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ff495050-1684-4850-9040-f542c3c148c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:52.564 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[0e3a625c-0062-4c43-99b3-c33f62dd33ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:52.582 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[80c14755-085b-41ad-b8bf-410e97ba9f9d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3afaf607-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:84:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 198], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 831647, 'reachable_time': 39542, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304403, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:52.595 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[78448630-2fdf-44b9-a2e7-83e03464fc73]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3afaf607-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 831657, 'tstamp': 831657}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 304404, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3afaf607-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 831659, 'tstamp': 831659}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 304404, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:52.597 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3afaf607-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.598 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:52.602 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3afaf607-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:52.602 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.602 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:52.602 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3afaf607-40, col_values=(('external_ids', {'iface-id': '0ed76a0a-650c-4ec7-a4d4-0e745236b047'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:52.603 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.698 226239 INFO nova.virt.libvirt.driver [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Instance shutdown successfully after 4 seconds.#033[00m
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.702 226239 INFO nova.virt.libvirt.driver [-] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Instance destroyed successfully.#033[00m
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.703 226239 DEBUG nova.virt.libvirt.vif [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:47:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-0',id=163,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMGvV4tGHwFrQ7+1WPmMS3fGcrpcMKpLQBFiD2ZG0NedKq4jaCN6oHf8RWlX+X72Ff/PSGJSQ5nqRPZm+CDMr01vn3vAMra9m4dZ/R1d2vwh+NDFwu298PivPHJQkyuCpg==',key_name='tempest-keypair-600650673',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:48:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8397e0fed04b4dabb57148d0924de2dc',ramdisk_id='',reservation_id='r-libm6dxn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-AttachVolumeMultiAttachTest-1931311941',owner_user_name='tempest-AttachVolumeMultiAttachTest-1931311941-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:49:42Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a498364761ef428b99cac3f92e603385',uuid=c215327f-37ad-41a7-a883-3dbb23334df6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fbe66833-82a6-4f72-9b11-a4732140845a", "address": "fa:16:3e:d6:4d:37", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "vif_mac": "fa:16:3e:d6:4d:37"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbe66833-82", "ovs_interfaceid": "fbe66833-82a6-4f72-9b11-a4732140845a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.703 226239 DEBUG nova.network.os_vif_util [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converting VIF {"id": "fbe66833-82a6-4f72-9b11-a4732140845a", "address": "fa:16:3e:d6:4d:37", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "vif_mac": "fa:16:3e:d6:4d:37"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbe66833-82", "ovs_interfaceid": "fbe66833-82a6-4f72-9b11-a4732140845a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.704 226239 DEBUG nova.network.os_vif_util [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d6:4d:37,bridge_name='br-int',has_traffic_filtering=True,id=fbe66833-82a6-4f72-9b11-a4732140845a,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbe66833-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.704 226239 DEBUG os_vif [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d6:4d:37,bridge_name='br-int',has_traffic_filtering=True,id=fbe66833-82a6-4f72-9b11-a4732140845a,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbe66833-82') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.706 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.706 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbe66833-82, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.707 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.708 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.709 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.710 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.713 226239 INFO os_vif [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d6:4d:37,bridge_name='br-int',has_traffic_filtering=True,id=fbe66833-82a6-4f72-9b11-a4732140845a,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbe66833-82')#033[00m
Jan 31 03:49:52 np0005603623 nova_compute[226235]: 2026-01-31 08:49:52.989 226239 DEBUG nova.network.neutron [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Updating instance_info_cache with network_info: [{"id": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "address": "fa:16:3e:5e:64:02", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d2ef565-4b", "ovs_interfaceid": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.047 226239 DEBUG oslo_concurrency.lockutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Releasing lock "refresh_cache-07ab810a-aca3-4084-abb7-b092f658255b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.048 226239 DEBUG nova.compute.manager [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Instance network_info: |[{"id": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "address": "fa:16:3e:5e:64:02", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d2ef565-4b", "ovs_interfaceid": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.050 226239 DEBUG nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Start _get_guest_xml network_info=[{"id": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "address": "fa:16:3e:5e:64:02", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d2ef565-4b", "ovs_interfaceid": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.054 226239 WARNING nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.062 226239 DEBUG nova.virt.libvirt.host [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.063 226239 DEBUG nova.virt.libvirt.host [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.067 226239 DEBUG nova.virt.libvirt.host [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.068 226239 DEBUG nova.virt.libvirt.host [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.069 226239 DEBUG nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.070 226239 DEBUG nova.virt.hardware [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.070 226239 DEBUG nova.virt.hardware [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.070 226239 DEBUG nova.virt.hardware [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.071 226239 DEBUG nova.virt.hardware [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.071 226239 DEBUG nova.virt.hardware [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.071 226239 DEBUG nova.virt.hardware [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.071 226239 DEBUG nova.virt.hardware [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.072 226239 DEBUG nova.virt.hardware [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.072 226239 DEBUG nova.virt.hardware [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.072 226239 DEBUG nova.virt.hardware [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.072 226239 DEBUG nova.virt.hardware [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.076 226239 DEBUG oslo_concurrency.processutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.122 226239 DEBUG nova.virt.libvirt.driver [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.122 226239 DEBUG nova.virt.libvirt.driver [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.123 226239 DEBUG nova.virt.libvirt.driver [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:49:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:49:53 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1645012824' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.495 226239 DEBUG oslo_concurrency.processutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.524 226239 DEBUG nova.storage.rbd_utils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] rbd image 07ab810a-aca3-4084-abb7-b092f658255b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:49:53 np0005603623 nova_compute[226235]: 2026-01-31 08:49:53.528 226239 DEBUG oslo_concurrency.processutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:49:54 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1217814750' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.017 226239 DEBUG oslo_concurrency.processutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.019 226239 DEBUG nova.virt.libvirt.vif [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:49:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-494286398',display_name='tempest-AttachVolumeNegativeTest-server-494286398',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-494286398',id=168,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFaWuk8mgnEQe6j9EgiK1wmF6jqLzpW2mI2jCE7KPbM8a8Truz16MX5T0/cz0IqO/RG2Eb/afZul8hye4itT39RkNKl3UlAnb9FyZBHowBQj4vN9SLaA7npVJpQN7XcH9w==',key_name='tempest-keypair-1801134062',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76fb5cb7abcd4d74abfc471a96bbd12c',ramdisk_id='',reservation_id='r-qhhau9sk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-457307401',owner_user_name='tempest-AttachVolumeNegativeTest-457307401-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:49:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3bd4ce8a916a4bdbbc988eb4fe32991e',uuid=07ab810a-aca3-4084-abb7-b092f658255b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "address": "fa:16:3e:5e:64:02", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d2ef565-4b", "ovs_interfaceid": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.019 226239 DEBUG nova.network.os_vif_util [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Converting VIF {"id": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "address": "fa:16:3e:5e:64:02", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d2ef565-4b", "ovs_interfaceid": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.020 226239 DEBUG nova.network.os_vif_util [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:64:02,bridge_name='br-int',has_traffic_filtering=True,id=0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8,network=Network(a02f269a-650e-4227-8352-05abf2566c17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d2ef565-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.021 226239 DEBUG nova.objects.instance [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lazy-loading 'pci_devices' on Instance uuid 07ab810a-aca3-4084-abb7-b092f658255b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.024 226239 DEBUG nova.compute.manager [req-4b08318c-227b-4f2a-a29e-6cc24b521bcc req-865d8949-76cd-4dbe-82cf-cff4566550ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Received event network-vif-unplugged-fbe66833-82a6-4f72-9b11-a4732140845a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.024 226239 DEBUG oslo_concurrency.lockutils [req-4b08318c-227b-4f2a-a29e-6cc24b521bcc req-865d8949-76cd-4dbe-82cf-cff4566550ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.024 226239 DEBUG oslo_concurrency.lockutils [req-4b08318c-227b-4f2a-a29e-6cc24b521bcc req-865d8949-76cd-4dbe-82cf-cff4566550ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.024 226239 DEBUG oslo_concurrency.lockutils [req-4b08318c-227b-4f2a-a29e-6cc24b521bcc req-865d8949-76cd-4dbe-82cf-cff4566550ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.024 226239 DEBUG nova.compute.manager [req-4b08318c-227b-4f2a-a29e-6cc24b521bcc req-865d8949-76cd-4dbe-82cf-cff4566550ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] No waiting events found dispatching network-vif-unplugged-fbe66833-82a6-4f72-9b11-a4732140845a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.025 226239 WARNING nova.compute.manager [req-4b08318c-227b-4f2a-a29e-6cc24b521bcc req-865d8949-76cd-4dbe-82cf-cff4566550ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Received unexpected event network-vif-unplugged-fbe66833-82a6-4f72-9b11-a4732140845a for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.047 226239 DEBUG nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:49:54 np0005603623 nova_compute[226235]:  <uuid>07ab810a-aca3-4084-abb7-b092f658255b</uuid>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:  <name>instance-000000a8</name>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <nova:name>tempest-AttachVolumeNegativeTest-server-494286398</nova:name>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:49:53</nova:creationTime>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:49:54 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:        <nova:user uuid="3bd4ce8a916a4bdbbc988eb4fe32991e">tempest-AttachVolumeNegativeTest-457307401-project-member</nova:user>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:        <nova:project uuid="76fb5cb7abcd4d74abfc471a96bbd12c">tempest-AttachVolumeNegativeTest-457307401</nova:project>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:        <nova:port uuid="0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8">
Jan 31 03:49:54 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <entry name="serial">07ab810a-aca3-4084-abb7-b092f658255b</entry>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <entry name="uuid">07ab810a-aca3-4084-abb7-b092f658255b</entry>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/07ab810a-aca3-4084-abb7-b092f658255b_disk">
Jan 31 03:49:54 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:49:54 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/07ab810a-aca3-4084-abb7-b092f658255b_disk.config">
Jan 31 03:49:54 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:49:54 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:5e:64:02"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <target dev="tap0d2ef565-4b"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/07ab810a-aca3-4084-abb7-b092f658255b/console.log" append="off"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:49:54 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:49:54 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:49:54 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:49:54 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.048 226239 DEBUG nova.compute.manager [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Preparing to wait for external event network-vif-plugged-0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.048 226239 DEBUG oslo_concurrency.lockutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "07ab810a-aca3-4084-abb7-b092f658255b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.048 226239 DEBUG oslo_concurrency.lockutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "07ab810a-aca3-4084-abb7-b092f658255b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.048 226239 DEBUG oslo_concurrency.lockutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "07ab810a-aca3-4084-abb7-b092f658255b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.049 226239 DEBUG nova.virt.libvirt.vif [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:49:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-494286398',display_name='tempest-AttachVolumeNegativeTest-server-494286398',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-494286398',id=168,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFaWuk8mgnEQe6j9EgiK1wmF6jqLzpW2mI2jCE7KPbM8a8Truz16MX5T0/cz0IqO/RG2Eb/afZul8hye4itT39RkNKl3UlAnb9FyZBHowBQj4vN9SLaA7npVJpQN7XcH9w==',key_name='tempest-keypair-1801134062',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76fb5cb7abcd4d74abfc471a96bbd12c',ramdisk_id='',reservation_id='r-qhhau9sk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-457307401',owner_user_name='tempest-AttachVolumeNegativeTest-457307401-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:49:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3bd4ce8a916a4bdbbc988eb4fe32991e',uuid=07ab810a-aca3-4084-abb7-b092f658255b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "address": "fa:16:3e:5e:64:02", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d2ef565-4b", "ovs_interfaceid": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.049 226239 DEBUG nova.network.os_vif_util [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Converting VIF {"id": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "address": "fa:16:3e:5e:64:02", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d2ef565-4b", "ovs_interfaceid": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.050 226239 DEBUG nova.network.os_vif_util [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:64:02,bridge_name='br-int',has_traffic_filtering=True,id=0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8,network=Network(a02f269a-650e-4227-8352-05abf2566c17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d2ef565-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.050 226239 DEBUG os_vif [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:64:02,bridge_name='br-int',has_traffic_filtering=True,id=0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8,network=Network(a02f269a-650e-4227-8352-05abf2566c17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d2ef565-4b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.051 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.051 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.051 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.053 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.054 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0d2ef565-4b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.054 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0d2ef565-4b, col_values=(('external_ids', {'iface-id': '0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5e:64:02', 'vm-uuid': '07ab810a-aca3-4084-abb7-b092f658255b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.056 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:54 np0005603623 NetworkManager[48970]: <info>  [1769849394.0567] manager: (tap0d2ef565-4b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/328)
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.057 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.061 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.062 226239 INFO os_vif [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:64:02,bridge_name='br-int',has_traffic_filtering=True,id=0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8,network=Network(a02f269a-650e-4227-8352-05abf2566c17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d2ef565-4b')#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.121 226239 DEBUG nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.122 226239 DEBUG nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.122 226239 DEBUG nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] No VIF found with MAC fa:16:3e:5e:64:02, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.122 226239 INFO nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Using config drive#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.159 226239 DEBUG nova.storage.rbd_utils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] rbd image 07ab810a-aca3-4084-abb7-b092f658255b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.165 226239 DEBUG neutronclient.v2_0.client [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port fbe66833-82a6-4f72-9b11-a4732140845a for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 31 03:49:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:49:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:54.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:49:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:54.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.449 226239 DEBUG oslo_concurrency.lockutils [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.450 226239 DEBUG oslo_concurrency.lockutils [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.450 226239 DEBUG oslo_concurrency.lockutils [None req-9653eed8-8ffb-4c89-81d0-4cb37883b857 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.823 226239 INFO nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Creating config drive at /var/lib/nova/instances/07ab810a-aca3-4084-abb7-b092f658255b/disk.config#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.829 226239 DEBUG oslo_concurrency.processutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/07ab810a-aca3-4084-abb7-b092f658255b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpii807f54 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.950 226239 DEBUG oslo_concurrency.processutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/07ab810a-aca3-4084-abb7-b092f658255b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpii807f54" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.974 226239 DEBUG nova.storage.rbd_utils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] rbd image 07ab810a-aca3-4084-abb7-b092f658255b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:49:54 np0005603623 nova_compute[226235]: 2026-01-31 08:49:54.977 226239 DEBUG oslo_concurrency.processutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/07ab810a-aca3-4084-abb7-b092f658255b/disk.config 07ab810a-aca3-4084-abb7-b092f658255b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.175 226239 DEBUG oslo_concurrency.processutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/07ab810a-aca3-4084-abb7-b092f658255b/disk.config 07ab810a-aca3-4084-abb7-b092f658255b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.197s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.176 226239 INFO nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Deleting local config drive /var/lib/nova/instances/07ab810a-aca3-4084-abb7-b092f658255b/disk.config because it was imported into RBD.#033[00m
Jan 31 03:49:55 np0005603623 kernel: tap0d2ef565-4b: entered promiscuous mode
Jan 31 03:49:55 np0005603623 NetworkManager[48970]: <info>  [1769849395.2204] manager: (tap0d2ef565-4b): new Tun device (/org/freedesktop/NetworkManager/Devices/329)
Jan 31 03:49:55 np0005603623 systemd-udevd[304393]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:49:55 np0005603623 ovn_controller[133449]: 2026-01-31T08:49:55Z|00692|binding|INFO|Claiming lport 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 for this chassis.
Jan 31 03:49:55 np0005603623 ovn_controller[133449]: 2026-01-31T08:49:55Z|00693|binding|INFO|0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8: Claiming fa:16:3e:5e:64:02 10.100.0.7
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.220 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:55 np0005603623 ovn_controller[133449]: 2026-01-31T08:49:55Z|00694|binding|INFO|Setting lport 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 ovn-installed in OVS
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.228 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.230 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:55 np0005603623 NetworkManager[48970]: <info>  [1769849395.2323] device (tap0d2ef565-4b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:49:55 np0005603623 NetworkManager[48970]: <info>  [1769849395.2335] device (tap0d2ef565-4b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:49:55 np0005603623 ovn_controller[133449]: 2026-01-31T08:49:55Z|00695|binding|INFO|Setting lport 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 up in Southbound
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.248 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:64:02 10.100.0.7'], port_security=['fa:16:3e:5e:64:02 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '07ab810a-aca3-4084-abb7-b092f658255b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a02f269a-650e-4227-8352-05abf2566c17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76fb5cb7abcd4d74abfc471a96bbd12c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b8be5d87-2c36-4b5c-850e-b217f1c7a95b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97876448-21a4-4b64-9452-bd401dfcc8ac, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.249 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 in datapath a02f269a-650e-4227-8352-05abf2566c17 bound to our chassis#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.251 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a02f269a-650e-4227-8352-05abf2566c17#033[00m
Jan 31 03:49:55 np0005603623 systemd-machined[194379]: New machine qemu-79-instance-000000a8.
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.259 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bc824993-eb53-402b-b3f1-faa6e1f5c5c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.260 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa02f269a-61 in ovnmeta-a02f269a-650e-4227-8352-05abf2566c17 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.262 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa02f269a-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.262 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ca7d9727-6bda-411f-be55-2541462a2179]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.262 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9d2c1c69-b96e-4c76-8371-639bb801b2d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:55 np0005603623 systemd[1]: Started Virtual Machine qemu-79-instance-000000a8.
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.272 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[da369b64-1450-4c9c-a0a5-f6c9d8f19eb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.280 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9912df39-3d49-4d3c-ab5d-6a38327dfc63]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.302 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[4f013e3e-fcc2-4d84-9b39-0ab144493ca5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.307 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e11818ca-79e6-4431-b7da-0dc168d74ffd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:55 np0005603623 NetworkManager[48970]: <info>  [1769849395.3081] manager: (tapa02f269a-60): new Veth device (/org/freedesktop/NetworkManager/Devices/330)
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.331 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[27a8380e-c61d-4ed9-8552-acd03a6a545b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.334 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[0bcdee50-6f4f-497c-80f6-cd198a25a294]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:55 np0005603623 NetworkManager[48970]: <info>  [1769849395.3495] device (tapa02f269a-60): carrier: link connected
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.353 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[41c49ada-add0-4922-bdf1-ce1db9a42438]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.363 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[46838524-867f-424e-9ad7-23806c4dcec3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa02f269a-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:e9:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 207], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 853577, 'reachable_time': 38460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304584, 'error': None, 'target': 'ovnmeta-a02f269a-650e-4227-8352-05abf2566c17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.374 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b2b81cd7-e214-4fbe-83a3-178f1b179e1b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee4:e973'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 853577, 'tstamp': 853577}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 304585, 'error': None, 'target': 'ovnmeta-a02f269a-650e-4227-8352-05abf2566c17', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.386 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f5bca367-7318-4523-93ec-85c16ffba1f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa02f269a-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:e9:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 207], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 853577, 'reachable_time': 38460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 304586, 'error': None, 'target': 'ovnmeta-a02f269a-650e-4227-8352-05abf2566c17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.406 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ad230dad-d1c6-4933-809e-d21e96945090]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.447 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fa972f22-cb0e-4b78-841d-9555d79df59f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.448 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa02f269a-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.448 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.449 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa02f269a-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.450 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:55 np0005603623 NetworkManager[48970]: <info>  [1769849395.4511] manager: (tapa02f269a-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/331)
Jan 31 03:49:55 np0005603623 kernel: tapa02f269a-60: entered promiscuous mode
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.454 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.457 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa02f269a-60, col_values=(('external_ids', {'iface-id': '2c775482-0f82-4695-be62-4a95328fbf79'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.458 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:55 np0005603623 ovn_controller[133449]: 2026-01-31T08:49:55Z|00696|binding|INFO|Releasing lport 2c775482-0f82-4695-be62-4a95328fbf79 from this chassis (sb_readonly=0)
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.459 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.461 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a02f269a-650e-4227-8352-05abf2566c17.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a02f269a-650e-4227-8352-05abf2566c17.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.462 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f91d125b-f4be-442d-a678-b67cc2e4bcdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.463 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-a02f269a-650e-4227-8352-05abf2566c17
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/a02f269a-650e-4227-8352-05abf2566c17.pid.haproxy
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID a02f269a-650e-4227-8352-05abf2566c17
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.464 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:49:55.465 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a02f269a-650e-4227-8352-05abf2566c17', 'env', 'PROCESS_TAG=haproxy-a02f269a-650e-4227-8352-05abf2566c17', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a02f269a-650e-4227-8352-05abf2566c17.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.674 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.688 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849395.688007, 07ab810a-aca3-4084-abb7-b092f658255b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.689 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] VM Started (Lifecycle Event)#033[00m
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.729 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.735 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849395.6889577, 07ab810a-aca3-4084-abb7-b092f658255b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.735 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:49:55 np0005603623 podman[304661]: 2026-01-31 08:49:55.801981446 +0000 UTC m=+0.059376434 container create dd2acf0b4d6082ad9bd33d42ec192f7b282b6553ccfe3e263280c8c61578cc89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.818 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.823 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:49:55 np0005603623 systemd[1]: Started libpod-conmon-dd2acf0b4d6082ad9bd33d42ec192f7b282b6553ccfe3e263280c8c61578cc89.scope.
Jan 31 03:49:55 np0005603623 podman[304661]: 2026-01-31 08:49:55.763209459 +0000 UTC m=+0.020604467 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:49:55 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:49:55 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10f449439fc98b3b2ff16255c73135d888c2d6d1654a160d6dc7691b4c5344f5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:49:55 np0005603623 podman[304661]: 2026-01-31 08:49:55.880830539 +0000 UTC m=+0.138225557 container init dd2acf0b4d6082ad9bd33d42ec192f7b282b6553ccfe3e263280c8c61578cc89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:49:55 np0005603623 podman[304661]: 2026-01-31 08:49:55.884891237 +0000 UTC m=+0.142286235 container start dd2acf0b4d6082ad9bd33d42ec192f7b282b6553ccfe3e263280c8c61578cc89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 03:49:55 np0005603623 nova_compute[226235]: 2026-01-31 08:49:55.895 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:49:55 np0005603623 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[304676]: [NOTICE]   (304680) : New worker (304682) forked
Jan 31 03:49:55 np0005603623 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[304676]: [NOTICE]   (304680) : Loading success.
Jan 31 03:49:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.255 226239 DEBUG nova.compute.manager [req-4c3a4484-8482-402d-b13e-9c89ddab0e5d req-3e5e711b-4676-46b9-88b1-48f6da27aa3b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Received event network-vif-plugged-fbe66833-82a6-4f72-9b11-a4732140845a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.255 226239 DEBUG oslo_concurrency.lockutils [req-4c3a4484-8482-402d-b13e-9c89ddab0e5d req-3e5e711b-4676-46b9-88b1-48f6da27aa3b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.255 226239 DEBUG oslo_concurrency.lockutils [req-4c3a4484-8482-402d-b13e-9c89ddab0e5d req-3e5e711b-4676-46b9-88b1-48f6da27aa3b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.256 226239 DEBUG oslo_concurrency.lockutils [req-4c3a4484-8482-402d-b13e-9c89ddab0e5d req-3e5e711b-4676-46b9-88b1-48f6da27aa3b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.256 226239 DEBUG nova.compute.manager [req-4c3a4484-8482-402d-b13e-9c89ddab0e5d req-3e5e711b-4676-46b9-88b1-48f6da27aa3b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] No waiting events found dispatching network-vif-plugged-fbe66833-82a6-4f72-9b11-a4732140845a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:49:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:56.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.256 226239 WARNING nova.compute.manager [req-4c3a4484-8482-402d-b13e-9c89ddab0e5d req-3e5e711b-4676-46b9-88b1-48f6da27aa3b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Received unexpected event network-vif-plugged-fbe66833-82a6-4f72-9b11-a4732140845a for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.257 226239 DEBUG nova.compute.manager [req-4c3a4484-8482-402d-b13e-9c89ddab0e5d req-3e5e711b-4676-46b9-88b1-48f6da27aa3b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Received event network-vif-plugged-0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.257 226239 DEBUG oslo_concurrency.lockutils [req-4c3a4484-8482-402d-b13e-9c89ddab0e5d req-3e5e711b-4676-46b9-88b1-48f6da27aa3b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "07ab810a-aca3-4084-abb7-b092f658255b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.257 226239 DEBUG oslo_concurrency.lockutils [req-4c3a4484-8482-402d-b13e-9c89ddab0e5d req-3e5e711b-4676-46b9-88b1-48f6da27aa3b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "07ab810a-aca3-4084-abb7-b092f658255b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.257 226239 DEBUG oslo_concurrency.lockutils [req-4c3a4484-8482-402d-b13e-9c89ddab0e5d req-3e5e711b-4676-46b9-88b1-48f6da27aa3b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "07ab810a-aca3-4084-abb7-b092f658255b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.257 226239 DEBUG nova.compute.manager [req-4c3a4484-8482-402d-b13e-9c89ddab0e5d req-3e5e711b-4676-46b9-88b1-48f6da27aa3b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Processing event network-vif-plugged-0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.258 226239 DEBUG nova.compute.manager [req-4c3a4484-8482-402d-b13e-9c89ddab0e5d req-3e5e711b-4676-46b9-88b1-48f6da27aa3b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Received event network-vif-plugged-0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.258 226239 DEBUG oslo_concurrency.lockutils [req-4c3a4484-8482-402d-b13e-9c89ddab0e5d req-3e5e711b-4676-46b9-88b1-48f6da27aa3b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "07ab810a-aca3-4084-abb7-b092f658255b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.258 226239 DEBUG oslo_concurrency.lockutils [req-4c3a4484-8482-402d-b13e-9c89ddab0e5d req-3e5e711b-4676-46b9-88b1-48f6da27aa3b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "07ab810a-aca3-4084-abb7-b092f658255b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.258 226239 DEBUG oslo_concurrency.lockutils [req-4c3a4484-8482-402d-b13e-9c89ddab0e5d req-3e5e711b-4676-46b9-88b1-48f6da27aa3b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "07ab810a-aca3-4084-abb7-b092f658255b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.259 226239 DEBUG nova.compute.manager [req-4c3a4484-8482-402d-b13e-9c89ddab0e5d req-3e5e711b-4676-46b9-88b1-48f6da27aa3b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] No waiting events found dispatching network-vif-plugged-0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.259 226239 WARNING nova.compute.manager [req-4c3a4484-8482-402d-b13e-9c89ddab0e5d req-3e5e711b-4676-46b9-88b1-48f6da27aa3b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Received unexpected event network-vif-plugged-0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.260 226239 DEBUG nova.compute.manager [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.265 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849396.2641525, 07ab810a-aca3-4084-abb7-b092f658255b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.265 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.267 226239 DEBUG nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.270 226239 INFO nova.virt.libvirt.driver [-] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Instance spawned successfully.#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.270 226239 DEBUG nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.300 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.305 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:49:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:56.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.307 226239 DEBUG nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.308 226239 DEBUG nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.308 226239 DEBUG nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.308 226239 DEBUG nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.309 226239 DEBUG nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.309 226239 DEBUG nova.virt.libvirt.driver [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.413 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.461 226239 INFO nova.compute.manager [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Took 18.24 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.462 226239 DEBUG nova.compute.manager [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.621 226239 INFO nova.compute.manager [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Took 19.82 seconds to build instance.#033[00m
Jan 31 03:49:56 np0005603623 nova_compute[226235]: 2026-01-31 08:49:56.694 226239 DEBUG oslo_concurrency.lockutils [None req-6351fb14-a409-45ff-8ce1-ec7c59350c50 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "07ab810a-aca3-4084-abb7-b092f658255b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:57 np0005603623 nova_compute[226235]: 2026-01-31 08:49:57.556 226239 DEBUG nova.compute.manager [req-6592911b-e3c1-427f-9c2a-b63b8c21cbcd req-b305f0b1-9a53-49bf-926c-e302f34be861 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Received event network-changed-fbe66833-82a6-4f72-9b11-a4732140845a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:49:57 np0005603623 nova_compute[226235]: 2026-01-31 08:49:57.557 226239 DEBUG nova.compute.manager [req-6592911b-e3c1-427f-9c2a-b63b8c21cbcd req-b305f0b1-9a53-49bf-926c-e302f34be861 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Refreshing instance network info cache due to event network-changed-fbe66833-82a6-4f72-9b11-a4732140845a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:49:57 np0005603623 nova_compute[226235]: 2026-01-31 08:49:57.557 226239 DEBUG oslo_concurrency.lockutils [req-6592911b-e3c1-427f-9c2a-b63b8c21cbcd req-b305f0b1-9a53-49bf-926c-e302f34be861 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:49:57 np0005603623 nova_compute[226235]: 2026-01-31 08:49:57.557 226239 DEBUG oslo_concurrency.lockutils [req-6592911b-e3c1-427f-9c2a-b63b8c21cbcd req-b305f0b1-9a53-49bf-926c-e302f34be861 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:49:57 np0005603623 nova_compute[226235]: 2026-01-31 08:49:57.558 226239 DEBUG nova.network.neutron [req-6592911b-e3c1-427f-9c2a-b63b8c21cbcd req-b305f0b1-9a53-49bf-926c-e302f34be861 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Refreshing network info cache for port fbe66833-82a6-4f72-9b11-a4732140845a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:49:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:58.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:49:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:58.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:58 np0005603623 podman[304744]: 2026-01-31 08:49:58.969777712 +0000 UTC m=+0.065138925 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:49:58 np0005603623 podman[304743]: 2026-01-31 08:49:58.975820952 +0000 UTC m=+0.071182075 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 31 03:49:59 np0005603623 nova_compute[226235]: 2026-01-31 08:49:59.056 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:00 np0005603623 nova_compute[226235]: 2026-01-31 08:50:00.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:50:00 np0005603623 nova_compute[226235]: 2026-01-31 08:50:00.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:50:00 np0005603623 ceph-mon[77037]: overall HEALTH_OK
Jan 31 03:50:00 np0005603623 nova_compute[226235]: 2026-01-31 08:50:00.256 226239 DEBUG nova.network.neutron [req-6592911b-e3c1-427f-9c2a-b63b8c21cbcd req-b305f0b1-9a53-49bf-926c-e302f34be861 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Updated VIF entry in instance network info cache for port fbe66833-82a6-4f72-9b11-a4732140845a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:50:00 np0005603623 nova_compute[226235]: 2026-01-31 08:50:00.256 226239 DEBUG nova.network.neutron [req-6592911b-e3c1-427f-9c2a-b63b8c21cbcd req-b305f0b1-9a53-49bf-926c-e302f34be861 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Updating instance_info_cache with network_info: [{"id": "fbe66833-82a6-4f72-9b11-a4732140845a", "address": "fa:16:3e:d6:4d:37", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbe66833-82", "ovs_interfaceid": "fbe66833-82a6-4f72-9b11-a4732140845a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:50:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:00.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:00.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:00 np0005603623 nova_compute[226235]: 2026-01-31 08:50:00.318 226239 DEBUG oslo_concurrency.lockutils [req-6592911b-e3c1-427f-9c2a-b63b8c21cbcd req-b305f0b1-9a53-49bf-926c-e302f34be861 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:50:00 np0005603623 nova_compute[226235]: 2026-01-31 08:50:00.676 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:50:01 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1063363873' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:50:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:50:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:02.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:50:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:02.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:02 np0005603623 nova_compute[226235]: 2026-01-31 08:50:02.394 226239 DEBUG nova.compute.manager [req-f27a829a-6c3c-4b52-82c8-8eae618483cc req-605b8b05-6023-420f-868e-23439d0e3ad5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Received event network-changed-0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:02 np0005603623 nova_compute[226235]: 2026-01-31 08:50:02.395 226239 DEBUG nova.compute.manager [req-f27a829a-6c3c-4b52-82c8-8eae618483cc req-605b8b05-6023-420f-868e-23439d0e3ad5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Refreshing instance network info cache due to event network-changed-0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:50:02 np0005603623 nova_compute[226235]: 2026-01-31 08:50:02.395 226239 DEBUG oslo_concurrency.lockutils [req-f27a829a-6c3c-4b52-82c8-8eae618483cc req-605b8b05-6023-420f-868e-23439d0e3ad5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-07ab810a-aca3-4084-abb7-b092f658255b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:50:02 np0005603623 nova_compute[226235]: 2026-01-31 08:50:02.395 226239 DEBUG oslo_concurrency.lockutils [req-f27a829a-6c3c-4b52-82c8-8eae618483cc req-605b8b05-6023-420f-868e-23439d0e3ad5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-07ab810a-aca3-4084-abb7-b092f658255b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:50:02 np0005603623 nova_compute[226235]: 2026-01-31 08:50:02.395 226239 DEBUG nova.network.neutron [req-f27a829a-6c3c-4b52-82c8-8eae618483cc req-605b8b05-6023-420f-868e-23439d0e3ad5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Refreshing network info cache for port 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:50:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e361 e361: 3 total, 3 up, 3 in
Jan 31 03:50:03 np0005603623 nova_compute[226235]: 2026-01-31 08:50:03.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:50:03 np0005603623 nova_compute[226235]: 2026-01-31 08:50:03.262 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:03 np0005603623 nova_compute[226235]: 2026-01-31 08:50:03.262 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:03 np0005603623 nova_compute[226235]: 2026-01-31 08:50:03.262 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:03 np0005603623 nova_compute[226235]: 2026-01-31 08:50:03.263 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:50:03 np0005603623 nova_compute[226235]: 2026-01-31 08:50:03.263 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:50:03 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4074317525' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:50:03 np0005603623 nova_compute[226235]: 2026-01-31 08:50:03.659 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.396s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:04 np0005603623 nova_compute[226235]: 2026-01-31 08:50:04.059 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:04.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:04.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:04 np0005603623 nova_compute[226235]: 2026-01-31 08:50:04.663 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000009c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:04 np0005603623 nova_compute[226235]: 2026-01-31 08:50:04.663 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000009c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:04 np0005603623 nova_compute[226235]: 2026-01-31 08:50:04.666 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:04 np0005603623 nova_compute[226235]: 2026-01-31 08:50:04.667 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:04 np0005603623 nova_compute[226235]: 2026-01-31 08:50:04.667 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:04 np0005603623 nova_compute[226235]: 2026-01-31 08:50:04.670 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:04 np0005603623 nova_compute[226235]: 2026-01-31 08:50:04.670 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:04 np0005603623 nova_compute[226235]: 2026-01-31 08:50:04.673 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:04 np0005603623 nova_compute[226235]: 2026-01-31 08:50:04.673 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:04 np0005603623 nova_compute[226235]: 2026-01-31 08:50:04.674 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:04 np0005603623 nova_compute[226235]: 2026-01-31 08:50:04.831 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:50:04 np0005603623 nova_compute[226235]: 2026-01-31 08:50:04.832 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3618MB free_disk=20.560894012451172GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:50:04 np0005603623 nova_compute[226235]: 2026-01-31 08:50:04.832 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:04 np0005603623 nova_compute[226235]: 2026-01-31 08:50:04.832 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:05 np0005603623 nova_compute[226235]: 2026-01-31 08:50:05.050 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Migration for instance c215327f-37ad-41a7-a883-3dbb23334df6 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 03:50:05 np0005603623 nova_compute[226235]: 2026-01-31 08:50:05.117 226239 INFO nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Updating resource usage from migration 9ab432c0-3f96-470c-ac27-8c3e3291f927#033[00m
Jan 31 03:50:05 np0005603623 nova_compute[226235]: 2026-01-31 08:50:05.117 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Starting to track outgoing migration 9ab432c0-3f96-470c-ac27-8c3e3291f927 with flavor a01eb4f0-fd80-416b-a750-75de320394d8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Jan 31 03:50:05 np0005603623 nova_compute[226235]: 2026-01-31 08:50:05.165 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 0edbf2b9-b76f-446b-85fa-09a4dcb37976 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:50:05 np0005603623 nova_compute[226235]: 2026-01-31 08:50:05.166 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance da4e355d-c6c2-446e-8eb1-d2ca8279e549 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:50:05 np0005603623 nova_compute[226235]: 2026-01-31 08:50:05.166 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 07ab810a-aca3-4084-abb7-b092f658255b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:50:05 np0005603623 nova_compute[226235]: 2026-01-31 08:50:05.166 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Migration 9ab432c0-3f96-470c-ac27-8c3e3291f927 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 03:50:05 np0005603623 nova_compute[226235]: 2026-01-31 08:50:05.167 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:50:05 np0005603623 nova_compute[226235]: 2026-01-31 08:50:05.167 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:50:05 np0005603623 nova_compute[226235]: 2026-01-31 08:50:05.491 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:05 np0005603623 nova_compute[226235]: 2026-01-31 08:50:05.677 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:50:05 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2697222297' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:50:05 np0005603623 nova_compute[226235]: 2026-01-31 08:50:05.903 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:05 np0005603623 nova_compute[226235]: 2026-01-31 08:50:05.916 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:50:06 np0005603623 nova_compute[226235]: 2026-01-31 08:50:06.002 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:50:06 np0005603623 nova_compute[226235]: 2026-01-31 08:50:06.089 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:50:06 np0005603623 nova_compute[226235]: 2026-01-31 08:50:06.090 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e361 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:06.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:50:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:06.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:50:07 np0005603623 nova_compute[226235]: 2026-01-31 08:50:07.240 226239 DEBUG nova.compute.manager [req-5d9533f4-d9a1-4ac6-9a9b-f8857db41f94 req-619c54ff-f5c1-46e6-8257-461c913eb4ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Received event network-vif-plugged-fbe66833-82a6-4f72-9b11-a4732140845a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:07 np0005603623 nova_compute[226235]: 2026-01-31 08:50:07.241 226239 DEBUG oslo_concurrency.lockutils [req-5d9533f4-d9a1-4ac6-9a9b-f8857db41f94 req-619c54ff-f5c1-46e6-8257-461c913eb4ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:07 np0005603623 nova_compute[226235]: 2026-01-31 08:50:07.242 226239 DEBUG oslo_concurrency.lockutils [req-5d9533f4-d9a1-4ac6-9a9b-f8857db41f94 req-619c54ff-f5c1-46e6-8257-461c913eb4ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:07 np0005603623 nova_compute[226235]: 2026-01-31 08:50:07.242 226239 DEBUG oslo_concurrency.lockutils [req-5d9533f4-d9a1-4ac6-9a9b-f8857db41f94 req-619c54ff-f5c1-46e6-8257-461c913eb4ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:07 np0005603623 nova_compute[226235]: 2026-01-31 08:50:07.242 226239 DEBUG nova.compute.manager [req-5d9533f4-d9a1-4ac6-9a9b-f8857db41f94 req-619c54ff-f5c1-46e6-8257-461c913eb4ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] No waiting events found dispatching network-vif-plugged-fbe66833-82a6-4f72-9b11-a4732140845a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:50:07 np0005603623 nova_compute[226235]: 2026-01-31 08:50:07.242 226239 WARNING nova.compute.manager [req-5d9533f4-d9a1-4ac6-9a9b-f8857db41f94 req-619c54ff-f5c1-46e6-8257-461c913eb4ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Received unexpected event network-vif-plugged-fbe66833-82a6-4f72-9b11-a4732140845a for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:50:07 np0005603623 nova_compute[226235]: 2026-01-31 08:50:07.699 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849392.698126, c215327f-37ad-41a7-a883-3dbb23334df6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:50:07 np0005603623 nova_compute[226235]: 2026-01-31 08:50:07.700 226239 INFO nova.compute.manager [-] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:50:07 np0005603623 nova_compute[226235]: 2026-01-31 08:50:07.860 226239 DEBUG nova.network.neutron [req-f27a829a-6c3c-4b52-82c8-8eae618483cc req-605b8b05-6023-420f-868e-23439d0e3ad5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Updated VIF entry in instance network info cache for port 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:50:07 np0005603623 nova_compute[226235]: 2026-01-31 08:50:07.861 226239 DEBUG nova.network.neutron [req-f27a829a-6c3c-4b52-82c8-8eae618483cc req-605b8b05-6023-420f-868e-23439d0e3ad5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Updating instance_info_cache with network_info: [{"id": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "address": "fa:16:3e:5e:64:02", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d2ef565-4b", "ovs_interfaceid": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:50:07 np0005603623 nova_compute[226235]: 2026-01-31 08:50:07.947 226239 DEBUG nova.compute.manager [None req-0d88a7b7-24e9-4c99-baae-0f03968b57ec - - - - - -] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:50:07 np0005603623 nova_compute[226235]: 2026-01-31 08:50:07.951 226239 DEBUG nova.compute.manager [None req-0d88a7b7-24e9-4c99-baae-0f03968b57ec - - - - - -] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:50:07 np0005603623 nova_compute[226235]: 2026-01-31 08:50:07.958 226239 DEBUG oslo_concurrency.lockutils [req-f27a829a-6c3c-4b52-82c8-8eae618483cc req-605b8b05-6023-420f-868e-23439d0e3ad5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-07ab810a-aca3-4084-abb7-b092f658255b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:50:07 np0005603623 nova_compute[226235]: 2026-01-31 08:50:07.992 226239 INFO nova.compute.manager [None req-0d88a7b7-24e9-4c99-baae-0f03968b57ec - - - - - -] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 31 03:50:08 np0005603623 nova_compute[226235]: 2026-01-31 08:50:08.268 226239 DEBUG oslo_concurrency.lockutils [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "c215327f-37ad-41a7-a883-3dbb23334df6" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:08 np0005603623 nova_compute[226235]: 2026-01-31 08:50:08.268 226239 DEBUG oslo_concurrency.lockutils [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:08 np0005603623 nova_compute[226235]: 2026-01-31 08:50:08.268 226239 DEBUG nova.compute.manager [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Going to confirm migration 22 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 31 03:50:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:08.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:08.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:09 np0005603623 nova_compute[226235]: 2026-01-31 08:50:09.063 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:10 np0005603623 nova_compute[226235]: 2026-01-31 08:50:10.024 226239 DEBUG neutronclient.v2_0.client [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port fbe66833-82a6-4f72-9b11-a4732140845a for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 31 03:50:10 np0005603623 nova_compute[226235]: 2026-01-31 08:50:10.024 226239 DEBUG oslo_concurrency.lockutils [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:50:10 np0005603623 nova_compute[226235]: 2026-01-31 08:50:10.024 226239 DEBUG oslo_concurrency.lockutils [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquired lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:50:10 np0005603623 nova_compute[226235]: 2026-01-31 08:50:10.025 226239 DEBUG nova.network.neutron [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:50:10 np0005603623 nova_compute[226235]: 2026-01-31 08:50:10.025 226239 DEBUG nova.objects.instance [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'info_cache' on Instance uuid c215327f-37ad-41a7-a883-3dbb23334df6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:10 np0005603623 nova_compute[226235]: 2026-01-31 08:50:10.142 226239 DEBUG nova.compute.manager [req-d84c389a-34de-4b88-81f8-45f9b56474db req-b8c36ce3-c7d3-46fd-92e5-a999da72efe0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Received event network-vif-plugged-fbe66833-82a6-4f72-9b11-a4732140845a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:10 np0005603623 nova_compute[226235]: 2026-01-31 08:50:10.142 226239 DEBUG oslo_concurrency.lockutils [req-d84c389a-34de-4b88-81f8-45f9b56474db req-b8c36ce3-c7d3-46fd-92e5-a999da72efe0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:10 np0005603623 nova_compute[226235]: 2026-01-31 08:50:10.142 226239 DEBUG oslo_concurrency.lockutils [req-d84c389a-34de-4b88-81f8-45f9b56474db req-b8c36ce3-c7d3-46fd-92e5-a999da72efe0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:10 np0005603623 nova_compute[226235]: 2026-01-31 08:50:10.143 226239 DEBUG oslo_concurrency.lockutils [req-d84c389a-34de-4b88-81f8-45f9b56474db req-b8c36ce3-c7d3-46fd-92e5-a999da72efe0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:10 np0005603623 nova_compute[226235]: 2026-01-31 08:50:10.143 226239 DEBUG nova.compute.manager [req-d84c389a-34de-4b88-81f8-45f9b56474db req-b8c36ce3-c7d3-46fd-92e5-a999da72efe0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] No waiting events found dispatching network-vif-plugged-fbe66833-82a6-4f72-9b11-a4732140845a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:50:10 np0005603623 nova_compute[226235]: 2026-01-31 08:50:10.143 226239 WARNING nova.compute.manager [req-d84c389a-34de-4b88-81f8-45f9b56474db req-b8c36ce3-c7d3-46fd-92e5-a999da72efe0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Received unexpected event network-vif-plugged-fbe66833-82a6-4f72-9b11-a4732140845a for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:50:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:10.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:10.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:10 np0005603623 nova_compute[226235]: 2026-01-31 08:50:10.679 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:11 np0005603623 nova_compute[226235]: 2026-01-31 08:50:11.090 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:50:11 np0005603623 nova_compute[226235]: 2026-01-31 08:50:11.090 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:50:11 np0005603623 nova_compute[226235]: 2026-01-31 08:50:11.091 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:50:11 np0005603623 nova_compute[226235]: 2026-01-31 08:50:11.091 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:50:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e361 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:11 np0005603623 ovn_controller[133449]: 2026-01-31T08:50:11Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5e:64:02 10.100.0.7
Jan 31 03:50:11 np0005603623 ovn_controller[133449]: 2026-01-31T08:50:11Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5e:64:02 10.100.0.7
Jan 31 03:50:11 np0005603623 nova_compute[226235]: 2026-01-31 08:50:11.962 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-0edbf2b9-b76f-446b-85fa-09a4dcb37976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:50:11 np0005603623 nova_compute[226235]: 2026-01-31 08:50:11.963 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-0edbf2b9-b76f-446b-85fa-09a4dcb37976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:50:11 np0005603623 nova_compute[226235]: 2026-01-31 08:50:11.963 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:50:11 np0005603623 nova_compute[226235]: 2026-01-31 08:50:11.963 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0edbf2b9-b76f-446b-85fa-09a4dcb37976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:50:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:12.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:50:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:50:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:12.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:50:14 np0005603623 nova_compute[226235]: 2026-01-31 08:50:14.067 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:14.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:14.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:14 np0005603623 nova_compute[226235]: 2026-01-31 08:50:14.967 226239 DEBUG nova.network.neutron [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: c215327f-37ad-41a7-a883-3dbb23334df6] Updating instance_info_cache with network_info: [{"id": "fbe66833-82a6-4f72-9b11-a4732140845a", "address": "fa:16:3e:d6:4d:37", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbe66833-82", "ovs_interfaceid": "fbe66833-82a6-4f72-9b11-a4732140845a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:50:15 np0005603623 nova_compute[226235]: 2026-01-31 08:50:15.681 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e361 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:16 np0005603623 nova_compute[226235]: 2026-01-31 08:50:16.234 226239 DEBUG oslo_concurrency.lockutils [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Releasing lock "refresh_cache-c215327f-37ad-41a7-a883-3dbb23334df6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:50:16 np0005603623 nova_compute[226235]: 2026-01-31 08:50:16.235 226239 DEBUG nova.objects.instance [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'migration_context' on Instance uuid c215327f-37ad-41a7-a883-3dbb23334df6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:16.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:16.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:16 np0005603623 nova_compute[226235]: 2026-01-31 08:50:16.471 226239 DEBUG nova.storage.rbd_utils [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] removing snapshot(nova-resize) on rbd image(c215327f-37ad-41a7-a883-3dbb23334df6_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:50:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e362 e362: 3 total, 3 up, 3 in
Jan 31 03:50:16 np0005603623 nova_compute[226235]: 2026-01-31 08:50:16.923 226239 DEBUG nova.virt.libvirt.vif [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:47:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='multiattach-server-0',id=163,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMGvV4tGHwFrQ7+1WPmMS3fGcrpcMKpLQBFiD2ZG0NedKq4jaCN6oHf8RWlX+X72Ff/PSGJSQ5nqRPZm+CDMr01vn3vAMra9m4dZ/R1d2vwh+NDFwu298PivPHJQkyuCpg==',key_name='tempest-keypair-600650673',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:50:06Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8397e0fed04b4dabb57148d0924de2dc',ramdisk_id='',reservation_id='r-libm6dxn',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-AttachVolumeMultiAttachTest-1931311941',owner_user_name='tempest-AttachVolumeMultiAttachTest-1931311941-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:50:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a498364761ef428b99cac3f92e603385',uuid=c215327f-37ad-41a7-a883-3dbb23334df6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "fbe66833-82a6-4f72-9b11-a4732140845a", "address": "fa:16:3e:d6:4d:37", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbe66833-82", "ovs_interfaceid": "fbe66833-82a6-4f72-9b11-a4732140845a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:50:16 np0005603623 nova_compute[226235]: 2026-01-31 08:50:16.924 226239 DEBUG nova.network.os_vif_util [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converting VIF {"id": "fbe66833-82a6-4f72-9b11-a4732140845a", "address": "fa:16:3e:d6:4d:37", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbe66833-82", "ovs_interfaceid": "fbe66833-82a6-4f72-9b11-a4732140845a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:50:16 np0005603623 nova_compute[226235]: 2026-01-31 08:50:16.924 226239 DEBUG nova.network.os_vif_util [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d6:4d:37,bridge_name='br-int',has_traffic_filtering=True,id=fbe66833-82a6-4f72-9b11-a4732140845a,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbe66833-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:50:16 np0005603623 nova_compute[226235]: 2026-01-31 08:50:16.925 226239 DEBUG os_vif [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d6:4d:37,bridge_name='br-int',has_traffic_filtering=True,id=fbe66833-82a6-4f72-9b11-a4732140845a,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbe66833-82') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:50:16 np0005603623 nova_compute[226235]: 2026-01-31 08:50:16.926 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:16 np0005603623 nova_compute[226235]: 2026-01-31 08:50:16.926 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbe66833-82, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:16 np0005603623 nova_compute[226235]: 2026-01-31 08:50:16.927 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:50:16 np0005603623 nova_compute[226235]: 2026-01-31 08:50:16.932 226239 INFO os_vif [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d6:4d:37,bridge_name='br-int',has_traffic_filtering=True,id=fbe66833-82a6-4f72-9b11-a4732140845a,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbe66833-82')#033[00m
Jan 31 03:50:16 np0005603623 nova_compute[226235]: 2026-01-31 08:50:16.933 226239 DEBUG oslo_concurrency.lockutils [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:16 np0005603623 nova_compute[226235]: 2026-01-31 08:50:16.933 226239 DEBUG oslo_concurrency.lockutils [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:17 np0005603623 nova_compute[226235]: 2026-01-31 08:50:17.146 226239 DEBUG oslo_concurrency.processutils [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:50:17 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1706663832' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:50:17 np0005603623 nova_compute[226235]: 2026-01-31 08:50:17.585 226239 DEBUG oslo_concurrency.processutils [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:17 np0005603623 nova_compute[226235]: 2026-01-31 08:50:17.590 226239 DEBUG nova.compute.provider_tree [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:50:17 np0005603623 nova_compute[226235]: 2026-01-31 08:50:17.621 226239 DEBUG nova.scheduler.client.report [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:50:17 np0005603623 nova_compute[226235]: 2026-01-31 08:50:17.638 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Updating instance_info_cache with network_info: [{"id": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "address": "fa:16:3e:84:6e:68", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6486275-22", "ovs_interfaceid": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:50:17 np0005603623 nova_compute[226235]: 2026-01-31 08:50:17.669 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-0edbf2b9-b76f-446b-85fa-09a4dcb37976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:50:17 np0005603623 nova_compute[226235]: 2026-01-31 08:50:17.670 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:50:17 np0005603623 nova_compute[226235]: 2026-01-31 08:50:17.670 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:50:17 np0005603623 nova_compute[226235]: 2026-01-31 08:50:17.670 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:50:17 np0005603623 nova_compute[226235]: 2026-01-31 08:50:17.671 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:50:17 np0005603623 nova_compute[226235]: 2026-01-31 08:50:17.671 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:50:17 np0005603623 nova_compute[226235]: 2026-01-31 08:50:17.719 226239 DEBUG oslo_concurrency.lockutils [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:17 np0005603623 nova_compute[226235]: 2026-01-31 08:50:17.957 226239 INFO nova.scheduler.client.report [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Deleted allocation for migration 9ab432c0-3f96-470c-ac27-8c3e3291f927#033[00m
Jan 31 03:50:18 np0005603623 nova_compute[226235]: 2026-01-31 08:50:18.098 226239 DEBUG oslo_concurrency.lockutils [None req-058c21fa-17e0-4522-b8d2-b40e925b4c5c a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "c215327f-37ad-41a7-a883-3dbb23334df6" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 9.830s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:18.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:18.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:18 np0005603623 nova_compute[226235]: 2026-01-31 08:50:18.591 226239 DEBUG oslo_concurrency.lockutils [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "07ab810a-aca3-4084-abb7-b092f658255b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:18 np0005603623 nova_compute[226235]: 2026-01-31 08:50:18.591 226239 DEBUG oslo_concurrency.lockutils [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "07ab810a-aca3-4084-abb7-b092f658255b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:18 np0005603623 nova_compute[226235]: 2026-01-31 08:50:18.592 226239 DEBUG oslo_concurrency.lockutils [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "07ab810a-aca3-4084-abb7-b092f658255b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:18 np0005603623 nova_compute[226235]: 2026-01-31 08:50:18.592 226239 DEBUG oslo_concurrency.lockutils [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "07ab810a-aca3-4084-abb7-b092f658255b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:18 np0005603623 nova_compute[226235]: 2026-01-31 08:50:18.592 226239 DEBUG oslo_concurrency.lockutils [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "07ab810a-aca3-4084-abb7-b092f658255b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:18 np0005603623 nova_compute[226235]: 2026-01-31 08:50:18.593 226239 INFO nova.compute.manager [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Terminating instance#033[00m
Jan 31 03:50:18 np0005603623 nova_compute[226235]: 2026-01-31 08:50:18.594 226239 DEBUG nova.compute.manager [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:50:18 np0005603623 kernel: tap0d2ef565-4b (unregistering): left promiscuous mode
Jan 31 03:50:18 np0005603623 NetworkManager[48970]: <info>  [1769849418.8305] device (tap0d2ef565-4b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:50:18 np0005603623 nova_compute[226235]: 2026-01-31 08:50:18.837 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:18 np0005603623 ovn_controller[133449]: 2026-01-31T08:50:18Z|00697|binding|INFO|Releasing lport 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 from this chassis (sb_readonly=0)
Jan 31 03:50:18 np0005603623 ovn_controller[133449]: 2026-01-31T08:50:18Z|00698|binding|INFO|Setting lport 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 down in Southbound
Jan 31 03:50:18 np0005603623 ovn_controller[133449]: 2026-01-31T08:50:18Z|00699|binding|INFO|Removing iface tap0d2ef565-4b ovn-installed in OVS
Jan 31 03:50:18 np0005603623 nova_compute[226235]: 2026-01-31 08:50:18.839 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:18 np0005603623 nova_compute[226235]: 2026-01-31 08:50:18.847 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:18.857 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:64:02 10.100.0.7'], port_security=['fa:16:3e:5e:64:02 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '07ab810a-aca3-4084-abb7-b092f658255b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a02f269a-650e-4227-8352-05abf2566c17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76fb5cb7abcd4d74abfc471a96bbd12c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b8be5d87-2c36-4b5c-850e-b217f1c7a95b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.243'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97876448-21a4-4b64-9452-bd401dfcc8ac, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:50:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:18.859 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 in datapath a02f269a-650e-4227-8352-05abf2566c17 unbound from our chassis#033[00m
Jan 31 03:50:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:18.861 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a02f269a-650e-4227-8352-05abf2566c17, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:50:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:18.862 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[72844183-61b3-45f1-b454-d77e651fe7c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:18 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:18.863 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a02f269a-650e-4227-8352-05abf2566c17 namespace which is not needed anymore#033[00m
Jan 31 03:50:18 np0005603623 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000a8.scope: Deactivated successfully.
Jan 31 03:50:18 np0005603623 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000a8.scope: Consumed 12.859s CPU time.
Jan 31 03:50:18 np0005603623 systemd-machined[194379]: Machine qemu-79-instance-000000a8 terminated.
Jan 31 03:50:18 np0005603623 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[304676]: [NOTICE]   (304680) : haproxy version is 2.8.14-c23fe91
Jan 31 03:50:18 np0005603623 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[304676]: [NOTICE]   (304680) : path to executable is /usr/sbin/haproxy
Jan 31 03:50:18 np0005603623 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[304676]: [WARNING]  (304680) : Exiting Master process...
Jan 31 03:50:18 np0005603623 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[304676]: [WARNING]  (304680) : Exiting Master process...
Jan 31 03:50:18 np0005603623 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[304676]: [ALERT]    (304680) : Current worker (304682) exited with code 143 (Terminated)
Jan 31 03:50:18 np0005603623 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[304676]: [WARNING]  (304680) : All workers exited. Exiting... (0)
Jan 31 03:50:18 np0005603623 systemd[1]: libpod-dd2acf0b4d6082ad9bd33d42ec192f7b282b6553ccfe3e263280c8c61578cc89.scope: Deactivated successfully.
Jan 31 03:50:18 np0005603623 podman[304975]: 2026-01-31 08:50:18.971070059 +0000 UTC m=+0.038192319 container died dd2acf0b4d6082ad9bd33d42ec192f7b282b6553ccfe3e263280c8c61578cc89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:50:18 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dd2acf0b4d6082ad9bd33d42ec192f7b282b6553ccfe3e263280c8c61578cc89-userdata-shm.mount: Deactivated successfully.
Jan 31 03:50:18 np0005603623 systemd[1]: var-lib-containers-storage-overlay-10f449439fc98b3b2ff16255c73135d888c2d6d1654a160d6dc7691b4c5344f5-merged.mount: Deactivated successfully.
Jan 31 03:50:19 np0005603623 podman[304975]: 2026-01-31 08:50:19.007587675 +0000 UTC m=+0.074709935 container cleanup dd2acf0b4d6082ad9bd33d42ec192f7b282b6553ccfe3e263280c8c61578cc89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:50:19 np0005603623 kernel: tap0d2ef565-4b: entered promiscuous mode
Jan 31 03:50:19 np0005603623 NetworkManager[48970]: <info>  [1769849419.0097] manager: (tap0d2ef565-4b): new Tun device (/org/freedesktop/NetworkManager/Devices/332)
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.010 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:19 np0005603623 kernel: tap0d2ef565-4b (unregistering): left promiscuous mode
Jan 31 03:50:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:50:19Z|00700|binding|INFO|Claiming lport 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 for this chassis.
Jan 31 03:50:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:50:19Z|00701|binding|INFO|0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8: Claiming fa:16:3e:5e:64:02 10.100.0.7
Jan 31 03:50:19 np0005603623 systemd[1]: libpod-conmon-dd2acf0b4d6082ad9bd33d42ec192f7b282b6553ccfe3e263280c8c61578cc89.scope: Deactivated successfully.
Jan 31 03:50:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:50:19Z|00702|binding|INFO|Setting lport 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 ovn-installed in OVS
Jan 31 03:50:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:50:19Z|00703|if_status|INFO|Dropped 1 log messages in last 1069 seconds (most recently, 1069 seconds ago) due to excessive rate
Jan 31 03:50:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:50:19Z|00704|if_status|INFO|Not setting lport 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 down as sb is readonly
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.024 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.027 226239 INFO nova.virt.libvirt.driver [-] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Instance destroyed successfully.#033[00m
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.028 226239 DEBUG nova.objects.instance [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lazy-loading 'resources' on Instance uuid 07ab810a-aca3-4084-abb7-b092f658255b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:19 np0005603623 podman[305008]: 2026-01-31 08:50:19.064634404 +0000 UTC m=+0.038986524 container remove dd2acf0b4d6082ad9bd33d42ec192f7b282b6553ccfe3e263280c8c61578cc89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.068 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[99779202-64e6-4eff-8f02-598c5ac7ae6d]: (4, ('Sat Jan 31 08:50:18 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17 (dd2acf0b4d6082ad9bd33d42ec192f7b282b6553ccfe3e263280c8c61578cc89)\ndd2acf0b4d6082ad9bd33d42ec192f7b282b6553ccfe3e263280c8c61578cc89\nSat Jan 31 08:50:19 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17 (dd2acf0b4d6082ad9bd33d42ec192f7b282b6553ccfe3e263280c8c61578cc89)\ndd2acf0b4d6082ad9bd33d42ec192f7b282b6553ccfe3e263280c8c61578cc89\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.069 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.070 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9eb04697-3a48-4414-85af-106918178eac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.071 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa02f269a-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.072 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:19 np0005603623 kernel: tapa02f269a-60: left promiscuous mode
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.080 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.081 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.083 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7ec2eb28-d958-4e89-aa51-e75a3264f4ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.096 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ad9eb5be-a07f-40f4-b279-541f179a1540]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.097 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0310c2b6-1399-405b-abb1-776b851bb30c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.107 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[90ed4717-e209-44d5-a992-13625dc0f4fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 853572, 'reachable_time': 26929, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 305028, 'error': None, 'target': 'ovnmeta-a02f269a-650e-4227-8352-05abf2566c17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.110 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a02f269a-650e-4227-8352-05abf2566c17 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.110 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[82cc0cef-3a4e-4f7b-88c6-f8ef01be3294]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 systemd[1]: run-netns-ovnmeta\x2da02f269a\x2d650e\x2d4227\x2d8352\x2d05abf2566c17.mount: Deactivated successfully.
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.381 226239 DEBUG nova.virt.libvirt.vif [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:49:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-494286398',display_name='tempest-AttachVolumeNegativeTest-server-494286398',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-494286398',id=168,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFaWuk8mgnEQe6j9EgiK1wmF6jqLzpW2mI2jCE7KPbM8a8Truz16MX5T0/cz0IqO/RG2Eb/afZul8hye4itT39RkNKl3UlAnb9FyZBHowBQj4vN9SLaA7npVJpQN7XcH9w==',key_name='tempest-keypair-1801134062',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:49:56Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76fb5cb7abcd4d74abfc471a96bbd12c',ramdisk_id='',reservation_id='r-qhhau9sk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeNegativeTest-457307401',owner_user_name='tempest-AttachVolumeNegativeTest-457307401-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:49:56Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3bd4ce8a916a4bdbbc988eb4fe32991e',uuid=07ab810a-aca3-4084-abb7-b092f658255b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "address": "fa:16:3e:5e:64:02", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d2ef565-4b", "ovs_interfaceid": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.381 226239 DEBUG nova.network.os_vif_util [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Converting VIF {"id": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "address": "fa:16:3e:5e:64:02", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d2ef565-4b", "ovs_interfaceid": "0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.382 226239 DEBUG nova.network.os_vif_util [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5e:64:02,bridge_name='br-int',has_traffic_filtering=True,id=0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8,network=Network(a02f269a-650e-4227-8352-05abf2566c17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d2ef565-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.382 226239 DEBUG os_vif [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5e:64:02,bridge_name='br-int',has_traffic_filtering=True,id=0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8,network=Network(a02f269a-650e-4227-8352-05abf2566c17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d2ef565-4b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.383 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.384 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d2ef565-4b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.385 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.387 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.389 226239 INFO os_vif [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5e:64:02,bridge_name='br-int',has_traffic_filtering=True,id=0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8,network=Network(a02f269a-650e-4227-8352-05abf2566c17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d2ef565-4b')#033[00m
Jan 31 03:50:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:50:19Z|00705|binding|INFO|Releasing lport 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 from this chassis (sb_readonly=0)
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.546 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:64:02 10.100.0.7'], port_security=['fa:16:3e:5e:64:02 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '07ab810a-aca3-4084-abb7-b092f658255b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a02f269a-650e-4227-8352-05abf2566c17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76fb5cb7abcd4d74abfc471a96bbd12c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b8be5d87-2c36-4b5c-850e-b217f1c7a95b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.243'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97876448-21a4-4b64-9452-bd401dfcc8ac, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.548 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 in datapath a02f269a-650e-4227-8352-05abf2566c17 bound to our chassis#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.550 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a02f269a-650e-4227-8352-05abf2566c17#033[00m
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.551 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.557 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fc1f5040-b159-471b-a184-947b6bfa89b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.558 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa02f269a-61 in ovnmeta-a02f269a-650e-4227-8352-05abf2566c17 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.559 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa02f269a-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.559 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c80ac120-4fa3-4320-a13b-805e84f955ef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.560 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[35ae58c6-1760-4810-81ae-be66a0809870]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.566 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[074b2836-c216-4280-aa63-33fddde0da02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.574 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0d27e01c-297f-4abc-afcb-b810b77c4848]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.593 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[87b8f56e-7e38-4745-9cab-3928ed889b0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 systemd-udevd[304954]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:50:19 np0005603623 NetworkManager[48970]: <info>  [1769849419.5995] manager: (tapa02f269a-60): new Veth device (/org/freedesktop/NetworkManager/Devices/333)
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.599 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[15cd870a-c6d7-41f5-b3b2-950328478e37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.622 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[b966693a-4175-43c5-b497-5ddee9cf559c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.625 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[b67ee99c-ae43-4e2c-9284-74e8e412262a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 NetworkManager[48970]: <info>  [1769849419.6432] device (tapa02f269a-60): carrier: link connected
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.647 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[339ea825-e4e5-41ef-b6a7-c42016850ab3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.659 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[12240d3e-6583-44a6-b3b4-f54a1853fdbe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa02f269a-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:e9:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 209], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 856007, 'reachable_time': 29816, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 305072, 'error': None, 'target': 'ovnmeta-a02f269a-650e-4227-8352-05abf2566c17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.670 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[97aa1d91-639a-4256-9b95-a2abe2ad5e7d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee4:e973'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 856007, 'tstamp': 856007}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 305073, 'error': None, 'target': 'ovnmeta-a02f269a-650e-4227-8352-05abf2566c17', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.683 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[042d49be-ab2a-4e95-97d1-76eb53fa26dc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa02f269a-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:e9:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 209], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 856007, 'reachable_time': 29816, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 305074, 'error': None, 'target': 'ovnmeta-a02f269a-650e-4227-8352-05abf2566c17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.705 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[88ca4c80-fe10-499b-b1f4-3ad874d372a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.743 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1acf02e4-a414-4794-a120-03b786c07b6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.744 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa02f269a-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.744 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.745 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa02f269a-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.747 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:19 np0005603623 NetworkManager[48970]: <info>  [1769849419.7477] manager: (tapa02f269a-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/334)
Jan 31 03:50:19 np0005603623 kernel: tapa02f269a-60: entered promiscuous mode
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.749 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.749 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa02f269a-60, col_values=(('external_ids', {'iface-id': '2c775482-0f82-4695-be62-4a95328fbf79'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.750 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:50:19Z|00706|binding|INFO|Releasing lport 2c775482-0f82-4695-be62-4a95328fbf79 from this chassis (sb_readonly=1)
Jan 31 03:50:19 np0005603623 nova_compute[226235]: 2026-01-31 08:50:19.755 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.756 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a02f269a-650e-4227-8352-05abf2566c17.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a02f269a-650e-4227-8352-05abf2566c17.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.758 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[645193ff-fde3-460d-85c1-f18f8fec7ae0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.758 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-a02f269a-650e-4227-8352-05abf2566c17
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/a02f269a-650e-4227-8352-05abf2566c17.pid.haproxy
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID a02f269a-650e-4227-8352-05abf2566c17
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:50:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:19.759 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a02f269a-650e-4227-8352-05abf2566c17', 'env', 'PROCESS_TAG=haproxy-a02f269a-650e-4227-8352-05abf2566c17', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a02f269a-650e-4227-8352-05abf2566c17.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:50:20 np0005603623 podman[305106]: 2026-01-31 08:50:20.087926326 +0000 UTC m=+0.067775727 container create 3dfc0496b446aea4732d22b24cf19e49145a842f5397258dbf2e8207af83e81c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:50:20 np0005603623 systemd[1]: Started libpod-conmon-3dfc0496b446aea4732d22b24cf19e49145a842f5397258dbf2e8207af83e81c.scope.
Jan 31 03:50:20 np0005603623 podman[305106]: 2026-01-31 08:50:20.039472986 +0000 UTC m=+0.019322427 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:50:20 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:50:20 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe000c9c256c3bc72162f1066b2e75359cce01e80ef11164d1ea39be907ffdfa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:50:20 np0005603623 podman[305106]: 2026-01-31 08:50:20.154476654 +0000 UTC m=+0.134326045 container init 3dfc0496b446aea4732d22b24cf19e49145a842f5397258dbf2e8207af83e81c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Jan 31 03:50:20 np0005603623 podman[305106]: 2026-01-31 08:50:20.15853129 +0000 UTC m=+0.138380691 container start 3dfc0496b446aea4732d22b24cf19e49145a842f5397258dbf2e8207af83e81c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 03:50:20 np0005603623 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[305121]: [NOTICE]   (305125) : New worker (305127) forked
Jan 31 03:50:20 np0005603623 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[305121]: [NOTICE]   (305125) : Loading success.
Jan 31 03:50:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:50:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:20.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:50:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:50:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:20.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:50:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:20.494 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:64:02 10.100.0.7'], port_security=['fa:16:3e:5e:64:02 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '07ab810a-aca3-4084-abb7-b092f658255b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a02f269a-650e-4227-8352-05abf2566c17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76fb5cb7abcd4d74abfc471a96bbd12c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b8be5d87-2c36-4b5c-850e-b217f1c7a95b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.243'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97876448-21a4-4b64-9452-bd401dfcc8ac, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:50:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:20.496 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 in datapath a02f269a-650e-4227-8352-05abf2566c17 unbound from our chassis#033[00m
Jan 31 03:50:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:20.498 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a02f269a-650e-4227-8352-05abf2566c17, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:50:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:20.499 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[99a0ca45-14e2-47cf-9987-cb0dd3b42d41]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:20.500 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a02f269a-650e-4227-8352-05abf2566c17 namespace which is not needed anymore#033[00m
Jan 31 03:50:20 np0005603623 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[305121]: [NOTICE]   (305125) : haproxy version is 2.8.14-c23fe91
Jan 31 03:50:20 np0005603623 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[305121]: [NOTICE]   (305125) : path to executable is /usr/sbin/haproxy
Jan 31 03:50:20 np0005603623 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[305121]: [WARNING]  (305125) : Exiting Master process...
Jan 31 03:50:20 np0005603623 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[305121]: [ALERT]    (305125) : Current worker (305127) exited with code 143 (Terminated)
Jan 31 03:50:20 np0005603623 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[305121]: [WARNING]  (305125) : All workers exited. Exiting... (0)
Jan 31 03:50:20 np0005603623 systemd[1]: libpod-3dfc0496b446aea4732d22b24cf19e49145a842f5397258dbf2e8207af83e81c.scope: Deactivated successfully.
Jan 31 03:50:20 np0005603623 podman[305155]: 2026-01-31 08:50:20.610356855 +0000 UTC m=+0.045574731 container died 3dfc0496b446aea4732d22b24cf19e49145a842f5397258dbf2e8207af83e81c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 03:50:20 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3dfc0496b446aea4732d22b24cf19e49145a842f5397258dbf2e8207af83e81c-userdata-shm.mount: Deactivated successfully.
Jan 31 03:50:20 np0005603623 systemd[1]: var-lib-containers-storage-overlay-fe000c9c256c3bc72162f1066b2e75359cce01e80ef11164d1ea39be907ffdfa-merged.mount: Deactivated successfully.
Jan 31 03:50:20 np0005603623 podman[305155]: 2026-01-31 08:50:20.647445819 +0000 UTC m=+0.082663685 container cleanup 3dfc0496b446aea4732d22b24cf19e49145a842f5397258dbf2e8207af83e81c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:50:20 np0005603623 systemd[1]: libpod-conmon-3dfc0496b446aea4732d22b24cf19e49145a842f5397258dbf2e8207af83e81c.scope: Deactivated successfully.
Jan 31 03:50:20 np0005603623 nova_compute[226235]: 2026-01-31 08:50:20.683 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:20 np0005603623 podman[305186]: 2026-01-31 08:50:20.707588615 +0000 UTC m=+0.040142050 container remove 3dfc0496b446aea4732d22b24cf19e49145a842f5397258dbf2e8207af83e81c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 03:50:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:20.711 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[488a8e1c-0fdc-456f-94a5-25ba8cdf53db]: (4, ('Sat Jan 31 08:50:20 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17 (3dfc0496b446aea4732d22b24cf19e49145a842f5397258dbf2e8207af83e81c)\n3dfc0496b446aea4732d22b24cf19e49145a842f5397258dbf2e8207af83e81c\nSat Jan 31 08:50:20 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17 (3dfc0496b446aea4732d22b24cf19e49145a842f5397258dbf2e8207af83e81c)\n3dfc0496b446aea4732d22b24cf19e49145a842f5397258dbf2e8207af83e81c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:20.713 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d647a40c-53d8-4fed-acbc-aabf3407cce3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:20.714 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa02f269a-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:20 np0005603623 nova_compute[226235]: 2026-01-31 08:50:20.716 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:20 np0005603623 kernel: tapa02f269a-60: left promiscuous mode
Jan 31 03:50:20 np0005603623 nova_compute[226235]: 2026-01-31 08:50:20.722 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:20.725 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2707f433-23b3-42b9-8bf8-db812d9c32ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:20.737 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4351c266-185e-41d8-8a44-d5c964daa4ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:20.738 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ba5bea89-e135-4bc2-b58d-aa8c5d606bd7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:20.749 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[59873053-0dec-4695-990c-9ccc5b5cb1a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 856002, 'reachable_time': 20362, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 305201, 'error': None, 'target': 'ovnmeta-a02f269a-650e-4227-8352-05abf2566c17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:20.750 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a02f269a-650e-4227-8352-05abf2566c17 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:50:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:20.751 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[b18ace9a-c14b-463f-90a8-ba457231e8da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:20 np0005603623 systemd[1]: run-netns-ovnmeta\x2da02f269a\x2d650e\x2d4227\x2d8352\x2d05abf2566c17.mount: Deactivated successfully.
Jan 31 03:50:21 np0005603623 nova_compute[226235]: 2026-01-31 08:50:21.138 226239 INFO nova.virt.libvirt.driver [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Deleting instance files /var/lib/nova/instances/07ab810a-aca3-4084-abb7-b092f658255b_del#033[00m
Jan 31 03:50:21 np0005603623 nova_compute[226235]: 2026-01-31 08:50:21.139 226239 INFO nova.virt.libvirt.driver [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Deletion of /var/lib/nova/instances/07ab810a-aca3-4084-abb7-b092f658255b_del complete#033[00m
Jan 31 03:50:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e362 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:21 np0005603623 nova_compute[226235]: 2026-01-31 08:50:21.291 226239 INFO nova.compute.manager [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Took 2.70 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:50:21 np0005603623 nova_compute[226235]: 2026-01-31 08:50:21.292 226239 DEBUG oslo.service.loopingcall [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:50:21 np0005603623 nova_compute[226235]: 2026-01-31 08:50:21.293 226239 DEBUG nova.compute.manager [-] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:50:21 np0005603623 nova_compute[226235]: 2026-01-31 08:50:21.293 226239 DEBUG nova.network.neutron [-] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:50:21 np0005603623 nova_compute[226235]: 2026-01-31 08:50:21.905 226239 DEBUG nova.compute.manager [req-30d81eb7-3fd1-4bdf-8ac6-31ded27786a0 req-46aacab0-3ec1-434c-b06a-a2e38d0e3901 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Received event network-vif-unplugged-0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:21 np0005603623 nova_compute[226235]: 2026-01-31 08:50:21.905 226239 DEBUG oslo_concurrency.lockutils [req-30d81eb7-3fd1-4bdf-8ac6-31ded27786a0 req-46aacab0-3ec1-434c-b06a-a2e38d0e3901 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "07ab810a-aca3-4084-abb7-b092f658255b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:21 np0005603623 nova_compute[226235]: 2026-01-31 08:50:21.906 226239 DEBUG oslo_concurrency.lockutils [req-30d81eb7-3fd1-4bdf-8ac6-31ded27786a0 req-46aacab0-3ec1-434c-b06a-a2e38d0e3901 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "07ab810a-aca3-4084-abb7-b092f658255b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:21 np0005603623 nova_compute[226235]: 2026-01-31 08:50:21.906 226239 DEBUG oslo_concurrency.lockutils [req-30d81eb7-3fd1-4bdf-8ac6-31ded27786a0 req-46aacab0-3ec1-434c-b06a-a2e38d0e3901 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "07ab810a-aca3-4084-abb7-b092f658255b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:21 np0005603623 nova_compute[226235]: 2026-01-31 08:50:21.906 226239 DEBUG nova.compute.manager [req-30d81eb7-3fd1-4bdf-8ac6-31ded27786a0 req-46aacab0-3ec1-434c-b06a-a2e38d0e3901 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] No waiting events found dispatching network-vif-unplugged-0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:50:21 np0005603623 nova_compute[226235]: 2026-01-31 08:50:21.906 226239 DEBUG nova.compute.manager [req-30d81eb7-3fd1-4bdf-8ac6-31ded27786a0 req-46aacab0-3ec1-434c-b06a-a2e38d0e3901 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Received event network-vif-unplugged-0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:50:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:22.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:22.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:23 np0005603623 nova_compute[226235]: 2026-01-31 08:50:23.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:50:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:24.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:50:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:24.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:50:24 np0005603623 nova_compute[226235]: 2026-01-31 08:50:24.387 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:25 np0005603623 nova_compute[226235]: 2026-01-31 08:50:25.392 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:25.391 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=72, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=71) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:50:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:25.393 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:50:25 np0005603623 nova_compute[226235]: 2026-01-31 08:50:25.561 226239 DEBUG nova.compute.manager [req-9e2d2644-ab7f-4644-b68c-65352c03f6e6 req-0bf6eae4-0f86-4126-8bc0-4eb6523ebaba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Received event network-vif-plugged-0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:25 np0005603623 nova_compute[226235]: 2026-01-31 08:50:25.561 226239 DEBUG oslo_concurrency.lockutils [req-9e2d2644-ab7f-4644-b68c-65352c03f6e6 req-0bf6eae4-0f86-4126-8bc0-4eb6523ebaba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "07ab810a-aca3-4084-abb7-b092f658255b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:25 np0005603623 nova_compute[226235]: 2026-01-31 08:50:25.561 226239 DEBUG oslo_concurrency.lockutils [req-9e2d2644-ab7f-4644-b68c-65352c03f6e6 req-0bf6eae4-0f86-4126-8bc0-4eb6523ebaba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "07ab810a-aca3-4084-abb7-b092f658255b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:25 np0005603623 nova_compute[226235]: 2026-01-31 08:50:25.562 226239 DEBUG oslo_concurrency.lockutils [req-9e2d2644-ab7f-4644-b68c-65352c03f6e6 req-0bf6eae4-0f86-4126-8bc0-4eb6523ebaba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "07ab810a-aca3-4084-abb7-b092f658255b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:25 np0005603623 nova_compute[226235]: 2026-01-31 08:50:25.562 226239 DEBUG nova.compute.manager [req-9e2d2644-ab7f-4644-b68c-65352c03f6e6 req-0bf6eae4-0f86-4126-8bc0-4eb6523ebaba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] No waiting events found dispatching network-vif-plugged-0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:50:25 np0005603623 nova_compute[226235]: 2026-01-31 08:50:25.562 226239 WARNING nova.compute.manager [req-9e2d2644-ab7f-4644-b68c-65352c03f6e6 req-0bf6eae4-0f86-4126-8bc0-4eb6523ebaba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Received unexpected event network-vif-plugged-0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:50:25 np0005603623 nova_compute[226235]: 2026-01-31 08:50:25.569 226239 DEBUG nova.network.neutron [-] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:50:25 np0005603623 nova_compute[226235]: 2026-01-31 08:50:25.647 226239 DEBUG nova.compute.manager [req-200cd1fb-7d13-4872-9662-137d26c8dcd7 req-3b4d8a26-859f-4b81-bdcd-3fcb7414f03a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Received event network-vif-deleted-0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:25 np0005603623 nova_compute[226235]: 2026-01-31 08:50:25.647 226239 INFO nova.compute.manager [req-200cd1fb-7d13-4872-9662-137d26c8dcd7 req-3b4d8a26-859f-4b81-bdcd-3fcb7414f03a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Neutron deleted interface 0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:50:25 np0005603623 nova_compute[226235]: 2026-01-31 08:50:25.647 226239 DEBUG nova.network.neutron [req-200cd1fb-7d13-4872-9662-137d26c8dcd7 req-3b4d8a26-859f-4b81-bdcd-3fcb7414f03a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:50:25 np0005603623 nova_compute[226235]: 2026-01-31 08:50:25.660 226239 INFO nova.compute.manager [-] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Took 4.37 seconds to deallocate network for instance.#033[00m
Jan 31 03:50:25 np0005603623 nova_compute[226235]: 2026-01-31 08:50:25.685 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:25 np0005603623 nova_compute[226235]: 2026-01-31 08:50:25.738 226239 DEBUG nova.compute.manager [req-200cd1fb-7d13-4872-9662-137d26c8dcd7 req-3b4d8a26-859f-4b81-bdcd-3fcb7414f03a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Detach interface failed, port_id=0d2ef565-4b7f-48bb-ae2b-a5ec60a409b8, reason: Instance 07ab810a-aca3-4084-abb7-b092f658255b could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:50:25 np0005603623 nova_compute[226235]: 2026-01-31 08:50:25.792 226239 DEBUG oslo_concurrency.lockutils [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:25 np0005603623 nova_compute[226235]: 2026-01-31 08:50:25.793 226239 DEBUG oslo_concurrency.lockutils [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e362 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:26 np0005603623 nova_compute[226235]: 2026-01-31 08:50:26.288 226239 DEBUG oslo_concurrency.processutils [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:26.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:26.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:50:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1299003526' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:50:26 np0005603623 nova_compute[226235]: 2026-01-31 08:50:26.721 226239 DEBUG oslo_concurrency.processutils [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:26 np0005603623 nova_compute[226235]: 2026-01-31 08:50:26.726 226239 DEBUG nova.compute.provider_tree [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:50:26 np0005603623 nova_compute[226235]: 2026-01-31 08:50:26.826 226239 DEBUG nova.scheduler.client.report [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:50:27 np0005603623 nova_compute[226235]: 2026-01-31 08:50:27.067 226239 DEBUG oslo_concurrency.lockutils [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:27 np0005603623 nova_compute[226235]: 2026-01-31 08:50:27.154 226239 INFO nova.scheduler.client.report [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Deleted allocations for instance 07ab810a-aca3-4084-abb7-b092f658255b#033[00m
Jan 31 03:50:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e363 e363: 3 total, 3 up, 3 in
Jan 31 03:50:27 np0005603623 nova_compute[226235]: 2026-01-31 08:50:27.365 226239 DEBUG oslo_concurrency.lockutils [None req-aa3e4c5b-b550-4a90-b82d-77e8af01d504 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "07ab810a-aca3-4084-abb7-b092f658255b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.774s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:27 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:50:27 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:50:27 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:50:27 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:50:27 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:50:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:28.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:50:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:28.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:50:29 np0005603623 nova_compute[226235]: 2026-01-31 08:50:29.389 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:29 np0005603623 podman[305361]: 2026-01-31 08:50:29.973910357 +0000 UTC m=+0.064600558 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:50:29 np0005603623 podman[305360]: 2026-01-31 08:50:29.984344445 +0000 UTC m=+0.075251532 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 03:50:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:30.142 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:30.142 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:30.142 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:30.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:50:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:30.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:50:30 np0005603623 nova_compute[226235]: 2026-01-31 08:50:30.686 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:50:31.395 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '72'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:50:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:32.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:50:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:50:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:32.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:50:33 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:50:33 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:50:34 np0005603623 nova_compute[226235]: 2026-01-31 08:50:34.026 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849419.0240421, 07ab810a-aca3-4084-abb7-b092f658255b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:50:34 np0005603623 nova_compute[226235]: 2026-01-31 08:50:34.027 226239 INFO nova.compute.manager [-] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:50:34 np0005603623 nova_compute[226235]: 2026-01-31 08:50:34.190 226239 DEBUG nova.compute.manager [None req-fa1ed3cc-46e4-4f34-9df0-399d99c7120c - - - - - -] [instance: 07ab810a-aca3-4084-abb7-b092f658255b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:50:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:34.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:34.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:34 np0005603623 nova_compute[226235]: 2026-01-31 08:50:34.392 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:35 np0005603623 nova_compute[226235]: 2026-01-31 08:50:35.688 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:36.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:50:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:36.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:50:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:38.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:38.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:39 np0005603623 nova_compute[226235]: 2026-01-31 08:50:39.395 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:40 np0005603623 nova_compute[226235]: 2026-01-31 08:50:40.289 226239 DEBUG nova.compute.manager [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 31 03:50:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:40.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:40.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:40 np0005603623 nova_compute[226235]: 2026-01-31 08:50:40.469 226239 DEBUG oslo_concurrency.lockutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:40 np0005603623 nova_compute[226235]: 2026-01-31 08:50:40.469 226239 DEBUG oslo_concurrency.lockutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:40 np0005603623 nova_compute[226235]: 2026-01-31 08:50:40.502 226239 DEBUG nova.objects.instance [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'pci_requests' on Instance uuid 4af4043c-8199-4d0f-acf9-38d029560167 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:40 np0005603623 nova_compute[226235]: 2026-01-31 08:50:40.526 226239 DEBUG nova.virt.hardware [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:50:40 np0005603623 nova_compute[226235]: 2026-01-31 08:50:40.526 226239 INFO nova.compute.claims [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:50:40 np0005603623 nova_compute[226235]: 2026-01-31 08:50:40.527 226239 DEBUG nova.objects.instance [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'resources' on Instance uuid 4af4043c-8199-4d0f-acf9-38d029560167 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:40 np0005603623 nova_compute[226235]: 2026-01-31 08:50:40.548 226239 DEBUG nova.objects.instance [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'pci_devices' on Instance uuid 4af4043c-8199-4d0f-acf9-38d029560167 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:40 np0005603623 nova_compute[226235]: 2026-01-31 08:50:40.633 226239 INFO nova.compute.resource_tracker [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updating resource usage from migration cccb9a8e-31b9-485b-8565-37667620f588#033[00m
Jan 31 03:50:40 np0005603623 nova_compute[226235]: 2026-01-31 08:50:40.634 226239 DEBUG nova.compute.resource_tracker [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Starting to track incoming migration cccb9a8e-31b9-485b-8565-37667620f588 with flavor f75c4aee-d826-4343-a7e3-f06a4b21de52 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 31 03:50:40 np0005603623 nova_compute[226235]: 2026-01-31 08:50:40.689 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:40 np0005603623 nova_compute[226235]: 2026-01-31 08:50:40.755 226239 DEBUG oslo_concurrency.processutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:50:41 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3779543151' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:50:41 np0005603623 nova_compute[226235]: 2026-01-31 08:50:41.248 226239 DEBUG oslo_concurrency.processutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:41 np0005603623 nova_compute[226235]: 2026-01-31 08:50:41.253 226239 DEBUG nova.compute.provider_tree [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:50:41 np0005603623 nova_compute[226235]: 2026-01-31 08:50:41.308 226239 DEBUG nova.scheduler.client.report [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:50:41 np0005603623 nova_compute[226235]: 2026-01-31 08:50:41.344 226239 DEBUG oslo_concurrency.lockutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.875s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:41 np0005603623 nova_compute[226235]: 2026-01-31 08:50:41.345 226239 INFO nova.compute.manager [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Migrating#033[00m
Jan 31 03:50:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:42.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:50:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:42.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:50:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:44.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:44.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:44 np0005603623 nova_compute[226235]: 2026-01-31 08:50:44.398 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:45 np0005603623 nova_compute[226235]: 2026-01-31 08:50:45.691 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:46.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:50:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:46.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:50:46 np0005603623 systemd-logind[795]: New session 66 of user nova.
Jan 31 03:50:46 np0005603623 systemd[1]: Created slice User Slice of UID 42436.
Jan 31 03:50:46 np0005603623 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 31 03:50:46 np0005603623 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 31 03:50:46 np0005603623 systemd[1]: Starting User Manager for UID 42436...
Jan 31 03:50:46 np0005603623 systemd[305537]: Queued start job for default target Main User Target.
Jan 31 03:50:46 np0005603623 systemd[305537]: Created slice User Application Slice.
Jan 31 03:50:46 np0005603623 systemd[305537]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 31 03:50:46 np0005603623 systemd[305537]: Started Daily Cleanup of User's Temporary Directories.
Jan 31 03:50:46 np0005603623 systemd[305537]: Reached target Paths.
Jan 31 03:50:46 np0005603623 systemd[305537]: Reached target Timers.
Jan 31 03:50:46 np0005603623 systemd[305537]: Starting D-Bus User Message Bus Socket...
Jan 31 03:50:46 np0005603623 systemd[305537]: Starting Create User's Volatile Files and Directories...
Jan 31 03:50:46 np0005603623 systemd[305537]: Finished Create User's Volatile Files and Directories.
Jan 31 03:50:46 np0005603623 systemd[305537]: Listening on D-Bus User Message Bus Socket.
Jan 31 03:50:46 np0005603623 systemd[305537]: Reached target Sockets.
Jan 31 03:50:46 np0005603623 systemd[305537]: Reached target Basic System.
Jan 31 03:50:46 np0005603623 systemd[305537]: Reached target Main User Target.
Jan 31 03:50:46 np0005603623 systemd[305537]: Startup finished in 116ms.
Jan 31 03:50:46 np0005603623 systemd[1]: Started User Manager for UID 42436.
Jan 31 03:50:46 np0005603623 systemd[1]: Started Session 66 of User nova.
Jan 31 03:50:47 np0005603623 systemd[1]: session-66.scope: Deactivated successfully.
Jan 31 03:50:47 np0005603623 systemd-logind[795]: Session 66 logged out. Waiting for processes to exit.
Jan 31 03:50:47 np0005603623 systemd-logind[795]: Removed session 66.
Jan 31 03:50:47 np0005603623 systemd-logind[795]: New session 68 of user nova.
Jan 31 03:50:47 np0005603623 systemd[1]: Started Session 68 of User nova.
Jan 31 03:50:47 np0005603623 systemd[1]: session-68.scope: Deactivated successfully.
Jan 31 03:50:47 np0005603623 systemd-logind[795]: Session 68 logged out. Waiting for processes to exit.
Jan 31 03:50:47 np0005603623 systemd-logind[795]: Removed session 68.
Jan 31 03:50:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:50:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:48.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:50:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:48.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:49 np0005603623 nova_compute[226235]: 2026-01-31 08:50:49.402 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:50.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:50:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:50.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:50:50 np0005603623 nova_compute[226235]: 2026-01-31 08:50:50.692 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:52.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:52.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:50:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:54.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:50:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:54.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:54 np0005603623 nova_compute[226235]: 2026-01-31 08:50:54.405 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:54 np0005603623 nova_compute[226235]: 2026-01-31 08:50:54.408 226239 INFO nova.network.neutron [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updating port 3aae5c0f-f2ed-4352-a4e2-017466399641 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 31 03:50:54 np0005603623 nova_compute[226235]: 2026-01-31 08:50:54.431 226239 DEBUG nova.compute.manager [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-vif-unplugged-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:54 np0005603623 nova_compute[226235]: 2026-01-31 08:50:54.431 226239 DEBUG oslo_concurrency.lockutils [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:54 np0005603623 nova_compute[226235]: 2026-01-31 08:50:54.431 226239 DEBUG oslo_concurrency.lockutils [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:54 np0005603623 nova_compute[226235]: 2026-01-31 08:50:54.431 226239 DEBUG oslo_concurrency.lockutils [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:54 np0005603623 nova_compute[226235]: 2026-01-31 08:50:54.432 226239 DEBUG nova.compute.manager [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] No waiting events found dispatching network-vif-unplugged-3aae5c0f-f2ed-4352-a4e2-017466399641 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:50:54 np0005603623 nova_compute[226235]: 2026-01-31 08:50:54.432 226239 WARNING nova.compute.manager [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received unexpected event network-vif-unplugged-3aae5c0f-f2ed-4352-a4e2-017466399641 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:50:54 np0005603623 nova_compute[226235]: 2026-01-31 08:50:54.432 226239 DEBUG nova.compute.manager [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:54 np0005603623 nova_compute[226235]: 2026-01-31 08:50:54.432 226239 DEBUG oslo_concurrency.lockutils [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:54 np0005603623 nova_compute[226235]: 2026-01-31 08:50:54.432 226239 DEBUG oslo_concurrency.lockutils [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:54 np0005603623 nova_compute[226235]: 2026-01-31 08:50:54.432 226239 DEBUG oslo_concurrency.lockutils [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:54 np0005603623 nova_compute[226235]: 2026-01-31 08:50:54.433 226239 DEBUG nova.compute.manager [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] No waiting events found dispatching network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:50:54 np0005603623 nova_compute[226235]: 2026-01-31 08:50:54.433 226239 WARNING nova.compute.manager [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received unexpected event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:50:55 np0005603623 nova_compute[226235]: 2026-01-31 08:50:55.704 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:56 np0005603623 nova_compute[226235]: 2026-01-31 08:50:56.289 226239 DEBUG oslo_concurrency.lockutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:50:56 np0005603623 nova_compute[226235]: 2026-01-31 08:50:56.289 226239 DEBUG oslo_concurrency.lockutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquired lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:50:56 np0005603623 nova_compute[226235]: 2026-01-31 08:50:56.290 226239 DEBUG nova.network.neutron [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:50:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:56.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:56.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:57 np0005603623 systemd[1]: Stopping User Manager for UID 42436...
Jan 31 03:50:57 np0005603623 systemd[305537]: Activating special unit Exit the Session...
Jan 31 03:50:57 np0005603623 systemd[305537]: Stopped target Main User Target.
Jan 31 03:50:57 np0005603623 systemd[305537]: Stopped target Basic System.
Jan 31 03:50:57 np0005603623 systemd[305537]: Stopped target Paths.
Jan 31 03:50:57 np0005603623 systemd[305537]: Stopped target Sockets.
Jan 31 03:50:57 np0005603623 systemd[305537]: Stopped target Timers.
Jan 31 03:50:57 np0005603623 systemd[305537]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 31 03:50:57 np0005603623 systemd[305537]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 31 03:50:57 np0005603623 systemd[305537]: Closed D-Bus User Message Bus Socket.
Jan 31 03:50:57 np0005603623 systemd[305537]: Stopped Create User's Volatile Files and Directories.
Jan 31 03:50:57 np0005603623 systemd[305537]: Removed slice User Application Slice.
Jan 31 03:50:57 np0005603623 systemd[305537]: Reached target Shutdown.
Jan 31 03:50:57 np0005603623 systemd[305537]: Finished Exit the Session.
Jan 31 03:50:57 np0005603623 systemd[305537]: Reached target Exit the Session.
Jan 31 03:50:57 np0005603623 systemd[1]: user@42436.service: Deactivated successfully.
Jan 31 03:50:57 np0005603623 systemd[1]: Stopped User Manager for UID 42436.
Jan 31 03:50:57 np0005603623 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 31 03:50:57 np0005603623 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 31 03:50:57 np0005603623 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 31 03:50:57 np0005603623 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 31 03:50:57 np0005603623 systemd[1]: Removed slice User Slice of UID 42436.
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.061 226239 DEBUG nova.network.neutron [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updating instance_info_cache with network_info: [{"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.228 226239 DEBUG nova.compute.manager [req-47af7f31-6dc6-492a-9134-85677516bf5c req-a1f34a79-5697-4b7d-86db-6f75a9052217 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-changed-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.229 226239 DEBUG nova.compute.manager [req-47af7f31-6dc6-492a-9134-85677516bf5c req-a1f34a79-5697-4b7d-86db-6f75a9052217 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Refreshing instance network info cache due to event network-changed-3aae5c0f-f2ed-4352-a4e2-017466399641. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.230 226239 DEBUG oslo_concurrency.lockutils [req-47af7f31-6dc6-492a-9134-85677516bf5c req-a1f34a79-5697-4b7d-86db-6f75a9052217 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.304 226239 DEBUG oslo_concurrency.lockutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Releasing lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.310 226239 DEBUG oslo_concurrency.lockutils [req-47af7f31-6dc6-492a-9134-85677516bf5c req-a1f34a79-5697-4b7d-86db-6f75a9052217 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.311 226239 DEBUG nova.network.neutron [req-47af7f31-6dc6-492a-9134-85677516bf5c req-a1f34a79-5697-4b7d-86db-6f75a9052217 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Refreshing network info cache for port 3aae5c0f-f2ed-4352-a4e2-017466399641 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:50:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:58.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:50:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:58.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.699 226239 DEBUG os_brick.utils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.701 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.709 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.710 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[97f2ea59-c13a-4034-8f0e-e7c5342a8c2d]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.711 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.718 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.718 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[adf76c2f-84b8-41c9-94a2-69954f570f70]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.720 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.728 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.729 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[4b2bc375-7d14-4dd1-9fa5-96a48bb07bb9]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.730 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[03b72927-92ac-416a-bb35-d7b20cc1a56e]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.731 226239 DEBUG oslo_concurrency.processutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.750 226239 DEBUG oslo_concurrency.processutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "nvme version" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.752 226239 DEBUG os_brick.initiator.connectors.lightos [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.752 226239 DEBUG os_brick.initiator.connectors.lightos [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.752 226239 DEBUG os_brick.initiator.connectors.lightos [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:50:58 np0005603623 nova_compute[226235]: 2026-01-31 08:50:58.752 226239 DEBUG os_brick.utils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] <== get_connector_properties: return (52ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:50:59 np0005603623 nova_compute[226235]: 2026-01-31 08:50:59.408 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:00 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:51:00 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/357517992' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:51:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:00.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:51:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:00.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:51:00 np0005603623 nova_compute[226235]: 2026-01-31 08:51:00.669 226239 DEBUG nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 31 03:51:00 np0005603623 nova_compute[226235]: 2026-01-31 08:51:00.670 226239 DEBUG nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 31 03:51:00 np0005603623 nova_compute[226235]: 2026-01-31 08:51:00.671 226239 INFO nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Creating image(s)#033[00m
Jan 31 03:51:00 np0005603623 nova_compute[226235]: 2026-01-31 08:51:00.704 226239 DEBUG nova.storage.rbd_utils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] creating snapshot(nova-resize) on rbd image(4af4043c-8199-4d0f-acf9-38d029560167_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:51:00 np0005603623 nova_compute[226235]: 2026-01-31 08:51:00.749 226239 DEBUG nova.network.neutron [req-47af7f31-6dc6-492a-9134-85677516bf5c req-a1f34a79-5697-4b7d-86db-6f75a9052217 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updated VIF entry in instance network info cache for port 3aae5c0f-f2ed-4352-a4e2-017466399641. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:51:00 np0005603623 nova_compute[226235]: 2026-01-31 08:51:00.749 226239 DEBUG nova.network.neutron [req-47af7f31-6dc6-492a-9134-85677516bf5c req-a1f34a79-5697-4b7d-86db-6f75a9052217 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updating instance_info_cache with network_info: [{"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:51:00 np0005603623 nova_compute[226235]: 2026-01-31 08:51:00.750 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:00 np0005603623 nova_compute[226235]: 2026-01-31 08:51:00.781 226239 DEBUG oslo_concurrency.lockutils [req-47af7f31-6dc6-492a-9134-85677516bf5c req-a1f34a79-5697-4b7d-86db-6f75a9052217 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:51:00 np0005603623 podman[305660]: 2026-01-31 08:51:00.960208064 +0000 UTC m=+0.045779856 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 31 03:51:00 np0005603623 podman[305661]: 2026-01-31 08:51:00.984155256 +0000 UTC m=+0.069707898 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 31 03:51:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e364 e364: 3 total, 3 up, 3 in
Jan 31 03:51:01 np0005603623 nova_compute[226235]: 2026-01-31 08:51:01.935 226239 DEBUG nova.objects.instance [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4af4043c-8199-4d0f-acf9-38d029560167 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:51:02 np0005603623 ovn_controller[133449]: 2026-01-31T08:51:02Z|00707|binding|INFO|Releasing lport 5976b74a-78ce-46e1-bd2c-76a2a502c8f5 from this chassis (sb_readonly=0)
Jan 31 03:51:02 np0005603623 ovn_controller[133449]: 2026-01-31T08:51:02Z|00708|binding|INFO|Releasing lport 0ed76a0a-650c-4ec7-a4d4-0e745236b047 from this chassis (sb_readonly=0)
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.066 226239 DEBUG nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.066 226239 DEBUG nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Ensure instance console log exists: /var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.066 226239 DEBUG oslo_concurrency.lockutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.067 226239 DEBUG oslo_concurrency.lockutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.067 226239 DEBUG oslo_concurrency.lockutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.069 226239 DEBUG nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Start _get_guest_xml network_info=[{"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "vif_mac": "fa:16:3e:69:73:3d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [{'attachment_id': '8c5fdaa4-59ab-4375-b3a7-84593a1767e7', 'delete_on_termination': False, 'guest_format': None, 'device_type': 'disk', 'boot_index': None, 'disk_bus': 'virtio', 'mount_device': '/dev/vdb', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-12e9d9b2-8ec9-4b16-b334-60c0f639cb59', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '12e9d9b2-8ec9-4b16-b334-60c0f639cb59', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'attaching', 'instance': '4af4043c-8199-4d0f-acf9-38d029560167', 'attached_at': '2026-01-31T08:51:00.000000', 'detached_at': '', 'volume_id': '12e9d9b2-8ec9-4b16-b334-60c0f639cb59', 'multiattach': True, 'serial': '12e9d9b2-8ec9-4b16-b334-60c0f639cb59'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.073 226239 WARNING nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.074 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.079 226239 DEBUG nova.virt.libvirt.host [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.080 226239 DEBUG nova.virt.libvirt.host [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.083 226239 DEBUG nova.virt.libvirt.host [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.084 226239 DEBUG nova.virt.libvirt.host [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.084 226239 DEBUG nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.085 226239 DEBUG nova.virt.hardware [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='f75c4aee-d826-4343-a7e3-f06a4b21de52',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.085 226239 DEBUG nova.virt.hardware [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.085 226239 DEBUG nova.virt.hardware [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.086 226239 DEBUG nova.virt.hardware [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.086 226239 DEBUG nova.virt.hardware [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.086 226239 DEBUG nova.virt.hardware [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.086 226239 DEBUG nova.virt.hardware [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.086 226239 DEBUG nova.virt.hardware [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.087 226239 DEBUG nova.virt.hardware [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.087 226239 DEBUG nova.virt.hardware [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.087 226239 DEBUG nova.virt.hardware [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.087 226239 DEBUG nova.objects.instance [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4af4043c-8199-4d0f-acf9-38d029560167 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.104 226239 DEBUG oslo_concurrency.processutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:51:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:02.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:51:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:02.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:51:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:51:02 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/227161612' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.516 226239 DEBUG oslo_concurrency.processutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:02 np0005603623 nova_compute[226235]: 2026-01-31 08:51:02.551 226239 DEBUG oslo_concurrency.processutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:51:02 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/252367404' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.007 226239 DEBUG oslo_concurrency.processutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.064 226239 DEBUG nova.virt.libvirt.vif [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:48:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-1',id=165,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMGvV4tGHwFrQ7+1WPmMS3fGcrpcMKpLQBFiD2ZG0NedKq4jaCN6oHf8RWlX+X72Ff/PSGJSQ5nqRPZm+CDMr01vn3vAMra9m4dZ/R1d2vwh+NDFwu298PivPHJQkyuCpg==',key_name='tempest-keypair-600650673',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:49:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8397e0fed04b4dabb57148d0924de2dc',ramdisk_id='',reservation_id='r-bd0dktab',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-AttachVolumeMultiAttachTest-1931311941',owner_user_name='tempest-AttachVolumeMultiAttachTest-1931311941-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:50:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a498364761ef428b99cac3f92e603385',uuid=4af4043c-8199-4d0f-acf9-38d029560167,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "vif_mac": "fa:16:3e:69:73:3d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.065 226239 DEBUG nova.network.os_vif_util [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converting VIF {"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "vif_mac": "fa:16:3e:69:73:3d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.066 226239 DEBUG nova.network.os_vif_util [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:73:3d,bridge_name='br-int',has_traffic_filtering=True,id=3aae5c0f-f2ed-4352-a4e2-017466399641,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aae5c0f-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.069 226239 DEBUG nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:  <uuid>4af4043c-8199-4d0f-acf9-38d029560167</uuid>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:  <name>instance-000000a5</name>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:  <memory>196608</memory>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <nova:name>multiattach-server-1</nova:name>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:51:02</nova:creationTime>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.micro">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <nova:memory>192</nova:memory>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <nova:user uuid="a498364761ef428b99cac3f92e603385">tempest-AttachVolumeMultiAttachTest-1931311941-project-member</nova:user>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <nova:project uuid="8397e0fed04b4dabb57148d0924de2dc">tempest-AttachVolumeMultiAttachTest-1931311941</nova:project>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <nova:port uuid="3aae5c0f-f2ed-4352-a4e2-017466399641">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <entry name="serial">4af4043c-8199-4d0f-acf9-38d029560167</entry>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <entry name="uuid">4af4043c-8199-4d0f-acf9-38d029560167</entry>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/4af4043c-8199-4d0f-acf9-38d029560167_disk">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/4af4043c-8199-4d0f-acf9-38d029560167_disk.config">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="volumes/volume-12e9d9b2-8ec9-4b16-b334-60c0f639cb59">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <target dev="vdb" bus="virtio"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <serial>12e9d9b2-8ec9-4b16-b334-60c0f639cb59</serial>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <shareable/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:69:73:3d"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <target dev="tap3aae5c0f-f2"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167/console.log" append="off"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:51:03 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:51:03 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:51:03 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:51:03 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.070 226239 DEBUG nova.virt.libvirt.vif [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:48:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-1',id=165,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMGvV4tGHwFrQ7+1WPmMS3fGcrpcMKpLQBFiD2ZG0NedKq4jaCN6oHf8RWlX+X72Ff/PSGJSQ5nqRPZm+CDMr01vn3vAMra9m4dZ/R1d2vwh+NDFwu298PivPHJQkyuCpg==',key_name='tempest-keypair-600650673',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:49:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8397e0fed04b4dabb57148d0924de2dc',ramdisk_id='',reservation_id='r-bd0dktab',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-AttachVolumeMultiAttachTest-1931311941',owner_user_name='tempest-AttachVolumeMultiAttachTest-1931311941-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:50:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a498364761ef428b99cac3f92e603385',uuid=4af4043c-8199-4d0f-acf9-38d029560167,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "vif_mac": "fa:16:3e:69:73:3d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.070 226239 DEBUG nova.network.os_vif_util [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converting VIF {"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "vif_mac": "fa:16:3e:69:73:3d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.071 226239 DEBUG nova.network.os_vif_util [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:73:3d,bridge_name='br-int',has_traffic_filtering=True,id=3aae5c0f-f2ed-4352-a4e2-017466399641,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aae5c0f-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.071 226239 DEBUG os_vif [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:73:3d,bridge_name='br-int',has_traffic_filtering=True,id=3aae5c0f-f2ed-4352-a4e2-017466399641,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aae5c0f-f2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.072 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.072 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.072 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.076 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.077 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3aae5c0f-f2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.077 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3aae5c0f-f2, col_values=(('external_ids', {'iface-id': '3aae5c0f-f2ed-4352-a4e2-017466399641', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:69:73:3d', 'vm-uuid': '4af4043c-8199-4d0f-acf9-38d029560167'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.079 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:03 np0005603623 NetworkManager[48970]: <info>  [1769849463.0796] manager: (tap3aae5c0f-f2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.081 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.084 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.085 226239 INFO os_vif [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:73:3d,bridge_name='br-int',has_traffic_filtering=True,id=3aae5c0f-f2ed-4352-a4e2-017466399641,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aae5c0f-f2')#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.250 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.250 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.251 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.251 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.251 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.274 226239 DEBUG nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.275 226239 DEBUG nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.275 226239 DEBUG nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.275 226239 DEBUG nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No VIF found with MAC fa:16:3e:69:73:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.275 226239 INFO nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Using config drive#033[00m
Jan 31 03:51:03 np0005603623 kernel: tap3aae5c0f-f2: entered promiscuous mode
Jan 31 03:51:03 np0005603623 NetworkManager[48970]: <info>  [1769849463.3519] manager: (tap3aae5c0f-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/336)
Jan 31 03:51:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:51:03Z|00709|binding|INFO|Claiming lport 3aae5c0f-f2ed-4352-a4e2-017466399641 for this chassis.
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.353 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:51:03Z|00710|binding|INFO|3aae5c0f-f2ed-4352-a4e2-017466399641: Claiming fa:16:3e:69:73:3d 10.100.0.9
Jan 31 03:51:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:51:03Z|00711|binding|INFO|Setting lport 3aae5c0f-f2ed-4352-a4e2-017466399641 ovn-installed in OVS
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.360 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:03 np0005603623 systemd-udevd[305856]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:51:03 np0005603623 systemd-machined[194379]: New machine qemu-80-instance-000000a5.
Jan 31 03:51:03 np0005603623 NetworkManager[48970]: <info>  [1769849463.3883] device (tap3aae5c0f-f2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:51:03 np0005603623 NetworkManager[48970]: <info>  [1769849463.3887] device (tap3aae5c0f-f2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:51:03 np0005603623 systemd[1]: Started Virtual Machine qemu-80-instance-000000a5.
Jan 31 03:51:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:51:03Z|00712|binding|INFO|Setting lport 3aae5c0f-f2ed-4352-a4e2-017466399641 up in Southbound
Jan 31 03:51:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:03.425 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:73:3d 10.100.0.9'], port_security=['fa:16:3e:69:73:3d 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4af4043c-8199-4d0f-acf9-38d029560167', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8397e0fed04b4dabb57148d0924de2dc', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'd636f3a4-efef-465a-ac59-8182d61336f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbd2578f-ff6e-4dc3-bc49-93cbf023edc5, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=3aae5c0f-f2ed-4352-a4e2-017466399641) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:51:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:03.426 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 3aae5c0f-f2ed-4352-a4e2-017466399641 in datapath 3afaf607-43a1-4d65-95fc-0a22b5c901d0 bound to our chassis#033[00m
Jan 31 03:51:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:03.428 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3afaf607-43a1-4d65-95fc-0a22b5c901d0#033[00m
Jan 31 03:51:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:03.440 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7ab04f30-0fd5-4cfc-b8e7-c51d3c7da5a3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:03.455 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[883c2243-bb50-4e3b-8c13-116d7e4f03c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:03.459 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[d3c96a43-94ee-4cd2-a26d-5bc1f9d920ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:03.473 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[1eda3534-0b1c-482a-bfc1-06f7b98ac0f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:03.484 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[93a34404-5ff5-42dc-bc84-c32bc8a62b9f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3afaf607-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:84:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 198], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 831647, 'reachable_time': 39542, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 305871, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:03.494 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[61e8d76f-823e-48db-a396-0b083a8d1ef8]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3afaf607-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 831657, 'tstamp': 831657}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 305872, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3afaf607-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 831659, 'tstamp': 831659}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 305872, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:03.495 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3afaf607-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.527 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.528 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:03.528 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3afaf607-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:03.529 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:51:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:03.529 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3afaf607-40, col_values=(('external_ids', {'iface-id': '0ed76a0a-650c-4ec7-a4d4-0e745236b047'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:03.529 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:51:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:51:03 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2994482711' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.675 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.891 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849463.891093, 4af4043c-8199-4d0f-acf9-38d029560167 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.892 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.896 226239 DEBUG nova.compute.manager [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.899 226239 INFO nova.virt.libvirt.driver [-] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Instance running successfully.#033[00m
Jan 31 03:51:03 np0005603623 virtqemud[225858]: argument unsupported: QEMU guest agent is not configured
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.903 226239 DEBUG nova.virt.libvirt.guest [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.904 226239 DEBUG nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.939 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000009c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.939 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000009c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.943 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.944 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.944 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.948 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.949 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.949 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.956 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:51:03 np0005603623 nova_compute[226235]: 2026-01-31 08:51:03.960 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.093 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.095 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3777MB free_disk=20.63943099975586GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.096 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.096 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.141 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.142 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849463.891623, 4af4043c-8199-4d0f-acf9-38d029560167 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.142 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] VM Started (Lifecycle Event)#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.183 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.187 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.235 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Applying migration context for instance 4af4043c-8199-4d0f-acf9-38d029560167 as it has an incoming, in-progress migration cccb9a8e-31b9-485b-8565-37667620f588. Migration status is post-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.236 226239 INFO nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updating resource usage from migration cccb9a8e-31b9-485b-8565-37667620f588#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.239 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.257 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 0edbf2b9-b76f-446b-85fa-09a4dcb37976 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.258 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance da4e355d-c6c2-446e-8eb1-d2ca8279e549 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.258 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 4af4043c-8199-4d0f-acf9-38d029560167 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.259 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.259 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:51:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:51:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:04.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:51:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:04.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.440 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:51:04 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1677914974' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.888 226239 DEBUG nova.compute.manager [req-a102e8b4-8922-477b-b1c8-287a75107e6c req-21752168-8051-4065-afd7-3ea42ed47d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.889 226239 DEBUG oslo_concurrency.lockutils [req-a102e8b4-8922-477b-b1c8-287a75107e6c req-21752168-8051-4065-afd7-3ea42ed47d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.890 226239 DEBUG oslo_concurrency.lockutils [req-a102e8b4-8922-477b-b1c8-287a75107e6c req-21752168-8051-4065-afd7-3ea42ed47d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.890 226239 DEBUG oslo_concurrency.lockutils [req-a102e8b4-8922-477b-b1c8-287a75107e6c req-21752168-8051-4065-afd7-3ea42ed47d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.891 226239 DEBUG nova.compute.manager [req-a102e8b4-8922-477b-b1c8-287a75107e6c req-21752168-8051-4065-afd7-3ea42ed47d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] No waiting events found dispatching network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.891 226239 WARNING nova.compute.manager [req-a102e8b4-8922-477b-b1c8-287a75107e6c req-21752168-8051-4065-afd7-3ea42ed47d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received unexpected event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.899 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.905 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.936 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.969 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:51:04 np0005603623 nova_compute[226235]: 2026-01-31 08:51:04.971 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.875s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:05 np0005603623 nova_compute[226235]: 2026-01-31 08:51:05.709 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e364 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:06.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:51:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:06.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:51:07 np0005603623 nova_compute[226235]: 2026-01-31 08:51:07.254 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:07.253 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=73, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=72) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:51:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:07.254 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:51:07 np0005603623 nova_compute[226235]: 2026-01-31 08:51:07.783 226239 DEBUG nova.compute.manager [req-32cd674b-a893-4d86-9fb1-16de2cdeedae req-a7e9dd9e-4a59-4a93-8d1f-4b8815424367 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:51:07 np0005603623 nova_compute[226235]: 2026-01-31 08:51:07.783 226239 DEBUG oslo_concurrency.lockutils [req-32cd674b-a893-4d86-9fb1-16de2cdeedae req-a7e9dd9e-4a59-4a93-8d1f-4b8815424367 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:07 np0005603623 nova_compute[226235]: 2026-01-31 08:51:07.784 226239 DEBUG oslo_concurrency.lockutils [req-32cd674b-a893-4d86-9fb1-16de2cdeedae req-a7e9dd9e-4a59-4a93-8d1f-4b8815424367 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:07 np0005603623 nova_compute[226235]: 2026-01-31 08:51:07.784 226239 DEBUG oslo_concurrency.lockutils [req-32cd674b-a893-4d86-9fb1-16de2cdeedae req-a7e9dd9e-4a59-4a93-8d1f-4b8815424367 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:07 np0005603623 nova_compute[226235]: 2026-01-31 08:51:07.784 226239 DEBUG nova.compute.manager [req-32cd674b-a893-4d86-9fb1-16de2cdeedae req-a7e9dd9e-4a59-4a93-8d1f-4b8815424367 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] No waiting events found dispatching network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:51:07 np0005603623 nova_compute[226235]: 2026-01-31 08:51:07.785 226239 WARNING nova.compute.manager [req-32cd674b-a893-4d86-9fb1-16de2cdeedae req-a7e9dd9e-4a59-4a93-8d1f-4b8815424367 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received unexpected event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:51:08 np0005603623 nova_compute[226235]: 2026-01-31 08:51:08.095 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:08.256 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '73'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:08.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:08.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:10.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:10.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:10 np0005603623 nova_compute[226235]: 2026-01-31 08:51:10.710 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:10 np0005603623 nova_compute[226235]: 2026-01-31 08:51:10.971 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:10 np0005603623 nova_compute[226235]: 2026-01-31 08:51:10.972 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:51:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e364 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:12 np0005603623 nova_compute[226235]: 2026-01-31 08:51:12.073 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-da4e355d-c6c2-446e-8eb1-d2ca8279e549" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:51:12 np0005603623 nova_compute[226235]: 2026-01-31 08:51:12.074 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-da4e355d-c6c2-446e-8eb1-d2ca8279e549" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:51:12 np0005603623 nova_compute[226235]: 2026-01-31 08:51:12.074 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:51:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:51:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:12.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:51:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:51:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:12.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:51:13 np0005603623 nova_compute[226235]: 2026-01-31 08:51:13.096 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:14.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:14.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:51:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/824177169' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:51:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:51:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/824177169' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:51:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e365 e365: 3 total, 3 up, 3 in
Jan 31 03:51:15 np0005603623 nova_compute[226235]: 2026-01-31 08:51:15.351 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Updating instance_info_cache with network_info: [{"id": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "address": "fa:16:3e:15:96:10", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5224a5be-3c", "ovs_interfaceid": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:51:15 np0005603623 nova_compute[226235]: 2026-01-31 08:51:15.666 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-da4e355d-c6c2-446e-8eb1-d2ca8279e549" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:51:15 np0005603623 nova_compute[226235]: 2026-01-31 08:51:15.667 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:51:15 np0005603623 nova_compute[226235]: 2026-01-31 08:51:15.667 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:15 np0005603623 nova_compute[226235]: 2026-01-31 08:51:15.668 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:15 np0005603623 nova_compute[226235]: 2026-01-31 08:51:15.668 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:15 np0005603623 nova_compute[226235]: 2026-01-31 08:51:15.668 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:15 np0005603623 nova_compute[226235]: 2026-01-31 08:51:15.711 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:15 np0005603623 nova_compute[226235]: 2026-01-31 08:51:15.848 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e365 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:51:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:16.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:51:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:16.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:16 np0005603623 ovn_controller[133449]: 2026-01-31T08:51:16Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:69:73:3d 10.100.0.9
Jan 31 03:51:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:51:16 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2730741540' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:51:17 np0005603623 nova_compute[226235]: 2026-01-31 08:51:17.728 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:18 np0005603623 nova_compute[226235]: 2026-01-31 08:51:18.099 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:18.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:18.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:51:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:20.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:51:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:20.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:20 np0005603623 nova_compute[226235]: 2026-01-31 08:51:20.713 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:20 np0005603623 nova_compute[226235]: 2026-01-31 08:51:20.803 226239 DEBUG oslo_concurrency.lockutils [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:20 np0005603623 nova_compute[226235]: 2026-01-31 08:51:20.804 226239 DEBUG oslo_concurrency.lockutils [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:20 np0005603623 nova_compute[226235]: 2026-01-31 08:51:20.804 226239 DEBUG oslo_concurrency.lockutils [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:20 np0005603623 nova_compute[226235]: 2026-01-31 08:51:20.804 226239 DEBUG oslo_concurrency.lockutils [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:20 np0005603623 nova_compute[226235]: 2026-01-31 08:51:20.805 226239 DEBUG oslo_concurrency.lockutils [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:20 np0005603623 nova_compute[226235]: 2026-01-31 08:51:20.806 226239 INFO nova.compute.manager [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Terminating instance#033[00m
Jan 31 03:51:20 np0005603623 nova_compute[226235]: 2026-01-31 08:51:20.806 226239 DEBUG nova.compute.manager [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:51:20 np0005603623 kernel: tap5224a5be-3c (unregistering): left promiscuous mode
Jan 31 03:51:20 np0005603623 NetworkManager[48970]: <info>  [1769849480.8837] device (tap5224a5be-3c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:51:20 np0005603623 ovn_controller[133449]: 2026-01-31T08:51:20Z|00713|binding|INFO|Releasing lport 5224a5be-3c99-4dab-acc8-a3d0488d9a42 from this chassis (sb_readonly=0)
Jan 31 03:51:20 np0005603623 ovn_controller[133449]: 2026-01-31T08:51:20Z|00714|binding|INFO|Setting lport 5224a5be-3c99-4dab-acc8-a3d0488d9a42 down in Southbound
Jan 31 03:51:20 np0005603623 ovn_controller[133449]: 2026-01-31T08:51:20Z|00715|binding|INFO|Removing iface tap5224a5be-3c ovn-installed in OVS
Jan 31 03:51:20 np0005603623 nova_compute[226235]: 2026-01-31 08:51:20.889 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:20 np0005603623 nova_compute[226235]: 2026-01-31 08:51:20.890 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:20 np0005603623 nova_compute[226235]: 2026-01-31 08:51:20.898 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:20.899 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:96:10 10.100.0.4'], port_security=['fa:16:3e:15:96:10 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'da4e355d-c6c2-446e-8eb1-d2ca8279e549', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2128154c-0218-4f66-9509-e0db66eba3fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'neutron:revision_number': '6', 'neutron:security_group_ids': '4937dacf-809a-410a-970f-8b358db49b15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc4ff4f3-028a-4adf-9ffc-a84ef2563d05, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=5224a5be-3c99-4dab-acc8-a3d0488d9a42) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:51:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:20.903 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 5224a5be-3c99-4dab-acc8-a3d0488d9a42 in datapath 2128154c-0218-4f66-9509-e0db66eba3fc unbound from our chassis#033[00m
Jan 31 03:51:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:20.911 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2128154c-0218-4f66-9509-e0db66eba3fc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:51:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:20.950 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d771ed80-ca73-4feb-8f42-92d8b648afa3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:20.952 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc namespace which is not needed anymore#033[00m
Jan 31 03:51:20 np0005603623 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000a2.scope: Deactivated successfully.
Jan 31 03:51:20 np0005603623 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000a2.scope: Consumed 19.389s CPU time.
Jan 31 03:51:20 np0005603623 systemd-machined[194379]: Machine qemu-77-instance-000000a2 terminated.
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.025 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.028 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.038 226239 INFO nova.virt.libvirt.driver [-] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Instance destroyed successfully.#033[00m
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.038 226239 DEBUG nova.objects.instance [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'resources' on Instance uuid da4e355d-c6c2-446e-8eb1-d2ca8279e549 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:51:21 np0005603623 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[302876]: [NOTICE]   (302880) : haproxy version is 2.8.14-c23fe91
Jan 31 03:51:21 np0005603623 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[302876]: [NOTICE]   (302880) : path to executable is /usr/sbin/haproxy
Jan 31 03:51:21 np0005603623 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[302876]: [WARNING]  (302880) : Exiting Master process...
Jan 31 03:51:21 np0005603623 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[302876]: [ALERT]    (302880) : Current worker (302882) exited with code 143 (Terminated)
Jan 31 03:51:21 np0005603623 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[302876]: [WARNING]  (302880) : All workers exited. Exiting... (0)
Jan 31 03:51:21 np0005603623 systemd[1]: libpod-48fa582d356ec680dc52136e1e18162b6970df1b5a6e57304c8d510d5cd5b9c7.scope: Deactivated successfully.
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.065 226239 DEBUG nova.virt.libvirt.vif [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:47:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1869656790',display_name='tempest-ServerRescueNegativeTestJSON-server-1869656790',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1869656790',id=162,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:47:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bf1c3d387dbe4191b4d05bdfca5959da',ramdisk_id='',reservation_id='r-l0z9mou0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-81297706',owner_user_name='tempest-ServerRescueNegativeTestJSON-81297706-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:47:56Z,user_data=None,user_id='aa7f893021af4a84b03d85b476dadfe0',uuid=da4e355d-c6c2-446e-8eb1-d2ca8279e549,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "address": "fa:16:3e:15:96:10", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5224a5be-3c", "ovs_interfaceid": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.066 226239 DEBUG nova.network.os_vif_util [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converting VIF {"id": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "address": "fa:16:3e:15:96:10", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5224a5be-3c", "ovs_interfaceid": "5224a5be-3c99-4dab-acc8-a3d0488d9a42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.067 226239 DEBUG nova.network.os_vif_util [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:96:10,bridge_name='br-int',has_traffic_filtering=True,id=5224a5be-3c99-4dab-acc8-a3d0488d9a42,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5224a5be-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.067 226239 DEBUG os_vif [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:96:10,bridge_name='br-int',has_traffic_filtering=True,id=5224a5be-3c99-4dab-acc8-a3d0488d9a42,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5224a5be-3c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.069 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.069 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5224a5be-3c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.071 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.072 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:21 np0005603623 podman[306041]: 2026-01-31 08:51:21.073959619 +0000 UTC m=+0.043050732 container died 48fa582d356ec680dc52136e1e18162b6970df1b5a6e57304c8d510d5cd5b9c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.074 226239 INFO os_vif [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:96:10,bridge_name='br-int',has_traffic_filtering=True,id=5224a5be-3c99-4dab-acc8-a3d0488d9a42,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5224a5be-3c')#033[00m
Jan 31 03:51:21 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-48fa582d356ec680dc52136e1e18162b6970df1b5a6e57304c8d510d5cd5b9c7-userdata-shm.mount: Deactivated successfully.
Jan 31 03:51:21 np0005603623 systemd[1]: var-lib-containers-storage-overlay-af6961a16a3922407915a4aeb0886c8d731b95c44887313e6d243b222e185681-merged.mount: Deactivated successfully.
Jan 31 03:51:21 np0005603623 podman[306041]: 2026-01-31 08:51:21.110820745 +0000 UTC m=+0.079911858 container cleanup 48fa582d356ec680dc52136e1e18162b6970df1b5a6e57304c8d510d5cd5b9c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:51:21 np0005603623 systemd[1]: libpod-conmon-48fa582d356ec680dc52136e1e18162b6970df1b5a6e57304c8d510d5cd5b9c7.scope: Deactivated successfully.
Jan 31 03:51:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e365 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:21 np0005603623 podman[306099]: 2026-01-31 08:51:21.160232686 +0000 UTC m=+0.036387043 container remove 48fa582d356ec680dc52136e1e18162b6970df1b5a6e57304c8d510d5cd5b9c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:51:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:21.164 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b3bdc5b8-0db6-4d0b-8443-21b9381da799]: (4, ('Sat Jan 31 08:51:21 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc (48fa582d356ec680dc52136e1e18162b6970df1b5a6e57304c8d510d5cd5b9c7)\n48fa582d356ec680dc52136e1e18162b6970df1b5a6e57304c8d510d5cd5b9c7\nSat Jan 31 08:51:21 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc (48fa582d356ec680dc52136e1e18162b6970df1b5a6e57304c8d510d5cd5b9c7)\n48fa582d356ec680dc52136e1e18162b6970df1b5a6e57304c8d510d5cd5b9c7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:21.166 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[82b0daa2-2319-44e7-9932-ff863553e6ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:21.167 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2128154c-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.168 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:21 np0005603623 kernel: tap2128154c-00: left promiscuous mode
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.173 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:21.176 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3109b491-fed9-47f6-9994-1dcd2b682ea1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:21.196 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[69ad3b6d-8713-4ab2-832f-c1a9813ff9a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:21.197 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[91fe98d9-c6af-462a-aab9-b14840e60f4c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:21.209 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a85e047e-fbc0-4ed0-bd32-bb75fc0e2f19]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 841571, 'reachable_time': 26360, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 306114, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:21 np0005603623 systemd[1]: run-netns-ovnmeta\x2d2128154c\x2d0218\x2d4f66\x2d9509\x2de0db66eba3fc.mount: Deactivated successfully.
Jan 31 03:51:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:21.211 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:51:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:21.211 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[9656fb05-189c-463a-86f4-45622ff93f9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.456 226239 DEBUG nova.compute.manager [req-c635f7db-3217-4ccd-9d52-763fadd4c449 req-d2edcafb-e9ed-4a7f-8bd1-297db5334a59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Received event network-vif-unplugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.457 226239 DEBUG oslo_concurrency.lockutils [req-c635f7db-3217-4ccd-9d52-763fadd4c449 req-d2edcafb-e9ed-4a7f-8bd1-297db5334a59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.457 226239 DEBUG oslo_concurrency.lockutils [req-c635f7db-3217-4ccd-9d52-763fadd4c449 req-d2edcafb-e9ed-4a7f-8bd1-297db5334a59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.457 226239 DEBUG oslo_concurrency.lockutils [req-c635f7db-3217-4ccd-9d52-763fadd4c449 req-d2edcafb-e9ed-4a7f-8bd1-297db5334a59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.457 226239 DEBUG nova.compute.manager [req-c635f7db-3217-4ccd-9d52-763fadd4c449 req-d2edcafb-e9ed-4a7f-8bd1-297db5334a59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] No waiting events found dispatching network-vif-unplugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:51:21 np0005603623 nova_compute[226235]: 2026-01-31 08:51:21.458 226239 DEBUG nova.compute.manager [req-c635f7db-3217-4ccd-9d52-763fadd4c449 req-d2edcafb-e9ed-4a7f-8bd1-297db5334a59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Received event network-vif-unplugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:51:22 np0005603623 nova_compute[226235]: 2026-01-31 08:51:22.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e366 e366: 3 total, 3 up, 3 in
Jan 31 03:51:22 np0005603623 nova_compute[226235]: 2026-01-31 08:51:22.200 226239 INFO nova.virt.libvirt.driver [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Deleting instance files /var/lib/nova/instances/da4e355d-c6c2-446e-8eb1-d2ca8279e549_del#033[00m
Jan 31 03:51:22 np0005603623 nova_compute[226235]: 2026-01-31 08:51:22.201 226239 INFO nova.virt.libvirt.driver [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Deletion of /var/lib/nova/instances/da4e355d-c6c2-446e-8eb1-d2ca8279e549_del complete#033[00m
Jan 31 03:51:22 np0005603623 nova_compute[226235]: 2026-01-31 08:51:22.289 226239 INFO nova.compute.manager [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Took 1.48 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:51:22 np0005603623 nova_compute[226235]: 2026-01-31 08:51:22.289 226239 DEBUG oslo.service.loopingcall [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:51:22 np0005603623 nova_compute[226235]: 2026-01-31 08:51:22.290 226239 DEBUG nova.compute.manager [-] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:51:22 np0005603623 nova_compute[226235]: 2026-01-31 08:51:22.290 226239 DEBUG nova.network.neutron [-] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:51:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:51:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:22.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:51:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:51:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:22.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:51:23 np0005603623 nova_compute[226235]: 2026-01-31 08:51:23.814 226239 DEBUG nova.compute.manager [req-db2acf70-e49c-4a84-b9a7-57fe73fdfe71 req-2f033003-a409-4ee7-af8d-3686210e3f76 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Received event network-vif-plugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:51:23 np0005603623 nova_compute[226235]: 2026-01-31 08:51:23.814 226239 DEBUG oslo_concurrency.lockutils [req-db2acf70-e49c-4a84-b9a7-57fe73fdfe71 req-2f033003-a409-4ee7-af8d-3686210e3f76 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:23 np0005603623 nova_compute[226235]: 2026-01-31 08:51:23.814 226239 DEBUG oslo_concurrency.lockutils [req-db2acf70-e49c-4a84-b9a7-57fe73fdfe71 req-2f033003-a409-4ee7-af8d-3686210e3f76 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:23 np0005603623 nova_compute[226235]: 2026-01-31 08:51:23.814 226239 DEBUG oslo_concurrency.lockutils [req-db2acf70-e49c-4a84-b9a7-57fe73fdfe71 req-2f033003-a409-4ee7-af8d-3686210e3f76 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:23 np0005603623 nova_compute[226235]: 2026-01-31 08:51:23.814 226239 DEBUG nova.compute.manager [req-db2acf70-e49c-4a84-b9a7-57fe73fdfe71 req-2f033003-a409-4ee7-af8d-3686210e3f76 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] No waiting events found dispatching network-vif-plugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:51:23 np0005603623 nova_compute[226235]: 2026-01-31 08:51:23.815 226239 WARNING nova.compute.manager [req-db2acf70-e49c-4a84-b9a7-57fe73fdfe71 req-2f033003-a409-4ee7-af8d-3686210e3f76 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Received unexpected event network-vif-plugged-5224a5be-3c99-4dab-acc8-a3d0488d9a42 for instance with vm_state rescued and task_state deleting.#033[00m
Jan 31 03:51:23 np0005603623 nova_compute[226235]: 2026-01-31 08:51:23.815 226239 DEBUG nova.compute.manager [req-78101796-5fd3-42f1-a429-7386125259ac req-ff3b8ceb-81e4-4aa4-8621-c99b9fe36d11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-changed-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:51:23 np0005603623 nova_compute[226235]: 2026-01-31 08:51:23.815 226239 DEBUG nova.compute.manager [req-78101796-5fd3-42f1-a429-7386125259ac req-ff3b8ceb-81e4-4aa4-8621-c99b9fe36d11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Refreshing instance network info cache due to event network-changed-3aae5c0f-f2ed-4352-a4e2-017466399641. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:51:23 np0005603623 nova_compute[226235]: 2026-01-31 08:51:23.816 226239 DEBUG oslo_concurrency.lockutils [req-78101796-5fd3-42f1-a429-7386125259ac req-ff3b8ceb-81e4-4aa4-8621-c99b9fe36d11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:51:23 np0005603623 nova_compute[226235]: 2026-01-31 08:51:23.816 226239 DEBUG oslo_concurrency.lockutils [req-78101796-5fd3-42f1-a429-7386125259ac req-ff3b8ceb-81e4-4aa4-8621-c99b9fe36d11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:51:23 np0005603623 nova_compute[226235]: 2026-01-31 08:51:23.816 226239 DEBUG nova.network.neutron [req-78101796-5fd3-42f1-a429-7386125259ac req-ff3b8ceb-81e4-4aa4-8621-c99b9fe36d11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Refreshing network info cache for port 3aae5c0f-f2ed-4352-a4e2-017466399641 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:51:23 np0005603623 nova_compute[226235]: 2026-01-31 08:51:23.876 226239 DEBUG nova.network.neutron [-] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:51:24 np0005603623 nova_compute[226235]: 2026-01-31 08:51:24.213 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:24.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:24.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:24 np0005603623 nova_compute[226235]: 2026-01-31 08:51:24.423 226239 INFO nova.compute.manager [-] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Took 2.13 seconds to deallocate network for instance.#033[00m
Jan 31 03:51:24 np0005603623 nova_compute[226235]: 2026-01-31 08:51:24.947 226239 DEBUG oslo_concurrency.lockutils [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:24 np0005603623 nova_compute[226235]: 2026-01-31 08:51:24.948 226239 DEBUG oslo_concurrency.lockutils [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:25 np0005603623 nova_compute[226235]: 2026-01-31 08:51:25.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:25 np0005603623 nova_compute[226235]: 2026-01-31 08:51:25.716 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:26 np0005603623 nova_compute[226235]: 2026-01-31 08:51:26.071 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:26.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:26 np0005603623 nova_compute[226235]: 2026-01-31 08:51:26.373 226239 DEBUG nova.compute.manager [req-87bb1287-43db-41cc-bdda-6b1aa3c55fca req-927bfe9c-ebb9-4fec-9f8c-7450cbeb33a4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Received event network-vif-deleted-5224a5be-3c99-4dab-acc8-a3d0488d9a42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:51:26 np0005603623 nova_compute[226235]: 2026-01-31 08:51:26.383 226239 DEBUG oslo_concurrency.processutils [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:26.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:51:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3441348664' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:51:26 np0005603623 nova_compute[226235]: 2026-01-31 08:51:26.867 226239 DEBUG oslo_concurrency.processutils [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:26 np0005603623 nova_compute[226235]: 2026-01-31 08:51:26.874 226239 DEBUG nova.compute.provider_tree [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:51:26 np0005603623 nova_compute[226235]: 2026-01-31 08:51:26.895 226239 DEBUG nova.scheduler.client.report [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:51:26 np0005603623 nova_compute[226235]: 2026-01-31 08:51:26.918 226239 DEBUG oslo_concurrency.lockutils [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.970s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:26 np0005603623 nova_compute[226235]: 2026-01-31 08:51:26.952 226239 INFO nova.scheduler.client.report [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Deleted allocations for instance da4e355d-c6c2-446e-8eb1-d2ca8279e549#033[00m
Jan 31 03:51:27 np0005603623 nova_compute[226235]: 2026-01-31 08:51:27.096 226239 DEBUG oslo_concurrency.lockutils [None req-59b0625e-97ce-47c1-9111-ac7b84e7472a aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "da4e355d-c6c2-446e-8eb1-d2ca8279e549" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:27 np0005603623 nova_compute[226235]: 2026-01-31 08:51:27.766 226239 DEBUG nova.network.neutron [req-78101796-5fd3-42f1-a429-7386125259ac req-ff3b8ceb-81e4-4aa4-8621-c99b9fe36d11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updated VIF entry in instance network info cache for port 3aae5c0f-f2ed-4352-a4e2-017466399641. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:51:27 np0005603623 nova_compute[226235]: 2026-01-31 08:51:27.766 226239 DEBUG nova.network.neutron [req-78101796-5fd3-42f1-a429-7386125259ac req-ff3b8ceb-81e4-4aa4-8621-c99b9fe36d11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updating instance_info_cache with network_info: [{"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:51:27 np0005603623 nova_compute[226235]: 2026-01-31 08:51:27.822 226239 DEBUG oslo_concurrency.lockutils [req-78101796-5fd3-42f1-a429-7386125259ac req-ff3b8ceb-81e4-4aa4-8621-c99b9fe36d11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:51:28 np0005603623 nova_compute[226235]: 2026-01-31 08:51:28.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:28 np0005603623 nova_compute[226235]: 2026-01-31 08:51:28.156 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:51:28 np0005603623 nova_compute[226235]: 2026-01-31 08:51:28.257 226239 DEBUG oslo_concurrency.lockutils [None req-ce0a4889-4d8a-49b1-9989-97d5d75c7058 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:28 np0005603623 nova_compute[226235]: 2026-01-31 08:51:28.258 226239 DEBUG oslo_concurrency.lockutils [None req-ce0a4889-4d8a-49b1-9989-97d5d75c7058 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:28 np0005603623 nova_compute[226235]: 2026-01-31 08:51:28.278 226239 INFO nova.compute.manager [None req-ce0a4889-4d8a-49b1-9989-97d5d75c7058 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Detaching volume 12e9d9b2-8ec9-4b16-b334-60c0f639cb59#033[00m
Jan 31 03:51:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:28.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:28.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:28 np0005603623 nova_compute[226235]: 2026-01-31 08:51:28.477 226239 INFO nova.virt.block_device [None req-ce0a4889-4d8a-49b1-9989-97d5d75c7058 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Attempting to driver detach volume 12e9d9b2-8ec9-4b16-b334-60c0f639cb59 from mountpoint /dev/vdb#033[00m
Jan 31 03:51:28 np0005603623 nova_compute[226235]: 2026-01-31 08:51:28.486 226239 DEBUG nova.virt.libvirt.driver [None req-ce0a4889-4d8a-49b1-9989-97d5d75c7058 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Attempting to detach device vdb from instance 4af4043c-8199-4d0f-acf9-38d029560167 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:51:28 np0005603623 nova_compute[226235]: 2026-01-31 08:51:28.487 226239 DEBUG nova.virt.libvirt.guest [None req-ce0a4889-4d8a-49b1-9989-97d5d75c7058 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:51:28 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:51:28 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-12e9d9b2-8ec9-4b16-b334-60c0f639cb59">
Jan 31 03:51:28 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:51:28 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:51:28 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:51:28 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:51:28 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:51:28 np0005603623 nova_compute[226235]:  <serial>12e9d9b2-8ec9-4b16-b334-60c0f639cb59</serial>
Jan 31 03:51:28 np0005603623 nova_compute[226235]:  <shareable/>
Jan 31 03:51:28 np0005603623 nova_compute[226235]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 31 03:51:28 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:51:28 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:51:28 np0005603623 nova_compute[226235]: 2026-01-31 08:51:28.497 226239 INFO nova.virt.libvirt.driver [None req-ce0a4889-4d8a-49b1-9989-97d5d75c7058 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Successfully detached device vdb from instance 4af4043c-8199-4d0f-acf9-38d029560167 from the persistent domain config.#033[00m
Jan 31 03:51:28 np0005603623 nova_compute[226235]: 2026-01-31 08:51:28.498 226239 DEBUG nova.virt.libvirt.driver [None req-ce0a4889-4d8a-49b1-9989-97d5d75c7058 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 4af4043c-8199-4d0f-acf9-38d029560167 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 03:51:28 np0005603623 nova_compute[226235]: 2026-01-31 08:51:28.498 226239 DEBUG nova.virt.libvirt.guest [None req-ce0a4889-4d8a-49b1-9989-97d5d75c7058 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:51:28 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:51:28 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-12e9d9b2-8ec9-4b16-b334-60c0f639cb59">
Jan 31 03:51:28 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:51:28 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:51:28 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:51:28 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:51:28 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:51:28 np0005603623 nova_compute[226235]:  <serial>12e9d9b2-8ec9-4b16-b334-60c0f639cb59</serial>
Jan 31 03:51:28 np0005603623 nova_compute[226235]:  <shareable/>
Jan 31 03:51:28 np0005603623 nova_compute[226235]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 31 03:51:28 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:51:28 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:51:28 np0005603623 nova_compute[226235]: 2026-01-31 08:51:28.713 226239 DEBUG nova.virt.libvirt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Received event <DeviceRemovedEvent: 1769849488.7135756, 4af4043c-8199-4d0f-acf9-38d029560167 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 03:51:28 np0005603623 nova_compute[226235]: 2026-01-31 08:51:28.715 226239 DEBUG nova.virt.libvirt.driver [None req-ce0a4889-4d8a-49b1-9989-97d5d75c7058 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 4af4043c-8199-4d0f-acf9-38d029560167 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 03:51:28 np0005603623 nova_compute[226235]: 2026-01-31 08:51:28.717 226239 INFO nova.virt.libvirt.driver [None req-ce0a4889-4d8a-49b1-9989-97d5d75c7058 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Successfully detached device vdb from instance 4af4043c-8199-4d0f-acf9-38d029560167 from the live domain config.#033[00m
Jan 31 03:51:29 np0005603623 nova_compute[226235]: 2026-01-31 08:51:29.203 226239 DEBUG nova.objects.instance [None req-ce0a4889-4d8a-49b1-9989-97d5d75c7058 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'flavor' on Instance uuid 4af4043c-8199-4d0f-acf9-38d029560167 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:51:29 np0005603623 nova_compute[226235]: 2026-01-31 08:51:29.264 226239 DEBUG oslo_concurrency.lockutils [None req-ce0a4889-4d8a-49b1-9989-97d5d75c7058 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:30.143 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:30.143 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:51:30.144 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:30.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:51:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:30.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:51:30 np0005603623 nova_compute[226235]: 2026-01-31 08:51:30.718 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:31 np0005603623 nova_compute[226235]: 2026-01-31 08:51:31.072 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:31 np0005603623 podman[306146]: 2026-01-31 08:51:31.967773827 +0000 UTC m=+0.056342129 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:51:31 np0005603623 podman[306147]: 2026-01-31 08:51:31.989336374 +0000 UTC m=+0.077731981 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 31 03:51:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:32.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:32.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:33 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:51:33 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:51:33 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:51:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:34.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:34.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:35 np0005603623 nova_compute[226235]: 2026-01-31 08:51:35.720 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:36 np0005603623 nova_compute[226235]: 2026-01-31 08:51:36.037 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849481.0365686, da4e355d-c6c2-446e-8eb1-d2ca8279e549 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:51:36 np0005603623 nova_compute[226235]: 2026-01-31 08:51:36.037 226239 INFO nova.compute.manager [-] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:51:36 np0005603623 nova_compute[226235]: 2026-01-31 08:51:36.073 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:36 np0005603623 nova_compute[226235]: 2026-01-31 08:51:36.078 226239 DEBUG nova.compute.manager [None req-4cadc0ff-602d-4d91-a1e8-8298b50a1bba - - - - - -] [instance: da4e355d-c6c2-446e-8eb1-d2ca8279e549] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:51:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:36.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:36.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:36 np0005603623 ovn_controller[133449]: 2026-01-31T08:51:36Z|00716|binding|INFO|Releasing lport 0ed76a0a-650c-4ec7-a4d4-0e745236b047 from this chassis (sb_readonly=0)
Jan 31 03:51:36 np0005603623 nova_compute[226235]: 2026-01-31 08:51:36.901 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:38.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:38.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:51:39 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:51:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:51:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:40.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:51:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:40.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:40 np0005603623 nova_compute[226235]: 2026-01-31 08:51:40.722 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:41 np0005603623 nova_compute[226235]: 2026-01-31 08:51:41.074 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:42.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:51:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:42.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:51:43 np0005603623 ceph-osd[79732]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/lock/cls_lock.cc:291: Could not read list of current lockers off disk: (2) No such file or directory
Jan 31 03:51:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:51:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:44.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:51:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:51:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:44.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:51:45 np0005603623 nova_compute[226235]: 2026-01-31 08:51:45.724 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:45 np0005603623 nova_compute[226235]: 2026-01-31 08:51:45.961 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:46 np0005603623 nova_compute[226235]: 2026-01-31 08:51:46.004 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Triggering sync for uuid 0edbf2b9-b76f-446b-85fa-09a4dcb37976 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 03:51:46 np0005603623 nova_compute[226235]: 2026-01-31 08:51:46.005 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Triggering sync for uuid 4af4043c-8199-4d0f-acf9-38d029560167 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 03:51:46 np0005603623 nova_compute[226235]: 2026-01-31 08:51:46.005 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:46 np0005603623 nova_compute[226235]: 2026-01-31 08:51:46.005 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:46 np0005603623 nova_compute[226235]: 2026-01-31 08:51:46.006 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:46 np0005603623 nova_compute[226235]: 2026-01-31 08:51:46.006 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "4af4043c-8199-4d0f-acf9-38d029560167" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:46 np0005603623 nova_compute[226235]: 2026-01-31 08:51:46.066 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "4af4043c-8199-4d0f-acf9-38d029560167" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.060s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:46 np0005603623 nova_compute[226235]: 2026-01-31 08:51:46.068 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.062s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:46 np0005603623 nova_compute[226235]: 2026-01-31 08:51:46.076 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:46 np0005603623 nova_compute[226235]: 2026-01-31 08:51:46.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:46.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:46 np0005603623 nova_compute[226235]: 2026-01-31 08:51:46.385 226239 DEBUG oslo_concurrency.lockutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Acquiring lock "5a226745-c1af-4252-846b-95c02008d4db" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:46 np0005603623 nova_compute[226235]: 2026-01-31 08:51:46.385 226239 DEBUG oslo_concurrency.lockutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "5a226745-c1af-4252-846b-95c02008d4db" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:46 np0005603623 nova_compute[226235]: 2026-01-31 08:51:46.420 226239 DEBUG nova.compute.manager [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:51:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:46.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:46 np0005603623 nova_compute[226235]: 2026-01-31 08:51:46.557 226239 DEBUG oslo_concurrency.lockutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:46 np0005603623 nova_compute[226235]: 2026-01-31 08:51:46.558 226239 DEBUG oslo_concurrency.lockutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:46 np0005603623 nova_compute[226235]: 2026-01-31 08:51:46.567 226239 DEBUG nova.virt.hardware [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:51:46 np0005603623 nova_compute[226235]: 2026-01-31 08:51:46.567 226239 INFO nova.compute.claims [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:51:46 np0005603623 nova_compute[226235]: 2026-01-31 08:51:46.810 226239 DEBUG oslo_concurrency.processutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.189 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.190 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:51:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:51:47 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3516743228' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.212 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.218 226239 DEBUG oslo_concurrency.processutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.222 226239 DEBUG nova.compute.provider_tree [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.238 226239 DEBUG nova.scheduler.client.report [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.268 226239 DEBUG oslo_concurrency.lockutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.269 226239 DEBUG nova.compute.manager [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.350 226239 DEBUG nova.compute.manager [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.395 226239 INFO nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.426 226239 DEBUG nova.compute.manager [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.601 226239 DEBUG nova.compute.manager [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.603 226239 DEBUG nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.603 226239 INFO nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Creating image(s)#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.627 226239 DEBUG nova.storage.rbd_utils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] rbd image 5a226745-c1af-4252-846b-95c02008d4db_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.650 226239 DEBUG nova.storage.rbd_utils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] rbd image 5a226745-c1af-4252-846b-95c02008d4db_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.673 226239 DEBUG nova.storage.rbd_utils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] rbd image 5a226745-c1af-4252-846b-95c02008d4db_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.676 226239 DEBUG oslo_concurrency.processutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.729 226239 DEBUG oslo_concurrency.processutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.730 226239 DEBUG oslo_concurrency.lockutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.731 226239 DEBUG oslo_concurrency.lockutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.731 226239 DEBUG oslo_concurrency.lockutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.755 226239 DEBUG nova.storage.rbd_utils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] rbd image 5a226745-c1af-4252-846b-95c02008d4db_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:51:47 np0005603623 nova_compute[226235]: 2026-01-31 08:51:47.758 226239 DEBUG oslo_concurrency.processutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 5a226745-c1af-4252-846b-95c02008d4db_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.046 226239 DEBUG oslo_concurrency.processutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 5a226745-c1af-4252-846b-95c02008d4db_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.287s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.122 226239 DEBUG nova.storage.rbd_utils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] resizing rbd image 5a226745-c1af-4252-846b-95c02008d4db_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.233 226239 DEBUG nova.objects.instance [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lazy-loading 'migration_context' on Instance uuid 5a226745-c1af-4252-846b-95c02008d4db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.287 226239 DEBUG nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.287 226239 DEBUG nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Ensure instance console log exists: /var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.288 226239 DEBUG oslo_concurrency.lockutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.288 226239 DEBUG oslo_concurrency.lockutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.288 226239 DEBUG oslo_concurrency.lockutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.290 226239 DEBUG nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.296 226239 WARNING nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.300 226239 DEBUG nova.virt.libvirt.host [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.301 226239 DEBUG nova.virt.libvirt.host [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.303 226239 DEBUG nova.virt.libvirt.host [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.304 226239 DEBUG nova.virt.libvirt.host [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.305 226239 DEBUG nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.306 226239 DEBUG nova.virt.hardware [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.306 226239 DEBUG nova.virt.hardware [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.306 226239 DEBUG nova.virt.hardware [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.307 226239 DEBUG nova.virt.hardware [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.307 226239 DEBUG nova.virt.hardware [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.307 226239 DEBUG nova.virt.hardware [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.308 226239 DEBUG nova.virt.hardware [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.308 226239 DEBUG nova.virt.hardware [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.308 226239 DEBUG nova.virt.hardware [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.309 226239 DEBUG nova.virt.hardware [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.309 226239 DEBUG nova.virt.hardware [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.312 226239 DEBUG oslo_concurrency.processutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:51:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:48.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:51:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:51:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:48.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:51:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:51:48 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/488145696' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.798 226239 DEBUG oslo_concurrency.processutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.824 226239 DEBUG nova.storage.rbd_utils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] rbd image 5a226745-c1af-4252-846b-95c02008d4db_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:51:48 np0005603623 nova_compute[226235]: 2026-01-31 08:51:48.828 226239 DEBUG oslo_concurrency.processutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:51:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3653452823' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:51:49 np0005603623 nova_compute[226235]: 2026-01-31 08:51:49.229 226239 DEBUG oslo_concurrency.processutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.401s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:49 np0005603623 nova_compute[226235]: 2026-01-31 08:51:49.231 226239 DEBUG nova.objects.instance [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5a226745-c1af-4252-846b-95c02008d4db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:51:49 np0005603623 nova_compute[226235]: 2026-01-31 08:51:49.355 226239 DEBUG nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:51:49 np0005603623 nova_compute[226235]:  <uuid>5a226745-c1af-4252-846b-95c02008d4db</uuid>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:  <name>instance-000000ab</name>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerShowV254Test-server-1343394802</nova:name>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:51:48</nova:creationTime>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:51:49 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:        <nova:user uuid="1b85a92acbd44357b341ea45817e0d54">tempest-ServerShowV254Test-498366842-project-member</nova:user>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:        <nova:project uuid="d25e38c8ee7e4c819da74af820219c54">tempest-ServerShowV254Test-498366842</nova:project>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <nova:ports/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <entry name="serial">5a226745-c1af-4252-846b-95c02008d4db</entry>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <entry name="uuid">5a226745-c1af-4252-846b-95c02008d4db</entry>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/5a226745-c1af-4252-846b-95c02008d4db_disk">
Jan 31 03:51:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:51:49 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/5a226745-c1af-4252-846b-95c02008d4db_disk.config">
Jan 31 03:51:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:51:49 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db/console.log" append="off"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:51:49 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:51:49 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:51:49 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:51:49 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:51:49 np0005603623 nova_compute[226235]: 2026-01-31 08:51:49.553 226239 DEBUG nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:51:49 np0005603623 nova_compute[226235]: 2026-01-31 08:51:49.554 226239 DEBUG nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:51:49 np0005603623 nova_compute[226235]: 2026-01-31 08:51:49.554 226239 INFO nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Using config drive#033[00m
Jan 31 03:51:49 np0005603623 nova_compute[226235]: 2026-01-31 08:51:49.579 226239 DEBUG nova.storage.rbd_utils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] rbd image 5a226745-c1af-4252-846b-95c02008d4db_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:51:49 np0005603623 nova_compute[226235]: 2026-01-31 08:51:49.984 226239 INFO nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Creating config drive at /var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db/disk.config#033[00m
Jan 31 03:51:49 np0005603623 nova_compute[226235]: 2026-01-31 08:51:49.987 226239 DEBUG oslo_concurrency.processutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpp27o5fa9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.114 226239 DEBUG oslo_concurrency.processutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpp27o5fa9" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.136 226239 DEBUG nova.storage.rbd_utils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] rbd image 5a226745-c1af-4252-846b-95c02008d4db_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.138 226239 DEBUG oslo_concurrency.processutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db/disk.config 5a226745-c1af-4252-846b-95c02008d4db_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.290 226239 DEBUG oslo_concurrency.processutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db/disk.config 5a226745-c1af-4252-846b-95c02008d4db_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.291 226239 INFO nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Deleting local config drive /var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db/disk.config because it was imported into RBD.#033[00m
Jan 31 03:51:50 np0005603623 systemd-machined[194379]: New machine qemu-81-instance-000000ab.
Jan 31 03:51:50 np0005603623 systemd[1]: Started Virtual Machine qemu-81-instance-000000ab.
Jan 31 03:51:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:50.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:50.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.726 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.852 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849510.8520315, 5a226745-c1af-4252-846b-95c02008d4db => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.853 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5a226745-c1af-4252-846b-95c02008d4db] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.857 226239 DEBUG nova.compute.manager [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.857 226239 DEBUG nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.861 226239 INFO nova.virt.libvirt.driver [-] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Instance spawned successfully.#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.861 226239 DEBUG nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.893 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.897 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.900 226239 DEBUG nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.901 226239 DEBUG nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.901 226239 DEBUG nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.902 226239 DEBUG nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.902 226239 DEBUG nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.903 226239 DEBUG nova.virt.libvirt.driver [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.936 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5a226745-c1af-4252-846b-95c02008d4db] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.936 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849510.8532805, 5a226745-c1af-4252-846b-95c02008d4db => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:51:50 np0005603623 nova_compute[226235]: 2026-01-31 08:51:50.936 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5a226745-c1af-4252-846b-95c02008d4db] VM Started (Lifecycle Event)#033[00m
Jan 31 03:51:51 np0005603623 nova_compute[226235]: 2026-01-31 08:51:51.005 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:51:51 np0005603623 nova_compute[226235]: 2026-01-31 08:51:51.008 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:51:51 np0005603623 nova_compute[226235]: 2026-01-31 08:51:51.015 226239 INFO nova.compute.manager [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Took 3.41 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:51:51 np0005603623 nova_compute[226235]: 2026-01-31 08:51:51.016 226239 DEBUG nova.compute.manager [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:51:51 np0005603623 nova_compute[226235]: 2026-01-31 08:51:51.063 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5a226745-c1af-4252-846b-95c02008d4db] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:51:51 np0005603623 nova_compute[226235]: 2026-01-31 08:51:51.078 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:51 np0005603623 nova_compute[226235]: 2026-01-31 08:51:51.133 226239 INFO nova.compute.manager [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Took 4.64 seconds to build instance.#033[00m
Jan 31 03:51:51 np0005603623 nova_compute[226235]: 2026-01-31 08:51:51.159 226239 DEBUG oslo_concurrency.lockutils [None req-58269860-f2ac-4b47-bae1-10debe60aa13 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "5a226745-c1af-4252-846b-95c02008d4db" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.774s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:51:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:52.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:51:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:52.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:53 np0005603623 nova_compute[226235]: 2026-01-31 08:51:53.301 226239 INFO nova.compute.manager [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Rebuilding instance#033[00m
Jan 31 03:51:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:51:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:54.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:51:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:54.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:54 np0005603623 nova_compute[226235]: 2026-01-31 08:51:54.874 226239 DEBUG nova.objects.instance [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5a226745-c1af-4252-846b-95c02008d4db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:51:54 np0005603623 nova_compute[226235]: 2026-01-31 08:51:54.946 226239 DEBUG nova.compute.manager [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:51:55 np0005603623 nova_compute[226235]: 2026-01-31 08:51:55.288 226239 DEBUG nova.objects.instance [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lazy-loading 'pci_requests' on Instance uuid 5a226745-c1af-4252-846b-95c02008d4db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:51:55 np0005603623 nova_compute[226235]: 2026-01-31 08:51:55.330 226239 DEBUG nova.objects.instance [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5a226745-c1af-4252-846b-95c02008d4db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:51:55 np0005603623 nova_compute[226235]: 2026-01-31 08:51:55.364 226239 DEBUG nova.objects.instance [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lazy-loading 'resources' on Instance uuid 5a226745-c1af-4252-846b-95c02008d4db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:51:55 np0005603623 nova_compute[226235]: 2026-01-31 08:51:55.382 226239 DEBUG nova.objects.instance [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lazy-loading 'migration_context' on Instance uuid 5a226745-c1af-4252-846b-95c02008d4db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:51:55 np0005603623 nova_compute[226235]: 2026-01-31 08:51:55.425 226239 DEBUG nova.objects.instance [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 03:51:55 np0005603623 nova_compute[226235]: 2026-01-31 08:51:55.428 226239 DEBUG nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:51:55 np0005603623 nova_compute[226235]: 2026-01-31 08:51:55.761 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:56 np0005603623 nova_compute[226235]: 2026-01-31 08:51:56.080 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:56.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:56.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:58.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:51:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:51:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:58.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:52:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:52:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:00.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:52:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:00.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:00 np0005603623 nova_compute[226235]: 2026-01-31 08:52:00.763 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:01 np0005603623 nova_compute[226235]: 2026-01-31 08:52:01.082 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:52:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:02.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:52:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:02.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:02 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #57. Immutable memtables: 13.
Jan 31 03:52:02 np0005603623 podman[306855]: 2026-01-31 08:52:02.976397932 +0000 UTC m=+0.062354498 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:52:03 np0005603623 podman[306856]: 2026-01-31 08:52:03.003266845 +0000 UTC m=+0.089194350 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:52:04 np0005603623 nova_compute[226235]: 2026-01-31 08:52:04.177 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:52:04 np0005603623 nova_compute[226235]: 2026-01-31 08:52:04.177 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:52:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:04.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:04.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e367 e367: 3 total, 3 up, 3 in
Jan 31 03:52:05 np0005603623 nova_compute[226235]: 2026-01-31 08:52:05.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:52:05 np0005603623 nova_compute[226235]: 2026-01-31 08:52:05.207 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:05 np0005603623 nova_compute[226235]: 2026-01-31 08:52:05.207 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:05 np0005603623 nova_compute[226235]: 2026-01-31 08:52:05.208 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:05 np0005603623 nova_compute[226235]: 2026-01-31 08:52:05.208 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:52:05 np0005603623 nova_compute[226235]: 2026-01-31 08:52:05.208 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:05 np0005603623 nova_compute[226235]: 2026-01-31 08:52:05.468 226239 DEBUG nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 03:52:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:52:05 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/241227837' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:52:05 np0005603623 nova_compute[226235]: 2026-01-31 08:52:05.670 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:05 np0005603623 nova_compute[226235]: 2026-01-31 08:52:05.809 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:05 np0005603623 nova_compute[226235]: 2026-01-31 08:52:05.981 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000009c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:52:05 np0005603623 nova_compute[226235]: 2026-01-31 08:52:05.981 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-0000009c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:52:05 np0005603623 nova_compute[226235]: 2026-01-31 08:52:05.984 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000ab as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:52:05 np0005603623 nova_compute[226235]: 2026-01-31 08:52:05.985 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000ab as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:52:05 np0005603623 nova_compute[226235]: 2026-01-31 08:52:05.987 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:52:05 np0005603623 nova_compute[226235]: 2026-01-31 08:52:05.987 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000a5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.084 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.119 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.120 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3600MB free_disk=20.722122192382812GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.121 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.121 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e367 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.266 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 0edbf2b9-b76f-446b-85fa-09a4dcb37976 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.266 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 4af4043c-8199-4d0f-acf9-38d029560167 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.266 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 5a226745-c1af-4252-846b-95c02008d4db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.266 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.266 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.282 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.299 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.299 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.322 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.357 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:52:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:06.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.469 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:52:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:06.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:52:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:52:06 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3208260840' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:52:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e368 e368: 3 total, 3 up, 3 in
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.890 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.895 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.926 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.967 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:52:06 np0005603623 nova_compute[226235]: 2026-01-31 08:52:06.968 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.847s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:08 np0005603623 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000ab.scope: Deactivated successfully.
Jan 31 03:52:08 np0005603623 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000ab.scope: Consumed 12.837s CPU time.
Jan 31 03:52:08 np0005603623 systemd-machined[194379]: Machine qemu-81-instance-000000ab terminated.
Jan 31 03:52:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:08.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:08.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:08 np0005603623 nova_compute[226235]: 2026-01-31 08:52:08.534 226239 INFO nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Instance shutdown successfully after 13 seconds.#033[00m
Jan 31 03:52:08 np0005603623 nova_compute[226235]: 2026-01-31 08:52:08.539 226239 INFO nova.virt.libvirt.driver [-] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Instance destroyed successfully.#033[00m
Jan 31 03:52:08 np0005603623 nova_compute[226235]: 2026-01-31 08:52:08.543 226239 INFO nova.virt.libvirt.driver [-] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Instance destroyed successfully.#033[00m
Jan 31 03:52:08 np0005603623 nova_compute[226235]: 2026-01-31 08:52:08.934 226239 INFO nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Deleting instance files /var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db_del#033[00m
Jan 31 03:52:08 np0005603623 nova_compute[226235]: 2026-01-31 08:52:08.936 226239 INFO nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Deletion of /var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db_del complete#033[00m
Jan 31 03:52:09 np0005603623 nova_compute[226235]: 2026-01-31 08:52:09.222 226239 DEBUG nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:52:09 np0005603623 nova_compute[226235]: 2026-01-31 08:52:09.222 226239 INFO nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Creating image(s)#033[00m
Jan 31 03:52:09 np0005603623 nova_compute[226235]: 2026-01-31 08:52:09.246 226239 DEBUG nova.storage.rbd_utils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] rbd image 5a226745-c1af-4252-846b-95c02008d4db_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:52:09 np0005603623 nova_compute[226235]: 2026-01-31 08:52:09.273 226239 DEBUG nova.storage.rbd_utils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] rbd image 5a226745-c1af-4252-846b-95c02008d4db_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:52:09 np0005603623 nova_compute[226235]: 2026-01-31 08:52:09.299 226239 DEBUG nova.storage.rbd_utils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] rbd image 5a226745-c1af-4252-846b-95c02008d4db_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:52:09 np0005603623 nova_compute[226235]: 2026-01-31 08:52:09.303 226239 DEBUG oslo_concurrency.processutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:09 np0005603623 nova_compute[226235]: 2026-01-31 08:52:09.381 226239 DEBUG oslo_concurrency.processutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:09 np0005603623 nova_compute[226235]: 2026-01-31 08:52:09.382 226239 DEBUG oslo_concurrency.lockutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Acquiring lock "365f9823d2619ef09948bdeed685488da63755b5" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:09 np0005603623 nova_compute[226235]: 2026-01-31 08:52:09.383 226239 DEBUG oslo_concurrency.lockutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "365f9823d2619ef09948bdeed685488da63755b5" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:09 np0005603623 nova_compute[226235]: 2026-01-31 08:52:09.383 226239 DEBUG oslo_concurrency.lockutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "365f9823d2619ef09948bdeed685488da63755b5" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:09 np0005603623 nova_compute[226235]: 2026-01-31 08:52:09.407 226239 DEBUG nova.storage.rbd_utils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] rbd image 5a226745-c1af-4252-846b-95c02008d4db_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:52:09 np0005603623 nova_compute[226235]: 2026-01-31 08:52:09.410 226239 DEBUG oslo_concurrency.processutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 5a226745-c1af-4252-846b-95c02008d4db_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:10.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:10.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:10 np0005603623 nova_compute[226235]: 2026-01-31 08:52:10.811 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:10 np0005603623 nova_compute[226235]: 2026-01-31 08:52:10.840 226239 DEBUG oslo_concurrency.processutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 5a226745-c1af-4252-846b-95c02008d4db_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:10 np0005603623 nova_compute[226235]: 2026-01-31 08:52:10.917 226239 DEBUG nova.storage.rbd_utils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] resizing rbd image 5a226745-c1af-4252-846b-95c02008d4db_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.018 226239 DEBUG nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.019 226239 DEBUG nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Ensure instance console log exists: /var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.019 226239 DEBUG oslo_concurrency.lockutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.019 226239 DEBUG oslo_concurrency.lockutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.020 226239 DEBUG oslo_concurrency.lockutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.021 226239 DEBUG nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:45Z,direct_url=<?>,disk_format='qcow2',id=0864ca59-9877-4e6d-adfc-f0a3204ed8f8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:47Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.024 226239 WARNING nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.032 226239 DEBUG nova.virt.libvirt.host [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.032 226239 DEBUG nova.virt.libvirt.host [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.035 226239 DEBUG nova.virt.libvirt.host [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.036 226239 DEBUG nova.virt.libvirt.host [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.037 226239 DEBUG nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.037 226239 DEBUG nova.virt.hardware [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:45Z,direct_url=<?>,disk_format='qcow2',id=0864ca59-9877-4e6d-adfc-f0a3204ed8f8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:47Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.038 226239 DEBUG nova.virt.hardware [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.038 226239 DEBUG nova.virt.hardware [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.038 226239 DEBUG nova.virt.hardware [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.038 226239 DEBUG nova.virt.hardware [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.038 226239 DEBUG nova.virt.hardware [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.038 226239 DEBUG nova.virt.hardware [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.039 226239 DEBUG nova.virt.hardware [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.039 226239 DEBUG nova.virt.hardware [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.039 226239 DEBUG nova.virt.hardware [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.039 226239 DEBUG nova.virt.hardware [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.039 226239 DEBUG nova.objects.instance [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5a226745-c1af-4252-846b-95c02008d4db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.087 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.108 226239 DEBUG oslo_concurrency.processutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e368 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:52:11 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3852254797' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.529 226239 DEBUG oslo_concurrency.processutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.557 226239 DEBUG nova.storage.rbd_utils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] rbd image 5a226745-c1af-4252-846b-95c02008d4db_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.563 226239 DEBUG oslo_concurrency.processutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.969 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:52:11 np0005603623 nova_compute[226235]: 2026-01-31 08:52:11.969 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:52:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:52:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1263756215' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:52:12 np0005603623 nova_compute[226235]: 2026-01-31 08:52:12.015 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:52:12 np0005603623 nova_compute[226235]: 2026-01-31 08:52:12.026 226239 DEBUG oslo_concurrency.processutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:12 np0005603623 nova_compute[226235]: 2026-01-31 08:52:12.029 226239 DEBUG nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:52:12 np0005603623 nova_compute[226235]:  <uuid>5a226745-c1af-4252-846b-95c02008d4db</uuid>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:  <name>instance-000000ab</name>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <nova:name>tempest-ServerShowV254Test-server-1343394802</nova:name>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:52:11</nova:creationTime>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:52:12 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:        <nova:user uuid="1b85a92acbd44357b341ea45817e0d54">tempest-ServerShowV254Test-498366842-project-member</nova:user>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:        <nova:project uuid="d25e38c8ee7e4c819da74af820219c54">tempest-ServerShowV254Test-498366842</nova:project>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="0864ca59-9877-4e6d-adfc-f0a3204ed8f8"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <nova:ports/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <entry name="serial">5a226745-c1af-4252-846b-95c02008d4db</entry>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <entry name="uuid">5a226745-c1af-4252-846b-95c02008d4db</entry>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/5a226745-c1af-4252-846b-95c02008d4db_disk">
Jan 31 03:52:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:52:12 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/5a226745-c1af-4252-846b-95c02008d4db_disk.config">
Jan 31 03:52:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:52:12 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db/console.log" append="off"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:52:12 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:52:12 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:52:12 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:52:12 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:52:12 np0005603623 nova_compute[226235]: 2026-01-31 08:52:12.087 226239 DEBUG nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:52:12 np0005603623 nova_compute[226235]: 2026-01-31 08:52:12.088 226239 DEBUG nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:52:12 np0005603623 nova_compute[226235]: 2026-01-31 08:52:12.088 226239 INFO nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Using config drive#033[00m
Jan 31 03:52:12 np0005603623 nova_compute[226235]: 2026-01-31 08:52:12.112 226239 DEBUG nova.storage.rbd_utils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] rbd image 5a226745-c1af-4252-846b-95c02008d4db_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:52:12 np0005603623 nova_compute[226235]: 2026-01-31 08:52:12.134 226239 DEBUG nova.objects.instance [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 5a226745-c1af-4252-846b-95c02008d4db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:52:12 np0005603623 nova_compute[226235]: 2026-01-31 08:52:12.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:52:12 np0005603623 nova_compute[226235]: 2026-01-31 08:52:12.372 226239 INFO nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Creating config drive at /var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db/disk.config#033[00m
Jan 31 03:52:12 np0005603623 nova_compute[226235]: 2026-01-31 08:52:12.375 226239 DEBUG oslo_concurrency.processutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpgojweluq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:52:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:12.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:52:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:12.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:12 np0005603623 nova_compute[226235]: 2026-01-31 08:52:12.497 226239 DEBUG oslo_concurrency.processutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpgojweluq" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:12 np0005603623 nova_compute[226235]: 2026-01-31 08:52:12.521 226239 DEBUG nova.storage.rbd_utils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] rbd image 5a226745-c1af-4252-846b-95c02008d4db_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:52:12 np0005603623 nova_compute[226235]: 2026-01-31 08:52:12.524 226239 DEBUG oslo_concurrency.processutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db/disk.config 5a226745-c1af-4252-846b-95c02008d4db_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:12 np0005603623 nova_compute[226235]: 2026-01-31 08:52:12.649 226239 DEBUG oslo_concurrency.processutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db/disk.config 5a226745-c1af-4252-846b-95c02008d4db_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:12 np0005603623 nova_compute[226235]: 2026-01-31 08:52:12.650 226239 INFO nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Deleting local config drive /var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db/disk.config because it was imported into RBD.#033[00m
Jan 31 03:52:12 np0005603623 systemd-machined[194379]: New machine qemu-82-instance-000000ab.
Jan 31 03:52:12 np0005603623 systemd[1]: Started Virtual Machine qemu-82-instance-000000ab.
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.423 226239 DEBUG nova.virt.libvirt.host [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Removed pending event for 5a226745-c1af-4252-846b-95c02008d4db due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.423 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849533.4228714, 5a226745-c1af-4252-846b-95c02008d4db => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.424 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5a226745-c1af-4252-846b-95c02008d4db] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.426 226239 DEBUG nova.compute.manager [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.426 226239 DEBUG nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.429 226239 INFO nova.virt.libvirt.driver [-] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Instance spawned successfully.#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.429 226239 DEBUG nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.466 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.470 226239 DEBUG nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.470 226239 DEBUG nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.470 226239 DEBUG nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.471 226239 DEBUG nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.471 226239 DEBUG nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.471 226239 DEBUG nova.virt.libvirt.driver [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.474 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.608 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5a226745-c1af-4252-846b-95c02008d4db] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.608 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849533.4256775, 5a226745-c1af-4252-846b-95c02008d4db => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.609 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5a226745-c1af-4252-846b-95c02008d4db] VM Started (Lifecycle Event)#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.669 226239 DEBUG nova.compute.manager [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.684 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.688 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.788 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 5a226745-c1af-4252-846b-95c02008d4db] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.831 226239 DEBUG oslo_concurrency.lockutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.832 226239 DEBUG oslo_concurrency.lockutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.832 226239 DEBUG nova.objects.instance [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 03:52:13 np0005603623 nova_compute[226235]: 2026-01-31 08:52:13.943 226239 DEBUG oslo_concurrency.lockutils [None req-51f68a08-93d8-4a85-93cf-57787736b9ab 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:14 np0005603623 nova_compute[226235]: 2026-01-31 08:52:14.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:52:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:14.291 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=74, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=73) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:52:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:14.291 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:52:14 np0005603623 nova_compute[226235]: 2026-01-31 08:52:14.292 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:52:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:14.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:52:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:14.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:14 np0005603623 nova_compute[226235]: 2026-01-31 08:52:14.728 226239 DEBUG oslo_concurrency.lockutils [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Acquiring lock "5a226745-c1af-4252-846b-95c02008d4db" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:14 np0005603623 nova_compute[226235]: 2026-01-31 08:52:14.729 226239 DEBUG oslo_concurrency.lockutils [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "5a226745-c1af-4252-846b-95c02008d4db" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:14 np0005603623 nova_compute[226235]: 2026-01-31 08:52:14.729 226239 DEBUG oslo_concurrency.lockutils [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Acquiring lock "5a226745-c1af-4252-846b-95c02008d4db-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:14 np0005603623 nova_compute[226235]: 2026-01-31 08:52:14.730 226239 DEBUG oslo_concurrency.lockutils [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "5a226745-c1af-4252-846b-95c02008d4db-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:14 np0005603623 nova_compute[226235]: 2026-01-31 08:52:14.730 226239 DEBUG oslo_concurrency.lockutils [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "5a226745-c1af-4252-846b-95c02008d4db-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:14 np0005603623 nova_compute[226235]: 2026-01-31 08:52:14.731 226239 INFO nova.compute.manager [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Terminating instance#033[00m
Jan 31 03:52:14 np0005603623 nova_compute[226235]: 2026-01-31 08:52:14.732 226239 DEBUG oslo_concurrency.lockutils [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Acquiring lock "refresh_cache-5a226745-c1af-4252-846b-95c02008d4db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:52:14 np0005603623 nova_compute[226235]: 2026-01-31 08:52:14.732 226239 DEBUG oslo_concurrency.lockutils [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Acquired lock "refresh_cache-5a226745-c1af-4252-846b-95c02008d4db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:52:14 np0005603623 nova_compute[226235]: 2026-01-31 08:52:14.732 226239 DEBUG nova.network.neutron [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:52:15 np0005603623 nova_compute[226235]: 2026-01-31 08:52:15.198 226239 DEBUG nova.network.neutron [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:52:15 np0005603623 nova_compute[226235]: 2026-01-31 08:52:15.713 226239 DEBUG nova.network.neutron [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:52:15 np0005603623 nova_compute[226235]: 2026-01-31 08:52:15.763 226239 DEBUG oslo_concurrency.lockutils [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Releasing lock "refresh_cache-5a226745-c1af-4252-846b-95c02008d4db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:52:15 np0005603623 nova_compute[226235]: 2026-01-31 08:52:15.764 226239 DEBUG nova.compute.manager [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:52:15 np0005603623 nova_compute[226235]: 2026-01-31 08:52:15.849 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:15 np0005603623 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000ab.scope: Deactivated successfully.
Jan 31 03:52:15 np0005603623 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000ab.scope: Consumed 3.057s CPU time.
Jan 31 03:52:15 np0005603623 systemd-machined[194379]: Machine qemu-82-instance-000000ab terminated.
Jan 31 03:52:15 np0005603623 nova_compute[226235]: 2026-01-31 08:52:15.978 226239 INFO nova.virt.libvirt.driver [-] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Instance destroyed successfully.#033[00m
Jan 31 03:52:15 np0005603623 nova_compute[226235]: 2026-01-31 08:52:15.978 226239 DEBUG nova.objects.instance [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lazy-loading 'resources' on Instance uuid 5a226745-c1af-4252-846b-95c02008d4db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:52:16 np0005603623 nova_compute[226235]: 2026-01-31 08:52:16.088 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:16 np0005603623 nova_compute[226235]: 2026-01-31 08:52:16.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:52:16 np0005603623 nova_compute[226235]: 2026-01-31 08:52:16.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:52:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e368 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:16 np0005603623 nova_compute[226235]: 2026-01-31 08:52:16.357 226239 INFO nova.virt.libvirt.driver [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Deleting instance files /var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db_del#033[00m
Jan 31 03:52:16 np0005603623 nova_compute[226235]: 2026-01-31 08:52:16.358 226239 INFO nova.virt.libvirt.driver [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Deletion of /var/lib/nova/instances/5a226745-c1af-4252-846b-95c02008d4db_del complete#033[00m
Jan 31 03:52:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:16.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:16 np0005603623 nova_compute[226235]: 2026-01-31 08:52:16.491 226239 INFO nova.compute.manager [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Took 0.73 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:52:16 np0005603623 nova_compute[226235]: 2026-01-31 08:52:16.492 226239 DEBUG oslo.service.loopingcall [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:52:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:16 np0005603623 nova_compute[226235]: 2026-01-31 08:52:16.492 226239 DEBUG nova.compute.manager [-] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:52:16 np0005603623 nova_compute[226235]: 2026-01-31 08:52:16.493 226239 DEBUG nova.network.neutron [-] [instance: 5a226745-c1af-4252-846b-95c02008d4db] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:52:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:16.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:16 np0005603623 nova_compute[226235]: 2026-01-31 08:52:16.750 226239 DEBUG nova.network.neutron [-] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:52:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e369 e369: 3 total, 3 up, 3 in
Jan 31 03:52:16 np0005603623 nova_compute[226235]: 2026-01-31 08:52:16.782 226239 DEBUG nova.network.neutron [-] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:52:16 np0005603623 nova_compute[226235]: 2026-01-31 08:52:16.812 226239 INFO nova.compute.manager [-] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Took 0.32 seconds to deallocate network for instance.#033[00m
Jan 31 03:52:16 np0005603623 nova_compute[226235]: 2026-01-31 08:52:16.881 226239 DEBUG oslo_concurrency.lockutils [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:16 np0005603623 nova_compute[226235]: 2026-01-31 08:52:16.881 226239 DEBUG oslo_concurrency.lockutils [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:16 np0005603623 nova_compute[226235]: 2026-01-31 08:52:16.999 226239 DEBUG oslo_concurrency.processutils [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:52:17 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1249331657' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:52:17 np0005603623 nova_compute[226235]: 2026-01-31 08:52:17.432 226239 DEBUG oslo_concurrency.processutils [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:17 np0005603623 nova_compute[226235]: 2026-01-31 08:52:17.438 226239 DEBUG nova.compute.provider_tree [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:52:17 np0005603623 nova_compute[226235]: 2026-01-31 08:52:17.463 226239 DEBUG nova.scheduler.client.report [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:52:17 np0005603623 nova_compute[226235]: 2026-01-31 08:52:17.510 226239 DEBUG oslo_concurrency.lockutils [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:17 np0005603623 nova_compute[226235]: 2026-01-31 08:52:17.548 226239 INFO nova.scheduler.client.report [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Deleted allocations for instance 5a226745-c1af-4252-846b-95c02008d4db#033[00m
Jan 31 03:52:17 np0005603623 nova_compute[226235]: 2026-01-31 08:52:17.643 226239 DEBUG oslo_concurrency.lockutils [None req-9065c4f0-28a1-4a32-a520-b34ecf517a76 1b85a92acbd44357b341ea45817e0d54 d25e38c8ee7e4c819da74af820219c54 - - default default] Lock "5a226745-c1af-4252-846b-95c02008d4db" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.914s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:52:18 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/429166792' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:52:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:52:18 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/429166792' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:52:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:18.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:18.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e370 e370: 3 total, 3 up, 3 in
Jan 31 03:52:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:52:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:20.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:52:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:20.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:20 np0005603623 nova_compute[226235]: 2026-01-31 08:52:20.852 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:21 np0005603623 nova_compute[226235]: 2026-01-31 08:52:21.091 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:22.294 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '74'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:22.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:52:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:22.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:52:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:24.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:24.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:25 np0005603623 nova_compute[226235]: 2026-01-31 08:52:25.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:52:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:52:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.0 total, 600.0 interval#012Cumulative writes: 14K writes, 73K keys, 14K commit groups, 1.0 writes per commit group, ingest: 0.15 GB, 0.03 MB/s#012Cumulative WAL: 14K writes, 14K syncs, 1.00 writes per sync, written: 0.15 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1562 writes, 7481 keys, 1562 commit groups, 1.0 writes per commit group, ingest: 15.48 MB, 0.03 MB/s#012Interval WAL: 1562 writes, 1562 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     41.3      2.18              0.21        46    0.047       0      0       0.0       0.0#012  L6      1/0   10.53 MB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   5.0     57.6     49.0      9.24              0.99        45    0.205    318K    24K       0.0       0.0#012 Sum      1/0   10.53 MB   0.0      0.5     0.1      0.4       0.5      0.1       0.0   6.0     46.6     47.5     11.41              1.20        91    0.125    318K    24K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.4     52.6     51.9      1.21              0.13        10    0.121     47K   2604       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   0.0     57.6     49.0      9.24              0.99        45    0.205    318K    24K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     41.3      2.17              0.21        45    0.048       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 5400.0 total, 600.0 interval#012Flush(GB): cumulative 0.088, interval 0.008#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.53 GB write, 0.10 MB/s write, 0.52 GB read, 0.10 MB/s read, 11.4 seconds#012Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.06 GB read, 0.11 MB/s read, 1.2 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557fc5f1b1f0#2 capacity: 304.00 MB usage: 58.46 MB table_size: 0 occupancy: 18446744073709551615 collections: 10 last_copies: 0 last_secs: 0.000367 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3335,56.12 MB,18.4602%) FilterBlock(91,890.42 KB,0.286037%) IndexBlock(91,1.47 MB,0.483899%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 03:52:25 np0005603623 nova_compute[226235]: 2026-01-31 08:52:25.896 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:26 np0005603623 nova_compute[226235]: 2026-01-31 08:52:26.092 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:52:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:26.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:52:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:26.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e371 e371: 3 total, 3 up, 3 in
Jan 31 03:52:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:28.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:28.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e372 e372: 3 total, 3 up, 3 in
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.110 226239 DEBUG oslo_concurrency.lockutils [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.110 226239 DEBUG oslo_concurrency.lockutils [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.111 226239 DEBUG oslo_concurrency.lockutils [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.111 226239 DEBUG oslo_concurrency.lockutils [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.111 226239 DEBUG oslo_concurrency.lockutils [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.112 226239 INFO nova.compute.manager [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Terminating instance#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.113 226239 DEBUG nova.compute.manager [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:30.143 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:30.143 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:30.144 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:30 np0005603623 kernel: tap3aae5c0f-f2 (unregistering): left promiscuous mode
Jan 31 03:52:30 np0005603623 NetworkManager[48970]: <info>  [1769849550.1782] device (tap3aae5c0f-f2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:52:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:52:30Z|00717|binding|INFO|Releasing lport 3aae5c0f-f2ed-4352-a4e2-017466399641 from this chassis (sb_readonly=0)
Jan 31 03:52:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:52:30Z|00718|binding|INFO|Setting lport 3aae5c0f-f2ed-4352-a4e2-017466399641 down in Southbound
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.184 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:30 np0005603623 ovn_controller[133449]: 2026-01-31T08:52:30Z|00719|binding|INFO|Removing iface tap3aae5c0f-f2 ovn-installed in OVS
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.186 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.191 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:30.218 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:73:3d 10.100.0.9'], port_security=['fa:16:3e:69:73:3d 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4af4043c-8199-4d0f-acf9-38d029560167', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8397e0fed04b4dabb57148d0924de2dc', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'd636f3a4-efef-465a-ac59-8182d61336f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.250'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbd2578f-ff6e-4dc3-bc49-93cbf023edc5, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=3aae5c0f-f2ed-4352-a4e2-017466399641) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:30.220 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 3aae5c0f-f2ed-4352-a4e2-017466399641 in datapath 3afaf607-43a1-4d65-95fc-0a22b5c901d0 unbound from our chassis#033[00m
Jan 31 03:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:30.221 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3afaf607-43a1-4d65-95fc-0a22b5c901d0#033[00m
Jan 31 03:52:30 np0005603623 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000a5.scope: Deactivated successfully.
Jan 31 03:52:30 np0005603623 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000a5.scope: Consumed 14.895s CPU time.
Jan 31 03:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:30.231 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6a0ffb67-ab4f-4993-833f-b67620bc00d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:30 np0005603623 systemd-machined[194379]: Machine qemu-80-instance-000000a5 terminated.
Jan 31 03:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:30.252 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[2a7e28bf-eed6-48c0-a4b5-348f7a8f4b05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:30.255 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[7947e195-fd2d-4e30-9731-25dfac45861e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:30.274 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[cf65b84e-5591-4c85-ba69-cc8f56355184]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:30.292 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9dfcfeb3-6e23-4bac-9a86-7decc36bf203]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3afaf607-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:84:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 198], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 831647, 'reachable_time': 39542, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 307430, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:30.305 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e4d53fdf-9481-40a4-a2a5-480159776473]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3afaf607-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 831657, 'tstamp': 831657}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 307431, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3afaf607-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 831659, 'tstamp': 831659}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 307431, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:30.306 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3afaf607-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.308 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.311 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:30.311 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3afaf607-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:30.312 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:30.312 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3afaf607-40, col_values=(('external_ids', {'iface-id': '0ed76a0a-650c-4ec7-a4d4-0e745236b047'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:52:30.313 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.329 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.333 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.349 226239 INFO nova.virt.libvirt.driver [-] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Instance destroyed successfully.#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.350 226239 DEBUG nova.objects.instance [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'resources' on Instance uuid 4af4043c-8199-4d0f-acf9-38d029560167 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.417 226239 DEBUG nova.virt.libvirt.vif [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:48:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-1',id=165,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMGvV4tGHwFrQ7+1WPmMS3fGcrpcMKpLQBFiD2ZG0NedKq4jaCN6oHf8RWlX+X72Ff/PSGJSQ5nqRPZm+CDMr01vn3vAMra9m4dZ/R1d2vwh+NDFwu298PivPHJQkyuCpg==',key_name='tempest-keypair-600650673',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:51:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8397e0fed04b4dabb57148d0924de2dc',ramdisk_id='',reservation_id='r-bd0dktab',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeMultiAttachTest-1931311941',owner_user_name='tempest-AttachVolumeMultiAttachTest-1931311941-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:51:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a498364761ef428b99cac3f92e603385',uuid=4af4043c-8199-4d0f-acf9-38d029560167,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.417 226239 DEBUG nova.network.os_vif_util [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converting VIF {"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.418 226239 DEBUG nova.network.os_vif_util [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:69:73:3d,bridge_name='br-int',has_traffic_filtering=True,id=3aae5c0f-f2ed-4352-a4e2-017466399641,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aae5c0f-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.418 226239 DEBUG os_vif [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:69:73:3d,bridge_name='br-int',has_traffic_filtering=True,id=3aae5c0f-f2ed-4352-a4e2-017466399641,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aae5c0f-f2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.419 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.420 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3aae5c0f-f2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.421 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.422 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.424 226239 INFO os_vif [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:69:73:3d,bridge_name='br-int',has_traffic_filtering=True,id=3aae5c0f-f2ed-4352-a4e2-017466399641,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aae5c0f-f2')#033[00m
Jan 31 03:52:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:30.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:30.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.840 226239 DEBUG nova.compute.manager [req-bdca9cd1-e4bf-4736-9f26-e03893ddd950 req-9eba7e79-4126-4b18-baf3-c2d933671861 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-vif-unplugged-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.841 226239 DEBUG oslo_concurrency.lockutils [req-bdca9cd1-e4bf-4736-9f26-e03893ddd950 req-9eba7e79-4126-4b18-baf3-c2d933671861 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.841 226239 DEBUG oslo_concurrency.lockutils [req-bdca9cd1-e4bf-4736-9f26-e03893ddd950 req-9eba7e79-4126-4b18-baf3-c2d933671861 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.841 226239 DEBUG oslo_concurrency.lockutils [req-bdca9cd1-e4bf-4736-9f26-e03893ddd950 req-9eba7e79-4126-4b18-baf3-c2d933671861 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.842 226239 DEBUG nova.compute.manager [req-bdca9cd1-e4bf-4736-9f26-e03893ddd950 req-9eba7e79-4126-4b18-baf3-c2d933671861 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] No waiting events found dispatching network-vif-unplugged-3aae5c0f-f2ed-4352-a4e2-017466399641 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.842 226239 DEBUG nova.compute.manager [req-bdca9cd1-e4bf-4736-9f26-e03893ddd950 req-9eba7e79-4126-4b18-baf3-c2d933671861 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-vif-unplugged-3aae5c0f-f2ed-4352-a4e2-017466399641 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.899 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.976 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849535.9758463, 5a226745-c1af-4252-846b-95c02008d4db => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:52:30 np0005603623 nova_compute[226235]: 2026-01-31 08:52:30.977 226239 INFO nova.compute.manager [-] [instance: 5a226745-c1af-4252-846b-95c02008d4db] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:52:31 np0005603623 nova_compute[226235]: 2026-01-31 08:52:31.073 226239 DEBUG nova.compute.manager [None req-c1f47e23-ac0b-4179-a52d-604d0da44b8f - - - - - -] [instance: 5a226745-c1af-4252-846b-95c02008d4db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:52:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e372 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:31 np0005603623 nova_compute[226235]: 2026-01-31 08:52:31.869 226239 INFO nova.virt.libvirt.driver [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Deleting instance files /var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167_del#033[00m
Jan 31 03:52:31 np0005603623 nova_compute[226235]: 2026-01-31 08:52:31.871 226239 INFO nova.virt.libvirt.driver [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Deletion of /var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167_del complete#033[00m
Jan 31 03:52:32 np0005603623 nova_compute[226235]: 2026-01-31 08:52:32.092 226239 INFO nova.compute.manager [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Took 1.98 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:52:32 np0005603623 nova_compute[226235]: 2026-01-31 08:52:32.094 226239 DEBUG oslo.service.loopingcall [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:52:32 np0005603623 nova_compute[226235]: 2026-01-31 08:52:32.094 226239 DEBUG nova.compute.manager [-] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:52:32 np0005603623 nova_compute[226235]: 2026-01-31 08:52:32.094 226239 DEBUG nova.network.neutron [-] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:52:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:32.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:52:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:32.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:52:33 np0005603623 nova_compute[226235]: 2026-01-31 08:52:33.074 226239 DEBUG nova.compute.manager [req-741a4880-6efe-4381-8c0a-81269d2250dd req-7d5e1b6e-776a-404f-8272-c32b5d8fb207 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:52:33 np0005603623 nova_compute[226235]: 2026-01-31 08:52:33.075 226239 DEBUG oslo_concurrency.lockutils [req-741a4880-6efe-4381-8c0a-81269d2250dd req-7d5e1b6e-776a-404f-8272-c32b5d8fb207 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:33 np0005603623 nova_compute[226235]: 2026-01-31 08:52:33.075 226239 DEBUG oslo_concurrency.lockutils [req-741a4880-6efe-4381-8c0a-81269d2250dd req-7d5e1b6e-776a-404f-8272-c32b5d8fb207 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:33 np0005603623 nova_compute[226235]: 2026-01-31 08:52:33.075 226239 DEBUG oslo_concurrency.lockutils [req-741a4880-6efe-4381-8c0a-81269d2250dd req-7d5e1b6e-776a-404f-8272-c32b5d8fb207 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:33 np0005603623 nova_compute[226235]: 2026-01-31 08:52:33.076 226239 DEBUG nova.compute.manager [req-741a4880-6efe-4381-8c0a-81269d2250dd req-7d5e1b6e-776a-404f-8272-c32b5d8fb207 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] No waiting events found dispatching network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:52:33 np0005603623 nova_compute[226235]: 2026-01-31 08:52:33.076 226239 WARNING nova.compute.manager [req-741a4880-6efe-4381-8c0a-81269d2250dd req-7d5e1b6e-776a-404f-8272-c32b5d8fb207 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received unexpected event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:52:33 np0005603623 podman[307464]: 2026-01-31 08:52:33.962092335 +0000 UTC m=+0.053681385 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:52:34 np0005603623 podman[307465]: 2026-01-31 08:52:34.023687408 +0000 UTC m=+0.116827756 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Jan 31 03:52:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:52:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:34.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:52:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:34.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:34 np0005603623 nova_compute[226235]: 2026-01-31 08:52:34.737 226239 DEBUG nova.network.neutron [-] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:52:34 np0005603623 nova_compute[226235]: 2026-01-31 08:52:34.763 226239 INFO nova.compute.manager [-] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Took 2.67 seconds to deallocate network for instance.#033[00m
Jan 31 03:52:34 np0005603623 nova_compute[226235]: 2026-01-31 08:52:34.817 226239 DEBUG oslo_concurrency.lockutils [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:34 np0005603623 nova_compute[226235]: 2026-01-31 08:52:34.818 226239 DEBUG oslo_concurrency.lockutils [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:34 np0005603623 nova_compute[226235]: 2026-01-31 08:52:34.935 226239 DEBUG oslo_concurrency.processutils [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:34 np0005603623 nova_compute[226235]: 2026-01-31 08:52:34.995 226239 DEBUG nova.compute.manager [req-9af1b5ab-5d1e-43b0-9d92-a7c6c22c6213 req-57495325-a253-471b-ba41-b530f0d21609 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-vif-deleted-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:52:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:52:35 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1352985111' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:52:35 np0005603623 nova_compute[226235]: 2026-01-31 08:52:35.386 226239 DEBUG oslo_concurrency.processutils [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:35 np0005603623 nova_compute[226235]: 2026-01-31 08:52:35.392 226239 DEBUG nova.compute.provider_tree [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:52:35 np0005603623 nova_compute[226235]: 2026-01-31 08:52:35.418 226239 DEBUG nova.scheduler.client.report [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:52:35 np0005603623 nova_compute[226235]: 2026-01-31 08:52:35.422 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:35 np0005603623 nova_compute[226235]: 2026-01-31 08:52:35.462 226239 DEBUG oslo_concurrency.lockutils [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:35 np0005603623 nova_compute[226235]: 2026-01-31 08:52:35.536 226239 INFO nova.scheduler.client.report [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Deleted allocations for instance 4af4043c-8199-4d0f-acf9-38d029560167#033[00m
Jan 31 03:52:35 np0005603623 nova_compute[226235]: 2026-01-31 08:52:35.679 226239 DEBUG oslo_concurrency.lockutils [None req-0ed9e7b5-0876-4e6a-b795-66bf0e4b7c4d a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:35 np0005603623 nova_compute[226235]: 2026-01-31 08:52:35.903 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e372 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:52:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:36.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:52:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:52:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:36.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:52:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 e373: 3 total, 3 up, 3 in
Jan 31 03:52:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:38.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:38.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:39 np0005603623 podman[307759]: 2026-01-31 08:52:39.882517483 +0000 UTC m=+0.098795820 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 03:52:39 np0005603623 podman[307759]: 2026-01-31 08:52:39.984584765 +0000 UTC m=+0.200863092 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 03:52:40 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 31 03:52:40 np0005603623 nova_compute[226235]: 2026-01-31 08:52:40.423 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:40.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:52:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:40.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:52:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:52:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:52:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:52:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:52:40 np0005603623 podman[307910]: 2026-01-31 08:52:40.649277646 +0000 UTC m=+0.220774056 container exec dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 03:52:40 np0005603623 podman[307932]: 2026-01-31 08:52:40.729679709 +0000 UTC m=+0.063307077 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 03:52:40 np0005603623 podman[307910]: 2026-01-31 08:52:40.799626864 +0000 UTC m=+0.371123254 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 03:52:40 np0005603623 nova_compute[226235]: 2026-01-31 08:52:40.904 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:41 np0005603623 podman[307976]: 2026-01-31 08:52:41.365530007 +0000 UTC m=+0.162269112 container exec 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, io.openshift.tags=Ceph keepalived, vcs-type=git, com.redhat.component=keepalived-container, io.openshift.expose-services=, io.buildah.version=1.28.2, name=keepalived, release=1793, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Jan 31 03:52:41 np0005603623 podman[307996]: 2026-01-31 08:52:41.459882847 +0000 UTC m=+0.078496324 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, release=1793, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph, distribution-scope=public, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived)
Jan 31 03:52:41 np0005603623 podman[307976]: 2026-01-31 08:52:41.499620794 +0000 UTC m=+0.296359879 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, vcs-type=git, version=2.2.4, description=keepalived for Ceph, name=keepalived, release=1793, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Jan 31 03:52:41 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:52:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:42.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:42 np0005603623 nova_compute[226235]: 2026-01-31 08:52:42.456 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:42.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:52:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:52:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:52:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:52:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:44.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:52:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:44.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:52:45 np0005603623 nova_compute[226235]: 2026-01-31 08:52:45.348 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849550.3479354, 4af4043c-8199-4d0f-acf9-38d029560167 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:52:45 np0005603623 nova_compute[226235]: 2026-01-31 08:52:45.349 226239 INFO nova.compute.manager [-] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:52:45 np0005603623 nova_compute[226235]: 2026-01-31 08:52:45.427 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:45 np0005603623 nova_compute[226235]: 2026-01-31 08:52:45.631 226239 DEBUG nova.compute.manager [None req-6afe6188-7bf1-4e07-8b3d-87cd9bf48c20 - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:52:45 np0005603623 nova_compute[226235]: 2026-01-31 08:52:45.948 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:46.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:46.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:52:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:48.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:52:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:48.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #148. Immutable memtables: 0.
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:52:48.568865) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 93] Flushing memtable with next log file: 148
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849568568936, "job": 93, "event": "flush_started", "num_memtables": 1, "num_entries": 2465, "num_deletes": 256, "total_data_size": 5627899, "memory_usage": 5710064, "flush_reason": "Manual Compaction"}
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 93] Level-0 flush table #149: started
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849568614080, "cf_name": "default", "job": 93, "event": "table_file_creation", "file_number": 149, "file_size": 3666358, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 71799, "largest_seqno": 74259, "table_properties": {"data_size": 3656276, "index_size": 6383, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 21711, "raw_average_key_size": 20, "raw_value_size": 3635875, "raw_average_value_size": 3506, "num_data_blocks": 276, "num_entries": 1037, "num_filter_entries": 1037, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769849379, "oldest_key_time": 1769849379, "file_creation_time": 1769849568, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 149, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 93] Flush lasted 45344 microseconds, and 6299 cpu microseconds.
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:52:48.614199) [db/flush_job.cc:967] [default] [JOB 93] Level-0 flush table #149: 3666358 bytes OK
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:52:48.614230) [db/memtable_list.cc:519] [default] Level-0 commit table #149 started
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:52:48.618092) [db/memtable_list.cc:722] [default] Level-0 commit table #149: memtable #1 done
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:52:48.618105) EVENT_LOG_v1 {"time_micros": 1769849568618101, "job": 93, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:52:48.618119) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 93] Try to delete WAL files size 5616904, prev total WAL file size 5616904, number of live WAL files 2.
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000145.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:52:48.618924) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036323735' seq:72057594037927935, type:22 .. '7061786F730036353237' seq:0, type:0; will stop at (end)
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 94] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 93 Base level 0, inputs: [149(3580KB)], [147(10MB)]
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849568619003, "job": 94, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [149], "files_L6": [147], "score": -1, "input_data_size": 14707148, "oldest_snapshot_seqno": -1}
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 94] Generated table #150: 9570 keys, 12769795 bytes, temperature: kUnknown
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849568764622, "cf_name": "default", "job": 94, "event": "table_file_creation", "file_number": 150, "file_size": 12769795, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12707477, "index_size": 37309, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23941, "raw_key_size": 251438, "raw_average_key_size": 26, "raw_value_size": 12539461, "raw_average_value_size": 1310, "num_data_blocks": 1427, "num_entries": 9570, "num_filter_entries": 9570, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769849568, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 150, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:52:48.764986) [db/compaction/compaction_job.cc:1663] [default] [JOB 94] Compacted 1@0 + 1@6 files to L6 => 12769795 bytes
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:52:48.782145) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 100.8 rd, 87.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 10.5 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(7.5) write-amplify(3.5) OK, records in: 10101, records dropped: 531 output_compression: NoCompression
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:52:48.782187) EVENT_LOG_v1 {"time_micros": 1769849568782170, "job": 94, "event": "compaction_finished", "compaction_time_micros": 145845, "compaction_time_cpu_micros": 23108, "output_level": 6, "num_output_files": 1, "total_output_size": 12769795, "num_input_records": 10101, "num_output_records": 9570, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000149.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849568782630, "job": 94, "event": "table_file_deletion", "file_number": 149}
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000147.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849568783679, "job": 94, "event": "table_file_deletion", "file_number": 147}
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:52:48.618776) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:52:48.783706) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:52:48.783710) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:52:48.783712) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:52:48.783714) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:52:48 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:52:48.783716) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:52:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:52:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:52:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:52:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:50.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:52:50 np0005603623 nova_compute[226235]: 2026-01-31 08:52:50.468 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:50.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:50 np0005603623 nova_compute[226235]: 2026-01-31 08:52:50.950 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:52:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 62K writes, 270K keys, 62K commit groups, 1.0 writes per commit group, ingest: 0.28 GB, 0.05 MB/s#012Cumulative WAL: 62K writes, 20K syncs, 3.01 writes per sync, written: 0.28 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 8167 writes, 32K keys, 8167 commit groups, 1.0 writes per commit group, ingest: 37.09 MB, 0.06 MB/s#012Interval WAL: 8167 writes, 2954 syncs, 2.76 writes per sync, written: 0.04 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 03:52:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:52 np0005603623 nova_compute[226235]: 2026-01-31 08:52:52.380 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:52:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:52.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:52:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:52:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:52.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:52:53 np0005603623 ovn_controller[133449]: 2026-01-31T08:52:53Z|00720|binding|INFO|Releasing lport 0ed76a0a-650c-4ec7-a4d4-0e745236b047 from this chassis (sb_readonly=0)
Jan 31 03:52:53 np0005603623 nova_compute[226235]: 2026-01-31 08:52:53.185 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:53 np0005603623 ovn_controller[133449]: 2026-01-31T08:52:53Z|00721|binding|INFO|Releasing lport 0ed76a0a-650c-4ec7-a4d4-0e745236b047 from this chassis (sb_readonly=0)
Jan 31 03:52:53 np0005603623 nova_compute[226235]: 2026-01-31 08:52:53.238 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:54.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:54.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:55 np0005603623 nova_compute[226235]: 2026-01-31 08:52:55.469 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:55 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 03:52:55 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 03:52:55 np0005603623 nova_compute[226235]: 2026-01-31 08:52:55.953 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:52:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:56.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:52:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:52:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:56.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:52:57 np0005603623 ceph-mgr[77391]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3835187053
Jan 31 03:52:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:52:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:58.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:52:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:52:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:52:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:58.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:53:00 np0005603623 nova_compute[226235]: 2026-01-31 08:53:00.470 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:00.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:00.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:00 np0005603623 nova_compute[226235]: 2026-01-31 08:53:00.955 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:02.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:02.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.159 226239 DEBUG oslo_concurrency.lockutils [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.160 226239 DEBUG oslo_concurrency.lockutils [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.160 226239 DEBUG oslo_concurrency.lockutils [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.160 226239 DEBUG oslo_concurrency.lockutils [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.160 226239 DEBUG oslo_concurrency.lockutils [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.162 226239 INFO nova.compute.manager [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Terminating instance#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.163 226239 DEBUG nova.compute.manager [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:53:04 np0005603623 kernel: tape6486275-22 (unregistering): left promiscuous mode
Jan 31 03:53:04 np0005603623 NetworkManager[48970]: <info>  [1769849584.2106] device (tape6486275-22): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:53:04 np0005603623 ovn_controller[133449]: 2026-01-31T08:53:04Z|00722|binding|INFO|Releasing lport e6486275-22a6-4ee0-854f-fde4ef96bd8f from this chassis (sb_readonly=0)
Jan 31 03:53:04 np0005603623 ovn_controller[133449]: 2026-01-31T08:53:04Z|00723|binding|INFO|Setting lport e6486275-22a6-4ee0-854f-fde4ef96bd8f down in Southbound
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.217 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:04 np0005603623 ovn_controller[133449]: 2026-01-31T08:53:04Z|00724|binding|INFO|Removing iface tape6486275-22 ovn-installed in OVS
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.224 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:04 np0005603623 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d0000009c.scope: Deactivated successfully.
Jan 31 03:53:04 np0005603623 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d0000009c.scope: Consumed 26.069s CPU time.
Jan 31 03:53:04 np0005603623 systemd-machined[194379]: Machine qemu-75-instance-0000009c terminated.
Jan 31 03:53:04 np0005603623 podman[308256]: 2026-01-31 08:53:04.308095824 +0000 UTC m=+0.070813703 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127)
Jan 31 03:53:04 np0005603623 podman[308258]: 2026-01-31 08:53:04.330244179 +0000 UTC m=+0.092855384 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 31 03:53:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:04.337 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:6e:68 10.100.0.4'], port_security=['fa:16:3e:84:6e:68 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '0edbf2b9-b76f-446b-85fa-09a4dcb37976', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8397e0fed04b4dabb57148d0924de2dc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5a5f5fc8-9ea2-499a-9817-9f89f2dea440', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbd2578f-ff6e-4dc3-bc49-93cbf023edc5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=e6486275-22a6-4ee0-854f-fde4ef96bd8f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:53:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:04.338 143258 INFO neutron.agent.ovn.metadata.agent [-] Port e6486275-22a6-4ee0-854f-fde4ef96bd8f in datapath 3afaf607-43a1-4d65-95fc-0a22b5c901d0 unbound from our chassis#033[00m
Jan 31 03:53:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:04.339 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3afaf607-43a1-4d65-95fc-0a22b5c901d0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:53:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:04.340 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0c7f351c-3788-4a25-b6a0-61e14be1c749]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:04.340 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0 namespace which is not needed anymore#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.380 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.395 226239 INFO nova.virt.libvirt.driver [-] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Instance destroyed successfully.#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.395 226239 DEBUG nova.objects.instance [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'resources' on Instance uuid 0edbf2b9-b76f-446b-85fa-09a4dcb37976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:53:04 np0005603623 neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0[301238]: [NOTICE]   (301242) : haproxy version is 2.8.14-c23fe91
Jan 31 03:53:04 np0005603623 neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0[301238]: [NOTICE]   (301242) : path to executable is /usr/sbin/haproxy
Jan 31 03:53:04 np0005603623 neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0[301238]: [WARNING]  (301242) : Exiting Master process...
Jan 31 03:53:04 np0005603623 neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0[301238]: [ALERT]    (301242) : Current worker (301244) exited with code 143 (Terminated)
Jan 31 03:53:04 np0005603623 neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0[301238]: [WARNING]  (301242) : All workers exited. Exiting... (0)
Jan 31 03:53:04 np0005603623 systemd[1]: libpod-832c21cfde38e68c26070a267000959452b48a65c850ae543df2c2c21340bda8.scope: Deactivated successfully.
Jan 31 03:53:04 np0005603623 podman[308334]: 2026-01-31 08:53:04.458141281 +0000 UTC m=+0.048176513 container died 832c21cfde38e68c26070a267000959452b48a65c850ae543df2c2c21340bda8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:53:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:53:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:04.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:53:04 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-832c21cfde38e68c26070a267000959452b48a65c850ae543df2c2c21340bda8-userdata-shm.mount: Deactivated successfully.
Jan 31 03:53:04 np0005603623 systemd[1]: var-lib-containers-storage-overlay-57b24ebc73cd1c3e235e82dd9f5c7c4d672133ec160820974673c76e0e5d7e62-merged.mount: Deactivated successfully.
Jan 31 03:53:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:04.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:04 np0005603623 podman[308334]: 2026-01-31 08:53:04.546670567 +0000 UTC m=+0.136705779 container cleanup 832c21cfde38e68c26070a267000959452b48a65c850ae543df2c2c21340bda8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:53:04 np0005603623 systemd[1]: libpod-conmon-832c21cfde38e68c26070a267000959452b48a65c850ae543df2c2c21340bda8.scope: Deactivated successfully.
Jan 31 03:53:04 np0005603623 podman[308366]: 2026-01-31 08:53:04.616327553 +0000 UTC m=+0.055887985 container remove 832c21cfde38e68c26070a267000959452b48a65c850ae543df2c2c21340bda8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 31 03:53:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:04.620 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ad37eea1-7d36-4071-b85b-591869a97f21]: (4, ('Sat Jan 31 08:53:04 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0 (832c21cfde38e68c26070a267000959452b48a65c850ae543df2c2c21340bda8)\n832c21cfde38e68c26070a267000959452b48a65c850ae543df2c2c21340bda8\nSat Jan 31 08:53:04 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0 (832c21cfde38e68c26070a267000959452b48a65c850ae543df2c2c21340bda8)\n832c21cfde38e68c26070a267000959452b48a65c850ae543df2c2c21340bda8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:04.622 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5d268e1c-1b6e-461d-9e71-4decaf6e1240]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:04.623 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3afaf607-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.624 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:04 np0005603623 kernel: tap3afaf607-40: left promiscuous mode
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.632 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:04.635 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1ae0e6a0-280e-4ed1-bb05-5bd90dc5d855]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:04.652 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d5bb88cd-e21d-4b28-b0ee-a76c9a708455]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:04.653 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[686ad458-b460-482c-93ba-5233c3ea3153]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:04.664 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f4835610-b598-4650-bd80-05bdcc983744]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 831642, 'reachable_time': 41930, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308385, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:04 np0005603623 systemd[1]: run-netns-ovnmeta\x2d3afaf607\x2d43a1\x2d4d65\x2d95fc\x2d0a22b5c901d0.mount: Deactivated successfully.
Jan 31 03:53:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:04.667 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:53:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:04.667 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[88f2ec6e-02ab-475b-9751-535967a87273]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.774 226239 DEBUG nova.virt.libvirt.vif [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:45:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeMultiAttachTest-server-356846984',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumemultiattachtest-server-356846984',id=156,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:46:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8397e0fed04b4dabb57148d0924de2dc',ramdisk_id='',reservation_id='r-e0b5r0ay',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-AttachVolumeMultiAttachTest-1931311941',owner_user_name='tempest-AttachVolumeMultiAttachTest-1931311941-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:46:18Z,user_data=None,user_id='a498364761ef428b99cac3f92e603385',uuid=0edbf2b9-b76f-446b-85fa-09a4dcb37976,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "address": "fa:16:3e:84:6e:68", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6486275-22", "ovs_interfaceid": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.774 226239 DEBUG nova.network.os_vif_util [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converting VIF {"id": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "address": "fa:16:3e:84:6e:68", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6486275-22", "ovs_interfaceid": "e6486275-22a6-4ee0-854f-fde4ef96bd8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.775 226239 DEBUG nova.network.os_vif_util [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:84:6e:68,bridge_name='br-int',has_traffic_filtering=True,id=e6486275-22a6-4ee0-854f-fde4ef96bd8f,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6486275-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.776 226239 DEBUG os_vif [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:6e:68,bridge_name='br-int',has_traffic_filtering=True,id=e6486275-22a6-4ee0-854f-fde4ef96bd8f,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6486275-22') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.777 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.777 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape6486275-22, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.778 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.779 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:04 np0005603623 nova_compute[226235]: 2026-01-31 08:53:04.782 226239 INFO os_vif [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:6e:68,bridge_name='br-int',has_traffic_filtering=True,id=e6486275-22a6-4ee0-854f-fde4ef96bd8f,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6486275-22')#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.306 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.307 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.307 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.307 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.307 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.511 226239 DEBUG nova.compute.manager [req-c84fc32f-b69e-4a04-9398-fc7f4ed6ac5f req-77aa7896-05a5-4679-b3c1-01cfd5284d92 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Received event network-vif-unplugged-e6486275-22a6-4ee0-854f-fde4ef96bd8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.511 226239 DEBUG oslo_concurrency.lockutils [req-c84fc32f-b69e-4a04-9398-fc7f4ed6ac5f req-77aa7896-05a5-4679-b3c1-01cfd5284d92 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.512 226239 DEBUG oslo_concurrency.lockutils [req-c84fc32f-b69e-4a04-9398-fc7f4ed6ac5f req-77aa7896-05a5-4679-b3c1-01cfd5284d92 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.512 226239 DEBUG oslo_concurrency.lockutils [req-c84fc32f-b69e-4a04-9398-fc7f4ed6ac5f req-77aa7896-05a5-4679-b3c1-01cfd5284d92 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.512 226239 DEBUG nova.compute.manager [req-c84fc32f-b69e-4a04-9398-fc7f4ed6ac5f req-77aa7896-05a5-4679-b3c1-01cfd5284d92 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] No waiting events found dispatching network-vif-unplugged-e6486275-22a6-4ee0-854f-fde4ef96bd8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.512 226239 DEBUG nova.compute.manager [req-c84fc32f-b69e-4a04-9398-fc7f4ed6ac5f req-77aa7896-05a5-4679-b3c1-01cfd5284d92 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Received event network-vif-unplugged-e6486275-22a6-4ee0-854f-fde4ef96bd8f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.613 226239 INFO nova.virt.libvirt.driver [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Deleting instance files /var/lib/nova/instances/0edbf2b9-b76f-446b-85fa-09a4dcb37976_del#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.613 226239 INFO nova.virt.libvirt.driver [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Deletion of /var/lib/nova/instances/0edbf2b9-b76f-446b-85fa-09a4dcb37976_del complete#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.718 226239 INFO nova.compute.manager [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Took 1.55 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.718 226239 DEBUG oslo.service.loopingcall [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.719 226239 DEBUG nova.compute.manager [-] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.719 226239 DEBUG nova.network.neutron [-] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:53:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:53:05 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/167658510' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.746 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.900 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.902 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4270MB free_disk=20.98813247680664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.902 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.902 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:53:05 np0005603623 nova_compute[226235]: 2026-01-31 08:53:05.958 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:06 np0005603623 nova_compute[226235]: 2026-01-31 08:53:06.079 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 0edbf2b9-b76f-446b-85fa-09a4dcb37976 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:53:06 np0005603623 nova_compute[226235]: 2026-01-31 08:53:06.080 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:53:06 np0005603623 nova_compute[226235]: 2026-01-31 08:53:06.080 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:53:06 np0005603623 nova_compute[226235]: 2026-01-31 08:53:06.135 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:53:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:53:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:06.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:53:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:06.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:53:06 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3279004028' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:53:06 np0005603623 nova_compute[226235]: 2026-01-31 08:53:06.602 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:53:06 np0005603623 nova_compute[226235]: 2026-01-31 08:53:06.609 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:53:06 np0005603623 nova_compute[226235]: 2026-01-31 08:53:06.760 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:53:06 np0005603623 nova_compute[226235]: 2026-01-31 08:53:06.955 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:53:06 np0005603623 nova_compute[226235]: 2026-01-31 08:53:06.956 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:53:07 np0005603623 nova_compute[226235]: 2026-01-31 08:53:07.335 226239 DEBUG nova.network.neutron [-] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:53:07 np0005603623 nova_compute[226235]: 2026-01-31 08:53:07.419 226239 INFO nova.compute.manager [-] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Took 1.70 seconds to deallocate network for instance.#033[00m
Jan 31 03:53:07 np0005603623 nova_compute[226235]: 2026-01-31 08:53:07.806 226239 DEBUG nova.compute.manager [req-2566a343-006d-4475-956b-87e1d91fba38 req-69e0ed0a-0134-40de-a3e5-de8fd8ae1018 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Received event network-vif-deleted-e6486275-22a6-4ee0-854f-fde4ef96bd8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:53:07 np0005603623 nova_compute[226235]: 2026-01-31 08:53:07.857 226239 DEBUG nova.compute.manager [req-0c2f0029-8eaf-4cc7-ba0a-609f786fe1c1 req-f134b70e-49e9-4dce-93fb-f72964275f64 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Received event network-vif-plugged-e6486275-22a6-4ee0-854f-fde4ef96bd8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:53:07 np0005603623 nova_compute[226235]: 2026-01-31 08:53:07.858 226239 DEBUG oslo_concurrency.lockutils [req-0c2f0029-8eaf-4cc7-ba0a-609f786fe1c1 req-f134b70e-49e9-4dce-93fb-f72964275f64 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:53:07 np0005603623 nova_compute[226235]: 2026-01-31 08:53:07.858 226239 DEBUG oslo_concurrency.lockutils [req-0c2f0029-8eaf-4cc7-ba0a-609f786fe1c1 req-f134b70e-49e9-4dce-93fb-f72964275f64 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:53:07 np0005603623 nova_compute[226235]: 2026-01-31 08:53:07.858 226239 DEBUG oslo_concurrency.lockutils [req-0c2f0029-8eaf-4cc7-ba0a-609f786fe1c1 req-f134b70e-49e9-4dce-93fb-f72964275f64 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:53:07 np0005603623 nova_compute[226235]: 2026-01-31 08:53:07.859 226239 DEBUG nova.compute.manager [req-0c2f0029-8eaf-4cc7-ba0a-609f786fe1c1 req-f134b70e-49e9-4dce-93fb-f72964275f64 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] No waiting events found dispatching network-vif-plugged-e6486275-22a6-4ee0-854f-fde4ef96bd8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:53:07 np0005603623 nova_compute[226235]: 2026-01-31 08:53:07.860 226239 WARNING nova.compute.manager [req-0c2f0029-8eaf-4cc7-ba0a-609f786fe1c1 req-f134b70e-49e9-4dce-93fb-f72964275f64 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Received unexpected event network-vif-plugged-e6486275-22a6-4ee0-854f-fde4ef96bd8f for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:53:08 np0005603623 nova_compute[226235]: 2026-01-31 08:53:08.144 226239 INFO nova.compute.manager [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Took 0.72 seconds to detach 1 volumes for instance.#033[00m
Jan 31 03:53:08 np0005603623 nova_compute[226235]: 2026-01-31 08:53:08.233 226239 DEBUG oslo_concurrency.lockutils [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:53:08 np0005603623 nova_compute[226235]: 2026-01-31 08:53:08.233 226239 DEBUG oslo_concurrency.lockutils [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:53:08 np0005603623 nova_compute[226235]: 2026-01-31 08:53:08.299 226239 DEBUG oslo_concurrency.processutils [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:53:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:08.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:08.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:53:08 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3028657484' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:53:08 np0005603623 nova_compute[226235]: 2026-01-31 08:53:08.694 226239 DEBUG oslo_concurrency.processutils [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.395s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:53:08 np0005603623 nova_compute[226235]: 2026-01-31 08:53:08.699 226239 DEBUG nova.compute.provider_tree [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:53:08 np0005603623 nova_compute[226235]: 2026-01-31 08:53:08.892 226239 DEBUG nova.scheduler.client.report [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:53:08 np0005603623 nova_compute[226235]: 2026-01-31 08:53:08.920 226239 DEBUG oslo_concurrency.lockutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:53:08 np0005603623 nova_compute[226235]: 2026-01-31 08:53:08.920 226239 DEBUG oslo_concurrency.lockutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:53:08 np0005603623 nova_compute[226235]: 2026-01-31 08:53:08.928 226239 DEBUG oslo_concurrency.lockutils [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.695s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:53:08 np0005603623 nova_compute[226235]: 2026-01-31 08:53:08.967 226239 INFO nova.scheduler.client.report [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Deleted allocations for instance 0edbf2b9-b76f-446b-85fa-09a4dcb37976#033[00m
Jan 31 03:53:08 np0005603623 nova_compute[226235]: 2026-01-31 08:53:08.968 226239 DEBUG nova.compute.manager [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:53:09 np0005603623 nova_compute[226235]: 2026-01-31 08:53:09.097 226239 DEBUG oslo_concurrency.lockutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:53:09 np0005603623 nova_compute[226235]: 2026-01-31 08:53:09.097 226239 DEBUG oslo_concurrency.lockutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:53:09 np0005603623 nova_compute[226235]: 2026-01-31 08:53:09.103 226239 DEBUG nova.virt.hardware [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:53:09 np0005603623 nova_compute[226235]: 2026-01-31 08:53:09.103 226239 INFO nova.compute.claims [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:53:09 np0005603623 nova_compute[226235]: 2026-01-31 08:53:09.123 226239 DEBUG oslo_concurrency.lockutils [None req-593d27a7-288b-493c-9782-42c5704a1475 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "0edbf2b9-b76f-446b-85fa-09a4dcb37976" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.963s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:53:09 np0005603623 nova_compute[226235]: 2026-01-31 08:53:09.261 226239 DEBUG oslo_concurrency.processutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:53:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:53:09 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/53542812' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:53:09 np0005603623 nova_compute[226235]: 2026-01-31 08:53:09.759 226239 DEBUG oslo_concurrency.processutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:53:09 np0005603623 nova_compute[226235]: 2026-01-31 08:53:09.765 226239 DEBUG nova.compute.provider_tree [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:53:09 np0005603623 nova_compute[226235]: 2026-01-31 08:53:09.779 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:09 np0005603623 nova_compute[226235]: 2026-01-31 08:53:09.881 226239 DEBUG nova.scheduler.client.report [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:53:09 np0005603623 nova_compute[226235]: 2026-01-31 08:53:09.936 226239 DEBUG oslo_concurrency.lockutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.839s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:53:09 np0005603623 nova_compute[226235]: 2026-01-31 08:53:09.937 226239 DEBUG nova.compute.manager [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.030 226239 DEBUG nova.compute.manager [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.030 226239 DEBUG nova.network.neutron [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.084 226239 INFO nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.126 226239 DEBUG nova.compute.manager [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.308 226239 DEBUG nova.policy [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd442c7ba12ed444ca6d4dcc5cfd36150', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.411 226239 DEBUG nova.compute.manager [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.412 226239 DEBUG nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.413 226239 INFO nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Creating image(s)#033[00m
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.469 226239 DEBUG nova.storage.rbd_utils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image f2cd77e9-4c34-4b40-b91b-2f9896d859ce_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:53:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:10.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.496 226239 DEBUG nova.storage.rbd_utils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image f2cd77e9-4c34-4b40-b91b-2f9896d859ce_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:53:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:10.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.692 226239 DEBUG nova.storage.rbd_utils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image f2cd77e9-4c34-4b40-b91b-2f9896d859ce_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.695 226239 DEBUG oslo_concurrency.processutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.745 226239 DEBUG oslo_concurrency.processutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.746 226239 DEBUG oslo_concurrency.lockutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.746 226239 DEBUG oslo_concurrency.lockutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.747 226239 DEBUG oslo_concurrency.lockutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.774 226239 DEBUG nova.storage.rbd_utils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image f2cd77e9-4c34-4b40-b91b-2f9896d859ce_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.777 226239 DEBUG oslo_concurrency.processutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 f2cd77e9-4c34-4b40-b91b-2f9896d859ce_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:53:10 np0005603623 nova_compute[226235]: 2026-01-31 08:53:10.959 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:11 np0005603623 nova_compute[226235]: 2026-01-31 08:53:11.956 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:11 np0005603623 nova_compute[226235]: 2026-01-31 08:53:11.956 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:53:11 np0005603623 nova_compute[226235]: 2026-01-31 08:53:11.956 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:53:11 np0005603623 nova_compute[226235]: 2026-01-31 08:53:11.988 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:53:11 np0005603623 nova_compute[226235]: 2026-01-31 08:53:11.988 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:53:12 np0005603623 nova_compute[226235]: 2026-01-31 08:53:12.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:12.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:12.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:13 np0005603623 nova_compute[226235]: 2026-01-31 08:53:13.001 226239 DEBUG oslo_concurrency.processutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 f2cd77e9-4c34-4b40-b91b-2f9896d859ce_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.223s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:53:13 np0005603623 nova_compute[226235]: 2026-01-31 08:53:13.413 226239 DEBUG nova.storage.rbd_utils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] resizing rbd image f2cd77e9-4c34-4b40-b91b-2f9896d859ce_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:53:13 np0005603623 nova_compute[226235]: 2026-01-31 08:53:13.804 226239 DEBUG nova.objects.instance [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'migration_context' on Instance uuid f2cd77e9-4c34-4b40-b91b-2f9896d859ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:53:13 np0005603623 nova_compute[226235]: 2026-01-31 08:53:13.967 226239 DEBUG nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:53:13 np0005603623 nova_compute[226235]: 2026-01-31 08:53:13.967 226239 DEBUG nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Ensure instance console log exists: /var/lib/nova/instances/f2cd77e9-4c34-4b40-b91b-2f9896d859ce/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:53:13 np0005603623 nova_compute[226235]: 2026-01-31 08:53:13.968 226239 DEBUG oslo_concurrency.lockutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:53:13 np0005603623 nova_compute[226235]: 2026-01-31 08:53:13.969 226239 DEBUG oslo_concurrency.lockutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:53:13 np0005603623 nova_compute[226235]: 2026-01-31 08:53:13.969 226239 DEBUG oslo_concurrency.lockutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:53:14 np0005603623 nova_compute[226235]: 2026-01-31 08:53:14.087 226239 DEBUG nova.network.neutron [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Successfully created port: be6fd5ac-96d0-495c-827a-80641ec1d590 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:53:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:14.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:14.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:14 np0005603623 nova_compute[226235]: 2026-01-31 08:53:14.826 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:15 np0005603623 nova_compute[226235]: 2026-01-31 08:53:15.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:15 np0005603623 nova_compute[226235]: 2026-01-31 08:53:15.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:15 np0005603623 nova_compute[226235]: 2026-01-31 08:53:15.960 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:16 np0005603623 nova_compute[226235]: 2026-01-31 08:53:16.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:16 np0005603623 nova_compute[226235]: 2026-01-31 08:53:16.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:53:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:16.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:53:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:16.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:16 np0005603623 nova_compute[226235]: 2026-01-31 08:53:16.815 226239 DEBUG nova.network.neutron [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Successfully updated port: be6fd5ac-96d0-495c-827a-80641ec1d590 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:53:16 np0005603623 nova_compute[226235]: 2026-01-31 08:53:16.849 226239 DEBUG oslo_concurrency.lockutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "refresh_cache-f2cd77e9-4c34-4b40-b91b-2f9896d859ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:53:16 np0005603623 nova_compute[226235]: 2026-01-31 08:53:16.849 226239 DEBUG oslo_concurrency.lockutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquired lock "refresh_cache-f2cd77e9-4c34-4b40-b91b-2f9896d859ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:53:16 np0005603623 nova_compute[226235]: 2026-01-31 08:53:16.850 226239 DEBUG nova.network.neutron [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:53:16 np0005603623 nova_compute[226235]: 2026-01-31 08:53:16.976 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:16.977 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=75, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=74) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:53:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:16.978 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:53:17 np0005603623 nova_compute[226235]: 2026-01-31 08:53:17.025 226239 DEBUG nova.compute.manager [req-5e9e17ec-381c-4194-8a4e-f56b84e67ff3 req-a9a220ea-94da-441c-ad31-107dace4f95e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Received event network-changed-be6fd5ac-96d0-495c-827a-80641ec1d590 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:53:17 np0005603623 nova_compute[226235]: 2026-01-31 08:53:17.025 226239 DEBUG nova.compute.manager [req-5e9e17ec-381c-4194-8a4e-f56b84e67ff3 req-a9a220ea-94da-441c-ad31-107dace4f95e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Refreshing instance network info cache due to event network-changed-be6fd5ac-96d0-495c-827a-80641ec1d590. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:53:17 np0005603623 nova_compute[226235]: 2026-01-31 08:53:17.026 226239 DEBUG oslo_concurrency.lockutils [req-5e9e17ec-381c-4194-8a4e-f56b84e67ff3 req-a9a220ea-94da-441c-ad31-107dace4f95e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-f2cd77e9-4c34-4b40-b91b-2f9896d859ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:53:17 np0005603623 nova_compute[226235]: 2026-01-31 08:53:17.219 226239 DEBUG nova.network.neutron [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:53:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:18.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:53:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:18.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:53:19 np0005603623 nova_compute[226235]: 2026-01-31 08:53:19.394 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849584.3925822, 0edbf2b9-b76f-446b-85fa-09a4dcb37976 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:53:19 np0005603623 nova_compute[226235]: 2026-01-31 08:53:19.394 226239 INFO nova.compute.manager [-] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:53:19 np0005603623 nova_compute[226235]: 2026-01-31 08:53:19.532 226239 DEBUG nova.compute.manager [None req-53317112-4468-4528-9200-cc6abceac8e1 - - - - - -] [instance: 0edbf2b9-b76f-446b-85fa-09a4dcb37976] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:53:19 np0005603623 nova_compute[226235]: 2026-01-31 08:53:19.706 226239 DEBUG nova.network.neutron [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Updating instance_info_cache with network_info: [{"id": "be6fd5ac-96d0-495c-827a-80641ec1d590", "address": "fa:16:3e:47:32:4a", "network": {"id": "88788faa-de32-416c-9495-11b3f71610ce", "bridge": "br-int", "label": "tempest-network-smoke--472936268", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6fd5ac-96", "ovs_interfaceid": "be6fd5ac-96d0-495c-827a-80641ec1d590", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:53:19 np0005603623 nova_compute[226235]: 2026-01-31 08:53:19.827 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.080 226239 DEBUG oslo_concurrency.lockutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Releasing lock "refresh_cache-f2cd77e9-4c34-4b40-b91b-2f9896d859ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.080 226239 DEBUG nova.compute.manager [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Instance network_info: |[{"id": "be6fd5ac-96d0-495c-827a-80641ec1d590", "address": "fa:16:3e:47:32:4a", "network": {"id": "88788faa-de32-416c-9495-11b3f71610ce", "bridge": "br-int", "label": "tempest-network-smoke--472936268", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6fd5ac-96", "ovs_interfaceid": "be6fd5ac-96d0-495c-827a-80641ec1d590", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.081 226239 DEBUG oslo_concurrency.lockutils [req-5e9e17ec-381c-4194-8a4e-f56b84e67ff3 req-a9a220ea-94da-441c-ad31-107dace4f95e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-f2cd77e9-4c34-4b40-b91b-2f9896d859ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.081 226239 DEBUG nova.network.neutron [req-5e9e17ec-381c-4194-8a4e-f56b84e67ff3 req-a9a220ea-94da-441c-ad31-107dace4f95e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Refreshing network info cache for port be6fd5ac-96d0-495c-827a-80641ec1d590 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.084 226239 DEBUG nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Start _get_guest_xml network_info=[{"id": "be6fd5ac-96d0-495c-827a-80641ec1d590", "address": "fa:16:3e:47:32:4a", "network": {"id": "88788faa-de32-416c-9495-11b3f71610ce", "bridge": "br-int", "label": "tempest-network-smoke--472936268", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6fd5ac-96", "ovs_interfaceid": "be6fd5ac-96d0-495c-827a-80641ec1d590", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.090 226239 WARNING nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.100 226239 DEBUG nova.virt.libvirt.host [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.101 226239 DEBUG nova.virt.libvirt.host [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.105 226239 DEBUG nova.virt.libvirt.host [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.106 226239 DEBUG nova.virt.libvirt.host [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.107 226239 DEBUG nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.108 226239 DEBUG nova.virt.hardware [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.108 226239 DEBUG nova.virt.hardware [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.108 226239 DEBUG nova.virt.hardware [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.109 226239 DEBUG nova.virt.hardware [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.109 226239 DEBUG nova.virt.hardware [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.109 226239 DEBUG nova.virt.hardware [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.109 226239 DEBUG nova.virt.hardware [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.110 226239 DEBUG nova.virt.hardware [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.110 226239 DEBUG nova.virt.hardware [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.110 226239 DEBUG nova.virt.hardware [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.110 226239 DEBUG nova.virt.hardware [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.114 226239 DEBUG oslo_concurrency.processutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:53:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:20.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:53:20 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1353774752' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:53:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:20.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.561 226239 DEBUG oslo_concurrency.processutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.588 226239 DEBUG nova.storage.rbd_utils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image f2cd77e9-4c34-4b40-b91b-2f9896d859ce_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.594 226239 DEBUG oslo_concurrency.processutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:53:20 np0005603623 nova_compute[226235]: 2026-01-31 08:53:20.965 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:53:21 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/969970246' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.040 226239 DEBUG oslo_concurrency.processutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.042 226239 DEBUG nova.virt.libvirt.vif [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:53:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1311640661',display_name='tempest-TestNetworkBasicOps-server-1311640661',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1311640661',id=174,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDRKUWAS5a80ZjWPsd7hHgRkEp7Su45jVZtMGsGyUgsPHaNtUJrtTz9Xp8VPEvhhKx46Nib3p9WEVi71SNK8k+aau+j0uJLzRlNzrStVLXZtfDZ/222CrsWljOTfuHQP7w==',key_name='tempest-TestNetworkBasicOps-807155673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-d8ph01yi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:53:10Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=f2cd77e9-4c34-4b40-b91b-2f9896d859ce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be6fd5ac-96d0-495c-827a-80641ec1d590", "address": "fa:16:3e:47:32:4a", "network": {"id": "88788faa-de32-416c-9495-11b3f71610ce", "bridge": "br-int", "label": "tempest-network-smoke--472936268", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6fd5ac-96", "ovs_interfaceid": "be6fd5ac-96d0-495c-827a-80641ec1d590", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.042 226239 DEBUG nova.network.os_vif_util [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "be6fd5ac-96d0-495c-827a-80641ec1d590", "address": "fa:16:3e:47:32:4a", "network": {"id": "88788faa-de32-416c-9495-11b3f71610ce", "bridge": "br-int", "label": "tempest-network-smoke--472936268", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6fd5ac-96", "ovs_interfaceid": "be6fd5ac-96d0-495c-827a-80641ec1d590", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.043 226239 DEBUG nova.network.os_vif_util [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:32:4a,bridge_name='br-int',has_traffic_filtering=True,id=be6fd5ac-96d0-495c-827a-80641ec1d590,network=Network(88788faa-de32-416c-9495-11b3f71610ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe6fd5ac-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.044 226239 DEBUG nova.objects.instance [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'pci_devices' on Instance uuid f2cd77e9-4c34-4b40-b91b-2f9896d859ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.076 226239 DEBUG nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:53:21 np0005603623 nova_compute[226235]:  <uuid>f2cd77e9-4c34-4b40-b91b-2f9896d859ce</uuid>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:  <name>instance-000000ae</name>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <nova:name>tempest-TestNetworkBasicOps-server-1311640661</nova:name>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:53:20</nova:creationTime>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:53:21 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:        <nova:user uuid="d442c7ba12ed444ca6d4dcc5cfd36150">tempest-TestNetworkBasicOps-104417095-project-member</nova:user>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:        <nova:project uuid="abf9393aa2b646feb00a3d887a9dee14">tempest-TestNetworkBasicOps-104417095</nova:project>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:        <nova:port uuid="be6fd5ac-96d0-495c-827a-80641ec1d590">
Jan 31 03:53:21 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <entry name="serial">f2cd77e9-4c34-4b40-b91b-2f9896d859ce</entry>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <entry name="uuid">f2cd77e9-4c34-4b40-b91b-2f9896d859ce</entry>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/f2cd77e9-4c34-4b40-b91b-2f9896d859ce_disk">
Jan 31 03:53:21 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:53:21 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/f2cd77e9-4c34-4b40-b91b-2f9896d859ce_disk.config">
Jan 31 03:53:21 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:53:21 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:47:32:4a"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <target dev="tapbe6fd5ac-96"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/f2cd77e9-4c34-4b40-b91b-2f9896d859ce/console.log" append="off"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:53:21 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:53:21 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:53:21 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:53:21 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.077 226239 DEBUG nova.compute.manager [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Preparing to wait for external event network-vif-plugged-be6fd5ac-96d0-495c-827a-80641ec1d590 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.077 226239 DEBUG oslo_concurrency.lockutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.078 226239 DEBUG oslo_concurrency.lockutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.078 226239 DEBUG oslo_concurrency.lockutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.079 226239 DEBUG nova.virt.libvirt.vif [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:53:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1311640661',display_name='tempest-TestNetworkBasicOps-server-1311640661',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1311640661',id=174,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDRKUWAS5a80ZjWPsd7hHgRkEp7Su45jVZtMGsGyUgsPHaNtUJrtTz9Xp8VPEvhhKx46Nib3p9WEVi71SNK8k+aau+j0uJLzRlNzrStVLXZtfDZ/222CrsWljOTfuHQP7w==',key_name='tempest-TestNetworkBasicOps-807155673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-d8ph01yi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:53:10Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=f2cd77e9-4c34-4b40-b91b-2f9896d859ce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be6fd5ac-96d0-495c-827a-80641ec1d590", "address": "fa:16:3e:47:32:4a", "network": {"id": "88788faa-de32-416c-9495-11b3f71610ce", "bridge": "br-int", "label": "tempest-network-smoke--472936268", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6fd5ac-96", "ovs_interfaceid": "be6fd5ac-96d0-495c-827a-80641ec1d590", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.079 226239 DEBUG nova.network.os_vif_util [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "be6fd5ac-96d0-495c-827a-80641ec1d590", "address": "fa:16:3e:47:32:4a", "network": {"id": "88788faa-de32-416c-9495-11b3f71610ce", "bridge": "br-int", "label": "tempest-network-smoke--472936268", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6fd5ac-96", "ovs_interfaceid": "be6fd5ac-96d0-495c-827a-80641ec1d590", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.080 226239 DEBUG nova.network.os_vif_util [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:32:4a,bridge_name='br-int',has_traffic_filtering=True,id=be6fd5ac-96d0-495c-827a-80641ec1d590,network=Network(88788faa-de32-416c-9495-11b3f71610ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe6fd5ac-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.080 226239 DEBUG os_vif [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:32:4a,bridge_name='br-int',has_traffic_filtering=True,id=be6fd5ac-96d0-495c-827a-80641ec1d590,network=Network(88788faa-de32-416c-9495-11b3f71610ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe6fd5ac-96') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.080 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.081 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.081 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.083 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.084 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbe6fd5ac-96, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.084 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbe6fd5ac-96, col_values=(('external_ids', {'iface-id': 'be6fd5ac-96d0-495c-827a-80641ec1d590', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:47:32:4a', 'vm-uuid': 'f2cd77e9-4c34-4b40-b91b-2f9896d859ce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.085 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:21 np0005603623 NetworkManager[48970]: <info>  [1769849601.0868] manager: (tapbe6fd5ac-96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/337)
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.089 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.092 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.093 226239 INFO os_vif [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:32:4a,bridge_name='br-int',has_traffic_filtering=True,id=be6fd5ac-96d0-495c-827a-80641ec1d590,network=Network(88788faa-de32-416c-9495-11b3f71610ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe6fd5ac-96')#033[00m
Jan 31 03:53:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.254 226239 DEBUG nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.255 226239 DEBUG nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.255 226239 DEBUG nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No VIF found with MAC fa:16:3e:47:32:4a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.255 226239 INFO nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Using config drive#033[00m
Jan 31 03:53:21 np0005603623 nova_compute[226235]: 2026-01-31 08:53:21.285 226239 DEBUG nova.storage.rbd_utils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image f2cd77e9-4c34-4b40-b91b-2f9896d859ce_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:53:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:53:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:22.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:53:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:22.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:24 np0005603623 nova_compute[226235]: 2026-01-31 08:53:24.187 226239 INFO nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Creating config drive at /var/lib/nova/instances/f2cd77e9-4c34-4b40-b91b-2f9896d859ce/disk.config#033[00m
Jan 31 03:53:24 np0005603623 nova_compute[226235]: 2026-01-31 08:53:24.190 226239 DEBUG oslo_concurrency.processutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f2cd77e9-4c34-4b40-b91b-2f9896d859ce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpv1jbtfug execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:53:24 np0005603623 nova_compute[226235]: 2026-01-31 08:53:24.319 226239 DEBUG oslo_concurrency.processutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f2cd77e9-4c34-4b40-b91b-2f9896d859ce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpv1jbtfug" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:53:24 np0005603623 nova_compute[226235]: 2026-01-31 08:53:24.411 226239 DEBUG nova.storage.rbd_utils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image f2cd77e9-4c34-4b40-b91b-2f9896d859ce_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:53:24 np0005603623 nova_compute[226235]: 2026-01-31 08:53:24.415 226239 DEBUG oslo_concurrency.processutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f2cd77e9-4c34-4b40-b91b-2f9896d859ce/disk.config f2cd77e9-4c34-4b40-b91b-2f9896d859ce_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:53:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:53:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:24.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:53:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:24.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:24 np0005603623 nova_compute[226235]: 2026-01-31 08:53:24.933 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:25 np0005603623 nova_compute[226235]: 2026-01-31 08:53:25.127 226239 DEBUG oslo_concurrency.processutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f2cd77e9-4c34-4b40-b91b-2f9896d859ce/disk.config f2cd77e9-4c34-4b40-b91b-2f9896d859ce_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.712s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:53:25 np0005603623 nova_compute[226235]: 2026-01-31 08:53:25.128 226239 INFO nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Deleting local config drive /var/lib/nova/instances/f2cd77e9-4c34-4b40-b91b-2f9896d859ce/disk.config because it was imported into RBD.#033[00m
Jan 31 03:53:25 np0005603623 nova_compute[226235]: 2026-01-31 08:53:25.144 226239 DEBUG nova.network.neutron [req-5e9e17ec-381c-4194-8a4e-f56b84e67ff3 req-a9a220ea-94da-441c-ad31-107dace4f95e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Updated VIF entry in instance network info cache for port be6fd5ac-96d0-495c-827a-80641ec1d590. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:53:25 np0005603623 nova_compute[226235]: 2026-01-31 08:53:25.145 226239 DEBUG nova.network.neutron [req-5e9e17ec-381c-4194-8a4e-f56b84e67ff3 req-a9a220ea-94da-441c-ad31-107dace4f95e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Updating instance_info_cache with network_info: [{"id": "be6fd5ac-96d0-495c-827a-80641ec1d590", "address": "fa:16:3e:47:32:4a", "network": {"id": "88788faa-de32-416c-9495-11b3f71610ce", "bridge": "br-int", "label": "tempest-network-smoke--472936268", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6fd5ac-96", "ovs_interfaceid": "be6fd5ac-96d0-495c-827a-80641ec1d590", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:53:25 np0005603623 nova_compute[226235]: 2026-01-31 08:53:25.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:25 np0005603623 kernel: tapbe6fd5ac-96: entered promiscuous mode
Jan 31 03:53:25 np0005603623 NetworkManager[48970]: <info>  [1769849605.1610] manager: (tapbe6fd5ac-96): new Tun device (/org/freedesktop/NetworkManager/Devices/338)
Jan 31 03:53:25 np0005603623 ovn_controller[133449]: 2026-01-31T08:53:25Z|00725|binding|INFO|Claiming lport be6fd5ac-96d0-495c-827a-80641ec1d590 for this chassis.
Jan 31 03:53:25 np0005603623 ovn_controller[133449]: 2026-01-31T08:53:25Z|00726|binding|INFO|be6fd5ac-96d0-495c-827a-80641ec1d590: Claiming fa:16:3e:47:32:4a 10.100.0.13
Jan 31 03:53:25 np0005603623 nova_compute[226235]: 2026-01-31 08:53:25.163 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.178 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:32:4a 10.100.0.13'], port_security=['fa:16:3e:47:32:4a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f2cd77e9-4c34-4b40-b91b-2f9896d859ce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88788faa-de32-416c-9495-11b3f71610ce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8d113f51-a0f6-4ac5-a4a7-3be652fd2b6a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b8add34-1673-4dc8-94e4-1618dd3551f8, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=be6fd5ac-96d0-495c-827a-80641ec1d590) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.179 143258 INFO neutron.agent.ovn.metadata.agent [-] Port be6fd5ac-96d0-495c-827a-80641ec1d590 in datapath 88788faa-de32-416c-9495-11b3f71610ce bound to our chassis#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.180 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88788faa-de32-416c-9495-11b3f71610ce#033[00m
Jan 31 03:53:25 np0005603623 systemd-udevd[308854]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.187 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e834d3dc-f70b-4d68-83b4-609b15a6ba03]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.188 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88788faa-d1 in ovnmeta-88788faa-de32-416c-9495-11b3f71610ce namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.189 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88788faa-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.189 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[58687da8-ce80-4bfa-80c7-d72416a0f669]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.190 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4b3d6ca1-40fe-4444-84fd-5fe4806c796d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:25 np0005603623 systemd-machined[194379]: New machine qemu-83-instance-000000ae.
Jan 31 03:53:25 np0005603623 NetworkManager[48970]: <info>  [1769849605.1960] device (tapbe6fd5ac-96): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:53:25 np0005603623 NetworkManager[48970]: <info>  [1769849605.1968] device (tapbe6fd5ac-96): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.198 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[81f3b91b-e0c7-45f7-b60a-6e9db63504e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:25 np0005603623 nova_compute[226235]: 2026-01-31 08:53:25.202 226239 DEBUG oslo_concurrency.lockutils [req-5e9e17ec-381c-4194-8a4e-f56b84e67ff3 req-a9a220ea-94da-441c-ad31-107dace4f95e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-f2cd77e9-4c34-4b40-b91b-2f9896d859ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:53:25 np0005603623 ovn_controller[133449]: 2026-01-31T08:53:25Z|00727|binding|INFO|Setting lport be6fd5ac-96d0-495c-827a-80641ec1d590 ovn-installed in OVS
Jan 31 03:53:25 np0005603623 ovn_controller[133449]: 2026-01-31T08:53:25Z|00728|binding|INFO|Setting lport be6fd5ac-96d0-495c-827a-80641ec1d590 up in Southbound
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.219 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e7bf5ccc-d224-485a-9c52-a3a6abbf4599]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.243 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[b9564b2d-b7d3-43ba-bef8-df2853154cc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:25 np0005603623 systemd-udevd[308859]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.260 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[16e54beb-bea2-49e5-87b4-639aeece06ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:25 np0005603623 systemd[1]: Started Virtual Machine qemu-83-instance-000000ae.
Jan 31 03:53:25 np0005603623 nova_compute[226235]: 2026-01-31 08:53:25.261 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:25 np0005603623 NetworkManager[48970]: <info>  [1769849605.2624] manager: (tap88788faa-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/339)
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.281 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[d108a55c-2420-44a6-a294-47d2101ffa25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.283 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[8391d1fb-6cff-4399-8212-ae0fcb290885]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:25 np0005603623 NetworkManager[48970]: <info>  [1769849605.2993] device (tap88788faa-d0): carrier: link connected
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.304 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[e554092d-3097-4b1a-89dc-303c7d434b84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.316 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[135c7f31-48e2-468a-99cc-b4dc7d30fd44]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88788faa-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:fa:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 215], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 874572, 'reachable_time': 16634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308887, 'error': None, 'target': 'ovnmeta-88788faa-de32-416c-9495-11b3f71610ce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.324 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9276fe0a-afa2-474e-80c0-bb7040fbc3ef]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:fad7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 874572, 'tstamp': 874572}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 308889, 'error': None, 'target': 'ovnmeta-88788faa-de32-416c-9495-11b3f71610ce', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.334 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0d50cd80-b1a8-4230-bc00-d3197aa0d896]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88788faa-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:fa:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 215], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 874572, 'reachable_time': 16634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 308890, 'error': None, 'target': 'ovnmeta-88788faa-de32-416c-9495-11b3f71610ce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.350 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3e7d27e5-2aa4-45ad-a050-8f8c5e45c874]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.381 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[40989e52-02c9-4e3b-94f8-f9864dd06433]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.382 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88788faa-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.382 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.382 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88788faa-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:53:25 np0005603623 nova_compute[226235]: 2026-01-31 08:53:25.384 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:25 np0005603623 NetworkManager[48970]: <info>  [1769849605.3847] manager: (tap88788faa-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/340)
Jan 31 03:53:25 np0005603623 kernel: tap88788faa-d0: entered promiscuous mode
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.386 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88788faa-d0, col_values=(('external_ids', {'iface-id': '1da501db-771f-4fea-baad-4da77c8c5424'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:53:25 np0005603623 ovn_controller[133449]: 2026-01-31T08:53:25Z|00729|binding|INFO|Releasing lport 1da501db-771f-4fea-baad-4da77c8c5424 from this chassis (sb_readonly=0)
Jan 31 03:53:25 np0005603623 nova_compute[226235]: 2026-01-31 08:53:25.387 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.388 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88788faa-de32-416c-9495-11b3f71610ce.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88788faa-de32-416c-9495-11b3f71610ce.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.389 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f80c0d8d-dfc7-4b4e-81dd-0fb4cef610b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.389 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-88788faa-de32-416c-9495-11b3f71610ce
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/88788faa-de32-416c-9495-11b3f71610ce.pid.haproxy
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 88788faa-de32-416c-9495-11b3f71610ce
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:53:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:25.390 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88788faa-de32-416c-9495-11b3f71610ce', 'env', 'PROCESS_TAG=haproxy-88788faa-de32-416c-9495-11b3f71610ce', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88788faa-de32-416c-9495-11b3f71610ce.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:53:25 np0005603623 nova_compute[226235]: 2026-01-31 08:53:25.393 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:25 np0005603623 podman[308922]: 2026-01-31 08:53:25.676858557 +0000 UTC m=+0.022614331 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:53:25 np0005603623 nova_compute[226235]: 2026-01-31 08:53:25.891 226239 DEBUG nova.compute.manager [req-8fa5d071-49d2-432c-a05f-92678951caa0 req-e7f12c0b-de28-44d2-a348-1229b21795c2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Received event network-vif-plugged-be6fd5ac-96d0-495c-827a-80641ec1d590 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:53:25 np0005603623 nova_compute[226235]: 2026-01-31 08:53:25.892 226239 DEBUG oslo_concurrency.lockutils [req-8fa5d071-49d2-432c-a05f-92678951caa0 req-e7f12c0b-de28-44d2-a348-1229b21795c2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:53:25 np0005603623 nova_compute[226235]: 2026-01-31 08:53:25.892 226239 DEBUG oslo_concurrency.lockutils [req-8fa5d071-49d2-432c-a05f-92678951caa0 req-e7f12c0b-de28-44d2-a348-1229b21795c2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:53:25 np0005603623 nova_compute[226235]: 2026-01-31 08:53:25.893 226239 DEBUG oslo_concurrency.lockutils [req-8fa5d071-49d2-432c-a05f-92678951caa0 req-e7f12c0b-de28-44d2-a348-1229b21795c2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:53:25 np0005603623 nova_compute[226235]: 2026-01-31 08:53:25.893 226239 DEBUG nova.compute.manager [req-8fa5d071-49d2-432c-a05f-92678951caa0 req-e7f12c0b-de28-44d2-a348-1229b21795c2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Processing event network-vif-plugged-be6fd5ac-96d0-495c-827a-80641ec1d590 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:53:25 np0005603623 nova_compute[226235]: 2026-01-31 08:53:25.965 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:25 np0005603623 podman[308922]: 2026-01-31 08:53:25.99733331 +0000 UTC m=+0.343089064 container create b9e2179b1489f73f7724ca585f636c2d20e6bb66141cce5562d24b632e552ef6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88788faa-de32-416c-9495-11b3f71610ce, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:53:26 np0005603623 systemd[1]: Started libpod-conmon-b9e2179b1489f73f7724ca585f636c2d20e6bb66141cce5562d24b632e552ef6.scope.
Jan 31 03:53:26 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:53:26 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/367c699a40790e0bc9bad1735e59119da88595d9b9dd73fb31915223c941bc90/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.086 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:26 np0005603623 podman[308922]: 2026-01-31 08:53:26.099871907 +0000 UTC m=+0.445627691 container init b9e2179b1489f73f7724ca585f636c2d20e6bb66141cce5562d24b632e552ef6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88788faa-de32-416c-9495-11b3f71610ce, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:53:26 np0005603623 podman[308922]: 2026-01-31 08:53:26.104107159 +0000 UTC m=+0.449862913 container start b9e2179b1489f73f7724ca585f636c2d20e6bb66141cce5562d24b632e552ef6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88788faa-de32-416c-9495-11b3f71610ce, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.120 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849606.1199646, f2cd77e9-4c34-4b40-b91b-2f9896d859ce => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.121 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] VM Started (Lifecycle Event)#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.123 226239 DEBUG nova.compute.manager [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.126 226239 DEBUG nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.130 226239 INFO nova.virt.libvirt.driver [-] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Instance spawned successfully.#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.131 226239 DEBUG nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:53:26 np0005603623 neutron-haproxy-ovnmeta-88788faa-de32-416c-9495-11b3f71610ce[308979]: [NOTICE]   (308984) : New worker (308986) forked
Jan 31 03:53:26 np0005603623 neutron-haproxy-ovnmeta-88788faa-de32-416c-9495-11b3f71610ce[308979]: [NOTICE]   (308984) : Loading success.
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.150 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.153 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.192 226239 DEBUG nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.192 226239 DEBUG nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.192 226239 DEBUG nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.193 226239 DEBUG nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.193 226239 DEBUG nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.193 226239 DEBUG nova.virt.libvirt.driver [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.216 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.216 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849606.1205997, f2cd77e9-4c34-4b40-b91b-2f9896d859ce => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.216 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:53:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.248 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.251 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849606.126479, f2cd77e9-4c34-4b40-b91b-2f9896d859ce => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.251 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.323 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.325 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.371 226239 INFO nova.compute.manager [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Took 15.96 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.371 226239 DEBUG nova.compute.manager [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.378 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.477 226239 INFO nova.compute.manager [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Took 17.40 seconds to build instance.#033[00m
Jan 31 03:53:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:26.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:26.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:26 np0005603623 nova_compute[226235]: 2026-01-31 08:53:26.583 226239 DEBUG oslo_concurrency.lockutils [None req-42172269-a827-449d-b66c-4fb107e0b982 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.663s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:53:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:26.980 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '75'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:53:27 np0005603623 nova_compute[226235]: 2026-01-31 08:53:27.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:28 np0005603623 nova_compute[226235]: 2026-01-31 08:53:28.161 226239 DEBUG nova.compute.manager [req-1eae5e68-e094-440c-a9f5-f7f59a3ea65c req-816f1b1a-5d43-4ed6-8ce0-eb92b2423003 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Received event network-vif-plugged-be6fd5ac-96d0-495c-827a-80641ec1d590 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:53:28 np0005603623 nova_compute[226235]: 2026-01-31 08:53:28.161 226239 DEBUG oslo_concurrency.lockutils [req-1eae5e68-e094-440c-a9f5-f7f59a3ea65c req-816f1b1a-5d43-4ed6-8ce0-eb92b2423003 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:53:28 np0005603623 nova_compute[226235]: 2026-01-31 08:53:28.161 226239 DEBUG oslo_concurrency.lockutils [req-1eae5e68-e094-440c-a9f5-f7f59a3ea65c req-816f1b1a-5d43-4ed6-8ce0-eb92b2423003 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:53:28 np0005603623 nova_compute[226235]: 2026-01-31 08:53:28.161 226239 DEBUG oslo_concurrency.lockutils [req-1eae5e68-e094-440c-a9f5-f7f59a3ea65c req-816f1b1a-5d43-4ed6-8ce0-eb92b2423003 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:53:28 np0005603623 nova_compute[226235]: 2026-01-31 08:53:28.161 226239 DEBUG nova.compute.manager [req-1eae5e68-e094-440c-a9f5-f7f59a3ea65c req-816f1b1a-5d43-4ed6-8ce0-eb92b2423003 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] No waiting events found dispatching network-vif-plugged-be6fd5ac-96d0-495c-827a-80641ec1d590 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:53:28 np0005603623 nova_compute[226235]: 2026-01-31 08:53:28.162 226239 WARNING nova.compute.manager [req-1eae5e68-e094-440c-a9f5-f7f59a3ea65c req-816f1b1a-5d43-4ed6-8ce0-eb92b2423003 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Received unexpected event network-vif-plugged-be6fd5ac-96d0-495c-827a-80641ec1d590 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:53:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:53:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:28.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:53:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:28.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:30.144 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:53:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:30.144 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:53:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:53:30.145 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:53:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:30.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:30.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:30 np0005603623 nova_compute[226235]: 2026-01-31 08:53:30.968 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:31 np0005603623 nova_compute[226235]: 2026-01-31 08:53:31.088 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:32.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:32.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.002000064s ======
Jan 31 03:53:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:34.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 31 03:53:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:34.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:34 np0005603623 podman[309000]: 2026-01-31 08:53:34.96619791 +0000 UTC m=+0.053385956 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:53:35 np0005603623 podman[309001]: 2026-01-31 08:53:35.033373517 +0000 UTC m=+0.118267481 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 31 03:53:35 np0005603623 nova_compute[226235]: 2026-01-31 08:53:35.834 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:35 np0005603623 NetworkManager[48970]: <info>  [1769849615.8352] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/341)
Jan 31 03:53:35 np0005603623 NetworkManager[48970]: <info>  [1769849615.8365] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/342)
Jan 31 03:53:35 np0005603623 nova_compute[226235]: 2026-01-31 08:53:35.856 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:35 np0005603623 ovn_controller[133449]: 2026-01-31T08:53:35Z|00730|binding|INFO|Releasing lport 1da501db-771f-4fea-baad-4da77c8c5424 from this chassis (sb_readonly=0)
Jan 31 03:53:35 np0005603623 nova_compute[226235]: 2026-01-31 08:53:35.865 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:35 np0005603623 nova_compute[226235]: 2026-01-31 08:53:35.970 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:36 np0005603623 nova_compute[226235]: 2026-01-31 08:53:36.090 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:36.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:36.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:36 np0005603623 nova_compute[226235]: 2026-01-31 08:53:36.633 226239 DEBUG nova.compute.manager [req-bfce97ba-bd4b-4e13-ad4e-7c3330e51006 req-4cfc3fb0-0df5-4c28-938a-68adaa8f99da fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Received event network-changed-be6fd5ac-96d0-495c-827a-80641ec1d590 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:53:36 np0005603623 nova_compute[226235]: 2026-01-31 08:53:36.633 226239 DEBUG nova.compute.manager [req-bfce97ba-bd4b-4e13-ad4e-7c3330e51006 req-4cfc3fb0-0df5-4c28-938a-68adaa8f99da fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Refreshing instance network info cache due to event network-changed-be6fd5ac-96d0-495c-827a-80641ec1d590. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:53:36 np0005603623 nova_compute[226235]: 2026-01-31 08:53:36.634 226239 DEBUG oslo_concurrency.lockutils [req-bfce97ba-bd4b-4e13-ad4e-7c3330e51006 req-4cfc3fb0-0df5-4c28-938a-68adaa8f99da fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-f2cd77e9-4c34-4b40-b91b-2f9896d859ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:53:36 np0005603623 nova_compute[226235]: 2026-01-31 08:53:36.634 226239 DEBUG oslo_concurrency.lockutils [req-bfce97ba-bd4b-4e13-ad4e-7c3330e51006 req-4cfc3fb0-0df5-4c28-938a-68adaa8f99da fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-f2cd77e9-4c34-4b40-b91b-2f9896d859ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:53:36 np0005603623 nova_compute[226235]: 2026-01-31 08:53:36.634 226239 DEBUG nova.network.neutron [req-bfce97ba-bd4b-4e13-ad4e-7c3330e51006 req-4cfc3fb0-0df5-4c28-938a-68adaa8f99da fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Refreshing network info cache for port be6fd5ac-96d0-495c-827a-80641ec1d590 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:53:38 np0005603623 ovn_controller[133449]: 2026-01-31T08:53:38Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:47:32:4a 10.100.0.13
Jan 31 03:53:38 np0005603623 ovn_controller[133449]: 2026-01-31T08:53:38Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:47:32:4a 10.100.0.13
Jan 31 03:53:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:53:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:38.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:53:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:38.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:39 np0005603623 nova_compute[226235]: 2026-01-31 08:53:39.132 226239 DEBUG nova.network.neutron [req-bfce97ba-bd4b-4e13-ad4e-7c3330e51006 req-4cfc3fb0-0df5-4c28-938a-68adaa8f99da fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Updated VIF entry in instance network info cache for port be6fd5ac-96d0-495c-827a-80641ec1d590. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:53:39 np0005603623 nova_compute[226235]: 2026-01-31 08:53:39.133 226239 DEBUG nova.network.neutron [req-bfce97ba-bd4b-4e13-ad4e-7c3330e51006 req-4cfc3fb0-0df5-4c28-938a-68adaa8f99da fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Updating instance_info_cache with network_info: [{"id": "be6fd5ac-96d0-495c-827a-80641ec1d590", "address": "fa:16:3e:47:32:4a", "network": {"id": "88788faa-de32-416c-9495-11b3f71610ce", "bridge": "br-int", "label": "tempest-network-smoke--472936268", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6fd5ac-96", "ovs_interfaceid": "be6fd5ac-96d0-495c-827a-80641ec1d590", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:53:39 np0005603623 nova_compute[226235]: 2026-01-31 08:53:39.457 226239 DEBUG oslo_concurrency.lockutils [req-bfce97ba-bd4b-4e13-ad4e-7c3330e51006 req-4cfc3fb0-0df5-4c28-938a-68adaa8f99da fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-f2cd77e9-4c34-4b40-b91b-2f9896d859ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:53:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:40.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:40.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:53:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2845082275' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:53:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:53:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2845082275' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:53:40 np0005603623 nova_compute[226235]: 2026-01-31 08:53:40.971 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:41 np0005603623 nova_compute[226235]: 2026-01-31 08:53:41.092 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:53:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:42.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:53:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:42.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:42 np0005603623 nova_compute[226235]: 2026-01-31 08:53:42.846 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:43 np0005603623 nova_compute[226235]: 2026-01-31 08:53:43.843 226239 INFO nova.compute.manager [None req-be30aa0b-cf30-4096-b3cb-9a83829fe46a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Get console output#033[00m
Jan 31 03:53:43 np0005603623 nova_compute[226235]: 2026-01-31 08:53:43.848 270602 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 03:53:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:44.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:44.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:45 np0005603623 nova_compute[226235]: 2026-01-31 08:53:45.973 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:46 np0005603623 nova_compute[226235]: 2026-01-31 08:53:46.094 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:46.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:53:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:46.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:53:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:48.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:48.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:50 np0005603623 podman[309493]: 2026-01-31 08:53:50.381505417 +0000 UTC m=+0.056958818 container create 2bb6e4439967d1f377b7ff7a9a3c2838b1b4516ab4c746aa834d5d6a3ec26aa4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_swartz, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:53:50 np0005603623 podman[309493]: 2026-01-31 08:53:50.34240763 +0000 UTC m=+0.017861051 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 03:53:50 np0005603623 systemd[1]: Started libpod-conmon-2bb6e4439967d1f377b7ff7a9a3c2838b1b4516ab4c746aa834d5d6a3ec26aa4.scope.
Jan 31 03:53:50 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:53:50 np0005603623 podman[309493]: 2026-01-31 08:53:50.502693309 +0000 UTC m=+0.178146740 container init 2bb6e4439967d1f377b7ff7a9a3c2838b1b4516ab4c746aa834d5d6a3ec26aa4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_swartz, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 03:53:50 np0005603623 podman[309493]: 2026-01-31 08:53:50.508106828 +0000 UTC m=+0.183560229 container start 2bb6e4439967d1f377b7ff7a9a3c2838b1b4516ab4c746aa834d5d6a3ec26aa4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_swartz, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 03:53:50 np0005603623 ecstatic_swartz[309509]: 167 167
Jan 31 03:53:50 np0005603623 systemd[1]: libpod-2bb6e4439967d1f377b7ff7a9a3c2838b1b4516ab4c746aa834d5d6a3ec26aa4.scope: Deactivated successfully.
Jan 31 03:53:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:50.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:50 np0005603623 podman[309493]: 2026-01-31 08:53:50.572806048 +0000 UTC m=+0.248259469 container attach 2bb6e4439967d1f377b7ff7a9a3c2838b1b4516ab4c746aa834d5d6a3ec26aa4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_swartz, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 03:53:50 np0005603623 podman[309493]: 2026-01-31 08:53:50.573756688 +0000 UTC m=+0.249210099 container died 2bb6e4439967d1f377b7ff7a9a3c2838b1b4516ab4c746aa834d5d6a3ec26aa4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_swartz, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Jan 31 03:53:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:50.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:50 np0005603623 systemd[1]: var-lib-containers-storage-overlay-164ee259fb054854b17d950fc529695b03383d46653fed3ba71b629a175636bd-merged.mount: Deactivated successfully.
Jan 31 03:53:50 np0005603623 podman[309493]: 2026-01-31 08:53:50.678973769 +0000 UTC m=+0.354427170 container remove 2bb6e4439967d1f377b7ff7a9a3c2838b1b4516ab4c746aa834d5d6a3ec26aa4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_swartz, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Jan 31 03:53:50 np0005603623 systemd[1]: libpod-conmon-2bb6e4439967d1f377b7ff7a9a3c2838b1b4516ab4c746aa834d5d6a3ec26aa4.scope: Deactivated successfully.
Jan 31 03:53:50 np0005603623 podman[309534]: 2026-01-31 08:53:50.82501705 +0000 UTC m=+0.060737426 container create c3074be4d4478feb9583efcadea5c51898266e906b565887bf3f2b36edc1e358 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_euler, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 03:53:50 np0005603623 podman[309534]: 2026-01-31 08:53:50.784381095 +0000 UTC m=+0.020101481 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 03:53:50 np0005603623 systemd[1]: Started libpod-conmon-c3074be4d4478feb9583efcadea5c51898266e906b565887bf3f2b36edc1e358.scope.
Jan 31 03:53:50 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:50 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:50 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:53:50 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/222805782d5a03e4434b8f61e8f542b9e103e561d22d1743e4b74ac04eacfd2a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 03:53:50 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/222805782d5a03e4434b8f61e8f542b9e103e561d22d1743e4b74ac04eacfd2a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 03:53:50 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/222805782d5a03e4434b8f61e8f542b9e103e561d22d1743e4b74ac04eacfd2a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 03:53:50 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/222805782d5a03e4434b8f61e8f542b9e103e561d22d1743e4b74ac04eacfd2a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 03:53:50 np0005603623 nova_compute[226235]: 2026-01-31 08:53:50.974 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:51 np0005603623 podman[309534]: 2026-01-31 08:53:51.001311311 +0000 UTC m=+0.237031697 container init c3074be4d4478feb9583efcadea5c51898266e906b565887bf3f2b36edc1e358 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_euler, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 03:53:51 np0005603623 podman[309534]: 2026-01-31 08:53:51.006905156 +0000 UTC m=+0.242625532 container start c3074be4d4478feb9583efcadea5c51898266e906b565887bf3f2b36edc1e358 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_euler, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 03:53:51 np0005603623 podman[309534]: 2026-01-31 08:53:51.032779788 +0000 UTC m=+0.268500174 container attach c3074be4d4478feb9583efcadea5c51898266e906b565887bf3f2b36edc1e358 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_euler, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 03:53:51 np0005603623 nova_compute[226235]: 2026-01-31 08:53:51.095 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:52 np0005603623 condescending_euler[309551]: [
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:    {
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:        "available": false,
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:        "ceph_device": false,
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:        "lsm_data": {},
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:        "lvs": [],
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:        "path": "/dev/sr0",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:        "rejected_reasons": [
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "Insufficient space (<5GB)",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "Has a FileSystem"
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:        ],
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:        "sys_api": {
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "actuators": null,
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "device_nodes": "sr0",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "devname": "sr0",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "human_readable_size": "482.00 KB",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "id_bus": "ata",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "model": "QEMU DVD-ROM",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "nr_requests": "2",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "parent": "/dev/sr0",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "partitions": {},
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "path": "/dev/sr0",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "removable": "1",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "rev": "2.5+",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "ro": "0",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "rotational": "1",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "sas_address": "",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "sas_device_handle": "",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "scheduler_mode": "mq-deadline",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "sectors": 0,
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "sectorsize": "2048",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "size": 493568.0,
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "support_discard": "2048",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "type": "disk",
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:            "vendor": "QEMU"
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:        }
Jan 31 03:53:52 np0005603623 condescending_euler[309551]:    }
Jan 31 03:53:52 np0005603623 condescending_euler[309551]: ]
Jan 31 03:53:52 np0005603623 systemd[1]: libpod-c3074be4d4478feb9583efcadea5c51898266e906b565887bf3f2b36edc1e358.scope: Deactivated successfully.
Jan 31 03:53:52 np0005603623 systemd[1]: libpod-c3074be4d4478feb9583efcadea5c51898266e906b565887bf3f2b36edc1e358.scope: Consumed 1.119s CPU time.
Jan 31 03:53:52 np0005603623 podman[310881]: 2026-01-31 08:53:52.193526771 +0000 UTC m=+0.021268148 container died c3074be4d4478feb9583efcadea5c51898266e906b565887bf3f2b36edc1e358 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_euler, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 03:53:52 np0005603623 systemd[1]: var-lib-containers-storage-overlay-222805782d5a03e4434b8f61e8f542b9e103e561d22d1743e4b74ac04eacfd2a-merged.mount: Deactivated successfully.
Jan 31 03:53:52 np0005603623 podman[310881]: 2026-01-31 08:53:52.245492471 +0000 UTC m=+0.073233828 container remove c3074be4d4478feb9583efcadea5c51898266e906b565887bf3f2b36edc1e358 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_euler, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:53:52 np0005603623 systemd[1]: libpod-conmon-c3074be4d4478feb9583efcadea5c51898266e906b565887bf3f2b36edc1e358.scope: Deactivated successfully.
Jan 31 03:53:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:53:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:52.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:53:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:52.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:53:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:53:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:53:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:54.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:53:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:54.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:54 np0005603623 nova_compute[226235]: 2026-01-31 08:53:54.727 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:55 np0005603623 nova_compute[226235]: 2026-01-31 08:53:55.977 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:56 np0005603623 nova_compute[226235]: 2026-01-31 08:53:56.096 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:56.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:56.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:58.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:53:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:58.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:00.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:00.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:00 np0005603623 nova_compute[226235]: 2026-01-31 08:54:00.979 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:01 np0005603623 nova_compute[226235]: 2026-01-31 08:54:01.097 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:02.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:02.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:54:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:04.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:54:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:04.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:05 np0005603623 nova_compute[226235]: 2026-01-31 08:54:05.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:54:05 np0005603623 nova_compute[226235]: 2026-01-31 08:54:05.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:54:05 np0005603623 podman[311003]: 2026-01-31 08:54:05.969262585 +0000 UTC m=+0.055165431 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 03:54:05 np0005603623 nova_compute[226235]: 2026-01-31 08:54:05.980 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:06 np0005603623 podman[311004]: 2026-01-31 08:54:06.013167183 +0000 UTC m=+0.098848762 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:54:06 np0005603623 nova_compute[226235]: 2026-01-31 08:54:06.114 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:54:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:06.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:54:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:06.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:07 np0005603623 nova_compute[226235]: 2026-01-31 08:54:07.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:54:07 np0005603623 nova_compute[226235]: 2026-01-31 08:54:07.329 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:07 np0005603623 nova_compute[226235]: 2026-01-31 08:54:07.330 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:07 np0005603623 nova_compute[226235]: 2026-01-31 08:54:07.330 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:07 np0005603623 nova_compute[226235]: 2026-01-31 08:54:07.330 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:54:07 np0005603623 nova_compute[226235]: 2026-01-31 08:54:07.330 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:54:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:54:07 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1230630749' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:54:07 np0005603623 nova_compute[226235]: 2026-01-31 08:54:07.791 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:54:08 np0005603623 nova_compute[226235]: 2026-01-31 08:54:08.163 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000ae as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:54:08 np0005603623 nova_compute[226235]: 2026-01-31 08:54:08.164 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000ae as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:54:08 np0005603623 nova_compute[226235]: 2026-01-31 08:54:08.313 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:54:08 np0005603623 nova_compute[226235]: 2026-01-31 08:54:08.315 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4025MB free_disk=20.91445541381836GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:54:08 np0005603623 nova_compute[226235]: 2026-01-31 08:54:08.315 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:08 np0005603623 nova_compute[226235]: 2026-01-31 08:54:08.315 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:08.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:08.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:09 np0005603623 nova_compute[226235]: 2026-01-31 08:54:09.111 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance f2cd77e9-4c34-4b40-b91b-2f9896d859ce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:54:09 np0005603623 nova_compute[226235]: 2026-01-31 08:54:09.111 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:54:09 np0005603623 nova_compute[226235]: 2026-01-31 08:54:09.111 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:54:09 np0005603623 nova_compute[226235]: 2026-01-31 08:54:09.335 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:54:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:54:09 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3453587488' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:54:09 np0005603623 nova_compute[226235]: 2026-01-31 08:54:09.768 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:54:09 np0005603623 nova_compute[226235]: 2026-01-31 08:54:09.772 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:54:10 np0005603623 nova_compute[226235]: 2026-01-31 08:54:10.021 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:54:10 np0005603623 nova_compute[226235]: 2026-01-31 08:54:10.358 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:54:10 np0005603623 nova_compute[226235]: 2026-01-31 08:54:10.358 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.043s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:10.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:10 np0005603623 nova_compute[226235]: 2026-01-31 08:54:10.595 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:10.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:10 np0005603623 nova_compute[226235]: 2026-01-31 08:54:10.982 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:11 np0005603623 nova_compute[226235]: 2026-01-31 08:54:11.116 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #151. Immutable memtables: 0.
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:12.318176) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 95] Flushing memtable with next log file: 151
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849652318202, "job": 95, "event": "flush_started", "num_memtables": 1, "num_entries": 1200, "num_deletes": 256, "total_data_size": 2521189, "memory_usage": 2566816, "flush_reason": "Manual Compaction"}
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 95] Level-0 flush table #152: started
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849652326382, "cf_name": "default", "job": 95, "event": "table_file_creation", "file_number": 152, "file_size": 1641451, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 74264, "largest_seqno": 75459, "table_properties": {"data_size": 1636203, "index_size": 2643, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11631, "raw_average_key_size": 19, "raw_value_size": 1625574, "raw_average_value_size": 2755, "num_data_blocks": 114, "num_entries": 590, "num_filter_entries": 590, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769849569, "oldest_key_time": 1769849569, "file_creation_time": 1769849652, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 152, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 95] Flush lasted 8262 microseconds, and 3401 cpu microseconds.
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:12.326433) [db/flush_job.cc:967] [default] [JOB 95] Level-0 flush table #152: 1641451 bytes OK
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:12.326474) [db/memtable_list.cc:519] [default] Level-0 commit table #152 started
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:12.328563) [db/memtable_list.cc:722] [default] Level-0 commit table #152: memtable #1 done
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:12.328580) EVENT_LOG_v1 {"time_micros": 1769849652328574, "job": 95, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:12.328598) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 95] Try to delete WAL files size 2515377, prev total WAL file size 2515377, number of live WAL files 2.
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000148.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:12.329043) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032373633' seq:72057594037927935, type:22 .. '6C6F676D0033303135' seq:0, type:0; will stop at (end)
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 96] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 95 Base level 0, inputs: [152(1602KB)], [150(12MB)]
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849652329085, "job": 96, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [152], "files_L6": [150], "score": -1, "input_data_size": 14411246, "oldest_snapshot_seqno": -1}
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 96] Generated table #153: 9631 keys, 14255457 bytes, temperature: kUnknown
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849652501785, "cf_name": "default", "job": 96, "event": "table_file_creation", "file_number": 153, "file_size": 14255457, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14190930, "index_size": 39326, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24133, "raw_key_size": 253674, "raw_average_key_size": 26, "raw_value_size": 14020174, "raw_average_value_size": 1455, "num_data_blocks": 1510, "num_entries": 9631, "num_filter_entries": 9631, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769849652, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 153, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:12.502023) [db/compaction/compaction_job.cc:1663] [default] [JOB 96] Compacted 1@0 + 1@6 files to L6 => 14255457 bytes
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:12.503711) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 83.4 rd, 82.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 12.2 +0.0 blob) out(13.6 +0.0 blob), read-write-amplify(17.5) write-amplify(8.7) OK, records in: 10160, records dropped: 529 output_compression: NoCompression
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:12.503730) EVENT_LOG_v1 {"time_micros": 1769849652503721, "job": 96, "event": "compaction_finished", "compaction_time_micros": 172754, "compaction_time_cpu_micros": 27028, "output_level": 6, "num_output_files": 1, "total_output_size": 14255457, "num_input_records": 10160, "num_output_records": 9631, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000152.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849652503978, "job": 96, "event": "table_file_deletion", "file_number": 152}
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000150.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849652505156, "job": 96, "event": "table_file_deletion", "file_number": 150}
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:12.328992) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:12.505207) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:12.505212) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:12.505213) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:12.505215) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:12.505216) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:12.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:54:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:12.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:54:13 np0005603623 nova_compute[226235]: 2026-01-31 08:54:13.358 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:54:13 np0005603623 nova_compute[226235]: 2026-01-31 08:54:13.359 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:54:13 np0005603623 nova_compute[226235]: 2026-01-31 08:54:13.359 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:54:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:14.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:14.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:54:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/902819234' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:54:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:54:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/902819234' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:54:15 np0005603623 nova_compute[226235]: 2026-01-31 08:54:15.984 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:16 np0005603623 nova_compute[226235]: 2026-01-31 08:54:16.117 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:54:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:16.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:54:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:54:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:16.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:54:18 np0005603623 nova_compute[226235]: 2026-01-31 08:54:18.402 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-f2cd77e9-4c34-4b40-b91b-2f9896d859ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:54:18 np0005603623 nova_compute[226235]: 2026-01-31 08:54:18.402 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-f2cd77e9-4c34-4b40-b91b-2f9896d859ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:54:18 np0005603623 nova_compute[226235]: 2026-01-31 08:54:18.402 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:54:18 np0005603623 nova_compute[226235]: 2026-01-31 08:54:18.402 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f2cd77e9-4c34-4b40-b91b-2f9896d859ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:54:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:18.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:18.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:54:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:20.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:54:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:20.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:20 np0005603623 nova_compute[226235]: 2026-01-31 08:54:20.986 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:21 np0005603623 nova_compute[226235]: 2026-01-31 08:54:21.119 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:21 np0005603623 nova_compute[226235]: 2026-01-31 08:54:21.364 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:21.364 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=76, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=75) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:54:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:21.366 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:54:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:22.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:54:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:22.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:54:24 np0005603623 nova_compute[226235]: 2026-01-31 08:54:24.113 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Updating instance_info_cache with network_info: [{"id": "be6fd5ac-96d0-495c-827a-80641ec1d590", "address": "fa:16:3e:47:32:4a", "network": {"id": "88788faa-de32-416c-9495-11b3f71610ce", "bridge": "br-int", "label": "tempest-network-smoke--472936268", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6fd5ac-96", "ovs_interfaceid": "be6fd5ac-96d0-495c-827a-80641ec1d590", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:54:24 np0005603623 nova_compute[226235]: 2026-01-31 08:54:24.284 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-f2cd77e9-4c34-4b40-b91b-2f9896d859ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:54:24 np0005603623 nova_compute[226235]: 2026-01-31 08:54:24.284 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:54:24 np0005603623 nova_compute[226235]: 2026-01-31 08:54:24.284 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:54:24 np0005603623 nova_compute[226235]: 2026-01-31 08:54:24.285 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:54:24 np0005603623 nova_compute[226235]: 2026-01-31 08:54:24.285 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:54:24 np0005603623 nova_compute[226235]: 2026-01-31 08:54:24.285 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:54:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:24.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:54:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:24.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:54:25 np0005603623 nova_compute[226235]: 2026-01-31 08:54:25.988 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:26 np0005603623 nova_compute[226235]: 2026-01-31 08:54:26.120 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:26 np0005603623 nova_compute[226235]: 2026-01-31 08:54:26.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:54:26 np0005603623 nova_compute[226235]: 2026-01-31 08:54:26.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:54:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:26.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:26.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:27 np0005603623 nova_compute[226235]: 2026-01-31 08:54:27.294 226239 DEBUG oslo_concurrency.lockutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:27 np0005603623 nova_compute[226235]: 2026-01-31 08:54:27.295 226239 DEBUG oslo_concurrency.lockutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:27 np0005603623 nova_compute[226235]: 2026-01-31 08:54:27.596 226239 DEBUG nova.compute.manager [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:54:28 np0005603623 nova_compute[226235]: 2026-01-31 08:54:28.006 226239 DEBUG oslo_concurrency.lockutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:28 np0005603623 nova_compute[226235]: 2026-01-31 08:54:28.007 226239 DEBUG oslo_concurrency.lockutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:28 np0005603623 nova_compute[226235]: 2026-01-31 08:54:28.014 226239 DEBUG nova.virt.hardware [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:54:28 np0005603623 nova_compute[226235]: 2026-01-31 08:54:28.014 226239 INFO nova.compute.claims [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:54:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:28.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:28.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:28 np0005603623 nova_compute[226235]: 2026-01-31 08:54:28.817 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:29 np0005603623 nova_compute[226235]: 2026-01-31 08:54:29.681 226239 DEBUG oslo_concurrency.processutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:54:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:54:30 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2569195746' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:54:30 np0005603623 nova_compute[226235]: 2026-01-31 08:54:30.116 226239 DEBUG oslo_concurrency.processutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:54:30 np0005603623 nova_compute[226235]: 2026-01-31 08:54:30.121 226239 DEBUG nova.compute.provider_tree [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:54:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:30.145 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:30.145 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:30.146 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:30 np0005603623 nova_compute[226235]: 2026-01-31 08:54:30.208 226239 DEBUG nova.scheduler.client.report [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:54:30 np0005603623 nova_compute[226235]: 2026-01-31 08:54:30.323 226239 DEBUG oslo_concurrency.lockutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:30 np0005603623 nova_compute[226235]: 2026-01-31 08:54:30.325 226239 DEBUG nova.compute.manager [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:54:30 np0005603623 nova_compute[226235]: 2026-01-31 08:54:30.499 226239 DEBUG nova.compute.manager [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:54:30 np0005603623 nova_compute[226235]: 2026-01-31 08:54:30.500 226239 DEBUG nova.network.neutron [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:54:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:30.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:30.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:30 np0005603623 nova_compute[226235]: 2026-01-31 08:54:30.675 226239 INFO nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:54:30 np0005603623 nova_compute[226235]: 2026-01-31 08:54:30.858 226239 DEBUG nova.compute.manager [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:54:30 np0005603623 nova_compute[226235]: 2026-01-31 08:54:30.989 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:31 np0005603623 nova_compute[226235]: 2026-01-31 08:54:31.123 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:31.369 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '76'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:54:31 np0005603623 nova_compute[226235]: 2026-01-31 08:54:31.424 226239 DEBUG nova.compute.manager [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:54:31 np0005603623 nova_compute[226235]: 2026-01-31 08:54:31.425 226239 DEBUG nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:54:31 np0005603623 nova_compute[226235]: 2026-01-31 08:54:31.426 226239 INFO nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Creating image(s)#033[00m
Jan 31 03:54:31 np0005603623 nova_compute[226235]: 2026-01-31 08:54:31.456 226239 DEBUG nova.storage.rbd_utils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image f29e9f11-fa0f-4fe5-a566-82d8becccb7f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:54:31 np0005603623 nova_compute[226235]: 2026-01-31 08:54:31.488 226239 DEBUG nova.storage.rbd_utils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image f29e9f11-fa0f-4fe5-a566-82d8becccb7f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:54:31 np0005603623 nova_compute[226235]: 2026-01-31 08:54:31.524 226239 DEBUG nova.storage.rbd_utils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image f29e9f11-fa0f-4fe5-a566-82d8becccb7f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:54:31 np0005603623 nova_compute[226235]: 2026-01-31 08:54:31.530 226239 DEBUG oslo_concurrency.processutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:54:31 np0005603623 nova_compute[226235]: 2026-01-31 08:54:31.558 226239 DEBUG nova.policy [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd442c7ba12ed444ca6d4dcc5cfd36150', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:54:31 np0005603623 nova_compute[226235]: 2026-01-31 08:54:31.587 226239 DEBUG oslo_concurrency.processutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:54:31 np0005603623 nova_compute[226235]: 2026-01-31 08:54:31.590 226239 DEBUG oslo_concurrency.lockutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:31 np0005603623 nova_compute[226235]: 2026-01-31 08:54:31.591 226239 DEBUG oslo_concurrency.lockutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:31 np0005603623 nova_compute[226235]: 2026-01-31 08:54:31.591 226239 DEBUG oslo_concurrency.lockutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:31 np0005603623 nova_compute[226235]: 2026-01-31 08:54:31.615 226239 DEBUG nova.storage.rbd_utils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image f29e9f11-fa0f-4fe5-a566-82d8becccb7f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:54:31 np0005603623 nova_compute[226235]: 2026-01-31 08:54:31.619 226239 DEBUG oslo_concurrency.processutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 f29e9f11-fa0f-4fe5-a566-82d8becccb7f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:54:31 np0005603623 nova_compute[226235]: 2026-01-31 08:54:31.976 226239 DEBUG oslo_concurrency.processutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 f29e9f11-fa0f-4fe5-a566-82d8becccb7f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.357s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:54:32 np0005603623 nova_compute[226235]: 2026-01-31 08:54:32.041 226239 DEBUG nova.storage.rbd_utils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] resizing rbd image f29e9f11-fa0f-4fe5-a566-82d8becccb7f_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:54:32 np0005603623 nova_compute[226235]: 2026-01-31 08:54:32.139 226239 DEBUG nova.objects.instance [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'migration_context' on Instance uuid f29e9f11-fa0f-4fe5-a566-82d8becccb7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:54:32 np0005603623 nova_compute[226235]: 2026-01-31 08:54:32.229 226239 DEBUG nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:54:32 np0005603623 nova_compute[226235]: 2026-01-31 08:54:32.230 226239 DEBUG nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Ensure instance console log exists: /var/lib/nova/instances/f29e9f11-fa0f-4fe5-a566-82d8becccb7f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:54:32 np0005603623 nova_compute[226235]: 2026-01-31 08:54:32.231 226239 DEBUG oslo_concurrency.lockutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:32 np0005603623 nova_compute[226235]: 2026-01-31 08:54:32.231 226239 DEBUG oslo_concurrency.lockutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:32 np0005603623 nova_compute[226235]: 2026-01-31 08:54:32.231 226239 DEBUG oslo_concurrency.lockutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:32.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:32.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:54:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:34.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:54:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:54:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:34.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:54:35 np0005603623 nova_compute[226235]: 2026-01-31 08:54:35.991 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:36 np0005603623 nova_compute[226235]: 2026-01-31 08:54:36.124 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:36 np0005603623 nova_compute[226235]: 2026-01-31 08:54:36.394 226239 DEBUG nova.network.neutron [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Successfully created port: 877a2409-bb4a-41f9-9fd4-20ce910f0fd5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:54:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:54:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:36.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:54:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:36.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:36 np0005603623 podman[311345]: 2026-01-31 08:54:36.959234615 +0000 UTC m=+0.051406744 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:54:36 np0005603623 podman[311346]: 2026-01-31 08:54:36.978657864 +0000 UTC m=+0.068482320 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:54:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:54:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:38.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:54:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:38.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:39 np0005603623 nova_compute[226235]: 2026-01-31 08:54:39.390 226239 DEBUG nova.network.neutron [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Successfully updated port: 877a2409-bb4a-41f9-9fd4-20ce910f0fd5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:54:39 np0005603623 nova_compute[226235]: 2026-01-31 08:54:39.431 226239 DEBUG oslo_concurrency.lockutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "refresh_cache-f29e9f11-fa0f-4fe5-a566-82d8becccb7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:54:39 np0005603623 nova_compute[226235]: 2026-01-31 08:54:39.432 226239 DEBUG oslo_concurrency.lockutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquired lock "refresh_cache-f29e9f11-fa0f-4fe5-a566-82d8becccb7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:54:39 np0005603623 nova_compute[226235]: 2026-01-31 08:54:39.432 226239 DEBUG nova.network.neutron [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:54:39 np0005603623 nova_compute[226235]: 2026-01-31 08:54:39.704 226239 DEBUG nova.compute.manager [req-0f525f81-6e60-4e25-b616-4af3b108eb5d req-3ef2263d-406e-40dd-b4af-5f070321857a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Received event network-changed-877a2409-bb4a-41f9-9fd4-20ce910f0fd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:54:39 np0005603623 nova_compute[226235]: 2026-01-31 08:54:39.705 226239 DEBUG nova.compute.manager [req-0f525f81-6e60-4e25-b616-4af3b108eb5d req-3ef2263d-406e-40dd-b4af-5f070321857a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Refreshing instance network info cache due to event network-changed-877a2409-bb4a-41f9-9fd4-20ce910f0fd5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:54:39 np0005603623 nova_compute[226235]: 2026-01-31 08:54:39.705 226239 DEBUG oslo_concurrency.lockutils [req-0f525f81-6e60-4e25-b616-4af3b108eb5d req-3ef2263d-406e-40dd-b4af-5f070321857a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-f29e9f11-fa0f-4fe5-a566-82d8becccb7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:54:40 np0005603623 nova_compute[226235]: 2026-01-31 08:54:40.011 226239 DEBUG nova.network.neutron [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:54:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:40.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.002000064s ======
Jan 31 03:54:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:40.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 31 03:54:40 np0005603623 nova_compute[226235]: 2026-01-31 08:54:40.994 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:41 np0005603623 nova_compute[226235]: 2026-01-31 08:54:41.126 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:42.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:42.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.800 226239 DEBUG nova.network.neutron [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Updating instance_info_cache with network_info: [{"id": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "address": "fa:16:3e:5e:89:08", "network": {"id": "f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f", "bridge": "br-int", "label": "tempest-network-smoke--178981888", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877a2409-bb", "ovs_interfaceid": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.837 226239 DEBUG oslo_concurrency.lockutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Releasing lock "refresh_cache-f29e9f11-fa0f-4fe5-a566-82d8becccb7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.838 226239 DEBUG nova.compute.manager [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Instance network_info: |[{"id": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "address": "fa:16:3e:5e:89:08", "network": {"id": "f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f", "bridge": "br-int", "label": "tempest-network-smoke--178981888", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877a2409-bb", "ovs_interfaceid": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.839 226239 DEBUG oslo_concurrency.lockutils [req-0f525f81-6e60-4e25-b616-4af3b108eb5d req-3ef2263d-406e-40dd-b4af-5f070321857a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-f29e9f11-fa0f-4fe5-a566-82d8becccb7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.839 226239 DEBUG nova.network.neutron [req-0f525f81-6e60-4e25-b616-4af3b108eb5d req-3ef2263d-406e-40dd-b4af-5f070321857a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Refreshing network info cache for port 877a2409-bb4a-41f9-9fd4-20ce910f0fd5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.842 226239 DEBUG nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Start _get_guest_xml network_info=[{"id": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "address": "fa:16:3e:5e:89:08", "network": {"id": "f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f", "bridge": "br-int", "label": "tempest-network-smoke--178981888", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877a2409-bb", "ovs_interfaceid": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.846 226239 WARNING nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.853 226239 DEBUG nova.virt.libvirt.host [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.853 226239 DEBUG nova.virt.libvirt.host [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.858 226239 DEBUG nova.virt.libvirt.host [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.859 226239 DEBUG nova.virt.libvirt.host [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.860 226239 DEBUG nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.860 226239 DEBUG nova.virt.hardware [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.861 226239 DEBUG nova.virt.hardware [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.861 226239 DEBUG nova.virt.hardware [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.861 226239 DEBUG nova.virt.hardware [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.861 226239 DEBUG nova.virt.hardware [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.862 226239 DEBUG nova.virt.hardware [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.862 226239 DEBUG nova.virt.hardware [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.862 226239 DEBUG nova.virt.hardware [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.862 226239 DEBUG nova.virt.hardware [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.862 226239 DEBUG nova.virt.hardware [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.863 226239 DEBUG nova.virt.hardware [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:54:42 np0005603623 nova_compute[226235]: 2026-01-31 08:54:42.867 226239 DEBUG oslo_concurrency.processutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:54:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:54:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3247287874' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.326 226239 DEBUG oslo_concurrency.processutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.349 226239 DEBUG nova.storage.rbd_utils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image f29e9f11-fa0f-4fe5-a566-82d8becccb7f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.353 226239 DEBUG oslo_concurrency.processutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:54:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:54:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4116382506' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.793 226239 DEBUG oslo_concurrency.processutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.794 226239 DEBUG nova.virt.libvirt.vif [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:54:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-612660049',display_name='tempest-TestNetworkBasicOps-server-612660049',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-612660049',id=177,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCNzwGCpb0oBUzRcNo+awbxNxuGMJBl5Zfp2GKEtKokcpPNzsdsm/xnexhK/rRBrUpvaOYbWNNFVeXug28+rVoRUqgocA3Ra+UPxvdrhuNzOIZaQWx1FNDHOPDX0Z6GC6w==',key_name='tempest-TestNetworkBasicOps-1061129892',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-af0wzd8d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:54:31Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=f29e9f11-fa0f-4fe5-a566-82d8becccb7f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "address": "fa:16:3e:5e:89:08", "network": {"id": "f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f", "bridge": "br-int", "label": "tempest-network-smoke--178981888", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877a2409-bb", "ovs_interfaceid": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.794 226239 DEBUG nova.network.os_vif_util [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "address": "fa:16:3e:5e:89:08", "network": {"id": "f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f", "bridge": "br-int", "label": "tempest-network-smoke--178981888", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877a2409-bb", "ovs_interfaceid": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.795 226239 DEBUG nova.network.os_vif_util [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:89:08,bridge_name='br-int',has_traffic_filtering=True,id=877a2409-bb4a-41f9-9fd4-20ce910f0fd5,network=Network(f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap877a2409-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.796 226239 DEBUG nova.objects.instance [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'pci_devices' on Instance uuid f29e9f11-fa0f-4fe5-a566-82d8becccb7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.871 226239 DEBUG nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:54:43 np0005603623 nova_compute[226235]:  <uuid>f29e9f11-fa0f-4fe5-a566-82d8becccb7f</uuid>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:  <name>instance-000000b1</name>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <nova:name>tempest-TestNetworkBasicOps-server-612660049</nova:name>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:54:42</nova:creationTime>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:54:43 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:        <nova:user uuid="d442c7ba12ed444ca6d4dcc5cfd36150">tempest-TestNetworkBasicOps-104417095-project-member</nova:user>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:        <nova:project uuid="abf9393aa2b646feb00a3d887a9dee14">tempest-TestNetworkBasicOps-104417095</nova:project>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:        <nova:port uuid="877a2409-bb4a-41f9-9fd4-20ce910f0fd5">
Jan 31 03:54:43 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <entry name="serial">f29e9f11-fa0f-4fe5-a566-82d8becccb7f</entry>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <entry name="uuid">f29e9f11-fa0f-4fe5-a566-82d8becccb7f</entry>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/f29e9f11-fa0f-4fe5-a566-82d8becccb7f_disk">
Jan 31 03:54:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:54:43 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/f29e9f11-fa0f-4fe5-a566-82d8becccb7f_disk.config">
Jan 31 03:54:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:54:43 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:5e:89:08"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <target dev="tap877a2409-bb"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/f29e9f11-fa0f-4fe5-a566-82d8becccb7f/console.log" append="off"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:54:43 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:54:43 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:54:43 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:54:43 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.872 226239 DEBUG nova.compute.manager [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Preparing to wait for external event network-vif-plugged-877a2409-bb4a-41f9-9fd4-20ce910f0fd5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.872 226239 DEBUG oslo_concurrency.lockutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.873 226239 DEBUG oslo_concurrency.lockutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.873 226239 DEBUG oslo_concurrency.lockutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.874 226239 DEBUG nova.virt.libvirt.vif [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:54:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-612660049',display_name='tempest-TestNetworkBasicOps-server-612660049',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-612660049',id=177,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCNzwGCpb0oBUzRcNo+awbxNxuGMJBl5Zfp2GKEtKokcpPNzsdsm/xnexhK/rRBrUpvaOYbWNNFVeXug28+rVoRUqgocA3Ra+UPxvdrhuNzOIZaQWx1FNDHOPDX0Z6GC6w==',key_name='tempest-TestNetworkBasicOps-1061129892',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-af0wzd8d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:54:31Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=f29e9f11-fa0f-4fe5-a566-82d8becccb7f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "address": "fa:16:3e:5e:89:08", "network": {"id": "f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f", "bridge": "br-int", "label": "tempest-network-smoke--178981888", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877a2409-bb", "ovs_interfaceid": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.874 226239 DEBUG nova.network.os_vif_util [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "address": "fa:16:3e:5e:89:08", "network": {"id": "f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f", "bridge": "br-int", "label": "tempest-network-smoke--178981888", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877a2409-bb", "ovs_interfaceid": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.875 226239 DEBUG nova.network.os_vif_util [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:89:08,bridge_name='br-int',has_traffic_filtering=True,id=877a2409-bb4a-41f9-9fd4-20ce910f0fd5,network=Network(f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap877a2409-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.876 226239 DEBUG os_vif [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:89:08,bridge_name='br-int',has_traffic_filtering=True,id=877a2409-bb4a-41f9-9fd4-20ce910f0fd5,network=Network(f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap877a2409-bb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.876 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.877 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.877 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.880 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.880 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap877a2409-bb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.881 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap877a2409-bb, col_values=(('external_ids', {'iface-id': '877a2409-bb4a-41f9-9fd4-20ce910f0fd5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5e:89:08', 'vm-uuid': 'f29e9f11-fa0f-4fe5-a566-82d8becccb7f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.882 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:43 np0005603623 NetworkManager[48970]: <info>  [1769849683.8833] manager: (tap877a2409-bb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/343)
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.885 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.887 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:43 np0005603623 nova_compute[226235]: 2026-01-31 08:54:43.888 226239 INFO os_vif [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:89:08,bridge_name='br-int',has_traffic_filtering=True,id=877a2409-bb4a-41f9-9fd4-20ce910f0fd5,network=Network(f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap877a2409-bb')#033[00m
Jan 31 03:54:44 np0005603623 nova_compute[226235]: 2026-01-31 08:54:44.002 226239 DEBUG nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:54:44 np0005603623 nova_compute[226235]: 2026-01-31 08:54:44.003 226239 DEBUG nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:54:44 np0005603623 nova_compute[226235]: 2026-01-31 08:54:44.003 226239 DEBUG nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No VIF found with MAC fa:16:3e:5e:89:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:54:44 np0005603623 nova_compute[226235]: 2026-01-31 08:54:44.004 226239 INFO nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Using config drive#033[00m
Jan 31 03:54:44 np0005603623 nova_compute[226235]: 2026-01-31 08:54:44.029 226239 DEBUG nova.storage.rbd_utils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image f29e9f11-fa0f-4fe5-a566-82d8becccb7f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:54:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:44.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:44.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:44 np0005603623 nova_compute[226235]: 2026-01-31 08:54:44.873 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.110 226239 INFO nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Creating config drive at /var/lib/nova/instances/f29e9f11-fa0f-4fe5-a566-82d8becccb7f/disk.config#033[00m
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.114 226239 DEBUG oslo_concurrency.processutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f29e9f11-fa0f-4fe5-a566-82d8becccb7f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpump7u432 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.241 226239 DEBUG oslo_concurrency.processutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f29e9f11-fa0f-4fe5-a566-82d8becccb7f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpump7u432" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.269 226239 DEBUG nova.storage.rbd_utils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image f29e9f11-fa0f-4fe5-a566-82d8becccb7f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.273 226239 DEBUG oslo_concurrency.processutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f29e9f11-fa0f-4fe5-a566-82d8becccb7f/disk.config f29e9f11-fa0f-4fe5-a566-82d8becccb7f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.433 226239 DEBUG oslo_concurrency.processutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f29e9f11-fa0f-4fe5-a566-82d8becccb7f/disk.config f29e9f11-fa0f-4fe5-a566-82d8becccb7f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.434 226239 INFO nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Deleting local config drive /var/lib/nova/instances/f29e9f11-fa0f-4fe5-a566-82d8becccb7f/disk.config because it was imported into RBD.#033[00m
Jan 31 03:54:45 np0005603623 kernel: tap877a2409-bb: entered promiscuous mode
Jan 31 03:54:45 np0005603623 NetworkManager[48970]: <info>  [1769849685.4729] manager: (tap877a2409-bb): new Tun device (/org/freedesktop/NetworkManager/Devices/344)
Jan 31 03:54:45 np0005603623 ovn_controller[133449]: 2026-01-31T08:54:45Z|00731|binding|INFO|Claiming lport 877a2409-bb4a-41f9-9fd4-20ce910f0fd5 for this chassis.
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.473 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:45 np0005603623 ovn_controller[133449]: 2026-01-31T08:54:45Z|00732|binding|INFO|877a2409-bb4a-41f9-9fd4-20ce910f0fd5: Claiming fa:16:3e:5e:89:08 10.100.0.29
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.477 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.488 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:45 np0005603623 ovn_controller[133449]: 2026-01-31T08:54:45Z|00733|binding|INFO|Setting lport 877a2409-bb4a-41f9-9fd4-20ce910f0fd5 ovn-installed in OVS
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.490 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:45 np0005603623 systemd-udevd[311574]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:54:45 np0005603623 systemd-machined[194379]: New machine qemu-84-instance-000000b1.
Jan 31 03:54:45 np0005603623 NetworkManager[48970]: <info>  [1769849685.5064] device (tap877a2409-bb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:54:45 np0005603623 NetworkManager[48970]: <info>  [1769849685.5071] device (tap877a2409-bb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:54:45 np0005603623 ovn_controller[133449]: 2026-01-31T08:54:45Z|00734|binding|INFO|Setting lport 877a2409-bb4a-41f9-9fd4-20ce910f0fd5 up in Southbound
Jan 31 03:54:45 np0005603623 systemd[1]: Started Virtual Machine qemu-84-instance-000000b1.
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.513 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:89:08 10.100.0.29'], port_security=['fa:16:3e:5e:89:08 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': 'f29e9f11-fa0f-4fe5-a566-82d8becccb7f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0d9c16a7-acba-4417-880b-0294fb98bb1c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b658669c-361a-4865-baa4-bcdfdd5eec17, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=877a2409-bb4a-41f9-9fd4-20ce910f0fd5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.514 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 877a2409-bb4a-41f9-9fd4-20ce910f0fd5 in datapath f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f bound to our chassis#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.516 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.522 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8a37da93-9e07-4a54-8079-b395568b2e29]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.523 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf8e02b4b-21 in ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.525 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf8e02b4b-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.525 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[57a6f2de-f592-4aed-adb7-33a73666aa39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.525 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[644821fd-751a-4da9-81dd-aa4e5e863d4a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.533 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[f7aa5037-2f3e-411e-a150-0d2893649d4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.553 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4bcf3acf-69cd-4293-ae72-fc6ef8ca2573]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.576 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[d1e0b6c3-dbb7-4976-928a-6c9467937532]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.579 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6361dc3a-576a-479f-b68d-da918db69f76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:45 np0005603623 NetworkManager[48970]: <info>  [1769849685.5809] manager: (tapf8e02b4b-20): new Veth device (/org/freedesktop/NetworkManager/Devices/345)
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.604 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[d2d5639b-f44d-4d84-ac42-ea1668754c74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.606 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[2b930894-6a4a-468e-9774-0d5dd0a8e60e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:45 np0005603623 NetworkManager[48970]: <info>  [1769849685.6199] device (tapf8e02b4b-20): carrier: link connected
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.623 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[712f010a-f4b2-4b49-904d-e1677ae3681c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.635 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9853b035-ba59-4eef-9912-42c2a490e0e4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf8e02b4b-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:6f:b4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 882604, 'reachable_time': 38363, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 311608, 'error': None, 'target': 'ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.644 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7a57c293-22e3-48d3-bd0a-9891e2ca9b5b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe48:6fb4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 882604, 'tstamp': 882604}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 311609, 'error': None, 'target': 'ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.655 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[19f7f686-bae1-4235-85da-3e1950bbb2b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf8e02b4b-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:6f:b4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 882604, 'reachable_time': 38363, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 311610, 'error': None, 'target': 'ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.672 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f62f0c83-2e45-4350-b4cb-df215b850180]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.715 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7ab09e81-3094-4f24-a993-e0bf9b3fb01d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.717 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8e02b4b-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.717 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.717 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf8e02b4b-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.719 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:45 np0005603623 NetworkManager[48970]: <info>  [1769849685.7197] manager: (tapf8e02b4b-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/346)
Jan 31 03:54:45 np0005603623 kernel: tapf8e02b4b-20: entered promiscuous mode
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.721 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf8e02b4b-20, col_values=(('external_ids', {'iface-id': 'e3eb9ef9-cdd2-4606-8d60-802acc3c8dd3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.722 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.723 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.723 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:54:45 np0005603623 ovn_controller[133449]: 2026-01-31T08:54:45Z|00735|binding|INFO|Releasing lport e3eb9ef9-cdd2-4606-8d60-802acc3c8dd3 from this chassis (sb_readonly=0)
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.724 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5d4b6ce2-8eac-4305-a468-6e48461991d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.725 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f.pid.haproxy
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:54:45 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:54:45.725 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f', 'env', 'PROCESS_TAG=haproxy-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.729 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.891 226239 DEBUG nova.compute.manager [req-b6d6ddcc-2d09-42c3-8d9a-d85309511d78 req-81031ab1-3173-4681-a710-3552b49f3ac0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Received event network-vif-plugged-877a2409-bb4a-41f9-9fd4-20ce910f0fd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.892 226239 DEBUG oslo_concurrency.lockutils [req-b6d6ddcc-2d09-42c3-8d9a-d85309511d78 req-81031ab1-3173-4681-a710-3552b49f3ac0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.892 226239 DEBUG oslo_concurrency.lockutils [req-b6d6ddcc-2d09-42c3-8d9a-d85309511d78 req-81031ab1-3173-4681-a710-3552b49f3ac0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.892 226239 DEBUG oslo_concurrency.lockutils [req-b6d6ddcc-2d09-42c3-8d9a-d85309511d78 req-81031ab1-3173-4681-a710-3552b49f3ac0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:45 np0005603623 nova_compute[226235]: 2026-01-31 08:54:45.892 226239 DEBUG nova.compute.manager [req-b6d6ddcc-2d09-42c3-8d9a-d85309511d78 req-81031ab1-3173-4681-a710-3552b49f3ac0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Processing event network-vif-plugged-877a2409-bb4a-41f9-9fd4-20ce910f0fd5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:54:46 np0005603623 nova_compute[226235]: 2026-01-31 08:54:46.009 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:46 np0005603623 podman[311642]: 2026-01-31 08:54:46.064313277 +0000 UTC m=+0.074198929 container create bb1af8bc3d4ce0cc195f15e71a447bf90f2825bc984f31ff94f7c26d27e06c74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 03:54:46 np0005603623 systemd[1]: Started libpod-conmon-bb1af8bc3d4ce0cc195f15e71a447bf90f2825bc984f31ff94f7c26d27e06c74.scope.
Jan 31 03:54:46 np0005603623 podman[311642]: 2026-01-31 08:54:46.024696424 +0000 UTC m=+0.034582076 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:54:46 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:54:46 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b6a11ad609c1488b39886aea30ae06ddd83bb7c393f23ea58cbbb9aaf7fdd01/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:54:46 np0005603623 podman[311642]: 2026-01-31 08:54:46.150848422 +0000 UTC m=+0.160734094 container init bb1af8bc3d4ce0cc195f15e71a447bf90f2825bc984f31ff94f7c26d27e06c74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:54:46 np0005603623 podman[311642]: 2026-01-31 08:54:46.155122725 +0000 UTC m=+0.165008377 container start bb1af8bc3d4ce0cc195f15e71a447bf90f2825bc984f31ff94f7c26d27e06c74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:54:46 np0005603623 neutron-haproxy-ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f[311657]: [NOTICE]   (311662) : New worker (311664) forked
Jan 31 03:54:46 np0005603623 neutron-haproxy-ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f[311657]: [NOTICE]   (311662) : Loading success.
Jan 31 03:54:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:46 np0005603623 nova_compute[226235]: 2026-01-31 08:54:46.501 226239 DEBUG nova.network.neutron [req-0f525f81-6e60-4e25-b616-4af3b108eb5d req-3ef2263d-406e-40dd-b4af-5f070321857a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Updated VIF entry in instance network info cache for port 877a2409-bb4a-41f9-9fd4-20ce910f0fd5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:54:46 np0005603623 nova_compute[226235]: 2026-01-31 08:54:46.502 226239 DEBUG nova.network.neutron [req-0f525f81-6e60-4e25-b616-4af3b108eb5d req-3ef2263d-406e-40dd-b4af-5f070321857a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Updating instance_info_cache with network_info: [{"id": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "address": "fa:16:3e:5e:89:08", "network": {"id": "f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f", "bridge": "br-int", "label": "tempest-network-smoke--178981888", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877a2409-bb", "ovs_interfaceid": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:54:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:46.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:46.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:46 np0005603623 nova_compute[226235]: 2026-01-31 08:54:46.845 226239 DEBUG oslo_concurrency.lockutils [req-0f525f81-6e60-4e25-b616-4af3b108eb5d req-3ef2263d-406e-40dd-b4af-5f070321857a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-f29e9f11-fa0f-4fe5-a566-82d8becccb7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.022 226239 DEBUG nova.compute.manager [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.023 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849687.021913, f29e9f11-fa0f-4fe5-a566-82d8becccb7f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.023 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] VM Started (Lifecycle Event)#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.025 226239 DEBUG nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.028 226239 INFO nova.virt.libvirt.driver [-] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Instance spawned successfully.#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.028 226239 DEBUG nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.075 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.082 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.086 226239 DEBUG nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.086 226239 DEBUG nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.087 226239 DEBUG nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.087 226239 DEBUG nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.088 226239 DEBUG nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.088 226239 DEBUG nova.virt.libvirt.driver [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.153 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.154 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849687.0221326, f29e9f11-fa0f-4fe5-a566-82d8becccb7f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.154 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.191 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.195 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849687.0249808, f29e9f11-fa0f-4fe5-a566-82d8becccb7f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.195 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.211 226239 INFO nova.compute.manager [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Took 15.79 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.212 226239 DEBUG nova.compute.manager [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.246 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.249 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.287 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.309 226239 INFO nova.compute.manager [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Took 19.41 seconds to build instance.#033[00m
Jan 31 03:54:47 np0005603623 nova_compute[226235]: 2026-01-31 08:54:47.340 226239 DEBUG oslo_concurrency.lockutils [None req-2ad90a58-c4ed-4cde-84fd-da4c3b356feb d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #154. Immutable memtables: 0.
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:47.343023) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 97] Flushing memtable with next log file: 154
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849687343098, "job": 97, "event": "flush_started", "num_memtables": 1, "num_entries": 623, "num_deletes": 252, "total_data_size": 930810, "memory_usage": 942288, "flush_reason": "Manual Compaction"}
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 97] Level-0 flush table #155: started
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849687347372, "cf_name": "default", "job": 97, "event": "table_file_creation", "file_number": 155, "file_size": 427060, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 75464, "largest_seqno": 76082, "table_properties": {"data_size": 424340, "index_size": 691, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7578, "raw_average_key_size": 20, "raw_value_size": 418653, "raw_average_value_size": 1128, "num_data_blocks": 31, "num_entries": 371, "num_filter_entries": 371, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769849652, "oldest_key_time": 1769849652, "file_creation_time": 1769849687, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 155, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 97] Flush lasted 4384 microseconds, and 1742 cpu microseconds.
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:47.347416) [db/flush_job.cc:967] [default] [JOB 97] Level-0 flush table #155: 427060 bytes OK
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:47.347436) [db/memtable_list.cc:519] [default] Level-0 commit table #155 started
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:47.350624) [db/memtable_list.cc:722] [default] Level-0 commit table #155: memtable #1 done
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:47.350638) EVENT_LOG_v1 {"time_micros": 1769849687350633, "job": 97, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:47.350655) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 97] Try to delete WAL files size 927337, prev total WAL file size 927337, number of live WAL files 2.
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000151.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:47.351119) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032353034' seq:72057594037927935, type:22 .. '6D6772737461740032373537' seq:0, type:0; will stop at (end)
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 98] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 97 Base level 0, inputs: [155(417KB)], [153(13MB)]
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849687351174, "job": 98, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [155], "files_L6": [153], "score": -1, "input_data_size": 14682517, "oldest_snapshot_seqno": -1}
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 98] Generated table #156: 9501 keys, 10991724 bytes, temperature: kUnknown
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849687485695, "cf_name": "default", "job": 98, "event": "table_file_creation", "file_number": 156, "file_size": 10991724, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10932581, "index_size": 34262, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23813, "raw_key_size": 251168, "raw_average_key_size": 26, "raw_value_size": 10768598, "raw_average_value_size": 1133, "num_data_blocks": 1299, "num_entries": 9501, "num_filter_entries": 9501, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769849687, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 156, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:47.485959) [db/compaction/compaction_job.cc:1663] [default] [JOB 98] Compacted 1@0 + 1@6 files to L6 => 10991724 bytes
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:47.490876) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 109.1 rd, 81.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 13.6 +0.0 blob) out(10.5 +0.0 blob), read-write-amplify(60.1) write-amplify(25.7) OK, records in: 10002, records dropped: 501 output_compression: NoCompression
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:47.490906) EVENT_LOG_v1 {"time_micros": 1769849687490894, "job": 98, "event": "compaction_finished", "compaction_time_micros": 134601, "compaction_time_cpu_micros": 21532, "output_level": 6, "num_output_files": 1, "total_output_size": 10991724, "num_input_records": 10002, "num_output_records": 9501, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000155.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849687491120, "job": 98, "event": "table_file_deletion", "file_number": 155}
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000153.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849687492375, "job": 98, "event": "table_file_deletion", "file_number": 153}
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:47.351038) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:47.492420) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:47.492425) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:47.492426) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:47.492428) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:47 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:54:47.492429) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:48 np0005603623 nova_compute[226235]: 2026-01-31 08:54:48.063 226239 DEBUG nova.compute.manager [req-37ce43f1-5d07-457f-8fde-beb5e114192d req-cae07786-51cc-4dbe-96ff-5069e3c8e5ff fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Received event network-vif-plugged-877a2409-bb4a-41f9-9fd4-20ce910f0fd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:54:48 np0005603623 nova_compute[226235]: 2026-01-31 08:54:48.064 226239 DEBUG oslo_concurrency.lockutils [req-37ce43f1-5d07-457f-8fde-beb5e114192d req-cae07786-51cc-4dbe-96ff-5069e3c8e5ff fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:48 np0005603623 nova_compute[226235]: 2026-01-31 08:54:48.064 226239 DEBUG oslo_concurrency.lockutils [req-37ce43f1-5d07-457f-8fde-beb5e114192d req-cae07786-51cc-4dbe-96ff-5069e3c8e5ff fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:48 np0005603623 nova_compute[226235]: 2026-01-31 08:54:48.064 226239 DEBUG oslo_concurrency.lockutils [req-37ce43f1-5d07-457f-8fde-beb5e114192d req-cae07786-51cc-4dbe-96ff-5069e3c8e5ff fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:48 np0005603623 nova_compute[226235]: 2026-01-31 08:54:48.064 226239 DEBUG nova.compute.manager [req-37ce43f1-5d07-457f-8fde-beb5e114192d req-cae07786-51cc-4dbe-96ff-5069e3c8e5ff fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] No waiting events found dispatching network-vif-plugged-877a2409-bb4a-41f9-9fd4-20ce910f0fd5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:54:48 np0005603623 nova_compute[226235]: 2026-01-31 08:54:48.065 226239 WARNING nova.compute.manager [req-37ce43f1-5d07-457f-8fde-beb5e114192d req-cae07786-51cc-4dbe-96ff-5069e3c8e5ff fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Received unexpected event network-vif-plugged-877a2409-bb4a-41f9-9fd4-20ce910f0fd5 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:54:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:54:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:48.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:54:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:48.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:48 np0005603623 nova_compute[226235]: 2026-01-31 08:54:48.883 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:50.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:50.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:51 np0005603623 nova_compute[226235]: 2026-01-31 08:54:51.010 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:51 np0005603623 nova_compute[226235]: 2026-01-31 08:54:51.620 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:52.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:52.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:53 np0005603623 nova_compute[226235]: 2026-01-31 08:54:53.885 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:54:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:54.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:54:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:54.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 e374: 3 total, 3 up, 3 in
Jan 31 03:54:56 np0005603623 nova_compute[226235]: 2026-01-31 08:54:56.013 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:56.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:56.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:58.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:54:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:58.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:58 np0005603623 nova_compute[226235]: 2026-01-31 08:54:58.887 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:54:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:54:59 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:54:59 np0005603623 ovn_controller[133449]: 2026-01-31T08:54:59Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5e:89:08 10.100.0.29
Jan 31 03:54:59 np0005603623 ovn_controller[133449]: 2026-01-31T08:54:59Z|00089|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5e:89:08 10.100.0.29
Jan 31 03:55:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:55:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:00.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:55:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:00.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:01 np0005603623 nova_compute[226235]: 2026-01-31 08:55:01.014 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:02.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:02.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:03 np0005603623 nova_compute[226235]: 2026-01-31 08:55:03.890 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:04 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:55:04 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:55:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:04.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:55:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:04.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:55:05 np0005603623 nova_compute[226235]: 2026-01-31 08:55:05.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:05 np0005603623 nova_compute[226235]: 2026-01-31 08:55:05.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:55:06 np0005603623 nova_compute[226235]: 2026-01-31 08:55:06.016 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:06 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:06Z|00736|binding|INFO|Releasing lport e3eb9ef9-cdd2-4606-8d60-802acc3c8dd3 from this chassis (sb_readonly=0)
Jan 31 03:55:06 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:06Z|00737|binding|INFO|Releasing lport 1da501db-771f-4fea-baad-4da77c8c5424 from this chassis (sb_readonly=0)
Jan 31 03:55:06 np0005603623 nova_compute[226235]: 2026-01-31 08:55:06.047 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:06.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:06.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:07 np0005603623 podman[311957]: 2026-01-31 08:55:07.95520349 +0000 UTC m=+0.047072048 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:55:08 np0005603623 podman[311958]: 2026-01-31 08:55:08.010279107 +0000 UTC m=+0.102012400 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:55:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:08.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:08.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:08 np0005603623 nova_compute[226235]: 2026-01-31 08:55:08.891 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:09 np0005603623 nova_compute[226235]: 2026-01-31 08:55:09.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:09 np0005603623 nova_compute[226235]: 2026-01-31 08:55:09.199 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:09 np0005603623 nova_compute[226235]: 2026-01-31 08:55:09.199 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:09 np0005603623 nova_compute[226235]: 2026-01-31 08:55:09.200 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:09 np0005603623 nova_compute[226235]: 2026-01-31 08:55:09.200 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:55:09 np0005603623 nova_compute[226235]: 2026-01-31 08:55:09.201 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:55:09 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2674126547' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:55:09 np0005603623 nova_compute[226235]: 2026-01-31 08:55:09.598 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.398s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:09 np0005603623 nova_compute[226235]: 2026-01-31 08:55:09.741 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000b1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:55:09 np0005603623 nova_compute[226235]: 2026-01-31 08:55:09.742 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000b1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:55:09 np0005603623 nova_compute[226235]: 2026-01-31 08:55:09.747 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000ae as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:55:09 np0005603623 nova_compute[226235]: 2026-01-31 08:55:09.747 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000ae as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:55:09 np0005603623 nova_compute[226235]: 2026-01-31 08:55:09.891 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:55:09 np0005603623 nova_compute[226235]: 2026-01-31 08:55:09.893 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3843MB free_disk=20.851715087890625GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:55:09 np0005603623 nova_compute[226235]: 2026-01-31 08:55:09.893 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:09 np0005603623 nova_compute[226235]: 2026-01-31 08:55:09.893 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.049 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance f2cd77e9-4c34-4b40-b91b-2f9896d859ce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.050 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance f29e9f11-fa0f-4fe5-a566-82d8becccb7f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.050 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.050 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.157 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:55:10 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1715758590' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.577 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.579 226239 DEBUG oslo_concurrency.lockutils [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.579 226239 DEBUG oslo_concurrency.lockutils [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.580 226239 DEBUG oslo_concurrency.lockutils [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.580 226239 DEBUG oslo_concurrency.lockutils [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.580 226239 DEBUG oslo_concurrency.lockutils [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.582 226239 INFO nova.compute.manager [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Terminating instance#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.583 226239 DEBUG nova.compute.manager [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.587 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:55:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:10.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.632 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:55:10 np0005603623 kernel: tap877a2409-bb (unregistering): left promiscuous mode
Jan 31 03:55:10 np0005603623 NetworkManager[48970]: <info>  [1769849710.6382] device (tap877a2409-bb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.643 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:10 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:10Z|00738|binding|INFO|Releasing lport 877a2409-bb4a-41f9-9fd4-20ce910f0fd5 from this chassis (sb_readonly=0)
Jan 31 03:55:10 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:10Z|00739|binding|INFO|Setting lport 877a2409-bb4a-41f9-9fd4-20ce910f0fd5 down in Southbound
Jan 31 03:55:10 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:10Z|00740|binding|INFO|Removing iface tap877a2409-bb ovn-installed in OVS
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.646 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:10.654 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:89:08 10.100.0.29'], port_security=['fa:16:3e:5e:89:08 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': 'f29e9f11-fa0f-4fe5-a566-82d8becccb7f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0d9c16a7-acba-4417-880b-0294fb98bb1c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b658669c-361a-4865-baa4-bcdfdd5eec17, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=877a2409-bb4a-41f9-9fd4-20ce910f0fd5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.655 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:10.657 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 877a2409-bb4a-41f9-9fd4-20ce910f0fd5 in datapath f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f unbound from our chassis#033[00m
Jan 31 03:55:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:10.662 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:55:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:10.664 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b011ea0e-d53b-4cc9-8ed0-48bb52f49823]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:10.665 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f namespace which is not needed anymore#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.675 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.676 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.783s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:10.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:10 np0005603623 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000b1.scope: Deactivated successfully.
Jan 31 03:55:10 np0005603623 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000b1.scope: Consumed 13.909s CPU time.
Jan 31 03:55:10 np0005603623 systemd-machined[194379]: Machine qemu-84-instance-000000b1 terminated.
Jan 31 03:55:10 np0005603623 neutron-haproxy-ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f[311657]: [NOTICE]   (311662) : haproxy version is 2.8.14-c23fe91
Jan 31 03:55:10 np0005603623 neutron-haproxy-ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f[311657]: [NOTICE]   (311662) : path to executable is /usr/sbin/haproxy
Jan 31 03:55:10 np0005603623 neutron-haproxy-ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f[311657]: [WARNING]  (311662) : Exiting Master process...
Jan 31 03:55:10 np0005603623 neutron-haproxy-ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f[311657]: [ALERT]    (311662) : Current worker (311664) exited with code 143 (Terminated)
Jan 31 03:55:10 np0005603623 neutron-haproxy-ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f[311657]: [WARNING]  (311662) : All workers exited. Exiting... (0)
Jan 31 03:55:10 np0005603623 systemd[1]: libpod-bb1af8bc3d4ce0cc195f15e71a447bf90f2825bc984f31ff94f7c26d27e06c74.scope: Deactivated successfully.
Jan 31 03:55:10 np0005603623 podman[312070]: 2026-01-31 08:55:10.768809634 +0000 UTC m=+0.040401428 container died bb1af8bc3d4ce0cc195f15e71a447bf90f2825bc984f31ff94f7c26d27e06c74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:55:10 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bb1af8bc3d4ce0cc195f15e71a447bf90f2825bc984f31ff94f7c26d27e06c74-userdata-shm.mount: Deactivated successfully.
Jan 31 03:55:10 np0005603623 systemd[1]: var-lib-containers-storage-overlay-8b6a11ad609c1488b39886aea30ae06ddd83bb7c393f23ea58cbbb9aaf7fdd01-merged.mount: Deactivated successfully.
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.801 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:10 np0005603623 podman[312070]: 2026-01-31 08:55:10.804283627 +0000 UTC m=+0.075875401 container cleanup bb1af8bc3d4ce0cc195f15e71a447bf90f2825bc984f31ff94f7c26d27e06c74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.806 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:10 np0005603623 systemd[1]: libpod-conmon-bb1af8bc3d4ce0cc195f15e71a447bf90f2825bc984f31ff94f7c26d27e06c74.scope: Deactivated successfully.
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.813 226239 INFO nova.virt.libvirt.driver [-] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Instance destroyed successfully.#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.813 226239 DEBUG nova.objects.instance [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'resources' on Instance uuid f29e9f11-fa0f-4fe5-a566-82d8becccb7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.845 226239 DEBUG nova.virt.libvirt.vif [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:54:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-612660049',display_name='tempest-TestNetworkBasicOps-server-612660049',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-612660049',id=177,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCNzwGCpb0oBUzRcNo+awbxNxuGMJBl5Zfp2GKEtKokcpPNzsdsm/xnexhK/rRBrUpvaOYbWNNFVeXug28+rVoRUqgocA3Ra+UPxvdrhuNzOIZaQWx1FNDHOPDX0Z6GC6w==',key_name='tempest-TestNetworkBasicOps-1061129892',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:54:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-af0wzd8d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:54:47Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=f29e9f11-fa0f-4fe5-a566-82d8becccb7f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "address": "fa:16:3e:5e:89:08", "network": {"id": "f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f", "bridge": "br-int", "label": "tempest-network-smoke--178981888", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877a2409-bb", "ovs_interfaceid": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.845 226239 DEBUG nova.network.os_vif_util [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "address": "fa:16:3e:5e:89:08", "network": {"id": "f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f", "bridge": "br-int", "label": "tempest-network-smoke--178981888", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877a2409-bb", "ovs_interfaceid": "877a2409-bb4a-41f9-9fd4-20ce910f0fd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.846 226239 DEBUG nova.network.os_vif_util [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:89:08,bridge_name='br-int',has_traffic_filtering=True,id=877a2409-bb4a-41f9-9fd4-20ce910f0fd5,network=Network(f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap877a2409-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.847 226239 DEBUG os_vif [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:89:08,bridge_name='br-int',has_traffic_filtering=True,id=877a2409-bb4a-41f9-9fd4-20ce910f0fd5,network=Network(f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap877a2409-bb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.848 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.849 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap877a2409-bb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.850 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.853 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.855 226239 INFO os_vif [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:89:08,bridge_name='br-int',has_traffic_filtering=True,id=877a2409-bb4a-41f9-9fd4-20ce910f0fd5,network=Network(f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap877a2409-bb')#033[00m
Jan 31 03:55:10 np0005603623 podman[312107]: 2026-01-31 08:55:10.874742147 +0000 UTC m=+0.049627187 container remove bb1af8bc3d4ce0cc195f15e71a447bf90f2825bc984f31ff94f7c26d27e06c74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:55:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:10.878 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3b92ccd0-0ae0-4198-8c54-f43c11ee86cc]: (4, ('Sat Jan 31 08:55:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f (bb1af8bc3d4ce0cc195f15e71a447bf90f2825bc984f31ff94f7c26d27e06c74)\nbb1af8bc3d4ce0cc195f15e71a447bf90f2825bc984f31ff94f7c26d27e06c74\nSat Jan 31 08:55:10 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f (bb1af8bc3d4ce0cc195f15e71a447bf90f2825bc984f31ff94f7c26d27e06c74)\nbb1af8bc3d4ce0cc195f15e71a447bf90f2825bc984f31ff94f7c26d27e06c74\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:10.880 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[036f19ed-1796-4ad1-96ab-4957c4026842]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:10.880 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8e02b4b-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:10 np0005603623 kernel: tapf8e02b4b-20: left promiscuous mode
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.882 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:10 np0005603623 nova_compute[226235]: 2026-01-31 08:55:10.889 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:10.892 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4544e981-9c56-4a3b-bbf4-75d663c81a7e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:10.905 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[27995f9a-43f6-4dcd-9bf8-29dc03569c8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:10.907 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[84e97bbc-512f-4974-b085-1e156f1e6c2e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:10.918 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f63dfb57-6602-4ec5-9a2b-7538908b1a98]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 882600, 'reachable_time': 25721, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312142, 'error': None, 'target': 'ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:10.921 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f8e02b4b-2e10-441b-b8b0-aafa8f5b6a9f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:55:10 np0005603623 systemd[1]: run-netns-ovnmeta\x2df8e02b4b\x2d2e10\x2d441b\x2db8b0\x2daafa8f5b6a9f.mount: Deactivated successfully.
Jan 31 03:55:10 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:10.922 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[7eb9bbf9-309a-4c3a-86f6-7d09c387ba7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:11 np0005603623 nova_compute[226235]: 2026-01-31 08:55:11.017 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:11 np0005603623 nova_compute[226235]: 2026-01-31 08:55:11.259 226239 INFO nova.virt.libvirt.driver [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Deleting instance files /var/lib/nova/instances/f29e9f11-fa0f-4fe5-a566-82d8becccb7f_del#033[00m
Jan 31 03:55:11 np0005603623 nova_compute[226235]: 2026-01-31 08:55:11.259 226239 INFO nova.virt.libvirt.driver [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Deletion of /var/lib/nova/instances/f29e9f11-fa0f-4fe5-a566-82d8becccb7f_del complete#033[00m
Jan 31 03:55:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:11 np0005603623 nova_compute[226235]: 2026-01-31 08:55:11.370 226239 INFO nova.compute.manager [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Took 0.79 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:55:11 np0005603623 nova_compute[226235]: 2026-01-31 08:55:11.370 226239 DEBUG oslo.service.loopingcall [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:55:11 np0005603623 nova_compute[226235]: 2026-01-31 08:55:11.370 226239 DEBUG nova.compute.manager [-] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:55:11 np0005603623 nova_compute[226235]: 2026-01-31 08:55:11.371 226239 DEBUG nova.network.neutron [-] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:55:11 np0005603623 nova_compute[226235]: 2026-01-31 08:55:11.676 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:11 np0005603623 nova_compute[226235]: 2026-01-31 08:55:11.677 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:55:11 np0005603623 nova_compute[226235]: 2026-01-31 08:55:11.677 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:55:11 np0005603623 nova_compute[226235]: 2026-01-31 08:55:11.761 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 31 03:55:11 np0005603623 nova_compute[226235]: 2026-01-31 08:55:11.900 226239 DEBUG nova.compute.manager [req-aec3666f-87c9-4727-bd52-d73d9f399147 req-306ace51-0074-4354-a371-f7c89bc0bb59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Received event network-vif-unplugged-877a2409-bb4a-41f9-9fd4-20ce910f0fd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:11 np0005603623 nova_compute[226235]: 2026-01-31 08:55:11.901 226239 DEBUG oslo_concurrency.lockutils [req-aec3666f-87c9-4727-bd52-d73d9f399147 req-306ace51-0074-4354-a371-f7c89bc0bb59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:11 np0005603623 nova_compute[226235]: 2026-01-31 08:55:11.901 226239 DEBUG oslo_concurrency.lockutils [req-aec3666f-87c9-4727-bd52-d73d9f399147 req-306ace51-0074-4354-a371-f7c89bc0bb59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:11 np0005603623 nova_compute[226235]: 2026-01-31 08:55:11.901 226239 DEBUG oslo_concurrency.lockutils [req-aec3666f-87c9-4727-bd52-d73d9f399147 req-306ace51-0074-4354-a371-f7c89bc0bb59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:11 np0005603623 nova_compute[226235]: 2026-01-31 08:55:11.901 226239 DEBUG nova.compute.manager [req-aec3666f-87c9-4727-bd52-d73d9f399147 req-306ace51-0074-4354-a371-f7c89bc0bb59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] No waiting events found dispatching network-vif-unplugged-877a2409-bb4a-41f9-9fd4-20ce910f0fd5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:55:11 np0005603623 nova_compute[226235]: 2026-01-31 08:55:11.902 226239 DEBUG nova.compute.manager [req-aec3666f-87c9-4727-bd52-d73d9f399147 req-306ace51-0074-4354-a371-f7c89bc0bb59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Received event network-vif-unplugged-877a2409-bb4a-41f9-9fd4-20ce910f0fd5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:55:12 np0005603623 nova_compute[226235]: 2026-01-31 08:55:12.235 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-f2cd77e9-4c34-4b40-b91b-2f9896d859ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:55:12 np0005603623 nova_compute[226235]: 2026-01-31 08:55:12.235 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-f2cd77e9-4c34-4b40-b91b-2f9896d859ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:55:12 np0005603623 nova_compute[226235]: 2026-01-31 08:55:12.235 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:55:12 np0005603623 nova_compute[226235]: 2026-01-31 08:55:12.235 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f2cd77e9-4c34-4b40-b91b-2f9896d859ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:55:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:12.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:55:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:12.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:13 np0005603623 nova_compute[226235]: 2026-01-31 08:55:13.335 226239 DEBUG nova.network.neutron [-] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:55:13 np0005603623 nova_compute[226235]: 2026-01-31 08:55:13.381 226239 INFO nova.compute.manager [-] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Took 2.01 seconds to deallocate network for instance.#033[00m
Jan 31 03:55:13 np0005603623 nova_compute[226235]: 2026-01-31 08:55:13.456 226239 DEBUG oslo_concurrency.lockutils [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:13 np0005603623 nova_compute[226235]: 2026-01-31 08:55:13.456 226239 DEBUG oslo_concurrency.lockutils [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:13 np0005603623 nova_compute[226235]: 2026-01-31 08:55:13.667 226239 DEBUG oslo_concurrency.processutils [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:13 np0005603623 nova_compute[226235]: 2026-01-31 08:55:13.704 226239 DEBUG nova.compute.manager [req-5bcf1a94-5cea-48c7-ac75-21c25a998950 req-73285a8d-b93f-49b8-94c4-8a4f7ca14ab7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Received event network-vif-deleted-877a2409-bb4a-41f9-9fd4-20ce910f0fd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:55:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1282902010' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:55:14 np0005603623 nova_compute[226235]: 2026-01-31 08:55:14.082 226239 DEBUG oslo_concurrency.processutils [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:14 np0005603623 nova_compute[226235]: 2026-01-31 08:55:14.087 226239 DEBUG nova.compute.provider_tree [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:55:14 np0005603623 nova_compute[226235]: 2026-01-31 08:55:14.131 226239 DEBUG nova.scheduler.client.report [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:55:14 np0005603623 nova_compute[226235]: 2026-01-31 08:55:14.174 226239 DEBUG oslo_concurrency.lockutils [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.718s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:14 np0005603623 nova_compute[226235]: 2026-01-31 08:55:14.304 226239 INFO nova.scheduler.client.report [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Deleted allocations for instance f29e9f11-fa0f-4fe5-a566-82d8becccb7f#033[00m
Jan 31 03:55:14 np0005603623 nova_compute[226235]: 2026-01-31 08:55:14.308 226239 DEBUG nova.compute.manager [req-de9a8f9e-22a8-4055-9a40-04d2f059fd8e req-031f12d0-0f3d-492e-9f48-dd6e0925e61d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Received event network-vif-plugged-877a2409-bb4a-41f9-9fd4-20ce910f0fd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:14 np0005603623 nova_compute[226235]: 2026-01-31 08:55:14.308 226239 DEBUG oslo_concurrency.lockutils [req-de9a8f9e-22a8-4055-9a40-04d2f059fd8e req-031f12d0-0f3d-492e-9f48-dd6e0925e61d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:14 np0005603623 nova_compute[226235]: 2026-01-31 08:55:14.309 226239 DEBUG oslo_concurrency.lockutils [req-de9a8f9e-22a8-4055-9a40-04d2f059fd8e req-031f12d0-0f3d-492e-9f48-dd6e0925e61d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:14 np0005603623 nova_compute[226235]: 2026-01-31 08:55:14.309 226239 DEBUG oslo_concurrency.lockutils [req-de9a8f9e-22a8-4055-9a40-04d2f059fd8e req-031f12d0-0f3d-492e-9f48-dd6e0925e61d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:14 np0005603623 nova_compute[226235]: 2026-01-31 08:55:14.309 226239 DEBUG nova.compute.manager [req-de9a8f9e-22a8-4055-9a40-04d2f059fd8e req-031f12d0-0f3d-492e-9f48-dd6e0925e61d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] No waiting events found dispatching network-vif-plugged-877a2409-bb4a-41f9-9fd4-20ce910f0fd5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:55:14 np0005603623 nova_compute[226235]: 2026-01-31 08:55:14.309 226239 WARNING nova.compute.manager [req-de9a8f9e-22a8-4055-9a40-04d2f059fd8e req-031f12d0-0f3d-492e-9f48-dd6e0925e61d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Received unexpected event network-vif-plugged-877a2409-bb4a-41f9-9fd4-20ce910f0fd5 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:55:14 np0005603623 nova_compute[226235]: 2026-01-31 08:55:14.612 226239 DEBUG oslo_concurrency.lockutils [None req-2218a0f4-97d8-4bba-9d64-51890f650e0b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "f29e9f11-fa0f-4fe5-a566-82d8becccb7f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:55:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:14.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:55:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:14.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:55:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4194297485' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:55:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:55:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4194297485' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:55:15 np0005603623 nova_compute[226235]: 2026-01-31 08:55:15.850 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:15 np0005603623 nova_compute[226235]: 2026-01-31 08:55:15.853 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Updating instance_info_cache with network_info: [{"id": "be6fd5ac-96d0-495c-827a-80641ec1d590", "address": "fa:16:3e:47:32:4a", "network": {"id": "88788faa-de32-416c-9495-11b3f71610ce", "bridge": "br-int", "label": "tempest-network-smoke--472936268", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6fd5ac-96", "ovs_interfaceid": "be6fd5ac-96d0-495c-827a-80641ec1d590", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:55:15 np0005603623 nova_compute[226235]: 2026-01-31 08:55:15.891 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-f2cd77e9-4c34-4b40-b91b-2f9896d859ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:55:15 np0005603623 nova_compute[226235]: 2026-01-31 08:55:15.891 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:55:15 np0005603623 nova_compute[226235]: 2026-01-31 08:55:15.892 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:16 np0005603623 nova_compute[226235]: 2026-01-31 08:55:16.019 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:16.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:16.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:17 np0005603623 nova_compute[226235]: 2026-01-31 08:55:17.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:17 np0005603623 nova_compute[226235]: 2026-01-31 08:55:17.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:18.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:18.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:18 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:18Z|00741|binding|INFO|Releasing lport 1da501db-771f-4fea-baad-4da77c8c5424 from this chassis (sb_readonly=0)
Jan 31 03:55:19 np0005603623 nova_compute[226235]: 2026-01-31 08:55:19.002 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:19 np0005603623 nova_compute[226235]: 2026-01-31 08:55:19.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:20 np0005603623 nova_compute[226235]: 2026-01-31 08:55:20.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:20.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:20.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:20 np0005603623 nova_compute[226235]: 2026-01-31 08:55:20.852 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.020 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.110 226239 DEBUG nova.compute.manager [req-20962da6-c6bb-4152-9d32-6c0748ea430f req-280f1300-7170-4fa6-880d-eeb6ca000770 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Received event network-changed-be6fd5ac-96d0-495c-827a-80641ec1d590 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.111 226239 DEBUG nova.compute.manager [req-20962da6-c6bb-4152-9d32-6c0748ea430f req-280f1300-7170-4fa6-880d-eeb6ca000770 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Refreshing instance network info cache due to event network-changed-be6fd5ac-96d0-495c-827a-80641ec1d590. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.111 226239 DEBUG oslo_concurrency.lockutils [req-20962da6-c6bb-4152-9d32-6c0748ea430f req-280f1300-7170-4fa6-880d-eeb6ca000770 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-f2cd77e9-4c34-4b40-b91b-2f9896d859ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.111 226239 DEBUG oslo_concurrency.lockutils [req-20962da6-c6bb-4152-9d32-6c0748ea430f req-280f1300-7170-4fa6-880d-eeb6ca000770 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-f2cd77e9-4c34-4b40-b91b-2f9896d859ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.111 226239 DEBUG nova.network.neutron [req-20962da6-c6bb-4152-9d32-6c0748ea430f req-280f1300-7170-4fa6-880d-eeb6ca000770 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Refreshing network info cache for port be6fd5ac-96d0-495c-827a-80641ec1d590 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.198 226239 DEBUG oslo_concurrency.lockutils [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.198 226239 DEBUG oslo_concurrency.lockutils [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.199 226239 DEBUG oslo_concurrency.lockutils [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.199 226239 DEBUG oslo_concurrency.lockutils [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.199 226239 DEBUG oslo_concurrency.lockutils [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.200 226239 INFO nova.compute.manager [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Terminating instance#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.201 226239 DEBUG nova.compute.manager [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:55:21 np0005603623 kernel: tapbe6fd5ac-96 (unregistering): left promiscuous mode
Jan 31 03:55:21 np0005603623 NetworkManager[48970]: <info>  [1769849721.2568] device (tapbe6fd5ac-96): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.261 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:21Z|00742|binding|INFO|Releasing lport be6fd5ac-96d0-495c-827a-80641ec1d590 from this chassis (sb_readonly=0)
Jan 31 03:55:21 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:21Z|00743|binding|INFO|Setting lport be6fd5ac-96d0-495c-827a-80641ec1d590 down in Southbound
Jan 31 03:55:21 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:21Z|00744|binding|INFO|Removing iface tapbe6fd5ac-96 ovn-installed in OVS
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.263 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.268 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:32:4a 10.100.0.13'], port_security=['fa:16:3e:47:32:4a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f2cd77e9-4c34-4b40-b91b-2f9896d859ce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88788faa-de32-416c-9495-11b3f71610ce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8d113f51-a0f6-4ac5-a4a7-3be652fd2b6a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b8add34-1673-4dc8-94e4-1618dd3551f8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=be6fd5ac-96d0-495c-827a-80641ec1d590) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.270 143258 INFO neutron.agent.ovn.metadata.agent [-] Port be6fd5ac-96d0-495c-827a-80641ec1d590 in datapath 88788faa-de32-416c-9495-11b3f71610ce unbound from our chassis#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.271 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.271 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88788faa-de32-416c-9495-11b3f71610ce, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.272 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[922d33fe-f076-492f-ada6-a2cc69c2b0ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.273 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88788faa-de32-416c-9495-11b3f71610ce namespace which is not needed anymore#033[00m
Jan 31 03:55:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:21 np0005603623 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000ae.scope: Deactivated successfully.
Jan 31 03:55:21 np0005603623 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000ae.scope: Consumed 16.483s CPU time.
Jan 31 03:55:21 np0005603623 systemd-machined[194379]: Machine qemu-83-instance-000000ae terminated.
Jan 31 03:55:21 np0005603623 neutron-haproxy-ovnmeta-88788faa-de32-416c-9495-11b3f71610ce[308979]: [NOTICE]   (308984) : haproxy version is 2.8.14-c23fe91
Jan 31 03:55:21 np0005603623 neutron-haproxy-ovnmeta-88788faa-de32-416c-9495-11b3f71610ce[308979]: [NOTICE]   (308984) : path to executable is /usr/sbin/haproxy
Jan 31 03:55:21 np0005603623 neutron-haproxy-ovnmeta-88788faa-de32-416c-9495-11b3f71610ce[308979]: [WARNING]  (308984) : Exiting Master process...
Jan 31 03:55:21 np0005603623 neutron-haproxy-ovnmeta-88788faa-de32-416c-9495-11b3f71610ce[308979]: [ALERT]    (308984) : Current worker (308986) exited with code 143 (Terminated)
Jan 31 03:55:21 np0005603623 neutron-haproxy-ovnmeta-88788faa-de32-416c-9495-11b3f71610ce[308979]: [WARNING]  (308984) : All workers exited. Exiting... (0)
Jan 31 03:55:21 np0005603623 systemd[1]: libpod-b9e2179b1489f73f7724ca585f636c2d20e6bb66141cce5562d24b632e552ef6.scope: Deactivated successfully.
Jan 31 03:55:21 np0005603623 podman[312246]: 2026-01-31 08:55:21.375199164 +0000 UTC m=+0.037575600 container died b9e2179b1489f73f7724ca585f636c2d20e6bb66141cce5562d24b632e552ef6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88788faa-de32-416c-9495-11b3f71610ce, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:55:21 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b9e2179b1489f73f7724ca585f636c2d20e6bb66141cce5562d24b632e552ef6-userdata-shm.mount: Deactivated successfully.
Jan 31 03:55:21 np0005603623 systemd[1]: var-lib-containers-storage-overlay-367c699a40790e0bc9bad1735e59119da88595d9b9dd73fb31915223c941bc90-merged.mount: Deactivated successfully.
Jan 31 03:55:21 np0005603623 podman[312246]: 2026-01-31 08:55:21.405146223 +0000 UTC m=+0.067522659 container cleanup b9e2179b1489f73f7724ca585f636c2d20e6bb66141cce5562d24b632e552ef6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88788faa-de32-416c-9495-11b3f71610ce, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:55:21 np0005603623 systemd[1]: libpod-conmon-b9e2179b1489f73f7724ca585f636c2d20e6bb66141cce5562d24b632e552ef6.scope: Deactivated successfully.
Jan 31 03:55:21 np0005603623 kernel: tapbe6fd5ac-96: entered promiscuous mode
Jan 31 03:55:21 np0005603623 NetworkManager[48970]: <info>  [1769849721.4175] manager: (tapbe6fd5ac-96): new Tun device (/org/freedesktop/NetworkManager/Devices/347)
Jan 31 03:55:21 np0005603623 kernel: tapbe6fd5ac-96 (unregistering): left promiscuous mode
Jan 31 03:55:21 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:21Z|00745|binding|INFO|Claiming lport be6fd5ac-96d0-495c-827a-80641ec1d590 for this chassis.
Jan 31 03:55:21 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:21Z|00746|binding|INFO|be6fd5ac-96d0-495c-827a-80641ec1d590: Claiming fa:16:3e:47:32:4a 10.100.0.13
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.420 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:21Z|00747|binding|INFO|Releasing lport be6fd5ac-96d0-495c-827a-80641ec1d590 from this chassis (sb_readonly=0)
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.431 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.433 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:32:4a 10.100.0.13'], port_security=['fa:16:3e:47:32:4a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f2cd77e9-4c34-4b40-b91b-2f9896d859ce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88788faa-de32-416c-9495-11b3f71610ce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8d113f51-a0f6-4ac5-a4a7-3be652fd2b6a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b8add34-1673-4dc8-94e4-1618dd3551f8, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=be6fd5ac-96d0-495c-827a-80641ec1d590) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.439 226239 INFO nova.virt.libvirt.driver [-] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Instance destroyed successfully.#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.439 226239 DEBUG nova.objects.instance [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'resources' on Instance uuid f2cd77e9-4c34-4b40-b91b-2f9896d859ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:21 np0005603623 podman[312280]: 2026-01-31 08:55:21.460375415 +0000 UTC m=+0.035980809 container remove b9e2179b1489f73f7724ca585f636c2d20e6bb66141cce5562d24b632e552ef6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88788faa-de32-416c-9495-11b3f71610ce, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.464 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[97d49969-e984-4f6d-89fe-d0d172fc008b]: (4, ('Sat Jan 31 08:55:21 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-88788faa-de32-416c-9495-11b3f71610ce (b9e2179b1489f73f7724ca585f636c2d20e6bb66141cce5562d24b632e552ef6)\nb9e2179b1489f73f7724ca585f636c2d20e6bb66141cce5562d24b632e552ef6\nSat Jan 31 08:55:21 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-88788faa-de32-416c-9495-11b3f71610ce (b9e2179b1489f73f7724ca585f636c2d20e6bb66141cce5562d24b632e552ef6)\nb9e2179b1489f73f7724ca585f636c2d20e6bb66141cce5562d24b632e552ef6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.466 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8e65310b-dd2f-4cf7-92d2-c544c00b0261]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.467 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88788faa-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:21 np0005603623 kernel: tap88788faa-d0: left promiscuous mode
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.469 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.472 226239 DEBUG nova.virt.libvirt.vif [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:53:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1311640661',display_name='tempest-TestNetworkBasicOps-server-1311640661',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1311640661',id=174,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDRKUWAS5a80ZjWPsd7hHgRkEp7Su45jVZtMGsGyUgsPHaNtUJrtTz9Xp8VPEvhhKx46Nib3p9WEVi71SNK8k+aau+j0uJLzRlNzrStVLXZtfDZ/222CrsWljOTfuHQP7w==',key_name='tempest-TestNetworkBasicOps-807155673',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:53:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-d8ph01yi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:53:26Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=f2cd77e9-4c34-4b40-b91b-2f9896d859ce,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "be6fd5ac-96d0-495c-827a-80641ec1d590", "address": "fa:16:3e:47:32:4a", "network": {"id": "88788faa-de32-416c-9495-11b3f71610ce", "bridge": "br-int", "label": "tempest-network-smoke--472936268", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6fd5ac-96", "ovs_interfaceid": "be6fd5ac-96d0-495c-827a-80641ec1d590", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.473 226239 DEBUG nova.network.os_vif_util [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "be6fd5ac-96d0-495c-827a-80641ec1d590", "address": "fa:16:3e:47:32:4a", "network": {"id": "88788faa-de32-416c-9495-11b3f71610ce", "bridge": "br-int", "label": "tempest-network-smoke--472936268", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6fd5ac-96", "ovs_interfaceid": "be6fd5ac-96d0-495c-827a-80641ec1d590", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.473 226239 DEBUG nova.network.os_vif_util [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:47:32:4a,bridge_name='br-int',has_traffic_filtering=True,id=be6fd5ac-96d0-495c-827a-80641ec1d590,network=Network(88788faa-de32-416c-9495-11b3f71610ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe6fd5ac-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.473 226239 DEBUG os_vif [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:47:32:4a,bridge_name='br-int',has_traffic_filtering=True,id=be6fd5ac-96d0-495c-827a-80641ec1d590,network=Network(88788faa-de32-416c-9495-11b3f71610ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe6fd5ac-96') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.474 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:32:4a 10.100.0.13'], port_security=['fa:16:3e:47:32:4a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f2cd77e9-4c34-4b40-b91b-2f9896d859ce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88788faa-de32-416c-9495-11b3f71610ce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8d113f51-a0f6-4ac5-a4a7-3be652fd2b6a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b8add34-1673-4dc8-94e4-1618dd3551f8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=be6fd5ac-96d0-495c-827a-80641ec1d590) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.475 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.475 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbe6fd5ac-96, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.476 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.476 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.478 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.479 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6f8608a4-e80e-409b-a698-ae580e38af7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603623 nova_compute[226235]: 2026-01-31 08:55:21.481 226239 INFO os_vif [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:47:32:4a,bridge_name='br-int',has_traffic_filtering=True,id=be6fd5ac-96d0-495c-827a-80641ec1d590,network=Network(88788faa-de32-416c-9495-11b3f71610ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe6fd5ac-96')#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.493 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b3b91f58-a779-4b22-9bd7-9a3def2f5651]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.495 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4db7599e-3232-4a7f-8f73-61b3f8fc4dc1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.506 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5636526a-8150-4e1f-8f65-200f43e495ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 874566, 'reachable_time': 42593, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312313, 'error': None, 'target': 'ovnmeta-88788faa-de32-416c-9495-11b3f71610ce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603623 systemd[1]: run-netns-ovnmeta\x2d88788faa\x2dde32\x2d416c\x2d9495\x2d11b3f71610ce.mount: Deactivated successfully.
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.509 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88788faa-de32-416c-9495-11b3f71610ce deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.509 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[7d3ad320-95b0-46c5-be93-d7ed75bb473e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.510 143258 INFO neutron.agent.ovn.metadata.agent [-] Port be6fd5ac-96d0-495c-827a-80641ec1d590 in datapath 88788faa-de32-416c-9495-11b3f71610ce unbound from our chassis#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.512 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88788faa-de32-416c-9495-11b3f71610ce, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.512 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5f19d2ef-1613-4540-a8cb-1774ec4ee8d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.513 143258 INFO neutron.agent.ovn.metadata.agent [-] Port be6fd5ac-96d0-495c-827a-80641ec1d590 in datapath 88788faa-de32-416c-9495-11b3f71610ce unbound from our chassis#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.514 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88788faa-de32-416c-9495-11b3f71610ce, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:55:21 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:21.514 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2737ddb2-d5e0-44cb-9275-764a7981e925]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:22 np0005603623 nova_compute[226235]: 2026-01-31 08:55:22.037 226239 INFO nova.virt.libvirt.driver [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Deleting instance files /var/lib/nova/instances/f2cd77e9-4c34-4b40-b91b-2f9896d859ce_del#033[00m
Jan 31 03:55:22 np0005603623 nova_compute[226235]: 2026-01-31 08:55:22.038 226239 INFO nova.virt.libvirt.driver [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Deletion of /var/lib/nova/instances/f2cd77e9-4c34-4b40-b91b-2f9896d859ce_del complete#033[00m
Jan 31 03:55:22 np0005603623 nova_compute[226235]: 2026-01-31 08:55:22.191 226239 INFO nova.compute.manager [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Took 0.99 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:55:22 np0005603623 nova_compute[226235]: 2026-01-31 08:55:22.192 226239 DEBUG oslo.service.loopingcall [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:55:22 np0005603623 nova_compute[226235]: 2026-01-31 08:55:22.193 226239 DEBUG nova.compute.manager [-] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:55:22 np0005603623 nova_compute[226235]: 2026-01-31 08:55:22.193 226239 DEBUG nova.network.neutron [-] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:55:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:22.416 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=77, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=76) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:55:22 np0005603623 nova_compute[226235]: 2026-01-31 08:55:22.416 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:22.417 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:55:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:22.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:22.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:23 np0005603623 nova_compute[226235]: 2026-01-31 08:55:23.491 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:24 np0005603623 nova_compute[226235]: 2026-01-31 08:55:24.016 226239 DEBUG nova.network.neutron [req-20962da6-c6bb-4152-9d32-6c0748ea430f req-280f1300-7170-4fa6-880d-eeb6ca000770 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Updated VIF entry in instance network info cache for port be6fd5ac-96d0-495c-827a-80641ec1d590. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:55:24 np0005603623 nova_compute[226235]: 2026-01-31 08:55:24.016 226239 DEBUG nova.network.neutron [req-20962da6-c6bb-4152-9d32-6c0748ea430f req-280f1300-7170-4fa6-880d-eeb6ca000770 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Updating instance_info_cache with network_info: [{"id": "be6fd5ac-96d0-495c-827a-80641ec1d590", "address": "fa:16:3e:47:32:4a", "network": {"id": "88788faa-de32-416c-9495-11b3f71610ce", "bridge": "br-int", "label": "tempest-network-smoke--472936268", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6fd5ac-96", "ovs_interfaceid": "be6fd5ac-96d0-495c-827a-80641ec1d590", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:55:24 np0005603623 nova_compute[226235]: 2026-01-31 08:55:24.068 226239 DEBUG oslo_concurrency.lockutils [req-20962da6-c6bb-4152-9d32-6c0748ea430f req-280f1300-7170-4fa6-880d-eeb6ca000770 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-f2cd77e9-4c34-4b40-b91b-2f9896d859ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:55:24 np0005603623 nova_compute[226235]: 2026-01-31 08:55:24.142 226239 DEBUG nova.compute.manager [req-9e8da41d-5e29-4a9e-bf40-aeb9c4cbe573 req-133bc2d7-dda9-4600-91fa-13103d769905 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Received event network-vif-unplugged-be6fd5ac-96d0-495c-827a-80641ec1d590 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:24 np0005603623 nova_compute[226235]: 2026-01-31 08:55:24.142 226239 DEBUG oslo_concurrency.lockutils [req-9e8da41d-5e29-4a9e-bf40-aeb9c4cbe573 req-133bc2d7-dda9-4600-91fa-13103d769905 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:24 np0005603623 nova_compute[226235]: 2026-01-31 08:55:24.143 226239 DEBUG oslo_concurrency.lockutils [req-9e8da41d-5e29-4a9e-bf40-aeb9c4cbe573 req-133bc2d7-dda9-4600-91fa-13103d769905 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:24 np0005603623 nova_compute[226235]: 2026-01-31 08:55:24.143 226239 DEBUG oslo_concurrency.lockutils [req-9e8da41d-5e29-4a9e-bf40-aeb9c4cbe573 req-133bc2d7-dda9-4600-91fa-13103d769905 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:24 np0005603623 nova_compute[226235]: 2026-01-31 08:55:24.143 226239 DEBUG nova.compute.manager [req-9e8da41d-5e29-4a9e-bf40-aeb9c4cbe573 req-133bc2d7-dda9-4600-91fa-13103d769905 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] No waiting events found dispatching network-vif-unplugged-be6fd5ac-96d0-495c-827a-80641ec1d590 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:55:24 np0005603623 nova_compute[226235]: 2026-01-31 08:55:24.143 226239 DEBUG nova.compute.manager [req-9e8da41d-5e29-4a9e-bf40-aeb9c4cbe573 req-133bc2d7-dda9-4600-91fa-13103d769905 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Received event network-vif-unplugged-be6fd5ac-96d0-495c-827a-80641ec1d590 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:55:24 np0005603623 nova_compute[226235]: 2026-01-31 08:55:24.352 226239 DEBUG nova.network.neutron [-] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:55:24 np0005603623 nova_compute[226235]: 2026-01-31 08:55:24.397 226239 INFO nova.compute.manager [-] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Took 2.20 seconds to deallocate network for instance.#033[00m
Jan 31 03:55:24 np0005603623 nova_compute[226235]: 2026-01-31 08:55:24.533 226239 DEBUG oslo_concurrency.lockutils [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:24 np0005603623 nova_compute[226235]: 2026-01-31 08:55:24.534 226239 DEBUG oslo_concurrency.lockutils [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:24 np0005603623 nova_compute[226235]: 2026-01-31 08:55:24.573 226239 DEBUG nova.compute.manager [req-3024460f-b63f-448d-88b3-25f0643f2060 req-90455fed-87b8-4310-9881-39fde377f71b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Received event network-vif-deleted-be6fd5ac-96d0-495c-827a-80641ec1d590 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:24.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:24 np0005603623 nova_compute[226235]: 2026-01-31 08:55:24.661 226239 DEBUG oslo_concurrency.processutils [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:24.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:55:25 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1273298356' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:55:25 np0005603623 nova_compute[226235]: 2026-01-31 08:55:25.086 226239 DEBUG oslo_concurrency.processutils [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:25 np0005603623 nova_compute[226235]: 2026-01-31 08:55:25.094 226239 DEBUG nova.compute.provider_tree [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:55:25 np0005603623 nova_compute[226235]: 2026-01-31 08:55:25.241 226239 DEBUG nova.scheduler.client.report [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:55:25 np0005603623 nova_compute[226235]: 2026-01-31 08:55:25.302 226239 DEBUG oslo_concurrency.lockutils [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.768s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:25 np0005603623 nova_compute[226235]: 2026-01-31 08:55:25.406 226239 INFO nova.scheduler.client.report [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Deleted allocations for instance f2cd77e9-4c34-4b40-b91b-2f9896d859ce#033[00m
Jan 31 03:55:25 np0005603623 nova_compute[226235]: 2026-01-31 08:55:25.653 226239 DEBUG oslo_concurrency.lockutils [None req-1b56f353-e958-4409-9ce9-378b075b2e2a d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.454s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:25 np0005603623 nova_compute[226235]: 2026-01-31 08:55:25.812 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849710.8107445, f29e9f11-fa0f-4fe5-a566-82d8becccb7f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:55:25 np0005603623 nova_compute[226235]: 2026-01-31 08:55:25.812 226239 INFO nova.compute.manager [-] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:55:25 np0005603623 nova_compute[226235]: 2026-01-31 08:55:25.858 226239 DEBUG nova.compute.manager [None req-6718956c-0ae0-4939-a2a7-4e4b047694cf - - - - - -] [instance: f29e9f11-fa0f-4fe5-a566-82d8becccb7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:26 np0005603623 nova_compute[226235]: 2026-01-31 08:55:26.023 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:26 np0005603623 nova_compute[226235]: 2026-01-31 08:55:26.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:26 np0005603623 nova_compute[226235]: 2026-01-31 08:55:26.374 226239 DEBUG nova.compute.manager [req-e01875ff-efe8-48d3-9ec5-a8410b570519 req-73045009-fffb-4a92-9120-d94804173464 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Received event network-vif-plugged-be6fd5ac-96d0-495c-827a-80641ec1d590 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:26 np0005603623 nova_compute[226235]: 2026-01-31 08:55:26.374 226239 DEBUG oslo_concurrency.lockutils [req-e01875ff-efe8-48d3-9ec5-a8410b570519 req-73045009-fffb-4a92-9120-d94804173464 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:26 np0005603623 nova_compute[226235]: 2026-01-31 08:55:26.375 226239 DEBUG oslo_concurrency.lockutils [req-e01875ff-efe8-48d3-9ec5-a8410b570519 req-73045009-fffb-4a92-9120-d94804173464 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:26 np0005603623 nova_compute[226235]: 2026-01-31 08:55:26.375 226239 DEBUG oslo_concurrency.lockutils [req-e01875ff-efe8-48d3-9ec5-a8410b570519 req-73045009-fffb-4a92-9120-d94804173464 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f2cd77e9-4c34-4b40-b91b-2f9896d859ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:26 np0005603623 nova_compute[226235]: 2026-01-31 08:55:26.376 226239 DEBUG nova.compute.manager [req-e01875ff-efe8-48d3-9ec5-a8410b570519 req-73045009-fffb-4a92-9120-d94804173464 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] No waiting events found dispatching network-vif-plugged-be6fd5ac-96d0-495c-827a-80641ec1d590 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:55:26 np0005603623 nova_compute[226235]: 2026-01-31 08:55:26.376 226239 WARNING nova.compute.manager [req-e01875ff-efe8-48d3-9ec5-a8410b570519 req-73045009-fffb-4a92-9120-d94804173464 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Received unexpected event network-vif-plugged-be6fd5ac-96d0-495c-827a-80641ec1d590 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:55:26 np0005603623 nova_compute[226235]: 2026-01-31 08:55:26.476 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:26.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:26.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:28.419 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '77'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:28.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:28.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:30.146 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:30.147 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:30.147 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:30.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:55:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:30.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:55:31 np0005603623 nova_compute[226235]: 2026-01-31 08:55:31.025 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:31 np0005603623 nova_compute[226235]: 2026-01-31 08:55:31.478 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:32 np0005603623 nova_compute[226235]: 2026-01-31 08:55:32.151 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:32 np0005603623 nova_compute[226235]: 2026-01-31 08:55:32.157 226239 DEBUG oslo_concurrency.lockutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "e9903ecf-c775-4e84-8997-361061869fc6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:32 np0005603623 nova_compute[226235]: 2026-01-31 08:55:32.157 226239 DEBUG oslo_concurrency.lockutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:32 np0005603623 nova_compute[226235]: 2026-01-31 08:55:32.210 226239 DEBUG nova.compute.manager [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:55:32 np0005603623 nova_compute[226235]: 2026-01-31 08:55:32.364 226239 DEBUG oslo_concurrency.lockutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:32 np0005603623 nova_compute[226235]: 2026-01-31 08:55:32.365 226239 DEBUG oslo_concurrency.lockutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:32 np0005603623 nova_compute[226235]: 2026-01-31 08:55:32.371 226239 DEBUG nova.virt.hardware [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:55:32 np0005603623 nova_compute[226235]: 2026-01-31 08:55:32.371 226239 INFO nova.compute.claims [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:55:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:32.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:32 np0005603623 nova_compute[226235]: 2026-01-31 08:55:32.669 226239 DEBUG oslo_concurrency.processutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:32.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:32 np0005603623 nova_compute[226235]: 2026-01-31 08:55:32.727 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:55:33 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/545711537' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.071 226239 DEBUG oslo_concurrency.processutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.402s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.077 226239 DEBUG nova.compute.provider_tree [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.101 226239 DEBUG nova.scheduler.client.report [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.174 226239 DEBUG oslo_concurrency.lockutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.809s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.175 226239 DEBUG nova.compute.manager [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.273 226239 DEBUG nova.compute.manager [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.273 226239 DEBUG nova.network.neutron [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.300 226239 INFO nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.334 226239 DEBUG nova.compute.manager [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.397 226239 INFO nova.virt.block_device [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Booting with volume 49519e92-9c10-4ba7-87c3-e1349e72980c at /dev/vda#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.765 226239 DEBUG os_brick.utils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.767 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.777 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.777 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[4bdf770a-bcf0-47cf-aa61-fd4992f3325f]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.779 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.787 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.787 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[ccdec8d4-d978-484b-a80f-8e31fa15d634]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.789 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.794 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.005s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.794 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[9acda1e8-3459-4f0d-8f48-baa74f60b100]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.796 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[88c272e4-4e94-4b52-a643-f8d75e604e83]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.797 226239 DEBUG oslo_concurrency.processutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.820 226239 DEBUG oslo_concurrency.processutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CMD "nvme version" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.823 226239 DEBUG os_brick.initiator.connectors.lightos [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.823 226239 DEBUG os_brick.initiator.connectors.lightos [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.823 226239 DEBUG os_brick.initiator.connectors.lightos [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.823 226239 DEBUG os_brick.utils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] <== get_connector_properties: return (57ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.824 226239 DEBUG nova.virt.block_device [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updating existing volume attachment record: 7b2664a9-418c-44ac-a39c-1552e9d00485 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:55:33 np0005603623 nova_compute[226235]: 2026-01-31 08:55:33.885 226239 DEBUG nova.policy [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cfaebb011a374541b083e772a6c83f25', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '06b5fc9cfd4c49abb2d8b9f2f8a82c1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:55:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:34.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:55:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:34.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:55:35 np0005603623 nova_compute[226235]: 2026-01-31 08:55:35.537 226239 DEBUG nova.compute.manager [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:55:35 np0005603623 nova_compute[226235]: 2026-01-31 08:55:35.539 226239 DEBUG nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:55:35 np0005603623 nova_compute[226235]: 2026-01-31 08:55:35.539 226239 INFO nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Creating image(s)#033[00m
Jan 31 03:55:35 np0005603623 nova_compute[226235]: 2026-01-31 08:55:35.540 226239 DEBUG nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 03:55:35 np0005603623 nova_compute[226235]: 2026-01-31 08:55:35.540 226239 DEBUG nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Ensure instance console log exists: /var/lib/nova/instances/e9903ecf-c775-4e84-8997-361061869fc6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:55:35 np0005603623 nova_compute[226235]: 2026-01-31 08:55:35.540 226239 DEBUG oslo_concurrency.lockutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:35 np0005603623 nova_compute[226235]: 2026-01-31 08:55:35.541 226239 DEBUG oslo_concurrency.lockutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:35 np0005603623 nova_compute[226235]: 2026-01-31 08:55:35.541 226239 DEBUG oslo_concurrency.lockutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:35 np0005603623 nova_compute[226235]: 2026-01-31 08:55:35.699 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:35 np0005603623 nova_compute[226235]: 2026-01-31 08:55:35.775 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:35 np0005603623 nova_compute[226235]: 2026-01-31 08:55:35.842 226239 DEBUG nova.network.neutron [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Successfully created port: f859815f-0923-45c1-a84d-2a128fb7fd57 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:55:36 np0005603623 nova_compute[226235]: 2026-01-31 08:55:36.026 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:36 np0005603623 nova_compute[226235]: 2026-01-31 08:55:36.438 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849721.4371462, f2cd77e9-4c34-4b40-b91b-2f9896d859ce => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:55:36 np0005603623 nova_compute[226235]: 2026-01-31 08:55:36.439 226239 INFO nova.compute.manager [-] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:55:36 np0005603623 nova_compute[226235]: 2026-01-31 08:55:36.479 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:36 np0005603623 nova_compute[226235]: 2026-01-31 08:55:36.550 226239 DEBUG nova.compute.manager [None req-7c2ba705-ed3b-4c2e-8c37-fdbb72567350 - - - - - -] [instance: f2cd77e9-4c34-4b40-b91b-2f9896d859ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:55:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:36.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:55:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:36.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:37 np0005603623 nova_compute[226235]: 2026-01-31 08:55:37.784 226239 DEBUG nova.network.neutron [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Successfully updated port: f859815f-0923-45c1-a84d-2a128fb7fd57 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:55:37 np0005603623 nova_compute[226235]: 2026-01-31 08:55:37.846 226239 DEBUG oslo_concurrency.lockutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:55:37 np0005603623 nova_compute[226235]: 2026-01-31 08:55:37.847 226239 DEBUG oslo_concurrency.lockutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquired lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:55:37 np0005603623 nova_compute[226235]: 2026-01-31 08:55:37.847 226239 DEBUG nova.network.neutron [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:55:38 np0005603623 nova_compute[226235]: 2026-01-31 08:55:38.108 226239 DEBUG nova.compute.manager [req-24054c92-7dfd-47ba-aa3a-4bb0625b9f73 req-9f0e75b7-2295-4d39-a439-b65e9d274469 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Received event network-changed-f859815f-0923-45c1-a84d-2a128fb7fd57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:38 np0005603623 nova_compute[226235]: 2026-01-31 08:55:38.109 226239 DEBUG nova.compute.manager [req-24054c92-7dfd-47ba-aa3a-4bb0625b9f73 req-9f0e75b7-2295-4d39-a439-b65e9d274469 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Refreshing instance network info cache due to event network-changed-f859815f-0923-45c1-a84d-2a128fb7fd57. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:55:38 np0005603623 nova_compute[226235]: 2026-01-31 08:55:38.109 226239 DEBUG oslo_concurrency.lockutils [req-24054c92-7dfd-47ba-aa3a-4bb0625b9f73 req-9f0e75b7-2295-4d39-a439-b65e9d274469 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:55:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:38.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:38.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:38 np0005603623 podman[312431]: 2026-01-31 08:55:38.970414668 +0000 UTC m=+0.067103825 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:55:38 np0005603623 podman[312430]: 2026-01-31 08:55:38.980300888 +0000 UTC m=+0.077056798 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:55:39 np0005603623 nova_compute[226235]: 2026-01-31 08:55:39.586 226239 DEBUG nova.network.neutron [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:55:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:55:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:40.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:55:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:40.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.029 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.328 226239 DEBUG nova.network.neutron [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updating instance_info_cache with network_info: [{"id": "f859815f-0923-45c1-a84d-2a128fb7fd57", "address": "fa:16:3e:ee:28:18", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf859815f-09", "ovs_interfaceid": "f859815f-0923-45c1-a84d-2a128fb7fd57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.376 226239 DEBUG oslo_concurrency.lockutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Releasing lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.377 226239 DEBUG nova.compute.manager [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Instance network_info: |[{"id": "f859815f-0923-45c1-a84d-2a128fb7fd57", "address": "fa:16:3e:ee:28:18", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf859815f-09", "ovs_interfaceid": "f859815f-0923-45c1-a84d-2a128fb7fd57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.377 226239 DEBUG oslo_concurrency.lockutils [req-24054c92-7dfd-47ba-aa3a-4bb0625b9f73 req-9f0e75b7-2295-4d39-a439-b65e9d274469 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.377 226239 DEBUG nova.network.neutron [req-24054c92-7dfd-47ba-aa3a-4bb0625b9f73 req-9f0e75b7-2295-4d39-a439-b65e9d274469 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Refreshing network info cache for port f859815f-0923-45c1-a84d-2a128fb7fd57 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.380 226239 DEBUG nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Start _get_guest_xml network_info=[{"id": "f859815f-0923-45c1-a84d-2a128fb7fd57", "address": "fa:16:3e:ee:28:18", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf859815f-09", "ovs_interfaceid": "f859815f-0923-45c1-a84d-2a128fb7fd57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'attachment_id': '7b2664a9-418c-44ac-a39c-1552e9d00485', 'delete_on_termination': False, 'guest_format': None, 'device_type': 'disk', 'boot_index': 0, 'disk_bus': 'virtio', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-49519e92-9c10-4ba7-87c3-e1349e72980c', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '49519e92-9c10-4ba7-87c3-e1349e72980c', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'e9903ecf-c775-4e84-8997-361061869fc6', 'attached_at': '', 'detached_at': '', 'volume_id': '49519e92-9c10-4ba7-87c3-e1349e72980c', 'serial': '49519e92-9c10-4ba7-87c3-e1349e72980c'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.385 226239 WARNING nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.391 226239 DEBUG nova.virt.libvirt.host [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.391 226239 DEBUG nova.virt.libvirt.host [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.395 226239 DEBUG nova.virt.libvirt.host [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.396 226239 DEBUG nova.virt.libvirt.host [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.397 226239 DEBUG nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.397 226239 DEBUG nova.virt.hardware [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.398 226239 DEBUG nova.virt.hardware [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.398 226239 DEBUG nova.virt.hardware [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.398 226239 DEBUG nova.virt.hardware [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.398 226239 DEBUG nova.virt.hardware [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.399 226239 DEBUG nova.virt.hardware [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.399 226239 DEBUG nova.virt.hardware [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.399 226239 DEBUG nova.virt.hardware [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.399 226239 DEBUG nova.virt.hardware [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.400 226239 DEBUG nova.virt.hardware [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.400 226239 DEBUG nova.virt.hardware [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.431 226239 DEBUG nova.storage.rbd_utils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] rbd image e9903ecf-c775-4e84-8997-361061869fc6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.435 226239 DEBUG oslo_concurrency.processutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.481 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:55:41 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3508586329' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.897 226239 DEBUG oslo_concurrency.processutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.941 226239 DEBUG nova.virt.libvirt.vif [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:55:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-1802059624',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-1802059624',id=179,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEXzK6zUN8P2oqgqYwcegkodZ7bCeyyyhmYXIteBKXOhNEu+drS3qyKalg8BzkpjD3Rc/+FviAhlBApTbimNmOyPmM7IztIR2VGri6qDWFeRA0jXOdg2vS/Kgt0ALKH9cg==',key_name='tempest-TestInstancesWithCinderVolumes-176277168',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='06b5fc9cfd4c49abb2d8b9f2f8a82c1f',ramdisk_id='',reservation_id='r-1s00kmp1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestInstancesWithCinderVolumes-2132464628',owner_user_name='tempest-TestInstancesWithCinderVolumes-2132464628-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:55:33Z,user_data=None,user_id='cfaebb011a374541b083e772a6c83f25',uuid=e9903ecf-c775-4e84-8997-361061869fc6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f859815f-0923-45c1-a84d-2a128fb7fd57", "address": "fa:16:3e:ee:28:18", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf859815f-09", "ovs_interfaceid": "f859815f-0923-45c1-a84d-2a128fb7fd57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.941 226239 DEBUG nova.network.os_vif_util [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Converting VIF {"id": "f859815f-0923-45c1-a84d-2a128fb7fd57", "address": "fa:16:3e:ee:28:18", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf859815f-09", "ovs_interfaceid": "f859815f-0923-45c1-a84d-2a128fb7fd57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.942 226239 DEBUG nova.network.os_vif_util [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:28:18,bridge_name='br-int',has_traffic_filtering=True,id=f859815f-0923-45c1-a84d-2a128fb7fd57,network=Network(405bd95c-1bad-49fb-83bf-a97a0c66786e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf859815f-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.943 226239 DEBUG nova.objects.instance [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lazy-loading 'pci_devices' on Instance uuid e9903ecf-c775-4e84-8997-361061869fc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.974 226239 DEBUG nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:55:41 np0005603623 nova_compute[226235]:  <uuid>e9903ecf-c775-4e84-8997-361061869fc6</uuid>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:  <name>instance-000000b3</name>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <nova:name>tempest-TestInstancesWithCinderVolumes-server-1802059624</nova:name>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:55:41</nova:creationTime>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:55:41 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:        <nova:user uuid="cfaebb011a374541b083e772a6c83f25">tempest-TestInstancesWithCinderVolumes-2132464628-project-member</nova:user>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:        <nova:project uuid="06b5fc9cfd4c49abb2d8b9f2f8a82c1f">tempest-TestInstancesWithCinderVolumes-2132464628</nova:project>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:        <nova:port uuid="f859815f-0923-45c1-a84d-2a128fb7fd57">
Jan 31 03:55:41 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <entry name="serial">e9903ecf-c775-4e84-8997-361061869fc6</entry>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <entry name="uuid">e9903ecf-c775-4e84-8997-361061869fc6</entry>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/e9903ecf-c775-4e84-8997-361061869fc6_disk.config">
Jan 31 03:55:41 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:55:41 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="volumes/volume-49519e92-9c10-4ba7-87c3-e1349e72980c">
Jan 31 03:55:41 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:55:41 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <serial>49519e92-9c10-4ba7-87c3-e1349e72980c</serial>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:ee:28:18"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <target dev="tapf859815f-09"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/e9903ecf-c775-4e84-8997-361061869fc6/console.log" append="off"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:55:41 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:55:41 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:55:41 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:55:41 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.975 226239 DEBUG nova.compute.manager [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Preparing to wait for external event network-vif-plugged-f859815f-0923-45c1-a84d-2a128fb7fd57 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.977 226239 DEBUG oslo_concurrency.lockutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "e9903ecf-c775-4e84-8997-361061869fc6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.977 226239 DEBUG oslo_concurrency.lockutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.977 226239 DEBUG oslo_concurrency.lockutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.978 226239 DEBUG nova.virt.libvirt.vif [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:55:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-1802059624',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-1802059624',id=179,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEXzK6zUN8P2oqgqYwcegkodZ7bCeyyyhmYXIteBKXOhNEu+drS3qyKalg8BzkpjD3Rc/+FviAhlBApTbimNmOyPmM7IztIR2VGri6qDWFeRA0jXOdg2vS/Kgt0ALKH9cg==',key_name='tempest-TestInstancesWithCinderVolumes-176277168',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='06b5fc9cfd4c49abb2d8b9f2f8a82c1f',ramdisk_id='',reservation_id='r-1s00kmp1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestInstancesWithCinderVolumes-2132464628',owner_user_name='tempest-TestInstancesWithCinderVolumes-2132464628-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:55:33Z,user_data=None,user_id='cfaebb011a374541b083e772a6c83f25',uuid=e9903ecf-c775-4e84-8997-361061869fc6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f859815f-0923-45c1-a84d-2a128fb7fd57", "address": "fa:16:3e:ee:28:18", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf859815f-09", "ovs_interfaceid": "f859815f-0923-45c1-a84d-2a128fb7fd57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.978 226239 DEBUG nova.network.os_vif_util [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Converting VIF {"id": "f859815f-0923-45c1-a84d-2a128fb7fd57", "address": "fa:16:3e:ee:28:18", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf859815f-09", "ovs_interfaceid": "f859815f-0923-45c1-a84d-2a128fb7fd57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.979 226239 DEBUG nova.network.os_vif_util [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:28:18,bridge_name='br-int',has_traffic_filtering=True,id=f859815f-0923-45c1-a84d-2a128fb7fd57,network=Network(405bd95c-1bad-49fb-83bf-a97a0c66786e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf859815f-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.979 226239 DEBUG os_vif [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:28:18,bridge_name='br-int',has_traffic_filtering=True,id=f859815f-0923-45c1-a84d-2a128fb7fd57,network=Network(405bd95c-1bad-49fb-83bf-a97a0c66786e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf859815f-09') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.980 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.980 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.980 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.983 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.983 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf859815f-09, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.984 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf859815f-09, col_values=(('external_ids', {'iface-id': 'f859815f-0923-45c1-a84d-2a128fb7fd57', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:28:18', 'vm-uuid': 'e9903ecf-c775-4e84-8997-361061869fc6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.985 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:41 np0005603623 NetworkManager[48970]: <info>  [1769849741.9862] manager: (tapf859815f-09): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/348)
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.987 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.990 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:41 np0005603623 nova_compute[226235]: 2026-01-31 08:55:41.991 226239 INFO os_vif [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:28:18,bridge_name='br-int',has_traffic_filtering=True,id=f859815f-0923-45c1-a84d-2a128fb7fd57,network=Network(405bd95c-1bad-49fb-83bf-a97a0c66786e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf859815f-09')#033[00m
Jan 31 03:55:42 np0005603623 nova_compute[226235]: 2026-01-31 08:55:42.084 226239 DEBUG nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:55:42 np0005603623 nova_compute[226235]: 2026-01-31 08:55:42.085 226239 DEBUG nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:55:42 np0005603623 nova_compute[226235]: 2026-01-31 08:55:42.085 226239 DEBUG nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No VIF found with MAC fa:16:3e:ee:28:18, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:55:42 np0005603623 nova_compute[226235]: 2026-01-31 08:55:42.085 226239 INFO nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Using config drive#033[00m
Jan 31 03:55:42 np0005603623 nova_compute[226235]: 2026-01-31 08:55:42.111 226239 DEBUG nova.storage.rbd_utils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] rbd image e9903ecf-c775-4e84-8997-361061869fc6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:55:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:42.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:42.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:42 np0005603623 nova_compute[226235]: 2026-01-31 08:55:42.784 226239 INFO nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Creating config drive at /var/lib/nova/instances/e9903ecf-c775-4e84-8997-361061869fc6/disk.config#033[00m
Jan 31 03:55:42 np0005603623 nova_compute[226235]: 2026-01-31 08:55:42.787 226239 DEBUG oslo_concurrency.processutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e9903ecf-c775-4e84-8997-361061869fc6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp7rvdxjcv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:42 np0005603623 nova_compute[226235]: 2026-01-31 08:55:42.913 226239 DEBUG oslo_concurrency.processutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e9903ecf-c775-4e84-8997-361061869fc6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp7rvdxjcv" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:42 np0005603623 nova_compute[226235]: 2026-01-31 08:55:42.938 226239 DEBUG nova.storage.rbd_utils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] rbd image e9903ecf-c775-4e84-8997-361061869fc6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:55:42 np0005603623 nova_compute[226235]: 2026-01-31 08:55:42.941 226239 DEBUG oslo_concurrency.processutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e9903ecf-c775-4e84-8997-361061869fc6/disk.config e9903ecf-c775-4e84-8997-361061869fc6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.076 226239 DEBUG oslo_concurrency.processutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e9903ecf-c775-4e84-8997-361061869fc6/disk.config e9903ecf-c775-4e84-8997-361061869fc6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.076 226239 INFO nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Deleting local config drive /var/lib/nova/instances/e9903ecf-c775-4e84-8997-361061869fc6/disk.config because it was imported into RBD.#033[00m
Jan 31 03:55:43 np0005603623 kernel: tapf859815f-09: entered promiscuous mode
Jan 31 03:55:43 np0005603623 NetworkManager[48970]: <info>  [1769849743.1282] manager: (tapf859815f-09): new Tun device (/org/freedesktop/NetworkManager/Devices/349)
Jan 31 03:55:43 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:43Z|00748|binding|INFO|Claiming lport f859815f-0923-45c1-a84d-2a128fb7fd57 for this chassis.
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.151 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:43 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:43Z|00749|binding|INFO|f859815f-0923-45c1-a84d-2a128fb7fd57: Claiming fa:16:3e:ee:28:18 10.100.0.3
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.155 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:43 np0005603623 systemd-udevd[312587]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.167 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:28:18 10.100.0.3'], port_security=['fa:16:3e:ee:28:18 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'e9903ecf-c775-4e84-8997-361061869fc6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405bd95c-1bad-49fb-83bf-a97a0c66786e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '06b5fc9cfd4c49abb2d8b9f2f8a82c1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9d7b4c6b-30ca-4a01-b275-d4aa9d87b845', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fe6e8b31-5a27-4e0f-b157-3b33899fa37b, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=f859815f-0923-45c1-a84d-2a128fb7fd57) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.168 143258 INFO neutron.agent.ovn.metadata.agent [-] Port f859815f-0923-45c1-a84d-2a128fb7fd57 in datapath 405bd95c-1bad-49fb-83bf-a97a0c66786e bound to our chassis#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.169 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 405bd95c-1bad-49fb-83bf-a97a0c66786e#033[00m
Jan 31 03:55:43 np0005603623 systemd-machined[194379]: New machine qemu-85-instance-000000b3.
Jan 31 03:55:43 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:43Z|00750|binding|INFO|Setting lport f859815f-0923-45c1-a84d-2a128fb7fd57 ovn-installed in OVS
Jan 31 03:55:43 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:43Z|00751|binding|INFO|Setting lport f859815f-0923-45c1-a84d-2a128fb7fd57 up in Southbound
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.174 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.177 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6cdce31b-9191-48d3-b400-4183f5dec84e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.177 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap405bd95c-11 in ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.179 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap405bd95c-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.179 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[55de32c1-f99d-4640-b39b-8adb53cc90e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.180 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[44c06ecd-81ea-43dd-925d-14ec204fccdd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:43 np0005603623 NetworkManager[48970]: <info>  [1769849743.1820] device (tapf859815f-09): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:55:43 np0005603623 NetworkManager[48970]: <info>  [1769849743.1832] device (tapf859815f-09): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:55:43 np0005603623 systemd[1]: Started Virtual Machine qemu-85-instance-000000b3.
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.190 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[4300fcde-e010-4080-af89-3980baaeabdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.200 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4b7cf652-818b-461d-af1e-4d9c01589239]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.225 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[65b51e66-15e3-4fcb-96c0-2721bf6da41c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:43 np0005603623 systemd-udevd[312591]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.230 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[de5fd011-ac1e-4592-82b4-d7ca2b38d7c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:43 np0005603623 NetworkManager[48970]: <info>  [1769849743.2325] manager: (tap405bd95c-10): new Veth device (/org/freedesktop/NetworkManager/Devices/350)
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.252 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[6248e3db-c1cc-40d8-9a96-0c8673ef2e9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.255 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[694eeccf-235c-4a18-bbd3-cbadb850c174]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:43 np0005603623 NetworkManager[48970]: <info>  [1769849743.2722] device (tap405bd95c-10): carrier: link connected
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.274 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[1f54bf7f-d5a5-4d39-b2d9-436f9313f8be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.286 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[54dfc276-c723-435e-a083-f7b01b3c1258]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405bd95c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:d8:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 221], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 888370, 'reachable_time': 36878, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312620, 'error': None, 'target': 'ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.297 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[02233584-7466-4ebe-b4d5-d372511a8da4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe77:d880'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 888370, 'tstamp': 888370}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 312621, 'error': None, 'target': 'ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.308 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5ff620ec-38ca-4d6d-a098-d373e37bb449]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405bd95c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:d8:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 221], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 888370, 'reachable_time': 36878, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 312622, 'error': None, 'target': 'ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.335 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[963fe396-57d3-425a-8acd-1f28437d3ae9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.373 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d2d500ce-2e1d-4a45-90b6-4449dba742f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.375 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405bd95c-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.375 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.375 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap405bd95c-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:43 np0005603623 kernel: tap405bd95c-10: entered promiscuous mode
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.377 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:43 np0005603623 NetworkManager[48970]: <info>  [1769849743.3782] manager: (tap405bd95c-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/351)
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.380 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.381 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap405bd95c-10, col_values=(('external_ids', {'iface-id': '5a0136e3-84ab-4495-80ff-8006a0a74934'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.383 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:43 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:43Z|00752|binding|INFO|Releasing lport 5a0136e3-84ab-4495-80ff-8006a0a74934 from this chassis (sb_readonly=0)
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.393 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.395 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/405bd95c-1bad-49fb-83bf-a97a0c66786e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/405bd95c-1bad-49fb-83bf-a97a0c66786e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.395 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[84abd04c-11d6-4922-9328-a072bb90d2de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.396 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-405bd95c-1bad-49fb-83bf-a97a0c66786e
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/405bd95c-1bad-49fb-83bf-a97a0c66786e.pid.haproxy
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 405bd95c-1bad-49fb-83bf-a97a0c66786e
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:55:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:55:43.397 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e', 'env', 'PROCESS_TAG=haproxy-405bd95c-1bad-49fb-83bf-a97a0c66786e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/405bd95c-1bad-49fb-83bf-a97a0c66786e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.555 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849743.554755, e9903ecf-c775-4e84-8997-361061869fc6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.555 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e9903ecf-c775-4e84-8997-361061869fc6] VM Started (Lifecycle Event)#033[00m
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.595 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.599 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849743.554905, e9903ecf-c775-4e84-8997-361061869fc6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.599 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e9903ecf-c775-4e84-8997-361061869fc6] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.632 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.635 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.659 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e9903ecf-c775-4e84-8997-361061869fc6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:55:43 np0005603623 podman[312696]: 2026-01-31 08:55:43.741691647 +0000 UTC m=+0.054164080 container create af3b5ad1f0ae7c7d221e1ed643650a95212a23439efe58cb6cb5c95d0be050fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 31 03:55:43 np0005603623 systemd[1]: Started libpod-conmon-af3b5ad1f0ae7c7d221e1ed643650a95212a23439efe58cb6cb5c95d0be050fc.scope.
Jan 31 03:55:43 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:55:43 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/328bab48dbfd0b70cf017cf0c6b714945e66cb16082db3ac0bc893afa6b5bd30/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:55:43 np0005603623 podman[312696]: 2026-01-31 08:55:43.706481413 +0000 UTC m=+0.018953886 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:55:43 np0005603623 podman[312696]: 2026-01-31 08:55:43.822484011 +0000 UTC m=+0.134956464 container init af3b5ad1f0ae7c7d221e1ed643650a95212a23439efe58cb6cb5c95d0be050fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:55:43 np0005603623 podman[312696]: 2026-01-31 08:55:43.826914851 +0000 UTC m=+0.139387284 container start af3b5ad1f0ae7c7d221e1ed643650a95212a23439efe58cb6cb5c95d0be050fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 31 03:55:43 np0005603623 neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e[312711]: [NOTICE]   (312715) : New worker (312717) forked
Jan 31 03:55:43 np0005603623 neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e[312711]: [NOTICE]   (312715) : Loading success.
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.980 226239 DEBUG nova.network.neutron [req-24054c92-7dfd-47ba-aa3a-4bb0625b9f73 req-9f0e75b7-2295-4d39-a439-b65e9d274469 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updated VIF entry in instance network info cache for port f859815f-0923-45c1-a84d-2a128fb7fd57. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:55:43 np0005603623 nova_compute[226235]: 2026-01-31 08:55:43.980 226239 DEBUG nova.network.neutron [req-24054c92-7dfd-47ba-aa3a-4bb0625b9f73 req-9f0e75b7-2295-4d39-a439-b65e9d274469 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updating instance_info_cache with network_info: [{"id": "f859815f-0923-45c1-a84d-2a128fb7fd57", "address": "fa:16:3e:ee:28:18", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf859815f-09", "ovs_interfaceid": "f859815f-0923-45c1-a84d-2a128fb7fd57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.008 226239 DEBUG oslo_concurrency.lockutils [req-24054c92-7dfd-47ba-aa3a-4bb0625b9f73 req-9f0e75b7-2295-4d39-a439-b65e9d274469 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.258 226239 DEBUG nova.compute.manager [req-0c1fbf0e-854c-4fb8-bacc-422d378648f2 req-49b17648-2128-4d26-afb9-1cda269319ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Received event network-vif-plugged-f859815f-0923-45c1-a84d-2a128fb7fd57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.259 226239 DEBUG oslo_concurrency.lockutils [req-0c1fbf0e-854c-4fb8-bacc-422d378648f2 req-49b17648-2128-4d26-afb9-1cda269319ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e9903ecf-c775-4e84-8997-361061869fc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.259 226239 DEBUG oslo_concurrency.lockutils [req-0c1fbf0e-854c-4fb8-bacc-422d378648f2 req-49b17648-2128-4d26-afb9-1cda269319ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.259 226239 DEBUG oslo_concurrency.lockutils [req-0c1fbf0e-854c-4fb8-bacc-422d378648f2 req-49b17648-2128-4d26-afb9-1cda269319ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.260 226239 DEBUG nova.compute.manager [req-0c1fbf0e-854c-4fb8-bacc-422d378648f2 req-49b17648-2128-4d26-afb9-1cda269319ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Processing event network-vif-plugged-f859815f-0923-45c1-a84d-2a128fb7fd57 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.260 226239 DEBUG nova.compute.manager [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.265 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849744.2650633, e9903ecf-c775-4e84-8997-361061869fc6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.265 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e9903ecf-c775-4e84-8997-361061869fc6] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.267 226239 DEBUG nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.269 226239 INFO nova.virt.libvirt.driver [-] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Instance spawned successfully.#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.270 226239 DEBUG nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.289 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.292 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.302 226239 DEBUG nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.302 226239 DEBUG nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.303 226239 DEBUG nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.303 226239 DEBUG nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.304 226239 DEBUG nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.304 226239 DEBUG nova.virt.libvirt.driver [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.359 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e9903ecf-c775-4e84-8997-361061869fc6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.418 226239 INFO nova.compute.manager [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Took 8.88 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.418 226239 DEBUG nova.compute.manager [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.561 226239 INFO nova.compute.manager [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Took 12.27 seconds to build instance.#033[00m
Jan 31 03:55:44 np0005603623 nova_compute[226235]: 2026-01-31 08:55:44.623 226239 DEBUG oslo_concurrency.lockutils [None req-519c3c18-1d15-45df-b695-aad5787ffd22 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.466s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:44.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:55:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:44.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:55:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:55:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4088982563' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:55:46 np0005603623 nova_compute[226235]: 2026-01-31 08:55:46.030 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:46 np0005603623 nova_compute[226235]: 2026-01-31 08:55:46.403 226239 DEBUG nova.compute.manager [req-43d09744-b752-4df6-9a13-d881b79565d7 req-0b5f91b4-bf87-496b-817f-0c2095c7e88b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Received event network-vif-plugged-f859815f-0923-45c1-a84d-2a128fb7fd57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:46 np0005603623 nova_compute[226235]: 2026-01-31 08:55:46.404 226239 DEBUG oslo_concurrency.lockutils [req-43d09744-b752-4df6-9a13-d881b79565d7 req-0b5f91b4-bf87-496b-817f-0c2095c7e88b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e9903ecf-c775-4e84-8997-361061869fc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:46 np0005603623 nova_compute[226235]: 2026-01-31 08:55:46.404 226239 DEBUG oslo_concurrency.lockutils [req-43d09744-b752-4df6-9a13-d881b79565d7 req-0b5f91b4-bf87-496b-817f-0c2095c7e88b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:46 np0005603623 nova_compute[226235]: 2026-01-31 08:55:46.405 226239 DEBUG oslo_concurrency.lockutils [req-43d09744-b752-4df6-9a13-d881b79565d7 req-0b5f91b4-bf87-496b-817f-0c2095c7e88b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:46 np0005603623 nova_compute[226235]: 2026-01-31 08:55:46.405 226239 DEBUG nova.compute.manager [req-43d09744-b752-4df6-9a13-d881b79565d7 req-0b5f91b4-bf87-496b-817f-0c2095c7e88b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] No waiting events found dispatching network-vif-plugged-f859815f-0923-45c1-a84d-2a128fb7fd57 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:55:46 np0005603623 nova_compute[226235]: 2026-01-31 08:55:46.405 226239 WARNING nova.compute.manager [req-43d09744-b752-4df6-9a13-d881b79565d7 req-0b5f91b4-bf87-496b-817f-0c2095c7e88b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Received unexpected event network-vif-plugged-f859815f-0923-45c1-a84d-2a128fb7fd57 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:55:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:46.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:55:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:46.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:55:46 np0005603623 nova_compute[226235]: 2026-01-31 08:55:46.986 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:48.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:48.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:50.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:50.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:51 np0005603623 nova_compute[226235]: 2026-01-31 08:55:51.072 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:51 np0005603623 nova_compute[226235]: 2026-01-31 08:55:51.988 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:55:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:52.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:55:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:52.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:55:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:54.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:55:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:54.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:54 np0005603623 NetworkManager[48970]: <info>  [1769849754.8398] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/352)
Jan 31 03:55:54 np0005603623 nova_compute[226235]: 2026-01-31 08:55:54.839 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:54 np0005603623 NetworkManager[48970]: <info>  [1769849754.8407] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/353)
Jan 31 03:55:54 np0005603623 nova_compute[226235]: 2026-01-31 08:55:54.889 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:54 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:54Z|00753|binding|INFO|Releasing lport 5a0136e3-84ab-4495-80ff-8006a0a74934 from this chassis (sb_readonly=0)
Jan 31 03:55:54 np0005603623 nova_compute[226235]: 2026-01-31 08:55:54.911 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:56 np0005603623 nova_compute[226235]: 2026-01-31 08:55:56.074 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:56 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:56Z|00090|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ee:28:18 10.100.0.3
Jan 31 03:55:56 np0005603623 ovn_controller[133449]: 2026-01-31T08:55:56Z|00091|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ee:28:18 10.100.0.3
Jan 31 03:55:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:55:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:56.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:55:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:56.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:56 np0005603623 nova_compute[226235]: 2026-01-31 08:55:56.991 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:58.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:55:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:55:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:58.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:56:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:56:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:00.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:56:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:00.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:01 np0005603623 nova_compute[226235]: 2026-01-31 08:56:01.076 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #157. Immutable memtables: 0.
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:56:01.544784) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 99] Flushing memtable with next log file: 157
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849761544860, "job": 99, "event": "flush_started", "num_memtables": 1, "num_entries": 1040, "num_deletes": 251, "total_data_size": 1973355, "memory_usage": 2003696, "flush_reason": "Manual Compaction"}
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 99] Level-0 flush table #158: started
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849761563800, "cf_name": "default", "job": 99, "event": "table_file_creation", "file_number": 158, "file_size": 1301000, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 76087, "largest_seqno": 77122, "table_properties": {"data_size": 1296511, "index_size": 2076, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10572, "raw_average_key_size": 19, "raw_value_size": 1287175, "raw_average_value_size": 2419, "num_data_blocks": 92, "num_entries": 532, "num_filter_entries": 532, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769849688, "oldest_key_time": 1769849688, "file_creation_time": 1769849761, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 158, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 99] Flush lasted 19059 microseconds, and 2481 cpu microseconds.
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:56:01.563842) [db/flush_job.cc:967] [default] [JOB 99] Level-0 flush table #158: 1301000 bytes OK
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:56:01.563863) [db/memtable_list.cc:519] [default] Level-0 commit table #158 started
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:56:01.566381) [db/memtable_list.cc:722] [default] Level-0 commit table #158: memtable #1 done
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:56:01.566397) EVENT_LOG_v1 {"time_micros": 1769849761566392, "job": 99, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:56:01.566414) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 99] Try to delete WAL files size 1968224, prev total WAL file size 1968224, number of live WAL files 2.
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000154.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:56:01.566913) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036353236' seq:72057594037927935, type:22 .. '7061786F730036373738' seq:0, type:0; will stop at (end)
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 100] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 99 Base level 0, inputs: [158(1270KB)], [156(10MB)]
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849761566964, "job": 100, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [158], "files_L6": [156], "score": -1, "input_data_size": 12292724, "oldest_snapshot_seqno": -1}
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 100] Generated table #159: 9514 keys, 10423584 bytes, temperature: kUnknown
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849761703128, "cf_name": "default", "job": 100, "event": "table_file_creation", "file_number": 159, "file_size": 10423584, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10364907, "index_size": 33796, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23813, "raw_key_size": 252170, "raw_average_key_size": 26, "raw_value_size": 10201126, "raw_average_value_size": 1072, "num_data_blocks": 1274, "num_entries": 9514, "num_filter_entries": 9514, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769849761, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 159, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:56:01.703449) [db/compaction/compaction_job.cc:1663] [default] [JOB 100] Compacted 1@0 + 1@6 files to L6 => 10423584 bytes
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:56:01.708539) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 90.2 rd, 76.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 10.5 +0.0 blob) out(9.9 +0.0 blob), read-write-amplify(17.5) write-amplify(8.0) OK, records in: 10033, records dropped: 519 output_compression: NoCompression
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:56:01.708581) EVENT_LOG_v1 {"time_micros": 1769849761708561, "job": 100, "event": "compaction_finished", "compaction_time_micros": 136271, "compaction_time_cpu_micros": 19352, "output_level": 6, "num_output_files": 1, "total_output_size": 10423584, "num_input_records": 10033, "num_output_records": 9514, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000158.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849761708958, "job": 100, "event": "table_file_deletion", "file_number": 158}
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000156.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849761711032, "job": 100, "event": "table_file_deletion", "file_number": 156}
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:56:01.566828) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:56:01.711209) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:56:01.711228) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:56:01.711233) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:56:01.711238) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:56:01 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:56:01.711242) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:56:01 np0005603623 nova_compute[226235]: 2026-01-31 08:56:01.993 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:02.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:02.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:56:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:04.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:56:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:04.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:05 np0005603623 nova_compute[226235]: 2026-01-31 08:56:05.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:05 np0005603623 nova_compute[226235]: 2026-01-31 08:56:05.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:56:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:56:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:56:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:56:05 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:56:06 np0005603623 nova_compute[226235]: 2026-01-31 08:56:06.077 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:56:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:06.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:56:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:56:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:06.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:56:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:56:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:56:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 03:56:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 03:56:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 03:56:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:56:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:56:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:56:06 np0005603623 nova_compute[226235]: 2026-01-31 08:56:06.996 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:56:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:08.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:56:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:56:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:08.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:56:09 np0005603623 podman[313044]: 2026-01-31 08:56:09.970239669 +0000 UTC m=+0.066562400 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:56:09 np0005603623 podman[313043]: 2026-01-31 08:56:09.975777933 +0000 UTC m=+0.072631469 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:56:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:10.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:10.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:11 np0005603623 nova_compute[226235]: 2026-01-31 08:56:11.079 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:11 np0005603623 nova_compute[226235]: 2026-01-31 08:56:11.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:11 np0005603623 nova_compute[226235]: 2026-01-31 08:56:11.183 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:11 np0005603623 nova_compute[226235]: 2026-01-31 08:56:11.183 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:11 np0005603623 nova_compute[226235]: 2026-01-31 08:56:11.183 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:11 np0005603623 nova_compute[226235]: 2026-01-31 08:56:11.183 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:56:11 np0005603623 nova_compute[226235]: 2026-01-31 08:56:11.184 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:56:11 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3053976949' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:56:11 np0005603623 nova_compute[226235]: 2026-01-31 08:56:11.647 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:11 np0005603623 nova_compute[226235]: 2026-01-31 08:56:11.964 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000b3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:56:11 np0005603623 nova_compute[226235]: 2026-01-31 08:56:11.964 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000b3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:56:11 np0005603623 nova_compute[226235]: 2026-01-31 08:56:11.998 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:12 np0005603623 nova_compute[226235]: 2026-01-31 08:56:12.109 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:56:12 np0005603623 nova_compute[226235]: 2026-01-31 08:56:12.111 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4005MB free_disk=20.921260833740234GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:56:12 np0005603623 nova_compute[226235]: 2026-01-31 08:56:12.111 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:12 np0005603623 nova_compute[226235]: 2026-01-31 08:56:12.111 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:12 np0005603623 nova_compute[226235]: 2026-01-31 08:56:12.290 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance e9903ecf-c775-4e84-8997-361061869fc6 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:56:12 np0005603623 nova_compute[226235]: 2026-01-31 08:56:12.291 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:56:12 np0005603623 nova_compute[226235]: 2026-01-31 08:56:12.291 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:56:12 np0005603623 nova_compute[226235]: 2026-01-31 08:56:12.365 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:12.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:12.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:56:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4082203985' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:56:12 np0005603623 nova_compute[226235]: 2026-01-31 08:56:12.801 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:12 np0005603623 nova_compute[226235]: 2026-01-31 08:56:12.806 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:56:12 np0005603623 nova_compute[226235]: 2026-01-31 08:56:12.861 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:56:13 np0005603623 nova_compute[226235]: 2026-01-31 08:56:13.050 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:56:13 np0005603623 nova_compute[226235]: 2026-01-31 08:56:13.051 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.940s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:13 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:56:13 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:56:14 np0005603623 nova_compute[226235]: 2026-01-31 08:56:14.051 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:14 np0005603623 nova_compute[226235]: 2026-01-31 08:56:14.052 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:56:14 np0005603623 nova_compute[226235]: 2026-01-31 08:56:14.052 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:56:14 np0005603623 nova_compute[226235]: 2026-01-31 08:56:14.442 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:56:14 np0005603623 nova_compute[226235]: 2026-01-31 08:56:14.442 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:56:14 np0005603623 nova_compute[226235]: 2026-01-31 08:56:14.442 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:56:14 np0005603623 nova_compute[226235]: 2026-01-31 08:56:14.443 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e9903ecf-c775-4e84-8997-361061869fc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:56:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:56:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:14.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:56:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:14.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:16 np0005603623 nova_compute[226235]: 2026-01-31 08:56:16.081 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:56:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:16.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:56:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:16.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:17 np0005603623 nova_compute[226235]: 2026-01-31 08:56:17.002 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:18 np0005603623 nova_compute[226235]: 2026-01-31 08:56:18.649 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updating instance_info_cache with network_info: [{"id": "f859815f-0923-45c1-a84d-2a128fb7fd57", "address": "fa:16:3e:ee:28:18", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf859815f-09", "ovs_interfaceid": "f859815f-0923-45c1-a84d-2a128fb7fd57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:56:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:18.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:18.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:18 np0005603623 nova_compute[226235]: 2026-01-31 08:56:18.833 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:56:18 np0005603623 nova_compute[226235]: 2026-01-31 08:56:18.833 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:56:18 np0005603623 nova_compute[226235]: 2026-01-31 08:56:18.833 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:19 np0005603623 nova_compute[226235]: 2026-01-31 08:56:19.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:19 np0005603623 nova_compute[226235]: 2026-01-31 08:56:19.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:19 np0005603623 nova_compute[226235]: 2026-01-31 08:56:19.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:20 np0005603623 nova_compute[226235]: 2026-01-31 08:56:20.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:56:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:20.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:56:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:20.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:21 np0005603623 nova_compute[226235]: 2026-01-31 08:56:21.083 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:22 np0005603623 nova_compute[226235]: 2026-01-31 08:56:22.005 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:22.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:22.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:23.855 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=78, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=77) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:56:23 np0005603623 nova_compute[226235]: 2026-01-31 08:56:23.855 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:23.856 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:56:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:24.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:24.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:24.858 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '78'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:26 np0005603623 nova_compute[226235]: 2026-01-31 08:56:26.084 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:26 np0005603623 nova_compute[226235]: 2026-01-31 08:56:26.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:56:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:26.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:56:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:26.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:27 np0005603623 nova_compute[226235]: 2026-01-31 08:56:27.006 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:28.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:28.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:30.147 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:30.148 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:30.148 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:30.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:30.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:31 np0005603623 nova_compute[226235]: 2026-01-31 08:56:31.086 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:32 np0005603623 nova_compute[226235]: 2026-01-31 08:56:32.007 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:32 np0005603623 nova_compute[226235]: 2026-01-31 08:56:32.379 226239 DEBUG oslo_concurrency.lockutils [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "e9903ecf-c775-4e84-8997-361061869fc6" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:32 np0005603623 nova_compute[226235]: 2026-01-31 08:56:32.380 226239 DEBUG oslo_concurrency.lockutils [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:32 np0005603623 nova_compute[226235]: 2026-01-31 08:56:32.414 226239 DEBUG nova.objects.instance [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lazy-loading 'flavor' on Instance uuid e9903ecf-c775-4e84-8997-361061869fc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:56:32 np0005603623 nova_compute[226235]: 2026-01-31 08:56:32.590 226239 DEBUG oslo_concurrency.lockutils [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:32.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:32.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:33 np0005603623 nova_compute[226235]: 2026-01-31 08:56:33.898 226239 DEBUG oslo_concurrency.lockutils [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "e9903ecf-c775-4e84-8997-361061869fc6" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:33 np0005603623 nova_compute[226235]: 2026-01-31 08:56:33.898 226239 DEBUG oslo_concurrency.lockutils [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:33 np0005603623 nova_compute[226235]: 2026-01-31 08:56:33.899 226239 INFO nova.compute.manager [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Attaching volume 985c0898-7bd0-457c-b3bb-abe45d65168a to /dev/vdb#033[00m
Jan 31 03:56:34 np0005603623 nova_compute[226235]: 2026-01-31 08:56:34.133 226239 DEBUG os_brick.utils [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:56:34 np0005603623 nova_compute[226235]: 2026-01-31 08:56:34.134 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:34 np0005603623 nova_compute[226235]: 2026-01-31 08:56:34.142 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:34 np0005603623 nova_compute[226235]: 2026-01-31 08:56:34.142 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[43e75f48-f743-4390-865f-26acb3e0167f]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:34 np0005603623 nova_compute[226235]: 2026-01-31 08:56:34.143 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:34 np0005603623 nova_compute[226235]: 2026-01-31 08:56:34.148 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.004s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:34 np0005603623 nova_compute[226235]: 2026-01-31 08:56:34.148 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[5b928594-8d54-4346-9377-c9af56ba0830]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:34 np0005603623 nova_compute[226235]: 2026-01-31 08:56:34.149 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:34 np0005603623 nova_compute[226235]: 2026-01-31 08:56:34.155 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.005s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:34 np0005603623 nova_compute[226235]: 2026-01-31 08:56:34.155 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[6f5b3b40-a1e8-4518-81f7-a808b72c32d7]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:34 np0005603623 nova_compute[226235]: 2026-01-31 08:56:34.157 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[04acc05f-185f-4b2e-87a1-e09495cf274c]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:34 np0005603623 nova_compute[226235]: 2026-01-31 08:56:34.158 226239 DEBUG oslo_concurrency.processutils [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:34 np0005603623 nova_compute[226235]: 2026-01-31 08:56:34.186 226239 DEBUG oslo_concurrency.processutils [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CMD "nvme version" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:34 np0005603623 nova_compute[226235]: 2026-01-31 08:56:34.188 226239 DEBUG os_brick.initiator.connectors.lightos [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:56:34 np0005603623 nova_compute[226235]: 2026-01-31 08:56:34.188 226239 DEBUG os_brick.initiator.connectors.lightos [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:56:34 np0005603623 nova_compute[226235]: 2026-01-31 08:56:34.188 226239 DEBUG os_brick.initiator.connectors.lightos [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:56:34 np0005603623 nova_compute[226235]: 2026-01-31 08:56:34.189 226239 DEBUG os_brick.utils [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] <== get_connector_properties: return (55ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:56:34 np0005603623 nova_compute[226235]: 2026-01-31 08:56:34.189 226239 DEBUG nova.virt.block_device [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updating existing volume attachment record: 396f2e87-48dc-48be-8d15-2b3c3ca6fce3 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:56:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:34.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:34.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:35 np0005603623 nova_compute[226235]: 2026-01-31 08:56:35.617 226239 DEBUG nova.objects.instance [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lazy-loading 'flavor' on Instance uuid e9903ecf-c775-4e84-8997-361061869fc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:56:35 np0005603623 nova_compute[226235]: 2026-01-31 08:56:35.684 226239 DEBUG nova.virt.libvirt.driver [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Attempting to attach volume 985c0898-7bd0-457c-b3bb-abe45d65168a with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 03:56:35 np0005603623 nova_compute[226235]: 2026-01-31 08:56:35.687 226239 DEBUG nova.virt.libvirt.guest [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 03:56:35 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:56:35 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-985c0898-7bd0-457c-b3bb-abe45d65168a">
Jan 31 03:56:35 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:56:35 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:56:35 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:56:35 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:56:35 np0005603623 nova_compute[226235]:  <auth username="openstack">
Jan 31 03:56:35 np0005603623 nova_compute[226235]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:56:35 np0005603623 nova_compute[226235]:  </auth>
Jan 31 03:56:35 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:56:35 np0005603623 nova_compute[226235]:  <serial>985c0898-7bd0-457c-b3bb-abe45d65168a</serial>
Jan 31 03:56:35 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:56:35 np0005603623 nova_compute[226235]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:56:35 np0005603623 nova_compute[226235]: 2026-01-31 08:56:35.911 226239 DEBUG nova.virt.libvirt.driver [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:56:35 np0005603623 nova_compute[226235]: 2026-01-31 08:56:35.912 226239 DEBUG nova.virt.libvirt.driver [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:56:35 np0005603623 nova_compute[226235]: 2026-01-31 08:56:35.912 226239 DEBUG nova.virt.libvirt.driver [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:56:35 np0005603623 nova_compute[226235]: 2026-01-31 08:56:35.912 226239 DEBUG nova.virt.libvirt.driver [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No VIF found with MAC fa:16:3e:ee:28:18, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:56:36 np0005603623 nova_compute[226235]: 2026-01-31 08:56:36.087 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:36 np0005603623 nova_compute[226235]: 2026-01-31 08:56:36.347 226239 DEBUG oslo_concurrency.lockutils [None req-3e6d2c07-7d02-4261-8741-1e40b3aea220 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.449s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:36.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:36.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:37 np0005603623 nova_compute[226235]: 2026-01-31 08:56:37.010 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:37 np0005603623 nova_compute[226235]: 2026-01-31 08:56:37.522 226239 DEBUG oslo_concurrency.lockutils [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "e9903ecf-c775-4e84-8997-361061869fc6" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:37 np0005603623 nova_compute[226235]: 2026-01-31 08:56:37.522 226239 DEBUG oslo_concurrency.lockutils [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:37 np0005603623 nova_compute[226235]: 2026-01-31 08:56:37.607 226239 DEBUG nova.objects.instance [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lazy-loading 'flavor' on Instance uuid e9903ecf-c775-4e84-8997-361061869fc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:56:37 np0005603623 nova_compute[226235]: 2026-01-31 08:56:37.951 226239 DEBUG oslo_concurrency.lockutils [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.429s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:38 np0005603623 nova_compute[226235]: 2026-01-31 08:56:38.684 226239 DEBUG oslo_concurrency.lockutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:38 np0005603623 nova_compute[226235]: 2026-01-31 08:56:38.684 226239 DEBUG oslo_concurrency.lockutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:38.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:38.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.395 226239 DEBUG nova.compute.manager [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.685 226239 DEBUG oslo_concurrency.lockutils [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "e9903ecf-c775-4e84-8997-361061869fc6" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.686 226239 DEBUG oslo_concurrency.lockutils [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.686 226239 INFO nova.compute.manager [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Attaching volume 75a603c9-1c6b-4103-ac69-0db6813e7404 to /dev/vdc#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.754 226239 DEBUG oslo_concurrency.lockutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.755 226239 DEBUG oslo_concurrency.lockutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.764 226239 DEBUG nova.virt.hardware [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.765 226239 INFO nova.compute.claims [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.924 226239 DEBUG os_brick.utils [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.925 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.932 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.932 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[8102a5f2-27c9-490d-b242-ec17061b6927]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.934 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.938 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.004s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.938 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[e2587834-437c-4a80-9d57-15ed1ba89bb4]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.940 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.946 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.946 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[6c395174-9593-48cf-a63a-ec0f989d5b59]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.947 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[109abba4-cefa-4a22-a256-3cf25c0a9d56]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.948 226239 DEBUG oslo_concurrency.processutils [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.964 226239 DEBUG oslo_concurrency.processutils [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CMD "nvme version" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.966 226239 DEBUG os_brick.initiator.connectors.lightos [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.966 226239 DEBUG os_brick.initiator.connectors.lightos [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.967 226239 DEBUG os_brick.initiator.connectors.lightos [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.967 226239 DEBUG os_brick.utils [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] <== get_connector_properties: return (42ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:56:39 np0005603623 nova_compute[226235]: 2026-01-31 08:56:39.968 226239 DEBUG nova.virt.block_device [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updating existing volume attachment record: 2f60e240-67fa-428c-9862-a5b1a48923bc _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:56:40 np0005603623 nova_compute[226235]: 2026-01-31 08:56:40.117 226239 DEBUG oslo_concurrency.processutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:40 np0005603623 nova_compute[226235]: 2026-01-31 08:56:40.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:40 np0005603623 nova_compute[226235]: 2026-01-31 08:56:40.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:56:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:56:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2095850844' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:56:40 np0005603623 nova_compute[226235]: 2026-01-31 08:56:40.518 226239 DEBUG oslo_concurrency.processutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.401s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:40 np0005603623 nova_compute[226235]: 2026-01-31 08:56:40.523 226239 DEBUG nova.compute.provider_tree [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:56:40 np0005603623 nova_compute[226235]: 2026-01-31 08:56:40.612 226239 DEBUG nova.scheduler.client.report [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:56:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:40.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:40.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:40 np0005603623 nova_compute[226235]: 2026-01-31 08:56:40.799 226239 DEBUG oslo_concurrency.lockutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:40 np0005603623 nova_compute[226235]: 2026-01-31 08:56:40.801 226239 DEBUG nova.compute.manager [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:56:40 np0005603623 podman[313351]: 2026-01-31 08:56:40.962028384 +0000 UTC m=+0.051924190 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 03:56:40 np0005603623 podman[313352]: 2026-01-31 08:56:40.973090641 +0000 UTC m=+0.063699259 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:56:41 np0005603623 nova_compute[226235]: 2026-01-31 08:56:41.088 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:41 np0005603623 nova_compute[226235]: 2026-01-31 08:56:41.250 226239 DEBUG nova.compute.manager [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:56:41 np0005603623 nova_compute[226235]: 2026-01-31 08:56:41.250 226239 DEBUG nova.network.neutron [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:56:41 np0005603623 nova_compute[226235]: 2026-01-31 08:56:41.372 226239 DEBUG nova.objects.instance [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lazy-loading 'flavor' on Instance uuid e9903ecf-c775-4e84-8997-361061869fc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:56:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:41 np0005603623 nova_compute[226235]: 2026-01-31 08:56:41.546 226239 INFO nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:56:41 np0005603623 nova_compute[226235]: 2026-01-31 08:56:41.575 226239 DEBUG nova.virt.libvirt.driver [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Attempting to attach volume 75a603c9-1c6b-4103-ac69-0db6813e7404 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 03:56:41 np0005603623 nova_compute[226235]: 2026-01-31 08:56:41.579 226239 DEBUG nova.virt.libvirt.guest [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 03:56:41 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:56:41 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-75a603c9-1c6b-4103-ac69-0db6813e7404">
Jan 31 03:56:41 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:56:41 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:56:41 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:56:41 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:56:41 np0005603623 nova_compute[226235]:  <auth username="openstack">
Jan 31 03:56:41 np0005603623 nova_compute[226235]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:56:41 np0005603623 nova_compute[226235]:  </auth>
Jan 31 03:56:41 np0005603623 nova_compute[226235]:  <target dev="vdc" bus="virtio"/>
Jan 31 03:56:41 np0005603623 nova_compute[226235]:  <serial>75a603c9-1c6b-4103-ac69-0db6813e7404</serial>
Jan 31 03:56:41 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:56:41 np0005603623 nova_compute[226235]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:56:41 np0005603623 nova_compute[226235]: 2026-01-31 08:56:41.614 226239 DEBUG nova.policy [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53804fd0f3a14f95a4955e3bc6dcc8cb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ab2d642eb03c4bda84a9a23e86f1fa4d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:56:41 np0005603623 nova_compute[226235]: 2026-01-31 08:56:41.636 226239 DEBUG nova.compute.manager [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:56:41 np0005603623 nova_compute[226235]: 2026-01-31 08:56:41.905 226239 DEBUG nova.virt.libvirt.driver [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:56:41 np0005603623 nova_compute[226235]: 2026-01-31 08:56:41.905 226239 DEBUG nova.virt.libvirt.driver [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:56:41 np0005603623 nova_compute[226235]: 2026-01-31 08:56:41.906 226239 DEBUG nova.virt.libvirt.driver [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:56:41 np0005603623 nova_compute[226235]: 2026-01-31 08:56:41.906 226239 DEBUG nova.virt.libvirt.driver [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:56:41 np0005603623 nova_compute[226235]: 2026-01-31 08:56:41.906 226239 DEBUG nova.virt.libvirt.driver [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No VIF found with MAC fa:16:3e:ee:28:18, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.062 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.236 226239 DEBUG nova.compute.manager [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.237 226239 DEBUG nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.237 226239 INFO nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Creating image(s)#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.262 226239 DEBUG nova.storage.rbd_utils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] rbd image b23e0349-42a8-41d0-9eea-0407b7ffa806_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.288 226239 DEBUG nova.storage.rbd_utils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] rbd image b23e0349-42a8-41d0-9eea-0407b7ffa806_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.314 226239 DEBUG nova.storage.rbd_utils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] rbd image b23e0349-42a8-41d0-9eea-0407b7ffa806_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.318 226239 DEBUG oslo_concurrency.processutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.374 226239 DEBUG oslo_concurrency.processutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.376 226239 DEBUG oslo_concurrency.lockutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.376 226239 DEBUG oslo_concurrency.lockutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.376 226239 DEBUG oslo_concurrency.lockutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.395 226239 DEBUG nova.storage.rbd_utils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] rbd image b23e0349-42a8-41d0-9eea-0407b7ffa806_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.398 226239 DEBUG oslo_concurrency.processutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 b23e0349-42a8-41d0-9eea-0407b7ffa806_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.665 226239 DEBUG oslo_concurrency.processutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 b23e0349-42a8-41d0-9eea-0407b7ffa806_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.267s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:56:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:42.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.724 226239 DEBUG nova.storage.rbd_utils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] resizing rbd image b23e0349-42a8-41d0-9eea-0407b7ffa806_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:56:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:42.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.807 226239 DEBUG nova.objects.instance [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'migration_context' on Instance uuid b23e0349-42a8-41d0-9eea-0407b7ffa806 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.830 226239 DEBUG nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.830 226239 DEBUG nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Ensure instance console log exists: /var/lib/nova/instances/b23e0349-42a8-41d0-9eea-0407b7ffa806/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.831 226239 DEBUG oslo_concurrency.lockutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.831 226239 DEBUG oslo_concurrency.lockutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:42 np0005603623 nova_compute[226235]: 2026-01-31 08:56:42.831 226239 DEBUG oslo_concurrency.lockutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:43 np0005603623 nova_compute[226235]: 2026-01-31 08:56:43.502 226239 DEBUG oslo_concurrency.lockutils [None req-ec538719-aaac-4a33-8f7e-184ce3547cdc cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 3.817s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:44.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:56:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:44.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:56:44 np0005603623 nova_compute[226235]: 2026-01-31 08:56:44.994 226239 DEBUG nova.network.neutron [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Successfully created port: bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:56:46 np0005603623 nova_compute[226235]: 2026-01-31 08:56:46.092 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:46 np0005603623 ovn_controller[133449]: 2026-01-31T08:56:46Z|00754|binding|INFO|Releasing lport 5a0136e3-84ab-4495-80ff-8006a0a74934 from this chassis (sb_readonly=0)
Jan 31 03:56:46 np0005603623 nova_compute[226235]: 2026-01-31 08:56:46.424 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:46 np0005603623 ovn_controller[133449]: 2026-01-31T08:56:46Z|00755|binding|INFO|Releasing lport 5a0136e3-84ab-4495-80ff-8006a0a74934 from this chassis (sb_readonly=0)
Jan 31 03:56:46 np0005603623 nova_compute[226235]: 2026-01-31 08:56:46.511 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:56:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:46.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:56:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:56:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:46.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:56:46 np0005603623 nova_compute[226235]: 2026-01-31 08:56:46.784 226239 DEBUG nova.network.neutron [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Successfully updated port: bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:56:46 np0005603623 nova_compute[226235]: 2026-01-31 08:56:46.850 226239 DEBUG oslo_concurrency.lockutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "refresh_cache-b23e0349-42a8-41d0-9eea-0407b7ffa806" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:56:46 np0005603623 nova_compute[226235]: 2026-01-31 08:56:46.851 226239 DEBUG oslo_concurrency.lockutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquired lock "refresh_cache-b23e0349-42a8-41d0-9eea-0407b7ffa806" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:56:46 np0005603623 nova_compute[226235]: 2026-01-31 08:56:46.852 226239 DEBUG nova.network.neutron [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:56:47 np0005603623 nova_compute[226235]: 2026-01-31 08:56:47.060 226239 DEBUG nova.compute.manager [req-ede35aee-db9f-44f3-ab12-8acde8e9f6a9 req-018f9871-9cdf-44e2-bad2-4eecf6126d10 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Received event network-changed-bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:47 np0005603623 nova_compute[226235]: 2026-01-31 08:56:47.060 226239 DEBUG nova.compute.manager [req-ede35aee-db9f-44f3-ab12-8acde8e9f6a9 req-018f9871-9cdf-44e2-bad2-4eecf6126d10 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Refreshing instance network info cache due to event network-changed-bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:56:47 np0005603623 nova_compute[226235]: 2026-01-31 08:56:47.060 226239 DEBUG oslo_concurrency.lockutils [req-ede35aee-db9f-44f3-ab12-8acde8e9f6a9 req-018f9871-9cdf-44e2-bad2-4eecf6126d10 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-b23e0349-42a8-41d0-9eea-0407b7ffa806" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:56:47 np0005603623 nova_compute[226235]: 2026-01-31 08:56:47.063 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:47 np0005603623 nova_compute[226235]: 2026-01-31 08:56:47.105 226239 DEBUG nova.network.neutron [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:56:47 np0005603623 nova_compute[226235]: 2026-01-31 08:56:47.124 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:47 np0005603623 NetworkManager[48970]: <info>  [1769849807.1255] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/354)
Jan 31 03:56:47 np0005603623 NetworkManager[48970]: <info>  [1769849807.1266] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/355)
Jan 31 03:56:47 np0005603623 nova_compute[226235]: 2026-01-31 08:56:47.167 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:47 np0005603623 ovn_controller[133449]: 2026-01-31T08:56:47Z|00756|binding|INFO|Releasing lport 5a0136e3-84ab-4495-80ff-8006a0a74934 from this chassis (sb_readonly=0)
Jan 31 03:56:47 np0005603623 nova_compute[226235]: 2026-01-31 08:56:47.187 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.122 226239 DEBUG nova.compute.manager [req-140c60a4-8738-4807-9bbd-c1cfa3672b9a req-87a70f73-95c3-46c3-9a5f-79d56d1bae39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Received event network-changed-f859815f-0923-45c1-a84d-2a128fb7fd57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.123 226239 DEBUG nova.compute.manager [req-140c60a4-8738-4807-9bbd-c1cfa3672b9a req-87a70f73-95c3-46c3-9a5f-79d56d1bae39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Refreshing instance network info cache due to event network-changed-f859815f-0923-45c1-a84d-2a128fb7fd57. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.123 226239 DEBUG oslo_concurrency.lockutils [req-140c60a4-8738-4807-9bbd-c1cfa3672b9a req-87a70f73-95c3-46c3-9a5f-79d56d1bae39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.123 226239 DEBUG oslo_concurrency.lockutils [req-140c60a4-8738-4807-9bbd-c1cfa3672b9a req-87a70f73-95c3-46c3-9a5f-79d56d1bae39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.123 226239 DEBUG nova.network.neutron [req-140c60a4-8738-4807-9bbd-c1cfa3672b9a req-87a70f73-95c3-46c3-9a5f-79d56d1bae39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Refreshing network info cache for port f859815f-0923-45c1-a84d-2a128fb7fd57 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.544 226239 DEBUG nova.network.neutron [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Updating instance_info_cache with network_info: [{"id": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "address": "fa:16:3e:95:38:a3", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcb7b61b-e5", "ovs_interfaceid": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.702 226239 DEBUG oslo_concurrency.lockutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Releasing lock "refresh_cache-b23e0349-42a8-41d0-9eea-0407b7ffa806" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.703 226239 DEBUG nova.compute.manager [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Instance network_info: |[{"id": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "address": "fa:16:3e:95:38:a3", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcb7b61b-e5", "ovs_interfaceid": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.704 226239 DEBUG oslo_concurrency.lockutils [req-ede35aee-db9f-44f3-ab12-8acde8e9f6a9 req-018f9871-9cdf-44e2-bad2-4eecf6126d10 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-b23e0349-42a8-41d0-9eea-0407b7ffa806" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.704 226239 DEBUG nova.network.neutron [req-ede35aee-db9f-44f3-ab12-8acde8e9f6a9 req-018f9871-9cdf-44e2-bad2-4eecf6126d10 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Refreshing network info cache for port bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.707 226239 DEBUG nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Start _get_guest_xml network_info=[{"id": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "address": "fa:16:3e:95:38:a3", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcb7b61b-e5", "ovs_interfaceid": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:56:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.714 226239 WARNING nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:56:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:48.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.723 226239 DEBUG nova.virt.libvirt.host [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.724 226239 DEBUG nova.virt.libvirt.host [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.729 226239 DEBUG nova.virt.libvirt.host [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.729 226239 DEBUG nova.virt.libvirt.host [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.731 226239 DEBUG nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.731 226239 DEBUG nova.virt.hardware [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.731 226239 DEBUG nova.virt.hardware [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.732 226239 DEBUG nova.virt.hardware [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.732 226239 DEBUG nova.virt.hardware [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.732 226239 DEBUG nova.virt.hardware [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.732 226239 DEBUG nova.virt.hardware [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.733 226239 DEBUG nova.virt.hardware [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.733 226239 DEBUG nova.virt.hardware [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.733 226239 DEBUG nova.virt.hardware [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.733 226239 DEBUG nova.virt.hardware [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.734 226239 DEBUG nova.virt.hardware [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:56:48 np0005603623 nova_compute[226235]: 2026-01-31 08:56:48.736 226239 DEBUG oslo_concurrency.processutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:48.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:56:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2797667370' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.148 226239 DEBUG oslo_concurrency.processutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.174 226239 DEBUG nova.storage.rbd_utils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] rbd image b23e0349-42a8-41d0-9eea-0407b7ffa806_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.178 226239 DEBUG oslo_concurrency.processutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:56:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2117009616' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.584 226239 DEBUG oslo_concurrency.processutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.406s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.586 226239 DEBUG nova.virt.libvirt.vif [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:56:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-2065541187',display_name='tempest-AttachVolumeTestJSON-server-2065541187',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-2065541187',id=183,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPUL5cEcJVRlCiMbz2iSvyLHRRbwondnCB0J5PuEgcXNy+3njwhMoe7I/EWgOT4I7wc9pdVLAB8zhu2jDR6J0fzH2TLS8K30f9hMVWgCCgCcK+C+JzBwM0cKlSwqRiExQA==',key_name='tempest-keypair-717536484',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab2d642eb03c4bda84a9a23e86f1fa4d',ramdisk_id='',reservation_id='r-wlcyw370',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1437067745',owner_user_name='tempest-AttachVolumeTestJSON-1437067745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:56:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53804fd0f3a14f95a4955e3bc6dcc8cb',uuid=b23e0349-42a8-41d0-9eea-0407b7ffa806,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "address": "fa:16:3e:95:38:a3", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcb7b61b-e5", "ovs_interfaceid": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.586 226239 DEBUG nova.network.os_vif_util [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converting VIF {"id": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "address": "fa:16:3e:95:38:a3", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcb7b61b-e5", "ovs_interfaceid": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.587 226239 DEBUG nova.network.os_vif_util [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:38:a3,bridge_name='br-int',has_traffic_filtering=True,id=bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbcb7b61b-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.588 226239 DEBUG nova.objects.instance [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'pci_devices' on Instance uuid b23e0349-42a8-41d0-9eea-0407b7ffa806 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.711 226239 DEBUG nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:56:49 np0005603623 nova_compute[226235]:  <uuid>b23e0349-42a8-41d0-9eea-0407b7ffa806</uuid>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:  <name>instance-000000b7</name>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <nova:name>tempest-AttachVolumeTestJSON-server-2065541187</nova:name>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:56:48</nova:creationTime>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:56:49 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:        <nova:user uuid="53804fd0f3a14f95a4955e3bc6dcc8cb">tempest-AttachVolumeTestJSON-1437067745-project-member</nova:user>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:        <nova:project uuid="ab2d642eb03c4bda84a9a23e86f1fa4d">tempest-AttachVolumeTestJSON-1437067745</nova:project>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:        <nova:port uuid="bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d">
Jan 31 03:56:49 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <entry name="serial">b23e0349-42a8-41d0-9eea-0407b7ffa806</entry>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <entry name="uuid">b23e0349-42a8-41d0-9eea-0407b7ffa806</entry>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/b23e0349-42a8-41d0-9eea-0407b7ffa806_disk">
Jan 31 03:56:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:56:49 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/b23e0349-42a8-41d0-9eea-0407b7ffa806_disk.config">
Jan 31 03:56:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:56:49 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:95:38:a3"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <target dev="tapbcb7b61b-e5"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/b23e0349-42a8-41d0-9eea-0407b7ffa806/console.log" append="off"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:56:49 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:56:49 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:56:49 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:56:49 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.712 226239 DEBUG nova.compute.manager [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Preparing to wait for external event network-vif-plugged-bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.713 226239 DEBUG oslo_concurrency.lockutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "b23e0349-42a8-41d0-9eea-0407b7ffa806-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.713 226239 DEBUG oslo_concurrency.lockutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.713 226239 DEBUG oslo_concurrency.lockutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.714 226239 DEBUG nova.virt.libvirt.vif [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:56:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-2065541187',display_name='tempest-AttachVolumeTestJSON-server-2065541187',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-2065541187',id=183,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPUL5cEcJVRlCiMbz2iSvyLHRRbwondnCB0J5PuEgcXNy+3njwhMoe7I/EWgOT4I7wc9pdVLAB8zhu2jDR6J0fzH2TLS8K30f9hMVWgCCgCcK+C+JzBwM0cKlSwqRiExQA==',key_name='tempest-keypair-717536484',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab2d642eb03c4bda84a9a23e86f1fa4d',ramdisk_id='',reservation_id='r-wlcyw370',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1437067745',owner_user_name='tempest-AttachVolumeTestJSON-1437067745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:56:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53804fd0f3a14f95a4955e3bc6dcc8cb',uuid=b23e0349-42a8-41d0-9eea-0407b7ffa806,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "address": "fa:16:3e:95:38:a3", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcb7b61b-e5", "ovs_interfaceid": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.715 226239 DEBUG nova.network.os_vif_util [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converting VIF {"id": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "address": "fa:16:3e:95:38:a3", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcb7b61b-e5", "ovs_interfaceid": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.715 226239 DEBUG nova.network.os_vif_util [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:38:a3,bridge_name='br-int',has_traffic_filtering=True,id=bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbcb7b61b-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.716 226239 DEBUG os_vif [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:38:a3,bridge_name='br-int',has_traffic_filtering=True,id=bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbcb7b61b-e5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.717 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.718 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.718 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.723 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.724 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbcb7b61b-e5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.725 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbcb7b61b-e5, col_values=(('external_ids', {'iface-id': 'bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:38:a3', 'vm-uuid': 'b23e0349-42a8-41d0-9eea-0407b7ffa806'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:49 np0005603623 NetworkManager[48970]: <info>  [1769849809.7283] manager: (tapbcb7b61b-e5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/356)
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.728 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.732 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.735 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.736 226239 INFO os_vif [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:38:a3,bridge_name='br-int',has_traffic_filtering=True,id=bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbcb7b61b-e5')#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.881 226239 DEBUG nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.881 226239 DEBUG nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.881 226239 DEBUG nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No VIF found with MAC fa:16:3e:95:38:a3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.882 226239 INFO nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Using config drive#033[00m
Jan 31 03:56:49 np0005603623 nova_compute[226235]: 2026-01-31 08:56:49.903 226239 DEBUG nova.storage.rbd_utils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] rbd image b23e0349-42a8-41d0-9eea-0407b7ffa806_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:56:50 np0005603623 nova_compute[226235]: 2026-01-31 08:56:50.430 226239 DEBUG nova.compute.manager [req-032e3d0b-3028-4f9d-b674-c75ed1cf1894 req-b364c5fa-b98a-482a-8eb1-9ad134508dcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Received event network-changed-f859815f-0923-45c1-a84d-2a128fb7fd57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:50 np0005603623 nova_compute[226235]: 2026-01-31 08:56:50.430 226239 DEBUG nova.compute.manager [req-032e3d0b-3028-4f9d-b674-c75ed1cf1894 req-b364c5fa-b98a-482a-8eb1-9ad134508dcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Refreshing instance network info cache due to event network-changed-f859815f-0923-45c1-a84d-2a128fb7fd57. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:56:50 np0005603623 nova_compute[226235]: 2026-01-31 08:56:50.430 226239 DEBUG oslo_concurrency.lockutils [req-032e3d0b-3028-4f9d-b674-c75ed1cf1894 req-b364c5fa-b98a-482a-8eb1-9ad134508dcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:56:50 np0005603623 nova_compute[226235]: 2026-01-31 08:56:50.496 226239 INFO nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Creating config drive at /var/lib/nova/instances/b23e0349-42a8-41d0-9eea-0407b7ffa806/disk.config#033[00m
Jan 31 03:56:50 np0005603623 nova_compute[226235]: 2026-01-31 08:56:50.500 226239 DEBUG oslo_concurrency.processutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b23e0349-42a8-41d0-9eea-0407b7ffa806/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp9rhneduv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:50 np0005603623 nova_compute[226235]: 2026-01-31 08:56:50.620 226239 DEBUG oslo_concurrency.processutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b23e0349-42a8-41d0-9eea-0407b7ffa806/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp9rhneduv" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:50 np0005603623 nova_compute[226235]: 2026-01-31 08:56:50.645 226239 DEBUG nova.storage.rbd_utils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] rbd image b23e0349-42a8-41d0-9eea-0407b7ffa806_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:56:50 np0005603623 nova_compute[226235]: 2026-01-31 08:56:50.649 226239 DEBUG oslo_concurrency.processutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b23e0349-42a8-41d0-9eea-0407b7ffa806/disk.config b23e0349-42a8-41d0-9eea-0407b7ffa806_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:50 np0005603623 nova_compute[226235]: 2026-01-31 08:56:50.668 226239 DEBUG nova.network.neutron [req-140c60a4-8738-4807-9bbd-c1cfa3672b9a req-87a70f73-95c3-46c3-9a5f-79d56d1bae39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updated VIF entry in instance network info cache for port f859815f-0923-45c1-a84d-2a128fb7fd57. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:56:50 np0005603623 nova_compute[226235]: 2026-01-31 08:56:50.670 226239 DEBUG nova.network.neutron [req-140c60a4-8738-4807-9bbd-c1cfa3672b9a req-87a70f73-95c3-46c3-9a5f-79d56d1bae39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updating instance_info_cache with network_info: [{"id": "f859815f-0923-45c1-a84d-2a128fb7fd57", "address": "fa:16:3e:ee:28:18", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf859815f-09", "ovs_interfaceid": "f859815f-0923-45c1-a84d-2a128fb7fd57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:56:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:50.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:50.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:50 np0005603623 nova_compute[226235]: 2026-01-31 08:56:50.927 226239 DEBUG oslo_concurrency.lockutils [req-140c60a4-8738-4807-9bbd-c1cfa3672b9a req-87a70f73-95c3-46c3-9a5f-79d56d1bae39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:56:50 np0005603623 nova_compute[226235]: 2026-01-31 08:56:50.928 226239 DEBUG oslo_concurrency.lockutils [req-032e3d0b-3028-4f9d-b674-c75ed1cf1894 req-b364c5fa-b98a-482a-8eb1-9ad134508dcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:56:50 np0005603623 nova_compute[226235]: 2026-01-31 08:56:50.928 226239 DEBUG nova.network.neutron [req-032e3d0b-3028-4f9d-b674-c75ed1cf1894 req-b364c5fa-b98a-482a-8eb1-9ad134508dcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Refreshing network info cache for port f859815f-0923-45c1-a84d-2a128fb7fd57 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:56:51 np0005603623 nova_compute[226235]: 2026-01-31 08:56:51.068 226239 DEBUG nova.network.neutron [req-ede35aee-db9f-44f3-ab12-8acde8e9f6a9 req-018f9871-9cdf-44e2-bad2-4eecf6126d10 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Updated VIF entry in instance network info cache for port bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:56:51 np0005603623 nova_compute[226235]: 2026-01-31 08:56:51.068 226239 DEBUG nova.network.neutron [req-ede35aee-db9f-44f3-ab12-8acde8e9f6a9 req-018f9871-9cdf-44e2-bad2-4eecf6126d10 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Updating instance_info_cache with network_info: [{"id": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "address": "fa:16:3e:95:38:a3", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcb7b61b-e5", "ovs_interfaceid": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:56:51 np0005603623 nova_compute[226235]: 2026-01-31 08:56:51.138 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:51 np0005603623 nova_compute[226235]: 2026-01-31 08:56:51.243 226239 DEBUG oslo_concurrency.lockutils [req-ede35aee-db9f-44f3-ab12-8acde8e9f6a9 req-018f9871-9cdf-44e2-bad2-4eecf6126d10 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-b23e0349-42a8-41d0-9eea-0407b7ffa806" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:56:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:51 np0005603623 nova_compute[226235]: 2026-01-31 08:56:51.836 226239 DEBUG oslo_concurrency.processutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b23e0349-42a8-41d0-9eea-0407b7ffa806/disk.config b23e0349-42a8-41d0-9eea-0407b7ffa806_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.187s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:51 np0005603623 nova_compute[226235]: 2026-01-31 08:56:51.836 226239 INFO nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Deleting local config drive /var/lib/nova/instances/b23e0349-42a8-41d0-9eea-0407b7ffa806/disk.config because it was imported into RBD.#033[00m
Jan 31 03:56:51 np0005603623 kernel: tapbcb7b61b-e5: entered promiscuous mode
Jan 31 03:56:51 np0005603623 NetworkManager[48970]: <info>  [1769849811.8778] manager: (tapbcb7b61b-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/357)
Jan 31 03:56:51 np0005603623 ovn_controller[133449]: 2026-01-31T08:56:51Z|00757|binding|INFO|Claiming lport bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d for this chassis.
Jan 31 03:56:51 np0005603623 nova_compute[226235]: 2026-01-31 08:56:51.878 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:51 np0005603623 ovn_controller[133449]: 2026-01-31T08:56:51Z|00758|binding|INFO|bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d: Claiming fa:16:3e:95:38:a3 10.100.0.5
Jan 31 03:56:51 np0005603623 ovn_controller[133449]: 2026-01-31T08:56:51Z|00759|binding|INFO|Setting lport bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d ovn-installed in OVS
Jan 31 03:56:51 np0005603623 nova_compute[226235]: 2026-01-31 08:56:51.886 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:51 np0005603623 systemd-machined[194379]: New machine qemu-86-instance-000000b7.
Jan 31 03:56:51 np0005603623 systemd[1]: Started Virtual Machine qemu-86-instance-000000b7.
Jan 31 03:56:51 np0005603623 systemd-udevd[313723]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:56:51 np0005603623 NetworkManager[48970]: <info>  [1769849811.9261] device (tapbcb7b61b-e5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:56:51 np0005603623 NetworkManager[48970]: <info>  [1769849811.9268] device (tapbcb7b61b-e5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:56:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:56:52Z|00760|binding|INFO|Setting lport bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d up in Southbound
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.158 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:38:a3 10.100.0.5'], port_security=['fa:16:3e:95:38:a3 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b23e0349-42a8-41d0-9eea-0407b7ffa806', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1b5345e-f6dc-4309-b059-80678428d42d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab2d642eb03c4bda84a9a23e86f1fa4d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3374df0b-0f82-408b-a043-afddd7a50c2a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a496b319-a305-495c-a6a2-a324cd91f494, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.159 143258 INFO neutron.agent.ovn.metadata.agent [-] Port bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d in datapath a1b5345e-f6dc-4309-b059-80678428d42d bound to our chassis#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.160 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a1b5345e-f6dc-4309-b059-80678428d42d#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.168 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a2b88deb-cce6-4bce-b110-e7bd2cf73e11]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.169 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa1b5345e-f1 in ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.171 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa1b5345e-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.171 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b4363c79-07e5-4243-b2c8-2d7379ddebdb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.172 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[45e8d068-f15a-4b42-b4e8-484640a68c04]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.184 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[2b951101-3a64-4bd7-abd4-b4900ed01634]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.193 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[471a89b1-b877-4f7b-af42-fd374946cb91]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.215 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[b685be48-8396-442b-bd3d-f2cb7e96197e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.220 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f9823211-c897-4bae-811d-e713c74d7565]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:52 np0005603623 systemd-udevd[313727]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:56:52 np0005603623 NetworkManager[48970]: <info>  [1769849812.2209] manager: (tapa1b5345e-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/358)
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.242 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[2f8abb62-6364-42bc-8846-624fce683de1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.244 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[349a2610-00f8-4e31-acdd-072cd6fb40fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:52 np0005603623 NetworkManager[48970]: <info>  [1769849812.2583] device (tapa1b5345e-f0): carrier: link connected
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.262 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[3056ab77-dfd4-45fb-a2f9-21ab7a575500]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.276 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ebb235e4-2358-4ba7-87bc-dcd8d81a1e1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1b5345e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:43:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 223], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 895268, 'reachable_time': 28194, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 313756, 'error': None, 'target': 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.287 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d0e48a25-5517-48a4-90ea-630ab30e2998]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee7:43a3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 895268, 'tstamp': 895268}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 313757, 'error': None, 'target': 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.298 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e06a1308-e943-49de-af08-7dd337ba7de7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1b5345e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:43:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 223], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 895268, 'reachable_time': 28194, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 313758, 'error': None, 'target': 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.316 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fdb874d0-85dd-4c77-b2b7-bba82b7cc840]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.353 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[841295b8-1f2c-496b-9c70-e2760e70eb7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.355 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1b5345e-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.355 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.356 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa1b5345e-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:52 np0005603623 nova_compute[226235]: 2026-01-31 08:56:52.357 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:52 np0005603623 NetworkManager[48970]: <info>  [1769849812.3582] manager: (tapa1b5345e-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/359)
Jan 31 03:56:52 np0005603623 kernel: tapa1b5345e-f0: entered promiscuous mode
Jan 31 03:56:52 np0005603623 nova_compute[226235]: 2026-01-31 08:56:52.360 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.362 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa1b5345e-f0, col_values=(('external_ids', {'iface-id': '7a59c286-57bd-4dc4-87e7-a6bfcee69c68'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:52 np0005603623 nova_compute[226235]: 2026-01-31 08:56:52.363 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:52 np0005603623 ovn_controller[133449]: 2026-01-31T08:56:52Z|00761|binding|INFO|Releasing lport 7a59c286-57bd-4dc4-87e7-a6bfcee69c68 from this chassis (sb_readonly=0)
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.365 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a1b5345e-f6dc-4309-b059-80678428d42d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a1b5345e-f6dc-4309-b059-80678428d42d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.366 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f1050ed2-3a92-470d-96e1-97da8e1e14a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.367 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-a1b5345e-f6dc-4309-b059-80678428d42d
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/a1b5345e-f6dc-4309-b059-80678428d42d.pid.haproxy
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID a1b5345e-f6dc-4309-b059-80678428d42d
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:56:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:56:52.368 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'env', 'PROCESS_TAG=haproxy-a1b5345e-f6dc-4309-b059-80678428d42d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a1b5345e-f6dc-4309-b059-80678428d42d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:56:52 np0005603623 nova_compute[226235]: 2026-01-31 08:56:52.368 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:52.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:52 np0005603623 podman[313827]: 2026-01-31 08:56:52.650871293 +0000 UTC m=+0.023824338 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:56:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:52.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:52 np0005603623 nova_compute[226235]: 2026-01-31 08:56:52.821 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849812.8202624, b23e0349-42a8-41d0-9eea-0407b7ffa806 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:56:52 np0005603623 nova_compute[226235]: 2026-01-31 08:56:52.821 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] VM Started (Lifecycle Event)#033[00m
Jan 31 03:56:52 np0005603623 nova_compute[226235]: 2026-01-31 08:56:52.930 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:56:52 np0005603623 nova_compute[226235]: 2026-01-31 08:56:52.935 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849812.8207943, b23e0349-42a8-41d0-9eea-0407b7ffa806 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:56:52 np0005603623 nova_compute[226235]: 2026-01-31 08:56:52.935 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:56:53 np0005603623 nova_compute[226235]: 2026-01-31 08:56:53.031 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:56:53 np0005603623 nova_compute[226235]: 2026-01-31 08:56:53.034 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:56:53 np0005603623 podman[313827]: 2026-01-31 08:56:53.105737552 +0000 UTC m=+0.478690567 container create a3512e38e7f3f711f7589f2871359c1fd928341bc410fa942099ba5a59daee90 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:56:53 np0005603623 systemd[1]: Started libpod-conmon-a3512e38e7f3f711f7589f2871359c1fd928341bc410fa942099ba5a59daee90.scope.
Jan 31 03:56:53 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:56:53 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/df7b33469585a7da0bee20dbc2be208246ee5410d131bdadb76c3a7b450f221d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:56:53 np0005603623 podman[313827]: 2026-01-31 08:56:53.207128342 +0000 UTC m=+0.580081377 container init a3512e38e7f3f711f7589f2871359c1fd928341bc410fa942099ba5a59daee90 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:56:53 np0005603623 podman[313827]: 2026-01-31 08:56:53.211558112 +0000 UTC m=+0.584511127 container start a3512e38e7f3f711f7589f2871359c1fd928341bc410fa942099ba5a59daee90 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:56:53 np0005603623 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[313848]: [NOTICE]   (313852) : New worker (313854) forked
Jan 31 03:56:53 np0005603623 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[313848]: [NOTICE]   (313852) : Loading success.
Jan 31 03:56:53 np0005603623 nova_compute[226235]: 2026-01-31 08:56:53.370 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:56:53 np0005603623 nova_compute[226235]: 2026-01-31 08:56:53.566 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:53 np0005603623 nova_compute[226235]: 2026-01-31 08:56:53.567 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:56:53 np0005603623 nova_compute[226235]: 2026-01-31 08:56:53.714 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.543 226239 DEBUG nova.compute.manager [req-0d7faab3-ab3a-4d04-87e3-5f2da0a9a58c req-3e06bfed-1b4a-4511-b185-70549641c907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Received event network-vif-plugged-bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.544 226239 DEBUG oslo_concurrency.lockutils [req-0d7faab3-ab3a-4d04-87e3-5f2da0a9a58c req-3e06bfed-1b4a-4511-b185-70549641c907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b23e0349-42a8-41d0-9eea-0407b7ffa806-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.545 226239 DEBUG oslo_concurrency.lockutils [req-0d7faab3-ab3a-4d04-87e3-5f2da0a9a58c req-3e06bfed-1b4a-4511-b185-70549641c907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.545 226239 DEBUG oslo_concurrency.lockutils [req-0d7faab3-ab3a-4d04-87e3-5f2da0a9a58c req-3e06bfed-1b4a-4511-b185-70549641c907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.545 226239 DEBUG nova.compute.manager [req-0d7faab3-ab3a-4d04-87e3-5f2da0a9a58c req-3e06bfed-1b4a-4511-b185-70549641c907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Processing event network-vif-plugged-bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.546 226239 DEBUG nova.compute.manager [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.550 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849814.5497904, b23e0349-42a8-41d0-9eea-0407b7ffa806 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.550 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.552 226239 DEBUG nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.556 226239 INFO nova.virt.libvirt.driver [-] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Instance spawned successfully.#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.556 226239 DEBUG nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.657 226239 DEBUG nova.network.neutron [req-032e3d0b-3028-4f9d-b674-c75ed1cf1894 req-b364c5fa-b98a-482a-8eb1-9ad134508dcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updated VIF entry in instance network info cache for port f859815f-0923-45c1-a84d-2a128fb7fd57. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.658 226239 DEBUG nova.network.neutron [req-032e3d0b-3028-4f9d-b674-c75ed1cf1894 req-b364c5fa-b98a-482a-8eb1-9ad134508dcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updating instance_info_cache with network_info: [{"id": "f859815f-0923-45c1-a84d-2a128fb7fd57", "address": "fa:16:3e:ee:28:18", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf859815f-09", "ovs_interfaceid": "f859815f-0923-45c1-a84d-2a128fb7fd57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:56:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:54.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.728 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:54.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.818 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.820 226239 DEBUG oslo_concurrency.lockutils [req-032e3d0b-3028-4f9d-b674-c75ed1cf1894 req-b364c5fa-b98a-482a-8eb1-9ad134508dcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.821 226239 DEBUG nova.compute.manager [req-032e3d0b-3028-4f9d-b674-c75ed1cf1894 req-b364c5fa-b98a-482a-8eb1-9ad134508dcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Received event network-changed-f859815f-0923-45c1-a84d-2a128fb7fd57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.821 226239 DEBUG nova.compute.manager [req-032e3d0b-3028-4f9d-b674-c75ed1cf1894 req-b364c5fa-b98a-482a-8eb1-9ad134508dcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Refreshing instance network info cache due to event network-changed-f859815f-0923-45c1-a84d-2a128fb7fd57. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.822 226239 DEBUG oslo_concurrency.lockutils [req-032e3d0b-3028-4f9d-b674-c75ed1cf1894 req-b364c5fa-b98a-482a-8eb1-9ad134508dcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.822 226239 DEBUG oslo_concurrency.lockutils [req-032e3d0b-3028-4f9d-b674-c75ed1cf1894 req-b364c5fa-b98a-482a-8eb1-9ad134508dcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.822 226239 DEBUG nova.network.neutron [req-032e3d0b-3028-4f9d-b674-c75ed1cf1894 req-b364c5fa-b98a-482a-8eb1-9ad134508dcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Refreshing network info cache for port f859815f-0923-45c1-a84d-2a128fb7fd57 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.826 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.829 226239 DEBUG nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.829 226239 DEBUG nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.830 226239 DEBUG nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.831 226239 DEBUG nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.831 226239 DEBUG nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:56:54 np0005603623 nova_compute[226235]: 2026-01-31 08:56:54.832 226239 DEBUG nova.virt.libvirt.driver [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:56:55 np0005603623 nova_compute[226235]: 2026-01-31 08:56:55.003 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:56:55 np0005603623 nova_compute[226235]: 2026-01-31 08:56:55.340 226239 INFO nova.compute.manager [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Took 13.10 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:56:55 np0005603623 nova_compute[226235]: 2026-01-31 08:56:55.341 226239 DEBUG nova.compute.manager [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:56:55 np0005603623 nova_compute[226235]: 2026-01-31 08:56:55.718 226239 INFO nova.compute.manager [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Took 16.03 seconds to build instance.#033[00m
Jan 31 03:56:55 np0005603623 nova_compute[226235]: 2026-01-31 08:56:55.956 226239 DEBUG oslo_concurrency.lockutils [None req-f9891346-4932-4a93-acbd-734c5d892be3 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:56 np0005603623 nova_compute[226235]: 2026-01-31 08:56:56.140 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:56:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:56.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:56:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:56.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:58 np0005603623 nova_compute[226235]: 2026-01-31 08:56:58.056 226239 DEBUG nova.compute.manager [req-775b779f-d33f-437f-8d2b-79dbe61737e8 req-7af23815-b444-4ada-99a5-0cfdf2f5a6c1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Received event network-vif-plugged-bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:58 np0005603623 nova_compute[226235]: 2026-01-31 08:56:58.057 226239 DEBUG oslo_concurrency.lockutils [req-775b779f-d33f-437f-8d2b-79dbe61737e8 req-7af23815-b444-4ada-99a5-0cfdf2f5a6c1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b23e0349-42a8-41d0-9eea-0407b7ffa806-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:58 np0005603623 nova_compute[226235]: 2026-01-31 08:56:58.057 226239 DEBUG oslo_concurrency.lockutils [req-775b779f-d33f-437f-8d2b-79dbe61737e8 req-7af23815-b444-4ada-99a5-0cfdf2f5a6c1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:58 np0005603623 nova_compute[226235]: 2026-01-31 08:56:58.057 226239 DEBUG oslo_concurrency.lockutils [req-775b779f-d33f-437f-8d2b-79dbe61737e8 req-7af23815-b444-4ada-99a5-0cfdf2f5a6c1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:58 np0005603623 nova_compute[226235]: 2026-01-31 08:56:58.057 226239 DEBUG nova.compute.manager [req-775b779f-d33f-437f-8d2b-79dbe61737e8 req-7af23815-b444-4ada-99a5-0cfdf2f5a6c1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] No waiting events found dispatching network-vif-plugged-bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:56:58 np0005603623 nova_compute[226235]: 2026-01-31 08:56:58.058 226239 WARNING nova.compute.manager [req-775b779f-d33f-437f-8d2b-79dbe61737e8 req-7af23815-b444-4ada-99a5-0cfdf2f5a6c1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Received unexpected event network-vif-plugged-bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d for instance with vm_state active and task_state None.#033[00m
Jan 31 03:56:58 np0005603623 nova_compute[226235]: 2026-01-31 08:56:58.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:58.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:56:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:58.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:59 np0005603623 nova_compute[226235]: 2026-01-31 08:56:59.730 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:00 np0005603623 nova_compute[226235]: 2026-01-31 08:57:00.127 226239 DEBUG nova.network.neutron [req-032e3d0b-3028-4f9d-b674-c75ed1cf1894 req-b364c5fa-b98a-482a-8eb1-9ad134508dcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updated VIF entry in instance network info cache for port f859815f-0923-45c1-a84d-2a128fb7fd57. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:57:00 np0005603623 nova_compute[226235]: 2026-01-31 08:57:00.128 226239 DEBUG nova.network.neutron [req-032e3d0b-3028-4f9d-b674-c75ed1cf1894 req-b364c5fa-b98a-482a-8eb1-9ad134508dcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updating instance_info_cache with network_info: [{"id": "f859815f-0923-45c1-a84d-2a128fb7fd57", "address": "fa:16:3e:ee:28:18", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf859815f-09", "ovs_interfaceid": "f859815f-0923-45c1-a84d-2a128fb7fd57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:57:00 np0005603623 nova_compute[226235]: 2026-01-31 08:57:00.222 226239 DEBUG oslo_concurrency.lockutils [req-032e3d0b-3028-4f9d-b674-c75ed1cf1894 req-b364c5fa-b98a-482a-8eb1-9ad134508dcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:57:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:57:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:00.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:57:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:00.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:01 np0005603623 nova_compute[226235]: 2026-01-31 08:57:01.196 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:01 np0005603623 nova_compute[226235]: 2026-01-31 08:57:01.247 226239 DEBUG nova.compute.manager [req-215d5a69-1ad4-4854-8c83-3d7eb50292c0 req-ca86c768-fc5b-477c-b600-4336048b86a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Received event network-changed-bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:01 np0005603623 nova_compute[226235]: 2026-01-31 08:57:01.248 226239 DEBUG nova.compute.manager [req-215d5a69-1ad4-4854-8c83-3d7eb50292c0 req-ca86c768-fc5b-477c-b600-4336048b86a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Refreshing instance network info cache due to event network-changed-bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:57:01 np0005603623 nova_compute[226235]: 2026-01-31 08:57:01.248 226239 DEBUG oslo_concurrency.lockutils [req-215d5a69-1ad4-4854-8c83-3d7eb50292c0 req-ca86c768-fc5b-477c-b600-4336048b86a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-b23e0349-42a8-41d0-9eea-0407b7ffa806" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:57:01 np0005603623 nova_compute[226235]: 2026-01-31 08:57:01.248 226239 DEBUG oslo_concurrency.lockutils [req-215d5a69-1ad4-4854-8c83-3d7eb50292c0 req-ca86c768-fc5b-477c-b600-4336048b86a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-b23e0349-42a8-41d0-9eea-0407b7ffa806" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:57:01 np0005603623 nova_compute[226235]: 2026-01-31 08:57:01.248 226239 DEBUG nova.network.neutron [req-215d5a69-1ad4-4854-8c83-3d7eb50292c0 req-ca86c768-fc5b-477c-b600-4336048b86a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Refreshing network info cache for port bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:57:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:02.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:57:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:02.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:57:03 np0005603623 nova_compute[226235]: 2026-01-31 08:57:03.137 226239 DEBUG nova.network.neutron [req-215d5a69-1ad4-4854-8c83-3d7eb50292c0 req-ca86c768-fc5b-477c-b600-4336048b86a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Updated VIF entry in instance network info cache for port bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:57:03 np0005603623 nova_compute[226235]: 2026-01-31 08:57:03.137 226239 DEBUG nova.network.neutron [req-215d5a69-1ad4-4854-8c83-3d7eb50292c0 req-ca86c768-fc5b-477c-b600-4336048b86a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Updating instance_info_cache with network_info: [{"id": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "address": "fa:16:3e:95:38:a3", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcb7b61b-e5", "ovs_interfaceid": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:57:03 np0005603623 nova_compute[226235]: 2026-01-31 08:57:03.170 226239 DEBUG oslo_concurrency.lockutils [req-215d5a69-1ad4-4854-8c83-3d7eb50292c0 req-ca86c768-fc5b-477c-b600-4336048b86a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-b23e0349-42a8-41d0-9eea-0407b7ffa806" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:57:03 np0005603623 nova_compute[226235]: 2026-01-31 08:57:03.247 226239 DEBUG nova.compute.manager [req-b41c3e9b-e737-4494-bb3d-fd05b7ad1bbd req-76e994b0-a6b4-475b-8aa1-123544647bc9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Received event network-changed-f859815f-0923-45c1-a84d-2a128fb7fd57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:03 np0005603623 nova_compute[226235]: 2026-01-31 08:57:03.248 226239 DEBUG nova.compute.manager [req-b41c3e9b-e737-4494-bb3d-fd05b7ad1bbd req-76e994b0-a6b4-475b-8aa1-123544647bc9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Refreshing instance network info cache due to event network-changed-f859815f-0923-45c1-a84d-2a128fb7fd57. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:57:03 np0005603623 nova_compute[226235]: 2026-01-31 08:57:03.249 226239 DEBUG oslo_concurrency.lockutils [req-b41c3e9b-e737-4494-bb3d-fd05b7ad1bbd req-76e994b0-a6b4-475b-8aa1-123544647bc9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:57:03 np0005603623 nova_compute[226235]: 2026-01-31 08:57:03.249 226239 DEBUG oslo_concurrency.lockutils [req-b41c3e9b-e737-4494-bb3d-fd05b7ad1bbd req-76e994b0-a6b4-475b-8aa1-123544647bc9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:57:03 np0005603623 nova_compute[226235]: 2026-01-31 08:57:03.249 226239 DEBUG nova.network.neutron [req-b41c3e9b-e737-4494-bb3d-fd05b7ad1bbd req-76e994b0-a6b4-475b-8aa1-123544647bc9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Refreshing network info cache for port f859815f-0923-45c1-a84d-2a128fb7fd57 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:57:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:04.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:04 np0005603623 nova_compute[226235]: 2026-01-31 08:57:04.733 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:04.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:05 np0005603623 nova_compute[226235]: 2026-01-31 08:57:05.910 226239 DEBUG nova.network.neutron [req-b41c3e9b-e737-4494-bb3d-fd05b7ad1bbd req-76e994b0-a6b4-475b-8aa1-123544647bc9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updated VIF entry in instance network info cache for port f859815f-0923-45c1-a84d-2a128fb7fd57. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:57:05 np0005603623 nova_compute[226235]: 2026-01-31 08:57:05.910 226239 DEBUG nova.network.neutron [req-b41c3e9b-e737-4494-bb3d-fd05b7ad1bbd req-76e994b0-a6b4-475b-8aa1-123544647bc9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updating instance_info_cache with network_info: [{"id": "f859815f-0923-45c1-a84d-2a128fb7fd57", "address": "fa:16:3e:ee:28:18", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf859815f-09", "ovs_interfaceid": "f859815f-0923-45c1-a84d-2a128fb7fd57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:57:05 np0005603623 nova_compute[226235]: 2026-01-31 08:57:05.936 226239 DEBUG oslo_concurrency.lockutils [req-b41c3e9b-e737-4494-bb3d-fd05b7ad1bbd req-76e994b0-a6b4-475b-8aa1-123544647bc9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:57:06 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:06Z|00092|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:95:38:a3 10.100.0.5
Jan 31 03:57:06 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:06Z|00093|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:95:38:a3 10.100.0.5
Jan 31 03:57:06 np0005603623 nova_compute[226235]: 2026-01-31 08:57:06.197 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:06 np0005603623 nova_compute[226235]: 2026-01-31 08:57:06.236 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:06 np0005603623 nova_compute[226235]: 2026-01-31 08:57:06.269 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:06 np0005603623 nova_compute[226235]: 2026-01-31 08:57:06.270 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:57:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:57:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:06.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:57:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:57:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:06.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:57:07 np0005603623 nova_compute[226235]: 2026-01-31 08:57:07.253 226239 DEBUG oslo_concurrency.lockutils [None req-68ee357f-3ead-4a38-a728-93cd82434c9d cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "e9903ecf-c775-4e84-8997-361061869fc6" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:07 np0005603623 nova_compute[226235]: 2026-01-31 08:57:07.253 226239 DEBUG oslo_concurrency.lockutils [None req-68ee357f-3ead-4a38-a728-93cd82434c9d cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:07 np0005603623 nova_compute[226235]: 2026-01-31 08:57:07.268 226239 INFO nova.compute.manager [None req-68ee357f-3ead-4a38-a728-93cd82434c9d cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Detaching volume 985c0898-7bd0-457c-b3bb-abe45d65168a#033[00m
Jan 31 03:57:07 np0005603623 nova_compute[226235]: 2026-01-31 08:57:07.498 226239 INFO nova.virt.block_device [None req-68ee357f-3ead-4a38-a728-93cd82434c9d cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Attempting to driver detach volume 985c0898-7bd0-457c-b3bb-abe45d65168a from mountpoint /dev/vdb#033[00m
Jan 31 03:57:07 np0005603623 nova_compute[226235]: 2026-01-31 08:57:07.506 226239 DEBUG nova.virt.libvirt.driver [None req-68ee357f-3ead-4a38-a728-93cd82434c9d cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Attempting to detach device vdb from instance e9903ecf-c775-4e84-8997-361061869fc6 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:57:07 np0005603623 nova_compute[226235]: 2026-01-31 08:57:07.507 226239 DEBUG nova.virt.libvirt.guest [None req-68ee357f-3ead-4a38-a728-93cd82434c9d cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:57:07 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:57:07 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-985c0898-7bd0-457c-b3bb-abe45d65168a">
Jan 31 03:57:07 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:07 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:07 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:07 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:57:07 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:57:07 np0005603623 nova_compute[226235]:  <serial>985c0898-7bd0-457c-b3bb-abe45d65168a</serial>
Jan 31 03:57:07 np0005603623 nova_compute[226235]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 03:57:07 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:57:07 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:57:07 np0005603623 nova_compute[226235]: 2026-01-31 08:57:07.515 226239 INFO nova.virt.libvirt.driver [None req-68ee357f-3ead-4a38-a728-93cd82434c9d cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Successfully detached device vdb from instance e9903ecf-c775-4e84-8997-361061869fc6 from the persistent domain config.#033[00m
Jan 31 03:57:07 np0005603623 nova_compute[226235]: 2026-01-31 08:57:07.515 226239 DEBUG nova.virt.libvirt.driver [None req-68ee357f-3ead-4a38-a728-93cd82434c9d cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance e9903ecf-c775-4e84-8997-361061869fc6 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 03:57:07 np0005603623 nova_compute[226235]: 2026-01-31 08:57:07.516 226239 DEBUG nova.virt.libvirt.guest [None req-68ee357f-3ead-4a38-a728-93cd82434c9d cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:57:07 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:57:07 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-985c0898-7bd0-457c-b3bb-abe45d65168a">
Jan 31 03:57:07 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:07 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:07 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:07 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:57:07 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:57:07 np0005603623 nova_compute[226235]:  <serial>985c0898-7bd0-457c-b3bb-abe45d65168a</serial>
Jan 31 03:57:07 np0005603623 nova_compute[226235]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 03:57:07 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:57:07 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:57:07 np0005603623 nova_compute[226235]: 2026-01-31 08:57:07.634 226239 DEBUG nova.virt.libvirt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Received event <DeviceRemovedEvent: 1769849827.6343908, e9903ecf-c775-4e84-8997-361061869fc6 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 03:57:07 np0005603623 nova_compute[226235]: 2026-01-31 08:57:07.636 226239 DEBUG nova.virt.libvirt.driver [None req-68ee357f-3ead-4a38-a728-93cd82434c9d cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance e9903ecf-c775-4e84-8997-361061869fc6 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 03:57:07 np0005603623 nova_compute[226235]: 2026-01-31 08:57:07.638 226239 INFO nova.virt.libvirt.driver [None req-68ee357f-3ead-4a38-a728-93cd82434c9d cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Successfully detached device vdb from instance e9903ecf-c775-4e84-8997-361061869fc6 from the live domain config.#033[00m
Jan 31 03:57:08 np0005603623 nova_compute[226235]: 2026-01-31 08:57:08.412 226239 DEBUG nova.objects.instance [None req-68ee357f-3ead-4a38-a728-93cd82434c9d cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lazy-loading 'flavor' on Instance uuid e9903ecf-c775-4e84-8997-361061869fc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:08 np0005603623 nova_compute[226235]: 2026-01-31 08:57:08.461 226239 DEBUG oslo_concurrency.lockutils [None req-68ee357f-3ead-4a38-a728-93cd82434c9d cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:08.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:08.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:09 np0005603623 nova_compute[226235]: 2026-01-31 08:57:09.735 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:57:10 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3685118432' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:57:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:57:10 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3685118432' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:57:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:10.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:10.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:11 np0005603623 nova_compute[226235]: 2026-01-31 08:57:11.199 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:11 np0005603623 nova_compute[226235]: 2026-01-31 08:57:11.427 226239 DEBUG oslo_concurrency.lockutils [None req-5abccdd4-ce6f-48b0-a3d6-94867e6b6044 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "e9903ecf-c775-4e84-8997-361061869fc6" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:11 np0005603623 nova_compute[226235]: 2026-01-31 08:57:11.427 226239 DEBUG oslo_concurrency.lockutils [None req-5abccdd4-ce6f-48b0-a3d6-94867e6b6044 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:11 np0005603623 nova_compute[226235]: 2026-01-31 08:57:11.443 226239 INFO nova.compute.manager [None req-5abccdd4-ce6f-48b0-a3d6-94867e6b6044 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Detaching volume 75a603c9-1c6b-4103-ac69-0db6813e7404#033[00m
Jan 31 03:57:11 np0005603623 nova_compute[226235]: 2026-01-31 08:57:11.752 226239 INFO nova.virt.block_device [None req-5abccdd4-ce6f-48b0-a3d6-94867e6b6044 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Attempting to driver detach volume 75a603c9-1c6b-4103-ac69-0db6813e7404 from mountpoint /dev/vdc#033[00m
Jan 31 03:57:11 np0005603623 nova_compute[226235]: 2026-01-31 08:57:11.759 226239 DEBUG nova.virt.libvirt.driver [None req-5abccdd4-ce6f-48b0-a3d6-94867e6b6044 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Attempting to detach device vdc from instance e9903ecf-c775-4e84-8997-361061869fc6 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:57:11 np0005603623 nova_compute[226235]: 2026-01-31 08:57:11.760 226239 DEBUG nova.virt.libvirt.guest [None req-5abccdd4-ce6f-48b0-a3d6-94867e6b6044 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:57:11 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:57:11 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-75a603c9-1c6b-4103-ac69-0db6813e7404">
Jan 31 03:57:11 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:11 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:11 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:11 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:57:11 np0005603623 nova_compute[226235]:  <target dev="vdc" bus="virtio"/>
Jan 31 03:57:11 np0005603623 nova_compute[226235]:  <serial>75a603c9-1c6b-4103-ac69-0db6813e7404</serial>
Jan 31 03:57:11 np0005603623 nova_compute[226235]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 31 03:57:11 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:57:11 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:57:11 np0005603623 nova_compute[226235]: 2026-01-31 08:57:11.768 226239 INFO nova.virt.libvirt.driver [None req-5abccdd4-ce6f-48b0-a3d6-94867e6b6044 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Successfully detached device vdc from instance e9903ecf-c775-4e84-8997-361061869fc6 from the persistent domain config.#033[00m
Jan 31 03:57:11 np0005603623 nova_compute[226235]: 2026-01-31 08:57:11.768 226239 DEBUG nova.virt.libvirt.driver [None req-5abccdd4-ce6f-48b0-a3d6-94867e6b6044 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] (1/8): Attempting to detach device vdc with device alias virtio-disk2 from instance e9903ecf-c775-4e84-8997-361061869fc6 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 03:57:11 np0005603623 nova_compute[226235]: 2026-01-31 08:57:11.769 226239 DEBUG nova.virt.libvirt.guest [None req-5abccdd4-ce6f-48b0-a3d6-94867e6b6044 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:57:11 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:57:11 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-75a603c9-1c6b-4103-ac69-0db6813e7404">
Jan 31 03:57:11 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:11 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:11 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:11 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:57:11 np0005603623 nova_compute[226235]:  <target dev="vdc" bus="virtio"/>
Jan 31 03:57:11 np0005603623 nova_compute[226235]:  <serial>75a603c9-1c6b-4103-ac69-0db6813e7404</serial>
Jan 31 03:57:11 np0005603623 nova_compute[226235]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 31 03:57:11 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:57:11 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:57:11 np0005603623 nova_compute[226235]: 2026-01-31 08:57:11.872 226239 DEBUG nova.virt.libvirt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Received event <DeviceRemovedEvent: 1769849831.872087, e9903ecf-c775-4e84-8997-361061869fc6 => virtio-disk2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 03:57:11 np0005603623 nova_compute[226235]: 2026-01-31 08:57:11.873 226239 DEBUG nova.virt.libvirt.driver [None req-5abccdd4-ce6f-48b0-a3d6-94867e6b6044 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Start waiting for the detach event from libvirt for device vdc with device alias virtio-disk2 for instance e9903ecf-c775-4e84-8997-361061869fc6 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 03:57:11 np0005603623 nova_compute[226235]: 2026-01-31 08:57:11.875 226239 INFO nova.virt.libvirt.driver [None req-5abccdd4-ce6f-48b0-a3d6-94867e6b6044 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Successfully detached device vdc from instance e9903ecf-c775-4e84-8997-361061869fc6 from the live domain config.#033[00m
Jan 31 03:57:11 np0005603623 podman[313927]: 2026-01-31 08:57:11.963260805 +0000 UTC m=+0.047498641 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:57:11 np0005603623 podman[313928]: 2026-01-31 08:57:11.984300185 +0000 UTC m=+0.068537851 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, managed_by=edpm_ansible)
Jan 31 03:57:12 np0005603623 nova_compute[226235]: 2026-01-31 08:57:12.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:12 np0005603623 nova_compute[226235]: 2026-01-31 08:57:12.197 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:12 np0005603623 nova_compute[226235]: 2026-01-31 08:57:12.198 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:12 np0005603623 nova_compute[226235]: 2026-01-31 08:57:12.198 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:12 np0005603623 nova_compute[226235]: 2026-01-31 08:57:12.198 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:57:12 np0005603623 nova_compute[226235]: 2026-01-31 08:57:12.198 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:12 np0005603623 nova_compute[226235]: 2026-01-31 08:57:12.269 226239 DEBUG nova.objects.instance [None req-5abccdd4-ce6f-48b0-a3d6-94867e6b6044 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lazy-loading 'flavor' on Instance uuid e9903ecf-c775-4e84-8997-361061869fc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:12 np0005603623 nova_compute[226235]: 2026-01-31 08:57:12.360 226239 DEBUG oslo_concurrency.lockutils [None req-5abccdd4-ce6f-48b0-a3d6-94867e6b6044 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.932s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:57:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2538271858' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:57:12 np0005603623 nova_compute[226235]: 2026-01-31 08:57:12.611 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:12 np0005603623 nova_compute[226235]: 2026-01-31 08:57:12.708 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000b7 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:57:12 np0005603623 nova_compute[226235]: 2026-01-31 08:57:12.708 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000b7 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:57:12 np0005603623 nova_compute[226235]: 2026-01-31 08:57:12.712 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000b3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:57:12 np0005603623 nova_compute[226235]: 2026-01-31 08:57:12.712 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000b3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:57:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:12.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:12.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:12 np0005603623 nova_compute[226235]: 2026-01-31 08:57:12.857 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:57:12 np0005603623 nova_compute[226235]: 2026-01-31 08:57:12.858 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3842MB free_disk=20.896854400634766GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:57:12 np0005603623 nova_compute[226235]: 2026-01-31 08:57:12.859 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:12 np0005603623 nova_compute[226235]: 2026-01-31 08:57:12.859 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:13 np0005603623 nova_compute[226235]: 2026-01-31 08:57:13.195 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance e9903ecf-c775-4e84-8997-361061869fc6 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:57:13 np0005603623 nova_compute[226235]: 2026-01-31 08:57:13.195 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance b23e0349-42a8-41d0-9eea-0407b7ffa806 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:57:13 np0005603623 nova_compute[226235]: 2026-01-31 08:57:13.196 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:57:13 np0005603623 nova_compute[226235]: 2026-01-31 08:57:13.196 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:57:13 np0005603623 nova_compute[226235]: 2026-01-31 08:57:13.213 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:57:13 np0005603623 nova_compute[226235]: 2026-01-31 08:57:13.238 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:57:13 np0005603623 nova_compute[226235]: 2026-01-31 08:57:13.238 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:57:13 np0005603623 nova_compute[226235]: 2026-01-31 08:57:13.260 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:57:13 np0005603623 nova_compute[226235]: 2026-01-31 08:57:13.294 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:57:13 np0005603623 nova_compute[226235]: 2026-01-31 08:57:13.394 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:57:13 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1616483987' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:57:13 np0005603623 nova_compute[226235]: 2026-01-31 08:57:13.862 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:13 np0005603623 nova_compute[226235]: 2026-01-31 08:57:13.866 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:57:13 np0005603623 nova_compute[226235]: 2026-01-31 08:57:13.906 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:57:13 np0005603623 nova_compute[226235]: 2026-01-31 08:57:13.957 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:57:13 np0005603623 nova_compute[226235]: 2026-01-31 08:57:13.958 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:14 np0005603623 nova_compute[226235]: 2026-01-31 08:57:14.265 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:14 np0005603623 nova_compute[226235]: 2026-01-31 08:57:14.576 226239 DEBUG nova.compute.manager [req-9787be40-9669-4229-bde9-6d882bb83e53 req-07419bff-24db-4484-9f92-0b436a9d657e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Received event network-changed-f859815f-0923-45c1-a84d-2a128fb7fd57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:14 np0005603623 nova_compute[226235]: 2026-01-31 08:57:14.577 226239 DEBUG nova.compute.manager [req-9787be40-9669-4229-bde9-6d882bb83e53 req-07419bff-24db-4484-9f92-0b436a9d657e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Refreshing instance network info cache due to event network-changed-f859815f-0923-45c1-a84d-2a128fb7fd57. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:57:14 np0005603623 nova_compute[226235]: 2026-01-31 08:57:14.577 226239 DEBUG oslo_concurrency.lockutils [req-9787be40-9669-4229-bde9-6d882bb83e53 req-07419bff-24db-4484-9f92-0b436a9d657e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:57:14 np0005603623 nova_compute[226235]: 2026-01-31 08:57:14.577 226239 DEBUG oslo_concurrency.lockutils [req-9787be40-9669-4229-bde9-6d882bb83e53 req-07419bff-24db-4484-9f92-0b436a9d657e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:57:14 np0005603623 nova_compute[226235]: 2026-01-31 08:57:14.577 226239 DEBUG nova.network.neutron [req-9787be40-9669-4229-bde9-6d882bb83e53 req-07419bff-24db-4484-9f92-0b436a9d657e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Refreshing network info cache for port f859815f-0923-45c1-a84d-2a128fb7fd57 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:57:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:57:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/246009028' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:57:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:57:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/246009028' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:57:14 np0005603623 nova_compute[226235]: 2026-01-31 08:57:14.736 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:14.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:14.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:15 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:57:15 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:57:15 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:57:15 np0005603623 nova_compute[226235]: 2026-01-31 08:57:15.959 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:15 np0005603623 nova_compute[226235]: 2026-01-31 08:57:15.959 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:57:15 np0005603623 nova_compute[226235]: 2026-01-31 08:57:15.960 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:57:16 np0005603623 nova_compute[226235]: 2026-01-31 08:57:16.201 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:16 np0005603623 nova_compute[226235]: 2026-01-31 08:57:16.255 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:57:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:16.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:16.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:17 np0005603623 nova_compute[226235]: 2026-01-31 08:57:17.296 226239 DEBUG nova.network.neutron [req-9787be40-9669-4229-bde9-6d882bb83e53 req-07419bff-24db-4484-9f92-0b436a9d657e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updated VIF entry in instance network info cache for port f859815f-0923-45c1-a84d-2a128fb7fd57. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:57:17 np0005603623 nova_compute[226235]: 2026-01-31 08:57:17.296 226239 DEBUG nova.network.neutron [req-9787be40-9669-4229-bde9-6d882bb83e53 req-07419bff-24db-4484-9f92-0b436a9d657e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updating instance_info_cache with network_info: [{"id": "f859815f-0923-45c1-a84d-2a128fb7fd57", "address": "fa:16:3e:ee:28:18", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf859815f-09", "ovs_interfaceid": "f859815f-0923-45c1-a84d-2a128fb7fd57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:57:17 np0005603623 nova_compute[226235]: 2026-01-31 08:57:17.328 226239 DEBUG oslo_concurrency.lockutils [req-9787be40-9669-4229-bde9-6d882bb83e53 req-07419bff-24db-4484-9f92-0b436a9d657e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:57:17 np0005603623 nova_compute[226235]: 2026-01-31 08:57:17.329 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:57:17 np0005603623 nova_compute[226235]: 2026-01-31 08:57:17.329 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:57:17 np0005603623 nova_compute[226235]: 2026-01-31 08:57:17.329 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e9903ecf-c775-4e84-8997-361061869fc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:57:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:18.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:57:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:18.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:19 np0005603623 nova_compute[226235]: 2026-01-31 08:57:19.414 226239 DEBUG oslo_concurrency.lockutils [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:19 np0005603623 nova_compute[226235]: 2026-01-31 08:57:19.415 226239 DEBUG oslo_concurrency.lockutils [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:19 np0005603623 nova_compute[226235]: 2026-01-31 08:57:19.465 226239 DEBUG nova.objects.instance [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'flavor' on Instance uuid b23e0349-42a8-41d0-9eea-0407b7ffa806 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:19 np0005603623 nova_compute[226235]: 2026-01-31 08:57:19.521 226239 DEBUG oslo_concurrency.lockutils [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:19 np0005603623 nova_compute[226235]: 2026-01-31 08:57:19.600 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updating instance_info_cache with network_info: [{"id": "f859815f-0923-45c1-a84d-2a128fb7fd57", "address": "fa:16:3e:ee:28:18", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf859815f-09", "ovs_interfaceid": "f859815f-0923-45c1-a84d-2a128fb7fd57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:57:19 np0005603623 nova_compute[226235]: 2026-01-31 08:57:19.638 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:57:19 np0005603623 nova_compute[226235]: 2026-01-31 08:57:19.638 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:57:19 np0005603623 nova_compute[226235]: 2026-01-31 08:57:19.638 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:19 np0005603623 nova_compute[226235]: 2026-01-31 08:57:19.639 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:19 np0005603623 nova_compute[226235]: 2026-01-31 08:57:19.738 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.108 226239 DEBUG oslo_concurrency.lockutils [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.108 226239 DEBUG oslo_concurrency.lockutils [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.108 226239 INFO nova.compute.manager [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Attaching volume af751b83-d2b1-494f-9ebb-7edf7087a67c to /dev/vdb#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.348 226239 DEBUG os_brick.utils [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.350 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.359 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.359 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[7aeb41ff-c47c-4e9d-97bf-cd95b8c5d944]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.361 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.367 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.367 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[7f647230-9e21-4bda-ba15-16c7faa95fee]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.368 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.374 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.374 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[c3058c6e-e522-4afc-b139-8c9f2d3e4b71]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.375 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[f25a705a-d951-4a72-af57-23978d4e1b5c]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.376 226239 DEBUG oslo_concurrency.processutils [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.396 226239 DEBUG oslo_concurrency.processutils [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "nvme version" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.398 226239 DEBUG os_brick.initiator.connectors.lightos [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.398 226239 DEBUG os_brick.initiator.connectors.lightos [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.398 226239 DEBUG os_brick.initiator.connectors.lightos [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.399 226239 DEBUG os_brick.utils [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] <== get_connector_properties: return (49ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:57:20 np0005603623 nova_compute[226235]: 2026-01-31 08:57:20.399 226239 DEBUG nova.virt.block_device [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Updating existing volume attachment record: a697c025-cb73-4bcc-b4d8-801e4a7d6792 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:57:20 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:57:20 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:57:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:20.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:20.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:21 np0005603623 nova_compute[226235]: 2026-01-31 08:57:21.203 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:57:21 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3894776916' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:57:21 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:21Z|00762|binding|INFO|Releasing lport 7a59c286-57bd-4dc4-87e7-a6bfcee69c68 from this chassis (sb_readonly=0)
Jan 31 03:57:21 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:21Z|00763|binding|INFO|Releasing lport 5a0136e3-84ab-4495-80ff-8006a0a74934 from this chassis (sb_readonly=0)
Jan 31 03:57:21 np0005603623 nova_compute[226235]: 2026-01-31 08:57:21.556 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:21 np0005603623 nova_compute[226235]: 2026-01-31 08:57:21.715 226239 DEBUG nova.objects.instance [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'flavor' on Instance uuid b23e0349-42a8-41d0-9eea-0407b7ffa806 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:21 np0005603623 nova_compute[226235]: 2026-01-31 08:57:21.721 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:21 np0005603623 nova_compute[226235]: 2026-01-31 08:57:21.746 226239 DEBUG nova.virt.libvirt.driver [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Attempting to attach volume af751b83-d2b1-494f-9ebb-7edf7087a67c with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 03:57:21 np0005603623 nova_compute[226235]: 2026-01-31 08:57:21.750 226239 DEBUG nova.virt.libvirt.guest [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 03:57:21 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:57:21 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-af751b83-d2b1-494f-9ebb-7edf7087a67c">
Jan 31 03:57:21 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:21 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:21 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:21 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:57:21 np0005603623 nova_compute[226235]:  <auth username="openstack">
Jan 31 03:57:21 np0005603623 nova_compute[226235]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:57:21 np0005603623 nova_compute[226235]:  </auth>
Jan 31 03:57:21 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:57:21 np0005603623 nova_compute[226235]:  <serial>af751b83-d2b1-494f-9ebb-7edf7087a67c</serial>
Jan 31 03:57:21 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:57:21 np0005603623 nova_compute[226235]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:57:21 np0005603623 nova_compute[226235]: 2026-01-31 08:57:21.854 226239 DEBUG nova.virt.libvirt.driver [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:21 np0005603623 nova_compute[226235]: 2026-01-31 08:57:21.854 226239 DEBUG nova.virt.libvirt.driver [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:21 np0005603623 nova_compute[226235]: 2026-01-31 08:57:21.854 226239 DEBUG nova.virt.libvirt.driver [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:21 np0005603623 nova_compute[226235]: 2026-01-31 08:57:21.855 226239 DEBUG nova.virt.libvirt.driver [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No VIF found with MAC fa:16:3e:95:38:a3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:57:22 np0005603623 nova_compute[226235]: 2026-01-31 08:57:22.350 226239 DEBUG oslo_concurrency.lockutils [None req-be31680a-8a9e-4ba6-9064-346c8d853688 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:57:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/989841347' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:57:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:22.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:22.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:24.487 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=79, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=78) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:57:24 np0005603623 nova_compute[226235]: 2026-01-31 08:57:24.488 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:24.489 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:57:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:24.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:24 np0005603623 nova_compute[226235]: 2026-01-31 08:57:24.769 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:57:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:24.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:57:25 np0005603623 nova_compute[226235]: 2026-01-31 08:57:25.149 226239 DEBUG oslo_concurrency.lockutils [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:25 np0005603623 nova_compute[226235]: 2026-01-31 08:57:25.150 226239 DEBUG oslo_concurrency.lockutils [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:25 np0005603623 nova_compute[226235]: 2026-01-31 08:57:25.178 226239 DEBUG nova.objects.instance [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'flavor' on Instance uuid b23e0349-42a8-41d0-9eea-0407b7ffa806 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:25 np0005603623 nova_compute[226235]: 2026-01-31 08:57:25.271 226239 DEBUG oslo_concurrency.lockutils [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:25 np0005603623 nova_compute[226235]: 2026-01-31 08:57:25.679 226239 DEBUG oslo_concurrency.lockutils [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:25 np0005603623 nova_compute[226235]: 2026-01-31 08:57:25.679 226239 DEBUG oslo_concurrency.lockutils [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:25 np0005603623 nova_compute[226235]: 2026-01-31 08:57:25.679 226239 INFO nova.compute.manager [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Attaching volume d8da98d6-6c1a-48c1-8832-4fd207143449 to /dev/vdc#033[00m
Jan 31 03:57:25 np0005603623 nova_compute[226235]: 2026-01-31 08:57:25.982 226239 DEBUG os_brick.utils [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:57:25 np0005603623 nova_compute[226235]: 2026-01-31 08:57:25.983 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:25 np0005603623 nova_compute[226235]: 2026-01-31 08:57:25.989 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:25 np0005603623 nova_compute[226235]: 2026-01-31 08:57:25.989 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[5c8ad905-68a2-40db-b9fa-77a300843280]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:25 np0005603623 nova_compute[226235]: 2026-01-31 08:57:25.990 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:25 np0005603623 nova_compute[226235]: 2026-01-31 08:57:25.994 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.004s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:25 np0005603623 nova_compute[226235]: 2026-01-31 08:57:25.994 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[2cc9a3e2-659f-4a8b-986b-8c77777ed385]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:25 np0005603623 nova_compute[226235]: 2026-01-31 08:57:25.996 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:26 np0005603623 nova_compute[226235]: 2026-01-31 08:57:26.001 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.005s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:26 np0005603623 nova_compute[226235]: 2026-01-31 08:57:26.001 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[bc9fa052-983a-4268-9461-540f252acfe6]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:26 np0005603623 nova_compute[226235]: 2026-01-31 08:57:26.003 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[44efa367-bc41-463d-ab3a-9461df7860d7]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:26 np0005603623 nova_compute[226235]: 2026-01-31 08:57:26.003 226239 DEBUG oslo_concurrency.processutils [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:26 np0005603623 nova_compute[226235]: 2026-01-31 08:57:26.022 226239 DEBUG oslo_concurrency.processutils [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "nvme version" returned: 0 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:26 np0005603623 nova_compute[226235]: 2026-01-31 08:57:26.024 226239 DEBUG os_brick.initiator.connectors.lightos [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:57:26 np0005603623 nova_compute[226235]: 2026-01-31 08:57:26.024 226239 DEBUG os_brick.initiator.connectors.lightos [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:57:26 np0005603623 nova_compute[226235]: 2026-01-31 08:57:26.024 226239 DEBUG os_brick.initiator.connectors.lightos [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:57:26 np0005603623 nova_compute[226235]: 2026-01-31 08:57:26.025 226239 DEBUG os_brick.utils [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] <== get_connector_properties: return (42ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:57:26 np0005603623 nova_compute[226235]: 2026-01-31 08:57:26.025 226239 DEBUG nova.virt.block_device [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Updating existing volume attachment record: fb8393c9-ab65-4a4f-89cd-2af5b8d15b45 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:57:26 np0005603623 nova_compute[226235]: 2026-01-31 08:57:26.204 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:26 np0005603623 nova_compute[226235]: 2026-01-31 08:57:26.591 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:26.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:57:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:26.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:57:27 np0005603623 nova_compute[226235]: 2026-01-31 08:57:27.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:27 np0005603623 nova_compute[226235]: 2026-01-31 08:57:27.427 226239 DEBUG nova.objects.instance [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'flavor' on Instance uuid b23e0349-42a8-41d0-9eea-0407b7ffa806 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:27 np0005603623 nova_compute[226235]: 2026-01-31 08:57:27.507 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:27 np0005603623 nova_compute[226235]: 2026-01-31 08:57:27.510 226239 DEBUG nova.virt.libvirt.driver [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Attempting to attach volume d8da98d6-6c1a-48c1-8832-4fd207143449 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 03:57:27 np0005603623 nova_compute[226235]: 2026-01-31 08:57:27.512 226239 DEBUG nova.virt.libvirt.guest [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 03:57:27 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:57:27 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-d8da98d6-6c1a-48c1-8832-4fd207143449">
Jan 31 03:57:27 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:27 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:27 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:27 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:57:27 np0005603623 nova_compute[226235]:  <auth username="openstack">
Jan 31 03:57:27 np0005603623 nova_compute[226235]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:57:27 np0005603623 nova_compute[226235]:  </auth>
Jan 31 03:57:27 np0005603623 nova_compute[226235]:  <target dev="vdc" bus="virtio"/>
Jan 31 03:57:27 np0005603623 nova_compute[226235]:  <serial>d8da98d6-6c1a-48c1-8832-4fd207143449</serial>
Jan 31 03:57:27 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:57:27 np0005603623 nova_compute[226235]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:57:27 np0005603623 nova_compute[226235]: 2026-01-31 08:57:27.860 226239 DEBUG nova.virt.libvirt.driver [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:27 np0005603623 nova_compute[226235]: 2026-01-31 08:57:27.861 226239 DEBUG nova.virt.libvirt.driver [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:27 np0005603623 nova_compute[226235]: 2026-01-31 08:57:27.861 226239 DEBUG nova.virt.libvirt.driver [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:27 np0005603623 nova_compute[226235]: 2026-01-31 08:57:27.861 226239 DEBUG nova.virt.libvirt.driver [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:27 np0005603623 nova_compute[226235]: 2026-01-31 08:57:27.861 226239 DEBUG nova.virt.libvirt.driver [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No VIF found with MAC fa:16:3e:95:38:a3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:57:28 np0005603623 nova_compute[226235]: 2026-01-31 08:57:28.270 226239 DEBUG oslo_concurrency.lockutils [None req-33cfe3bb-2893-4c9c-8ed7-ae66ffba9d24 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:28.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:28.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:29 np0005603623 nova_compute[226235]: 2026-01-31 08:57:29.772 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:30.149 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:30.150 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:30.150 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:30 np0005603623 nova_compute[226235]: 2026-01-31 08:57:30.668 226239 DEBUG oslo_concurrency.lockutils [None req-f8e6f829-8ce2-4f60-847d-8ea1dba4faf0 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:30 np0005603623 nova_compute[226235]: 2026-01-31 08:57:30.668 226239 DEBUG oslo_concurrency.lockutils [None req-f8e6f829-8ce2-4f60-847d-8ea1dba4faf0 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:30 np0005603623 nova_compute[226235]: 2026-01-31 08:57:30.703 226239 INFO nova.compute.manager [None req-f8e6f829-8ce2-4f60-847d-8ea1dba4faf0 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Detaching volume af751b83-d2b1-494f-9ebb-7edf7087a67c#033[00m
Jan 31 03:57:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:30.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:30.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:30 np0005603623 nova_compute[226235]: 2026-01-31 08:57:30.987 226239 INFO nova.virt.block_device [None req-f8e6f829-8ce2-4f60-847d-8ea1dba4faf0 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Attempting to driver detach volume af751b83-d2b1-494f-9ebb-7edf7087a67c from mountpoint /dev/vdb#033[00m
Jan 31 03:57:30 np0005603623 nova_compute[226235]: 2026-01-31 08:57:30.994 226239 DEBUG nova.virt.libvirt.driver [None req-f8e6f829-8ce2-4f60-847d-8ea1dba4faf0 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Attempting to detach device vdb from instance b23e0349-42a8-41d0-9eea-0407b7ffa806 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:57:30 np0005603623 nova_compute[226235]: 2026-01-31 08:57:30.995 226239 DEBUG nova.virt.libvirt.guest [None req-f8e6f829-8ce2-4f60-847d-8ea1dba4faf0 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:57:30 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:57:30 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-af751b83-d2b1-494f-9ebb-7edf7087a67c">
Jan 31 03:57:30 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:30 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:30 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:30 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:57:30 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:57:30 np0005603623 nova_compute[226235]:  <serial>af751b83-d2b1-494f-9ebb-7edf7087a67c</serial>
Jan 31 03:57:30 np0005603623 nova_compute[226235]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 03:57:30 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:57:30 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:57:31 np0005603623 nova_compute[226235]: 2026-01-31 08:57:31.001 226239 INFO nova.virt.libvirt.driver [None req-f8e6f829-8ce2-4f60-847d-8ea1dba4faf0 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Successfully detached device vdb from instance b23e0349-42a8-41d0-9eea-0407b7ffa806 from the persistent domain config.#033[00m
Jan 31 03:57:31 np0005603623 nova_compute[226235]: 2026-01-31 08:57:31.001 226239 DEBUG nova.virt.libvirt.driver [None req-f8e6f829-8ce2-4f60-847d-8ea1dba4faf0 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance b23e0349-42a8-41d0-9eea-0407b7ffa806 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 03:57:31 np0005603623 nova_compute[226235]: 2026-01-31 08:57:31.002 226239 DEBUG nova.virt.libvirt.guest [None req-f8e6f829-8ce2-4f60-847d-8ea1dba4faf0 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:57:31 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:57:31 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-af751b83-d2b1-494f-9ebb-7edf7087a67c">
Jan 31 03:57:31 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:31 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:31 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:31 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:57:31 np0005603623 nova_compute[226235]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:57:31 np0005603623 nova_compute[226235]:  <serial>af751b83-d2b1-494f-9ebb-7edf7087a67c</serial>
Jan 31 03:57:31 np0005603623 nova_compute[226235]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 03:57:31 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:57:31 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:57:31 np0005603623 nova_compute[226235]: 2026-01-31 08:57:31.092 226239 DEBUG nova.virt.libvirt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Received event <DeviceRemovedEvent: 1769849851.0923538, b23e0349-42a8-41d0-9eea-0407b7ffa806 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 03:57:31 np0005603623 nova_compute[226235]: 2026-01-31 08:57:31.094 226239 DEBUG nova.virt.libvirt.driver [None req-f8e6f829-8ce2-4f60-847d-8ea1dba4faf0 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance b23e0349-42a8-41d0-9eea-0407b7ffa806 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 03:57:31 np0005603623 nova_compute[226235]: 2026-01-31 08:57:31.096 226239 INFO nova.virt.libvirt.driver [None req-f8e6f829-8ce2-4f60-847d-8ea1dba4faf0 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Successfully detached device vdb from instance b23e0349-42a8-41d0-9eea-0407b7ffa806 from the live domain config.#033[00m
Jan 31 03:57:31 np0005603623 nova_compute[226235]: 2026-01-31 08:57:31.207 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:31 np0005603623 nova_compute[226235]: 2026-01-31 08:57:31.406 226239 DEBUG nova.objects.instance [None req-f8e6f829-8ce2-4f60-847d-8ea1dba4faf0 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'flavor' on Instance uuid b23e0349-42a8-41d0-9eea-0407b7ffa806 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:31 np0005603623 nova_compute[226235]: 2026-01-31 08:57:31.472 226239 DEBUG oslo_concurrency.lockutils [None req-f8e6f829-8ce2-4f60-847d-8ea1dba4faf0 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.803s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:31.490 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '79'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:57:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:32.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:57:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:32.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:33 np0005603623 nova_compute[226235]: 2026-01-31 08:57:33.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:33 np0005603623 nova_compute[226235]: 2026-01-31 08:57:33.178 226239 DEBUG oslo_concurrency.lockutils [None req-ef723421-5b39-442c-9e4b-8442b413eb78 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:33 np0005603623 nova_compute[226235]: 2026-01-31 08:57:33.178 226239 DEBUG oslo_concurrency.lockutils [None req-ef723421-5b39-442c-9e4b-8442b413eb78 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:33 np0005603623 nova_compute[226235]: 2026-01-31 08:57:33.217 226239 INFO nova.compute.manager [None req-ef723421-5b39-442c-9e4b-8442b413eb78 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Detaching volume d8da98d6-6c1a-48c1-8832-4fd207143449#033[00m
Jan 31 03:57:33 np0005603623 nova_compute[226235]: 2026-01-31 08:57:33.450 226239 INFO nova.virt.block_device [None req-ef723421-5b39-442c-9e4b-8442b413eb78 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Attempting to driver detach volume d8da98d6-6c1a-48c1-8832-4fd207143449 from mountpoint /dev/vdc#033[00m
Jan 31 03:57:33 np0005603623 nova_compute[226235]: 2026-01-31 08:57:33.456 226239 DEBUG nova.virt.libvirt.driver [None req-ef723421-5b39-442c-9e4b-8442b413eb78 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Attempting to detach device vdc from instance b23e0349-42a8-41d0-9eea-0407b7ffa806 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:57:33 np0005603623 nova_compute[226235]: 2026-01-31 08:57:33.457 226239 DEBUG nova.virt.libvirt.guest [None req-ef723421-5b39-442c-9e4b-8442b413eb78 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:57:33 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:57:33 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-d8da98d6-6c1a-48c1-8832-4fd207143449">
Jan 31 03:57:33 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:33 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:33 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:33 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:57:33 np0005603623 nova_compute[226235]:  <target dev="vdc" bus="virtio"/>
Jan 31 03:57:33 np0005603623 nova_compute[226235]:  <serial>d8da98d6-6c1a-48c1-8832-4fd207143449</serial>
Jan 31 03:57:33 np0005603623 nova_compute[226235]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 31 03:57:33 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:57:33 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:57:33 np0005603623 nova_compute[226235]: 2026-01-31 08:57:33.464 226239 INFO nova.virt.libvirt.driver [None req-ef723421-5b39-442c-9e4b-8442b413eb78 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Successfully detached device vdc from instance b23e0349-42a8-41d0-9eea-0407b7ffa806 from the persistent domain config.#033[00m
Jan 31 03:57:33 np0005603623 nova_compute[226235]: 2026-01-31 08:57:33.464 226239 DEBUG nova.virt.libvirt.driver [None req-ef723421-5b39-442c-9e4b-8442b413eb78 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] (1/8): Attempting to detach device vdc with device alias virtio-disk2 from instance b23e0349-42a8-41d0-9eea-0407b7ffa806 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 03:57:33 np0005603623 nova_compute[226235]: 2026-01-31 08:57:33.465 226239 DEBUG nova.virt.libvirt.guest [None req-ef723421-5b39-442c-9e4b-8442b413eb78 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:57:33 np0005603623 nova_compute[226235]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:57:33 np0005603623 nova_compute[226235]:  <source protocol="rbd" name="volumes/volume-d8da98d6-6c1a-48c1-8832-4fd207143449">
Jan 31 03:57:33 np0005603623 nova_compute[226235]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:33 np0005603623 nova_compute[226235]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:33 np0005603623 nova_compute[226235]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:33 np0005603623 nova_compute[226235]:  </source>
Jan 31 03:57:33 np0005603623 nova_compute[226235]:  <target dev="vdc" bus="virtio"/>
Jan 31 03:57:33 np0005603623 nova_compute[226235]:  <serial>d8da98d6-6c1a-48c1-8832-4fd207143449</serial>
Jan 31 03:57:33 np0005603623 nova_compute[226235]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 31 03:57:33 np0005603623 nova_compute[226235]: </disk>
Jan 31 03:57:33 np0005603623 nova_compute[226235]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:57:33 np0005603623 nova_compute[226235]: 2026-01-31 08:57:33.532 226239 DEBUG nova.virt.libvirt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Received event <DeviceRemovedEvent: 1769849853.5320294, b23e0349-42a8-41d0-9eea-0407b7ffa806 => virtio-disk2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 03:57:33 np0005603623 nova_compute[226235]: 2026-01-31 08:57:33.533 226239 DEBUG nova.virt.libvirt.driver [None req-ef723421-5b39-442c-9e4b-8442b413eb78 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Start waiting for the detach event from libvirt for device vdc with device alias virtio-disk2 for instance b23e0349-42a8-41d0-9eea-0407b7ffa806 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 03:57:33 np0005603623 nova_compute[226235]: 2026-01-31 08:57:33.535 226239 INFO nova.virt.libvirt.driver [None req-ef723421-5b39-442c-9e4b-8442b413eb78 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Successfully detached device vdc from instance b23e0349-42a8-41d0-9eea-0407b7ffa806 from the live domain config.#033[00m
Jan 31 03:57:33 np0005603623 nova_compute[226235]: 2026-01-31 08:57:33.864 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:33 np0005603623 nova_compute[226235]: 2026-01-31 08:57:33.973 226239 DEBUG nova.objects.instance [None req-ef723421-5b39-442c-9e4b-8442b413eb78 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'flavor' on Instance uuid b23e0349-42a8-41d0-9eea-0407b7ffa806 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:34 np0005603623 nova_compute[226235]: 2026-01-31 08:57:34.024 226239 DEBUG oslo_concurrency.lockutils [None req-ef723421-5b39-442c-9e4b-8442b413eb78 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.845s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:34.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:34 np0005603623 nova_compute[226235]: 2026-01-31 08:57:34.815 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:57:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:34.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.650 226239 DEBUG oslo_concurrency.lockutils [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.650 226239 DEBUG oslo_concurrency.lockutils [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.651 226239 DEBUG oslo_concurrency.lockutils [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "b23e0349-42a8-41d0-9eea-0407b7ffa806-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.651 226239 DEBUG oslo_concurrency.lockutils [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.652 226239 DEBUG oslo_concurrency.lockutils [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.653 226239 INFO nova.compute.manager [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Terminating instance#033[00m
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.654 226239 DEBUG nova.compute.manager [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:57:35 np0005603623 kernel: tapbcb7b61b-e5 (unregistering): left promiscuous mode
Jan 31 03:57:35 np0005603623 NetworkManager[48970]: <info>  [1769849855.6979] device (tapbcb7b61b-e5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.703 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:35 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:35Z|00764|binding|INFO|Releasing lport bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d from this chassis (sb_readonly=0)
Jan 31 03:57:35 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:35Z|00765|binding|INFO|Setting lport bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d down in Southbound
Jan 31 03:57:35 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:35Z|00766|binding|INFO|Removing iface tapbcb7b61b-e5 ovn-installed in OVS
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.708 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:35.713 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:38:a3 10.100.0.5'], port_security=['fa:16:3e:95:38:a3 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b23e0349-42a8-41d0-9eea-0407b7ffa806', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1b5345e-f6dc-4309-b059-80678428d42d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab2d642eb03c4bda84a9a23e86f1fa4d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3374df0b-0f82-408b-a043-afddd7a50c2a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a496b319-a305-495c-a6a2-a324cd91f494, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:57:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:35.715 143258 INFO neutron.agent.ovn.metadata.agent [-] Port bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d in datapath a1b5345e-f6dc-4309-b059-80678428d42d unbound from our chassis#033[00m
Jan 31 03:57:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:35.717 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a1b5345e-f6dc-4309-b059-80678428d42d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:57:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:35.718 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[be4f5cfe-1337-45d3-bea7-c4fe66877c28]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:35.718 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d namespace which is not needed anymore#033[00m
Jan 31 03:57:35 np0005603623 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000b7.scope: Deactivated successfully.
Jan 31 03:57:35 np0005603623 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000b7.scope: Consumed 13.568s CPU time.
Jan 31 03:57:35 np0005603623 systemd-machined[194379]: Machine qemu-86-instance-000000b7 terminated.
Jan 31 03:57:35 np0005603623 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[313848]: [NOTICE]   (313852) : haproxy version is 2.8.14-c23fe91
Jan 31 03:57:35 np0005603623 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[313848]: [NOTICE]   (313852) : path to executable is /usr/sbin/haproxy
Jan 31 03:57:35 np0005603623 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[313848]: [WARNING]  (313852) : Exiting Master process...
Jan 31 03:57:35 np0005603623 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[313848]: [ALERT]    (313852) : Current worker (313854) exited with code 143 (Terminated)
Jan 31 03:57:35 np0005603623 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[313848]: [WARNING]  (313852) : All workers exited. Exiting... (0)
Jan 31 03:57:35 np0005603623 systemd[1]: libpod-a3512e38e7f3f711f7589f2871359c1fd928341bc410fa942099ba5a59daee90.scope: Deactivated successfully.
Jan 31 03:57:35 np0005603623 podman[314345]: 2026-01-31 08:57:35.827135715 +0000 UTC m=+0.040292576 container died a3512e38e7f3f711f7589f2871359c1fd928341bc410fa942099ba5a59daee90 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:57:35 np0005603623 systemd[1]: var-lib-containers-storage-overlay-df7b33469585a7da0bee20dbc2be208246ee5410d131bdadb76c3a7b450f221d-merged.mount: Deactivated successfully.
Jan 31 03:57:35 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a3512e38e7f3f711f7589f2871359c1fd928341bc410fa942099ba5a59daee90-userdata-shm.mount: Deactivated successfully.
Jan 31 03:57:35 np0005603623 podman[314345]: 2026-01-31 08:57:35.858322043 +0000 UTC m=+0.071478884 container cleanup a3512e38e7f3f711f7589f2871359c1fd928341bc410fa942099ba5a59daee90 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:57:35 np0005603623 systemd[1]: libpod-conmon-a3512e38e7f3f711f7589f2871359c1fd928341bc410fa942099ba5a59daee90.scope: Deactivated successfully.
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.882 226239 INFO nova.virt.libvirt.driver [-] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Instance destroyed successfully.#033[00m
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.883 226239 DEBUG nova.objects.instance [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'resources' on Instance uuid b23e0349-42a8-41d0-9eea-0407b7ffa806 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.898 226239 DEBUG nova.virt.libvirt.vif [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:56:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-2065541187',display_name='tempest-AttachVolumeTestJSON-server-2065541187',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-2065541187',id=183,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPUL5cEcJVRlCiMbz2iSvyLHRRbwondnCB0J5PuEgcXNy+3njwhMoe7I/EWgOT4I7wc9pdVLAB8zhu2jDR6J0fzH2TLS8K30f9hMVWgCCgCcK+C+JzBwM0cKlSwqRiExQA==',key_name='tempest-keypair-717536484',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:56:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ab2d642eb03c4bda84a9a23e86f1fa4d',ramdisk_id='',reservation_id='r-wlcyw370',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-1437067745',owner_user_name='tempest-AttachVolumeTestJSON-1437067745-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:56:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53804fd0f3a14f95a4955e3bc6dcc8cb',uuid=b23e0349-42a8-41d0-9eea-0407b7ffa806,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "address": "fa:16:3e:95:38:a3", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcb7b61b-e5", "ovs_interfaceid": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.899 226239 DEBUG nova.network.os_vif_util [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converting VIF {"id": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "address": "fa:16:3e:95:38:a3", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcb7b61b-e5", "ovs_interfaceid": "bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.900 226239 DEBUG nova.network.os_vif_util [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:95:38:a3,bridge_name='br-int',has_traffic_filtering=True,id=bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbcb7b61b-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.900 226239 DEBUG os_vif [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:38:a3,bridge_name='br-int',has_traffic_filtering=True,id=bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbcb7b61b-e5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.902 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.902 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbcb7b61b-e5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.905 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.906 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.909 226239 INFO os_vif [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:38:a3,bridge_name='br-int',has_traffic_filtering=True,id=bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbcb7b61b-e5')#033[00m
Jan 31 03:57:35 np0005603623 podman[314375]: 2026-01-31 08:57:35.914542996 +0000 UTC m=+0.038196799 container remove a3512e38e7f3f711f7589f2871359c1fd928341bc410fa942099ba5a59daee90 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:57:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:35.919 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[018927da-fedc-4e9e-b724-cea50aa48166]: (4, ('Sat Jan 31 08:57:35 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d (a3512e38e7f3f711f7589f2871359c1fd928341bc410fa942099ba5a59daee90)\na3512e38e7f3f711f7589f2871359c1fd928341bc410fa942099ba5a59daee90\nSat Jan 31 08:57:35 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d (a3512e38e7f3f711f7589f2871359c1fd928341bc410fa942099ba5a59daee90)\na3512e38e7f3f711f7589f2871359c1fd928341bc410fa942099ba5a59daee90\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:35.921 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e478fd38-a737-4c71-bb97-9b7cf01fab2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:35.922 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1b5345e-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:35 np0005603623 kernel: tapa1b5345e-f0: left promiscuous mode
Jan 31 03:57:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:35.928 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c55eb5d6-7d60-45f4-aadf-bd2a3a820fe8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:35 np0005603623 nova_compute[226235]: 2026-01-31 08:57:35.930 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:35.944 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bba768c4-d849-40e7-b7f3-55965184515b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:35.946 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1f9f3479-1497-4d8f-8a94-cfb5c7779fbc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:35.958 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[05e58652-57b1-4f55-bcc6-1bbbe0c7c4db]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 895263, 'reachable_time': 27870, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 314419, 'error': None, 'target': 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:35.961 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:57:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:35.961 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[d83b6cc7-d27d-4ff3-9377-fdc6ec8e18ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:35 np0005603623 systemd[1]: run-netns-ovnmeta\x2da1b5345e\x2df6dc\x2d4309\x2db059\x2d80678428d42d.mount: Deactivated successfully.
Jan 31 03:57:36 np0005603623 nova_compute[226235]: 2026-01-31 08:57:36.209 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:36 np0005603623 nova_compute[226235]: 2026-01-31 08:57:36.230 226239 DEBUG nova.compute.manager [req-a2d77c8b-3723-41e9-a87d-fa89d3127d2b req-b5874349-7b59-4893-adb8-f8c8995c6b24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Received event network-vif-unplugged-bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:36 np0005603623 nova_compute[226235]: 2026-01-31 08:57:36.231 226239 DEBUG oslo_concurrency.lockutils [req-a2d77c8b-3723-41e9-a87d-fa89d3127d2b req-b5874349-7b59-4893-adb8-f8c8995c6b24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b23e0349-42a8-41d0-9eea-0407b7ffa806-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:36 np0005603623 nova_compute[226235]: 2026-01-31 08:57:36.231 226239 DEBUG oslo_concurrency.lockutils [req-a2d77c8b-3723-41e9-a87d-fa89d3127d2b req-b5874349-7b59-4893-adb8-f8c8995c6b24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:36 np0005603623 nova_compute[226235]: 2026-01-31 08:57:36.232 226239 DEBUG oslo_concurrency.lockutils [req-a2d77c8b-3723-41e9-a87d-fa89d3127d2b req-b5874349-7b59-4893-adb8-f8c8995c6b24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:36 np0005603623 nova_compute[226235]: 2026-01-31 08:57:36.232 226239 DEBUG nova.compute.manager [req-a2d77c8b-3723-41e9-a87d-fa89d3127d2b req-b5874349-7b59-4893-adb8-f8c8995c6b24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] No waiting events found dispatching network-vif-unplugged-bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:57:36 np0005603623 nova_compute[226235]: 2026-01-31 08:57:36.233 226239 DEBUG nova.compute.manager [req-a2d77c8b-3723-41e9-a87d-fa89d3127d2b req-b5874349-7b59-4893-adb8-f8c8995c6b24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Received event network-vif-unplugged-bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:57:36 np0005603623 nova_compute[226235]: 2026-01-31 08:57:36.361 226239 INFO nova.virt.libvirt.driver [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Deleting instance files /var/lib/nova/instances/b23e0349-42a8-41d0-9eea-0407b7ffa806_del#033[00m
Jan 31 03:57:36 np0005603623 nova_compute[226235]: 2026-01-31 08:57:36.363 226239 INFO nova.virt.libvirt.driver [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Deletion of /var/lib/nova/instances/b23e0349-42a8-41d0-9eea-0407b7ffa806_del complete#033[00m
Jan 31 03:57:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:36 np0005603623 nova_compute[226235]: 2026-01-31 08:57:36.449 226239 INFO nova.compute.manager [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Took 0.79 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:57:36 np0005603623 nova_compute[226235]: 2026-01-31 08:57:36.450 226239 DEBUG oslo.service.loopingcall [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:57:36 np0005603623 nova_compute[226235]: 2026-01-31 08:57:36.451 226239 DEBUG nova.compute.manager [-] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:57:36 np0005603623 nova_compute[226235]: 2026-01-31 08:57:36.451 226239 DEBUG nova.network.neutron [-] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:57:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:36.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:36.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:38 np0005603623 nova_compute[226235]: 2026-01-31 08:57:38.450 226239 DEBUG nova.compute.manager [req-c4b0f831-47ed-4841-ab79-f3978008c148 req-31dd8892-4af3-4037-af71-a24534af5335 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Received event network-vif-plugged-bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:38 np0005603623 nova_compute[226235]: 2026-01-31 08:57:38.450 226239 DEBUG oslo_concurrency.lockutils [req-c4b0f831-47ed-4841-ab79-f3978008c148 req-31dd8892-4af3-4037-af71-a24534af5335 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b23e0349-42a8-41d0-9eea-0407b7ffa806-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:38 np0005603623 nova_compute[226235]: 2026-01-31 08:57:38.451 226239 DEBUG oslo_concurrency.lockutils [req-c4b0f831-47ed-4841-ab79-f3978008c148 req-31dd8892-4af3-4037-af71-a24534af5335 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:38 np0005603623 nova_compute[226235]: 2026-01-31 08:57:38.451 226239 DEBUG oslo_concurrency.lockutils [req-c4b0f831-47ed-4841-ab79-f3978008c148 req-31dd8892-4af3-4037-af71-a24534af5335 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:38 np0005603623 nova_compute[226235]: 2026-01-31 08:57:38.451 226239 DEBUG nova.compute.manager [req-c4b0f831-47ed-4841-ab79-f3978008c148 req-31dd8892-4af3-4037-af71-a24534af5335 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] No waiting events found dispatching network-vif-plugged-bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:57:38 np0005603623 nova_compute[226235]: 2026-01-31 08:57:38.451 226239 WARNING nova.compute.manager [req-c4b0f831-47ed-4841-ab79-f3978008c148 req-31dd8892-4af3-4037-af71-a24534af5335 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Received unexpected event network-vif-plugged-bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:57:38 np0005603623 nova_compute[226235]: 2026-01-31 08:57:38.509 226239 DEBUG nova.network.neutron [-] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:57:38 np0005603623 nova_compute[226235]: 2026-01-31 08:57:38.573 226239 INFO nova.compute.manager [-] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Took 2.12 seconds to deallocate network for instance.#033[00m
Jan 31 03:57:38 np0005603623 nova_compute[226235]: 2026-01-31 08:57:38.644 226239 DEBUG oslo_concurrency.lockutils [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:38 np0005603623 nova_compute[226235]: 2026-01-31 08:57:38.645 226239 DEBUG oslo_concurrency.lockutils [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:38 np0005603623 nova_compute[226235]: 2026-01-31 08:57:38.685 226239 DEBUG nova.compute.manager [req-834bf98a-6c22-4ab0-a010-9962cad003b3 req-e3ec6821-b0ec-4b21-9b13-71ea3367d723 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Received event network-vif-deleted-bcb7b61b-e51b-4bfe-a5a5-e01d818c0e2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:38 np0005603623 nova_compute[226235]: 2026-01-31 08:57:38.759 226239 DEBUG oslo_concurrency.processutils [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:38.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:38.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:57:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3463132735' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:57:39 np0005603623 nova_compute[226235]: 2026-01-31 08:57:39.168 226239 DEBUG oslo_concurrency.processutils [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:39 np0005603623 nova_compute[226235]: 2026-01-31 08:57:39.173 226239 DEBUG nova.compute.provider_tree [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:57:39 np0005603623 nova_compute[226235]: 2026-01-31 08:57:39.195 226239 DEBUG nova.scheduler.client.report [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:57:39 np0005603623 nova_compute[226235]: 2026-01-31 08:57:39.228 226239 DEBUG oslo_concurrency.lockutils [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.584s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:39 np0005603623 nova_compute[226235]: 2026-01-31 08:57:39.494 226239 INFO nova.scheduler.client.report [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Deleted allocations for instance b23e0349-42a8-41d0-9eea-0407b7ffa806#033[00m
Jan 31 03:57:39 np0005603623 nova_compute[226235]: 2026-01-31 08:57:39.587 226239 DEBUG oslo_concurrency.lockutils [None req-20a032fc-5f7b-49e8-9381-dacf8697977f 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b23e0349-42a8-41d0-9eea-0407b7ffa806" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.936s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:40.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:57:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:40.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:57:40 np0005603623 nova_compute[226235]: 2026-01-31 08:57:40.906 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:41 np0005603623 nova_compute[226235]: 2026-01-31 08:57:41.211 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:57:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:42.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:57:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:42.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:42 np0005603623 podman[314497]: 2026-01-31 08:57:42.965002115 +0000 UTC m=+0.051447285 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 03:57:42 np0005603623 podman[314498]: 2026-01-31 08:57:42.982717221 +0000 UTC m=+0.068772949 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:57:43 np0005603623 nova_compute[226235]: 2026-01-31 08:57:43.307 226239 DEBUG oslo_concurrency.lockutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquiring lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:43 np0005603623 nova_compute[226235]: 2026-01-31 08:57:43.307 226239 DEBUG oslo_concurrency.lockutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:43 np0005603623 nova_compute[226235]: 2026-01-31 08:57:43.338 226239 DEBUG nova.compute.manager [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:57:43 np0005603623 nova_compute[226235]: 2026-01-31 08:57:43.704 226239 DEBUG oslo_concurrency.lockutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:43 np0005603623 nova_compute[226235]: 2026-01-31 08:57:43.704 226239 DEBUG oslo_concurrency.lockutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:43 np0005603623 nova_compute[226235]: 2026-01-31 08:57:43.713 226239 DEBUG nova.virt.hardware [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:57:43 np0005603623 nova_compute[226235]: 2026-01-31 08:57:43.714 226239 INFO nova.compute.claims [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:57:43 np0005603623 nova_compute[226235]: 2026-01-31 08:57:43.895 226239 DEBUG oslo_concurrency.processutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:57:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3775677690' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.353 226239 DEBUG oslo_concurrency.processutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.358 226239 DEBUG nova.compute.provider_tree [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.376 226239 DEBUG nova.scheduler.client.report [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.405 226239 DEBUG oslo_concurrency.lockutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.406 226239 DEBUG nova.compute.manager [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.485 226239 DEBUG nova.compute.manager [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.486 226239 DEBUG nova.network.neutron [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.522 226239 INFO nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.567 226239 DEBUG nova.compute.manager [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:57:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:57:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2098857400' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:57:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:57:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2098857400' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:57:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:44.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.803 226239 DEBUG nova.compute.manager [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.804 226239 DEBUG nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.804 226239 INFO nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Creating image(s)#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.825 226239 DEBUG nova.storage.rbd_utils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] rbd image b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.847 226239 DEBUG nova.storage.rbd_utils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] rbd image b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.868 226239 DEBUG nova.storage.rbd_utils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] rbd image b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:44.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.872 226239 DEBUG oslo_concurrency.processutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.922 226239 DEBUG oslo_concurrency.processutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.922 226239 DEBUG oslo_concurrency.lockutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.923 226239 DEBUG oslo_concurrency.lockutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.923 226239 DEBUG oslo_concurrency.lockutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.942 226239 DEBUG nova.storage.rbd_utils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] rbd image b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:44 np0005603623 nova_compute[226235]: 2026-01-31 08:57:44.945 226239 DEBUG oslo_concurrency.processutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:45 np0005603623 nova_compute[226235]: 2026-01-31 08:57:45.206 226239 DEBUG oslo_concurrency.processutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.261s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:45 np0005603623 nova_compute[226235]: 2026-01-31 08:57:45.305 226239 DEBUG nova.storage.rbd_utils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] resizing rbd image b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:57:45 np0005603623 nova_compute[226235]: 2026-01-31 08:57:45.405 226239 DEBUG nova.objects.instance [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lazy-loading 'migration_context' on Instance uuid b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:45 np0005603623 nova_compute[226235]: 2026-01-31 08:57:45.427 226239 DEBUG nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:57:45 np0005603623 nova_compute[226235]: 2026-01-31 08:57:45.428 226239 DEBUG nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Ensure instance console log exists: /var/lib/nova/instances/b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:57:45 np0005603623 nova_compute[226235]: 2026-01-31 08:57:45.428 226239 DEBUG oslo_concurrency.lockutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:45 np0005603623 nova_compute[226235]: 2026-01-31 08:57:45.429 226239 DEBUG oslo_concurrency.lockutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:45 np0005603623 nova_compute[226235]: 2026-01-31 08:57:45.429 226239 DEBUG oslo_concurrency.lockutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:45 np0005603623 nova_compute[226235]: 2026-01-31 08:57:45.910 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:46 np0005603623 nova_compute[226235]: 2026-01-31 08:57:46.262 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:57:46 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3720910465' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:57:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:57:46 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3720910465' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:57:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:46 np0005603623 nova_compute[226235]: 2026-01-31 08:57:46.466 226239 DEBUG nova.network.neutron [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Successfully created port: e6ed4c7b-198a-42b9-bcf7-79fcae00e769 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:57:46 np0005603623 nova_compute[226235]: 2026-01-31 08:57:46.715 226239 DEBUG oslo_concurrency.lockutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:46 np0005603623 nova_compute[226235]: 2026-01-31 08:57:46.715 226239 DEBUG oslo_concurrency.lockutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:46.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:46 np0005603623 nova_compute[226235]: 2026-01-31 08:57:46.851 226239 DEBUG nova.compute.manager [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:57:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:57:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:46.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:57:47 np0005603623 nova_compute[226235]: 2026-01-31 08:57:47.022 226239 DEBUG oslo_concurrency.lockutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:47 np0005603623 nova_compute[226235]: 2026-01-31 08:57:47.022 226239 DEBUG oslo_concurrency.lockutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:47 np0005603623 nova_compute[226235]: 2026-01-31 08:57:47.028 226239 DEBUG nova.virt.hardware [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:57:47 np0005603623 nova_compute[226235]: 2026-01-31 08:57:47.028 226239 INFO nova.compute.claims [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:57:47 np0005603623 nova_compute[226235]: 2026-01-31 08:57:47.278 226239 DEBUG oslo_concurrency.processutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:57:47 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1993068141' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:57:47 np0005603623 nova_compute[226235]: 2026-01-31 08:57:47.699 226239 DEBUG oslo_concurrency.processutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:47 np0005603623 nova_compute[226235]: 2026-01-31 08:57:47.704 226239 DEBUG nova.compute.provider_tree [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:57:47 np0005603623 nova_compute[226235]: 2026-01-31 08:57:47.724 226239 DEBUG nova.scheduler.client.report [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:57:47 np0005603623 nova_compute[226235]: 2026-01-31 08:57:47.765 226239 DEBUG oslo_concurrency.lockutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.743s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:47 np0005603623 nova_compute[226235]: 2026-01-31 08:57:47.766 226239 DEBUG nova.compute.manager [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:57:47 np0005603623 nova_compute[226235]: 2026-01-31 08:57:47.834 226239 DEBUG nova.compute.manager [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:57:47 np0005603623 nova_compute[226235]: 2026-01-31 08:57:47.835 226239 DEBUG nova.network.neutron [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:57:47 np0005603623 nova_compute[226235]: 2026-01-31 08:57:47.874 226239 INFO nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:57:47 np0005603623 nova_compute[226235]: 2026-01-31 08:57:47.908 226239 DEBUG nova.compute.manager [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:57:47 np0005603623 nova_compute[226235]: 2026-01-31 08:57:47.916 226239 DEBUG nova.network.neutron [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Successfully updated port: e6ed4c7b-198a-42b9-bcf7-79fcae00e769 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:57:47 np0005603623 nova_compute[226235]: 2026-01-31 08:57:47.956 226239 DEBUG oslo_concurrency.lockutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquiring lock "refresh_cache-b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:57:47 np0005603623 nova_compute[226235]: 2026-01-31 08:57:47.956 226239 DEBUG oslo_concurrency.lockutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquired lock "refresh_cache-b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:57:47 np0005603623 nova_compute[226235]: 2026-01-31 08:57:47.956 226239 DEBUG nova.network.neutron [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.041 226239 DEBUG nova.compute.manager [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.043 226239 DEBUG nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.043 226239 INFO nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Creating image(s)#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.067 226239 DEBUG nova.storage.rbd_utils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 2450f89f-bcd8-4bab-8fbd-ae73ab968552_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.094 226239 DEBUG nova.storage.rbd_utils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 2450f89f-bcd8-4bab-8fbd-ae73ab968552_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.120 226239 DEBUG nova.storage.rbd_utils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 2450f89f-bcd8-4bab-8fbd-ae73ab968552_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.123 226239 DEBUG oslo_concurrency.processutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.168 226239 DEBUG oslo_concurrency.processutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.169 226239 DEBUG oslo_concurrency.lockutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.170 226239 DEBUG oslo_concurrency.lockutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.170 226239 DEBUG oslo_concurrency.lockutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.197 226239 DEBUG nova.storage.rbd_utils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 2450f89f-bcd8-4bab-8fbd-ae73ab968552_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.201 226239 DEBUG oslo_concurrency.processutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 2450f89f-bcd8-4bab-8fbd-ae73ab968552_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.221 226239 DEBUG nova.policy [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd442c7ba12ed444ca6d4dcc5cfd36150', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.497 226239 DEBUG oslo_concurrency.processutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 2450f89f-bcd8-4bab-8fbd-ae73ab968552_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.296s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.576 226239 DEBUG nova.storage.rbd_utils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] resizing rbd image 2450f89f-bcd8-4bab-8fbd-ae73ab968552_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.700 226239 DEBUG nova.network.neutron [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.707 226239 DEBUG nova.objects.instance [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'migration_context' on Instance uuid 2450f89f-bcd8-4bab-8fbd-ae73ab968552 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.742 226239 DEBUG nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.742 226239 DEBUG nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Ensure instance console log exists: /var/lib/nova/instances/2450f89f-bcd8-4bab-8fbd-ae73ab968552/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.743 226239 DEBUG oslo_concurrency.lockutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.743 226239 DEBUG oslo_concurrency.lockutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:48 np0005603623 nova_compute[226235]: 2026-01-31 08:57:48.743 226239 DEBUG oslo_concurrency.lockutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:48.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:57:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:48.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:57:49 np0005603623 nova_compute[226235]: 2026-01-31 08:57:49.853 226239 DEBUG nova.network.neutron [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Successfully created port: 7f888fb7-e22f-4012-8f50-9248df3a9eac _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.047 226239 DEBUG nova.compute.manager [req-8cbb3557-7d85-4b3a-b0aa-1fd0a99724e0 req-10dabd9b-6484-40fc-8170-9f67567786c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Received event network-changed-e6ed4c7b-198a-42b9-bcf7-79fcae00e769 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.047 226239 DEBUG nova.compute.manager [req-8cbb3557-7d85-4b3a-b0aa-1fd0a99724e0 req-10dabd9b-6484-40fc-8170-9f67567786c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Refreshing instance network info cache due to event network-changed-e6ed4c7b-198a-42b9-bcf7-79fcae00e769. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.047 226239 DEBUG oslo_concurrency.lockutils [req-8cbb3557-7d85-4b3a-b0aa-1fd0a99724e0 req-10dabd9b-6484-40fc-8170-9f67567786c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.179 226239 DEBUG nova.network.neutron [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Updating instance_info_cache with network_info: [{"id": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "address": "fa:16:3e:5d:5e:b9", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6ed4c7b-19", "ovs_interfaceid": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.206 226239 DEBUG oslo_concurrency.lockutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Releasing lock "refresh_cache-b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.206 226239 DEBUG nova.compute.manager [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Instance network_info: |[{"id": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "address": "fa:16:3e:5d:5e:b9", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6ed4c7b-19", "ovs_interfaceid": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.206 226239 DEBUG oslo_concurrency.lockutils [req-8cbb3557-7d85-4b3a-b0aa-1fd0a99724e0 req-10dabd9b-6484-40fc-8170-9f67567786c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.207 226239 DEBUG nova.network.neutron [req-8cbb3557-7d85-4b3a-b0aa-1fd0a99724e0 req-10dabd9b-6484-40fc-8170-9f67567786c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Refreshing network info cache for port e6ed4c7b-198a-42b9-bcf7-79fcae00e769 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.209 226239 DEBUG nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Start _get_guest_xml network_info=[{"id": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "address": "fa:16:3e:5d:5e:b9", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6ed4c7b-19", "ovs_interfaceid": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.212 226239 WARNING nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.217 226239 DEBUG nova.virt.libvirt.host [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.218 226239 DEBUG nova.virt.libvirt.host [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.220 226239 DEBUG nova.virt.libvirt.host [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.221 226239 DEBUG nova.virt.libvirt.host [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.222 226239 DEBUG nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.222 226239 DEBUG nova.virt.hardware [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.222 226239 DEBUG nova.virt.hardware [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.222 226239 DEBUG nova.virt.hardware [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.223 226239 DEBUG nova.virt.hardware [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.223 226239 DEBUG nova.virt.hardware [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.223 226239 DEBUG nova.virt.hardware [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.223 226239 DEBUG nova.virt.hardware [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.223 226239 DEBUG nova.virt.hardware [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.224 226239 DEBUG nova.virt.hardware [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.224 226239 DEBUG nova.virt.hardware [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.224 226239 DEBUG nova.virt.hardware [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.227 226239 DEBUG oslo_concurrency.processutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:57:50 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2708497056' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.658 226239 DEBUG oslo_concurrency.processutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.689 226239 DEBUG nova.storage.rbd_utils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] rbd image b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.693 226239 DEBUG oslo_concurrency.processutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:50.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:50.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.881 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849855.8811924, b23e0349-42a8-41d0-9eea-0407b7ffa806 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.882 226239 INFO nova.compute.manager [-] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.912 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:50 np0005603623 nova_compute[226235]: 2026-01-31 08:57:50.917 226239 DEBUG nova.compute.manager [None req-5321aea7-8ca0-47b5-bc98-17049c7c1b14 - - - - - -] [instance: b23e0349-42a8-41d0-9eea-0407b7ffa806] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:57:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:57:51 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/68729806' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.143 226239 DEBUG oslo_concurrency.processutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.144 226239 DEBUG nova.virt.libvirt.vif [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:57:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1191175908',display_name='tempest-TestServerMultinode-server-1191175908',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-1191175908',id=186,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c37e7d6d634448bfb3172894ad2af105',ramdisk_id='',reservation_id='r-gylvl88b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-893388561',owner_user_name='tempest-TestServerMultinode-893388561-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:57:44Z,user_data=None,user_id='4e364ad937544559bea978006e9ff229',uuid=b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "address": "fa:16:3e:5d:5e:b9", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6ed4c7b-19", "ovs_interfaceid": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.145 226239 DEBUG nova.network.os_vif_util [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Converting VIF {"id": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "address": "fa:16:3e:5d:5e:b9", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6ed4c7b-19", "ovs_interfaceid": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.145 226239 DEBUG nova.network.os_vif_util [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:5e:b9,bridge_name='br-int',has_traffic_filtering=True,id=e6ed4c7b-198a-42b9-bcf7-79fcae00e769,network=Network(b9195012-fef1-4e17-acdd-2b9ffc979da0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6ed4c7b-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.146 226239 DEBUG nova.objects.instance [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lazy-loading 'pci_devices' on Instance uuid b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.166 226239 DEBUG nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:57:51 np0005603623 nova_compute[226235]:  <uuid>b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f</uuid>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:  <name>instance-000000ba</name>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <nova:name>tempest-TestServerMultinode-server-1191175908</nova:name>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:57:50</nova:creationTime>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:57:51 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:        <nova:user uuid="4e364ad937544559bea978006e9ff229">tempest-TestServerMultinode-893388561-project-admin</nova:user>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:        <nova:project uuid="c37e7d6d634448bfb3172894ad2af105">tempest-TestServerMultinode-893388561</nova:project>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:        <nova:port uuid="e6ed4c7b-198a-42b9-bcf7-79fcae00e769">
Jan 31 03:57:51 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <entry name="serial">b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f</entry>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <entry name="uuid">b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f</entry>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f_disk">
Jan 31 03:57:51 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:57:51 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f_disk.config">
Jan 31 03:57:51 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:57:51 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:5d:5e:b9"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <target dev="tape6ed4c7b-19"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f/console.log" append="off"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:57:51 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:57:51 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:57:51 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:57:51 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.167 226239 DEBUG nova.compute.manager [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Preparing to wait for external event network-vif-plugged-e6ed4c7b-198a-42b9-bcf7-79fcae00e769 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.168 226239 DEBUG oslo_concurrency.lockutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquiring lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.169 226239 DEBUG oslo_concurrency.lockutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.170 226239 DEBUG oslo_concurrency.lockutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.173 226239 DEBUG nova.virt.libvirt.vif [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:57:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1191175908',display_name='tempest-TestServerMultinode-server-1191175908',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-1191175908',id=186,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c37e7d6d634448bfb3172894ad2af105',ramdisk_id='',reservation_id='r-gylvl88b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-893388561',owner_user_name='tempest-TestServerMultinode-893388561-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:57:44Z,user_data=None,user_id='4e364ad937544559bea978006e9ff229',uuid=b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "address": "fa:16:3e:5d:5e:b9", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6ed4c7b-19", "ovs_interfaceid": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.174 226239 DEBUG nova.network.os_vif_util [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Converting VIF {"id": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "address": "fa:16:3e:5d:5e:b9", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6ed4c7b-19", "ovs_interfaceid": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.175 226239 DEBUG nova.network.os_vif_util [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:5e:b9,bridge_name='br-int',has_traffic_filtering=True,id=e6ed4c7b-198a-42b9-bcf7-79fcae00e769,network=Network(b9195012-fef1-4e17-acdd-2b9ffc979da0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6ed4c7b-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.177 226239 DEBUG os_vif [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:5e:b9,bridge_name='br-int',has_traffic_filtering=True,id=e6ed4c7b-198a-42b9-bcf7-79fcae00e769,network=Network(b9195012-fef1-4e17-acdd-2b9ffc979da0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6ed4c7b-19') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.180 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.181 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.182 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.185 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.185 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape6ed4c7b-19, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.186 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape6ed4c7b-19, col_values=(('external_ids', {'iface-id': 'e6ed4c7b-198a-42b9-bcf7-79fcae00e769', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5d:5e:b9', 'vm-uuid': 'b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:51 np0005603623 NetworkManager[48970]: <info>  [1769849871.1882] manager: (tape6ed4c7b-19): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/360)
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.190 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.191 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.192 226239 INFO os_vif [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:5e:b9,bridge_name='br-int',has_traffic_filtering=True,id=e6ed4c7b-198a-42b9-bcf7-79fcae00e769,network=Network(b9195012-fef1-4e17-acdd-2b9ffc979da0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6ed4c7b-19')#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.263 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.274 226239 DEBUG nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.274 226239 DEBUG nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.274 226239 DEBUG nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] No VIF found with MAC fa:16:3e:5d:5e:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.275 226239 INFO nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Using config drive#033[00m
Jan 31 03:57:51 np0005603623 nova_compute[226235]: 2026-01-31 08:57:51.297 226239 DEBUG nova.storage.rbd_utils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] rbd image b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:52 np0005603623 nova_compute[226235]: 2026-01-31 08:57:52.141 226239 INFO nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Creating config drive at /var/lib/nova/instances/b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f/disk.config#033[00m
Jan 31 03:57:52 np0005603623 nova_compute[226235]: 2026-01-31 08:57:52.145 226239 DEBUG oslo_concurrency.processutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpay54l33_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:52 np0005603623 nova_compute[226235]: 2026-01-31 08:57:52.209 226239 DEBUG nova.network.neutron [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Successfully updated port: 7f888fb7-e22f-4012-8f50-9248df3a9eac _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:57:52 np0005603623 nova_compute[226235]: 2026-01-31 08:57:52.231 226239 DEBUG oslo_concurrency.lockutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "refresh_cache-2450f89f-bcd8-4bab-8fbd-ae73ab968552" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:57:52 np0005603623 nova_compute[226235]: 2026-01-31 08:57:52.232 226239 DEBUG oslo_concurrency.lockutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquired lock "refresh_cache-2450f89f-bcd8-4bab-8fbd-ae73ab968552" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:57:52 np0005603623 nova_compute[226235]: 2026-01-31 08:57:52.232 226239 DEBUG nova.network.neutron [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:57:52 np0005603623 nova_compute[226235]: 2026-01-31 08:57:52.275 226239 DEBUG oslo_concurrency.processutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpay54l33_" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:52 np0005603623 nova_compute[226235]: 2026-01-31 08:57:52.398 226239 DEBUG nova.storage.rbd_utils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] rbd image b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:52 np0005603623 nova_compute[226235]: 2026-01-31 08:57:52.402 226239 DEBUG oslo_concurrency.processutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f/disk.config b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:52.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:52 np0005603623 nova_compute[226235]: 2026-01-31 08:57:52.852 226239 DEBUG nova.compute.manager [req-08864d69-4fad-454e-94b9-33fc5ffad587 req-154f4932-185b-473f-9957-0da6f25deb85 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Received event network-changed-7f888fb7-e22f-4012-8f50-9248df3a9eac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:52 np0005603623 nova_compute[226235]: 2026-01-31 08:57:52.853 226239 DEBUG nova.compute.manager [req-08864d69-4fad-454e-94b9-33fc5ffad587 req-154f4932-185b-473f-9957-0da6f25deb85 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Refreshing instance network info cache due to event network-changed-7f888fb7-e22f-4012-8f50-9248df3a9eac. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:57:52 np0005603623 nova_compute[226235]: 2026-01-31 08:57:52.854 226239 DEBUG oslo_concurrency.lockutils [req-08864d69-4fad-454e-94b9-33fc5ffad587 req-154f4932-185b-473f-9957-0da6f25deb85 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-2450f89f-bcd8-4bab-8fbd-ae73ab968552" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:57:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:57:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:52.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.024 226239 DEBUG nova.network.neutron [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.144 226239 DEBUG oslo_concurrency.processutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f/disk.config b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.742s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.145 226239 INFO nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Deleting local config drive /var/lib/nova/instances/b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f/disk.config because it was imported into RBD.#033[00m
Jan 31 03:57:53 np0005603623 kernel: tape6ed4c7b-19: entered promiscuous mode
Jan 31 03:57:53 np0005603623 NetworkManager[48970]: <info>  [1769849873.1844] manager: (tape6ed4c7b-19): new Tun device (/org/freedesktop/NetworkManager/Devices/361)
Jan 31 03:57:53 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:53Z|00767|binding|INFO|Claiming lport e6ed4c7b-198a-42b9-bcf7-79fcae00e769 for this chassis.
Jan 31 03:57:53 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:53Z|00768|binding|INFO|e6ed4c7b-198a-42b9-bcf7-79fcae00e769: Claiming fa:16:3e:5d:5e:b9 10.100.0.10
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.185 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:53 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:53Z|00769|binding|INFO|Setting lport e6ed4c7b-198a-42b9-bcf7-79fcae00e769 ovn-installed in OVS
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.194 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.196 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:53 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:53Z|00770|binding|INFO|Setting lport e6ed4c7b-198a-42b9-bcf7-79fcae00e769 up in Southbound
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.197 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:5e:b9 10.100.0.10'], port_security=['fa:16:3e:5d:5e:b9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9195012-fef1-4e17-acdd-2b9ffc979da0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c37e7d6d634448bfb3172894ad2af105', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd98fdedc-7ec4-4678-86fd-333fbe96f77f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f5a6fc0-3df3-4c2f-84cd-adc2af316a8e, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=e6ed4c7b-198a-42b9-bcf7-79fcae00e769) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.198 143258 INFO neutron.agent.ovn.metadata.agent [-] Port e6ed4c7b-198a-42b9-bcf7-79fcae00e769 in datapath b9195012-fef1-4e17-acdd-2b9ffc979da0 bound to our chassis#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.200 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b9195012-fef1-4e17-acdd-2b9ffc979da0#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.208 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e126b7f3-ab02-471b-ae05-e5f538fd51aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:53 np0005603623 systemd-machined[194379]: New machine qemu-87-instance-000000ba.
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.210 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb9195012-f1 in ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:57:53 np0005603623 systemd-udevd[315058]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.212 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb9195012-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.212 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[516dd15e-35c4-43dd-b0bd-ebd5511a081a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.213 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[62fc6344-0c23-4fdc-8fef-98d2b635ca98]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:53 np0005603623 NetworkManager[48970]: <info>  [1769849873.2206] device (tape6ed4c7b-19): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:57:53 np0005603623 NetworkManager[48970]: <info>  [1769849873.2212] device (tape6ed4c7b-19): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:57:53 np0005603623 systemd[1]: Started Virtual Machine qemu-87-instance-000000ba.
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.220 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[172a024d-2559-43e3-910a-23d8f2e00873]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.230 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ddbf3eac-c1ef-43b4-b6a2-6b4ae63f3ddd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.253 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[304b6f99-d164-41e7-a135-a22c61a8cfda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.257 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[76b3c888-9602-41b0-ab61-5216fdfbc3cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:53 np0005603623 NetworkManager[48970]: <info>  [1769849873.2581] manager: (tapb9195012-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/362)
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.280 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[573b5d9c-9170-44cf-bf4e-7e2d393575ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.286 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[41341f84-1990-411d-8f87-48d2e8d468ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:53 np0005603623 NetworkManager[48970]: <info>  [1769849873.3055] device (tapb9195012-f0): carrier: link connected
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.312 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ef540e92-edcb-458e-9d28-982cb70ebf2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.325 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4daa31c8-8be8-41ab-ac27-0dc7b18c47f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb9195012-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:f5:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 226], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 901373, 'reachable_time': 19534, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315090, 'error': None, 'target': 'ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.335 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8103458e-d371-4836-a416-f78c101b4936]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fede:f5e8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 901373, 'tstamp': 901373}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 315091, 'error': None, 'target': 'ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.346 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d457573a-e790-407c-8cc7-afebdb2d857a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb9195012-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:f5:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 226], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 901373, 'reachable_time': 19534, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 315092, 'error': None, 'target': 'ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.360 226239 DEBUG nova.network.neutron [req-8cbb3557-7d85-4b3a-b0aa-1fd0a99724e0 req-10dabd9b-6484-40fc-8170-9f67567786c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Updated VIF entry in instance network info cache for port e6ed4c7b-198a-42b9-bcf7-79fcae00e769. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.361 226239 DEBUG nova.network.neutron [req-8cbb3557-7d85-4b3a-b0aa-1fd0a99724e0 req-10dabd9b-6484-40fc-8170-9f67567786c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Updating instance_info_cache with network_info: [{"id": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "address": "fa:16:3e:5d:5e:b9", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6ed4c7b-19", "ovs_interfaceid": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.365 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8a2edd79-659d-4ff2-8abc-af99713d6574]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.381 226239 DEBUG oslo_concurrency.lockutils [req-8cbb3557-7d85-4b3a-b0aa-1fd0a99724e0 req-10dabd9b-6484-40fc-8170-9f67567786c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.402 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[508b41a7-5265-436d-aa87-9543a5a5b2b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.403 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb9195012-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.403 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.404 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb9195012-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:53 np0005603623 NetworkManager[48970]: <info>  [1769849873.4061] manager: (tapb9195012-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/363)
Jan 31 03:57:53 np0005603623 kernel: tapb9195012-f0: entered promiscuous mode
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.408 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb9195012-f0, col_values=(('external_ids', {'iface-id': '1553dad0-d27d-4162-94ad-0b8a3a359f3a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:53 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:53Z|00771|binding|INFO|Releasing lport 1553dad0-d27d-4162-94ad-0b8a3a359f3a from this chassis (sb_readonly=0)
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.416 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.420 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.421 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b9195012-fef1-4e17-acdd-2b9ffc979da0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b9195012-fef1-4e17-acdd-2b9ffc979da0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.421 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[651891f1-21d0-405b-b9e3-f6ab0558c24c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.422 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-b9195012-fef1-4e17-acdd-2b9ffc979da0
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/b9195012-fef1-4e17-acdd-2b9ffc979da0.pid.haproxy
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID b9195012-fef1-4e17-acdd-2b9ffc979da0
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:57:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:53.422 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0', 'env', 'PROCESS_TAG=haproxy-b9195012-fef1-4e17-acdd-2b9ffc979da0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b9195012-fef1-4e17-acdd-2b9ffc979da0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:57:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:57:53 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3496300286' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:57:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:57:53 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3496300286' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.684 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849873.6833942, b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.684 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] VM Started (Lifecycle Event)#033[00m
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.708 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.713 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849873.6839397, b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.713 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.753 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.756 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:57:53 np0005603623 nova_compute[226235]: 2026-01-31 08:57:53.786 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:57:53 np0005603623 podman[315166]: 2026-01-31 08:57:53.702375763 +0000 UTC m=+0.021936859 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:57:53 np0005603623 podman[315166]: 2026-01-31 08:57:53.921588279 +0000 UTC m=+0.241149355 container create 3a811005f2e590e3bdfc0de34185012125beb4aec84c808d4b9029123d9b6e61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:57:54 np0005603623 systemd[1]: Started libpod-conmon-3a811005f2e590e3bdfc0de34185012125beb4aec84c808d4b9029123d9b6e61.scope.
Jan 31 03:57:54 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:57:54 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f1208a0b4f1658f77d736df263a91908dcba98e8eaa68a0c5f514a17ca2a0bd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:57:54 np0005603623 podman[315166]: 2026-01-31 08:57:54.063026847 +0000 UTC m=+0.382587943 container init 3a811005f2e590e3bdfc0de34185012125beb4aec84c808d4b9029123d9b6e61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 03:57:54 np0005603623 podman[315166]: 2026-01-31 08:57:54.066658231 +0000 UTC m=+0.386219317 container start 3a811005f2e590e3bdfc0de34185012125beb4aec84c808d4b9029123d9b6e61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Jan 31 03:57:54 np0005603623 neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0[315182]: [NOTICE]   (315186) : New worker (315188) forked
Jan 31 03:57:54 np0005603623 neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0[315182]: [NOTICE]   (315186) : Loading success.
Jan 31 03:57:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:54.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:54.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:54 np0005603623 nova_compute[226235]: 2026-01-31 08:57:54.993 226239 DEBUG nova.network.neutron [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Updating instance_info_cache with network_info: [{"id": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "address": "fa:16:3e:06:b6:50", "network": {"id": "80db08a4-2d34-453a-a239-a7ada660bee1", "bridge": "br-int", "label": "tempest-network-smoke--1433603680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f888fb7-e2", "ovs_interfaceid": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.010 226239 DEBUG nova.compute.manager [req-f3b61bd7-a460-4f87-84f7-f86282deb47d req-049e25d2-c56d-4a1e-83c1-919eaa59989d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Received event network-vif-plugged-e6ed4c7b-198a-42b9-bcf7-79fcae00e769 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.010 226239 DEBUG oslo_concurrency.lockutils [req-f3b61bd7-a460-4f87-84f7-f86282deb47d req-049e25d2-c56d-4a1e-83c1-919eaa59989d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.010 226239 DEBUG oslo_concurrency.lockutils [req-f3b61bd7-a460-4f87-84f7-f86282deb47d req-049e25d2-c56d-4a1e-83c1-919eaa59989d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.011 226239 DEBUG oslo_concurrency.lockutils [req-f3b61bd7-a460-4f87-84f7-f86282deb47d req-049e25d2-c56d-4a1e-83c1-919eaa59989d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.011 226239 DEBUG nova.compute.manager [req-f3b61bd7-a460-4f87-84f7-f86282deb47d req-049e25d2-c56d-4a1e-83c1-919eaa59989d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Processing event network-vif-plugged-e6ed4c7b-198a-42b9-bcf7-79fcae00e769 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.011 226239 DEBUG nova.compute.manager [req-f3b61bd7-a460-4f87-84f7-f86282deb47d req-049e25d2-c56d-4a1e-83c1-919eaa59989d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Received event network-vif-plugged-e6ed4c7b-198a-42b9-bcf7-79fcae00e769 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.011 226239 DEBUG oslo_concurrency.lockutils [req-f3b61bd7-a460-4f87-84f7-f86282deb47d req-049e25d2-c56d-4a1e-83c1-919eaa59989d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.011 226239 DEBUG oslo_concurrency.lockutils [req-f3b61bd7-a460-4f87-84f7-f86282deb47d req-049e25d2-c56d-4a1e-83c1-919eaa59989d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.012 226239 DEBUG oslo_concurrency.lockutils [req-f3b61bd7-a460-4f87-84f7-f86282deb47d req-049e25d2-c56d-4a1e-83c1-919eaa59989d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.012 226239 DEBUG nova.compute.manager [req-f3b61bd7-a460-4f87-84f7-f86282deb47d req-049e25d2-c56d-4a1e-83c1-919eaa59989d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] No waiting events found dispatching network-vif-plugged-e6ed4c7b-198a-42b9-bcf7-79fcae00e769 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.012 226239 WARNING nova.compute.manager [req-f3b61bd7-a460-4f87-84f7-f86282deb47d req-049e25d2-c56d-4a1e-83c1-919eaa59989d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Received unexpected event network-vif-plugged-e6ed4c7b-198a-42b9-bcf7-79fcae00e769 for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.012 226239 DEBUG nova.compute.manager [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.016 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849875.015781, b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.016 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.018 226239 DEBUG nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.021 226239 INFO nova.virt.libvirt.driver [-] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Instance spawned successfully.#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.022 226239 DEBUG nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.060 226239 DEBUG oslo_concurrency.lockutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Releasing lock "refresh_cache-2450f89f-bcd8-4bab-8fbd-ae73ab968552" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.060 226239 DEBUG nova.compute.manager [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Instance network_info: |[{"id": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "address": "fa:16:3e:06:b6:50", "network": {"id": "80db08a4-2d34-453a-a239-a7ada660bee1", "bridge": "br-int", "label": "tempest-network-smoke--1433603680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f888fb7-e2", "ovs_interfaceid": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.061 226239 DEBUG oslo_concurrency.lockutils [req-08864d69-4fad-454e-94b9-33fc5ffad587 req-154f4932-185b-473f-9957-0da6f25deb85 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-2450f89f-bcd8-4bab-8fbd-ae73ab968552" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.061 226239 DEBUG nova.network.neutron [req-08864d69-4fad-454e-94b9-33fc5ffad587 req-154f4932-185b-473f-9957-0da6f25deb85 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Refreshing network info cache for port 7f888fb7-e22f-4012-8f50-9248df3a9eac _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.063 226239 DEBUG nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Start _get_guest_xml network_info=[{"id": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "address": "fa:16:3e:06:b6:50", "network": {"id": "80db08a4-2d34-453a-a239-a7ada660bee1", "bridge": "br-int", "label": "tempest-network-smoke--1433603680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f888fb7-e2", "ovs_interfaceid": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.066 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.072 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.075 226239 DEBUG nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.076 226239 DEBUG nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.076 226239 DEBUG nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.077 226239 DEBUG nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.077 226239 DEBUG nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.078 226239 DEBUG nova.virt.libvirt.driver [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.083 226239 WARNING nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.091 226239 DEBUG nova.virt.libvirt.host [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.092 226239 DEBUG nova.virt.libvirt.host [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.097 226239 DEBUG nova.virt.libvirt.host [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.098 226239 DEBUG nova.virt.libvirt.host [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.099 226239 DEBUG nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.100 226239 DEBUG nova.virt.hardware [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.100 226239 DEBUG nova.virt.hardware [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.100 226239 DEBUG nova.virt.hardware [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.101 226239 DEBUG nova.virt.hardware [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.101 226239 DEBUG nova.virt.hardware [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.101 226239 DEBUG nova.virt.hardware [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.101 226239 DEBUG nova.virt.hardware [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.102 226239 DEBUG nova.virt.hardware [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.102 226239 DEBUG nova.virt.hardware [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.102 226239 DEBUG nova.virt.hardware [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.102 226239 DEBUG nova.virt.hardware [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.105 226239 DEBUG oslo_concurrency.processutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.131 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.175 226239 INFO nova.compute.manager [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Took 10.37 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.176 226239 DEBUG nova.compute.manager [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.340 226239 INFO nova.compute.manager [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Took 11.66 seconds to build instance.#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.414 226239 DEBUG oslo_concurrency.lockutils [None req-0fce739a-cff4-485a-b13a-06538f90bd4a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:57:55 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/101534847' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.562 226239 DEBUG oslo_concurrency.processutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.596 226239 DEBUG nova.storage.rbd_utils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 2450f89f-bcd8-4bab-8fbd-ae73ab968552_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:55 np0005603623 nova_compute[226235]: 2026-01-31 08:57:55.601 226239 DEBUG oslo_concurrency.processutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:57:56 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1044739718' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.034 226239 DEBUG oslo_concurrency.processutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.035 226239 DEBUG nova.virt.libvirt.vif [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:57:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-72318832',display_name='tempest-TestNetworkBasicOps-server-72318832',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-72318832',id=187,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBf/CcSGfenjrIJY2a7u0nsdG9tlN4s/UTheBCPSx6l9SRae2pY2uByjcTYgzee1XtVV9LklbY40NCV7PMEvPHBNIU59v4AYC+T6/bE+p78+IC5i8eS8hJXzPuTEiCLLlQ==',key_name='tempest-TestNetworkBasicOps-1980659074',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-sq07ng9b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:57:47Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=2450f89f-bcd8-4bab-8fbd-ae73ab968552,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "address": "fa:16:3e:06:b6:50", "network": {"id": "80db08a4-2d34-453a-a239-a7ada660bee1", "bridge": "br-int", "label": "tempest-network-smoke--1433603680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f888fb7-e2", "ovs_interfaceid": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.036 226239 DEBUG nova.network.os_vif_util [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "address": "fa:16:3e:06:b6:50", "network": {"id": "80db08a4-2d34-453a-a239-a7ada660bee1", "bridge": "br-int", "label": "tempest-network-smoke--1433603680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f888fb7-e2", "ovs_interfaceid": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.037 226239 DEBUG nova.network.os_vif_util [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:b6:50,bridge_name='br-int',has_traffic_filtering=True,id=7f888fb7-e22f-4012-8f50-9248df3a9eac,network=Network(80db08a4-2d34-453a-a239-a7ada660bee1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f888fb7-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.038 226239 DEBUG nova.objects.instance [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2450f89f-bcd8-4bab-8fbd-ae73ab968552 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.057 226239 DEBUG nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:57:56 np0005603623 nova_compute[226235]:  <uuid>2450f89f-bcd8-4bab-8fbd-ae73ab968552</uuid>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:  <name>instance-000000bb</name>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <nova:name>tempest-TestNetworkBasicOps-server-72318832</nova:name>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:57:55</nova:creationTime>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:57:56 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:        <nova:user uuid="d442c7ba12ed444ca6d4dcc5cfd36150">tempest-TestNetworkBasicOps-104417095-project-member</nova:user>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:        <nova:project uuid="abf9393aa2b646feb00a3d887a9dee14">tempest-TestNetworkBasicOps-104417095</nova:project>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:        <nova:port uuid="7f888fb7-e22f-4012-8f50-9248df3a9eac">
Jan 31 03:57:56 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <entry name="serial">2450f89f-bcd8-4bab-8fbd-ae73ab968552</entry>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <entry name="uuid">2450f89f-bcd8-4bab-8fbd-ae73ab968552</entry>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/2450f89f-bcd8-4bab-8fbd-ae73ab968552_disk">
Jan 31 03:57:56 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:57:56 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/2450f89f-bcd8-4bab-8fbd-ae73ab968552_disk.config">
Jan 31 03:57:56 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:57:56 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:06:b6:50"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <target dev="tap7f888fb7-e2"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/2450f89f-bcd8-4bab-8fbd-ae73ab968552/console.log" append="off"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:57:56 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:57:56 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:57:56 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:57:56 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.058 226239 DEBUG nova.compute.manager [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Preparing to wait for external event network-vif-plugged-7f888fb7-e22f-4012-8f50-9248df3a9eac prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.058 226239 DEBUG oslo_concurrency.lockutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.059 226239 DEBUG oslo_concurrency.lockutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.059 226239 DEBUG oslo_concurrency.lockutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.060 226239 DEBUG nova.virt.libvirt.vif [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:57:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-72318832',display_name='tempest-TestNetworkBasicOps-server-72318832',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-72318832',id=187,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBf/CcSGfenjrIJY2a7u0nsdG9tlN4s/UTheBCPSx6l9SRae2pY2uByjcTYgzee1XtVV9LklbY40NCV7PMEvPHBNIU59v4AYC+T6/bE+p78+IC5i8eS8hJXzPuTEiCLLlQ==',key_name='tempest-TestNetworkBasicOps-1980659074',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-sq07ng9b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:57:47Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=2450f89f-bcd8-4bab-8fbd-ae73ab968552,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "address": "fa:16:3e:06:b6:50", "network": {"id": "80db08a4-2d34-453a-a239-a7ada660bee1", "bridge": "br-int", "label": "tempest-network-smoke--1433603680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f888fb7-e2", "ovs_interfaceid": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.060 226239 DEBUG nova.network.os_vif_util [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "address": "fa:16:3e:06:b6:50", "network": {"id": "80db08a4-2d34-453a-a239-a7ada660bee1", "bridge": "br-int", "label": "tempest-network-smoke--1433603680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f888fb7-e2", "ovs_interfaceid": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.060 226239 DEBUG nova.network.os_vif_util [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:b6:50,bridge_name='br-int',has_traffic_filtering=True,id=7f888fb7-e22f-4012-8f50-9248df3a9eac,network=Network(80db08a4-2d34-453a-a239-a7ada660bee1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f888fb7-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.061 226239 DEBUG os_vif [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:b6:50,bridge_name='br-int',has_traffic_filtering=True,id=7f888fb7-e22f-4012-8f50-9248df3a9eac,network=Network(80db08a4-2d34-453a-a239-a7ada660bee1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f888fb7-e2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.061 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.062 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.062 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.064 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.064 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7f888fb7-e2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.065 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7f888fb7-e2, col_values=(('external_ids', {'iface-id': '7f888fb7-e22f-4012-8f50-9248df3a9eac', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:b6:50', 'vm-uuid': '2450f89f-bcd8-4bab-8fbd-ae73ab968552'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.066 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:56 np0005603623 NetworkManager[48970]: <info>  [1769849876.0672] manager: (tap7f888fb7-e2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/364)
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.068 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.071 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.072 226239 INFO os_vif [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:b6:50,bridge_name='br-int',has_traffic_filtering=True,id=7f888fb7-e22f-4012-8f50-9248df3a9eac,network=Network(80db08a4-2d34-453a-a239-a7ada660bee1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f888fb7-e2')#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.143 226239 DEBUG nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.144 226239 DEBUG nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.144 226239 DEBUG nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No VIF found with MAC fa:16:3e:06:b6:50, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.144 226239 INFO nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Using config drive#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.164 226239 DEBUG nova.storage.rbd_utils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 2450f89f-bcd8-4bab-8fbd-ae73ab968552_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:56 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:56Z|00772|binding|INFO|Releasing lport 5a0136e3-84ab-4495-80ff-8006a0a74934 from this chassis (sb_readonly=0)
Jan 31 03:57:56 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:56Z|00773|binding|INFO|Releasing lport 1553dad0-d27d-4162-94ad-0b8a3a359f3a from this chassis (sb_readonly=0)
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.240 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.293 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:56 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:56Z|00774|binding|INFO|Releasing lport 5a0136e3-84ab-4495-80ff-8006a0a74934 from this chassis (sb_readonly=0)
Jan 31 03:57:56 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:56Z|00775|binding|INFO|Releasing lport 1553dad0-d27d-4162-94ad-0b8a3a359f3a from this chassis (sb_readonly=0)
Jan 31 03:57:56 np0005603623 nova_compute[226235]: 2026-01-31 08:57:56.311 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:56.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:56.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:57 np0005603623 nova_compute[226235]: 2026-01-31 08:57:57.254 226239 INFO nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Creating config drive at /var/lib/nova/instances/2450f89f-bcd8-4bab-8fbd-ae73ab968552/disk.config#033[00m
Jan 31 03:57:57 np0005603623 nova_compute[226235]: 2026-01-31 08:57:57.257 226239 DEBUG oslo_concurrency.processutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2450f89f-bcd8-4bab-8fbd-ae73ab968552/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmptghlbno7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:57 np0005603623 nova_compute[226235]: 2026-01-31 08:57:57.378 226239 DEBUG oslo_concurrency.processutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2450f89f-bcd8-4bab-8fbd-ae73ab968552/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmptghlbno7" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:57 np0005603623 nova_compute[226235]: 2026-01-31 08:57:57.405 226239 DEBUG nova.storage.rbd_utils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 2450f89f-bcd8-4bab-8fbd-ae73ab968552_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:57 np0005603623 nova_compute[226235]: 2026-01-31 08:57:57.410 226239 DEBUG oslo_concurrency.processutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2450f89f-bcd8-4bab-8fbd-ae73ab968552/disk.config 2450f89f-bcd8-4bab-8fbd-ae73ab968552_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:57 np0005603623 nova_compute[226235]: 2026-01-31 08:57:57.567 226239 DEBUG oslo_concurrency.processutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2450f89f-bcd8-4bab-8fbd-ae73ab968552/disk.config 2450f89f-bcd8-4bab-8fbd-ae73ab968552_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.157s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:57 np0005603623 nova_compute[226235]: 2026-01-31 08:57:57.569 226239 INFO nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Deleting local config drive /var/lib/nova/instances/2450f89f-bcd8-4bab-8fbd-ae73ab968552/disk.config because it was imported into RBD.#033[00m
Jan 31 03:57:57 np0005603623 NetworkManager[48970]: <info>  [1769849877.6080] manager: (tap7f888fb7-e2): new Tun device (/org/freedesktop/NetworkManager/Devices/365)
Jan 31 03:57:57 np0005603623 kernel: tap7f888fb7-e2: entered promiscuous mode
Jan 31 03:57:57 np0005603623 nova_compute[226235]: 2026-01-31 08:57:57.613 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:57 np0005603623 nova_compute[226235]: 2026-01-31 08:57:57.618 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:57Z|00776|binding|INFO|Claiming lport 7f888fb7-e22f-4012-8f50-9248df3a9eac for this chassis.
Jan 31 03:57:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:57Z|00777|binding|INFO|7f888fb7-e22f-4012-8f50-9248df3a9eac: Claiming fa:16:3e:06:b6:50 10.100.0.7
Jan 31 03:57:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:57Z|00778|binding|INFO|Setting lport 7f888fb7-e22f-4012-8f50-9248df3a9eac ovn-installed in OVS
Jan 31 03:57:57 np0005603623 nova_compute[226235]: 2026-01-31 08:57:57.641 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:57 np0005603623 systemd-machined[194379]: New machine qemu-88-instance-000000bb.
Jan 31 03:57:57 np0005603623 systemd[1]: Started Virtual Machine qemu-88-instance-000000bb.
Jan 31 03:57:57 np0005603623 systemd-udevd[315336]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:57:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:57Z|00779|binding|INFO|Setting lport 7f888fb7-e22f-4012-8f50-9248df3a9eac up in Southbound
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.675 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:b6:50 10.100.0.7'], port_security=['fa:16:3e:06:b6:50 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '2450f89f-bcd8-4bab-8fbd-ae73ab968552', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-80db08a4-2d34-453a-a239-a7ada660bee1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ccc60d81-26b0-49de-8e09-12a020a730d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=91e2daca-cac7-4f99-8d64-7b570d2bf474, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=7f888fb7-e22f-4012-8f50-9248df3a9eac) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:57:57 np0005603623 NetworkManager[48970]: <info>  [1769849877.6777] device (tap7f888fb7-e2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:57:57 np0005603623 NetworkManager[48970]: <info>  [1769849877.6787] device (tap7f888fb7-e2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.676 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 7f888fb7-e22f-4012-8f50-9248df3a9eac in datapath 80db08a4-2d34-453a-a239-a7ada660bee1 bound to our chassis#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.679 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 80db08a4-2d34-453a-a239-a7ada660bee1#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.687 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dfe449cc-a63b-4e8b-96c7-5317fe9034f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.692 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap80db08a4-21 in ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.693 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap80db08a4-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.693 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[afc4c2c5-3268-4df4-be3e-fc643babe443]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.694 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[26475a78-bdeb-4c36-b746-a32a9d8c71a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.703 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[8bc7013e-1f65-46f4-af14-36f35a66164e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:57 np0005603623 nova_compute[226235]: 2026-01-31 08:57:57.723 226239 DEBUG nova.network.neutron [req-08864d69-4fad-454e-94b9-33fc5ffad587 req-154f4932-185b-473f-9957-0da6f25deb85 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Updated VIF entry in instance network info cache for port 7f888fb7-e22f-4012-8f50-9248df3a9eac. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:57:57 np0005603623 nova_compute[226235]: 2026-01-31 08:57:57.724 226239 DEBUG nova.network.neutron [req-08864d69-4fad-454e-94b9-33fc5ffad587 req-154f4932-185b-473f-9957-0da6f25deb85 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Updating instance_info_cache with network_info: [{"id": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "address": "fa:16:3e:06:b6:50", "network": {"id": "80db08a4-2d34-453a-a239-a7ada660bee1", "bridge": "br-int", "label": "tempest-network-smoke--1433603680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f888fb7-e2", "ovs_interfaceid": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.729 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fecec7eb-f7ad-4cfd-b8f2-155107794892]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:57 np0005603623 nova_compute[226235]: 2026-01-31 08:57:57.774 226239 DEBUG oslo_concurrency.lockutils [req-08864d69-4fad-454e-94b9-33fc5ffad587 req-154f4932-185b-473f-9957-0da6f25deb85 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-2450f89f-bcd8-4bab-8fbd-ae73ab968552" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.789 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[a54df288-0727-47a1-ab10-b25561743a90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:57 np0005603623 NetworkManager[48970]: <info>  [1769849877.7940] manager: (tap80db08a4-20): new Veth device (/org/freedesktop/NetworkManager/Devices/366)
Jan 31 03:57:57 np0005603623 systemd-udevd[315340]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.793 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a888923f-0999-48a1-b7df-ad65928e2a2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.815 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[12899f4a-91e0-458f-aa9c-98839e28a6df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.817 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[dd4c4055-6c14-4c33-a6fc-9bab385a1086]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:57 np0005603623 NetworkManager[48970]: <info>  [1769849877.8321] device (tap80db08a4-20): carrier: link connected
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.836 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[d1df179b-11ff-4b5e-9802-8a3fa0ba9e89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:57 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #58. Immutable memtables: 14.
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.850 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[07b05477-a77f-431d-963d-be6960c82d94]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap80db08a4-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:3e:53'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 228], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 901826, 'reachable_time': 24590, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315369, 'error': None, 'target': 'ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.861 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dfa60763-91e0-4a37-95f8-72680c0ed73b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef0:3e53'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 901826, 'tstamp': 901826}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 315370, 'error': None, 'target': 'ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.875 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d20d05ec-861f-493a-81a3-54e56a776365]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap80db08a4-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:3e:53'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 228], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 901826, 'reachable_time': 24590, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 315371, 'error': None, 'target': 'ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.893 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[25d26f50-6fdc-4f6b-9805-4b9eabd60250]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.934 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[377396dd-08e6-4b2b-a6a9-74fc84086643]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.935 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap80db08a4-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.936 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.936 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap80db08a4-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:57 np0005603623 NetworkManager[48970]: <info>  [1769849877.9390] manager: (tap80db08a4-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/367)
Jan 31 03:57:57 np0005603623 nova_compute[226235]: 2026-01-31 08:57:57.938 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:57 np0005603623 kernel: tap80db08a4-20: entered promiscuous mode
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.944 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap80db08a4-20, col_values=(('external_ids', {'iface-id': '31300652-e6db-4684-9385-9a9bdfcee2c5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:57 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:57Z|00780|binding|INFO|Releasing lport 31300652-e6db-4684-9385-9a9bdfcee2c5 from this chassis (sb_readonly=0)
Jan 31 03:57:57 np0005603623 nova_compute[226235]: 2026-01-31 08:57:57.945 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.949 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/80db08a4-2d34-453a-a239-a7ada660bee1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/80db08a4-2d34-453a-a239-a7ada660bee1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.950 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d75c4e64-3e91-43e8-abdd-26e1419019cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.950 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-80db08a4-2d34-453a-a239-a7ada660bee1
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/80db08a4-2d34-453a-a239-a7ada660bee1.pid.haproxy
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 80db08a4-2d34-453a-a239-a7ada660bee1
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:57:57 np0005603623 nova_compute[226235]: 2026-01-31 08:57:57.951 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:57.952 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1', 'env', 'PROCESS_TAG=haproxy-80db08a4-2d34-453a-a239-a7ada660bee1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/80db08a4-2d34-453a-a239-a7ada660bee1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.093 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849878.0928643, 2450f89f-bcd8-4bab-8fbd-ae73ab968552 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.093 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] VM Started (Lifecycle Event)#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.143 226239 DEBUG nova.compute.manager [req-c7efba7a-d249-486d-89ac-d8b361ad44cc req-300e9d01-61a2-4e6d-879a-2b0ddd852130 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Received event network-vif-plugged-7f888fb7-e22f-4012-8f50-9248df3a9eac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.143 226239 DEBUG oslo_concurrency.lockutils [req-c7efba7a-d249-486d-89ac-d8b361ad44cc req-300e9d01-61a2-4e6d-879a-2b0ddd852130 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.144 226239 DEBUG oslo_concurrency.lockutils [req-c7efba7a-d249-486d-89ac-d8b361ad44cc req-300e9d01-61a2-4e6d-879a-2b0ddd852130 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.144 226239 DEBUG oslo_concurrency.lockutils [req-c7efba7a-d249-486d-89ac-d8b361ad44cc req-300e9d01-61a2-4e6d-879a-2b0ddd852130 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.144 226239 DEBUG nova.compute.manager [req-c7efba7a-d249-486d-89ac-d8b361ad44cc req-300e9d01-61a2-4e6d-879a-2b0ddd852130 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Processing event network-vif-plugged-7f888fb7-e22f-4012-8f50-9248df3a9eac _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.145 226239 DEBUG nova.compute.manager [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.148 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.149 226239 DEBUG nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.154 226239 INFO nova.virt.libvirt.driver [-] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Instance spawned successfully.#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.154 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.155 226239 DEBUG nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.181 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.182 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849878.0929906, 2450f89f-bcd8-4bab-8fbd-ae73ab968552 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.182 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.194 226239 DEBUG nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.194 226239 DEBUG nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.195 226239 DEBUG nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.196 226239 DEBUG nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.196 226239 DEBUG nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.197 226239 DEBUG nova.virt.libvirt.driver [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.211 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.215 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849878.1473584, 2450f89f-bcd8-4bab-8fbd-ae73ab968552 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.216 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.252 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.255 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:57:58 np0005603623 podman[315443]: 2026-01-31 08:57:58.283340581 +0000 UTC m=+0.048998858 container create f4addd6a96ca7cc7389dc4b6345107c10395564f83817ce18fdab4d9e6dc38a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.294 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.312 226239 INFO nova.compute.manager [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Took 10.27 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.313 226239 DEBUG nova.compute.manager [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.313 226239 DEBUG oslo_concurrency.lockutils [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquiring lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.314 226239 DEBUG oslo_concurrency.lockutils [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.314 226239 DEBUG oslo_concurrency.lockutils [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquiring lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.314 226239 DEBUG oslo_concurrency.lockutils [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.314 226239 DEBUG oslo_concurrency.lockutils [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:58 np0005603623 systemd[1]: Started libpod-conmon-f4addd6a96ca7cc7389dc4b6345107c10395564f83817ce18fdab4d9e6dc38a2.scope.
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.316 226239 INFO nova.compute.manager [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Terminating instance#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.316 226239 DEBUG nova.compute.manager [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:57:58 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:57:58 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c63a03c8e6add801ade5f83000b4f485e2b2283fc26c4ae43d84b6101ed250d7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:57:58 np0005603623 kernel: tape6ed4c7b-19 (unregistering): left promiscuous mode
Jan 31 03:57:58 np0005603623 podman[315443]: 2026-01-31 08:57:58.353947276 +0000 UTC m=+0.119605583 container init f4addd6a96ca7cc7389dc4b6345107c10395564f83817ce18fdab4d9e6dc38a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:57:58 np0005603623 NetworkManager[48970]: <info>  [1769849878.3542] device (tape6ed4c7b-19): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:57:58 np0005603623 podman[315443]: 2026-01-31 08:57:58.35850554 +0000 UTC m=+0.124163817 container start f4addd6a96ca7cc7389dc4b6345107c10395564f83817ce18fdab4d9e6dc38a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 31 03:57:58 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:58Z|00781|binding|INFO|Releasing lport e6ed4c7b-198a-42b9-bcf7-79fcae00e769 from this chassis (sb_readonly=0)
Jan 31 03:57:58 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:58Z|00782|binding|INFO|Setting lport e6ed4c7b-198a-42b9-bcf7-79fcae00e769 down in Southbound
Jan 31 03:57:58 np0005603623 podman[315443]: 2026-01-31 08:57:58.264217681 +0000 UTC m=+0.029875978 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.362 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:58 np0005603623 ovn_controller[133449]: 2026-01-31T08:57:58Z|00783|binding|INFO|Removing iface tape6ed4c7b-19 ovn-installed in OVS
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.368 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:58 np0005603623 neutron-haproxy-ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1[315458]: [NOTICE]   (315464) : New worker (315467) forked
Jan 31 03:57:58 np0005603623 neutron-haproxy-ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1[315458]: [NOTICE]   (315464) : Loading success.
Jan 31 03:57:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:58.378 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:5e:b9 10.100.0.10'], port_security=['fa:16:3e:5d:5e:b9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9195012-fef1-4e17-acdd-2b9ffc979da0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c37e7d6d634448bfb3172894ad2af105', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd98fdedc-7ec4-4678-86fd-333fbe96f77f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f5a6fc0-3df3-4c2f-84cd-adc2af316a8e, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=e6ed4c7b-198a-42b9-bcf7-79fcae00e769) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:57:58 np0005603623 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000ba.scope: Deactivated successfully.
Jan 31 03:57:58 np0005603623 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000ba.scope: Consumed 3.831s CPU time.
Jan 31 03:57:58 np0005603623 systemd-machined[194379]: Machine qemu-87-instance-000000ba terminated.
Jan 31 03:57:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:58.416 143258 INFO neutron.agent.ovn.metadata.agent [-] Port e6ed4c7b-198a-42b9-bcf7-79fcae00e769 in datapath b9195012-fef1-4e17-acdd-2b9ffc979da0 unbound from our chassis#033[00m
Jan 31 03:57:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:58.417 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b9195012-fef1-4e17-acdd-2b9ffc979da0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:57:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:58.418 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e994561b-bfd5-4722-869d-fe9521e9f3c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:58.418 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0 namespace which is not needed anymore#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.423 226239 INFO nova.compute.manager [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Took 11.46 seconds to build instance.#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.449 226239 DEBUG oslo_concurrency.lockutils [None req-f918fdb9-4d00-47c1-a475-cf9ce07bc144 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.734s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:58 np0005603623 neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0[315182]: [NOTICE]   (315186) : haproxy version is 2.8.14-c23fe91
Jan 31 03:57:58 np0005603623 neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0[315182]: [NOTICE]   (315186) : path to executable is /usr/sbin/haproxy
Jan 31 03:57:58 np0005603623 neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0[315182]: [WARNING]  (315186) : Exiting Master process...
Jan 31 03:57:58 np0005603623 neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0[315182]: [ALERT]    (315186) : Current worker (315188) exited with code 143 (Terminated)
Jan 31 03:57:58 np0005603623 neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0[315182]: [WARNING]  (315186) : All workers exited. Exiting... (0)
Jan 31 03:57:58 np0005603623 systemd[1]: libpod-3a811005f2e590e3bdfc0de34185012125beb4aec84c808d4b9029123d9b6e61.scope: Deactivated successfully.
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.533 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:58 np0005603623 podman[315493]: 2026-01-31 08:57:58.53609186 +0000 UTC m=+0.049788142 container died 3a811005f2e590e3bdfc0de34185012125beb4aec84c808d4b9029123d9b6e61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.537 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.548 226239 INFO nova.virt.libvirt.driver [-] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Instance destroyed successfully.#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.548 226239 DEBUG nova.objects.instance [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lazy-loading 'resources' on Instance uuid b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:58 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3a811005f2e590e3bdfc0de34185012125beb4aec84c808d4b9029123d9b6e61-userdata-shm.mount: Deactivated successfully.
Jan 31 03:57:58 np0005603623 systemd[1]: var-lib-containers-storage-overlay-3f1208a0b4f1658f77d736df263a91908dcba98e8eaa68a0c5f514a17ca2a0bd-merged.mount: Deactivated successfully.
Jan 31 03:57:58 np0005603623 podman[315493]: 2026-01-31 08:57:58.577927313 +0000 UTC m=+0.091623595 container cleanup 3a811005f2e590e3bdfc0de34185012125beb4aec84c808d4b9029123d9b6e61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:57:58 np0005603623 systemd[1]: libpod-conmon-3a811005f2e590e3bdfc0de34185012125beb4aec84c808d4b9029123d9b6e61.scope: Deactivated successfully.
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.606 226239 DEBUG nova.virt.libvirt.vif [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:57:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1191175908',display_name='tempest-TestServerMultinode-server-1191175908',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-1191175908',id=186,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:57:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c37e7d6d634448bfb3172894ad2af105',ramdisk_id='',reservation_id='r-gylvl88b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-893388561',owner_user_name='tempest-TestServerMultinode-893388561-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:57:55Z,user_data=None,user_id='4e364ad937544559bea978006e9ff229',uuid=b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "address": "fa:16:3e:5d:5e:b9", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6ed4c7b-19", "ovs_interfaceid": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.606 226239 DEBUG nova.network.os_vif_util [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Converting VIF {"id": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "address": "fa:16:3e:5d:5e:b9", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6ed4c7b-19", "ovs_interfaceid": "e6ed4c7b-198a-42b9-bcf7-79fcae00e769", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.607 226239 DEBUG nova.network.os_vif_util [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:5e:b9,bridge_name='br-int',has_traffic_filtering=True,id=e6ed4c7b-198a-42b9-bcf7-79fcae00e769,network=Network(b9195012-fef1-4e17-acdd-2b9ffc979da0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6ed4c7b-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.607 226239 DEBUG os_vif [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:5e:b9,bridge_name='br-int',has_traffic_filtering=True,id=e6ed4c7b-198a-42b9-bcf7-79fcae00e769,network=Network(b9195012-fef1-4e17-acdd-2b9ffc979da0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6ed4c7b-19') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.609 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.610 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape6ed4c7b-19, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.611 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.613 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.615 226239 INFO os_vif [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:5e:b9,bridge_name='br-int',has_traffic_filtering=True,id=e6ed4c7b-198a-42b9-bcf7-79fcae00e769,network=Network(b9195012-fef1-4e17-acdd-2b9ffc979da0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6ed4c7b-19')#033[00m
Jan 31 03:57:58 np0005603623 podman[315531]: 2026-01-31 08:57:58.636160269 +0000 UTC m=+0.043282548 container remove 3a811005f2e590e3bdfc0de34185012125beb4aec84c808d4b9029123d9b6e61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:57:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:58.640 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[25420df4-3a37-46c0-8925-cfc2be23ee50]: (4, ('Sat Jan 31 08:57:58 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0 (3a811005f2e590e3bdfc0de34185012125beb4aec84c808d4b9029123d9b6e61)\n3a811005f2e590e3bdfc0de34185012125beb4aec84c808d4b9029123d9b6e61\nSat Jan 31 08:57:58 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0 (3a811005f2e590e3bdfc0de34185012125beb4aec84c808d4b9029123d9b6e61)\n3a811005f2e590e3bdfc0de34185012125beb4aec84c808d4b9029123d9b6e61\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:58.642 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cb79cc89-42c1-4a98-bacd-a6ff6fb4ad6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:58.643 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb9195012-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:58 np0005603623 kernel: tapb9195012-f0: left promiscuous mode
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.646 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:58.652 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0fccd6fa-af40-4603-aab5-ca745d4a0846]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:58 np0005603623 nova_compute[226235]: 2026-01-31 08:57:58.652 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:58.666 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9acc86a5-366f-4a35-a2ce-f435577efc0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:58.669 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[165049e1-1935-408e-855b-f1ee085c7a07]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:58.682 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ea3d57f1-f9f5-416c-97e8-4e75c964a6ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 901367, 'reachable_time': 26788, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315586, 'error': None, 'target': 'ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:58.685 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:57:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:57:58.685 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[3d4da81c-60dc-47b7-ba93-ebfa8414b8c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:58 np0005603623 systemd[1]: run-netns-ovnmeta\x2db9195012\x2dfef1\x2d4e17\x2dacdd\x2d2b9ffc979da0.mount: Deactivated successfully.
Jan 31 03:57:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:58.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:57:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:57:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:58.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:57:59 np0005603623 nova_compute[226235]: 2026-01-31 08:57:59.207 226239 DEBUG nova.compute.manager [req-c10e453b-78e5-4ba2-a70d-28c978985d70 req-668b94b4-d7a1-4240-b774-0ff2b97c0022 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Received event network-vif-unplugged-e6ed4c7b-198a-42b9-bcf7-79fcae00e769 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:59 np0005603623 nova_compute[226235]: 2026-01-31 08:57:59.207 226239 DEBUG oslo_concurrency.lockutils [req-c10e453b-78e5-4ba2-a70d-28c978985d70 req-668b94b4-d7a1-4240-b774-0ff2b97c0022 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:59 np0005603623 nova_compute[226235]: 2026-01-31 08:57:59.209 226239 DEBUG oslo_concurrency.lockutils [req-c10e453b-78e5-4ba2-a70d-28c978985d70 req-668b94b4-d7a1-4240-b774-0ff2b97c0022 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:59 np0005603623 nova_compute[226235]: 2026-01-31 08:57:59.209 226239 DEBUG oslo_concurrency.lockutils [req-c10e453b-78e5-4ba2-a70d-28c978985d70 req-668b94b4-d7a1-4240-b774-0ff2b97c0022 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:59 np0005603623 nova_compute[226235]: 2026-01-31 08:57:59.209 226239 DEBUG nova.compute.manager [req-c10e453b-78e5-4ba2-a70d-28c978985d70 req-668b94b4-d7a1-4240-b774-0ff2b97c0022 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] No waiting events found dispatching network-vif-unplugged-e6ed4c7b-198a-42b9-bcf7-79fcae00e769 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:57:59 np0005603623 nova_compute[226235]: 2026-01-31 08:57:59.210 226239 DEBUG nova.compute.manager [req-c10e453b-78e5-4ba2-a70d-28c978985d70 req-668b94b4-d7a1-4240-b774-0ff2b97c0022 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Received event network-vif-unplugged-e6ed4c7b-198a-42b9-bcf7-79fcae00e769 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:58:00 np0005603623 nova_compute[226235]: 2026-01-31 08:58:00.289 226239 INFO nova.virt.libvirt.driver [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Deleting instance files /var/lib/nova/instances/b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f_del#033[00m
Jan 31 03:58:00 np0005603623 nova_compute[226235]: 2026-01-31 08:58:00.290 226239 INFO nova.virt.libvirt.driver [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Deletion of /var/lib/nova/instances/b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f_del complete#033[00m
Jan 31 03:58:00 np0005603623 nova_compute[226235]: 2026-01-31 08:58:00.371 226239 INFO nova.compute.manager [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Took 2.05 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:58:00 np0005603623 nova_compute[226235]: 2026-01-31 08:58:00.371 226239 DEBUG oslo.service.loopingcall [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:58:00 np0005603623 nova_compute[226235]: 2026-01-31 08:58:00.372 226239 DEBUG nova.compute.manager [-] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:58:00 np0005603623 nova_compute[226235]: 2026-01-31 08:58:00.372 226239 DEBUG nova.network.neutron [-] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:58:00 np0005603623 nova_compute[226235]: 2026-01-31 08:58:00.697 226239 DEBUG nova.compute.manager [req-ecabbbfd-a8c0-4603-8fde-36a7fde5d85a req-3f8a32cd-ffb5-40ac-a602-016ffd27f471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Received event network-vif-plugged-7f888fb7-e22f-4012-8f50-9248df3a9eac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:00 np0005603623 nova_compute[226235]: 2026-01-31 08:58:00.698 226239 DEBUG oslo_concurrency.lockutils [req-ecabbbfd-a8c0-4603-8fde-36a7fde5d85a req-3f8a32cd-ffb5-40ac-a602-016ffd27f471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:00 np0005603623 nova_compute[226235]: 2026-01-31 08:58:00.698 226239 DEBUG oslo_concurrency.lockutils [req-ecabbbfd-a8c0-4603-8fde-36a7fde5d85a req-3f8a32cd-ffb5-40ac-a602-016ffd27f471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:00 np0005603623 nova_compute[226235]: 2026-01-31 08:58:00.699 226239 DEBUG oslo_concurrency.lockutils [req-ecabbbfd-a8c0-4603-8fde-36a7fde5d85a req-3f8a32cd-ffb5-40ac-a602-016ffd27f471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:00 np0005603623 nova_compute[226235]: 2026-01-31 08:58:00.699 226239 DEBUG nova.compute.manager [req-ecabbbfd-a8c0-4603-8fde-36a7fde5d85a req-3f8a32cd-ffb5-40ac-a602-016ffd27f471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] No waiting events found dispatching network-vif-plugged-7f888fb7-e22f-4012-8f50-9248df3a9eac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:58:00 np0005603623 nova_compute[226235]: 2026-01-31 08:58:00.699 226239 WARNING nova.compute.manager [req-ecabbbfd-a8c0-4603-8fde-36a7fde5d85a req-3f8a32cd-ffb5-40ac-a602-016ffd27f471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Received unexpected event network-vif-plugged-7f888fb7-e22f-4012-8f50-9248df3a9eac for instance with vm_state active and task_state None.#033[00m
Jan 31 03:58:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:00.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:58:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:00.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:58:01 np0005603623 nova_compute[226235]: 2026-01-31 08:58:01.300 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:01 np0005603623 nova_compute[226235]: 2026-01-31 08:58:01.426 226239 DEBUG nova.compute.manager [req-9f3e6522-9f4c-4a23-869d-9d813ffb9820 req-90e21315-470e-4f5d-a61e-d23d7758b431 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Received event network-vif-plugged-e6ed4c7b-198a-42b9-bcf7-79fcae00e769 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:01 np0005603623 nova_compute[226235]: 2026-01-31 08:58:01.427 226239 DEBUG oslo_concurrency.lockutils [req-9f3e6522-9f4c-4a23-869d-9d813ffb9820 req-90e21315-470e-4f5d-a61e-d23d7758b431 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:01 np0005603623 nova_compute[226235]: 2026-01-31 08:58:01.427 226239 DEBUG oslo_concurrency.lockutils [req-9f3e6522-9f4c-4a23-869d-9d813ffb9820 req-90e21315-470e-4f5d-a61e-d23d7758b431 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:01 np0005603623 nova_compute[226235]: 2026-01-31 08:58:01.428 226239 DEBUG oslo_concurrency.lockutils [req-9f3e6522-9f4c-4a23-869d-9d813ffb9820 req-90e21315-470e-4f5d-a61e-d23d7758b431 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:01 np0005603623 nova_compute[226235]: 2026-01-31 08:58:01.428 226239 DEBUG nova.compute.manager [req-9f3e6522-9f4c-4a23-869d-9d813ffb9820 req-90e21315-470e-4f5d-a61e-d23d7758b431 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] No waiting events found dispatching network-vif-plugged-e6ed4c7b-198a-42b9-bcf7-79fcae00e769 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:58:01 np0005603623 nova_compute[226235]: 2026-01-31 08:58:01.428 226239 WARNING nova.compute.manager [req-9f3e6522-9f4c-4a23-869d-9d813ffb9820 req-90e21315-470e-4f5d-a61e-d23d7758b431 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Received unexpected event network-vif-plugged-e6ed4c7b-198a-42b9-bcf7-79fcae00e769 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:58:01 np0005603623 nova_compute[226235]: 2026-01-31 08:58:01.824 226239 DEBUG nova.network.neutron [-] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:58:01 np0005603623 nova_compute[226235]: 2026-01-31 08:58:01.853 226239 INFO nova.compute.manager [-] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Took 1.48 seconds to deallocate network for instance.#033[00m
Jan 31 03:58:01 np0005603623 nova_compute[226235]: 2026-01-31 08:58:01.945 226239 DEBUG oslo_concurrency.lockutils [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:01 np0005603623 nova_compute[226235]: 2026-01-31 08:58:01.945 226239 DEBUG oslo_concurrency.lockutils [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:02 np0005603623 nova_compute[226235]: 2026-01-31 08:58:02.073 226239 DEBUG oslo_concurrency.processutils [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:58:02 np0005603623 nova_compute[226235]: 2026-01-31 08:58:02.534 226239 DEBUG oslo_concurrency.processutils [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:58:02 np0005603623 nova_compute[226235]: 2026-01-31 08:58:02.539 226239 DEBUG nova.compute.provider_tree [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:58:02 np0005603623 nova_compute[226235]: 2026-01-31 08:58:02.562 226239 DEBUG nova.scheduler.client.report [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:58:02 np0005603623 nova_compute[226235]: 2026-01-31 08:58:02.598 226239 DEBUG oslo_concurrency.lockutils [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:02 np0005603623 nova_compute[226235]: 2026-01-31 08:58:02.648 226239 INFO nova.scheduler.client.report [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Deleted allocations for instance b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f#033[00m
Jan 31 03:58:02 np0005603623 nova_compute[226235]: 2026-01-31 08:58:02.739 226239 DEBUG oslo_concurrency.lockutils [None req-cc5d91a0-cc9b-46b9-bb47-ad4cac7b5dd4 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.425s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:58:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:02.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:58:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:02.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:02 np0005603623 nova_compute[226235]: 2026-01-31 08:58:02.893 226239 DEBUG nova.compute.manager [req-75e0d82b-7e82-4fe7-bf7b-5924af7bea35 req-0a8f60f4-063f-481c-bf5c-d9fc778db7dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Received event network-vif-deleted-e6ed4c7b-198a-42b9-bcf7-79fcae00e769 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:03 np0005603623 NetworkManager[48970]: <info>  [1769849883.4945] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/368)
Jan 31 03:58:03 np0005603623 NetworkManager[48970]: <info>  [1769849883.4956] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/369)
Jan 31 03:58:03 np0005603623 nova_compute[226235]: 2026-01-31 08:58:03.493 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:03 np0005603623 nova_compute[226235]: 2026-01-31 08:58:03.529 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:58:03Z|00784|binding|INFO|Releasing lport 31300652-e6db-4684-9385-9a9bdfcee2c5 from this chassis (sb_readonly=0)
Jan 31 03:58:03 np0005603623 ovn_controller[133449]: 2026-01-31T08:58:03Z|00785|binding|INFO|Releasing lport 5a0136e3-84ab-4495-80ff-8006a0a74934 from this chassis (sb_readonly=0)
Jan 31 03:58:03 np0005603623 nova_compute[226235]: 2026-01-31 08:58:03.556 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:03 np0005603623 nova_compute[226235]: 2026-01-31 08:58:03.611 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:03 np0005603623 nova_compute[226235]: 2026-01-31 08:58:03.973 226239 DEBUG nova.compute.manager [req-5a37ae04-9858-47ff-aee7-17e8c29ac684 req-251338a6-4786-403d-9649-0908cabac36b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Received event network-changed-f859815f-0923-45c1-a84d-2a128fb7fd57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:03 np0005603623 nova_compute[226235]: 2026-01-31 08:58:03.974 226239 DEBUG nova.compute.manager [req-5a37ae04-9858-47ff-aee7-17e8c29ac684 req-251338a6-4786-403d-9649-0908cabac36b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Refreshing instance network info cache due to event network-changed-f859815f-0923-45c1-a84d-2a128fb7fd57. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:58:03 np0005603623 nova_compute[226235]: 2026-01-31 08:58:03.976 226239 DEBUG oslo_concurrency.lockutils [req-5a37ae04-9858-47ff-aee7-17e8c29ac684 req-251338a6-4786-403d-9649-0908cabac36b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:58:03 np0005603623 nova_compute[226235]: 2026-01-31 08:58:03.976 226239 DEBUG oslo_concurrency.lockutils [req-5a37ae04-9858-47ff-aee7-17e8c29ac684 req-251338a6-4786-403d-9649-0908cabac36b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:58:03 np0005603623 nova_compute[226235]: 2026-01-31 08:58:03.977 226239 DEBUG nova.network.neutron [req-5a37ae04-9858-47ff-aee7-17e8c29ac684 req-251338a6-4786-403d-9649-0908cabac36b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Refreshing network info cache for port f859815f-0923-45c1-a84d-2a128fb7fd57 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:58:04 np0005603623 nova_compute[226235]: 2026-01-31 08:58:04.239 226239 DEBUG nova.compute.manager [req-4ac4c171-cf88-4a74-854c-f021c579526a req-9b578871-5381-40b9-bc5b-80c4b1840a96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Received event network-changed-7f888fb7-e22f-4012-8f50-9248df3a9eac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:04 np0005603623 nova_compute[226235]: 2026-01-31 08:58:04.240 226239 DEBUG nova.compute.manager [req-4ac4c171-cf88-4a74-854c-f021c579526a req-9b578871-5381-40b9-bc5b-80c4b1840a96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Refreshing instance network info cache due to event network-changed-7f888fb7-e22f-4012-8f50-9248df3a9eac. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:58:04 np0005603623 nova_compute[226235]: 2026-01-31 08:58:04.240 226239 DEBUG oslo_concurrency.lockutils [req-4ac4c171-cf88-4a74-854c-f021c579526a req-9b578871-5381-40b9-bc5b-80c4b1840a96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-2450f89f-bcd8-4bab-8fbd-ae73ab968552" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:58:04 np0005603623 nova_compute[226235]: 2026-01-31 08:58:04.240 226239 DEBUG oslo_concurrency.lockutils [req-4ac4c171-cf88-4a74-854c-f021c579526a req-9b578871-5381-40b9-bc5b-80c4b1840a96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-2450f89f-bcd8-4bab-8fbd-ae73ab968552" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:58:04 np0005603623 nova_compute[226235]: 2026-01-31 08:58:04.241 226239 DEBUG nova.network.neutron [req-4ac4c171-cf88-4a74-854c-f021c579526a req-9b578871-5381-40b9-bc5b-80c4b1840a96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Refreshing network info cache for port 7f888fb7-e22f-4012-8f50-9248df3a9eac _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:58:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:04.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:04.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e375 e375: 3 total, 3 up, 3 in
Jan 31 03:58:06 np0005603623 nova_compute[226235]: 2026-01-31 08:58:06.322 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e375 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:06 np0005603623 nova_compute[226235]: 2026-01-31 08:58:06.690 226239 DEBUG nova.network.neutron [req-4ac4c171-cf88-4a74-854c-f021c579526a req-9b578871-5381-40b9-bc5b-80c4b1840a96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Updated VIF entry in instance network info cache for port 7f888fb7-e22f-4012-8f50-9248df3a9eac. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:58:06 np0005603623 nova_compute[226235]: 2026-01-31 08:58:06.691 226239 DEBUG nova.network.neutron [req-4ac4c171-cf88-4a74-854c-f021c579526a req-9b578871-5381-40b9-bc5b-80c4b1840a96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Updating instance_info_cache with network_info: [{"id": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "address": "fa:16:3e:06:b6:50", "network": {"id": "80db08a4-2d34-453a-a239-a7ada660bee1", "bridge": "br-int", "label": "tempest-network-smoke--1433603680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f888fb7-e2", "ovs_interfaceid": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:58:06 np0005603623 nova_compute[226235]: 2026-01-31 08:58:06.714 226239 DEBUG nova.network.neutron [req-5a37ae04-9858-47ff-aee7-17e8c29ac684 req-251338a6-4786-403d-9649-0908cabac36b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updated VIF entry in instance network info cache for port f859815f-0923-45c1-a84d-2a128fb7fd57. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:58:06 np0005603623 nova_compute[226235]: 2026-01-31 08:58:06.715 226239 DEBUG nova.network.neutron [req-5a37ae04-9858-47ff-aee7-17e8c29ac684 req-251338a6-4786-403d-9649-0908cabac36b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updating instance_info_cache with network_info: [{"id": "f859815f-0923-45c1-a84d-2a128fb7fd57", "address": "fa:16:3e:ee:28:18", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf859815f-09", "ovs_interfaceid": "f859815f-0923-45c1-a84d-2a128fb7fd57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:58:06 np0005603623 nova_compute[226235]: 2026-01-31 08:58:06.755 226239 DEBUG oslo_concurrency.lockutils [req-5a37ae04-9858-47ff-aee7-17e8c29ac684 req-251338a6-4786-403d-9649-0908cabac36b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e9903ecf-c775-4e84-8997-361061869fc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:58:06 np0005603623 nova_compute[226235]: 2026-01-31 08:58:06.756 226239 DEBUG oslo_concurrency.lockutils [req-4ac4c171-cf88-4a74-854c-f021c579526a req-9b578871-5381-40b9-bc5b-80c4b1840a96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-2450f89f-bcd8-4bab-8fbd-ae73ab968552" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:58:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:06.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:06.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e376 e376: 3 total, 3 up, 3 in
Jan 31 03:58:07 np0005603623 nova_compute[226235]: 2026-01-31 08:58:07.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:58:07 np0005603623 nova_compute[226235]: 2026-01-31 08:58:07.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:58:08 np0005603623 nova_compute[226235]: 2026-01-31 08:58:08.615 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:08.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:08.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:09 np0005603623 ovn_controller[133449]: 2026-01-31T08:58:09Z|00094|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:06:b6:50 10.100.0.7
Jan 31 03:58:09 np0005603623 ovn_controller[133449]: 2026-01-31T08:58:09Z|00095|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:06:b6:50 10.100.0.7
Jan 31 03:58:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:58:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:10.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:58:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:10.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:11 np0005603623 nova_compute[226235]: 2026-01-31 08:58:11.321 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e376 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:12 np0005603623 nova_compute[226235]: 2026-01-31 08:58:12.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:58:12 np0005603623 nova_compute[226235]: 2026-01-31 08:58:12.445 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:12 np0005603623 nova_compute[226235]: 2026-01-31 08:58:12.446 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:12 np0005603623 nova_compute[226235]: 2026-01-31 08:58:12.447 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:12 np0005603623 nova_compute[226235]: 2026-01-31 08:58:12.447 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:58:12 np0005603623 nova_compute[226235]: 2026-01-31 08:58:12.447 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:58:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:12.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:58:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3610323404' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:58:12 np0005603623 nova_compute[226235]: 2026-01-31 08:58:12.858 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.411s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:58:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:12.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:12 np0005603623 nova_compute[226235]: 2026-01-31 08:58:12.976 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000b3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:58:12 np0005603623 nova_compute[226235]: 2026-01-31 08:58:12.977 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000b3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:58:12 np0005603623 nova_compute[226235]: 2026-01-31 08:58:12.980 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000bb as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:58:12 np0005603623 nova_compute[226235]: 2026-01-31 08:58:12.980 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000bb as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:58:13 np0005603623 nova_compute[226235]: 2026-01-31 08:58:13.116 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:58:13 np0005603623 nova_compute[226235]: 2026-01-31 08:58:13.117 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3792MB free_disk=20.956371307373047GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:58:13 np0005603623 nova_compute[226235]: 2026-01-31 08:58:13.117 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:13 np0005603623 nova_compute[226235]: 2026-01-31 08:58:13.117 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:13 np0005603623 nova_compute[226235]: 2026-01-31 08:58:13.377 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance e9903ecf-c775-4e84-8997-361061869fc6 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:58:13 np0005603623 nova_compute[226235]: 2026-01-31 08:58:13.377 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 2450f89f-bcd8-4bab-8fbd-ae73ab968552 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:58:13 np0005603623 nova_compute[226235]: 2026-01-31 08:58:13.378 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:58:13 np0005603623 nova_compute[226235]: 2026-01-31 08:58:13.378 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:58:13 np0005603623 nova_compute[226235]: 2026-01-31 08:58:13.437 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:58:13 np0005603623 nova_compute[226235]: 2026-01-31 08:58:13.544 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849878.5440423, b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:58:13 np0005603623 nova_compute[226235]: 2026-01-31 08:58:13.545 226239 INFO nova.compute.manager [-] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:58:13 np0005603623 nova_compute[226235]: 2026-01-31 08:58:13.592 226239 DEBUG nova.compute.manager [None req-3ce50c45-19aa-411e-89d6-f321064a0c4a - - - - - -] [instance: b3aabf4d-ad0d-4f20-9281-1ab04ca22f3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:58:13 np0005603623 nova_compute[226235]: 2026-01-31 08:58:13.617 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:58:13 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/319403539' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:58:13 np0005603623 nova_compute[226235]: 2026-01-31 08:58:13.858 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:58:13 np0005603623 nova_compute[226235]: 2026-01-31 08:58:13.863 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:58:13 np0005603623 nova_compute[226235]: 2026-01-31 08:58:13.887 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:58:13 np0005603623 nova_compute[226235]: 2026-01-31 08:58:13.945 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:58:13 np0005603623 nova_compute[226235]: 2026-01-31 08:58:13.946 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.829s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:13 np0005603623 podman[315689]: 2026-01-31 08:58:13.967559737 +0000 UTC m=+0.050191975 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:58:13 np0005603623 podman[315690]: 2026-01-31 08:58:13.989132674 +0000 UTC m=+0.071764272 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.391 226239 DEBUG oslo_concurrency.lockutils [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "e9903ecf-c775-4e84-8997-361061869fc6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.391 226239 DEBUG oslo_concurrency.lockutils [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.392 226239 DEBUG oslo_concurrency.lockutils [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "e9903ecf-c775-4e84-8997-361061869fc6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.392 226239 DEBUG oslo_concurrency.lockutils [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.392 226239 DEBUG oslo_concurrency.lockutils [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.393 226239 INFO nova.compute.manager [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Terminating instance#033[00m
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.394 226239 DEBUG nova.compute.manager [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:58:14 np0005603623 kernel: tapf859815f-09 (unregistering): left promiscuous mode
Jan 31 03:58:14 np0005603623 NetworkManager[48970]: <info>  [1769849894.4788] device (tapf859815f-09): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:58:14 np0005603623 ovn_controller[133449]: 2026-01-31T08:58:14Z|00786|binding|INFO|Releasing lport f859815f-0923-45c1-a84d-2a128fb7fd57 from this chassis (sb_readonly=0)
Jan 31 03:58:14 np0005603623 ovn_controller[133449]: 2026-01-31T08:58:14Z|00787|binding|INFO|Setting lport f859815f-0923-45c1-a84d-2a128fb7fd57 down in Southbound
Jan 31 03:58:14 np0005603623 ovn_controller[133449]: 2026-01-31T08:58:14Z|00788|binding|INFO|Removing iface tapf859815f-09 ovn-installed in OVS
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.485 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.486 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.490 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:14.493 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:28:18 10.100.0.3'], port_security=['fa:16:3e:ee:28:18 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'e9903ecf-c775-4e84-8997-361061869fc6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405bd95c-1bad-49fb-83bf-a97a0c66786e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '06b5fc9cfd4c49abb2d8b9f2f8a82c1f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '9d7b4c6b-30ca-4a01-b275-d4aa9d87b845', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fe6e8b31-5a27-4e0f-b157-3b33899fa37b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=f859815f-0923-45c1-a84d-2a128fb7fd57) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:58:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:14.494 143258 INFO neutron.agent.ovn.metadata.agent [-] Port f859815f-0923-45c1-a84d-2a128fb7fd57 in datapath 405bd95c-1bad-49fb-83bf-a97a0c66786e unbound from our chassis#033[00m
Jan 31 03:58:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:14.495 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 405bd95c-1bad-49fb-83bf-a97a0c66786e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:58:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:14.496 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5e6a147b-8c1b-4b12-a4f5-c7e3d108fd8b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:14.496 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e namespace which is not needed anymore#033[00m
Jan 31 03:58:14 np0005603623 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000b3.scope: Deactivated successfully.
Jan 31 03:58:14 np0005603623 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000b3.scope: Consumed 17.848s CPU time.
Jan 31 03:58:14 np0005603623 systemd-machined[194379]: Machine qemu-85-instance-000000b3 terminated.
Jan 31 03:58:14 np0005603623 neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e[312711]: [NOTICE]   (312715) : haproxy version is 2.8.14-c23fe91
Jan 31 03:58:14 np0005603623 neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e[312711]: [NOTICE]   (312715) : path to executable is /usr/sbin/haproxy
Jan 31 03:58:14 np0005603623 neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e[312711]: [WARNING]  (312715) : Exiting Master process...
Jan 31 03:58:14 np0005603623 neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e[312711]: [ALERT]    (312715) : Current worker (312717) exited with code 143 (Terminated)
Jan 31 03:58:14 np0005603623 neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e[312711]: [WARNING]  (312715) : All workers exited. Exiting... (0)
Jan 31 03:58:14 np0005603623 systemd[1]: libpod-af3b5ad1f0ae7c7d221e1ed643650a95212a23439efe58cb6cb5c95d0be050fc.scope: Deactivated successfully.
Jan 31 03:58:14 np0005603623 podman[315760]: 2026-01-31 08:58:14.597901662 +0000 UTC m=+0.040059928 container died af3b5ad1f0ae7c7d221e1ed643650a95212a23439efe58cb6cb5c95d0be050fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.610 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.613 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:14 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-af3b5ad1f0ae7c7d221e1ed643650a95212a23439efe58cb6cb5c95d0be050fc-userdata-shm.mount: Deactivated successfully.
Jan 31 03:58:14 np0005603623 systemd[1]: var-lib-containers-storage-overlay-328bab48dbfd0b70cf017cf0c6b714945e66cb16082db3ac0bc893afa6b5bd30-merged.mount: Deactivated successfully.
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.629 226239 INFO nova.virt.libvirt.driver [-] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Instance destroyed successfully.#033[00m
Jan 31 03:58:14 np0005603623 podman[315760]: 2026-01-31 08:58:14.631200386 +0000 UTC m=+0.073358652 container cleanup af3b5ad1f0ae7c7d221e1ed643650a95212a23439efe58cb6cb5c95d0be050fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.630 226239 DEBUG nova.objects.instance [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lazy-loading 'resources' on Instance uuid e9903ecf-c775-4e84-8997-361061869fc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:58:14 np0005603623 systemd[1]: libpod-conmon-af3b5ad1f0ae7c7d221e1ed643650a95212a23439efe58cb6cb5c95d0be050fc.scope: Deactivated successfully.
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.651 226239 DEBUG nova.virt.libvirt.vif [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:55:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-1802059624',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-1802059624',id=179,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEXzK6zUN8P2oqgqYwcegkodZ7bCeyyyhmYXIteBKXOhNEu+drS3qyKalg8BzkpjD3Rc/+FviAhlBApTbimNmOyPmM7IztIR2VGri6qDWFeRA0jXOdg2vS/Kgt0ALKH9cg==',key_name='tempest-TestInstancesWithCinderVolumes-176277168',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:55:44Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='06b5fc9cfd4c49abb2d8b9f2f8a82c1f',ramdisk_id='',reservation_id='r-1s00kmp1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestInstancesWithCinderVolumes-2132464628',owner_user_name='tempest-TestInstancesWithCinderVolumes-2132464628-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:55:44Z,user_data=None,user_id='cfaebb011a374541b083e772a6c83f25',uuid=e9903ecf-c775-4e84-8997-361061869fc6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f859815f-0923-45c1-a84d-2a128fb7fd57", "address": "fa:16:3e:ee:28:18", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf859815f-09", "ovs_interfaceid": "f859815f-0923-45c1-a84d-2a128fb7fd57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.652 226239 DEBUG nova.network.os_vif_util [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Converting VIF {"id": "f859815f-0923-45c1-a84d-2a128fb7fd57", "address": "fa:16:3e:ee:28:18", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf859815f-09", "ovs_interfaceid": "f859815f-0923-45c1-a84d-2a128fb7fd57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.653 226239 DEBUG nova.network.os_vif_util [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:28:18,bridge_name='br-int',has_traffic_filtering=True,id=f859815f-0923-45c1-a84d-2a128fb7fd57,network=Network(405bd95c-1bad-49fb-83bf-a97a0c66786e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf859815f-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.654 226239 DEBUG os_vif [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:28:18,bridge_name='br-int',has_traffic_filtering=True,id=f859815f-0923-45c1-a84d-2a128fb7fd57,network=Network(405bd95c-1bad-49fb-83bf-a97a0c66786e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf859815f-09') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.658 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.659 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf859815f-09, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.663 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.667 226239 INFO os_vif [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:28:18,bridge_name='br-int',has_traffic_filtering=True,id=f859815f-0923-45c1-a84d-2a128fb7fd57,network=Network(405bd95c-1bad-49fb-83bf-a97a0c66786e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf859815f-09')#033[00m
Jan 31 03:58:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:58:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/66900949' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:58:14 np0005603623 podman[315797]: 2026-01-31 08:58:14.688587236 +0000 UTC m=+0.037708794 container remove af3b5ad1f0ae7c7d221e1ed643650a95212a23439efe58cb6cb5c95d0be050fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 03:58:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:58:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/66900949' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:58:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:14.692 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[08038362-1534-4989-a473-0eb15b80ca48]: (4, ('Sat Jan 31 08:58:14 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e (af3b5ad1f0ae7c7d221e1ed643650a95212a23439efe58cb6cb5c95d0be050fc)\naf3b5ad1f0ae7c7d221e1ed643650a95212a23439efe58cb6cb5c95d0be050fc\nSat Jan 31 08:58:14 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e (af3b5ad1f0ae7c7d221e1ed643650a95212a23439efe58cb6cb5c95d0be050fc)\naf3b5ad1f0ae7c7d221e1ed643650a95212a23439efe58cb6cb5c95d0be050fc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:14.693 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c403e515-88b3-4383-ae0e-7c3cbcfcb264]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:14.694 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405bd95c-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.695 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:14 np0005603623 kernel: tap405bd95c-10: left promiscuous mode
Jan 31 03:58:14 np0005603623 nova_compute[226235]: 2026-01-31 08:58:14.701 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:14.703 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1ab16255-fc85-4d74-81fa-c29fd609a537]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:14.717 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[92de1655-7853-4df0-9893-b5c0b05cc38f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:14.718 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[68a282bf-684b-4e8c-800c-a814947b3c95]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:14.729 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[540b7c6f-d266-4203-977e-a2e19b7c0f18]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 888365, 'reachable_time': 43394, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315831, 'error': None, 'target': 'ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:14.731 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:58:14 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:14.731 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[02f3e92d-bd56-4872-bd04-dc6556c65997]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:14 np0005603623 systemd[1]: run-netns-ovnmeta\x2d405bd95c\x2d1bad\x2d49fb\x2d83bf\x2da97a0c66786e.mount: Deactivated successfully.
Jan 31 03:58:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:14.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:14.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:15 np0005603623 nova_compute[226235]: 2026-01-31 08:58:15.166 226239 DEBUG nova.compute.manager [req-1b7003cd-9f42-4a18-b081-91c99d3dc34d req-ce9a77fe-c064-490b-8253-70c643dce429 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Received event network-vif-unplugged-f859815f-0923-45c1-a84d-2a128fb7fd57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:15 np0005603623 nova_compute[226235]: 2026-01-31 08:58:15.166 226239 DEBUG oslo_concurrency.lockutils [req-1b7003cd-9f42-4a18-b081-91c99d3dc34d req-ce9a77fe-c064-490b-8253-70c643dce429 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e9903ecf-c775-4e84-8997-361061869fc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:15 np0005603623 nova_compute[226235]: 2026-01-31 08:58:15.166 226239 DEBUG oslo_concurrency.lockutils [req-1b7003cd-9f42-4a18-b081-91c99d3dc34d req-ce9a77fe-c064-490b-8253-70c643dce429 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:15 np0005603623 nova_compute[226235]: 2026-01-31 08:58:15.166 226239 DEBUG oslo_concurrency.lockutils [req-1b7003cd-9f42-4a18-b081-91c99d3dc34d req-ce9a77fe-c064-490b-8253-70c643dce429 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:15 np0005603623 nova_compute[226235]: 2026-01-31 08:58:15.167 226239 DEBUG nova.compute.manager [req-1b7003cd-9f42-4a18-b081-91c99d3dc34d req-ce9a77fe-c064-490b-8253-70c643dce429 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] No waiting events found dispatching network-vif-unplugged-f859815f-0923-45c1-a84d-2a128fb7fd57 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:58:15 np0005603623 nova_compute[226235]: 2026-01-31 08:58:15.167 226239 DEBUG nova.compute.manager [req-1b7003cd-9f42-4a18-b081-91c99d3dc34d req-ce9a77fe-c064-490b-8253-70c643dce429 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Received event network-vif-unplugged-f859815f-0923-45c1-a84d-2a128fb7fd57 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:58:15 np0005603623 nova_compute[226235]: 2026-01-31 08:58:15.815 226239 INFO nova.virt.libvirt.driver [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Deleting instance files /var/lib/nova/instances/e9903ecf-c775-4e84-8997-361061869fc6_del#033[00m
Jan 31 03:58:15 np0005603623 nova_compute[226235]: 2026-01-31 08:58:15.816 226239 INFO nova.virt.libvirt.driver [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Deletion of /var/lib/nova/instances/e9903ecf-c775-4e84-8997-361061869fc6_del complete#033[00m
Jan 31 03:58:15 np0005603623 nova_compute[226235]: 2026-01-31 08:58:15.870 226239 INFO nova.compute.manager [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Took 1.48 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:58:15 np0005603623 nova_compute[226235]: 2026-01-31 08:58:15.870 226239 DEBUG oslo.service.loopingcall [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:58:15 np0005603623 nova_compute[226235]: 2026-01-31 08:58:15.871 226239 DEBUG nova.compute.manager [-] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:58:15 np0005603623 nova_compute[226235]: 2026-01-31 08:58:15.871 226239 DEBUG nova.network.neutron [-] [instance: e9903ecf-c775-4e84-8997-361061869fc6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:58:16 np0005603623 nova_compute[226235]: 2026-01-31 08:58:16.324 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e376 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:16.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:16.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:16 np0005603623 nova_compute[226235]: 2026-01-31 08:58:16.947 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:58:16 np0005603623 nova_compute[226235]: 2026-01-31 08:58:16.947 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:58:16 np0005603623 nova_compute[226235]: 2026-01-31 08:58:16.977 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:58:17 np0005603623 ovn_controller[133449]: 2026-01-31T08:58:17Z|00789|binding|INFO|Releasing lport 31300652-e6db-4684-9385-9a9bdfcee2c5 from this chassis (sb_readonly=0)
Jan 31 03:58:17 np0005603623 nova_compute[226235]: 2026-01-31 08:58:17.206 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:17 np0005603623 nova_compute[226235]: 2026-01-31 08:58:17.360 226239 DEBUG nova.compute.manager [req-fedf3512-2da2-491d-9297-950fb12aa4b9 req-fb1531bb-33fc-4ad4-a40c-f4790bad608c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Received event network-vif-plugged-f859815f-0923-45c1-a84d-2a128fb7fd57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:17 np0005603623 nova_compute[226235]: 2026-01-31 08:58:17.361 226239 DEBUG oslo_concurrency.lockutils [req-fedf3512-2da2-491d-9297-950fb12aa4b9 req-fb1531bb-33fc-4ad4-a40c-f4790bad608c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e9903ecf-c775-4e84-8997-361061869fc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:17 np0005603623 nova_compute[226235]: 2026-01-31 08:58:17.361 226239 DEBUG oslo_concurrency.lockutils [req-fedf3512-2da2-491d-9297-950fb12aa4b9 req-fb1531bb-33fc-4ad4-a40c-f4790bad608c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:17 np0005603623 nova_compute[226235]: 2026-01-31 08:58:17.362 226239 DEBUG oslo_concurrency.lockutils [req-fedf3512-2da2-491d-9297-950fb12aa4b9 req-fb1531bb-33fc-4ad4-a40c-f4790bad608c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:17 np0005603623 nova_compute[226235]: 2026-01-31 08:58:17.362 226239 DEBUG nova.compute.manager [req-fedf3512-2da2-491d-9297-950fb12aa4b9 req-fb1531bb-33fc-4ad4-a40c-f4790bad608c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] No waiting events found dispatching network-vif-plugged-f859815f-0923-45c1-a84d-2a128fb7fd57 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:58:17 np0005603623 nova_compute[226235]: 2026-01-31 08:58:17.362 226239 WARNING nova.compute.manager [req-fedf3512-2da2-491d-9297-950fb12aa4b9 req-fb1531bb-33fc-4ad4-a40c-f4790bad608c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Received unexpected event network-vif-plugged-f859815f-0923-45c1-a84d-2a128fb7fd57 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:58:17 np0005603623 nova_compute[226235]: 2026-01-31 08:58:17.375 226239 INFO nova.compute.manager [None req-e1b75b71-e4aa-44df-b242-4197036696a2 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Get console output#033[00m
Jan 31 03:58:17 np0005603623 nova_compute[226235]: 2026-01-31 08:58:17.379 270602 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 03:58:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e377 e377: 3 total, 3 up, 3 in
Jan 31 03:58:17 np0005603623 nova_compute[226235]: 2026-01-31 08:58:17.482 226239 DEBUG nova.network.neutron [-] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:58:17 np0005603623 nova_compute[226235]: 2026-01-31 08:58:17.500 226239 INFO nova.compute.manager [-] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Took 1.63 seconds to deallocate network for instance.#033[00m
Jan 31 03:58:17 np0005603623 nova_compute[226235]: 2026-01-31 08:58:17.563 226239 DEBUG nova.compute.manager [req-a5f377ee-891b-434e-8871-033467acfea7 req-04b3ed0b-6c20-4561-b854-6446afb79f54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Received event network-vif-deleted-f859815f-0923-45c1-a84d-2a128fb7fd57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:17 np0005603623 nova_compute[226235]: 2026-01-31 08:58:17.958 226239 INFO nova.compute.manager [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Took 0.46 seconds to detach 1 volumes for instance.#033[00m
Jan 31 03:58:18 np0005603623 nova_compute[226235]: 2026-01-31 08:58:18.083 226239 INFO nova.compute.manager [None req-db4cedb4-7fb9-42ae-a5cd-179687fcc7e3 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Get console output#033[00m
Jan 31 03:58:18 np0005603623 nova_compute[226235]: 2026-01-31 08:58:18.087 270602 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 03:58:18 np0005603623 nova_compute[226235]: 2026-01-31 08:58:18.142 226239 DEBUG oslo_concurrency.lockutils [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:18 np0005603623 nova_compute[226235]: 2026-01-31 08:58:18.142 226239 DEBUG oslo_concurrency.lockutils [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:18 np0005603623 nova_compute[226235]: 2026-01-31 08:58:18.206 226239 DEBUG oslo_concurrency.processutils [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:58:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:58:18 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/19215101' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:58:18 np0005603623 nova_compute[226235]: 2026-01-31 08:58:18.624 226239 DEBUG oslo_concurrency.processutils [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:58:18 np0005603623 nova_compute[226235]: 2026-01-31 08:58:18.631 226239 DEBUG nova.compute.provider_tree [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:58:18 np0005603623 nova_compute[226235]: 2026-01-31 08:58:18.655 226239 DEBUG nova.scheduler.client.report [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:58:18 np0005603623 nova_compute[226235]: 2026-01-31 08:58:18.687 226239 DEBUG oslo_concurrency.lockutils [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.544s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:18 np0005603623 nova_compute[226235]: 2026-01-31 08:58:18.727 226239 INFO nova.scheduler.client.report [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Deleted allocations for instance e9903ecf-c775-4e84-8997-361061869fc6#033[00m
Jan 31 03:58:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:18.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:18 np0005603623 nova_compute[226235]: 2026-01-31 08:58:18.842 226239 DEBUG oslo_concurrency.lockutils [None req-1855051c-b48c-498c-b490-12c8b53cb2c8 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "e9903ecf-c775-4e84-8997-361061869fc6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.451s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:18.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.350 226239 DEBUG nova.compute.manager [req-251a6dda-a6b3-4482-8cdd-cad7c59683aa req-32dc1cc8-6008-4e61-ad45-3991a63fad0a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Received event network-changed-7f888fb7-e22f-4012-8f50-9248df3a9eac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.351 226239 DEBUG nova.compute.manager [req-251a6dda-a6b3-4482-8cdd-cad7c59683aa req-32dc1cc8-6008-4e61-ad45-3991a63fad0a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Refreshing instance network info cache due to event network-changed-7f888fb7-e22f-4012-8f50-9248df3a9eac. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.351 226239 DEBUG oslo_concurrency.lockutils [req-251a6dda-a6b3-4482-8cdd-cad7c59683aa req-32dc1cc8-6008-4e61-ad45-3991a63fad0a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-2450f89f-bcd8-4bab-8fbd-ae73ab968552" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.351 226239 DEBUG oslo_concurrency.lockutils [req-251a6dda-a6b3-4482-8cdd-cad7c59683aa req-32dc1cc8-6008-4e61-ad45-3991a63fad0a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-2450f89f-bcd8-4bab-8fbd-ae73ab968552" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.351 226239 DEBUG nova.network.neutron [req-251a6dda-a6b3-4482-8cdd-cad7c59683aa req-32dc1cc8-6008-4e61-ad45-3991a63fad0a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Refreshing network info cache for port 7f888fb7-e22f-4012-8f50-9248df3a9eac _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.448 226239 DEBUG oslo_concurrency.lockutils [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.449 226239 DEBUG oslo_concurrency.lockutils [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.449 226239 DEBUG oslo_concurrency.lockutils [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.449 226239 DEBUG oslo_concurrency.lockutils [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.449 226239 DEBUG oslo_concurrency.lockutils [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.451 226239 INFO nova.compute.manager [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Terminating instance#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.452 226239 DEBUG nova.compute.manager [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:58:19 np0005603623 kernel: tap7f888fb7-e2 (unregistering): left promiscuous mode
Jan 31 03:58:19 np0005603623 NetworkManager[48970]: <info>  [1769849899.5113] device (tap7f888fb7-e2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:58:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:58:19Z|00790|binding|INFO|Releasing lport 7f888fb7-e22f-4012-8f50-9248df3a9eac from this chassis (sb_readonly=0)
Jan 31 03:58:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:58:19Z|00791|binding|INFO|Setting lport 7f888fb7-e22f-4012-8f50-9248df3a9eac down in Southbound
Jan 31 03:58:19 np0005603623 ovn_controller[133449]: 2026-01-31T08:58:19Z|00792|binding|INFO|Removing iface tap7f888fb7-e2 ovn-installed in OVS
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.517 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.525 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:19.535 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:b6:50 10.100.0.7'], port_security=['fa:16:3e:06:b6:50 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '2450f89f-bcd8-4bab-8fbd-ae73ab968552', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-80db08a4-2d34-453a-a239-a7ada660bee1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ccc60d81-26b0-49de-8e09-12a020a730d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=91e2daca-cac7-4f99-8d64-7b570d2bf474, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=7f888fb7-e22f-4012-8f50-9248df3a9eac) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:58:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:19.536 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 7f888fb7-e22f-4012-8f50-9248df3a9eac in datapath 80db08a4-2d34-453a-a239-a7ada660bee1 unbound from our chassis#033[00m
Jan 31 03:58:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:19.538 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 80db08a4-2d34-453a-a239-a7ada660bee1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:58:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:19.538 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b1954b31-e8bb-4818-af86-f40e141103cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:19.539 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1 namespace which is not needed anymore#033[00m
Jan 31 03:58:19 np0005603623 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000bb.scope: Deactivated successfully.
Jan 31 03:58:19 np0005603623 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000bb.scope: Consumed 13.043s CPU time.
Jan 31 03:58:19 np0005603623 systemd-machined[194379]: Machine qemu-88-instance-000000bb terminated.
Jan 31 03:58:19 np0005603623 neutron-haproxy-ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1[315458]: [NOTICE]   (315464) : haproxy version is 2.8.14-c23fe91
Jan 31 03:58:19 np0005603623 neutron-haproxy-ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1[315458]: [NOTICE]   (315464) : path to executable is /usr/sbin/haproxy
Jan 31 03:58:19 np0005603623 neutron-haproxy-ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1[315458]: [WARNING]  (315464) : Exiting Master process...
Jan 31 03:58:19 np0005603623 neutron-haproxy-ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1[315458]: [ALERT]    (315464) : Current worker (315467) exited with code 143 (Terminated)
Jan 31 03:58:19 np0005603623 neutron-haproxy-ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1[315458]: [WARNING]  (315464) : All workers exited. Exiting... (0)
Jan 31 03:58:19 np0005603623 systemd[1]: libpod-f4addd6a96ca7cc7389dc4b6345107c10395564f83817ce18fdab4d9e6dc38a2.scope: Deactivated successfully.
Jan 31 03:58:19 np0005603623 conmon[315458]: conmon f4addd6a96ca7cc7389d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f4addd6a96ca7cc7389dc4b6345107c10395564f83817ce18fdab4d9e6dc38a2.scope/container/memory.events
Jan 31 03:58:19 np0005603623 podman[315931]: 2026-01-31 08:58:19.638176229 +0000 UTC m=+0.037870129 container died f4addd6a96ca7cc7389dc4b6345107c10395564f83817ce18fdab4d9e6dc38a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:58:19 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f4addd6a96ca7cc7389dc4b6345107c10395564f83817ce18fdab4d9e6dc38a2-userdata-shm.mount: Deactivated successfully.
Jan 31 03:58:19 np0005603623 systemd[1]: var-lib-containers-storage-overlay-c63a03c8e6add801ade5f83000b4f485e2b2283fc26c4ae43d84b6101ed250d7-merged.mount: Deactivated successfully.
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.704 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.708 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.711 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:19 np0005603623 podman[315931]: 2026-01-31 08:58:19.712196651 +0000 UTC m=+0.111890551 container cleanup f4addd6a96ca7cc7389dc4b6345107c10395564f83817ce18fdab4d9e6dc38a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 03:58:19 np0005603623 systemd[1]: libpod-conmon-f4addd6a96ca7cc7389dc4b6345107c10395564f83817ce18fdab4d9e6dc38a2.scope: Deactivated successfully.
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.721 226239 INFO nova.virt.libvirt.driver [-] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Instance destroyed successfully.#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.721 226239 DEBUG nova.objects.instance [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'resources' on Instance uuid 2450f89f-bcd8-4bab-8fbd-ae73ab968552 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:58:19 np0005603623 podman[315968]: 2026-01-31 08:58:19.759554347 +0000 UTC m=+0.032259133 container remove f4addd6a96ca7cc7389dc4b6345107c10395564f83817ce18fdab4d9e6dc38a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Jan 31 03:58:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:19.762 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dcbb2e01-5dc5-46af-bf68-69e93727fb4f]: (4, ('Sat Jan 31 08:58:19 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1 (f4addd6a96ca7cc7389dc4b6345107c10395564f83817ce18fdab4d9e6dc38a2)\nf4addd6a96ca7cc7389dc4b6345107c10395564f83817ce18fdab4d9e6dc38a2\nSat Jan 31 08:58:19 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1 (f4addd6a96ca7cc7389dc4b6345107c10395564f83817ce18fdab4d9e6dc38a2)\nf4addd6a96ca7cc7389dc4b6345107c10395564f83817ce18fdab4d9e6dc38a2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:19.765 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c7001a5e-08d2-4a7e-8f06-12c9012e619d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:19.765 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap80db08a4-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.767 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:19 np0005603623 kernel: tap80db08a4-20: left promiscuous mode
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.772 226239 DEBUG nova.virt.libvirt.vif [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:57:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-72318832',display_name='tempest-TestNetworkBasicOps-server-72318832',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-72318832',id=187,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBf/CcSGfenjrIJY2a7u0nsdG9tlN4s/UTheBCPSx6l9SRae2pY2uByjcTYgzee1XtVV9LklbY40NCV7PMEvPHBNIU59v4AYC+T6/bE+p78+IC5i8eS8hJXzPuTEiCLLlQ==',key_name='tempest-TestNetworkBasicOps-1980659074',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:57:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-sq07ng9b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:57:58Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=2450f89f-bcd8-4bab-8fbd-ae73ab968552,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "address": "fa:16:3e:06:b6:50", "network": {"id": "80db08a4-2d34-453a-a239-a7ada660bee1", "bridge": "br-int", "label": "tempest-network-smoke--1433603680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f888fb7-e2", "ovs_interfaceid": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.773 226239 DEBUG nova.network.os_vif_util [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "address": "fa:16:3e:06:b6:50", "network": {"id": "80db08a4-2d34-453a-a239-a7ada660bee1", "bridge": "br-int", "label": "tempest-network-smoke--1433603680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f888fb7-e2", "ovs_interfaceid": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.774 226239 DEBUG nova.network.os_vif_util [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:06:b6:50,bridge_name='br-int',has_traffic_filtering=True,id=7f888fb7-e22f-4012-8f50-9248df3a9eac,network=Network(80db08a4-2d34-453a-a239-a7ada660bee1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f888fb7-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.774 226239 DEBUG os_vif [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:b6:50,bridge_name='br-int',has_traffic_filtering=True,id=7f888fb7-e22f-4012-8f50-9248df3a9eac,network=Network(80db08a4-2d34-453a-a239-a7ada660bee1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f888fb7-e2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.775 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:19.776 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0e5b73be-8702-4ac5-8093-7c7c396cd3e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.777 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f888fb7-e2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.778 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.779 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.779 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:19 np0005603623 nova_compute[226235]: 2026-01-31 08:58:19.782 226239 INFO os_vif [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:b6:50,bridge_name='br-int',has_traffic_filtering=True,id=7f888fb7-e22f-4012-8f50-9248df3a9eac,network=Network(80db08a4-2d34-453a-a239-a7ada660bee1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f888fb7-e2')#033[00m
Jan 31 03:58:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:19.790 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d7b01db9-40ab-46ef-8bd3-79c9daaa7df4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:19.793 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[94aecadc-48cf-4274-b9fc-3fc1ea878e82]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:19.804 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2539c50a-0e57-4a1c-b6f4-e49b803dea0d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 901821, 'reachable_time': 19761, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 316004, 'error': None, 'target': 'ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:19.806 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-80db08a4-2d34-453a-a239-a7ada660bee1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:58:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:19.806 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[58be239f-089e-4155-ad99-7af890d39ad1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:19 np0005603623 systemd[1]: run-netns-ovnmeta\x2d80db08a4\x2d2d34\x2d453a\x2da239\x2da7ada660bee1.mount: Deactivated successfully.
Jan 31 03:58:20 np0005603623 nova_compute[226235]: 2026-01-31 08:58:20.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:58:20 np0005603623 nova_compute[226235]: 2026-01-31 08:58:20.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:58:20 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:58:20 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:58:20 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:58:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:20.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:20.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.067 226239 INFO nova.virt.libvirt.driver [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Deleting instance files /var/lib/nova/instances/2450f89f-bcd8-4bab-8fbd-ae73ab968552_del#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.068 226239 INFO nova.virt.libvirt.driver [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Deletion of /var/lib/nova/instances/2450f89f-bcd8-4bab-8fbd-ae73ab968552_del complete#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.153 226239 INFO nova.compute.manager [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Took 1.70 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.154 226239 DEBUG oslo.service.loopingcall [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.154 226239 DEBUG nova.compute.manager [-] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.155 226239 DEBUG nova.network.neutron [-] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.326 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.547 226239 DEBUG nova.compute.manager [req-9462ca53-2d58-4617-964e-002a04bf551f req-299b36fa-826d-44f4-bb03-2abf7804a379 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Received event network-vif-unplugged-7f888fb7-e22f-4012-8f50-9248df3a9eac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.547 226239 DEBUG oslo_concurrency.lockutils [req-9462ca53-2d58-4617-964e-002a04bf551f req-299b36fa-826d-44f4-bb03-2abf7804a379 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.548 226239 DEBUG oslo_concurrency.lockutils [req-9462ca53-2d58-4617-964e-002a04bf551f req-299b36fa-826d-44f4-bb03-2abf7804a379 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.548 226239 DEBUG oslo_concurrency.lockutils [req-9462ca53-2d58-4617-964e-002a04bf551f req-299b36fa-826d-44f4-bb03-2abf7804a379 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.548 226239 DEBUG nova.compute.manager [req-9462ca53-2d58-4617-964e-002a04bf551f req-299b36fa-826d-44f4-bb03-2abf7804a379 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] No waiting events found dispatching network-vif-unplugged-7f888fb7-e22f-4012-8f50-9248df3a9eac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.549 226239 DEBUG nova.compute.manager [req-9462ca53-2d58-4617-964e-002a04bf551f req-299b36fa-826d-44f4-bb03-2abf7804a379 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Received event network-vif-unplugged-7f888fb7-e22f-4012-8f50-9248df3a9eac for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.549 226239 DEBUG nova.compute.manager [req-9462ca53-2d58-4617-964e-002a04bf551f req-299b36fa-826d-44f4-bb03-2abf7804a379 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Received event network-vif-plugged-7f888fb7-e22f-4012-8f50-9248df3a9eac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.549 226239 DEBUG oslo_concurrency.lockutils [req-9462ca53-2d58-4617-964e-002a04bf551f req-299b36fa-826d-44f4-bb03-2abf7804a379 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.549 226239 DEBUG oslo_concurrency.lockutils [req-9462ca53-2d58-4617-964e-002a04bf551f req-299b36fa-826d-44f4-bb03-2abf7804a379 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.550 226239 DEBUG oslo_concurrency.lockutils [req-9462ca53-2d58-4617-964e-002a04bf551f req-299b36fa-826d-44f4-bb03-2abf7804a379 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.550 226239 DEBUG nova.compute.manager [req-9462ca53-2d58-4617-964e-002a04bf551f req-299b36fa-826d-44f4-bb03-2abf7804a379 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] No waiting events found dispatching network-vif-plugged-7f888fb7-e22f-4012-8f50-9248df3a9eac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:58:21 np0005603623 nova_compute[226235]: 2026-01-31 08:58:21.550 226239 WARNING nova.compute.manager [req-9462ca53-2d58-4617-964e-002a04bf551f req-299b36fa-826d-44f4-bb03-2abf7804a379 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Received unexpected event network-vif-plugged-7f888fb7-e22f-4012-8f50-9248df3a9eac for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:58:22 np0005603623 nova_compute[226235]: 2026-01-31 08:58:22.102 226239 DEBUG nova.network.neutron [req-251a6dda-a6b3-4482-8cdd-cad7c59683aa req-32dc1cc8-6008-4e61-ad45-3991a63fad0a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Updated VIF entry in instance network info cache for port 7f888fb7-e22f-4012-8f50-9248df3a9eac. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:58:22 np0005603623 nova_compute[226235]: 2026-01-31 08:58:22.102 226239 DEBUG nova.network.neutron [req-251a6dda-a6b3-4482-8cdd-cad7c59683aa req-32dc1cc8-6008-4e61-ad45-3991a63fad0a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Updating instance_info_cache with network_info: [{"id": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "address": "fa:16:3e:06:b6:50", "network": {"id": "80db08a4-2d34-453a-a239-a7ada660bee1", "bridge": "br-int", "label": "tempest-network-smoke--1433603680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f888fb7-e2", "ovs_interfaceid": "7f888fb7-e22f-4012-8f50-9248df3a9eac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:58:22 np0005603623 nova_compute[226235]: 2026-01-31 08:58:22.161 226239 DEBUG oslo_concurrency.lockutils [req-251a6dda-a6b3-4482-8cdd-cad7c59683aa req-32dc1cc8-6008-4e61-ad45-3991a63fad0a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-2450f89f-bcd8-4bab-8fbd-ae73ab968552" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:58:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:58:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:22.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:58:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:22.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:23 np0005603623 nova_compute[226235]: 2026-01-31 08:58:23.163 226239 DEBUG nova.network.neutron [-] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:58:23 np0005603623 nova_compute[226235]: 2026-01-31 08:58:23.269 226239 INFO nova.compute.manager [-] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Took 2.11 seconds to deallocate network for instance.#033[00m
Jan 31 03:58:23 np0005603623 nova_compute[226235]: 2026-01-31 08:58:23.311 226239 DEBUG nova.compute.manager [req-597fd752-c2eb-45f8-a612-fdbebbca681c req-a8ef9c49-20bb-499b-8a45-3acba630a16e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Received event network-vif-deleted-7f888fb7-e22f-4012-8f50-9248df3a9eac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:23 np0005603623 nova_compute[226235]: 2026-01-31 08:58:23.311 226239 INFO nova.compute.manager [req-597fd752-c2eb-45f8-a612-fdbebbca681c req-a8ef9c49-20bb-499b-8a45-3acba630a16e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Neutron deleted interface 7f888fb7-e22f-4012-8f50-9248df3a9eac; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:58:23 np0005603623 nova_compute[226235]: 2026-01-31 08:58:23.312 226239 DEBUG nova.network.neutron [req-597fd752-c2eb-45f8-a612-fdbebbca681c req-a8ef9c49-20bb-499b-8a45-3acba630a16e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:58:23 np0005603623 nova_compute[226235]: 2026-01-31 08:58:23.356 226239 DEBUG nova.compute.manager [req-597fd752-c2eb-45f8-a612-fdbebbca681c req-a8ef9c49-20bb-499b-8a45-3acba630a16e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Detach interface failed, port_id=7f888fb7-e22f-4012-8f50-9248df3a9eac, reason: Instance 2450f89f-bcd8-4bab-8fbd-ae73ab968552 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:58:23 np0005603623 nova_compute[226235]: 2026-01-31 08:58:23.441 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:23 np0005603623 nova_compute[226235]: 2026-01-31 08:58:23.508 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:23 np0005603623 nova_compute[226235]: 2026-01-31 08:58:23.638 226239 DEBUG oslo_concurrency.lockutils [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:23 np0005603623 nova_compute[226235]: 2026-01-31 08:58:23.638 226239 DEBUG oslo_concurrency.lockutils [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:23 np0005603623 nova_compute[226235]: 2026-01-31 08:58:23.717 226239 DEBUG oslo_concurrency.processutils [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:58:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:58:24 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2100569456' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:58:24 np0005603623 nova_compute[226235]: 2026-01-31 08:58:24.105 226239 DEBUG oslo_concurrency.processutils [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.388s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:58:24 np0005603623 nova_compute[226235]: 2026-01-31 08:58:24.110 226239 DEBUG nova.compute.provider_tree [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:58:24 np0005603623 nova_compute[226235]: 2026-01-31 08:58:24.249 226239 DEBUG nova.scheduler.client.report [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:58:24 np0005603623 nova_compute[226235]: 2026-01-31 08:58:24.330 226239 DEBUG oslo_concurrency.lockutils [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.691s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:24 np0005603623 nova_compute[226235]: 2026-01-31 08:58:24.523 226239 INFO nova.scheduler.client.report [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Deleted allocations for instance 2450f89f-bcd8-4bab-8fbd-ae73ab968552#033[00m
Jan 31 03:58:24 np0005603623 nova_compute[226235]: 2026-01-31 08:58:24.778 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:24.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:24 np0005603623 nova_compute[226235]: 2026-01-31 08:58:24.873 226239 DEBUG oslo_concurrency.lockutils [None req-bd520578-36ac-45f7-b862-929b3d9210e8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "2450f89f-bcd8-4bab-8fbd-ae73ab968552" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.424s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:24.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:26.248 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=80, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=79) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:58:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:26.249 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:58:26 np0005603623 nova_compute[226235]: 2026-01-31 08:58:26.249 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:26 np0005603623 nova_compute[226235]: 2026-01-31 08:58:26.327 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:26 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:58:26 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:58:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:58:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:26.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:58:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:26.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:28 np0005603623 nova_compute[226235]: 2026-01-31 08:58:28.157 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:58:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:28.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:28.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:29 np0005603623 nova_compute[226235]: 2026-01-31 08:58:29.627 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849894.6255364, e9903ecf-c775-4e84-8997-361061869fc6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:58:29 np0005603623 nova_compute[226235]: 2026-01-31 08:58:29.627 226239 INFO nova.compute.manager [-] [instance: e9903ecf-c775-4e84-8997-361061869fc6] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:58:29 np0005603623 nova_compute[226235]: 2026-01-31 08:58:29.651 226239 DEBUG nova.compute.manager [None req-d503b264-8f9c-44f5-b453-0ae8c442cd0c - - - - - -] [instance: e9903ecf-c775-4e84-8997-361061869fc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:58:29 np0005603623 nova_compute[226235]: 2026-01-31 08:58:29.780 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:30.150 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:30.150 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:30.150 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:30.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:30.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:31 np0005603623 nova_compute[226235]: 2026-01-31 08:58:31.328 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:58:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4054324948' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:58:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:58:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4054324948' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:58:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:32.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:32.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:34 np0005603623 nova_compute[226235]: 2026-01-31 08:58:34.719 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849899.7178771, 2450f89f-bcd8-4bab-8fbd-ae73ab968552 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:58:34 np0005603623 nova_compute[226235]: 2026-01-31 08:58:34.719 226239 INFO nova.compute.manager [-] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:58:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:58:34 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3374387525' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:58:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:58:34 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3374387525' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:58:34 np0005603623 nova_compute[226235]: 2026-01-31 08:58:34.754 226239 DEBUG nova.compute.manager [None req-34e77563-a4ee-43b1-8fcf-0aaa83ccb55c - - - - - -] [instance: 2450f89f-bcd8-4bab-8fbd-ae73ab968552] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:58:34 np0005603623 nova_compute[226235]: 2026-01-31 08:58:34.781 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:34.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:34.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:36 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:58:36.251 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '80'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:58:36 np0005603623 nova_compute[226235]: 2026-01-31 08:58:36.331 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:36.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:36.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:38.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e378 e378: 3 total, 3 up, 3 in
Jan 31 03:58:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:38.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:39 np0005603623 nova_compute[226235]: 2026-01-31 08:58:39.782 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:58:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:40.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:58:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:40.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:41 np0005603623 nova_compute[226235]: 2026-01-31 08:58:41.332 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:58:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:42.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:58:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:42.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:44 np0005603623 nova_compute[226235]: 2026-01-31 08:58:44.783 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:44.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:44.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:44 np0005603623 podman[316277]: 2026-01-31 08:58:44.95120728 +0000 UTC m=+0.047482600 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 03:58:45 np0005603623 podman[316278]: 2026-01-31 08:58:45.048503472 +0000 UTC m=+0.144544255 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:58:46 np0005603623 nova_compute[226235]: 2026-01-31 08:58:46.334 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:46.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:46.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e379 e379: 3 total, 3 up, 3 in
Jan 31 03:58:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:48.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:48.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:49 np0005603623 nova_compute[226235]: 2026-01-31 08:58:49.785 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:50.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:50.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:51 np0005603623 nova_compute[226235]: 2026-01-31 08:58:51.336 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:52.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:52.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:54 np0005603623 nova_compute[226235]: 2026-01-31 08:58:54.786 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:54.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:54.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:56 np0005603623 nova_compute[226235]: 2026-01-31 08:58:56.386 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:56.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:56.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:58.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:58:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:58.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:59 np0005603623 nova_compute[226235]: 2026-01-31 08:58:59.787 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:00.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:59:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:00.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:59:01 np0005603623 nova_compute[226235]: 2026-01-31 08:59:01.389 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:02.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:59:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:02.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:59:04 np0005603623 nova_compute[226235]: 2026-01-31 08:59:04.787 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:04.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:04.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:06 np0005603623 nova_compute[226235]: 2026-01-31 08:59:06.392 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:06.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:06.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:08.009 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=81, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=80) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:59:08 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:08.010 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:59:08 np0005603623 nova_compute[226235]: 2026-01-31 08:59:08.010 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:08 np0005603623 nova_compute[226235]: 2026-01-31 08:59:08.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:08 np0005603623 nova_compute[226235]: 2026-01-31 08:59:08.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:59:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:08.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:08.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:09.012 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '81'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:09 np0005603623 nova_compute[226235]: 2026-01-31 08:59:09.789 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:59:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:10.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:59:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:10.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:11 np0005603623 nova_compute[226235]: 2026-01-31 08:59:11.393 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:59:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:12.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:59:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:12.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #160. Immutable memtables: 0.
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:59:13.232397) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 101] Flushing memtable with next log file: 160
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849953232497, "job": 101, "event": "flush_started", "num_memtables": 1, "num_entries": 2411, "num_deletes": 253, "total_data_size": 5417849, "memory_usage": 5521984, "flush_reason": "Manual Compaction"}
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 101] Level-0 flush table #161: started
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849953292082, "cf_name": "default", "job": 101, "event": "table_file_creation", "file_number": 161, "file_size": 3539115, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 77127, "largest_seqno": 79533, "table_properties": {"data_size": 3529531, "index_size": 5949, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 21069, "raw_average_key_size": 20, "raw_value_size": 3509874, "raw_average_value_size": 3458, "num_data_blocks": 258, "num_entries": 1015, "num_filter_entries": 1015, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769849761, "oldest_key_time": 1769849761, "file_creation_time": 1769849953, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 161, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 101] Flush lasted 59762 microseconds, and 6000 cpu microseconds.
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:59:13.292156) [db/flush_job.cc:967] [default] [JOB 101] Level-0 flush table #161: 3539115 bytes OK
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:59:13.292176) [db/memtable_list.cc:519] [default] Level-0 commit table #161 started
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:59:13.419624) [db/memtable_list.cc:722] [default] Level-0 commit table #161: memtable #1 done
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:59:13.419670) EVENT_LOG_v1 {"time_micros": 1769849953419660, "job": 101, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:59:13.419691) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 101] Try to delete WAL files size 5407151, prev total WAL file size 5407151, number of live WAL files 2.
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000157.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:59:13.420996) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036373737' seq:72057594037927935, type:22 .. '7061786F730037303239' seq:0, type:0; will stop at (end)
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 102] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 101 Base level 0, inputs: [161(3456KB)], [159(10179KB)]
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849953421815, "job": 102, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [161], "files_L6": [159], "score": -1, "input_data_size": 13962699, "oldest_snapshot_seqno": -1}
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 102] Generated table #162: 10002 keys, 12047959 bytes, temperature: kUnknown
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849953555324, "cf_name": "default", "job": 102, "event": "table_file_creation", "file_number": 162, "file_size": 12047959, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11984885, "index_size": 36977, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25029, "raw_key_size": 263265, "raw_average_key_size": 26, "raw_value_size": 11811513, "raw_average_value_size": 1180, "num_data_blocks": 1404, "num_entries": 10002, "num_filter_entries": 10002, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769849953, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 162, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:59:13.555564) [db/compaction/compaction_job.cc:1663] [default] [JOB 102] Compacted 1@0 + 1@6 files to L6 => 12047959 bytes
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:59:13.572086) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 104.5 rd, 90.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 9.9 +0.0 blob) out(11.5 +0.0 blob), read-write-amplify(7.3) write-amplify(3.4) OK, records in: 10529, records dropped: 527 output_compression: NoCompression
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:59:13.572118) EVENT_LOG_v1 {"time_micros": 1769849953572105, "job": 102, "event": "compaction_finished", "compaction_time_micros": 133561, "compaction_time_cpu_micros": 23715, "output_level": 6, "num_output_files": 1, "total_output_size": 12047959, "num_input_records": 10529, "num_output_records": 10002, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000161.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849953572653, "job": 102, "event": "table_file_deletion", "file_number": 161}
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000159.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849953573576, "job": 102, "event": "table_file_deletion", "file_number": 159}
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:59:13.420796) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:59:13.573626) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:59:13.573631) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:59:13.573633) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:59:13.573635) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:59:13 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-08:59:13.573637) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:59:14 np0005603623 nova_compute[226235]: 2026-01-31 08:59:14.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:14 np0005603623 nova_compute[226235]: 2026-01-31 08:59:14.188 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:14 np0005603623 nova_compute[226235]: 2026-01-31 08:59:14.188 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:14 np0005603623 nova_compute[226235]: 2026-01-31 08:59:14.188 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:14 np0005603623 nova_compute[226235]: 2026-01-31 08:59:14.188 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:59:14 np0005603623 nova_compute[226235]: 2026-01-31 08:59:14.188 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:59:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3491461315' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:59:14 np0005603623 nova_compute[226235]: 2026-01-31 08:59:14.611 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:14 np0005603623 nova_compute[226235]: 2026-01-31 08:59:14.736 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:59:14 np0005603623 nova_compute[226235]: 2026-01-31 08:59:14.738 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4226MB free_disk=20.946487426757812GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:59:14 np0005603623 nova_compute[226235]: 2026-01-31 08:59:14.738 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:14 np0005603623 nova_compute[226235]: 2026-01-31 08:59:14.738 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:14 np0005603623 nova_compute[226235]: 2026-01-31 08:59:14.789 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:14.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:14.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:15 np0005603623 nova_compute[226235]: 2026-01-31 08:59:15.218 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:59:15 np0005603623 nova_compute[226235]: 2026-01-31 08:59:15.218 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:59:15 np0005603623 nova_compute[226235]: 2026-01-31 08:59:15.389 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:59:15 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2742732422' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:59:15 np0005603623 nova_compute[226235]: 2026-01-31 08:59:15.797 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:15 np0005603623 nova_compute[226235]: 2026-01-31 08:59:15.801 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:59:15 np0005603623 nova_compute[226235]: 2026-01-31 08:59:15.875 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:59:15 np0005603623 podman[316429]: 2026-01-31 08:59:15.954091253 +0000 UTC m=+0.045374095 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 31 03:59:15 np0005603623 podman[316430]: 2026-01-31 08:59:15.972276213 +0000 UTC m=+0.063354768 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:59:15 np0005603623 nova_compute[226235]: 2026-01-31 08:59:15.989 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:59:15 np0005603623 nova_compute[226235]: 2026-01-31 08:59:15.989 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:16 np0005603623 nova_compute[226235]: 2026-01-31 08:59:16.395 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:59:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:16.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:59:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:16.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:16 np0005603623 nova_compute[226235]: 2026-01-31 08:59:16.989 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:16 np0005603623 nova_compute[226235]: 2026-01-31 08:59:16.990 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:59:16 np0005603623 nova_compute[226235]: 2026-01-31 08:59:16.990 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:59:17 np0005603623 nova_compute[226235]: 2026-01-31 08:59:17.070 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:59:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:59:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:18.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:59:18 np0005603623 nova_compute[226235]: 2026-01-31 08:59:18.950 226239 DEBUG oslo_concurrency.lockutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquiring lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:18 np0005603623 nova_compute[226235]: 2026-01-31 08:59:18.951 226239 DEBUG oslo_concurrency.lockutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:18 np0005603623 nova_compute[226235]: 2026-01-31 08:59:18.974 226239 DEBUG nova.compute.manager [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:59:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:18.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:19 np0005603623 nova_compute[226235]: 2026-01-31 08:59:19.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:19 np0005603623 nova_compute[226235]: 2026-01-31 08:59:19.199 226239 DEBUG oslo_concurrency.lockutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:19 np0005603623 nova_compute[226235]: 2026-01-31 08:59:19.200 226239 DEBUG oslo_concurrency.lockutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:19 np0005603623 nova_compute[226235]: 2026-01-31 08:59:19.209 226239 DEBUG nova.virt.hardware [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:59:19 np0005603623 nova_compute[226235]: 2026-01-31 08:59:19.209 226239 INFO nova.compute.claims [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 03:59:19 np0005603623 nova_compute[226235]: 2026-01-31 08:59:19.388 226239 DEBUG oslo_concurrency.processutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:19 np0005603623 nova_compute[226235]: 2026-01-31 08:59:19.791 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:59:19 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/762735647' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:59:19 np0005603623 nova_compute[226235]: 2026-01-31 08:59:19.810 226239 DEBUG oslo_concurrency.processutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:19 np0005603623 nova_compute[226235]: 2026-01-31 08:59:19.815 226239 DEBUG nova.compute.provider_tree [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:59:19 np0005603623 nova_compute[226235]: 2026-01-31 08:59:19.847 226239 DEBUG nova.scheduler.client.report [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:59:19 np0005603623 nova_compute[226235]: 2026-01-31 08:59:19.905 226239 DEBUG oslo_concurrency.lockutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.706s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:19 np0005603623 nova_compute[226235]: 2026-01-31 08:59:19.906 226239 DEBUG nova.compute.manager [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.007 226239 DEBUG nova.compute.manager [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.008 226239 DEBUG nova.network.neutron [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.039 226239 INFO nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.059 226239 DEBUG nova.compute.manager [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.304 226239 DEBUG nova.compute.manager [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.305 226239 DEBUG nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.306 226239 INFO nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Creating image(s)#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.329 226239 DEBUG nova.storage.rbd_utils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] rbd image 3f0d401f-df22-424f-b572-4eb9ab2df0f4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.352 226239 DEBUG nova.storage.rbd_utils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] rbd image 3f0d401f-df22-424f-b572-4eb9ab2df0f4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.377 226239 DEBUG nova.storage.rbd_utils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] rbd image 3f0d401f-df22-424f-b572-4eb9ab2df0f4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.384 226239 DEBUG oslo_concurrency.processutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.436 226239 DEBUG oslo_concurrency.processutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.437 226239 DEBUG oslo_concurrency.lockutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.438 226239 DEBUG oslo_concurrency.lockutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.439 226239 DEBUG oslo_concurrency.lockutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.464 226239 DEBUG nova.storage.rbd_utils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] rbd image 3f0d401f-df22-424f-b572-4eb9ab2df0f4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.468 226239 DEBUG oslo_concurrency.processutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 3f0d401f-df22-424f-b572-4eb9ab2df0f4_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.485 226239 DEBUG nova.policy [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3859f52c5b70471097d1e4ffa75ecc0e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1f293713f6854265a89a1a4a002088d5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.737 226239 DEBUG oslo_concurrency.processutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 3f0d401f-df22-424f-b572-4eb9ab2df0f4_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.269s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.822 226239 DEBUG nova.storage.rbd_utils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] resizing rbd image 3f0d401f-df22-424f-b572-4eb9ab2df0f4_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:59:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:59:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:20.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.970 226239 DEBUG nova.objects.instance [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lazy-loading 'migration_context' on Instance uuid 3f0d401f-df22-424f-b572-4eb9ab2df0f4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:59:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:59:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:20.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.998 226239 DEBUG nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.999 226239 DEBUG nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Ensure instance console log exists: /var/lib/nova/instances/3f0d401f-df22-424f-b572-4eb9ab2df0f4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:59:20 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.999 226239 DEBUG oslo_concurrency.lockutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:21 np0005603623 nova_compute[226235]: 2026-01-31 08:59:20.999 226239 DEBUG oslo_concurrency.lockutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:21 np0005603623 nova_compute[226235]: 2026-01-31 08:59:21.000 226239 DEBUG oslo_concurrency.lockutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:21 np0005603623 nova_compute[226235]: 2026-01-31 08:59:21.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:21 np0005603623 nova_compute[226235]: 2026-01-31 08:59:21.397 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:22 np0005603623 nova_compute[226235]: 2026-01-31 08:59:22.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:22 np0005603623 nova_compute[226235]: 2026-01-31 08:59:22.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:22 np0005603623 nova_compute[226235]: 2026-01-31 08:59:22.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:22 np0005603623 ovn_controller[133449]: 2026-01-31T08:59:22Z|00793|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 03:59:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:22.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:59:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:22.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:59:23 np0005603623 nova_compute[226235]: 2026-01-31 08:59:23.389 226239 DEBUG nova.network.neutron [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Successfully created port: 8cd49adc-5281-4272-9c97-e9121d662fff _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:59:24 np0005603623 nova_compute[226235]: 2026-01-31 08:59:24.793 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:59:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:24.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:59:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:24.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:26 np0005603623 nova_compute[226235]: 2026-01-31 08:59:26.295 226239 DEBUG nova.network.neutron [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Successfully updated port: 8cd49adc-5281-4272-9c97-e9121d662fff _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:59:26 np0005603623 nova_compute[226235]: 2026-01-31 08:59:26.342 226239 DEBUG oslo_concurrency.lockutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquiring lock "refresh_cache-3f0d401f-df22-424f-b572-4eb9ab2df0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:59:26 np0005603623 nova_compute[226235]: 2026-01-31 08:59:26.342 226239 DEBUG oslo_concurrency.lockutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquired lock "refresh_cache-3f0d401f-df22-424f-b572-4eb9ab2df0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:59:26 np0005603623 nova_compute[226235]: 2026-01-31 08:59:26.343 226239 DEBUG nova.network.neutron [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:59:26 np0005603623 nova_compute[226235]: 2026-01-31 08:59:26.398 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:26 np0005603623 nova_compute[226235]: 2026-01-31 08:59:26.793 226239 DEBUG nova.network.neutron [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:59:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:26.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:27.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.265 226239 DEBUG nova.compute.manager [req-b353f312-e091-4a16-839d-445b1beb7604 req-455502f8-01fc-4fde-ad49-f6b6aa41a15e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Received event network-changed-8cd49adc-5281-4272-9c97-e9121d662fff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.266 226239 DEBUG nova.compute.manager [req-b353f312-e091-4a16-839d-445b1beb7604 req-455502f8-01fc-4fde-ad49-f6b6aa41a15e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Refreshing instance network info cache due to event network-changed-8cd49adc-5281-4272-9c97-e9121d662fff. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.266 226239 DEBUG oslo_concurrency.lockutils [req-b353f312-e091-4a16-839d-445b1beb7604 req-455502f8-01fc-4fde-ad49-f6b6aa41a15e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-3f0d401f-df22-424f-b572-4eb9ab2df0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.355 226239 DEBUG nova.network.neutron [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Updating instance_info_cache with network_info: [{"id": "8cd49adc-5281-4272-9c97-e9121d662fff", "address": "fa:16:3e:ef:a5:77", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cd49adc-52", "ovs_interfaceid": "8cd49adc-5281-4272-9c97-e9121d662fff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.568 226239 DEBUG oslo_concurrency.lockutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Releasing lock "refresh_cache-3f0d401f-df22-424f-b572-4eb9ab2df0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.568 226239 DEBUG nova.compute.manager [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Instance network_info: |[{"id": "8cd49adc-5281-4272-9c97-e9121d662fff", "address": "fa:16:3e:ef:a5:77", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cd49adc-52", "ovs_interfaceid": "8cd49adc-5281-4272-9c97-e9121d662fff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.569 226239 DEBUG oslo_concurrency.lockutils [req-b353f312-e091-4a16-839d-445b1beb7604 req-455502f8-01fc-4fde-ad49-f6b6aa41a15e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-3f0d401f-df22-424f-b572-4eb9ab2df0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.569 226239 DEBUG nova.network.neutron [req-b353f312-e091-4a16-839d-445b1beb7604 req-455502f8-01fc-4fde-ad49-f6b6aa41a15e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Refreshing network info cache for port 8cd49adc-5281-4272-9c97-e9121d662fff _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.572 226239 DEBUG nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Start _get_guest_xml network_info=[{"id": "8cd49adc-5281-4272-9c97-e9121d662fff", "address": "fa:16:3e:ef:a5:77", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cd49adc-52", "ovs_interfaceid": "8cd49adc-5281-4272-9c97-e9121d662fff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.577 226239 WARNING nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.585 226239 DEBUG nova.virt.libvirt.host [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.586 226239 DEBUG nova.virt.libvirt.host [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.592 226239 DEBUG nova.virt.libvirt.host [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.593 226239 DEBUG nova.virt.libvirt.host [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.594 226239 DEBUG nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.594 226239 DEBUG nova.virt.hardware [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.595 226239 DEBUG nova.virt.hardware [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.595 226239 DEBUG nova.virt.hardware [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.595 226239 DEBUG nova.virt.hardware [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.596 226239 DEBUG nova.virt.hardware [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.596 226239 DEBUG nova.virt.hardware [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.596 226239 DEBUG nova.virt.hardware [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.597 226239 DEBUG nova.virt.hardware [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.597 226239 DEBUG nova.virt.hardware [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.597 226239 DEBUG nova.virt.hardware [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.597 226239 DEBUG nova.virt.hardware [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:59:28 np0005603623 nova_compute[226235]: 2026-01-31 08:59:28.600 226239 DEBUG oslo_concurrency.processutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:28.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:29.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:59:29 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2143885191' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:59:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:59:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:59:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.135 226239 DEBUG oslo_concurrency.processutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.534s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.166 226239 DEBUG nova.storage.rbd_utils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] rbd image 3f0d401f-df22-424f-b572-4eb9ab2df0f4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.170 226239 DEBUG oslo_concurrency.processutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:59:29 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3206800752' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.582 226239 DEBUG oslo_concurrency.processutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.584 226239 DEBUG nova.virt.libvirt.vif [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:59:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-273568541',display_name='tempest-TestShelveInstance-server-273568541',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-273568541',id=190,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMSJMgmfnDpAAhB+HaJLz9QSwipQmTsA86IiQRxWFaiZVFUeEfcIK6d3P3mBAHHd/rEKxm6Cw/JZh8tqOgCxKABbrDqL+FM2acHOfaAtltHep9oak+RawMJvZFvKOagynQ==',key_name='tempest-TestShelveInstance-1861761101',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1f293713f6854265a89a1a4a002088d5',ramdisk_id='',reservation_id='r-179fgz65',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1813478377',owner_user_name='tempest-TestShelveInstance-1813478377-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:59:20Z,user_data=None,user_id='3859f52c5b70471097d1e4ffa75ecc0e',uuid=3f0d401f-df22-424f-b572-4eb9ab2df0f4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8cd49adc-5281-4272-9c97-e9121d662fff", "address": "fa:16:3e:ef:a5:77", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cd49adc-52", "ovs_interfaceid": "8cd49adc-5281-4272-9c97-e9121d662fff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.585 226239 DEBUG nova.network.os_vif_util [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Converting VIF {"id": "8cd49adc-5281-4272-9c97-e9121d662fff", "address": "fa:16:3e:ef:a5:77", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cd49adc-52", "ovs_interfaceid": "8cd49adc-5281-4272-9c97-e9121d662fff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.585 226239 DEBUG nova.network.os_vif_util [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:a5:77,bridge_name='br-int',has_traffic_filtering=True,id=8cd49adc-5281-4272-9c97-e9121d662fff,network=Network(1c62fa1c-f7d2-4937-9258-1d3a4456b207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cd49adc-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.587 226239 DEBUG nova.objects.instance [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3f0d401f-df22-424f-b572-4eb9ab2df0f4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.671 226239 DEBUG nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:59:29 np0005603623 nova_compute[226235]:  <uuid>3f0d401f-df22-424f-b572-4eb9ab2df0f4</uuid>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:  <name>instance-000000be</name>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <nova:name>tempest-TestShelveInstance-server-273568541</nova:name>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 08:59:28</nova:creationTime>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 03:59:29 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:        <nova:user uuid="3859f52c5b70471097d1e4ffa75ecc0e">tempest-TestShelveInstance-1813478377-project-member</nova:user>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:        <nova:project uuid="1f293713f6854265a89a1a4a002088d5">tempest-TestShelveInstance-1813478377</nova:project>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:        <nova:port uuid="8cd49adc-5281-4272-9c97-e9121d662fff">
Jan 31 03:59:29 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <system>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <entry name="serial">3f0d401f-df22-424f-b572-4eb9ab2df0f4</entry>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <entry name="uuid">3f0d401f-df22-424f-b572-4eb9ab2df0f4</entry>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    </system>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:  <os>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:  </os>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:  <features>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:  </features>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:  </clock>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:  <devices>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/3f0d401f-df22-424f-b572-4eb9ab2df0f4_disk">
Jan 31 03:59:29 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:59:29 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/3f0d401f-df22-424f-b572-4eb9ab2df0f4_disk.config">
Jan 31 03:59:29 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      </source>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 03:59:29 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      </auth>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    </disk>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:ef:a5:77"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <target dev="tap8cd49adc-52"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    </interface>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/3f0d401f-df22-424f-b572-4eb9ab2df0f4/console.log" append="off"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    </serial>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <video>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    </video>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    </rng>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 03:59:29 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 03:59:29 np0005603623 nova_compute[226235]:  </devices>
Jan 31 03:59:29 np0005603623 nova_compute[226235]: </domain>
Jan 31 03:59:29 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.672 226239 DEBUG nova.compute.manager [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Preparing to wait for external event network-vif-plugged-8cd49adc-5281-4272-9c97-e9121d662fff prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.672 226239 DEBUG oslo_concurrency.lockutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquiring lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.673 226239 DEBUG oslo_concurrency.lockutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.673 226239 DEBUG oslo_concurrency.lockutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.674 226239 DEBUG nova.virt.libvirt.vif [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:59:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-273568541',display_name='tempest-TestShelveInstance-server-273568541',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-273568541',id=190,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMSJMgmfnDpAAhB+HaJLz9QSwipQmTsA86IiQRxWFaiZVFUeEfcIK6d3P3mBAHHd/rEKxm6Cw/JZh8tqOgCxKABbrDqL+FM2acHOfaAtltHep9oak+RawMJvZFvKOagynQ==',key_name='tempest-TestShelveInstance-1861761101',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1f293713f6854265a89a1a4a002088d5',ramdisk_id='',reservation_id='r-179fgz65',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1813478377',owner_user_name='tempest-TestShelveInstance-1813478377-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:59:20Z,user_data=None,user_id='3859f52c5b70471097d1e4ffa75ecc0e',uuid=3f0d401f-df22-424f-b572-4eb9ab2df0f4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8cd49adc-5281-4272-9c97-e9121d662fff", "address": "fa:16:3e:ef:a5:77", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cd49adc-52", "ovs_interfaceid": "8cd49adc-5281-4272-9c97-e9121d662fff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.674 226239 DEBUG nova.network.os_vif_util [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Converting VIF {"id": "8cd49adc-5281-4272-9c97-e9121d662fff", "address": "fa:16:3e:ef:a5:77", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cd49adc-52", "ovs_interfaceid": "8cd49adc-5281-4272-9c97-e9121d662fff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.675 226239 DEBUG nova.network.os_vif_util [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:a5:77,bridge_name='br-int',has_traffic_filtering=True,id=8cd49adc-5281-4272-9c97-e9121d662fff,network=Network(1c62fa1c-f7d2-4937-9258-1d3a4456b207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cd49adc-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.675 226239 DEBUG os_vif [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:a5:77,bridge_name='br-int',has_traffic_filtering=True,id=8cd49adc-5281-4272-9c97-e9121d662fff,network=Network(1c62fa1c-f7d2-4937-9258-1d3a4456b207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cd49adc-52') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.676 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.676 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.676 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.680 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.680 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8cd49adc-52, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.680 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8cd49adc-52, col_values=(('external_ids', {'iface-id': '8cd49adc-5281-4272-9c97-e9121d662fff', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ef:a5:77', 'vm-uuid': '3f0d401f-df22-424f-b572-4eb9ab2df0f4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.682 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:29 np0005603623 NetworkManager[48970]: <info>  [1769849969.6829] manager: (tap8cd49adc-52): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/370)
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.684 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.690 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.691 226239 INFO os_vif [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:a5:77,bridge_name='br-int',has_traffic_filtering=True,id=8cd49adc-5281-4272-9c97-e9121d662fff,network=Network(1c62fa1c-f7d2-4937-9258-1d3a4456b207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cd49adc-52')#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.909 226239 DEBUG nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.910 226239 DEBUG nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.910 226239 DEBUG nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] No VIF found with MAC fa:16:3e:ef:a5:77, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.911 226239 INFO nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Using config drive#033[00m
Jan 31 03:59:29 np0005603623 nova_compute[226235]: 2026-01-31 08:59:29.949 226239 DEBUG nova.storage.rbd_utils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] rbd image 3f0d401f-df22-424f-b572-4eb9ab2df0f4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:59:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:30.151 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:30.151 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:30.151 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:59:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:59:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:30.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:31.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:31 np0005603623 nova_compute[226235]: 2026-01-31 08:59:31.397 226239 INFO nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Creating config drive at /var/lib/nova/instances/3f0d401f-df22-424f-b572-4eb9ab2df0f4/disk.config#033[00m
Jan 31 03:59:31 np0005603623 nova_compute[226235]: 2026-01-31 08:59:31.401 226239 DEBUG oslo_concurrency.processutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3f0d401f-df22-424f-b572-4eb9ab2df0f4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmph_1jl7ps execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:31 np0005603623 nova_compute[226235]: 2026-01-31 08:59:31.447 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:31 np0005603623 nova_compute[226235]: 2026-01-31 08:59:31.535 226239 DEBUG oslo_concurrency.processutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3f0d401f-df22-424f-b572-4eb9ab2df0f4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmph_1jl7ps" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:31 np0005603623 nova_compute[226235]: 2026-01-31 08:59:31.568 226239 DEBUG nova.storage.rbd_utils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] rbd image 3f0d401f-df22-424f-b572-4eb9ab2df0f4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:59:31 np0005603623 nova_compute[226235]: 2026-01-31 08:59:31.571 226239 DEBUG oslo_concurrency.processutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3f0d401f-df22-424f-b572-4eb9ab2df0f4/disk.config 3f0d401f-df22-424f-b572-4eb9ab2df0f4_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:31 np0005603623 nova_compute[226235]: 2026-01-31 08:59:31.961 226239 DEBUG oslo_concurrency.processutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3f0d401f-df22-424f-b572-4eb9ab2df0f4/disk.config 3f0d401f-df22-424f-b572-4eb9ab2df0f4_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.389s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:31 np0005603623 nova_compute[226235]: 2026-01-31 08:59:31.961 226239 INFO nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Deleting local config drive /var/lib/nova/instances/3f0d401f-df22-424f-b572-4eb9ab2df0f4/disk.config because it was imported into RBD.#033[00m
Jan 31 03:59:32 np0005603623 NetworkManager[48970]: <info>  [1769849972.0118] manager: (tap8cd49adc-52): new Tun device (/org/freedesktop/NetworkManager/Devices/371)
Jan 31 03:59:32 np0005603623 kernel: tap8cd49adc-52: entered promiscuous mode
Jan 31 03:59:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:59:32Z|00794|binding|INFO|Claiming lport 8cd49adc-5281-4272-9c97-e9121d662fff for this chassis.
Jan 31 03:59:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:59:32Z|00795|binding|INFO|8cd49adc-5281-4272-9c97-e9121d662fff: Claiming fa:16:3e:ef:a5:77 10.100.0.4
Jan 31 03:59:32 np0005603623 nova_compute[226235]: 2026-01-31 08:59:32.013 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:32 np0005603623 nova_compute[226235]: 2026-01-31 08:59:32.016 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:32 np0005603623 nova_compute[226235]: 2026-01-31 08:59:32.020 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:32 np0005603623 systemd-udevd[316982]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.039 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:a5:77 10.100.0.4'], port_security=['fa:16:3e:ef:a5:77 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '3f0d401f-df22-424f-b572-4eb9ab2df0f4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1c62fa1c-f7d2-4937-9258-1d3a4456b207', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f293713f6854265a89a1a4a002088d5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cc6e8c6f-bb2f-4c15-bc84-a1452e5550a6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=05ac80f4-66e3-4e8c-b69d-f2f58ada92e8, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=8cd49adc-5281-4272-9c97-e9121d662fff) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.040 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 8cd49adc-5281-4272-9c97-e9121d662fff in datapath 1c62fa1c-f7d2-4937-9258-1d3a4456b207 bound to our chassis#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.043 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1c62fa1c-f7d2-4937-9258-1d3a4456b207#033[00m
Jan 31 03:59:32 np0005603623 nova_compute[226235]: 2026-01-31 08:59:32.045 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:59:32Z|00796|binding|INFO|Setting lport 8cd49adc-5281-4272-9c97-e9121d662fff ovn-installed in OVS
Jan 31 03:59:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:59:32Z|00797|binding|INFO|Setting lport 8cd49adc-5281-4272-9c97-e9121d662fff up in Southbound
Jan 31 03:59:32 np0005603623 nova_compute[226235]: 2026-01-31 08:59:32.048 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:32 np0005603623 systemd-machined[194379]: New machine qemu-89-instance-000000be.
Jan 31 03:59:32 np0005603623 NetworkManager[48970]: <info>  [1769849972.0536] device (tap8cd49adc-52): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:59:32 np0005603623 NetworkManager[48970]: <info>  [1769849972.0544] device (tap8cd49adc-52): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.053 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[87dfff19-3d05-4175-8ee5-9adae6c64ae1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.055 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1c62fa1c-f1 in ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.057 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1c62fa1c-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.058 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[780f7e60-5b21-407a-8951-0c4738d40a59]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.058 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e3988a83-e809-41fb-9036-92370dead559]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:32 np0005603623 systemd[1]: Started Virtual Machine qemu-89-instance-000000be.
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.071 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[ed97b4ea-1b56-4b10-9df0-c067743b859e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.086 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5d73ebf1-01cc-4a05-ad65-61d2e76467a2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.105 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[5d2cbfc4-59e8-4cf7-ace6-1494f6151d7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.109 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ef6d1edc-90e4-4d10-8322-f6d2df26df73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:32 np0005603623 NetworkManager[48970]: <info>  [1769849972.1103] manager: (tap1c62fa1c-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/372)
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.131 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[bff7cd92-db54-450c-8f33-f6b483c213ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.134 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[26958000-6eb0-487e-be80-9bfb88ff9c64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:32 np0005603623 NetworkManager[48970]: <info>  [1769849972.1513] device (tap1c62fa1c-f0): carrier: link connected
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.154 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[4d7036f8-264e-46a4-9ca2-041e4b5251cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.166 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f2b7f572-52e9-4251-b9a6-3a71db10d6f0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1c62fa1c-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:15:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 233], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 911258, 'reachable_time': 37782, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 317017, 'error': None, 'target': 'ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.178 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b78eaf44-7770-4db1-90be-d703d860f605]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe52:1552'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 911258, 'tstamp': 911258}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 317018, 'error': None, 'target': 'ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.190 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cc268671-7f90-4807-adbe-32fb2d3de16a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1c62fa1c-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:15:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 233], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 911258, 'reachable_time': 37782, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 317019, 'error': None, 'target': 'ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.209 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e2d5ce36-fb22-4ab8-98e4-eb10e6abe310]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.245 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[982b74b4-ae09-474c-be00-ad3e00795bb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.247 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1c62fa1c-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.247 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.249 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1c62fa1c-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:32 np0005603623 NetworkManager[48970]: <info>  [1769849972.2517] manager: (tap1c62fa1c-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/373)
Jan 31 03:59:32 np0005603623 kernel: tap1c62fa1c-f0: entered promiscuous mode
Jan 31 03:59:32 np0005603623 nova_compute[226235]: 2026-01-31 08:59:32.252 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.253 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1c62fa1c-f0, col_values=(('external_ids', {'iface-id': '46e41546-aa3b-4838-b2c2-ba3b46cf445c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:32 np0005603623 ovn_controller[133449]: 2026-01-31T08:59:32Z|00798|binding|INFO|Releasing lport 46e41546-aa3b-4838-b2c2-ba3b46cf445c from this chassis (sb_readonly=0)
Jan 31 03:59:32 np0005603623 nova_compute[226235]: 2026-01-31 08:59:32.260 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.260 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1c62fa1c-f7d2-4937-9258-1d3a4456b207.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1c62fa1c-f7d2-4937-9258-1d3a4456b207.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.261 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0f47821b-0102-4ea4-9366-e854c7775f86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.261 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-1c62fa1c-f7d2-4937-9258-1d3a4456b207
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/1c62fa1c-f7d2-4937-9258-1d3a4456b207.pid.haproxy
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 1c62fa1c-f7d2-4937-9258-1d3a4456b207
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:59:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 08:59:32.262 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207', 'env', 'PROCESS_TAG=haproxy-1c62fa1c-f7d2-4937-9258-1d3a4456b207', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1c62fa1c-f7d2-4937-9258-1d3a4456b207.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:59:32 np0005603623 nova_compute[226235]: 2026-01-31 08:59:32.294 226239 DEBUG nova.network.neutron [req-b353f312-e091-4a16-839d-445b1beb7604 req-455502f8-01fc-4fde-ad49-f6b6aa41a15e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Updated VIF entry in instance network info cache for port 8cd49adc-5281-4272-9c97-e9121d662fff. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:59:32 np0005603623 nova_compute[226235]: 2026-01-31 08:59:32.294 226239 DEBUG nova.network.neutron [req-b353f312-e091-4a16-839d-445b1beb7604 req-455502f8-01fc-4fde-ad49-f6b6aa41a15e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Updating instance_info_cache with network_info: [{"id": "8cd49adc-5281-4272-9c97-e9121d662fff", "address": "fa:16:3e:ef:a5:77", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cd49adc-52", "ovs_interfaceid": "8cd49adc-5281-4272-9c97-e9121d662fff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:59:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e380 e380: 3 total, 3 up, 3 in
Jan 31 03:59:32 np0005603623 nova_compute[226235]: 2026-01-31 08:59:32.414 226239 DEBUG oslo_concurrency.lockutils [req-b353f312-e091-4a16-839d-445b1beb7604 req-455502f8-01fc-4fde-ad49-f6b6aa41a15e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-3f0d401f-df22-424f-b572-4eb9ab2df0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:59:32 np0005603623 nova_compute[226235]: 2026-01-31 08:59:32.424 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849972.4241512, 3f0d401f-df22-424f-b572-4eb9ab2df0f4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:59:32 np0005603623 nova_compute[226235]: 2026-01-31 08:59:32.425 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] VM Started (Lifecycle Event)#033[00m
Jan 31 03:59:32 np0005603623 nova_compute[226235]: 2026-01-31 08:59:32.565 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:59:32 np0005603623 nova_compute[226235]: 2026-01-31 08:59:32.571 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849972.4243345, 3f0d401f-df22-424f-b572-4eb9ab2df0f4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:59:32 np0005603623 nova_compute[226235]: 2026-01-31 08:59:32.571 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:59:32 np0005603623 podman[317094]: 2026-01-31 08:59:32.573064631 +0000 UTC m=+0.051966281 container create f8fdba7f924634a2157d30168823ba49d1b83913a086d607fb215e8c43ba7bf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:59:32 np0005603623 systemd[1]: Started libpod-conmon-f8fdba7f924634a2157d30168823ba49d1b83913a086d607fb215e8c43ba7bf7.scope.
Jan 31 03:59:32 np0005603623 systemd[1]: Started libcrun container.
Jan 31 03:59:32 np0005603623 podman[317094]: 2026-01-31 08:59:32.542366368 +0000 UTC m=+0.021268068 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:59:32 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/229bfcb246ae29f166ff1394a49a39824bf4627c012581bd9d1bf07461aa0624/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:59:32 np0005603623 podman[317094]: 2026-01-31 08:59:32.645234525 +0000 UTC m=+0.124136195 container init f8fdba7f924634a2157d30168823ba49d1b83913a086d607fb215e8c43ba7bf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 31 03:59:32 np0005603623 podman[317094]: 2026-01-31 08:59:32.649246851 +0000 UTC m=+0.128148501 container start f8fdba7f924634a2157d30168823ba49d1b83913a086d607fb215e8c43ba7bf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:59:32 np0005603623 neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207[317109]: [NOTICE]   (317113) : New worker (317115) forked
Jan 31 03:59:32 np0005603623 neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207[317109]: [NOTICE]   (317113) : Loading success.
Jan 31 03:59:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:32.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:32 np0005603623 nova_compute[226235]: 2026-01-31 08:59:32.926 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:59:32 np0005603623 nova_compute[226235]: 2026-01-31 08:59:32.930 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:59:32 np0005603623 nova_compute[226235]: 2026-01-31 08:59:32.977 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:59:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:33.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.146 226239 DEBUG nova.compute.manager [req-3ba70a1e-590d-4e88-8bb4-32390e61dd23 req-09f1f120-0f39-4ec9-b704-0f6fed27c8b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Received event network-vif-plugged-8cd49adc-5281-4272-9c97-e9121d662fff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.146 226239 DEBUG oslo_concurrency.lockutils [req-3ba70a1e-590d-4e88-8bb4-32390e61dd23 req-09f1f120-0f39-4ec9-b704-0f6fed27c8b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.147 226239 DEBUG oslo_concurrency.lockutils [req-3ba70a1e-590d-4e88-8bb4-32390e61dd23 req-09f1f120-0f39-4ec9-b704-0f6fed27c8b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.147 226239 DEBUG oslo_concurrency.lockutils [req-3ba70a1e-590d-4e88-8bb4-32390e61dd23 req-09f1f120-0f39-4ec9-b704-0f6fed27c8b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.147 226239 DEBUG nova.compute.manager [req-3ba70a1e-590d-4e88-8bb4-32390e61dd23 req-09f1f120-0f39-4ec9-b704-0f6fed27c8b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Processing event network-vif-plugged-8cd49adc-5281-4272-9c97-e9121d662fff _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.148 226239 DEBUG nova.compute.manager [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.150 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769849973.150769, 3f0d401f-df22-424f-b572-4eb9ab2df0f4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.151 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.152 226239 DEBUG nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.155 226239 INFO nova.virt.libvirt.driver [-] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Instance spawned successfully.#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.155 226239 DEBUG nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.195 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.198 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.210 226239 DEBUG nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.211 226239 DEBUG nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.211 226239 DEBUG nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.212 226239 DEBUG nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.212 226239 DEBUG nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.213 226239 DEBUG nova.virt.libvirt.driver [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.441 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.776 226239 INFO nova.compute.manager [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Took 13.47 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:59:33 np0005603623 nova_compute[226235]: 2026-01-31 08:59:33.777 226239 DEBUG nova.compute.manager [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:59:34 np0005603623 nova_compute[226235]: 2026-01-31 08:59:34.086 226239 INFO nova.compute.manager [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Took 15.01 seconds to build instance.#033[00m
Jan 31 03:59:34 np0005603623 nova_compute[226235]: 2026-01-31 08:59:34.170 226239 DEBUG oslo_concurrency.lockutils [None req-4e918d84-d729-43e9-b2ce-bed82ca8041f 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:34 np0005603623 nova_compute[226235]: 2026-01-31 08:59:34.683 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:34.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:35.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:35 np0005603623 nova_compute[226235]: 2026-01-31 08:59:35.151 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:35 np0005603623 nova_compute[226235]: 2026-01-31 08:59:35.420 226239 DEBUG nova.compute.manager [req-afbdccf9-e930-4ecd-a00a-0f866319e30b req-1647fe3e-2746-492e-b070-75749f064841 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Received event network-vif-plugged-8cd49adc-5281-4272-9c97-e9121d662fff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:59:35 np0005603623 nova_compute[226235]: 2026-01-31 08:59:35.421 226239 DEBUG oslo_concurrency.lockutils [req-afbdccf9-e930-4ecd-a00a-0f866319e30b req-1647fe3e-2746-492e-b070-75749f064841 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:35 np0005603623 nova_compute[226235]: 2026-01-31 08:59:35.421 226239 DEBUG oslo_concurrency.lockutils [req-afbdccf9-e930-4ecd-a00a-0f866319e30b req-1647fe3e-2746-492e-b070-75749f064841 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:35 np0005603623 nova_compute[226235]: 2026-01-31 08:59:35.421 226239 DEBUG oslo_concurrency.lockutils [req-afbdccf9-e930-4ecd-a00a-0f866319e30b req-1647fe3e-2746-492e-b070-75749f064841 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:35 np0005603623 nova_compute[226235]: 2026-01-31 08:59:35.422 226239 DEBUG nova.compute.manager [req-afbdccf9-e930-4ecd-a00a-0f866319e30b req-1647fe3e-2746-492e-b070-75749f064841 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] No waiting events found dispatching network-vif-plugged-8cd49adc-5281-4272-9c97-e9121d662fff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:59:35 np0005603623 nova_compute[226235]: 2026-01-31 08:59:35.422 226239 WARNING nova.compute.manager [req-afbdccf9-e930-4ecd-a00a-0f866319e30b req-1647fe3e-2746-492e-b070-75749f064841 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Received unexpected event network-vif-plugged-8cd49adc-5281-4272-9c97-e9121d662fff for instance with vm_state active and task_state None.#033[00m
Jan 31 03:59:35 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:59:35 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:59:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e380 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:36 np0005603623 nova_compute[226235]: 2026-01-31 08:59:36.449 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 03:59:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:36.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 03:59:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:37.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:38.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:59:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:39.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:59:39 np0005603623 nova_compute[226235]: 2026-01-31 08:59:39.728 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:40.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:41.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e380 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:41 np0005603623 nova_compute[226235]: 2026-01-31 08:59:41.450 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:41 np0005603623 nova_compute[226235]: 2026-01-31 08:59:41.715 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:41 np0005603623 NetworkManager[48970]: <info>  [1769849981.7194] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/374)
Jan 31 03:59:41 np0005603623 NetworkManager[48970]: <info>  [1769849981.7207] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/375)
Jan 31 03:59:41 np0005603623 nova_compute[226235]: 2026-01-31 08:59:41.769 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:41 np0005603623 ovn_controller[133449]: 2026-01-31T08:59:41Z|00799|binding|INFO|Releasing lport 46e41546-aa3b-4838-b2c2-ba3b46cf445c from this chassis (sb_readonly=0)
Jan 31 03:59:41 np0005603623 nova_compute[226235]: 2026-01-31 08:59:41.788 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:42 np0005603623 nova_compute[226235]: 2026-01-31 08:59:42.749 226239 DEBUG nova.compute.manager [req-8bdf2a0e-bc2f-4986-b81b-b8e87901d2a4 req-887f4b72-bdb8-4cac-8ea9-5170d7f81edc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Received event network-changed-8cd49adc-5281-4272-9c97-e9121d662fff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:59:42 np0005603623 nova_compute[226235]: 2026-01-31 08:59:42.749 226239 DEBUG nova.compute.manager [req-8bdf2a0e-bc2f-4986-b81b-b8e87901d2a4 req-887f4b72-bdb8-4cac-8ea9-5170d7f81edc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Refreshing instance network info cache due to event network-changed-8cd49adc-5281-4272-9c97-e9121d662fff. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:59:42 np0005603623 nova_compute[226235]: 2026-01-31 08:59:42.749 226239 DEBUG oslo_concurrency.lockutils [req-8bdf2a0e-bc2f-4986-b81b-b8e87901d2a4 req-887f4b72-bdb8-4cac-8ea9-5170d7f81edc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-3f0d401f-df22-424f-b572-4eb9ab2df0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:59:42 np0005603623 nova_compute[226235]: 2026-01-31 08:59:42.750 226239 DEBUG oslo_concurrency.lockutils [req-8bdf2a0e-bc2f-4986-b81b-b8e87901d2a4 req-887f4b72-bdb8-4cac-8ea9-5170d7f81edc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-3f0d401f-df22-424f-b572-4eb9ab2df0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:59:42 np0005603623 nova_compute[226235]: 2026-01-31 08:59:42.750 226239 DEBUG nova.network.neutron [req-8bdf2a0e-bc2f-4986-b81b-b8e87901d2a4 req-887f4b72-bdb8-4cac-8ea9-5170d7f81edc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Refreshing network info cache for port 8cd49adc-5281-4272-9c97-e9121d662fff _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:59:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:59:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:42.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:59:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:43.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:44 np0005603623 nova_compute[226235]: 2026-01-31 08:59:44.730 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:44.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:45.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:45 np0005603623 nova_compute[226235]: 2026-01-31 08:59:45.495 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e380 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:46 np0005603623 nova_compute[226235]: 2026-01-31 08:59:46.491 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:46 np0005603623 nova_compute[226235]: 2026-01-31 08:59:46.611 226239 DEBUG nova.network.neutron [req-8bdf2a0e-bc2f-4986-b81b-b8e87901d2a4 req-887f4b72-bdb8-4cac-8ea9-5170d7f81edc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Updated VIF entry in instance network info cache for port 8cd49adc-5281-4272-9c97-e9121d662fff. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:59:46 np0005603623 nova_compute[226235]: 2026-01-31 08:59:46.612 226239 DEBUG nova.network.neutron [req-8bdf2a0e-bc2f-4986-b81b-b8e87901d2a4 req-887f4b72-bdb8-4cac-8ea9-5170d7f81edc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Updating instance_info_cache with network_info: [{"id": "8cd49adc-5281-4272-9c97-e9121d662fff", "address": "fa:16:3e:ef:a5:77", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cd49adc-52", "ovs_interfaceid": "8cd49adc-5281-4272-9c97-e9121d662fff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:59:46 np0005603623 nova_compute[226235]: 2026-01-31 08:59:46.666 226239 DEBUG oslo_concurrency.lockutils [req-8bdf2a0e-bc2f-4986-b81b-b8e87901d2a4 req-887f4b72-bdb8-4cac-8ea9-5170d7f81edc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-3f0d401f-df22-424f-b572-4eb9ab2df0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:59:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:46.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:46 np0005603623 podman[317232]: 2026-01-31 08:59:46.940141356 +0000 UTC m=+0.038954682 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:59:46 np0005603623 podman[317233]: 2026-01-31 08:59:46.965188972 +0000 UTC m=+0.060892081 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:59:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:47.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:59:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:48.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:59:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:49.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:59:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2724464930' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:59:49 np0005603623 nova_compute[226235]: 2026-01-31 08:59:49.779 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:59:50Z|00096|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ef:a5:77 10.100.0.4
Jan 31 03:59:50 np0005603623 ovn_controller[133449]: 2026-01-31T08:59:50Z|00097|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ef:a5:77 10.100.0.4
Jan 31 03:59:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:59:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:50.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:59:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:51.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e380 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:51 np0005603623 nova_compute[226235]: 2026-01-31 08:59:51.495 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:52 np0005603623 nova_compute[226235]: 2026-01-31 08:59:52.379 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 03:59:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:52.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 03:59:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:53.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:54 np0005603623 nova_compute[226235]: 2026-01-31 08:59:54.782 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:54.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:55.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e380 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:56 np0005603623 nova_compute[226235]: 2026-01-31 08:59:56.496 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:56.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:57.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:58.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 03:59:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:59.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:59 np0005603623 nova_compute[226235]: 2026-01-31 08:59:59.784 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:59 np0005603623 nova_compute[226235]: 2026-01-31 08:59:59.785 226239 DEBUG oslo_concurrency.lockutils [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquiring lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:59 np0005603623 nova_compute[226235]: 2026-01-31 08:59:59.786 226239 DEBUG oslo_concurrency.lockutils [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:59 np0005603623 nova_compute[226235]: 2026-01-31 08:59:59.786 226239 INFO nova.compute.manager [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Shelving#033[00m
Jan 31 03:59:59 np0005603623 nova_compute[226235]: 2026-01-31 08:59:59.892 226239 DEBUG nova.virt.libvirt.driver [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 04:00:00 np0005603623 ceph-mon[77037]: overall HEALTH_OK
Jan 31 04:00:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:00.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:01.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:01 np0005603623 ovn_controller[133449]: 2026-01-31T09:00:01Z|00800|binding|INFO|Releasing lport 46e41546-aa3b-4838-b2c2-ba3b46cf445c from this chassis (sb_readonly=0)
Jan 31 04:00:01 np0005603623 nova_compute[226235]: 2026-01-31 09:00:01.278 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e380 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:01 np0005603623 nova_compute[226235]: 2026-01-31 09:00:01.498 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:02 np0005603623 kernel: tap8cd49adc-52 (unregistering): left promiscuous mode
Jan 31 04:00:02 np0005603623 NetworkManager[48970]: <info>  [1769850002.2883] device (tap8cd49adc-52): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:00:02 np0005603623 ovn_controller[133449]: 2026-01-31T09:00:02Z|00801|binding|INFO|Releasing lport 8cd49adc-5281-4272-9c97-e9121d662fff from this chassis (sb_readonly=0)
Jan 31 04:00:02 np0005603623 ovn_controller[133449]: 2026-01-31T09:00:02Z|00802|binding|INFO|Setting lport 8cd49adc-5281-4272-9c97-e9121d662fff down in Southbound
Jan 31 04:00:02 np0005603623 ovn_controller[133449]: 2026-01-31T09:00:02Z|00803|binding|INFO|Removing iface tap8cd49adc-52 ovn-installed in OVS
Jan 31 04:00:02 np0005603623 nova_compute[226235]: 2026-01-31 09:00:02.323 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:02 np0005603623 nova_compute[226235]: 2026-01-31 09:00:02.324 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:02 np0005603623 nova_compute[226235]: 2026-01-31 09:00:02.328 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:02 np0005603623 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000be.scope: Deactivated successfully.
Jan 31 04:00:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:02.363 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:a5:77 10.100.0.4'], port_security=['fa:16:3e:ef:a5:77 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '3f0d401f-df22-424f-b572-4eb9ab2df0f4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1c62fa1c-f7d2-4937-9258-1d3a4456b207', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f293713f6854265a89a1a4a002088d5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cc6e8c6f-bb2f-4c15-bc84-a1452e5550a6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.233'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=05ac80f4-66e3-4e8c-b69d-f2f58ada92e8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=8cd49adc-5281-4272-9c97-e9121d662fff) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:00:02 np0005603623 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000be.scope: Consumed 12.867s CPU time.
Jan 31 04:00:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:02.365 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 8cd49adc-5281-4272-9c97-e9121d662fff in datapath 1c62fa1c-f7d2-4937-9258-1d3a4456b207 unbound from our chassis#033[00m
Jan 31 04:00:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:02.366 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1c62fa1c-f7d2-4937-9258-1d3a4456b207, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:00:02 np0005603623 systemd-machined[194379]: Machine qemu-89-instance-000000be terminated.
Jan 31 04:00:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:02.367 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[42226c33-73e0-4d9a-9583-04978a2fa5c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:02.368 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207 namespace which is not needed anymore#033[00m
Jan 31 04:00:02 np0005603623 neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207[317109]: [NOTICE]   (317113) : haproxy version is 2.8.14-c23fe91
Jan 31 04:00:02 np0005603623 neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207[317109]: [NOTICE]   (317113) : path to executable is /usr/sbin/haproxy
Jan 31 04:00:02 np0005603623 neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207[317109]: [WARNING]  (317113) : Exiting Master process...
Jan 31 04:00:02 np0005603623 neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207[317109]: [ALERT]    (317113) : Current worker (317115) exited with code 143 (Terminated)
Jan 31 04:00:02 np0005603623 neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207[317109]: [WARNING]  (317113) : All workers exited. Exiting... (0)
Jan 31 04:00:02 np0005603623 systemd[1]: libpod-f8fdba7f924634a2157d30168823ba49d1b83913a086d607fb215e8c43ba7bf7.scope: Deactivated successfully.
Jan 31 04:00:02 np0005603623 podman[317357]: 2026-01-31 09:00:02.533627988 +0000 UTC m=+0.089425667 container died f8fdba7f924634a2157d30168823ba49d1b83913a086d607fb215e8c43ba7bf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:00:02 np0005603623 nova_compute[226235]: 2026-01-31 09:00:02.536 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:02 np0005603623 nova_compute[226235]: 2026-01-31 09:00:02.539 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:02 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f8fdba7f924634a2157d30168823ba49d1b83913a086d607fb215e8c43ba7bf7-userdata-shm.mount: Deactivated successfully.
Jan 31 04:00:02 np0005603623 systemd[1]: var-lib-containers-storage-overlay-229bfcb246ae29f166ff1394a49a39824bf4627c012581bd9d1bf07461aa0624-merged.mount: Deactivated successfully.
Jan 31 04:00:02 np0005603623 podman[317357]: 2026-01-31 09:00:02.574411777 +0000 UTC m=+0.130209446 container cleanup f8fdba7f924634a2157d30168823ba49d1b83913a086d607fb215e8c43ba7bf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:00:02 np0005603623 systemd[1]: libpod-conmon-f8fdba7f924634a2157d30168823ba49d1b83913a086d607fb215e8c43ba7bf7.scope: Deactivated successfully.
Jan 31 04:00:02 np0005603623 podman[317400]: 2026-01-31 09:00:02.625781629 +0000 UTC m=+0.036336971 container remove f8fdba7f924634a2157d30168823ba49d1b83913a086d607fb215e8c43ba7bf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 04:00:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:02.629 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9258500e-b922-4866-8fd6-28132ee032b9]: (4, ('Sat Jan 31 09:00:02 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207 (f8fdba7f924634a2157d30168823ba49d1b83913a086d607fb215e8c43ba7bf7)\nf8fdba7f924634a2157d30168823ba49d1b83913a086d607fb215e8c43ba7bf7\nSat Jan 31 09:00:02 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207 (f8fdba7f924634a2157d30168823ba49d1b83913a086d607fb215e8c43ba7bf7)\nf8fdba7f924634a2157d30168823ba49d1b83913a086d607fb215e8c43ba7bf7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:02.631 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6cac3ccc-fad3-4a63-ad71-9e57178fe455]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:02.632 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1c62fa1c-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:00:02 np0005603623 kernel: tap1c62fa1c-f0: left promiscuous mode
Jan 31 04:00:02 np0005603623 nova_compute[226235]: 2026-01-31 09:00:02.634 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:02 np0005603623 nova_compute[226235]: 2026-01-31 09:00:02.640 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:02.643 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4b9bff46-c258-43ce-b32e-af423ecfb9cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:02.660 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[759df2f9-a8aa-411f-98e1-a8c20ec9d8f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:02.662 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ee2df1d7-8641-49a2-83ae-e67f0f451e55]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:02.673 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[44c2a796-eb45-4a2a-bcbd-7af687d03f58]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 911253, 'reachable_time': 17930, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 317419, 'error': None, 'target': 'ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:02.675 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:00:02 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:02.675 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[4b017f3e-73a1-4e4f-a3b8-395aeccf8b94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:02 np0005603623 systemd[1]: run-netns-ovnmeta\x2d1c62fa1c\x2df7d2\x2d4937\x2d9258\x2d1d3a4456b207.mount: Deactivated successfully.
Jan 31 04:00:02 np0005603623 nova_compute[226235]: 2026-01-31 09:00:02.906 226239 INFO nova.virt.libvirt.driver [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Instance shutdown successfully after 3 seconds.#033[00m
Jan 31 04:00:02 np0005603623 nova_compute[226235]: 2026-01-31 09:00:02.910 226239 INFO nova.virt.libvirt.driver [-] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Instance destroyed successfully.#033[00m
Jan 31 04:00:02 np0005603623 nova_compute[226235]: 2026-01-31 09:00:02.911 226239 DEBUG nova.objects.instance [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lazy-loading 'numa_topology' on Instance uuid 3f0d401f-df22-424f-b572-4eb9ab2df0f4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:00:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:02.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:03.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:03 np0005603623 nova_compute[226235]: 2026-01-31 09:00:03.268 226239 DEBUG nova.compute.manager [req-35155237-2b82-4aaa-b44f-bf7eeb1f66b7 req-9df6b9d8-3ae9-45c6-88d3-304b8612ca56 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Received event network-vif-unplugged-8cd49adc-5281-4272-9c97-e9121d662fff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:00:03 np0005603623 nova_compute[226235]: 2026-01-31 09:00:03.270 226239 DEBUG oslo_concurrency.lockutils [req-35155237-2b82-4aaa-b44f-bf7eeb1f66b7 req-9df6b9d8-3ae9-45c6-88d3-304b8612ca56 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:00:03 np0005603623 nova_compute[226235]: 2026-01-31 09:00:03.270 226239 DEBUG oslo_concurrency.lockutils [req-35155237-2b82-4aaa-b44f-bf7eeb1f66b7 req-9df6b9d8-3ae9-45c6-88d3-304b8612ca56 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:00:03 np0005603623 nova_compute[226235]: 2026-01-31 09:00:03.270 226239 DEBUG oslo_concurrency.lockutils [req-35155237-2b82-4aaa-b44f-bf7eeb1f66b7 req-9df6b9d8-3ae9-45c6-88d3-304b8612ca56 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:03 np0005603623 nova_compute[226235]: 2026-01-31 09:00:03.270 226239 DEBUG nova.compute.manager [req-35155237-2b82-4aaa-b44f-bf7eeb1f66b7 req-9df6b9d8-3ae9-45c6-88d3-304b8612ca56 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] No waiting events found dispatching network-vif-unplugged-8cd49adc-5281-4272-9c97-e9121d662fff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:00:03 np0005603623 nova_compute[226235]: 2026-01-31 09:00:03.270 226239 WARNING nova.compute.manager [req-35155237-2b82-4aaa-b44f-bf7eeb1f66b7 req-9df6b9d8-3ae9-45c6-88d3-304b8612ca56 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Received unexpected event network-vif-unplugged-8cd49adc-5281-4272-9c97-e9121d662fff for instance with vm_state active and task_state shelving.#033[00m
Jan 31 04:00:03 np0005603623 nova_compute[226235]: 2026-01-31 09:00:03.627 226239 INFO nova.virt.libvirt.driver [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Beginning cold snapshot process#033[00m
Jan 31 04:00:04 np0005603623 nova_compute[226235]: 2026-01-31 09:00:04.465 226239 DEBUG nova.virt.libvirt.imagebackend [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] No parent info for 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 31 04:00:04 np0005603623 nova_compute[226235]: 2026-01-31 09:00:04.785 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:04.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:04 np0005603623 nova_compute[226235]: 2026-01-31 09:00:04.917 226239 DEBUG nova.storage.rbd_utils [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] creating snapshot(6b1d982aa60c49d392d3212b6e5aa610) on rbd image(3f0d401f-df22-424f-b572-4eb9ab2df0f4_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 04:00:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e381 e381: 3 total, 3 up, 3 in
Jan 31 04:00:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:00:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:05.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:00:05 np0005603623 nova_compute[226235]: 2026-01-31 09:00:05.119 226239 DEBUG nova.storage.rbd_utils [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] cloning vms/3f0d401f-df22-424f-b572-4eb9ab2df0f4_disk@6b1d982aa60c49d392d3212b6e5aa610 to images/297f46fc-627f-48b0-8a66-2e6f3dab7554 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 04:00:05 np0005603623 nova_compute[226235]: 2026-01-31 09:00:05.259 226239 DEBUG nova.storage.rbd_utils [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] flattening images/297f46fc-627f-48b0-8a66-2e6f3dab7554 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 31 04:00:05 np0005603623 nova_compute[226235]: 2026-01-31 09:00:05.614 226239 DEBUG nova.compute.manager [req-761e1b97-02b1-4776-9af5-28552ee28985 req-022c7b26-1b0f-43a0-b427-1dc98217bacb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Received event network-vif-plugged-8cd49adc-5281-4272-9c97-e9121d662fff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:00:05 np0005603623 nova_compute[226235]: 2026-01-31 09:00:05.614 226239 DEBUG oslo_concurrency.lockutils [req-761e1b97-02b1-4776-9af5-28552ee28985 req-022c7b26-1b0f-43a0-b427-1dc98217bacb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:00:05 np0005603623 nova_compute[226235]: 2026-01-31 09:00:05.614 226239 DEBUG oslo_concurrency.lockutils [req-761e1b97-02b1-4776-9af5-28552ee28985 req-022c7b26-1b0f-43a0-b427-1dc98217bacb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:00:05 np0005603623 nova_compute[226235]: 2026-01-31 09:00:05.615 226239 DEBUG oslo_concurrency.lockutils [req-761e1b97-02b1-4776-9af5-28552ee28985 req-022c7b26-1b0f-43a0-b427-1dc98217bacb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:05 np0005603623 nova_compute[226235]: 2026-01-31 09:00:05.615 226239 DEBUG nova.compute.manager [req-761e1b97-02b1-4776-9af5-28552ee28985 req-022c7b26-1b0f-43a0-b427-1dc98217bacb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] No waiting events found dispatching network-vif-plugged-8cd49adc-5281-4272-9c97-e9121d662fff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:00:05 np0005603623 nova_compute[226235]: 2026-01-31 09:00:05.615 226239 WARNING nova.compute.manager [req-761e1b97-02b1-4776-9af5-28552ee28985 req-022c7b26-1b0f-43a0-b427-1dc98217bacb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Received unexpected event network-vif-plugged-8cd49adc-5281-4272-9c97-e9121d662fff for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 31 04:00:05 np0005603623 nova_compute[226235]: 2026-01-31 09:00:05.719 226239 DEBUG nova.storage.rbd_utils [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] removing snapshot(6b1d982aa60c49d392d3212b6e5aa610) on rbd image(3f0d401f-df22-424f-b572-4eb9ab2df0f4_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 04:00:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e382 e382: 3 total, 3 up, 3 in
Jan 31 04:00:06 np0005603623 nova_compute[226235]: 2026-01-31 09:00:06.127 226239 DEBUG nova.storage.rbd_utils [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] creating snapshot(snap) on rbd image(297f46fc-627f-48b0-8a66-2e6f3dab7554) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 04:00:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:06 np0005603623 nova_compute[226235]: 2026-01-31 09:00:06.499 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:06.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:07.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e383 e383: 3 total, 3 up, 3 in
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #163. Immutable memtables: 0.
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:00:07.442672) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 103] Flushing memtable with next log file: 163
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850007442727, "job": 103, "event": "flush_started", "num_memtables": 1, "num_entries": 877, "num_deletes": 255, "total_data_size": 1598519, "memory_usage": 1630760, "flush_reason": "Manual Compaction"}
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 103] Level-0 flush table #164: started
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850007451596, "cf_name": "default", "job": 103, "event": "table_file_creation", "file_number": 164, "file_size": 1054346, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 79538, "largest_seqno": 80410, "table_properties": {"data_size": 1050233, "index_size": 1828, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9388, "raw_average_key_size": 19, "raw_value_size": 1041785, "raw_average_value_size": 2161, "num_data_blocks": 79, "num_entries": 482, "num_filter_entries": 482, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769849954, "oldest_key_time": 1769849954, "file_creation_time": 1769850007, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 164, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 103] Flush lasted 8956 microseconds, and 2929 cpu microseconds.
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:00:07.451636) [db/flush_job.cc:967] [default] [JOB 103] Level-0 flush table #164: 1054346 bytes OK
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:00:07.451651) [db/memtable_list.cc:519] [default] Level-0 commit table #164 started
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:00:07.453873) [db/memtable_list.cc:722] [default] Level-0 commit table #164: memtable #1 done
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:00:07.453888) EVENT_LOG_v1 {"time_micros": 1769850007453884, "job": 103, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:00:07.453904) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 103] Try to delete WAL files size 1593997, prev total WAL file size 1593997, number of live WAL files 2.
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000160.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:00:07.454300) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033303134' seq:72057594037927935, type:22 .. '6C6F676D0033323635' seq:0, type:0; will stop at (end)
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 104] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 103 Base level 0, inputs: [164(1029KB)], [162(11MB)]
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850007454339, "job": 104, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [164], "files_L6": [162], "score": -1, "input_data_size": 13102305, "oldest_snapshot_seqno": -1}
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 104] Generated table #165: 9956 keys, 12971387 bytes, temperature: kUnknown
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850007526933, "cf_name": "default", "job": 104, "event": "table_file_creation", "file_number": 165, "file_size": 12971387, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12907191, "index_size": 38227, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24901, "raw_key_size": 263316, "raw_average_key_size": 26, "raw_value_size": 12733241, "raw_average_value_size": 1278, "num_data_blocks": 1454, "num_entries": 9956, "num_filter_entries": 9956, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769850007, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 165, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:00:07.527173) [db/compaction/compaction_job.cc:1663] [default] [JOB 104] Compacted 1@0 + 1@6 files to L6 => 12971387 bytes
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:00:07.528501) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 180.3 rd, 178.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 11.5 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(24.7) write-amplify(12.3) OK, records in: 10484, records dropped: 528 output_compression: NoCompression
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:00:07.528521) EVENT_LOG_v1 {"time_micros": 1769850007528512, "job": 104, "event": "compaction_finished", "compaction_time_micros": 72657, "compaction_time_cpu_micros": 22433, "output_level": 6, "num_output_files": 1, "total_output_size": 12971387, "num_input_records": 10484, "num_output_records": 9956, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000164.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850007528698, "job": 104, "event": "table_file_deletion", "file_number": 164}
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000162.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850007529538, "job": 104, "event": "table_file_deletion", "file_number": 162}
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:00:07.454247) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:00:07.529679) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:00:07.529684) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:00:07.529686) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:00:07.529688) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:00:07 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:00:07.529689) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:00:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:07.561 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=82, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=81) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:00:07 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:07.561 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:00:07 np0005603623 nova_compute[226235]: 2026-01-31 09:00:07.561 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:08.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:00:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:09.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:00:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e384 e384: 3 total, 3 up, 3 in
Jan 31 04:00:09 np0005603623 nova_compute[226235]: 2026-01-31 09:00:09.788 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:10 np0005603623 nova_compute[226235]: 2026-01-31 09:00:10.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:10 np0005603623 nova_compute[226235]: 2026-01-31 09:00:10.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:00:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:00:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:10.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:00:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:11.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:11 np0005603623 nova_compute[226235]: 2026-01-31 09:00:11.213 226239 INFO nova.virt.libvirt.driver [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Snapshot image upload complete#033[00m
Jan 31 04:00:11 np0005603623 nova_compute[226235]: 2026-01-31 09:00:11.214 226239 DEBUG nova.compute.manager [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:00:11 np0005603623 nova_compute[226235]: 2026-01-31 09:00:11.400 226239 INFO nova.compute.manager [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Shelve offloading#033[00m
Jan 31 04:00:11 np0005603623 nova_compute[226235]: 2026-01-31 09:00:11.405 226239 INFO nova.virt.libvirt.driver [-] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Instance destroyed successfully.#033[00m
Jan 31 04:00:11 np0005603623 nova_compute[226235]: 2026-01-31 09:00:11.406 226239 DEBUG nova.compute.manager [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:00:11 np0005603623 nova_compute[226235]: 2026-01-31 09:00:11.407 226239 DEBUG oslo_concurrency.lockutils [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquiring lock "refresh_cache-3f0d401f-df22-424f-b572-4eb9ab2df0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:00:11 np0005603623 nova_compute[226235]: 2026-01-31 09:00:11.407 226239 DEBUG oslo_concurrency.lockutils [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquired lock "refresh_cache-3f0d401f-df22-424f-b572-4eb9ab2df0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:00:11 np0005603623 nova_compute[226235]: 2026-01-31 09:00:11.408 226239 DEBUG nova.network.neutron [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:00:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e384 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:11 np0005603623 nova_compute[226235]: 2026-01-31 09:00:11.501 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e385 e385: 3 total, 3 up, 3 in
Jan 31 04:00:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:12.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:13.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:13 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:13.563 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '82'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:00:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:00:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/75591692' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:00:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:00:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/75591692' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:00:14 np0005603623 nova_compute[226235]: 2026-01-31 09:00:14.791 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:14.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:00:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:15.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:00:15 np0005603623 nova_compute[226235]: 2026-01-31 09:00:15.926 226239 DEBUG nova.network.neutron [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Updating instance_info_cache with network_info: [{"id": "8cd49adc-5281-4272-9c97-e9121d662fff", "address": "fa:16:3e:ef:a5:77", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cd49adc-52", "ovs_interfaceid": "8cd49adc-5281-4272-9c97-e9121d662fff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:00:15 np0005603623 nova_compute[226235]: 2026-01-31 09:00:15.959 226239 DEBUG oslo_concurrency.lockutils [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Releasing lock "refresh_cache-3f0d401f-df22-424f-b572-4eb9ab2df0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:00:16 np0005603623 nova_compute[226235]: 2026-01-31 09:00:16.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:16 np0005603623 nova_compute[226235]: 2026-01-31 09:00:16.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:00:16 np0005603623 nova_compute[226235]: 2026-01-31 09:00:16.156 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:00:16 np0005603623 nova_compute[226235]: 2026-01-31 09:00:16.193 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-3f0d401f-df22-424f-b572-4eb9ab2df0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:00:16 np0005603623 nova_compute[226235]: 2026-01-31 09:00:16.194 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-3f0d401f-df22-424f-b572-4eb9ab2df0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:00:16 np0005603623 nova_compute[226235]: 2026-01-31 09:00:16.194 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 04:00:16 np0005603623 nova_compute[226235]: 2026-01-31 09:00:16.194 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3f0d401f-df22-424f-b572-4eb9ab2df0f4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:00:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:16 np0005603623 nova_compute[226235]: 2026-01-31 09:00:16.504 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:16.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:00:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:17.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:00:17 np0005603623 nova_compute[226235]: 2026-01-31 09:00:17.544 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850002.54329, 3f0d401f-df22-424f-b572-4eb9ab2df0f4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:00:17 np0005603623 nova_compute[226235]: 2026-01-31 09:00:17.545 226239 INFO nova.compute.manager [-] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:00:17 np0005603623 nova_compute[226235]: 2026-01-31 09:00:17.587 226239 DEBUG nova.compute.manager [None req-ebc0be8a-177e-49e4-95fa-ab093bb71fbc - - - - - -] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:00:17 np0005603623 nova_compute[226235]: 2026-01-31 09:00:17.590 226239 DEBUG nova.compute.manager [None req-ebc0be8a-177e-49e4-95fa-ab093bb71fbc - - - - - -] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: shelved, current task_state: shelving_offloading, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:00:17 np0005603623 nova_compute[226235]: 2026-01-31 09:00:17.631 226239 INFO nova.compute.manager [None req-ebc0be8a-177e-49e4-95fa-ab093bb71fbc - - - - - -] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] During sync_power_state the instance has a pending task (shelving_offloading). Skip.#033[00m
Jan 31 04:00:18 np0005603623 podman[317568]: 2026-01-31 09:00:18.008303402 +0000 UTC m=+0.102798306 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:00:18 np0005603623 podman[317569]: 2026-01-31 09:00:18.012653378 +0000 UTC m=+0.105410948 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 04:00:18 np0005603623 nova_compute[226235]: 2026-01-31 09:00:18.298 226239 INFO nova.virt.libvirt.driver [-] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Instance destroyed successfully.#033[00m
Jan 31 04:00:18 np0005603623 nova_compute[226235]: 2026-01-31 09:00:18.298 226239 DEBUG nova.objects.instance [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lazy-loading 'resources' on Instance uuid 3f0d401f-df22-424f-b572-4eb9ab2df0f4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:00:18 np0005603623 nova_compute[226235]: 2026-01-31 09:00:18.331 226239 DEBUG nova.virt.libvirt.vif [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:59:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-273568541',display_name='tempest-TestShelveInstance-server-273568541',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-273568541',id=190,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMSJMgmfnDpAAhB+HaJLz9QSwipQmTsA86IiQRxWFaiZVFUeEfcIK6d3P3mBAHHd/rEKxm6Cw/JZh8tqOgCxKABbrDqL+FM2acHOfaAtltHep9oak+RawMJvZFvKOagynQ==',key_name='tempest-TestShelveInstance-1861761101',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:59:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='1f293713f6854265a89a1a4a002088d5',ramdisk_id='',reservation_id='r-179fgz65',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1813478377',owner_user_name='tempest-TestShelveInstance-1813478377-project-member',shelved_at='2026-01-31T09:00:11.214123',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='297f46fc-627f-48b0-8a66-2e6f3dab7554'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:00:04Z,user_data=None,user_id='3859f52c5b70471097d1e4ffa75ecc0e',uuid=3f0d401f-df22-424f-b572-4eb9ab2df0f4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "8cd49adc-5281-4272-9c97-e9121d662fff", "address": "fa:16:3e:ef:a5:77", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cd49adc-52", "ovs_interfaceid": "8cd49adc-5281-4272-9c97-e9121d662fff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:00:18 np0005603623 nova_compute[226235]: 2026-01-31 09:00:18.332 226239 DEBUG nova.network.os_vif_util [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Converting VIF {"id": "8cd49adc-5281-4272-9c97-e9121d662fff", "address": "fa:16:3e:ef:a5:77", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cd49adc-52", "ovs_interfaceid": "8cd49adc-5281-4272-9c97-e9121d662fff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:00:18 np0005603623 nova_compute[226235]: 2026-01-31 09:00:18.333 226239 DEBUG nova.network.os_vif_util [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:a5:77,bridge_name='br-int',has_traffic_filtering=True,id=8cd49adc-5281-4272-9c97-e9121d662fff,network=Network(1c62fa1c-f7d2-4937-9258-1d3a4456b207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cd49adc-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:00:18 np0005603623 nova_compute[226235]: 2026-01-31 09:00:18.333 226239 DEBUG os_vif [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:a5:77,bridge_name='br-int',has_traffic_filtering=True,id=8cd49adc-5281-4272-9c97-e9121d662fff,network=Network(1c62fa1c-f7d2-4937-9258-1d3a4456b207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cd49adc-52') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:00:18 np0005603623 nova_compute[226235]: 2026-01-31 09:00:18.336 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:18 np0005603623 nova_compute[226235]: 2026-01-31 09:00:18.337 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8cd49adc-52, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:00:18 np0005603623 nova_compute[226235]: 2026-01-31 09:00:18.338 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:18 np0005603623 nova_compute[226235]: 2026-01-31 09:00:18.340 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:18 np0005603623 nova_compute[226235]: 2026-01-31 09:00:18.344 226239 INFO os_vif [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:a5:77,bridge_name='br-int',has_traffic_filtering=True,id=8cd49adc-5281-4272-9c97-e9121d662fff,network=Network(1c62fa1c-f7d2-4937-9258-1d3a4456b207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cd49adc-52')#033[00m
Jan 31 04:00:18 np0005603623 nova_compute[226235]: 2026-01-31 09:00:18.614 226239 DEBUG nova.compute.manager [req-70b6eaad-e9c6-4447-a993-f23afd7e9d9b req-9c36956b-60f9-46fd-8d0d-cf1514f1b21a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Received event network-changed-8cd49adc-5281-4272-9c97-e9121d662fff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:00:18 np0005603623 nova_compute[226235]: 2026-01-31 09:00:18.615 226239 DEBUG nova.compute.manager [req-70b6eaad-e9c6-4447-a993-f23afd7e9d9b req-9c36956b-60f9-46fd-8d0d-cf1514f1b21a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Refreshing instance network info cache due to event network-changed-8cd49adc-5281-4272-9c97-e9121d662fff. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:00:18 np0005603623 nova_compute[226235]: 2026-01-31 09:00:18.615 226239 DEBUG oslo_concurrency.lockutils [req-70b6eaad-e9c6-4447-a993-f23afd7e9d9b req-9c36956b-60f9-46fd-8d0d-cf1514f1b21a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-3f0d401f-df22-424f-b572-4eb9ab2df0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:00:18 np0005603623 nova_compute[226235]: 2026-01-31 09:00:18.852 226239 INFO nova.virt.libvirt.driver [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Deleting instance files /var/lib/nova/instances/3f0d401f-df22-424f-b572-4eb9ab2df0f4_del#033[00m
Jan 31 04:00:18 np0005603623 nova_compute[226235]: 2026-01-31 09:00:18.853 226239 INFO nova.virt.libvirt.driver [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Deletion of /var/lib/nova/instances/3f0d401f-df22-424f-b572-4eb9ab2df0f4_del complete#033[00m
Jan 31 04:00:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:00:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:18.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:00:19 np0005603623 nova_compute[226235]: 2026-01-31 09:00:19.007 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Updating instance_info_cache with network_info: [{"id": "8cd49adc-5281-4272-9c97-e9121d662fff", "address": "fa:16:3e:ef:a5:77", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cd49adc-52", "ovs_interfaceid": "8cd49adc-5281-4272-9c97-e9121d662fff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:00:19 np0005603623 nova_compute[226235]: 2026-01-31 09:00:19.059 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-3f0d401f-df22-424f-b572-4eb9ab2df0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:00:19 np0005603623 nova_compute[226235]: 2026-01-31 09:00:19.059 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 04:00:19 np0005603623 nova_compute[226235]: 2026-01-31 09:00:19.059 226239 DEBUG oslo_concurrency.lockutils [req-70b6eaad-e9c6-4447-a993-f23afd7e9d9b req-9c36956b-60f9-46fd-8d0d-cf1514f1b21a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-3f0d401f-df22-424f-b572-4eb9ab2df0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:00:19 np0005603623 nova_compute[226235]: 2026-01-31 09:00:19.059 226239 DEBUG nova.network.neutron [req-70b6eaad-e9c6-4447-a993-f23afd7e9d9b req-9c36956b-60f9-46fd-8d0d-cf1514f1b21a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Refreshing network info cache for port 8cd49adc-5281-4272-9c97-e9121d662fff _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:00:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:19.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:19 np0005603623 nova_compute[226235]: 2026-01-31 09:00:19.061 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:19 np0005603623 nova_compute[226235]: 2026-01-31 09:00:19.140 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:00:19 np0005603623 nova_compute[226235]: 2026-01-31 09:00:19.140 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:00:19 np0005603623 nova_compute[226235]: 2026-01-31 09:00:19.141 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:19 np0005603623 nova_compute[226235]: 2026-01-31 09:00:19.141 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:00:19 np0005603623 nova_compute[226235]: 2026-01-31 09:00:19.141 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:00:19 np0005603623 nova_compute[226235]: 2026-01-31 09:00:19.281 226239 INFO nova.scheduler.client.report [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Deleted allocations for instance 3f0d401f-df22-424f-b572-4eb9ab2df0f4#033[00m
Jan 31 04:00:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:00:19 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1197078461' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:00:19 np0005603623 nova_compute[226235]: 2026-01-31 09:00:19.554 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:00:19 np0005603623 nova_compute[226235]: 2026-01-31 09:00:19.691 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:00:19 np0005603623 nova_compute[226235]: 2026-01-31 09:00:19.693 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4188MB free_disk=20.80600357055664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:00:19 np0005603623 nova_compute[226235]: 2026-01-31 09:00:19.693 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:00:19 np0005603623 nova_compute[226235]: 2026-01-31 09:00:19.693 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:00:20 np0005603623 nova_compute[226235]: 2026-01-31 09:00:20.524 226239 DEBUG oslo_concurrency.lockutils [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:00:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:20.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:21.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:21 np0005603623 nova_compute[226235]: 2026-01-31 09:00:21.510 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:21 np0005603623 nova_compute[226235]: 2026-01-31 09:00:21.546 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:00:21 np0005603623 nova_compute[226235]: 2026-01-31 09:00:21.546 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:00:21 np0005603623 nova_compute[226235]: 2026-01-31 09:00:21.622 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:00:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:00:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2527080538' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:00:22 np0005603623 nova_compute[226235]: 2026-01-31 09:00:22.044 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:00:22 np0005603623 nova_compute[226235]: 2026-01-31 09:00:22.049 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:00:22 np0005603623 nova_compute[226235]: 2026-01-31 09:00:22.078 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:00:22 np0005603623 nova_compute[226235]: 2026-01-31 09:00:22.154 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:00:22 np0005603623 nova_compute[226235]: 2026-01-31 09:00:22.155 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:22 np0005603623 nova_compute[226235]: 2026-01-31 09:00:22.155 226239 DEBUG oslo_concurrency.lockutils [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 1.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:00:22 np0005603623 nova_compute[226235]: 2026-01-31 09:00:22.159 226239 DEBUG oslo_concurrency.lockutils [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:22 np0005603623 nova_compute[226235]: 2026-01-31 09:00:22.249 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:22 np0005603623 nova_compute[226235]: 2026-01-31 09:00:22.249 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:22 np0005603623 nova_compute[226235]: 2026-01-31 09:00:22.288 226239 DEBUG oslo_concurrency.lockutils [None req-9258c8de-48b9-4e42-8c24-83affe947377 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "3f0d401f-df22-424f-b572-4eb9ab2df0f4" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 22.502s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:22.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:23.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:23 np0005603623 nova_compute[226235]: 2026-01-31 09:00:23.156 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:23 np0005603623 nova_compute[226235]: 2026-01-31 09:00:23.156 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:23 np0005603623 nova_compute[226235]: 2026-01-31 09:00:23.340 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:24 np0005603623 nova_compute[226235]: 2026-01-31 09:00:24.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:24.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:25.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:25 np0005603623 nova_compute[226235]: 2026-01-31 09:00:25.278 226239 DEBUG nova.network.neutron [req-70b6eaad-e9c6-4447-a993-f23afd7e9d9b req-9c36956b-60f9-46fd-8d0d-cf1514f1b21a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Updated VIF entry in instance network info cache for port 8cd49adc-5281-4272-9c97-e9121d662fff. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:00:25 np0005603623 nova_compute[226235]: 2026-01-31 09:00:25.278 226239 DEBUG nova.network.neutron [req-70b6eaad-e9c6-4447-a993-f23afd7e9d9b req-9c36956b-60f9-46fd-8d0d-cf1514f1b21a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3f0d401f-df22-424f-b572-4eb9ab2df0f4] Updating instance_info_cache with network_info: [{"id": "8cd49adc-5281-4272-9c97-e9121d662fff", "address": "fa:16:3e:ef:a5:77", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": null, "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap8cd49adc-52", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:00:25 np0005603623 nova_compute[226235]: 2026-01-31 09:00:25.568 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:26 np0005603623 nova_compute[226235]: 2026-01-31 09:00:26.025 226239 DEBUG oslo_concurrency.lockutils [req-70b6eaad-e9c6-4447-a993-f23afd7e9d9b req-9c36956b-60f9-46fd-8d0d-cf1514f1b21a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-3f0d401f-df22-424f-b572-4eb9ab2df0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:00:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:26 np0005603623 nova_compute[226235]: 2026-01-31 09:00:26.512 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:26.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:00:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:27.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:00:28 np0005603623 nova_compute[226235]: 2026-01-31 09:00:28.344 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:28.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:29.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:29 np0005603623 nova_compute[226235]: 2026-01-31 09:00:29.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:30.152 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:00:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:30.153 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:00:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:30.153 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:00:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:30.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:00:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:31.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:31 np0005603623 nova_compute[226235]: 2026-01-31 09:00:31.538 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:32.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:33.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:33 np0005603623 nova_compute[226235]: 2026-01-31 09:00:33.346 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:33 np0005603623 nova_compute[226235]: 2026-01-31 09:00:33.516 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:34.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:35.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:36 np0005603623 nova_compute[226235]: 2026-01-31 09:00:36.540 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:36.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:37.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:00:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:00:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:00:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:00:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:00:38 np0005603623 nova_compute[226235]: 2026-01-31 09:00:38.347 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:38.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:39.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e386 e386: 3 total, 3 up, 3 in
Jan 31 04:00:40 np0005603623 nova_compute[226235]: 2026-01-31 09:00:40.298 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:40 np0005603623 nova_compute[226235]: 2026-01-31 09:00:40.384 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:40.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:41.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e386 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:41 np0005603623 nova_compute[226235]: 2026-01-31 09:00:41.541 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e387 e387: 3 total, 3 up, 3 in
Jan 31 04:00:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:42.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:43.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:00:43 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:00:43 np0005603623 nova_compute[226235]: 2026-01-31 09:00:43.349 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:44.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:45.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e387 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:46 np0005603623 nova_compute[226235]: 2026-01-31 09:00:46.542 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:00:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:46.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:00:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:47.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e388 e388: 3 total, 3 up, 3 in
Jan 31 04:00:48 np0005603623 nova_compute[226235]: 2026-01-31 09:00:48.357 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:48 np0005603623 podman[317975]: 2026-01-31 09:00:48.951652368 +0000 UTC m=+0.048221704 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Jan 31 04:00:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:00:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:48.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:00:48 np0005603623 podman[317976]: 2026-01-31 09:00:48.970279292 +0000 UTC m=+0.066749335 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 04:00:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:49.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:00:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:50.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:00:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:00:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:51.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:00:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:51 np0005603623 nova_compute[226235]: 2026-01-31 09:00:51.574 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:51.593 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=83, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=82) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:00:51 np0005603623 nova_compute[226235]: 2026-01-31 09:00:51.593 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:00:51.594 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:00:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:00:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:52.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:00:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:00:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:53.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:00:53 np0005603623 nova_compute[226235]: 2026-01-31 09:00:53.358 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:00:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:54.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:00:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:55.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:56 np0005603623 nova_compute[226235]: 2026-01-31 09:00:56.576 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:00:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:56.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:00:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:00:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:57.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:00:58 np0005603623 nova_compute[226235]: 2026-01-31 09:00:58.361 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:00:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:58.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:00:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:00:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:59.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:00.596 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '83'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:01:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:01:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:00.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:01:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:01.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:01 np0005603623 nova_compute[226235]: 2026-01-31 09:01:01.577 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:02.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:01:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:03.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:01:03 np0005603623 nova_compute[226235]: 2026-01-31 09:01:03.363 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:04.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:05.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:06 np0005603623 nova_compute[226235]: 2026-01-31 09:01:06.580 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:06.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:07.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:08 np0005603623 nova_compute[226235]: 2026-01-31 09:01:08.365 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:08.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:09.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:10.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:11.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:11 np0005603623 nova_compute[226235]: 2026-01-31 09:01:11.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:11 np0005603623 nova_compute[226235]: 2026-01-31 09:01:11.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:01:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:11 np0005603623 nova_compute[226235]: 2026-01-31 09:01:11.581 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:12 np0005603623 nova_compute[226235]: 2026-01-31 09:01:12.105 226239 DEBUG oslo_concurrency.lockutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "357f617a-600c-48b2-b5b1-420475f37e61" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:12 np0005603623 nova_compute[226235]: 2026-01-31 09:01:12.105 226239 DEBUG oslo_concurrency.lockutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "357f617a-600c-48b2-b5b1-420475f37e61" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:12 np0005603623 nova_compute[226235]: 2026-01-31 09:01:12.142 226239 DEBUG nova.compute.manager [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:01:12 np0005603623 nova_compute[226235]: 2026-01-31 09:01:12.250 226239 DEBUG oslo_concurrency.lockutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:12 np0005603623 nova_compute[226235]: 2026-01-31 09:01:12.250 226239 DEBUG oslo_concurrency.lockutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:12 np0005603623 nova_compute[226235]: 2026-01-31 09:01:12.258 226239 DEBUG nova.virt.hardware [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:01:12 np0005603623 nova_compute[226235]: 2026-01-31 09:01:12.258 226239 INFO nova.compute.claims [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 04:01:12 np0005603623 nova_compute[226235]: 2026-01-31 09:01:12.419 226239 DEBUG oslo_concurrency.processutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:01:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3378383625' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:01:12 np0005603623 nova_compute[226235]: 2026-01-31 09:01:12.809 226239 DEBUG oslo_concurrency.processutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.391s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:12 np0005603623 nova_compute[226235]: 2026-01-31 09:01:12.815 226239 DEBUG nova.compute.provider_tree [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:01:12 np0005603623 nova_compute[226235]: 2026-01-31 09:01:12.837 226239 DEBUG nova.scheduler.client.report [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:01:12 np0005603623 nova_compute[226235]: 2026-01-31 09:01:12.965 226239 DEBUG oslo_concurrency.lockutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.714s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:12 np0005603623 nova_compute[226235]: 2026-01-31 09:01:12.965 226239 DEBUG nova.compute.manager [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:01:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:12.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.062 226239 DEBUG nova.compute.manager [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.063 226239 DEBUG nova.network.neutron [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.117 226239 INFO nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:01:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:13.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.173 226239 DEBUG nova.compute.manager [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.248 226239 INFO nova.virt.block_device [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Booting with volume e4cba400-cc7c-48f0-94b8-a77f724bc898 at /dev/vda#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.367 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.646 226239 DEBUG nova.policy [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dc42b92a5dd34d32b6b184bdc7acb092', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76ce367a834b49dfb5b436848118b860', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.649 226239 DEBUG os_brick.utils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.650 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.658 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.659 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[b7f69b0e-f4d3-48fc-bfaf-711a55aedca5]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.660 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.665 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.666 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[c50f9528-5f48-4652-b01b-c513a2d032b8]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.667 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.672 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.005s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.672 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[b4b947ce-1928-4216-a8e4-4d1807d7b29e]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.673 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[bbf692d2-5135-4984-935c-626b09bb53e3]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.673 226239 DEBUG oslo_concurrency.processutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.691 226239 DEBUG oslo_concurrency.processutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "nvme version" returned: 0 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.693 226239 DEBUG os_brick.initiator.connectors.lightos [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.693 226239 DEBUG os_brick.initiator.connectors.lightos [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.693 226239 DEBUG os_brick.initiator.connectors.lightos [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.694 226239 DEBUG os_brick.utils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] <== get_connector_properties: return (44ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 04:01:13 np0005603623 nova_compute[226235]: 2026-01-31 09:01:13.694 226239 DEBUG nova.virt.block_device [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Updating existing volume attachment record: f083d67f-52b0-4ca2-941c-f61672efe91d _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 04:01:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:01:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/159240594' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:01:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:01:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/159240594' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:01:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:01:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:14.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:01:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:01:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:15.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:01:15 np0005603623 nova_compute[226235]: 2026-01-31 09:01:15.802 226239 DEBUG nova.network.neutron [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Successfully created port: 6e590395-62f6-48d5-a65a-5272c690c918 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.305 226239 DEBUG nova.compute.manager [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.306 226239 DEBUG nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.307 226239 INFO nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Creating image(s)#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.307 226239 DEBUG nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.307 226239 DEBUG nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Ensure instance console log exists: /var/lib/nova/instances/357f617a-600c-48b2-b5b1-420475f37e61/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.308 226239 DEBUG oslo_concurrency.lockutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.308 226239 DEBUG oslo_concurrency.lockutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.309 226239 DEBUG oslo_concurrency.lockutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.336 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.337 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.337 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.404 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.405 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.405 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.405 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.405 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.582 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:01:16 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1560879371' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.803 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.397s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.958 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.959 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4216MB free_disk=20.921878814697266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.959 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:16 np0005603623 nova_compute[226235]: 2026-01-31 09:01:16.959 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:16.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:17 np0005603623 nova_compute[226235]: 2026-01-31 09:01:17.045 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 357f617a-600c-48b2-b5b1-420475f37e61 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:01:17 np0005603623 nova_compute[226235]: 2026-01-31 09:01:17.045 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:01:17 np0005603623 nova_compute[226235]: 2026-01-31 09:01:17.045 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:01:17 np0005603623 nova_compute[226235]: 2026-01-31 09:01:17.093 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:17.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:01:17 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3529204547' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:01:17 np0005603623 nova_compute[226235]: 2026-01-31 09:01:17.528 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:17 np0005603623 nova_compute[226235]: 2026-01-31 09:01:17.533 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:01:17 np0005603623 nova_compute[226235]: 2026-01-31 09:01:17.562 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:01:17 np0005603623 nova_compute[226235]: 2026-01-31 09:01:17.601 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:01:17 np0005603623 nova_compute[226235]: 2026-01-31 09:01:17.602 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:18 np0005603623 nova_compute[226235]: 2026-01-31 09:01:18.370 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:18.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:01:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:19.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:01:19 np0005603623 nova_compute[226235]: 2026-01-31 09:01:19.559 226239 DEBUG nova.network.neutron [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Successfully updated port: 6e590395-62f6-48d5-a65a-5272c690c918 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:01:19 np0005603623 nova_compute[226235]: 2026-01-31 09:01:19.584 226239 DEBUG oslo_concurrency.lockutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "refresh_cache-357f617a-600c-48b2-b5b1-420475f37e61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:01:19 np0005603623 nova_compute[226235]: 2026-01-31 09:01:19.585 226239 DEBUG oslo_concurrency.lockutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquired lock "refresh_cache-357f617a-600c-48b2-b5b1-420475f37e61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:01:19 np0005603623 nova_compute[226235]: 2026-01-31 09:01:19.585 226239 DEBUG nova.network.neutron [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:01:19 np0005603623 nova_compute[226235]: 2026-01-31 09:01:19.776 226239 DEBUG nova.compute.manager [req-4563ef9b-4fbb-45d1-afc8-5dcd030e52b4 req-9deafc8e-3d38-4996-98b6-4eaa6b3dac90 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Received event network-changed-6e590395-62f6-48d5-a65a-5272c690c918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:01:19 np0005603623 nova_compute[226235]: 2026-01-31 09:01:19.777 226239 DEBUG nova.compute.manager [req-4563ef9b-4fbb-45d1-afc8-5dcd030e52b4 req-9deafc8e-3d38-4996-98b6-4eaa6b3dac90 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Refreshing instance network info cache due to event network-changed-6e590395-62f6-48d5-a65a-5272c690c918. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:01:19 np0005603623 nova_compute[226235]: 2026-01-31 09:01:19.777 226239 DEBUG oslo_concurrency.lockutils [req-4563ef9b-4fbb-45d1-afc8-5dcd030e52b4 req-9deafc8e-3d38-4996-98b6-4eaa6b3dac90 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-357f617a-600c-48b2-b5b1-420475f37e61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:01:19 np0005603623 podman[318194]: 2026-01-31 09:01:19.797190175 +0000 UTC m=+0.056590807 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 04:01:19 np0005603623 podman[318195]: 2026-01-31 09:01:19.862449822 +0000 UTC m=+0.118939292 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:01:19 np0005603623 nova_compute[226235]: 2026-01-31 09:01:19.963 226239 DEBUG nova.network.neutron [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:01:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:20.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:21.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:21 np0005603623 nova_compute[226235]: 2026-01-31 09:01:21.419 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:21 np0005603623 nova_compute[226235]: 2026-01-31 09:01:21.584 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.738 226239 DEBUG nova.network.neutron [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Updating instance_info_cache with network_info: [{"id": "6e590395-62f6-48d5-a65a-5272c690c918", "address": "fa:16:3e:7c:90:2a", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e590395-62", "ovs_interfaceid": "6e590395-62f6-48d5-a65a-5272c690c918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.889 226239 DEBUG oslo_concurrency.lockutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Releasing lock "refresh_cache-357f617a-600c-48b2-b5b1-420475f37e61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.890 226239 DEBUG nova.compute.manager [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Instance network_info: |[{"id": "6e590395-62f6-48d5-a65a-5272c690c918", "address": "fa:16:3e:7c:90:2a", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e590395-62", "ovs_interfaceid": "6e590395-62f6-48d5-a65a-5272c690c918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.890 226239 DEBUG oslo_concurrency.lockutils [req-4563ef9b-4fbb-45d1-afc8-5dcd030e52b4 req-9deafc8e-3d38-4996-98b6-4eaa6b3dac90 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-357f617a-600c-48b2-b5b1-420475f37e61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.890 226239 DEBUG nova.network.neutron [req-4563ef9b-4fbb-45d1-afc8-5dcd030e52b4 req-9deafc8e-3d38-4996-98b6-4eaa6b3dac90 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Refreshing network info cache for port 6e590395-62f6-48d5-a65a-5272c690c918 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.893 226239 DEBUG nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Start _get_guest_xml network_info=[{"id": "6e590395-62f6-48d5-a65a-5272c690c918", "address": "fa:16:3e:7c:90:2a", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e590395-62", "ovs_interfaceid": "6e590395-62f6-48d5-a65a-5272c690c918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'attachment_id': 'f083d67f-52b0-4ca2-941c-f61672efe91d', 'delete_on_termination': False, 'guest_format': None, 'device_type': 'disk', 'boot_index': 0, 'disk_bus': 'virtio', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-e4cba400-cc7c-48f0-94b8-a77f724bc898', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'e4cba400-cc7c-48f0-94b8-a77f724bc898', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': True, 'cacheable': False}, 'status': 'reserved', 'instance': '357f617a-600c-48b2-b5b1-420475f37e61', 'attached_at': '', 'detached_at': '', 'volume_id': 'e4cba400-cc7c-48f0-94b8-a77f724bc898', 'serial': 'e4cba400-cc7c-48f0-94b8-a77f724bc898'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.896 226239 WARNING nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.905 226239 DEBUG nova.virt.libvirt.host [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.906 226239 DEBUG nova.virt.libvirt.host [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.910 226239 DEBUG nova.virt.libvirt.host [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.911 226239 DEBUG nova.virt.libvirt.host [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.912 226239 DEBUG nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.912 226239 DEBUG nova.virt.hardware [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.913 226239 DEBUG nova.virt.hardware [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.913 226239 DEBUG nova.virt.hardware [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.913 226239 DEBUG nova.virt.hardware [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.913 226239 DEBUG nova.virt.hardware [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.914 226239 DEBUG nova.virt.hardware [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.914 226239 DEBUG nova.virt.hardware [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.914 226239 DEBUG nova.virt.hardware [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.915 226239 DEBUG nova.virt.hardware [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.915 226239 DEBUG nova.virt.hardware [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.915 226239 DEBUG nova.virt.hardware [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.943 226239 DEBUG nova.storage.rbd_utils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] rbd image 357f617a-600c-48b2-b5b1-420475f37e61_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:01:22 np0005603623 nova_compute[226235]: 2026-01-31 09:01:22.947 226239 DEBUG oslo_concurrency.processutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:01:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:22.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:01:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:23.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:23 np0005603623 nova_compute[226235]: 2026-01-31 09:01:23.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:23 np0005603623 nova_compute[226235]: 2026-01-31 09:01:23.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:01:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/625362502' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:01:23 np0005603623 nova_compute[226235]: 2026-01-31 09:01:23.358 226239 DEBUG oslo_concurrency.processutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:23 np0005603623 nova_compute[226235]: 2026-01-31 09:01:23.372 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:23 np0005603623 nova_compute[226235]: 2026-01-31 09:01:23.786 226239 DEBUG os_brick.encryptors [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Using volume encryption metadata '{'encryption_key_id': 'cf731dd8-1d28-4250-a5e1-54d441c738cb', 'control_location': 'front-end', 'cipher': 'aes-xts-plain64', 'key_size': 256, 'provider': 'luks'}' for connection: {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-e4cba400-cc7c-48f0-94b8-a77f724bc898', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'e4cba400-cc7c-48f0-94b8-a77f724bc898', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': True, 'cacheable': False}, 'status': 'reserved', 'instance': '357f617a-600c-48b2-b5b1-420475f37e61', 'attached_at': '', 'detached_at': '', 'volume_id': 'e4cba400-cc7c-48f0-94b8-a77f724bc898', 'serial': '} get_encryption_metadata /usr/lib/python3.9/site-packages/os_brick/encryptors/__init__.py:135#033[00m
Jan 31 04:01:23 np0005603623 nova_compute[226235]: 2026-01-31 09:01:23.789 226239 DEBUG barbicanclient.client [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Creating Client object Client /usr/lib/python3.9/site-packages/barbicanclient/client.py:163#033[00m
Jan 31 04:01:23 np0005603623 nova_compute[226235]: 2026-01-31 09:01:23.873 226239 DEBUG barbicanclient.v1.secrets [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Getting secret - Secret href: https://barbican-internal.openstack.svc:9311/secrets/cf731dd8-1d28-4250-a5e1-54d441c738cb get /usr/lib/python3.9/site-packages/barbicanclient/v1/secrets.py:514#033[00m
Jan 31 04:01:23 np0005603623 nova_compute[226235]: 2026-01-31 09:01:23.873 226239 INFO barbicanclient.base [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Calculated Secrets uuid ref: secrets/cf731dd8-1d28-4250-a5e1-54d441c738cb#033[00m
Jan 31 04:01:23 np0005603623 nova_compute[226235]: 2026-01-31 09:01:23.921 226239 DEBUG barbicanclient.client [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 04:01:23 np0005603623 nova_compute[226235]: 2026-01-31 09:01:23.921 226239 INFO barbicanclient.base [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Calculated Secrets uuid ref: secrets/cf731dd8-1d28-4250-a5e1-54d441c738cb#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.002 226239 DEBUG barbicanclient.client [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.003 226239 INFO barbicanclient.base [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Calculated Secrets uuid ref: secrets/cf731dd8-1d28-4250-a5e1-54d441c738cb#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.061 226239 DEBUG barbicanclient.client [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.062 226239 INFO barbicanclient.base [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Calculated Secrets uuid ref: secrets/cf731dd8-1d28-4250-a5e1-54d441c738cb#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.099 226239 DEBUG barbicanclient.client [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.100 226239 INFO barbicanclient.base [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Calculated Secrets uuid ref: secrets/cf731dd8-1d28-4250-a5e1-54d441c738cb#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.138 226239 DEBUG barbicanclient.client [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.139 226239 INFO barbicanclient.base [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Calculated Secrets uuid ref: secrets/cf731dd8-1d28-4250-a5e1-54d441c738cb#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.182 226239 DEBUG barbicanclient.client [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.182 226239 INFO barbicanclient.base [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Calculated Secrets uuid ref: secrets/cf731dd8-1d28-4250-a5e1-54d441c738cb#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.247 226239 DEBUG barbicanclient.client [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.248 226239 INFO barbicanclient.base [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Calculated Secrets uuid ref: secrets/cf731dd8-1d28-4250-a5e1-54d441c738cb#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.293 226239 DEBUG barbicanclient.client [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.294 226239 INFO barbicanclient.base [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Calculated Secrets uuid ref: secrets/cf731dd8-1d28-4250-a5e1-54d441c738cb#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.352 226239 DEBUG barbicanclient.client [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.353 226239 INFO barbicanclient.base [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Calculated Secrets uuid ref: secrets/cf731dd8-1d28-4250-a5e1-54d441c738cb#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.408 226239 DEBUG barbicanclient.client [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.408 226239 INFO barbicanclient.base [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Calculated Secrets uuid ref: secrets/cf731dd8-1d28-4250-a5e1-54d441c738cb#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.503 226239 DEBUG barbicanclient.client [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.503 226239 INFO barbicanclient.base [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Calculated Secrets uuid ref: secrets/cf731dd8-1d28-4250-a5e1-54d441c738cb#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.611 226239 DEBUG barbicanclient.client [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.611 226239 INFO barbicanclient.base [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Calculated Secrets uuid ref: secrets/cf731dd8-1d28-4250-a5e1-54d441c738cb#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.686 226239 DEBUG barbicanclient.client [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.687 226239 INFO barbicanclient.base [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Calculated Secrets uuid ref: secrets/cf731dd8-1d28-4250-a5e1-54d441c738cb#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.729 226239 DEBUG barbicanclient.client [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.730 226239 INFO barbicanclient.base [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Calculated Secrets uuid ref: secrets/cf731dd8-1d28-4250-a5e1-54d441c738cb#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.767 226239 DEBUG barbicanclient.client [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.768 226239 DEBUG nova.virt.libvirt.host [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Secret XML: <secret ephemeral="no" private="no">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  <usage type="volume">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <volume>e4cba400-cc7c-48f0-94b8-a77f724bc898</volume>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  </usage>
Jan 31 04:01:24 np0005603623 nova_compute[226235]: </secret>
Jan 31 04:01:24 np0005603623 nova_compute[226235]: create_secret /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1131#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.831 226239 DEBUG nova.virt.libvirt.vif [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:01:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-285494724',display_name='tempest-TestVolumeBootPattern-server-285494724',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-285494724',id=194,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76ce367a834b49dfb5b436848118b860',ramdisk_id='',reservation_id='r-r0wj6qp0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1392945362',owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:01:13Z,user_data=None,user_id='dc42b92a5dd34d32b6b184bdc7acb092',uuid=357f617a-600c-48b2-b5b1-420475f37e61,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e590395-62f6-48d5-a65a-5272c690c918", "address": "fa:16:3e:7c:90:2a", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e590395-62", "ovs_interfaceid": "6e590395-62f6-48d5-a65a-5272c690c918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.832 226239 DEBUG nova.network.os_vif_util [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converting VIF {"id": "6e590395-62f6-48d5-a65a-5272c690c918", "address": "fa:16:3e:7c:90:2a", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e590395-62", "ovs_interfaceid": "6e590395-62f6-48d5-a65a-5272c690c918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.832 226239 DEBUG nova.network.os_vif_util [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:90:2a,bridge_name='br-int',has_traffic_filtering=True,id=6e590395-62f6-48d5-a65a-5272c690c918,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e590395-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.834 226239 DEBUG nova.objects.instance [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lazy-loading 'pci_devices' on Instance uuid 357f617a-600c-48b2-b5b1-420475f37e61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.875 226239 DEBUG nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  <uuid>357f617a-600c-48b2-b5b1-420475f37e61</uuid>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  <name>instance-000000c2</name>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <nova:name>tempest-TestVolumeBootPattern-server-285494724</nova:name>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 09:01:22</nova:creationTime>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:        <nova:user uuid="dc42b92a5dd34d32b6b184bdc7acb092">tempest-TestVolumeBootPattern-1392945362-project-member</nova:user>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:        <nova:project uuid="76ce367a834b49dfb5b436848118b860">tempest-TestVolumeBootPattern-1392945362</nova:project>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:        <nova:port uuid="6e590395-62f6-48d5-a65a-5272c690c918">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <system>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <entry name="serial">357f617a-600c-48b2-b5b1-420475f37e61</entry>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <entry name="uuid">357f617a-600c-48b2-b5b1-420475f37e61</entry>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    </system>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  <os>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  </os>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  <features>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  </features>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  </clock>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  <devices>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/357f617a-600c-48b2-b5b1-420475f37e61_disk.config">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="volumes/volume-e4cba400-cc7c-48f0-94b8-a77f724bc898">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <serial>e4cba400-cc7c-48f0-94b8-a77f724bc898</serial>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <encryption format="luks">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:        <secret type="passphrase" uuid="04971298-3e7d-4d93-9755-528a0908ee4b"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      </encryption>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:7c:90:2a"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <target dev="tap6e590395-62"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    </interface>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/357f617a-600c-48b2-b5b1-420475f37e61/console.log" append="off"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    </serial>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <video>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    </video>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    </rng>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 04:01:24 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 04:01:24 np0005603623 nova_compute[226235]:  </devices>
Jan 31 04:01:24 np0005603623 nova_compute[226235]: </domain>
Jan 31 04:01:24 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.877 226239 DEBUG nova.compute.manager [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Preparing to wait for external event network-vif-plugged-6e590395-62f6-48d5-a65a-5272c690c918 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.877 226239 DEBUG oslo_concurrency.lockutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "357f617a-600c-48b2-b5b1-420475f37e61-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.877 226239 DEBUG oslo_concurrency.lockutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "357f617a-600c-48b2-b5b1-420475f37e61-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.877 226239 DEBUG oslo_concurrency.lockutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "357f617a-600c-48b2-b5b1-420475f37e61-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.878 226239 DEBUG nova.virt.libvirt.vif [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:01:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-285494724',display_name='tempest-TestVolumeBootPattern-server-285494724',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-285494724',id=194,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76ce367a834b49dfb5b436848118b860',ramdisk_id='',reservation_id='r-r0wj6qp0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1392945362',owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:01:13Z,user_data=None,user_id='dc42b92a5dd34d32b6b184bdc7acb092',uuid=357f617a-600c-48b2-b5b1-420475f37e61,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e590395-62f6-48d5-a65a-5272c690c918", "address": "fa:16:3e:7c:90:2a", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e590395-62", "ovs_interfaceid": "6e590395-62f6-48d5-a65a-5272c690c918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.878 226239 DEBUG nova.network.os_vif_util [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converting VIF {"id": "6e590395-62f6-48d5-a65a-5272c690c918", "address": "fa:16:3e:7c:90:2a", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e590395-62", "ovs_interfaceid": "6e590395-62f6-48d5-a65a-5272c690c918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.879 226239 DEBUG nova.network.os_vif_util [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:90:2a,bridge_name='br-int',has_traffic_filtering=True,id=6e590395-62f6-48d5-a65a-5272c690c918,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e590395-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.879 226239 DEBUG os_vif [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:90:2a,bridge_name='br-int',has_traffic_filtering=True,id=6e590395-62f6-48d5-a65a-5272c690c918,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e590395-62') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.880 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.880 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.880 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.883 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.884 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e590395-62, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.884 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6e590395-62, col_values=(('external_ids', {'iface-id': '6e590395-62f6-48d5-a65a-5272c690c918', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:90:2a', 'vm-uuid': '357f617a-600c-48b2-b5b1-420475f37e61'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.885 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:24 np0005603623 NetworkManager[48970]: <info>  [1769850084.8864] manager: (tap6e590395-62): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/376)
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.888 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.890 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.890 226239 INFO os_vif [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:90:2a,bridge_name='br-int',has_traffic_filtering=True,id=6e590395-62f6-48d5-a65a-5272c690c918,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e590395-62')#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.971 226239 DEBUG nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.972 226239 DEBUG nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.972 226239 DEBUG nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] No VIF found with MAC fa:16:3e:7c:90:2a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.972 226239 INFO nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Using config drive#033[00m
Jan 31 04:01:24 np0005603623 nova_compute[226235]: 2026-01-31 09:01:24.995 226239 DEBUG nova.storage.rbd_utils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] rbd image 357f617a-600c-48b2-b5b1-420475f37e61_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:01:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:24.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:25.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:25 np0005603623 nova_compute[226235]: 2026-01-31 09:01:25.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:25 np0005603623 nova_compute[226235]: 2026-01-31 09:01:25.869 226239 INFO nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Creating config drive at /var/lib/nova/instances/357f617a-600c-48b2-b5b1-420475f37e61/disk.config#033[00m
Jan 31 04:01:25 np0005603623 nova_compute[226235]: 2026-01-31 09:01:25.873 226239 DEBUG oslo_concurrency.processutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/357f617a-600c-48b2-b5b1-420475f37e61/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpro26i616 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:25 np0005603623 nova_compute[226235]: 2026-01-31 09:01:25.994 226239 DEBUG oslo_concurrency.processutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/357f617a-600c-48b2-b5b1-420475f37e61/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpro26i616" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:26 np0005603623 nova_compute[226235]: 2026-01-31 09:01:26.023 226239 DEBUG nova.storage.rbd_utils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] rbd image 357f617a-600c-48b2-b5b1-420475f37e61_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:01:26 np0005603623 nova_compute[226235]: 2026-01-31 09:01:26.029 226239 DEBUG oslo_concurrency.processutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/357f617a-600c-48b2-b5b1-420475f37e61/disk.config 357f617a-600c-48b2-b5b1-420475f37e61_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:26 np0005603623 nova_compute[226235]: 2026-01-31 09:01:26.189 226239 DEBUG oslo_concurrency.processutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/357f617a-600c-48b2-b5b1-420475f37e61/disk.config 357f617a-600c-48b2-b5b1-420475f37e61_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:26 np0005603623 nova_compute[226235]: 2026-01-31 09:01:26.190 226239 INFO nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Deleting local config drive /var/lib/nova/instances/357f617a-600c-48b2-b5b1-420475f37e61/disk.config because it was imported into RBD.#033[00m
Jan 31 04:01:26 np0005603623 kernel: tap6e590395-62: entered promiscuous mode
Jan 31 04:01:26 np0005603623 NetworkManager[48970]: <info>  [1769850086.2305] manager: (tap6e590395-62): new Tun device (/org/freedesktop/NetworkManager/Devices/377)
Jan 31 04:01:26 np0005603623 ovn_controller[133449]: 2026-01-31T09:01:26Z|00804|binding|INFO|Claiming lport 6e590395-62f6-48d5-a65a-5272c690c918 for this chassis.
Jan 31 04:01:26 np0005603623 ovn_controller[133449]: 2026-01-31T09:01:26Z|00805|binding|INFO|6e590395-62f6-48d5-a65a-5272c690c918: Claiming fa:16:3e:7c:90:2a 10.100.0.4
Jan 31 04:01:26 np0005603623 nova_compute[226235]: 2026-01-31 09:01:26.231 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:26 np0005603623 nova_compute[226235]: 2026-01-31 09:01:26.233 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:26 np0005603623 systemd-udevd[318380]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:01:26 np0005603623 systemd-machined[194379]: New machine qemu-90-instance-000000c2.
Jan 31 04:01:26 np0005603623 nova_compute[226235]: 2026-01-31 09:01:26.260 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:26 np0005603623 NetworkManager[48970]: <info>  [1769850086.2633] device (tap6e590395-62): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:01:26 np0005603623 systemd[1]: Started Virtual Machine qemu-90-instance-000000c2.
Jan 31 04:01:26 np0005603623 NetworkManager[48970]: <info>  [1769850086.2640] device (tap6e590395-62): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:01:26 np0005603623 nova_compute[226235]: 2026-01-31 09:01:26.266 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:26 np0005603623 ovn_controller[133449]: 2026-01-31T09:01:26Z|00806|binding|INFO|Setting lport 6e590395-62f6-48d5-a65a-5272c690c918 ovn-installed in OVS
Jan 31 04:01:26 np0005603623 ovn_controller[133449]: 2026-01-31T09:01:26Z|00807|binding|INFO|Setting lport 6e590395-62f6-48d5-a65a-5272c690c918 up in Southbound
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.259 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:90:2a 10.100.0.4'], port_security=['fa:16:3e:7c:90:2a 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '357f617a-600c-48b2-b5b1-420475f37e61', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-650eb345-8346-4e8f-8e83-eeb0117654f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76ce367a834b49dfb5b436848118b860', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b400fbac-34cb-4e36-b84a-e8d6447b9bc2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ecdc171-9d09-4cba-9bb9-cd2f8ef8e6c3, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=6e590395-62f6-48d5-a65a-5272c690c918) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.261 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 6e590395-62f6-48d5-a65a-5272c690c918 in datapath 650eb345-8346-4e8f-8e83-eeb0117654f6 bound to our chassis#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.266 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 650eb345-8346-4e8f-8e83-eeb0117654f6#033[00m
Jan 31 04:01:26 np0005603623 nova_compute[226235]: 2026-01-31 09:01:26.268 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.275 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1029268f-0918-4abf-a329-dfd89672beb2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.276 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap650eb345-81 in ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.277 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap650eb345-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.277 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[da270a20-17dc-4fab-903a-2104f18cf928]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.278 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bca6ec24-4af1-4bbc-8ce4-4645d4f5cfaf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.288 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[57925d06-ed85-483b-a8a7-87e4a9fae48a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.297 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4c52073f-0458-4cff-9370-5ab4d9179a34]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.319 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[8898a63a-96f9-42d0-a189-cfd19d90b078]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.324 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4fb9e9e3-d08e-4f9e-b24b-7714ae3c6f51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:26 np0005603623 NetworkManager[48970]: <info>  [1769850086.3251] manager: (tap650eb345-80): new Veth device (/org/freedesktop/NetworkManager/Devices/378)
Jan 31 04:01:26 np0005603623 systemd-udevd[318382]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.359 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[695ff097-4d7e-4a13-8f1a-329f30ddea1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.362 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[fcdb7825-eac4-49b1-bbf6-32f4d96e2c13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:26 np0005603623 NetworkManager[48970]: <info>  [1769850086.3891] device (tap650eb345-80): carrier: link connected
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.395 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[1ffa8e6d-20e8-46d9-a16d-49a3beaba377]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.408 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4cd45d36-b77e-42df-96e6-8aa6a6aefbd0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap650eb345-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:27:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 236], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 922681, 'reachable_time': 40533, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 318413, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.420 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5c41cef0-7ac5-4b74-b53f-99865ae59440]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9f:27ec'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 922681, 'tstamp': 922681}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 318414, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.438 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5a768b5b-e5bf-4ed9-97af-b80b3eb212d5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap650eb345-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:27:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 236], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 922681, 'reachable_time': 40533, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 318415, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.471 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3aab22bb-0304-440e-b250-fa5b11e0fab4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.531 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e27dac71-bcb9-475f-9548-1d6090b85897]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.533 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap650eb345-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.534 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.534 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap650eb345-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:01:26 np0005603623 nova_compute[226235]: 2026-01-31 09:01:26.537 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:26 np0005603623 kernel: tap650eb345-80: entered promiscuous mode
Jan 31 04:01:26 np0005603623 NetworkManager[48970]: <info>  [1769850086.5397] manager: (tap650eb345-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/379)
Jan 31 04:01:26 np0005603623 nova_compute[226235]: 2026-01-31 09:01:26.539 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.541 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap650eb345-80, col_values=(('external_ids', {'iface-id': '74bde109-0188-4ce3-87c3-02a3eb853dc2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:01:26 np0005603623 nova_compute[226235]: 2026-01-31 09:01:26.542 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:26 np0005603623 ovn_controller[133449]: 2026-01-31T09:01:26Z|00808|binding|INFO|Releasing lport 74bde109-0188-4ce3-87c3-02a3eb853dc2 from this chassis (sb_readonly=0)
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.544 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/650eb345-8346-4e8f-8e83-eeb0117654f6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/650eb345-8346-4e8f-8e83-eeb0117654f6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.545 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bdcf0536-125e-452d-9f61-e34cfb79b165]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.546 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-650eb345-8346-4e8f-8e83-eeb0117654f6
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/650eb345-8346-4e8f-8e83-eeb0117654f6.pid.haproxy
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 650eb345-8346-4e8f-8e83-eeb0117654f6
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:01:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:26.547 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'env', 'PROCESS_TAG=haproxy-650eb345-8346-4e8f-8e83-eeb0117654f6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/650eb345-8346-4e8f-8e83-eeb0117654f6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:01:26 np0005603623 nova_compute[226235]: 2026-01-31 09:01:26.548 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:26 np0005603623 nova_compute[226235]: 2026-01-31 09:01:26.587 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:26 np0005603623 nova_compute[226235]: 2026-01-31 09:01:26.770 226239 DEBUG nova.network.neutron [req-4563ef9b-4fbb-45d1-afc8-5dcd030e52b4 req-9deafc8e-3d38-4996-98b6-4eaa6b3dac90 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Updated VIF entry in instance network info cache for port 6e590395-62f6-48d5-a65a-5272c690c918. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:01:26 np0005603623 nova_compute[226235]: 2026-01-31 09:01:26.771 226239 DEBUG nova.network.neutron [req-4563ef9b-4fbb-45d1-afc8-5dcd030e52b4 req-9deafc8e-3d38-4996-98b6-4eaa6b3dac90 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Updating instance_info_cache with network_info: [{"id": "6e590395-62f6-48d5-a65a-5272c690c918", "address": "fa:16:3e:7c:90:2a", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e590395-62", "ovs_interfaceid": "6e590395-62f6-48d5-a65a-5272c690c918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:01:26 np0005603623 nova_compute[226235]: 2026-01-31 09:01:26.863 226239 DEBUG oslo_concurrency.lockutils [req-4563ef9b-4fbb-45d1-afc8-5dcd030e52b4 req-9deafc8e-3d38-4996-98b6-4eaa6b3dac90 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-357f617a-600c-48b2-b5b1-420475f37e61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:01:26 np0005603623 podman[318484]: 2026-01-31 09:01:26.897387484 +0000 UTC m=+0.056423302 container create 751341066b071b04233d5124b27dcb3274c18d36637c75db3b5df4504a3cc999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Jan 31 04:01:26 np0005603623 systemd[1]: Started libpod-conmon-751341066b071b04233d5124b27dcb3274c18d36637c75db3b5df4504a3cc999.scope.
Jan 31 04:01:26 np0005603623 podman[318484]: 2026-01-31 09:01:26.867174366 +0000 UTC m=+0.026210214 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:01:26 np0005603623 systemd[1]: Started libcrun container.
Jan 31 04:01:26 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15f0c4a608c2afb8379acd5f0240e2aaff2beb18d4bf8004e868fc9be57459e0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:01:26 np0005603623 podman[318484]: 2026-01-31 09:01:26.977987962 +0000 UTC m=+0.137023810 container init 751341066b071b04233d5124b27dcb3274c18d36637c75db3b5df4504a3cc999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 04:01:26 np0005603623 podman[318484]: 2026-01-31 09:01:26.982244336 +0000 UTC m=+0.141280154 container start 751341066b071b04233d5124b27dcb3274c18d36637c75db3b5df4504a3cc999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:01:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:26.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:27 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[318500]: [NOTICE]   (318504) : New worker (318506) forked
Jan 31 04:01:27 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[318500]: [NOTICE]   (318504) : Loading success.
Jan 31 04:01:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:27.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:27 np0005603623 nova_compute[226235]: 2026-01-31 09:01:27.546 226239 DEBUG nova.compute.manager [req-97041d38-7aee-4ff2-8e62-f7de86b452ec req-a6c55b97-23eb-4587-a758-558463bc32f9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Received event network-vif-plugged-6e590395-62f6-48d5-a65a-5272c690c918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:01:27 np0005603623 nova_compute[226235]: 2026-01-31 09:01:27.547 226239 DEBUG oslo_concurrency.lockutils [req-97041d38-7aee-4ff2-8e62-f7de86b452ec req-a6c55b97-23eb-4587-a758-558463bc32f9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "357f617a-600c-48b2-b5b1-420475f37e61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:27 np0005603623 nova_compute[226235]: 2026-01-31 09:01:27.547 226239 DEBUG oslo_concurrency.lockutils [req-97041d38-7aee-4ff2-8e62-f7de86b452ec req-a6c55b97-23eb-4587-a758-558463bc32f9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "357f617a-600c-48b2-b5b1-420475f37e61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:27 np0005603623 nova_compute[226235]: 2026-01-31 09:01:27.547 226239 DEBUG oslo_concurrency.lockutils [req-97041d38-7aee-4ff2-8e62-f7de86b452ec req-a6c55b97-23eb-4587-a758-558463bc32f9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "357f617a-600c-48b2-b5b1-420475f37e61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:27 np0005603623 nova_compute[226235]: 2026-01-31 09:01:27.547 226239 DEBUG nova.compute.manager [req-97041d38-7aee-4ff2-8e62-f7de86b452ec req-a6c55b97-23eb-4587-a758-558463bc32f9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Processing event network-vif-plugged-6e590395-62f6-48d5-a65a-5272c690c918 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:01:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:29.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.026 226239 DEBUG nova.compute.manager [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.027 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850089.026936, 357f617a-600c-48b2-b5b1-420475f37e61 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.028 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] VM Started (Lifecycle Event)#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.031 226239 DEBUG nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.036 226239 INFO nova.virt.libvirt.driver [-] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Instance spawned successfully.#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.037 226239 DEBUG nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.099 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.103 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.108 226239 DEBUG nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.109 226239 DEBUG nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.109 226239 DEBUG nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.110 226239 DEBUG nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.110 226239 DEBUG nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.110 226239 DEBUG nova.virt.libvirt.driver [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:01:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:29.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.177 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.177 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850089.0270386, 357f617a-600c-48b2-b5b1-420475f37e61 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.178 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.282 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.286 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850089.0300333, 357f617a-600c-48b2-b5b1-420475f37e61 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.287 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.315 226239 INFO nova.compute.manager [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Took 13.01 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.315 226239 DEBUG nova.compute.manager [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.375 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.378 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.481 226239 INFO nova.compute.manager [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Took 17.28 seconds to build instance.#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.579 226239 DEBUG oslo_concurrency.lockutils [None req-9a041f46-24dc-4683-b054-0ace0631f50f dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "357f617a-600c-48b2-b5b1-420475f37e61" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.474s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.886 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.933 226239 DEBUG nova.compute.manager [req-b400b039-fce9-484e-b1d0-2ad563aff20b req-900af59a-d888-4462-b392-cc8f5126f5a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Received event network-vif-plugged-6e590395-62f6-48d5-a65a-5272c690c918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.933 226239 DEBUG oslo_concurrency.lockutils [req-b400b039-fce9-484e-b1d0-2ad563aff20b req-900af59a-d888-4462-b392-cc8f5126f5a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "357f617a-600c-48b2-b5b1-420475f37e61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.934 226239 DEBUG oslo_concurrency.lockutils [req-b400b039-fce9-484e-b1d0-2ad563aff20b req-900af59a-d888-4462-b392-cc8f5126f5a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "357f617a-600c-48b2-b5b1-420475f37e61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.934 226239 DEBUG oslo_concurrency.lockutils [req-b400b039-fce9-484e-b1d0-2ad563aff20b req-900af59a-d888-4462-b392-cc8f5126f5a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "357f617a-600c-48b2-b5b1-420475f37e61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.934 226239 DEBUG nova.compute.manager [req-b400b039-fce9-484e-b1d0-2ad563aff20b req-900af59a-d888-4462-b392-cc8f5126f5a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] No waiting events found dispatching network-vif-plugged-6e590395-62f6-48d5-a65a-5272c690c918 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:01:29 np0005603623 nova_compute[226235]: 2026-01-31 09:01:29.934 226239 WARNING nova.compute.manager [req-b400b039-fce9-484e-b1d0-2ad563aff20b req-900af59a-d888-4462-b392-cc8f5126f5a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Received unexpected event network-vif-plugged-6e590395-62f6-48d5-a65a-5272c690c918 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:01:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:30.153 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:30.154 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:30.154 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:30 np0005603623 nova_compute[226235]: 2026-01-31 09:01:30.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:31.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:01:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:31.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:01:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:31 np0005603623 nova_compute[226235]: 2026-01-31 09:01:31.589 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:33.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:33.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:34 np0005603623 nova_compute[226235]: 2026-01-31 09:01:34.888 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:35.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:35.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:35 np0005603623 nova_compute[226235]: 2026-01-31 09:01:35.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:35 np0005603623 nova_compute[226235]: 2026-01-31 09:01:35.155 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:35 np0005603623 nova_compute[226235]: 2026-01-31 09:01:35.155 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:35 np0005603623 nova_compute[226235]: 2026-01-31 09:01:35.156 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:35 np0005603623 nova_compute[226235]: 2026-01-31 09:01:35.156 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:35 np0005603623 nova_compute[226235]: 2026-01-31 09:01:35.157 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:35 np0005603623 nova_compute[226235]: 2026-01-31 09:01:35.157 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:35 np0005603623 nova_compute[226235]: 2026-01-31 09:01:35.505 226239 DEBUG nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Adding ephemeral_1_0706d66 into backend ephemeral images _store_ephemeral_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:100#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.007 226239 DEBUG nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.007 226239 DEBUG nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.008 226239 DEBUG nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] 357f617a-600c-48b2-b5b1-420475f37e61 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.008 226239 WARNING nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.008 226239 WARNING nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.008 226239 INFO nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Removable base files: /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.009 226239 INFO nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.009 226239 INFO nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.009 226239 DEBUG nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.010 226239 DEBUG nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.010 226239 DEBUG nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.010 226239 INFO nova.virt.libvirt.imagecache [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/ephemeral_1_0706d66#033[00m
Jan 31 04:01:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.507 226239 DEBUG oslo_concurrency.lockutils [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "357f617a-600c-48b2-b5b1-420475f37e61" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.509 226239 DEBUG oslo_concurrency.lockutils [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "357f617a-600c-48b2-b5b1-420475f37e61" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.509 226239 DEBUG oslo_concurrency.lockutils [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "357f617a-600c-48b2-b5b1-420475f37e61-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.510 226239 DEBUG oslo_concurrency.lockutils [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "357f617a-600c-48b2-b5b1-420475f37e61-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.510 226239 DEBUG oslo_concurrency.lockutils [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "357f617a-600c-48b2-b5b1-420475f37e61-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.511 226239 INFO nova.compute.manager [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Terminating instance#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.513 226239 DEBUG nova.compute.manager [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.592 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:36 np0005603623 kernel: tap6e590395-62 (unregistering): left promiscuous mode
Jan 31 04:01:36 np0005603623 NetworkManager[48970]: <info>  [1769850096.8826] device (tap6e590395-62): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:01:36 np0005603623 ovn_controller[133449]: 2026-01-31T09:01:36Z|00809|binding|INFO|Releasing lport 6e590395-62f6-48d5-a65a-5272c690c918 from this chassis (sb_readonly=0)
Jan 31 04:01:36 np0005603623 ovn_controller[133449]: 2026-01-31T09:01:36Z|00810|binding|INFO|Setting lport 6e590395-62f6-48d5-a65a-5272c690c918 down in Southbound
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.886 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:36 np0005603623 ovn_controller[133449]: 2026-01-31T09:01:36Z|00811|binding|INFO|Removing iface tap6e590395-62 ovn-installed in OVS
Jan 31 04:01:36 np0005603623 nova_compute[226235]: 2026-01-31 09:01:36.899 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:36 np0005603623 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000c2.scope: Deactivated successfully.
Jan 31 04:01:36 np0005603623 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000c2.scope: Consumed 3.185s CPU time.
Jan 31 04:01:36 np0005603623 systemd-machined[194379]: Machine qemu-90-instance-000000c2 terminated.
Jan 31 04:01:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:37.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:37.040 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:90:2a 10.100.0.4'], port_security=['fa:16:3e:7c:90:2a 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '357f617a-600c-48b2-b5b1-420475f37e61', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-650eb345-8346-4e8f-8e83-eeb0117654f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76ce367a834b49dfb5b436848118b860', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b400fbac-34cb-4e36-b84a-e8d6447b9bc2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ecdc171-9d09-4cba-9bb9-cd2f8ef8e6c3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=6e590395-62f6-48d5-a65a-5272c690c918) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:01:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:37.041 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 6e590395-62f6-48d5-a65a-5272c690c918 in datapath 650eb345-8346-4e8f-8e83-eeb0117654f6 unbound from our chassis#033[00m
Jan 31 04:01:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:37.044 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 650eb345-8346-4e8f-8e83-eeb0117654f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:01:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:37.045 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ee3678cc-9659-46e9-8b8a-81b4615495b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:37.046 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 namespace which is not needed anymore#033[00m
Jan 31 04:01:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:37.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:37 np0005603623 nova_compute[226235]: 2026-01-31 09:01:37.151 226239 INFO nova.virt.libvirt.driver [-] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Instance destroyed successfully.#033[00m
Jan 31 04:01:37 np0005603623 nova_compute[226235]: 2026-01-31 09:01:37.151 226239 DEBUG nova.objects.instance [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lazy-loading 'resources' on Instance uuid 357f617a-600c-48b2-b5b1-420475f37e61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:01:37 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[318500]: [NOTICE]   (318504) : haproxy version is 2.8.14-c23fe91
Jan 31 04:01:37 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[318500]: [NOTICE]   (318504) : path to executable is /usr/sbin/haproxy
Jan 31 04:01:37 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[318500]: [WARNING]  (318504) : Exiting Master process...
Jan 31 04:01:37 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[318500]: [WARNING]  (318504) : Exiting Master process...
Jan 31 04:01:37 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[318500]: [ALERT]    (318504) : Current worker (318506) exited with code 143 (Terminated)
Jan 31 04:01:37 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[318500]: [WARNING]  (318504) : All workers exited. Exiting... (0)
Jan 31 04:01:37 np0005603623 systemd[1]: libpod-751341066b071b04233d5124b27dcb3274c18d36637c75db3b5df4504a3cc999.scope: Deactivated successfully.
Jan 31 04:01:37 np0005603623 podman[318551]: 2026-01-31 09:01:37.220081823 +0000 UTC m=+0.106637386 container died 751341066b071b04233d5124b27dcb3274c18d36637c75db3b5df4504a3cc999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 04:01:37 np0005603623 nova_compute[226235]: 2026-01-31 09:01:37.359 226239 DEBUG nova.virt.libvirt.vif [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:01:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-285494724',display_name='tempest-TestVolumeBootPattern-server-285494724',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-285494724',id=194,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:01:29Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76ce367a834b49dfb5b436848118b860',ramdisk_id='',reservation_id='r-r0wj6qp0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestVolumeBootPattern-1392945362',owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:01:29Z,user_data=None,user_id='dc42b92a5dd34d32b6b184bdc7acb092',uuid=357f617a-600c-48b2-b5b1-420475f37e61,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e590395-62f6-48d5-a65a-5272c690c918", "address": "fa:16:3e:7c:90:2a", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e590395-62", "ovs_interfaceid": "6e590395-62f6-48d5-a65a-5272c690c918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:01:37 np0005603623 nova_compute[226235]: 2026-01-31 09:01:37.360 226239 DEBUG nova.network.os_vif_util [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converting VIF {"id": "6e590395-62f6-48d5-a65a-5272c690c918", "address": "fa:16:3e:7c:90:2a", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e590395-62", "ovs_interfaceid": "6e590395-62f6-48d5-a65a-5272c690c918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:01:37 np0005603623 nova_compute[226235]: 2026-01-31 09:01:37.361 226239 DEBUG nova.network.os_vif_util [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:90:2a,bridge_name='br-int',has_traffic_filtering=True,id=6e590395-62f6-48d5-a65a-5272c690c918,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e590395-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:01:37 np0005603623 nova_compute[226235]: 2026-01-31 09:01:37.361 226239 DEBUG os_vif [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:90:2a,bridge_name='br-int',has_traffic_filtering=True,id=6e590395-62f6-48d5-a65a-5272c690c918,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e590395-62') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:01:37 np0005603623 nova_compute[226235]: 2026-01-31 09:01:37.362 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:37 np0005603623 nova_compute[226235]: 2026-01-31 09:01:37.363 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e590395-62, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:01:37 np0005603623 nova_compute[226235]: 2026-01-31 09:01:37.364 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:37 np0005603623 nova_compute[226235]: 2026-01-31 09:01:37.366 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:01:37 np0005603623 nova_compute[226235]: 2026-01-31 09:01:37.369 226239 INFO os_vif [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:90:2a,bridge_name='br-int',has_traffic_filtering=True,id=6e590395-62f6-48d5-a65a-5272c690c918,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e590395-62')#033[00m
Jan 31 04:01:37 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-751341066b071b04233d5124b27dcb3274c18d36637c75db3b5df4504a3cc999-userdata-shm.mount: Deactivated successfully.
Jan 31 04:01:37 np0005603623 systemd[1]: var-lib-containers-storage-overlay-15f0c4a608c2afb8379acd5f0240e2aaff2beb18d4bf8004e868fc9be57459e0-merged.mount: Deactivated successfully.
Jan 31 04:01:37 np0005603623 podman[318551]: 2026-01-31 09:01:37.622418815 +0000 UTC m=+0.508974378 container cleanup 751341066b071b04233d5124b27dcb3274c18d36637c75db3b5df4504a3cc999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 04:01:37 np0005603623 podman[318607]: 2026-01-31 09:01:37.688694033 +0000 UTC m=+0.051867227 container remove 751341066b071b04233d5124b27dcb3274c18d36637c75db3b5df4504a3cc999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:01:37 np0005603623 systemd[1]: libpod-conmon-751341066b071b04233d5124b27dcb3274c18d36637c75db3b5df4504a3cc999.scope: Deactivated successfully.
Jan 31 04:01:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:37.693 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[53e09a15-dea2-485b-956f-3d40cb63508c]: (4, ('Sat Jan 31 09:01:37 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 (751341066b071b04233d5124b27dcb3274c18d36637c75db3b5df4504a3cc999)\n751341066b071b04233d5124b27dcb3274c18d36637c75db3b5df4504a3cc999\nSat Jan 31 09:01:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 (751341066b071b04233d5124b27dcb3274c18d36637c75db3b5df4504a3cc999)\n751341066b071b04233d5124b27dcb3274c18d36637c75db3b5df4504a3cc999\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:37.695 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[11b0492b-4dc9-4c9f-a243-323364764364]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:37.696 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap650eb345-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:01:37 np0005603623 nova_compute[226235]: 2026-01-31 09:01:37.698 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:37 np0005603623 nova_compute[226235]: 2026-01-31 09:01:37.699 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:37 np0005603623 kernel: tap650eb345-80: left promiscuous mode
Jan 31 04:01:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:37.701 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dd140261-35c8-4410-894f-8a8b68ae44f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:37 np0005603623 nova_compute[226235]: 2026-01-31 09:01:37.703 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:37.721 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[40a5091c-1622-4f51-9a83-ac1428b73b37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:37.722 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[54ce69ba-52f9-4fec-ae3c-b1839921cdec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:37.736 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dddc0895-4c4b-4c7f-a230-a19db0ccb84c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 922674, 'reachable_time': 28901, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 318622, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:37.739 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:01:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:37.739 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[ff129179-5510-462d-b44c-ecbe67ec347c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:37 np0005603623 systemd[1]: run-netns-ovnmeta\x2d650eb345\x2d8346\x2d4e8f\x2d8e83\x2deeb0117654f6.mount: Deactivated successfully.
Jan 31 04:01:37 np0005603623 nova_compute[226235]: 2026-01-31 09:01:37.870 226239 INFO nova.virt.libvirt.driver [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Deleting instance files /var/lib/nova/instances/357f617a-600c-48b2-b5b1-420475f37e61_del#033[00m
Jan 31 04:01:37 np0005603623 nova_compute[226235]: 2026-01-31 09:01:37.871 226239 INFO nova.virt.libvirt.driver [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Deletion of /var/lib/nova/instances/357f617a-600c-48b2-b5b1-420475f37e61_del complete#033[00m
Jan 31 04:01:38 np0005603623 nova_compute[226235]: 2026-01-31 09:01:38.005 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:38 np0005603623 nova_compute[226235]: 2026-01-31 09:01:38.228 226239 INFO nova.compute.manager [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Took 1.72 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:01:38 np0005603623 nova_compute[226235]: 2026-01-31 09:01:38.229 226239 DEBUG oslo.service.loopingcall [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:01:38 np0005603623 nova_compute[226235]: 2026-01-31 09:01:38.230 226239 DEBUG nova.compute.manager [-] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:01:38 np0005603623 nova_compute[226235]: 2026-01-31 09:01:38.230 226239 DEBUG nova.network.neutron [-] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:01:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:39.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:39.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:39 np0005603623 nova_compute[226235]: 2026-01-31 09:01:39.323 226239 DEBUG nova.compute.manager [req-4884959c-f8c8-43a8-9998-cd7ea7c9fb9c req-b0c563ce-d20b-40c1-848d-0fbb0061d369 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Received event network-vif-unplugged-6e590395-62f6-48d5-a65a-5272c690c918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:01:39 np0005603623 nova_compute[226235]: 2026-01-31 09:01:39.323 226239 DEBUG oslo_concurrency.lockutils [req-4884959c-f8c8-43a8-9998-cd7ea7c9fb9c req-b0c563ce-d20b-40c1-848d-0fbb0061d369 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "357f617a-600c-48b2-b5b1-420475f37e61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:39 np0005603623 nova_compute[226235]: 2026-01-31 09:01:39.324 226239 DEBUG oslo_concurrency.lockutils [req-4884959c-f8c8-43a8-9998-cd7ea7c9fb9c req-b0c563ce-d20b-40c1-848d-0fbb0061d369 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "357f617a-600c-48b2-b5b1-420475f37e61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:39 np0005603623 nova_compute[226235]: 2026-01-31 09:01:39.324 226239 DEBUG oslo_concurrency.lockutils [req-4884959c-f8c8-43a8-9998-cd7ea7c9fb9c req-b0c563ce-d20b-40c1-848d-0fbb0061d369 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "357f617a-600c-48b2-b5b1-420475f37e61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:39 np0005603623 nova_compute[226235]: 2026-01-31 09:01:39.324 226239 DEBUG nova.compute.manager [req-4884959c-f8c8-43a8-9998-cd7ea7c9fb9c req-b0c563ce-d20b-40c1-848d-0fbb0061d369 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] No waiting events found dispatching network-vif-unplugged-6e590395-62f6-48d5-a65a-5272c690c918 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:01:39 np0005603623 nova_compute[226235]: 2026-01-31 09:01:39.324 226239 DEBUG nova.compute.manager [req-4884959c-f8c8-43a8-9998-cd7ea7c9fb9c req-b0c563ce-d20b-40c1-848d-0fbb0061d369 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Received event network-vif-unplugged-6e590395-62f6-48d5-a65a-5272c690c918 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:01:39 np0005603623 nova_compute[226235]: 2026-01-31 09:01:39.689 226239 DEBUG nova.network.neutron [-] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:01:39 np0005603623 nova_compute[226235]: 2026-01-31 09:01:39.783 226239 INFO nova.compute.manager [-] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Took 1.55 seconds to deallocate network for instance.#033[00m
Jan 31 04:01:39 np0005603623 nova_compute[226235]: 2026-01-31 09:01:39.816 226239 DEBUG nova.compute.manager [req-8fc0f1e4-9500-4545-b81d-4287dc4700d2 req-b65c8c85-368a-41f9-b410-631bcdf448cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Received event network-vif-deleted-6e590395-62f6-48d5-a65a-5272c690c918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:01:39 np0005603623 nova_compute[226235]: 2026-01-31 09:01:39.816 226239 INFO nova.compute.manager [req-8fc0f1e4-9500-4545-b81d-4287dc4700d2 req-b65c8c85-368a-41f9-b410-631bcdf448cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Neutron deleted interface 6e590395-62f6-48d5-a65a-5272c690c918; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 04:01:39 np0005603623 nova_compute[226235]: 2026-01-31 09:01:39.816 226239 DEBUG nova.network.neutron [req-8fc0f1e4-9500-4545-b81d-4287dc4700d2 req-b65c8c85-368a-41f9-b410-631bcdf448cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:01:39 np0005603623 nova_compute[226235]: 2026-01-31 09:01:39.887 226239 DEBUG nova.compute.manager [req-8fc0f1e4-9500-4545-b81d-4287dc4700d2 req-b65c8c85-368a-41f9-b410-631bcdf448cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Detach interface failed, port_id=6e590395-62f6-48d5-a65a-5272c690c918, reason: Instance 357f617a-600c-48b2-b5b1-420475f37e61 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 04:01:40 np0005603623 nova_compute[226235]: 2026-01-31 09:01:40.220 226239 INFO nova.compute.manager [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Took 0.44 seconds to detach 1 volumes for instance.#033[00m
Jan 31 04:01:40 np0005603623 nova_compute[226235]: 2026-01-31 09:01:40.339 226239 DEBUG oslo_concurrency.lockutils [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:40 np0005603623 nova_compute[226235]: 2026-01-31 09:01:40.340 226239 DEBUG oslo_concurrency.lockutils [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:40 np0005603623 nova_compute[226235]: 2026-01-31 09:01:40.434 226239 DEBUG oslo_concurrency.processutils [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:01:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1988598809' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:01:40 np0005603623 nova_compute[226235]: 2026-01-31 09:01:40.862 226239 DEBUG oslo_concurrency.processutils [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:40 np0005603623 nova_compute[226235]: 2026-01-31 09:01:40.867 226239 DEBUG nova.compute.provider_tree [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:01:40 np0005603623 nova_compute[226235]: 2026-01-31 09:01:40.948 226239 DEBUG nova.scheduler.client.report [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:01:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:41.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:41.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:41 np0005603623 nova_compute[226235]: 2026-01-31 09:01:41.628 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:42 np0005603623 nova_compute[226235]: 2026-01-31 09:01:42.091 226239 DEBUG oslo_concurrency.lockutils [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.751s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:42 np0005603623 nova_compute[226235]: 2026-01-31 09:01:42.184 226239 DEBUG nova.compute.manager [req-7cf85914-4d14-4c48-b93d-a038ee08d401 req-fa36e1e2-38ca-4e66-87a1-4e98d2e237eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Received event network-vif-plugged-6e590395-62f6-48d5-a65a-5272c690c918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:01:42 np0005603623 nova_compute[226235]: 2026-01-31 09:01:42.185 226239 DEBUG oslo_concurrency.lockutils [req-7cf85914-4d14-4c48-b93d-a038ee08d401 req-fa36e1e2-38ca-4e66-87a1-4e98d2e237eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "357f617a-600c-48b2-b5b1-420475f37e61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:42 np0005603623 nova_compute[226235]: 2026-01-31 09:01:42.185 226239 DEBUG oslo_concurrency.lockutils [req-7cf85914-4d14-4c48-b93d-a038ee08d401 req-fa36e1e2-38ca-4e66-87a1-4e98d2e237eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "357f617a-600c-48b2-b5b1-420475f37e61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:42 np0005603623 nova_compute[226235]: 2026-01-31 09:01:42.185 226239 DEBUG oslo_concurrency.lockutils [req-7cf85914-4d14-4c48-b93d-a038ee08d401 req-fa36e1e2-38ca-4e66-87a1-4e98d2e237eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "357f617a-600c-48b2-b5b1-420475f37e61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:42 np0005603623 nova_compute[226235]: 2026-01-31 09:01:42.186 226239 DEBUG nova.compute.manager [req-7cf85914-4d14-4c48-b93d-a038ee08d401 req-fa36e1e2-38ca-4e66-87a1-4e98d2e237eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] No waiting events found dispatching network-vif-plugged-6e590395-62f6-48d5-a65a-5272c690c918 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:01:42 np0005603623 nova_compute[226235]: 2026-01-31 09:01:42.186 226239 WARNING nova.compute.manager [req-7cf85914-4d14-4c48-b93d-a038ee08d401 req-fa36e1e2-38ca-4e66-87a1-4e98d2e237eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Received unexpected event network-vif-plugged-6e590395-62f6-48d5-a65a-5272c690c918 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 04:01:42 np0005603623 nova_compute[226235]: 2026-01-31 09:01:42.360 226239 INFO nova.scheduler.client.report [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Deleted allocations for instance 357f617a-600c-48b2-b5b1-420475f37e61#033[00m
Jan 31 04:01:42 np0005603623 nova_compute[226235]: 2026-01-31 09:01:42.365 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:42 np0005603623 nova_compute[226235]: 2026-01-31 09:01:42.635 226239 DEBUG oslo_concurrency.lockutils [None req-d2f3af26-2e5d-4aa1-8179-44232da2133d dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "357f617a-600c-48b2-b5b1-420475f37e61" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:43.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:43.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:01:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:01:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:01:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:45.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:01:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:45.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:01:45 np0005603623 nova_compute[226235]: 2026-01-31 09:01:45.938 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:46 np0005603623 nova_compute[226235]: 2026-01-31 09:01:46.630 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:01:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:47.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:01:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:01:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:47.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:01:47 np0005603623 nova_compute[226235]: 2026-01-31 09:01:47.367 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:48 np0005603623 nova_compute[226235]: 2026-01-31 09:01:48.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:48 np0005603623 nova_compute[226235]: 2026-01-31 09:01:48.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 04:01:48 np0005603623 nova_compute[226235]: 2026-01-31 09:01:48.345 226239 DEBUG oslo_concurrency.lockutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquiring lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:48 np0005603623 nova_compute[226235]: 2026-01-31 09:01:48.345 226239 DEBUG oslo_concurrency.lockutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:48 np0005603623 nova_compute[226235]: 2026-01-31 09:01:48.491 226239 DEBUG nova.compute.manager [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:01:48 np0005603623 nova_compute[226235]: 2026-01-31 09:01:48.652 226239 DEBUG oslo_concurrency.lockutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:48 np0005603623 nova_compute[226235]: 2026-01-31 09:01:48.652 226239 DEBUG oslo_concurrency.lockutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:48 np0005603623 nova_compute[226235]: 2026-01-31 09:01:48.659 226239 DEBUG nova.virt.hardware [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:01:48 np0005603623 nova_compute[226235]: 2026-01-31 09:01:48.660 226239 INFO nova.compute.claims [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 04:01:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:01:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:49.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:01:49 np0005603623 nova_compute[226235]: 2026-01-31 09:01:49.084 226239 DEBUG oslo_concurrency.processutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:49.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:01:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1653825721' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:01:49 np0005603623 nova_compute[226235]: 2026-01-31 09:01:49.468 226239 DEBUG oslo_concurrency.processutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.384s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:49 np0005603623 nova_compute[226235]: 2026-01-31 09:01:49.475 226239 DEBUG nova.compute.provider_tree [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:01:49 np0005603623 nova_compute[226235]: 2026-01-31 09:01:49.508 226239 DEBUG nova.scheduler.client.report [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:01:49 np0005603623 nova_compute[226235]: 2026-01-31 09:01:49.549 226239 DEBUG oslo_concurrency.lockutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.897s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:49 np0005603623 nova_compute[226235]: 2026-01-31 09:01:49.550 226239 DEBUG nova.compute.manager [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:01:49 np0005603623 nova_compute[226235]: 2026-01-31 09:01:49.654 226239 DEBUG nova.compute.manager [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:01:49 np0005603623 nova_compute[226235]: 2026-01-31 09:01:49.654 226239 DEBUG nova.network.neutron [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:01:49 np0005603623 nova_compute[226235]: 2026-01-31 09:01:49.823 226239 INFO nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:01:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:01:49 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:01:49 np0005603623 podman[318905]: 2026-01-31 09:01:49.966385739 +0000 UTC m=+0.061183890 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:01:50 np0005603623 podman[318906]: 2026-01-31 09:01:50.010923056 +0000 UTC m=+0.095202077 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.080 226239 DEBUG nova.compute.manager [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.270 226239 INFO nova.virt.block_device [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Booting with volume 766e40b4-9f67-4558-bd4a-c5d2d46d1ef1 at /dev/vda#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.292 226239 DEBUG nova.policy [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3859f52c5b70471097d1e4ffa75ecc0e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1f293713f6854265a89a1a4a002088d5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.512 226239 DEBUG os_brick.utils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.513 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.526 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.526 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[4255a18e-e5ae-4237-b836-6f2b7bd999fd]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.527 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.534 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.535 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[e62bb569-09ba-4ec2-bc44-3e7e1b08ce19]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.537 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.544 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.544 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[7788c4cb-e61f-458f-b096-46d9fbb62560]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.546 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[7bb208ae-5c48-4c95-91b8-2f23af23351c]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.546 226239 DEBUG oslo_concurrency.processutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.573 226239 DEBUG oslo_concurrency.processutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CMD "nvme version" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.579 226239 DEBUG os_brick.initiator.connectors.lightos [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.579 226239 DEBUG os_brick.initiator.connectors.lightos [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.579 226239 DEBUG os_brick.initiator.connectors.lightos [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.580 226239 DEBUG os_brick.utils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] <== get_connector_properties: return (66ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 04:01:50 np0005603623 nova_compute[226235]: 2026-01-31 09:01:50.580 226239 DEBUG nova.virt.block_device [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Updating existing volume attachment record: dcfe61c4-06eb-423c-9ffb-cfecfb9cdddb _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 04:01:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:51.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:51.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:51 np0005603623 nova_compute[226235]: 2026-01-31 09:01:51.632 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:51.799 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=84, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=83) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:01:51 np0005603623 nova_compute[226235]: 2026-01-31 09:01:51.799 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:51.800 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:01:52 np0005603623 nova_compute[226235]: 2026-01-31 09:01:52.107 226239 DEBUG nova.network.neutron [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Successfully created port: acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:01:52 np0005603623 nova_compute[226235]: 2026-01-31 09:01:52.150 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850097.1490657, 357f617a-600c-48b2-b5b1-420475f37e61 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:01:52 np0005603623 nova_compute[226235]: 2026-01-31 09:01:52.150 226239 INFO nova.compute.manager [-] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:01:52 np0005603623 nova_compute[226235]: 2026-01-31 09:01:52.198 226239 DEBUG nova.compute.manager [None req-7d9a3f9f-ed1b-4fdf-bd80-74fc65305ed0 - - - - - -] [instance: 357f617a-600c-48b2-b5b1-420475f37e61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:01:52 np0005603623 nova_compute[226235]: 2026-01-31 09:01:52.372 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:52 np0005603623 nova_compute[226235]: 2026-01-31 09:01:52.546 226239 DEBUG nova.compute.manager [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:01:52 np0005603623 nova_compute[226235]: 2026-01-31 09:01:52.548 226239 DEBUG nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:01:52 np0005603623 nova_compute[226235]: 2026-01-31 09:01:52.548 226239 INFO nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Creating image(s)#033[00m
Jan 31 04:01:52 np0005603623 nova_compute[226235]: 2026-01-31 09:01:52.549 226239 DEBUG nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 04:01:52 np0005603623 nova_compute[226235]: 2026-01-31 09:01:52.549 226239 DEBUG nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Ensure instance console log exists: /var/lib/nova/instances/2fbbeeee-ff60-4a39-9bea-e3d59301b0ad/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:01:52 np0005603623 nova_compute[226235]: 2026-01-31 09:01:52.549 226239 DEBUG oslo_concurrency.lockutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:52 np0005603623 nova_compute[226235]: 2026-01-31 09:01:52.550 226239 DEBUG oslo_concurrency.lockutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:52 np0005603623 nova_compute[226235]: 2026-01-31 09:01:52.550 226239 DEBUG oslo_concurrency.lockutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:52 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:52.802 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '84'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:01:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:01:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:53.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:01:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:53.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:53 np0005603623 nova_compute[226235]: 2026-01-31 09:01:53.962 226239 DEBUG nova.network.neutron [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Successfully updated port: acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:01:54 np0005603623 nova_compute[226235]: 2026-01-31 09:01:54.000 226239 DEBUG oslo_concurrency.lockutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquiring lock "refresh_cache-2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:01:54 np0005603623 nova_compute[226235]: 2026-01-31 09:01:54.001 226239 DEBUG oslo_concurrency.lockutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquired lock "refresh_cache-2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:01:54 np0005603623 nova_compute[226235]: 2026-01-31 09:01:54.001 226239 DEBUG nova.network.neutron [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:01:54 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e389 e389: 3 total, 3 up, 3 in
Jan 31 04:01:54 np0005603623 nova_compute[226235]: 2026-01-31 09:01:54.142 226239 DEBUG nova.compute.manager [req-310ade88-c356-4bd9-aaa0-b536161262cf req-49d32787-a98c-4b73-9869-9bb5249f0077 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Received event network-changed-acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:01:54 np0005603623 nova_compute[226235]: 2026-01-31 09:01:54.143 226239 DEBUG nova.compute.manager [req-310ade88-c356-4bd9-aaa0-b536161262cf req-49d32787-a98c-4b73-9869-9bb5249f0077 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Refreshing instance network info cache due to event network-changed-acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:01:54 np0005603623 nova_compute[226235]: 2026-01-31 09:01:54.143 226239 DEBUG oslo_concurrency.lockutils [req-310ade88-c356-4bd9-aaa0-b536161262cf req-49d32787-a98c-4b73-9869-9bb5249f0077 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:01:54 np0005603623 nova_compute[226235]: 2026-01-31 09:01:54.300 226239 DEBUG nova.network.neutron [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:01:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:55.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:55.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.000 226239 DEBUG nova.network.neutron [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Updating instance_info_cache with network_info: [{"id": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "address": "fa:16:3e:ad:a6:e8", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacfc2f5c-0e", "ovs_interfaceid": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.179 226239 DEBUG oslo_concurrency.lockutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Releasing lock "refresh_cache-2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.179 226239 DEBUG nova.compute.manager [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Instance network_info: |[{"id": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "address": "fa:16:3e:ad:a6:e8", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacfc2f5c-0e", "ovs_interfaceid": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.180 226239 DEBUG oslo_concurrency.lockutils [req-310ade88-c356-4bd9-aaa0-b536161262cf req-49d32787-a98c-4b73-9869-9bb5249f0077 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.180 226239 DEBUG nova.network.neutron [req-310ade88-c356-4bd9-aaa0-b536161262cf req-49d32787-a98c-4b73-9869-9bb5249f0077 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Refreshing network info cache for port acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.183 226239 DEBUG nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Start _get_guest_xml network_info=[{"id": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "address": "fa:16:3e:ad:a6:e8", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacfc2f5c-0e", "ovs_interfaceid": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'attachment_id': 'dcfe61c4-06eb-423c-9ffb-cfecfb9cdddb', 'delete_on_termination': True, 'guest_format': None, 'device_type': 'disk', 'boot_index': 0, 'disk_bus': 'virtio', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-766e40b4-9f67-4558-bd4a-c5d2d46d1ef1', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '766e40b4-9f67-4558-bd4a-c5d2d46d1ef1', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '2fbbeeee-ff60-4a39-9bea-e3d59301b0ad', 'attached_at': '', 'detached_at': '', 'volume_id': '766e40b4-9f67-4558-bd4a-c5d2d46d1ef1', 'serial': '766e40b4-9f67-4558-bd4a-c5d2d46d1ef1'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.187 226239 WARNING nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.191 226239 DEBUG nova.virt.libvirt.host [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.191 226239 DEBUG nova.virt.libvirt.host [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.196 226239 DEBUG nova.virt.libvirt.host [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.196 226239 DEBUG nova.virt.libvirt.host [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.197 226239 DEBUG nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.198 226239 DEBUG nova.virt.hardware [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.198 226239 DEBUG nova.virt.hardware [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.198 226239 DEBUG nova.virt.hardware [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.198 226239 DEBUG nova.virt.hardware [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.199 226239 DEBUG nova.virt.hardware [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.199 226239 DEBUG nova.virt.hardware [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.199 226239 DEBUG nova.virt.hardware [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.199 226239 DEBUG nova.virt.hardware [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.199 226239 DEBUG nova.virt.hardware [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.200 226239 DEBUG nova.virt.hardware [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.200 226239 DEBUG nova.virt.hardware [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.231 226239 DEBUG nova.storage.rbd_utils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] rbd image 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.235 226239 DEBUG oslo_concurrency.processutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e389 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.633 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:01:56 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2617875344' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.674 226239 DEBUG oslo_concurrency.processutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.834 226239 DEBUG nova.virt.libvirt.vif [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:01:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-700374568',display_name='tempest-TestShelveInstance-server-700374568',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-700374568',id=195,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGtA3anYK1F9iXzxi+WbZdX9+H2hCPBQsf9eZ9YvPVN48lWV0Tj2M8EqzHWhivNuSFaYD9k1TbjDSy9xGFH7/SEr14KZUm/LE8cO61iZWeNARWc/E4iBetyQV/0Aqvvflw==',key_name='tempest-TestShelveInstance-1586472022',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1f293713f6854265a89a1a4a002088d5',ramdisk_id='',reservation_id='r-kzm36xl7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1813478377',owner_user_name='tempest-TestShelveInstance-1813478377-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:01:50Z,user_data=None,user_id='3859f52c5b70471097d1e4ffa75ecc0e',uuid=2fbbeeee-ff60-4a39-9bea-e3d59301b0ad,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "address": "fa:16:3e:ad:a6:e8", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacfc2f5c-0e", "ovs_interfaceid": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.834 226239 DEBUG nova.network.os_vif_util [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Converting VIF {"id": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "address": "fa:16:3e:ad:a6:e8", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacfc2f5c-0e", "ovs_interfaceid": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.835 226239 DEBUG nova.network.os_vif_util [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:a6:e8,bridge_name='br-int',has_traffic_filtering=True,id=acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1,network=Network(1c62fa1c-f7d2-4937-9258-1d3a4456b207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacfc2f5c-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.836 226239 DEBUG nova.objects.instance [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.904 226239 DEBUG nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:01:56 np0005603623 nova_compute[226235]:  <uuid>2fbbeeee-ff60-4a39-9bea-e3d59301b0ad</uuid>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:  <name>instance-000000c3</name>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <nova:name>tempest-TestShelveInstance-server-700374568</nova:name>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 09:01:56</nova:creationTime>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 04:01:56 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:        <nova:user uuid="3859f52c5b70471097d1e4ffa75ecc0e">tempest-TestShelveInstance-1813478377-project-member</nova:user>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:        <nova:project uuid="1f293713f6854265a89a1a4a002088d5">tempest-TestShelveInstance-1813478377</nova:project>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:        <nova:port uuid="acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1">
Jan 31 04:01:56 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <system>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <entry name="serial">2fbbeeee-ff60-4a39-9bea-e3d59301b0ad</entry>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <entry name="uuid">2fbbeeee-ff60-4a39-9bea-e3d59301b0ad</entry>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    </system>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:  <os>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:  </os>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:  <features>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:  </features>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:  </clock>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:  <devices>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/2fbbeeee-ff60-4a39-9bea-e3d59301b0ad_disk.config">
Jan 31 04:01:56 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:01:56 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="volumes/volume-766e40b4-9f67-4558-bd4a-c5d2d46d1ef1">
Jan 31 04:01:56 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:01:56 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <serial>766e40b4-9f67-4558-bd4a-c5d2d46d1ef1</serial>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:ad:a6:e8"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <target dev="tapacfc2f5c-0e"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    </interface>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/2fbbeeee-ff60-4a39-9bea-e3d59301b0ad/console.log" append="off"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    </serial>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <video>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    </video>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    </rng>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 04:01:56 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 04:01:56 np0005603623 nova_compute[226235]:  </devices>
Jan 31 04:01:56 np0005603623 nova_compute[226235]: </domain>
Jan 31 04:01:56 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.905 226239 DEBUG nova.compute.manager [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Preparing to wait for external event network-vif-plugged-acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.905 226239 DEBUG oslo_concurrency.lockutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquiring lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.906 226239 DEBUG oslo_concurrency.lockutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.906 226239 DEBUG oslo_concurrency.lockutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.906 226239 DEBUG nova.virt.libvirt.vif [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:01:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-700374568',display_name='tempest-TestShelveInstance-server-700374568',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-700374568',id=195,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGtA3anYK1F9iXzxi+WbZdX9+H2hCPBQsf9eZ9YvPVN48lWV0Tj2M8EqzHWhivNuSFaYD9k1TbjDSy9xGFH7/SEr14KZUm/LE8cO61iZWeNARWc/E4iBetyQV/0Aqvvflw==',key_name='tempest-TestShelveInstance-1586472022',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1f293713f6854265a89a1a4a002088d5',ramdisk_id='',reservation_id='r-kzm36xl7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1813478377',owner_user_name='tempest-TestShelveInstance-1813478377-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:01:50Z,user_data=None,user_id='3859f52c5b70471097d1e4ffa75ecc0e',uuid=2fbbeeee-ff60-4a39-9bea-e3d59301b0ad,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "address": "fa:16:3e:ad:a6:e8", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacfc2f5c-0e", "ovs_interfaceid": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.907 226239 DEBUG nova.network.os_vif_util [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Converting VIF {"id": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "address": "fa:16:3e:ad:a6:e8", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacfc2f5c-0e", "ovs_interfaceid": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.907 226239 DEBUG nova.network.os_vif_util [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:a6:e8,bridge_name='br-int',has_traffic_filtering=True,id=acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1,network=Network(1c62fa1c-f7d2-4937-9258-1d3a4456b207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacfc2f5c-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.907 226239 DEBUG os_vif [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:a6:e8,bridge_name='br-int',has_traffic_filtering=True,id=acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1,network=Network(1c62fa1c-f7d2-4937-9258-1d3a4456b207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacfc2f5c-0e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.908 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.908 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.909 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.912 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.912 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapacfc2f5c-0e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.913 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapacfc2f5c-0e, col_values=(('external_ids', {'iface-id': 'acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ad:a6:e8', 'vm-uuid': '2fbbeeee-ff60-4a39-9bea-e3d59301b0ad'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.914 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:56 np0005603623 NetworkManager[48970]: <info>  [1769850116.9152] manager: (tapacfc2f5c-0e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/380)
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.916 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.919 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:56 np0005603623 nova_compute[226235]: 2026-01-31 09:01:56.920 226239 INFO os_vif [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:a6:e8,bridge_name='br-int',has_traffic_filtering=True,id=acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1,network=Network(1c62fa1c-f7d2-4937-9258-1d3a4456b207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacfc2f5c-0e')#033[00m
Jan 31 04:01:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:57.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:57 np0005603623 nova_compute[226235]: 2026-01-31 09:01:57.159 226239 DEBUG nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:01:57 np0005603623 nova_compute[226235]: 2026-01-31 09:01:57.159 226239 DEBUG nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:01:57 np0005603623 nova_compute[226235]: 2026-01-31 09:01:57.160 226239 DEBUG nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] No VIF found with MAC fa:16:3e:ad:a6:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:01:57 np0005603623 nova_compute[226235]: 2026-01-31 09:01:57.160 226239 INFO nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Using config drive#033[00m
Jan 31 04:01:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:57.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:57 np0005603623 nova_compute[226235]: 2026-01-31 09:01:57.183 226239 DEBUG nova.storage.rbd_utils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] rbd image 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:01:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e390 e390: 3 total, 3 up, 3 in
Jan 31 04:01:58 np0005603623 nova_compute[226235]: 2026-01-31 09:01:58.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:58 np0005603623 nova_compute[226235]: 2026-01-31 09:01:58.338 226239 INFO nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Creating config drive at /var/lib/nova/instances/2fbbeeee-ff60-4a39-9bea-e3d59301b0ad/disk.config#033[00m
Jan 31 04:01:58 np0005603623 nova_compute[226235]: 2026-01-31 09:01:58.342 226239 DEBUG oslo_concurrency.processutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2fbbeeee-ff60-4a39-9bea-e3d59301b0ad/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmplg0e1txo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:58 np0005603623 nova_compute[226235]: 2026-01-31 09:01:58.462 226239 DEBUG oslo_concurrency.processutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2fbbeeee-ff60-4a39-9bea-e3d59301b0ad/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmplg0e1txo" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:58 np0005603623 nova_compute[226235]: 2026-01-31 09:01:58.488 226239 DEBUG nova.storage.rbd_utils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] rbd image 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:01:58 np0005603623 nova_compute[226235]: 2026-01-31 09:01:58.491 226239 DEBUG oslo_concurrency.processutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2fbbeeee-ff60-4a39-9bea-e3d59301b0ad/disk.config 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:58 np0005603623 nova_compute[226235]: 2026-01-31 09:01:58.749 226239 DEBUG oslo_concurrency.processutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2fbbeeee-ff60-4a39-9bea-e3d59301b0ad/disk.config 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.258s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:58 np0005603623 nova_compute[226235]: 2026-01-31 09:01:58.749 226239 INFO nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Deleting local config drive /var/lib/nova/instances/2fbbeeee-ff60-4a39-9bea-e3d59301b0ad/disk.config because it was imported into RBD.#033[00m
Jan 31 04:01:58 np0005603623 kernel: tapacfc2f5c-0e: entered promiscuous mode
Jan 31 04:01:58 np0005603623 NetworkManager[48970]: <info>  [1769850118.7849] manager: (tapacfc2f5c-0e): new Tun device (/org/freedesktop/NetworkManager/Devices/381)
Jan 31 04:01:58 np0005603623 ovn_controller[133449]: 2026-01-31T09:01:58Z|00812|binding|INFO|Claiming lport acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 for this chassis.
Jan 31 04:01:58 np0005603623 nova_compute[226235]: 2026-01-31 09:01:58.785 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:58 np0005603623 ovn_controller[133449]: 2026-01-31T09:01:58Z|00813|binding|INFO|acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1: Claiming fa:16:3e:ad:a6:e8 10.100.0.3
Jan 31 04:01:58 np0005603623 nova_compute[226235]: 2026-01-31 09:01:58.788 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:58 np0005603623 nova_compute[226235]: 2026-01-31 09:01:58.790 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:58 np0005603623 systemd-machined[194379]: New machine qemu-91-instance-000000c3.
Jan 31 04:01:58 np0005603623 nova_compute[226235]: 2026-01-31 09:01:58.811 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.814 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:a6:e8 10.100.0.3'], port_security=['fa:16:3e:ad:a6:e8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '2fbbeeee-ff60-4a39-9bea-e3d59301b0ad', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1c62fa1c-f7d2-4937-9258-1d3a4456b207', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f293713f6854265a89a1a4a002088d5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '400baeb3-ed1b-4018-bb0b-3e4e58b8921d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=05ac80f4-66e3-4e8c-b69d-f2f58ada92e8, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:01:58 np0005603623 ovn_controller[133449]: 2026-01-31T09:01:58Z|00814|binding|INFO|Setting lport acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 ovn-installed in OVS
Jan 31 04:01:58 np0005603623 ovn_controller[133449]: 2026-01-31T09:01:58Z|00815|binding|INFO|Setting lport acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 up in Southbound
Jan 31 04:01:58 np0005603623 nova_compute[226235]: 2026-01-31 09:01:58.815 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:58 np0005603623 systemd-udevd[319076]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:01:58 np0005603623 systemd[1]: Started Virtual Machine qemu-91-instance-000000c3.
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.816 143258 INFO neutron.agent.ovn.metadata.agent [-] Port acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 in datapath 1c62fa1c-f7d2-4937-9258-1d3a4456b207 bound to our chassis#033[00m
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.820 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1c62fa1c-f7d2-4937-9258-1d3a4456b207#033[00m
Jan 31 04:01:58 np0005603623 NetworkManager[48970]: <info>  [1769850118.8271] device (tapacfc2f5c-0e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:01:58 np0005603623 NetworkManager[48970]: <info>  [1769850118.8276] device (tapacfc2f5c-0e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.829 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d4c65e34-cc2d-490f-9453-e79b01c62e95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.831 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1c62fa1c-f1 in ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.833 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1c62fa1c-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.833 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0fb75090-c968-460b-8b83-47360ba78bd6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.833 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4ecde74f-f9cb-4d00-b38e-278cef0a4866]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.844 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[e3792dd0-5cc3-4d2a-a68a-970297044ea4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.854 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[35908ca7-0278-4ddd-be19-1b5bb774c31d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.875 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[b41650d9-8311-4805-8ba5-de16318e4a6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:58 np0005603623 systemd-udevd[319079]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.879 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0f3a8493-a298-405f-ae7f-cbd74af9235e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:58 np0005603623 NetworkManager[48970]: <info>  [1769850118.8801] manager: (tap1c62fa1c-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/382)
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.902 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[5471e297-1261-4c69-bc25-c235ce377a41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.906 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[e5a5aaf3-3905-47a0-a393-af98a2d4815a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:58 np0005603623 NetworkManager[48970]: <info>  [1769850118.9211] device (tap1c62fa1c-f0): carrier: link connected
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.922 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[48a1a464-8fc9-4d9e-b296-78327a52aa5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.936 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[87588419-bccd-4e1d-9104-02d53d632416]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1c62fa1c-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:15:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 239], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 925935, 'reachable_time': 44726, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 319109, 'error': None, 'target': 'ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.950 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[68e8321f-8c29-4371-881d-f24ef55b5aa8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe52:1552'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 925935, 'tstamp': 925935}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 319110, 'error': None, 'target': 'ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.962 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[636b6ae7-e717-4c72-b0f1-95a8feea1a67]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1c62fa1c-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:15:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 239], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 925935, 'reachable_time': 44726, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 319111, 'error': None, 'target': 'ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:58 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:58.983 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ecc55beb-1f4a-4cf3-a292-e78aca467334]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:59.019 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b176104c-79f6-43c6-a7b9-c8780b6bb588]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:59.020 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1c62fa1c-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:59.021 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:59.021 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1c62fa1c-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:01:59 np0005603623 kernel: tap1c62fa1c-f0: entered promiscuous mode
Jan 31 04:01:59 np0005603623 NetworkManager[48970]: <info>  [1769850119.0252] manager: (tap1c62fa1c-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/383)
Jan 31 04:01:59 np0005603623 nova_compute[226235]: 2026-01-31 09:01:59.024 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:59.030 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1c62fa1c-f0, col_values=(('external_ids', {'iface-id': '46e41546-aa3b-4838-b2c2-ba3b46cf445c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:01:59 np0005603623 ovn_controller[133449]: 2026-01-31T09:01:59Z|00816|binding|INFO|Releasing lport 46e41546-aa3b-4838-b2c2-ba3b46cf445c from this chassis (sb_readonly=0)
Jan 31 04:01:59 np0005603623 nova_compute[226235]: 2026-01-31 09:01:59.031 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:59.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:59 np0005603623 nova_compute[226235]: 2026-01-31 09:01:59.035 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:59.036 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1c62fa1c-f7d2-4937-9258-1d3a4456b207.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1c62fa1c-f7d2-4937-9258-1d3a4456b207.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:59.037 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5f3e4637-7ade-4465-b488-f8062c3c8d18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:59.037 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-1c62fa1c-f7d2-4937-9258-1d3a4456b207
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/1c62fa1c-f7d2-4937-9258-1d3a4456b207.pid.haproxy
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 1c62fa1c-f7d2-4937-9258-1d3a4456b207
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:01:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:01:59.038 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207', 'env', 'PROCESS_TAG=haproxy-1c62fa1c-f7d2-4937-9258-1d3a4456b207', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1c62fa1c-f7d2-4937-9258-1d3a4456b207.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:01:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:01:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:59.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:59 np0005603623 podman[319157]: 2026-01-31 09:01:59.334668878 +0000 UTC m=+0.043368521 container create 6a656b34033f2af7230a48e88ebcf0c7ba494db53f295f1e30f0768d09acf132 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 04:01:59 np0005603623 systemd[1]: Started libpod-conmon-6a656b34033f2af7230a48e88ebcf0c7ba494db53f295f1e30f0768d09acf132.scope.
Jan 31 04:01:59 np0005603623 systemd[1]: Started libcrun container.
Jan 31 04:01:59 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4494b4570a1b0e13c8a4ceb0e3eeb354f2b82d3b54355e8fc3f1fcd00ef241a5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:01:59 np0005603623 podman[319157]: 2026-01-31 09:01:59.387342551 +0000 UTC m=+0.096042224 container init 6a656b34033f2af7230a48e88ebcf0c7ba494db53f295f1e30f0768d09acf132 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Jan 31 04:01:59 np0005603623 podman[319157]: 2026-01-31 09:01:59.392991878 +0000 UTC m=+0.101691531 container start 6a656b34033f2af7230a48e88ebcf0c7ba494db53f295f1e30f0768d09acf132 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:01:59 np0005603623 podman[319157]: 2026-01-31 09:01:59.311508032 +0000 UTC m=+0.020207695 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:01:59 np0005603623 neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207[319193]: [NOTICE]   (319202) : New worker (319204) forked
Jan 31 04:01:59 np0005603623 neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207[319193]: [NOTICE]   (319202) : Loading success.
Jan 31 04:01:59 np0005603623 nova_compute[226235]: 2026-01-31 09:01:59.470 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850119.4702437, 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:01:59 np0005603623 nova_compute[226235]: 2026-01-31 09:01:59.471 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] VM Started (Lifecycle Event)#033[00m
Jan 31 04:01:59 np0005603623 nova_compute[226235]: 2026-01-31 09:01:59.581 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:01:59 np0005603623 nova_compute[226235]: 2026-01-31 09:01:59.585 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850119.4711926, 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:01:59 np0005603623 nova_compute[226235]: 2026-01-31 09:01:59.585 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:01:59 np0005603623 nova_compute[226235]: 2026-01-31 09:01:59.627 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:01:59 np0005603623 nova_compute[226235]: 2026-01-31 09:01:59.629 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:01:59 np0005603623 nova_compute[226235]: 2026-01-31 09:01:59.653 226239 DEBUG nova.network.neutron [req-310ade88-c356-4bd9-aaa0-b536161262cf req-49d32787-a98c-4b73-9869-9bb5249f0077 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Updated VIF entry in instance network info cache for port acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:01:59 np0005603623 nova_compute[226235]: 2026-01-31 09:01:59.653 226239 DEBUG nova.network.neutron [req-310ade88-c356-4bd9-aaa0-b536161262cf req-49d32787-a98c-4b73-9869-9bb5249f0077 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Updating instance_info_cache with network_info: [{"id": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "address": "fa:16:3e:ad:a6:e8", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacfc2f5c-0e", "ovs_interfaceid": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:01:59 np0005603623 nova_compute[226235]: 2026-01-31 09:01:59.694 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:01:59 np0005603623 nova_compute[226235]: 2026-01-31 09:01:59.697 226239 DEBUG oslo_concurrency.lockutils [req-310ade88-c356-4bd9-aaa0-b536161262cf req-49d32787-a98c-4b73-9869-9bb5249f0077 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.141 226239 DEBUG nova.compute.manager [req-052012b8-f8bb-4953-bc7f-c2bcec736370 req-1f9d6466-ff55-4128-8d3a-53799cb13063 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Received event network-vif-plugged-acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.142 226239 DEBUG oslo_concurrency.lockutils [req-052012b8-f8bb-4953-bc7f-c2bcec736370 req-1f9d6466-ff55-4128-8d3a-53799cb13063 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.142 226239 DEBUG oslo_concurrency.lockutils [req-052012b8-f8bb-4953-bc7f-c2bcec736370 req-1f9d6466-ff55-4128-8d3a-53799cb13063 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.142 226239 DEBUG oslo_concurrency.lockutils [req-052012b8-f8bb-4953-bc7f-c2bcec736370 req-1f9d6466-ff55-4128-8d3a-53799cb13063 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.143 226239 DEBUG nova.compute.manager [req-052012b8-f8bb-4953-bc7f-c2bcec736370 req-1f9d6466-ff55-4128-8d3a-53799cb13063 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Processing event network-vif-plugged-acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.143 226239 DEBUG nova.compute.manager [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.147 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850120.146399, 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.147 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.149 226239 DEBUG nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.152 226239 INFO nova.virt.libvirt.driver [-] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Instance spawned successfully.#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.153 226239 DEBUG nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.241 226239 DEBUG nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.242 226239 DEBUG nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.243 226239 DEBUG nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.243 226239 DEBUG nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.243 226239 DEBUG nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.244 226239 DEBUG nova.virt.libvirt.driver [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.247 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.250 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.701 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.933 226239 INFO nova.compute.manager [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Took 8.39 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:02:00 np0005603623 nova_compute[226235]: 2026-01-31 09:02:00.934 226239 DEBUG nova.compute.manager [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:02:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:01.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:01 np0005603623 nova_compute[226235]: 2026-01-31 09:02:01.140 226239 INFO nova.compute.manager [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Took 12.52 seconds to build instance.#033[00m
Jan 31 04:02:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:02:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:01.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:02:01 np0005603623 nova_compute[226235]: 2026-01-31 09:02:01.426 226239 DEBUG oslo_concurrency.lockutils [None req-34efde99-f272-4f91-b341-842fd92572b1 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:01 np0005603623 nova_compute[226235]: 2026-01-31 09:02:01.634 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:01 np0005603623 nova_compute[226235]: 2026-01-31 09:02:01.914 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e391 e391: 3 total, 3 up, 3 in
Jan 31 04:02:02 np0005603623 nova_compute[226235]: 2026-01-31 09:02:02.851 226239 DEBUG nova.compute.manager [req-f27277ff-0bad-4a5a-90f6-8f9d08f23306 req-2a2bfe0d-503c-4ad0-9bc0-46745fe1a4a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Received event network-vif-plugged-acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:02 np0005603623 nova_compute[226235]: 2026-01-31 09:02:02.852 226239 DEBUG oslo_concurrency.lockutils [req-f27277ff-0bad-4a5a-90f6-8f9d08f23306 req-2a2bfe0d-503c-4ad0-9bc0-46745fe1a4a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:02 np0005603623 nova_compute[226235]: 2026-01-31 09:02:02.852 226239 DEBUG oslo_concurrency.lockutils [req-f27277ff-0bad-4a5a-90f6-8f9d08f23306 req-2a2bfe0d-503c-4ad0-9bc0-46745fe1a4a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:02 np0005603623 nova_compute[226235]: 2026-01-31 09:02:02.853 226239 DEBUG oslo_concurrency.lockutils [req-f27277ff-0bad-4a5a-90f6-8f9d08f23306 req-2a2bfe0d-503c-4ad0-9bc0-46745fe1a4a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:02 np0005603623 nova_compute[226235]: 2026-01-31 09:02:02.853 226239 DEBUG nova.compute.manager [req-f27277ff-0bad-4a5a-90f6-8f9d08f23306 req-2a2bfe0d-503c-4ad0-9bc0-46745fe1a4a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] No waiting events found dispatching network-vif-plugged-acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:02:02 np0005603623 nova_compute[226235]: 2026-01-31 09:02:02.854 226239 WARNING nova.compute.manager [req-f27277ff-0bad-4a5a-90f6-8f9d08f23306 req-2a2bfe0d-503c-4ad0-9bc0-46745fe1a4a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Received unexpected event network-vif-plugged-acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:02:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:02:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:03.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:02:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:03.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:02:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:05.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:02:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:05.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:05 np0005603623 nova_compute[226235]: 2026-01-31 09:02:05.509 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:05 np0005603623 nova_compute[226235]: 2026-01-31 09:02:05.509 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 04:02:05 np0005603623 nova_compute[226235]: 2026-01-31 09:02:05.864 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 04:02:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e391 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:06 np0005603623 nova_compute[226235]: 2026-01-31 09:02:06.635 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:06 np0005603623 nova_compute[226235]: 2026-01-31 09:02:06.915 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:07.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:07.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:08 np0005603623 nova_compute[226235]: 2026-01-31 09:02:08.629 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:08 np0005603623 NetworkManager[48970]: <info>  [1769850128.6302] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/384)
Jan 31 04:02:08 np0005603623 NetworkManager[48970]: <info>  [1769850128.6311] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/385)
Jan 31 04:02:08 np0005603623 nova_compute[226235]: 2026-01-31 09:02:08.700 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:08 np0005603623 ovn_controller[133449]: 2026-01-31T09:02:08Z|00817|binding|INFO|Releasing lport 46e41546-aa3b-4838-b2c2-ba3b46cf445c from this chassis (sb_readonly=0)
Jan 31 04:02:08 np0005603623 nova_compute[226235]: 2026-01-31 09:02:08.728 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e392 e392: 3 total, 3 up, 3 in
Jan 31 04:02:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:02:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:09.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:02:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:02:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:09.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:02:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:11.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:11.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:11 np0005603623 nova_compute[226235]: 2026-01-31 09:02:11.510 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:11 np0005603623 nova_compute[226235]: 2026-01-31 09:02:11.511 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:02:11 np0005603623 nova_compute[226235]: 2026-01-31 09:02:11.637 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:11 np0005603623 nova_compute[226235]: 2026-01-31 09:02:11.876 226239 DEBUG nova.compute.manager [req-3aec839d-bb94-4502-99a5-cd620f77fff2 req-38c8b4ea-c924-473c-8c34-10b81837942c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Received event network-changed-acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:11 np0005603623 nova_compute[226235]: 2026-01-31 09:02:11.877 226239 DEBUG nova.compute.manager [req-3aec839d-bb94-4502-99a5-cd620f77fff2 req-38c8b4ea-c924-473c-8c34-10b81837942c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Refreshing instance network info cache due to event network-changed-acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:02:11 np0005603623 nova_compute[226235]: 2026-01-31 09:02:11.877 226239 DEBUG oslo_concurrency.lockutils [req-3aec839d-bb94-4502-99a5-cd620f77fff2 req-38c8b4ea-c924-473c-8c34-10b81837942c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:02:11 np0005603623 nova_compute[226235]: 2026-01-31 09:02:11.878 226239 DEBUG oslo_concurrency.lockutils [req-3aec839d-bb94-4502-99a5-cd620f77fff2 req-38c8b4ea-c924-473c-8c34-10b81837942c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:02:11 np0005603623 nova_compute[226235]: 2026-01-31 09:02:11.878 226239 DEBUG nova.network.neutron [req-3aec839d-bb94-4502-99a5-cd620f77fff2 req-38c8b4ea-c924-473c-8c34-10b81837942c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Refreshing network info cache for port acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:02:11 np0005603623 nova_compute[226235]: 2026-01-31 09:02:11.917 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:13.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:13.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:13 np0005603623 ovn_controller[133449]: 2026-01-31T09:02:13Z|00098|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ad:a6:e8 10.100.0.3
Jan 31 04:02:13 np0005603623 ovn_controller[133449]: 2026-01-31T09:02:13Z|00099|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ad:a6:e8 10.100.0.3
Jan 31 04:02:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:15.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:15.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:16 np0005603623 nova_compute[226235]: 2026-01-31 09:02:16.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:16 np0005603623 nova_compute[226235]: 2026-01-31 09:02:16.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:02:16 np0005603623 nova_compute[226235]: 2026-01-31 09:02:16.156 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:02:16 np0005603623 nova_compute[226235]: 2026-01-31 09:02:16.454 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:02:16 np0005603623 nova_compute[226235]: 2026-01-31 09:02:16.638 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:16 np0005603623 nova_compute[226235]: 2026-01-31 09:02:16.919 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:17.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:02:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:17.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:02:17 np0005603623 nova_compute[226235]: 2026-01-31 09:02:17.363 226239 DEBUG nova.network.neutron [req-3aec839d-bb94-4502-99a5-cd620f77fff2 req-38c8b4ea-c924-473c-8c34-10b81837942c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Updated VIF entry in instance network info cache for port acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:02:17 np0005603623 nova_compute[226235]: 2026-01-31 09:02:17.364 226239 DEBUG nova.network.neutron [req-3aec839d-bb94-4502-99a5-cd620f77fff2 req-38c8b4ea-c924-473c-8c34-10b81837942c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Updating instance_info_cache with network_info: [{"id": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "address": "fa:16:3e:ad:a6:e8", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacfc2f5c-0e", "ovs_interfaceid": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:02:17 np0005603623 nova_compute[226235]: 2026-01-31 09:02:17.637 226239 DEBUG oslo_concurrency.lockutils [req-3aec839d-bb94-4502-99a5-cd620f77fff2 req-38c8b4ea-c924-473c-8c34-10b81837942c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:02:17 np0005603623 nova_compute[226235]: 2026-01-31 09:02:17.637 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:02:17 np0005603623 nova_compute[226235]: 2026-01-31 09:02:17.638 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 04:02:17 np0005603623 nova_compute[226235]: 2026-01-31 09:02:17.638 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:02:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:02:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:19.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:02:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:19.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:19 np0005603623 ovn_controller[133449]: 2026-01-31T09:02:19Z|00818|memory|INFO|peak resident set size grew 51% in last 5430.4 seconds, from 16128 kB to 24324 kB
Jan 31 04:02:19 np0005603623 ovn_controller[133449]: 2026-01-31T09:02:19Z|00819|memory|INFO|idl-cells-OVN_Southbound:11026 idl-cells-Open_vSwitch:870 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:373 lflow-cache-entries-cache-matches:292 lflow-cache-size-KB:1533 local_datapath_usage-KB:3 ofctrl_desired_flow_usage-KB:648 ofctrl_installed_flow_usage-KB:474 ofctrl_sb_flow_ref_usage-KB:244
Jan 31 04:02:20 np0005603623 podman[319299]: 2026-01-31 09:02:20.173207208 +0000 UTC m=+0.045589142 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 31 04:02:20 np0005603623 podman[319300]: 2026-01-31 09:02:20.196197819 +0000 UTC m=+0.068633454 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 31 04:02:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:02:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:21.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:02:21 np0005603623 nova_compute[226235]: 2026-01-31 09:02:21.138 226239 DEBUG oslo_concurrency.lockutils [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquiring lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" by "nova.compute.manager.ComputeManager.shelve_offload_instance.<locals>.do_shelve_offload_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:21 np0005603623 nova_compute[226235]: 2026-01-31 09:02:21.138 226239 DEBUG oslo_concurrency.lockutils [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" acquired by "nova.compute.manager.ComputeManager.shelve_offload_instance.<locals>.do_shelve_offload_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:21 np0005603623 nova_compute[226235]: 2026-01-31 09:02:21.139 226239 INFO nova.compute.manager [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Shelve offloading#033[00m
Jan 31 04:02:21 np0005603623 nova_compute[226235]: 2026-01-31 09:02:21.172 226239 DEBUG nova.virt.libvirt.driver [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 04:02:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:21.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:21 np0005603623 nova_compute[226235]: 2026-01-31 09:02:21.641 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:21 np0005603623 nova_compute[226235]: 2026-01-31 09:02:21.921 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:22 np0005603623 nova_compute[226235]: 2026-01-31 09:02:22.670 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Updating instance_info_cache with network_info: [{"id": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "address": "fa:16:3e:ad:a6:e8", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacfc2f5c-0e", "ovs_interfaceid": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:02:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:23.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:02:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:23.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:02:24 np0005603623 nova_compute[226235]: 2026-01-31 09:02:24.313 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:02:24 np0005603623 nova_compute[226235]: 2026-01-31 09:02:24.314 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 04:02:24 np0005603623 nova_compute[226235]: 2026-01-31 09:02:24.314 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:24 np0005603623 nova_compute[226235]: 2026-01-31 09:02:24.314 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:24 np0005603623 nova_compute[226235]: 2026-01-31 09:02:24.385 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:24 np0005603623 nova_compute[226235]: 2026-01-31 09:02:24.385 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:24 np0005603623 nova_compute[226235]: 2026-01-31 09:02:24.385 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:24 np0005603623 nova_compute[226235]: 2026-01-31 09:02:24.386 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:02:24 np0005603623 nova_compute[226235]: 2026-01-31 09:02:24.386 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:02:24 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2140582176' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:02:24 np0005603623 nova_compute[226235]: 2026-01-31 09:02:24.928 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.542s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:25 np0005603623 nova_compute[226235]: 2026-01-31 09:02:25.056 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000c3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:02:25 np0005603623 nova_compute[226235]: 2026-01-31 09:02:25.057 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000c3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:02:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:25.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:25 np0005603623 nova_compute[226235]: 2026-01-31 09:02:25.183 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:02:25 np0005603623 nova_compute[226235]: 2026-01-31 09:02:25.184 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4001MB free_disk=20.921737670898438GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:02:25 np0005603623 nova_compute[226235]: 2026-01-31 09:02:25.184 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:25 np0005603623 nova_compute[226235]: 2026-01-31 09:02:25.184 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:25.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:25 np0005603623 nova_compute[226235]: 2026-01-31 09:02:25.316 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:02:25 np0005603623 nova_compute[226235]: 2026-01-31 09:02:25.317 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:02:25 np0005603623 nova_compute[226235]: 2026-01-31 09:02:25.317 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:02:25 np0005603623 nova_compute[226235]: 2026-01-31 09:02:25.354 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 04:02:25 np0005603623 nova_compute[226235]: 2026-01-31 09:02:25.420 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 04:02:25 np0005603623 nova_compute[226235]: 2026-01-31 09:02:25.420 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 04:02:25 np0005603623 nova_compute[226235]: 2026-01-31 09:02:25.455 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 04:02:25 np0005603623 nova_compute[226235]: 2026-01-31 09:02:25.486 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 04:02:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:02:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.0 total, 600.0 interval#012Cumulative writes: 16K writes, 81K keys, 16K commit groups, 1.0 writes per commit group, ingest: 0.16 GB, 0.03 MB/s#012Cumulative WAL: 16K writes, 16K syncs, 1.00 writes per sync, written: 0.16 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1687 writes, 8285 keys, 1687 commit groups, 1.0 writes per commit group, ingest: 16.37 MB, 0.03 MB/s#012Interval WAL: 1687 writes, 1687 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     43.5      2.32              0.24        52    0.045       0      0       0.0       0.0#012  L6      1/0   12.37 MB   0.0      0.6     0.1      0.5       0.5      0.0       0.0   5.2     60.9     52.1     10.03              1.13        51    0.197    379K    27K       0.0       0.0#012 Sum      1/0   12.37 MB   0.0      0.6     0.1      0.5       0.6      0.1       0.0   6.2     49.5     50.5     12.36              1.36       103    0.120    379K    27K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.3     84.2     86.2      0.94              0.16        12    0.078     61K   3135       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.6     0.1      0.5       0.5      0.0       0.0   0.0     60.9     52.1     10.03              1.13        51    0.197    379K    27K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     43.5      2.32              0.24        51    0.045       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.0 total, 600.0 interval#012Flush(GB): cumulative 0.099, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.61 GB write, 0.10 MB/s write, 0.60 GB read, 0.10 MB/s read, 12.4 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.08 GB read, 0.13 MB/s read, 0.9 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557fc5f1b1f0#2 capacity: 304.00 MB usage: 66.61 MB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 0 last_secs: 0.000357 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3787,63.86 MB,21.0065%) FilterBlock(103,1.03 MB,0.337194%) IndexBlock(103,1.73 MB,0.567783%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 04:02:25 np0005603623 nova_compute[226235]: 2026-01-31 09:02:25.603 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:26 np0005603623 kernel: tapacfc2f5c-0e (unregistering): left promiscuous mode
Jan 31 04:02:26 np0005603623 NetworkManager[48970]: <info>  [1769850146.0821] device (tapacfc2f5c-0e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1492181088' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:02:26 np0005603623 ovn_controller[133449]: 2026-01-31T09:02:26Z|00820|binding|INFO|Releasing lport acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 from this chassis (sb_readonly=0)
Jan 31 04:02:26 np0005603623 ovn_controller[133449]: 2026-01-31T09:02:26Z|00821|binding|INFO|Setting lport acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 down in Southbound
Jan 31 04:02:26 np0005603623 ovn_controller[133449]: 2026-01-31T09:02:26Z|00822|binding|INFO|Removing iface tapacfc2f5c-0e ovn-installed in OVS
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.138 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.140 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.154 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.551s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.160 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:02:26 np0005603623 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000c3.scope: Deactivated successfully.
Jan 31 04:02:26 np0005603623 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000c3.scope: Consumed 13.014s CPU time.
Jan 31 04:02:26 np0005603623 systemd-machined[194379]: Machine qemu-91-instance-000000c3 terminated.
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #166. Immutable memtables: 0.
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:02:26.256356) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 105] Flushing memtable with next log file: 166
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850146256425, "job": 105, "event": "flush_started", "num_memtables": 1, "num_entries": 1807, "num_deletes": 255, "total_data_size": 3909476, "memory_usage": 3966792, "flush_reason": "Manual Compaction"}
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 105] Level-0 flush table #167: started
Jan 31 04:02:26 np0005603623 kernel: tapacfc2f5c-0e: entered promiscuous mode
Jan 31 04:02:26 np0005603623 kernel: tapacfc2f5c-0e (unregistering): left promiscuous mode
Jan 31 04:02:26 np0005603623 NetworkManager[48970]: <info>  [1769850146.3466] manager: (tapacfc2f5c-0e): new Tun device (/org/freedesktop/NetworkManager/Devices/386)
Jan 31 04:02:26 np0005603623 ovn_controller[133449]: 2026-01-31T09:02:26Z|00823|if_status|INFO|Not updating pb chassis for acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 now as sb is readonly
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.345 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:26 np0005603623 ovn_controller[133449]: 2026-01-31T09:02:26Z|00824|binding|INFO|Releasing lport acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 from this chassis (sb_readonly=1)
Jan 31 04:02:26 np0005603623 ovn_controller[133449]: 2026-01-31T09:02:26Z|00825|if_status|INFO|Dropped 8 log messages in last 727 seconds (most recently, 727 seconds ago) due to excessive rate
Jan 31 04:02:26 np0005603623 ovn_controller[133449]: 2026-01-31T09:02:26Z|00826|if_status|INFO|Not setting lport acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 down as sb is readonly
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.358 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.361 226239 INFO nova.virt.libvirt.driver [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Instance shutdown successfully after 5 seconds.#033[00m
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.364 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.366 226239 INFO nova.virt.libvirt.driver [-] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Instance destroyed successfully.#033[00m
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.367 226239 DEBUG nova.objects.instance [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850146423013, "cf_name": "default", "job": 105, "event": "table_file_creation", "file_number": 167, "file_size": 2554761, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 80415, "largest_seqno": 82217, "table_properties": {"data_size": 2547338, "index_size": 4301, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16335, "raw_average_key_size": 20, "raw_value_size": 2532193, "raw_average_value_size": 3185, "num_data_blocks": 189, "num_entries": 795, "num_filter_entries": 795, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850008, "oldest_key_time": 1769850008, "file_creation_time": 1769850146, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 167, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 105] Flush lasted 166698 microseconds, and 5054 cpu microseconds.
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:02:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:26.430 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:a6:e8 10.100.0.3'], port_security=['fa:16:3e:ad:a6:e8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '2fbbeeee-ff60-4a39-9bea-e3d59301b0ad', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1c62fa1c-f7d2-4937-9258-1d3a4456b207', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f293713f6854265a89a1a4a002088d5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '400baeb3-ed1b-4018-bb0b-3e4e58b8921d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=05ac80f4-66e3-4e8c-b69d-f2f58ada92e8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:02:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:26.431 143258 INFO neutron.agent.ovn.metadata.agent [-] Port acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 in datapath 1c62fa1c-f7d2-4937-9258-1d3a4456b207 unbound from our chassis#033[00m
Jan 31 04:02:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:26.433 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1c62fa1c-f7d2-4937-9258-1d3a4456b207, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:02:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:26.434 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4dbd883b-0897-4a34-ae8e-51e3cd4cd29d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:26.434 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207 namespace which is not needed anymore#033[00m
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.462 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:02:26.423059) [db/flush_job.cc:967] [default] [JOB 105] Level-0 flush table #167: 2554761 bytes OK
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:02:26.423081) [db/memtable_list.cc:519] [default] Level-0 commit table #167 started
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:02:26.530105) [db/memtable_list.cc:722] [default] Level-0 commit table #167: memtable #1 done
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:02:26.530147) EVENT_LOG_v1 {"time_micros": 1769850146530138, "job": 105, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:02:26.530169) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 105] Try to delete WAL files size 3901228, prev total WAL file size 3901228, number of live WAL files 2.
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000163.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:02:26.530927) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037303238' seq:72057594037927935, type:22 .. '7061786F730037323830' seq:0, type:0; will stop at (end)
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 106] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 105 Base level 0, inputs: [167(2494KB)], [165(12MB)]
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850146530989, "job": 106, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [167], "files_L6": [165], "score": -1, "input_data_size": 15526148, "oldest_snapshot_seqno": -1}
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.573 226239 DEBUG nova.compute.manager [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.581 226239 DEBUG oslo_concurrency.lockutils [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquiring lock "refresh_cache-2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.582 226239 DEBUG oslo_concurrency.lockutils [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquired lock "refresh_cache-2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.582 226239 DEBUG nova.network.neutron [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.643 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:26 np0005603623 neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207[319193]: [NOTICE]   (319202) : haproxy version is 2.8.14-c23fe91
Jan 31 04:02:26 np0005603623 neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207[319193]: [NOTICE]   (319202) : path to executable is /usr/sbin/haproxy
Jan 31 04:02:26 np0005603623 neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207[319193]: [WARNING]  (319202) : Exiting Master process...
Jan 31 04:02:26 np0005603623 neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207[319193]: [ALERT]    (319202) : Current worker (319204) exited with code 143 (Terminated)
Jan 31 04:02:26 np0005603623 neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207[319193]: [WARNING]  (319202) : All workers exited. Exiting... (0)
Jan 31 04:02:26 np0005603623 systemd[1]: libpod-6a656b34033f2af7230a48e88ebcf0c7ba494db53f295f1e30f0768d09acf132.scope: Deactivated successfully.
Jan 31 04:02:26 np0005603623 podman[319453]: 2026-01-31 09:02:26.671754502 +0000 UTC m=+0.170609294 container died 6a656b34033f2af7230a48e88ebcf0c7ba494db53f295f1e30f0768d09acf132 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 106] Generated table #168: 10224 keys, 13698202 bytes, temperature: kUnknown
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850146696841, "cf_name": "default", "job": 106, "event": "table_file_creation", "file_number": 168, "file_size": 13698202, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13631384, "index_size": 40145, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25605, "raw_key_size": 269705, "raw_average_key_size": 26, "raw_value_size": 13451947, "raw_average_value_size": 1315, "num_data_blocks": 1532, "num_entries": 10224, "num_filter_entries": 10224, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769850146, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 168, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.760 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.761 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:02:26.697316) [db/compaction/compaction_job.cc:1663] [default] [JOB 106] Compacted 1@0 + 1@6 files to L6 => 13698202 bytes
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:02:26.772800) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 93.6 rd, 82.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 12.4 +0.0 blob) out(13.1 +0.0 blob), read-write-amplify(11.4) write-amplify(5.4) OK, records in: 10751, records dropped: 527 output_compression: NoCompression
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:02:26.772847) EVENT_LOG_v1 {"time_micros": 1769850146772828, "job": 106, "event": "compaction_finished", "compaction_time_micros": 165923, "compaction_time_cpu_micros": 25004, "output_level": 6, "num_output_files": 1, "total_output_size": 13698202, "num_input_records": 10751, "num_output_records": 10224, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000167.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850146774113, "job": 106, "event": "table_file_deletion", "file_number": 167}
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000165.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850146776976, "job": 106, "event": "table_file_deletion", "file_number": 165}
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:02:26.530844) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:02:26.777038) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:02:26.777042) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:02:26.777044) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:02:26.777045) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:02:26 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:02:26.777046) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:02:26 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6a656b34033f2af7230a48e88ebcf0c7ba494db53f295f1e30f0768d09acf132-userdata-shm.mount: Deactivated successfully.
Jan 31 04:02:26 np0005603623 systemd[1]: var-lib-containers-storage-overlay-4494b4570a1b0e13c8a4ceb0e3eeb354f2b82d3b54355e8fc3f1fcd00ef241a5-merged.mount: Deactivated successfully.
Jan 31 04:02:26 np0005603623 nova_compute[226235]: 2026-01-31 09:02:26.923 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:27.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:02:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:27.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:02:27 np0005603623 podman[319453]: 2026-01-31 09:02:27.31179796 +0000 UTC m=+0.810652742 container cleanup 6a656b34033f2af7230a48e88ebcf0c7ba494db53f295f1e30f0768d09acf132 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 04:02:27 np0005603623 systemd[1]: libpod-conmon-6a656b34033f2af7230a48e88ebcf0c7ba494db53f295f1e30f0768d09acf132.scope: Deactivated successfully.
Jan 31 04:02:27 np0005603623 nova_compute[226235]: 2026-01-31 09:02:27.343 226239 DEBUG oslo_concurrency.lockutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:27 np0005603623 nova_compute[226235]: 2026-01-31 09:02:27.343 226239 DEBUG oslo_concurrency.lockutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:27 np0005603623 nova_compute[226235]: 2026-01-31 09:02:27.602 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:27 np0005603623 nova_compute[226235]: 2026-01-31 09:02:27.602 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:27 np0005603623 nova_compute[226235]: 2026-01-31 09:02:27.602 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:27 np0005603623 nova_compute[226235]: 2026-01-31 09:02:27.603 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:27 np0005603623 nova_compute[226235]: 2026-01-31 09:02:27.627 226239 DEBUG nova.compute.manager [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:02:28 np0005603623 podman[319486]: 2026-01-31 09:02:28.516186642 +0000 UTC m=+1.188987730 container remove 6a656b34033f2af7230a48e88ebcf0c7ba494db53f295f1e30f0768d09acf132 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:02:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:28.522 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3cb03bed-9920-425e-8650-faefacb1e359]: (4, ('Sat Jan 31 09:02:26 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207 (6a656b34033f2af7230a48e88ebcf0c7ba494db53f295f1e30f0768d09acf132)\n6a656b34033f2af7230a48e88ebcf0c7ba494db53f295f1e30f0768d09acf132\nSat Jan 31 09:02:27 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207 (6a656b34033f2af7230a48e88ebcf0c7ba494db53f295f1e30f0768d09acf132)\n6a656b34033f2af7230a48e88ebcf0c7ba494db53f295f1e30f0768d09acf132\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:28.524 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b8a8d2fa-19c0-4096-843c-7b802a5774e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:28.525 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1c62fa1c-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:28 np0005603623 nova_compute[226235]: 2026-01-31 09:02:28.527 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:28 np0005603623 kernel: tap1c62fa1c-f0: left promiscuous mode
Jan 31 04:02:28 np0005603623 nova_compute[226235]: 2026-01-31 09:02:28.536 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:28.539 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0047405a-e755-45a5-9c72-4c6272864b75]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:28.554 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[321c56bc-f058-4a7d-8b14-5c3f85098a19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:28.556 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[986035db-7f91-45d7-98f3-4353d03ec456]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:28.567 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f85fb748-fad4-4285-ae0f-d5015809b3ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 925930, 'reachable_time': 20650, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 319507, 'error': None, 'target': 'ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:28.571 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1c62fa1c-f7d2-4937-9258-1d3a4456b207 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:02:28 np0005603623 systemd[1]: run-netns-ovnmeta\x2d1c62fa1c\x2df7d2\x2d4937\x2d9258\x2d1d3a4456b207.mount: Deactivated successfully.
Jan 31 04:02:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:28.572 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[13046814-ea88-4189-9dba-f053a9cbf9e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:28 np0005603623 nova_compute[226235]: 2026-01-31 09:02:28.583 226239 DEBUG oslo_concurrency.lockutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:28 np0005603623 nova_compute[226235]: 2026-01-31 09:02:28.584 226239 DEBUG oslo_concurrency.lockutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:28 np0005603623 nova_compute[226235]: 2026-01-31 09:02:28.593 226239 DEBUG nova.virt.hardware [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:02:28 np0005603623 nova_compute[226235]: 2026-01-31 09:02:28.593 226239 INFO nova.compute.claims [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 04:02:29 np0005603623 nova_compute[226235]: 2026-01-31 09:02:29.007 226239 DEBUG nova.compute.manager [req-aaeeb478-1ed4-48fe-9cd6-fdfae8e2e918 req-824ceb05-d9c5-4409-8cd7-03aa0a74b1d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Received event network-vif-unplugged-acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:29 np0005603623 nova_compute[226235]: 2026-01-31 09:02:29.008 226239 DEBUG oslo_concurrency.lockutils [req-aaeeb478-1ed4-48fe-9cd6-fdfae8e2e918 req-824ceb05-d9c5-4409-8cd7-03aa0a74b1d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:29 np0005603623 nova_compute[226235]: 2026-01-31 09:02:29.008 226239 DEBUG oslo_concurrency.lockutils [req-aaeeb478-1ed4-48fe-9cd6-fdfae8e2e918 req-824ceb05-d9c5-4409-8cd7-03aa0a74b1d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:29 np0005603623 nova_compute[226235]: 2026-01-31 09:02:29.008 226239 DEBUG oslo_concurrency.lockutils [req-aaeeb478-1ed4-48fe-9cd6-fdfae8e2e918 req-824ceb05-d9c5-4409-8cd7-03aa0a74b1d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:29 np0005603623 nova_compute[226235]: 2026-01-31 09:02:29.008 226239 DEBUG nova.compute.manager [req-aaeeb478-1ed4-48fe-9cd6-fdfae8e2e918 req-824ceb05-d9c5-4409-8cd7-03aa0a74b1d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] No waiting events found dispatching network-vif-unplugged-acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:02:29 np0005603623 nova_compute[226235]: 2026-01-31 09:02:29.009 226239 WARNING nova.compute.manager [req-aaeeb478-1ed4-48fe-9cd6-fdfae8e2e918 req-824ceb05-d9c5-4409-8cd7-03aa0a74b1d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Received unexpected event network-vif-unplugged-acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 for instance with vm_state active and task_state shelving.#033[00m
Jan 31 04:02:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:29.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:29 np0005603623 nova_compute[226235]: 2026-01-31 09:02:29.080 226239 DEBUG oslo_concurrency.processutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:29.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:02:29 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/231255030' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:02:29 np0005603623 nova_compute[226235]: 2026-01-31 09:02:29.480 226239 DEBUG oslo_concurrency.processutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:29 np0005603623 nova_compute[226235]: 2026-01-31 09:02:29.485 226239 DEBUG nova.compute.provider_tree [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:02:29 np0005603623 nova_compute[226235]: 2026-01-31 09:02:29.631 226239 DEBUG nova.scheduler.client.report [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:02:30 np0005603623 nova_compute[226235]: 2026-01-31 09:02:30.061 226239 DEBUG oslo_concurrency.lockutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.476s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:30 np0005603623 nova_compute[226235]: 2026-01-31 09:02:30.062 226239 DEBUG nova.compute.manager [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:02:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:30.154 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:30.155 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:30 np0005603623 nova_compute[226235]: 2026-01-31 09:02:30.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:30.155 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:30 np0005603623 nova_compute[226235]: 2026-01-31 09:02:30.231 226239 DEBUG nova.compute.manager [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:02:30 np0005603623 nova_compute[226235]: 2026-01-31 09:02:30.231 226239 DEBUG nova.network.neutron [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:02:30 np0005603623 nova_compute[226235]: 2026-01-31 09:02:30.444 226239 INFO nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:02:30 np0005603623 nova_compute[226235]: 2026-01-31 09:02:30.586 226239 DEBUG nova.compute.manager [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:02:30 np0005603623 nova_compute[226235]: 2026-01-31 09:02:30.683 226239 DEBUG nova.policy [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd442c7ba12ed444ca6d4dcc5cfd36150', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:02:30 np0005603623 nova_compute[226235]: 2026-01-31 09:02:30.887 226239 DEBUG nova.network.neutron [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Updating instance_info_cache with network_info: [{"id": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "address": "fa:16:3e:ad:a6:e8", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacfc2f5c-0e", "ovs_interfaceid": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:02:30 np0005603623 nova_compute[226235]: 2026-01-31 09:02:30.937 226239 DEBUG oslo_concurrency.lockutils [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Releasing lock "refresh_cache-2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.002 226239 DEBUG nova.compute.manager [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.003 226239 DEBUG nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.003 226239 INFO nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Creating image(s)#033[00m
Jan 31 04:02:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:31.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.163 226239 DEBUG nova.storage.rbd_utils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 0d650376-d2f8-4e6a-b19a-f8639cc5ce52_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.191 226239 DEBUG nova.storage.rbd_utils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 0d650376-d2f8-4e6a-b19a-f8639cc5ce52_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:02:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:31.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.225 226239 DEBUG nova.storage.rbd_utils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 0d650376-d2f8-4e6a-b19a-f8639cc5ce52_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.229 226239 DEBUG oslo_concurrency.processutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.294 226239 DEBUG oslo_concurrency.processutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.295 226239 DEBUG oslo_concurrency.lockutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.296 226239 DEBUG oslo_concurrency.lockutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.296 226239 DEBUG oslo_concurrency.lockutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.321 226239 DEBUG nova.storage.rbd_utils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 0d650376-d2f8-4e6a-b19a-f8639cc5ce52_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.324 226239 DEBUG oslo_concurrency.processutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 0d650376-d2f8-4e6a-b19a-f8639cc5ce52_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.515 226239 DEBUG nova.compute.manager [req-edd46548-d835-41d7-9fdd-54c79aecf00b req-1e7604b4-309b-478f-9a75-fbe095fe4369 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Received event network-vif-plugged-acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.515 226239 DEBUG oslo_concurrency.lockutils [req-edd46548-d835-41d7-9fdd-54c79aecf00b req-1e7604b4-309b-478f-9a75-fbe095fe4369 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.515 226239 DEBUG oslo_concurrency.lockutils [req-edd46548-d835-41d7-9fdd-54c79aecf00b req-1e7604b4-309b-478f-9a75-fbe095fe4369 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.516 226239 DEBUG oslo_concurrency.lockutils [req-edd46548-d835-41d7-9fdd-54c79aecf00b req-1e7604b4-309b-478f-9a75-fbe095fe4369 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.516 226239 DEBUG nova.compute.manager [req-edd46548-d835-41d7-9fdd-54c79aecf00b req-1e7604b4-309b-478f-9a75-fbe095fe4369 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] No waiting events found dispatching network-vif-plugged-acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.516 226239 WARNING nova.compute.manager [req-edd46548-d835-41d7-9fdd-54c79aecf00b req-1e7604b4-309b-478f-9a75-fbe095fe4369 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Received unexpected event network-vif-plugged-acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 for instance with vm_state active and task_state shelving.#033[00m
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.644 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:31 np0005603623 nova_compute[226235]: 2026-01-31 09:02:31.926 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:33.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:33 np0005603623 nova_compute[226235]: 2026-01-31 09:02:33.160 226239 DEBUG oslo_concurrency.processutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 0d650376-d2f8-4e6a-b19a-f8639cc5ce52_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.836s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:33.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:33 np0005603623 nova_compute[226235]: 2026-01-31 09:02:33.236 226239 DEBUG nova.storage.rbd_utils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] resizing rbd image 0d650376-d2f8-4e6a-b19a-f8639cc5ce52_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.231 226239 DEBUG nova.network.neutron [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Successfully created port: 61191e24-6a0b-4f1f-a895-c15502e9f067 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:02:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:34.281 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=85, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=84) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:02:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:34.282 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.307 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.311 226239 DEBUG nova.objects.instance [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'migration_context' on Instance uuid 0d650376-d2f8-4e6a-b19a-f8639cc5ce52 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.343 226239 DEBUG nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.343 226239 DEBUG nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Ensure instance console log exists: /var/lib/nova/instances/0d650376-d2f8-4e6a-b19a-f8639cc5ce52/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.343 226239 DEBUG oslo_concurrency.lockutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.344 226239 DEBUG oslo_concurrency.lockutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.344 226239 DEBUG oslo_concurrency.lockutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.452 226239 INFO nova.virt.libvirt.driver [-] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Instance destroyed successfully.#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.452 226239 DEBUG nova.objects.instance [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lazy-loading 'resources' on Instance uuid 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.476 226239 DEBUG nova.virt.libvirt.vif [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:01:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-700374568',display_name='tempest-TestShelveInstance-server-700374568',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-700374568',id=195,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGtA3anYK1F9iXzxi+WbZdX9+H2hCPBQsf9eZ9YvPVN48lWV0Tj2M8EqzHWhivNuSFaYD9k1TbjDSy9xGFH7/SEr14KZUm/LE8cO61iZWeNARWc/E4iBetyQV/0Aqvvflw==',key_name='tempest-TestShelveInstance-1586472022',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:02:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1f293713f6854265a89a1a4a002088d5',ramdisk_id='',reservation_id='r-kzm36xl7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestShelveInstance-1813478377',owner_user_name='tempest-TestShelveInstance-1813478377-project-member'},tags=<?>,task_state='shelving',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:02:01Z,user_data=None,user_id='3859f52c5b70471097d1e4ffa75ecc0e',uuid=2fbbeeee-ff60-4a39-9bea-e3d59301b0ad,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "address": "fa:16:3e:ad:a6:e8", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacfc2f5c-0e", "ovs_interfaceid": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.477 226239 DEBUG nova.network.os_vif_util [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Converting VIF {"id": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "address": "fa:16:3e:ad:a6:e8", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": "br-int", "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacfc2f5c-0e", "ovs_interfaceid": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.478 226239 DEBUG nova.network.os_vif_util [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ad:a6:e8,bridge_name='br-int',has_traffic_filtering=True,id=acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1,network=Network(1c62fa1c-f7d2-4937-9258-1d3a4456b207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacfc2f5c-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.478 226239 DEBUG os_vif [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:a6:e8,bridge_name='br-int',has_traffic_filtering=True,id=acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1,network=Network(1c62fa1c-f7d2-4937-9258-1d3a4456b207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacfc2f5c-0e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.480 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.480 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapacfc2f5c-0e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.481 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.483 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.484 226239 INFO os_vif [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:a6:e8,bridge_name='br-int',has_traffic_filtering=True,id=acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1,network=Network(1c62fa1c-f7d2-4937-9258-1d3a4456b207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacfc2f5c-0e')#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.793 226239 DEBUG nova.compute.manager [req-0651bb8c-88ea-4796-9ddd-a6f867caa264 req-3eddae11-aad3-49bd-9bad-0e3e7507bf60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Received event network-changed-acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.794 226239 DEBUG nova.compute.manager [req-0651bb8c-88ea-4796-9ddd-a6f867caa264 req-3eddae11-aad3-49bd-9bad-0e3e7507bf60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Refreshing instance network info cache due to event network-changed-acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.794 226239 DEBUG oslo_concurrency.lockutils [req-0651bb8c-88ea-4796-9ddd-a6f867caa264 req-3eddae11-aad3-49bd-9bad-0e3e7507bf60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.795 226239 DEBUG oslo_concurrency.lockutils [req-0651bb8c-88ea-4796-9ddd-a6f867caa264 req-3eddae11-aad3-49bd-9bad-0e3e7507bf60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:02:34 np0005603623 nova_compute[226235]: 2026-01-31 09:02:34.795 226239 DEBUG nova.network.neutron [req-0651bb8c-88ea-4796-9ddd-a6f867caa264 req-3eddae11-aad3-49bd-9bad-0e3e7507bf60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Refreshing network info cache for port acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:02:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:35.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:35.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:36 np0005603623 nova_compute[226235]: 2026-01-31 09:02:36.646 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:02:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:37.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:02:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:37.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:38 np0005603623 nova_compute[226235]: 2026-01-31 09:02:38.076 226239 INFO nova.virt.libvirt.driver [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Deleting instance files /var/lib/nova/instances/2fbbeeee-ff60-4a39-9bea-e3d59301b0ad_del#033[00m
Jan 31 04:02:38 np0005603623 nova_compute[226235]: 2026-01-31 09:02:38.077 226239 INFO nova.virt.libvirt.driver [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Deletion of /var/lib/nova/instances/2fbbeeee-ff60-4a39-9bea-e3d59301b0ad_del complete#033[00m
Jan 31 04:02:38 np0005603623 nova_compute[226235]: 2026-01-31 09:02:38.336 226239 DEBUG nova.network.neutron [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Successfully updated port: 61191e24-6a0b-4f1f-a895-c15502e9f067 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:02:38 np0005603623 nova_compute[226235]: 2026-01-31 09:02:38.371 226239 DEBUG oslo_concurrency.lockutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "refresh_cache-0d650376-d2f8-4e6a-b19a-f8639cc5ce52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:02:38 np0005603623 nova_compute[226235]: 2026-01-31 09:02:38.372 226239 DEBUG oslo_concurrency.lockutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquired lock "refresh_cache-0d650376-d2f8-4e6a-b19a-f8639cc5ce52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:02:38 np0005603623 nova_compute[226235]: 2026-01-31 09:02:38.372 226239 DEBUG nova.network.neutron [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:02:38 np0005603623 nova_compute[226235]: 2026-01-31 09:02:38.513 226239 DEBUG nova.compute.manager [req-4db3dc91-e06c-4762-ab11-1be31a197a26 req-e1598690-a196-4cee-8fa4-c6c264d8d2ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Received event network-changed-61191e24-6a0b-4f1f-a895-c15502e9f067 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:38 np0005603623 nova_compute[226235]: 2026-01-31 09:02:38.514 226239 DEBUG nova.compute.manager [req-4db3dc91-e06c-4762-ab11-1be31a197a26 req-e1598690-a196-4cee-8fa4-c6c264d8d2ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Refreshing instance network info cache due to event network-changed-61191e24-6a0b-4f1f-a895-c15502e9f067. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:02:38 np0005603623 nova_compute[226235]: 2026-01-31 09:02:38.514 226239 DEBUG oslo_concurrency.lockutils [req-4db3dc91-e06c-4762-ab11-1be31a197a26 req-e1598690-a196-4cee-8fa4-c6c264d8d2ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-0d650376-d2f8-4e6a-b19a-f8639cc5ce52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:02:38 np0005603623 nova_compute[226235]: 2026-01-31 09:02:38.687 226239 DEBUG nova.network.neutron [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:02:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:39.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:39 np0005603623 nova_compute[226235]: 2026-01-31 09:02:39.162 226239 DEBUG nova.network.neutron [req-0651bb8c-88ea-4796-9ddd-a6f867caa264 req-3eddae11-aad3-49bd-9bad-0e3e7507bf60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Updated VIF entry in instance network info cache for port acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:02:39 np0005603623 nova_compute[226235]: 2026-01-31 09:02:39.163 226239 DEBUG nova.network.neutron [req-0651bb8c-88ea-4796-9ddd-a6f867caa264 req-3eddae11-aad3-49bd-9bad-0e3e7507bf60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Updating instance_info_cache with network_info: [{"id": "acfc2f5c-0e80-48f1-ba84-7ec66c3d64b1", "address": "fa:16:3e:ad:a6:e8", "network": {"id": "1c62fa1c-f7d2-4937-9258-1d3a4456b207", "bridge": null, "label": "tempest-TestShelveInstance-1686478311-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f293713f6854265a89a1a4a002088d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tapacfc2f5c-0e", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:02:39 np0005603623 nova_compute[226235]: 2026-01-31 09:02:39.209 226239 DEBUG oslo_concurrency.lockutils [req-0651bb8c-88ea-4796-9ddd-a6f867caa264 req-3eddae11-aad3-49bd-9bad-0e3e7507bf60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:02:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:39.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:39 np0005603623 nova_compute[226235]: 2026-01-31 09:02:39.409 226239 INFO nova.scheduler.client.report [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Deleted allocations for instance 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad#033[00m
Jan 31 04:02:39 np0005603623 nova_compute[226235]: 2026-01-31 09:02:39.482 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:39 np0005603623 nova_compute[226235]: 2026-01-31 09:02:39.556 226239 DEBUG oslo_concurrency.lockutils [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:39 np0005603623 nova_compute[226235]: 2026-01-31 09:02:39.557 226239 DEBUG oslo_concurrency.lockutils [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:39 np0005603623 nova_compute[226235]: 2026-01-31 09:02:39.648 226239 DEBUG oslo_concurrency.processutils [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:02:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4165647913' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:02:40 np0005603623 nova_compute[226235]: 2026-01-31 09:02:40.055 226239 DEBUG oslo_concurrency.processutils [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.407s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:40 np0005603623 nova_compute[226235]: 2026-01-31 09:02:40.059 226239 DEBUG nova.compute.provider_tree [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:02:40 np0005603623 nova_compute[226235]: 2026-01-31 09:02:40.094 226239 DEBUG nova.scheduler.client.report [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:02:40 np0005603623 nova_compute[226235]: 2026-01-31 09:02:40.141 226239 DEBUG oslo_concurrency.lockutils [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.584s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:40 np0005603623 nova_compute[226235]: 2026-01-31 09:02:40.213 226239 DEBUG oslo_concurrency.lockutils [None req-5b316df0-8dde-4c2c-aa9e-b1bef985d349 3859f52c5b70471097d1e4ffa75ecc0e 1f293713f6854265a89a1a4a002088d5 - - default default] Lock "2fbbeeee-ff60-4a39-9bea-e3d59301b0ad" "released" by "nova.compute.manager.ComputeManager.shelve_offload_instance.<locals>.do_shelve_offload_instance" :: held 19.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:40 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 31 04:02:40 np0005603623 nova_compute[226235]: 2026-01-31 09:02:40.350 226239 DEBUG nova.network.neutron [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Updating instance_info_cache with network_info: [{"id": "61191e24-6a0b-4f1f-a895-c15502e9f067", "address": "fa:16:3e:9e:df:1b", "network": {"id": "58d12028-6cf1-48b0-8622-9e4a18613610", "bridge": "br-int", "label": "tempest-network-smoke--1228215158", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61191e24-6a", "ovs_interfaceid": "61191e24-6a0b-4f1f-a895-c15502e9f067", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:02:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:41.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:41.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.256 226239 DEBUG oslo_concurrency.lockutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Releasing lock "refresh_cache-0d650376-d2f8-4e6a-b19a-f8639cc5ce52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.256 226239 DEBUG nova.compute.manager [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Instance network_info: |[{"id": "61191e24-6a0b-4f1f-a895-c15502e9f067", "address": "fa:16:3e:9e:df:1b", "network": {"id": "58d12028-6cf1-48b0-8622-9e4a18613610", "bridge": "br-int", "label": "tempest-network-smoke--1228215158", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61191e24-6a", "ovs_interfaceid": "61191e24-6a0b-4f1f-a895-c15502e9f067", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.256 226239 DEBUG oslo_concurrency.lockutils [req-4db3dc91-e06c-4762-ab11-1be31a197a26 req-e1598690-a196-4cee-8fa4-c6c264d8d2ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-0d650376-d2f8-4e6a-b19a-f8639cc5ce52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.257 226239 DEBUG nova.network.neutron [req-4db3dc91-e06c-4762-ab11-1be31a197a26 req-e1598690-a196-4cee-8fa4-c6c264d8d2ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Refreshing network info cache for port 61191e24-6a0b-4f1f-a895-c15502e9f067 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.259 226239 DEBUG nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Start _get_guest_xml network_info=[{"id": "61191e24-6a0b-4f1f-a895-c15502e9f067", "address": "fa:16:3e:9e:df:1b", "network": {"id": "58d12028-6cf1-48b0-8622-9e4a18613610", "bridge": "br-int", "label": "tempest-network-smoke--1228215158", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61191e24-6a", "ovs_interfaceid": "61191e24-6a0b-4f1f-a895-c15502e9f067", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.262 226239 WARNING nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.266 226239 DEBUG nova.virt.libvirt.host [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.267 226239 DEBUG nova.virt.libvirt.host [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.272 226239 DEBUG nova.virt.libvirt.host [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.273 226239 DEBUG nova.virt.libvirt.host [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.275 226239 DEBUG nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.276 226239 DEBUG nova.virt.hardware [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.277 226239 DEBUG nova.virt.hardware [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.277 226239 DEBUG nova.virt.hardware [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.278 226239 DEBUG nova.virt.hardware [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.278 226239 DEBUG nova.virt.hardware [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.279 226239 DEBUG nova.virt.hardware [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.279 226239 DEBUG nova.virt.hardware [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.280 226239 DEBUG nova.virt.hardware [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.280 226239 DEBUG nova.virt.hardware [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.281 226239 DEBUG nova.virt.hardware [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.281 226239 DEBUG nova.virt.hardware [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:02:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:41.284 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '85'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.287 226239 DEBUG oslo_concurrency.processutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.362 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850146.3594007, 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.362 226239 INFO nova.compute.manager [-] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.384 226239 DEBUG nova.compute.manager [None req-9cdf9554-feb3-4e36-9aff-660b7a658190 - - - - - -] [instance: 2fbbeeee-ff60-4a39-9bea-e3d59301b0ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.647 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:02:41 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/410196146' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.731 226239 DEBUG oslo_concurrency.processutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.759 226239 DEBUG nova.storage.rbd_utils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 0d650376-d2f8-4e6a-b19a-f8639cc5ce52_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:02:41 np0005603623 nova_compute[226235]: 2026-01-31 09:02:41.762 226239 DEBUG oslo_concurrency.processutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:02:42 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/706568901' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.155 226239 DEBUG oslo_concurrency.processutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.393s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.157 226239 DEBUG nova.virt.libvirt.vif [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:02:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-583579837',display_name='tempest-TestNetworkBasicOps-server-583579837',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-583579837',id=198,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEy3xxnQwS9QtMrUkwVHcHmpmeYhKB5YWo/R2bmWdQq9YEvCyb+gT+4GjT28KiprJJz2dZS4ZGJGGhvzW1p3Ze2nwQi6n5tMoA/qRpwp3bGqxwS+holIW8WwQmWwrA+gIg==',key_name='tempest-TestNetworkBasicOps-371986021',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-r9w2hza2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:02:30Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=0d650376-d2f8-4e6a-b19a-f8639cc5ce52,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "61191e24-6a0b-4f1f-a895-c15502e9f067", "address": "fa:16:3e:9e:df:1b", "network": {"id": "58d12028-6cf1-48b0-8622-9e4a18613610", "bridge": "br-int", "label": "tempest-network-smoke--1228215158", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61191e24-6a", "ovs_interfaceid": "61191e24-6a0b-4f1f-a895-c15502e9f067", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.157 226239 DEBUG nova.network.os_vif_util [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "61191e24-6a0b-4f1f-a895-c15502e9f067", "address": "fa:16:3e:9e:df:1b", "network": {"id": "58d12028-6cf1-48b0-8622-9e4a18613610", "bridge": "br-int", "label": "tempest-network-smoke--1228215158", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61191e24-6a", "ovs_interfaceid": "61191e24-6a0b-4f1f-a895-c15502e9f067", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.159 226239 DEBUG nova.network.os_vif_util [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:df:1b,bridge_name='br-int',has_traffic_filtering=True,id=61191e24-6a0b-4f1f-a895-c15502e9f067,network=Network(58d12028-6cf1-48b0-8622-9e4a18613610),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61191e24-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.160 226239 DEBUG nova.objects.instance [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0d650376-d2f8-4e6a-b19a-f8639cc5ce52 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.195 226239 DEBUG nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:02:42 np0005603623 nova_compute[226235]:  <uuid>0d650376-d2f8-4e6a-b19a-f8639cc5ce52</uuid>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:  <name>instance-000000c6</name>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <nova:name>tempest-TestNetworkBasicOps-server-583579837</nova:name>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 09:02:41</nova:creationTime>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 04:02:42 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:        <nova:user uuid="d442c7ba12ed444ca6d4dcc5cfd36150">tempest-TestNetworkBasicOps-104417095-project-member</nova:user>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:        <nova:project uuid="abf9393aa2b646feb00a3d887a9dee14">tempest-TestNetworkBasicOps-104417095</nova:project>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:        <nova:port uuid="61191e24-6a0b-4f1f-a895-c15502e9f067">
Jan 31 04:02:42 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <system>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <entry name="serial">0d650376-d2f8-4e6a-b19a-f8639cc5ce52</entry>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <entry name="uuid">0d650376-d2f8-4e6a-b19a-f8639cc5ce52</entry>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    </system>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:  <os>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:  </os>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:  <features>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:  </features>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:  </clock>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:  <devices>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/0d650376-d2f8-4e6a-b19a-f8639cc5ce52_disk">
Jan 31 04:02:42 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:02:42 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/0d650376-d2f8-4e6a-b19a-f8639cc5ce52_disk.config">
Jan 31 04:02:42 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:02:42 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:9e:df:1b"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <target dev="tap61191e24-6a"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    </interface>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/0d650376-d2f8-4e6a-b19a-f8639cc5ce52/console.log" append="off"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    </serial>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <video>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    </video>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    </rng>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 04:02:42 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 04:02:42 np0005603623 nova_compute[226235]:  </devices>
Jan 31 04:02:42 np0005603623 nova_compute[226235]: </domain>
Jan 31 04:02:42 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.196 226239 DEBUG nova.compute.manager [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Preparing to wait for external event network-vif-plugged-61191e24-6a0b-4f1f-a895-c15502e9f067 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.197 226239 DEBUG oslo_concurrency.lockutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.197 226239 DEBUG oslo_concurrency.lockutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.197 226239 DEBUG oslo_concurrency.lockutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.198 226239 DEBUG nova.virt.libvirt.vif [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:02:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-583579837',display_name='tempest-TestNetworkBasicOps-server-583579837',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-583579837',id=198,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEy3xxnQwS9QtMrUkwVHcHmpmeYhKB5YWo/R2bmWdQq9YEvCyb+gT+4GjT28KiprJJz2dZS4ZGJGGhvzW1p3Ze2nwQi6n5tMoA/qRpwp3bGqxwS+holIW8WwQmWwrA+gIg==',key_name='tempest-TestNetworkBasicOps-371986021',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-r9w2hza2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:02:30Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=0d650376-d2f8-4e6a-b19a-f8639cc5ce52,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "61191e24-6a0b-4f1f-a895-c15502e9f067", "address": "fa:16:3e:9e:df:1b", "network": {"id": "58d12028-6cf1-48b0-8622-9e4a18613610", "bridge": "br-int", "label": "tempest-network-smoke--1228215158", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61191e24-6a", "ovs_interfaceid": "61191e24-6a0b-4f1f-a895-c15502e9f067", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.198 226239 DEBUG nova.network.os_vif_util [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "61191e24-6a0b-4f1f-a895-c15502e9f067", "address": "fa:16:3e:9e:df:1b", "network": {"id": "58d12028-6cf1-48b0-8622-9e4a18613610", "bridge": "br-int", "label": "tempest-network-smoke--1228215158", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61191e24-6a", "ovs_interfaceid": "61191e24-6a0b-4f1f-a895-c15502e9f067", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.199 226239 DEBUG nova.network.os_vif_util [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:df:1b,bridge_name='br-int',has_traffic_filtering=True,id=61191e24-6a0b-4f1f-a895-c15502e9f067,network=Network(58d12028-6cf1-48b0-8622-9e4a18613610),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61191e24-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.199 226239 DEBUG os_vif [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:df:1b,bridge_name='br-int',has_traffic_filtering=True,id=61191e24-6a0b-4f1f-a895-c15502e9f067,network=Network(58d12028-6cf1-48b0-8622-9e4a18613610),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61191e24-6a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.200 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.201 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.201 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.204 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.204 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61191e24-6a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.204 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap61191e24-6a, col_values=(('external_ids', {'iface-id': '61191e24-6a0b-4f1f-a895-c15502e9f067', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9e:df:1b', 'vm-uuid': '0d650376-d2f8-4e6a-b19a-f8639cc5ce52'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:42 np0005603623 NetworkManager[48970]: <info>  [1769850162.2068] manager: (tap61191e24-6a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/387)
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.207 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.210 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.211 226239 INFO os_vif [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:df:1b,bridge_name='br-int',has_traffic_filtering=True,id=61191e24-6a0b-4f1f-a895-c15502e9f067,network=Network(58d12028-6cf1-48b0-8622-9e4a18613610),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61191e24-6a')#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.412 226239 DEBUG nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.413 226239 DEBUG nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.413 226239 DEBUG nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No VIF found with MAC fa:16:3e:9e:df:1b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.414 226239 INFO nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Using config drive#033[00m
Jan 31 04:02:42 np0005603623 nova_compute[226235]: 2026-01-31 09:02:42.441 226239 DEBUG nova.storage.rbd_utils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 0d650376-d2f8-4e6a-b19a-f8639cc5ce52_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:02:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:43.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:43.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:43 np0005603623 nova_compute[226235]: 2026-01-31 09:02:43.918 226239 INFO nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Creating config drive at /var/lib/nova/instances/0d650376-d2f8-4e6a-b19a-f8639cc5ce52/disk.config#033[00m
Jan 31 04:02:43 np0005603623 nova_compute[226235]: 2026-01-31 09:02:43.922 226239 DEBUG oslo_concurrency.processutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0d650376-d2f8-4e6a-b19a-f8639cc5ce52/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpqxhu0hpn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:44 np0005603623 nova_compute[226235]: 2026-01-31 09:02:44.041 226239 DEBUG oslo_concurrency.processutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0d650376-d2f8-4e6a-b19a-f8639cc5ce52/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpqxhu0hpn" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:44 np0005603623 nova_compute[226235]: 2026-01-31 09:02:44.069 226239 DEBUG nova.storage.rbd_utils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 0d650376-d2f8-4e6a-b19a-f8639cc5ce52_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:02:44 np0005603623 nova_compute[226235]: 2026-01-31 09:02:44.073 226239 DEBUG oslo_concurrency.processutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0d650376-d2f8-4e6a-b19a-f8639cc5ce52/disk.config 0d650376-d2f8-4e6a-b19a-f8639cc5ce52_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:44 np0005603623 nova_compute[226235]: 2026-01-31 09:02:44.449 226239 DEBUG oslo_concurrency.processutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0d650376-d2f8-4e6a-b19a-f8639cc5ce52/disk.config 0d650376-d2f8-4e6a-b19a-f8639cc5ce52_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.376s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:44 np0005603623 nova_compute[226235]: 2026-01-31 09:02:44.450 226239 INFO nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Deleting local config drive /var/lib/nova/instances/0d650376-d2f8-4e6a-b19a-f8639cc5ce52/disk.config because it was imported into RBD.#033[00m
Jan 31 04:02:44 np0005603623 kernel: tap61191e24-6a: entered promiscuous mode
Jan 31 04:02:44 np0005603623 NetworkManager[48970]: <info>  [1769850164.4846] manager: (tap61191e24-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/388)
Jan 31 04:02:44 np0005603623 ovn_controller[133449]: 2026-01-31T09:02:44Z|00827|binding|INFO|Claiming lport 61191e24-6a0b-4f1f-a895-c15502e9f067 for this chassis.
Jan 31 04:02:44 np0005603623 nova_compute[226235]: 2026-01-31 09:02:44.484 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:44 np0005603623 ovn_controller[133449]: 2026-01-31T09:02:44Z|00828|binding|INFO|61191e24-6a0b-4f1f-a895-c15502e9f067: Claiming fa:16:3e:9e:df:1b 10.100.0.29
Jan 31 04:02:44 np0005603623 nova_compute[226235]: 2026-01-31 09:02:44.487 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:44 np0005603623 nova_compute[226235]: 2026-01-31 09:02:44.499 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:44 np0005603623 ovn_controller[133449]: 2026-01-31T09:02:44Z|00829|binding|INFO|Setting lport 61191e24-6a0b-4f1f-a895-c15502e9f067 ovn-installed in OVS
Jan 31 04:02:44 np0005603623 nova_compute[226235]: 2026-01-31 09:02:44.504 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:44 np0005603623 systemd-udevd[319930]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:02:44 np0005603623 systemd-machined[194379]: New machine qemu-92-instance-000000c6.
Jan 31 04:02:44 np0005603623 NetworkManager[48970]: <info>  [1769850164.5219] device (tap61191e24-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:02:44 np0005603623 NetworkManager[48970]: <info>  [1769850164.5226] device (tap61191e24-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:02:44 np0005603623 systemd[1]: Started Virtual Machine qemu-92-instance-000000c6.
Jan 31 04:02:44 np0005603623 ovn_controller[133449]: 2026-01-31T09:02:44Z|00830|binding|INFO|Setting lport 61191e24-6a0b-4f1f-a895-c15502e9f067 up in Southbound
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.547 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:df:1b 10.100.0.29'], port_security=['fa:16:3e:9e:df:1b 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': '0d650376-d2f8-4e6a-b19a-f8639cc5ce52', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58d12028-6cf1-48b0-8622-9e4a18613610', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7ac0caca-a209-4e29-a9eb-05d42413f872', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69ee6f93-cadf-4e2f-a073-fd82c56c8449, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=61191e24-6a0b-4f1f-a895-c15502e9f067) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.548 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 61191e24-6a0b-4f1f-a895-c15502e9f067 in datapath 58d12028-6cf1-48b0-8622-9e4a18613610 bound to our chassis#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.550 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58d12028-6cf1-48b0-8622-9e4a18613610#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.558 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[78e21f6f-5f2a-4787-9c0a-49a6b3b26405]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.559 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58d12028-61 in ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.560 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58d12028-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.560 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ff70a953-a9a5-41af-b799-6723ce809f4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.561 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[79e5e218-b58f-470c-aa13-795d188a5c1b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.569 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[3d634b3b-880c-4678-a63b-919ff82455b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.576 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[367b0f4c-145f-4189-8293-8047c41865d4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.595 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[f5e48d64-8cac-4ff7-9623-a9cd3e922ba8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:44 np0005603623 NetworkManager[48970]: <info>  [1769850164.6006] manager: (tap58d12028-60): new Veth device (/org/freedesktop/NetworkManager/Devices/389)
Jan 31 04:02:44 np0005603623 systemd-udevd[319932]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.599 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[552c918e-f07c-4917-9d50-f0642c1e3505]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.623 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[46cb2f51-2954-4fdd-ba6a-d43ad0469d9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.625 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[952b0e9a-771b-4208-9d15-b0759a0199f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:44 np0005603623 NetworkManager[48970]: <info>  [1769850164.6408] device (tap58d12028-60): carrier: link connected
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.645 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[d5a04ba4-34a1-4a1c-ba3f-d89414e3779d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.657 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[53b85216-f23d-4343-819f-18466a67f309]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58d12028-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:54:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 242], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 930507, 'reachable_time': 27060, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 319964, 'error': None, 'target': 'ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.668 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4c7010d5-3060-49d9-8430-5459fbc45698]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe77:543f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 930507, 'tstamp': 930507}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 319965, 'error': None, 'target': 'ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.679 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c9ee9c84-e390-47db-b9cf-9122816d63a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58d12028-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:54:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 242], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 930507, 'reachable_time': 27060, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 319966, 'error': None, 'target': 'ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.699 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[41094c0e-0867-40f6-8eec-72f8c9cb46de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.735 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[60979e81-ec0f-43c5-a052-4c1e6d058a53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.736 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58d12028-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.737 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.737 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58d12028-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:44 np0005603623 kernel: tap58d12028-60: entered promiscuous mode
Jan 31 04:02:44 np0005603623 nova_compute[226235]: 2026-01-31 09:02:44.739 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:44 np0005603623 NetworkManager[48970]: <info>  [1769850164.7409] manager: (tap58d12028-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/390)
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.741 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58d12028-60, col_values=(('external_ids', {'iface-id': 'e67a29eb-6020-405c-9d90-f517b6b8d40e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:44 np0005603623 ovn_controller[133449]: 2026-01-31T09:02:44Z|00831|binding|INFO|Releasing lport e67a29eb-6020-405c-9d90-f517b6b8d40e from this chassis (sb_readonly=0)
Jan 31 04:02:44 np0005603623 nova_compute[226235]: 2026-01-31 09:02:44.742 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:44 np0005603623 nova_compute[226235]: 2026-01-31 09:02:44.746 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.747 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58d12028-6cf1-48b0-8622-9e4a18613610.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58d12028-6cf1-48b0-8622-9e4a18613610.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.748 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e49b534d-0b74-421d-affc-41aa1c080535]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.748 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-58d12028-6cf1-48b0-8622-9e4a18613610
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/58d12028-6cf1-48b0-8622-9e4a18613610.pid.haproxy
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 58d12028-6cf1-48b0-8622-9e4a18613610
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:02:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:02:44.749 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610', 'env', 'PROCESS_TAG=haproxy-58d12028-6cf1-48b0-8622-9e4a18613610', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58d12028-6cf1-48b0-8622-9e4a18613610.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:02:44 np0005603623 nova_compute[226235]: 2026-01-31 09:02:44.940 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850164.9404151, 0d650376-d2f8-4e6a-b19a-f8639cc5ce52 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:02:44 np0005603623 nova_compute[226235]: 2026-01-31 09:02:44.941 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] VM Started (Lifecycle Event)#033[00m
Jan 31 04:02:45 np0005603623 nova_compute[226235]: 2026-01-31 09:02:45.034 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:02:45 np0005603623 nova_compute[226235]: 2026-01-31 09:02:45.038 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850164.941298, 0d650376-d2f8-4e6a-b19a-f8639cc5ce52 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:02:45 np0005603623 nova_compute[226235]: 2026-01-31 09:02:45.038 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:02:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:45.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:45 np0005603623 podman[320038]: 2026-01-31 09:02:45.020320649 +0000 UTC m=+0.018675107 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:02:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:45.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:45 np0005603623 nova_compute[226235]: 2026-01-31 09:02:45.272 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:02:45 np0005603623 nova_compute[226235]: 2026-01-31 09:02:45.275 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:02:45 np0005603623 podman[320038]: 2026-01-31 09:02:45.320253358 +0000 UTC m=+0.318607796 container create 837aff463bef565110677b5a365704f6d6a433cc1a65bfa345cba5f41742a665 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:02:45 np0005603623 nova_compute[226235]: 2026-01-31 09:02:45.336 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:02:45 np0005603623 systemd[1]: Started libpod-conmon-837aff463bef565110677b5a365704f6d6a433cc1a65bfa345cba5f41742a665.scope.
Jan 31 04:02:45 np0005603623 systemd[1]: Started libcrun container.
Jan 31 04:02:45 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6387a1782b3d1ca5d9a9abc701e1767f18715c06c743f5b2fbaeeb777db2ae9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:02:45 np0005603623 podman[320038]: 2026-01-31 09:02:45.565204602 +0000 UTC m=+0.563559050 container init 837aff463bef565110677b5a365704f6d6a433cc1a65bfa345cba5f41742a665 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 31 04:02:45 np0005603623 podman[320038]: 2026-01-31 09:02:45.569531048 +0000 UTC m=+0.567885486 container start 837aff463bef565110677b5a365704f6d6a433cc1a65bfa345cba5f41742a665 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:02:45 np0005603623 neutron-haproxy-ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610[320053]: [NOTICE]   (320057) : New worker (320059) forked
Jan 31 04:02:45 np0005603623 neutron-haproxy-ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610[320053]: [NOTICE]   (320057) : Loading success.
Jan 31 04:02:45 np0005603623 nova_compute[226235]: 2026-01-31 09:02:45.791 226239 DEBUG nova.network.neutron [req-4db3dc91-e06c-4762-ab11-1be31a197a26 req-e1598690-a196-4cee-8fa4-c6c264d8d2ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Updated VIF entry in instance network info cache for port 61191e24-6a0b-4f1f-a895-c15502e9f067. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:02:45 np0005603623 nova_compute[226235]: 2026-01-31 09:02:45.792 226239 DEBUG nova.network.neutron [req-4db3dc91-e06c-4762-ab11-1be31a197a26 req-e1598690-a196-4cee-8fa4-c6c264d8d2ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Updating instance_info_cache with network_info: [{"id": "61191e24-6a0b-4f1f-a895-c15502e9f067", "address": "fa:16:3e:9e:df:1b", "network": {"id": "58d12028-6cf1-48b0-8622-9e4a18613610", "bridge": "br-int", "label": "tempest-network-smoke--1228215158", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61191e24-6a", "ovs_interfaceid": "61191e24-6a0b-4f1f-a895-c15502e9f067", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:02:45 np0005603623 nova_compute[226235]: 2026-01-31 09:02:45.868 226239 DEBUG oslo_concurrency.lockutils [req-4db3dc91-e06c-4762-ab11-1be31a197a26 req-e1598690-a196-4cee-8fa4-c6c264d8d2ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-0d650376-d2f8-4e6a-b19a-f8639cc5ce52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:02:46 np0005603623 nova_compute[226235]: 2026-01-31 09:02:46.649 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:02:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:47.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:02:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:47.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.241 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.560 226239 DEBUG nova.compute.manager [req-efdaf7b4-1044-4bfd-ab1e-b17e261796e9 req-04ee4d9f-fcd9-42f7-9a1a-93d7a8f7fcba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Received event network-vif-plugged-61191e24-6a0b-4f1f-a895-c15502e9f067 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.561 226239 DEBUG oslo_concurrency.lockutils [req-efdaf7b4-1044-4bfd-ab1e-b17e261796e9 req-04ee4d9f-fcd9-42f7-9a1a-93d7a8f7fcba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.562 226239 DEBUG oslo_concurrency.lockutils [req-efdaf7b4-1044-4bfd-ab1e-b17e261796e9 req-04ee4d9f-fcd9-42f7-9a1a-93d7a8f7fcba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.562 226239 DEBUG oslo_concurrency.lockutils [req-efdaf7b4-1044-4bfd-ab1e-b17e261796e9 req-04ee4d9f-fcd9-42f7-9a1a-93d7a8f7fcba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.562 226239 DEBUG nova.compute.manager [req-efdaf7b4-1044-4bfd-ab1e-b17e261796e9 req-04ee4d9f-fcd9-42f7-9a1a-93d7a8f7fcba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Processing event network-vif-plugged-61191e24-6a0b-4f1f-a895-c15502e9f067 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.563 226239 DEBUG nova.compute.manager [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.566 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850167.5666056, 0d650376-d2f8-4e6a-b19a-f8639cc5ce52 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.567 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.569 226239 DEBUG nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.572 226239 INFO nova.virt.libvirt.driver [-] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Instance spawned successfully.#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.572 226239 DEBUG nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.786 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.791 226239 DEBUG nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.791 226239 DEBUG nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.792 226239 DEBUG nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.792 226239 DEBUG nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.793 226239 DEBUG nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.793 226239 DEBUG nova.virt.libvirt.driver [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.799 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:02:47 np0005603623 nova_compute[226235]: 2026-01-31 09:02:47.946 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:02:48 np0005603623 nova_compute[226235]: 2026-01-31 09:02:48.011 226239 INFO nova.compute.manager [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Took 17.01 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:02:48 np0005603623 nova_compute[226235]: 2026-01-31 09:02:48.011 226239 DEBUG nova.compute.manager [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:02:48 np0005603623 nova_compute[226235]: 2026-01-31 09:02:48.193 226239 INFO nova.compute.manager [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Took 19.91 seconds to build instance.#033[00m
Jan 31 04:02:48 np0005603623 nova_compute[226235]: 2026-01-31 09:02:48.351 226239 DEBUG oslo_concurrency.lockutils [None req-dab0df74-cb19-40f1-9d67-1a0e2d23025b d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:49.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:02:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:49.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:02:49 np0005603623 podman[320242]: 2026-01-31 09:02:49.829468645 +0000 UTC m=+0.110964863 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507)
Jan 31 04:02:49 np0005603623 podman[320242]: 2026-01-31 09:02:49.915776392 +0000 UTC m=+0.197272610 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 04:02:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:02:50 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1226535535' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:02:50 np0005603623 podman[320346]: 2026-01-31 09:02:50.263211202 +0000 UTC m=+0.046708616 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 04:02:50 np0005603623 podman[320347]: 2026-01-31 09:02:50.29216887 +0000 UTC m=+0.073560008 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:02:50 np0005603623 podman[320433]: 2026-01-31 09:02:50.407185128 +0000 UTC m=+0.048416239 container exec dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 04:02:50 np0005603623 podman[320457]: 2026-01-31 09:02:50.468647706 +0000 UTC m=+0.050734442 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 04:02:50 np0005603623 podman[320433]: 2026-01-31 09:02:50.473809479 +0000 UTC m=+0.115040560 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 04:02:50 np0005603623 podman[320504]: 2026-01-31 09:02:50.615307627 +0000 UTC m=+0.041606416 container exec 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, io.buildah.version=1.28.2, architecture=x86_64, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=keepalived, release=1793, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, build-date=2023-02-22T09:23:20)
Jan 31 04:02:50 np0005603623 podman[320504]: 2026-01-31 09:02:50.6268637 +0000 UTC m=+0.053162459 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, release=1793, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Jan 31 04:02:50 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:02:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:51.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:02:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 70K writes, 299K keys, 70K commit groups, 1.0 writes per commit group, ingest: 0.31 GB, 0.05 MB/s#012Cumulative WAL: 70K writes, 23K syncs, 2.95 writes per sync, written: 0.31 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 7633 writes, 29K keys, 7633 commit groups, 1.0 writes per commit group, ingest: 30.71 MB, 0.05 MB/s#012Interval WAL: 7633 writes, 2972 syncs, 2.57 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e60fd28f30#2 capacity: 1.11 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000122231%) FilterBlock(3,0.33 KB,2.82073e-05%) IndexBlock(3,0.34 KB,2.95505e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e60fd28f30#2 capacity: 1.11 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000122231%) FilterBlock(3,0.33 KB,2.82073e-05%) IndexBlock(3,0.34 KB,2.95505e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 
Jan 31 04:02:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:51.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:51 np0005603623 nova_compute[226235]: 2026-01-31 09:02:51.559 226239 DEBUG nova.compute.manager [req-65d91cb8-8983-4d66-81da-270f411caa3a req-0369f57f-9c79-429c-83eb-c81d43687dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Received event network-vif-plugged-61191e24-6a0b-4f1f-a895-c15502e9f067 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:51 np0005603623 nova_compute[226235]: 2026-01-31 09:02:51.561 226239 DEBUG oslo_concurrency.lockutils [req-65d91cb8-8983-4d66-81da-270f411caa3a req-0369f57f-9c79-429c-83eb-c81d43687dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:51 np0005603623 nova_compute[226235]: 2026-01-31 09:02:51.561 226239 DEBUG oslo_concurrency.lockutils [req-65d91cb8-8983-4d66-81da-270f411caa3a req-0369f57f-9c79-429c-83eb-c81d43687dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:51 np0005603623 nova_compute[226235]: 2026-01-31 09:02:51.561 226239 DEBUG oslo_concurrency.lockutils [req-65d91cb8-8983-4d66-81da-270f411caa3a req-0369f57f-9c79-429c-83eb-c81d43687dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:51 np0005603623 nova_compute[226235]: 2026-01-31 09:02:51.562 226239 DEBUG nova.compute.manager [req-65d91cb8-8983-4d66-81da-270f411caa3a req-0369f57f-9c79-429c-83eb-c81d43687dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] No waiting events found dispatching network-vif-plugged-61191e24-6a0b-4f1f-a895-c15502e9f067 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:02:51 np0005603623 nova_compute[226235]: 2026-01-31 09:02:51.562 226239 WARNING nova.compute.manager [req-65d91cb8-8983-4d66-81da-270f411caa3a req-0369f57f-9c79-429c-83eb-c81d43687dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Received unexpected event network-vif-plugged-61191e24-6a0b-4f1f-a895-c15502e9f067 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:02:51 np0005603623 nova_compute[226235]: 2026-01-31 09:02:51.652 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:02:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:02:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:02:51 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:02:52 np0005603623 nova_compute[226235]: 2026-01-31 09:02:52.244 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:02:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:53.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:02:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:53.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:02:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:55.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:02:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:55.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:56 np0005603623 nova_compute[226235]: 2026-01-31 09:02:56.687 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:56 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:02:56 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:02:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:57.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:57.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:57 np0005603623 nova_compute[226235]: 2026-01-31 09:02:57.246 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:59.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:02:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:59.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:03:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:01.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:03:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:01.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e393 e393: 3 total, 3 up, 3 in
Jan 31 04:03:01 np0005603623 ovn_controller[133449]: 2026-01-31T09:03:01Z|00100|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9e:df:1b 10.100.0.29
Jan 31 04:03:01 np0005603623 ovn_controller[133449]: 2026-01-31T09:03:01Z|00101|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9e:df:1b 10.100.0.29
Jan 31 04:03:01 np0005603623 nova_compute[226235]: 2026-01-31 09:03:01.687 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e393 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:02 np0005603623 nova_compute[226235]: 2026-01-31 09:03:02.247 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:03.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:03.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:05.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:05.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:06 np0005603623 nova_compute[226235]: 2026-01-31 09:03:06.689 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e393 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:07.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:07.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:07 np0005603623 nova_compute[226235]: 2026-01-31 09:03:07.249 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e394 e394: 3 total, 3 up, 3 in
Jan 31 04:03:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:03:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:09.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:03:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:09.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:03:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:11.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:03:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:11.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:11 np0005603623 nova_compute[226235]: 2026-01-31 09:03:11.691 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:12 np0005603623 nova_compute[226235]: 2026-01-31 09:03:12.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:12 np0005603623 nova_compute[226235]: 2026-01-31 09:03:12.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:03:12 np0005603623 nova_compute[226235]: 2026-01-31 09:03:12.252 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #169. Immutable memtables: 0.
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:03:12.590229) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 107] Flushing memtable with next log file: 169
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850192590272, "job": 107, "event": "flush_started", "num_memtables": 1, "num_entries": 763, "num_deletes": 253, "total_data_size": 1370381, "memory_usage": 1390264, "flush_reason": "Manual Compaction"}
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 107] Level-0 flush table #170: started
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850192598366, "cf_name": "default", "job": 107, "event": "table_file_creation", "file_number": 170, "file_size": 643817, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 82222, "largest_seqno": 82980, "table_properties": {"data_size": 640521, "index_size": 1139, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 9075, "raw_average_key_size": 21, "raw_value_size": 633443, "raw_average_value_size": 1480, "num_data_blocks": 49, "num_entries": 428, "num_filter_entries": 428, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850147, "oldest_key_time": 1769850147, "file_creation_time": 1769850192, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 170, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 107] Flush lasted 8241 microseconds, and 2165 cpu microseconds.
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:03:12.598453) [db/flush_job.cc:967] [default] [JOB 107] Level-0 flush table #170: 643817 bytes OK
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:03:12.598491) [db/memtable_list.cc:519] [default] Level-0 commit table #170 started
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:03:12.606507) [db/memtable_list.cc:722] [default] Level-0 commit table #170: memtable #1 done
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:03:12.606522) EVENT_LOG_v1 {"time_micros": 1769850192606517, "job": 107, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:03:12.606539) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 107] Try to delete WAL files size 1366299, prev total WAL file size 1367014, number of live WAL files 2.
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000166.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:03:12.608429) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032373536' seq:72057594037927935, type:22 .. '6D6772737461740033303039' seq:0, type:0; will stop at (end)
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 108] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 107 Base level 0, inputs: [170(628KB)], [168(13MB)]
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850192608524, "job": 108, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [170], "files_L6": [168], "score": -1, "input_data_size": 14342019, "oldest_snapshot_seqno": -1}
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 108] Generated table #171: 10144 keys, 10722486 bytes, temperature: kUnknown
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850192733244, "cf_name": "default", "job": 108, "event": "table_file_creation", "file_number": 171, "file_size": 10722486, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10660386, "index_size": 35619, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25413, "raw_key_size": 268302, "raw_average_key_size": 26, "raw_value_size": 10486654, "raw_average_value_size": 1033, "num_data_blocks": 1343, "num_entries": 10144, "num_filter_entries": 10144, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769850192, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 171, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:03:12.733519) [db/compaction/compaction_job.cc:1663] [default] [JOB 108] Compacted 1@0 + 1@6 files to L6 => 10722486 bytes
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:03:12.736177) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 114.9 rd, 85.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 13.1 +0.0 blob) out(10.2 +0.0 blob), read-write-amplify(38.9) write-amplify(16.7) OK, records in: 10652, records dropped: 508 output_compression: NoCompression
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:03:12.736238) EVENT_LOG_v1 {"time_micros": 1769850192736217, "job": 108, "event": "compaction_finished", "compaction_time_micros": 124793, "compaction_time_cpu_micros": 22323, "output_level": 6, "num_output_files": 1, "total_output_size": 10722486, "num_input_records": 10652, "num_output_records": 10144, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000170.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850192736644, "job": 108, "event": "table_file_deletion", "file_number": 170}
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000168.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850192737978, "job": 108, "event": "table_file_deletion", "file_number": 168}
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:03:12.608326) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:03:12.738066) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:03:12.738071) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:03:12.738076) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:03:12.738079) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:03:12 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:03:12.738081) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:03:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:03:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:13.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:03:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:13.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:15.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:15.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.265 226239 DEBUG oslo_concurrency.lockutils [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.266 226239 DEBUG oslo_concurrency.lockutils [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.266 226239 DEBUG oslo_concurrency.lockutils [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.266 226239 DEBUG oslo_concurrency.lockutils [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.266 226239 DEBUG oslo_concurrency.lockutils [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.267 226239 INFO nova.compute.manager [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Terminating instance#033[00m
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.268 226239 DEBUG nova.compute.manager [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:03:16 np0005603623 kernel: tap61191e24-6a (unregistering): left promiscuous mode
Jan 31 04:03:16 np0005603623 NetworkManager[48970]: <info>  [1769850196.3483] device (tap61191e24-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.381 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:16 np0005603623 ovn_controller[133449]: 2026-01-31T09:03:16Z|00832|binding|INFO|Releasing lport 61191e24-6a0b-4f1f-a895-c15502e9f067 from this chassis (sb_readonly=0)
Jan 31 04:03:16 np0005603623 ovn_controller[133449]: 2026-01-31T09:03:16Z|00833|binding|INFO|Setting lport 61191e24-6a0b-4f1f-a895-c15502e9f067 down in Southbound
Jan 31 04:03:16 np0005603623 ovn_controller[133449]: 2026-01-31T09:03:16Z|00834|binding|INFO|Removing iface tap61191e24-6a ovn-installed in OVS
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.384 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.393 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:16.398 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:df:1b 10.100.0.29'], port_security=['fa:16:3e:9e:df:1b 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': '0d650376-d2f8-4e6a-b19a-f8639cc5ce52', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58d12028-6cf1-48b0-8622-9e4a18613610', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7ac0caca-a209-4e29-a9eb-05d42413f872', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69ee6f93-cadf-4e2f-a073-fd82c56c8449, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=61191e24-6a0b-4f1f-a895-c15502e9f067) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:03:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:16.400 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 61191e24-6a0b-4f1f-a895-c15502e9f067 in datapath 58d12028-6cf1-48b0-8622-9e4a18613610 unbound from our chassis#033[00m
Jan 31 04:03:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:16.402 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58d12028-6cf1-48b0-8622-9e4a18613610, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:03:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:16.403 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c095ace9-c3c9-44f4-8269-4b82835bdee6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:16.403 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610 namespace which is not needed anymore#033[00m
Jan 31 04:03:16 np0005603623 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000c6.scope: Deactivated successfully.
Jan 31 04:03:16 np0005603623 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000c6.scope: Consumed 12.746s CPU time.
Jan 31 04:03:16 np0005603623 systemd-machined[194379]: Machine qemu-92-instance-000000c6 terminated.
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.498 226239 INFO nova.virt.libvirt.driver [-] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Instance destroyed successfully.#033[00m
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.500 226239 DEBUG nova.objects.instance [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'resources' on Instance uuid 0d650376-d2f8-4e6a-b19a-f8639cc5ce52 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:03:16 np0005603623 neutron-haproxy-ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610[320053]: [NOTICE]   (320057) : haproxy version is 2.8.14-c23fe91
Jan 31 04:03:16 np0005603623 neutron-haproxy-ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610[320053]: [NOTICE]   (320057) : path to executable is /usr/sbin/haproxy
Jan 31 04:03:16 np0005603623 neutron-haproxy-ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610[320053]: [WARNING]  (320057) : Exiting Master process...
Jan 31 04:03:16 np0005603623 neutron-haproxy-ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610[320053]: [ALERT]    (320057) : Current worker (320059) exited with code 143 (Terminated)
Jan 31 04:03:16 np0005603623 neutron-haproxy-ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610[320053]: [WARNING]  (320057) : All workers exited. Exiting... (0)
Jan 31 04:03:16 np0005603623 systemd[1]: libpod-837aff463bef565110677b5a365704f6d6a433cc1a65bfa345cba5f41742a665.scope: Deactivated successfully.
Jan 31 04:03:16 np0005603623 podman[320802]: 2026-01-31 09:03:16.532151938 +0000 UTC m=+0.054845442 container died 837aff463bef565110677b5a365704f6d6a433cc1a65bfa345cba5f41742a665 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.538 226239 DEBUG nova.virt.libvirt.vif [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:02:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-583579837',display_name='tempest-TestNetworkBasicOps-server-583579837',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-583579837',id=198,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEy3xxnQwS9QtMrUkwVHcHmpmeYhKB5YWo/R2bmWdQq9YEvCyb+gT+4GjT28KiprJJz2dZS4ZGJGGhvzW1p3Ze2nwQi6n5tMoA/qRpwp3bGqxwS+holIW8WwQmWwrA+gIg==',key_name='tempest-TestNetworkBasicOps-371986021',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:02:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-r9w2hza2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:02:48Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=0d650376-d2f8-4e6a-b19a-f8639cc5ce52,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61191e24-6a0b-4f1f-a895-c15502e9f067", "address": "fa:16:3e:9e:df:1b", "network": {"id": "58d12028-6cf1-48b0-8622-9e4a18613610", "bridge": "br-int", "label": "tempest-network-smoke--1228215158", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61191e24-6a", "ovs_interfaceid": "61191e24-6a0b-4f1f-a895-c15502e9f067", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.539 226239 DEBUG nova.network.os_vif_util [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "61191e24-6a0b-4f1f-a895-c15502e9f067", "address": "fa:16:3e:9e:df:1b", "network": {"id": "58d12028-6cf1-48b0-8622-9e4a18613610", "bridge": "br-int", "label": "tempest-network-smoke--1228215158", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61191e24-6a", "ovs_interfaceid": "61191e24-6a0b-4f1f-a895-c15502e9f067", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.539 226239 DEBUG nova.network.os_vif_util [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:df:1b,bridge_name='br-int',has_traffic_filtering=True,id=61191e24-6a0b-4f1f-a895-c15502e9f067,network=Network(58d12028-6cf1-48b0-8622-9e4a18613610),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61191e24-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.540 226239 DEBUG os_vif [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:df:1b,bridge_name='br-int',has_traffic_filtering=True,id=61191e24-6a0b-4f1f-a895-c15502e9f067,network=Network(58d12028-6cf1-48b0-8622-9e4a18613610),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61191e24-6a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.541 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.542 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61191e24-6a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.543 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.544 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.547 226239 INFO os_vif [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:df:1b,bridge_name='br-int',has_traffic_filtering=True,id=61191e24-6a0b-4f1f-a895-c15502e9f067,network=Network(58d12028-6cf1-48b0-8622-9e4a18613610),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61191e24-6a')#033[00m
Jan 31 04:03:16 np0005603623 systemd[1]: var-lib-containers-storage-overlay-e6387a1782b3d1ca5d9a9abc701e1767f18715c06c743f5b2fbaeeb777db2ae9-merged.mount: Deactivated successfully.
Jan 31 04:03:16 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-837aff463bef565110677b5a365704f6d6a433cc1a65bfa345cba5f41742a665-userdata-shm.mount: Deactivated successfully.
Jan 31 04:03:16 np0005603623 podman[320802]: 2026-01-31 09:03:16.671575182 +0000 UTC m=+0.194268676 container cleanup 837aff463bef565110677b5a365704f6d6a433cc1a65bfa345cba5f41742a665 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:03:16 np0005603623 systemd[1]: libpod-conmon-837aff463bef565110677b5a365704f6d6a433cc1a65bfa345cba5f41742a665.scope: Deactivated successfully.
Jan 31 04:03:16 np0005603623 nova_compute[226235]: 2026-01-31 09:03:16.693 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:16 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:17 np0005603623 podman[320864]: 2026-01-31 09:03:17.026814665 +0000 UTC m=+0.335216296 container remove 837aff463bef565110677b5a365704f6d6a433cc1a65bfa345cba5f41742a665 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:03:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:17.030 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9fade5fb-2856-4b99-9e68-e39a588ab0a5]: (4, ('Sat Jan 31 09:03:16 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610 (837aff463bef565110677b5a365704f6d6a433cc1a65bfa345cba5f41742a665)\n837aff463bef565110677b5a365704f6d6a433cc1a65bfa345cba5f41742a665\nSat Jan 31 09:03:16 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610 (837aff463bef565110677b5a365704f6d6a433cc1a65bfa345cba5f41742a665)\n837aff463bef565110677b5a365704f6d6a433cc1a65bfa345cba5f41742a665\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:17.032 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6a8d6ea1-d70c-4f9d-85dd-16c4fde68c82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:17.033 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58d12028-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:03:17 np0005603623 kernel: tap58d12028-60: left promiscuous mode
Jan 31 04:03:17 np0005603623 nova_compute[226235]: 2026-01-31 09:03:17.035 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:17 np0005603623 nova_compute[226235]: 2026-01-31 09:03:17.039 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:17 np0005603623 nova_compute[226235]: 2026-01-31 09:03:17.040 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:17.042 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[762ad9ce-c45d-44d2-8218-2227d1f64315]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:17.057 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[498e9064-9b53-41d8-8aef-be7e898c970b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:17.059 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[76046fdb-9ea7-46d1-8eb6-4c75715fb144]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:17.069 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[75c153b8-eb29-4ccd-9174-c27d1395f90b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 930502, 'reachable_time': 39454, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 320879, 'error': None, 'target': 'ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:17 np0005603623 systemd[1]: run-netns-ovnmeta\x2d58d12028\x2d6cf1\x2d48b0\x2d8622\x2d9e4a18613610.mount: Deactivated successfully.
Jan 31 04:03:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:17.071 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58d12028-6cf1-48b0-8622-9e4a18613610 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:03:17 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:17.071 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[62186169-1d86-4ebd-a435-334ea086152c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:17.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:17 np0005603623 nova_compute[226235]: 2026-01-31 09:03:17.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:17 np0005603623 nova_compute[226235]: 2026-01-31 09:03:17.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:03:17 np0005603623 nova_compute[226235]: 2026-01-31 09:03:17.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:03:17 np0005603623 nova_compute[226235]: 2026-01-31 09:03:17.205 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 31 04:03:17 np0005603623 nova_compute[226235]: 2026-01-31 09:03:17.205 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:03:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:17.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:17 np0005603623 nova_compute[226235]: 2026-01-31 09:03:17.464 226239 INFO nova.virt.libvirt.driver [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Deleting instance files /var/lib/nova/instances/0d650376-d2f8-4e6a-b19a-f8639cc5ce52_del#033[00m
Jan 31 04:03:17 np0005603623 nova_compute[226235]: 2026-01-31 09:03:17.465 226239 INFO nova.virt.libvirt.driver [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Deletion of /var/lib/nova/instances/0d650376-d2f8-4e6a-b19a-f8639cc5ce52_del complete#033[00m
Jan 31 04:03:17 np0005603623 nova_compute[226235]: 2026-01-31 09:03:17.571 226239 INFO nova.compute.manager [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Took 1.30 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:03:17 np0005603623 nova_compute[226235]: 2026-01-31 09:03:17.571 226239 DEBUG oslo.service.loopingcall [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:03:17 np0005603623 nova_compute[226235]: 2026-01-31 09:03:17.572 226239 DEBUG nova.compute.manager [-] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:03:17 np0005603623 nova_compute[226235]: 2026-01-31 09:03:17.572 226239 DEBUG nova.network.neutron [-] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:03:18 np0005603623 nova_compute[226235]: 2026-01-31 09:03:18.022 226239 DEBUG nova.compute.manager [req-8ec39415-8379-4d68-9703-1e889b9a4be6 req-d78bebef-ef28-4029-8bd6-23f510f3ad3d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Received event network-vif-unplugged-61191e24-6a0b-4f1f-a895-c15502e9f067 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:03:18 np0005603623 nova_compute[226235]: 2026-01-31 09:03:18.023 226239 DEBUG oslo_concurrency.lockutils [req-8ec39415-8379-4d68-9703-1e889b9a4be6 req-d78bebef-ef28-4029-8bd6-23f510f3ad3d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:18 np0005603623 nova_compute[226235]: 2026-01-31 09:03:18.023 226239 DEBUG oslo_concurrency.lockutils [req-8ec39415-8379-4d68-9703-1e889b9a4be6 req-d78bebef-ef28-4029-8bd6-23f510f3ad3d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:18 np0005603623 nova_compute[226235]: 2026-01-31 09:03:18.023 226239 DEBUG oslo_concurrency.lockutils [req-8ec39415-8379-4d68-9703-1e889b9a4be6 req-d78bebef-ef28-4029-8bd6-23f510f3ad3d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:18 np0005603623 nova_compute[226235]: 2026-01-31 09:03:18.023 226239 DEBUG nova.compute.manager [req-8ec39415-8379-4d68-9703-1e889b9a4be6 req-d78bebef-ef28-4029-8bd6-23f510f3ad3d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] No waiting events found dispatching network-vif-unplugged-61191e24-6a0b-4f1f-a895-c15502e9f067 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:03:18 np0005603623 nova_compute[226235]: 2026-01-31 09:03:18.023 226239 DEBUG nova.compute.manager [req-8ec39415-8379-4d68-9703-1e889b9a4be6 req-d78bebef-ef28-4029-8bd6-23f510f3ad3d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Received event network-vif-unplugged-61191e24-6a0b-4f1f-a895-c15502e9f067 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:03:18 np0005603623 nova_compute[226235]: 2026-01-31 09:03:18.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:18 np0005603623 nova_compute[226235]: 2026-01-31 09:03:18.236 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:18 np0005603623 nova_compute[226235]: 2026-01-31 09:03:18.237 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:18 np0005603623 nova_compute[226235]: 2026-01-31 09:03:18.237 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:18 np0005603623 nova_compute[226235]: 2026-01-31 09:03:18.237 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:03:18 np0005603623 nova_compute[226235]: 2026-01-31 09:03:18.237 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:03:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:03:18 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1004008841' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:03:18 np0005603623 nova_compute[226235]: 2026-01-31 09:03:18.630 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.393s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:03:18 np0005603623 nova_compute[226235]: 2026-01-31 09:03:18.773 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:03:18 np0005603623 nova_compute[226235]: 2026-01-31 09:03:18.774 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4171MB free_disk=20.851333618164062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:03:18 np0005603623 nova_compute[226235]: 2026-01-31 09:03:18.774 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:18 np0005603623 nova_compute[226235]: 2026-01-31 09:03:18.774 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:19.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:19.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:20 np0005603623 podman[320929]: 2026-01-31 09:03:20.489616887 +0000 UTC m=+0.045272581 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:03:20 np0005603623 podman[320930]: 2026-01-31 09:03:20.544307083 +0000 UTC m=+0.098189911 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 31 04:03:20 np0005603623 nova_compute[226235]: 2026-01-31 09:03:20.930 226239 DEBUG nova.compute.manager [req-2143473c-9e0c-4e10-b6d7-3b363b3d4d61 req-3c73720d-fecf-4624-a449-057a9a7b3e9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Received event network-vif-plugged-61191e24-6a0b-4f1f-a895-c15502e9f067 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:03:20 np0005603623 nova_compute[226235]: 2026-01-31 09:03:20.930 226239 DEBUG oslo_concurrency.lockutils [req-2143473c-9e0c-4e10-b6d7-3b363b3d4d61 req-3c73720d-fecf-4624-a449-057a9a7b3e9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:20 np0005603623 nova_compute[226235]: 2026-01-31 09:03:20.930 226239 DEBUG oslo_concurrency.lockutils [req-2143473c-9e0c-4e10-b6d7-3b363b3d4d61 req-3c73720d-fecf-4624-a449-057a9a7b3e9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:20 np0005603623 nova_compute[226235]: 2026-01-31 09:03:20.930 226239 DEBUG oslo_concurrency.lockutils [req-2143473c-9e0c-4e10-b6d7-3b363b3d4d61 req-3c73720d-fecf-4624-a449-057a9a7b3e9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:20 np0005603623 nova_compute[226235]: 2026-01-31 09:03:20.931 226239 DEBUG nova.compute.manager [req-2143473c-9e0c-4e10-b6d7-3b363b3d4d61 req-3c73720d-fecf-4624-a449-057a9a7b3e9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] No waiting events found dispatching network-vif-plugged-61191e24-6a0b-4f1f-a895-c15502e9f067 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:03:20 np0005603623 nova_compute[226235]: 2026-01-31 09:03:20.931 226239 WARNING nova.compute.manager [req-2143473c-9e0c-4e10-b6d7-3b363b3d4d61 req-3c73720d-fecf-4624-a449-057a9a7b3e9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Received unexpected event network-vif-plugged-61191e24-6a0b-4f1f-a895-c15502e9f067 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 04:03:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:21.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:21 np0005603623 nova_compute[226235]: 2026-01-31 09:03:21.151 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 0d650376-d2f8-4e6a-b19a-f8639cc5ce52 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:03:21 np0005603623 nova_compute[226235]: 2026-01-31 09:03:21.151 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:03:21 np0005603623 nova_compute[226235]: 2026-01-31 09:03:21.151 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:03:21 np0005603623 nova_compute[226235]: 2026-01-31 09:03:21.201 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:03:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:21.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:21 np0005603623 nova_compute[226235]: 2026-01-31 09:03:21.477 226239 DEBUG nova.network.neutron [-] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:03:21 np0005603623 nova_compute[226235]: 2026-01-31 09:03:21.545 226239 INFO nova.compute.manager [-] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Took 3.97 seconds to deallocate network for instance.#033[00m
Jan 31 04:03:21 np0005603623 nova_compute[226235]: 2026-01-31 09:03:21.546 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:03:21 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3971060593' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:03:21 np0005603623 nova_compute[226235]: 2026-01-31 09:03:21.632 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:03:21 np0005603623 nova_compute[226235]: 2026-01-31 09:03:21.637 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:03:21 np0005603623 nova_compute[226235]: 2026-01-31 09:03:21.694 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:21 np0005603623 nova_compute[226235]: 2026-01-31 09:03:21.807 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:03:21 np0005603623 nova_compute[226235]: 2026-01-31 09:03:21.844 226239 DEBUG nova.compute.manager [req-4259fd22-aab4-4983-853f-8db66a23b654 req-f6879d4b-ff7a-4972-9af6-ea5c0733007a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Received event network-vif-deleted-61191e24-6a0b-4f1f-a895-c15502e9f067 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:03:21 np0005603623 nova_compute[226235]: 2026-01-31 09:03:21.850 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:03:21 np0005603623 nova_compute[226235]: 2026-01-31 09:03:21.850 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:21 np0005603623 nova_compute[226235]: 2026-01-31 09:03:21.916 226239 DEBUG oslo_concurrency.lockutils [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:21 np0005603623 nova_compute[226235]: 2026-01-31 09:03:21.916 226239 DEBUG oslo_concurrency.lockutils [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:21 np0005603623 nova_compute[226235]: 2026-01-31 09:03:21.994 226239 DEBUG oslo_concurrency.processutils [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:03:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:03:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3756029573' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:03:22 np0005603623 nova_compute[226235]: 2026-01-31 09:03:22.387 226239 DEBUG oslo_concurrency.processutils [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.393s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:03:22 np0005603623 nova_compute[226235]: 2026-01-31 09:03:22.391 226239 DEBUG nova.compute.provider_tree [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:03:22 np0005603623 nova_compute[226235]: 2026-01-31 09:03:22.482 226239 DEBUG nova.scheduler.client.report [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:03:22 np0005603623 nova_compute[226235]: 2026-01-31 09:03:22.960 226239 DEBUG oslo_concurrency.lockutils [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.043s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:23 np0005603623 nova_compute[226235]: 2026-01-31 09:03:23.034 226239 INFO nova.scheduler.client.report [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Deleted allocations for instance 0d650376-d2f8-4e6a-b19a-f8639cc5ce52#033[00m
Jan 31 04:03:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:23.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:23 np0005603623 nova_compute[226235]: 2026-01-31 09:03:23.164 226239 DEBUG oslo_concurrency.lockutils [None req-e98023ad-800c-493c-8b6a-d1a8367ee6f9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "0d650376-d2f8-4e6a-b19a-f8639cc5ce52" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.898s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:23.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:03:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:25.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:03:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:25.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:25 np0005603623 nova_compute[226235]: 2026-01-31 09:03:25.851 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:26 np0005603623 nova_compute[226235]: 2026-01-31 09:03:26.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:26 np0005603623 nova_compute[226235]: 2026-01-31 09:03:26.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:26 np0005603623 nova_compute[226235]: 2026-01-31 09:03:26.549 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:26 np0005603623 nova_compute[226235]: 2026-01-31 09:03:26.697 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:03:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:27.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:03:27 np0005603623 nova_compute[226235]: 2026-01-31 09:03:27.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:27.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:28 np0005603623 nova_compute[226235]: 2026-01-31 09:03:28.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:29.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:29.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:29 np0005603623 nova_compute[226235]: 2026-01-31 09:03:29.314 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:29 np0005603623 nova_compute[226235]: 2026-01-31 09:03:29.435 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:30 np0005603623 nova_compute[226235]: 2026-01-31 09:03:30.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:30.156 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:30.156 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:30.156 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:03:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:31.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:03:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:31.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:31 np0005603623 nova_compute[226235]: 2026-01-31 09:03:31.497 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850196.4951987, 0d650376-d2f8-4e6a-b19a-f8639cc5ce52 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:03:31 np0005603623 nova_compute[226235]: 2026-01-31 09:03:31.497 226239 INFO nova.compute.manager [-] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:03:31 np0005603623 nova_compute[226235]: 2026-01-31 09:03:31.539 226239 DEBUG nova.compute.manager [None req-e7d7368d-4cac-42fd-92ac-23a7804d51c8 - - - - - -] [instance: 0d650376-d2f8-4e6a-b19a-f8639cc5ce52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:03:31 np0005603623 nova_compute[226235]: 2026-01-31 09:03:31.553 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:31 np0005603623 nova_compute[226235]: 2026-01-31 09:03:31.698 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:33.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:33.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:35.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:35.183 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=86, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=85) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:03:35 np0005603623 nova_compute[226235]: 2026-01-31 09:03:35.183 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:35.184 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:03:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:03:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:35.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:03:36 np0005603623 nova_compute[226235]: 2026-01-31 09:03:36.555 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:36 np0005603623 nova_compute[226235]: 2026-01-31 09:03:36.701 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:37.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:37.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:38 np0005603623 nova_compute[226235]: 2026-01-31 09:03:38.079 226239 DEBUG oslo_concurrency.lockutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "05185cfd-db91-4d53-8ae4-57a005be337f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:38 np0005603623 nova_compute[226235]: 2026-01-31 09:03:38.080 226239 DEBUG oslo_concurrency.lockutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "05185cfd-db91-4d53-8ae4-57a005be337f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:38 np0005603623 nova_compute[226235]: 2026-01-31 09:03:38.109 226239 DEBUG nova.compute.manager [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:03:38 np0005603623 nova_compute[226235]: 2026-01-31 09:03:38.226 226239 DEBUG oslo_concurrency.lockutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:38 np0005603623 nova_compute[226235]: 2026-01-31 09:03:38.227 226239 DEBUG oslo_concurrency.lockutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:38 np0005603623 nova_compute[226235]: 2026-01-31 09:03:38.235 226239 DEBUG nova.virt.hardware [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:03:38 np0005603623 nova_compute[226235]: 2026-01-31 09:03:38.236 226239 INFO nova.compute.claims [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 04:03:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:03:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3904878602' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:03:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:03:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3904878602' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:03:38 np0005603623 nova_compute[226235]: 2026-01-31 09:03:38.423 226239 DEBUG oslo_concurrency.processutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:03:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:03:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/203671473' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:03:38 np0005603623 nova_compute[226235]: 2026-01-31 09:03:38.828 226239 DEBUG oslo_concurrency.processutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.405s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:03:38 np0005603623 nova_compute[226235]: 2026-01-31 09:03:38.835 226239 DEBUG nova.compute.provider_tree [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:03:38 np0005603623 nova_compute[226235]: 2026-01-31 09:03:38.896 226239 DEBUG nova.scheduler.client.report [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:03:38 np0005603623 nova_compute[226235]: 2026-01-31 09:03:38.947 226239 DEBUG oslo_concurrency.lockutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.720s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:38 np0005603623 nova_compute[226235]: 2026-01-31 09:03:38.948 226239 DEBUG nova.compute.manager [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.040 226239 DEBUG nova.compute.manager [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.040 226239 DEBUG nova.network.neutron [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.072 226239 INFO nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.107 226239 DEBUG nova.compute.manager [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:03:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:39.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.201 226239 INFO nova.virt.block_device [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Booting with volume 47ca8e30-e640-4fca-a274-67bf157244e9 at /dev/vda#033[00m
Jan 31 04:03:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:39.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.499 226239 DEBUG os_brick.utils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.501 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.511 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.512 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[a6a853b9-d45b-46ca-aef8-2cf6a5a2e55a]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.513 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.520 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.520 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[85c3997a-d020-4d6a-bec8-d5734f043bac]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.522 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.528 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.528 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[240fa693-a8ab-4b92-9987-87b8e9490d14]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.529 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[a40b2af5-a7ae-446c-9aca-d1a93ffb6040]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.530 226239 DEBUG oslo_concurrency.processutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.550 226239 DEBUG nova.policy [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dc42b92a5dd34d32b6b184bdc7acb092', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76ce367a834b49dfb5b436848118b860', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.554 226239 DEBUG oslo_concurrency.processutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "nvme version" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.556 226239 DEBUG os_brick.initiator.connectors.lightos [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.557 226239 DEBUG os_brick.initiator.connectors.lightos [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.557 226239 DEBUG os_brick.initiator.connectors.lightos [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.558 226239 DEBUG os_brick.utils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] <== get_connector_properties: return (57ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 04:03:39 np0005603623 nova_compute[226235]: 2026-01-31 09:03:39.558 226239 DEBUG nova.virt.block_device [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Updating existing volume attachment record: 89d38aa9-30c7-4827-aeb5-f11a82443d9e _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 04:03:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:41.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:41.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:03:41 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1348580643' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:03:41 np0005603623 nova_compute[226235]: 2026-01-31 09:03:41.572 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:41 np0005603623 nova_compute[226235]: 2026-01-31 09:03:41.703 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:41 np0005603623 nova_compute[226235]: 2026-01-31 09:03:41.912 226239 DEBUG nova.compute.manager [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:03:41 np0005603623 nova_compute[226235]: 2026-01-31 09:03:41.913 226239 DEBUG nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:03:41 np0005603623 nova_compute[226235]: 2026-01-31 09:03:41.914 226239 INFO nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Creating image(s)#033[00m
Jan 31 04:03:41 np0005603623 nova_compute[226235]: 2026-01-31 09:03:41.914 226239 DEBUG nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 04:03:41 np0005603623 nova_compute[226235]: 2026-01-31 09:03:41.914 226239 DEBUG nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Ensure instance console log exists: /var/lib/nova/instances/05185cfd-db91-4d53-8ae4-57a005be337f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:03:41 np0005603623 nova_compute[226235]: 2026-01-31 09:03:41.915 226239 DEBUG oslo_concurrency.lockutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:41 np0005603623 nova_compute[226235]: 2026-01-31 09:03:41.915 226239 DEBUG oslo_concurrency.lockutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:41 np0005603623 nova_compute[226235]: 2026-01-31 09:03:41.915 226239 DEBUG oslo_concurrency.lockutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:42 np0005603623 nova_compute[226235]: 2026-01-31 09:03:42.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:43.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:43.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:43 np0005603623 nova_compute[226235]: 2026-01-31 09:03:43.761 226239 DEBUG nova.network.neutron [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Successfully created port: eac28070-b447-4e98-9966-378755225162 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:03:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:44.185 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '86'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:03:45 np0005603623 nova_compute[226235]: 2026-01-31 09:03:45.016 226239 DEBUG nova.network.neutron [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Successfully updated port: eac28070-b447-4e98-9966-378755225162 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:03:45 np0005603623 nova_compute[226235]: 2026-01-31 09:03:45.060 226239 DEBUG oslo_concurrency.lockutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "refresh_cache-05185cfd-db91-4d53-8ae4-57a005be337f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:03:45 np0005603623 nova_compute[226235]: 2026-01-31 09:03:45.060 226239 DEBUG oslo_concurrency.lockutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquired lock "refresh_cache-05185cfd-db91-4d53-8ae4-57a005be337f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:03:45 np0005603623 nova_compute[226235]: 2026-01-31 09:03:45.060 226239 DEBUG nova.network.neutron [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:03:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:03:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:45.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:03:45 np0005603623 nova_compute[226235]: 2026-01-31 09:03:45.140 226239 DEBUG nova.compute.manager [req-13667957-d1ae-4b5f-809a-941e10868de7 req-0a8abb06-e6ed-4a4c-91b5-9477f762ebda fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Received event network-changed-eac28070-b447-4e98-9966-378755225162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:03:45 np0005603623 nova_compute[226235]: 2026-01-31 09:03:45.140 226239 DEBUG nova.compute.manager [req-13667957-d1ae-4b5f-809a-941e10868de7 req-0a8abb06-e6ed-4a4c-91b5-9477f762ebda fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Refreshing instance network info cache due to event network-changed-eac28070-b447-4e98-9966-378755225162. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:03:45 np0005603623 nova_compute[226235]: 2026-01-31 09:03:45.141 226239 DEBUG oslo_concurrency.lockutils [req-13667957-d1ae-4b5f-809a-941e10868de7 req-0a8abb06-e6ed-4a4c-91b5-9477f762ebda fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-05185cfd-db91-4d53-8ae4-57a005be337f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:03:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:45.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:45 np0005603623 nova_compute[226235]: 2026-01-31 09:03:45.309 226239 DEBUG nova.network.neutron [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:03:46 np0005603623 nova_compute[226235]: 2026-01-31 09:03:46.576 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:46 np0005603623 nova_compute[226235]: 2026-01-31 09:03:46.705 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:47.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:47.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.876 226239 DEBUG nova.network.neutron [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Updating instance_info_cache with network_info: [{"id": "eac28070-b447-4e98-9966-378755225162", "address": "fa:16:3e:30:e8:e5", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeac28070-b4", "ovs_interfaceid": "eac28070-b447-4e98-9966-378755225162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.922 226239 DEBUG oslo_concurrency.lockutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Releasing lock "refresh_cache-05185cfd-db91-4d53-8ae4-57a005be337f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.922 226239 DEBUG nova.compute.manager [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Instance network_info: |[{"id": "eac28070-b447-4e98-9966-378755225162", "address": "fa:16:3e:30:e8:e5", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeac28070-b4", "ovs_interfaceid": "eac28070-b447-4e98-9966-378755225162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.923 226239 DEBUG oslo_concurrency.lockutils [req-13667957-d1ae-4b5f-809a-941e10868de7 req-0a8abb06-e6ed-4a4c-91b5-9477f762ebda fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-05185cfd-db91-4d53-8ae4-57a005be337f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.923 226239 DEBUG nova.network.neutron [req-13667957-d1ae-4b5f-809a-941e10868de7 req-0a8abb06-e6ed-4a4c-91b5-9477f762ebda fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Refreshing network info cache for port eac28070-b447-4e98-9966-378755225162 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.926 226239 DEBUG nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Start _get_guest_xml network_info=[{"id": "eac28070-b447-4e98-9966-378755225162", "address": "fa:16:3e:30:e8:e5", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeac28070-b4", "ovs_interfaceid": "eac28070-b447-4e98-9966-378755225162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'attachment_id': '89d38aa9-30c7-4827-aeb5-f11a82443d9e', 'delete_on_termination': True, 'guest_format': None, 'device_type': 'disk', 'boot_index': 0, 'disk_bus': 'virtio', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-47ca8e30-e640-4fca-a274-67bf157244e9', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '47ca8e30-e640-4fca-a274-67bf157244e9', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '05185cfd-db91-4d53-8ae4-57a005be337f', 'attached_at': '', 'detached_at': '', 'volume_id': '47ca8e30-e640-4fca-a274-67bf157244e9', 'serial': '47ca8e30-e640-4fca-a274-67bf157244e9'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.929 226239 WARNING nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.936 226239 DEBUG nova.virt.libvirt.host [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.936 226239 DEBUG nova.virt.libvirt.host [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.939 226239 DEBUG nova.virt.libvirt.host [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.939 226239 DEBUG nova.virt.libvirt.host [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.940 226239 DEBUG nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.941 226239 DEBUG nova.virt.hardware [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.941 226239 DEBUG nova.virt.hardware [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.941 226239 DEBUG nova.virt.hardware [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.941 226239 DEBUG nova.virt.hardware [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.942 226239 DEBUG nova.virt.hardware [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.942 226239 DEBUG nova.virt.hardware [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.942 226239 DEBUG nova.virt.hardware [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.942 226239 DEBUG nova.virt.hardware [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.943 226239 DEBUG nova.virt.hardware [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.943 226239 DEBUG nova.virt.hardware [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.943 226239 DEBUG nova.virt.hardware [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.969 226239 DEBUG nova.storage.rbd_utils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] rbd image 05185cfd-db91-4d53-8ae4-57a005be337f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:03:47 np0005603623 nova_compute[226235]: 2026-01-31 09:03:47.973 226239 DEBUG oslo_concurrency.processutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:03:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:03:48 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/92642890' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.397 226239 DEBUG oslo_concurrency.processutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.435 226239 DEBUG nova.virt.libvirt.vif [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-volume-backed-server-1556930195',display_name='tempest-TestVolumeBootPattern-volume-backed-server-1556930195',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-volume-backed-server-1556930195',id=199,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKF59rccHhsRb4D/CZFLX092sIK+/IvwyeTXpmNTOR//zJFz7TjDtGD2uvwMP0M17UQZZs9W7y6QnwkWlWcOVeaQO/cKF4TdqnQ7iouwZ3EUZ5flP3Y2CaP6XS5xR0AJeQ==',key_name='tempest-keypair-452510270',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76ce367a834b49dfb5b436848118b860',ramdisk_id='',reservation_id='r-g4jls9jc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1392945362',owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:03:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='dc42b92a5dd34d32b6b184bdc7acb092',uuid=05185cfd-db91-4d53-8ae4-57a005be337f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eac28070-b447-4e98-9966-378755225162", "address": "fa:16:3e:30:e8:e5", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeac28070-b4", "ovs_interfaceid": "eac28070-b447-4e98-9966-378755225162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.436 226239 DEBUG nova.network.os_vif_util [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converting VIF {"id": "eac28070-b447-4e98-9966-378755225162", "address": "fa:16:3e:30:e8:e5", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeac28070-b4", "ovs_interfaceid": "eac28070-b447-4e98-9966-378755225162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.437 226239 DEBUG nova.network.os_vif_util [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:e8:e5,bridge_name='br-int',has_traffic_filtering=True,id=eac28070-b447-4e98-9966-378755225162,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeac28070-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.438 226239 DEBUG nova.objects.instance [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lazy-loading 'pci_devices' on Instance uuid 05185cfd-db91-4d53-8ae4-57a005be337f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.456 226239 DEBUG nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:03:48 np0005603623 nova_compute[226235]:  <uuid>05185cfd-db91-4d53-8ae4-57a005be337f</uuid>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:  <name>instance-000000c7</name>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <nova:name>tempest-TestVolumeBootPattern-volume-backed-server-1556930195</nova:name>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 09:03:47</nova:creationTime>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 04:03:48 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:        <nova:user uuid="dc42b92a5dd34d32b6b184bdc7acb092">tempest-TestVolumeBootPattern-1392945362-project-member</nova:user>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:        <nova:project uuid="76ce367a834b49dfb5b436848118b860">tempest-TestVolumeBootPattern-1392945362</nova:project>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:        <nova:port uuid="eac28070-b447-4e98-9966-378755225162">
Jan 31 04:03:48 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <system>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <entry name="serial">05185cfd-db91-4d53-8ae4-57a005be337f</entry>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <entry name="uuid">05185cfd-db91-4d53-8ae4-57a005be337f</entry>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    </system>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:  <os>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:  </os>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:  <features>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:  </features>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:  </clock>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:  <devices>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/05185cfd-db91-4d53-8ae4-57a005be337f_disk.config">
Jan 31 04:03:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:03:48 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="volumes/volume-47ca8e30-e640-4fca-a274-67bf157244e9">
Jan 31 04:03:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:03:48 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <serial>47ca8e30-e640-4fca-a274-67bf157244e9</serial>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:30:e8:e5"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <target dev="tapeac28070-b4"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    </interface>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/05185cfd-db91-4d53-8ae4-57a005be337f/console.log" append="off"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    </serial>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <video>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    </video>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    </rng>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 04:03:48 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 04:03:48 np0005603623 nova_compute[226235]:  </devices>
Jan 31 04:03:48 np0005603623 nova_compute[226235]: </domain>
Jan 31 04:03:48 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.457 226239 DEBUG nova.compute.manager [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Preparing to wait for external event network-vif-plugged-eac28070-b447-4e98-9966-378755225162 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.457 226239 DEBUG oslo_concurrency.lockutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "05185cfd-db91-4d53-8ae4-57a005be337f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.457 226239 DEBUG oslo_concurrency.lockutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "05185cfd-db91-4d53-8ae4-57a005be337f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.458 226239 DEBUG oslo_concurrency.lockutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "05185cfd-db91-4d53-8ae4-57a005be337f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.458 226239 DEBUG nova.virt.libvirt.vif [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-volume-backed-server-1556930195',display_name='tempest-TestVolumeBootPattern-volume-backed-server-1556930195',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-volume-backed-server-1556930195',id=199,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKF59rccHhsRb4D/CZFLX092sIK+/IvwyeTXpmNTOR//zJFz7TjDtGD2uvwMP0M17UQZZs9W7y6QnwkWlWcOVeaQO/cKF4TdqnQ7iouwZ3EUZ5flP3Y2CaP6XS5xR0AJeQ==',key_name='tempest-keypair-452510270',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76ce367a834b49dfb5b436848118b860',ramdisk_id='',reservation_id='r-g4jls9jc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1392945362',owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:03:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='dc42b92a5dd34d32b6b184bdc7acb092',uuid=05185cfd-db91-4d53-8ae4-57a005be337f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eac28070-b447-4e98-9966-378755225162", "address": "fa:16:3e:30:e8:e5", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeac28070-b4", "ovs_interfaceid": "eac28070-b447-4e98-9966-378755225162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.459 226239 DEBUG nova.network.os_vif_util [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converting VIF {"id": "eac28070-b447-4e98-9966-378755225162", "address": "fa:16:3e:30:e8:e5", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeac28070-b4", "ovs_interfaceid": "eac28070-b447-4e98-9966-378755225162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.460 226239 DEBUG nova.network.os_vif_util [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:e8:e5,bridge_name='br-int',has_traffic_filtering=True,id=eac28070-b447-4e98-9966-378755225162,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeac28070-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.460 226239 DEBUG os_vif [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:e8:e5,bridge_name='br-int',has_traffic_filtering=True,id=eac28070-b447-4e98-9966-378755225162,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeac28070-b4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.461 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.461 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.462 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.464 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.464 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeac28070-b4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.465 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeac28070-b4, col_values=(('external_ids', {'iface-id': 'eac28070-b447-4e98-9966-378755225162', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:30:e8:e5', 'vm-uuid': '05185cfd-db91-4d53-8ae4-57a005be337f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.466 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:48 np0005603623 NetworkManager[48970]: <info>  [1769850228.4672] manager: (tapeac28070-b4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/391)
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.469 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.470 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.471 226239 INFO os_vif [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:e8:e5,bridge_name='br-int',has_traffic_filtering=True,id=eac28070-b447-4e98-9966-378755225162,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeac28070-b4')#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.678 226239 DEBUG nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.679 226239 DEBUG nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.679 226239 DEBUG nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] No VIF found with MAC fa:16:3e:30:e8:e5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.680 226239 INFO nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Using config drive#033[00m
Jan 31 04:03:48 np0005603623 nova_compute[226235]: 2026-01-31 09:03:48.795 226239 DEBUG nova.storage.rbd_utils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] rbd image 05185cfd-db91-4d53-8ae4-57a005be337f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:03:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:49.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:49.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:49 np0005603623 nova_compute[226235]: 2026-01-31 09:03:49.476 226239 INFO nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Creating config drive at /var/lib/nova/instances/05185cfd-db91-4d53-8ae4-57a005be337f/disk.config#033[00m
Jan 31 04:03:49 np0005603623 nova_compute[226235]: 2026-01-31 09:03:49.481 226239 DEBUG oslo_concurrency.processutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/05185cfd-db91-4d53-8ae4-57a005be337f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp91cyqwa1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:03:49 np0005603623 nova_compute[226235]: 2026-01-31 09:03:49.601 226239 DEBUG oslo_concurrency.processutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/05185cfd-db91-4d53-8ae4-57a005be337f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp91cyqwa1" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:03:49 np0005603623 nova_compute[226235]: 2026-01-31 09:03:49.629 226239 DEBUG nova.storage.rbd_utils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] rbd image 05185cfd-db91-4d53-8ae4-57a005be337f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:03:49 np0005603623 nova_compute[226235]: 2026-01-31 09:03:49.632 226239 DEBUG oslo_concurrency.processutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/05185cfd-db91-4d53-8ae4-57a005be337f/disk.config 05185cfd-db91-4d53-8ae4-57a005be337f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.238 226239 DEBUG oslo_concurrency.processutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/05185cfd-db91-4d53-8ae4-57a005be337f/disk.config 05185cfd-db91-4d53-8ae4-57a005be337f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.606s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.239 226239 INFO nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Deleting local config drive /var/lib/nova/instances/05185cfd-db91-4d53-8ae4-57a005be337f/disk.config because it was imported into RBD.#033[00m
Jan 31 04:03:50 np0005603623 kernel: tapeac28070-b4: entered promiscuous mode
Jan 31 04:03:50 np0005603623 NetworkManager[48970]: <info>  [1769850230.2770] manager: (tapeac28070-b4): new Tun device (/org/freedesktop/NetworkManager/Devices/392)
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.277 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:50 np0005603623 ovn_controller[133449]: 2026-01-31T09:03:50Z|00835|binding|INFO|Claiming lport eac28070-b447-4e98-9966-378755225162 for this chassis.
Jan 31 04:03:50 np0005603623 ovn_controller[133449]: 2026-01-31T09:03:50Z|00836|binding|INFO|eac28070-b447-4e98-9966-378755225162: Claiming fa:16:3e:30:e8:e5 10.100.0.13
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.296 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:e8:e5 10.100.0.13'], port_security=['fa:16:3e:30:e8:e5 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '05185cfd-db91-4d53-8ae4-57a005be337f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-650eb345-8346-4e8f-8e83-eeb0117654f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76ce367a834b49dfb5b436848118b860', 'neutron:revision_number': '2', 'neutron:security_group_ids': '93695839-94d5-47ca-8baf-4134adc006a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ecdc171-9d09-4cba-9bb9-cd2f8ef8e6c3, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=eac28070-b447-4e98-9966-378755225162) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:03:50 np0005603623 systemd-udevd[321250]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.297 143258 INFO neutron.agent.ovn.metadata.agent [-] Port eac28070-b447-4e98-9966-378755225162 in datapath 650eb345-8346-4e8f-8e83-eeb0117654f6 bound to our chassis#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.299 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 650eb345-8346-4e8f-8e83-eeb0117654f6#033[00m
Jan 31 04:03:50 np0005603623 systemd-machined[194379]: New machine qemu-93-instance-000000c7.
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.302 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:50 np0005603623 ovn_controller[133449]: 2026-01-31T09:03:50Z|00837|binding|INFO|Setting lport eac28070-b447-4e98-9966-378755225162 ovn-installed in OVS
Jan 31 04:03:50 np0005603623 ovn_controller[133449]: 2026-01-31T09:03:50Z|00838|binding|INFO|Setting lport eac28070-b447-4e98-9966-378755225162 up in Southbound
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.307 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ed22e9c7-23f0-4bb3-8efc-328cef987f84]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.307 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap650eb345-81 in ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.309 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap650eb345-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.309 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4bd9b0da-02ba-4886-ac9b-443165cd4b24]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.310 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f34f7d03-a164-4133-b4b8-5bfc876d2ef3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.309 226239 DEBUG nova.network.neutron [req-13667957-d1ae-4b5f-809a-941e10868de7 req-0a8abb06-e6ed-4a4c-91b5-9477f762ebda fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Updated VIF entry in instance network info cache for port eac28070-b447-4e98-9966-378755225162. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.310 226239 DEBUG nova.network.neutron [req-13667957-d1ae-4b5f-809a-941e10868de7 req-0a8abb06-e6ed-4a4c-91b5-9477f762ebda fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Updating instance_info_cache with network_info: [{"id": "eac28070-b447-4e98-9966-378755225162", "address": "fa:16:3e:30:e8:e5", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeac28070-b4", "ovs_interfaceid": "eac28070-b447-4e98-9966-378755225162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.312 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:50 np0005603623 NetworkManager[48970]: <info>  [1769850230.3150] device (tapeac28070-b4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:03:50 np0005603623 NetworkManager[48970]: <info>  [1769850230.3157] device (tapeac28070-b4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:03:50 np0005603623 systemd[1]: Started Virtual Machine qemu-93-instance-000000c7.
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.318 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[8c84dae2-137c-4eb3-8289-2dd1e18e84f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.337 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a8abfc9e-863f-4924-8691-97098becdcb2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.337 226239 DEBUG oslo_concurrency.lockutils [req-13667957-d1ae-4b5f-809a-941e10868de7 req-0a8abb06-e6ed-4a4c-91b5-9477f762ebda fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-05185cfd-db91-4d53-8ae4-57a005be337f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.356 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[e467cc39-e116-4001-a942-68d1b1e4407c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.361 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[35840fa7-1687-402e-965a-9565e8427e88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:50 np0005603623 NetworkManager[48970]: <info>  [1769850230.3628] manager: (tap650eb345-80): new Veth device (/org/freedesktop/NetworkManager/Devices/393)
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.383 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[20650cb3-655f-42c8-8d19-755944e25a63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.386 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[93dc8476-482a-4592-9bb8-732762fea307]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:50 np0005603623 NetworkManager[48970]: <info>  [1769850230.3994] device (tap650eb345-80): carrier: link connected
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.401 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[3b70f9b6-0d86-4b10-8fd7-0993708a6812]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.414 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[026dfa04-fe09-4ed7-a522-6c293ee2b0aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap650eb345-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:27:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 245], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 937082, 'reachable_time': 33618, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 321284, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.425 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6ee9e41a-f7d9-4aaf-a45d-2d2b343ace9e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9f:27ec'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 937082, 'tstamp': 937082}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 321285, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.442 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[71b332bd-4188-45fd-bd4c-536fae54172c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap650eb345-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:27:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 245], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 937082, 'reachable_time': 33618, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 321286, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.463 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[15e3848b-8efe-447f-9ec8-090a893c34c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.520 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[61236de3-a25d-4eb5-88b1-6f74f6f727de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.522 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap650eb345-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.522 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.523 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap650eb345-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:03:50 np0005603623 kernel: tap650eb345-80: entered promiscuous mode
Jan 31 04:03:50 np0005603623 NetworkManager[48970]: <info>  [1769850230.5266] manager: (tap650eb345-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/394)
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.528 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.529 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap650eb345-80, col_values=(('external_ids', {'iface-id': '74bde109-0188-4ce3-87c3-02a3eb853dc2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:03:50 np0005603623 ovn_controller[133449]: 2026-01-31T09:03:50Z|00839|binding|INFO|Releasing lport 74bde109-0188-4ce3-87c3-02a3eb853dc2 from this chassis (sb_readonly=0)
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.536 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.538 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/650eb345-8346-4e8f-8e83-eeb0117654f6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/650eb345-8346-4e8f-8e83-eeb0117654f6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.540 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[21bd33db-5c33-47a2-a1a7-aff9f4040499]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.540 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-650eb345-8346-4e8f-8e83-eeb0117654f6
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/650eb345-8346-4e8f-8e83-eeb0117654f6.pid.haproxy
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 650eb345-8346-4e8f-8e83-eeb0117654f6
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:03:50 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:03:50.541 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'env', 'PROCESS_TAG=haproxy-650eb345-8346-4e8f-8e83-eeb0117654f6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/650eb345-8346-4e8f-8e83-eeb0117654f6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.758 226239 DEBUG nova.compute.manager [req-d4546e34-61b2-47d0-a4c3-07a3fa8fd789 req-97095633-ba97-423d-8061-8ee1ccc14426 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Received event network-vif-plugged-eac28070-b447-4e98-9966-378755225162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.758 226239 DEBUG oslo_concurrency.lockutils [req-d4546e34-61b2-47d0-a4c3-07a3fa8fd789 req-97095633-ba97-423d-8061-8ee1ccc14426 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "05185cfd-db91-4d53-8ae4-57a005be337f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.759 226239 DEBUG oslo_concurrency.lockutils [req-d4546e34-61b2-47d0-a4c3-07a3fa8fd789 req-97095633-ba97-423d-8061-8ee1ccc14426 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "05185cfd-db91-4d53-8ae4-57a005be337f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.759 226239 DEBUG oslo_concurrency.lockutils [req-d4546e34-61b2-47d0-a4c3-07a3fa8fd789 req-97095633-ba97-423d-8061-8ee1ccc14426 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "05185cfd-db91-4d53-8ae4-57a005be337f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.759 226239 DEBUG nova.compute.manager [req-d4546e34-61b2-47d0-a4c3-07a3fa8fd789 req-97095633-ba97-423d-8061-8ee1ccc14426 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Processing event network-vif-plugged-eac28070-b447-4e98-9966-378755225162 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.832 226239 DEBUG nova.compute.manager [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.833 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850230.832445, 05185cfd-db91-4d53-8ae4-57a005be337f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.834 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] VM Started (Lifecycle Event)#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.837 226239 DEBUG nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.839 226239 INFO nova.virt.libvirt.driver [-] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Instance spawned successfully.#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.840 226239 DEBUG nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.860 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.862 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.869 226239 DEBUG nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.869 226239 DEBUG nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.870 226239 DEBUG nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.870 226239 DEBUG nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.870 226239 DEBUG nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.871 226239 DEBUG nova.virt.libvirt.driver [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.914 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.915 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850230.8340206, 05185cfd-db91-4d53-8ae4-57a005be337f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.915 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.939 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:03:50 np0005603623 podman[321361]: 2026-01-31 09:03:50.844911377 +0000 UTC m=+0.026774271 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.943 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850230.836577, 05185cfd-db91-4d53-8ae4-57a005be337f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:03:50 np0005603623 nova_compute[226235]: 2026-01-31 09:03:50.943 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:03:50 np0005603623 podman[321371]: 2026-01-31 09:03:50.993107796 +0000 UTC m=+0.088430735 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:03:51 np0005603623 podman[321372]: 2026-01-31 09:03:51.013283799 +0000 UTC m=+0.106771231 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 04:03:51 np0005603623 nova_compute[226235]: 2026-01-31 09:03:51.021 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:03:51 np0005603623 nova_compute[226235]: 2026-01-31 09:03:51.024 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:03:51 np0005603623 nova_compute[226235]: 2026-01-31 09:03:51.036 226239 INFO nova.compute.manager [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Took 9.12 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:03:51 np0005603623 nova_compute[226235]: 2026-01-31 09:03:51.036 226239 DEBUG nova.compute.manager [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:03:51 np0005603623 podman[321361]: 2026-01-31 09:03:51.057894158 +0000 UTC m=+0.239757022 container create 2356c43e0cb4c56ef6ce8367838f208c59d78edca346602c0903252ae6a4fb14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Jan 31 04:03:51 np0005603623 nova_compute[226235]: 2026-01-31 09:03:51.061 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:03:51 np0005603623 systemd[1]: Started libpod-conmon-2356c43e0cb4c56ef6ce8367838f208c59d78edca346602c0903252ae6a4fb14.scope.
Jan 31 04:03:51 np0005603623 systemd[1]: Started libcrun container.
Jan 31 04:03:51 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c7c7070dfdfa5743f1bd6c78236a037c6a7e632dd15fcb4bb6899a29c189ded/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:03:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:51.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:51 np0005603623 nova_compute[226235]: 2026-01-31 09:03:51.145 226239 INFO nova.compute.manager [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Took 12.95 seconds to build instance.#033[00m
Jan 31 04:03:51 np0005603623 nova_compute[226235]: 2026-01-31 09:03:51.190 226239 DEBUG oslo_concurrency.lockutils [None req-e0d7deac-bb27-4510-9d9a-13143c43b283 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "05185cfd-db91-4d53-8ae4-57a005be337f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:51 np0005603623 podman[321361]: 2026-01-31 09:03:51.256076705 +0000 UTC m=+0.437939609 container init 2356c43e0cb4c56ef6ce8367838f208c59d78edca346602c0903252ae6a4fb14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 04:03:51 np0005603623 podman[321361]: 2026-01-31 09:03:51.260243447 +0000 UTC m=+0.442106321 container start 2356c43e0cb4c56ef6ce8367838f208c59d78edca346602c0903252ae6a4fb14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 31 04:03:51 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[321422]: [NOTICE]   (321426) : New worker (321428) forked
Jan 31 04:03:51 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[321422]: [NOTICE]   (321426) : Loading success.
Jan 31 04:03:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:51.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:51 np0005603623 nova_compute[226235]: 2026-01-31 09:03:51.708 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:52 np0005603623 nova_compute[226235]: 2026-01-31 09:03:52.933 226239 DEBUG nova.compute.manager [req-62fdd0a1-4e77-4a25-b69c-395ee6888303 req-46f495d3-e4a8-48fe-bacf-7a7d4d0119f3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Received event network-vif-plugged-eac28070-b447-4e98-9966-378755225162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:03:52 np0005603623 nova_compute[226235]: 2026-01-31 09:03:52.934 226239 DEBUG oslo_concurrency.lockutils [req-62fdd0a1-4e77-4a25-b69c-395ee6888303 req-46f495d3-e4a8-48fe-bacf-7a7d4d0119f3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "05185cfd-db91-4d53-8ae4-57a005be337f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:52 np0005603623 nova_compute[226235]: 2026-01-31 09:03:52.934 226239 DEBUG oslo_concurrency.lockutils [req-62fdd0a1-4e77-4a25-b69c-395ee6888303 req-46f495d3-e4a8-48fe-bacf-7a7d4d0119f3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "05185cfd-db91-4d53-8ae4-57a005be337f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:52 np0005603623 nova_compute[226235]: 2026-01-31 09:03:52.934 226239 DEBUG oslo_concurrency.lockutils [req-62fdd0a1-4e77-4a25-b69c-395ee6888303 req-46f495d3-e4a8-48fe-bacf-7a7d4d0119f3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "05185cfd-db91-4d53-8ae4-57a005be337f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:52 np0005603623 nova_compute[226235]: 2026-01-31 09:03:52.934 226239 DEBUG nova.compute.manager [req-62fdd0a1-4e77-4a25-b69c-395ee6888303 req-46f495d3-e4a8-48fe-bacf-7a7d4d0119f3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] No waiting events found dispatching network-vif-plugged-eac28070-b447-4e98-9966-378755225162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:03:52 np0005603623 nova_compute[226235]: 2026-01-31 09:03:52.935 226239 WARNING nova.compute.manager [req-62fdd0a1-4e77-4a25-b69c-395ee6888303 req-46f495d3-e4a8-48fe-bacf-7a7d4d0119f3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Received unexpected event network-vif-plugged-eac28070-b447-4e98-9966-378755225162 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:03:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:53.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:53.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:53 np0005603623 nova_compute[226235]: 2026-01-31 09:03:53.468 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:55.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:55.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:55 np0005603623 NetworkManager[48970]: <info>  [1769850235.8852] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/395)
Jan 31 04:03:55 np0005603623 NetworkManager[48970]: <info>  [1769850235.8865] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/396)
Jan 31 04:03:55 np0005603623 nova_compute[226235]: 2026-01-31 09:03:55.887 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:55 np0005603623 nova_compute[226235]: 2026-01-31 09:03:55.974 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:55 np0005603623 ovn_controller[133449]: 2026-01-31T09:03:55Z|00840|binding|INFO|Releasing lport 74bde109-0188-4ce3-87c3-02a3eb853dc2 from this chassis (sb_readonly=0)
Jan 31 04:03:55 np0005603623 nova_compute[226235]: 2026-01-31 09:03:55.992 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:56 np0005603623 nova_compute[226235]: 2026-01-31 09:03:56.709 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:56 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:57.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:57.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:57 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:03:57 np0005603623 nova_compute[226235]: 2026-01-31 09:03:57.600 226239 DEBUG nova.compute.manager [req-ed9fdd1f-5809-4fc2-956b-0fdefb26e5fa req-019b72d8-4d22-48c8-829b-f692acd89139 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Received event network-changed-eac28070-b447-4e98-9966-378755225162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:03:57 np0005603623 nova_compute[226235]: 2026-01-31 09:03:57.602 226239 DEBUG nova.compute.manager [req-ed9fdd1f-5809-4fc2-956b-0fdefb26e5fa req-019b72d8-4d22-48c8-829b-f692acd89139 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Refreshing instance network info cache due to event network-changed-eac28070-b447-4e98-9966-378755225162. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:03:57 np0005603623 nova_compute[226235]: 2026-01-31 09:03:57.602 226239 DEBUG oslo_concurrency.lockutils [req-ed9fdd1f-5809-4fc2-956b-0fdefb26e5fa req-019b72d8-4d22-48c8-829b-f692acd89139 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-05185cfd-db91-4d53-8ae4-57a005be337f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:03:57 np0005603623 nova_compute[226235]: 2026-01-31 09:03:57.602 226239 DEBUG oslo_concurrency.lockutils [req-ed9fdd1f-5809-4fc2-956b-0fdefb26e5fa req-019b72d8-4d22-48c8-829b-f692acd89139 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-05185cfd-db91-4d53-8ae4-57a005be337f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:03:57 np0005603623 nova_compute[226235]: 2026-01-31 09:03:57.603 226239 DEBUG nova.network.neutron [req-ed9fdd1f-5809-4fc2-956b-0fdefb26e5fa req-019b72d8-4d22-48c8-829b-f692acd89139 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Refreshing network info cache for port eac28070-b447-4e98-9966-378755225162 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:03:58 np0005603623 nova_compute[226235]: 2026-01-31 09:03:58.470 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:03:58 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:03:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:59.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:03:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:59.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:00 np0005603623 nova_compute[226235]: 2026-01-31 09:04:00.241 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:00 np0005603623 nova_compute[226235]: 2026-01-31 09:04:00.811 226239 DEBUG nova.network.neutron [req-ed9fdd1f-5809-4fc2-956b-0fdefb26e5fa req-019b72d8-4d22-48c8-829b-f692acd89139 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Updated VIF entry in instance network info cache for port eac28070-b447-4e98-9966-378755225162. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:04:00 np0005603623 nova_compute[226235]: 2026-01-31 09:04:00.811 226239 DEBUG nova.network.neutron [req-ed9fdd1f-5809-4fc2-956b-0fdefb26e5fa req-019b72d8-4d22-48c8-829b-f692acd89139 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Updating instance_info_cache with network_info: [{"id": "eac28070-b447-4e98-9966-378755225162", "address": "fa:16:3e:30:e8:e5", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeac28070-b4", "ovs_interfaceid": "eac28070-b447-4e98-9966-378755225162", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:04:00 np0005603623 nova_compute[226235]: 2026-01-31 09:04:00.857 226239 DEBUG oslo_concurrency.lockutils [req-ed9fdd1f-5809-4fc2-956b-0fdefb26e5fa req-019b72d8-4d22-48c8-829b-f692acd89139 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-05185cfd-db91-4d53-8ae4-57a005be337f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:04:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:01.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:01.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:01 np0005603623 nova_compute[226235]: 2026-01-31 09:04:01.711 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:04:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:03.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:04:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:03.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:03 np0005603623 nova_compute[226235]: 2026-01-31 09:04:03.486 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:05.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:05.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:05 np0005603623 ovn_controller[133449]: 2026-01-31T09:04:05Z|00102|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:30:e8:e5 10.100.0.13
Jan 31 04:04:05 np0005603623 ovn_controller[133449]: 2026-01-31T09:04:05Z|00103|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:30:e8:e5 10.100.0.13
Jan 31 04:04:06 np0005603623 nova_compute[226235]: 2026-01-31 09:04:06.714 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:07.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:07.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:08 np0005603623 nova_compute[226235]: 2026-01-31 09:04:08.489 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:09.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:09.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:04:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:04:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:11.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:11.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:11 np0005603623 nova_compute[226235]: 2026-01-31 09:04:11.716 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:04:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2831984058' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:04:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:04:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2831984058' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:04:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:04:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:13.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:04:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:13.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:13 np0005603623 nova_compute[226235]: 2026-01-31 09:04:13.343 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:13 np0005603623 nova_compute[226235]: 2026-01-31 09:04:13.490 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:14 np0005603623 nova_compute[226235]: 2026-01-31 09:04:14.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:04:14 np0005603623 nova_compute[226235]: 2026-01-31 09:04:14.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:04:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:04:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:15.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:04:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:15.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:16 np0005603623 nova_compute[226235]: 2026-01-31 09:04:16.718 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:17.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:17.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:18 np0005603623 nova_compute[226235]: 2026-01-31 09:04:18.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:04:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e395 e395: 3 total, 3 up, 3 in
Jan 31 04:04:18 np0005603623 nova_compute[226235]: 2026-01-31 09:04:18.538 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:18 np0005603623 nova_compute[226235]: 2026-01-31 09:04:18.593 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:04:18 np0005603623 nova_compute[226235]: 2026-01-31 09:04:18.594 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:04:18 np0005603623 nova_compute[226235]: 2026-01-31 09:04:18.594 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:04:18 np0005603623 nova_compute[226235]: 2026-01-31 09:04:18.594 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:04:18 np0005603623 nova_compute[226235]: 2026-01-31 09:04:18.595 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:04:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:04:18 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2765689244' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:04:19 np0005603623 nova_compute[226235]: 2026-01-31 09:04:19.003 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:04:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:19.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:19 np0005603623 nova_compute[226235]: 2026-01-31 09:04:19.205 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000c7 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:04:19 np0005603623 nova_compute[226235]: 2026-01-31 09:04:19.206 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000c7 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:04:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:19.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:19 np0005603623 nova_compute[226235]: 2026-01-31 09:04:19.320 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:04:19 np0005603623 nova_compute[226235]: 2026-01-31 09:04:19.321 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3998MB free_disk=20.94245147705078GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:04:19 np0005603623 nova_compute[226235]: 2026-01-31 09:04:19.321 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:04:19 np0005603623 nova_compute[226235]: 2026-01-31 09:04:19.322 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:04:19 np0005603623 nova_compute[226235]: 2026-01-31 09:04:19.906 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 05185cfd-db91-4d53-8ae4-57a005be337f actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:04:19 np0005603623 nova_compute[226235]: 2026-01-31 09:04:19.906 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:04:19 np0005603623 nova_compute[226235]: 2026-01-31 09:04:19.906 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:04:20 np0005603623 nova_compute[226235]: 2026-01-31 09:04:20.670 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:04:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:04:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:21.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:04:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:04:21 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/259443406' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:04:21 np0005603623 nova_compute[226235]: 2026-01-31 09:04:21.282 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.612s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:04:21 np0005603623 nova_compute[226235]: 2026-01-31 09:04:21.287 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:04:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:21.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e396 e396: 3 total, 3 up, 3 in
Jan 31 04:04:21 np0005603623 nova_compute[226235]: 2026-01-31 09:04:21.708 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:04:21 np0005603623 nova_compute[226235]: 2026-01-31 09:04:21.720 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:22 np0005603623 podman[321779]: 2026-01-31 09:04:22.005159598 +0000 UTC m=+0.078503423 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 31 04:04:22 np0005603623 podman[321778]: 2026-01-31 09:04:22.005307163 +0000 UTC m=+0.079787134 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 04:04:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:22 np0005603623 nova_compute[226235]: 2026-01-31 09:04:22.287 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:04:22 np0005603623 nova_compute[226235]: 2026-01-31 09:04:22.288 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.966s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:04:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:23.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:23 np0005603623 nova_compute[226235]: 2026-01-31 09:04:23.288 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:04:23 np0005603623 nova_compute[226235]: 2026-01-31 09:04:23.289 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:04:23 np0005603623 nova_compute[226235]: 2026-01-31 09:04:23.289 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:04:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:04:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:23.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:04:23 np0005603623 nova_compute[226235]: 2026-01-31 09:04:23.540 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:23 np0005603623 nova_compute[226235]: 2026-01-31 09:04:23.596 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:23 np0005603623 nova_compute[226235]: 2026-01-31 09:04:23.918 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-05185cfd-db91-4d53-8ae4-57a005be337f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:04:23 np0005603623 nova_compute[226235]: 2026-01-31 09:04:23.918 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-05185cfd-db91-4d53-8ae4-57a005be337f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:04:23 np0005603623 nova_compute[226235]: 2026-01-31 09:04:23.918 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 04:04:23 np0005603623 nova_compute[226235]: 2026-01-31 09:04:23.918 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 05185cfd-db91-4d53-8ae4-57a005be337f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:04:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:04:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:25.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:04:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:25.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e397 e397: 3 total, 3 up, 3 in
Jan 31 04:04:26 np0005603623 nova_compute[226235]: 2026-01-31 09:04:26.723 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:04:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:27.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:04:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:27.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:28 np0005603623 nova_compute[226235]: 2026-01-31 09:04:28.542 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:29.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:29.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:29 np0005603623 nova_compute[226235]: 2026-01-31 09:04:29.824 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Updating instance_info_cache with network_info: [{"id": "eac28070-b447-4e98-9966-378755225162", "address": "fa:16:3e:30:e8:e5", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeac28070-b4", "ovs_interfaceid": "eac28070-b447-4e98-9966-378755225162", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:04:29 np0005603623 nova_compute[226235]: 2026-01-31 09:04:29.847 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-05185cfd-db91-4d53-8ae4-57a005be337f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:04:29 np0005603623 nova_compute[226235]: 2026-01-31 09:04:29.847 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 04:04:29 np0005603623 nova_compute[226235]: 2026-01-31 09:04:29.848 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:04:29 np0005603623 nova_compute[226235]: 2026-01-31 09:04:29.848 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:04:29 np0005603623 nova_compute[226235]: 2026-01-31 09:04:29.849 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:04:29 np0005603623 nova_compute[226235]: 2026-01-31 09:04:29.849 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:04:30 np0005603623 nova_compute[226235]: 2026-01-31 09:04:30.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:04:30 np0005603623 nova_compute[226235]: 2026-01-31 09:04:30.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:04:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:04:30.156 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:04:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:04:30.157 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:04:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:04:30.157 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:04:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:31.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:31.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:31 np0005603623 nova_compute[226235]: 2026-01-31 09:04:31.726 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e398 e398: 3 total, 3 up, 3 in
Jan 31 04:04:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:04:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:33.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:04:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:33.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:33 np0005603623 nova_compute[226235]: 2026-01-31 09:04:33.596 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:04:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:35.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:04:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:35.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:36 np0005603623 nova_compute[226235]: 2026-01-31 09:04:36.726 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:37.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:04:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:37.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:04:37 np0005603623 nova_compute[226235]: 2026-01-31 09:04:37.976 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:38 np0005603623 nova_compute[226235]: 2026-01-31 09:04:38.598 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:38 np0005603623 ovn_controller[133449]: 2026-01-31T09:04:38Z|00841|binding|INFO|Releasing lport 74bde109-0188-4ce3-87c3-02a3eb853dc2 from this chassis (sb_readonly=0)
Jan 31 04:04:38 np0005603623 nova_compute[226235]: 2026-01-31 09:04:38.831 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:04:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:39.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:04:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:39.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:04:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:41.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:04:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:41.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:41 np0005603623 nova_compute[226235]: 2026-01-31 09:04:41.727 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:04:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:43.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:04:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:43.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:43 np0005603623 nova_compute[226235]: 2026-01-31 09:04:43.634 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:45 np0005603623 nova_compute[226235]: 2026-01-31 09:04:45.010 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:04:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:45.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:04:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:45.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:46 np0005603623 nova_compute[226235]: 2026-01-31 09:04:46.730 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:47.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:47.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:04:48 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4081490716' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:04:48 np0005603623 nova_compute[226235]: 2026-01-31 09:04:48.669 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:48 np0005603623 ovn_controller[133449]: 2026-01-31T09:04:48Z|00842|binding|INFO|Releasing lport 74bde109-0188-4ce3-87c3-02a3eb853dc2 from this chassis (sb_readonly=0)
Jan 31 04:04:48 np0005603623 nova_compute[226235]: 2026-01-31 09:04:48.736 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:49.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:49.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:51.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:04:51.294 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=87, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=86) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:04:51 np0005603623 nova_compute[226235]: 2026-01-31 09:04:51.295 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:04:51.295 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:04:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:51.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:51 np0005603623 nova_compute[226235]: 2026-01-31 09:04:51.731 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:52 np0005603623 podman[321891]: 2026-01-31 09:04:52.986049994 +0000 UTC m=+0.066125056 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 04:04:52 np0005603623 podman[321890]: 2026-01-31 09:04:52.986110446 +0000 UTC m=+0.069072568 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 31 04:04:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:04:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:53.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:04:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:53.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:53 np0005603623 nova_compute[226235]: 2026-01-31 09:04:53.704 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:04:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:55.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:04:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:04:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:55.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:04:56 np0005603623 nova_compute[226235]: 2026-01-31 09:04:56.733 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:04:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:57.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:04:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:57.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:58 np0005603623 nova_compute[226235]: 2026-01-31 09:04:58.707 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:04:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:59.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:04:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:04:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:59.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:00.297 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '87'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:05:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:01.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:01.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:01 np0005603623 nova_compute[226235]: 2026-01-31 09:05:01.735 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:05:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:03.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:05:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:03.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e399 e399: 3 total, 3 up, 3 in
Jan 31 04:05:03 np0005603623 nova_compute[226235]: 2026-01-31 09:05:03.750 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e400 e400: 3 total, 3 up, 3 in
Jan 31 04:05:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:05:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:05.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:05:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:05.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e401 e401: 3 total, 3 up, 3 in
Jan 31 04:05:06 np0005603623 nova_compute[226235]: 2026-01-31 09:05:06.062 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:06 np0005603623 nova_compute[226235]: 2026-01-31 09:05:06.737 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:06 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e402 e402: 3 total, 3 up, 3 in
Jan 31 04:05:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e402 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:07.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:07.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:08 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #59. Immutable memtables: 15.
Jan 31 04:05:08 np0005603623 nova_compute[226235]: 2026-01-31 09:05:08.752 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:09.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:09.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:05:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:11.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:05:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:05:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:05:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:05:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:11.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:11 np0005603623 nova_compute[226235]: 2026-01-31 09:05:11.738 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e402 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e403 e403: 3 total, 3 up, 3 in
Jan 31 04:05:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:05:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:13.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:05:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:13.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:13 np0005603623 nova_compute[226235]: 2026-01-31 09:05:13.757 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:14 np0005603623 nova_compute[226235]: 2026-01-31 09:05:14.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:14 np0005603623 nova_compute[226235]: 2026-01-31 09:05:14.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:05:14 np0005603623 nova_compute[226235]: 2026-01-31 09:05:14.440 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:05:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:15.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:05:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:15.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:16 np0005603623 nova_compute[226235]: 2026-01-31 09:05:16.739 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e403 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:17.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:17.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:05:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:05:18 np0005603623 nova_compute[226235]: 2026-01-31 09:05:18.761 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:19 np0005603623 nova_compute[226235]: 2026-01-31 09:05:19.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:19 np0005603623 nova_compute[226235]: 2026-01-31 09:05:19.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:05:19 np0005603623 nova_compute[226235]: 2026-01-31 09:05:19.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:05:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:19.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:19.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:19 np0005603623 nova_compute[226235]: 2026-01-31 09:05:19.420 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-05185cfd-db91-4d53-8ae4-57a005be337f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:05:19 np0005603623 nova_compute[226235]: 2026-01-31 09:05:19.421 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-05185cfd-db91-4d53-8ae4-57a005be337f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:05:19 np0005603623 nova_compute[226235]: 2026-01-31 09:05:19.421 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 04:05:19 np0005603623 nova_compute[226235]: 2026-01-31 09:05:19.421 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 05185cfd-db91-4d53-8ae4-57a005be337f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:05:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:21.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:21.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:21 np0005603623 nova_compute[226235]: 2026-01-31 09:05:21.741 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:21 np0005603623 nova_compute[226235]: 2026-01-31 09:05:21.910 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Updating instance_info_cache with network_info: [{"id": "eac28070-b447-4e98-9966-378755225162", "address": "fa:16:3e:30:e8:e5", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeac28070-b4", "ovs_interfaceid": "eac28070-b447-4e98-9966-378755225162", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:05:21 np0005603623 nova_compute[226235]: 2026-01-31 09:05:21.948 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-05185cfd-db91-4d53-8ae4-57a005be337f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:05:21 np0005603623 nova_compute[226235]: 2026-01-31 09:05:21.949 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 04:05:21 np0005603623 nova_compute[226235]: 2026-01-31 09:05:21.950 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:22 np0005603623 nova_compute[226235]: 2026-01-31 09:05:22.019 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:22 np0005603623 nova_compute[226235]: 2026-01-31 09:05:22.019 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:22 np0005603623 nova_compute[226235]: 2026-01-31 09:05:22.019 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:22 np0005603623 nova_compute[226235]: 2026-01-31 09:05:22.020 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:05:22 np0005603623 nova_compute[226235]: 2026-01-31 09:05:22.020 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e403 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:05:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/712220276' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:05:22 np0005603623 nova_compute[226235]: 2026-01-31 09:05:22.420 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:22 np0005603623 nova_compute[226235]: 2026-01-31 09:05:22.507 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000c7 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:05:22 np0005603623 nova_compute[226235]: 2026-01-31 09:05:22.507 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000c7 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:05:22 np0005603623 nova_compute[226235]: 2026-01-31 09:05:22.655 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:05:22 np0005603623 nova_compute[226235]: 2026-01-31 09:05:22.656 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4011MB free_disk=20.98794174194336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:05:22 np0005603623 nova_compute[226235]: 2026-01-31 09:05:22.656 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:22 np0005603623 nova_compute[226235]: 2026-01-31 09:05:22.656 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:22 np0005603623 nova_compute[226235]: 2026-01-31 09:05:22.767 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 05185cfd-db91-4d53-8ae4-57a005be337f actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:05:22 np0005603623 nova_compute[226235]: 2026-01-31 09:05:22.767 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:05:22 np0005603623 nova_compute[226235]: 2026-01-31 09:05:22.767 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:05:22 np0005603623 nova_compute[226235]: 2026-01-31 09:05:22.884 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:23.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:05:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4206512553' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:05:23 np0005603623 nova_compute[226235]: 2026-01-31 09:05:23.276 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.391s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:23 np0005603623 nova_compute[226235]: 2026-01-31 09:05:23.282 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:05:23 np0005603623 nova_compute[226235]: 2026-01-31 09:05:23.306 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:05:23 np0005603623 nova_compute[226235]: 2026-01-31 09:05:23.309 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:05:23 np0005603623 nova_compute[226235]: 2026-01-31 09:05:23.310 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:23.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:23 np0005603623 nova_compute[226235]: 2026-01-31 09:05:23.809 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:23 np0005603623 podman[322272]: 2026-01-31 09:05:23.971391938 +0000 UTC m=+0.058619830 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:05:23 np0005603623 podman[322273]: 2026-01-31 09:05:23.987012728 +0000 UTC m=+0.071700500 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 04:05:24 np0005603623 nova_compute[226235]: 2026-01-31 09:05:24.514 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:24 np0005603623 nova_compute[226235]: 2026-01-31 09:05:24.963 226239 DEBUG oslo_concurrency.lockutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Acquiring lock "7255c305-4d2b-4335-9af0-2be77f7f097a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:24 np0005603623 nova_compute[226235]: 2026-01-31 09:05:24.964 226239 DEBUG oslo_concurrency.lockutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Lock "7255c305-4d2b-4335-9af0-2be77f7f097a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:24 np0005603623 nova_compute[226235]: 2026-01-31 09:05:24.995 226239 DEBUG nova.compute.manager [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:05:25 np0005603623 nova_compute[226235]: 2026-01-31 09:05:25.091 226239 DEBUG oslo_concurrency.lockutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:25 np0005603623 nova_compute[226235]: 2026-01-31 09:05:25.091 226239 DEBUG oslo_concurrency.lockutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:25 np0005603623 nova_compute[226235]: 2026-01-31 09:05:25.098 226239 DEBUG nova.virt.hardware [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:05:25 np0005603623 nova_compute[226235]: 2026-01-31 09:05:25.098 226239 INFO nova.compute.claims [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 04:05:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:05:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:25.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:05:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:25.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:25 np0005603623 nova_compute[226235]: 2026-01-31 09:05:25.404 226239 DEBUG oslo_concurrency.processutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:05:25 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4074832310' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:05:25 np0005603623 nova_compute[226235]: 2026-01-31 09:05:25.851 226239 DEBUG oslo_concurrency.processutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:25 np0005603623 nova_compute[226235]: 2026-01-31 09:05:25.859 226239 DEBUG nova.compute.provider_tree [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:05:25 np0005603623 nova_compute[226235]: 2026-01-31 09:05:25.902 226239 DEBUG nova.scheduler.client.report [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:05:25 np0005603623 nova_compute[226235]: 2026-01-31 09:05:25.948 226239 DEBUG oslo_concurrency.lockutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.857s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:25 np0005603623 nova_compute[226235]: 2026-01-31 09:05:25.950 226239 DEBUG nova.compute.manager [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.023 226239 DEBUG nova.compute.manager [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.024 226239 DEBUG nova.network.neutron [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.055 226239 INFO nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.082 226239 DEBUG nova.compute.manager [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.163 226239 INFO nova.virt.block_device [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Booting with volume f3799711-34bd-4212-9be5-a6ded13ee858 at /dev/vda#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.279 226239 DEBUG nova.policy [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4bd7d1bbf3a8497b8b26f8df83fe8067', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c8017efbc64b4244b349174c29a41000', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.451 226239 DEBUG os_brick.utils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.453 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.461 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.461 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[ef31e22d-2fdf-421b-8198-e648ffc9acac]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.463 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.467 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.005s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.468 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[a1ae458f-3ca3-43a0-873f-1f9360c4cc50]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.470 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.477 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.477 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[7e0a187e-8f73-44d0-8a74-ccf5adc34f44]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.479 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[c541bf93-34a4-4243-98a3-891c5623496d]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.480 226239 DEBUG oslo_concurrency.processutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.502 226239 DEBUG oslo_concurrency.processutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] CMD "nvme version" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.511 226239 DEBUG os_brick.initiator.connectors.lightos [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.511 226239 DEBUG os_brick.initiator.connectors.lightos [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.512 226239 DEBUG os_brick.initiator.connectors.lightos [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.512 226239 DEBUG os_brick.utils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] <== get_connector_properties: return (61ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.513 226239 DEBUG nova.virt.block_device [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Updating existing volume attachment record: 825aef87-7f0c-42d5-af7a-3bc39e839174 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 04:05:26 np0005603623 nova_compute[226235]: 2026-01-31 09:05:26.744 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e403 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:27 np0005603623 nova_compute[226235]: 2026-01-31 09:05:27.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:05:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:27.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:05:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:27.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:27 np0005603623 nova_compute[226235]: 2026-01-31 09:05:27.841 226239 DEBUG nova.network.neutron [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Successfully created port: 04892866-81a0-44a3-99fa-2493a125f99a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:05:28 np0005603623 nova_compute[226235]: 2026-01-31 09:05:28.480 226239 DEBUG nova.compute.manager [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:05:28 np0005603623 nova_compute[226235]: 2026-01-31 09:05:28.482 226239 DEBUG nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:05:28 np0005603623 nova_compute[226235]: 2026-01-31 09:05:28.482 226239 INFO nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Creating image(s)#033[00m
Jan 31 04:05:28 np0005603623 nova_compute[226235]: 2026-01-31 09:05:28.483 226239 DEBUG nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 04:05:28 np0005603623 nova_compute[226235]: 2026-01-31 09:05:28.483 226239 DEBUG nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Ensure instance console log exists: /var/lib/nova/instances/7255c305-4d2b-4335-9af0-2be77f7f097a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:05:28 np0005603623 nova_compute[226235]: 2026-01-31 09:05:28.484 226239 DEBUG oslo_concurrency.lockutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:28 np0005603623 nova_compute[226235]: 2026-01-31 09:05:28.484 226239 DEBUG oslo_concurrency.lockutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:28 np0005603623 nova_compute[226235]: 2026-01-31 09:05:28.484 226239 DEBUG oslo_concurrency.lockutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:28 np0005603623 ovn_controller[133449]: 2026-01-31T09:05:28Z|00843|binding|INFO|Releasing lport 74bde109-0188-4ce3-87c3-02a3eb853dc2 from this chassis (sb_readonly=0)
Jan 31 04:05:28 np0005603623 nova_compute[226235]: 2026-01-31 09:05:28.599 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:28 np0005603623 nova_compute[226235]: 2026-01-31 09:05:28.811 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:29 np0005603623 nova_compute[226235]: 2026-01-31 09:05:29.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:05:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:29.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:05:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:05:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:29.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:05:29 np0005603623 nova_compute[226235]: 2026-01-31 09:05:29.470 226239 DEBUG nova.network.neutron [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Successfully updated port: 04892866-81a0-44a3-99fa-2493a125f99a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:05:29 np0005603623 nova_compute[226235]: 2026-01-31 09:05:29.491 226239 DEBUG oslo_concurrency.lockutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Acquiring lock "refresh_cache-7255c305-4d2b-4335-9af0-2be77f7f097a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:05:29 np0005603623 nova_compute[226235]: 2026-01-31 09:05:29.491 226239 DEBUG oslo_concurrency.lockutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Acquired lock "refresh_cache-7255c305-4d2b-4335-9af0-2be77f7f097a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:05:29 np0005603623 nova_compute[226235]: 2026-01-31 09:05:29.491 226239 DEBUG nova.network.neutron [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:05:29 np0005603623 nova_compute[226235]: 2026-01-31 09:05:29.741 226239 DEBUG nova.compute.manager [req-a675c5c2-be9b-441a-b435-4e0b4f54c0ce req-dac15074-04c0-4c99-a501-e0fa46bc7950 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Received event network-changed-04892866-81a0-44a3-99fa-2493a125f99a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:05:29 np0005603623 nova_compute[226235]: 2026-01-31 09:05:29.741 226239 DEBUG nova.compute.manager [req-a675c5c2-be9b-441a-b435-4e0b4f54c0ce req-dac15074-04c0-4c99-a501-e0fa46bc7950 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Refreshing instance network info cache due to event network-changed-04892866-81a0-44a3-99fa-2493a125f99a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:05:29 np0005603623 nova_compute[226235]: 2026-01-31 09:05:29.742 226239 DEBUG oslo_concurrency.lockutils [req-a675c5c2-be9b-441a-b435-4e0b4f54c0ce req-dac15074-04c0-4c99-a501-e0fa46bc7950 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-7255c305-4d2b-4335-9af0-2be77f7f097a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:05:29 np0005603623 nova_compute[226235]: 2026-01-31 09:05:29.971 226239 DEBUG nova.network.neutron [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:05:30 np0005603623 nova_compute[226235]: 2026-01-31 09:05:30.152 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:30 np0005603623 nova_compute[226235]: 2026-01-31 09:05:30.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:30.157 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:30.157 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:30.158 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:31.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:05:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:31.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.669 226239 DEBUG nova.network.neutron [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Updating instance_info_cache with network_info: [{"id": "04892866-81a0-44a3-99fa-2493a125f99a", "address": "fa:16:3e:e9:bb:07", "network": {"id": "246a163d-3e9c-48ad-b266-efe63aa146b3", "bridge": "br-int", "label": "tempest-TestVolumeBackupRestore-1170900981-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8017efbc64b4244b349174c29a41000", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04892866-81", "ovs_interfaceid": "04892866-81a0-44a3-99fa-2493a125f99a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.745 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.914 226239 DEBUG oslo_concurrency.lockutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Releasing lock "refresh_cache-7255c305-4d2b-4335-9af0-2be77f7f097a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.915 226239 DEBUG nova.compute.manager [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Instance network_info: |[{"id": "04892866-81a0-44a3-99fa-2493a125f99a", "address": "fa:16:3e:e9:bb:07", "network": {"id": "246a163d-3e9c-48ad-b266-efe63aa146b3", "bridge": "br-int", "label": "tempest-TestVolumeBackupRestore-1170900981-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8017efbc64b4244b349174c29a41000", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04892866-81", "ovs_interfaceid": "04892866-81a0-44a3-99fa-2493a125f99a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.915 226239 DEBUG oslo_concurrency.lockutils [req-a675c5c2-be9b-441a-b435-4e0b4f54c0ce req-dac15074-04c0-4c99-a501-e0fa46bc7950 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-7255c305-4d2b-4335-9af0-2be77f7f097a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.916 226239 DEBUG nova.network.neutron [req-a675c5c2-be9b-441a-b435-4e0b4f54c0ce req-dac15074-04c0-4c99-a501-e0fa46bc7950 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Refreshing network info cache for port 04892866-81a0-44a3-99fa-2493a125f99a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.919 226239 DEBUG nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Start _get_guest_xml network_info=[{"id": "04892866-81a0-44a3-99fa-2493a125f99a", "address": "fa:16:3e:e9:bb:07", "network": {"id": "246a163d-3e9c-48ad-b266-efe63aa146b3", "bridge": "br-int", "label": "tempest-TestVolumeBackupRestore-1170900981-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8017efbc64b4244b349174c29a41000", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04892866-81", "ovs_interfaceid": "04892866-81a0-44a3-99fa-2493a125f99a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'attachment_id': '825aef87-7f0c-42d5-af7a-3bc39e839174', 'delete_on_termination': False, 'guest_format': None, 'device_type': 'disk', 'boot_index': 0, 'disk_bus': 'virtio', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-f3799711-34bd-4212-9be5-a6ded13ee858', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'f3799711-34bd-4212-9be5-a6ded13ee858', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '7255c305-4d2b-4335-9af0-2be77f7f097a', 'attached_at': '', 'detached_at': '', 'volume_id': 'f3799711-34bd-4212-9be5-a6ded13ee858', 'serial': 'f3799711-34bd-4212-9be5-a6ded13ee858'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.923 226239 WARNING nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.990 226239 DEBUG nova.virt.libvirt.host [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.991 226239 DEBUG nova.virt.libvirt.host [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.993 226239 DEBUG nova.virt.libvirt.host [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.994 226239 DEBUG nova.virt.libvirt.host [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.995 226239 DEBUG nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.995 226239 DEBUG nova.virt.hardware [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.996 226239 DEBUG nova.virt.hardware [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.996 226239 DEBUG nova.virt.hardware [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.996 226239 DEBUG nova.virt.hardware [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.996 226239 DEBUG nova.virt.hardware [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.996 226239 DEBUG nova.virt.hardware [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.997 226239 DEBUG nova.virt.hardware [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.997 226239 DEBUG nova.virt.hardware [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.997 226239 DEBUG nova.virt.hardware [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.997 226239 DEBUG nova.virt.hardware [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:05:31 np0005603623 nova_compute[226235]: 2026-01-31 09:05:31.997 226239 DEBUG nova.virt.hardware [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.028 226239 DEBUG nova.storage.rbd_utils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] rbd image 7255c305-4d2b-4335-9af0-2be77f7f097a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.033 226239 DEBUG oslo_concurrency.processutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e403 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:05:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/489954830' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.433 226239 DEBUG oslo_concurrency.processutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.495 226239 DEBUG nova.virt.libvirt.vif [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:05:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBackupRestore-server-24996787',display_name='tempest-TestVolumeBackupRestore-server-24996787',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebackuprestore-server-24996787',id=204,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDiUuyTkBrBOg5gGHaaMVuI2XTjHgVpG5AlWT1Uy1XvC67KFAXePhqQCRO3vYd53cKQdeQH68O6uBB76SiY4XS9YRqOalJGVNrTbO6XCdiKTmeFqiIqAGHxPMGsjL9U1Cw==',key_name='tempest-TestVolumeBackupRestore-882231007',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c8017efbc64b4244b349174c29a41000',ramdisk_id='',reservation_id='r-qk5gvh2l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBackupRestore-586195710',owner_user_name='tempest-TestVolumeBackupRestore-586195710-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:05:26Z,user_data=None,user_id='4bd7d1bbf3a8497b8b26f8df83fe8067',uuid=7255c305-4d2b-4335-9af0-2be77f7f097a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "04892866-81a0-44a3-99fa-2493a125f99a", "address": "fa:16:3e:e9:bb:07", "network": {"id": "246a163d-3e9c-48ad-b266-efe63aa146b3", "bridge": "br-int", "label": "tempest-TestVolumeBackupRestore-1170900981-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8017efbc64b4244b349174c29a41000", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04892866-81", "ovs_interfaceid": "04892866-81a0-44a3-99fa-2493a125f99a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.495 226239 DEBUG nova.network.os_vif_util [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Converting VIF {"id": "04892866-81a0-44a3-99fa-2493a125f99a", "address": "fa:16:3e:e9:bb:07", "network": {"id": "246a163d-3e9c-48ad-b266-efe63aa146b3", "bridge": "br-int", "label": "tempest-TestVolumeBackupRestore-1170900981-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8017efbc64b4244b349174c29a41000", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04892866-81", "ovs_interfaceid": "04892866-81a0-44a3-99fa-2493a125f99a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.496 226239 DEBUG nova.network.os_vif_util [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:bb:07,bridge_name='br-int',has_traffic_filtering=True,id=04892866-81a0-44a3-99fa-2493a125f99a,network=Network(246a163d-3e9c-48ad-b266-efe63aa146b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04892866-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.497 226239 DEBUG nova.objects.instance [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7255c305-4d2b-4335-9af0-2be77f7f097a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.531 226239 DEBUG nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:05:32 np0005603623 nova_compute[226235]:  <uuid>7255c305-4d2b-4335-9af0-2be77f7f097a</uuid>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:  <name>instance-000000cc</name>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <nova:name>tempest-TestVolumeBackupRestore-server-24996787</nova:name>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 09:05:31</nova:creationTime>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 04:05:32 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:        <nova:user uuid="4bd7d1bbf3a8497b8b26f8df83fe8067">tempest-TestVolumeBackupRestore-586195710-project-member</nova:user>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:        <nova:project uuid="c8017efbc64b4244b349174c29a41000">tempest-TestVolumeBackupRestore-586195710</nova:project>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:        <nova:port uuid="04892866-81a0-44a3-99fa-2493a125f99a">
Jan 31 04:05:32 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <system>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <entry name="serial">7255c305-4d2b-4335-9af0-2be77f7f097a</entry>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <entry name="uuid">7255c305-4d2b-4335-9af0-2be77f7f097a</entry>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    </system>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:  <os>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:  </os>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:  <features>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:  </features>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:  </clock>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:  <devices>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/7255c305-4d2b-4335-9af0-2be77f7f097a_disk.config">
Jan 31 04:05:32 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:05:32 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="volumes/volume-f3799711-34bd-4212-9be5-a6ded13ee858">
Jan 31 04:05:32 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:05:32 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <serial>f3799711-34bd-4212-9be5-a6ded13ee858</serial>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:e9:bb:07"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <target dev="tap04892866-81"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    </interface>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/7255c305-4d2b-4335-9af0-2be77f7f097a/console.log" append="off"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    </serial>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <video>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    </video>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    </rng>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 04:05:32 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 04:05:32 np0005603623 nova_compute[226235]:  </devices>
Jan 31 04:05:32 np0005603623 nova_compute[226235]: </domain>
Jan 31 04:05:32 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.532 226239 DEBUG nova.compute.manager [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Preparing to wait for external event network-vif-plugged-04892866-81a0-44a3-99fa-2493a125f99a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.532 226239 DEBUG oslo_concurrency.lockutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Acquiring lock "7255c305-4d2b-4335-9af0-2be77f7f097a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.533 226239 DEBUG oslo_concurrency.lockutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Lock "7255c305-4d2b-4335-9af0-2be77f7f097a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.533 226239 DEBUG oslo_concurrency.lockutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Lock "7255c305-4d2b-4335-9af0-2be77f7f097a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.533 226239 DEBUG nova.virt.libvirt.vif [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:05:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBackupRestore-server-24996787',display_name='tempest-TestVolumeBackupRestore-server-24996787',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebackuprestore-server-24996787',id=204,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDiUuyTkBrBOg5gGHaaMVuI2XTjHgVpG5AlWT1Uy1XvC67KFAXePhqQCRO3vYd53cKQdeQH68O6uBB76SiY4XS9YRqOalJGVNrTbO6XCdiKTmeFqiIqAGHxPMGsjL9U1Cw==',key_name='tempest-TestVolumeBackupRestore-882231007',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c8017efbc64b4244b349174c29a41000',ramdisk_id='',reservation_id='r-qk5gvh2l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBackupRestore-586195710',owner_user_name='tempest-TestVolumeBackupRestore-586195710-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:05:26Z,user_data=None,user_id='4bd7d1bbf3a8497b8b26f8df83fe8067',uuid=7255c305-4d2b-4335-9af0-2be77f7f097a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "04892866-81a0-44a3-99fa-2493a125f99a", "address": "fa:16:3e:e9:bb:07", "network": {"id": "246a163d-3e9c-48ad-b266-efe63aa146b3", "bridge": "br-int", "label": "tempest-TestVolumeBackupRestore-1170900981-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8017efbc64b4244b349174c29a41000", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04892866-81", "ovs_interfaceid": "04892866-81a0-44a3-99fa-2493a125f99a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.534 226239 DEBUG nova.network.os_vif_util [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Converting VIF {"id": "04892866-81a0-44a3-99fa-2493a125f99a", "address": "fa:16:3e:e9:bb:07", "network": {"id": "246a163d-3e9c-48ad-b266-efe63aa146b3", "bridge": "br-int", "label": "tempest-TestVolumeBackupRestore-1170900981-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8017efbc64b4244b349174c29a41000", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04892866-81", "ovs_interfaceid": "04892866-81a0-44a3-99fa-2493a125f99a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.534 226239 DEBUG nova.network.os_vif_util [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:bb:07,bridge_name='br-int',has_traffic_filtering=True,id=04892866-81a0-44a3-99fa-2493a125f99a,network=Network(246a163d-3e9c-48ad-b266-efe63aa146b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04892866-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.534 226239 DEBUG os_vif [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:bb:07,bridge_name='br-int',has_traffic_filtering=True,id=04892866-81a0-44a3-99fa-2493a125f99a,network=Network(246a163d-3e9c-48ad-b266-efe63aa146b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04892866-81') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.535 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.535 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.536 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.540 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.541 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04892866-81, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.542 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap04892866-81, col_values=(('external_ids', {'iface-id': '04892866-81a0-44a3-99fa-2493a125f99a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e9:bb:07', 'vm-uuid': '7255c305-4d2b-4335-9af0-2be77f7f097a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.544 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:32 np0005603623 NetworkManager[48970]: <info>  [1769850332.5450] manager: (tap04892866-81): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/397)
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.547 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.549 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.551 226239 INFO os_vif [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:bb:07,bridge_name='br-int',has_traffic_filtering=True,id=04892866-81a0-44a3-99fa-2493a125f99a,network=Network(246a163d-3e9c-48ad-b266-efe63aa146b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04892866-81')#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.639 226239 DEBUG nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.639 226239 DEBUG nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.639 226239 DEBUG nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] No VIF found with MAC fa:16:3e:e9:bb:07, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.640 226239 INFO nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Using config drive#033[00m
Jan 31 04:05:32 np0005603623 nova_compute[226235]: 2026-01-31 09:05:32.662 226239 DEBUG nova.storage.rbd_utils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] rbd image 7255c305-4d2b-4335-9af0-2be77f7f097a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:05:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:05:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:33.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:05:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:33.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:34 np0005603623 nova_compute[226235]: 2026-01-31 09:05:34.699 226239 INFO nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Creating config drive at /var/lib/nova/instances/7255c305-4d2b-4335-9af0-2be77f7f097a/disk.config#033[00m
Jan 31 04:05:34 np0005603623 nova_compute[226235]: 2026-01-31 09:05:34.702 226239 DEBUG oslo_concurrency.processutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7255c305-4d2b-4335-9af0-2be77f7f097a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpi3gd8f5c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:34 np0005603623 nova_compute[226235]: 2026-01-31 09:05:34.824 226239 DEBUG oslo_concurrency.processutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7255c305-4d2b-4335-9af0-2be77f7f097a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpi3gd8f5c" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:34 np0005603623 nova_compute[226235]: 2026-01-31 09:05:34.849 226239 DEBUG nova.storage.rbd_utils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] rbd image 7255c305-4d2b-4335-9af0-2be77f7f097a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:05:34 np0005603623 nova_compute[226235]: 2026-01-31 09:05:34.852 226239 DEBUG oslo_concurrency.processutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7255c305-4d2b-4335-9af0-2be77f7f097a/disk.config 7255c305-4d2b-4335-9af0-2be77f7f097a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:34 np0005603623 nova_compute[226235]: 2026-01-31 09:05:34.991 226239 DEBUG oslo_concurrency.processutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7255c305-4d2b-4335-9af0-2be77f7f097a/disk.config 7255c305-4d2b-4335-9af0-2be77f7f097a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:34 np0005603623 nova_compute[226235]: 2026-01-31 09:05:34.993 226239 INFO nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Deleting local config drive /var/lib/nova/instances/7255c305-4d2b-4335-9af0-2be77f7f097a/disk.config because it was imported into RBD.#033[00m
Jan 31 04:05:35 np0005603623 NetworkManager[48970]: <info>  [1769850335.0212] manager: (tap04892866-81): new Tun device (/org/freedesktop/NetworkManager/Devices/398)
Jan 31 04:05:35 np0005603623 kernel: tap04892866-81: entered promiscuous mode
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.022 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:35 np0005603623 ovn_controller[133449]: 2026-01-31T09:05:35Z|00844|binding|INFO|Claiming lport 04892866-81a0-44a3-99fa-2493a125f99a for this chassis.
Jan 31 04:05:35 np0005603623 ovn_controller[133449]: 2026-01-31T09:05:35Z|00845|binding|INFO|04892866-81a0-44a3-99fa-2493a125f99a: Claiming fa:16:3e:e9:bb:07 10.100.0.11
Jan 31 04:05:35 np0005603623 ovn_controller[133449]: 2026-01-31T09:05:35Z|00846|binding|INFO|Setting lport 04892866-81a0-44a3-99fa-2493a125f99a ovn-installed in OVS
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.030 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.031 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.037 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:bb:07 10.100.0.11'], port_security=['fa:16:3e:e9:bb:07 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '7255c305-4d2b-4335-9af0-2be77f7f097a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-246a163d-3e9c-48ad-b266-efe63aa146b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c8017efbc64b4244b349174c29a41000', 'neutron:revision_number': '2', 'neutron:security_group_ids': '01d5e967-3f9e-4120-a2d3-584214a51bf6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e97c9db-e3ff-43c0-98d3-8300ecb9d187, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=04892866-81a0-44a3-99fa-2493a125f99a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:05:35 np0005603623 ovn_controller[133449]: 2026-01-31T09:05:35Z|00847|binding|INFO|Setting lport 04892866-81a0-44a3-99fa-2493a125f99a up in Southbound
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.039 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 04892866-81a0-44a3-99fa-2493a125f99a in datapath 246a163d-3e9c-48ad-b266-efe63aa146b3 bound to our chassis#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.040 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 246a163d-3e9c-48ad-b266-efe63aa146b3#033[00m
Jan 31 04:05:35 np0005603623 systemd-udevd[322464]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:05:35 np0005603623 systemd-machined[194379]: New machine qemu-94-instance-000000cc.
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.052 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1d8c49a9-705f-41a3-b3c3-3bb16e5b146c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.054 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap246a163d-31 in ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.056 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap246a163d-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.056 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[09255462-2d97-4e4c-876d-12de69a5efc2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.058 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[876903fe-e480-4107-808b-8eb9a2df5bf1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:35 np0005603623 NetworkManager[48970]: <info>  [1769850335.0612] device (tap04892866-81): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:05:35 np0005603623 NetworkManager[48970]: <info>  [1769850335.0632] device (tap04892866-81): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:05:35 np0005603623 systemd[1]: Started Virtual Machine qemu-94-instance-000000cc.
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.069 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[69f04f27-26f7-4402-ac7f-acb13e9d335d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.087 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3d2cf545-69a1-45ea-b70c-4230b4ed8e58]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.110 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[868194df-e20b-4dd5-ac13-3d2b0218badf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:35 np0005603623 NetworkManager[48970]: <info>  [1769850335.1173] manager: (tap246a163d-30): new Veth device (/org/freedesktop/NetworkManager/Devices/399)
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.117 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[28350a98-5b3a-405c-84b5-1aefb9c1a488]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:35 np0005603623 systemd-udevd[322468]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.141 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[7ad432f7-3a0c-44e1-a98b-9a481ba8ab90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.143 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[309e7084-bc4a-40ec-9779-ebb1fcf9c837]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:35 np0005603623 NetworkManager[48970]: <info>  [1769850335.1645] device (tap246a163d-30): carrier: link connected
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.170 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[3f8b866e-40a9-40e2-aacf-85a3df67255e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.182 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dd081a81-0d66-4548-b16c-91d61c4ceb16]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap246a163d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fa:16:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 247], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 947559, 'reachable_time': 30129, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 322500, 'error': None, 'target': 'ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.192 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c14fc4d0-d886-46b8-a825-f048dd5dc490]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefa:1647'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 947559, 'tstamp': 947559}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 322501, 'error': None, 'target': 'ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.203 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[311d9e9c-c8ab-4f76-9ff2-4571f2a9dbf3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap246a163d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fa:16:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 247], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 947559, 'reachable_time': 30129, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 322502, 'error': None, 'target': 'ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.221 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bd987850-9524-45cd-b92c-811e78296bcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:35.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.248 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1fbbe7f3-c506-4691-a359-016393f74143]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.253 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap246a163d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.253 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.253 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap246a163d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.294 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:35 np0005603623 kernel: tap246a163d-30: entered promiscuous mode
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.299 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:35 np0005603623 NetworkManager[48970]: <info>  [1769850335.2996] manager: (tap246a163d-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/400)
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.305 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap246a163d-30, col_values=(('external_ids', {'iface-id': '2dca4b95-44ff-4e96-a06f-5892705799a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.307 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.308 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:35 np0005603623 ovn_controller[133449]: 2026-01-31T09:05:35Z|00848|binding|INFO|Releasing lport 2dca4b95-44ff-4e96-a06f-5892705799a3 from this chassis (sb_readonly=0)
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.309 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/246a163d-3e9c-48ad-b266-efe63aa146b3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/246a163d-3e9c-48ad-b266-efe63aa146b3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.309 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9cb9415a-bedb-48e1-8040-39b34af02db8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.310 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-246a163d-3e9c-48ad-b266-efe63aa146b3
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/246a163d-3e9c-48ad-b266-efe63aa146b3.pid.haproxy
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 246a163d-3e9c-48ad-b266-efe63aa146b3
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:05:35 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:35.311 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3', 'env', 'PROCESS_TAG=haproxy-246a163d-3e9c-48ad-b266-efe63aa146b3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/246a163d-3e9c-48ad-b266-efe63aa146b3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.313 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:35.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.451 226239 DEBUG nova.network.neutron [req-a675c5c2-be9b-441a-b435-4e0b4f54c0ce req-dac15074-04c0-4c99-a501-e0fa46bc7950 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Updated VIF entry in instance network info cache for port 04892866-81a0-44a3-99fa-2493a125f99a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.452 226239 DEBUG nova.network.neutron [req-a675c5c2-be9b-441a-b435-4e0b4f54c0ce req-dac15074-04c0-4c99-a501-e0fa46bc7950 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Updating instance_info_cache with network_info: [{"id": "04892866-81a0-44a3-99fa-2493a125f99a", "address": "fa:16:3e:e9:bb:07", "network": {"id": "246a163d-3e9c-48ad-b266-efe63aa146b3", "bridge": "br-int", "label": "tempest-TestVolumeBackupRestore-1170900981-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8017efbc64b4244b349174c29a41000", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04892866-81", "ovs_interfaceid": "04892866-81a0-44a3-99fa-2493a125f99a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.477 226239 DEBUG oslo_concurrency.lockutils [req-a675c5c2-be9b-441a-b435-4e0b4f54c0ce req-dac15074-04c0-4c99-a501-e0fa46bc7950 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-7255c305-4d2b-4335-9af0-2be77f7f097a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:05:35 np0005603623 podman[322540]: 2026-01-31 09:05:35.602861496 +0000 UTC m=+0.039221902 container create f8f76e07682fba94d0cd3c01d11684b65b01c7b209acbe4e2ac7b6b11412e647 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:05:35 np0005603623 systemd[1]: Started libpod-conmon-f8f76e07682fba94d0cd3c01d11684b65b01c7b209acbe4e2ac7b6b11412e647.scope.
Jan 31 04:05:35 np0005603623 systemd[1]: Started libcrun container.
Jan 31 04:05:35 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a05449295f468bc95192ad43ef15d7f8c187c8ceb2c01e5a638c8d4852301fb8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:05:35 np0005603623 podman[322540]: 2026-01-31 09:05:35.667626007 +0000 UTC m=+0.103986443 container init f8f76e07682fba94d0cd3c01d11684b65b01c7b209acbe4e2ac7b6b11412e647 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:05:35 np0005603623 podman[322540]: 2026-01-31 09:05:35.672091167 +0000 UTC m=+0.108451573 container start f8f76e07682fba94d0cd3c01d11684b65b01c7b209acbe4e2ac7b6b11412e647 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 04:05:35 np0005603623 podman[322540]: 2026-01-31 09:05:35.580427352 +0000 UTC m=+0.016787778 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:05:35 np0005603623 neutron-haproxy-ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3[322587]: [NOTICE]   (322592) : New worker (322595) forked
Jan 31 04:05:35 np0005603623 neutron-haproxy-ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3[322587]: [NOTICE]   (322592) : Loading success.
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.721 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850335.7206244, 7255c305-4d2b-4335-9af0-2be77f7f097a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.721 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] VM Started (Lifecycle Event)#033[00m
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.774 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.778 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850335.7215283, 7255c305-4d2b-4335-9af0-2be77f7f097a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.778 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.817 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.821 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:05:35 np0005603623 nova_compute[226235]: 2026-01-31 09:05:35.854 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.044 226239 DEBUG nova.compute.manager [req-53e8a340-e7f6-4953-ad49-20e80edbe6a9 req-4a7caef9-91c8-404d-92b5-c81a317674f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Received event network-vif-plugged-04892866-81a0-44a3-99fa-2493a125f99a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.045 226239 DEBUG oslo_concurrency.lockutils [req-53e8a340-e7f6-4953-ad49-20e80edbe6a9 req-4a7caef9-91c8-404d-92b5-c81a317674f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7255c305-4d2b-4335-9af0-2be77f7f097a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.045 226239 DEBUG oslo_concurrency.lockutils [req-53e8a340-e7f6-4953-ad49-20e80edbe6a9 req-4a7caef9-91c8-404d-92b5-c81a317674f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7255c305-4d2b-4335-9af0-2be77f7f097a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.045 226239 DEBUG oslo_concurrency.lockutils [req-53e8a340-e7f6-4953-ad49-20e80edbe6a9 req-4a7caef9-91c8-404d-92b5-c81a317674f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7255c305-4d2b-4335-9af0-2be77f7f097a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.045 226239 DEBUG nova.compute.manager [req-53e8a340-e7f6-4953-ad49-20e80edbe6a9 req-4a7caef9-91c8-404d-92b5-c81a317674f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Processing event network-vif-plugged-04892866-81a0-44a3-99fa-2493a125f99a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.046 226239 DEBUG nova.compute.manager [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.050 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850336.04979, 7255c305-4d2b-4335-9af0-2be77f7f097a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.050 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.052 226239 DEBUG nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.056 226239 INFO nova.virt.libvirt.driver [-] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Instance spawned successfully.#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.057 226239 DEBUG nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.078 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.083 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.086 226239 DEBUG nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.087 226239 DEBUG nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.087 226239 DEBUG nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.087 226239 DEBUG nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.088 226239 DEBUG nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.088 226239 DEBUG nova.virt.libvirt.driver [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.121 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.170 226239 INFO nova.compute.manager [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Took 7.69 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.171 226239 DEBUG nova.compute.manager [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.495 226239 INFO nova.compute.manager [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Took 11.44 seconds to build instance.#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.543 226239 DEBUG oslo_concurrency.lockutils [None req-a55dbff1-9604-41f7-81ea-89b04df9ec30 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Lock "7255c305-4d2b-4335-9af0-2be77f7f097a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.751 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:36 np0005603623 nova_compute[226235]: 2026-01-31 09:05:36.866 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e403 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:05:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:37.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:05:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:37.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:37 np0005603623 nova_compute[226235]: 2026-01-31 09:05:37.585 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:05:37 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/620449192' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:05:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:05:37 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/620449192' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:05:38 np0005603623 nova_compute[226235]: 2026-01-31 09:05:38.196 226239 DEBUG nova.compute.manager [req-7b1eb216-5c7f-4140-a8bc-f995fffffc78 req-2be26a4d-4f59-4375-87ae-887adb26b921 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Received event network-vif-plugged-04892866-81a0-44a3-99fa-2493a125f99a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:05:38 np0005603623 nova_compute[226235]: 2026-01-31 09:05:38.197 226239 DEBUG oslo_concurrency.lockutils [req-7b1eb216-5c7f-4140-a8bc-f995fffffc78 req-2be26a4d-4f59-4375-87ae-887adb26b921 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7255c305-4d2b-4335-9af0-2be77f7f097a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:38 np0005603623 nova_compute[226235]: 2026-01-31 09:05:38.197 226239 DEBUG oslo_concurrency.lockutils [req-7b1eb216-5c7f-4140-a8bc-f995fffffc78 req-2be26a4d-4f59-4375-87ae-887adb26b921 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7255c305-4d2b-4335-9af0-2be77f7f097a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:38 np0005603623 nova_compute[226235]: 2026-01-31 09:05:38.197 226239 DEBUG oslo_concurrency.lockutils [req-7b1eb216-5c7f-4140-a8bc-f995fffffc78 req-2be26a4d-4f59-4375-87ae-887adb26b921 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7255c305-4d2b-4335-9af0-2be77f7f097a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:38 np0005603623 nova_compute[226235]: 2026-01-31 09:05:38.197 226239 DEBUG nova.compute.manager [req-7b1eb216-5c7f-4140-a8bc-f995fffffc78 req-2be26a4d-4f59-4375-87ae-887adb26b921 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] No waiting events found dispatching network-vif-plugged-04892866-81a0-44a3-99fa-2493a125f99a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:05:38 np0005603623 nova_compute[226235]: 2026-01-31 09:05:38.198 226239 WARNING nova.compute.manager [req-7b1eb216-5c7f-4140-a8bc-f995fffffc78 req-2be26a4d-4f59-4375-87ae-887adb26b921 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Received unexpected event network-vif-plugged-04892866-81a0-44a3-99fa-2493a125f99a for instance with vm_state active and task_state None.#033[00m
Jan 31 04:05:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:05:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:39.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:05:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:39.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e404 e404: 3 total, 3 up, 3 in
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.114 226239 DEBUG oslo_concurrency.lockutils [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "05185cfd-db91-4d53-8ae4-57a005be337f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.115 226239 DEBUG oslo_concurrency.lockutils [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "05185cfd-db91-4d53-8ae4-57a005be337f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.116 226239 DEBUG oslo_concurrency.lockutils [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "05185cfd-db91-4d53-8ae4-57a005be337f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.116 226239 DEBUG oslo_concurrency.lockutils [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "05185cfd-db91-4d53-8ae4-57a005be337f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.117 226239 DEBUG oslo_concurrency.lockutils [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "05185cfd-db91-4d53-8ae4-57a005be337f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.118 226239 INFO nova.compute.manager [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Terminating instance#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.120 226239 DEBUG nova.compute.manager [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:05:41 np0005603623 kernel: tapeac28070-b4 (unregistering): left promiscuous mode
Jan 31 04:05:41 np0005603623 NetworkManager[48970]: <info>  [1769850341.1888] device (tapeac28070-b4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.197 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:41 np0005603623 ovn_controller[133449]: 2026-01-31T09:05:41Z|00849|binding|INFO|Releasing lport eac28070-b447-4e98-9966-378755225162 from this chassis (sb_readonly=0)
Jan 31 04:05:41 np0005603623 ovn_controller[133449]: 2026-01-31T09:05:41Z|00850|binding|INFO|Setting lport eac28070-b447-4e98-9966-378755225162 down in Southbound
Jan 31 04:05:41 np0005603623 ovn_controller[133449]: 2026-01-31T09:05:41Z|00851|binding|INFO|Removing iface tapeac28070-b4 ovn-installed in OVS
Jan 31 04:05:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:41.206 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:e8:e5 10.100.0.13'], port_security=['fa:16:3e:30:e8:e5 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '05185cfd-db91-4d53-8ae4-57a005be337f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-650eb345-8346-4e8f-8e83-eeb0117654f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76ce367a834b49dfb5b436848118b860', 'neutron:revision_number': '4', 'neutron:security_group_ids': '93695839-94d5-47ca-8baf-4134adc006a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.238'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ecdc171-9d09-4cba-9bb9-cd2f8ef8e6c3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=eac28070-b447-4e98-9966-378755225162) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:05:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:41.208 143258 INFO neutron.agent.ovn.metadata.agent [-] Port eac28070-b447-4e98-9966-378755225162 in datapath 650eb345-8346-4e8f-8e83-eeb0117654f6 unbound from our chassis#033[00m
Jan 31 04:05:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:41.209 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 650eb345-8346-4e8f-8e83-eeb0117654f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.210 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:41.210 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5a5929d7-cdc4-4423-b42b-58c2d5686e80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:41.211 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 namespace which is not needed anymore#033[00m
Jan 31 04:05:41 np0005603623 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000c7.scope: Deactivated successfully.
Jan 31 04:05:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:41.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:41 np0005603623 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000c7.scope: Consumed 15.623s CPU time.
Jan 31 04:05:41 np0005603623 systemd-machined[194379]: Machine qemu-93-instance-000000c7 terminated.
Jan 31 04:05:41 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[321422]: [NOTICE]   (321426) : haproxy version is 2.8.14-c23fe91
Jan 31 04:05:41 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[321422]: [NOTICE]   (321426) : path to executable is /usr/sbin/haproxy
Jan 31 04:05:41 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[321422]: [WARNING]  (321426) : Exiting Master process...
Jan 31 04:05:41 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[321422]: [ALERT]    (321426) : Current worker (321428) exited with code 143 (Terminated)
Jan 31 04:05:41 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[321422]: [WARNING]  (321426) : All workers exited. Exiting... (0)
Jan 31 04:05:41 np0005603623 systemd[1]: libpod-2356c43e0cb4c56ef6ce8367838f208c59d78edca346602c0903252ae6a4fb14.scope: Deactivated successfully.
Jan 31 04:05:41 np0005603623 podman[322676]: 2026-01-31 09:05:41.321641257 +0000 UTC m=+0.036515176 container died 2356c43e0cb4c56ef6ce8367838f208c59d78edca346602c0903252ae6a4fb14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 04:05:41 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2356c43e0cb4c56ef6ce8367838f208c59d78edca346602c0903252ae6a4fb14-userdata-shm.mount: Deactivated successfully.
Jan 31 04:05:41 np0005603623 systemd[1]: var-lib-containers-storage-overlay-1c7c7070dfdfa5743f1bd6c78236a037c6a7e632dd15fcb4bb6899a29c189ded-merged.mount: Deactivated successfully.
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.360 226239 INFO nova.virt.libvirt.driver [-] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Instance destroyed successfully.#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.360 226239 DEBUG nova.objects.instance [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lazy-loading 'resources' on Instance uuid 05185cfd-db91-4d53-8ae4-57a005be337f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:05:41 np0005603623 podman[322676]: 2026-01-31 09:05:41.365567026 +0000 UTC m=+0.080440925 container cleanup 2356c43e0cb4c56ef6ce8367838f208c59d78edca346602c0903252ae6a4fb14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:05:41 np0005603623 systemd[1]: libpod-conmon-2356c43e0cb4c56ef6ce8367838f208c59d78edca346602c0903252ae6a4fb14.scope: Deactivated successfully.
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.392 226239 DEBUG nova.virt.libvirt.vif [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-volume-backed-server-1556930195',display_name='tempest-TestVolumeBootPattern-volume-backed-server-1556930195',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-volume-backed-server-1556930195',id=199,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKF59rccHhsRb4D/CZFLX092sIK+/IvwyeTXpmNTOR//zJFz7TjDtGD2uvwMP0M17UQZZs9W7y6QnwkWlWcOVeaQO/cKF4TdqnQ7iouwZ3EUZ5flP3Y2CaP6XS5xR0AJeQ==',key_name='tempest-keypair-452510270',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:03:51Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76ce367a834b49dfb5b436848118b860',ramdisk_id='',reservation_id='r-g4jls9jc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestVolumeBootPattern-1392945362',owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:03:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='dc42b92a5dd34d32b6b184bdc7acb092',uuid=05185cfd-db91-4d53-8ae4-57a005be337f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eac28070-b447-4e98-9966-378755225162", "address": "fa:16:3e:30:e8:e5", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeac28070-b4", "ovs_interfaceid": "eac28070-b447-4e98-9966-378755225162", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.393 226239 DEBUG nova.network.os_vif_util [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converting VIF {"id": "eac28070-b447-4e98-9966-378755225162", "address": "fa:16:3e:30:e8:e5", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeac28070-b4", "ovs_interfaceid": "eac28070-b447-4e98-9966-378755225162", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.394 226239 DEBUG nova.network.os_vif_util [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:30:e8:e5,bridge_name='br-int',has_traffic_filtering=True,id=eac28070-b447-4e98-9966-378755225162,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeac28070-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.394 226239 DEBUG os_vif [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:e8:e5,bridge_name='br-int',has_traffic_filtering=True,id=eac28070-b447-4e98-9966-378755225162,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeac28070-b4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.395 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.396 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeac28070-b4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.397 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.400 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.402 226239 INFO os_vif [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:e8:e5,bridge_name='br-int',has_traffic_filtering=True,id=eac28070-b447-4e98-9966-378755225162,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeac28070-b4')#033[00m
Jan 31 04:05:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:41.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:41 np0005603623 podman[322716]: 2026-01-31 09:05:41.416800022 +0000 UTC m=+0.034163753 container remove 2356c43e0cb4c56ef6ce8367838f208c59d78edca346602c0903252ae6a4fb14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:05:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:41.421 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ede3fe17-c9fe-486d-94fe-915e06368a89]: (4, ('Sat Jan 31 09:05:41 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 (2356c43e0cb4c56ef6ce8367838f208c59d78edca346602c0903252ae6a4fb14)\n2356c43e0cb4c56ef6ce8367838f208c59d78edca346602c0903252ae6a4fb14\nSat Jan 31 09:05:41 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 (2356c43e0cb4c56ef6ce8367838f208c59d78edca346602c0903252ae6a4fb14)\n2356c43e0cb4c56ef6ce8367838f208c59d78edca346602c0903252ae6a4fb14\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:41.426 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[54e05ce7-b929-414b-914d-42a876f9baac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:41.427 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap650eb345-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:05:41 np0005603623 kernel: tap650eb345-80: left promiscuous mode
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.432 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.435 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:41.436 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7157b50b-41f7-4a9d-8a34-d2f857d68d4d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:41.448 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[33c94672-779c-4a1b-ae46-ea3c6a2dc68c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:41.450 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ba567d2a-464f-41db-86fb-1b0dd6c0a366]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:41.460 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9f16d630-d0da-4c07-8a4a-08b865ffffc4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 937078, 'reachable_time': 38844, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 322745, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:41 np0005603623 systemd[1]: run-netns-ovnmeta\x2d650eb345\x2d8346\x2d4e8f\x2d8e83\x2deeb0117654f6.mount: Deactivated successfully.
Jan 31 04:05:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:41.462 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:05:41 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:41.463 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[9b4e5fc3-8ee0-46f7-bb88-527ad573bae6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.617 226239 INFO nova.virt.libvirt.driver [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Deleting instance files /var/lib/nova/instances/05185cfd-db91-4d53-8ae4-57a005be337f_del#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.618 226239 INFO nova.virt.libvirt.driver [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Deletion of /var/lib/nova/instances/05185cfd-db91-4d53-8ae4-57a005be337f_del complete#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.751 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.782 226239 DEBUG nova.compute.manager [req-69d2e99c-fd74-4960-bb20-9be2e2f204b2 req-0ff9fe54-772d-4831-87b3-376b495193d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Received event network-vif-unplugged-eac28070-b447-4e98-9966-378755225162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.782 226239 DEBUG oslo_concurrency.lockutils [req-69d2e99c-fd74-4960-bb20-9be2e2f204b2 req-0ff9fe54-772d-4831-87b3-376b495193d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "05185cfd-db91-4d53-8ae4-57a005be337f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.782 226239 DEBUG oslo_concurrency.lockutils [req-69d2e99c-fd74-4960-bb20-9be2e2f204b2 req-0ff9fe54-772d-4831-87b3-376b495193d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "05185cfd-db91-4d53-8ae4-57a005be337f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.783 226239 DEBUG oslo_concurrency.lockutils [req-69d2e99c-fd74-4960-bb20-9be2e2f204b2 req-0ff9fe54-772d-4831-87b3-376b495193d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "05185cfd-db91-4d53-8ae4-57a005be337f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.783 226239 DEBUG nova.compute.manager [req-69d2e99c-fd74-4960-bb20-9be2e2f204b2 req-0ff9fe54-772d-4831-87b3-376b495193d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] No waiting events found dispatching network-vif-unplugged-eac28070-b447-4e98-9966-378755225162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.783 226239 DEBUG nova.compute.manager [req-69d2e99c-fd74-4960-bb20-9be2e2f204b2 req-0ff9fe54-772d-4831-87b3-376b495193d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Received event network-vif-unplugged-eac28070-b447-4e98-9966-378755225162 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.818 226239 INFO nova.compute.manager [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Took 0.70 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.819 226239 DEBUG oslo.service.loopingcall [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.819 226239 DEBUG nova.compute.manager [-] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:05:41 np0005603623 nova_compute[226235]: 2026-01-31 09:05:41.819 226239 DEBUG nova.network.neutron [-] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:05:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e404 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:43 np0005603623 nova_compute[226235]: 2026-01-31 09:05:43.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:43.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:43.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:43 np0005603623 nova_compute[226235]: 2026-01-31 09:05:43.754 226239 DEBUG nova.network.neutron [-] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:05:43 np0005603623 nova_compute[226235]: 2026-01-31 09:05:43.791 226239 INFO nova.compute.manager [-] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Took 1.97 seconds to deallocate network for instance.#033[00m
Jan 31 04:05:43 np0005603623 nova_compute[226235]: 2026-01-31 09:05:43.923 226239 DEBUG nova.compute.manager [req-1eb93246-4922-4a44-bdaa-51eab4aabacb req-47d1c894-0518-49df-bf0c-6747133e62e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Received event network-changed-04892866-81a0-44a3-99fa-2493a125f99a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:05:43 np0005603623 nova_compute[226235]: 2026-01-31 09:05:43.924 226239 DEBUG nova.compute.manager [req-1eb93246-4922-4a44-bdaa-51eab4aabacb req-47d1c894-0518-49df-bf0c-6747133e62e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Refreshing instance network info cache due to event network-changed-04892866-81a0-44a3-99fa-2493a125f99a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:05:43 np0005603623 nova_compute[226235]: 2026-01-31 09:05:43.924 226239 DEBUG oslo_concurrency.lockutils [req-1eb93246-4922-4a44-bdaa-51eab4aabacb req-47d1c894-0518-49df-bf0c-6747133e62e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-7255c305-4d2b-4335-9af0-2be77f7f097a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:05:43 np0005603623 nova_compute[226235]: 2026-01-31 09:05:43.924 226239 DEBUG oslo_concurrency.lockutils [req-1eb93246-4922-4a44-bdaa-51eab4aabacb req-47d1c894-0518-49df-bf0c-6747133e62e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-7255c305-4d2b-4335-9af0-2be77f7f097a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:05:43 np0005603623 nova_compute[226235]: 2026-01-31 09:05:43.924 226239 DEBUG nova.network.neutron [req-1eb93246-4922-4a44-bdaa-51eab4aabacb req-47d1c894-0518-49df-bf0c-6747133e62e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Refreshing network info cache for port 04892866-81a0-44a3-99fa-2493a125f99a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:05:43 np0005603623 nova_compute[226235]: 2026-01-31 09:05:43.985 226239 DEBUG nova.compute.manager [req-26739682-b2d5-455f-82a8-e8f64c4dfa44 req-12d6a2fc-878a-4dbe-b16d-e61fe9dd6b83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Received event network-vif-deleted-eac28070-b447-4e98-9966-378755225162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:05:44 np0005603623 nova_compute[226235]: 2026-01-31 09:05:44.024 226239 DEBUG nova.compute.manager [req-9a761eb8-c07b-429c-a130-0b50743a4800 req-5281a38a-23b1-4a05-bbb0-9acb4b05f8e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Received event network-vif-plugged-eac28070-b447-4e98-9966-378755225162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:05:44 np0005603623 nova_compute[226235]: 2026-01-31 09:05:44.025 226239 DEBUG oslo_concurrency.lockutils [req-9a761eb8-c07b-429c-a130-0b50743a4800 req-5281a38a-23b1-4a05-bbb0-9acb4b05f8e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "05185cfd-db91-4d53-8ae4-57a005be337f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:44 np0005603623 nova_compute[226235]: 2026-01-31 09:05:44.025 226239 DEBUG oslo_concurrency.lockutils [req-9a761eb8-c07b-429c-a130-0b50743a4800 req-5281a38a-23b1-4a05-bbb0-9acb4b05f8e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "05185cfd-db91-4d53-8ae4-57a005be337f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:44 np0005603623 nova_compute[226235]: 2026-01-31 09:05:44.025 226239 DEBUG oslo_concurrency.lockutils [req-9a761eb8-c07b-429c-a130-0b50743a4800 req-5281a38a-23b1-4a05-bbb0-9acb4b05f8e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "05185cfd-db91-4d53-8ae4-57a005be337f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:44 np0005603623 nova_compute[226235]: 2026-01-31 09:05:44.025 226239 DEBUG nova.compute.manager [req-9a761eb8-c07b-429c-a130-0b50743a4800 req-5281a38a-23b1-4a05-bbb0-9acb4b05f8e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] No waiting events found dispatching network-vif-plugged-eac28070-b447-4e98-9966-378755225162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:05:44 np0005603623 nova_compute[226235]: 2026-01-31 09:05:44.025 226239 WARNING nova.compute.manager [req-9a761eb8-c07b-429c-a130-0b50743a4800 req-5281a38a-23b1-4a05-bbb0-9acb4b05f8e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Received unexpected event network-vif-plugged-eac28070-b447-4e98-9966-378755225162 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 04:05:44 np0005603623 nova_compute[226235]: 2026-01-31 09:05:44.153 226239 INFO nova.compute.manager [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Took 0.36 seconds to detach 1 volumes for instance.#033[00m
Jan 31 04:05:44 np0005603623 nova_compute[226235]: 2026-01-31 09:05:44.154 226239 DEBUG nova.compute.manager [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Deleting volume: 47ca8e30-e640-4fca-a274-67bf157244e9 _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 31 04:05:44 np0005603623 nova_compute[226235]: 2026-01-31 09:05:44.434 226239 DEBUG oslo_concurrency.lockutils [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:44 np0005603623 nova_compute[226235]: 2026-01-31 09:05:44.434 226239 DEBUG oslo_concurrency.lockutils [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:44 np0005603623 nova_compute[226235]: 2026-01-31 09:05:44.547 226239 DEBUG oslo_concurrency.processutils [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:44 np0005603623 nova_compute[226235]: 2026-01-31 09:05:44.877 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:05:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/58092302' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:05:44 np0005603623 nova_compute[226235]: 2026-01-31 09:05:44.963 226239 DEBUG oslo_concurrency.processutils [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:44 np0005603623 nova_compute[226235]: 2026-01-31 09:05:44.969 226239 DEBUG nova.compute.provider_tree [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:05:45 np0005603623 nova_compute[226235]: 2026-01-31 09:05:45.006 226239 DEBUG nova.scheduler.client.report [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:05:45 np0005603623 nova_compute[226235]: 2026-01-31 09:05:45.035 226239 DEBUG oslo_concurrency.lockutils [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:45 np0005603623 nova_compute[226235]: 2026-01-31 09:05:45.079 226239 INFO nova.scheduler.client.report [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Deleted allocations for instance 05185cfd-db91-4d53-8ae4-57a005be337f#033[00m
Jan 31 04:05:45 np0005603623 nova_compute[226235]: 2026-01-31 09:05:45.157 226239 DEBUG oslo_concurrency.lockutils [None req-105cdf53-017c-466d-b7b6-b28afa17462c dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "05185cfd-db91-4d53-8ae4-57a005be337f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.042s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:45.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:45.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:05:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/901373021' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:05:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:05:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/901373021' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:05:46 np0005603623 nova_compute[226235]: 2026-01-31 09:05:46.064 226239 DEBUG nova.compute.manager [req-722ce5b1-ba74-42d0-b71a-f93e0b413c06 req-319924f2-90f5-4f51-8065-c63199fb0a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Received event network-changed-04892866-81a0-44a3-99fa-2493a125f99a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:05:46 np0005603623 nova_compute[226235]: 2026-01-31 09:05:46.065 226239 DEBUG nova.compute.manager [req-722ce5b1-ba74-42d0-b71a-f93e0b413c06 req-319924f2-90f5-4f51-8065-c63199fb0a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Refreshing instance network info cache due to event network-changed-04892866-81a0-44a3-99fa-2493a125f99a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:05:46 np0005603623 nova_compute[226235]: 2026-01-31 09:05:46.065 226239 DEBUG oslo_concurrency.lockutils [req-722ce5b1-ba74-42d0-b71a-f93e0b413c06 req-319924f2-90f5-4f51-8065-c63199fb0a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-7255c305-4d2b-4335-9af0-2be77f7f097a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:05:46 np0005603623 nova_compute[226235]: 2026-01-31 09:05:46.254 226239 DEBUG nova.network.neutron [req-1eb93246-4922-4a44-bdaa-51eab4aabacb req-47d1c894-0518-49df-bf0c-6747133e62e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Updated VIF entry in instance network info cache for port 04892866-81a0-44a3-99fa-2493a125f99a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:05:46 np0005603623 nova_compute[226235]: 2026-01-31 09:05:46.255 226239 DEBUG nova.network.neutron [req-1eb93246-4922-4a44-bdaa-51eab4aabacb req-47d1c894-0518-49df-bf0c-6747133e62e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Updating instance_info_cache with network_info: [{"id": "04892866-81a0-44a3-99fa-2493a125f99a", "address": "fa:16:3e:e9:bb:07", "network": {"id": "246a163d-3e9c-48ad-b266-efe63aa146b3", "bridge": "br-int", "label": "tempest-TestVolumeBackupRestore-1170900981-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8017efbc64b4244b349174c29a41000", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04892866-81", "ovs_interfaceid": "04892866-81a0-44a3-99fa-2493a125f99a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:05:46 np0005603623 nova_compute[226235]: 2026-01-31 09:05:46.290 226239 DEBUG oslo_concurrency.lockutils [req-1eb93246-4922-4a44-bdaa-51eab4aabacb req-47d1c894-0518-49df-bf0c-6747133e62e5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-7255c305-4d2b-4335-9af0-2be77f7f097a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:05:46 np0005603623 nova_compute[226235]: 2026-01-31 09:05:46.291 226239 DEBUG oslo_concurrency.lockutils [req-722ce5b1-ba74-42d0-b71a-f93e0b413c06 req-319924f2-90f5-4f51-8065-c63199fb0a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-7255c305-4d2b-4335-9af0-2be77f7f097a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:05:46 np0005603623 nova_compute[226235]: 2026-01-31 09:05:46.291 226239 DEBUG nova.network.neutron [req-722ce5b1-ba74-42d0-b71a-f93e0b413c06 req-319924f2-90f5-4f51-8065-c63199fb0a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Refreshing network info cache for port 04892866-81a0-44a3-99fa-2493a125f99a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:05:46 np0005603623 nova_compute[226235]: 2026-01-31 09:05:46.398 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:46 np0005603623 nova_compute[226235]: 2026-01-31 09:05:46.753 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e404 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:47.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:47.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e405 e405: 3 total, 3 up, 3 in
Jan 31 04:05:47 np0005603623 nova_compute[226235]: 2026-01-31 09:05:47.952 226239 DEBUG nova.network.neutron [req-722ce5b1-ba74-42d0-b71a-f93e0b413c06 req-319924f2-90f5-4f51-8065-c63199fb0a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Updated VIF entry in instance network info cache for port 04892866-81a0-44a3-99fa-2493a125f99a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:05:47 np0005603623 nova_compute[226235]: 2026-01-31 09:05:47.953 226239 DEBUG nova.network.neutron [req-722ce5b1-ba74-42d0-b71a-f93e0b413c06 req-319924f2-90f5-4f51-8065-c63199fb0a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Updating instance_info_cache with network_info: [{"id": "04892866-81a0-44a3-99fa-2493a125f99a", "address": "fa:16:3e:e9:bb:07", "network": {"id": "246a163d-3e9c-48ad-b266-efe63aa146b3", "bridge": "br-int", "label": "tempest-TestVolumeBackupRestore-1170900981-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8017efbc64b4244b349174c29a41000", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04892866-81", "ovs_interfaceid": "04892866-81a0-44a3-99fa-2493a125f99a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:05:47 np0005603623 nova_compute[226235]: 2026-01-31 09:05:47.985 226239 DEBUG oslo_concurrency.lockutils [req-722ce5b1-ba74-42d0-b71a-f93e0b413c06 req-319924f2-90f5-4f51-8065-c63199fb0a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-7255c305-4d2b-4335-9af0-2be77f7f097a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:05:47 np0005603623 nova_compute[226235]: 2026-01-31 09:05:47.986 226239 DEBUG nova.compute.manager [req-722ce5b1-ba74-42d0-b71a-f93e0b413c06 req-319924f2-90f5-4f51-8065-c63199fb0a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Received event network-changed-04892866-81a0-44a3-99fa-2493a125f99a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:05:47 np0005603623 nova_compute[226235]: 2026-01-31 09:05:47.986 226239 DEBUG nova.compute.manager [req-722ce5b1-ba74-42d0-b71a-f93e0b413c06 req-319924f2-90f5-4f51-8065-c63199fb0a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Refreshing instance network info cache due to event network-changed-04892866-81a0-44a3-99fa-2493a125f99a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:05:47 np0005603623 nova_compute[226235]: 2026-01-31 09:05:47.986 226239 DEBUG oslo_concurrency.lockutils [req-722ce5b1-ba74-42d0-b71a-f93e0b413c06 req-319924f2-90f5-4f51-8065-c63199fb0a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-7255c305-4d2b-4335-9af0-2be77f7f097a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:05:47 np0005603623 nova_compute[226235]: 2026-01-31 09:05:47.986 226239 DEBUG oslo_concurrency.lockutils [req-722ce5b1-ba74-42d0-b71a-f93e0b413c06 req-319924f2-90f5-4f51-8065-c63199fb0a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-7255c305-4d2b-4335-9af0-2be77f7f097a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:05:47 np0005603623 nova_compute[226235]: 2026-01-31 09:05:47.987 226239 DEBUG nova.network.neutron [req-722ce5b1-ba74-42d0-b71a-f93e0b413c06 req-319924f2-90f5-4f51-8065-c63199fb0a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Refreshing network info cache for port 04892866-81a0-44a3-99fa-2493a125f99a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:05:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e406 e406: 3 total, 3 up, 3 in
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #172. Immutable memtables: 0.
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:05:49.017425) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 109] Flushing memtable with next log file: 172
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850349017546, "job": 109, "event": "flush_started", "num_memtables": 1, "num_entries": 1936, "num_deletes": 254, "total_data_size": 4361053, "memory_usage": 4421680, "flush_reason": "Manual Compaction"}
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 109] Level-0 flush table #173: started
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850349058250, "cf_name": "default", "job": 109, "event": "table_file_creation", "file_number": 173, "file_size": 2864427, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 82985, "largest_seqno": 84916, "table_properties": {"data_size": 2856398, "index_size": 4842, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 17462, "raw_average_key_size": 20, "raw_value_size": 2840038, "raw_average_value_size": 3368, "num_data_blocks": 212, "num_entries": 843, "num_filter_entries": 843, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850192, "oldest_key_time": 1769850192, "file_creation_time": 1769850349, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 173, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 109] Flush lasted 40910 microseconds, and 4806 cpu microseconds.
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:05:49.058312) [db/flush_job.cc:967] [default] [JOB 109] Level-0 flush table #173: 2864427 bytes OK
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:05:49.058360) [db/memtable_list.cc:519] [default] Level-0 commit table #173 started
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:05:49.063513) [db/memtable_list.cc:722] [default] Level-0 commit table #173: memtable #1 done
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:05:49.063554) EVENT_LOG_v1 {"time_micros": 1769850349063533, "job": 109, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:05:49.063572) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 109] Try to delete WAL files size 4352287, prev total WAL file size 4352287, number of live WAL files 2.
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000169.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:05:49.064186) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037323739' seq:72057594037927935, type:22 .. '7061786F730037353331' seq:0, type:0; will stop at (end)
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 110] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 109 Base level 0, inputs: [173(2797KB)], [171(10MB)]
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850349064251, "job": 110, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [173], "files_L6": [171], "score": -1, "input_data_size": 13586913, "oldest_snapshot_seqno": -1}
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 110] Generated table #174: 10463 keys, 11763295 bytes, temperature: kUnknown
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850349207021, "cf_name": "default", "job": 110, "event": "table_file_creation", "file_number": 174, "file_size": 11763295, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11698092, "index_size": 37954, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26181, "raw_key_size": 275959, "raw_average_key_size": 26, "raw_value_size": 11517685, "raw_average_value_size": 1100, "num_data_blocks": 1436, "num_entries": 10463, "num_filter_entries": 10463, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769850349, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 174, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:05:49.207313) [db/compaction/compaction_job.cc:1663] [default] [JOB 110] Compacted 1@0 + 1@6 files to L6 => 11763295 bytes
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:05:49.212262) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 95.1 rd, 82.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 10.2 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(8.9) write-amplify(4.1) OK, records in: 10987, records dropped: 524 output_compression: NoCompression
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:05:49.212290) EVENT_LOG_v1 {"time_micros": 1769850349212278, "job": 110, "event": "compaction_finished", "compaction_time_micros": 142858, "compaction_time_cpu_micros": 23519, "output_level": 6, "num_output_files": 1, "total_output_size": 11763295, "num_input_records": 10987, "num_output_records": 10463, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000173.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850349212967, "job": 110, "event": "table_file_deletion", "file_number": 173}
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000171.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850349213911, "job": 110, "event": "table_file_deletion", "file_number": 171}
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:05:49.064107) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:05:49.214001) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:05:49.214005) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:05:49.214007) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:05:49.214008) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:05:49 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:05:49.214010) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:05:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:49.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:49.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:49 np0005603623 ovn_controller[133449]: 2026-01-31T09:05:49Z|00104|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e9:bb:07 10.100.0.11
Jan 31 04:05:49 np0005603623 ovn_controller[133449]: 2026-01-31T09:05:49Z|00105|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e9:bb:07 10.100.0.11
Jan 31 04:05:49 np0005603623 nova_compute[226235]: 2026-01-31 09:05:49.955 226239 DEBUG nova.network.neutron [req-722ce5b1-ba74-42d0-b71a-f93e0b413c06 req-319924f2-90f5-4f51-8065-c63199fb0a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Updated VIF entry in instance network info cache for port 04892866-81a0-44a3-99fa-2493a125f99a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:05:49 np0005603623 nova_compute[226235]: 2026-01-31 09:05:49.955 226239 DEBUG nova.network.neutron [req-722ce5b1-ba74-42d0-b71a-f93e0b413c06 req-319924f2-90f5-4f51-8065-c63199fb0a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Updating instance_info_cache with network_info: [{"id": "04892866-81a0-44a3-99fa-2493a125f99a", "address": "fa:16:3e:e9:bb:07", "network": {"id": "246a163d-3e9c-48ad-b266-efe63aa146b3", "bridge": "br-int", "label": "tempest-TestVolumeBackupRestore-1170900981-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8017efbc64b4244b349174c29a41000", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04892866-81", "ovs_interfaceid": "04892866-81a0-44a3-99fa-2493a125f99a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:05:49 np0005603623 nova_compute[226235]: 2026-01-31 09:05:49.974 226239 DEBUG oslo_concurrency.lockutils [req-722ce5b1-ba74-42d0-b71a-f93e0b413c06 req-319924f2-90f5-4f51-8065-c63199fb0a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-7255c305-4d2b-4335-9af0-2be77f7f097a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:05:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:51.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:51 np0005603623 nova_compute[226235]: 2026-01-31 09:05:51.401 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:51.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:51 np0005603623 nova_compute[226235]: 2026-01-31 09:05:51.754 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:53.126 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=88, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=87) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:05:53 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:53.127 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:05:53 np0005603623 nova_compute[226235]: 2026-01-31 09:05:53.127 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:53.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:05:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:53.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:05:54 np0005603623 podman[322778]: 2026-01-31 09:05:54.972075211 +0000 UTC m=+0.052901480 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 04:05:54 np0005603623 podman[322779]: 2026-01-31 09:05:54.994201625 +0000 UTC m=+0.074989233 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 04:05:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:55.129 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '88'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:05:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:55.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:05:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:55.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.614 226239 DEBUG nova.compute.manager [req-1a48c2d7-3f22-4f90-955c-07b5b854d02f req-3acae65a-a247-46f9-961e-c69a7a16a4ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Received event network-changed-04892866-81a0-44a3-99fa-2493a125f99a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.615 226239 DEBUG nova.compute.manager [req-1a48c2d7-3f22-4f90-955c-07b5b854d02f req-3acae65a-a247-46f9-961e-c69a7a16a4ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Refreshing instance network info cache due to event network-changed-04892866-81a0-44a3-99fa-2493a125f99a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.615 226239 DEBUG oslo_concurrency.lockutils [req-1a48c2d7-3f22-4f90-955c-07b5b854d02f req-3acae65a-a247-46f9-961e-c69a7a16a4ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-7255c305-4d2b-4335-9af0-2be77f7f097a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.616 226239 DEBUG oslo_concurrency.lockutils [req-1a48c2d7-3f22-4f90-955c-07b5b854d02f req-3acae65a-a247-46f9-961e-c69a7a16a4ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-7255c305-4d2b-4335-9af0-2be77f7f097a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.616 226239 DEBUG nova.network.neutron [req-1a48c2d7-3f22-4f90-955c-07b5b854d02f req-3acae65a-a247-46f9-961e-c69a7a16a4ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Refreshing network info cache for port 04892866-81a0-44a3-99fa-2493a125f99a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.705 226239 DEBUG oslo_concurrency.lockutils [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Acquiring lock "7255c305-4d2b-4335-9af0-2be77f7f097a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.705 226239 DEBUG oslo_concurrency.lockutils [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Lock "7255c305-4d2b-4335-9af0-2be77f7f097a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.705 226239 DEBUG oslo_concurrency.lockutils [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Acquiring lock "7255c305-4d2b-4335-9af0-2be77f7f097a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.706 226239 DEBUG oslo_concurrency.lockutils [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Lock "7255c305-4d2b-4335-9af0-2be77f7f097a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.706 226239 DEBUG oslo_concurrency.lockutils [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Lock "7255c305-4d2b-4335-9af0-2be77f7f097a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.707 226239 INFO nova.compute.manager [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Terminating instance#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.708 226239 DEBUG nova.compute.manager [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:05:55 np0005603623 kernel: tap04892866-81 (unregistering): left promiscuous mode
Jan 31 04:05:55 np0005603623 NetworkManager[48970]: <info>  [1769850355.7889] device (tap04892866-81): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.795 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:55 np0005603623 ovn_controller[133449]: 2026-01-31T09:05:55Z|00852|binding|INFO|Releasing lport 04892866-81a0-44a3-99fa-2493a125f99a from this chassis (sb_readonly=0)
Jan 31 04:05:55 np0005603623 ovn_controller[133449]: 2026-01-31T09:05:55Z|00853|binding|INFO|Setting lport 04892866-81a0-44a3-99fa-2493a125f99a down in Southbound
Jan 31 04:05:55 np0005603623 ovn_controller[133449]: 2026-01-31T09:05:55Z|00854|binding|INFO|Removing iface tap04892866-81 ovn-installed in OVS
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.798 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:55.805 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:bb:07 10.100.0.11'], port_security=['fa:16:3e:e9:bb:07 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '7255c305-4d2b-4335-9af0-2be77f7f097a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-246a163d-3e9c-48ad-b266-efe63aa146b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c8017efbc64b4244b349174c29a41000', 'neutron:revision_number': '4', 'neutron:security_group_ids': '01d5e967-3f9e-4120-a2d3-584214a51bf6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e97c9db-e3ff-43c0-98d3-8300ecb9d187, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=04892866-81a0-44a3-99fa-2493a125f99a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:05:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:55.807 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 04892866-81a0-44a3-99fa-2493a125f99a in datapath 246a163d-3e9c-48ad-b266-efe63aa146b3 unbound from our chassis#033[00m
Jan 31 04:05:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:55.809 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 246a163d-3e9c-48ad-b266-efe63aa146b3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.809 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:55.810 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e9281cdd-23d5-40da-a1b2-2a6bc684f06c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:55.811 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3 namespace which is not needed anymore#033[00m
Jan 31 04:05:55 np0005603623 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000cc.scope: Deactivated successfully.
Jan 31 04:05:55 np0005603623 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000cc.scope: Consumed 12.716s CPU time.
Jan 31 04:05:55 np0005603623 systemd-machined[194379]: Machine qemu-94-instance-000000cc terminated.
Jan 31 04:05:55 np0005603623 neutron-haproxy-ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3[322587]: [NOTICE]   (322592) : haproxy version is 2.8.14-c23fe91
Jan 31 04:05:55 np0005603623 neutron-haproxy-ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3[322587]: [NOTICE]   (322592) : path to executable is /usr/sbin/haproxy
Jan 31 04:05:55 np0005603623 neutron-haproxy-ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3[322587]: [WARNING]  (322592) : Exiting Master process...
Jan 31 04:05:55 np0005603623 neutron-haproxy-ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3[322587]: [ALERT]    (322592) : Current worker (322595) exited with code 143 (Terminated)
Jan 31 04:05:55 np0005603623 neutron-haproxy-ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3[322587]: [WARNING]  (322592) : All workers exited. Exiting... (0)
Jan 31 04:05:55 np0005603623 systemd[1]: libpod-f8f76e07682fba94d0cd3c01d11684b65b01c7b209acbe4e2ac7b6b11412e647.scope: Deactivated successfully.
Jan 31 04:05:55 np0005603623 podman[322847]: 2026-01-31 09:05:55.920819034 +0000 UTC m=+0.043566097 container died f8f76e07682fba94d0cd3c01d11684b65b01c7b209acbe4e2ac7b6b11412e647 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127)
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.924 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.930 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.940 226239 INFO nova.virt.libvirt.driver [-] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Instance destroyed successfully.#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.941 226239 DEBUG nova.objects.instance [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Lazy-loading 'resources' on Instance uuid 7255c305-4d2b-4335-9af0-2be77f7f097a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:05:55 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f8f76e07682fba94d0cd3c01d11684b65b01c7b209acbe4e2ac7b6b11412e647-userdata-shm.mount: Deactivated successfully.
Jan 31 04:05:55 np0005603623 systemd[1]: var-lib-containers-storage-overlay-a05449295f468bc95192ad43ef15d7f8c187c8ceb2c01e5a638c8d4852301fb8-merged.mount: Deactivated successfully.
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.957 226239 DEBUG nova.virt.libvirt.vif [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:05:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBackupRestore-server-24996787',display_name='tempest-TestVolumeBackupRestore-server-24996787',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebackuprestore-server-24996787',id=204,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDiUuyTkBrBOg5gGHaaMVuI2XTjHgVpG5AlWT1Uy1XvC67KFAXePhqQCRO3vYd53cKQdeQH68O6uBB76SiY4XS9YRqOalJGVNrTbO6XCdiKTmeFqiIqAGHxPMGsjL9U1Cw==',key_name='tempest-TestVolumeBackupRestore-882231007',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:05:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c8017efbc64b4244b349174c29a41000',ramdisk_id='',reservation_id='r-qk5gvh2l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestVolumeBackupRestore-586195710',owner_user_name='tempest-TestVolumeBackupRestore-586195710-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:05:36Z,user_data=None,user_id='4bd7d1bbf3a8497b8b26f8df83fe8067',uuid=7255c305-4d2b-4335-9af0-2be77f7f097a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "04892866-81a0-44a3-99fa-2493a125f99a", "address": "fa:16:3e:e9:bb:07", "network": {"id": "246a163d-3e9c-48ad-b266-efe63aa146b3", "bridge": "br-int", "label": "tempest-TestVolumeBackupRestore-1170900981-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8017efbc64b4244b349174c29a41000", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04892866-81", "ovs_interfaceid": "04892866-81a0-44a3-99fa-2493a125f99a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.957 226239 DEBUG nova.network.os_vif_util [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Converting VIF {"id": "04892866-81a0-44a3-99fa-2493a125f99a", "address": "fa:16:3e:e9:bb:07", "network": {"id": "246a163d-3e9c-48ad-b266-efe63aa146b3", "bridge": "br-int", "label": "tempest-TestVolumeBackupRestore-1170900981-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8017efbc64b4244b349174c29a41000", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04892866-81", "ovs_interfaceid": "04892866-81a0-44a3-99fa-2493a125f99a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.958 226239 DEBUG nova.network.os_vif_util [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e9:bb:07,bridge_name='br-int',has_traffic_filtering=True,id=04892866-81a0-44a3-99fa-2493a125f99a,network=Network(246a163d-3e9c-48ad-b266-efe63aa146b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04892866-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.959 226239 DEBUG os_vif [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:bb:07,bridge_name='br-int',has_traffic_filtering=True,id=04892866-81a0-44a3-99fa-2493a125f99a,network=Network(246a163d-3e9c-48ad-b266-efe63aa146b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04892866-81') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.961 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.961 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04892866-81, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.962 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.964 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:55 np0005603623 podman[322847]: 2026-01-31 09:05:55.965321411 +0000 UTC m=+0.088068474 container cleanup f8f76e07682fba94d0cd3c01d11684b65b01c7b209acbe4e2ac7b6b11412e647 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127)
Jan 31 04:05:55 np0005603623 nova_compute[226235]: 2026-01-31 09:05:55.966 226239 INFO os_vif [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:bb:07,bridge_name='br-int',has_traffic_filtering=True,id=04892866-81a0-44a3-99fa-2493a125f99a,network=Network(246a163d-3e9c-48ad-b266-efe63aa146b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04892866-81')#033[00m
Jan 31 04:05:55 np0005603623 systemd[1]: libpod-conmon-f8f76e07682fba94d0cd3c01d11684b65b01c7b209acbe4e2ac7b6b11412e647.scope: Deactivated successfully.
Jan 31 04:05:56 np0005603623 podman[322889]: 2026-01-31 09:05:56.017975553 +0000 UTC m=+0.039138119 container remove f8f76e07682fba94d0cd3c01d11684b65b01c7b209acbe4e2ac7b6b11412e647 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:05:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:56.021 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[98e09649-0bf1-46ec-9d95-6ebba2511991]: (4, ('Sat Jan 31 09:05:55 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3 (f8f76e07682fba94d0cd3c01d11684b65b01c7b209acbe4e2ac7b6b11412e647)\nf8f76e07682fba94d0cd3c01d11684b65b01c7b209acbe4e2ac7b6b11412e647\nSat Jan 31 09:05:55 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3 (f8f76e07682fba94d0cd3c01d11684b65b01c7b209acbe4e2ac7b6b11412e647)\nf8f76e07682fba94d0cd3c01d11684b65b01c7b209acbe4e2ac7b6b11412e647\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:56.023 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[afc576a5-03d3-4c74-8391-4895ed3c811b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:56.023 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap246a163d-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.025 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.030 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:56 np0005603623 kernel: tap246a163d-30: left promiscuous mode
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.032 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:56.034 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[de21afa4-d9d3-48b4-b6a0-b492ad04d6c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:56.044 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0cafbac7-9bfc-429c-9a64-78bc627b3996]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:56.045 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb43c42-5a42-4918-b459-47adab4d47c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:56.055 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2c4b84d0-156f-484a-a747-4173d3988d07]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 947553, 'reachable_time': 37879, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 322922, 'error': None, 'target': 'ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:56 np0005603623 systemd[1]: run-netns-ovnmeta\x2d246a163d\x2d3e9c\x2d48ad\x2db266\x2defe63aa146b3.mount: Deactivated successfully.
Jan 31 04:05:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:56.059 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-246a163d-3e9c-48ad-b266-efe63aa146b3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:05:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:05:56.059 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[eb220b9b-791a-4ae8-9fc4-d3bca4d556a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.210 226239 DEBUG nova.compute.manager [req-6f3171ec-7850-4f34-973a-6c865dfdbcd1 req-85d09e1f-5bf1-489b-942f-d73b71d3a748 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Received event network-vif-unplugged-04892866-81a0-44a3-99fa-2493a125f99a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.211 226239 DEBUG oslo_concurrency.lockutils [req-6f3171ec-7850-4f34-973a-6c865dfdbcd1 req-85d09e1f-5bf1-489b-942f-d73b71d3a748 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7255c305-4d2b-4335-9af0-2be77f7f097a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.212 226239 DEBUG oslo_concurrency.lockutils [req-6f3171ec-7850-4f34-973a-6c865dfdbcd1 req-85d09e1f-5bf1-489b-942f-d73b71d3a748 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7255c305-4d2b-4335-9af0-2be77f7f097a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.212 226239 DEBUG oslo_concurrency.lockutils [req-6f3171ec-7850-4f34-973a-6c865dfdbcd1 req-85d09e1f-5bf1-489b-942f-d73b71d3a748 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7255c305-4d2b-4335-9af0-2be77f7f097a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.212 226239 DEBUG nova.compute.manager [req-6f3171ec-7850-4f34-973a-6c865dfdbcd1 req-85d09e1f-5bf1-489b-942f-d73b71d3a748 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] No waiting events found dispatching network-vif-unplugged-04892866-81a0-44a3-99fa-2493a125f99a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.212 226239 DEBUG nova.compute.manager [req-6f3171ec-7850-4f34-973a-6c865dfdbcd1 req-85d09e1f-5bf1-489b-942f-d73b71d3a748 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Received event network-vif-unplugged-04892866-81a0-44a3-99fa-2493a125f99a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.359 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850341.3576794, 05185cfd-db91-4d53-8ae4-57a005be337f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.359 226239 INFO nova.compute.manager [-] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.395 226239 DEBUG nova.compute.manager [None req-c5b779cb-b171-4bbc-b3ef-af0315b2631e - - - - - -] [instance: 05185cfd-db91-4d53-8ae4-57a005be337f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.755 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.883 226239 INFO nova.virt.libvirt.driver [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Deleting instance files /var/lib/nova/instances/7255c305-4d2b-4335-9af0-2be77f7f097a_del#033[00m
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.884 226239 INFO nova.virt.libvirt.driver [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Deletion of /var/lib/nova/instances/7255c305-4d2b-4335-9af0-2be77f7f097a_del complete#033[00m
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.967 226239 INFO nova.compute.manager [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Took 1.26 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.968 226239 DEBUG oslo.service.loopingcall [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.968 226239 DEBUG nova.compute.manager [-] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:05:56 np0005603623 nova_compute[226235]: 2026-01-31 09:05:56.968 226239 DEBUG nova.network.neutron [-] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:05:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:57.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:57.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:57 np0005603623 nova_compute[226235]: 2026-01-31 09:05:57.647 226239 DEBUG nova.network.neutron [req-1a48c2d7-3f22-4f90-955c-07b5b854d02f req-3acae65a-a247-46f9-961e-c69a7a16a4ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Updated VIF entry in instance network info cache for port 04892866-81a0-44a3-99fa-2493a125f99a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:05:57 np0005603623 nova_compute[226235]: 2026-01-31 09:05:57.648 226239 DEBUG nova.network.neutron [req-1a48c2d7-3f22-4f90-955c-07b5b854d02f req-3acae65a-a247-46f9-961e-c69a7a16a4ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Updating instance_info_cache with network_info: [{"id": "04892866-81a0-44a3-99fa-2493a125f99a", "address": "fa:16:3e:e9:bb:07", "network": {"id": "246a163d-3e9c-48ad-b266-efe63aa146b3", "bridge": "br-int", "label": "tempest-TestVolumeBackupRestore-1170900981-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8017efbc64b4244b349174c29a41000", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04892866-81", "ovs_interfaceid": "04892866-81a0-44a3-99fa-2493a125f99a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:05:57 np0005603623 nova_compute[226235]: 2026-01-31 09:05:57.681 226239 DEBUG oslo_concurrency.lockutils [req-1a48c2d7-3f22-4f90-955c-07b5b854d02f req-3acae65a-a247-46f9-961e-c69a7a16a4ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-7255c305-4d2b-4335-9af0-2be77f7f097a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:05:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e407 e407: 3 total, 3 up, 3 in
Jan 31 04:05:58 np0005603623 nova_compute[226235]: 2026-01-31 09:05:58.404 226239 DEBUG nova.compute.manager [req-52f52cfb-2d51-4be7-8c68-e70c37679958 req-52cea9ad-6f4f-4d88-aee5-eb77be0f57ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Received event network-vif-plugged-04892866-81a0-44a3-99fa-2493a125f99a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:05:58 np0005603623 nova_compute[226235]: 2026-01-31 09:05:58.405 226239 DEBUG oslo_concurrency.lockutils [req-52f52cfb-2d51-4be7-8c68-e70c37679958 req-52cea9ad-6f4f-4d88-aee5-eb77be0f57ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7255c305-4d2b-4335-9af0-2be77f7f097a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:58 np0005603623 nova_compute[226235]: 2026-01-31 09:05:58.405 226239 DEBUG oslo_concurrency.lockutils [req-52f52cfb-2d51-4be7-8c68-e70c37679958 req-52cea9ad-6f4f-4d88-aee5-eb77be0f57ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7255c305-4d2b-4335-9af0-2be77f7f097a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:58 np0005603623 nova_compute[226235]: 2026-01-31 09:05:58.406 226239 DEBUG oslo_concurrency.lockutils [req-52f52cfb-2d51-4be7-8c68-e70c37679958 req-52cea9ad-6f4f-4d88-aee5-eb77be0f57ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7255c305-4d2b-4335-9af0-2be77f7f097a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:58 np0005603623 nova_compute[226235]: 2026-01-31 09:05:58.406 226239 DEBUG nova.compute.manager [req-52f52cfb-2d51-4be7-8c68-e70c37679958 req-52cea9ad-6f4f-4d88-aee5-eb77be0f57ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] No waiting events found dispatching network-vif-plugged-04892866-81a0-44a3-99fa-2493a125f99a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:05:58 np0005603623 nova_compute[226235]: 2026-01-31 09:05:58.407 226239 WARNING nova.compute.manager [req-52f52cfb-2d51-4be7-8c68-e70c37679958 req-52cea9ad-6f4f-4d88-aee5-eb77be0f57ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Received unexpected event network-vif-plugged-04892866-81a0-44a3-99fa-2493a125f99a for instance with vm_state active and task_state deleting.#033[00m
Jan 31 04:05:58 np0005603623 nova_compute[226235]: 2026-01-31 09:05:58.581 226239 DEBUG nova.network.neutron [-] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:05:58 np0005603623 nova_compute[226235]: 2026-01-31 09:05:58.634 226239 INFO nova.compute.manager [-] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Took 1.67 seconds to deallocate network for instance.#033[00m
Jan 31 04:05:58 np0005603623 nova_compute[226235]: 2026-01-31 09:05:58.964 226239 INFO nova.compute.manager [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Took 0.33 seconds to detach 1 volumes for instance.#033[00m
Jan 31 04:05:59 np0005603623 nova_compute[226235]: 2026-01-31 09:05:59.067 226239 DEBUG oslo_concurrency.lockutils [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:59 np0005603623 nova_compute[226235]: 2026-01-31 09:05:59.068 226239 DEBUG oslo_concurrency.lockutils [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:59 np0005603623 nova_compute[226235]: 2026-01-31 09:05:59.129 226239 DEBUG oslo_concurrency.processutils [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:59.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:05:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:59.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:05:59 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1504136551' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:05:59 np0005603623 nova_compute[226235]: 2026-01-31 09:05:59.545 226239 DEBUG oslo_concurrency.processutils [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:59 np0005603623 nova_compute[226235]: 2026-01-31 09:05:59.551 226239 DEBUG nova.compute.provider_tree [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:05:59 np0005603623 nova_compute[226235]: 2026-01-31 09:05:59.594 226239 DEBUG nova.scheduler.client.report [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:05:59 np0005603623 nova_compute[226235]: 2026-01-31 09:05:59.636 226239 DEBUG oslo_concurrency.lockutils [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:59 np0005603623 nova_compute[226235]: 2026-01-31 09:05:59.688 226239 INFO nova.scheduler.client.report [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Deleted allocations for instance 7255c305-4d2b-4335-9af0-2be77f7f097a#033[00m
Jan 31 04:05:59 np0005603623 nova_compute[226235]: 2026-01-31 09:05:59.830 226239 DEBUG oslo_concurrency.lockutils [None req-0521de74-28c5-4d1e-804f-ac1ce2e5a358 4bd7d1bbf3a8497b8b26f8df83fe8067 c8017efbc64b4244b349174c29a41000 - - default default] Lock "7255c305-4d2b-4335-9af0-2be77f7f097a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:59 np0005603623 nova_compute[226235]: 2026-01-31 09:05:59.926 226239 DEBUG nova.compute.manager [req-50a28eca-3da1-40e4-9fa6-e3cc4dbc4d77 req-7930908e-8840-4b9a-bd0a-c21534db661e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Received event network-vif-deleted-04892866-81a0-44a3-99fa-2493a125f99a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:00 np0005603623 nova_compute[226235]: 2026-01-31 09:06:00.964 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:01 np0005603623 nova_compute[226235]: 2026-01-31 09:06:01.012 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:01.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:06:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:01.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:06:01 np0005603623 nova_compute[226235]: 2026-01-31 09:06:01.756 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e408 e408: 3 total, 3 up, 3 in
Jan 31 04:06:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:03.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:06:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:03.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:06:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e409 e409: 3 total, 3 up, 3 in
Jan 31 04:06:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:05.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:05.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:05 np0005603623 nova_compute[226235]: 2026-01-31 09:06:05.967 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:06 np0005603623 nova_compute[226235]: 2026-01-31 09:06:06.758 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:07.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:07.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:07 np0005603623 nova_compute[226235]: 2026-01-31 09:06:07.511 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:09.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:09.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:10 np0005603623 nova_compute[226235]: 2026-01-31 09:06:10.939 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850355.9380643, 7255c305-4d2b-4335-9af0-2be77f7f097a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:06:10 np0005603623 nova_compute[226235]: 2026-01-31 09:06:10.939 226239 INFO nova.compute.manager [-] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:06:10 np0005603623 nova_compute[226235]: 2026-01-31 09:06:10.970 226239 DEBUG nova.compute.manager [None req-ace0ed74-c51e-4773-8706-75392cda3dfc - - - - - -] [instance: 7255c305-4d2b-4335-9af0-2be77f7f097a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:06:11 np0005603623 nova_compute[226235]: 2026-01-31 09:06:11.014 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:06:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:11.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:06:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:06:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:11.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:06:11 np0005603623 nova_compute[226235]: 2026-01-31 09:06:11.760 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:12 np0005603623 nova_compute[226235]: 2026-01-31 09:06:12.116 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:12 np0005603623 nova_compute[226235]: 2026-01-31 09:06:12.212 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 e410: 3 total, 3 up, 3 in
Jan 31 04:06:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:06:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:13.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:06:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:13.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:14 np0005603623 nova_compute[226235]: 2026-01-31 09:06:14.346 226239 DEBUG oslo_concurrency.lockutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "e48e9071-e65c-4dc9-bee0-7d382977f14a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:14 np0005603623 nova_compute[226235]: 2026-01-31 09:06:14.346 226239 DEBUG oslo_concurrency.lockutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "e48e9071-e65c-4dc9-bee0-7d382977f14a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:14 np0005603623 nova_compute[226235]: 2026-01-31 09:06:14.384 226239 DEBUG nova.compute.manager [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:06:14 np0005603623 nova_compute[226235]: 2026-01-31 09:06:14.474 226239 DEBUG oslo_concurrency.lockutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:14 np0005603623 nova_compute[226235]: 2026-01-31 09:06:14.474 226239 DEBUG oslo_concurrency.lockutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:14 np0005603623 nova_compute[226235]: 2026-01-31 09:06:14.483 226239 DEBUG nova.virt.hardware [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:06:14 np0005603623 nova_compute[226235]: 2026-01-31 09:06:14.483 226239 INFO nova.compute.claims [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 04:06:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:06:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2452743407' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:06:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:06:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2452743407' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:06:14 np0005603623 nova_compute[226235]: 2026-01-31 09:06:14.735 226239 DEBUG oslo_concurrency.processutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.012 226239 DEBUG oslo_concurrency.lockutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Acquiring lock "eff12f1f-d1ce-444d-8d39-ead849efd65a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.013 226239 DEBUG oslo_concurrency.lockutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lock "eff12f1f-d1ce-444d-8d39-ead849efd65a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.036 226239 DEBUG nova.compute.manager [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.118 226239 DEBUG oslo_concurrency.lockutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:06:15 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1818806626' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.142 226239 DEBUG oslo_concurrency.processutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.407s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.147 226239 DEBUG nova.compute.provider_tree [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.180 226239 DEBUG nova.scheduler.client.report [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.224 226239 DEBUG oslo_concurrency.lockutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.750s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.225 226239 DEBUG nova.compute.manager [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.228 226239 DEBUG oslo_concurrency.lockutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.242 226239 DEBUG nova.virt.hardware [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.242 226239 INFO nova.compute.claims [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 04:06:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 31 04:06:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:15.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.329 226239 DEBUG nova.compute.manager [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.330 226239 DEBUG nova.network.neutron [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.378 226239 INFO nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.406 226239 DEBUG nova.compute.manager [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:06:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:15.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.466 226239 INFO nova.virt.block_device [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Booting with volume b457f046-29d9-4842-87e2-ad26874ea748 at /dev/vda#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.471 226239 DEBUG oslo_concurrency.processutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.729 226239 DEBUG nova.policy [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dc42b92a5dd34d32b6b184bdc7acb092', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76ce367a834b49dfb5b436848118b860', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.754 226239 DEBUG os_brick.utils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.755 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.763 236401 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.763 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[f9a34bba-5b07-4e7b-a3f2-a7ab8f661487]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.764 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.768 236401 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.004s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.768 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[1658d0ab-8ea4-4980-b019-6a2578908264]: (4, ('InitiatorName=iqn.1994-05.com.redhat:22dda56d75f7', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.769 236401 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.775 236401 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.005s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.775 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[55f8d8b6-5c77-4bc9-9ee8-49c166a070ef]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.776 236401 DEBUG oslo.privsep.daemon [-] privsep: reply[adc62f2e-bd0a-44eb-af80-424ace6ea5a5]: (4, '4e15465d-7c03-4925-9fc3-ba6a686b7adc') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.776 226239 DEBUG oslo_concurrency.processutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.793 226239 DEBUG oslo_concurrency.processutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "nvme version" returned: 0 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.795 226239 DEBUG os_brick.initiator.connectors.lightos [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.795 226239 DEBUG os_brick.initiator.connectors.lightos [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.795 226239 DEBUG os_brick.initiator.connectors.lightos [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.796 226239 DEBUG os_brick.utils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] <== get_connector_properties: return (41ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:22dda56d75f7', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': '4e15465d-7c03-4925-9fc3-ba6a686b7adc', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.796 226239 DEBUG nova.virt.block_device [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Updating existing volume attachment record: dd219e33-3c3b-490b-be2d-e751477d9a93 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 04:06:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:06:15 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3478514699' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.871 226239 DEBUG oslo_concurrency.processutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.875 226239 DEBUG nova.compute.provider_tree [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.900 226239 DEBUG nova.scheduler.client.report [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.934 226239 DEBUG oslo_concurrency.lockutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.705s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.934 226239 DEBUG nova.compute.manager [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.995 226239 DEBUG nova.compute.manager [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:06:15 np0005603623 nova_compute[226235]: 2026-01-31 09:06:15.995 226239 DEBUG nova.network.neutron [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.016 226239 INFO nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.018 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.046 226239 DEBUG nova.compute.manager [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.181 226239 DEBUG nova.compute.manager [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.183 226239 DEBUG nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.183 226239 INFO nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Creating image(s)#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.204 226239 DEBUG nova.storage.rbd_utils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] rbd image eff12f1f-d1ce-444d-8d39-ead849efd65a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.227 226239 DEBUG nova.storage.rbd_utils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] rbd image eff12f1f-d1ce-444d-8d39-ead849efd65a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.251 226239 DEBUG nova.storage.rbd_utils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] rbd image eff12f1f-d1ce-444d-8d39-ead849efd65a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.255 226239 DEBUG oslo_concurrency.processutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.313 226239 DEBUG oslo_concurrency.processutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.314 226239 DEBUG oslo_concurrency.lockutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.315 226239 DEBUG oslo_concurrency.lockutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.315 226239 DEBUG oslo_concurrency.lockutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.335 226239 DEBUG nova.storage.rbd_utils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] rbd image eff12f1f-d1ce-444d-8d39-ead849efd65a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.339 226239 DEBUG oslo_concurrency.processutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 eff12f1f-d1ce-444d-8d39-ead849efd65a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.700 226239 DEBUG nova.policy [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '47dc950da7924a109657b08e4b8b55b7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4d103de0dcec4f3f8bc7f8b3bec01a34', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.705 226239 DEBUG oslo_concurrency.processutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 eff12f1f-d1ce-444d-8d39-ead849efd65a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.366s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.765 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.771 226239 DEBUG nova.storage.rbd_utils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] resizing rbd image eff12f1f-d1ce-444d-8d39-ead849efd65a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.887 226239 DEBUG nova.objects.instance [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lazy-loading 'migration_context' on Instance uuid eff12f1f-d1ce-444d-8d39-ead849efd65a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.911 226239 DEBUG nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.912 226239 DEBUG nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Ensure instance console log exists: /var/lib/nova/instances/eff12f1f-d1ce-444d-8d39-ead849efd65a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.912 226239 DEBUG oslo_concurrency.lockutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.913 226239 DEBUG oslo_concurrency.lockutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:16 np0005603623 nova_compute[226235]: 2026-01-31 09:06:16.913 226239 DEBUG oslo_concurrency.lockutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:17.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:17 np0005603623 nova_compute[226235]: 2026-01-31 09:06:17.300 226239 DEBUG nova.compute.manager [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:06:17 np0005603623 nova_compute[226235]: 2026-01-31 09:06:17.301 226239 DEBUG nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:06:17 np0005603623 nova_compute[226235]: 2026-01-31 09:06:17.302 226239 INFO nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Creating image(s)#033[00m
Jan 31 04:06:17 np0005603623 nova_compute[226235]: 2026-01-31 09:06:17.302 226239 DEBUG nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 04:06:17 np0005603623 nova_compute[226235]: 2026-01-31 09:06:17.302 226239 DEBUG nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Ensure instance console log exists: /var/lib/nova/instances/e48e9071-e65c-4dc9-bee0-7d382977f14a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:06:17 np0005603623 nova_compute[226235]: 2026-01-31 09:06:17.303 226239 DEBUG oslo_concurrency.lockutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:17 np0005603623 nova_compute[226235]: 2026-01-31 09:06:17.303 226239 DEBUG oslo_concurrency.lockutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:17 np0005603623 nova_compute[226235]: 2026-01-31 09:06:17.303 226239 DEBUG oslo_concurrency.lockutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:17.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:17 np0005603623 nova_compute[226235]: 2026-01-31 09:06:17.692 226239 DEBUG nova.network.neutron [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Successfully created port: 8ea4b25c-0126-4805-a1e3-bb05f9021074 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:06:18 np0005603623 nova_compute[226235]: 2026-01-31 09:06:18.586 226239 DEBUG nova.network.neutron [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Successfully created port: c5eba141-81b4-4e12-b26f-8a6e0965b727 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:06:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:06:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:06:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:06:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:06:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:06:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:06:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 04:06:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 04:06:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 04:06:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:06:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:06:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:06:19 np0005603623 nova_compute[226235]: 2026-01-31 09:06:19.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:19 np0005603623 nova_compute[226235]: 2026-01-31 09:06:19.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:06:19 np0005603623 nova_compute[226235]: 2026-01-31 09:06:19.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:06:19 np0005603623 nova_compute[226235]: 2026-01-31 09:06:19.183 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 04:06:19 np0005603623 nova_compute[226235]: 2026-01-31 09:06:19.183 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 04:06:19 np0005603623 nova_compute[226235]: 2026-01-31 09:06:19.183 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:06:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:06:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:19.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:06:19 np0005603623 nova_compute[226235]: 2026-01-31 09:06:19.402 226239 DEBUG nova.network.neutron [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Successfully updated port: 8ea4b25c-0126-4805-a1e3-bb05f9021074 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:06:19 np0005603623 nova_compute[226235]: 2026-01-31 09:06:19.426 226239 DEBUG oslo_concurrency.lockutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "refresh_cache-e48e9071-e65c-4dc9-bee0-7d382977f14a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:06:19 np0005603623 nova_compute[226235]: 2026-01-31 09:06:19.427 226239 DEBUG oslo_concurrency.lockutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquired lock "refresh_cache-e48e9071-e65c-4dc9-bee0-7d382977f14a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:06:19 np0005603623 nova_compute[226235]: 2026-01-31 09:06:19.427 226239 DEBUG nova.network.neutron [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:06:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:19.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:19 np0005603623 nova_compute[226235]: 2026-01-31 09:06:19.569 226239 DEBUG nova.compute.manager [req-a043055c-372a-448d-9862-5eeac1f3b67a req-ad11835c-c53f-41b8-93f9-443c16e14752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Received event network-changed-8ea4b25c-0126-4805-a1e3-bb05f9021074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:19 np0005603623 nova_compute[226235]: 2026-01-31 09:06:19.570 226239 DEBUG nova.compute.manager [req-a043055c-372a-448d-9862-5eeac1f3b67a req-ad11835c-c53f-41b8-93f9-443c16e14752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Refreshing instance network info cache due to event network-changed-8ea4b25c-0126-4805-a1e3-bb05f9021074. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:06:19 np0005603623 nova_compute[226235]: 2026-01-31 09:06:19.570 226239 DEBUG oslo_concurrency.lockutils [req-a043055c-372a-448d-9862-5eeac1f3b67a req-ad11835c-c53f-41b8-93f9-443c16e14752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e48e9071-e65c-4dc9-bee0-7d382977f14a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:06:19 np0005603623 nova_compute[226235]: 2026-01-31 09:06:19.777 226239 DEBUG nova.network.neutron [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:06:20 np0005603623 nova_compute[226235]: 2026-01-31 09:06:20.688 226239 DEBUG nova.network.neutron [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Successfully updated port: c5eba141-81b4-4e12-b26f-8a6e0965b727 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:06:20 np0005603623 nova_compute[226235]: 2026-01-31 09:06:20.717 226239 DEBUG oslo_concurrency.lockutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Acquiring lock "refresh_cache-eff12f1f-d1ce-444d-8d39-ead849efd65a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:06:20 np0005603623 nova_compute[226235]: 2026-01-31 09:06:20.717 226239 DEBUG oslo_concurrency.lockutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Acquired lock "refresh_cache-eff12f1f-d1ce-444d-8d39-ead849efd65a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:06:20 np0005603623 nova_compute[226235]: 2026-01-31 09:06:20.718 226239 DEBUG nova.network.neutron [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.021 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.149 226239 DEBUG nova.network.neutron [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.188 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.188 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.188 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.189 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.189 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:21.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:21.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:06:21 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/469451409' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.574 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.386s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.706 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.707 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4116MB free_disk=20.876556396484375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.707 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.707 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.762 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.814 226239 DEBUG nova.network.neutron [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Updating instance_info_cache with network_info: [{"id": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "address": "fa:16:3e:52:66:9d", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ea4b25c-01", "ovs_interfaceid": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.826 226239 DEBUG nova.compute.manager [req-55bef873-2b9b-44ca-94f4-07ef6096f8c7 req-3014aac9-2f6d-4072-a3fd-f1340c1b97c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Received event network-changed-c5eba141-81b4-4e12-b26f-8a6e0965b727 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.826 226239 DEBUG nova.compute.manager [req-55bef873-2b9b-44ca-94f4-07ef6096f8c7 req-3014aac9-2f6d-4072-a3fd-f1340c1b97c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Refreshing instance network info cache due to event network-changed-c5eba141-81b4-4e12-b26f-8a6e0965b727. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.827 226239 DEBUG oslo_concurrency.lockutils [req-55bef873-2b9b-44ca-94f4-07ef6096f8c7 req-3014aac9-2f6d-4072-a3fd-f1340c1b97c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-eff12f1f-d1ce-444d-8d39-ead849efd65a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.848 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance e48e9071-e65c-4dc9-bee0-7d382977f14a actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.849 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance eff12f1f-d1ce-444d-8d39-ead849efd65a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.849 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.849 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.852 226239 DEBUG oslo_concurrency.lockutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Releasing lock "refresh_cache-e48e9071-e65c-4dc9-bee0-7d382977f14a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.852 226239 DEBUG nova.compute.manager [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Instance network_info: |[{"id": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "address": "fa:16:3e:52:66:9d", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ea4b25c-01", "ovs_interfaceid": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.852 226239 DEBUG oslo_concurrency.lockutils [req-a043055c-372a-448d-9862-5eeac1f3b67a req-ad11835c-c53f-41b8-93f9-443c16e14752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e48e9071-e65c-4dc9-bee0-7d382977f14a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.853 226239 DEBUG nova.network.neutron [req-a043055c-372a-448d-9862-5eeac1f3b67a req-ad11835c-c53f-41b8-93f9-443c16e14752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Refreshing network info cache for port 8ea4b25c-0126-4805-a1e3-bb05f9021074 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.855 226239 DEBUG nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Start _get_guest_xml network_info=[{"id": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "address": "fa:16:3e:52:66:9d", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ea4b25c-01", "ovs_interfaceid": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'attachment_id': 'dd219e33-3c3b-490b-be2d-e751477d9a93', 'delete_on_termination': False, 'guest_format': None, 'device_type': 'disk', 'boot_index': 0, 'disk_bus': 'virtio', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-b457f046-29d9-4842-87e2-ad26874ea748', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'b457f046-29d9-4842-87e2-ad26874ea748', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'e48e9071-e65c-4dc9-bee0-7d382977f14a', 'attached_at': '', 'detached_at': '', 'volume_id': 'b457f046-29d9-4842-87e2-ad26874ea748', 'serial': 'b457f046-29d9-4842-87e2-ad26874ea748'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.859 226239 WARNING nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.863 226239 DEBUG nova.virt.libvirt.host [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.863 226239 DEBUG nova.virt.libvirt.host [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.867 226239 DEBUG nova.virt.libvirt.host [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.867 226239 DEBUG nova.virt.libvirt.host [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.868 226239 DEBUG nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.869 226239 DEBUG nova.virt.hardware [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.869 226239 DEBUG nova.virt.hardware [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.869 226239 DEBUG nova.virt.hardware [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.870 226239 DEBUG nova.virt.hardware [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.870 226239 DEBUG nova.virt.hardware [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.870 226239 DEBUG nova.virt.hardware [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.870 226239 DEBUG nova.virt.hardware [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.871 226239 DEBUG nova.virt.hardware [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.871 226239 DEBUG nova.virt.hardware [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.871 226239 DEBUG nova.virt.hardware [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.871 226239 DEBUG nova.virt.hardware [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.897 226239 DEBUG nova.storage.rbd_utils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] rbd image e48e9071-e65c-4dc9-bee0-7d382977f14a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.901 226239 DEBUG oslo_concurrency.processutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:21 np0005603623 nova_compute[226235]: 2026-01-31 09:06:21.975 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.378 226239 DEBUG oslo_concurrency.processutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:06:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/146174279' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.397 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.402 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.424 226239 DEBUG nova.virt.libvirt.vif [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:06:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-1753899210',display_name='tempest-TestVolumeBootPattern-server-1753899210',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-1753899210',id=206,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNWvTxk1zh2OCmPH3tumEbxR7y880uhj4vJDAspX9r3EATf0w5oe5DG3NVBcNRbWTPcgVwlnXcyaRQZseLc7edDTe4kwfjogsRoplvkAsMWW9sCSaJlX0XBkMxl/Ghv8Fw==',key_name='tempest-TestVolumeBootPattern-11482540',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76ce367a834b49dfb5b436848118b860',ramdisk_id='',reservation_id='r-171izjoz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1392945362',owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:06:15Z,user_data=None,user_id='dc42b92a5dd34d32b6b184bdc7acb092',uuid=e48e9071-e65c-4dc9-bee0-7d382977f14a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "address": "fa:16:3e:52:66:9d", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ea4b25c-01", "ovs_interfaceid": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.425 226239 DEBUG nova.network.os_vif_util [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converting VIF {"id": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "address": "fa:16:3e:52:66:9d", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ea4b25c-01", "ovs_interfaceid": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.426 226239 DEBUG nova.network.os_vif_util [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:66:9d,bridge_name='br-int',has_traffic_filtering=True,id=8ea4b25c-0126-4805-a1e3-bb05f9021074,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ea4b25c-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.427 226239 DEBUG nova.objects.instance [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lazy-loading 'pci_devices' on Instance uuid e48e9071-e65c-4dc9-bee0-7d382977f14a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.431 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.458 226239 DEBUG nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:06:22 np0005603623 nova_compute[226235]:  <uuid>e48e9071-e65c-4dc9-bee0-7d382977f14a</uuid>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:  <name>instance-000000ce</name>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <nova:name>tempest-TestVolumeBootPattern-server-1753899210</nova:name>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 09:06:21</nova:creationTime>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 04:06:22 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:        <nova:user uuid="dc42b92a5dd34d32b6b184bdc7acb092">tempest-TestVolumeBootPattern-1392945362-project-member</nova:user>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:        <nova:project uuid="76ce367a834b49dfb5b436848118b860">tempest-TestVolumeBootPattern-1392945362</nova:project>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:        <nova:port uuid="8ea4b25c-0126-4805-a1e3-bb05f9021074">
Jan 31 04:06:22 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <system>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <entry name="serial">e48e9071-e65c-4dc9-bee0-7d382977f14a</entry>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <entry name="uuid">e48e9071-e65c-4dc9-bee0-7d382977f14a</entry>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    </system>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:  <os>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:  </os>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:  <features>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:  </features>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:  </clock>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:  <devices>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/e48e9071-e65c-4dc9-bee0-7d382977f14a_disk.config">
Jan 31 04:06:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:06:22 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="volumes/volume-b457f046-29d9-4842-87e2-ad26874ea748">
Jan 31 04:06:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:06:22 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <serial>b457f046-29d9-4842-87e2-ad26874ea748</serial>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:52:66:9d"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <target dev="tap8ea4b25c-01"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    </interface>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/e48e9071-e65c-4dc9-bee0-7d382977f14a/console.log" append="off"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    </serial>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <video>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    </video>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    </rng>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 04:06:22 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 04:06:22 np0005603623 nova_compute[226235]:  </devices>
Jan 31 04:06:22 np0005603623 nova_compute[226235]: </domain>
Jan 31 04:06:22 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.460 226239 DEBUG nova.compute.manager [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Preparing to wait for external event network-vif-plugged-8ea4b25c-0126-4805-a1e3-bb05f9021074 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.460 226239 DEBUG oslo_concurrency.lockutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "e48e9071-e65c-4dc9-bee0-7d382977f14a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.460 226239 DEBUG oslo_concurrency.lockutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "e48e9071-e65c-4dc9-bee0-7d382977f14a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.460 226239 DEBUG oslo_concurrency.lockutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "e48e9071-e65c-4dc9-bee0-7d382977f14a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.461 226239 DEBUG nova.virt.libvirt.vif [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:06:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-1753899210',display_name='tempest-TestVolumeBootPattern-server-1753899210',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-1753899210',id=206,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNWvTxk1zh2OCmPH3tumEbxR7y880uhj4vJDAspX9r3EATf0w5oe5DG3NVBcNRbWTPcgVwlnXcyaRQZseLc7edDTe4kwfjogsRoplvkAsMWW9sCSaJlX0XBkMxl/Ghv8Fw==',key_name='tempest-TestVolumeBootPattern-11482540',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76ce367a834b49dfb5b436848118b860',ramdisk_id='',reservation_id='r-171izjoz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1392945362',owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:06:15Z,user_data=None,user_id='dc42b92a5dd34d32b6b184bdc7acb092',uuid=e48e9071-e65c-4dc9-bee0-7d382977f14a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "address": "fa:16:3e:52:66:9d", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ea4b25c-01", "ovs_interfaceid": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.461 226239 DEBUG nova.network.os_vif_util [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converting VIF {"id": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "address": "fa:16:3e:52:66:9d", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ea4b25c-01", "ovs_interfaceid": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.462 226239 DEBUG nova.network.os_vif_util [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:66:9d,bridge_name='br-int',has_traffic_filtering=True,id=8ea4b25c-0126-4805-a1e3-bb05f9021074,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ea4b25c-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.462 226239 DEBUG os_vif [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:66:9d,bridge_name='br-int',has_traffic_filtering=True,id=8ea4b25c-0126-4805-a1e3-bb05f9021074,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ea4b25c-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.463 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.463 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.464 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.466 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.466 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.467 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.467 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8ea4b25c-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.468 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8ea4b25c-01, col_values=(('external_ids', {'iface-id': '8ea4b25c-0126-4805-a1e3-bb05f9021074', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:66:9d', 'vm-uuid': 'e48e9071-e65c-4dc9-bee0-7d382977f14a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.469 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:22 np0005603623 NetworkManager[48970]: <info>  [1769850382.4701] manager: (tap8ea4b25c-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/401)
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.626 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.628 226239 INFO os_vif [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:66:9d,bridge_name='br-int',has_traffic_filtering=True,id=8ea4b25c-0126-4805-a1e3-bb05f9021074,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ea4b25c-01')#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.711 226239 DEBUG nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.711 226239 DEBUG nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.711 226239 DEBUG nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] No VIF found with MAC fa:16:3e:52:66:9d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.712 226239 INFO nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Using config drive#033[00m
Jan 31 04:06:22 np0005603623 nova_compute[226235]: 2026-01-31 09:06:22.737 226239 DEBUG nova.storage.rbd_utils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] rbd image e48e9071-e65c-4dc9-bee0-7d382977f14a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:06:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:23.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:23.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:06:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.669 226239 INFO nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Creating config drive at /var/lib/nova/instances/e48e9071-e65c-4dc9-bee0-7d382977f14a/disk.config#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.673 226239 DEBUG oslo_concurrency.processutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e48e9071-e65c-4dc9-bee0-7d382977f14a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp5x19fre1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.796 226239 DEBUG oslo_concurrency.processutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e48e9071-e65c-4dc9-bee0-7d382977f14a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp5x19fre1" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.823 226239 DEBUG nova.storage.rbd_utils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] rbd image e48e9071-e65c-4dc9-bee0-7d382977f14a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.827 226239 DEBUG oslo_concurrency.processutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e48e9071-e65c-4dc9-bee0-7d382977f14a/disk.config e48e9071-e65c-4dc9-bee0-7d382977f14a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.927 226239 DEBUG nova.network.neutron [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Updating instance_info_cache with network_info: [{"id": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "address": "fa:16:3e:fa:64:8e", "network": {"id": "38986396-fb3a-4b5b-b8ee-59048b4b9ff2", "bridge": "br-int", "label": "tempest-network-smoke--1874028344", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d103de0dcec4f3f8bc7f8b3bec01a34", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5eba141-81", "ovs_interfaceid": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.946 226239 DEBUG oslo_concurrency.lockutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Releasing lock "refresh_cache-eff12f1f-d1ce-444d-8d39-ead849efd65a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.947 226239 DEBUG nova.compute.manager [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Instance network_info: |[{"id": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "address": "fa:16:3e:fa:64:8e", "network": {"id": "38986396-fb3a-4b5b-b8ee-59048b4b9ff2", "bridge": "br-int", "label": "tempest-network-smoke--1874028344", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d103de0dcec4f3f8bc7f8b3bec01a34", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5eba141-81", "ovs_interfaceid": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.947 226239 DEBUG oslo_concurrency.lockutils [req-55bef873-2b9b-44ca-94f4-07ef6096f8c7 req-3014aac9-2f6d-4072-a3fd-f1340c1b97c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-eff12f1f-d1ce-444d-8d39-ead849efd65a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.947 226239 DEBUG nova.network.neutron [req-55bef873-2b9b-44ca-94f4-07ef6096f8c7 req-3014aac9-2f6d-4072-a3fd-f1340c1b97c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Refreshing network info cache for port c5eba141-81b4-4e12-b26f-8a6e0965b727 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.950 226239 DEBUG nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Start _get_guest_xml network_info=[{"id": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "address": "fa:16:3e:fa:64:8e", "network": {"id": "38986396-fb3a-4b5b-b8ee-59048b4b9ff2", "bridge": "br-int", "label": "tempest-network-smoke--1874028344", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d103de0dcec4f3f8bc7f8b3bec01a34", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5eba141-81", "ovs_interfaceid": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.954 226239 WARNING nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.960 226239 DEBUG nova.virt.libvirt.host [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.961 226239 DEBUG nova.virt.libvirt.host [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.964 226239 DEBUG nova.virt.libvirt.host [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.964 226239 DEBUG nova.virt.libvirt.host [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.966 226239 DEBUG nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.966 226239 DEBUG nova.virt.hardware [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.967 226239 DEBUG nova.virt.hardware [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.967 226239 DEBUG nova.virt.hardware [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.967 226239 DEBUG nova.virt.hardware [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.967 226239 DEBUG nova.virt.hardware [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.968 226239 DEBUG nova.virt.hardware [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.969 226239 DEBUG nova.virt.hardware [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.970 226239 DEBUG nova.virt.hardware [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.970 226239 DEBUG nova.virt.hardware [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.970 226239 DEBUG nova.virt.hardware [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.971 226239 DEBUG nova.virt.hardware [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:06:24 np0005603623 nova_compute[226235]: 2026-01-31 09:06:24.975 226239 DEBUG oslo_concurrency.processutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.002 226239 DEBUG oslo_concurrency.processutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e48e9071-e65c-4dc9-bee0-7d382977f14a/disk.config e48e9071-e65c-4dc9-bee0-7d382977f14a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.175s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.003 226239 INFO nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Deleting local config drive /var/lib/nova/instances/e48e9071-e65c-4dc9-bee0-7d382977f14a/disk.config because it was imported into RBD.#033[00m
Jan 31 04:06:25 np0005603623 kernel: tap8ea4b25c-01: entered promiscuous mode
Jan 31 04:06:25 np0005603623 NetworkManager[48970]: <info>  [1769850385.0562] manager: (tap8ea4b25c-01): new Tun device (/org/freedesktop/NetworkManager/Devices/402)
Jan 31 04:06:25 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:25Z|00855|binding|INFO|Claiming lport 8ea4b25c-0126-4805-a1e3-bb05f9021074 for this chassis.
Jan 31 04:06:25 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:25Z|00856|binding|INFO|8ea4b25c-0126-4805-a1e3-bb05f9021074: Claiming fa:16:3e:52:66:9d 10.100.0.12
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.062 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.076 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:66:9d 10.100.0.12'], port_security=['fa:16:3e:52:66:9d 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'e48e9071-e65c-4dc9-bee0-7d382977f14a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-650eb345-8346-4e8f-8e83-eeb0117654f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76ce367a834b49dfb5b436848118b860', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cea15428-ed6f-44a7-98e5-24c0fab7b796', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ecdc171-9d09-4cba-9bb9-cd2f8ef8e6c3, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=8ea4b25c-0126-4805-a1e3-bb05f9021074) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.078 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 8ea4b25c-0126-4805-a1e3-bb05f9021074 in datapath 650eb345-8346-4e8f-8e83-eeb0117654f6 bound to our chassis#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.081 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 650eb345-8346-4e8f-8e83-eeb0117654f6#033[00m
Jan 31 04:06:25 np0005603623 systemd-machined[194379]: New machine qemu-95-instance-000000ce.
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.089 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a9d27356-8be1-4bef-9fb8-4d881d7ccb10]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.092 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.091 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap650eb345-81 in ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.092 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap650eb345-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.092 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[44b6ba08-d7f3-43ba-baaf-eec0c7d7fa04]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.094 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ed9d9d54-7130-4d11-a3f0-785245ba1c3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:25 np0005603623 systemd[1]: Started Virtual Machine qemu-95-instance-000000ce.
Jan 31 04:06:25 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:25Z|00857|binding|INFO|Setting lport 8ea4b25c-0126-4805-a1e3-bb05f9021074 ovn-installed in OVS
Jan 31 04:06:25 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:25Z|00858|binding|INFO|Setting lport 8ea4b25c-0126-4805-a1e3-bb05f9021074 up in Southbound
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.100 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.107 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[9b37905b-7035-4ee8-803e-e2d2d0e62f30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:25 np0005603623 systemd-udevd[323782]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.123 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e550aa28-2d1e-4cd5-b295-cf817d9a1455]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:25 np0005603623 NetworkManager[48970]: <info>  [1769850385.1377] device (tap8ea4b25c-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:06:25 np0005603623 NetworkManager[48970]: <info>  [1769850385.1382] device (tap8ea4b25c-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:06:25 np0005603623 podman[323735]: 2026-01-31 09:06:25.151514117 +0000 UTC m=+0.073691542 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.152 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[f6df8e4e-44a3-4d37-b0fa-a782ee2c80d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:25 np0005603623 NetworkManager[48970]: <info>  [1769850385.1581] manager: (tap650eb345-80): new Veth device (/org/freedesktop/NetworkManager/Devices/403)
Jan 31 04:06:25 np0005603623 systemd-udevd[323799]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.157 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ae68fec6-bec4-4d6c-b488-a56c569bbc04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:25 np0005603623 podman[323738]: 2026-01-31 09:06:25.174517749 +0000 UTC m=+0.097328644 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.187 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[9ced483f-19c8-40d7-9bc8-c9dff6353c8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.190 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[a3b1544d-3e47-4ac9-9547-0664191ad125]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:25 np0005603623 NetworkManager[48970]: <info>  [1769850385.2053] device (tap650eb345-80): carrier: link connected
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.210 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[79f6e337-cf97-4d74-99fb-c87534419c52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.221 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3c38ddd5-7a6d-4060-808c-10fd749c5575]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap650eb345-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:27:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 251], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 952563, 'reachable_time': 22816, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 323831, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.232 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3c498ed9-b346-4bbe-81cc-aa7672a649f5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9f:27ec'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 952563, 'tstamp': 952563}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 323832, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.242 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b4e85503-c9ef-4756-8aa8-ccf03ac307d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap650eb345-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:27:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 251], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 952563, 'reachable_time': 22816, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 323833, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.260 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[39b215ec-31db-4638-8d9e-d4d21f050fc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.300 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[457cff94-82cf-4c41-8f9d-d964cfe95e2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.301 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap650eb345-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.301 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:06:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:25.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.302 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap650eb345-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.304 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:25 np0005603623 NetworkManager[48970]: <info>  [1769850385.3047] manager: (tap650eb345-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/404)
Jan 31 04:06:25 np0005603623 kernel: tap650eb345-80: entered promiscuous mode
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.306 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap650eb345-80, col_values=(('external_ids', {'iface-id': '74bde109-0188-4ce3-87c3-02a3eb853dc2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.308 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.309 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/650eb345-8346-4e8f-8e83-eeb0117654f6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/650eb345-8346-4e8f-8e83-eeb0117654f6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:06:25 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:25Z|00859|binding|INFO|Releasing lport 74bde109-0188-4ce3-87c3-02a3eb853dc2 from this chassis (sb_readonly=0)
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.309 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5d2f6ddf-90fe-4305-98ae-903dfcc76668]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.310 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-650eb345-8346-4e8f-8e83-eeb0117654f6
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/650eb345-8346-4e8f-8e83-eeb0117654f6.pid.haproxy
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 650eb345-8346-4e8f-8e83-eeb0117654f6
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:06:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:25.311 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'env', 'PROCESS_TAG=haproxy-650eb345-8346-4e8f-8e83-eeb0117654f6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/650eb345-8346-4e8f-8e83-eeb0117654f6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.315 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.421 226239 DEBUG oslo_concurrency.processutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.446 226239 DEBUG nova.storage.rbd_utils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] rbd image eff12f1f-d1ce-444d-8d39-ead849efd65a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.455 226239 DEBUG oslo_concurrency.processutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:25.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.476 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:25 np0005603623 podman[323923]: 2026-01-31 09:06:25.608194404 +0000 UTC m=+0.045240631 container create a0cbfc5e66a2671971c3b7254cb0f5388ace74b30b42358557d7b4d7410a3562 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3)
Jan 31 04:06:25 np0005603623 systemd[1]: Started libpod-conmon-a0cbfc5e66a2671971c3b7254cb0f5388ace74b30b42358557d7b4d7410a3562.scope.
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.639 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850385.6393147, e48e9071-e65c-4dc9-bee0-7d382977f14a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.640 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] VM Started (Lifecycle Event)#033[00m
Jan 31 04:06:25 np0005603623 systemd[1]: Started libcrun container.
Jan 31 04:06:25 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6fcf083e6ec477a4290dfa6a1b70d4e83157162f68b7c3147024946312893b8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:06:25 np0005603623 podman[323923]: 2026-01-31 09:06:25.656622043 +0000 UTC m=+0.093668290 container init a0cbfc5e66a2671971c3b7254cb0f5388ace74b30b42358557d7b4d7410a3562 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 04:06:25 np0005603623 podman[323923]: 2026-01-31 09:06:25.661379662 +0000 UTC m=+0.098425879 container start a0cbfc5e66a2671971c3b7254cb0f5388ace74b30b42358557d7b4d7410a3562 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:06:25 np0005603623 podman[323923]: 2026-01-31 09:06:25.584090288 +0000 UTC m=+0.021136515 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.675 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:06:25 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[323961]: [NOTICE]   (323965) : New worker (323967) forked
Jan 31 04:06:25 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[323961]: [NOTICE]   (323965) : Loading success.
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.680 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850385.642251, e48e9071-e65c-4dc9-bee0-7d382977f14a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.680 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.706 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.709 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.733 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:06:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:06:25 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2269048728' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.890 226239 DEBUG oslo_concurrency.processutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.892 226239 DEBUG nova.virt.libvirt.vif [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:06:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-680379111-access_point-537838458',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-680379111-access_point-537838458',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-680379111-acc',id=207,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgM7X9M9wZlSF1gyfdYak2nFokjAMhj//zRi82vONcQIZQBgxY0V5vu8trZHrF2NFzC4L6xfyfURFS38dJDUkA6k9r7Oe+j0g6OpQQYsDS30vFo6R9Q4mijKm1iKtRQig==',key_name='tempest-TestSecurityGroupsBasicOps-508360795',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4d103de0dcec4f3f8bc7f8b3bec01a34',ramdisk_id='',reservation_id='r-9a7z3o7m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-680379111',owner_user_name='tempest-TestSecurityGroupsBasicOps-680379111-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:06:16Z,user_data=None,user_id='47dc950da7924a109657b08e4b8b55b7',uuid=eff12f1f-d1ce-444d-8d39-ead849efd65a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "address": "fa:16:3e:fa:64:8e", "network": {"id": "38986396-fb3a-4b5b-b8ee-59048b4b9ff2", "bridge": "br-int", "label": "tempest-network-smoke--1874028344", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d103de0dcec4f3f8bc7f8b3bec01a34", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5eba141-81", "ovs_interfaceid": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.892 226239 DEBUG nova.network.os_vif_util [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Converting VIF {"id": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "address": "fa:16:3e:fa:64:8e", "network": {"id": "38986396-fb3a-4b5b-b8ee-59048b4b9ff2", "bridge": "br-int", "label": "tempest-network-smoke--1874028344", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d103de0dcec4f3f8bc7f8b3bec01a34", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5eba141-81", "ovs_interfaceid": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.893 226239 DEBUG nova.network.os_vif_util [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:64:8e,bridge_name='br-int',has_traffic_filtering=True,id=c5eba141-81b4-4e12-b26f-8a6e0965b727,network=Network(38986396-fb3a-4b5b-b8ee-59048b4b9ff2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5eba141-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.894 226239 DEBUG nova.objects.instance [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lazy-loading 'pci_devices' on Instance uuid eff12f1f-d1ce-444d-8d39-ead849efd65a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.914 226239 DEBUG nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:06:25 np0005603623 nova_compute[226235]:  <uuid>eff12f1f-d1ce-444d-8d39-ead849efd65a</uuid>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:  <name>instance-000000cf</name>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-680379111-access_point-537838458</nova:name>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 09:06:24</nova:creationTime>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 04:06:25 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:        <nova:user uuid="47dc950da7924a109657b08e4b8b55b7">tempest-TestSecurityGroupsBasicOps-680379111-project-member</nova:user>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:        <nova:project uuid="4d103de0dcec4f3f8bc7f8b3bec01a34">tempest-TestSecurityGroupsBasicOps-680379111</nova:project>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:        <nova:port uuid="c5eba141-81b4-4e12-b26f-8a6e0965b727">
Jan 31 04:06:25 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <system>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <entry name="serial">eff12f1f-d1ce-444d-8d39-ead849efd65a</entry>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <entry name="uuid">eff12f1f-d1ce-444d-8d39-ead849efd65a</entry>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    </system>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:  <os>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:  </os>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:  <features>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:  </features>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:  </clock>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:  <devices>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/eff12f1f-d1ce-444d-8d39-ead849efd65a_disk">
Jan 31 04:06:25 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:06:25 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/eff12f1f-d1ce-444d-8d39-ead849efd65a_disk.config">
Jan 31 04:06:25 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:06:25 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:fa:64:8e"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <target dev="tapc5eba141-81"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    </interface>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/eff12f1f-d1ce-444d-8d39-ead849efd65a/console.log" append="off"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    </serial>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <video>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    </video>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    </rng>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 04:06:25 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 04:06:25 np0005603623 nova_compute[226235]:  </devices>
Jan 31 04:06:25 np0005603623 nova_compute[226235]: </domain>
Jan 31 04:06:25 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.914 226239 DEBUG nova.compute.manager [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Preparing to wait for external event network-vif-plugged-c5eba141-81b4-4e12-b26f-8a6e0965b727 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.915 226239 DEBUG oslo_concurrency.lockutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Acquiring lock "eff12f1f-d1ce-444d-8d39-ead849efd65a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.915 226239 DEBUG oslo_concurrency.lockutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lock "eff12f1f-d1ce-444d-8d39-ead849efd65a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.915 226239 DEBUG oslo_concurrency.lockutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lock "eff12f1f-d1ce-444d-8d39-ead849efd65a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.916 226239 DEBUG nova.virt.libvirt.vif [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:06:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-680379111-access_point-537838458',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-680379111-access_point-537838458',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-680379111-acc',id=207,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgM7X9M9wZlSF1gyfdYak2nFokjAMhj//zRi82vONcQIZQBgxY0V5vu8trZHrF2NFzC4L6xfyfURFS38dJDUkA6k9r7Oe+j0g6OpQQYsDS30vFo6R9Q4mijKm1iKtRQig==',key_name='tempest-TestSecurityGroupsBasicOps-508360795',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4d103de0dcec4f3f8bc7f8b3bec01a34',ramdisk_id='',reservation_id='r-9a7z3o7m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-680379111',owner_user_name='tempest-TestSecurityGroupsBasicOps-680379111-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:06:16Z,user_data=None,user_id='47dc950da7924a109657b08e4b8b55b7',uuid=eff12f1f-d1ce-444d-8d39-ead849efd65a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "address": "fa:16:3e:fa:64:8e", "network": {"id": "38986396-fb3a-4b5b-b8ee-59048b4b9ff2", "bridge": "br-int", "label": "tempest-network-smoke--1874028344", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d103de0dcec4f3f8bc7f8b3bec01a34", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5eba141-81", "ovs_interfaceid": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.916 226239 DEBUG nova.network.os_vif_util [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Converting VIF {"id": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "address": "fa:16:3e:fa:64:8e", "network": {"id": "38986396-fb3a-4b5b-b8ee-59048b4b9ff2", "bridge": "br-int", "label": "tempest-network-smoke--1874028344", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d103de0dcec4f3f8bc7f8b3bec01a34", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5eba141-81", "ovs_interfaceid": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.916 226239 DEBUG nova.network.os_vif_util [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:64:8e,bridge_name='br-int',has_traffic_filtering=True,id=c5eba141-81b4-4e12-b26f-8a6e0965b727,network=Network(38986396-fb3a-4b5b-b8ee-59048b4b9ff2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5eba141-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.917 226239 DEBUG os_vif [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:64:8e,bridge_name='br-int',has_traffic_filtering=True,id=c5eba141-81b4-4e12-b26f-8a6e0965b727,network=Network(38986396-fb3a-4b5b-b8ee-59048b4b9ff2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5eba141-81') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.917 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.917 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.918 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.920 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.920 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc5eba141-81, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.921 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc5eba141-81, col_values=(('external_ids', {'iface-id': 'c5eba141-81b4-4e12-b26f-8a6e0965b727', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fa:64:8e', 'vm-uuid': 'eff12f1f-d1ce-444d-8d39-ead849efd65a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.922 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:25 np0005603623 NetworkManager[48970]: <info>  [1769850385.9240] manager: (tapc5eba141-81): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/405)
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.926 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.928 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.929 226239 INFO os_vif [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:64:8e,bridge_name='br-int',has_traffic_filtering=True,id=c5eba141-81b4-4e12-b26f-8a6e0965b727,network=Network(38986396-fb3a-4b5b-b8ee-59048b4b9ff2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5eba141-81')#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.979 226239 DEBUG nova.compute.manager [req-3a8912f7-86c4-427d-86b2-3a3529e1c1d0 req-d95b052a-92dd-4b6f-ad6d-8c37ffb0a7e9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Received event network-vif-plugged-8ea4b25c-0126-4805-a1e3-bb05f9021074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.980 226239 DEBUG oslo_concurrency.lockutils [req-3a8912f7-86c4-427d-86b2-3a3529e1c1d0 req-d95b052a-92dd-4b6f-ad6d-8c37ffb0a7e9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e48e9071-e65c-4dc9-bee0-7d382977f14a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.980 226239 DEBUG oslo_concurrency.lockutils [req-3a8912f7-86c4-427d-86b2-3a3529e1c1d0 req-d95b052a-92dd-4b6f-ad6d-8c37ffb0a7e9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e48e9071-e65c-4dc9-bee0-7d382977f14a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.980 226239 DEBUG oslo_concurrency.lockutils [req-3a8912f7-86c4-427d-86b2-3a3529e1c1d0 req-d95b052a-92dd-4b6f-ad6d-8c37ffb0a7e9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e48e9071-e65c-4dc9-bee0-7d382977f14a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.980 226239 DEBUG nova.compute.manager [req-3a8912f7-86c4-427d-86b2-3a3529e1c1d0 req-d95b052a-92dd-4b6f-ad6d-8c37ffb0a7e9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Processing event network-vif-plugged-8ea4b25c-0126-4805-a1e3-bb05f9021074 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.981 226239 DEBUG nova.compute.manager [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.985 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850385.9852955, e48e9071-e65c-4dc9-bee0-7d382977f14a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.986 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.988 226239 DEBUG nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.991 226239 INFO nova.virt.libvirt.driver [-] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Instance spawned successfully.#033[00m
Jan 31 04:06:25 np0005603623 nova_compute[226235]: 2026-01-31 09:06:25.992 226239 DEBUG nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.005 226239 DEBUG nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.006 226239 DEBUG nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.006 226239 DEBUG nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] No VIF found with MAC fa:16:3e:fa:64:8e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.007 226239 INFO nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Using config drive#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.034 226239 DEBUG nova.storage.rbd_utils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] rbd image eff12f1f-d1ce-444d-8d39-ead849efd65a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.043 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.047 226239 DEBUG nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.048 226239 DEBUG nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.048 226239 DEBUG nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.048 226239 DEBUG nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.049 226239 DEBUG nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.049 226239 DEBUG nova.virt.libvirt.driver [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.055 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.084 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.129 226239 INFO nova.compute.manager [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Took 8.83 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.129 226239 DEBUG nova.compute.manager [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.246 226239 INFO nova.compute.manager [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Took 11.81 seconds to build instance.#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.276 226239 DEBUG oslo_concurrency.lockutils [None req-fd0e8c55-cbd5-482c-8427-a59503fb66c4 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "e48e9071-e65c-4dc9-bee0-7d382977f14a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.929s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:26 np0005603623 nova_compute[226235]: 2026-01-31 09:06:26.764 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:27 np0005603623 nova_compute[226235]: 2026-01-31 09:06:27.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:06:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:27.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:06:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:27.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:27 np0005603623 nova_compute[226235]: 2026-01-31 09:06:27.669 226239 INFO nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Creating config drive at /var/lib/nova/instances/eff12f1f-d1ce-444d-8d39-ead849efd65a/disk.config#033[00m
Jan 31 04:06:27 np0005603623 nova_compute[226235]: 2026-01-31 09:06:27.673 226239 DEBUG oslo_concurrency.processutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eff12f1f-d1ce-444d-8d39-ead849efd65a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmptee22h3l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:27 np0005603623 nova_compute[226235]: 2026-01-31 09:06:27.722 226239 DEBUG nova.network.neutron [req-a043055c-372a-448d-9862-5eeac1f3b67a req-ad11835c-c53f-41b8-93f9-443c16e14752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Updated VIF entry in instance network info cache for port 8ea4b25c-0126-4805-a1e3-bb05f9021074. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:06:27 np0005603623 nova_compute[226235]: 2026-01-31 09:06:27.723 226239 DEBUG nova.network.neutron [req-a043055c-372a-448d-9862-5eeac1f3b67a req-ad11835c-c53f-41b8-93f9-443c16e14752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Updating instance_info_cache with network_info: [{"id": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "address": "fa:16:3e:52:66:9d", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ea4b25c-01", "ovs_interfaceid": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:06:27 np0005603623 nova_compute[226235]: 2026-01-31 09:06:27.749 226239 DEBUG oslo_concurrency.lockutils [req-a043055c-372a-448d-9862-5eeac1f3b67a req-ad11835c-c53f-41b8-93f9-443c16e14752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e48e9071-e65c-4dc9-bee0-7d382977f14a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:06:27 np0005603623 nova_compute[226235]: 2026-01-31 09:06:27.794 226239 DEBUG oslo_concurrency.processutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eff12f1f-d1ce-444d-8d39-ead849efd65a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmptee22h3l" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:27 np0005603623 nova_compute[226235]: 2026-01-31 09:06:27.819 226239 DEBUG nova.storage.rbd_utils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] rbd image eff12f1f-d1ce-444d-8d39-ead849efd65a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:06:27 np0005603623 nova_compute[226235]: 2026-01-31 09:06:27.822 226239 DEBUG oslo_concurrency.processutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/eff12f1f-d1ce-444d-8d39-ead849efd65a/disk.config eff12f1f-d1ce-444d-8d39-ead849efd65a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:27 np0005603623 nova_compute[226235]: 2026-01-31 09:06:27.966 226239 DEBUG oslo_concurrency.processutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/eff12f1f-d1ce-444d-8d39-ead849efd65a/disk.config eff12f1f-d1ce-444d-8d39-ead849efd65a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:27 np0005603623 nova_compute[226235]: 2026-01-31 09:06:27.967 226239 INFO nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Deleting local config drive /var/lib/nova/instances/eff12f1f-d1ce-444d-8d39-ead849efd65a/disk.config because it was imported into RBD.#033[00m
Jan 31 04:06:28 np0005603623 kernel: tapc5eba141-81: entered promiscuous mode
Jan 31 04:06:28 np0005603623 NetworkManager[48970]: <info>  [1769850388.0043] manager: (tapc5eba141-81): new Tun device (/org/freedesktop/NetworkManager/Devices/406)
Jan 31 04:06:28 np0005603623 systemd-udevd[323822]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:06:28 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:28Z|00860|binding|INFO|Claiming lport c5eba141-81b4-4e12-b26f-8a6e0965b727 for this chassis.
Jan 31 04:06:28 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:28Z|00861|binding|INFO|c5eba141-81b4-4e12-b26f-8a6e0965b727: Claiming fa:16:3e:fa:64:8e 10.100.0.8
Jan 31 04:06:28 np0005603623 NetworkManager[48970]: <info>  [1769850388.0127] device (tapc5eba141-81): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:06:28 np0005603623 NetworkManager[48970]: <info>  [1769850388.0131] device (tapc5eba141-81): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.011 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.018 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:64:8e 10.100.0.8'], port_security=['fa:16:3e:fa:64:8e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'eff12f1f-d1ce-444d-8d39-ead849efd65a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-38986396-fb3a-4b5b-b8ee-59048b4b9ff2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d103de0dcec4f3f8bc7f8b3bec01a34', 'neutron:revision_number': '2', 'neutron:security_group_ids': '13209e92-378b-49f3-97e4-1aa68fc1f020 c12c0ff4-8bf2-46b4-adb4-e72c424153f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2d93e478-add3-4b37-b54b-67462a306e66, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=c5eba141-81b4-4e12-b26f-8a6e0965b727) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.019 143258 INFO neutron.agent.ovn.metadata.agent [-] Port c5eba141-81b4-4e12-b26f-8a6e0965b727 in datapath 38986396-fb3a-4b5b-b8ee-59048b4b9ff2 bound to our chassis#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.021 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 38986396-fb3a-4b5b-b8ee-59048b4b9ff2#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.028 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2fbb0d4b-1b54-452d-aa33-cf623ae4abd0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.029 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap38986396-f1 in ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.031 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap38986396-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.031 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[63988142-d944-4790-b288-b5f63c93bb1f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.032 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d767f9b2-5a06-4779-a5d6-0f1171d6cc03]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.033 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:28 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:28Z|00862|binding|INFO|Setting lport c5eba141-81b4-4e12-b26f-8a6e0965b727 ovn-installed in OVS
Jan 31 04:06:28 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:28Z|00863|binding|INFO|Setting lport c5eba141-81b4-4e12-b26f-8a6e0965b727 up in Southbound
Jan 31 04:06:28 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 04:06:28 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 04:06:28 np0005603623 systemd-machined[194379]: New machine qemu-96-instance-000000cf.
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.044 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.040 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[70bf4f1c-6aff-4098-8ce1-78d6ac919fbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:28 np0005603623 systemd[1]: Started Virtual Machine qemu-96-instance-000000cf.
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.056 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d6e747af-29f3-4322-a339-c3c358b9092d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.074 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[e0d4cec5-ce47-4452-9903-bca132fc8514]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.078 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d78bb3ab-0f4f-4ef6-89d6-84a1054f0224]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:28 np0005603623 NetworkManager[48970]: <info>  [1769850388.0836] manager: (tap38986396-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/407)
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.101 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[4f07da03-8400-499f-9bfc-034ee13f8862]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:28 np0005603623 systemd-udevd[324066]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.104 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[7f5cd95d-5636-48e1-81ee-448673d646bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:28 np0005603623 NetworkManager[48970]: <info>  [1769850388.1236] device (tap38986396-f0): carrier: link connected
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.128 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[4dbb4e24-fe34-48b4-9208-8977819f8774]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.137 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ca354654-7ef3-403a-8c7a-49cf2e5c0c13]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap38986396-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:b7:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 952855, 'reachable_time': 25957, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 324086, 'error': None, 'target': 'ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.145 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ff4fd9a4-744a-4a0c-b816-b8af6c28664c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:b740'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 952855, 'tstamp': 952855}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 324087, 'error': None, 'target': 'ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.154 226239 DEBUG nova.compute.manager [req-496a604d-c58c-43e9-9294-008c5f1bfd30 req-f09e2afe-917e-4b10-9878-04aaa4a5b87e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Received event network-vif-plugged-8ea4b25c-0126-4805-a1e3-bb05f9021074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.154 226239 DEBUG oslo_concurrency.lockutils [req-496a604d-c58c-43e9-9294-008c5f1bfd30 req-f09e2afe-917e-4b10-9878-04aaa4a5b87e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e48e9071-e65c-4dc9-bee0-7d382977f14a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.154 226239 DEBUG oslo_concurrency.lockutils [req-496a604d-c58c-43e9-9294-008c5f1bfd30 req-f09e2afe-917e-4b10-9878-04aaa4a5b87e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e48e9071-e65c-4dc9-bee0-7d382977f14a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.155 226239 DEBUG oslo_concurrency.lockutils [req-496a604d-c58c-43e9-9294-008c5f1bfd30 req-f09e2afe-917e-4b10-9878-04aaa4a5b87e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e48e9071-e65c-4dc9-bee0-7d382977f14a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.155 226239 DEBUG nova.compute.manager [req-496a604d-c58c-43e9-9294-008c5f1bfd30 req-f09e2afe-917e-4b10-9878-04aaa4a5b87e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] No waiting events found dispatching network-vif-plugged-8ea4b25c-0126-4805-a1e3-bb05f9021074 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.155 226239 WARNING nova.compute.manager [req-496a604d-c58c-43e9-9294-008c5f1bfd30 req-f09e2afe-917e-4b10-9878-04aaa4a5b87e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Received unexpected event network-vif-plugged-8ea4b25c-0126-4805-a1e3-bb05f9021074 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.155 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[325452f0-a49b-403b-963a-2faa1883b6b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap38986396-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:b7:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 952855, 'reachable_time': 25957, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 324088, 'error': None, 'target': 'ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.173 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4339b86e-74d2-49b4-bd3d-2fd779fda6fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.204 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[83181920-b77b-4f31-9021-04d53644ae52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.206 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap38986396-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.206 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.206 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap38986396-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.208 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:28 np0005603623 NetworkManager[48970]: <info>  [1769850388.2087] manager: (tap38986396-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/408)
Jan 31 04:06:28 np0005603623 kernel: tap38986396-f0: entered promiscuous mode
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.209 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.210 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap38986396-f0, col_values=(('external_ids', {'iface-id': 'b626bf32-64d8-4511-bbcf-dc852c82e121'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.211 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:28 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:28Z|00864|binding|INFO|Releasing lport b626bf32-64d8-4511-bbcf-dc852c82e121 from this chassis (sb_readonly=0)
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.215 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.216 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/38986396-fb3a-4b5b-b8ee-59048b4b9ff2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/38986396-fb3a-4b5b-b8ee-59048b4b9ff2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.217 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[534dbf07-617a-4ea8-8a5c-b397e2211965]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.217 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-38986396-fb3a-4b5b-b8ee-59048b4b9ff2
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/38986396-fb3a-4b5b-b8ee-59048b4b9ff2.pid.haproxy
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 38986396-fb3a-4b5b-b8ee-59048b4b9ff2
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:06:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:28.218 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2', 'env', 'PROCESS_TAG=haproxy-38986396-fb3a-4b5b-b8ee-59048b4b9ff2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/38986396-fb3a-4b5b-b8ee-59048b4b9ff2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:06:28 np0005603623 podman[324138]: 2026-01-31 09:06:28.502727127 +0000 UTC m=+0.044450985 container create 5e8b9fee0d77585409ffeeb8e3406a5cebc960ce9c77013c859b60fbaa824ab6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Jan 31 04:06:28 np0005603623 systemd[1]: Started libpod-conmon-5e8b9fee0d77585409ffeeb8e3406a5cebc960ce9c77013c859b60fbaa824ab6.scope.
Jan 31 04:06:28 np0005603623 systemd[1]: Started libcrun container.
Jan 31 04:06:28 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70033cc0e3f2639d329fbcb611bb996338a790fba9ef3db75fe1a2416f515aa6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.570 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850388.5700693, eff12f1f-d1ce-444d-8d39-ead849efd65a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.571 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] VM Started (Lifecycle Event)#033[00m
Jan 31 04:06:28 np0005603623 podman[324138]: 2026-01-31 09:06:28.578715251 +0000 UTC m=+0.120439129 container init 5e8b9fee0d77585409ffeeb8e3406a5cebc960ce9c77013c859b60fbaa824ab6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 31 04:06:28 np0005603623 podman[324138]: 2026-01-31 09:06:28.482494672 +0000 UTC m=+0.024218550 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:06:28 np0005603623 podman[324138]: 2026-01-31 09:06:28.583053208 +0000 UTC m=+0.124777076 container start 5e8b9fee0d77585409ffeeb8e3406a5cebc960ce9c77013c859b60fbaa824ab6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.597 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.601 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850388.5710673, eff12f1f-d1ce-444d-8d39-ead849efd65a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.602 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:06:28 np0005603623 neutron-haproxy-ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2[324174]: [NOTICE]   (324180) : New worker (324182) forked
Jan 31 04:06:28 np0005603623 neutron-haproxy-ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2[324174]: [NOTICE]   (324180) : Loading success.
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.606 226239 DEBUG nova.compute.manager [req-0eb8a843-e836-4784-8a5e-9aa5eda29492 req-4c908180-33c6-48bb-872b-40939481cfd3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Received event network-vif-plugged-c5eba141-81b4-4e12-b26f-8a6e0965b727 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.606 226239 DEBUG oslo_concurrency.lockutils [req-0eb8a843-e836-4784-8a5e-9aa5eda29492 req-4c908180-33c6-48bb-872b-40939481cfd3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "eff12f1f-d1ce-444d-8d39-ead849efd65a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.607 226239 DEBUG oslo_concurrency.lockutils [req-0eb8a843-e836-4784-8a5e-9aa5eda29492 req-4c908180-33c6-48bb-872b-40939481cfd3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "eff12f1f-d1ce-444d-8d39-ead849efd65a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.607 226239 DEBUG oslo_concurrency.lockutils [req-0eb8a843-e836-4784-8a5e-9aa5eda29492 req-4c908180-33c6-48bb-872b-40939481cfd3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "eff12f1f-d1ce-444d-8d39-ead849efd65a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.607 226239 DEBUG nova.compute.manager [req-0eb8a843-e836-4784-8a5e-9aa5eda29492 req-4c908180-33c6-48bb-872b-40939481cfd3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Processing event network-vif-plugged-c5eba141-81b4-4e12-b26f-8a6e0965b727 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.608 226239 DEBUG nova.compute.manager [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.612 226239 DEBUG nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.616 226239 INFO nova.virt.libvirt.driver [-] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Instance spawned successfully.#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.617 226239 DEBUG nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.636 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.642 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850388.610834, eff12f1f-d1ce-444d-8d39-ead849efd65a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.643 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.650 226239 DEBUG nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.650 226239 DEBUG nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.651 226239 DEBUG nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.651 226239 DEBUG nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.652 226239 DEBUG nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.652 226239 DEBUG nova.virt.libvirt.driver [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.686 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.689 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.744 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.766 226239 INFO nova.compute.manager [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Took 12.58 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.767 226239 DEBUG nova.compute.manager [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.953 226239 INFO nova.compute.manager [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Took 13.86 seconds to build instance.#033[00m
Jan 31 04:06:28 np0005603623 nova_compute[226235]: 2026-01-31 09:06:28.980 226239 DEBUG oslo_concurrency.lockutils [None req-b16b2e85-50a1-4748-ba5b-af6d4436111f 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lock "eff12f1f-d1ce-444d-8d39-ead849efd65a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.967s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:29 np0005603623 nova_compute[226235]: 2026-01-31 09:06:29.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:29.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:29.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:29 np0005603623 nova_compute[226235]: 2026-01-31 09:06:29.689 226239 DEBUG nova.network.neutron [req-55bef873-2b9b-44ca-94f4-07ef6096f8c7 req-3014aac9-2f6d-4072-a3fd-f1340c1b97c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Updated VIF entry in instance network info cache for port c5eba141-81b4-4e12-b26f-8a6e0965b727. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:06:29 np0005603623 nova_compute[226235]: 2026-01-31 09:06:29.689 226239 DEBUG nova.network.neutron [req-55bef873-2b9b-44ca-94f4-07ef6096f8c7 req-3014aac9-2f6d-4072-a3fd-f1340c1b97c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Updating instance_info_cache with network_info: [{"id": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "address": "fa:16:3e:fa:64:8e", "network": {"id": "38986396-fb3a-4b5b-b8ee-59048b4b9ff2", "bridge": "br-int", "label": "tempest-network-smoke--1874028344", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d103de0dcec4f3f8bc7f8b3bec01a34", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5eba141-81", "ovs_interfaceid": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:06:29 np0005603623 nova_compute[226235]: 2026-01-31 09:06:29.710 226239 DEBUG oslo_concurrency.lockutils [req-55bef873-2b9b-44ca-94f4-07ef6096f8c7 req-3014aac9-2f6d-4072-a3fd-f1340c1b97c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-eff12f1f-d1ce-444d-8d39-ead849efd65a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:06:30 np0005603623 nova_compute[226235]: 2026-01-31 09:06:30.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:30.158 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:30.159 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:30.159 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:30 np0005603623 nova_compute[226235]: 2026-01-31 09:06:30.819 226239 DEBUG nova.compute.manager [req-f753bf7a-44a4-46c4-8a9d-f45edf2c426d req-76d0a6d4-2cb1-48f1-be12-e00e7d78e32e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Received event network-vif-plugged-c5eba141-81b4-4e12-b26f-8a6e0965b727 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:30 np0005603623 nova_compute[226235]: 2026-01-31 09:06:30.820 226239 DEBUG oslo_concurrency.lockutils [req-f753bf7a-44a4-46c4-8a9d-f45edf2c426d req-76d0a6d4-2cb1-48f1-be12-e00e7d78e32e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "eff12f1f-d1ce-444d-8d39-ead849efd65a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:30 np0005603623 nova_compute[226235]: 2026-01-31 09:06:30.820 226239 DEBUG oslo_concurrency.lockutils [req-f753bf7a-44a4-46c4-8a9d-f45edf2c426d req-76d0a6d4-2cb1-48f1-be12-e00e7d78e32e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "eff12f1f-d1ce-444d-8d39-ead849efd65a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:30 np0005603623 nova_compute[226235]: 2026-01-31 09:06:30.820 226239 DEBUG oslo_concurrency.lockutils [req-f753bf7a-44a4-46c4-8a9d-f45edf2c426d req-76d0a6d4-2cb1-48f1-be12-e00e7d78e32e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "eff12f1f-d1ce-444d-8d39-ead849efd65a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:30 np0005603623 nova_compute[226235]: 2026-01-31 09:06:30.821 226239 DEBUG nova.compute.manager [req-f753bf7a-44a4-46c4-8a9d-f45edf2c426d req-76d0a6d4-2cb1-48f1-be12-e00e7d78e32e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] No waiting events found dispatching network-vif-plugged-c5eba141-81b4-4e12-b26f-8a6e0965b727 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:06:30 np0005603623 nova_compute[226235]: 2026-01-31 09:06:30.821 226239 WARNING nova.compute.manager [req-f753bf7a-44a4-46c4-8a9d-f45edf2c426d req-76d0a6d4-2cb1-48f1-be12-e00e7d78e32e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Received unexpected event network-vif-plugged-c5eba141-81b4-4e12-b26f-8a6e0965b727 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:06:30 np0005603623 nova_compute[226235]: 2026-01-31 09:06:30.923 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:31 np0005603623 nova_compute[226235]: 2026-01-31 09:06:31.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:31 np0005603623 nova_compute[226235]: 2026-01-31 09:06:31.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:06:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:31.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:06:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:31.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:31 np0005603623 nova_compute[226235]: 2026-01-31 09:06:31.765 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #175. Immutable memtables: 0.
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:06:32.931783) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 111] Flushing memtable with next log file: 175
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850392931865, "job": 111, "event": "flush_started", "num_memtables": 1, "num_entries": 847, "num_deletes": 259, "total_data_size": 1514602, "memory_usage": 1530976, "flush_reason": "Manual Compaction"}
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 111] Level-0 flush table #176: started
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850392937445, "cf_name": "default", "job": 111, "event": "table_file_creation", "file_number": 176, "file_size": 987887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 84921, "largest_seqno": 85763, "table_properties": {"data_size": 983844, "index_size": 1758, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9779, "raw_average_key_size": 20, "raw_value_size": 975365, "raw_average_value_size": 2011, "num_data_blocks": 75, "num_entries": 485, "num_filter_entries": 485, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850350, "oldest_key_time": 1769850350, "file_creation_time": 1769850392, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 176, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 111] Flush lasted 5745 microseconds, and 2424 cpu microseconds.
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:06:32.937519) [db/flush_job.cc:967] [default] [JOB 111] Level-0 flush table #176: 987887 bytes OK
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:06:32.937547) [db/memtable_list.cc:519] [default] Level-0 commit table #176 started
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:06:32.940932) [db/memtable_list.cc:722] [default] Level-0 commit table #176: memtable #1 done
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:06:32.940950) EVENT_LOG_v1 {"time_micros": 1769850392940944, "job": 111, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:06:32.940968) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 111] Try to delete WAL files size 1510118, prev total WAL file size 1510118, number of live WAL files 2.
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000172.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:06:32.941560) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033323634' seq:72057594037927935, type:22 .. '6C6F676D0033353135' seq:0, type:0; will stop at (end)
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 112] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 111 Base level 0, inputs: [176(964KB)], [174(11MB)]
Jan 31 04:06:32 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850392941605, "job": 112, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [176], "files_L6": [174], "score": -1, "input_data_size": 12751182, "oldest_snapshot_seqno": -1}
Jan 31 04:06:33 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 112] Generated table #177: 10410 keys, 12593955 bytes, temperature: kUnknown
Jan 31 04:06:33 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850393102377, "cf_name": "default", "job": 112, "event": "table_file_creation", "file_number": 177, "file_size": 12593955, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12527833, "index_size": 39001, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26053, "raw_key_size": 275867, "raw_average_key_size": 26, "raw_value_size": 12347218, "raw_average_value_size": 1186, "num_data_blocks": 1477, "num_entries": 10410, "num_filter_entries": 10410, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769850392, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 177, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:06:33 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:06:33 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:06:33.102659) [db/compaction/compaction_job.cc:1663] [default] [JOB 112] Compacted 1@0 + 1@6 files to L6 => 12593955 bytes
Jan 31 04:06:33 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:06:33.110736) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 79.3 rd, 78.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 11.2 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(25.7) write-amplify(12.7) OK, records in: 10948, records dropped: 538 output_compression: NoCompression
Jan 31 04:06:33 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:06:33.110779) EVENT_LOG_v1 {"time_micros": 1769850393110762, "job": 112, "event": "compaction_finished", "compaction_time_micros": 160890, "compaction_time_cpu_micros": 21950, "output_level": 6, "num_output_files": 1, "total_output_size": 12593955, "num_input_records": 10948, "num_output_records": 10410, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:06:33 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000176.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:06:33 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850393111088, "job": 112, "event": "table_file_deletion", "file_number": 176}
Jan 31 04:06:33 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000174.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:06:33 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850393111852, "job": 112, "event": "table_file_deletion", "file_number": 174}
Jan 31 04:06:33 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:06:32.941517) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:06:33 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:06:33.111879) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:06:33 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:06:33.111884) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:06:33 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:06:33.111885) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:06:33 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:06:33.111887) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:06:33 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:06:33.111888) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:06:33 np0005603623 nova_compute[226235]: 2026-01-31 09:06:33.281 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:33 np0005603623 NetworkManager[48970]: <info>  [1769850393.2825] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/409)
Jan 31 04:06:33 np0005603623 NetworkManager[48970]: <info>  [1769850393.2833] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/410)
Jan 31 04:06:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:33.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:33 np0005603623 nova_compute[226235]: 2026-01-31 09:06:33.325 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:33 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:33Z|00865|binding|INFO|Releasing lport 74bde109-0188-4ce3-87c3-02a3eb853dc2 from this chassis (sb_readonly=0)
Jan 31 04:06:33 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:33Z|00866|binding|INFO|Releasing lport b626bf32-64d8-4511-bbcf-dc852c82e121 from this chassis (sb_readonly=0)
Jan 31 04:06:33 np0005603623 nova_compute[226235]: 2026-01-31 09:06:33.342 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:06:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:33.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:06:34 np0005603623 nova_compute[226235]: 2026-01-31 09:06:34.224 226239 DEBUG nova.compute.manager [req-738b3af5-23c7-4cc2-8a28-678c0f787aee req-b56aca03-0084-4af5-bca9-59c86d9e85b1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Received event network-changed-c5eba141-81b4-4e12-b26f-8a6e0965b727 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:34 np0005603623 nova_compute[226235]: 2026-01-31 09:06:34.224 226239 DEBUG nova.compute.manager [req-738b3af5-23c7-4cc2-8a28-678c0f787aee req-b56aca03-0084-4af5-bca9-59c86d9e85b1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Refreshing instance network info cache due to event network-changed-c5eba141-81b4-4e12-b26f-8a6e0965b727. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:06:34 np0005603623 nova_compute[226235]: 2026-01-31 09:06:34.224 226239 DEBUG oslo_concurrency.lockutils [req-738b3af5-23c7-4cc2-8a28-678c0f787aee req-b56aca03-0084-4af5-bca9-59c86d9e85b1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-eff12f1f-d1ce-444d-8d39-ead849efd65a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:06:34 np0005603623 nova_compute[226235]: 2026-01-31 09:06:34.225 226239 DEBUG oslo_concurrency.lockutils [req-738b3af5-23c7-4cc2-8a28-678c0f787aee req-b56aca03-0084-4af5-bca9-59c86d9e85b1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-eff12f1f-d1ce-444d-8d39-ead849efd65a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:06:34 np0005603623 nova_compute[226235]: 2026-01-31 09:06:34.225 226239 DEBUG nova.network.neutron [req-738b3af5-23c7-4cc2-8a28-678c0f787aee req-b56aca03-0084-4af5-bca9-59c86d9e85b1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Refreshing network info cache for port c5eba141-81b4-4e12-b26f-8a6e0965b727 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:06:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:35.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:06:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:35.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:06:35 np0005603623 nova_compute[226235]: 2026-01-31 09:06:35.926 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:36 np0005603623 nova_compute[226235]: 2026-01-31 09:06:36.767 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:37 np0005603623 nova_compute[226235]: 2026-01-31 09:06:37.010 226239 DEBUG nova.compute.manager [req-d8e3c29e-5376-4170-9dff-b82a98bfc4dd req-56770042-6a2a-4788-843f-1d4a9d9a3471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Received event network-changed-8ea4b25c-0126-4805-a1e3-bb05f9021074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:37 np0005603623 nova_compute[226235]: 2026-01-31 09:06:37.011 226239 DEBUG nova.compute.manager [req-d8e3c29e-5376-4170-9dff-b82a98bfc4dd req-56770042-6a2a-4788-843f-1d4a9d9a3471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Refreshing instance network info cache due to event network-changed-8ea4b25c-0126-4805-a1e3-bb05f9021074. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:06:37 np0005603623 nova_compute[226235]: 2026-01-31 09:06:37.012 226239 DEBUG oslo_concurrency.lockutils [req-d8e3c29e-5376-4170-9dff-b82a98bfc4dd req-56770042-6a2a-4788-843f-1d4a9d9a3471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e48e9071-e65c-4dc9-bee0-7d382977f14a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:06:37 np0005603623 nova_compute[226235]: 2026-01-31 09:06:37.012 226239 DEBUG oslo_concurrency.lockutils [req-d8e3c29e-5376-4170-9dff-b82a98bfc4dd req-56770042-6a2a-4788-843f-1d4a9d9a3471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e48e9071-e65c-4dc9-bee0-7d382977f14a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:06:37 np0005603623 nova_compute[226235]: 2026-01-31 09:06:37.012 226239 DEBUG nova.network.neutron [req-d8e3c29e-5376-4170-9dff-b82a98bfc4dd req-56770042-6a2a-4788-843f-1d4a9d9a3471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Refreshing network info cache for port 8ea4b25c-0126-4805-a1e3-bb05f9021074 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:06:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:06:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:37.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:06:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:37.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:38 np0005603623 nova_compute[226235]: 2026-01-31 09:06:38.955 226239 DEBUG nova.network.neutron [req-738b3af5-23c7-4cc2-8a28-678c0f787aee req-b56aca03-0084-4af5-bca9-59c86d9e85b1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Updated VIF entry in instance network info cache for port c5eba141-81b4-4e12-b26f-8a6e0965b727. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:06:38 np0005603623 nova_compute[226235]: 2026-01-31 09:06:38.955 226239 DEBUG nova.network.neutron [req-738b3af5-23c7-4cc2-8a28-678c0f787aee req-b56aca03-0084-4af5-bca9-59c86d9e85b1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Updating instance_info_cache with network_info: [{"id": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "address": "fa:16:3e:fa:64:8e", "network": {"id": "38986396-fb3a-4b5b-b8ee-59048b4b9ff2", "bridge": "br-int", "label": "tempest-network-smoke--1874028344", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d103de0dcec4f3f8bc7f8b3bec01a34", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5eba141-81", "ovs_interfaceid": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:06:38 np0005603623 nova_compute[226235]: 2026-01-31 09:06:38.984 226239 DEBUG oslo_concurrency.lockutils [req-738b3af5-23c7-4cc2-8a28-678c0f787aee req-b56aca03-0084-4af5-bca9-59c86d9e85b1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-eff12f1f-d1ce-444d-8d39-ead849efd65a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:06:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:06:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:39.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:06:39 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:39Z|00106|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:52:66:9d 10.100.0.12
Jan 31 04:06:39 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:39Z|00107|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:52:66:9d 10.100.0.12
Jan 31 04:06:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:06:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:39.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:06:40 np0005603623 nova_compute[226235]: 2026-01-31 09:06:40.169 226239 DEBUG nova.network.neutron [req-d8e3c29e-5376-4170-9dff-b82a98bfc4dd req-56770042-6a2a-4788-843f-1d4a9d9a3471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Updated VIF entry in instance network info cache for port 8ea4b25c-0126-4805-a1e3-bb05f9021074. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:06:40 np0005603623 nova_compute[226235]: 2026-01-31 09:06:40.170 226239 DEBUG nova.network.neutron [req-d8e3c29e-5376-4170-9dff-b82a98bfc4dd req-56770042-6a2a-4788-843f-1d4a9d9a3471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Updating instance_info_cache with network_info: [{"id": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "address": "fa:16:3e:52:66:9d", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ea4b25c-01", "ovs_interfaceid": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:06:40 np0005603623 nova_compute[226235]: 2026-01-31 09:06:40.199 226239 DEBUG oslo_concurrency.lockutils [req-d8e3c29e-5376-4170-9dff-b82a98bfc4dd req-56770042-6a2a-4788-843f-1d4a9d9a3471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e48e9071-e65c-4dc9-bee0-7d382977f14a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:06:40 np0005603623 nova_compute[226235]: 2026-01-31 09:06:40.929 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:41 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:41Z|00108|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fa:64:8e 10.100.0.8
Jan 31 04:06:41 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:41Z|00109|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fa:64:8e 10.100.0.8
Jan 31 04:06:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:41.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:41.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:41 np0005603623 nova_compute[226235]: 2026-01-31 09:06:41.769 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:42 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:42Z|00867|binding|INFO|Releasing lport 74bde109-0188-4ce3-87c3-02a3eb853dc2 from this chassis (sb_readonly=0)
Jan 31 04:06:42 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:42Z|00868|binding|INFO|Releasing lport b626bf32-64d8-4511-bbcf-dc852c82e121 from this chassis (sb_readonly=0)
Jan 31 04:06:42 np0005603623 nova_compute[226235]: 2026-01-31 09:06:42.229 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:43.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:43.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:45.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:06:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:45.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:06:45 np0005603623 nova_compute[226235]: 2026-01-31 09:06:45.933 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:46 np0005603623 nova_compute[226235]: 2026-01-31 09:06:46.771 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:47.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:47.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:48 np0005603623 nova_compute[226235]: 2026-01-31 09:06:48.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:48 np0005603623 nova_compute[226235]: 2026-01-31 09:06:48.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 04:06:48 np0005603623 nova_compute[226235]: 2026-01-31 09:06:48.855 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:49.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:06:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:49.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:06:50 np0005603623 nova_compute[226235]: 2026-01-31 09:06:50.937 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.086 226239 DEBUG oslo_concurrency.lockutils [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "e48e9071-e65c-4dc9-bee0-7d382977f14a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.087 226239 DEBUG oslo_concurrency.lockutils [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "e48e9071-e65c-4dc9-bee0-7d382977f14a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.087 226239 DEBUG oslo_concurrency.lockutils [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "e48e9071-e65c-4dc9-bee0-7d382977f14a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.087 226239 DEBUG oslo_concurrency.lockutils [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "e48e9071-e65c-4dc9-bee0-7d382977f14a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.088 226239 DEBUG oslo_concurrency.lockutils [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "e48e9071-e65c-4dc9-bee0-7d382977f14a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.089 226239 INFO nova.compute.manager [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Terminating instance#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.090 226239 DEBUG nova.compute.manager [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:06:51 np0005603623 kernel: tap8ea4b25c-01 (unregistering): left promiscuous mode
Jan 31 04:06:51 np0005603623 NetworkManager[48970]: <info>  [1769850411.1517] device (tap8ea4b25c-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.156 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:51 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:51Z|00869|binding|INFO|Releasing lport 8ea4b25c-0126-4805-a1e3-bb05f9021074 from this chassis (sb_readonly=0)
Jan 31 04:06:51 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:51Z|00870|binding|INFO|Setting lport 8ea4b25c-0126-4805-a1e3-bb05f9021074 down in Southbound
Jan 31 04:06:51 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:51Z|00871|binding|INFO|Removing iface tap8ea4b25c-01 ovn-installed in OVS
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.159 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.166 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:51.169 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:66:9d 10.100.0.12'], port_security=['fa:16:3e:52:66:9d 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'e48e9071-e65c-4dc9-bee0-7d382977f14a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-650eb345-8346-4e8f-8e83-eeb0117654f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76ce367a834b49dfb5b436848118b860', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cea15428-ed6f-44a7-98e5-24c0fab7b796', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.197'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ecdc171-9d09-4cba-9bb9-cd2f8ef8e6c3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=8ea4b25c-0126-4805-a1e3-bb05f9021074) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:06:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:51.170 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 8ea4b25c-0126-4805-a1e3-bb05f9021074 in datapath 650eb345-8346-4e8f-8e83-eeb0117654f6 unbound from our chassis#033[00m
Jan 31 04:06:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:51.171 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 650eb345-8346-4e8f-8e83-eeb0117654f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:06:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:51.172 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4fc5f32a-7efa-4f42-acc2-5b3d891c32e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:51.173 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 namespace which is not needed anymore#033[00m
Jan 31 04:06:51 np0005603623 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d000000ce.scope: Deactivated successfully.
Jan 31 04:06:51 np0005603623 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d000000ce.scope: Consumed 12.939s CPU time.
Jan 31 04:06:51 np0005603623 systemd-machined[194379]: Machine qemu-95-instance-000000ce terminated.
Jan 31 04:06:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:51.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.341 226239 INFO nova.virt.libvirt.driver [-] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Instance destroyed successfully.#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.342 226239 DEBUG nova.objects.instance [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lazy-loading 'resources' on Instance uuid e48e9071-e65c-4dc9-bee0-7d382977f14a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:06:51 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[323961]: [NOTICE]   (323965) : haproxy version is 2.8.14-c23fe91
Jan 31 04:06:51 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[323961]: [NOTICE]   (323965) : path to executable is /usr/sbin/haproxy
Jan 31 04:06:51 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[323961]: [WARNING]  (323965) : Exiting Master process...
Jan 31 04:06:51 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[323961]: [WARNING]  (323965) : Exiting Master process...
Jan 31 04:06:51 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[323961]: [ALERT]    (323965) : Current worker (323967) exited with code 143 (Terminated)
Jan 31 04:06:51 np0005603623 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[323961]: [WARNING]  (323965) : All workers exited. Exiting... (0)
Jan 31 04:06:51 np0005603623 systemd[1]: libpod-a0cbfc5e66a2671971c3b7254cb0f5388ace74b30b42358557d7b4d7410a3562.scope: Deactivated successfully.
Jan 31 04:06:51 np0005603623 podman[324279]: 2026-01-31 09:06:51.363648312 +0000 UTC m=+0.045360574 container died a0cbfc5e66a2671971c3b7254cb0f5388ace74b30b42358557d7b4d7410a3562 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.363 226239 DEBUG nova.virt.libvirt.vif [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:06:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-1753899210',display_name='tempest-TestVolumeBootPattern-server-1753899210',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-1753899210',id=206,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNWvTxk1zh2OCmPH3tumEbxR7y880uhj4vJDAspX9r3EATf0w5oe5DG3NVBcNRbWTPcgVwlnXcyaRQZseLc7edDTe4kwfjogsRoplvkAsMWW9sCSaJlX0XBkMxl/Ghv8Fw==',key_name='tempest-TestVolumeBootPattern-11482540',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:06:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76ce367a834b49dfb5b436848118b860',ramdisk_id='',reservation_id='r-171izjoz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestVolumeBootPattern-1392945362',owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:06:26Z,user_data=None,user_id='dc42b92a5dd34d32b6b184bdc7acb092',uuid=e48e9071-e65c-4dc9-bee0-7d382977f14a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "address": "fa:16:3e:52:66:9d", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ea4b25c-01", "ovs_interfaceid": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.364 226239 DEBUG nova.network.os_vif_util [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converting VIF {"id": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "address": "fa:16:3e:52:66:9d", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ea4b25c-01", "ovs_interfaceid": "8ea4b25c-0126-4805-a1e3-bb05f9021074", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.364 226239 DEBUG nova.network.os_vif_util [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:52:66:9d,bridge_name='br-int',has_traffic_filtering=True,id=8ea4b25c-0126-4805-a1e3-bb05f9021074,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ea4b25c-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.365 226239 DEBUG os_vif [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:66:9d,bridge_name='br-int',has_traffic_filtering=True,id=8ea4b25c-0126-4805-a1e3-bb05f9021074,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ea4b25c-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.367 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.367 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8ea4b25c-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.369 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.370 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.372 226239 INFO os_vif [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:66:9d,bridge_name='br-int',has_traffic_filtering=True,id=8ea4b25c-0126-4805-a1e3-bb05f9021074,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ea4b25c-01')#033[00m
Jan 31 04:06:51 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a0cbfc5e66a2671971c3b7254cb0f5388ace74b30b42358557d7b4d7410a3562-userdata-shm.mount: Deactivated successfully.
Jan 31 04:06:51 np0005603623 systemd[1]: var-lib-containers-storage-overlay-d6fcf083e6ec477a4290dfa6a1b70d4e83157162f68b7c3147024946312893b8-merged.mount: Deactivated successfully.
Jan 31 04:06:51 np0005603623 podman[324279]: 2026-01-31 09:06:51.398130744 +0000 UTC m=+0.079843006 container cleanup a0cbfc5e66a2671971c3b7254cb0f5388ace74b30b42358557d7b4d7410a3562 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 04:06:51 np0005603623 systemd[1]: libpod-conmon-a0cbfc5e66a2671971c3b7254cb0f5388ace74b30b42358557d7b4d7410a3562.scope: Deactivated successfully.
Jan 31 04:06:51 np0005603623 podman[324333]: 2026-01-31 09:06:51.453578763 +0000 UTC m=+0.040724988 container remove a0cbfc5e66a2671971c3b7254cb0f5388ace74b30b42358557d7b4d7410a3562 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 31 04:06:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:51.456 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bf219aea-50ae-4a12-86fc-4920f6248e57]: (4, ('Sat Jan 31 09:06:51 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 (a0cbfc5e66a2671971c3b7254cb0f5388ace74b30b42358557d7b4d7410a3562)\na0cbfc5e66a2671971c3b7254cb0f5388ace74b30b42358557d7b4d7410a3562\nSat Jan 31 09:06:51 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 (a0cbfc5e66a2671971c3b7254cb0f5388ace74b30b42358557d7b4d7410a3562)\na0cbfc5e66a2671971c3b7254cb0f5388ace74b30b42358557d7b4d7410a3562\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:51.458 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ec0d38be-8be8-41d4-9f84-90d41a8aeb04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:51.459 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap650eb345-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.460 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:51 np0005603623 kernel: tap650eb345-80: left promiscuous mode
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.465 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:51.468 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ca73bf2e-d37d-467f-ac3d-8f3821c70802]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:51.485 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a8bbe061-6d1b-4a93-8595-8b3eb19ef3fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:51.485 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9d4954ac-0055-4741-8f44-2bd1ff37826b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:51.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:51.497 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d356333a-a50e-4e54-a3b6-66504b18f76b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 952557, 'reachable_time': 23058, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 324351, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:51 np0005603623 systemd[1]: run-netns-ovnmeta\x2d650eb345\x2d8346\x2d4e8f\x2d8e83\x2deeb0117654f6.mount: Deactivated successfully.
Jan 31 04:06:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:51.500 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:06:51 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:51.501 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[859acef1-9d4f-4e32-a787-6f8599200e0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.774 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.869 226239 INFO nova.virt.libvirt.driver [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Deleting instance files /var/lib/nova/instances/e48e9071-e65c-4dc9-bee0-7d382977f14a_del#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.870 226239 INFO nova.virt.libvirt.driver [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Deletion of /var/lib/nova/instances/e48e9071-e65c-4dc9-bee0-7d382977f14a_del complete#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.922 226239 INFO nova.compute.manager [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Took 0.83 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.923 226239 DEBUG oslo.service.loopingcall [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.924 226239 DEBUG nova.compute.manager [-] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:06:51 np0005603623 nova_compute[226235]: 2026-01-31 09:06:51.924 226239 DEBUG nova.network.neutron [-] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:06:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:53 np0005603623 nova_compute[226235]: 2026-01-31 09:06:53.058 226239 DEBUG nova.compute.manager [req-02a33dc0-5c70-4e9c-88c6-2ea65fdf06d7 req-308116c8-16cd-48a9-8d32-9f0c8b4dcc74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Received event network-vif-unplugged-8ea4b25c-0126-4805-a1e3-bb05f9021074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:53 np0005603623 nova_compute[226235]: 2026-01-31 09:06:53.059 226239 DEBUG oslo_concurrency.lockutils [req-02a33dc0-5c70-4e9c-88c6-2ea65fdf06d7 req-308116c8-16cd-48a9-8d32-9f0c8b4dcc74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e48e9071-e65c-4dc9-bee0-7d382977f14a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:53 np0005603623 nova_compute[226235]: 2026-01-31 09:06:53.060 226239 DEBUG oslo_concurrency.lockutils [req-02a33dc0-5c70-4e9c-88c6-2ea65fdf06d7 req-308116c8-16cd-48a9-8d32-9f0c8b4dcc74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e48e9071-e65c-4dc9-bee0-7d382977f14a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:53 np0005603623 nova_compute[226235]: 2026-01-31 09:06:53.060 226239 DEBUG oslo_concurrency.lockutils [req-02a33dc0-5c70-4e9c-88c6-2ea65fdf06d7 req-308116c8-16cd-48a9-8d32-9f0c8b4dcc74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e48e9071-e65c-4dc9-bee0-7d382977f14a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:53 np0005603623 nova_compute[226235]: 2026-01-31 09:06:53.061 226239 DEBUG nova.compute.manager [req-02a33dc0-5c70-4e9c-88c6-2ea65fdf06d7 req-308116c8-16cd-48a9-8d32-9f0c8b4dcc74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] No waiting events found dispatching network-vif-unplugged-8ea4b25c-0126-4805-a1e3-bb05f9021074 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:06:53 np0005603623 nova_compute[226235]: 2026-01-31 09:06:53.061 226239 DEBUG nova.compute.manager [req-02a33dc0-5c70-4e9c-88c6-2ea65fdf06d7 req-308116c8-16cd-48a9-8d32-9f0c8b4dcc74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Received event network-vif-unplugged-8ea4b25c-0126-4805-a1e3-bb05f9021074 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:06:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:53.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:53.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:54 np0005603623 nova_compute[226235]: 2026-01-31 09:06:54.132 226239 DEBUG nova.network.neutron [-] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:06:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:54.161 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=89, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=88) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:06:54 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:54.162 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:06:54 np0005603623 nova_compute[226235]: 2026-01-31 09:06:54.162 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:54 np0005603623 nova_compute[226235]: 2026-01-31 09:06:54.206 226239 INFO nova.compute.manager [-] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Took 2.28 seconds to deallocate network for instance.#033[00m
Jan 31 04:06:54 np0005603623 nova_compute[226235]: 2026-01-31 09:06:54.856 226239 INFO nova.compute.manager [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Took 0.65 seconds to detach 1 volumes for instance.#033[00m
Jan 31 04:06:54 np0005603623 nova_compute[226235]: 2026-01-31 09:06:54.935 226239 DEBUG oslo_concurrency.lockutils [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:54 np0005603623 nova_compute[226235]: 2026-01-31 09:06:54.935 226239 DEBUG oslo_concurrency.lockutils [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:55 np0005603623 nova_compute[226235]: 2026-01-31 09:06:55.042 226239 DEBUG oslo_concurrency.processutils [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:55 np0005603623 nova_compute[226235]: 2026-01-31 09:06:55.169 226239 DEBUG nova.compute.manager [req-27314fee-0da9-4623-948a-e6ab20ae34f6 req-3d6c3335-44bb-4e65-bd4f-d1298e7d462e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Received event network-vif-plugged-8ea4b25c-0126-4805-a1e3-bb05f9021074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:55 np0005603623 nova_compute[226235]: 2026-01-31 09:06:55.169 226239 DEBUG oslo_concurrency.lockutils [req-27314fee-0da9-4623-948a-e6ab20ae34f6 req-3d6c3335-44bb-4e65-bd4f-d1298e7d462e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e48e9071-e65c-4dc9-bee0-7d382977f14a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:55 np0005603623 nova_compute[226235]: 2026-01-31 09:06:55.170 226239 DEBUG oslo_concurrency.lockutils [req-27314fee-0da9-4623-948a-e6ab20ae34f6 req-3d6c3335-44bb-4e65-bd4f-d1298e7d462e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e48e9071-e65c-4dc9-bee0-7d382977f14a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:55 np0005603623 nova_compute[226235]: 2026-01-31 09:06:55.170 226239 DEBUG oslo_concurrency.lockutils [req-27314fee-0da9-4623-948a-e6ab20ae34f6 req-3d6c3335-44bb-4e65-bd4f-d1298e7d462e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e48e9071-e65c-4dc9-bee0-7d382977f14a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:55 np0005603623 nova_compute[226235]: 2026-01-31 09:06:55.170 226239 DEBUG nova.compute.manager [req-27314fee-0da9-4623-948a-e6ab20ae34f6 req-3d6c3335-44bb-4e65-bd4f-d1298e7d462e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] No waiting events found dispatching network-vif-plugged-8ea4b25c-0126-4805-a1e3-bb05f9021074 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:06:55 np0005603623 nova_compute[226235]: 2026-01-31 09:06:55.170 226239 WARNING nova.compute.manager [req-27314fee-0da9-4623-948a-e6ab20ae34f6 req-3d6c3335-44bb-4e65-bd4f-d1298e7d462e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Received unexpected event network-vif-plugged-8ea4b25c-0126-4805-a1e3-bb05f9021074 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 04:06:55 np0005603623 nova_compute[226235]: 2026-01-31 09:06:55.170 226239 DEBUG nova.compute.manager [req-27314fee-0da9-4623-948a-e6ab20ae34f6 req-3d6c3335-44bb-4e65-bd4f-d1298e7d462e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Received event network-vif-deleted-8ea4b25c-0126-4805-a1e3-bb05f9021074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:06:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:55.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:06:55 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:06:55 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2041208249' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:06:55 np0005603623 nova_compute[226235]: 2026-01-31 09:06:55.438 226239 DEBUG oslo_concurrency.processutils [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.396s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:55 np0005603623 nova_compute[226235]: 2026-01-31 09:06:55.443 226239 DEBUG nova.compute.provider_tree [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:06:55 np0005603623 nova_compute[226235]: 2026-01-31 09:06:55.464 226239 DEBUG nova.scheduler.client.report [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:06:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:55.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:55 np0005603623 nova_compute[226235]: 2026-01-31 09:06:55.507 226239 DEBUG oslo_concurrency.lockutils [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.572s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:55 np0005603623 nova_compute[226235]: 2026-01-31 09:06:55.547 226239 INFO nova.scheduler.client.report [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Deleted allocations for instance e48e9071-e65c-4dc9-bee0-7d382977f14a#033[00m
Jan 31 04:06:55 np0005603623 nova_compute[226235]: 2026-01-31 09:06:55.646 226239 DEBUG oslo_concurrency.lockutils [None req-00de2a9d-1e87-4539-9939-ec1b5aef9b36 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "e48e9071-e65c-4dc9-bee0-7d382977f14a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.559s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:55 np0005603623 podman[324378]: 2026-01-31 09:06:55.961735928 +0000 UTC m=+0.057399173 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 04:06:55 np0005603623 podman[324377]: 2026-01-31 09:06:55.965876287 +0000 UTC m=+0.062036896 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 31 04:06:56 np0005603623 nova_compute[226235]: 2026-01-31 09:06:56.402 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:56 np0005603623 nova_compute[226235]: 2026-01-31 09:06:56.776 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:56 np0005603623 nova_compute[226235]: 2026-01-31 09:06:56.898 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:57.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:57.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:58 np0005603623 nova_compute[226235]: 2026-01-31 09:06:58.671 226239 DEBUG nova.compute.manager [req-3ee52304-98f1-4346-a504-a4862bce3752 req-ec1e9274-87f0-4efc-9e32-a2e40d9c4a95 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Received event network-changed-c5eba141-81b4-4e12-b26f-8a6e0965b727 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:58 np0005603623 nova_compute[226235]: 2026-01-31 09:06:58.672 226239 DEBUG nova.compute.manager [req-3ee52304-98f1-4346-a504-a4862bce3752 req-ec1e9274-87f0-4efc-9e32-a2e40d9c4a95 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Refreshing instance network info cache due to event network-changed-c5eba141-81b4-4e12-b26f-8a6e0965b727. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:06:58 np0005603623 nova_compute[226235]: 2026-01-31 09:06:58.672 226239 DEBUG oslo_concurrency.lockutils [req-3ee52304-98f1-4346-a504-a4862bce3752 req-ec1e9274-87f0-4efc-9e32-a2e40d9c4a95 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-eff12f1f-d1ce-444d-8d39-ead849efd65a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:06:58 np0005603623 nova_compute[226235]: 2026-01-31 09:06:58.672 226239 DEBUG oslo_concurrency.lockutils [req-3ee52304-98f1-4346-a504-a4862bce3752 req-ec1e9274-87f0-4efc-9e32-a2e40d9c4a95 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-eff12f1f-d1ce-444d-8d39-ead849efd65a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:06:58 np0005603623 nova_compute[226235]: 2026-01-31 09:06:58.672 226239 DEBUG nova.network.neutron [req-3ee52304-98f1-4346-a504-a4862bce3752 req-ec1e9274-87f0-4efc-9e32-a2e40d9c4a95 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Refreshing network info cache for port c5eba141-81b4-4e12-b26f-8a6e0965b727 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:06:58 np0005603623 nova_compute[226235]: 2026-01-31 09:06:58.984 226239 DEBUG oslo_concurrency.lockutils [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Acquiring lock "eff12f1f-d1ce-444d-8d39-ead849efd65a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:58 np0005603623 nova_compute[226235]: 2026-01-31 09:06:58.985 226239 DEBUG oslo_concurrency.lockutils [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lock "eff12f1f-d1ce-444d-8d39-ead849efd65a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:58 np0005603623 nova_compute[226235]: 2026-01-31 09:06:58.986 226239 DEBUG oslo_concurrency.lockutils [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Acquiring lock "eff12f1f-d1ce-444d-8d39-ead849efd65a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:58 np0005603623 nova_compute[226235]: 2026-01-31 09:06:58.986 226239 DEBUG oslo_concurrency.lockutils [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lock "eff12f1f-d1ce-444d-8d39-ead849efd65a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:58 np0005603623 nova_compute[226235]: 2026-01-31 09:06:58.986 226239 DEBUG oslo_concurrency.lockutils [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lock "eff12f1f-d1ce-444d-8d39-ead849efd65a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:58 np0005603623 nova_compute[226235]: 2026-01-31 09:06:58.987 226239 INFO nova.compute.manager [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Terminating instance#033[00m
Jan 31 04:06:58 np0005603623 nova_compute[226235]: 2026-01-31 09:06:58.988 226239 DEBUG nova.compute.manager [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:06:59 np0005603623 kernel: tapc5eba141-81 (unregistering): left promiscuous mode
Jan 31 04:06:59 np0005603623 NetworkManager[48970]: <info>  [1769850419.0294] device (tapc5eba141-81): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.030 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.034 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:59 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:59Z|00872|binding|INFO|Releasing lport c5eba141-81b4-4e12-b26f-8a6e0965b727 from this chassis (sb_readonly=0)
Jan 31 04:06:59 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:59Z|00873|binding|INFO|Setting lport c5eba141-81b4-4e12-b26f-8a6e0965b727 down in Southbound
Jan 31 04:06:59 np0005603623 ovn_controller[133449]: 2026-01-31T09:06:59Z|00874|binding|INFO|Removing iface tapc5eba141-81 ovn-installed in OVS
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.035 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.040 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:59.054 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:64:8e 10.100.0.8'], port_security=['fa:16:3e:fa:64:8e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'eff12f1f-d1ce-444d-8d39-ead849efd65a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-38986396-fb3a-4b5b-b8ee-59048b4b9ff2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d103de0dcec4f3f8bc7f8b3bec01a34', 'neutron:revision_number': '4', 'neutron:security_group_ids': '13209e92-378b-49f3-97e4-1aa68fc1f020 c12c0ff4-8bf2-46b4-adb4-e72c424153f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2d93e478-add3-4b37-b54b-67462a306e66, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=c5eba141-81b4-4e12-b26f-8a6e0965b727) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:06:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:59.056 143258 INFO neutron.agent.ovn.metadata.agent [-] Port c5eba141-81b4-4e12-b26f-8a6e0965b727 in datapath 38986396-fb3a-4b5b-b8ee-59048b4b9ff2 unbound from our chassis#033[00m
Jan 31 04:06:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:59.059 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 38986396-fb3a-4b5b-b8ee-59048b4b9ff2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:06:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:59.060 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b17a8e20-a666-4265-a60a-e1b765192c84]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:59.060 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2 namespace which is not needed anymore#033[00m
Jan 31 04:06:59 np0005603623 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d000000cf.scope: Deactivated successfully.
Jan 31 04:06:59 np0005603623 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d000000cf.scope: Consumed 12.691s CPU time.
Jan 31 04:06:59 np0005603623 systemd-machined[194379]: Machine qemu-96-instance-000000cf terminated.
Jan 31 04:06:59 np0005603623 neutron-haproxy-ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2[324174]: [NOTICE]   (324180) : haproxy version is 2.8.14-c23fe91
Jan 31 04:06:59 np0005603623 neutron-haproxy-ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2[324174]: [NOTICE]   (324180) : path to executable is /usr/sbin/haproxy
Jan 31 04:06:59 np0005603623 neutron-haproxy-ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2[324174]: [WARNING]  (324180) : Exiting Master process...
Jan 31 04:06:59 np0005603623 neutron-haproxy-ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2[324174]: [ALERT]    (324180) : Current worker (324182) exited with code 143 (Terminated)
Jan 31 04:06:59 np0005603623 neutron-haproxy-ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2[324174]: [WARNING]  (324180) : All workers exited. Exiting... (0)
Jan 31 04:06:59 np0005603623 systemd[1]: libpod-5e8b9fee0d77585409ffeeb8e3406a5cebc960ce9c77013c859b60fbaa824ab6.scope: Deactivated successfully.
Jan 31 04:06:59 np0005603623 conmon[324174]: conmon 5e8b9fee0d77585409ff <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5e8b9fee0d77585409ffeeb8e3406a5cebc960ce9c77013c859b60fbaa824ab6.scope/container/memory.events
Jan 31 04:06:59 np0005603623 podman[324446]: 2026-01-31 09:06:59.163361245 +0000 UTC m=+0.036401243 container died 5e8b9fee0d77585409ffeeb8e3406a5cebc960ce9c77013c859b60fbaa824ab6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:06:59 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5e8b9fee0d77585409ffeeb8e3406a5cebc960ce9c77013c859b60fbaa824ab6-userdata-shm.mount: Deactivated successfully.
Jan 31 04:06:59 np0005603623 systemd[1]: var-lib-containers-storage-overlay-70033cc0e3f2639d329fbcb611bb996338a790fba9ef3db75fe1a2416f515aa6-merged.mount: Deactivated successfully.
Jan 31 04:06:59 np0005603623 podman[324446]: 2026-01-31 09:06:59.197763545 +0000 UTC m=+0.070803543 container cleanup 5e8b9fee0d77585409ffeeb8e3406a5cebc960ce9c77013c859b60fbaa824ab6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 04:06:59 np0005603623 systemd[1]: libpod-conmon-5e8b9fee0d77585409ffeeb8e3406a5cebc960ce9c77013c859b60fbaa824ab6.scope: Deactivated successfully.
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.214 226239 INFO nova.virt.libvirt.driver [-] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Instance destroyed successfully.#033[00m
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.214 226239 DEBUG nova.objects.instance [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lazy-loading 'resources' on Instance uuid eff12f1f-d1ce-444d-8d39-ead849efd65a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.229 226239 DEBUG nova.virt.libvirt.vif [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:06:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-680379111-access_point-537838458',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-680379111-access_point-537838458',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-680379111-acc',id=207,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgM7X9M9wZlSF1gyfdYak2nFokjAMhj//zRi82vONcQIZQBgxY0V5vu8trZHrF2NFzC4L6xfyfURFS38dJDUkA6k9r7Oe+j0g6OpQQYsDS30vFo6R9Q4mijKm1iKtRQig==',key_name='tempest-TestSecurityGroupsBasicOps-508360795',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:06:28Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4d103de0dcec4f3f8bc7f8b3bec01a34',ramdisk_id='',reservation_id='r-9a7z3o7m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-680379111',owner_user_name='tempest-TestSecurityGroupsBasicOps-680379111-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:06:28Z,user_data=None,user_id='47dc950da7924a109657b08e4b8b55b7',uuid=eff12f1f-d1ce-444d-8d39-ead849efd65a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "address": "fa:16:3e:fa:64:8e", "network": {"id": "38986396-fb3a-4b5b-b8ee-59048b4b9ff2", "bridge": "br-int", "label": "tempest-network-smoke--1874028344", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d103de0dcec4f3f8bc7f8b3bec01a34", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5eba141-81", "ovs_interfaceid": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.229 226239 DEBUG nova.network.os_vif_util [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Converting VIF {"id": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "address": "fa:16:3e:fa:64:8e", "network": {"id": "38986396-fb3a-4b5b-b8ee-59048b4b9ff2", "bridge": "br-int", "label": "tempest-network-smoke--1874028344", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d103de0dcec4f3f8bc7f8b3bec01a34", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5eba141-81", "ovs_interfaceid": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.230 226239 DEBUG nova.network.os_vif_util [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fa:64:8e,bridge_name='br-int',has_traffic_filtering=True,id=c5eba141-81b4-4e12-b26f-8a6e0965b727,network=Network(38986396-fb3a-4b5b-b8ee-59048b4b9ff2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5eba141-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.231 226239 DEBUG os_vif [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fa:64:8e,bridge_name='br-int',has_traffic_filtering=True,id=c5eba141-81b4-4e12-b26f-8a6e0965b727,network=Network(38986396-fb3a-4b5b-b8ee-59048b4b9ff2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5eba141-81') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.232 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.233 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc5eba141-81, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.234 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.235 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.237 226239 INFO os_vif [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fa:64:8e,bridge_name='br-int',has_traffic_filtering=True,id=c5eba141-81b4-4e12-b26f-8a6e0965b727,network=Network(38986396-fb3a-4b5b-b8ee-59048b4b9ff2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5eba141-81')#033[00m
Jan 31 04:06:59 np0005603623 podman[324481]: 2026-01-31 09:06:59.259084018 +0000 UTC m=+0.040127990 container remove 5e8b9fee0d77585409ffeeb8e3406a5cebc960ce9c77013c859b60fbaa824ab6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 04:06:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:59.262 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[166c964e-d24d-45aa-b400-946e27974581]: (4, ('Sat Jan 31 09:06:59 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2 (5e8b9fee0d77585409ffeeb8e3406a5cebc960ce9c77013c859b60fbaa824ab6)\n5e8b9fee0d77585409ffeeb8e3406a5cebc960ce9c77013c859b60fbaa824ab6\nSat Jan 31 09:06:59 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2 (5e8b9fee0d77585409ffeeb8e3406a5cebc960ce9c77013c859b60fbaa824ab6)\n5e8b9fee0d77585409ffeeb8e3406a5cebc960ce9c77013c859b60fbaa824ab6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:59.264 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7640ae8a-a80e-459f-8661-de6292e8c8d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:59.265 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap38986396-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:59 np0005603623 kernel: tap38986396-f0: left promiscuous mode
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.268 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.274 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:59.276 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[54916450-da7e-49ee-a1eb-e864ad552f32]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:59.288 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[86dbe415-1f4f-4f7d-b948-354b0be9a498]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:59.289 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f3612b88-1a70-41a7-bd2b-fdc1480829ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:59.306 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[a8db200c-d592-49e0-81dc-21009f252a1b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 952850, 'reachable_time': 24481, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 324519, 'error': None, 'target': 'ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:59.309 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-38986396-fb3a-4b5b-b8ee-59048b4b9ff2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:06:59 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:06:59.309 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[82ae3080-fa5e-4e1d-a5a7-3a58fdf27f07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:59 np0005603623 systemd[1]: run-netns-ovnmeta\x2d38986396\x2dfb3a\x2d4b5b\x2db8ee\x2d59048b4b9ff2.mount: Deactivated successfully.
Jan 31 04:06:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:06:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:59.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:06:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:06:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:06:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:59.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.624 226239 INFO nova.virt.libvirt.driver [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Deleting instance files /var/lib/nova/instances/eff12f1f-d1ce-444d-8d39-ead849efd65a_del#033[00m
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.625 226239 INFO nova.virt.libvirt.driver [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Deletion of /var/lib/nova/instances/eff12f1f-d1ce-444d-8d39-ead849efd65a_del complete#033[00m
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.682 226239 INFO nova.compute.manager [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Took 0.69 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.682 226239 DEBUG oslo.service.loopingcall [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.682 226239 DEBUG nova.compute.manager [-] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:06:59 np0005603623 nova_compute[226235]: 2026-01-31 09:06:59.683 226239 DEBUG nova.network.neutron [-] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:07:00 np0005603623 nova_compute[226235]: 2026-01-31 09:07:00.494 226239 DEBUG nova.compute.manager [req-f4228401-75dc-4506-b800-4691dc80e91e req-ca74201e-62ba-44ad-a850-8d1f8378a25e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Received event network-vif-unplugged-c5eba141-81b4-4e12-b26f-8a6e0965b727 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:07:00 np0005603623 nova_compute[226235]: 2026-01-31 09:07:00.494 226239 DEBUG oslo_concurrency.lockutils [req-f4228401-75dc-4506-b800-4691dc80e91e req-ca74201e-62ba-44ad-a850-8d1f8378a25e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "eff12f1f-d1ce-444d-8d39-ead849efd65a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:00 np0005603623 nova_compute[226235]: 2026-01-31 09:07:00.495 226239 DEBUG oslo_concurrency.lockutils [req-f4228401-75dc-4506-b800-4691dc80e91e req-ca74201e-62ba-44ad-a850-8d1f8378a25e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "eff12f1f-d1ce-444d-8d39-ead849efd65a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:00 np0005603623 nova_compute[226235]: 2026-01-31 09:07:00.495 226239 DEBUG oslo_concurrency.lockutils [req-f4228401-75dc-4506-b800-4691dc80e91e req-ca74201e-62ba-44ad-a850-8d1f8378a25e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "eff12f1f-d1ce-444d-8d39-ead849efd65a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:00 np0005603623 nova_compute[226235]: 2026-01-31 09:07:00.495 226239 DEBUG nova.compute.manager [req-f4228401-75dc-4506-b800-4691dc80e91e req-ca74201e-62ba-44ad-a850-8d1f8378a25e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] No waiting events found dispatching network-vif-unplugged-c5eba141-81b4-4e12-b26f-8a6e0965b727 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:07:00 np0005603623 nova_compute[226235]: 2026-01-31 09:07:00.495 226239 DEBUG nova.compute.manager [req-f4228401-75dc-4506-b800-4691dc80e91e req-ca74201e-62ba-44ad-a850-8d1f8378a25e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Received event network-vif-unplugged-c5eba141-81b4-4e12-b26f-8a6e0965b727 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:07:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:01.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:01 np0005603623 nova_compute[226235]: 2026-01-31 09:07:01.360 226239 DEBUG nova.network.neutron [-] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:07:01 np0005603623 nova_compute[226235]: 2026-01-31 09:07:01.401 226239 INFO nova.compute.manager [-] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Took 1.72 seconds to deallocate network for instance.#033[00m
Jan 31 04:07:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:07:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:01.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:07:01 np0005603623 nova_compute[226235]: 2026-01-31 09:07:01.581 226239 DEBUG oslo_concurrency.lockutils [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:01 np0005603623 nova_compute[226235]: 2026-01-31 09:07:01.582 226239 DEBUG oslo_concurrency.lockutils [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:01 np0005603623 nova_compute[226235]: 2026-01-31 09:07:01.720 226239 DEBUG oslo_concurrency.processutils [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:01 np0005603623 nova_compute[226235]: 2026-01-31 09:07:01.746 226239 DEBUG nova.compute.manager [req-7da575b4-4b78-4f96-b9fc-93e9068b5149 req-a4289992-767e-4eeb-9182-b32a21c89e22 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Received event network-vif-deleted-c5eba141-81b4-4e12-b26f-8a6e0965b727 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:07:01 np0005603623 nova_compute[226235]: 2026-01-31 09:07:01.779 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:07:02 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3594360946' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:07:02 np0005603623 nova_compute[226235]: 2026-01-31 09:07:02.133 226239 DEBUG oslo_concurrency.processutils [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:02 np0005603623 nova_compute[226235]: 2026-01-31 09:07:02.138 226239 DEBUG nova.compute.provider_tree [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:07:02 np0005603623 nova_compute[226235]: 2026-01-31 09:07:02.158 226239 DEBUG nova.scheduler.client.report [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:07:02 np0005603623 nova_compute[226235]: 2026-01-31 09:07:02.244 226239 DEBUG oslo_concurrency.lockutils [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:02 np0005603623 nova_compute[226235]: 2026-01-31 09:07:02.299 226239 DEBUG nova.network.neutron [req-3ee52304-98f1-4346-a504-a4862bce3752 req-ec1e9274-87f0-4efc-9e32-a2e40d9c4a95 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Updated VIF entry in instance network info cache for port c5eba141-81b4-4e12-b26f-8a6e0965b727. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:07:02 np0005603623 nova_compute[226235]: 2026-01-31 09:07:02.301 226239 DEBUG nova.network.neutron [req-3ee52304-98f1-4346-a504-a4862bce3752 req-ec1e9274-87f0-4efc-9e32-a2e40d9c4a95 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Updating instance_info_cache with network_info: [{"id": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "address": "fa:16:3e:fa:64:8e", "network": {"id": "38986396-fb3a-4b5b-b8ee-59048b4b9ff2", "bridge": "br-int", "label": "tempest-network-smoke--1874028344", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d103de0dcec4f3f8bc7f8b3bec01a34", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5eba141-81", "ovs_interfaceid": "c5eba141-81b4-4e12-b26f-8a6e0965b727", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:07:02 np0005603623 nova_compute[226235]: 2026-01-31 09:07:02.307 226239 INFO nova.scheduler.client.report [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Deleted allocations for instance eff12f1f-d1ce-444d-8d39-ead849efd65a#033[00m
Jan 31 04:07:02 np0005603623 nova_compute[226235]: 2026-01-31 09:07:02.343 226239 DEBUG oslo_concurrency.lockutils [req-3ee52304-98f1-4346-a504-a4862bce3752 req-ec1e9274-87f0-4efc-9e32-a2e40d9c4a95 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-eff12f1f-d1ce-444d-8d39-ead849efd65a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:07:02 np0005603623 nova_compute[226235]: 2026-01-31 09:07:02.440 226239 DEBUG oslo_concurrency.lockutils [None req-bcecd73b-5d9f-4d62-9073-e9b70ba3341e 47dc950da7924a109657b08e4b8b55b7 4d103de0dcec4f3f8bc7f8b3bec01a34 - - default default] Lock "eff12f1f-d1ce-444d-8d39-ead849efd65a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.454s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:03 np0005603623 nova_compute[226235]: 2026-01-31 09:07:03.116 226239 DEBUG nova.compute.manager [req-12c9035b-2645-42ab-acdf-0c98c4ae2e3a req-4639fafc-1392-4086-840c-4853d406467f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Received event network-vif-plugged-c5eba141-81b4-4e12-b26f-8a6e0965b727 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:07:03 np0005603623 nova_compute[226235]: 2026-01-31 09:07:03.117 226239 DEBUG oslo_concurrency.lockutils [req-12c9035b-2645-42ab-acdf-0c98c4ae2e3a req-4639fafc-1392-4086-840c-4853d406467f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "eff12f1f-d1ce-444d-8d39-ead849efd65a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:03 np0005603623 nova_compute[226235]: 2026-01-31 09:07:03.117 226239 DEBUG oslo_concurrency.lockutils [req-12c9035b-2645-42ab-acdf-0c98c4ae2e3a req-4639fafc-1392-4086-840c-4853d406467f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "eff12f1f-d1ce-444d-8d39-ead849efd65a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:03 np0005603623 nova_compute[226235]: 2026-01-31 09:07:03.117 226239 DEBUG oslo_concurrency.lockutils [req-12c9035b-2645-42ab-acdf-0c98c4ae2e3a req-4639fafc-1392-4086-840c-4853d406467f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "eff12f1f-d1ce-444d-8d39-ead849efd65a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:03 np0005603623 nova_compute[226235]: 2026-01-31 09:07:03.118 226239 DEBUG nova.compute.manager [req-12c9035b-2645-42ab-acdf-0c98c4ae2e3a req-4639fafc-1392-4086-840c-4853d406467f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] No waiting events found dispatching network-vif-plugged-c5eba141-81b4-4e12-b26f-8a6e0965b727 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:07:03 np0005603623 nova_compute[226235]: 2026-01-31 09:07:03.118 226239 WARNING nova.compute.manager [req-12c9035b-2645-42ab-acdf-0c98c4ae2e3a req-4639fafc-1392-4086-840c-4853d406467f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Received unexpected event network-vif-plugged-c5eba141-81b4-4e12-b26f-8a6e0965b727 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 04:07:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:03.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:03.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:04.164 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '89'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:04 np0005603623 nova_compute[226235]: 2026-01-31 09:07:04.237 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:05.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:05.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:06 np0005603623 nova_compute[226235]: 2026-01-31 09:07:06.341 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850411.3402903, e48e9071-e65c-4dc9-bee0-7d382977f14a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:07:06 np0005603623 nova_compute[226235]: 2026-01-31 09:07:06.341 226239 INFO nova.compute.manager [-] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:07:06 np0005603623 nova_compute[226235]: 2026-01-31 09:07:06.377 226239 DEBUG nova.compute.manager [None req-30da80ca-00cb-413e-8e99-0ad3d34b0c48 - - - - - -] [instance: e48e9071-e65c-4dc9-bee0-7d382977f14a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:07:06 np0005603623 nova_compute[226235]: 2026-01-31 09:07:06.797 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:07:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:07.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:07:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:07:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:07.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:07:08 np0005603623 nova_compute[226235]: 2026-01-31 09:07:08.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:09 np0005603623 nova_compute[226235]: 2026-01-31 09:07:09.290 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:09.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:09.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:10 np0005603623 nova_compute[226235]: 2026-01-31 09:07:10.113 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:10 np0005603623 nova_compute[226235]: 2026-01-31 09:07:10.191 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:11.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:11.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:11 np0005603623 nova_compute[226235]: 2026-01-31 09:07:11.798 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:12 np0005603623 nova_compute[226235]: 2026-01-31 09:07:12.194 226239 DEBUG oslo_concurrency.lockutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:12 np0005603623 nova_compute[226235]: 2026-01-31 09:07:12.194 226239 DEBUG oslo_concurrency.lockutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:12 np0005603623 nova_compute[226235]: 2026-01-31 09:07:12.223 226239 DEBUG nova.compute.manager [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:07:12 np0005603623 nova_compute[226235]: 2026-01-31 09:07:12.372 226239 DEBUG oslo_concurrency.lockutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:12 np0005603623 nova_compute[226235]: 2026-01-31 09:07:12.373 226239 DEBUG oslo_concurrency.lockutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:12 np0005603623 nova_compute[226235]: 2026-01-31 09:07:12.380 226239 DEBUG nova.virt.hardware [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:07:12 np0005603623 nova_compute[226235]: 2026-01-31 09:07:12.380 226239 INFO nova.compute.claims [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 04:07:12 np0005603623 nova_compute[226235]: 2026-01-31 09:07:12.638 226239 DEBUG oslo_concurrency.processutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:07:13 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3477333697' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.087 226239 DEBUG oslo_concurrency.processutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.092 226239 DEBUG nova.compute.provider_tree [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.134 226239 DEBUG nova.scheduler.client.report [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.176 226239 DEBUG oslo_concurrency.lockutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.803s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.177 226239 DEBUG nova.compute.manager [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.297 226239 DEBUG nova.compute.manager [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.298 226239 DEBUG nova.network.neutron [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:07:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:07:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:13.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.484 226239 INFO nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:07:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:13.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.529 226239 DEBUG nova.compute.manager [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.723 226239 DEBUG nova.compute.manager [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.725 226239 DEBUG nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.725 226239 INFO nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Creating image(s)#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.752 226239 DEBUG nova.storage.rbd_utils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image bd9f81d8-7c9d-42e3-969b-7b404bfaff36_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.782 226239 DEBUG nova.storage.rbd_utils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image bd9f81d8-7c9d-42e3-969b-7b404bfaff36_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.804 226239 DEBUG nova.storage.rbd_utils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image bd9f81d8-7c9d-42e3-969b-7b404bfaff36_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.807 226239 DEBUG oslo_concurrency.processutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.855 226239 DEBUG oslo_concurrency.processutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.856 226239 DEBUG oslo_concurrency.lockutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.856 226239 DEBUG oslo_concurrency.lockutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.856 226239 DEBUG oslo_concurrency.lockutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.879 226239 DEBUG nova.storage.rbd_utils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image bd9f81d8-7c9d-42e3-969b-7b404bfaff36_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:07:13 np0005603623 nova_compute[226235]: 2026-01-31 09:07:13.882 226239 DEBUG oslo_concurrency.processutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 bd9f81d8-7c9d-42e3-969b-7b404bfaff36_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:14 np0005603623 nova_compute[226235]: 2026-01-31 09:07:14.195 226239 DEBUG nova.policy [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd442c7ba12ed444ca6d4dcc5cfd36150', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:07:14 np0005603623 nova_compute[226235]: 2026-01-31 09:07:14.212 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850419.2122924, eff12f1f-d1ce-444d-8d39-ead849efd65a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:07:14 np0005603623 nova_compute[226235]: 2026-01-31 09:07:14.213 226239 INFO nova.compute.manager [-] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:07:14 np0005603623 nova_compute[226235]: 2026-01-31 09:07:14.218 226239 DEBUG oslo_concurrency.processutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 bd9f81d8-7c9d-42e3-969b-7b404bfaff36_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.336s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:14 np0005603623 nova_compute[226235]: 2026-01-31 09:07:14.286 226239 DEBUG nova.compute.manager [None req-3d0b8e4a-e749-463d-86eb-3dd46d019b45 - - - - - -] [instance: eff12f1f-d1ce-444d-8d39-ead849efd65a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:07:14 np0005603623 nova_compute[226235]: 2026-01-31 09:07:14.291 226239 DEBUG nova.storage.rbd_utils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] resizing rbd image bd9f81d8-7c9d-42e3-969b-7b404bfaff36_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 04:07:14 np0005603623 nova_compute[226235]: 2026-01-31 09:07:14.328 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:14 np0005603623 nova_compute[226235]: 2026-01-31 09:07:14.417 226239 DEBUG nova.objects.instance [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'migration_context' on Instance uuid bd9f81d8-7c9d-42e3-969b-7b404bfaff36 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:07:14 np0005603623 nova_compute[226235]: 2026-01-31 09:07:14.458 226239 DEBUG nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 04:07:14 np0005603623 nova_compute[226235]: 2026-01-31 09:07:14.458 226239 DEBUG nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Ensure instance console log exists: /var/lib/nova/instances/bd9f81d8-7c9d-42e3-969b-7b404bfaff36/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:07:14 np0005603623 nova_compute[226235]: 2026-01-31 09:07:14.459 226239 DEBUG oslo_concurrency.lockutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:14 np0005603623 nova_compute[226235]: 2026-01-31 09:07:14.459 226239 DEBUG oslo_concurrency.lockutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:14 np0005603623 nova_compute[226235]: 2026-01-31 09:07:14.459 226239 DEBUG oslo_concurrency.lockutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:15.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:15.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:16 np0005603623 nova_compute[226235]: 2026-01-31 09:07:16.212 226239 DEBUG nova.network.neutron [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Successfully created port: 989482db-88b2-40b8-9cb3-59cb766c98e3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:07:16 np0005603623 nova_compute[226235]: 2026-01-31 09:07:16.800 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:17 np0005603623 nova_compute[226235]: 2026-01-31 09:07:17.294 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:17 np0005603623 nova_compute[226235]: 2026-01-31 09:07:17.295 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 04:07:17 np0005603623 nova_compute[226235]: 2026-01-31 09:07:17.352 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 04:07:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:17.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:17 np0005603623 nova_compute[226235]: 2026-01-31 09:07:17.460 226239 DEBUG nova.network.neutron [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Successfully updated port: 989482db-88b2-40b8-9cb3-59cb766c98e3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:07:17 np0005603623 nova_compute[226235]: 2026-01-31 09:07:17.489 226239 DEBUG oslo_concurrency.lockutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:07:17 np0005603623 nova_compute[226235]: 2026-01-31 09:07:17.489 226239 DEBUG oslo_concurrency.lockutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquired lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:07:17 np0005603623 nova_compute[226235]: 2026-01-31 09:07:17.489 226239 DEBUG nova.network.neutron [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:07:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:07:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:17.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:07:17 np0005603623 nova_compute[226235]: 2026-01-31 09:07:17.689 226239 DEBUG nova.compute.manager [req-efb9caed-7b28-49d0-840c-263d6df5f357 req-cc259013-ab2a-4ce3-86ff-f550e8fc4fa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received event network-changed-989482db-88b2-40b8-9cb3-59cb766c98e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:07:17 np0005603623 nova_compute[226235]: 2026-01-31 09:07:17.689 226239 DEBUG nova.compute.manager [req-efb9caed-7b28-49d0-840c-263d6df5f357 req-cc259013-ab2a-4ce3-86ff-f550e8fc4fa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Refreshing instance network info cache due to event network-changed-989482db-88b2-40b8-9cb3-59cb766c98e3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:07:17 np0005603623 nova_compute[226235]: 2026-01-31 09:07:17.690 226239 DEBUG oslo_concurrency.lockutils [req-efb9caed-7b28-49d0-840c-263d6df5f357 req-cc259013-ab2a-4ce3-86ff-f550e8fc4fa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:07:18 np0005603623 nova_compute[226235]: 2026-01-31 09:07:18.044 226239 DEBUG nova.network.neutron [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:07:18 np0005603623 nova_compute[226235]: 2026-01-31 09:07:18.212 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:18 np0005603623 nova_compute[226235]: 2026-01-31 09:07:18.213 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:07:19 np0005603623 nova_compute[226235]: 2026-01-31 09:07:19.330 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:19.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:07:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:19.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.191 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.191 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.246 226239 DEBUG nova.network.neutron [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Updating instance_info_cache with network_info: [{"id": "989482db-88b2-40b8-9cb3-59cb766c98e3", "address": "fa:16:3e:8b:2c:dd", "network": {"id": "6b575155-2651-409f-ab96-5a6cf52f7f88", "bridge": "br-int", "label": "tempest-network-smoke--862739051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap989482db-88", "ovs_interfaceid": "989482db-88b2-40b8-9cb3-59cb766c98e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.283 226239 DEBUG oslo_concurrency.lockutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Releasing lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.284 226239 DEBUG nova.compute.manager [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Instance network_info: |[{"id": "989482db-88b2-40b8-9cb3-59cb766c98e3", "address": "fa:16:3e:8b:2c:dd", "network": {"id": "6b575155-2651-409f-ab96-5a6cf52f7f88", "bridge": "br-int", "label": "tempest-network-smoke--862739051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap989482db-88", "ovs_interfaceid": "989482db-88b2-40b8-9cb3-59cb766c98e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.284 226239 DEBUG oslo_concurrency.lockutils [req-efb9caed-7b28-49d0-840c-263d6df5f357 req-cc259013-ab2a-4ce3-86ff-f550e8fc4fa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.284 226239 DEBUG nova.network.neutron [req-efb9caed-7b28-49d0-840c-263d6df5f357 req-cc259013-ab2a-4ce3-86ff-f550e8fc4fa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Refreshing network info cache for port 989482db-88b2-40b8-9cb3-59cb766c98e3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.286 226239 DEBUG nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Start _get_guest_xml network_info=[{"id": "989482db-88b2-40b8-9cb3-59cb766c98e3", "address": "fa:16:3e:8b:2c:dd", "network": {"id": "6b575155-2651-409f-ab96-5a6cf52f7f88", "bridge": "br-int", "label": "tempest-network-smoke--862739051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap989482db-88", "ovs_interfaceid": "989482db-88b2-40b8-9cb3-59cb766c98e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.291 226239 WARNING nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.295 226239 DEBUG nova.virt.libvirt.host [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.295 226239 DEBUG nova.virt.libvirt.host [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.299 226239 DEBUG nova.virt.libvirt.host [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.299 226239 DEBUG nova.virt.libvirt.host [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.300 226239 DEBUG nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.301 226239 DEBUG nova.virt.hardware [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.301 226239 DEBUG nova.virt.hardware [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.301 226239 DEBUG nova.virt.hardware [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.301 226239 DEBUG nova.virt.hardware [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.302 226239 DEBUG nova.virt.hardware [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.302 226239 DEBUG nova.virt.hardware [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.302 226239 DEBUG nova.virt.hardware [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.302 226239 DEBUG nova.virt.hardware [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.303 226239 DEBUG nova.virt.hardware [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.303 226239 DEBUG nova.virt.hardware [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.303 226239 DEBUG nova.virt.hardware [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.305 226239 DEBUG oslo_concurrency.processutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:07:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:21.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:07:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.002000064s ======
Jan 31 04:07:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:21.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 31 04:07:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:07:21 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1111764618' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.707 226239 DEBUG oslo_concurrency.processutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.402s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.742 226239 DEBUG nova.storage.rbd_utils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image bd9f81d8-7c9d-42e3-969b-7b404bfaff36_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.746 226239 DEBUG oslo_concurrency.processutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:21 np0005603623 nova_compute[226235]: 2026-01-31 09:07:21.801 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:07:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/756256532' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.152 226239 DEBUG oslo_concurrency.processutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.406s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.153 226239 DEBUG nova.virt.libvirt.vif [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:07:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1788706990',display_name='tempest-TestNetworkBasicOps-server-1788706990',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1788706990',id=209,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFEUtQI/j7LXzuU0YrvuHFu6rkogJclFlco3io4wOyAEhRnhgfAZfdIyNjEUp2XoZclbiiMzCEa3Qt2b6cDsAqGG0bmO017oWolVcCZbo0JT42U5etFVtTzsJWjXNhAHJw==',key_name='tempest-TestNetworkBasicOps-1516398044',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-ozydntjb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:07:13Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=bd9f81d8-7c9d-42e3-969b-7b404bfaff36,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "989482db-88b2-40b8-9cb3-59cb766c98e3", "address": "fa:16:3e:8b:2c:dd", "network": {"id": "6b575155-2651-409f-ab96-5a6cf52f7f88", "bridge": "br-int", "label": "tempest-network-smoke--862739051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap989482db-88", "ovs_interfaceid": "989482db-88b2-40b8-9cb3-59cb766c98e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.154 226239 DEBUG nova.network.os_vif_util [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "989482db-88b2-40b8-9cb3-59cb766c98e3", "address": "fa:16:3e:8b:2c:dd", "network": {"id": "6b575155-2651-409f-ab96-5a6cf52f7f88", "bridge": "br-int", "label": "tempest-network-smoke--862739051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap989482db-88", "ovs_interfaceid": "989482db-88b2-40b8-9cb3-59cb766c98e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.155 226239 DEBUG nova.network.os_vif_util [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:2c:dd,bridge_name='br-int',has_traffic_filtering=True,id=989482db-88b2-40b8-9cb3-59cb766c98e3,network=Network(6b575155-2651-409f-ab96-5a6cf52f7f88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap989482db-88') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.156 226239 DEBUG nova.objects.instance [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'pci_devices' on Instance uuid bd9f81d8-7c9d-42e3-969b-7b404bfaff36 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.183 226239 DEBUG nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:07:22 np0005603623 nova_compute[226235]:  <uuid>bd9f81d8-7c9d-42e3-969b-7b404bfaff36</uuid>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:  <name>instance-000000d1</name>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <nova:name>tempest-TestNetworkBasicOps-server-1788706990</nova:name>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 09:07:21</nova:creationTime>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 04:07:22 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:        <nova:user uuid="d442c7ba12ed444ca6d4dcc5cfd36150">tempest-TestNetworkBasicOps-104417095-project-member</nova:user>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:        <nova:project uuid="abf9393aa2b646feb00a3d887a9dee14">tempest-TestNetworkBasicOps-104417095</nova:project>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:        <nova:port uuid="989482db-88b2-40b8-9cb3-59cb766c98e3">
Jan 31 04:07:22 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <system>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <entry name="serial">bd9f81d8-7c9d-42e3-969b-7b404bfaff36</entry>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <entry name="uuid">bd9f81d8-7c9d-42e3-969b-7b404bfaff36</entry>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    </system>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:  <os>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:  </os>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:  <features>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:  </features>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:  </clock>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:  <devices>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/bd9f81d8-7c9d-42e3-969b-7b404bfaff36_disk">
Jan 31 04:07:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:07:22 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/bd9f81d8-7c9d-42e3-969b-7b404bfaff36_disk.config">
Jan 31 04:07:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:07:22 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:8b:2c:dd"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <target dev="tap989482db-88"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    </interface>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/bd9f81d8-7c9d-42e3-969b-7b404bfaff36/console.log" append="off"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    </serial>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <video>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    </video>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    </rng>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 04:07:22 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 04:07:22 np0005603623 nova_compute[226235]:  </devices>
Jan 31 04:07:22 np0005603623 nova_compute[226235]: </domain>
Jan 31 04:07:22 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.185 226239 DEBUG nova.compute.manager [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Preparing to wait for external event network-vif-plugged-989482db-88b2-40b8-9cb3-59cb766c98e3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.185 226239 DEBUG oslo_concurrency.lockutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.186 226239 DEBUG oslo_concurrency.lockutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.186 226239 DEBUG oslo_concurrency.lockutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.187 226239 DEBUG nova.virt.libvirt.vif [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:07:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1788706990',display_name='tempest-TestNetworkBasicOps-server-1788706990',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1788706990',id=209,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFEUtQI/j7LXzuU0YrvuHFu6rkogJclFlco3io4wOyAEhRnhgfAZfdIyNjEUp2XoZclbiiMzCEa3Qt2b6cDsAqGG0bmO017oWolVcCZbo0JT42U5etFVtTzsJWjXNhAHJw==',key_name='tempest-TestNetworkBasicOps-1516398044',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-ozydntjb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:07:13Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=bd9f81d8-7c9d-42e3-969b-7b404bfaff36,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "989482db-88b2-40b8-9cb3-59cb766c98e3", "address": "fa:16:3e:8b:2c:dd", "network": {"id": "6b575155-2651-409f-ab96-5a6cf52f7f88", "bridge": "br-int", "label": "tempest-network-smoke--862739051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap989482db-88", "ovs_interfaceid": "989482db-88b2-40b8-9cb3-59cb766c98e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.187 226239 DEBUG nova.network.os_vif_util [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "989482db-88b2-40b8-9cb3-59cb766c98e3", "address": "fa:16:3e:8b:2c:dd", "network": {"id": "6b575155-2651-409f-ab96-5a6cf52f7f88", "bridge": "br-int", "label": "tempest-network-smoke--862739051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap989482db-88", "ovs_interfaceid": "989482db-88b2-40b8-9cb3-59cb766c98e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.187 226239 DEBUG nova.network.os_vif_util [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:2c:dd,bridge_name='br-int',has_traffic_filtering=True,id=989482db-88b2-40b8-9cb3-59cb766c98e3,network=Network(6b575155-2651-409f-ab96-5a6cf52f7f88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap989482db-88') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.188 226239 DEBUG os_vif [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:2c:dd,bridge_name='br-int',has_traffic_filtering=True,id=989482db-88b2-40b8-9cb3-59cb766c98e3,network=Network(6b575155-2651-409f-ab96-5a6cf52f7f88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap989482db-88') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.188 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.189 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.189 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.191 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.192 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap989482db-88, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.192 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap989482db-88, col_values=(('external_ids', {'iface-id': '989482db-88b2-40b8-9cb3-59cb766c98e3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:2c:dd', 'vm-uuid': 'bd9f81d8-7c9d-42e3-969b-7b404bfaff36'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:22 np0005603623 NetworkManager[48970]: <info>  [1769850442.1942] manager: (tap989482db-88): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/411)
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.194 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.197 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.200 226239 INFO os_vif [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:2c:dd,bridge_name='br-int',has_traffic_filtering=True,id=989482db-88b2-40b8-9cb3-59cb766c98e3,network=Network(6b575155-2651-409f-ab96-5a6cf52f7f88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap989482db-88')#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.285 226239 DEBUG nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.286 226239 DEBUG nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.286 226239 DEBUG nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No VIF found with MAC fa:16:3e:8b:2c:dd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.286 226239 INFO nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Using config drive#033[00m
Jan 31 04:07:22 np0005603623 nova_compute[226235]: 2026-01-31 09:07:22.310 226239 DEBUG nova.storage.rbd_utils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image bd9f81d8-7c9d-42e3-969b-7b404bfaff36_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.209 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.210 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.210 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.210 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.211 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:23.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.458 226239 INFO nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Creating config drive at /var/lib/nova/instances/bd9f81d8-7c9d-42e3-969b-7b404bfaff36/disk.config#033[00m
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.462 226239 DEBUG oslo_concurrency.processutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bd9f81d8-7c9d-42e3-969b-7b404bfaff36/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpurh4umyp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:07:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:23.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.585 226239 DEBUG oslo_concurrency.processutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bd9f81d8-7c9d-42e3-969b-7b404bfaff36/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpurh4umyp" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.610 226239 DEBUG nova.storage.rbd_utils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image bd9f81d8-7c9d-42e3-969b-7b404bfaff36_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:07:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2004205469' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.614 226239 DEBUG oslo_concurrency.processutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/bd9f81d8-7c9d-42e3-969b-7b404bfaff36/disk.config bd9f81d8-7c9d-42e3-969b-7b404bfaff36_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.636 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.694 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000d1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.695 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000d1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.773 226239 DEBUG oslo_concurrency.processutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/bd9f81d8-7c9d-42e3-969b-7b404bfaff36/disk.config bd9f81d8-7c9d-42e3-969b-7b404bfaff36_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.159s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.774 226239 INFO nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Deleting local config drive /var/lib/nova/instances/bd9f81d8-7c9d-42e3-969b-7b404bfaff36/disk.config because it was imported into RBD.#033[00m
Jan 31 04:07:23 np0005603623 kernel: tap989482db-88: entered promiscuous mode
Jan 31 04:07:23 np0005603623 NetworkManager[48970]: <info>  [1769850443.8091] manager: (tap989482db-88): new Tun device (/org/freedesktop/NetworkManager/Devices/412)
Jan 31 04:07:23 np0005603623 ovn_controller[133449]: 2026-01-31T09:07:23Z|00875|binding|INFO|Claiming lport 989482db-88b2-40b8-9cb3-59cb766c98e3 for this chassis.
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.810 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:23 np0005603623 ovn_controller[133449]: 2026-01-31T09:07:23Z|00876|binding|INFO|989482db-88b2-40b8-9cb3-59cb766c98e3: Claiming fa:16:3e:8b:2c:dd 10.100.0.9
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.812 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.822 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.823 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4115MB free_disk=20.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.823 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.824 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:23 np0005603623 systemd-udevd[325001]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:07:23 np0005603623 systemd-machined[194379]: New machine qemu-97-instance-000000d1.
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.834 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.835 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:2c:dd 10.100.0.9'], port_security=['fa:16:3e:8b:2c:dd 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'bd9f81d8-7c9d-42e3-969b-7b404bfaff36', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b575155-2651-409f-ab96-5a6cf52f7f88', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c2c6300a-1c14-467d-a580-f801f9bbd79f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d18ce706-6e0f-4295-834a-6178a6f7a4c4, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=989482db-88b2-40b8-9cb3-59cb766c98e3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.837 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 989482db-88b2-40b8-9cb3-59cb766c98e3 in datapath 6b575155-2651-409f-ab96-5a6cf52f7f88 bound to our chassis#033[00m
Jan 31 04:07:23 np0005603623 ovn_controller[133449]: 2026-01-31T09:07:23Z|00877|binding|INFO|Setting lport 989482db-88b2-40b8-9cb3-59cb766c98e3 ovn-installed in OVS
Jan 31 04:07:23 np0005603623 ovn_controller[133449]: 2026-01-31T09:07:23Z|00878|binding|INFO|Setting lport 989482db-88b2-40b8-9cb3-59cb766c98e3 up in Southbound
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.838 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6b575155-2651-409f-ab96-5a6cf52f7f88#033[00m
Jan 31 04:07:23 np0005603623 nova_compute[226235]: 2026-01-31 09:07:23.839 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:23 np0005603623 NetworkManager[48970]: <info>  [1769850443.8411] device (tap989482db-88): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:07:23 np0005603623 NetworkManager[48970]: <info>  [1769850443.8417] device (tap989482db-88): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:07:23 np0005603623 systemd[1]: Started Virtual Machine qemu-97-instance-000000d1.
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.847 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f922bb78-96c1-4e72-8a5c-4066bb66e4ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.848 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6b575155-21 in ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.849 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6b575155-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.849 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9b630f97-98f8-46db-80bc-1939a67ded46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.850 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[84ac04bd-10ef-42e9-af33-58dcded7c5be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.859 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[f2819716-11f4-47ac-8be3-99a480480312]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.867 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f1c645b5-ea3e-4a5e-9dac-676a3e8ee007]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.888 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[b34c419e-433c-43ae-8dae-32de3b448ea4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.892 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6ab24ecf-1c1d-4016-857d-c5536b7910d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:23 np0005603623 systemd-udevd[325004]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:07:23 np0005603623 NetworkManager[48970]: <info>  [1769850443.8932] manager: (tap6b575155-20): new Veth device (/org/freedesktop/NetworkManager/Devices/413)
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.911 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[6541d43c-5943-403e-8ad0-473e2a5fc3dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.914 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[bddb7f93-151d-4b33-a80b-6141ad997fbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:23 np0005603623 NetworkManager[48970]: <info>  [1769850443.9299] device (tap6b575155-20): carrier: link connected
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.932 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[68f932ac-e0c2-4c6a-8f53-aac763709b86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.943 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[69cf339f-cc4f-4438-a429-52f5788e206d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6b575155-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:3f:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 257], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 958435, 'reachable_time': 29459, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 325035, 'error': None, 'target': 'ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.953 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[008df78a-0bda-4c1a-a11b-753999e97b5d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedc:3f5d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 958435, 'tstamp': 958435}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 325036, 'error': None, 'target': 'ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.966 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b2c4f784-29c8-4dd3-82a0-81aa79f2ff26]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6b575155-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:3f:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 257], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 958435, 'reachable_time': 29459, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 325042, 'error': None, 'target': 'ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:23.990 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[32a881d7-6d37-4abd-b1d2-301b70099460]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.097 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance bd9f81d8-7c9d-42e3-969b-7b404bfaff36 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.097 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.097 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:24.120 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0d73cb7c-2d52-43d9-86f6-4a6f733b3ec8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:24.128 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b575155-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:24.128 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:24.139 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6b575155-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.141 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:24 np0005603623 NetworkManager[48970]: <info>  [1769850444.1420] manager: (tap6b575155-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/414)
Jan 31 04:07:24 np0005603623 kernel: tap6b575155-20: entered promiscuous mode
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.143 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:24.144 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6b575155-20, col_values=(('external_ids', {'iface-id': '49cdbdd6-d7fe-466c-bd27-e6f93335cffd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.145 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.147 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:24.148 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6b575155-2651-409f-ab96-5a6cf52f7f88.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6b575155-2651-409f-ab96-5a6cf52f7f88.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:07:24 np0005603623 ovn_controller[133449]: 2026-01-31T09:07:24Z|00879|binding|INFO|Releasing lport 49cdbdd6-d7fe-466c-bd27-e6f93335cffd from this chassis (sb_readonly=0)
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:24.150 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[30cb214a-fafe-4a8a-b829-3aee4f9ca0c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:24.151 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-6b575155-2651-409f-ab96-5a6cf52f7f88
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/6b575155-2651-409f-ab96-5a6cf52f7f88.pid.haproxy
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 6b575155-2651-409f-ab96-5a6cf52f7f88
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:07:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:24.151 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88', 'env', 'PROCESS_TAG=haproxy-6b575155-2651-409f-ab96-5a6cf52f7f88', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6b575155-2651-409f-ab96-5a6cf52f7f88.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.160 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.216 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:24 np0005603623 podman[325220]: 2026-01-31 09:07:24.457490541 +0000 UTC m=+0.042870065 container create dbc2847dd08a4827b6459bda431c9f6c8216645fa710b72f27c7f25fa0b4eb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 04:07:24 np0005603623 systemd[1]: Started libpod-conmon-dbc2847dd08a4827b6459bda431c9f6c8216645fa710b72f27c7f25fa0b4eb86.scope.
Jan 31 04:07:24 np0005603623 systemd[1]: Started libcrun container.
Jan 31 04:07:24 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8e94a64f511ed0d688a836112c8cfc8dc286cb00bf12f1c705a629236f5ee106/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:07:24 np0005603623 podman[325220]: 2026-01-31 09:07:24.508783581 +0000 UTC m=+0.094163115 container init dbc2847dd08a4827b6459bda431c9f6c8216645fa710b72f27c7f25fa0b4eb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:07:24 np0005603623 podman[325220]: 2026-01-31 09:07:24.514083647 +0000 UTC m=+0.099463161 container start dbc2847dd08a4827b6459bda431c9f6c8216645fa710b72f27c7f25fa0b4eb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 04:07:24 np0005603623 podman[325220]: 2026-01-31 09:07:24.435231743 +0000 UTC m=+0.020611287 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:07:24 np0005603623 neutron-haproxy-ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88[325269]: [NOTICE]   (325275) : New worker (325277) forked
Jan 31 04:07:24 np0005603623 neutron-haproxy-ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88[325269]: [NOTICE]   (325275) : Loading success.
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.570 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850444.5700111, bd9f81d8-7c9d-42e3-969b-7b404bfaff36 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.571 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] VM Started (Lifecycle Event)#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.586 226239 DEBUG nova.compute.manager [req-3d932db6-138a-48b6-ac9a-19cf5dc6891f req-086486bc-9c48-40c8-9d3c-48854a19f98a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received event network-vif-plugged-989482db-88b2-40b8-9cb3-59cb766c98e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.586 226239 DEBUG oslo_concurrency.lockutils [req-3d932db6-138a-48b6-ac9a-19cf5dc6891f req-086486bc-9c48-40c8-9d3c-48854a19f98a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.586 226239 DEBUG oslo_concurrency.lockutils [req-3d932db6-138a-48b6-ac9a-19cf5dc6891f req-086486bc-9c48-40c8-9d3c-48854a19f98a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.587 226239 DEBUG oslo_concurrency.lockutils [req-3d932db6-138a-48b6-ac9a-19cf5dc6891f req-086486bc-9c48-40c8-9d3c-48854a19f98a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.587 226239 DEBUG nova.compute.manager [req-3d932db6-138a-48b6-ac9a-19cf5dc6891f req-086486bc-9c48-40c8-9d3c-48854a19f98a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Processing event network-vif-plugged-989482db-88b2-40b8-9cb3-59cb766c98e3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.587 226239 DEBUG nova.compute.manager [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.590 226239 DEBUG nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.593 226239 INFO nova.virt.libvirt.driver [-] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Instance spawned successfully.#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.593 226239 DEBUG nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.610 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.612 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.632 226239 DEBUG nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.633 226239 DEBUG nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.633 226239 DEBUG nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.634 226239 DEBUG nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.634 226239 DEBUG nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.634 226239 DEBUG nova.virt.libvirt.driver [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.637 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.638 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850444.571254, bd9f81d8-7c9d-42e3-969b-7b404bfaff36 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.638 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:07:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:07:24 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/834331980' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.654 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.658 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.700 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.702 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.707 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850444.5900915, bd9f81d8-7c9d-42e3-969b-7b404bfaff36 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.707 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:07:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:07:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:07:24 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.814 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.815 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.815 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.992s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.818 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.838 226239 INFO nova.compute.manager [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Took 11.11 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.838 226239 DEBUG nova.compute.manager [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.882 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:07:24 np0005603623 nova_compute[226235]: 2026-01-31 09:07:24.990 226239 INFO nova.compute.manager [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Took 12.67 seconds to build instance.#033[00m
Jan 31 04:07:25 np0005603623 nova_compute[226235]: 2026-01-31 09:07:25.051 226239 DEBUG oslo_concurrency.lockutils [None req-7e9fec77-5886-4ead-8360-a08b5960a5fa d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.857s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:25 np0005603623 nova_compute[226235]: 2026-01-31 09:07:25.255 226239 DEBUG nova.network.neutron [req-efb9caed-7b28-49d0-840c-263d6df5f357 req-cc259013-ab2a-4ce3-86ff-f550e8fc4fa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Updated VIF entry in instance network info cache for port 989482db-88b2-40b8-9cb3-59cb766c98e3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:07:25 np0005603623 nova_compute[226235]: 2026-01-31 09:07:25.256 226239 DEBUG nova.network.neutron [req-efb9caed-7b28-49d0-840c-263d6df5f357 req-cc259013-ab2a-4ce3-86ff-f550e8fc4fa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Updating instance_info_cache with network_info: [{"id": "989482db-88b2-40b8-9cb3-59cb766c98e3", "address": "fa:16:3e:8b:2c:dd", "network": {"id": "6b575155-2651-409f-ab96-5a6cf52f7f88", "bridge": "br-int", "label": "tempest-network-smoke--862739051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap989482db-88", "ovs_interfaceid": "989482db-88b2-40b8-9cb3-59cb766c98e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:07:25 np0005603623 nova_compute[226235]: 2026-01-31 09:07:25.282 226239 DEBUG oslo_concurrency.lockutils [req-efb9caed-7b28-49d0-840c-263d6df5f357 req-cc259013-ab2a-4ce3-86ff-f550e8fc4fa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:07:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:25.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:07:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:25.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:07:25 np0005603623 nova_compute[226235]: 2026-01-31 09:07:25.816 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:26 np0005603623 nova_compute[226235]: 2026-01-31 09:07:26.806 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:26 np0005603623 podman[325291]: 2026-01-31 09:07:26.961263417 +0000 UTC m=+0.053161909 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:07:26 np0005603623 nova_compute[226235]: 2026-01-31 09:07:26.989 226239 DEBUG nova.compute.manager [req-8b61411c-0a0f-453d-9a4b-c863b88e0765 req-012c0b02-e026-404e-bb71-a211b7518736 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received event network-vif-plugged-989482db-88b2-40b8-9cb3-59cb766c98e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:07:26 np0005603623 nova_compute[226235]: 2026-01-31 09:07:26.989 226239 DEBUG oslo_concurrency.lockutils [req-8b61411c-0a0f-453d-9a4b-c863b88e0765 req-012c0b02-e026-404e-bb71-a211b7518736 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:26 np0005603623 nova_compute[226235]: 2026-01-31 09:07:26.989 226239 DEBUG oslo_concurrency.lockutils [req-8b61411c-0a0f-453d-9a4b-c863b88e0765 req-012c0b02-e026-404e-bb71-a211b7518736 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:26 np0005603623 nova_compute[226235]: 2026-01-31 09:07:26.990 226239 DEBUG oslo_concurrency.lockutils [req-8b61411c-0a0f-453d-9a4b-c863b88e0765 req-012c0b02-e026-404e-bb71-a211b7518736 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:26 np0005603623 nova_compute[226235]: 2026-01-31 09:07:26.990 226239 DEBUG nova.compute.manager [req-8b61411c-0a0f-453d-9a4b-c863b88e0765 req-012c0b02-e026-404e-bb71-a211b7518736 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] No waiting events found dispatching network-vif-plugged-989482db-88b2-40b8-9cb3-59cb766c98e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:07:26 np0005603623 nova_compute[226235]: 2026-01-31 09:07:26.990 226239 WARNING nova.compute.manager [req-8b61411c-0a0f-453d-9a4b-c863b88e0765 req-012c0b02-e026-404e-bb71-a211b7518736 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received unexpected event network-vif-plugged-989482db-88b2-40b8-9cb3-59cb766c98e3 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:07:26 np0005603623 podman[325292]: 2026-01-31 09:07:26.999161615 +0000 UTC m=+0.086020709 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 04:07:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:27 np0005603623 nova_compute[226235]: 2026-01-31 09:07:27.193 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:27.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:27.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:28 np0005603623 nova_compute[226235]: 2026-01-31 09:07:28.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:07:28 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3812384837' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:07:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:29.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:29.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:30 np0005603623 nova_compute[226235]: 2026-01-31 09:07:30.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:30 np0005603623 nova_compute[226235]: 2026-01-31 09:07:30.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:30.159 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:30.159 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:30.160 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:07:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:07:31 np0005603623 nova_compute[226235]: 2026-01-31 09:07:31.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:31.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:31.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:31.769 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=90, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=89) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:07:31 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:31.770 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:07:31 np0005603623 nova_compute[226235]: 2026-01-31 09:07:31.771 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:31 np0005603623 nova_compute[226235]: 2026-01-31 09:07:31.804 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:32 np0005603623 nova_compute[226235]: 2026-01-31 09:07:32.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:32 np0005603623 nova_compute[226235]: 2026-01-31 09:07:32.195 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:32 np0005603623 nova_compute[226235]: 2026-01-31 09:07:32.884 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:32 np0005603623 NetworkManager[48970]: <info>  [1769850452.8852] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/415)
Jan 31 04:07:32 np0005603623 NetworkManager[48970]: <info>  [1769850452.8864] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/416)
Jan 31 04:07:32 np0005603623 nova_compute[226235]: 2026-01-31 09:07:32.923 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:32 np0005603623 ovn_controller[133449]: 2026-01-31T09:07:32Z|00880|binding|INFO|Releasing lport 49cdbdd6-d7fe-466c-bd27-e6f93335cffd from this chassis (sb_readonly=0)
Jan 31 04:07:32 np0005603623 nova_compute[226235]: 2026-01-31 09:07:32.936 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:33.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:33.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:33 np0005603623 nova_compute[226235]: 2026-01-31 09:07:33.584 226239 DEBUG nova.compute.manager [req-5c6483a8-047a-4fb8-94bb-defe2c4fe071 req-6eb2cb91-ac06-408d-965b-1a8f0b6989f9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received event network-changed-989482db-88b2-40b8-9cb3-59cb766c98e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:07:33 np0005603623 nova_compute[226235]: 2026-01-31 09:07:33.585 226239 DEBUG nova.compute.manager [req-5c6483a8-047a-4fb8-94bb-defe2c4fe071 req-6eb2cb91-ac06-408d-965b-1a8f0b6989f9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Refreshing instance network info cache due to event network-changed-989482db-88b2-40b8-9cb3-59cb766c98e3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:07:33 np0005603623 nova_compute[226235]: 2026-01-31 09:07:33.585 226239 DEBUG oslo_concurrency.lockutils [req-5c6483a8-047a-4fb8-94bb-defe2c4fe071 req-6eb2cb91-ac06-408d-965b-1a8f0b6989f9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:07:33 np0005603623 nova_compute[226235]: 2026-01-31 09:07:33.585 226239 DEBUG oslo_concurrency.lockutils [req-5c6483a8-047a-4fb8-94bb-defe2c4fe071 req-6eb2cb91-ac06-408d-965b-1a8f0b6989f9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:07:33 np0005603623 nova_compute[226235]: 2026-01-31 09:07:33.586 226239 DEBUG nova.network.neutron [req-5c6483a8-047a-4fb8-94bb-defe2c4fe071 req-6eb2cb91-ac06-408d-965b-1a8f0b6989f9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Refreshing network info cache for port 989482db-88b2-40b8-9cb3-59cb766c98e3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:07:34 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:07:34.772 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '90'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:34 np0005603623 nova_compute[226235]: 2026-01-31 09:07:34.842 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:35.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:35.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:36 np0005603623 nova_compute[226235]: 2026-01-31 09:07:36.069 226239 DEBUG nova.network.neutron [req-5c6483a8-047a-4fb8-94bb-defe2c4fe071 req-6eb2cb91-ac06-408d-965b-1a8f0b6989f9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Updated VIF entry in instance network info cache for port 989482db-88b2-40b8-9cb3-59cb766c98e3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:07:36 np0005603623 nova_compute[226235]: 2026-01-31 09:07:36.070 226239 DEBUG nova.network.neutron [req-5c6483a8-047a-4fb8-94bb-defe2c4fe071 req-6eb2cb91-ac06-408d-965b-1a8f0b6989f9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Updating instance_info_cache with network_info: [{"id": "989482db-88b2-40b8-9cb3-59cb766c98e3", "address": "fa:16:3e:8b:2c:dd", "network": {"id": "6b575155-2651-409f-ab96-5a6cf52f7f88", "bridge": "br-int", "label": "tempest-network-smoke--862739051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap989482db-88", "ovs_interfaceid": "989482db-88b2-40b8-9cb3-59cb766c98e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:07:36 np0005603623 nova_compute[226235]: 2026-01-31 09:07:36.211 226239 DEBUG oslo_concurrency.lockutils [req-5c6483a8-047a-4fb8-94bb-defe2c4fe071 req-6eb2cb91-ac06-408d-965b-1a8f0b6989f9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:07:36 np0005603623 ovn_controller[133449]: 2026-01-31T09:07:36Z|00110|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8b:2c:dd 10.100.0.9
Jan 31 04:07:36 np0005603623 ovn_controller[133449]: 2026-01-31T09:07:36Z|00111|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8b:2c:dd 10.100.0.9
Jan 31 04:07:36 np0005603623 nova_compute[226235]: 2026-01-31 09:07:36.806 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:37 np0005603623 nova_compute[226235]: 2026-01-31 09:07:37.197 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:37.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:37.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:07:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:39.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:07:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:39.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:07:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:41.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:07:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:41.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:41 np0005603623 nova_compute[226235]: 2026-01-31 09:07:41.798 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:41 np0005603623 nova_compute[226235]: 2026-01-31 09:07:41.807 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:42 np0005603623 nova_compute[226235]: 2026-01-31 09:07:42.199 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e411 e411: 3 total, 3 up, 3 in
Jan 31 04:07:43 np0005603623 nova_compute[226235]: 2026-01-31 09:07:43.151 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:43.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:43.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:45.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:07:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:45.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:07:46 np0005603623 nova_compute[226235]: 2026-01-31 09:07:46.808 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:47 np0005603623 nova_compute[226235]: 2026-01-31 09:07:47.201 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:47.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:47.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:07:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:49.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:07:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:49.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:51.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:51.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:51 np0005603623 nova_compute[226235]: 2026-01-31 09:07:51.863 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:52 np0005603623 nova_compute[226235]: 2026-01-31 09:07:52.202 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:53.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:07:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:53.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:07:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:55.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:55.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:56 np0005603623 nova_compute[226235]: 2026-01-31 09:07:56.865 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:57 np0005603623 nova_compute[226235]: 2026-01-31 09:07:57.205 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:07:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:57.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:07:57 np0005603623 ceph-mgr[77391]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3835187053
Jan 31 04:07:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:57.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:57 np0005603623 podman[325455]: 2026-01-31 09:07:57.971287403 +0000 UTC m=+0.055986587 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 31 04:07:57 np0005603623 podman[325456]: 2026-01-31 09:07:57.987278976 +0000 UTC m=+0.072644230 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:07:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:59.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:07:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:59.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:01.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:01.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:01 np0005603623 nova_compute[226235]: 2026-01-31 09:08:01.867 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:02 np0005603623 nova_compute[226235]: 2026-01-31 09:08:02.207 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:08:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:03.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:08:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:03.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:05.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:08:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:05.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:08:06 np0005603623 nova_compute[226235]: 2026-01-31 09:08:06.869 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:07 np0005603623 nova_compute[226235]: 2026-01-31 09:08:07.208 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:08:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:07.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:08:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:07.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:09.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:09.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:11.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:11.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:11 np0005603623 nova_compute[226235]: 2026-01-31 09:08:11.923 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:12 np0005603623 nova_compute[226235]: 2026-01-31 09:08:12.210 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:13.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:13.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:08:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3969049839' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:08:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:08:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3969049839' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:08:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:15.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:08:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:15.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:08:16 np0005603623 nova_compute[226235]: 2026-01-31 09:08:16.924 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:17 np0005603623 nova_compute[226235]: 2026-01-31 09:08:17.211 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:17.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:08:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:17.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:08:18 np0005603623 nova_compute[226235]: 2026-01-31 09:08:18.196 226239 INFO nova.compute.manager [None req-6647582e-4e07-48b7-a650-dfbaa051f9b0 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Get console output#033[00m
Jan 31 04:08:18 np0005603623 nova_compute[226235]: 2026-01-31 09:08:18.202 270602 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 04:08:19 np0005603623 nova_compute[226235]: 2026-01-31 09:08:19.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:08:19 np0005603623 nova_compute[226235]: 2026-01-31 09:08:19.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:08:19 np0005603623 nova_compute[226235]: 2026-01-31 09:08:19.181 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:19.181 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=91, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=90) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:08:19 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:19.183 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:08:19 np0005603623 nova_compute[226235]: 2026-01-31 09:08:19.402 226239 DEBUG nova.compute.manager [req-186aaddb-a56f-4ba4-84b4-b5c56f083a69 req-c5bb8ac0-fa21-4edd-8e33-ae75b9104f87 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received event network-changed-989482db-88b2-40b8-9cb3-59cb766c98e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:19 np0005603623 nova_compute[226235]: 2026-01-31 09:08:19.402 226239 DEBUG nova.compute.manager [req-186aaddb-a56f-4ba4-84b4-b5c56f083a69 req-c5bb8ac0-fa21-4edd-8e33-ae75b9104f87 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Refreshing instance network info cache due to event network-changed-989482db-88b2-40b8-9cb3-59cb766c98e3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:08:19 np0005603623 nova_compute[226235]: 2026-01-31 09:08:19.402 226239 DEBUG oslo_concurrency.lockutils [req-186aaddb-a56f-4ba4-84b4-b5c56f083a69 req-c5bb8ac0-fa21-4edd-8e33-ae75b9104f87 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:08:19 np0005603623 nova_compute[226235]: 2026-01-31 09:08:19.403 226239 DEBUG oslo_concurrency.lockutils [req-186aaddb-a56f-4ba4-84b4-b5c56f083a69 req-c5bb8ac0-fa21-4edd-8e33-ae75b9104f87 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:08:19 np0005603623 nova_compute[226235]: 2026-01-31 09:08:19.403 226239 DEBUG nova.network.neutron [req-186aaddb-a56f-4ba4-84b4-b5c56f083a69 req-c5bb8ac0-fa21-4edd-8e33-ae75b9104f87 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Refreshing network info cache for port 989482db-88b2-40b8-9cb3-59cb766c98e3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:08:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:19.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:08:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:19.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:08:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:20.186 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '91'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:08:20 np0005603623 nova_compute[226235]: 2026-01-31 09:08:20.507 226239 INFO nova.compute.manager [None req-063f6352-8c61-405a-8a8d-e5172e57c2a9 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Get console output#033[00m
Jan 31 04:08:20 np0005603623 nova_compute[226235]: 2026-01-31 09:08:20.511 270602 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 04:08:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:21.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:21 np0005603623 nova_compute[226235]: 2026-01-31 09:08:21.582 226239 DEBUG nova.compute.manager [req-7d61a432-d0e4-4966-8b20-8f93115184af req-f30a7d09-aeb2-4196-848d-c6681ff3e055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received event network-vif-unplugged-989482db-88b2-40b8-9cb3-59cb766c98e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:21 np0005603623 nova_compute[226235]: 2026-01-31 09:08:21.583 226239 DEBUG oslo_concurrency.lockutils [req-7d61a432-d0e4-4966-8b20-8f93115184af req-f30a7d09-aeb2-4196-848d-c6681ff3e055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:21 np0005603623 nova_compute[226235]: 2026-01-31 09:08:21.583 226239 DEBUG oslo_concurrency.lockutils [req-7d61a432-d0e4-4966-8b20-8f93115184af req-f30a7d09-aeb2-4196-848d-c6681ff3e055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:21 np0005603623 nova_compute[226235]: 2026-01-31 09:08:21.584 226239 DEBUG oslo_concurrency.lockutils [req-7d61a432-d0e4-4966-8b20-8f93115184af req-f30a7d09-aeb2-4196-848d-c6681ff3e055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:21 np0005603623 nova_compute[226235]: 2026-01-31 09:08:21.584 226239 DEBUG nova.compute.manager [req-7d61a432-d0e4-4966-8b20-8f93115184af req-f30a7d09-aeb2-4196-848d-c6681ff3e055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] No waiting events found dispatching network-vif-unplugged-989482db-88b2-40b8-9cb3-59cb766c98e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:08:21 np0005603623 nova_compute[226235]: 2026-01-31 09:08:21.584 226239 WARNING nova.compute.manager [req-7d61a432-d0e4-4966-8b20-8f93115184af req-f30a7d09-aeb2-4196-848d-c6681ff3e055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received unexpected event network-vif-unplugged-989482db-88b2-40b8-9cb3-59cb766c98e3 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:08:21 np0005603623 nova_compute[226235]: 2026-01-31 09:08:21.585 226239 DEBUG nova.compute.manager [req-7d61a432-d0e4-4966-8b20-8f93115184af req-f30a7d09-aeb2-4196-848d-c6681ff3e055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received event network-vif-plugged-989482db-88b2-40b8-9cb3-59cb766c98e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:21 np0005603623 nova_compute[226235]: 2026-01-31 09:08:21.585 226239 DEBUG oslo_concurrency.lockutils [req-7d61a432-d0e4-4966-8b20-8f93115184af req-f30a7d09-aeb2-4196-848d-c6681ff3e055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:21 np0005603623 nova_compute[226235]: 2026-01-31 09:08:21.586 226239 DEBUG oslo_concurrency.lockutils [req-7d61a432-d0e4-4966-8b20-8f93115184af req-f30a7d09-aeb2-4196-848d-c6681ff3e055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:21 np0005603623 nova_compute[226235]: 2026-01-31 09:08:21.586 226239 DEBUG oslo_concurrency.lockutils [req-7d61a432-d0e4-4966-8b20-8f93115184af req-f30a7d09-aeb2-4196-848d-c6681ff3e055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:21 np0005603623 nova_compute[226235]: 2026-01-31 09:08:21.586 226239 DEBUG nova.compute.manager [req-7d61a432-d0e4-4966-8b20-8f93115184af req-f30a7d09-aeb2-4196-848d-c6681ff3e055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] No waiting events found dispatching network-vif-plugged-989482db-88b2-40b8-9cb3-59cb766c98e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:08:21 np0005603623 nova_compute[226235]: 2026-01-31 09:08:21.587 226239 WARNING nova.compute.manager [req-7d61a432-d0e4-4966-8b20-8f93115184af req-f30a7d09-aeb2-4196-848d-c6681ff3e055 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received unexpected event network-vif-plugged-989482db-88b2-40b8-9cb3-59cb766c98e3 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:08:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:21.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:21 np0005603623 nova_compute[226235]: 2026-01-31 09:08:21.616 226239 DEBUG nova.network.neutron [req-186aaddb-a56f-4ba4-84b4-b5c56f083a69 req-c5bb8ac0-fa21-4edd-8e33-ae75b9104f87 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Updated VIF entry in instance network info cache for port 989482db-88b2-40b8-9cb3-59cb766c98e3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:08:21 np0005603623 nova_compute[226235]: 2026-01-31 09:08:21.617 226239 DEBUG nova.network.neutron [req-186aaddb-a56f-4ba4-84b4-b5c56f083a69 req-c5bb8ac0-fa21-4edd-8e33-ae75b9104f87 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Updating instance_info_cache with network_info: [{"id": "989482db-88b2-40b8-9cb3-59cb766c98e3", "address": "fa:16:3e:8b:2c:dd", "network": {"id": "6b575155-2651-409f-ab96-5a6cf52f7f88", "bridge": "br-int", "label": "tempest-network-smoke--862739051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap989482db-88", "ovs_interfaceid": "989482db-88b2-40b8-9cb3-59cb766c98e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:08:21 np0005603623 nova_compute[226235]: 2026-01-31 09:08:21.657 226239 DEBUG oslo_concurrency.lockutils [req-186aaddb-a56f-4ba4-84b4-b5c56f083a69 req-c5bb8ac0-fa21-4edd-8e33-ae75b9104f87 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:08:21 np0005603623 nova_compute[226235]: 2026-01-31 09:08:21.926 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:22 np0005603623 nova_compute[226235]: 2026-01-31 09:08:22.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:08:22 np0005603623 nova_compute[226235]: 2026-01-31 09:08:22.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:08:22 np0005603623 nova_compute[226235]: 2026-01-31 09:08:22.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:08:22 np0005603623 nova_compute[226235]: 2026-01-31 09:08:22.212 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:22 np0005603623 nova_compute[226235]: 2026-01-31 09:08:22.367 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:08:22 np0005603623 nova_compute[226235]: 2026-01-31 09:08:22.368 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:08:22 np0005603623 nova_compute[226235]: 2026-01-31 09:08:22.368 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 04:08:22 np0005603623 nova_compute[226235]: 2026-01-31 09:08:22.368 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bd9f81d8-7c9d-42e3-969b-7b404bfaff36 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:08:22 np0005603623 nova_compute[226235]: 2026-01-31 09:08:22.621 226239 DEBUG nova.compute.manager [req-975c2818-4cf1-4a08-a05b-9b15cf7846c0 req-8b47f2c4-df0c-46cb-9243-327df188bdb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received event network-changed-989482db-88b2-40b8-9cb3-59cb766c98e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:22 np0005603623 nova_compute[226235]: 2026-01-31 09:08:22.621 226239 DEBUG nova.compute.manager [req-975c2818-4cf1-4a08-a05b-9b15cf7846c0 req-8b47f2c4-df0c-46cb-9243-327df188bdb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Refreshing instance network info cache due to event network-changed-989482db-88b2-40b8-9cb3-59cb766c98e3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:08:22 np0005603623 nova_compute[226235]: 2026-01-31 09:08:22.621 226239 DEBUG oslo_concurrency.lockutils [req-975c2818-4cf1-4a08-a05b-9b15cf7846c0 req-8b47f2c4-df0c-46cb-9243-327df188bdb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:08:22 np0005603623 nova_compute[226235]: 2026-01-31 09:08:22.943 226239 INFO nova.compute.manager [None req-56051c2e-bae0-4f18-9759-d14b064efc69 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Get console output#033[00m
Jan 31 04:08:22 np0005603623 nova_compute[226235]: 2026-01-31 09:08:22.946 270602 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 04:08:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:23.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:08:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:23.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:08:23 np0005603623 nova_compute[226235]: 2026-01-31 09:08:23.703 226239 DEBUG nova.compute.manager [req-37c3ce1a-5df5-4eeb-8422-2027b2805d5e req-e374001f-35d9-47ea-ba77-723ae3f6f68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received event network-vif-plugged-989482db-88b2-40b8-9cb3-59cb766c98e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:23 np0005603623 nova_compute[226235]: 2026-01-31 09:08:23.703 226239 DEBUG oslo_concurrency.lockutils [req-37c3ce1a-5df5-4eeb-8422-2027b2805d5e req-e374001f-35d9-47ea-ba77-723ae3f6f68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:23 np0005603623 nova_compute[226235]: 2026-01-31 09:08:23.703 226239 DEBUG oslo_concurrency.lockutils [req-37c3ce1a-5df5-4eeb-8422-2027b2805d5e req-e374001f-35d9-47ea-ba77-723ae3f6f68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:23 np0005603623 nova_compute[226235]: 2026-01-31 09:08:23.704 226239 DEBUG oslo_concurrency.lockutils [req-37c3ce1a-5df5-4eeb-8422-2027b2805d5e req-e374001f-35d9-47ea-ba77-723ae3f6f68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:23 np0005603623 nova_compute[226235]: 2026-01-31 09:08:23.704 226239 DEBUG nova.compute.manager [req-37c3ce1a-5df5-4eeb-8422-2027b2805d5e req-e374001f-35d9-47ea-ba77-723ae3f6f68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] No waiting events found dispatching network-vif-plugged-989482db-88b2-40b8-9cb3-59cb766c98e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:08:23 np0005603623 nova_compute[226235]: 2026-01-31 09:08:23.704 226239 WARNING nova.compute.manager [req-37c3ce1a-5df5-4eeb-8422-2027b2805d5e req-e374001f-35d9-47ea-ba77-723ae3f6f68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received unexpected event network-vif-plugged-989482db-88b2-40b8-9cb3-59cb766c98e3 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:08:23 np0005603623 nova_compute[226235]: 2026-01-31 09:08:23.704 226239 DEBUG nova.compute.manager [req-37c3ce1a-5df5-4eeb-8422-2027b2805d5e req-e374001f-35d9-47ea-ba77-723ae3f6f68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received event network-vif-plugged-989482db-88b2-40b8-9cb3-59cb766c98e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:23 np0005603623 nova_compute[226235]: 2026-01-31 09:08:23.704 226239 DEBUG oslo_concurrency.lockutils [req-37c3ce1a-5df5-4eeb-8422-2027b2805d5e req-e374001f-35d9-47ea-ba77-723ae3f6f68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:23 np0005603623 nova_compute[226235]: 2026-01-31 09:08:23.705 226239 DEBUG oslo_concurrency.lockutils [req-37c3ce1a-5df5-4eeb-8422-2027b2805d5e req-e374001f-35d9-47ea-ba77-723ae3f6f68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:23 np0005603623 nova_compute[226235]: 2026-01-31 09:08:23.705 226239 DEBUG oslo_concurrency.lockutils [req-37c3ce1a-5df5-4eeb-8422-2027b2805d5e req-e374001f-35d9-47ea-ba77-723ae3f6f68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:23 np0005603623 nova_compute[226235]: 2026-01-31 09:08:23.705 226239 DEBUG nova.compute.manager [req-37c3ce1a-5df5-4eeb-8422-2027b2805d5e req-e374001f-35d9-47ea-ba77-723ae3f6f68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] No waiting events found dispatching network-vif-plugged-989482db-88b2-40b8-9cb3-59cb766c98e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:08:23 np0005603623 nova_compute[226235]: 2026-01-31 09:08:23.705 226239 WARNING nova.compute.manager [req-37c3ce1a-5df5-4eeb-8422-2027b2805d5e req-e374001f-35d9-47ea-ba77-723ae3f6f68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received unexpected event network-vif-plugged-989482db-88b2-40b8-9cb3-59cb766c98e3 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.235 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Updating instance_info_cache with network_info: [{"id": "989482db-88b2-40b8-9cb3-59cb766c98e3", "address": "fa:16:3e:8b:2c:dd", "network": {"id": "6b575155-2651-409f-ab96-5a6cf52f7f88", "bridge": "br-int", "label": "tempest-network-smoke--862739051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap989482db-88", "ovs_interfaceid": "989482db-88b2-40b8-9cb3-59cb766c98e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.255 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.256 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.256 226239 DEBUG oslo_concurrency.lockutils [req-975c2818-4cf1-4a08-a05b-9b15cf7846c0 req-8b47f2c4-df0c-46cb-9243-327df188bdb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.256 226239 DEBUG nova.network.neutron [req-975c2818-4cf1-4a08-a05b-9b15cf7846c0 req-8b47f2c4-df0c-46cb-9243-327df188bdb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Refreshing network info cache for port 989482db-88b2-40b8-9cb3-59cb766c98e3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.258 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.289 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.289 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.289 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.290 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.290 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:08:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:08:24 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4134445694' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.723 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.793 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000d1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.794 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000d1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.868 226239 DEBUG oslo_concurrency.lockutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "cfd916b8-0d40-401d-b967-86b08f49eaed" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.869 226239 DEBUG oslo_concurrency.lockutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "cfd916b8-0d40-401d-b967-86b08f49eaed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.902 226239 DEBUG nova.compute.manager [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.941 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.942 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3960MB free_disk=20.851383209228516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.942 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:24 np0005603623 nova_compute[226235]: 2026-01-31 09:08:24.943 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.023 226239 DEBUG oslo_concurrency.lockutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.041 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance bd9f81d8-7c9d-42e3-969b-7b404bfaff36 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.060 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance cfd916b8-0d40-401d-b967-86b08f49eaed has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.060 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.061 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.074 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.100 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.101 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.125 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.146 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.251 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:08:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:25.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:25.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:08:25 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3154596343' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.662 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.411s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.667 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.681 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.705 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.705 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.762s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.705 226239 DEBUG oslo_concurrency.lockutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.682s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.712 226239 DEBUG nova.virt.hardware [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.712 226239 INFO nova.compute.claims [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 04:08:25 np0005603623 nova_compute[226235]: 2026-01-31 09:08:25.868 226239 DEBUG oslo_concurrency.processutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:08:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:08:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2995973815' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.272 226239 DEBUG oslo_concurrency.processutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.403s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.276 226239 DEBUG nova.compute.provider_tree [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.295 226239 DEBUG nova.scheduler.client.report [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.318 226239 DEBUG oslo_concurrency.lockutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.613s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.319 226239 DEBUG nova.compute.manager [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.372 226239 DEBUG nova.compute.manager [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.372 226239 DEBUG nova.network.neutron [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.400 226239 INFO nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.421 226239 DEBUG nova.compute.manager [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.549 226239 DEBUG nova.compute.manager [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.550 226239 DEBUG nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.551 226239 INFO nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Creating image(s)#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.572 226239 DEBUG nova.storage.rbd_utils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image cfd916b8-0d40-401d-b967-86b08f49eaed_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.595 226239 DEBUG nova.storage.rbd_utils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image cfd916b8-0d40-401d-b967-86b08f49eaed_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.621 226239 DEBUG nova.storage.rbd_utils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image cfd916b8-0d40-401d-b967-86b08f49eaed_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.625 226239 DEBUG oslo_concurrency.processutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.673 226239 DEBUG oslo_concurrency.processutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.674 226239 DEBUG oslo_concurrency.lockutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.675 226239 DEBUG oslo_concurrency.lockutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.675 226239 DEBUG oslo_concurrency.lockutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.695 226239 DEBUG nova.storage.rbd_utils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image cfd916b8-0d40-401d-b967-86b08f49eaed_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.698 226239 DEBUG oslo_concurrency.processutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 cfd916b8-0d40-401d-b967-86b08f49eaed_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:08:26 np0005603623 nova_compute[226235]: 2026-01-31 09:08:26.928 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:27 np0005603623 nova_compute[226235]: 2026-01-31 09:08:27.047 226239 DEBUG nova.policy [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ebd43008d7a64b8bbf97a2304b1f78b6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c7930b92fc3471f87d9fe78ee56e71e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:08:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:27 np0005603623 nova_compute[226235]: 2026-01-31 09:08:27.113 226239 DEBUG oslo_concurrency.processutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 cfd916b8-0d40-401d-b967-86b08f49eaed_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:08:27 np0005603623 nova_compute[226235]: 2026-01-31 09:08:27.188 226239 DEBUG nova.storage.rbd_utils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] resizing rbd image cfd916b8-0d40-401d-b967-86b08f49eaed_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 04:08:27 np0005603623 nova_compute[226235]: 2026-01-31 09:08:27.233 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:27.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:27 np0005603623 nova_compute[226235]: 2026-01-31 09:08:27.481 226239 DEBUG nova.objects.instance [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lazy-loading 'migration_context' on Instance uuid cfd916b8-0d40-401d-b967-86b08f49eaed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:08:27 np0005603623 nova_compute[226235]: 2026-01-31 09:08:27.507 226239 DEBUG nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 04:08:27 np0005603623 nova_compute[226235]: 2026-01-31 09:08:27.507 226239 DEBUG nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Ensure instance console log exists: /var/lib/nova/instances/cfd916b8-0d40-401d-b967-86b08f49eaed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:08:27 np0005603623 nova_compute[226235]: 2026-01-31 09:08:27.508 226239 DEBUG oslo_concurrency.lockutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:27 np0005603623 nova_compute[226235]: 2026-01-31 09:08:27.508 226239 DEBUG oslo_concurrency.lockutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:27 np0005603623 nova_compute[226235]: 2026-01-31 09:08:27.508 226239 DEBUG oslo_concurrency.lockutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:27 np0005603623 nova_compute[226235]: 2026-01-31 09:08:27.602 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:08:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:27.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:27 np0005603623 nova_compute[226235]: 2026-01-31 09:08:27.859 226239 DEBUG nova.network.neutron [req-975c2818-4cf1-4a08-a05b-9b15cf7846c0 req-8b47f2c4-df0c-46cb-9243-327df188bdb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Updated VIF entry in instance network info cache for port 989482db-88b2-40b8-9cb3-59cb766c98e3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:08:27 np0005603623 nova_compute[226235]: 2026-01-31 09:08:27.860 226239 DEBUG nova.network.neutron [req-975c2818-4cf1-4a08-a05b-9b15cf7846c0 req-8b47f2c4-df0c-46cb-9243-327df188bdb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Updating instance_info_cache with network_info: [{"id": "989482db-88b2-40b8-9cb3-59cb766c98e3", "address": "fa:16:3e:8b:2c:dd", "network": {"id": "6b575155-2651-409f-ab96-5a6cf52f7f88", "bridge": "br-int", "label": "tempest-network-smoke--862739051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap989482db-88", "ovs_interfaceid": "989482db-88b2-40b8-9cb3-59cb766c98e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:08:27 np0005603623 nova_compute[226235]: 2026-01-31 09:08:27.883 226239 DEBUG oslo_concurrency.lockutils [req-975c2818-4cf1-4a08-a05b-9b15cf7846c0 req-8b47f2c4-df0c-46cb-9243-327df188bdb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:08:28 np0005603623 nova_compute[226235]: 2026-01-31 09:08:28.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:08:28 np0005603623 podman[325850]: 2026-01-31 09:08:28.955567074 +0000 UTC m=+0.050731403 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:08:28 np0005603623 nova_compute[226235]: 2026-01-31 09:08:28.960 226239 DEBUG nova.network.neutron [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Successfully created port: ffc6313d-bd98-4ce5-b22e-8e055d354e40 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:08:28 np0005603623 podman[325851]: 2026-01-31 09:08:28.971247805 +0000 UTC m=+0.066351292 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Jan 31 04:08:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:08:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:29.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:08:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:29.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.106 226239 DEBUG nova.compute.manager [req-d83d8d77-7078-4725-8992-a8fdb49a0776 req-6530168c-6a75-4f72-92e3-e25b9514962b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received event network-changed-989482db-88b2-40b8-9cb3-59cb766c98e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.107 226239 DEBUG nova.compute.manager [req-d83d8d77-7078-4725-8992-a8fdb49a0776 req-6530168c-6a75-4f72-92e3-e25b9514962b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Refreshing instance network info cache due to event network-changed-989482db-88b2-40b8-9cb3-59cb766c98e3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.107 226239 DEBUG oslo_concurrency.lockutils [req-d83d8d77-7078-4725-8992-a8fdb49a0776 req-6530168c-6a75-4f72-92e3-e25b9514962b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.107 226239 DEBUG oslo_concurrency.lockutils [req-d83d8d77-7078-4725-8992-a8fdb49a0776 req-6530168c-6a75-4f72-92e3-e25b9514962b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.107 226239 DEBUG nova.network.neutron [req-d83d8d77-7078-4725-8992-a8fdb49a0776 req-6530168c-6a75-4f72-92e3-e25b9514962b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Refreshing network info cache for port 989482db-88b2-40b8-9cb3-59cb766c98e3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:30.160 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:30.160 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:30.161 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.220 226239 DEBUG oslo_concurrency.lockutils [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.221 226239 DEBUG oslo_concurrency.lockutils [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.221 226239 DEBUG oslo_concurrency.lockutils [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.221 226239 DEBUG oslo_concurrency.lockutils [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.221 226239 DEBUG oslo_concurrency.lockutils [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.222 226239 INFO nova.compute.manager [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Terminating instance#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.223 226239 DEBUG nova.compute.manager [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:08:30 np0005603623 kernel: tap989482db-88 (unregistering): left promiscuous mode
Jan 31 04:08:30 np0005603623 NetworkManager[48970]: <info>  [1769850510.3583] device (tap989482db-88): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:08:30 np0005603623 ovn_controller[133449]: 2026-01-31T09:08:30Z|00881|binding|INFO|Releasing lport 989482db-88b2-40b8-9cb3-59cb766c98e3 from this chassis (sb_readonly=0)
Jan 31 04:08:30 np0005603623 ovn_controller[133449]: 2026-01-31T09:08:30Z|00882|binding|INFO|Setting lport 989482db-88b2-40b8-9cb3-59cb766c98e3 down in Southbound
Jan 31 04:08:30 np0005603623 ovn_controller[133449]: 2026-01-31T09:08:30Z|00883|binding|INFO|Removing iface tap989482db-88 ovn-installed in OVS
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.364 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.366 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:30.377 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:2c:dd 10.100.0.9'], port_security=['fa:16:3e:8b:2c:dd 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'bd9f81d8-7c9d-42e3-969b-7b404bfaff36', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b575155-2651-409f-ab96-5a6cf52f7f88', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c2c6300a-1c14-467d-a580-f801f9bbd79f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d18ce706-6e0f-4295-834a-6178a6f7a4c4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=989482db-88b2-40b8-9cb3-59cb766c98e3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:30.378 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 989482db-88b2-40b8-9cb3-59cb766c98e3 in datapath 6b575155-2651-409f-ab96-5a6cf52f7f88 unbound from our chassis#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.379 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:30.380 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6b575155-2651-409f-ab96-5a6cf52f7f88, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:30.382 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[40cabfd4-3cd7-4c43-a09c-d82620c36237]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:30.382 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88 namespace which is not needed anymore#033[00m
Jan 31 04:08:30 np0005603623 systemd[1]: machine-qemu\x2d97\x2dinstance\x2d000000d1.scope: Deactivated successfully.
Jan 31 04:08:30 np0005603623 systemd[1]: machine-qemu\x2d97\x2dinstance\x2d000000d1.scope: Consumed 14.592s CPU time.
Jan 31 04:08:30 np0005603623 systemd-machined[194379]: Machine qemu-97-instance-000000d1 terminated.
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.455 226239 INFO nova.virt.libvirt.driver [-] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Instance destroyed successfully.#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.456 226239 DEBUG nova.objects.instance [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'resources' on Instance uuid bd9f81d8-7c9d-42e3-969b-7b404bfaff36 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.480 226239 DEBUG nova.virt.libvirt.vif [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:07:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1788706990',display_name='tempest-TestNetworkBasicOps-server-1788706990',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1788706990',id=209,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFEUtQI/j7LXzuU0YrvuHFu6rkogJclFlco3io4wOyAEhRnhgfAZfdIyNjEUp2XoZclbiiMzCEa3Qt2b6cDsAqGG0bmO017oWolVcCZbo0JT42U5etFVtTzsJWjXNhAHJw==',key_name='tempest-TestNetworkBasicOps-1516398044',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:07:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-ozydntjb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:07:24Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=bd9f81d8-7c9d-42e3-969b-7b404bfaff36,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "989482db-88b2-40b8-9cb3-59cb766c98e3", "address": "fa:16:3e:8b:2c:dd", "network": {"id": "6b575155-2651-409f-ab96-5a6cf52f7f88", "bridge": "br-int", "label": "tempest-network-smoke--862739051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap989482db-88", "ovs_interfaceid": "989482db-88b2-40b8-9cb3-59cb766c98e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.480 226239 DEBUG nova.network.os_vif_util [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "989482db-88b2-40b8-9cb3-59cb766c98e3", "address": "fa:16:3e:8b:2c:dd", "network": {"id": "6b575155-2651-409f-ab96-5a6cf52f7f88", "bridge": "br-int", "label": "tempest-network-smoke--862739051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap989482db-88", "ovs_interfaceid": "989482db-88b2-40b8-9cb3-59cb766c98e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.481 226239 DEBUG nova.network.os_vif_util [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8b:2c:dd,bridge_name='br-int',has_traffic_filtering=True,id=989482db-88b2-40b8-9cb3-59cb766c98e3,network=Network(6b575155-2651-409f-ab96-5a6cf52f7f88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap989482db-88') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.481 226239 DEBUG os_vif [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:2c:dd,bridge_name='br-int',has_traffic_filtering=True,id=989482db-88b2-40b8-9cb3-59cb766c98e3,network=Network(6b575155-2651-409f-ab96-5a6cf52f7f88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap989482db-88') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.483 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.484 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap989482db-88, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.485 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.486 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.488 226239 INFO os_vif [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:2c:dd,bridge_name='br-int',has_traffic_filtering=True,id=989482db-88b2-40b8-9cb3-59cb766c98e3,network=Network(6b575155-2651-409f-ab96-5a6cf52f7f88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap989482db-88')#033[00m
Jan 31 04:08:30 np0005603623 neutron-haproxy-ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88[325269]: [NOTICE]   (325275) : haproxy version is 2.8.14-c23fe91
Jan 31 04:08:30 np0005603623 neutron-haproxy-ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88[325269]: [NOTICE]   (325275) : path to executable is /usr/sbin/haproxy
Jan 31 04:08:30 np0005603623 neutron-haproxy-ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88[325269]: [WARNING]  (325275) : Exiting Master process...
Jan 31 04:08:30 np0005603623 neutron-haproxy-ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88[325269]: [WARNING]  (325275) : Exiting Master process...
Jan 31 04:08:30 np0005603623 neutron-haproxy-ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88[325269]: [ALERT]    (325275) : Current worker (325277) exited with code 143 (Terminated)
Jan 31 04:08:30 np0005603623 neutron-haproxy-ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88[325269]: [WARNING]  (325275) : All workers exited. Exiting... (0)
Jan 31 04:08:30 np0005603623 systemd[1]: libpod-dbc2847dd08a4827b6459bda431c9f6c8216645fa710b72f27c7f25fa0b4eb86.scope: Deactivated successfully.
Jan 31 04:08:30 np0005603623 podman[326056]: 2026-01-31 09:08:30.500244471 +0000 UTC m=+0.045044734 container died dbc2847dd08a4827b6459bda431c9f6c8216645fa710b72f27c7f25fa0b4eb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 31 04:08:30 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dbc2847dd08a4827b6459bda431c9f6c8216645fa710b72f27c7f25fa0b4eb86-userdata-shm.mount: Deactivated successfully.
Jan 31 04:08:30 np0005603623 systemd[1]: var-lib-containers-storage-overlay-8e94a64f511ed0d688a836112c8cfc8dc286cb00bf12f1c705a629236f5ee106-merged.mount: Deactivated successfully.
Jan 31 04:08:30 np0005603623 podman[326056]: 2026-01-31 09:08:30.543166018 +0000 UTC m=+0.087966261 container cleanup dbc2847dd08a4827b6459bda431c9f6c8216645fa710b72f27c7f25fa0b4eb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 04:08:30 np0005603623 systemd[1]: libpod-conmon-dbc2847dd08a4827b6459bda431c9f6c8216645fa710b72f27c7f25fa0b4eb86.scope: Deactivated successfully.
Jan 31 04:08:30 np0005603623 podman[326109]: 2026-01-31 09:08:30.595247891 +0000 UTC m=+0.038373214 container remove dbc2847dd08a4827b6459bda431c9f6c8216645fa710b72f27c7f25fa0b4eb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 04:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:30.598 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0b747703-6861-44e6-b6a0-51cd3ba791e6]: (4, ('Sat Jan 31 09:08:30 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88 (dbc2847dd08a4827b6459bda431c9f6c8216645fa710b72f27c7f25fa0b4eb86)\ndbc2847dd08a4827b6459bda431c9f6c8216645fa710b72f27c7f25fa0b4eb86\nSat Jan 31 09:08:30 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88 (dbc2847dd08a4827b6459bda431c9f6c8216645fa710b72f27c7f25fa0b4eb86)\ndbc2847dd08a4827b6459bda431c9f6c8216645fa710b72f27c7f25fa0b4eb86\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:30.600 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9d6c5fc2-6c09-46b2-9f47-d47ef9b4f8d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:30.601 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b575155-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:08:30 np0005603623 kernel: tap6b575155-20: left promiscuous mode
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.603 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:30 np0005603623 nova_compute[226235]: 2026-01-31 09:08:30.608 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:30.611 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d02b043e-f585-421b-acff-8243b01e3490]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:30.629 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f24136d3-438f-402b-a420-c024293967c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:30.630 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dde3dc2a-8134-4ad3-832c-5d8d05f2f020]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:30.641 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[75797e9e-ef72-4000-a10a-675f96042471]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 958431, 'reachable_time': 25262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 326124, 'error': None, 'target': 'ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:30.644 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6b575155-2651-409f-ab96-5a6cf52f7f88 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:08:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:30.644 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[66f200c2-de0d-46de-99a4-de576463beed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:30 np0005603623 systemd[1]: run-netns-ovnmeta\x2d6b575155\x2d2651\x2d409f\x2dab96\x2d5a6cf52f7f88.mount: Deactivated successfully.
Jan 31 04:08:31 np0005603623 nova_compute[226235]: 2026-01-31 09:08:31.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:08:31 np0005603623 nova_compute[226235]: 2026-01-31 09:08:31.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:08:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:31.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:31 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:08:31 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:08:31 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:08:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:31.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:31 np0005603623 nova_compute[226235]: 2026-01-31 09:08:31.930 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:31 np0005603623 nova_compute[226235]: 2026-01-31 09:08:31.974 226239 DEBUG nova.network.neutron [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Successfully updated port: ffc6313d-bd98-4ce5-b22e-8e055d354e40 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:08:31 np0005603623 nova_compute[226235]: 2026-01-31 09:08:31.991 226239 DEBUG oslo_concurrency.lockutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "refresh_cache-cfd916b8-0d40-401d-b967-86b08f49eaed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:08:31 np0005603623 nova_compute[226235]: 2026-01-31 09:08:31.991 226239 DEBUG oslo_concurrency.lockutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquired lock "refresh_cache-cfd916b8-0d40-401d-b967-86b08f49eaed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:08:31 np0005603623 nova_compute[226235]: 2026-01-31 09:08:31.991 226239 DEBUG nova.network.neutron [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:08:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.112 226239 DEBUG nova.compute.manager [req-7b5908cb-1be7-4966-8408-357b785c2b82 req-f83fcc4f-5d05-4ece-a3fe-ecce212e42cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Received event network-changed-ffc6313d-bd98-4ce5-b22e-8e055d354e40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.112 226239 DEBUG nova.compute.manager [req-7b5908cb-1be7-4966-8408-357b785c2b82 req-f83fcc4f-5d05-4ece-a3fe-ecce212e42cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Refreshing instance network info cache due to event network-changed-ffc6313d-bd98-4ce5-b22e-8e055d354e40. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.113 226239 DEBUG oslo_concurrency.lockutils [req-7b5908cb-1be7-4966-8408-357b785c2b82 req-f83fcc4f-5d05-4ece-a3fe-ecce212e42cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-cfd916b8-0d40-401d-b967-86b08f49eaed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.142 226239 INFO nova.virt.libvirt.driver [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Deleting instance files /var/lib/nova/instances/bd9f81d8-7c9d-42e3-969b-7b404bfaff36_del#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.143 226239 INFO nova.virt.libvirt.driver [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Deletion of /var/lib/nova/instances/bd9f81d8-7c9d-42e3-969b-7b404bfaff36_del complete#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.210 226239 DEBUG nova.compute.manager [req-4d5b374b-4a92-4779-ae4a-e5cf314696d3 req-a0906bcd-87b3-4cf8-9fa9-ee0da78b4f96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received event network-vif-unplugged-989482db-88b2-40b8-9cb3-59cb766c98e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.211 226239 DEBUG oslo_concurrency.lockutils [req-4d5b374b-4a92-4779-ae4a-e5cf314696d3 req-a0906bcd-87b3-4cf8-9fa9-ee0da78b4f96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.211 226239 DEBUG oslo_concurrency.lockutils [req-4d5b374b-4a92-4779-ae4a-e5cf314696d3 req-a0906bcd-87b3-4cf8-9fa9-ee0da78b4f96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.211 226239 DEBUG oslo_concurrency.lockutils [req-4d5b374b-4a92-4779-ae4a-e5cf314696d3 req-a0906bcd-87b3-4cf8-9fa9-ee0da78b4f96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.211 226239 DEBUG nova.compute.manager [req-4d5b374b-4a92-4779-ae4a-e5cf314696d3 req-a0906bcd-87b3-4cf8-9fa9-ee0da78b4f96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] No waiting events found dispatching network-vif-unplugged-989482db-88b2-40b8-9cb3-59cb766c98e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.212 226239 DEBUG nova.compute.manager [req-4d5b374b-4a92-4779-ae4a-e5cf314696d3 req-a0906bcd-87b3-4cf8-9fa9-ee0da78b4f96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received event network-vif-unplugged-989482db-88b2-40b8-9cb3-59cb766c98e3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.212 226239 DEBUG nova.compute.manager [req-4d5b374b-4a92-4779-ae4a-e5cf314696d3 req-a0906bcd-87b3-4cf8-9fa9-ee0da78b4f96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received event network-vif-plugged-989482db-88b2-40b8-9cb3-59cb766c98e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.212 226239 DEBUG oslo_concurrency.lockutils [req-4d5b374b-4a92-4779-ae4a-e5cf314696d3 req-a0906bcd-87b3-4cf8-9fa9-ee0da78b4f96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.212 226239 DEBUG oslo_concurrency.lockutils [req-4d5b374b-4a92-4779-ae4a-e5cf314696d3 req-a0906bcd-87b3-4cf8-9fa9-ee0da78b4f96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.213 226239 DEBUG oslo_concurrency.lockutils [req-4d5b374b-4a92-4779-ae4a-e5cf314696d3 req-a0906bcd-87b3-4cf8-9fa9-ee0da78b4f96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.213 226239 DEBUG nova.compute.manager [req-4d5b374b-4a92-4779-ae4a-e5cf314696d3 req-a0906bcd-87b3-4cf8-9fa9-ee0da78b4f96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] No waiting events found dispatching network-vif-plugged-989482db-88b2-40b8-9cb3-59cb766c98e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.213 226239 WARNING nova.compute.manager [req-4d5b374b-4a92-4779-ae4a-e5cf314696d3 req-a0906bcd-87b3-4cf8-9fa9-ee0da78b4f96 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received unexpected event network-vif-plugged-989482db-88b2-40b8-9cb3-59cb766c98e3 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.220 226239 INFO nova.compute.manager [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Took 2.00 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.221 226239 DEBUG oslo.service.loopingcall [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.221 226239 DEBUG nova.compute.manager [-] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.221 226239 DEBUG nova.network.neutron [-] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:08:32 np0005603623 nova_compute[226235]: 2026-01-31 09:08:32.226 226239 DEBUG nova.network.neutron [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:08:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:33.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:33.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:34 np0005603623 nova_compute[226235]: 2026-01-31 09:08:34.035 226239 DEBUG nova.network.neutron [req-d83d8d77-7078-4725-8992-a8fdb49a0776 req-6530168c-6a75-4f72-92e3-e25b9514962b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Updated VIF entry in instance network info cache for port 989482db-88b2-40b8-9cb3-59cb766c98e3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:08:34 np0005603623 nova_compute[226235]: 2026-01-31 09:08:34.035 226239 DEBUG nova.network.neutron [req-d83d8d77-7078-4725-8992-a8fdb49a0776 req-6530168c-6a75-4f72-92e3-e25b9514962b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Updating instance_info_cache with network_info: [{"id": "989482db-88b2-40b8-9cb3-59cb766c98e3", "address": "fa:16:3e:8b:2c:dd", "network": {"id": "6b575155-2651-409f-ab96-5a6cf52f7f88", "bridge": "br-int", "label": "tempest-network-smoke--862739051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap989482db-88", "ovs_interfaceid": "989482db-88b2-40b8-9cb3-59cb766c98e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:08:34 np0005603623 nova_compute[226235]: 2026-01-31 09:08:34.057 226239 DEBUG oslo_concurrency.lockutils [req-d83d8d77-7078-4725-8992-a8fdb49a0776 req-6530168c-6a75-4f72-92e3-e25b9514962b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-bd9f81d8-7c9d-42e3-969b-7b404bfaff36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:08:34 np0005603623 nova_compute[226235]: 2026-01-31 09:08:34.199 226239 DEBUG nova.network.neutron [-] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:08:34 np0005603623 nova_compute[226235]: 2026-01-31 09:08:34.225 226239 INFO nova.compute.manager [-] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Took 2.00 seconds to deallocate network for instance.#033[00m
Jan 31 04:08:34 np0005603623 nova_compute[226235]: 2026-01-31 09:08:34.289 226239 DEBUG oslo_concurrency.lockutils [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:34 np0005603623 nova_compute[226235]: 2026-01-31 09:08:34.290 226239 DEBUG oslo_concurrency.lockutils [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:34 np0005603623 nova_compute[226235]: 2026-01-31 09:08:34.314 226239 DEBUG nova.compute.manager [req-8ebf7e1e-db97-4232-8bbe-8033a356d4ba req-11b37bd2-62a6-4ed6-8337-26ffa9d77efc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Received event network-vif-deleted-989482db-88b2-40b8-9cb3-59cb766c98e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:34 np0005603623 nova_compute[226235]: 2026-01-31 09:08:34.352 226239 DEBUG oslo_concurrency.processutils [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:08:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:08:34 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3141119628' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:08:34 np0005603623 nova_compute[226235]: 2026-01-31 09:08:34.787 226239 DEBUG oslo_concurrency.processutils [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:08:34 np0005603623 nova_compute[226235]: 2026-01-31 09:08:34.794 226239 DEBUG nova.compute.provider_tree [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:08:34 np0005603623 nova_compute[226235]: 2026-01-31 09:08:34.819 226239 DEBUG nova.scheduler.client.report [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:08:34 np0005603623 nova_compute[226235]: 2026-01-31 09:08:34.874 226239 DEBUG oslo_concurrency.lockutils [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.584s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:34 np0005603623 nova_compute[226235]: 2026-01-31 09:08:34.908 226239 INFO nova.scheduler.client.report [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Deleted allocations for instance bd9f81d8-7c9d-42e3-969b-7b404bfaff36#033[00m
Jan 31 04:08:34 np0005603623 nova_compute[226235]: 2026-01-31 09:08:34.998 226239 DEBUG oslo_concurrency.lockutils [None req-fab03955-ecad-486c-9a7e-78cf6bd1c794 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "bd9f81d8-7c9d-42e3-969b-7b404bfaff36" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.777s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.007 226239 DEBUG nova.network.neutron [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Updating instance_info_cache with network_info: [{"id": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "address": "fa:16:3e:3d:51:96", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc6313d-bd", "ovs_interfaceid": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.032 226239 DEBUG oslo_concurrency.lockutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Releasing lock "refresh_cache-cfd916b8-0d40-401d-b967-86b08f49eaed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.032 226239 DEBUG nova.compute.manager [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Instance network_info: |[{"id": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "address": "fa:16:3e:3d:51:96", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc6313d-bd", "ovs_interfaceid": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.033 226239 DEBUG oslo_concurrency.lockutils [req-7b5908cb-1be7-4966-8408-357b785c2b82 req-f83fcc4f-5d05-4ece-a3fe-ecce212e42cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-cfd916b8-0d40-401d-b967-86b08f49eaed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.033 226239 DEBUG nova.network.neutron [req-7b5908cb-1be7-4966-8408-357b785c2b82 req-f83fcc4f-5d05-4ece-a3fe-ecce212e42cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Refreshing network info cache for port ffc6313d-bd98-4ce5-b22e-8e055d354e40 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.036 226239 DEBUG nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Start _get_guest_xml network_info=[{"id": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "address": "fa:16:3e:3d:51:96", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc6313d-bd", "ovs_interfaceid": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.040 226239 WARNING nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.045 226239 DEBUG nova.virt.libvirt.host [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.045 226239 DEBUG nova.virt.libvirt.host [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.048 226239 DEBUG nova.virt.libvirt.host [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.049 226239 DEBUG nova.virt.libvirt.host [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.050 226239 DEBUG nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.050 226239 DEBUG nova.virt.hardware [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.050 226239 DEBUG nova.virt.hardware [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.050 226239 DEBUG nova.virt.hardware [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.051 226239 DEBUG nova.virt.hardware [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.051 226239 DEBUG nova.virt.hardware [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.051 226239 DEBUG nova.virt.hardware [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.051 226239 DEBUG nova.virt.hardware [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.052 226239 DEBUG nova.virt.hardware [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.052 226239 DEBUG nova.virt.hardware [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.052 226239 DEBUG nova.virt.hardware [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.052 226239 DEBUG nova.virt.hardware [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.055 226239 DEBUG oslo_concurrency.processutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:08:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.003000095s ======
Jan 31 04:08:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:35.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000095s
Jan 31 04:08:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:08:35 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1397964414' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.484 226239 DEBUG oslo_concurrency.processutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.505 226239 DEBUG nova.storage.rbd_utils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image cfd916b8-0d40-401d-b967-86b08f49eaed_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.508 226239 DEBUG oslo_concurrency.processutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.524 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:35.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:08:35 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1177225280' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.918 226239 DEBUG oslo_concurrency.processutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.919 226239 DEBUG nova.virt.libvirt.vif [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:08:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-gen-0-1286502888',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-gen-0-1286502888',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1802479850-ge',id=213,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDOzLIomxM/9LD6pUmebgfRrGKD5wh7paXInFcqd2L0Y2NS24kEHuU2xVFiRKZyesRVQNhc+hG031RV8i0F2/Jgjlcab+v9GRiy/RiVhAhjy1pjZJBvE/J+e5Z7tpOEKxw==',key_name='tempest-TestSecurityGroupsBasicOps-542781411',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c7930b92fc3471f87d9fe78ee56e71e',ramdisk_id='',reservation_id='r-t6fig6iy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1802479850',owner_user_name='tempest-TestSecurityGroupsBasicOps-1802479850-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:08:26Z,user_data=None,user_id='ebd43008d7a64b8bbf97a2304b1f78b6',uuid=cfd916b8-0d40-401d-b967-86b08f49eaed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "address": "fa:16:3e:3d:51:96", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc6313d-bd", "ovs_interfaceid": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.920 226239 DEBUG nova.network.os_vif_util [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converting VIF {"id": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "address": "fa:16:3e:3d:51:96", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc6313d-bd", "ovs_interfaceid": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.921 226239 DEBUG nova.network.os_vif_util [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:51:96,bridge_name='br-int',has_traffic_filtering=True,id=ffc6313d-bd98-4ce5-b22e-8e055d354e40,network=Network(a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffc6313d-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.922 226239 DEBUG nova.objects.instance [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lazy-loading 'pci_devices' on Instance uuid cfd916b8-0d40-401d-b967-86b08f49eaed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.944 226239 DEBUG nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:08:35 np0005603623 nova_compute[226235]:  <uuid>cfd916b8-0d40-401d-b967-86b08f49eaed</uuid>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:  <name>instance-000000d5</name>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-gen-0-1286502888</nova:name>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 09:08:35</nova:creationTime>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 04:08:35 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:        <nova:user uuid="ebd43008d7a64b8bbf97a2304b1f78b6">tempest-TestSecurityGroupsBasicOps-1802479850-project-member</nova:user>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:        <nova:project uuid="0c7930b92fc3471f87d9fe78ee56e71e">tempest-TestSecurityGroupsBasicOps-1802479850</nova:project>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:        <nova:port uuid="ffc6313d-bd98-4ce5-b22e-8e055d354e40">
Jan 31 04:08:35 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <system>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <entry name="serial">cfd916b8-0d40-401d-b967-86b08f49eaed</entry>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <entry name="uuid">cfd916b8-0d40-401d-b967-86b08f49eaed</entry>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    </system>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:  <os>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:  </os>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:  <features>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:  </features>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:  </clock>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:  <devices>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/cfd916b8-0d40-401d-b967-86b08f49eaed_disk">
Jan 31 04:08:35 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:08:35 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/cfd916b8-0d40-401d-b967-86b08f49eaed_disk.config">
Jan 31 04:08:35 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:08:35 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:3d:51:96"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <target dev="tapffc6313d-bd"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    </interface>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/cfd916b8-0d40-401d-b967-86b08f49eaed/console.log" append="off"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    </serial>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <video>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    </video>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    </rng>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 04:08:35 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 04:08:35 np0005603623 nova_compute[226235]:  </devices>
Jan 31 04:08:35 np0005603623 nova_compute[226235]: </domain>
Jan 31 04:08:35 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.945 226239 DEBUG nova.compute.manager [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Preparing to wait for external event network-vif-plugged-ffc6313d-bd98-4ce5-b22e-8e055d354e40 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.946 226239 DEBUG oslo_concurrency.lockutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "cfd916b8-0d40-401d-b967-86b08f49eaed-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.946 226239 DEBUG oslo_concurrency.lockutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "cfd916b8-0d40-401d-b967-86b08f49eaed-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.946 226239 DEBUG oslo_concurrency.lockutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "cfd916b8-0d40-401d-b967-86b08f49eaed-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.947 226239 DEBUG nova.virt.libvirt.vif [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:08:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-gen-0-1286502888',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-gen-0-1286502888',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1802479850-ge',id=213,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDOzLIomxM/9LD6pUmebgfRrGKD5wh7paXInFcqd2L0Y2NS24kEHuU2xVFiRKZyesRVQNhc+hG031RV8i0F2/Jgjlcab+v9GRiy/RiVhAhjy1pjZJBvE/J+e5Z7tpOEKxw==',key_name='tempest-TestSecurityGroupsBasicOps-542781411',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c7930b92fc3471f87d9fe78ee56e71e',ramdisk_id='',reservation_id='r-t6fig6iy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1802479850',owner_user_name='tempest-TestSecurityGroupsBasicOps-1802479850-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:08:26Z,user_data=None,user_id='ebd43008d7a64b8bbf97a2304b1f78b6',uuid=cfd916b8-0d40-401d-b967-86b08f49eaed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "address": "fa:16:3e:3d:51:96", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc6313d-bd", "ovs_interfaceid": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.947 226239 DEBUG nova.network.os_vif_util [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converting VIF {"id": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "address": "fa:16:3e:3d:51:96", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc6313d-bd", "ovs_interfaceid": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.948 226239 DEBUG nova.network.os_vif_util [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:51:96,bridge_name='br-int',has_traffic_filtering=True,id=ffc6313d-bd98-4ce5-b22e-8e055d354e40,network=Network(a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffc6313d-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.948 226239 DEBUG os_vif [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:51:96,bridge_name='br-int',has_traffic_filtering=True,id=ffc6313d-bd98-4ce5-b22e-8e055d354e40,network=Network(a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffc6313d-bd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.949 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.949 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.949 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.952 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.953 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapffc6313d-bd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.953 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapffc6313d-bd, col_values=(('external_ids', {'iface-id': 'ffc6313d-bd98-4ce5-b22e-8e055d354e40', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3d:51:96', 'vm-uuid': 'cfd916b8-0d40-401d-b967-86b08f49eaed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.954 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:35 np0005603623 NetworkManager[48970]: <info>  [1769850515.9556] manager: (tapffc6313d-bd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/417)
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.957 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.958 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:35 np0005603623 nova_compute[226235]: 2026-01-31 09:08:35.959 226239 INFO os_vif [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:51:96,bridge_name='br-int',has_traffic_filtering=True,id=ffc6313d-bd98-4ce5-b22e-8e055d354e40,network=Network(a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffc6313d-bd')#033[00m
Jan 31 04:08:36 np0005603623 nova_compute[226235]: 2026-01-31 09:08:36.029 226239 DEBUG nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:08:36 np0005603623 nova_compute[226235]: 2026-01-31 09:08:36.030 226239 DEBUG nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:08:36 np0005603623 nova_compute[226235]: 2026-01-31 09:08:36.030 226239 DEBUG nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] No VIF found with MAC fa:16:3e:3d:51:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:08:36 np0005603623 nova_compute[226235]: 2026-01-31 09:08:36.030 226239 INFO nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Using config drive#033[00m
Jan 31 04:08:36 np0005603623 nova_compute[226235]: 2026-01-31 09:08:36.052 226239 DEBUG nova.storage.rbd_utils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image cfd916b8-0d40-401d-b967-86b08f49eaed_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:08:36 np0005603623 nova_compute[226235]: 2026-01-31 09:08:36.931 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:36 np0005603623 nova_compute[226235]: 2026-01-31 09:08:36.995 226239 INFO nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Creating config drive at /var/lib/nova/instances/cfd916b8-0d40-401d-b967-86b08f49eaed/disk.config#033[00m
Jan 31 04:08:36 np0005603623 nova_compute[226235]: 2026-01-31 09:08:36.998 226239 DEBUG oslo_concurrency.processutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cfd916b8-0d40-401d-b967-86b08f49eaed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpkoz74rfl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:08:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:37 np0005603623 nova_compute[226235]: 2026-01-31 09:08:37.117 226239 DEBUG oslo_concurrency.processutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cfd916b8-0d40-401d-b967-86b08f49eaed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpkoz74rfl" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:08:37 np0005603623 nova_compute[226235]: 2026-01-31 09:08:37.141 226239 DEBUG nova.storage.rbd_utils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image cfd916b8-0d40-401d-b967-86b08f49eaed_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:08:37 np0005603623 nova_compute[226235]: 2026-01-31 09:08:37.144 226239 DEBUG oslo_concurrency.processutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/cfd916b8-0d40-401d-b967-86b08f49eaed/disk.config cfd916b8-0d40-401d-b967-86b08f49eaed_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:08:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:08:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:37.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:37.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:37 np0005603623 nova_compute[226235]: 2026-01-31 09:08:37.889 226239 DEBUG oslo_concurrency.processutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/cfd916b8-0d40-401d-b967-86b08f49eaed/disk.config cfd916b8-0d40-401d-b967-86b08f49eaed_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.744s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:08:37 np0005603623 nova_compute[226235]: 2026-01-31 09:08:37.890 226239 INFO nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Deleting local config drive /var/lib/nova/instances/cfd916b8-0d40-401d-b967-86b08f49eaed/disk.config because it was imported into RBD.#033[00m
Jan 31 04:08:37 np0005603623 kernel: tapffc6313d-bd: entered promiscuous mode
Jan 31 04:08:37 np0005603623 NetworkManager[48970]: <info>  [1769850517.9277] manager: (tapffc6313d-bd): new Tun device (/org/freedesktop/NetworkManager/Devices/418)
Jan 31 04:08:37 np0005603623 nova_compute[226235]: 2026-01-31 09:08:37.929 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:37 np0005603623 ovn_controller[133449]: 2026-01-31T09:08:37Z|00884|binding|INFO|Claiming lport ffc6313d-bd98-4ce5-b22e-8e055d354e40 for this chassis.
Jan 31 04:08:37 np0005603623 ovn_controller[133449]: 2026-01-31T09:08:37Z|00885|binding|INFO|ffc6313d-bd98-4ce5-b22e-8e055d354e40: Claiming fa:16:3e:3d:51:96 10.100.0.9
Jan 31 04:08:37 np0005603623 ovn_controller[133449]: 2026-01-31T09:08:37Z|00886|binding|INFO|Setting lport ffc6313d-bd98-4ce5-b22e-8e055d354e40 ovn-installed in OVS
Jan 31 04:08:37 np0005603623 nova_compute[226235]: 2026-01-31 09:08:37.937 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:37 np0005603623 ovn_controller[133449]: 2026-01-31T09:08:37Z|00887|binding|INFO|Setting lport ffc6313d-bd98-4ce5-b22e-8e055d354e40 up in Southbound
Jan 31 04:08:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:37.940 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:51:96 10.100.0.9'], port_security=['fa:16:3e:3d:51:96 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'cfd916b8-0d40-401d-b967-86b08f49eaed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c7930b92fc3471f87d9fe78ee56e71e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f00beb61-464a-48b1-b739-7a38012ca3e6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ea5e5c3-41ce-42a3-ba60-7739a7d8baec, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=ffc6313d-bd98-4ce5-b22e-8e055d354e40) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:08:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:37.941 143258 INFO neutron.agent.ovn.metadata.agent [-] Port ffc6313d-bd98-4ce5-b22e-8e055d354e40 in datapath a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98 bound to our chassis#033[00m
Jan 31 04:08:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:37.943 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98#033[00m
Jan 31 04:08:37 np0005603623 systemd-udevd[326338]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:08:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:37.952 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ef04a3b3-90bb-4ee5-a254-885172836ba1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:37.953 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa43e5cb4-61 in ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:08:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:37.954 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa43e5cb4-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:08:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:37.954 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6637b180-5c59-41df-8f20-ab9ea1fc0f89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:37 np0005603623 systemd-machined[194379]: New machine qemu-98-instance-000000d5.
Jan 31 04:08:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:37.955 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[22a5c45c-4338-428d-a6ac-b2398364c079]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:37 np0005603623 NetworkManager[48970]: <info>  [1769850517.9647] device (tapffc6313d-bd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:08:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:37.965 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[c9154805-8487-4748-a7a7-7532825ded48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:37 np0005603623 NetworkManager[48970]: <info>  [1769850517.9662] device (tapffc6313d-bd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:08:37 np0005603623 systemd[1]: Started Virtual Machine qemu-98-instance-000000d5.
Jan 31 04:08:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:37.974 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8ca3cbdf-cf52-45c8-b09f-e44a488deb34]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:37 np0005603623 nova_compute[226235]: 2026-01-31 09:08:37.996 226239 DEBUG nova.network.neutron [req-7b5908cb-1be7-4966-8408-357b785c2b82 req-f83fcc4f-5d05-4ece-a3fe-ecce212e42cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Updated VIF entry in instance network info cache for port ffc6313d-bd98-4ce5-b22e-8e055d354e40. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:08:37 np0005603623 nova_compute[226235]: 2026-01-31 09:08:37.996 226239 DEBUG nova.network.neutron [req-7b5908cb-1be7-4966-8408-357b785c2b82 req-f83fcc4f-5d05-4ece-a3fe-ecce212e42cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Updating instance_info_cache with network_info: [{"id": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "address": "fa:16:3e:3d:51:96", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc6313d-bd", "ovs_interfaceid": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:08:37 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:37.997 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[98817910-7934-4bf7-8122-2cda8e18285d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:38.001 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[67aa8938-41c2-4658-9031-81d8366e8c37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:38 np0005603623 NetworkManager[48970]: <info>  [1769850518.0023] manager: (tapa43e5cb4-60): new Veth device (/org/freedesktop/NetworkManager/Devices/419)
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.018 226239 DEBUG oslo_concurrency.lockutils [req-7b5908cb-1be7-4966-8408-357b785c2b82 req-f83fcc4f-5d05-4ece-a3fe-ecce212e42cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-cfd916b8-0d40-401d-b967-86b08f49eaed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:38.022 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[c6ba54b6-6746-4ee4-b8a0-91f8ef431fe8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:38.025 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[805f9dd8-9afb-4a89-9279-26f8afc91c89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:38 np0005603623 NetworkManager[48970]: <info>  [1769850518.0395] device (tapa43e5cb4-60): carrier: link connected
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:38.043 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[e728d75d-cbaf-4d29-886c-2741c7385a6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:38.056 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[874c1119-cde8-4f4d-a869-60a5ffd20f59]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa43e5cb4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:22:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 260], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 965846, 'reachable_time': 27184, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 326370, 'error': None, 'target': 'ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:38.067 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[caff224b-e83c-4e74-924e-f737904b3220]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feff:227e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 965846, 'tstamp': 965846}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 326371, 'error': None, 'target': 'ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:38.078 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b28837c6-7768-43c5-91c0-6e236e3fef5b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa43e5cb4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:22:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 260], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 965846, 'reachable_time': 27184, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 326372, 'error': None, 'target': 'ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:38.099 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2fb46b27-58a6-445a-b707-87c7f9e619f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:38.140 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1c925221-366d-4dd7-8900-b37f09da6e44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:38.142 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa43e5cb4-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:38.142 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:38.142 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa43e5cb4-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:08:38 np0005603623 NetworkManager[48970]: <info>  [1769850518.1449] manager: (tapa43e5cb4-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/420)
Jan 31 04:08:38 np0005603623 kernel: tapa43e5cb4-60: entered promiscuous mode
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.144 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:38.146 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa43e5cb4-60, col_values=(('external_ids', {'iface-id': 'b8c8a7b7-27e3-462c-bcf8-67bad4f07d3a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.147 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:38 np0005603623 ovn_controller[133449]: 2026-01-31T09:08:38Z|00888|binding|INFO|Releasing lport b8c8a7b7-27e3-462c-bcf8-67bad4f07d3a from this chassis (sb_readonly=0)
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:38.148 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:38.153 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f497b18e-0fb6-4482-962d-770fa9621a70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.153 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:38.154 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98.pid.haproxy
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:08:38 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:38.154 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98', 'env', 'PROCESS_TAG=haproxy-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.218 226239 DEBUG nova.compute.manager [req-e33bbf41-83f2-4680-97e6-df78b424f306 req-22948084-c2f7-4bf2-bbfb-64990e47cf27 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Received event network-vif-plugged-ffc6313d-bd98-4ce5-b22e-8e055d354e40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.218 226239 DEBUG oslo_concurrency.lockutils [req-e33bbf41-83f2-4680-97e6-df78b424f306 req-22948084-c2f7-4bf2-bbfb-64990e47cf27 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "cfd916b8-0d40-401d-b967-86b08f49eaed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.219 226239 DEBUG oslo_concurrency.lockutils [req-e33bbf41-83f2-4680-97e6-df78b424f306 req-22948084-c2f7-4bf2-bbfb-64990e47cf27 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "cfd916b8-0d40-401d-b967-86b08f49eaed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.219 226239 DEBUG oslo_concurrency.lockutils [req-e33bbf41-83f2-4680-97e6-df78b424f306 req-22948084-c2f7-4bf2-bbfb-64990e47cf27 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "cfd916b8-0d40-401d-b967-86b08f49eaed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.220 226239 DEBUG nova.compute.manager [req-e33bbf41-83f2-4680-97e6-df78b424f306 req-22948084-c2f7-4bf2-bbfb-64990e47cf27 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Processing event network-vif-plugged-ffc6313d-bd98-4ce5-b22e-8e055d354e40 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:08:38 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:08:38 np0005603623 podman[326404]: 2026-01-31 09:08:38.439205223 +0000 UTC m=+0.017052626 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:08:38 np0005603623 podman[326404]: 2026-01-31 09:08:38.586196125 +0000 UTC m=+0.164043508 container create 053de6ad45c3d7cd3a1971017a60eae217d9dbd5d9de12458edfb3d206a2a254 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Jan 31 04:08:38 np0005603623 systemd[1]: Started libpod-conmon-053de6ad45c3d7cd3a1971017a60eae217d9dbd5d9de12458edfb3d206a2a254.scope.
Jan 31 04:08:38 np0005603623 systemd[1]: Started libcrun container.
Jan 31 04:08:38 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0397c701295319e8bd4a50316a402429ff82ddc753c8611f9ea3c3222f876303/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:08:38 np0005603623 podman[326404]: 2026-01-31 09:08:38.725881537 +0000 UTC m=+0.303728940 container init 053de6ad45c3d7cd3a1971017a60eae217d9dbd5d9de12458edfb3d206a2a254 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Jan 31 04:08:38 np0005603623 podman[326404]: 2026-01-31 09:08:38.731796332 +0000 UTC m=+0.309643715 container start 053de6ad45c3d7cd3a1971017a60eae217d9dbd5d9de12458edfb3d206a2a254 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:08:38 np0005603623 neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98[326421]: [NOTICE]   (326443) : New worker (326445) forked
Jan 31 04:08:38 np0005603623 neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98[326421]: [NOTICE]   (326443) : Loading success.
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.961 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850518.9611661, cfd916b8-0d40-401d-b967-86b08f49eaed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.962 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] VM Started (Lifecycle Event)#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.964 226239 DEBUG nova.compute.manager [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.967 226239 DEBUG nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.970 226239 INFO nova.virt.libvirt.driver [-] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Instance spawned successfully.#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.970 226239 DEBUG nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.983 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.986 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.997 226239 DEBUG nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.998 226239 DEBUG nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.998 226239 DEBUG nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.998 226239 DEBUG nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.999 226239 DEBUG nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:08:38 np0005603623 nova_compute[226235]: 2026-01-31 09:08:38.999 226239 DEBUG nova.virt.libvirt.driver [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:08:39 np0005603623 nova_compute[226235]: 2026-01-31 09:08:39.012 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:08:39 np0005603623 nova_compute[226235]: 2026-01-31 09:08:39.013 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850518.961313, cfd916b8-0d40-401d-b967-86b08f49eaed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:08:39 np0005603623 nova_compute[226235]: 2026-01-31 09:08:39.013 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:08:39 np0005603623 nova_compute[226235]: 2026-01-31 09:08:39.037 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:08:39 np0005603623 nova_compute[226235]: 2026-01-31 09:08:39.039 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850518.9660769, cfd916b8-0d40-401d-b967-86b08f49eaed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:08:39 np0005603623 nova_compute[226235]: 2026-01-31 09:08:39.040 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:08:39 np0005603623 nova_compute[226235]: 2026-01-31 09:08:39.064 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:08:39 np0005603623 nova_compute[226235]: 2026-01-31 09:08:39.066 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:08:39 np0005603623 nova_compute[226235]: 2026-01-31 09:08:39.074 226239 INFO nova.compute.manager [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Took 12.52 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:08:39 np0005603623 nova_compute[226235]: 2026-01-31 09:08:39.075 226239 DEBUG nova.compute.manager [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:08:39 np0005603623 nova_compute[226235]: 2026-01-31 09:08:39.088 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:08:39 np0005603623 nova_compute[226235]: 2026-01-31 09:08:39.155 226239 INFO nova.compute.manager [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Took 14.17 seconds to build instance.#033[00m
Jan 31 04:08:39 np0005603623 nova_compute[226235]: 2026-01-31 09:08:39.177 226239 DEBUG oslo_concurrency.lockutils [None req-cf358187-2efd-48d6-ad33-8f69d600d193 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "cfd916b8-0d40-401d-b967-86b08f49eaed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.309s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:08:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:39.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:08:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:39.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:40 np0005603623 nova_compute[226235]: 2026-01-31 09:08:40.392 226239 DEBUG nova.compute.manager [req-7e5f6481-9f88-4b3f-a1ac-b8d073e5c311 req-4d921c04-c562-44e6-91ab-229de6b544a0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Received event network-vif-plugged-ffc6313d-bd98-4ce5-b22e-8e055d354e40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:40 np0005603623 nova_compute[226235]: 2026-01-31 09:08:40.392 226239 DEBUG oslo_concurrency.lockutils [req-7e5f6481-9f88-4b3f-a1ac-b8d073e5c311 req-4d921c04-c562-44e6-91ab-229de6b544a0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "cfd916b8-0d40-401d-b967-86b08f49eaed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:40 np0005603623 nova_compute[226235]: 2026-01-31 09:08:40.393 226239 DEBUG oslo_concurrency.lockutils [req-7e5f6481-9f88-4b3f-a1ac-b8d073e5c311 req-4d921c04-c562-44e6-91ab-229de6b544a0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "cfd916b8-0d40-401d-b967-86b08f49eaed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:40 np0005603623 nova_compute[226235]: 2026-01-31 09:08:40.393 226239 DEBUG oslo_concurrency.lockutils [req-7e5f6481-9f88-4b3f-a1ac-b8d073e5c311 req-4d921c04-c562-44e6-91ab-229de6b544a0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "cfd916b8-0d40-401d-b967-86b08f49eaed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:40 np0005603623 nova_compute[226235]: 2026-01-31 09:08:40.393 226239 DEBUG nova.compute.manager [req-7e5f6481-9f88-4b3f-a1ac-b8d073e5c311 req-4d921c04-c562-44e6-91ab-229de6b544a0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] No waiting events found dispatching network-vif-plugged-ffc6313d-bd98-4ce5-b22e-8e055d354e40 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:08:40 np0005603623 nova_compute[226235]: 2026-01-31 09:08:40.393 226239 WARNING nova.compute.manager [req-7e5f6481-9f88-4b3f-a1ac-b8d073e5c311 req-4d921c04-c562-44e6-91ab-229de6b544a0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Received unexpected event network-vif-plugged-ffc6313d-bd98-4ce5-b22e-8e055d354e40 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:08:40 np0005603623 ovn_controller[133449]: 2026-01-31T09:08:40Z|00889|binding|INFO|Releasing lport b8c8a7b7-27e3-462c-bcf8-67bad4f07d3a from this chassis (sb_readonly=0)
Jan 31 04:08:40 np0005603623 nova_compute[226235]: 2026-01-31 09:08:40.891 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:40 np0005603623 nova_compute[226235]: 2026-01-31 09:08:40.954 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:41.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:08:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:41.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:08:41 np0005603623 nova_compute[226235]: 2026-01-31 09:08:41.933 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:08:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:43.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:08:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:43.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:08:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/887187949' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:08:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:08:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/887187949' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:08:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e412 e412: 3 total, 3 up, 3 in
Jan 31 04:08:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:45.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:45 np0005603623 nova_compute[226235]: 2026-01-31 09:08:45.454 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850510.453418, bd9f81d8-7c9d-42e3-969b-7b404bfaff36 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:08:45 np0005603623 nova_compute[226235]: 2026-01-31 09:08:45.455 226239 INFO nova.compute.manager [-] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:08:45 np0005603623 nova_compute[226235]: 2026-01-31 09:08:45.478 226239 DEBUG nova.compute.manager [None req-c8ea02b0-3bd1-472d-b776-23fe78e360c1 - - - - - -] [instance: bd9f81d8-7c9d-42e3-969b-7b404bfaff36] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:08:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:45.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:45 np0005603623 nova_compute[226235]: 2026-01-31 09:08:45.956 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:46 np0005603623 nova_compute[226235]: 2026-01-31 09:08:46.158 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:46 np0005603623 nova_compute[226235]: 2026-01-31 09:08:46.936 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e412 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:47.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:47.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:49.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:08:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:49.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:08:50 np0005603623 nova_compute[226235]: 2026-01-31 09:08:50.959 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:51 np0005603623 ovn_controller[133449]: 2026-01-31T09:08:51Z|00112|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3d:51:96 10.100.0.9
Jan 31 04:08:51 np0005603623 ovn_controller[133449]: 2026-01-31T09:08:51Z|00113|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3d:51:96 10.100.0.9
Jan 31 04:08:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:08:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:51.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:08:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:51.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:51 np0005603623 nova_compute[226235]: 2026-01-31 09:08:51.660 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:51 np0005603623 nova_compute[226235]: 2026-01-31 09:08:51.947 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e412 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 e413: 3 total, 3 up, 3 in
Jan 31 04:08:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:08:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:53.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:08:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:53.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:08:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:55.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:08:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:55.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:55 np0005603623 nova_compute[226235]: 2026-01-31 09:08:55.962 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.298 226239 DEBUG oslo_concurrency.lockutils [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "cfd916b8-0d40-401d-b967-86b08f49eaed" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.298 226239 DEBUG oslo_concurrency.lockutils [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "cfd916b8-0d40-401d-b967-86b08f49eaed" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.298 226239 DEBUG oslo_concurrency.lockutils [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "cfd916b8-0d40-401d-b967-86b08f49eaed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.299 226239 DEBUG oslo_concurrency.lockutils [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "cfd916b8-0d40-401d-b967-86b08f49eaed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.299 226239 DEBUG oslo_concurrency.lockutils [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "cfd916b8-0d40-401d-b967-86b08f49eaed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.300 226239 INFO nova.compute.manager [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Terminating instance#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.301 226239 DEBUG nova.compute.manager [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:08:56 np0005603623 kernel: tapffc6313d-bd (unregistering): left promiscuous mode
Jan 31 04:08:56 np0005603623 NetworkManager[48970]: <info>  [1769850536.4433] device (tapffc6313d-bd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.442 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:56 np0005603623 ovn_controller[133449]: 2026-01-31T09:08:56Z|00890|binding|INFO|Releasing lport ffc6313d-bd98-4ce5-b22e-8e055d354e40 from this chassis (sb_readonly=0)
Jan 31 04:08:56 np0005603623 ovn_controller[133449]: 2026-01-31T09:08:56Z|00891|binding|INFO|Setting lport ffc6313d-bd98-4ce5-b22e-8e055d354e40 down in Southbound
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.448 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:56 np0005603623 ovn_controller[133449]: 2026-01-31T09:08:56Z|00892|binding|INFO|Removing iface tapffc6313d-bd ovn-installed in OVS
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.450 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.456 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:56.461 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:51:96 10.100.0.9'], port_security=['fa:16:3e:3d:51:96 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'cfd916b8-0d40-401d-b967-86b08f49eaed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c7930b92fc3471f87d9fe78ee56e71e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f00beb61-464a-48b1-b739-7a38012ca3e6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ea5e5c3-41ce-42a3-ba60-7739a7d8baec, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=ffc6313d-bd98-4ce5-b22e-8e055d354e40) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:08:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:56.464 143258 INFO neutron.agent.ovn.metadata.agent [-] Port ffc6313d-bd98-4ce5-b22e-8e055d354e40 in datapath a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98 unbound from our chassis#033[00m
Jan 31 04:08:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:56.466 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:08:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:56.467 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[eb1d19af-f850-4071-af77-8ff8f91039cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:56.468 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98 namespace which is not needed anymore#033[00m
Jan 31 04:08:56 np0005603623 systemd[1]: machine-qemu\x2d98\x2dinstance\x2d000000d5.scope: Deactivated successfully.
Jan 31 04:08:56 np0005603623 systemd[1]: machine-qemu\x2d98\x2dinstance\x2d000000d5.scope: Consumed 12.459s CPU time.
Jan 31 04:08:56 np0005603623 systemd-machined[194379]: Machine qemu-98-instance-000000d5 terminated.
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.531 226239 INFO nova.virt.libvirt.driver [-] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Instance destroyed successfully.#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.532 226239 DEBUG nova.objects.instance [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lazy-loading 'resources' on Instance uuid cfd916b8-0d40-401d-b967-86b08f49eaed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.559 226239 DEBUG nova.virt.libvirt.vif [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:08:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-gen-0-1286502888',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-gen-0-1286502888',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1802479850-ge',id=213,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDOzLIomxM/9LD6pUmebgfRrGKD5wh7paXInFcqd2L0Y2NS24kEHuU2xVFiRKZyesRVQNhc+hG031RV8i0F2/Jgjlcab+v9GRiy/RiVhAhjy1pjZJBvE/J+e5Z7tpOEKxw==',key_name='tempest-TestSecurityGroupsBasicOps-542781411',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:08:39Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c7930b92fc3471f87d9fe78ee56e71e',ramdisk_id='',reservation_id='r-t6fig6iy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1802479850',owner_user_name='tempest-TestSecurityGroupsBasicOps-1802479850-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:08:39Z,user_data=None,user_id='ebd43008d7a64b8bbf97a2304b1f78b6',uuid=cfd916b8-0d40-401d-b967-86b08f49eaed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "address": "fa:16:3e:3d:51:96", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc6313d-bd", "ovs_interfaceid": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.559 226239 DEBUG nova.network.os_vif_util [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converting VIF {"id": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "address": "fa:16:3e:3d:51:96", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc6313d-bd", "ovs_interfaceid": "ffc6313d-bd98-4ce5-b22e-8e055d354e40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.560 226239 DEBUG nova.network.os_vif_util [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:51:96,bridge_name='br-int',has_traffic_filtering=True,id=ffc6313d-bd98-4ce5-b22e-8e055d354e40,network=Network(a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffc6313d-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.561 226239 DEBUG os_vif [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:51:96,bridge_name='br-int',has_traffic_filtering=True,id=ffc6313d-bd98-4ce5-b22e-8e055d354e40,network=Network(a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffc6313d-bd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.562 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.562 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapffc6313d-bd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.566 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.568 226239 INFO os_vif [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:51:96,bridge_name='br-int',has_traffic_filtering=True,id=ffc6313d-bd98-4ce5-b22e-8e055d354e40,network=Network(a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffc6313d-bd')#033[00m
Jan 31 04:08:56 np0005603623 neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98[326421]: [NOTICE]   (326443) : haproxy version is 2.8.14-c23fe91
Jan 31 04:08:56 np0005603623 neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98[326421]: [NOTICE]   (326443) : path to executable is /usr/sbin/haproxy
Jan 31 04:08:56 np0005603623 neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98[326421]: [ALERT]    (326443) : Current worker (326445) exited with code 143 (Terminated)
Jan 31 04:08:56 np0005603623 neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98[326421]: [WARNING]  (326443) : All workers exited. Exiting... (0)
Jan 31 04:08:56 np0005603623 systemd[1]: libpod-053de6ad45c3d7cd3a1971017a60eae217d9dbd5d9de12458edfb3d206a2a254.scope: Deactivated successfully.
Jan 31 04:08:56 np0005603623 podman[326571]: 2026-01-31 09:08:56.583785261 +0000 UTC m=+0.042803174 container died 053de6ad45c3d7cd3a1971017a60eae217d9dbd5d9de12458edfb3d206a2a254 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 04:08:56 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-053de6ad45c3d7cd3a1971017a60eae217d9dbd5d9de12458edfb3d206a2a254-userdata-shm.mount: Deactivated successfully.
Jan 31 04:08:56 np0005603623 systemd[1]: var-lib-containers-storage-overlay-0397c701295319e8bd4a50316a402429ff82ddc753c8611f9ea3c3222f876303-merged.mount: Deactivated successfully.
Jan 31 04:08:56 np0005603623 podman[326571]: 2026-01-31 09:08:56.613974188 +0000 UTC m=+0.072992061 container cleanup 053de6ad45c3d7cd3a1971017a60eae217d9dbd5d9de12458edfb3d206a2a254 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 04:08:56 np0005603623 systemd[1]: libpod-conmon-053de6ad45c3d7cd3a1971017a60eae217d9dbd5d9de12458edfb3d206a2a254.scope: Deactivated successfully.
Jan 31 04:08:56 np0005603623 podman[326621]: 2026-01-31 09:08:56.66916053 +0000 UTC m=+0.038804060 container remove 053de6ad45c3d7cd3a1971017a60eae217d9dbd5d9de12458edfb3d206a2a254 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 04:08:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:56.672 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[31f298eb-402a-4c97-9e17-c3eef4d9be92]: (4, ('Sat Jan 31 09:08:56 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98 (053de6ad45c3d7cd3a1971017a60eae217d9dbd5d9de12458edfb3d206a2a254)\n053de6ad45c3d7cd3a1971017a60eae217d9dbd5d9de12458edfb3d206a2a254\nSat Jan 31 09:08:56 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98 (053de6ad45c3d7cd3a1971017a60eae217d9dbd5d9de12458edfb3d206a2a254)\n053de6ad45c3d7cd3a1971017a60eae217d9dbd5d9de12458edfb3d206a2a254\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:56.674 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2b9d162b-b14d-4bc8-bc0c-80457e6ed00e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:56.675 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa43e5cb4-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.710 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:56 np0005603623 kernel: tapa43e5cb4-60: left promiscuous mode
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.716 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:56.720 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[753e2c23-afa9-4d71-8bf4-85a968135c8c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:56.740 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b1681910-dba0-4aad-97fd-51c3b5e3b456]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:56.741 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c9f470ab-41ea-457a-9f6d-b53786069b82]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:56.753 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cd895a28-f0a6-41fb-b078-bbc7c464f3d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 965842, 'reachable_time': 24243, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 326636, 'error': None, 'target': 'ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:56.755 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:08:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:08:56.755 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[286c9de5-6983-4a9e-b70e-b220bdd2736f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:56 np0005603623 systemd[1]: run-netns-ovnmeta\x2da43e5cb4\x2d640f\x2d4ef5\x2d9c18\x2d9b6b0fe32a98.mount: Deactivated successfully.
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.879 226239 DEBUG nova.compute.manager [req-b3d99ef5-a3f9-47dd-86eb-db616c281740 req-4420a840-3c0b-4a2c-9944-28ebb2cc0277 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Received event network-vif-unplugged-ffc6313d-bd98-4ce5-b22e-8e055d354e40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.880 226239 DEBUG oslo_concurrency.lockutils [req-b3d99ef5-a3f9-47dd-86eb-db616c281740 req-4420a840-3c0b-4a2c-9944-28ebb2cc0277 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "cfd916b8-0d40-401d-b967-86b08f49eaed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.880 226239 DEBUG oslo_concurrency.lockutils [req-b3d99ef5-a3f9-47dd-86eb-db616c281740 req-4420a840-3c0b-4a2c-9944-28ebb2cc0277 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "cfd916b8-0d40-401d-b967-86b08f49eaed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.880 226239 DEBUG oslo_concurrency.lockutils [req-b3d99ef5-a3f9-47dd-86eb-db616c281740 req-4420a840-3c0b-4a2c-9944-28ebb2cc0277 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "cfd916b8-0d40-401d-b967-86b08f49eaed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.880 226239 DEBUG nova.compute.manager [req-b3d99ef5-a3f9-47dd-86eb-db616c281740 req-4420a840-3c0b-4a2c-9944-28ebb2cc0277 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] No waiting events found dispatching network-vif-unplugged-ffc6313d-bd98-4ce5-b22e-8e055d354e40 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.880 226239 DEBUG nova.compute.manager [req-b3d99ef5-a3f9-47dd-86eb-db616c281740 req-4420a840-3c0b-4a2c-9944-28ebb2cc0277 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Received event network-vif-unplugged-ffc6313d-bd98-4ce5-b22e-8e055d354e40 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:08:56 np0005603623 nova_compute[226235]: 2026-01-31 09:08:56.950 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:57 np0005603623 nova_compute[226235]: 2026-01-31 09:08:57.299 226239 INFO nova.virt.libvirt.driver [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Deleting instance files /var/lib/nova/instances/cfd916b8-0d40-401d-b967-86b08f49eaed_del#033[00m
Jan 31 04:08:57 np0005603623 nova_compute[226235]: 2026-01-31 09:08:57.300 226239 INFO nova.virt.libvirt.driver [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Deletion of /var/lib/nova/instances/cfd916b8-0d40-401d-b967-86b08f49eaed_del complete#033[00m
Jan 31 04:08:57 np0005603623 nova_compute[226235]: 2026-01-31 09:08:57.364 226239 INFO nova.compute.manager [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Took 1.06 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:08:57 np0005603623 nova_compute[226235]: 2026-01-31 09:08:57.364 226239 DEBUG oslo.service.loopingcall [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:08:57 np0005603623 nova_compute[226235]: 2026-01-31 09:08:57.364 226239 DEBUG nova.compute.manager [-] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:08:57 np0005603623 nova_compute[226235]: 2026-01-31 09:08:57.365 226239 DEBUG nova.network.neutron [-] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:08:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:57.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:57.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:59 np0005603623 nova_compute[226235]: 2026-01-31 09:08:59.026 226239 DEBUG nova.compute.manager [req-dcd6ce75-a528-483f-b820-414561b44305 req-61d011fd-5054-4a4c-becf-9f258e57b1ff fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Received event network-vif-plugged-ffc6313d-bd98-4ce5-b22e-8e055d354e40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:59 np0005603623 nova_compute[226235]: 2026-01-31 09:08:59.027 226239 DEBUG oslo_concurrency.lockutils [req-dcd6ce75-a528-483f-b820-414561b44305 req-61d011fd-5054-4a4c-becf-9f258e57b1ff fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "cfd916b8-0d40-401d-b967-86b08f49eaed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:59 np0005603623 nova_compute[226235]: 2026-01-31 09:08:59.028 226239 DEBUG oslo_concurrency.lockutils [req-dcd6ce75-a528-483f-b820-414561b44305 req-61d011fd-5054-4a4c-becf-9f258e57b1ff fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "cfd916b8-0d40-401d-b967-86b08f49eaed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:59 np0005603623 nova_compute[226235]: 2026-01-31 09:08:59.028 226239 DEBUG oslo_concurrency.lockutils [req-dcd6ce75-a528-483f-b820-414561b44305 req-61d011fd-5054-4a4c-becf-9f258e57b1ff fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "cfd916b8-0d40-401d-b967-86b08f49eaed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:59 np0005603623 nova_compute[226235]: 2026-01-31 09:08:59.028 226239 DEBUG nova.compute.manager [req-dcd6ce75-a528-483f-b820-414561b44305 req-61d011fd-5054-4a4c-becf-9f258e57b1ff fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] No waiting events found dispatching network-vif-plugged-ffc6313d-bd98-4ce5-b22e-8e055d354e40 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:08:59 np0005603623 nova_compute[226235]: 2026-01-31 09:08:59.029 226239 WARNING nova.compute.manager [req-dcd6ce75-a528-483f-b820-414561b44305 req-61d011fd-5054-4a4c-becf-9f258e57b1ff fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Received unexpected event network-vif-plugged-ffc6313d-bd98-4ce5-b22e-8e055d354e40 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 04:08:59 np0005603623 nova_compute[226235]: 2026-01-31 09:08:59.361 226239 DEBUG nova.network.neutron [-] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:08:59 np0005603623 nova_compute[226235]: 2026-01-31 09:08:59.392 226239 INFO nova.compute.manager [-] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Took 2.03 seconds to deallocate network for instance.#033[00m
Jan 31 04:08:59 np0005603623 nova_compute[226235]: 2026-01-31 09:08:59.459 226239 DEBUG oslo_concurrency.lockutils [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:59 np0005603623 nova_compute[226235]: 2026-01-31 09:08:59.460 226239 DEBUG oslo_concurrency.lockutils [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:59.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:59 np0005603623 nova_compute[226235]: 2026-01-31 09:08:59.464 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:59 np0005603623 nova_compute[226235]: 2026-01-31 09:08:59.510 226239 DEBUG oslo_concurrency.processutils [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:08:59 np0005603623 nova_compute[226235]: 2026-01-31 09:08:59.528 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:08:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:59.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:59 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:08:59 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3148587545' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:08:59 np0005603623 nova_compute[226235]: 2026-01-31 09:08:59.918 226239 DEBUG oslo_concurrency.processutils [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:08:59 np0005603623 nova_compute[226235]: 2026-01-31 09:08:59.924 226239 DEBUG nova.compute.provider_tree [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:08:59 np0005603623 nova_compute[226235]: 2026-01-31 09:08:59.945 226239 DEBUG nova.scheduler.client.report [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:08:59 np0005603623 nova_compute[226235]: 2026-01-31 09:08:59.971 226239 DEBUG oslo_concurrency.lockutils [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.511s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:59 np0005603623 podman[326660]: 2026-01-31 09:08:59.9754212 +0000 UTC m=+0.072881077 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:08:59 np0005603623 podman[326661]: 2026-01-31 09:08:59.975392299 +0000 UTC m=+0.071618138 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller)
Jan 31 04:09:00 np0005603623 nova_compute[226235]: 2026-01-31 09:09:00.005 226239 INFO nova.scheduler.client.report [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Deleted allocations for instance cfd916b8-0d40-401d-b967-86b08f49eaed#033[00m
Jan 31 04:09:00 np0005603623 nova_compute[226235]: 2026-01-31 09:09:00.074 226239 DEBUG oslo_concurrency.lockutils [None req-c6c76d22-9bcf-4b2b-9833-8cdf9fd444a0 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "cfd916b8-0d40-401d-b967-86b08f49eaed" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.776s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:00 np0005603623 nova_compute[226235]: 2026-01-31 09:09:00.589 226239 DEBUG oslo_concurrency.lockutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "0814449a-7dc1-4f82-a204-a793c4867d69" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:00 np0005603623 nova_compute[226235]: 2026-01-31 09:09:00.590 226239 DEBUG oslo_concurrency.lockutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "0814449a-7dc1-4f82-a204-a793c4867d69" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:00 np0005603623 nova_compute[226235]: 2026-01-31 09:09:00.610 226239 DEBUG nova.compute.manager [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:09:00 np0005603623 nova_compute[226235]: 2026-01-31 09:09:00.700 226239 DEBUG oslo_concurrency.lockutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:00 np0005603623 nova_compute[226235]: 2026-01-31 09:09:00.701 226239 DEBUG oslo_concurrency.lockutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:00 np0005603623 nova_compute[226235]: 2026-01-31 09:09:00.707 226239 DEBUG nova.virt.hardware [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:09:00 np0005603623 nova_compute[226235]: 2026-01-31 09:09:00.708 226239 INFO nova.compute.claims [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 04:09:00 np0005603623 nova_compute[226235]: 2026-01-31 09:09:00.853 226239 DEBUG oslo_concurrency.processutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.174 226239 DEBUG nova.compute.manager [req-fb0aa942-ad03-46e5-8e98-6963687e8c21 req-4a1aa2a1-f18e-4b03-a67f-58625e11f344 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Received event network-vif-deleted-ffc6313d-bd98-4ce5-b22e-8e055d354e40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:09:01 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:09:01 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2968019034' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.257 226239 DEBUG oslo_concurrency.processutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.403s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.261 226239 DEBUG nova.compute.provider_tree [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.280 226239 DEBUG nova.scheduler.client.report [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.313 226239 DEBUG oslo_concurrency.lockutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.612s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.314 226239 DEBUG nova.compute.manager [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.368 226239 DEBUG nova.compute.manager [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.368 226239 DEBUG nova.network.neutron [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.397 226239 INFO nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.433 226239 DEBUG nova.compute.manager [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:09:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:01.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.576 226239 DEBUG nova.compute.manager [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.579 226239 DEBUG nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.579 226239 INFO nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Creating image(s)#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.625 226239 DEBUG nova.storage.rbd_utils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 0814449a-7dc1-4f82-a204-a793c4867d69_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:09:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:01.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.671 226239 DEBUG nova.storage.rbd_utils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 0814449a-7dc1-4f82-a204-a793c4867d69_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.705 226239 DEBUG nova.storage.rbd_utils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 0814449a-7dc1-4f82-a204-a793c4867d69_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.710 226239 DEBUG oslo_concurrency.processutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.735 226239 DEBUG nova.policy [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd442c7ba12ed444ca6d4dcc5cfd36150', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.738 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.775 226239 DEBUG oslo_concurrency.processutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.776 226239 DEBUG oslo_concurrency.lockutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.777 226239 DEBUG oslo_concurrency.lockutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.777 226239 DEBUG oslo_concurrency.lockutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.809 226239 DEBUG nova.storage.rbd_utils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 0814449a-7dc1-4f82-a204-a793c4867d69_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.814 226239 DEBUG oslo_concurrency.processutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 0814449a-7dc1-4f82-a204-a793c4867d69_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:01 np0005603623 nova_compute[226235]: 2026-01-31 09:09:01.952 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:02 np0005603623 nova_compute[226235]: 2026-01-31 09:09:02.480 226239 DEBUG oslo_concurrency.processutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 0814449a-7dc1-4f82-a204-a793c4867d69_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.666s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:02 np0005603623 nova_compute[226235]: 2026-01-31 09:09:02.547 226239 DEBUG nova.storage.rbd_utils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] resizing rbd image 0814449a-7dc1-4f82-a204-a793c4867d69_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 04:09:02 np0005603623 nova_compute[226235]: 2026-01-31 09:09:02.870 226239 DEBUG nova.objects.instance [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'migration_context' on Instance uuid 0814449a-7dc1-4f82-a204-a793c4867d69 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:09:02 np0005603623 nova_compute[226235]: 2026-01-31 09:09:02.887 226239 DEBUG nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 04:09:02 np0005603623 nova_compute[226235]: 2026-01-31 09:09:02.888 226239 DEBUG nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Ensure instance console log exists: /var/lib/nova/instances/0814449a-7dc1-4f82-a204-a793c4867d69/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:09:02 np0005603623 nova_compute[226235]: 2026-01-31 09:09:02.888 226239 DEBUG oslo_concurrency.lockutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:02 np0005603623 nova_compute[226235]: 2026-01-31 09:09:02.889 226239 DEBUG oslo_concurrency.lockutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:02 np0005603623 nova_compute[226235]: 2026-01-31 09:09:02.889 226239 DEBUG oslo_concurrency.lockutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:03.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:03.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:04 np0005603623 nova_compute[226235]: 2026-01-31 09:09:04.192 226239 DEBUG nova.network.neutron [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Successfully created port: 15887f40-38e5-4ffb-bb22-fe2276411b0d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:09:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:05.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #178. Immutable memtables: 0.
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:09:05.492982) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 113] Flushing memtable with next log file: 178
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850545493107, "job": 113, "event": "flush_started", "num_memtables": 1, "num_entries": 1814, "num_deletes": 252, "total_data_size": 4119331, "memory_usage": 4169312, "flush_reason": "Manual Compaction"}
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 113] Level-0 flush table #179: started
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850545508429, "cf_name": "default", "job": 113, "event": "table_file_creation", "file_number": 179, "file_size": 2706276, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 85769, "largest_seqno": 87577, "table_properties": {"data_size": 2698889, "index_size": 4329, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 15990, "raw_average_key_size": 20, "raw_value_size": 2683866, "raw_average_value_size": 3405, "num_data_blocks": 190, "num_entries": 788, "num_filter_entries": 788, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850393, "oldest_key_time": 1769850393, "file_creation_time": 1769850545, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 179, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 113] Flush lasted 15572 microseconds, and 7524 cpu microseconds.
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:09:05.508554) [db/flush_job.cc:967] [default] [JOB 113] Level-0 flush table #179: 2706276 bytes OK
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:09:05.508579) [db/memtable_list.cc:519] [default] Level-0 commit table #179 started
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:09:05.509748) [db/memtable_list.cc:722] [default] Level-0 commit table #179: memtable #1 done
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:09:05.509769) EVENT_LOG_v1 {"time_micros": 1769850545509762, "job": 113, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:09:05.509791) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 113] Try to delete WAL files size 4111133, prev total WAL file size 4111133, number of live WAL files 2.
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000175.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:09:05.510665) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037353330' seq:72057594037927935, type:22 .. '7061786F730037373832' seq:0, type:0; will stop at (end)
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 114] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 113 Base level 0, inputs: [179(2642KB)], [177(12MB)]
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850545510718, "job": 114, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [179], "files_L6": [177], "score": -1, "input_data_size": 15300231, "oldest_snapshot_seqno": -1}
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 114] Generated table #180: 10677 keys, 13445599 bytes, temperature: kUnknown
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850545600145, "cf_name": "default", "job": 114, "event": "table_file_creation", "file_number": 180, "file_size": 13445599, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13377032, "index_size": 40742, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26757, "raw_key_size": 282193, "raw_average_key_size": 26, "raw_value_size": 13191053, "raw_average_value_size": 1235, "num_data_blocks": 1546, "num_entries": 10677, "num_filter_entries": 10677, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769850545, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 180, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:09:05.600412) [db/compaction/compaction_job.cc:1663] [default] [JOB 114] Compacted 1@0 + 1@6 files to L6 => 13445599 bytes
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:09:05.601645) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 170.9 rd, 150.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 12.0 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(10.6) write-amplify(5.0) OK, records in: 11198, records dropped: 521 output_compression: NoCompression
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:09:05.601661) EVENT_LOG_v1 {"time_micros": 1769850545601653, "job": 114, "event": "compaction_finished", "compaction_time_micros": 89550, "compaction_time_cpu_micros": 23972, "output_level": 6, "num_output_files": 1, "total_output_size": 13445599, "num_input_records": 11198, "num_output_records": 10677, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000179.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850545602000, "job": 114, "event": "table_file_deletion", "file_number": 179}
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000177.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850545603240, "job": 114, "event": "table_file_deletion", "file_number": 177}
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:09:05.510611) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:09:05.603332) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:09:05.603338) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:09:05.603340) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:09:05.603342) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:09:05 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:09:05.603343) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:09:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:05.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:06 np0005603623 nova_compute[226235]: 2026-01-31 09:09:06.010 226239 DEBUG nova.network.neutron [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Successfully updated port: 15887f40-38e5-4ffb-bb22-fe2276411b0d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:09:06 np0005603623 nova_compute[226235]: 2026-01-31 09:09:06.034 226239 DEBUG oslo_concurrency.lockutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "refresh_cache-0814449a-7dc1-4f82-a204-a793c4867d69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:09:06 np0005603623 nova_compute[226235]: 2026-01-31 09:09:06.034 226239 DEBUG oslo_concurrency.lockutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquired lock "refresh_cache-0814449a-7dc1-4f82-a204-a793c4867d69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:09:06 np0005603623 nova_compute[226235]: 2026-01-31 09:09:06.034 226239 DEBUG nova.network.neutron [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:09:06 np0005603623 nova_compute[226235]: 2026-01-31 09:09:06.099 226239 DEBUG nova.compute.manager [req-b48e42f2-102b-4880-9998-8966dc801de4 req-086901f0-46ae-4eef-b136-06145c9f7ad1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Received event network-changed-15887f40-38e5-4ffb-bb22-fe2276411b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:09:06 np0005603623 nova_compute[226235]: 2026-01-31 09:09:06.100 226239 DEBUG nova.compute.manager [req-b48e42f2-102b-4880-9998-8966dc801de4 req-086901f0-46ae-4eef-b136-06145c9f7ad1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Refreshing instance network info cache due to event network-changed-15887f40-38e5-4ffb-bb22-fe2276411b0d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:09:06 np0005603623 nova_compute[226235]: 2026-01-31 09:09:06.100 226239 DEBUG oslo_concurrency.lockutils [req-b48e42f2-102b-4880-9998-8966dc801de4 req-086901f0-46ae-4eef-b136-06145c9f7ad1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-0814449a-7dc1-4f82-a204-a793c4867d69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:09:06 np0005603623 nova_compute[226235]: 2026-01-31 09:09:06.273 226239 DEBUG nova.network.neutron [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:09:06 np0005603623 nova_compute[226235]: 2026-01-31 09:09:06.740 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:06 np0005603623 nova_compute[226235]: 2026-01-31 09:09:06.954 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:07.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:07.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.351 226239 DEBUG nova.network.neutron [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Updating instance_info_cache with network_info: [{"id": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "address": "fa:16:3e:de:aa:32", "network": {"id": "a79751eb-00bd-46af-a104-84d2e5b78aa7", "bridge": "br-int", "label": "tempest-network-smoke--1249603954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15887f40-38", "ovs_interfaceid": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.370 226239 DEBUG oslo_concurrency.lockutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Releasing lock "refresh_cache-0814449a-7dc1-4f82-a204-a793c4867d69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.371 226239 DEBUG nova.compute.manager [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Instance network_info: |[{"id": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "address": "fa:16:3e:de:aa:32", "network": {"id": "a79751eb-00bd-46af-a104-84d2e5b78aa7", "bridge": "br-int", "label": "tempest-network-smoke--1249603954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15887f40-38", "ovs_interfaceid": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.372 226239 DEBUG oslo_concurrency.lockutils [req-b48e42f2-102b-4880-9998-8966dc801de4 req-086901f0-46ae-4eef-b136-06145c9f7ad1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-0814449a-7dc1-4f82-a204-a793c4867d69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.372 226239 DEBUG nova.network.neutron [req-b48e42f2-102b-4880-9998-8966dc801de4 req-086901f0-46ae-4eef-b136-06145c9f7ad1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Refreshing network info cache for port 15887f40-38e5-4ffb-bb22-fe2276411b0d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.377 226239 DEBUG nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Start _get_guest_xml network_info=[{"id": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "address": "fa:16:3e:de:aa:32", "network": {"id": "a79751eb-00bd-46af-a104-84d2e5b78aa7", "bridge": "br-int", "label": "tempest-network-smoke--1249603954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15887f40-38", "ovs_interfaceid": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.383 226239 WARNING nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.389 226239 DEBUG nova.virt.libvirt.host [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.390 226239 DEBUG nova.virt.libvirt.host [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.394 226239 DEBUG nova.virt.libvirt.host [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.395 226239 DEBUG nova.virt.libvirt.host [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.397 226239 DEBUG nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.397 226239 DEBUG nova.virt.hardware [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.398 226239 DEBUG nova.virt.hardware [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.398 226239 DEBUG nova.virt.hardware [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.399 226239 DEBUG nova.virt.hardware [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.399 226239 DEBUG nova.virt.hardware [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.400 226239 DEBUG nova.virt.hardware [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.400 226239 DEBUG nova.virt.hardware [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.401 226239 DEBUG nova.virt.hardware [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.401 226239 DEBUG nova.virt.hardware [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.402 226239 DEBUG nova.virt.hardware [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.402 226239 DEBUG nova.virt.hardware [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.407 226239 DEBUG oslo_concurrency.processutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:09.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:09.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:09 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:09:09 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4279525860' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.852 226239 DEBUG oslo_concurrency.processutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.875 226239 DEBUG nova.storage.rbd_utils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 0814449a-7dc1-4f82-a204-a793c4867d69_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:09:09 np0005603623 nova_compute[226235]: 2026-01-31 09:09:09.878 226239 DEBUG oslo_concurrency.processutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:09:10 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/232391877' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.296 226239 DEBUG oslo_concurrency.processutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.297 226239 DEBUG nova.virt.libvirt.vif [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:08:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1546550354',display_name='tempest-TestNetworkBasicOps-server-1546550354',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1546550354',id=214,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEaqKioUa1HkPeYonPNuJnlCnVNvcGfuTPmT25tLMWjRfxazsN/y2+GpIz5S9URuuJ+OShDdxnT4urjNzwUEEtbxThOGjGDYWgMQGOYwZTWn6bfH0MXFcray2n4OsFILuQ==',key_name='tempest-TestNetworkBasicOps-330648990',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-l6mo0qox',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:09:01Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=0814449a-7dc1-4f82-a204-a793c4867d69,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "address": "fa:16:3e:de:aa:32", "network": {"id": "a79751eb-00bd-46af-a104-84d2e5b78aa7", "bridge": "br-int", "label": "tempest-network-smoke--1249603954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15887f40-38", "ovs_interfaceid": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.297 226239 DEBUG nova.network.os_vif_util [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "address": "fa:16:3e:de:aa:32", "network": {"id": "a79751eb-00bd-46af-a104-84d2e5b78aa7", "bridge": "br-int", "label": "tempest-network-smoke--1249603954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15887f40-38", "ovs_interfaceid": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.298 226239 DEBUG nova.network.os_vif_util [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:aa:32,bridge_name='br-int',has_traffic_filtering=True,id=15887f40-38e5-4ffb-bb22-fe2276411b0d,network=Network(a79751eb-00bd-46af-a104-84d2e5b78aa7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15887f40-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.299 226239 DEBUG nova.objects.instance [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0814449a-7dc1-4f82-a204-a793c4867d69 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.323 226239 DEBUG nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:09:10 np0005603623 nova_compute[226235]:  <uuid>0814449a-7dc1-4f82-a204-a793c4867d69</uuid>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:  <name>instance-000000d6</name>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <nova:name>tempest-TestNetworkBasicOps-server-1546550354</nova:name>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 09:09:09</nova:creationTime>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 04:09:10 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:        <nova:user uuid="d442c7ba12ed444ca6d4dcc5cfd36150">tempest-TestNetworkBasicOps-104417095-project-member</nova:user>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:        <nova:project uuid="abf9393aa2b646feb00a3d887a9dee14">tempest-TestNetworkBasicOps-104417095</nova:project>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:        <nova:port uuid="15887f40-38e5-4ffb-bb22-fe2276411b0d">
Jan 31 04:09:10 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <system>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <entry name="serial">0814449a-7dc1-4f82-a204-a793c4867d69</entry>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <entry name="uuid">0814449a-7dc1-4f82-a204-a793c4867d69</entry>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    </system>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:  <os>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:  </os>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:  <features>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:  </features>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:  </clock>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:  <devices>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/0814449a-7dc1-4f82-a204-a793c4867d69_disk">
Jan 31 04:09:10 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:09:10 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/0814449a-7dc1-4f82-a204-a793c4867d69_disk.config">
Jan 31 04:09:10 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:09:10 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:de:aa:32"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <target dev="tap15887f40-38"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    </interface>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/0814449a-7dc1-4f82-a204-a793c4867d69/console.log" append="off"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    </serial>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <video>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    </video>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    </rng>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 04:09:10 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 04:09:10 np0005603623 nova_compute[226235]:  </devices>
Jan 31 04:09:10 np0005603623 nova_compute[226235]: </domain>
Jan 31 04:09:10 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.324 226239 DEBUG nova.compute.manager [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Preparing to wait for external event network-vif-plugged-15887f40-38e5-4ffb-bb22-fe2276411b0d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.325 226239 DEBUG oslo_concurrency.lockutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "0814449a-7dc1-4f82-a204-a793c4867d69-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.325 226239 DEBUG oslo_concurrency.lockutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "0814449a-7dc1-4f82-a204-a793c4867d69-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.326 226239 DEBUG oslo_concurrency.lockutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "0814449a-7dc1-4f82-a204-a793c4867d69-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.326 226239 DEBUG nova.virt.libvirt.vif [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:08:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1546550354',display_name='tempest-TestNetworkBasicOps-server-1546550354',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1546550354',id=214,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEaqKioUa1HkPeYonPNuJnlCnVNvcGfuTPmT25tLMWjRfxazsN/y2+GpIz5S9URuuJ+OShDdxnT4urjNzwUEEtbxThOGjGDYWgMQGOYwZTWn6bfH0MXFcray2n4OsFILuQ==',key_name='tempest-TestNetworkBasicOps-330648990',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-l6mo0qox',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:09:01Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=0814449a-7dc1-4f82-a204-a793c4867d69,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "address": "fa:16:3e:de:aa:32", "network": {"id": "a79751eb-00bd-46af-a104-84d2e5b78aa7", "bridge": "br-int", "label": "tempest-network-smoke--1249603954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15887f40-38", "ovs_interfaceid": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.327 226239 DEBUG nova.network.os_vif_util [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "address": "fa:16:3e:de:aa:32", "network": {"id": "a79751eb-00bd-46af-a104-84d2e5b78aa7", "bridge": "br-int", "label": "tempest-network-smoke--1249603954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15887f40-38", "ovs_interfaceid": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.327 226239 DEBUG nova.network.os_vif_util [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:aa:32,bridge_name='br-int',has_traffic_filtering=True,id=15887f40-38e5-4ffb-bb22-fe2276411b0d,network=Network(a79751eb-00bd-46af-a104-84d2e5b78aa7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15887f40-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.328 226239 DEBUG os_vif [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:aa:32,bridge_name='br-int',has_traffic_filtering=True,id=15887f40-38e5-4ffb-bb22-fe2276411b0d,network=Network(a79751eb-00bd-46af-a104-84d2e5b78aa7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15887f40-38') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.329 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.330 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.330 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.333 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.333 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap15887f40-38, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.334 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap15887f40-38, col_values=(('external_ids', {'iface-id': '15887f40-38e5-4ffb-bb22-fe2276411b0d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:de:aa:32', 'vm-uuid': '0814449a-7dc1-4f82-a204-a793c4867d69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.336 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:10 np0005603623 NetworkManager[48970]: <info>  [1769850550.3372] manager: (tap15887f40-38): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/421)
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.340 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.341 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.342 226239 INFO os_vif [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:aa:32,bridge_name='br-int',has_traffic_filtering=True,id=15887f40-38e5-4ffb-bb22-fe2276411b0d,network=Network(a79751eb-00bd-46af-a104-84d2e5b78aa7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15887f40-38')#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.401 226239 DEBUG nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.402 226239 DEBUG nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.402 226239 DEBUG nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No VIF found with MAC fa:16:3e:de:aa:32, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.403 226239 INFO nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Using config drive#033[00m
Jan 31 04:09:10 np0005603623 nova_compute[226235]: 2026-01-31 09:09:10.427 226239 DEBUG nova.storage.rbd_utils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 0814449a-7dc1-4f82-a204-a793c4867d69_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.265 226239 INFO nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Creating config drive at /var/lib/nova/instances/0814449a-7dc1-4f82-a204-a793c4867d69/disk.config#033[00m
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.269 226239 DEBUG oslo_concurrency.processutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0814449a-7dc1-4f82-a204-a793c4867d69/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp8x1u05ef execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.394 226239 DEBUG oslo_concurrency.processutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0814449a-7dc1-4f82-a204-a793c4867d69/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp8x1u05ef" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.420 226239 DEBUG nova.storage.rbd_utils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 0814449a-7dc1-4f82-a204-a793c4867d69_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.424 226239 DEBUG oslo_concurrency.processutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0814449a-7dc1-4f82-a204-a793c4867d69/disk.config 0814449a-7dc1-4f82-a204-a793c4867d69_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:11.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.531 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850536.5299237, cfd916b8-0d40-401d-b967-86b08f49eaed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.531 226239 INFO nova.compute.manager [-] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.561 226239 DEBUG nova.compute.manager [None req-74537409-310a-4f74-9797-a4191608459b - - - - - -] [instance: cfd916b8-0d40-401d-b967-86b08f49eaed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.569 226239 DEBUG oslo_concurrency.processutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0814449a-7dc1-4f82-a204-a793c4867d69/disk.config 0814449a-7dc1-4f82-a204-a793c4867d69_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.570 226239 INFO nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Deleting local config drive /var/lib/nova/instances/0814449a-7dc1-4f82-a204-a793c4867d69/disk.config because it was imported into RBD.#033[00m
Jan 31 04:09:11 np0005603623 kernel: tap15887f40-38: entered promiscuous mode
Jan 31 04:09:11 np0005603623 NetworkManager[48970]: <info>  [1769850551.6115] manager: (tap15887f40-38): new Tun device (/org/freedesktop/NetworkManager/Devices/422)
Jan 31 04:09:11 np0005603623 ovn_controller[133449]: 2026-01-31T09:09:11Z|00893|binding|INFO|Claiming lport 15887f40-38e5-4ffb-bb22-fe2276411b0d for this chassis.
Jan 31 04:09:11 np0005603623 ovn_controller[133449]: 2026-01-31T09:09:11Z|00894|binding|INFO|15887f40-38e5-4ffb-bb22-fe2276411b0d: Claiming fa:16:3e:de:aa:32 10.100.0.10
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.612 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.616 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.618 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.626 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:aa:32 10.100.0.10'], port_security=['fa:16:3e:de:aa:32 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '0814449a-7dc1-4f82-a204-a793c4867d69', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a79751eb-00bd-46af-a104-84d2e5b78aa7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'neutron:revision_number': '2', 'neutron:security_group_ids': '375fe5bc-fb57-4bdb-9034-2adaae05eb90', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4ab4b272-4425-49ee-8131-a3e99987a09a, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=15887f40-38e5-4ffb-bb22-fe2276411b0d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.627 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 15887f40-38e5-4ffb-bb22-fe2276411b0d in datapath a79751eb-00bd-46af-a104-84d2e5b78aa7 bound to our chassis#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.628 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a79751eb-00bd-46af-a104-84d2e5b78aa7#033[00m
Jan 31 04:09:11 np0005603623 systemd-udevd[327083]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.637 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4f23ca0f-2d42-445e-b151-3d94e62bf321]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.638 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa79751eb-01 in ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.640 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa79751eb-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.640 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[fa5e8b5e-892e-4676-9b6b-5ac7499853fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.641 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[819131b0-2a0d-4329-8984-145cd5dbd20e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:11 np0005603623 NetworkManager[48970]: <info>  [1769850551.6436] device (tap15887f40-38): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:09:11 np0005603623 NetworkManager[48970]: <info>  [1769850551.6442] device (tap15887f40-38): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.646 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:11 np0005603623 systemd-machined[194379]: New machine qemu-99-instance-000000d6.
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.651 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[18d79d26-fbee-41b6-9ce4-ec824f248764]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:11 np0005603623 ovn_controller[133449]: 2026-01-31T09:09:11Z|00895|binding|INFO|Setting lport 15887f40-38e5-4ffb-bb22-fe2276411b0d ovn-installed in OVS
Jan 31 04:09:11 np0005603623 ovn_controller[133449]: 2026-01-31T09:09:11Z|00896|binding|INFO|Setting lport 15887f40-38e5-4ffb-bb22-fe2276411b0d up in Southbound
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.654 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:11 np0005603623 systemd[1]: Started Virtual Machine qemu-99-instance-000000d6.
Jan 31 04:09:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:11.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.664 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4e56ea8f-0ea5-4862-aab3-1543803c768e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.684 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[12b8c1d4-8956-4559-af08-1c93d76d8fee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:11 np0005603623 NetworkManager[48970]: <info>  [1769850551.6895] manager: (tapa79751eb-00): new Veth device (/org/freedesktop/NetworkManager/Devices/423)
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.688 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[ec69d306-c6c3-48e1-bf05-a42d6b90c685]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:11 np0005603623 systemd-udevd[327088]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.713 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[95528cf7-8fcd-4786-b794-c3dca260fcfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.716 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[79661b2d-0150-4839-8507-9e95757b8ddc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:11 np0005603623 NetworkManager[48970]: <info>  [1769850551.7339] device (tapa79751eb-00): carrier: link connected
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.739 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[aade689a-b2d9-4758-bffb-f528791bca7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.753 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8f325616-4043-473e-afd1-96105209d940]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa79751eb-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:71:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 263], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 969216, 'reachable_time': 35860, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 327117, 'error': None, 'target': 'ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.765 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8946147e-ed8f-45df-85f8-995d2c2cf3b1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6d:71b6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 969216, 'tstamp': 969216}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 327118, 'error': None, 'target': 'ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.785 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5f0ea28c-0c13-47e7-8df0-e6b6bc603daf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa79751eb-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:71:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 263], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 969216, 'reachable_time': 35860, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 327119, 'error': None, 'target': 'ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.816 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[09197aa4-a133-407b-8eef-9bb05df412cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.864 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1ce1f398-4189-488c-ac6b-2e4aab97628a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.866 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa79751eb-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.866 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.867 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa79751eb-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.869 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:11 np0005603623 NetworkManager[48970]: <info>  [1769850551.8699] manager: (tapa79751eb-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/424)
Jan 31 04:09:11 np0005603623 kernel: tapa79751eb-00: entered promiscuous mode
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.876 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa79751eb-00, col_values=(('external_ids', {'iface-id': '799caae4-3b07-428a-8115-83dc29773dc5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:09:11 np0005603623 ovn_controller[133449]: 2026-01-31T09:09:11Z|00897|binding|INFO|Releasing lport 799caae4-3b07-428a-8115-83dc29773dc5 from this chassis (sb_readonly=0)
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.877 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.881 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a79751eb-00bd-46af-a104-84d2e5b78aa7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a79751eb-00bd-46af-a104-84d2e5b78aa7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.883 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.882 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[30d53a11-46ec-406e-8ec5-ff794abcafb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.884 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-a79751eb-00bd-46af-a104-84d2e5b78aa7
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/a79751eb-00bd-46af-a104-84d2e5b78aa7.pid.haproxy
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID a79751eb-00bd-46af-a104-84d2e5b78aa7
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:09:11 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:11.885 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7', 'env', 'PROCESS_TAG=haproxy-a79751eb-00bd-46af-a104-84d2e5b78aa7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a79751eb-00bd-46af-a104-84d2e5b78aa7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:09:11 np0005603623 nova_compute[226235]: 2026-01-31 09:09:11.955 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:12 np0005603623 podman[327187]: 2026-01-31 09:09:12.213529898 +0000 UTC m=+0.044671172 container create 64a0ca35911597411ba4f1b7c8392a3e3c681dadeb61d2d3f7e1dac848f0922e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 04:09:12 np0005603623 systemd[1]: Started libpod-conmon-64a0ca35911597411ba4f1b7c8392a3e3c681dadeb61d2d3f7e1dac848f0922e.scope.
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.260 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850552.2597337, 0814449a-7dc1-4f82-a204-a793c4867d69 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.261 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] VM Started (Lifecycle Event)#033[00m
Jan 31 04:09:12 np0005603623 systemd[1]: Started libcrun container.
Jan 31 04:09:12 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/890f322295a825c908aeef869e9afb42fafdc2c5e1cab2bd75474cfb8bcc253f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:09:12 np0005603623 podman[327187]: 2026-01-31 09:09:12.282905564 +0000 UTC m=+0.114046868 container init 64a0ca35911597411ba4f1b7c8392a3e3c681dadeb61d2d3f7e1dac848f0922e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:09:12 np0005603623 podman[327187]: 2026-01-31 09:09:12.189476774 +0000 UTC m=+0.020618068 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.288 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:09:12 np0005603623 podman[327187]: 2026-01-31 09:09:12.291882476 +0000 UTC m=+0.123023760 container start 64a0ca35911597411ba4f1b7c8392a3e3c681dadeb61d2d3f7e1dac848f0922e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.293 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850552.2610903, 0814449a-7dc1-4f82-a204-a793c4867d69 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.294 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:09:12 np0005603623 neutron-haproxy-ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7[327209]: [NOTICE]   (327213) : New worker (327215) forked
Jan 31 04:09:12 np0005603623 neutron-haproxy-ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7[327209]: [NOTICE]   (327213) : Loading success.
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.320 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.324 226239 DEBUG nova.compute.manager [req-d92242bb-e2fa-4140-aa1c-345ef4d2e6dd req-1f2a3b73-45a3-420b-b30f-ef239e8626dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Received event network-vif-plugged-15887f40-38e5-4ffb-bb22-fe2276411b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.325 226239 DEBUG oslo_concurrency.lockutils [req-d92242bb-e2fa-4140-aa1c-345ef4d2e6dd req-1f2a3b73-45a3-420b-b30f-ef239e8626dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0814449a-7dc1-4f82-a204-a793c4867d69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.325 226239 DEBUG oslo_concurrency.lockutils [req-d92242bb-e2fa-4140-aa1c-345ef4d2e6dd req-1f2a3b73-45a3-420b-b30f-ef239e8626dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0814449a-7dc1-4f82-a204-a793c4867d69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.326 226239 DEBUG oslo_concurrency.lockutils [req-d92242bb-e2fa-4140-aa1c-345ef4d2e6dd req-1f2a3b73-45a3-420b-b30f-ef239e8626dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0814449a-7dc1-4f82-a204-a793c4867d69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.326 226239 DEBUG nova.compute.manager [req-d92242bb-e2fa-4140-aa1c-345ef4d2e6dd req-1f2a3b73-45a3-420b-b30f-ef239e8626dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Processing event network-vif-plugged-15887f40-38e5-4ffb-bb22-fe2276411b0d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.327 226239 DEBUG nova.compute.manager [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.342 226239 DEBUG nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.343 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850552.3387809, 0814449a-7dc1-4f82-a204-a793c4867d69 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.344 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.348 226239 INFO nova.virt.libvirt.driver [-] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Instance spawned successfully.#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.349 226239 DEBUG nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.365 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.368 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.380 226239 DEBUG nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.381 226239 DEBUG nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.381 226239 DEBUG nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.382 226239 DEBUG nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.383 226239 DEBUG nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.383 226239 DEBUG nova.virt.libvirt.driver [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.414 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.417 226239 DEBUG nova.network.neutron [req-b48e42f2-102b-4880-9998-8966dc801de4 req-086901f0-46ae-4eef-b136-06145c9f7ad1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Updated VIF entry in instance network info cache for port 15887f40-38e5-4ffb-bb22-fe2276411b0d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.418 226239 DEBUG nova.network.neutron [req-b48e42f2-102b-4880-9998-8966dc801de4 req-086901f0-46ae-4eef-b136-06145c9f7ad1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Updating instance_info_cache with network_info: [{"id": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "address": "fa:16:3e:de:aa:32", "network": {"id": "a79751eb-00bd-46af-a104-84d2e5b78aa7", "bridge": "br-int", "label": "tempest-network-smoke--1249603954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15887f40-38", "ovs_interfaceid": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.447 226239 DEBUG oslo_concurrency.lockutils [req-b48e42f2-102b-4880-9998-8966dc801de4 req-086901f0-46ae-4eef-b136-06145c9f7ad1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-0814449a-7dc1-4f82-a204-a793c4867d69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.461 226239 INFO nova.compute.manager [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Took 10.88 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.462 226239 DEBUG nova.compute.manager [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.579 226239 INFO nova.compute.manager [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Took 11.89 seconds to build instance.#033[00m
Jan 31 04:09:12 np0005603623 nova_compute[226235]: 2026-01-31 09:09:12.598 226239 DEBUG oslo_concurrency.lockutils [None req-2e4aa5e1-3426-4b19-aefa-ccb3f56529f8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "0814449a-7dc1-4f82-a204-a793c4867d69" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:13.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:13.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:14 np0005603623 nova_compute[226235]: 2026-01-31 09:09:14.554 226239 DEBUG nova.compute.manager [req-e4e3e569-6368-4c68-9282-5234fa6b71c6 req-4a9eb253-560d-43bf-8daa-f4eb50531d58 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Received event network-vif-plugged-15887f40-38e5-4ffb-bb22-fe2276411b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:09:14 np0005603623 nova_compute[226235]: 2026-01-31 09:09:14.555 226239 DEBUG oslo_concurrency.lockutils [req-e4e3e569-6368-4c68-9282-5234fa6b71c6 req-4a9eb253-560d-43bf-8daa-f4eb50531d58 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0814449a-7dc1-4f82-a204-a793c4867d69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:14 np0005603623 nova_compute[226235]: 2026-01-31 09:09:14.555 226239 DEBUG oslo_concurrency.lockutils [req-e4e3e569-6368-4c68-9282-5234fa6b71c6 req-4a9eb253-560d-43bf-8daa-f4eb50531d58 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0814449a-7dc1-4f82-a204-a793c4867d69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:14 np0005603623 nova_compute[226235]: 2026-01-31 09:09:14.555 226239 DEBUG oslo_concurrency.lockutils [req-e4e3e569-6368-4c68-9282-5234fa6b71c6 req-4a9eb253-560d-43bf-8daa-f4eb50531d58 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0814449a-7dc1-4f82-a204-a793c4867d69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:14 np0005603623 nova_compute[226235]: 2026-01-31 09:09:14.555 226239 DEBUG nova.compute.manager [req-e4e3e569-6368-4c68-9282-5234fa6b71c6 req-4a9eb253-560d-43bf-8daa-f4eb50531d58 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] No waiting events found dispatching network-vif-plugged-15887f40-38e5-4ffb-bb22-fe2276411b0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:09:14 np0005603623 nova_compute[226235]: 2026-01-31 09:09:14.556 226239 WARNING nova.compute.manager [req-e4e3e569-6368-4c68-9282-5234fa6b71c6 req-4a9eb253-560d-43bf-8daa-f4eb50531d58 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Received unexpected event network-vif-plugged-15887f40-38e5-4ffb-bb22-fe2276411b0d for instance with vm_state active and task_state None.#033[00m
Jan 31 04:09:15 np0005603623 nova_compute[226235]: 2026-01-31 09:09:15.337 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:15.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:15.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:16 np0005603623 NetworkManager[48970]: <info>  [1769850556.1488] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/425)
Jan 31 04:09:16 np0005603623 NetworkManager[48970]: <info>  [1769850556.1496] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/426)
Jan 31 04:09:16 np0005603623 ovn_controller[133449]: 2026-01-31T09:09:16Z|00898|binding|INFO|Releasing lport 799caae4-3b07-428a-8115-83dc29773dc5 from this chassis (sb_readonly=0)
Jan 31 04:09:16 np0005603623 nova_compute[226235]: 2026-01-31 09:09:16.152 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:16 np0005603623 ovn_controller[133449]: 2026-01-31T09:09:16Z|00899|binding|INFO|Releasing lport 799caae4-3b07-428a-8115-83dc29773dc5 from this chassis (sb_readonly=0)
Jan 31 04:09:16 np0005603623 nova_compute[226235]: 2026-01-31 09:09:16.165 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:16 np0005603623 nova_compute[226235]: 2026-01-31 09:09:16.505 226239 DEBUG nova.compute.manager [req-99e32415-e041-4b74-b0d3-41156f8e1020 req-3c6e147e-7593-4618-9e1f-0470ccf670a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Received event network-changed-15887f40-38e5-4ffb-bb22-fe2276411b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:09:16 np0005603623 nova_compute[226235]: 2026-01-31 09:09:16.505 226239 DEBUG nova.compute.manager [req-99e32415-e041-4b74-b0d3-41156f8e1020 req-3c6e147e-7593-4618-9e1f-0470ccf670a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Refreshing instance network info cache due to event network-changed-15887f40-38e5-4ffb-bb22-fe2276411b0d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:09:16 np0005603623 nova_compute[226235]: 2026-01-31 09:09:16.506 226239 DEBUG oslo_concurrency.lockutils [req-99e32415-e041-4b74-b0d3-41156f8e1020 req-3c6e147e-7593-4618-9e1f-0470ccf670a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-0814449a-7dc1-4f82-a204-a793c4867d69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:09:16 np0005603623 nova_compute[226235]: 2026-01-31 09:09:16.506 226239 DEBUG oslo_concurrency.lockutils [req-99e32415-e041-4b74-b0d3-41156f8e1020 req-3c6e147e-7593-4618-9e1f-0470ccf670a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-0814449a-7dc1-4f82-a204-a793c4867d69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:09:16 np0005603623 nova_compute[226235]: 2026-01-31 09:09:16.507 226239 DEBUG nova.network.neutron [req-99e32415-e041-4b74-b0d3-41156f8e1020 req-3c6e147e-7593-4618-9e1f-0470ccf670a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Refreshing network info cache for port 15887f40-38e5-4ffb-bb22-fe2276411b0d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:09:16 np0005603623 nova_compute[226235]: 2026-01-31 09:09:16.956 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:17 np0005603623 nova_compute[226235]: 2026-01-31 09:09:17.401 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:17.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:17.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:18 np0005603623 nova_compute[226235]: 2026-01-31 09:09:18.302 226239 DEBUG nova.network.neutron [req-99e32415-e041-4b74-b0d3-41156f8e1020 req-3c6e147e-7593-4618-9e1f-0470ccf670a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Updated VIF entry in instance network info cache for port 15887f40-38e5-4ffb-bb22-fe2276411b0d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:09:18 np0005603623 nova_compute[226235]: 2026-01-31 09:09:18.302 226239 DEBUG nova.network.neutron [req-99e32415-e041-4b74-b0d3-41156f8e1020 req-3c6e147e-7593-4618-9e1f-0470ccf670a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Updating instance_info_cache with network_info: [{"id": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "address": "fa:16:3e:de:aa:32", "network": {"id": "a79751eb-00bd-46af-a104-84d2e5b78aa7", "bridge": "br-int", "label": "tempest-network-smoke--1249603954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15887f40-38", "ovs_interfaceid": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:09:18 np0005603623 nova_compute[226235]: 2026-01-31 09:09:18.340 226239 DEBUG oslo_concurrency.lockutils [req-99e32415-e041-4b74-b0d3-41156f8e1020 req-3c6e147e-7593-4618-9e1f-0470ccf670a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-0814449a-7dc1-4f82-a204-a793c4867d69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:09:19 np0005603623 nova_compute[226235]: 2026-01-31 09:09:19.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:19 np0005603623 nova_compute[226235]: 2026-01-31 09:09:19.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:09:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:19.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:19.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:20 np0005603623 nova_compute[226235]: 2026-01-31 09:09:20.036 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:20.035 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=92, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=91) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:09:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:20.038 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:09:20 np0005603623 nova_compute[226235]: 2026-01-31 09:09:20.338 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:21.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:21.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:21 np0005603623 nova_compute[226235]: 2026-01-31 09:09:21.834 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:21 np0005603623 nova_compute[226235]: 2026-01-31 09:09:21.957 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:23 np0005603623 nova_compute[226235]: 2026-01-31 09:09:23.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:23 np0005603623 nova_compute[226235]: 2026-01-31 09:09:23.156 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:09:23 np0005603623 nova_compute[226235]: 2026-01-31 09:09:23.156 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:09:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:23.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:23.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:24 np0005603623 nova_compute[226235]: 2026-01-31 09:09:24.028 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-0814449a-7dc1-4f82-a204-a793c4867d69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:09:24 np0005603623 nova_compute[226235]: 2026-01-31 09:09:24.028 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-0814449a-7dc1-4f82-a204-a793c4867d69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:09:24 np0005603623 nova_compute[226235]: 2026-01-31 09:09:24.029 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 04:09:24 np0005603623 nova_compute[226235]: 2026-01-31 09:09:24.029 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0814449a-7dc1-4f82-a204-a793c4867d69 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:09:25 np0005603623 nova_compute[226235]: 2026-01-31 09:09:25.376 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:25.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:25.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:26 np0005603623 nova_compute[226235]: 2026-01-31 09:09:26.373 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Updating instance_info_cache with network_info: [{"id": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "address": "fa:16:3e:de:aa:32", "network": {"id": "a79751eb-00bd-46af-a104-84d2e5b78aa7", "bridge": "br-int", "label": "tempest-network-smoke--1249603954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15887f40-38", "ovs_interfaceid": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:09:26 np0005603623 nova_compute[226235]: 2026-01-31 09:09:26.387 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-0814449a-7dc1-4f82-a204-a793c4867d69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:09:26 np0005603623 nova_compute[226235]: 2026-01-31 09:09:26.387 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 04:09:26 np0005603623 nova_compute[226235]: 2026-01-31 09:09:26.388 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:26 np0005603623 nova_compute[226235]: 2026-01-31 09:09:26.388 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:26 np0005603623 nova_compute[226235]: 2026-01-31 09:09:26.407 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:26 np0005603623 nova_compute[226235]: 2026-01-31 09:09:26.408 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:26 np0005603623 nova_compute[226235]: 2026-01-31 09:09:26.408 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:26 np0005603623 nova_compute[226235]: 2026-01-31 09:09:26.408 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:09:26 np0005603623 nova_compute[226235]: 2026-01-31 09:09:26.409 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:09:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3116808261' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:09:26 np0005603623 nova_compute[226235]: 2026-01-31 09:09:26.828 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:26 np0005603623 nova_compute[226235]: 2026-01-31 09:09:26.907 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000d6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:09:26 np0005603623 nova_compute[226235]: 2026-01-31 09:09:26.907 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000d6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:09:26 np0005603623 nova_compute[226235]: 2026-01-31 09:09:26.960 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:27 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:27.040 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '92'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:09:27 np0005603623 nova_compute[226235]: 2026-01-31 09:09:27.052 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:09:27 np0005603623 nova_compute[226235]: 2026-01-31 09:09:27.053 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3975MB free_disk=20.95781707763672GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:09:27 np0005603623 nova_compute[226235]: 2026-01-31 09:09:27.054 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:27 np0005603623 nova_compute[226235]: 2026-01-31 09:09:27.054 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:27 np0005603623 nova_compute[226235]: 2026-01-31 09:09:27.209 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 0814449a-7dc1-4f82-a204-a793c4867d69 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:09:27 np0005603623 nova_compute[226235]: 2026-01-31 09:09:27.209 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:09:27 np0005603623 nova_compute[226235]: 2026-01-31 09:09:27.210 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:09:27 np0005603623 nova_compute[226235]: 2026-01-31 09:09:27.310 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:27.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:27.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:09:27 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3520127858' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:09:27 np0005603623 nova_compute[226235]: 2026-01-31 09:09:27.731 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:27 np0005603623 nova_compute[226235]: 2026-01-31 09:09:27.735 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:09:27 np0005603623 nova_compute[226235]: 2026-01-31 09:09:27.753 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:09:27 np0005603623 nova_compute[226235]: 2026-01-31 09:09:27.783 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:09:27 np0005603623 nova_compute[226235]: 2026-01-31 09:09:27.783 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.729s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:28 np0005603623 ovn_controller[133449]: 2026-01-31T09:09:28Z|00114|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:de:aa:32 10.100.0.10
Jan 31 04:09:28 np0005603623 ovn_controller[133449]: 2026-01-31T09:09:28Z|00115|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:de:aa:32 10.100.0.10
Jan 31 04:09:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:29.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:29.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:30.161 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:30.162 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:30.162 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:30 np0005603623 nova_compute[226235]: 2026-01-31 09:09:30.378 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:30 np0005603623 podman[327330]: 2026-01-31 09:09:30.973362518 +0000 UTC m=+0.056781903 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:09:31 np0005603623 podman[327331]: 2026-01-31 09:09:31.002133891 +0000 UTC m=+0.085553526 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:09:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:31.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:31 np0005603623 nova_compute[226235]: 2026-01-31 09:09:31.550 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:31 np0005603623 nova_compute[226235]: 2026-01-31 09:09:31.550 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:31 np0005603623 nova_compute[226235]: 2026-01-31 09:09:31.550 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:31.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:31 np0005603623 nova_compute[226235]: 2026-01-31 09:09:31.961 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:32 np0005603623 nova_compute[226235]: 2026-01-31 09:09:32.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:33 np0005603623 nova_compute[226235]: 2026-01-31 09:09:33.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:33.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:33.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:35 np0005603623 nova_compute[226235]: 2026-01-31 09:09:35.379 226239 INFO nova.compute.manager [None req-9887520f-fac3-47ab-8fc5-b47d8de9a198 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Get console output#033[00m
Jan 31 04:09:35 np0005603623 nova_compute[226235]: 2026-01-31 09:09:35.381 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:35 np0005603623 nova_compute[226235]: 2026-01-31 09:09:35.387 270602 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 04:09:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:35.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:35.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:36 np0005603623 ovn_controller[133449]: 2026-01-31T09:09:36Z|00900|binding|INFO|Releasing lport 799caae4-3b07-428a-8115-83dc29773dc5 from this chassis (sb_readonly=0)
Jan 31 04:09:36 np0005603623 ovn_controller[133449]: 2026-01-31T09:09:36Z|00901|binding|INFO|Releasing lport 799caae4-3b07-428a-8115-83dc29773dc5 from this chassis (sb_readonly=0)
Jan 31 04:09:36 np0005603623 nova_compute[226235]: 2026-01-31 09:09:36.357 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:36 np0005603623 nova_compute[226235]: 2026-01-31 09:09:36.362 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:36 np0005603623 nova_compute[226235]: 2026-01-31 09:09:36.963 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:37.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:37 np0005603623 nova_compute[226235]: 2026-01-31 09:09:37.622 226239 INFO nova.compute.manager [None req-b1129839-0d70-4e7c-bdd6-25dfc8605e2c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Get console output#033[00m
Jan 31 04:09:37 np0005603623 nova_compute[226235]: 2026-01-31 09:09:37.630 270602 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 04:09:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:37.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:38 np0005603623 NetworkManager[48970]: <info>  [1769850578.4118] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/427)
Jan 31 04:09:38 np0005603623 NetworkManager[48970]: <info>  [1769850578.4125] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/428)
Jan 31 04:09:38 np0005603623 nova_compute[226235]: 2026-01-31 09:09:38.411 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:38 np0005603623 nova_compute[226235]: 2026-01-31 09:09:38.464 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:38 np0005603623 ovn_controller[133449]: 2026-01-31T09:09:38Z|00902|binding|INFO|Releasing lport 799caae4-3b07-428a-8115-83dc29773dc5 from this chassis (sb_readonly=0)
Jan 31 04:09:38 np0005603623 nova_compute[226235]: 2026-01-31 09:09:38.478 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:38 np0005603623 nova_compute[226235]: 2026-01-31 09:09:38.833 226239 INFO nova.compute.manager [None req-5c176396-3c91-44a1-9a0e-de75b8adf2d3 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Get console output#033[00m
Jan 31 04:09:38 np0005603623 nova_compute[226235]: 2026-01-31 09:09:38.838 270602 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 04:09:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:39.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:09:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:39.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:09:39 np0005603623 nova_compute[226235]: 2026-01-31 09:09:39.969 226239 DEBUG nova.compute.manager [req-d45694c1-a229-4093-bb19-9f47b61f7ace req-8e40beed-df84-4488-b202-989017d48832 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Received event network-changed-15887f40-38e5-4ffb-bb22-fe2276411b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:09:39 np0005603623 nova_compute[226235]: 2026-01-31 09:09:39.969 226239 DEBUG nova.compute.manager [req-d45694c1-a229-4093-bb19-9f47b61f7ace req-8e40beed-df84-4488-b202-989017d48832 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Refreshing instance network info cache due to event network-changed-15887f40-38e5-4ffb-bb22-fe2276411b0d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:09:39 np0005603623 nova_compute[226235]: 2026-01-31 09:09:39.969 226239 DEBUG oslo_concurrency.lockutils [req-d45694c1-a229-4093-bb19-9f47b61f7ace req-8e40beed-df84-4488-b202-989017d48832 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-0814449a-7dc1-4f82-a204-a793c4867d69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:09:39 np0005603623 nova_compute[226235]: 2026-01-31 09:09:39.970 226239 DEBUG oslo_concurrency.lockutils [req-d45694c1-a229-4093-bb19-9f47b61f7ace req-8e40beed-df84-4488-b202-989017d48832 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-0814449a-7dc1-4f82-a204-a793c4867d69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:09:39 np0005603623 nova_compute[226235]: 2026-01-31 09:09:39.970 226239 DEBUG nova.network.neutron [req-d45694c1-a229-4093-bb19-9f47b61f7ace req-8e40beed-df84-4488-b202-989017d48832 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Refreshing network info cache for port 15887f40-38e5-4ffb-bb22-fe2276411b0d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.053 226239 DEBUG oslo_concurrency.lockutils [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "0814449a-7dc1-4f82-a204-a793c4867d69" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.054 226239 DEBUG oslo_concurrency.lockutils [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "0814449a-7dc1-4f82-a204-a793c4867d69" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.054 226239 DEBUG oslo_concurrency.lockutils [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "0814449a-7dc1-4f82-a204-a793c4867d69-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.054 226239 DEBUG oslo_concurrency.lockutils [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "0814449a-7dc1-4f82-a204-a793c4867d69-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.055 226239 DEBUG oslo_concurrency.lockutils [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "0814449a-7dc1-4f82-a204-a793c4867d69-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.056 226239 INFO nova.compute.manager [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Terminating instance#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.057 226239 DEBUG nova.compute.manager [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:09:40 np0005603623 kernel: tap15887f40-38 (unregistering): left promiscuous mode
Jan 31 04:09:40 np0005603623 NetworkManager[48970]: <info>  [1769850580.1176] device (tap15887f40-38): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.128 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:40 np0005603623 ovn_controller[133449]: 2026-01-31T09:09:40Z|00903|binding|INFO|Releasing lport 15887f40-38e5-4ffb-bb22-fe2276411b0d from this chassis (sb_readonly=0)
Jan 31 04:09:40 np0005603623 ovn_controller[133449]: 2026-01-31T09:09:40Z|00904|binding|INFO|Setting lport 15887f40-38e5-4ffb-bb22-fe2276411b0d down in Southbound
Jan 31 04:09:40 np0005603623 ovn_controller[133449]: 2026-01-31T09:09:40Z|00905|binding|INFO|Removing iface tap15887f40-38 ovn-installed in OVS
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.136 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:40.139 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:aa:32 10.100.0.10'], port_security=['fa:16:3e:de:aa:32 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '0814449a-7dc1-4f82-a204-a793c4867d69', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a79751eb-00bd-46af-a104-84d2e5b78aa7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'neutron:revision_number': '4', 'neutron:security_group_ids': '375fe5bc-fb57-4bdb-9034-2adaae05eb90', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4ab4b272-4425-49ee-8131-a3e99987a09a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=15887f40-38e5-4ffb-bb22-fe2276411b0d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:09:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:40.140 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 15887f40-38e5-4ffb-bb22-fe2276411b0d in datapath a79751eb-00bd-46af-a104-84d2e5b78aa7 unbound from our chassis#033[00m
Jan 31 04:09:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:40.142 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a79751eb-00bd-46af-a104-84d2e5b78aa7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:09:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:40.143 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e26990fe-6cf5-4826-a09b-644ba963de52]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:40.144 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7 namespace which is not needed anymore#033[00m
Jan 31 04:09:40 np0005603623 systemd[1]: machine-qemu\x2d99\x2dinstance\x2d000000d6.scope: Deactivated successfully.
Jan 31 04:09:40 np0005603623 systemd[1]: machine-qemu\x2d99\x2dinstance\x2d000000d6.scope: Consumed 13.276s CPU time.
Jan 31 04:09:40 np0005603623 systemd-machined[194379]: Machine qemu-99-instance-000000d6 terminated.
Jan 31 04:09:40 np0005603623 neutron-haproxy-ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7[327209]: [NOTICE]   (327213) : haproxy version is 2.8.14-c23fe91
Jan 31 04:09:40 np0005603623 neutron-haproxy-ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7[327209]: [NOTICE]   (327213) : path to executable is /usr/sbin/haproxy
Jan 31 04:09:40 np0005603623 neutron-haproxy-ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7[327209]: [WARNING]  (327213) : Exiting Master process...
Jan 31 04:09:40 np0005603623 neutron-haproxy-ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7[327209]: [ALERT]    (327213) : Current worker (327215) exited with code 143 (Terminated)
Jan 31 04:09:40 np0005603623 neutron-haproxy-ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7[327209]: [WARNING]  (327213) : All workers exited. Exiting... (0)
Jan 31 04:09:40 np0005603623 systemd[1]: libpod-64a0ca35911597411ba4f1b7c8392a3e3c681dadeb61d2d3f7e1dac848f0922e.scope: Deactivated successfully.
Jan 31 04:09:40 np0005603623 podman[327532]: 2026-01-31 09:09:40.253696389 +0000 UTC m=+0.039212461 container died 64a0ca35911597411ba4f1b7c8392a3e3c681dadeb61d2d3f7e1dac848f0922e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.273 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:40 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-64a0ca35911597411ba4f1b7c8392a3e3c681dadeb61d2d3f7e1dac848f0922e-userdata-shm.mount: Deactivated successfully.
Jan 31 04:09:40 np0005603623 systemd[1]: var-lib-containers-storage-overlay-890f322295a825c908aeef869e9afb42fafdc2c5e1cab2bd75474cfb8bcc253f-merged.mount: Deactivated successfully.
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.279 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.286 226239 INFO nova.virt.libvirt.driver [-] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Instance destroyed successfully.#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.287 226239 DEBUG nova.objects.instance [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'resources' on Instance uuid 0814449a-7dc1-4f82-a204-a793c4867d69 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:09:40 np0005603623 podman[327532]: 2026-01-31 09:09:40.289007797 +0000 UTC m=+0.074523869 container cleanup 64a0ca35911597411ba4f1b7c8392a3e3c681dadeb61d2d3f7e1dac848f0922e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 31 04:09:40 np0005603623 systemd[1]: libpod-conmon-64a0ca35911597411ba4f1b7c8392a3e3c681dadeb61d2d3f7e1dac848f0922e.scope: Deactivated successfully.
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.304 226239 DEBUG nova.virt.libvirt.vif [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:08:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1546550354',display_name='tempest-TestNetworkBasicOps-server-1546550354',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1546550354',id=214,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEaqKioUa1HkPeYonPNuJnlCnVNvcGfuTPmT25tLMWjRfxazsN/y2+GpIz5S9URuuJ+OShDdxnT4urjNzwUEEtbxThOGjGDYWgMQGOYwZTWn6bfH0MXFcray2n4OsFILuQ==',key_name='tempest-TestNetworkBasicOps-330648990',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:09:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-l6mo0qox',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:09:12Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=0814449a-7dc1-4f82-a204-a793c4867d69,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "address": "fa:16:3e:de:aa:32", "network": {"id": "a79751eb-00bd-46af-a104-84d2e5b78aa7", "bridge": "br-int", "label": "tempest-network-smoke--1249603954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15887f40-38", "ovs_interfaceid": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.305 226239 DEBUG nova.network.os_vif_util [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "address": "fa:16:3e:de:aa:32", "network": {"id": "a79751eb-00bd-46af-a104-84d2e5b78aa7", "bridge": "br-int", "label": "tempest-network-smoke--1249603954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15887f40-38", "ovs_interfaceid": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.306 226239 DEBUG nova.network.os_vif_util [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:de:aa:32,bridge_name='br-int',has_traffic_filtering=True,id=15887f40-38e5-4ffb-bb22-fe2276411b0d,network=Network(a79751eb-00bd-46af-a104-84d2e5b78aa7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15887f40-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.307 226239 DEBUG os_vif [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:de:aa:32,bridge_name='br-int',has_traffic_filtering=True,id=15887f40-38e5-4ffb-bb22-fe2276411b0d,network=Network(a79751eb-00bd-46af-a104-84d2e5b78aa7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15887f40-38') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.309 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.310 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15887f40-38, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.311 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.314 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.316 226239 INFO os_vif [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:de:aa:32,bridge_name='br-int',has_traffic_filtering=True,id=15887f40-38e5-4ffb-bb22-fe2276411b0d,network=Network(a79751eb-00bd-46af-a104-84d2e5b78aa7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15887f40-38')#033[00m
Jan 31 04:09:40 np0005603623 podman[327570]: 2026-01-31 09:09:40.349242996 +0000 UTC m=+0.039982085 container remove 64a0ca35911597411ba4f1b7c8392a3e3c681dadeb61d2d3f7e1dac848f0922e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 04:09:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:40.354 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[47664c63-faac-4755-8c69-43e86b42a145]: (4, ('Sat Jan 31 09:09:40 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7 (64a0ca35911597411ba4f1b7c8392a3e3c681dadeb61d2d3f7e1dac848f0922e)\n64a0ca35911597411ba4f1b7c8392a3e3c681dadeb61d2d3f7e1dac848f0922e\nSat Jan 31 09:09:40 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7 (64a0ca35911597411ba4f1b7c8392a3e3c681dadeb61d2d3f7e1dac848f0922e)\n64a0ca35911597411ba4f1b7c8392a3e3c681dadeb61d2d3f7e1dac848f0922e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:40.356 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4ed6adbf-c95a-4a22-b196-f9313ec78f24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:40.357 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa79751eb-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.358 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:40 np0005603623 kernel: tapa79751eb-00: left promiscuous mode
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.364 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:40.367 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bb056bb1-4668-49f9-be45-6ec1e98b3ea4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:40.377 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9803130c-5839-4227-9f16-31f6680e18bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:40.379 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f3248a68-9df6-40c9-93c1-204654adf378]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:40.397 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[42c9c1f6-b649-44c5-bbee-77b9f368e99d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 969211, 'reachable_time': 42382, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 327603, 'error': None, 'target': 'ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:40 np0005603623 systemd[1]: run-netns-ovnmeta\x2da79751eb\x2d00bd\x2d46af\x2da104\x2d84d2e5b78aa7.mount: Deactivated successfully.
Jan 31 04:09:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:40.401 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a79751eb-00bd-46af-a104-84d2e5b78aa7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:09:40 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:09:40.401 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[edac8a83-b2d8-48d6-85ac-9b7fc90014df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:09:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:09:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:09:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:09:40 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.722 226239 INFO nova.virt.libvirt.driver [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Deleting instance files /var/lib/nova/instances/0814449a-7dc1-4f82-a204-a793c4867d69_del#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.723 226239 INFO nova.virt.libvirt.driver [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Deletion of /var/lib/nova/instances/0814449a-7dc1-4f82-a204-a793c4867d69_del complete#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.816 226239 INFO nova.compute.manager [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Took 0.76 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.816 226239 DEBUG oslo.service.loopingcall [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.816 226239 DEBUG nova.compute.manager [-] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:09:40 np0005603623 nova_compute[226235]: 2026-01-31 09:09:40.817 226239 DEBUG nova.network.neutron [-] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:09:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:09:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:41.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:09:41 np0005603623 nova_compute[226235]: 2026-01-31 09:09:41.517 226239 DEBUG nova.network.neutron [req-d45694c1-a229-4093-bb19-9f47b61f7ace req-8e40beed-df84-4488-b202-989017d48832 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Updated VIF entry in instance network info cache for port 15887f40-38e5-4ffb-bb22-fe2276411b0d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:09:41 np0005603623 nova_compute[226235]: 2026-01-31 09:09:41.518 226239 DEBUG nova.network.neutron [req-d45694c1-a229-4093-bb19-9f47b61f7ace req-8e40beed-df84-4488-b202-989017d48832 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Updating instance_info_cache with network_info: [{"id": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "address": "fa:16:3e:de:aa:32", "network": {"id": "a79751eb-00bd-46af-a104-84d2e5b78aa7", "bridge": "br-int", "label": "tempest-network-smoke--1249603954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15887f40-38", "ovs_interfaceid": "15887f40-38e5-4ffb-bb22-fe2276411b0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:09:41 np0005603623 nova_compute[226235]: 2026-01-31 09:09:41.541 226239 DEBUG oslo_concurrency.lockutils [req-d45694c1-a229-4093-bb19-9f47b61f7ace req-8e40beed-df84-4488-b202-989017d48832 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-0814449a-7dc1-4f82-a204-a793c4867d69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:09:41 np0005603623 nova_compute[226235]: 2026-01-31 09:09:41.585 226239 DEBUG nova.network.neutron [-] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:09:41 np0005603623 nova_compute[226235]: 2026-01-31 09:09:41.611 226239 INFO nova.compute.manager [-] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Took 0.79 seconds to deallocate network for instance.#033[00m
Jan 31 04:09:41 np0005603623 nova_compute[226235]: 2026-01-31 09:09:41.668 226239 DEBUG oslo_concurrency.lockutils [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:41 np0005603623 nova_compute[226235]: 2026-01-31 09:09:41.668 226239 DEBUG oslo_concurrency.lockutils [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:41 np0005603623 nova_compute[226235]: 2026-01-31 09:09:41.696 226239 DEBUG nova.compute.manager [req-9d6bf239-dacc-4d5a-bb1d-cb57e51952f8 req-f5c831aa-0431-4973-b37c-df3a2213bef7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Received event network-vif-deleted-15887f40-38e5-4ffb-bb22-fe2276411b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:09:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:41.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:41 np0005603623 nova_compute[226235]: 2026-01-31 09:09:41.742 226239 DEBUG oslo_concurrency.processutils [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.009 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.089 226239 DEBUG nova.compute.manager [req-7337ad32-ae3c-4ca4-a9f5-b134d69d4fa9 req-3bf6ad97-e935-4600-9a44-709c1caa3f34 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Received event network-vif-unplugged-15887f40-38e5-4ffb-bb22-fe2276411b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.091 226239 DEBUG oslo_concurrency.lockutils [req-7337ad32-ae3c-4ca4-a9f5-b134d69d4fa9 req-3bf6ad97-e935-4600-9a44-709c1caa3f34 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0814449a-7dc1-4f82-a204-a793c4867d69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.091 226239 DEBUG oslo_concurrency.lockutils [req-7337ad32-ae3c-4ca4-a9f5-b134d69d4fa9 req-3bf6ad97-e935-4600-9a44-709c1caa3f34 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0814449a-7dc1-4f82-a204-a793c4867d69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.092 226239 DEBUG oslo_concurrency.lockutils [req-7337ad32-ae3c-4ca4-a9f5-b134d69d4fa9 req-3bf6ad97-e935-4600-9a44-709c1caa3f34 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0814449a-7dc1-4f82-a204-a793c4867d69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.092 226239 DEBUG nova.compute.manager [req-7337ad32-ae3c-4ca4-a9f5-b134d69d4fa9 req-3bf6ad97-e935-4600-9a44-709c1caa3f34 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] No waiting events found dispatching network-vif-unplugged-15887f40-38e5-4ffb-bb22-fe2276411b0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.093 226239 WARNING nova.compute.manager [req-7337ad32-ae3c-4ca4-a9f5-b134d69d4fa9 req-3bf6ad97-e935-4600-9a44-709c1caa3f34 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Received unexpected event network-vif-unplugged-15887f40-38e5-4ffb-bb22-fe2276411b0d for instance with vm_state deleted and task_state None.#033[00m
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.093 226239 DEBUG nova.compute.manager [req-7337ad32-ae3c-4ca4-a9f5-b134d69d4fa9 req-3bf6ad97-e935-4600-9a44-709c1caa3f34 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Received event network-vif-plugged-15887f40-38e5-4ffb-bb22-fe2276411b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.093 226239 DEBUG oslo_concurrency.lockutils [req-7337ad32-ae3c-4ca4-a9f5-b134d69d4fa9 req-3bf6ad97-e935-4600-9a44-709c1caa3f34 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "0814449a-7dc1-4f82-a204-a793c4867d69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.094 226239 DEBUG oslo_concurrency.lockutils [req-7337ad32-ae3c-4ca4-a9f5-b134d69d4fa9 req-3bf6ad97-e935-4600-9a44-709c1caa3f34 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0814449a-7dc1-4f82-a204-a793c4867d69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.094 226239 DEBUG oslo_concurrency.lockutils [req-7337ad32-ae3c-4ca4-a9f5-b134d69d4fa9 req-3bf6ad97-e935-4600-9a44-709c1caa3f34 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "0814449a-7dc1-4f82-a204-a793c4867d69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.095 226239 DEBUG nova.compute.manager [req-7337ad32-ae3c-4ca4-a9f5-b134d69d4fa9 req-3bf6ad97-e935-4600-9a44-709c1caa3f34 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] No waiting events found dispatching network-vif-plugged-15887f40-38e5-4ffb-bb22-fe2276411b0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.095 226239 WARNING nova.compute.manager [req-7337ad32-ae3c-4ca4-a9f5-b134d69d4fa9 req-3bf6ad97-e935-4600-9a44-709c1caa3f34 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Received unexpected event network-vif-plugged-15887f40-38e5-4ffb-bb22-fe2276411b0d for instance with vm_state deleted and task_state None.#033[00m
Jan 31 04:09:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:09:42 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3622667864' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.173 226239 DEBUG oslo_concurrency.processutils [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.179 226239 DEBUG nova.compute.provider_tree [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.193 226239 DEBUG nova.scheduler.client.report [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.214 226239 DEBUG oslo_concurrency.lockutils [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.545s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.243 226239 INFO nova.scheduler.client.report [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Deleted allocations for instance 0814449a-7dc1-4f82-a204-a793c4867d69#033[00m
Jan 31 04:09:42 np0005603623 nova_compute[226235]: 2026-01-31 09:09:42.316 226239 DEBUG oslo_concurrency.lockutils [None req-545ca87c-53d0-407d-9009-c314b2e8f06e d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "0814449a-7dc1-4f82-a204-a793c4867d69" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:43.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:09:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:43.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:09:45 np0005603623 nova_compute[226235]: 2026-01-31 09:09:45.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:45 np0005603623 nova_compute[226235]: 2026-01-31 09:09:45.312 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:45.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:45.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:46 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:09:46 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:09:47 np0005603623 nova_compute[226235]: 2026-01-31 09:09:47.010 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:09:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:47.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:09:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:47.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:49.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:49.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:49 np0005603623 nova_compute[226235]: 2026-01-31 09:09:49.998 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:50 np0005603623 nova_compute[226235]: 2026-01-31 09:09:50.031 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:50 np0005603623 nova_compute[226235]: 2026-01-31 09:09:50.313 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:51.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:09:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:51.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:09:52 np0005603623 nova_compute[226235]: 2026-01-31 09:09:52.012 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:09:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:53.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:09:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:53.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:55 np0005603623 nova_compute[226235]: 2026-01-31 09:09:55.286 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850580.2850018, 0814449a-7dc1-4f82-a204-a793c4867d69 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:09:55 np0005603623 nova_compute[226235]: 2026-01-31 09:09:55.287 226239 INFO nova.compute.manager [-] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:09:55 np0005603623 nova_compute[226235]: 2026-01-31 09:09:55.314 226239 DEBUG nova.compute.manager [None req-1d3b023e-0324-46b1-8d1c-d0033f9343d8 - - - - - -] [instance: 0814449a-7dc1-4f82-a204-a793c4867d69] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:09:55 np0005603623 nova_compute[226235]: 2026-01-31 09:09:55.315 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:09:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:55.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:09:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:55.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:57 np0005603623 nova_compute[226235]: 2026-01-31 09:09:57.014 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:57.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:57.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:59.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:09:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:59.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:00 np0005603623 nova_compute[226235]: 2026-01-31 09:10:00.316 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:00 np0005603623 ceph-mon[77037]: overall HEALTH_OK
Jan 31 04:10:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:01.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:01.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:01 np0005603623 podman[327740]: 2026-01-31 09:10:01.960734305 +0000 UTC m=+0.049392091 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:10:01 np0005603623 podman[327741]: 2026-01-31 09:10:01.993117811 +0000 UTC m=+0.079899517 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Jan 31 04:10:02 np0005603623 nova_compute[226235]: 2026-01-31 09:10:02.015 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:03.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:03.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:10:05 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3655800037' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:10:05 np0005603623 nova_compute[226235]: 2026-01-31 09:10:05.319 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:10:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:05.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:10:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:05.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:07 np0005603623 nova_compute[226235]: 2026-01-31 09:10:07.018 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:10:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:07.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:10:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:07.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:09.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:09.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:10 np0005603623 nova_compute[226235]: 2026-01-31 09:10:10.321 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:10:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:11.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:10:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:10:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:11.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:10:12 np0005603623 nova_compute[226235]: 2026-01-31 09:10:12.020 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:10:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:13.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:10:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:13.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:15 np0005603623 nova_compute[226235]: 2026-01-31 09:10:15.323 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:15.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:15.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:17 np0005603623 nova_compute[226235]: 2026-01-31 09:10:17.022 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:10:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:17.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:10:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:17.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:19.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:19.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:10:20.168 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=93, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=92) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:10:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:10:20.168 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:10:20 np0005603623 nova_compute[226235]: 2026-01-31 09:10:20.220 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:20 np0005603623 nova_compute[226235]: 2026-01-31 09:10:20.325 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:21 np0005603623 nova_compute[226235]: 2026-01-31 09:10:21.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:10:21 np0005603623 nova_compute[226235]: 2026-01-31 09:10:21.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:10:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:21.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:10:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:21.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:10:22 np0005603623 nova_compute[226235]: 2026-01-31 09:10:22.023 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:23 np0005603623 nova_compute[226235]: 2026-01-31 09:10:23.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:10:23 np0005603623 nova_compute[226235]: 2026-01-31 09:10:23.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:10:23 np0005603623 nova_compute[226235]: 2026-01-31 09:10:23.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:10:23 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:10:23.171 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '93'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:10:23 np0005603623 nova_compute[226235]: 2026-01-31 09:10:23.176 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:10:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:10:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:23.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:10:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:23.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:25 np0005603623 nova_compute[226235]: 2026-01-31 09:10:25.328 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:10:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:25.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:10:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:10:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:25.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:10:26 np0005603623 nova_compute[226235]: 2026-01-31 09:10:26.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:10:26 np0005603623 nova_compute[226235]: 2026-01-31 09:10:26.176 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:10:26 np0005603623 nova_compute[226235]: 2026-01-31 09:10:26.177 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:10:26 np0005603623 nova_compute[226235]: 2026-01-31 09:10:26.177 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:10:26 np0005603623 nova_compute[226235]: 2026-01-31 09:10:26.177 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:10:26 np0005603623 nova_compute[226235]: 2026-01-31 09:10:26.177 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:10:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:10:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/828397436' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:10:26 np0005603623 nova_compute[226235]: 2026-01-31 09:10:26.584 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.407s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:10:26 np0005603623 nova_compute[226235]: 2026-01-31 09:10:26.711 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:10:26 np0005603623 nova_compute[226235]: 2026-01-31 09:10:26.712 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4153MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:10:26 np0005603623 nova_compute[226235]: 2026-01-31 09:10:26.712 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:10:26 np0005603623 nova_compute[226235]: 2026-01-31 09:10:26.712 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:10:26 np0005603623 nova_compute[226235]: 2026-01-31 09:10:26.786 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:10:26 np0005603623 nova_compute[226235]: 2026-01-31 09:10:26.787 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:10:26 np0005603623 nova_compute[226235]: 2026-01-31 09:10:26.812 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:10:27 np0005603623 nova_compute[226235]: 2026-01-31 09:10:27.025 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:10:27 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4279472505' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:10:27 np0005603623 nova_compute[226235]: 2026-01-31 09:10:27.220 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:10:27 np0005603623 nova_compute[226235]: 2026-01-31 09:10:27.225 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:10:27 np0005603623 nova_compute[226235]: 2026-01-31 09:10:27.242 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:10:27 np0005603623 nova_compute[226235]: 2026-01-31 09:10:27.267 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:10:27 np0005603623 nova_compute[226235]: 2026-01-31 09:10:27.268 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.555s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:10:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:27.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:10:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:27.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:10:29 np0005603623 nova_compute[226235]: 2026-01-31 09:10:29.268 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:10:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:10:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:29.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:10:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:29.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:10:30.162 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:10:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:10:30.162 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:10:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:10:30.162 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:10:30 np0005603623 nova_compute[226235]: 2026-01-31 09:10:30.330 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:31 np0005603623 nova_compute[226235]: 2026-01-31 09:10:31.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:10:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:31.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:10:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:31.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:10:32 np0005603623 nova_compute[226235]: 2026-01-31 09:10:32.026 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:32 np0005603623 nova_compute[226235]: 2026-01-31 09:10:32.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:10:32 np0005603623 podman[327945]: 2026-01-31 09:10:32.971147875 +0000 UTC m=+0.067540580 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 31 04:10:32 np0005603623 podman[327946]: 2026-01-31 09:10:32.979093044 +0000 UTC m=+0.073710054 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller)
Jan 31 04:10:33 np0005603623 nova_compute[226235]: 2026-01-31 09:10:33.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:10:33 np0005603623 nova_compute[226235]: 2026-01-31 09:10:33.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:10:33 np0005603623 nova_compute[226235]: 2026-01-31 09:10:33.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:10:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:33.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:33.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:35 np0005603623 nova_compute[226235]: 2026-01-31 09:10:35.333 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:35.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:35.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:37 np0005603623 nova_compute[226235]: 2026-01-31 09:10:37.029 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:37.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:37.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:39.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:39.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:40 np0005603623 nova_compute[226235]: 2026-01-31 09:10:40.336 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:41.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:10:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:41.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:10:42 np0005603623 nova_compute[226235]: 2026-01-31 09:10:42.032 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:42 np0005603623 ovn_controller[133449]: 2026-01-31T09:10:42Z|00906|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 04:10:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:10:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:43.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:10:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:43.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:45 np0005603623 nova_compute[226235]: 2026-01-31 09:10:45.338 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:45.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:45.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:47 np0005603623 nova_compute[226235]: 2026-01-31 09:10:47.034 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:47.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:47.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:48 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:10:48 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:10:48 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:10:48 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:10:48 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:10:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:10:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:49.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:10:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:49.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:50 np0005603623 nova_compute[226235]: 2026-01-31 09:10:50.340 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:10:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:51.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:10:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:10:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:51.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:10:52 np0005603623 nova_compute[226235]: 2026-01-31 09:10:52.035 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:52 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #60. Immutable memtables: 16.
Jan 31 04:10:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:53.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:53.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:54 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:10:54 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:10:55 np0005603623 nova_compute[226235]: 2026-01-31 09:10:55.342 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:10:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:55.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:10:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:10:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:55.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:10:57 np0005603623 nova_compute[226235]: 2026-01-31 09:10:57.038 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:10:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:57.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:10:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:10:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:57.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:10:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:59.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:10:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:10:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:59.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:11:00 np0005603623 nova_compute[226235]: 2026-01-31 09:11:00.352 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:11:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:01.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:11:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:01.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:02 np0005603623 nova_compute[226235]: 2026-01-31 09:11:02.041 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:03.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:03.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:03 np0005603623 podman[328261]: 2026-01-31 09:11:03.905149703 +0000 UTC m=+0.049860325 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 04:11:03 np0005603623 podman[328262]: 2026-01-31 09:11:03.933085389 +0000 UTC m=+0.075986615 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 04:11:05 np0005603623 nova_compute[226235]: 2026-01-31 09:11:05.355 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:05.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:05.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:07 np0005603623 nova_compute[226235]: 2026-01-31 09:11:07.046 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:07.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:07.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e414 e414: 3 total, 3 up, 3 in
Jan 31 04:11:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:09.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:09.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:10 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e415 e415: 3 total, 3 up, 3 in
Jan 31 04:11:10 np0005603623 nova_compute[226235]: 2026-01-31 09:11:10.357 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:11 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e416 e416: 3 total, 3 up, 3 in
Jan 31 04:11:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:11.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:11:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:11.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:11:12 np0005603623 nova_compute[226235]: 2026-01-31 09:11:12.047 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:12 np0005603623 nova_compute[226235]: 2026-01-31 09:11:12.738 226239 DEBUG oslo_concurrency.lockutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "94b8b94c-865c-4d70-af81-218fd54902b0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:12 np0005603623 nova_compute[226235]: 2026-01-31 09:11:12.738 226239 DEBUG oslo_concurrency.lockutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "94b8b94c-865c-4d70-af81-218fd54902b0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:12 np0005603623 nova_compute[226235]: 2026-01-31 09:11:12.762 226239 DEBUG nova.compute.manager [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:11:12 np0005603623 nova_compute[226235]: 2026-01-31 09:11:12.893 226239 DEBUG oslo_concurrency.lockutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:12 np0005603623 nova_compute[226235]: 2026-01-31 09:11:12.894 226239 DEBUG oslo_concurrency.lockutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:12 np0005603623 nova_compute[226235]: 2026-01-31 09:11:12.902 226239 DEBUG nova.virt.hardware [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:11:12 np0005603623 nova_compute[226235]: 2026-01-31 09:11:12.902 226239 INFO nova.compute.claims [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.045 226239 DEBUG oslo_concurrency.processutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:11:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:11:13 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3829480722' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.445 226239 DEBUG oslo_concurrency.processutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.451 226239 DEBUG nova.compute.provider_tree [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.468 226239 DEBUG nova.scheduler.client.report [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.491 226239 DEBUG oslo_concurrency.lockutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.598s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.492 226239 DEBUG nova.compute.manager [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.543 226239 DEBUG nova.compute.manager [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.543 226239 DEBUG nova.network.neutron [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.575 226239 INFO nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.592 226239 DEBUG nova.compute.manager [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:11:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:11:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:13.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.684 226239 DEBUG nova.compute.manager [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.686 226239 DEBUG nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.686 226239 INFO nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Creating image(s)#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.712 226239 DEBUG nova.storage.rbd_utils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 94b8b94c-865c-4d70-af81-218fd54902b0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.735 226239 DEBUG nova.storage.rbd_utils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 94b8b94c-865c-4d70-af81-218fd54902b0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.760 226239 DEBUG nova.storage.rbd_utils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 94b8b94c-865c-4d70-af81-218fd54902b0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.764 226239 DEBUG oslo_concurrency.processutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.815 226239 DEBUG oslo_concurrency.processutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.816 226239 DEBUG oslo_concurrency.lockutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.817 226239 DEBUG oslo_concurrency.lockutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.817 226239 DEBUG oslo_concurrency.lockutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:13.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.841 226239 DEBUG nova.storage.rbd_utils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 94b8b94c-865c-4d70-af81-218fd54902b0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:11:13 np0005603623 nova_compute[226235]: 2026-01-31 09:11:13.844 226239 DEBUG oslo_concurrency.processutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 94b8b94c-865c-4d70-af81-218fd54902b0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:11:14 np0005603623 nova_compute[226235]: 2026-01-31 09:11:14.161 226239 DEBUG oslo_concurrency.processutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 94b8b94c-865c-4d70-af81-218fd54902b0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.317s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:11:14 np0005603623 nova_compute[226235]: 2026-01-31 09:11:14.229 226239 DEBUG nova.storage.rbd_utils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] resizing rbd image 94b8b94c-865c-4d70-af81-218fd54902b0_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 04:11:14 np0005603623 nova_compute[226235]: 2026-01-31 09:11:14.268 226239 DEBUG nova.policy [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ebd43008d7a64b8bbf97a2304b1f78b6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c7930b92fc3471f87d9fe78ee56e71e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:11:14 np0005603623 nova_compute[226235]: 2026-01-31 09:11:14.439 226239 DEBUG nova.objects.instance [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lazy-loading 'migration_context' on Instance uuid 94b8b94c-865c-4d70-af81-218fd54902b0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:11:14 np0005603623 nova_compute[226235]: 2026-01-31 09:11:14.465 226239 DEBUG nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 04:11:14 np0005603623 nova_compute[226235]: 2026-01-31 09:11:14.465 226239 DEBUG nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Ensure instance console log exists: /var/lib/nova/instances/94b8b94c-865c-4d70-af81-218fd54902b0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:11:14 np0005603623 nova_compute[226235]: 2026-01-31 09:11:14.466 226239 DEBUG oslo_concurrency.lockutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:14 np0005603623 nova_compute[226235]: 2026-01-31 09:11:14.466 226239 DEBUG oslo_concurrency.lockutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:14 np0005603623 nova_compute[226235]: 2026-01-31 09:11:14.466 226239 DEBUG oslo_concurrency.lockutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:11:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2048993436' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:11:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:11:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2048993436' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:11:15 np0005603623 nova_compute[226235]: 2026-01-31 09:11:15.358 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:15.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:15.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:17 np0005603623 nova_compute[226235]: 2026-01-31 09:11:17.048 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:17 np0005603623 nova_compute[226235]: 2026-01-31 09:11:17.230 226239 DEBUG nova.network.neutron [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Successfully created port: d2ae39fe-704a-46ab-ae15-c8171d1d776f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:11:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:11:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:17.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:11:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:17.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e417 e417: 3 total, 3 up, 3 in
Jan 31 04:11:18 np0005603623 nova_compute[226235]: 2026-01-31 09:11:18.264 226239 DEBUG nova.network.neutron [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Successfully updated port: d2ae39fe-704a-46ab-ae15-c8171d1d776f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:11:18 np0005603623 nova_compute[226235]: 2026-01-31 09:11:18.288 226239 DEBUG oslo_concurrency.lockutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "refresh_cache-94b8b94c-865c-4d70-af81-218fd54902b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:11:18 np0005603623 nova_compute[226235]: 2026-01-31 09:11:18.288 226239 DEBUG oslo_concurrency.lockutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquired lock "refresh_cache-94b8b94c-865c-4d70-af81-218fd54902b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:11:18 np0005603623 nova_compute[226235]: 2026-01-31 09:11:18.289 226239 DEBUG nova.network.neutron [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:11:18 np0005603623 nova_compute[226235]: 2026-01-31 09:11:18.309 226239 DEBUG oslo_concurrency.lockutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquiring lock "c893e608-08e4-4eab-a992-b241e484ea48" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:18 np0005603623 nova_compute[226235]: 2026-01-31 09:11:18.309 226239 DEBUG oslo_concurrency.lockutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "c893e608-08e4-4eab-a992-b241e484ea48" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:18 np0005603623 nova_compute[226235]: 2026-01-31 09:11:18.327 226239 DEBUG nova.compute.manager [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:11:18 np0005603623 nova_compute[226235]: 2026-01-31 09:11:18.375 226239 DEBUG nova.compute.manager [req-c4173561-251b-4282-9dfa-29faabd53342 req-44ab950f-aaf7-4904-b772-7295df7750d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Received event network-changed-d2ae39fe-704a-46ab-ae15-c8171d1d776f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:11:18 np0005603623 nova_compute[226235]: 2026-01-31 09:11:18.375 226239 DEBUG nova.compute.manager [req-c4173561-251b-4282-9dfa-29faabd53342 req-44ab950f-aaf7-4904-b772-7295df7750d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Refreshing instance network info cache due to event network-changed-d2ae39fe-704a-46ab-ae15-c8171d1d776f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:11:18 np0005603623 nova_compute[226235]: 2026-01-31 09:11:18.376 226239 DEBUG oslo_concurrency.lockutils [req-c4173561-251b-4282-9dfa-29faabd53342 req-44ab950f-aaf7-4904-b772-7295df7750d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-94b8b94c-865c-4d70-af81-218fd54902b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:11:18 np0005603623 nova_compute[226235]: 2026-01-31 09:11:18.438 226239 DEBUG oslo_concurrency.lockutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:18 np0005603623 nova_compute[226235]: 2026-01-31 09:11:18.438 226239 DEBUG oslo_concurrency.lockutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:18 np0005603623 nova_compute[226235]: 2026-01-31 09:11:18.445 226239 DEBUG nova.virt.hardware [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:11:18 np0005603623 nova_compute[226235]: 2026-01-31 09:11:18.445 226239 INFO nova.compute.claims [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 04:11:18 np0005603623 nova_compute[226235]: 2026-01-31 09:11:18.460 226239 DEBUG nova.network.neutron [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:11:18 np0005603623 nova_compute[226235]: 2026-01-31 09:11:18.584 226239 DEBUG oslo_concurrency.processutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:11:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:11:18 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3602965755' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:11:18 np0005603623 nova_compute[226235]: 2026-01-31 09:11:18.986 226239 DEBUG oslo_concurrency.processutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.402s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:11:18 np0005603623 nova_compute[226235]: 2026-01-31 09:11:18.991 226239 DEBUG nova.compute.provider_tree [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.103 226239 DEBUG nova.scheduler.client.report [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.140 226239 DEBUG oslo_concurrency.lockutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.140 226239 DEBUG nova.compute.manager [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.199 226239 DEBUG nova.compute.manager [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.200 226239 DEBUG nova.network.neutron [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.220 226239 INFO nova.virt.libvirt.driver [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.252 226239 DEBUG nova.compute.manager [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.342 226239 DEBUG nova.compute.manager [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.343 226239 DEBUG nova.virt.libvirt.driver [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.344 226239 INFO nova.virt.libvirt.driver [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Creating image(s)#033[00m
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.369 226239 DEBUG nova.storage.rbd_utils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] rbd image c893e608-08e4-4eab-a992-b241e484ea48_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.395 226239 DEBUG nova.storage.rbd_utils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] rbd image c893e608-08e4-4eab-a992-b241e484ea48_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.419 226239 DEBUG nova.storage.rbd_utils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] rbd image c893e608-08e4-4eab-a992-b241e484ea48_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.423 226239 DEBUG oslo_concurrency.lockutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquiring lock "6e12329c936450299dc23ec72ff67fa31642df64" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.424 226239 DEBUG oslo_concurrency.lockutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "6e12329c936450299dc23ec72ff67fa31642df64" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.434 226239 DEBUG nova.policy [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b7233f93367f4dcd8eb2b6b115680192', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c0be57039fd34aa9a2d05d9086ccff13', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:11:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:11:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:19.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.694 226239 DEBUG nova.virt.libvirt.imagebackend [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Image locations are: [{'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/f398889c-e272-449f-a032-36096e95d18f/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/f398889c-e272-449f-a032-36096e95d18f/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.741 226239 DEBUG nova.virt.libvirt.imagebackend [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Selected location: {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/f398889c-e272-449f-a032-36096e95d18f/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 31 04:11:19 np0005603623 nova_compute[226235]: 2026-01-31 09:11:19.742 226239 DEBUG nova.storage.rbd_utils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] cloning images/f398889c-e272-449f-a032-36096e95d18f@snap to None/c893e608-08e4-4eab-a992-b241e484ea48_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 04:11:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:19.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.019 226239 DEBUG oslo_concurrency.lockutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "6e12329c936450299dc23ec72ff67fa31642df64" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.146 226239 DEBUG nova.network.neutron [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Updating instance_info_cache with network_info: [{"id": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "address": "fa:16:3e:f1:58:48", "network": {"id": "919288ff-a51c-4b6d-81b3-cc76704eca9e", "bridge": "br-int", "label": "tempest-network-smoke--53537063", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ae39fe-70", "ovs_interfaceid": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.153 226239 DEBUG nova.objects.instance [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lazy-loading 'migration_context' on Instance uuid c893e608-08e4-4eab-a992-b241e484ea48 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.174 226239 DEBUG nova.virt.libvirt.driver [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.174 226239 DEBUG nova.virt.libvirt.driver [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Ensure instance console log exists: /var/lib/nova/instances/c893e608-08e4-4eab-a992-b241e484ea48/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.175 226239 DEBUG oslo_concurrency.lockutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.175 226239 DEBUG oslo_concurrency.lockutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.175 226239 DEBUG oslo_concurrency.lockutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.179 226239 DEBUG oslo_concurrency.lockutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Releasing lock "refresh_cache-94b8b94c-865c-4d70-af81-218fd54902b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.179 226239 DEBUG nova.compute.manager [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Instance network_info: |[{"id": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "address": "fa:16:3e:f1:58:48", "network": {"id": "919288ff-a51c-4b6d-81b3-cc76704eca9e", "bridge": "br-int", "label": "tempest-network-smoke--53537063", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ae39fe-70", "ovs_interfaceid": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.180 226239 DEBUG oslo_concurrency.lockutils [req-c4173561-251b-4282-9dfa-29faabd53342 req-44ab950f-aaf7-4904-b772-7295df7750d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-94b8b94c-865c-4d70-af81-218fd54902b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.180 226239 DEBUG nova.network.neutron [req-c4173561-251b-4282-9dfa-29faabd53342 req-44ab950f-aaf7-4904-b772-7295df7750d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Refreshing network info cache for port d2ae39fe-704a-46ab-ae15-c8171d1d776f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.182 226239 DEBUG nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Start _get_guest_xml network_info=[{"id": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "address": "fa:16:3e:f1:58:48", "network": {"id": "919288ff-a51c-4b6d-81b3-cc76704eca9e", "bridge": "br-int", "label": "tempest-network-smoke--53537063", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ae39fe-70", "ovs_interfaceid": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.186 226239 WARNING nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.190 226239 DEBUG nova.virt.libvirt.host [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.191 226239 DEBUG nova.virt.libvirt.host [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.193 226239 DEBUG nova.virt.libvirt.host [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.194 226239 DEBUG nova.virt.libvirt.host [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.195 226239 DEBUG nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.195 226239 DEBUG nova.virt.hardware [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.195 226239 DEBUG nova.virt.hardware [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.196 226239 DEBUG nova.virt.hardware [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.196 226239 DEBUG nova.virt.hardware [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.196 226239 DEBUG nova.virt.hardware [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.196 226239 DEBUG nova.virt.hardware [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.196 226239 DEBUG nova.virt.hardware [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.197 226239 DEBUG nova.virt.hardware [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.197 226239 DEBUG nova.virt.hardware [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.197 226239 DEBUG nova.virt.hardware [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.197 226239 DEBUG nova.virt.hardware [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.200 226239 DEBUG oslo_concurrency.processutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:11:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:20.279 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=94, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=93) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:11:20 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:20.281 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.281 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.359 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.389 226239 DEBUG nova.network.neutron [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Successfully created port: 0b63d009-60f2-4cf2-afea-679373e18e95 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:11:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:11:20 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1141431745' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.634 226239 DEBUG oslo_concurrency.processutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.660 226239 DEBUG nova.storage.rbd_utils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 94b8b94c-865c-4d70-af81-218fd54902b0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:11:20 np0005603623 nova_compute[226235]: 2026-01-31 09:11:20.664 226239 DEBUG oslo_concurrency.processutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:11:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:11:21 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3422245583' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.062 226239 DEBUG oslo_concurrency.processutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.398s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.064 226239 DEBUG nova.virt.libvirt.vif [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:11:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-gen-1-1734004012',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-gen-1-1734004012',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1802479850-ge',id=218,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKIc4iaayDmLhEo3vI6YGRzp7m9GW6fzslqwq++gP9ecVHJRq1tSjzVnTPtJw3RUxXTQDWiA7Ya9j/CawC++Id9BLZED+RHeJDZ4JXh3gvgziK3fUhGR6gajupFnKxcV3w==',key_name='tempest-TestSecurityGroupsBasicOps-762998316',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c7930b92fc3471f87d9fe78ee56e71e',ramdisk_id='',reservation_id='r-pdb07s60',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1802479850',owner_user_name='tempest-TestSecurityGroupsBasicOps-1802479850-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:11:13Z,user_data=None,user_id='ebd43008d7a64b8bbf97a2304b1f78b6',uuid=94b8b94c-865c-4d70-af81-218fd54902b0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "address": "fa:16:3e:f1:58:48", "network": {"id": "919288ff-a51c-4b6d-81b3-cc76704eca9e", "bridge": "br-int", "label": "tempest-network-smoke--53537063", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ae39fe-70", "ovs_interfaceid": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.064 226239 DEBUG nova.network.os_vif_util [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converting VIF {"id": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "address": "fa:16:3e:f1:58:48", "network": {"id": "919288ff-a51c-4b6d-81b3-cc76704eca9e", "bridge": "br-int", "label": "tempest-network-smoke--53537063", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ae39fe-70", "ovs_interfaceid": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.065 226239 DEBUG nova.network.os_vif_util [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:58:48,bridge_name='br-int',has_traffic_filtering=True,id=d2ae39fe-704a-46ab-ae15-c8171d1d776f,network=Network(919288ff-a51c-4b6d-81b3-cc76704eca9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ae39fe-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.067 226239 DEBUG nova.objects.instance [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lazy-loading 'pci_devices' on Instance uuid 94b8b94c-865c-4d70-af81-218fd54902b0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.091 226239 DEBUG nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:11:21 np0005603623 nova_compute[226235]:  <uuid>94b8b94c-865c-4d70-af81-218fd54902b0</uuid>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:  <name>instance-000000da</name>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-gen-1-1734004012</nova:name>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 09:11:20</nova:creationTime>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 04:11:21 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:        <nova:user uuid="ebd43008d7a64b8bbf97a2304b1f78b6">tempest-TestSecurityGroupsBasicOps-1802479850-project-member</nova:user>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:        <nova:project uuid="0c7930b92fc3471f87d9fe78ee56e71e">tempest-TestSecurityGroupsBasicOps-1802479850</nova:project>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:        <nova:port uuid="d2ae39fe-704a-46ab-ae15-c8171d1d776f">
Jan 31 04:11:21 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <system>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <entry name="serial">94b8b94c-865c-4d70-af81-218fd54902b0</entry>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <entry name="uuid">94b8b94c-865c-4d70-af81-218fd54902b0</entry>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    </system>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:  <os>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:  </os>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:  <features>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:  </features>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:  </clock>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:  <devices>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/94b8b94c-865c-4d70-af81-218fd54902b0_disk">
Jan 31 04:11:21 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:11:21 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/94b8b94c-865c-4d70-af81-218fd54902b0_disk.config">
Jan 31 04:11:21 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:11:21 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:f1:58:48"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <target dev="tapd2ae39fe-70"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    </interface>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/94b8b94c-865c-4d70-af81-218fd54902b0/console.log" append="off"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    </serial>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <video>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    </video>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    </rng>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 04:11:21 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 04:11:21 np0005603623 nova_compute[226235]:  </devices>
Jan 31 04:11:21 np0005603623 nova_compute[226235]: </domain>
Jan 31 04:11:21 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.092 226239 DEBUG nova.compute.manager [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Preparing to wait for external event network-vif-plugged-d2ae39fe-704a-46ab-ae15-c8171d1d776f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.092 226239 DEBUG oslo_concurrency.lockutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "94b8b94c-865c-4d70-af81-218fd54902b0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.092 226239 DEBUG oslo_concurrency.lockutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "94b8b94c-865c-4d70-af81-218fd54902b0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.092 226239 DEBUG oslo_concurrency.lockutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "94b8b94c-865c-4d70-af81-218fd54902b0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.093 226239 DEBUG nova.virt.libvirt.vif [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:11:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-gen-1-1734004012',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-gen-1-1734004012',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1802479850-ge',id=218,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKIc4iaayDmLhEo3vI6YGRzp7m9GW6fzslqwq++gP9ecVHJRq1tSjzVnTPtJw3RUxXTQDWiA7Ya9j/CawC++Id9BLZED+RHeJDZ4JXh3gvgziK3fUhGR6gajupFnKxcV3w==',key_name='tempest-TestSecurityGroupsBasicOps-762998316',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c7930b92fc3471f87d9fe78ee56e71e',ramdisk_id='',reservation_id='r-pdb07s60',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1802479850',owner_user_name='tempest-TestSecurityGroupsBasicOps-1802479850-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:11:13Z,user_data=None,user_id='ebd43008d7a64b8bbf97a2304b1f78b6',uuid=94b8b94c-865c-4d70-af81-218fd54902b0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "address": "fa:16:3e:f1:58:48", "network": {"id": "919288ff-a51c-4b6d-81b3-cc76704eca9e", "bridge": "br-int", "label": "tempest-network-smoke--53537063", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ae39fe-70", "ovs_interfaceid": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.093 226239 DEBUG nova.network.os_vif_util [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converting VIF {"id": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "address": "fa:16:3e:f1:58:48", "network": {"id": "919288ff-a51c-4b6d-81b3-cc76704eca9e", "bridge": "br-int", "label": "tempest-network-smoke--53537063", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ae39fe-70", "ovs_interfaceid": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.094 226239 DEBUG nova.network.os_vif_util [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:58:48,bridge_name='br-int',has_traffic_filtering=True,id=d2ae39fe-704a-46ab-ae15-c8171d1d776f,network=Network(919288ff-a51c-4b6d-81b3-cc76704eca9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ae39fe-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.094 226239 DEBUG os_vif [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:58:48,bridge_name='br-int',has_traffic_filtering=True,id=d2ae39fe-704a-46ab-ae15-c8171d1d776f,network=Network(919288ff-a51c-4b6d-81b3-cc76704eca9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ae39fe-70') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.095 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.095 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.095 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.098 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.098 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd2ae39fe-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.098 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd2ae39fe-70, col_values=(('external_ids', {'iface-id': 'd2ae39fe-704a-46ab-ae15-c8171d1d776f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f1:58:48', 'vm-uuid': '94b8b94c-865c-4d70-af81-218fd54902b0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.100 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:21 np0005603623 NetworkManager[48970]: <info>  [1769850681.1009] manager: (tapd2ae39fe-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/429)
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.103 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.106 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.107 226239 INFO os_vif [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:58:48,bridge_name='br-int',has_traffic_filtering=True,id=d2ae39fe-704a-46ab-ae15-c8171d1d776f,network=Network(919288ff-a51c-4b6d-81b3-cc76704eca9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ae39fe-70')#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.173 226239 DEBUG nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.173 226239 DEBUG nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.174 226239 DEBUG nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] No VIF found with MAC fa:16:3e:f1:58:48, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.174 226239 INFO nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Using config drive#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.201 226239 DEBUG nova.storage.rbd_utils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 94b8b94c-865c-4d70-af81-218fd54902b0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.468 226239 DEBUG nova.network.neutron [req-c4173561-251b-4282-9dfa-29faabd53342 req-44ab950f-aaf7-4904-b772-7295df7750d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Updated VIF entry in instance network info cache for port d2ae39fe-704a-46ab-ae15-c8171d1d776f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.469 226239 DEBUG nova.network.neutron [req-c4173561-251b-4282-9dfa-29faabd53342 req-44ab950f-aaf7-4904-b772-7295df7750d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Updating instance_info_cache with network_info: [{"id": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "address": "fa:16:3e:f1:58:48", "network": {"id": "919288ff-a51c-4b6d-81b3-cc76704eca9e", "bridge": "br-int", "label": "tempest-network-smoke--53537063", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ae39fe-70", "ovs_interfaceid": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:11:21 np0005603623 nova_compute[226235]: 2026-01-31 09:11:21.497 226239 DEBUG oslo_concurrency.lockutils [req-c4173561-251b-4282-9dfa-29faabd53342 req-44ab950f-aaf7-4904-b772-7295df7750d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-94b8b94c-865c-4d70-af81-218fd54902b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:11:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:11:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:21.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:11:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:21.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.049 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.127 226239 DEBUG nova.network.neutron [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Successfully updated port: 0b63d009-60f2-4cf2-afea-679373e18e95 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.149 226239 DEBUG oslo_concurrency.lockutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquiring lock "refresh_cache-c893e608-08e4-4eab-a992-b241e484ea48" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.150 226239 DEBUG oslo_concurrency.lockutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquired lock "refresh_cache-c893e608-08e4-4eab-a992-b241e484ea48" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.150 226239 DEBUG nova.network.neutron [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.154 226239 INFO nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Creating config drive at /var/lib/nova/instances/94b8b94c-865c-4d70-af81-218fd54902b0/disk.config#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.158 226239 DEBUG oslo_concurrency.processutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/94b8b94c-865c-4d70-af81-218fd54902b0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpxyqcgduh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.230 226239 DEBUG nova.compute.manager [req-22a17314-8a35-47b5-a0c1-fde6303f78e6 req-e94f13f6-f1f0-4a10-ad72-30468ccc50d5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Received event network-changed-0b63d009-60f2-4cf2-afea-679373e18e95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.230 226239 DEBUG nova.compute.manager [req-22a17314-8a35-47b5-a0c1-fde6303f78e6 req-e94f13f6-f1f0-4a10-ad72-30468ccc50d5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Refreshing instance network info cache due to event network-changed-0b63d009-60f2-4cf2-afea-679373e18e95. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.231 226239 DEBUG oslo_concurrency.lockutils [req-22a17314-8a35-47b5-a0c1-fde6303f78e6 req-e94f13f6-f1f0-4a10-ad72-30468ccc50d5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-c893e608-08e4-4eab-a992-b241e484ea48" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.285 226239 DEBUG oslo_concurrency.processutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/94b8b94c-865c-4d70-af81-218fd54902b0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpxyqcgduh" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.311 226239 DEBUG nova.storage.rbd_utils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 94b8b94c-865c-4d70-af81-218fd54902b0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.314 226239 DEBUG oslo_concurrency.processutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/94b8b94c-865c-4d70-af81-218fd54902b0/disk.config 94b8b94c-865c-4d70-af81-218fd54902b0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.338 226239 DEBUG nova.network.neutron [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.475 226239 DEBUG oslo_concurrency.processutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/94b8b94c-865c-4d70-af81-218fd54902b0/disk.config 94b8b94c-865c-4d70-af81-218fd54902b0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.161s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.476 226239 INFO nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Deleting local config drive /var/lib/nova/instances/94b8b94c-865c-4d70-af81-218fd54902b0/disk.config because it was imported into RBD.#033[00m
Jan 31 04:11:22 np0005603623 kernel: tapd2ae39fe-70: entered promiscuous mode
Jan 31 04:11:22 np0005603623 NetworkManager[48970]: <info>  [1769850682.5096] manager: (tapd2ae39fe-70): new Tun device (/org/freedesktop/NetworkManager/Devices/430)
Jan 31 04:11:22 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:22Z|00907|binding|INFO|Claiming lport d2ae39fe-704a-46ab-ae15-c8171d1d776f for this chassis.
Jan 31 04:11:22 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:22Z|00908|binding|INFO|d2ae39fe-704a-46ab-ae15-c8171d1d776f: Claiming fa:16:3e:f1:58:48 10.100.0.5
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.510 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.515 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:22 np0005603623 systemd-udevd[328861]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.530 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:22 np0005603623 NetworkManager[48970]: <info>  [1769850682.5326] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/431)
Jan 31 04:11:22 np0005603623 NetworkManager[48970]: <info>  [1769850682.5334] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/432)
Jan 31 04:11:22 np0005603623 systemd-machined[194379]: New machine qemu-100-instance-000000da.
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.536 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:58:48 10.100.0.5'], port_security=['fa:16:3e:f1:58:48 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '94b8b94c-865c-4d70-af81-218fd54902b0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-919288ff-a51c-4b6d-81b3-cc76704eca9e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c7930b92fc3471f87d9fe78ee56e71e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a940e54a-7391-4617-93be-b1a956d3558c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be7ac7a5-1e86-4304-8ddd-d276d05956e0, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=d2ae39fe-704a-46ab-ae15-c8171d1d776f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.538 143258 INFO neutron.agent.ovn.metadata.agent [-] Port d2ae39fe-704a-46ab-ae15-c8171d1d776f in datapath 919288ff-a51c-4b6d-81b3-cc76704eca9e bound to our chassis#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.539 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 919288ff-a51c-4b6d-81b3-cc76704eca9e#033[00m
Jan 31 04:11:22 np0005603623 NetworkManager[48970]: <info>  [1769850682.5422] device (tapd2ae39fe-70): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:11:22 np0005603623 NetworkManager[48970]: <info>  [1769850682.5430] device (tapd2ae39fe-70): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:11:22 np0005603623 systemd[1]: Started Virtual Machine qemu-100-instance-000000da.
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.547 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e58cfe92-28aa-4124-8e44-bede2c655b81]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.548 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap919288ff-a1 in ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.550 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap919288ff-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.550 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0ed01587-f1c8-4358-9bc5-41cc64e71f30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.552 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1d58ed09-3fea-4c57-ba49-7da9997e2f87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.560 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[86b1d233-622f-4822-9a39-da733c5d7e44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.581 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[99d954a7-37fb-4010-a2be-b2c90b3ee06c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.587 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.601 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[1fde1ada-4550-48cb-971f-5c6339162fe9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.605 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:22 np0005603623 NetworkManager[48970]: <info>  [1769850682.6075] manager: (tap919288ff-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/433)
Jan 31 04:11:22 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:22Z|00909|binding|INFO|Setting lport d2ae39fe-704a-46ab-ae15-c8171d1d776f ovn-installed in OVS
Jan 31 04:11:22 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:22Z|00910|binding|INFO|Setting lport d2ae39fe-704a-46ab-ae15-c8171d1d776f up in Southbound
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.606 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2c9bf96e-4655-4e21-8fe4-fa092aa301a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.608 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.629 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[916f7a69-9644-4b74-9cee-c188352f57ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.632 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[7469c128-8084-4993-a513-e8041fec1131]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:22 np0005603623 NetworkManager[48970]: <info>  [1769850682.6446] device (tap919288ff-a0): carrier: link connected
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.647 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[2a5059eb-8a34-4a0b-b5dc-4223a4648780]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.659 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[58aad79d-dc2b-40ab-a536-0ad63f0f1a96]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap919288ff-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:27:10'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 266], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 982307, 'reachable_time': 27316, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 328896, 'error': None, 'target': 'ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.669 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[cc29485a-8f1c-4291-b1bd-8350f9e0ef41]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe63:2710'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 982307, 'tstamp': 982307}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 328897, 'error': None, 'target': 'ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.682 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e7c62d2c-267b-4b23-bdc4-43c53e71a8b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap919288ff-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:27:10'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 266], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 982307, 'reachable_time': 27316, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 328904, 'error': None, 'target': 'ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.703 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8eb25388-ce67-4362-b102-20d20690e149]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.740 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[3995f510-14fb-4048-8c5c-8f241a8acd1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.741 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap919288ff-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.742 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.742 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap919288ff-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:11:22 np0005603623 NetworkManager[48970]: <info>  [1769850682.7442] manager: (tap919288ff-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/434)
Jan 31 04:11:22 np0005603623 kernel: tap919288ff-a0: entered promiscuous mode
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.743 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.749 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap919288ff-a0, col_values=(('external_ids', {'iface-id': 'da214d50-142b-42ea-aa25-6c69e8caf69b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.750 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:22 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:22Z|00911|binding|INFO|Releasing lport da214d50-142b-42ea-aa25-6c69e8caf69b from this chassis (sb_readonly=0)
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.751 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/919288ff-a51c-4b6d-81b3-cc76704eca9e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/919288ff-a51c-4b6d-81b3-cc76704eca9e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.751 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[dc3cff2d-4c52-45fb-9564-b16e10cc8f37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.752 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-919288ff-a51c-4b6d-81b3-cc76704eca9e
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/919288ff-a51c-4b6d-81b3-cc76704eca9e.pid.haproxy
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 919288ff-a51c-4b6d-81b3-cc76704eca9e
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:11:22 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:22.753 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e', 'env', 'PROCESS_TAG=haproxy-919288ff-a51c-4b6d-81b3-cc76704eca9e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/919288ff-a51c-4b6d-81b3-cc76704eca9e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.756 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.835 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850682.8347995, 94b8b94c-865c-4d70-af81-218fd54902b0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.835 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] VM Started (Lifecycle Event)#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.866 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.869 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850682.8357265, 94b8b94c-865c-4d70-af81-218fd54902b0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.869 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.887 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.890 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:11:22 np0005603623 nova_compute[226235]: 2026-01-31 09:11:22.922 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:11:23 np0005603623 podman[328972]: 2026-01-31 09:11:23.069586676 +0000 UTC m=+0.045947223 container create f0dce35d566fb73f3a3b82591028c57b9964e1df5f095c1b0e08366bbddddf2f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:11:23 np0005603623 systemd[1]: Started libpod-conmon-f0dce35d566fb73f3a3b82591028c57b9964e1df5f095c1b0e08366bbddddf2f.scope.
Jan 31 04:11:23 np0005603623 systemd[1]: Started libcrun container.
Jan 31 04:11:23 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74059a7eafa5d4fc0988a8197e0bb5d5cba2b0dde39c36a60f15956c7063c736/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:11:23 np0005603623 podman[328972]: 2026-01-31 09:11:23.044086596 +0000 UTC m=+0.020447193 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:11:23 np0005603623 podman[328972]: 2026-01-31 09:11:23.145647022 +0000 UTC m=+0.122007569 container init f0dce35d566fb73f3a3b82591028c57b9964e1df5f095c1b0e08366bbddddf2f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 04:11:23 np0005603623 podman[328972]: 2026-01-31 09:11:23.149824123 +0000 UTC m=+0.126184670 container start f0dce35d566fb73f3a3b82591028c57b9964e1df5f095c1b0e08366bbddddf2f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:11:23 np0005603623 neutron-haproxy-ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e[328987]: [NOTICE]   (328991) : New worker (328993) forked
Jan 31 04:11:23 np0005603623 neutron-haproxy-ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e[328987]: [NOTICE]   (328991) : Loading success.
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.250 226239 DEBUG nova.compute.manager [req-6032ddc3-4a63-49df-94fd-190cd25f21a6 req-c19098dc-d218-4046-8884-d47e7184d70c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Received event network-vif-plugged-d2ae39fe-704a-46ab-ae15-c8171d1d776f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.250 226239 DEBUG oslo_concurrency.lockutils [req-6032ddc3-4a63-49df-94fd-190cd25f21a6 req-c19098dc-d218-4046-8884-d47e7184d70c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "94b8b94c-865c-4d70-af81-218fd54902b0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.251 226239 DEBUG oslo_concurrency.lockutils [req-6032ddc3-4a63-49df-94fd-190cd25f21a6 req-c19098dc-d218-4046-8884-d47e7184d70c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "94b8b94c-865c-4d70-af81-218fd54902b0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.251 226239 DEBUG oslo_concurrency.lockutils [req-6032ddc3-4a63-49df-94fd-190cd25f21a6 req-c19098dc-d218-4046-8884-d47e7184d70c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "94b8b94c-865c-4d70-af81-218fd54902b0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.251 226239 DEBUG nova.compute.manager [req-6032ddc3-4a63-49df-94fd-190cd25f21a6 req-c19098dc-d218-4046-8884-d47e7184d70c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Processing event network-vif-plugged-d2ae39fe-704a-46ab-ae15-c8171d1d776f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.252 226239 DEBUG nova.compute.manager [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.255 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850683.255216, 94b8b94c-865c-4d70-af81-218fd54902b0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.255 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.257 226239 DEBUG nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.260 226239 INFO nova.virt.libvirt.driver [-] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Instance spawned successfully.#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.261 226239 DEBUG nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.277 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.281 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.287 226239 DEBUG nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.288 226239 DEBUG nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.288 226239 DEBUG nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.288 226239 DEBUG nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.289 226239 DEBUG nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.289 226239 DEBUG nova.virt.libvirt.driver [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.315 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.379 226239 INFO nova.compute.manager [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Took 9.69 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.379 226239 DEBUG nova.compute.manager [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.468 226239 INFO nova.compute.manager [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Took 10.62 seconds to build instance.#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.490 226239 DEBUG oslo_concurrency.lockutils [None req-82c1f9ef-920a-4290-97aa-6310d3376f2d ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "94b8b94c-865c-4d70-af81-218fd54902b0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.752s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.545 226239 DEBUG nova.network.neutron [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Updating instance_info_cache with network_info: [{"id": "0b63d009-60f2-4cf2-afea-679373e18e95", "address": "fa:16:3e:05:73:e5", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b63d009-60", "ovs_interfaceid": "0b63d009-60f2-4cf2-afea-679373e18e95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.574 226239 DEBUG oslo_concurrency.lockutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Releasing lock "refresh_cache-c893e608-08e4-4eab-a992-b241e484ea48" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.575 226239 DEBUG nova.compute.manager [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Instance network_info: |[{"id": "0b63d009-60f2-4cf2-afea-679373e18e95", "address": "fa:16:3e:05:73:e5", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b63d009-60", "ovs_interfaceid": "0b63d009-60f2-4cf2-afea-679373e18e95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.576 226239 DEBUG oslo_concurrency.lockutils [req-22a17314-8a35-47b5-a0c1-fde6303f78e6 req-e94f13f6-f1f0-4a10-ad72-30468ccc50d5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-c893e608-08e4-4eab-a992-b241e484ea48" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.576 226239 DEBUG nova.network.neutron [req-22a17314-8a35-47b5-a0c1-fde6303f78e6 req-e94f13f6-f1f0-4a10-ad72-30468ccc50d5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Refreshing network info cache for port 0b63d009-60f2-4cf2-afea-679373e18e95 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.579 226239 DEBUG nova.virt.libvirt.driver [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Start _get_guest_xml network_info=[{"id": "0b63d009-60f2-4cf2-afea-679373e18e95", "address": "fa:16:3e:05:73:e5", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b63d009-60", "ovs_interfaceid": "0b63d009-60f2-4cf2-afea-679373e18e95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-31T09:11:06Z,direct_url=<?>,disk_format='raw',id=f398889c-e272-449f-a032-36096e95d18f,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-707513110',owner='c0be57039fd34aa9a2d05d9086ccff13',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-31T09:11:14Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': 'f398889c-e272-449f-a032-36096e95d18f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.587 226239 WARNING nova.virt.libvirt.driver [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.594 226239 DEBUG nova.virt.libvirt.host [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.595 226239 DEBUG nova.virt.libvirt.host [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.602 226239 DEBUG nova.virt.libvirt.host [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.603 226239 DEBUG nova.virt.libvirt.host [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.604 226239 DEBUG nova.virt.libvirt.driver [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.604 226239 DEBUG nova.virt.hardware [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-31T09:11:06Z,direct_url=<?>,disk_format='raw',id=f398889c-e272-449f-a032-36096e95d18f,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-707513110',owner='c0be57039fd34aa9a2d05d9086ccff13',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-31T09:11:14Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.605 226239 DEBUG nova.virt.hardware [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.605 226239 DEBUG nova.virt.hardware [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.605 226239 DEBUG nova.virt.hardware [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.605 226239 DEBUG nova.virt.hardware [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.606 226239 DEBUG nova.virt.hardware [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.606 226239 DEBUG nova.virt.hardware [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.606 226239 DEBUG nova.virt.hardware [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.607 226239 DEBUG nova.virt.hardware [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.607 226239 DEBUG nova.virt.hardware [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.607 226239 DEBUG nova.virt.hardware [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:11:23 np0005603623 nova_compute[226235]: 2026-01-31 09:11:23.610 226239 DEBUG oslo_concurrency.processutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:11:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:11:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:23.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:11:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:23.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:11:24 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2122974663' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.093 226239 DEBUG oslo_concurrency.processutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.122 226239 DEBUG nova.storage.rbd_utils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] rbd image c893e608-08e4-4eab-a992-b241e484ea48_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.128 226239 DEBUG oslo_concurrency.processutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:11:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:11:24 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1001211175' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.576 226239 DEBUG oslo_concurrency.processutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.579 226239 DEBUG nova.virt.libvirt.vif [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:11:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1862064566',display_name='tempest-TestSnapshotPattern-server-1862064566',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1862064566',id=219,image_ref='f398889c-e272-449f-a032-36096e95d18f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKpOUB83OLuX57XNlwrNvNm9A+WqJpxKoMPAfeJaapb6yUdxSW7lKu+x7yQPy1sLXzgH0zh++G8qQAE8XC1z1HkX9voX2EUsKpZ4pSJBKy01SGRO5BQ6pTGTKexhMNezAw==',key_name='tempest-TestSnapshotPattern-475590621',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c0be57039fd34aa9a2d05d9086ccff13',ramdisk_id='',reservation_id='r-x7vjslr0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='f0b8c4bd-36b6-479c-9160-9adb3a86dc6f',image_min_disk='1',image_min_ram='0',image_owner_id='c0be57039fd34aa9a2d05d9086ccff13',image_owner_project_name='tempest-TestSnapshotPattern-418405266',image_owner_user_name='tempest-TestSnapshotPattern-418405266-project-member',image_user_id='b7233f93367f4dcd8eb2b6b115680192',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-418405266',owner_user_name='tempest-TestSnapshotPattern-418405266-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:11:19Z,user_data=None,user_id='b7233f93367f4dcd8eb2b6b115680192',uuid=c893e608-08e4-4eab-a992-b241e484ea48,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b63d009-60f2-4cf2-afea-679373e18e95", "address": "fa:16:3e:05:73:e5", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b63d009-60", "ovs_interfaceid": "0b63d009-60f2-4cf2-afea-679373e18e95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.580 226239 DEBUG nova.network.os_vif_util [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Converting VIF {"id": "0b63d009-60f2-4cf2-afea-679373e18e95", "address": "fa:16:3e:05:73:e5", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b63d009-60", "ovs_interfaceid": "0b63d009-60f2-4cf2-afea-679373e18e95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.580 226239 DEBUG nova.network.os_vif_util [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:73:e5,bridge_name='br-int',has_traffic_filtering=True,id=0b63d009-60f2-4cf2-afea-679373e18e95,network=Network(08c50cf6-ab45-467a-a4d2-628200ead973),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b63d009-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.582 226239 DEBUG nova.objects.instance [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lazy-loading 'pci_devices' on Instance uuid c893e608-08e4-4eab-a992-b241e484ea48 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.601 226239 DEBUG nova.virt.libvirt.driver [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:11:24 np0005603623 nova_compute[226235]:  <uuid>c893e608-08e4-4eab-a992-b241e484ea48</uuid>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:  <name>instance-000000db</name>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <nova:name>tempest-TestSnapshotPattern-server-1862064566</nova:name>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 09:11:23</nova:creationTime>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 04:11:24 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:        <nova:user uuid="b7233f93367f4dcd8eb2b6b115680192">tempest-TestSnapshotPattern-418405266-project-member</nova:user>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:        <nova:project uuid="c0be57039fd34aa9a2d05d9086ccff13">tempest-TestSnapshotPattern-418405266</nova:project>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="f398889c-e272-449f-a032-36096e95d18f"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:        <nova:port uuid="0b63d009-60f2-4cf2-afea-679373e18e95">
Jan 31 04:11:24 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <system>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <entry name="serial">c893e608-08e4-4eab-a992-b241e484ea48</entry>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <entry name="uuid">c893e608-08e4-4eab-a992-b241e484ea48</entry>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    </system>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:  <os>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:  </os>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:  <features>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:  </features>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:  </clock>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:  <devices>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/c893e608-08e4-4eab-a992-b241e484ea48_disk">
Jan 31 04:11:24 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:11:24 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/c893e608-08e4-4eab-a992-b241e484ea48_disk.config">
Jan 31 04:11:24 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:11:24 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:05:73:e5"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <target dev="tap0b63d009-60"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    </interface>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/c893e608-08e4-4eab-a992-b241e484ea48/console.log" append="off"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    </serial>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <video>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    </video>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <input type="keyboard" bus="usb"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    </rng>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 04:11:24 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 04:11:24 np0005603623 nova_compute[226235]:  </devices>
Jan 31 04:11:24 np0005603623 nova_compute[226235]: </domain>
Jan 31 04:11:24 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.607 226239 DEBUG nova.compute.manager [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Preparing to wait for external event network-vif-plugged-0b63d009-60f2-4cf2-afea-679373e18e95 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.608 226239 DEBUG oslo_concurrency.lockutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquiring lock "c893e608-08e4-4eab-a992-b241e484ea48-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.608 226239 DEBUG oslo_concurrency.lockutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "c893e608-08e4-4eab-a992-b241e484ea48-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.608 226239 DEBUG oslo_concurrency.lockutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "c893e608-08e4-4eab-a992-b241e484ea48-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.609 226239 DEBUG nova.virt.libvirt.vif [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:11:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1862064566',display_name='tempest-TestSnapshotPattern-server-1862064566',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1862064566',id=219,image_ref='f398889c-e272-449f-a032-36096e95d18f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKpOUB83OLuX57XNlwrNvNm9A+WqJpxKoMPAfeJaapb6yUdxSW7lKu+x7yQPy1sLXzgH0zh++G8qQAE8XC1z1HkX9voX2EUsKpZ4pSJBKy01SGRO5BQ6pTGTKexhMNezAw==',key_name='tempest-TestSnapshotPattern-475590621',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c0be57039fd34aa9a2d05d9086ccff13',ramdisk_id='',reservation_id='r-x7vjslr0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='f0b8c4bd-36b6-479c-9160-9adb3a86dc6f',image_min_disk='1',image_min_ram='0',image_owner_id='c0be57039fd34aa9a2d05d9086ccff13',image_owner_project_name='tempest-TestSnapshotPattern-418405266',image_owner_user_name='tempest-TestSnapshotPattern-418405266-project-member',image_user_id='b7233f93367f4dcd8eb2b6b115680192',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-418405266',owner_user_name='tempest-TestSnapshotPattern-418405266-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:11:19Z,user_data=None,user_id='b7233f93367f4dcd8eb2b6b115680192',uuid=c893e608-08e4-4eab-a992-b241e484ea48,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b63d009-60f2-4cf2-afea-679373e18e95", "address": "fa:16:3e:05:73:e5", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b63d009-60", "ovs_interfaceid": "0b63d009-60f2-4cf2-afea-679373e18e95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.610 226239 DEBUG nova.network.os_vif_util [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Converting VIF {"id": "0b63d009-60f2-4cf2-afea-679373e18e95", "address": "fa:16:3e:05:73:e5", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b63d009-60", "ovs_interfaceid": "0b63d009-60f2-4cf2-afea-679373e18e95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.610 226239 DEBUG nova.network.os_vif_util [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:73:e5,bridge_name='br-int',has_traffic_filtering=True,id=0b63d009-60f2-4cf2-afea-679373e18e95,network=Network(08c50cf6-ab45-467a-a4d2-628200ead973),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b63d009-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.611 226239 DEBUG os_vif [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:73:e5,bridge_name='br-int',has_traffic_filtering=True,id=0b63d009-60f2-4cf2-afea-679373e18e95,network=Network(08c50cf6-ab45-467a-a4d2-628200ead973),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b63d009-60') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.611 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.612 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.612 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.615 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.615 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b63d009-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.615 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b63d009-60, col_values=(('external_ids', {'iface-id': '0b63d009-60f2-4cf2-afea-679373e18e95', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:05:73:e5', 'vm-uuid': 'c893e608-08e4-4eab-a992-b241e484ea48'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:11:24 np0005603623 NetworkManager[48970]: <info>  [1769850684.6181] manager: (tap0b63d009-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/435)
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.619 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.622 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.623 226239 INFO os_vif [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:73:e5,bridge_name='br-int',has_traffic_filtering=True,id=0b63d009-60f2-4cf2-afea-679373e18e95,network=Network(08c50cf6-ab45-467a-a4d2-628200ead973),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b63d009-60')#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.693 226239 DEBUG nova.virt.libvirt.driver [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.693 226239 DEBUG nova.virt.libvirt.driver [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.694 226239 DEBUG nova.virt.libvirt.driver [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] No VIF found with MAC fa:16:3e:05:73:e5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.694 226239 INFO nova.virt.libvirt.driver [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Using config drive#033[00m
Jan 31 04:11:24 np0005603623 nova_compute[226235]: 2026-01-31 09:11:24.717 226239 DEBUG nova.storage.rbd_utils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] rbd image c893e608-08e4-4eab-a992-b241e484ea48_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.170 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.362 226239 DEBUG nova.compute.manager [req-ff71d2d4-bf25-4da5-b0cf-e11bc1a081bb req-3165d870-ffd0-4796-860e-05ac7f554907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Received event network-vif-plugged-d2ae39fe-704a-46ab-ae15-c8171d1d776f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.362 226239 DEBUG oslo_concurrency.lockutils [req-ff71d2d4-bf25-4da5-b0cf-e11bc1a081bb req-3165d870-ffd0-4796-860e-05ac7f554907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "94b8b94c-865c-4d70-af81-218fd54902b0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.363 226239 DEBUG oslo_concurrency.lockutils [req-ff71d2d4-bf25-4da5-b0cf-e11bc1a081bb req-3165d870-ffd0-4796-860e-05ac7f554907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "94b8b94c-865c-4d70-af81-218fd54902b0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.363 226239 DEBUG oslo_concurrency.lockutils [req-ff71d2d4-bf25-4da5-b0cf-e11bc1a081bb req-3165d870-ffd0-4796-860e-05ac7f554907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "94b8b94c-865c-4d70-af81-218fd54902b0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.363 226239 DEBUG nova.compute.manager [req-ff71d2d4-bf25-4da5-b0cf-e11bc1a081bb req-3165d870-ffd0-4796-860e-05ac7f554907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] No waiting events found dispatching network-vif-plugged-d2ae39fe-704a-46ab-ae15-c8171d1d776f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.363 226239 WARNING nova.compute.manager [req-ff71d2d4-bf25-4da5-b0cf-e11bc1a081bb req-3165d870-ffd0-4796-860e-05ac7f554907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Received unexpected event network-vif-plugged-d2ae39fe-704a-46ab-ae15-c8171d1d776f for instance with vm_state active and task_state None.#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.371 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-94b8b94c-865c-4d70-af81-218fd54902b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.371 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-94b8b94c-865c-4d70-af81-218fd54902b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.371 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.371 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 94b8b94c-865c-4d70-af81-218fd54902b0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.403 226239 INFO nova.virt.libvirt.driver [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Creating config drive at /var/lib/nova/instances/c893e608-08e4-4eab-a992-b241e484ea48/disk.config#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.407 226239 DEBUG oslo_concurrency.processutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c893e608-08e4-4eab-a992-b241e484ea48/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp2evgfa3r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.491 226239 DEBUG nova.network.neutron [req-22a17314-8a35-47b5-a0c1-fde6303f78e6 req-e94f13f6-f1f0-4a10-ad72-30468ccc50d5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Updated VIF entry in instance network info cache for port 0b63d009-60f2-4cf2-afea-679373e18e95. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.492 226239 DEBUG nova.network.neutron [req-22a17314-8a35-47b5-a0c1-fde6303f78e6 req-e94f13f6-f1f0-4a10-ad72-30468ccc50d5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Updating instance_info_cache with network_info: [{"id": "0b63d009-60f2-4cf2-afea-679373e18e95", "address": "fa:16:3e:05:73:e5", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b63d009-60", "ovs_interfaceid": "0b63d009-60f2-4cf2-afea-679373e18e95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.515 226239 DEBUG oslo_concurrency.lockutils [req-22a17314-8a35-47b5-a0c1-fde6303f78e6 req-e94f13f6-f1f0-4a10-ad72-30468ccc50d5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-c893e608-08e4-4eab-a992-b241e484ea48" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.528 226239 DEBUG oslo_concurrency.processutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c893e608-08e4-4eab-a992-b241e484ea48/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp2evgfa3r" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.551 226239 DEBUG nova.storage.rbd_utils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] rbd image c893e608-08e4-4eab-a992-b241e484ea48_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.553 226239 DEBUG oslo_concurrency.processutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c893e608-08e4-4eab-a992-b241e484ea48/disk.config c893e608-08e4-4eab-a992-b241e484ea48_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:11:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:25.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.689 226239 DEBUG oslo_concurrency.processutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c893e608-08e4-4eab-a992-b241e484ea48/disk.config c893e608-08e4-4eab-a992-b241e484ea48_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.690 226239 INFO nova.virt.libvirt.driver [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Deleting local config drive /var/lib/nova/instances/c893e608-08e4-4eab-a992-b241e484ea48/disk.config because it was imported into RBD.#033[00m
Jan 31 04:11:25 np0005603623 NetworkManager[48970]: <info>  [1769850685.7191] manager: (tap0b63d009-60): new Tun device (/org/freedesktop/NetworkManager/Devices/436)
Jan 31 04:11:25 np0005603623 kernel: tap0b63d009-60: entered promiscuous mode
Jan 31 04:11:25 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:25Z|00912|binding|INFO|Claiming lport 0b63d009-60f2-4cf2-afea-679373e18e95 for this chassis.
Jan 31 04:11:25 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:25Z|00913|binding|INFO|0b63d009-60f2-4cf2-afea-679373e18e95: Claiming fa:16:3e:05:73:e5 10.100.0.10
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.725 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.727 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.731 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:25 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:25Z|00914|binding|INFO|Setting lport 0b63d009-60f2-4cf2-afea-679373e18e95 ovn-installed in OVS
Jan 31 04:11:25 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:25Z|00915|binding|INFO|Setting lport 0b63d009-60f2-4cf2-afea-679373e18e95 up in Southbound
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.733 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.735 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.730 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:73:e5 10.100.0.10'], port_security=['fa:16:3e:05:73:e5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c893e608-08e4-4eab-a992-b241e484ea48', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08c50cf6-ab45-467a-a4d2-628200ead973', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c0be57039fd34aa9a2d05d9086ccff13', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1b8739db-40d8-4ddd-aaaf-640fa0a0d612', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=df4ff59b-6bd6-4dbb-9483-bf928652de1a, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=0b63d009-60f2-4cf2-afea-679373e18e95) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.732 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 0b63d009-60f2-4cf2-afea-679373e18e95 in datapath 08c50cf6-ab45-467a-a4d2-628200ead973 bound to our chassis#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.734 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08c50cf6-ab45-467a-a4d2-628200ead973#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.747 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5f7fd839-a4d8-42fe-840f-2ad000f47639]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.748 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap08c50cf6-a1 in ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.749 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap08c50cf6-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.750 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[eed399d8-dde1-4c8e-a410-b3e2e1a4232d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.751 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4a101d77-3462-4284-b571-2031b7b3e9f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:25 np0005603623 systemd-udevd[329190]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:11:25 np0005603623 systemd-machined[194379]: New machine qemu-101-instance-000000db.
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.759 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[39f01548-a661-426a-96a9-92b9f605c6f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:25 np0005603623 NetworkManager[48970]: <info>  [1769850685.7651] device (tap0b63d009-60): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:11:25 np0005603623 NetworkManager[48970]: <info>  [1769850685.7660] device (tap0b63d009-60): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:11:25 np0005603623 systemd[1]: Started Virtual Machine qemu-101-instance-000000db.
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.780 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b730b2c4-a1a2-4550-a3ef-428785975186]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.801 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[90cc016d-0c80-4845-8a13-7106111f4f4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:25 np0005603623 NetworkManager[48970]: <info>  [1769850685.8071] manager: (tap08c50cf6-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/437)
Jan 31 04:11:25 np0005603623 systemd-udevd[329194]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.806 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[53287af4-c7f0-4845-b309-ebbd83b06300]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.836 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ddbe8f40-028e-408b-ad7c-f520ca38d209]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.838 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[33988ba2-73cd-490c-b7ed-94c1e300b848]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:25.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:25 np0005603623 NetworkManager[48970]: <info>  [1769850685.8534] device (tap08c50cf6-a0): carrier: link connected
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.857 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[c979048e-8387-4dd4-bfcc-a769509428cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.872 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e3a755e6-f64f-4495-94f9-20765eb8d915]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08c50cf6-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:ae:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 268], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 982628, 'reachable_time': 16982, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 329222, 'error': None, 'target': 'ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.886 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[f485738a-002e-4a36-9a28-3ae6d4634c6a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe15:ae8f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 982628, 'tstamp': 982628}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 329223, 'error': None, 'target': 'ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.898 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[163dc914-d2ed-4293-90b5-bc2ec565f395]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08c50cf6-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:ae:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 268], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 982628, 'reachable_time': 16982, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 329224, 'error': None, 'target': 'ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.926 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6b90d53e-5850-4f1f-83ce-0d6f6882cb32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.968 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e0b55ce0-dc51-454a-b589-d0f96bdf6696]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.970 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08c50cf6-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.970 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.971 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08c50cf6-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:11:25 np0005603623 NetworkManager[48970]: <info>  [1769850685.9897] manager: (tap08c50cf6-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/438)
Jan 31 04:11:25 np0005603623 kernel: tap08c50cf6-a0: entered promiscuous mode
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.989 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.991 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.994 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08c50cf6-a0, col_values=(('external_ids', {'iface-id': 'e8e83b87-bd31-4052-864a-9e5dbf11e897'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:11:25 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:25Z|00916|binding|INFO|Releasing lport e8e83b87-bd31-4052-864a-9e5dbf11e897 from this chassis (sb_readonly=0)
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.995 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:25 np0005603623 nova_compute[226235]: 2026-01-31 09:11:25.996 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.996 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/08c50cf6-ab45-467a-a4d2-628200ead973.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/08c50cf6-ab45-467a-a4d2-628200ead973.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.997 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8d4480df-994f-4d97-810f-73150d97e75e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.998 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-08c50cf6-ab45-467a-a4d2-628200ead973
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/08c50cf6-ab45-467a-a4d2-628200ead973.pid.haproxy
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 08c50cf6-ab45-467a-a4d2-628200ead973
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:11:25 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:25.998 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973', 'env', 'PROCESS_TAG=haproxy-08c50cf6-ab45-467a-a4d2-628200ead973', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/08c50cf6-ab45-467a-a4d2-628200ead973.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.000 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.249 226239 DEBUG nova.compute.manager [req-d2815769-0a94-4b0c-8bd4-302cc9c07d61 req-6ed83385-7cdf-415f-94ad-a909d73dfc5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Received event network-vif-plugged-0b63d009-60f2-4cf2-afea-679373e18e95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.249 226239 DEBUG oslo_concurrency.lockutils [req-d2815769-0a94-4b0c-8bd4-302cc9c07d61 req-6ed83385-7cdf-415f-94ad-a909d73dfc5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c893e608-08e4-4eab-a992-b241e484ea48-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.250 226239 DEBUG oslo_concurrency.lockutils [req-d2815769-0a94-4b0c-8bd4-302cc9c07d61 req-6ed83385-7cdf-415f-94ad-a909d73dfc5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c893e608-08e4-4eab-a992-b241e484ea48-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.250 226239 DEBUG oslo_concurrency.lockutils [req-d2815769-0a94-4b0c-8bd4-302cc9c07d61 req-6ed83385-7cdf-415f-94ad-a909d73dfc5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c893e608-08e4-4eab-a992-b241e484ea48-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.250 226239 DEBUG nova.compute.manager [req-d2815769-0a94-4b0c-8bd4-302cc9c07d61 req-6ed83385-7cdf-415f-94ad-a909d73dfc5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Processing event network-vif-plugged-0b63d009-60f2-4cf2-afea-679373e18e95 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:11:26 np0005603623 podman[329256]: 2026-01-31 09:11:26.293074868 +0000 UTC m=+0.047796250 container create 994a1150a885234e7a907dcfd8485906279ab50e92a872125a290793b35e6c2f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 04:11:26 np0005603623 systemd[1]: Started libpod-conmon-994a1150a885234e7a907dcfd8485906279ab50e92a872125a290793b35e6c2f.scope.
Jan 31 04:11:26 np0005603623 systemd[1]: Started libcrun container.
Jan 31 04:11:26 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/839732a2390952220ed89078acd0a6523bbdc7787aa1d760166b551047c11a06/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:11:26 np0005603623 podman[329256]: 2026-01-31 09:11:26.364331634 +0000 UTC m=+0.119053046 container init 994a1150a885234e7a907dcfd8485906279ab50e92a872125a290793b35e6c2f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 04:11:26 np0005603623 podman[329256]: 2026-01-31 09:11:26.368962299 +0000 UTC m=+0.123683681 container start 994a1150a885234e7a907dcfd8485906279ab50e92a872125a290793b35e6c2f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 04:11:26 np0005603623 podman[329256]: 2026-01-31 09:11:26.273212025 +0000 UTC m=+0.027933427 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:11:26 np0005603623 neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973[329270]: [NOTICE]   (329274) : New worker (329276) forked
Jan 31 04:11:26 np0005603623 neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973[329270]: [NOTICE]   (329274) : Loading success.
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.793 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850686.7934368, c893e608-08e4-4eab-a992-b241e484ea48 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.794 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c893e608-08e4-4eab-a992-b241e484ea48] VM Started (Lifecycle Event)#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.797 226239 DEBUG nova.compute.manager [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.800 226239 DEBUG nova.virt.libvirt.driver [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.803 226239 INFO nova.virt.libvirt.driver [-] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Instance spawned successfully.#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.803 226239 INFO nova.compute.manager [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Took 7.46 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.803 226239 DEBUG nova.compute.manager [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.843 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.845 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.874 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c893e608-08e4-4eab-a992-b241e484ea48] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.874 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850686.7936206, c893e608-08e4-4eab-a992-b241e484ea48 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.874 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c893e608-08e4-4eab-a992-b241e484ea48] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.884 226239 INFO nova.compute.manager [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Took 8.49 seconds to build instance.#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.886 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Updating instance_info_cache with network_info: [{"id": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "address": "fa:16:3e:f1:58:48", "network": {"id": "919288ff-a51c-4b6d-81b3-cc76704eca9e", "bridge": "br-int", "label": "tempest-network-smoke--53537063", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ae39fe-70", "ovs_interfaceid": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.901 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.905 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850686.7993631, c893e608-08e4-4eab-a992-b241e484ea48 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.905 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c893e608-08e4-4eab-a992-b241e484ea48] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.907 226239 DEBUG oslo_concurrency.lockutils [None req-7578559e-b54b-4074-95a7-a3b85a25008f b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "c893e608-08e4-4eab-a992-b241e484ea48" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.598s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.908 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-94b8b94c-865c-4d70-af81-218fd54902b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.908 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.923 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:11:26 np0005603623 nova_compute[226235]: 2026-01-31 09:11:26.926 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:11:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.097 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.176 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.177 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.177 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.177 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.178 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:11:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:27.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:11:27 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3354774569' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.641 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.732 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000db as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.733 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000db as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.736 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000da as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.736 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000da as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:11:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:11:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:27.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.874 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.875 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3900MB free_disk=20.876209259033203GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.875 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.875 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.943 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 94b8b94c-865c-4d70-af81-218fd54902b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.943 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance c893e608-08e4-4eab-a992-b241e484ea48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.943 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.943 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:11:27 np0005603623 nova_compute[226235]: 2026-01-31 09:11:27.990 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:11:28 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:28.282 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '94'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.327 226239 DEBUG nova.compute.manager [req-adf54f1f-cc8d-4842-9f51-9a9e8b520246 req-d6bc1063-795a-4d40-9742-773d79700c45 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Received event network-changed-d2ae39fe-704a-46ab-ae15-c8171d1d776f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.328 226239 DEBUG nova.compute.manager [req-adf54f1f-cc8d-4842-9f51-9a9e8b520246 req-d6bc1063-795a-4d40-9742-773d79700c45 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Refreshing instance network info cache due to event network-changed-d2ae39fe-704a-46ab-ae15-c8171d1d776f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.328 226239 DEBUG oslo_concurrency.lockutils [req-adf54f1f-cc8d-4842-9f51-9a9e8b520246 req-d6bc1063-795a-4d40-9742-773d79700c45 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-94b8b94c-865c-4d70-af81-218fd54902b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.329 226239 DEBUG oslo_concurrency.lockutils [req-adf54f1f-cc8d-4842-9f51-9a9e8b520246 req-d6bc1063-795a-4d40-9742-773d79700c45 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-94b8b94c-865c-4d70-af81-218fd54902b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.329 226239 DEBUG nova.network.neutron [req-adf54f1f-cc8d-4842-9f51-9a9e8b520246 req-d6bc1063-795a-4d40-9742-773d79700c45 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Refreshing network info cache for port d2ae39fe-704a-46ab-ae15-c8171d1d776f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.333 226239 DEBUG nova.compute.manager [req-0adfc873-75c4-4f50-ad19-129f9b89a4ce req-271da435-11d0-4397-8bd4-33d28cb82ca4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Received event network-vif-plugged-0b63d009-60f2-4cf2-afea-679373e18e95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.333 226239 DEBUG oslo_concurrency.lockutils [req-0adfc873-75c4-4f50-ad19-129f9b89a4ce req-271da435-11d0-4397-8bd4-33d28cb82ca4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c893e608-08e4-4eab-a992-b241e484ea48-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.333 226239 DEBUG oslo_concurrency.lockutils [req-0adfc873-75c4-4f50-ad19-129f9b89a4ce req-271da435-11d0-4397-8bd4-33d28cb82ca4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c893e608-08e4-4eab-a992-b241e484ea48-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.334 226239 DEBUG oslo_concurrency.lockutils [req-0adfc873-75c4-4f50-ad19-129f9b89a4ce req-271da435-11d0-4397-8bd4-33d28cb82ca4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c893e608-08e4-4eab-a992-b241e484ea48-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.334 226239 DEBUG nova.compute.manager [req-0adfc873-75c4-4f50-ad19-129f9b89a4ce req-271da435-11d0-4397-8bd4-33d28cb82ca4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] No waiting events found dispatching network-vif-plugged-0b63d009-60f2-4cf2-afea-679373e18e95 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.334 226239 WARNING nova.compute.manager [req-0adfc873-75c4-4f50-ad19-129f9b89a4ce req-271da435-11d0-4397-8bd4-33d28cb82ca4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Received unexpected event network-vif-plugged-0b63d009-60f2-4cf2-afea-679373e18e95 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.334 226239 DEBUG nova.compute.manager [req-0adfc873-75c4-4f50-ad19-129f9b89a4ce req-271da435-11d0-4397-8bd4-33d28cb82ca4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Received event network-changed-d2ae39fe-704a-46ab-ae15-c8171d1d776f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.334 226239 DEBUG nova.compute.manager [req-0adfc873-75c4-4f50-ad19-129f9b89a4ce req-271da435-11d0-4397-8bd4-33d28cb82ca4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Refreshing instance network info cache due to event network-changed-d2ae39fe-704a-46ab-ae15-c8171d1d776f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.335 226239 DEBUG oslo_concurrency.lockutils [req-0adfc873-75c4-4f50-ad19-129f9b89a4ce req-271da435-11d0-4397-8bd4-33d28cb82ca4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-94b8b94c-865c-4d70-af81-218fd54902b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:11:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:11:28 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/830516827' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.415 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.420 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.450 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.478 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:11:28 np0005603623 nova_compute[226235]: 2026-01-31 09:11:28.479 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:29 np0005603623 nova_compute[226235]: 2026-01-31 09:11:29.259 226239 DEBUG nova.compute.manager [req-167192e5-669f-4bc4-b104-ee2d5d3a29d6 req-2f24b5da-a335-4a97-a9b1-d52da5cd7a9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Received event network-changed-0b63d009-60f2-4cf2-afea-679373e18e95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:11:29 np0005603623 nova_compute[226235]: 2026-01-31 09:11:29.259 226239 DEBUG nova.compute.manager [req-167192e5-669f-4bc4-b104-ee2d5d3a29d6 req-2f24b5da-a335-4a97-a9b1-d52da5cd7a9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Refreshing instance network info cache due to event network-changed-0b63d009-60f2-4cf2-afea-679373e18e95. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:11:29 np0005603623 nova_compute[226235]: 2026-01-31 09:11:29.260 226239 DEBUG oslo_concurrency.lockutils [req-167192e5-669f-4bc4-b104-ee2d5d3a29d6 req-2f24b5da-a335-4a97-a9b1-d52da5cd7a9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-c893e608-08e4-4eab-a992-b241e484ea48" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:11:29 np0005603623 nova_compute[226235]: 2026-01-31 09:11:29.260 226239 DEBUG oslo_concurrency.lockutils [req-167192e5-669f-4bc4-b104-ee2d5d3a29d6 req-2f24b5da-a335-4a97-a9b1-d52da5cd7a9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-c893e608-08e4-4eab-a992-b241e484ea48" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:11:29 np0005603623 nova_compute[226235]: 2026-01-31 09:11:29.260 226239 DEBUG nova.network.neutron [req-167192e5-669f-4bc4-b104-ee2d5d3a29d6 req-2f24b5da-a335-4a97-a9b1-d52da5cd7a9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Refreshing network info cache for port 0b63d009-60f2-4cf2-afea-679373e18e95 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:11:29 np0005603623 nova_compute[226235]: 2026-01-31 09:11:29.478 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:29 np0005603623 nova_compute[226235]: 2026-01-31 09:11:29.492 226239 DEBUG nova.network.neutron [req-adf54f1f-cc8d-4842-9f51-9a9e8b520246 req-d6bc1063-795a-4d40-9742-773d79700c45 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Updated VIF entry in instance network info cache for port d2ae39fe-704a-46ab-ae15-c8171d1d776f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:11:29 np0005603623 nova_compute[226235]: 2026-01-31 09:11:29.492 226239 DEBUG nova.network.neutron [req-adf54f1f-cc8d-4842-9f51-9a9e8b520246 req-d6bc1063-795a-4d40-9742-773d79700c45 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Updating instance_info_cache with network_info: [{"id": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "address": "fa:16:3e:f1:58:48", "network": {"id": "919288ff-a51c-4b6d-81b3-cc76704eca9e", "bridge": "br-int", "label": "tempest-network-smoke--53537063", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ae39fe-70", "ovs_interfaceid": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:11:29 np0005603623 nova_compute[226235]: 2026-01-31 09:11:29.507 226239 DEBUG oslo_concurrency.lockutils [req-adf54f1f-cc8d-4842-9f51-9a9e8b520246 req-d6bc1063-795a-4d40-9742-773d79700c45 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-94b8b94c-865c-4d70-af81-218fd54902b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:11:29 np0005603623 nova_compute[226235]: 2026-01-31 09:11:29.508 226239 DEBUG oslo_concurrency.lockutils [req-0adfc873-75c4-4f50-ad19-129f9b89a4ce req-271da435-11d0-4397-8bd4-33d28cb82ca4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-94b8b94c-865c-4d70-af81-218fd54902b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:11:29 np0005603623 nova_compute[226235]: 2026-01-31 09:11:29.508 226239 DEBUG nova.network.neutron [req-0adfc873-75c4-4f50-ad19-129f9b89a4ce req-271da435-11d0-4397-8bd4-33d28cb82ca4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Refreshing network info cache for port d2ae39fe-704a-46ab-ae15-c8171d1d776f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:11:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:11:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:29.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:11:29 np0005603623 nova_compute[226235]: 2026-01-31 09:11:29.618 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:11:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:29.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:11:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:30.163 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:30.163 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:30.164 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:30 np0005603623 nova_compute[226235]: 2026-01-31 09:11:30.610 226239 DEBUG nova.network.neutron [req-167192e5-669f-4bc4-b104-ee2d5d3a29d6 req-2f24b5da-a335-4a97-a9b1-d52da5cd7a9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Updated VIF entry in instance network info cache for port 0b63d009-60f2-4cf2-afea-679373e18e95. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:11:30 np0005603623 nova_compute[226235]: 2026-01-31 09:11:30.611 226239 DEBUG nova.network.neutron [req-167192e5-669f-4bc4-b104-ee2d5d3a29d6 req-2f24b5da-a335-4a97-a9b1-d52da5cd7a9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Updating instance_info_cache with network_info: [{"id": "0b63d009-60f2-4cf2-afea-679373e18e95", "address": "fa:16:3e:05:73:e5", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b63d009-60", "ovs_interfaceid": "0b63d009-60f2-4cf2-afea-679373e18e95", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:11:30 np0005603623 nova_compute[226235]: 2026-01-31 09:11:30.633 226239 DEBUG oslo_concurrency.lockutils [req-167192e5-669f-4bc4-b104-ee2d5d3a29d6 req-2f24b5da-a335-4a97-a9b1-d52da5cd7a9a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-c893e608-08e4-4eab-a992-b241e484ea48" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:11:30 np0005603623 nova_compute[226235]: 2026-01-31 09:11:30.753 226239 DEBUG nova.network.neutron [req-0adfc873-75c4-4f50-ad19-129f9b89a4ce req-271da435-11d0-4397-8bd4-33d28cb82ca4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Updated VIF entry in instance network info cache for port d2ae39fe-704a-46ab-ae15-c8171d1d776f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:11:30 np0005603623 nova_compute[226235]: 2026-01-31 09:11:30.753 226239 DEBUG nova.network.neutron [req-0adfc873-75c4-4f50-ad19-129f9b89a4ce req-271da435-11d0-4397-8bd4-33d28cb82ca4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Updating instance_info_cache with network_info: [{"id": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "address": "fa:16:3e:f1:58:48", "network": {"id": "919288ff-a51c-4b6d-81b3-cc76704eca9e", "bridge": "br-int", "label": "tempest-network-smoke--53537063", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ae39fe-70", "ovs_interfaceid": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:11:30 np0005603623 nova_compute[226235]: 2026-01-31 09:11:30.785 226239 DEBUG oslo_concurrency.lockutils [req-0adfc873-75c4-4f50-ad19-129f9b89a4ce req-271da435-11d0-4397-8bd4-33d28cb82ca4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-94b8b94c-865c-4d70-af81-218fd54902b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:11:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:31.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:31.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:32 np0005603623 nova_compute[226235]: 2026-01-31 09:11:32.143 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:32 np0005603623 nova_compute[226235]: 2026-01-31 09:11:32.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:33 np0005603623 nova_compute[226235]: 2026-01-31 09:11:33.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:33.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:33.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:34 np0005603623 nova_compute[226235]: 2026-01-31 09:11:34.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:34 np0005603623 nova_compute[226235]: 2026-01-31 09:11:34.622 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:34 np0005603623 podman[329377]: 2026-01-31 09:11:34.978623721 +0000 UTC m=+0.059378724 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 04:11:35 np0005603623 podman[329378]: 2026-01-31 09:11:35.006472354 +0000 UTC m=+0.087980261 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 31 04:11:35 np0005603623 nova_compute[226235]: 2026-01-31 09:11:35.152 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:35 np0005603623 nova_compute[226235]: 2026-01-31 09:11:35.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 31 04:11:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:35.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 31 04:11:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:35.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:36 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:36Z|00116|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f1:58:48 10.100.0.5
Jan 31 04:11:36 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:36Z|00117|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f1:58:48 10.100.0.5
Jan 31 04:11:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:37 np0005603623 nova_compute[226235]: 2026-01-31 09:11:37.146 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:11:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:37.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:11:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:11:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:37.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #181. Immutable memtables: 0.
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:11:38.036858) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 115] Flushing memtable with next log file: 181
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850698036938, "job": 115, "event": "flush_started", "num_memtables": 1, "num_entries": 1764, "num_deletes": 253, "total_data_size": 4141945, "memory_usage": 4191864, "flush_reason": "Manual Compaction"}
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 115] Level-0 flush table #182: started
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850698048574, "cf_name": "default", "job": 115, "event": "table_file_creation", "file_number": 182, "file_size": 1654898, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 87582, "largest_seqno": 89341, "table_properties": {"data_size": 1649225, "index_size": 2812, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 15106, "raw_average_key_size": 21, "raw_value_size": 1636655, "raw_average_value_size": 2298, "num_data_blocks": 125, "num_entries": 712, "num_filter_entries": 712, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850546, "oldest_key_time": 1769850546, "file_creation_time": 1769850698, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 182, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 115] Flush lasted 11793 microseconds, and 4131 cpu microseconds.
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:11:38.048646) [db/flush_job.cc:967] [default] [JOB 115] Level-0 flush table #182: 1654898 bytes OK
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:11:38.048669) [db/memtable_list.cc:519] [default] Level-0 commit table #182 started
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:11:38.050573) [db/memtable_list.cc:722] [default] Level-0 commit table #182: memtable #1 done
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:11:38.050623) EVENT_LOG_v1 {"time_micros": 1769850698050611, "job": 115, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:11:38.050650) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 115] Try to delete WAL files size 4133892, prev total WAL file size 4133892, number of live WAL files 2.
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000178.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:11:38.051616) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033303038' seq:72057594037927935, type:22 .. '6D6772737461740033323631' seq:0, type:0; will stop at (end)
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 116] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 115 Base level 0, inputs: [182(1616KB)], [180(12MB)]
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850698051665, "job": 116, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [182], "files_L6": [180], "score": -1, "input_data_size": 15100497, "oldest_snapshot_seqno": -1}
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 116] Generated table #183: 10929 keys, 12261405 bytes, temperature: kUnknown
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850698158834, "cf_name": "default", "job": 116, "event": "table_file_creation", "file_number": 183, "file_size": 12261405, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12193776, "index_size": 39228, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27333, "raw_key_size": 287764, "raw_average_key_size": 26, "raw_value_size": 12006089, "raw_average_value_size": 1098, "num_data_blocks": 1486, "num_entries": 10929, "num_filter_entries": 10929, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769850698, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 183, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:11:38.159160) [db/compaction/compaction_job.cc:1663] [default] [JOB 116] Compacted 1@0 + 1@6 files to L6 => 12261405 bytes
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:11:38.160541) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 140.8 rd, 114.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 12.8 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(16.5) write-amplify(7.4) OK, records in: 11389, records dropped: 460 output_compression: NoCompression
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:11:38.160561) EVENT_LOG_v1 {"time_micros": 1769850698160552, "job": 116, "event": "compaction_finished", "compaction_time_micros": 107274, "compaction_time_cpu_micros": 22625, "output_level": 6, "num_output_files": 1, "total_output_size": 12261405, "num_input_records": 11389, "num_output_records": 10929, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000182.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850698160961, "job": 116, "event": "table_file_deletion", "file_number": 182}
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000180.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850698162652, "job": 116, "event": "table_file_deletion", "file_number": 180}
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:11:38.051553) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:11:38.162723) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:11:38.162727) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:11:38.162729) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:11:38.162732) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:11:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:11:38.162735) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:11:39 np0005603623 nova_compute[226235]: 2026-01-31 09:11:39.626 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:39.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:39.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:40 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:40Z|00118|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.4 does not match offer 10.100.0.10
Jan 31 04:11:40 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:40Z|00119|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:05:73:e5 10.100.0.10
Jan 31 04:11:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:41.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:11:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:41.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:11:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:42 np0005603623 nova_compute[226235]: 2026-01-31 09:11:42.147 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.399 226239 DEBUG oslo_concurrency.lockutils [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "94b8b94c-865c-4d70-af81-218fd54902b0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.400 226239 DEBUG oslo_concurrency.lockutils [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "94b8b94c-865c-4d70-af81-218fd54902b0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.400 226239 DEBUG oslo_concurrency.lockutils [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "94b8b94c-865c-4d70-af81-218fd54902b0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.400 226239 DEBUG oslo_concurrency.lockutils [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "94b8b94c-865c-4d70-af81-218fd54902b0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.401 226239 DEBUG oslo_concurrency.lockutils [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "94b8b94c-865c-4d70-af81-218fd54902b0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.402 226239 INFO nova.compute.manager [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Terminating instance#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.403 226239 DEBUG nova.compute.manager [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:11:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:43.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:43 np0005603623 kernel: tapd2ae39fe-70 (unregistering): left promiscuous mode
Jan 31 04:11:43 np0005603623 NetworkManager[48970]: <info>  [1769850703.7534] device (tapd2ae39fe-70): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:11:43 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:43Z|00917|binding|INFO|Releasing lport d2ae39fe-704a-46ab-ae15-c8171d1d776f from this chassis (sb_readonly=0)
Jan 31 04:11:43 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:43Z|00918|binding|INFO|Setting lport d2ae39fe-704a-46ab-ae15-c8171d1d776f down in Southbound
Jan 31 04:11:43 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:43Z|00919|binding|INFO|Removing iface tapd2ae39fe-70 ovn-installed in OVS
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.762 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.764 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.771 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:43.771 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:58:48 10.100.0.5', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '94b8b94c-865c-4d70-af81-218fd54902b0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-919288ff-a51c-4b6d-81b3-cc76704eca9e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c7930b92fc3471f87d9fe78ee56e71e', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be7ac7a5-1e86-4304-8ddd-d276d05956e0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=d2ae39fe-704a-46ab-ae15-c8171d1d776f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:43.773 143258 INFO neutron.agent.ovn.metadata.agent [-] Port d2ae39fe-704a-46ab-ae15-c8171d1d776f in datapath 919288ff-a51c-4b6d-81b3-cc76704eca9e unbound from our chassis#033[00m
Jan 31 04:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:43.774 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 919288ff-a51c-4b6d-81b3-cc76704eca9e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:43.775 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[2bfe8bee-fe7c-4d32-bc0b-15dda0cade1e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:43.776 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e namespace which is not needed anymore#033[00m
Jan 31 04:11:43 np0005603623 systemd[1]: machine-qemu\x2d100\x2dinstance\x2d000000da.scope: Deactivated successfully.
Jan 31 04:11:43 np0005603623 systemd[1]: machine-qemu\x2d100\x2dinstance\x2d000000da.scope: Consumed 12.852s CPU time.
Jan 31 04:11:43 np0005603623 systemd-machined[194379]: Machine qemu-100-instance-000000da terminated.
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.832 226239 INFO nova.virt.libvirt.driver [-] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Instance destroyed successfully.#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.833 226239 DEBUG nova.objects.instance [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lazy-loading 'resources' on Instance uuid 94b8b94c-865c-4d70-af81-218fd54902b0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.853 226239 DEBUG nova.virt.libvirt.vif [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:11:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-gen-1-1734004012',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-gen-1-1734004012',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1802479850-ge',id=218,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKIc4iaayDmLhEo3vI6YGRzp7m9GW6fzslqwq++gP9ecVHJRq1tSjzVnTPtJw3RUxXTQDWiA7Ya9j/CawC++Id9BLZED+RHeJDZ4JXh3gvgziK3fUhGR6gajupFnKxcV3w==',key_name='tempest-TestSecurityGroupsBasicOps-762998316',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:11:23Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c7930b92fc3471f87d9fe78ee56e71e',ramdisk_id='',reservation_id='r-pdb07s60',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1802479850',owner_user_name='tempest-TestSecurityGroupsBasicOps-1802479850-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:11:23Z,user_data=None,user_id='ebd43008d7a64b8bbf97a2304b1f78b6',uuid=94b8b94c-865c-4d70-af81-218fd54902b0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "address": "fa:16:3e:f1:58:48", "network": {"id": "919288ff-a51c-4b6d-81b3-cc76704eca9e", "bridge": "br-int", "label": "tempest-network-smoke--53537063", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ae39fe-70", "ovs_interfaceid": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.853 226239 DEBUG nova.network.os_vif_util [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converting VIF {"id": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "address": "fa:16:3e:f1:58:48", "network": {"id": "919288ff-a51c-4b6d-81b3-cc76704eca9e", "bridge": "br-int", "label": "tempest-network-smoke--53537063", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ae39fe-70", "ovs_interfaceid": "d2ae39fe-704a-46ab-ae15-c8171d1d776f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.854 226239 DEBUG nova.network.os_vif_util [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f1:58:48,bridge_name='br-int',has_traffic_filtering=True,id=d2ae39fe-704a-46ab-ae15-c8171d1d776f,network=Network(919288ff-a51c-4b6d-81b3-cc76704eca9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ae39fe-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.854 226239 DEBUG os_vif [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f1:58:48,bridge_name='br-int',has_traffic_filtering=True,id=d2ae39fe-704a-46ab-ae15-c8171d1d776f,network=Network(919288ff-a51c-4b6d-81b3-cc76704eca9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ae39fe-70') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.856 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.856 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2ae39fe-70, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.857 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.859 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.861 226239 INFO os_vif [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f1:58:48,bridge_name='br-int',has_traffic_filtering=True,id=d2ae39fe-704a-46ab-ae15-c8171d1d776f,network=Network(919288ff-a51c-4b6d-81b3-cc76704eca9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ae39fe-70')#033[00m
Jan 31 04:11:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:11:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:43.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:11:43 np0005603623 neutron-haproxy-ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e[328987]: [NOTICE]   (328991) : haproxy version is 2.8.14-c23fe91
Jan 31 04:11:43 np0005603623 neutron-haproxy-ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e[328987]: [NOTICE]   (328991) : path to executable is /usr/sbin/haproxy
Jan 31 04:11:43 np0005603623 neutron-haproxy-ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e[328987]: [WARNING]  (328991) : Exiting Master process...
Jan 31 04:11:43 np0005603623 neutron-haproxy-ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e[328987]: [ALERT]    (328991) : Current worker (328993) exited with code 143 (Terminated)
Jan 31 04:11:43 np0005603623 neutron-haproxy-ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e[328987]: [WARNING]  (328991) : All workers exited. Exiting... (0)
Jan 31 04:11:43 np0005603623 systemd[1]: libpod-f0dce35d566fb73f3a3b82591028c57b9964e1df5f095c1b0e08366bbddddf2f.scope: Deactivated successfully.
Jan 31 04:11:43 np0005603623 podman[329463]: 2026-01-31 09:11:43.889692869 +0000 UTC m=+0.047046497 container died f0dce35d566fb73f3a3b82591028c57b9964e1df5f095c1b0e08366bbddddf2f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 04:11:43 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f0dce35d566fb73f3a3b82591028c57b9964e1df5f095c1b0e08366bbddddf2f-userdata-shm.mount: Deactivated successfully.
Jan 31 04:11:43 np0005603623 systemd[1]: var-lib-containers-storage-overlay-74059a7eafa5d4fc0988a8197e0bb5d5cba2b0dde39c36a60f15956c7063c736-merged.mount: Deactivated successfully.
Jan 31 04:11:43 np0005603623 podman[329463]: 2026-01-31 09:11:43.923053125 +0000 UTC m=+0.080406753 container cleanup f0dce35d566fb73f3a3b82591028c57b9964e1df5f095c1b0e08366bbddddf2f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 04:11:43 np0005603623 systemd[1]: libpod-conmon-f0dce35d566fb73f3a3b82591028c57b9964e1df5f095c1b0e08366bbddddf2f.scope: Deactivated successfully.
Jan 31 04:11:43 np0005603623 podman[329509]: 2026-01-31 09:11:43.976927346 +0000 UTC m=+0.040102190 container remove f0dce35d566fb73f3a3b82591028c57b9964e1df5f095c1b0e08366bbddddf2f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 04:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:43.979 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[afd02ee2-4999-4b90-895a-dbf9f7e5ef91]: (4, ('Sat Jan 31 09:11:43 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e (f0dce35d566fb73f3a3b82591028c57b9964e1df5f095c1b0e08366bbddddf2f)\nf0dce35d566fb73f3a3b82591028c57b9964e1df5f095c1b0e08366bbddddf2f\nSat Jan 31 09:11:43 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e (f0dce35d566fb73f3a3b82591028c57b9964e1df5f095c1b0e08366bbddddf2f)\nf0dce35d566fb73f3a3b82591028c57b9964e1df5f095c1b0e08366bbddddf2f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:43.981 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[bbd6be75-5351-4cf2-b0a0-cff63b767176]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:43.982 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap919288ff-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.983 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:43 np0005603623 kernel: tap919288ff-a0: left promiscuous mode
Jan 31 04:11:43 np0005603623 nova_compute[226235]: 2026-01-31 09:11:43.989 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:43 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:43.992 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c3b9d847-318b-4f51-97bf-c68661b40ac8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:44.009 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[07d6d2a0-5cf9-4f25-9d88-e716d553ac40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:44.009 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8fc8c36d-d777-4c5d-b2c0-949fc2599f01]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:44.021 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4d0a6e02-12f0-4caf-a1d6-398c4d124e57]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 982302, 'reachable_time': 37140, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 329527, 'error': None, 'target': 'ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:44.023 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-919288ff-a51c-4b6d-81b3-cc76704eca9e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:11:44 np0005603623 systemd[1]: run-netns-ovnmeta\x2d919288ff\x2da51c\x2d4b6d\x2d81b3\x2dcc76704eca9e.mount: Deactivated successfully.
Jan 31 04:11:44 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:11:44.023 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[1136fe7d-9cf4-469c-b770-f3eebe3c558b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:11:44 np0005603623 nova_compute[226235]: 2026-01-31 09:11:44.337 226239 DEBUG nova.compute.manager [req-f35eb655-b409-4ba2-879d-3295ca8962eb req-b9ca0a62-6136-439d-9b85-b2bd8ce1a8a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Received event network-vif-unplugged-d2ae39fe-704a-46ab-ae15-c8171d1d776f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:11:44 np0005603623 nova_compute[226235]: 2026-01-31 09:11:44.338 226239 DEBUG oslo_concurrency.lockutils [req-f35eb655-b409-4ba2-879d-3295ca8962eb req-b9ca0a62-6136-439d-9b85-b2bd8ce1a8a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "94b8b94c-865c-4d70-af81-218fd54902b0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:44 np0005603623 nova_compute[226235]: 2026-01-31 09:11:44.338 226239 DEBUG oslo_concurrency.lockutils [req-f35eb655-b409-4ba2-879d-3295ca8962eb req-b9ca0a62-6136-439d-9b85-b2bd8ce1a8a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "94b8b94c-865c-4d70-af81-218fd54902b0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:44 np0005603623 nova_compute[226235]: 2026-01-31 09:11:44.338 226239 DEBUG oslo_concurrency.lockutils [req-f35eb655-b409-4ba2-879d-3295ca8962eb req-b9ca0a62-6136-439d-9b85-b2bd8ce1a8a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "94b8b94c-865c-4d70-af81-218fd54902b0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:44 np0005603623 nova_compute[226235]: 2026-01-31 09:11:44.338 226239 DEBUG nova.compute.manager [req-f35eb655-b409-4ba2-879d-3295ca8962eb req-b9ca0a62-6136-439d-9b85-b2bd8ce1a8a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] No waiting events found dispatching network-vif-unplugged-d2ae39fe-704a-46ab-ae15-c8171d1d776f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:11:44 np0005603623 nova_compute[226235]: 2026-01-31 09:11:44.338 226239 DEBUG nova.compute.manager [req-f35eb655-b409-4ba2-879d-3295ca8962eb req-b9ca0a62-6136-439d-9b85-b2bd8ce1a8a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Received event network-vif-unplugged-d2ae39fe-704a-46ab-ae15-c8171d1d776f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:11:44 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:44Z|00120|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.4 does not match offer 10.100.0.10
Jan 31 04:11:44 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:44Z|00121|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:05:73:e5 10.100.0.10
Jan 31 04:11:45 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:45Z|00122|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:05:73:e5 10.100.0.10
Jan 31 04:11:45 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:45Z|00123|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:05:73:e5 10.100.0.10
Jan 31 04:11:45 np0005603623 nova_compute[226235]: 2026-01-31 09:11:45.486 226239 INFO nova.virt.libvirt.driver [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Deleting instance files /var/lib/nova/instances/94b8b94c-865c-4d70-af81-218fd54902b0_del#033[00m
Jan 31 04:11:45 np0005603623 nova_compute[226235]: 2026-01-31 09:11:45.487 226239 INFO nova.virt.libvirt.driver [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Deletion of /var/lib/nova/instances/94b8b94c-865c-4d70-af81-218fd54902b0_del complete#033[00m
Jan 31 04:11:45 np0005603623 nova_compute[226235]: 2026-01-31 09:11:45.542 226239 INFO nova.compute.manager [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Took 2.14 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:11:45 np0005603623 nova_compute[226235]: 2026-01-31 09:11:45.543 226239 DEBUG oslo.service.loopingcall [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:11:45 np0005603623 nova_compute[226235]: 2026-01-31 09:11:45.543 226239 DEBUG nova.compute.manager [-] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:11:45 np0005603623 nova_compute[226235]: 2026-01-31 09:11:45.544 226239 DEBUG nova.network.neutron [-] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:11:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:11:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:45.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:11:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:11:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:45.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:11:46 np0005603623 nova_compute[226235]: 2026-01-31 09:11:46.432 226239 DEBUG nova.compute.manager [req-b300ade3-870d-4648-909b-4febf4c1fd4c req-30c53129-3332-42f2-a438-78a7bf203c84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Received event network-vif-plugged-d2ae39fe-704a-46ab-ae15-c8171d1d776f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:11:46 np0005603623 nova_compute[226235]: 2026-01-31 09:11:46.433 226239 DEBUG oslo_concurrency.lockutils [req-b300ade3-870d-4648-909b-4febf4c1fd4c req-30c53129-3332-42f2-a438-78a7bf203c84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "94b8b94c-865c-4d70-af81-218fd54902b0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:46 np0005603623 nova_compute[226235]: 2026-01-31 09:11:46.433 226239 DEBUG oslo_concurrency.lockutils [req-b300ade3-870d-4648-909b-4febf4c1fd4c req-30c53129-3332-42f2-a438-78a7bf203c84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "94b8b94c-865c-4d70-af81-218fd54902b0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:46 np0005603623 nova_compute[226235]: 2026-01-31 09:11:46.434 226239 DEBUG oslo_concurrency.lockutils [req-b300ade3-870d-4648-909b-4febf4c1fd4c req-30c53129-3332-42f2-a438-78a7bf203c84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "94b8b94c-865c-4d70-af81-218fd54902b0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:46 np0005603623 nova_compute[226235]: 2026-01-31 09:11:46.434 226239 DEBUG nova.compute.manager [req-b300ade3-870d-4648-909b-4febf4c1fd4c req-30c53129-3332-42f2-a438-78a7bf203c84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] No waiting events found dispatching network-vif-plugged-d2ae39fe-704a-46ab-ae15-c8171d1d776f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:11:46 np0005603623 nova_compute[226235]: 2026-01-31 09:11:46.434 226239 WARNING nova.compute.manager [req-b300ade3-870d-4648-909b-4febf4c1fd4c req-30c53129-3332-42f2-a438-78a7bf203c84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Received unexpected event network-vif-plugged-d2ae39fe-704a-46ab-ae15-c8171d1d776f for instance with vm_state active and task_state deleting.#033[00m
Jan 31 04:11:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:47 np0005603623 nova_compute[226235]: 2026-01-31 09:11:47.151 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:47 np0005603623 nova_compute[226235]: 2026-01-31 09:11:47.337 226239 DEBUG nova.network.neutron [-] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:11:47 np0005603623 nova_compute[226235]: 2026-01-31 09:11:47.356 226239 INFO nova.compute.manager [-] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Took 1.81 seconds to deallocate network for instance.#033[00m
Jan 31 04:11:47 np0005603623 nova_compute[226235]: 2026-01-31 09:11:47.410 226239 DEBUG nova.compute.manager [req-206d7d83-ec49-4959-9ffc-972fbedbec10 req-6e94b3a0-9c87-4564-b356-f166cca6be38 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Received event network-vif-deleted-d2ae39fe-704a-46ab-ae15-c8171d1d776f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:11:47 np0005603623 nova_compute[226235]: 2026-01-31 09:11:47.416 226239 DEBUG oslo_concurrency.lockutils [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:47 np0005603623 nova_compute[226235]: 2026-01-31 09:11:47.417 226239 DEBUG oslo_concurrency.lockutils [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:47 np0005603623 nova_compute[226235]: 2026-01-31 09:11:47.495 226239 DEBUG oslo_concurrency.processutils [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:11:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:47.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:47.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:11:47 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/552452411' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:11:48 np0005603623 nova_compute[226235]: 2026-01-31 09:11:48.011 226239 DEBUG oslo_concurrency.processutils [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:11:48 np0005603623 nova_compute[226235]: 2026-01-31 09:11:48.019 226239 DEBUG nova.compute.provider_tree [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:11:48 np0005603623 nova_compute[226235]: 2026-01-31 09:11:48.046 226239 DEBUG nova.scheduler.client.report [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:11:48 np0005603623 nova_compute[226235]: 2026-01-31 09:11:48.071 226239 DEBUG oslo_concurrency.lockutils [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.654s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:48 np0005603623 nova_compute[226235]: 2026-01-31 09:11:48.101 226239 INFO nova.scheduler.client.report [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Deleted allocations for instance 94b8b94c-865c-4d70-af81-218fd54902b0#033[00m
Jan 31 04:11:48 np0005603623 nova_compute[226235]: 2026-01-31 09:11:48.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:48 np0005603623 nova_compute[226235]: 2026-01-31 09:11:48.162 226239 DEBUG oslo_concurrency.lockutils [None req-6514a13e-ad38-4fdb-8978-954399caebf1 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "94b8b94c-865c-4d70-af81-218fd54902b0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.762s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:48 np0005603623 nova_compute[226235]: 2026-01-31 09:11:48.858 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:48 np0005603623 nova_compute[226235]: 2026-01-31 09:11:48.972 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:49.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:49.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:11:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:51.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:11:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:51.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:52 np0005603623 nova_compute[226235]: 2026-01-31 09:11:52.192 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:53 np0005603623 nova_compute[226235]: 2026-01-31 09:11:53.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:53 np0005603623 nova_compute[226235]: 2026-01-31 09:11:53.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 04:11:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:53.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:53 np0005603623 nova_compute[226235]: 2026-01-31 09:11:53.860 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:11:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:53.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:11:54 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:11:54 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:11:54 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:11:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:11:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:55.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:11:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:55.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:57 np0005603623 nova_compute[226235]: 2026-01-31 09:11:57.195 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:11:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:57.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:11:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:57.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:58 np0005603623 nova_compute[226235]: 2026-01-31 09:11:58.831 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850703.8300385, 94b8b94c-865c-4d70-af81-218fd54902b0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:11:58 np0005603623 nova_compute[226235]: 2026-01-31 09:11:58.831 226239 INFO nova.compute.manager [-] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:11:58 np0005603623 nova_compute[226235]: 2026-01-31 09:11:58.856 226239 DEBUG nova.compute.manager [None req-1ef1fa89-e5d0-401d-94aa-c252944c7d8c - - - - - -] [instance: 94b8b94c-865c-4d70-af81-218fd54902b0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:11:58 np0005603623 nova_compute[226235]: 2026-01-31 09:11:58.863 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:59 np0005603623 ovn_controller[133449]: 2026-01-31T09:11:59Z|00920|binding|INFO|Releasing lport e8e83b87-bd31-4052-864a-9e5dbf11e897 from this chassis (sb_readonly=0)
Jan 31 04:11:59 np0005603623 nova_compute[226235]: 2026-01-31 09:11:59.577 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:59.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:11:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:59.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:00 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:12:00 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:12:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:12:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:01.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:12:01 np0005603623 nova_compute[226235]: 2026-01-31 09:12:01.783 226239 DEBUG nova.compute.manager [None req-4e2e1602-25fd-4c58-a6e1-768d04f84c34 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:12:01 np0005603623 nova_compute[226235]: 2026-01-31 09:12:01.866 226239 INFO nova.compute.manager [None req-4e2e1602-25fd-4c58-a6e1-768d04f84c34 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] instance snapshotting#033[00m
Jan 31 04:12:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:01.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:02 np0005603623 nova_compute[226235]: 2026-01-31 09:12:02.169 226239 INFO nova.virt.libvirt.driver [None req-4e2e1602-25fd-4c58-a6e1-768d04f84c34 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Beginning live snapshot process#033[00m
Jan 31 04:12:02 np0005603623 nova_compute[226235]: 2026-01-31 09:12:02.198 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:02 np0005603623 nova_compute[226235]: 2026-01-31 09:12:02.321 226239 DEBUG nova.storage.rbd_utils [None req-4e2e1602-25fd-4c58-a6e1-768d04f84c34 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] creating snapshot(dfcaec8199d74a25b1ca9c15dd9744c3) on rbd image(c893e608-08e4-4eab-a992-b241e484ea48_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 04:12:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e418 e418: 3 total, 3 up, 3 in
Jan 31 04:12:02 np0005603623 nova_compute[226235]: 2026-01-31 09:12:02.891 226239 DEBUG nova.storage.rbd_utils [None req-4e2e1602-25fd-4c58-a6e1-768d04f84c34 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] cloning vms/c893e608-08e4-4eab-a992-b241e484ea48_disk@dfcaec8199d74a25b1ca9c15dd9744c3 to images/a6036a30-758c-46f5-b0cb-e59cf3a4220b clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 04:12:03 np0005603623 nova_compute[226235]: 2026-01-31 09:12:03.037 226239 DEBUG nova.storage.rbd_utils [None req-4e2e1602-25fd-4c58-a6e1-768d04f84c34 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] flattening images/a6036a30-758c-46f5-b0cb-e59cf3a4220b flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 31 04:12:03 np0005603623 nova_compute[226235]: 2026-01-31 09:12:03.410 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:03.660 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=95, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=94) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:12:03 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:03.662 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:12:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:03.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:03 np0005603623 nova_compute[226235]: 2026-01-31 09:12:03.715 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:03 np0005603623 nova_compute[226235]: 2026-01-31 09:12:03.745 226239 DEBUG nova.storage.rbd_utils [None req-4e2e1602-25fd-4c58-a6e1-768d04f84c34 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] removing snapshot(dfcaec8199d74a25b1ca9c15dd9744c3) on rbd image(c893e608-08e4-4eab-a992-b241e484ea48_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 04:12:03 np0005603623 nova_compute[226235]: 2026-01-31 09:12:03.865 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:12:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:03.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:12:04 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e419 e419: 3 total, 3 up, 3 in
Jan 31 04:12:04 np0005603623 nova_compute[226235]: 2026-01-31 09:12:04.848 226239 DEBUG nova.storage.rbd_utils [None req-4e2e1602-25fd-4c58-a6e1-768d04f84c34 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] creating snapshot(snap) on rbd image(a6036a30-758c-46f5-b0cb-e59cf3a4220b) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 04:12:05 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:05.665 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '95'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:12:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:12:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:05.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:12:05 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e420 e420: 3 total, 3 up, 3 in
Jan 31 04:12:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:05.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:05 np0005603623 podman[329986]: 2026-01-31 09:12:05.982233065 +0000 UTC m=+0.065358961 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127)
Jan 31 04:12:06 np0005603623 podman[329987]: 2026-01-31 09:12:06.024900284 +0000 UTC m=+0.108719871 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:12:06 np0005603623 nova_compute[226235]: 2026-01-31 09:12:06.951 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:06 np0005603623 nova_compute[226235]: 2026-01-31 09:12:06.969 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Triggering sync for uuid c893e608-08e4-4eab-a992-b241e484ea48 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 04:12:06 np0005603623 nova_compute[226235]: 2026-01-31 09:12:06.969 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "c893e608-08e4-4eab-a992-b241e484ea48" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:06 np0005603623 nova_compute[226235]: 2026-01-31 09:12:06.970 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "c893e608-08e4-4eab-a992-b241e484ea48" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:06 np0005603623 nova_compute[226235]: 2026-01-31 09:12:06.970 226239 INFO nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: c893e608-08e4-4eab-a992-b241e484ea48] During sync_power_state the instance has a pending task (image_uploading). Skip.#033[00m
Jan 31 04:12:06 np0005603623 nova_compute[226235]: 2026-01-31 09:12:06.970 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "c893e608-08e4-4eab-a992-b241e484ea48" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e420 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:07 np0005603623 nova_compute[226235]: 2026-01-31 09:12:07.176 226239 INFO nova.virt.libvirt.driver [None req-4e2e1602-25fd-4c58-a6e1-768d04f84c34 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Snapshot image upload complete#033[00m
Jan 31 04:12:07 np0005603623 nova_compute[226235]: 2026-01-31 09:12:07.177 226239 INFO nova.compute.manager [None req-4e2e1602-25fd-4c58-a6e1-768d04f84c34 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Took 5.31 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 31 04:12:07 np0005603623 nova_compute[226235]: 2026-01-31 09:12:07.201 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:07 np0005603623 nova_compute[226235]: 2026-01-31 09:12:07.388 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:07.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e421 e421: 3 total, 3 up, 3 in
Jan 31 04:12:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:07.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:08 np0005603623 nova_compute[226235]: 2026-01-31 09:12:08.867 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.246 226239 DEBUG nova.compute.manager [req-a2a9c4c1-7a98-4db2-ab82-889816441489 req-7968be3b-d6ba-4dcb-87b1-6d47f12b0756 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Received event network-changed-0b63d009-60f2-4cf2-afea-679373e18e95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.247 226239 DEBUG nova.compute.manager [req-a2a9c4c1-7a98-4db2-ab82-889816441489 req-7968be3b-d6ba-4dcb-87b1-6d47f12b0756 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Refreshing instance network info cache due to event network-changed-0b63d009-60f2-4cf2-afea-679373e18e95. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.247 226239 DEBUG oslo_concurrency.lockutils [req-a2a9c4c1-7a98-4db2-ab82-889816441489 req-7968be3b-d6ba-4dcb-87b1-6d47f12b0756 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-c893e608-08e4-4eab-a992-b241e484ea48" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.247 226239 DEBUG oslo_concurrency.lockutils [req-a2a9c4c1-7a98-4db2-ab82-889816441489 req-7968be3b-d6ba-4dcb-87b1-6d47f12b0756 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-c893e608-08e4-4eab-a992-b241e484ea48" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.247 226239 DEBUG nova.network.neutron [req-a2a9c4c1-7a98-4db2-ab82-889816441489 req-7968be3b-d6ba-4dcb-87b1-6d47f12b0756 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Refreshing network info cache for port 0b63d009-60f2-4cf2-afea-679373e18e95 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.372 226239 DEBUG oslo_concurrency.lockutils [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquiring lock "c893e608-08e4-4eab-a992-b241e484ea48" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.374 226239 DEBUG oslo_concurrency.lockutils [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "c893e608-08e4-4eab-a992-b241e484ea48" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.374 226239 DEBUG oslo_concurrency.lockutils [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquiring lock "c893e608-08e4-4eab-a992-b241e484ea48-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.374 226239 DEBUG oslo_concurrency.lockutils [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "c893e608-08e4-4eab-a992-b241e484ea48-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.375 226239 DEBUG oslo_concurrency.lockutils [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "c893e608-08e4-4eab-a992-b241e484ea48-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.376 226239 INFO nova.compute.manager [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Terminating instance#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.378 226239 DEBUG nova.compute.manager [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:12:09 np0005603623 kernel: tap0b63d009-60 (unregistering): left promiscuous mode
Jan 31 04:12:09 np0005603623 NetworkManager[48970]: <info>  [1769850729.4523] device (tap0b63d009-60): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:12:09 np0005603623 ovn_controller[133449]: 2026-01-31T09:12:09Z|00921|binding|INFO|Releasing lport 0b63d009-60f2-4cf2-afea-679373e18e95 from this chassis (sb_readonly=0)
Jan 31 04:12:09 np0005603623 ovn_controller[133449]: 2026-01-31T09:12:09Z|00922|binding|INFO|Setting lport 0b63d009-60f2-4cf2-afea-679373e18e95 down in Southbound
Jan 31 04:12:09 np0005603623 ovn_controller[133449]: 2026-01-31T09:12:09Z|00923|binding|INFO|Removing iface tap0b63d009-60 ovn-installed in OVS
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.463 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:09.474 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:73:e5 10.100.0.10'], port_security=['fa:16:3e:05:73:e5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c893e608-08e4-4eab-a992-b241e484ea48', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08c50cf6-ab45-467a-a4d2-628200ead973', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c0be57039fd34aa9a2d05d9086ccff13', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1b8739db-40d8-4ddd-aaaf-640fa0a0d612', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=df4ff59b-6bd6-4dbb-9483-bf928652de1a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=0b63d009-60f2-4cf2-afea-679373e18e95) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.476 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:09.476 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 0b63d009-60f2-4cf2-afea-679373e18e95 in datapath 08c50cf6-ab45-467a-a4d2-628200ead973 unbound from our chassis#033[00m
Jan 31 04:12:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:09.479 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 08c50cf6-ab45-467a-a4d2-628200ead973, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:12:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:09.481 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9a7a3c7e-5109-46a4-8cc6-184ce2910b3b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:09.483 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973 namespace which is not needed anymore#033[00m
Jan 31 04:12:09 np0005603623 systemd[1]: machine-qemu\x2d101\x2dinstance\x2d000000db.scope: Deactivated successfully.
Jan 31 04:12:09 np0005603623 systemd[1]: machine-qemu\x2d101\x2dinstance\x2d000000db.scope: Consumed 15.399s CPU time.
Jan 31 04:12:09 np0005603623 systemd-machined[194379]: Machine qemu-101-instance-000000db terminated.
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.609 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:09 np0005603623 neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973[329270]: [NOTICE]   (329274) : haproxy version is 2.8.14-c23fe91
Jan 31 04:12:09 np0005603623 neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973[329270]: [NOTICE]   (329274) : path to executable is /usr/sbin/haproxy
Jan 31 04:12:09 np0005603623 neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973[329270]: [WARNING]  (329274) : Exiting Master process...
Jan 31 04:12:09 np0005603623 neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973[329270]: [WARNING]  (329274) : Exiting Master process...
Jan 31 04:12:09 np0005603623 neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973[329270]: [ALERT]    (329274) : Current worker (329276) exited with code 143 (Terminated)
Jan 31 04:12:09 np0005603623 neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973[329270]: [WARNING]  (329274) : All workers exited. Exiting... (0)
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.612 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:09 np0005603623 systemd[1]: libpod-994a1150a885234e7a907dcfd8485906279ab50e92a872125a290793b35e6c2f.scope: Deactivated successfully.
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.620 226239 INFO nova.virt.libvirt.driver [-] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Instance destroyed successfully.#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.620 226239 DEBUG nova.objects.instance [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lazy-loading 'resources' on Instance uuid c893e608-08e4-4eab-a992-b241e484ea48 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:12:09 np0005603623 podman[330059]: 2026-01-31 09:12:09.622622469 +0000 UTC m=+0.052263490 container died 994a1150a885234e7a907dcfd8485906279ab50e92a872125a290793b35e6c2f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.639 226239 DEBUG nova.virt.libvirt.vif [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:11:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1862064566',display_name='tempest-TestSnapshotPattern-server-1862064566',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1862064566',id=219,image_ref='f398889c-e272-449f-a032-36096e95d18f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKpOUB83OLuX57XNlwrNvNm9A+WqJpxKoMPAfeJaapb6yUdxSW7lKu+x7yQPy1sLXzgH0zh++G8qQAE8XC1z1HkX9voX2EUsKpZ4pSJBKy01SGRO5BQ6pTGTKexhMNezAw==',key_name='tempest-TestSnapshotPattern-475590621',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:11:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c0be57039fd34aa9a2d05d9086ccff13',ramdisk_id='',reservation_id='r-x7vjslr0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='f0b8c4bd-36b6-479c-9160-9adb3a86dc6f',image_min_disk='1',image_min_ram='0',image_owner_id='c0be57039fd34aa9a2d05d9086ccff13',image_owner_project_name='tempest-TestSnapshotPattern-418405266',image_owner_user_name='tempest-TestSnapshotPattern-418405266-project-member',image_user_id='b7233f93367f4dcd8eb2b6b115680192',image_version='8.0',owner_project_name='tempest-TestSnapshotPattern-418405266',owner_user_name='tempest-TestSnapshotPattern-418405266-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:12:07Z,user_data=None,user_id='b7233f93367f4dcd8eb2b6b115680192',uuid=c893e608-08e4-4eab-a992-b241e484ea48,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b63d009-60f2-4cf2-afea-679373e18e95", "address": "fa:16:3e:05:73:e5", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b63d009-60", "ovs_interfaceid": "0b63d009-60f2-4cf2-afea-679373e18e95", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.640 226239 DEBUG nova.network.os_vif_util [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Converting VIF {"id": "0b63d009-60f2-4cf2-afea-679373e18e95", "address": "fa:16:3e:05:73:e5", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b63d009-60", "ovs_interfaceid": "0b63d009-60f2-4cf2-afea-679373e18e95", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.640 226239 DEBUG nova.network.os_vif_util [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:05:73:e5,bridge_name='br-int',has_traffic_filtering=True,id=0b63d009-60f2-4cf2-afea-679373e18e95,network=Network(08c50cf6-ab45-467a-a4d2-628200ead973),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b63d009-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.641 226239 DEBUG os_vif [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:73:e5,bridge_name='br-int',has_traffic_filtering=True,id=0b63d009-60f2-4cf2-afea-679373e18e95,network=Network(08c50cf6-ab45-467a-a4d2-628200ead973),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b63d009-60') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.642 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.642 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b63d009-60, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.643 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.646 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.650 226239 INFO os_vif [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:73:e5,bridge_name='br-int',has_traffic_filtering=True,id=0b63d009-60f2-4cf2-afea-679373e18e95,network=Network(08c50cf6-ab45-467a-a4d2-628200ead973),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b63d009-60')#033[00m
Jan 31 04:12:09 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-994a1150a885234e7a907dcfd8485906279ab50e92a872125a290793b35e6c2f-userdata-shm.mount: Deactivated successfully.
Jan 31 04:12:09 np0005603623 systemd[1]: var-lib-containers-storage-overlay-839732a2390952220ed89078acd0a6523bbdc7787aa1d760166b551047c11a06-merged.mount: Deactivated successfully.
Jan 31 04:12:09 np0005603623 podman[330059]: 2026-01-31 09:12:09.669404606 +0000 UTC m=+0.099045597 container cleanup 994a1150a885234e7a907dcfd8485906279ab50e92a872125a290793b35e6c2f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:12:09 np0005603623 systemd[1]: libpod-conmon-994a1150a885234e7a907dcfd8485906279ab50e92a872125a290793b35e6c2f.scope: Deactivated successfully.
Jan 31 04:12:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:12:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:09.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:12:09 np0005603623 podman[330112]: 2026-01-31 09:12:09.725709762 +0000 UTC m=+0.037913379 container remove 994a1150a885234e7a907dcfd8485906279ab50e92a872125a290793b35e6c2f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 04:12:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:09.729 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[101e65b6-3eb9-4f3f-a80b-628443530eee]: (4, ('Sat Jan 31 09:12:09 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973 (994a1150a885234e7a907dcfd8485906279ab50e92a872125a290793b35e6c2f)\n994a1150a885234e7a907dcfd8485906279ab50e92a872125a290793b35e6c2f\nSat Jan 31 09:12:09 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973 (994a1150a885234e7a907dcfd8485906279ab50e92a872125a290793b35e6c2f)\n994a1150a885234e7a907dcfd8485906279ab50e92a872125a290793b35e6c2f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:09.731 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b560cc44-477d-4358-97fd-a6123579e81c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:09.732 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08c50cf6-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.735 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:09 np0005603623 kernel: tap08c50cf6-a0: left promiscuous mode
Jan 31 04:12:09 np0005603623 nova_compute[226235]: 2026-01-31 09:12:09.742 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:09.745 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[62c37b7a-7a7d-42b8-b0e4-ad51973d053c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:09.761 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[1d0265b6-0baa-4c6b-9310-877d48eaaf7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:09.763 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[42fa7d55-ae3b-4062-b44b-90fe6ad940a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:09.774 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[6b78b5d7-e8b2-4673-b649-50ad47536290]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 982622, 'reachable_time': 31843, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 330127, 'error': None, 'target': 'ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:09 np0005603623 systemd[1]: run-netns-ovnmeta\x2d08c50cf6\x2dab45\x2d467a\x2da4d2\x2d628200ead973.mount: Deactivated successfully.
Jan 31 04:12:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:09.778 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:12:09 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:09.778 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[0acf15e2-9f7d-44ca-a970-1688d8a71f5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:09.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:10 np0005603623 nova_compute[226235]: 2026-01-31 09:12:10.132 226239 INFO nova.virt.libvirt.driver [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Deleting instance files /var/lib/nova/instances/c893e608-08e4-4eab-a992-b241e484ea48_del#033[00m
Jan 31 04:12:10 np0005603623 nova_compute[226235]: 2026-01-31 09:12:10.133 226239 INFO nova.virt.libvirt.driver [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Deletion of /var/lib/nova/instances/c893e608-08e4-4eab-a992-b241e484ea48_del complete#033[00m
Jan 31 04:12:10 np0005603623 nova_compute[226235]: 2026-01-31 09:12:10.235 226239 INFO nova.compute.manager [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Took 0.86 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:12:10 np0005603623 nova_compute[226235]: 2026-01-31 09:12:10.235 226239 DEBUG oslo.service.loopingcall [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:12:10 np0005603623 nova_compute[226235]: 2026-01-31 09:12:10.236 226239 DEBUG nova.compute.manager [-] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:12:10 np0005603623 nova_compute[226235]: 2026-01-31 09:12:10.236 226239 DEBUG nova.network.neutron [-] [instance: c893e608-08e4-4eab-a992-b241e484ea48] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:12:10 np0005603623 nova_compute[226235]: 2026-01-31 09:12:10.466 226239 DEBUG nova.compute.manager [req-1303998b-6a65-4b35-b1c4-e6ce53371981 req-85ed3187-e414-4716-8e45-328cf745dacd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Received event network-vif-unplugged-0b63d009-60f2-4cf2-afea-679373e18e95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:12:10 np0005603623 nova_compute[226235]: 2026-01-31 09:12:10.467 226239 DEBUG oslo_concurrency.lockutils [req-1303998b-6a65-4b35-b1c4-e6ce53371981 req-85ed3187-e414-4716-8e45-328cf745dacd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c893e608-08e4-4eab-a992-b241e484ea48-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:10 np0005603623 nova_compute[226235]: 2026-01-31 09:12:10.468 226239 DEBUG oslo_concurrency.lockutils [req-1303998b-6a65-4b35-b1c4-e6ce53371981 req-85ed3187-e414-4716-8e45-328cf745dacd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c893e608-08e4-4eab-a992-b241e484ea48-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:10 np0005603623 nova_compute[226235]: 2026-01-31 09:12:10.469 226239 DEBUG oslo_concurrency.lockutils [req-1303998b-6a65-4b35-b1c4-e6ce53371981 req-85ed3187-e414-4716-8e45-328cf745dacd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c893e608-08e4-4eab-a992-b241e484ea48-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:10 np0005603623 nova_compute[226235]: 2026-01-31 09:12:10.469 226239 DEBUG nova.compute.manager [req-1303998b-6a65-4b35-b1c4-e6ce53371981 req-85ed3187-e414-4716-8e45-328cf745dacd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] No waiting events found dispatching network-vif-unplugged-0b63d009-60f2-4cf2-afea-679373e18e95 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:12:10 np0005603623 nova_compute[226235]: 2026-01-31 09:12:10.469 226239 DEBUG nova.compute.manager [req-1303998b-6a65-4b35-b1c4-e6ce53371981 req-85ed3187-e414-4716-8e45-328cf745dacd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Received event network-vif-unplugged-0b63d009-60f2-4cf2-afea-679373e18e95 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:12:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:11.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:11 np0005603623 nova_compute[226235]: 2026-01-31 09:12:11.893 226239 DEBUG nova.compute.manager [req-e2d73922-08ab-413c-b510-11f433b42dd0 req-e29ec167-ceb1-4354-92dc-a3fcd8d9c82f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Received event network-vif-deleted-0b63d009-60f2-4cf2-afea-679373e18e95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:12:11 np0005603623 nova_compute[226235]: 2026-01-31 09:12:11.894 226239 INFO nova.compute.manager [req-e2d73922-08ab-413c-b510-11f433b42dd0 req-e29ec167-ceb1-4354-92dc-a3fcd8d9c82f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Neutron deleted interface 0b63d009-60f2-4cf2-afea-679373e18e95; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 04:12:11 np0005603623 nova_compute[226235]: 2026-01-31 09:12:11.894 226239 DEBUG nova.network.neutron [req-e2d73922-08ab-413c-b510-11f433b42dd0 req-e29ec167-ceb1-4354-92dc-a3fcd8d9c82f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:12:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:11.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:11 np0005603623 nova_compute[226235]: 2026-01-31 09:12:11.979 226239 DEBUG nova.network.neutron [-] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:12:12 np0005603623 nova_compute[226235]: 2026-01-31 09:12:12.003 226239 DEBUG nova.compute.manager [req-e2d73922-08ab-413c-b510-11f433b42dd0 req-e29ec167-ceb1-4354-92dc-a3fcd8d9c82f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Detach interface failed, port_id=0b63d009-60f2-4cf2-afea-679373e18e95, reason: Instance c893e608-08e4-4eab-a992-b241e484ea48 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 04:12:12 np0005603623 nova_compute[226235]: 2026-01-31 09:12:12.040 226239 INFO nova.compute.manager [-] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Took 1.80 seconds to deallocate network for instance.#033[00m
Jan 31 04:12:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:12 np0005603623 nova_compute[226235]: 2026-01-31 09:12:12.115 226239 DEBUG oslo_concurrency.lockutils [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:12 np0005603623 nova_compute[226235]: 2026-01-31 09:12:12.115 226239 DEBUG oslo_concurrency.lockutils [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:12 np0005603623 nova_compute[226235]: 2026-01-31 09:12:12.186 226239 DEBUG oslo_concurrency.processutils [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:12:12 np0005603623 nova_compute[226235]: 2026-01-31 09:12:12.207 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:12 np0005603623 nova_compute[226235]: 2026-01-31 09:12:12.559 226239 DEBUG nova.compute.manager [req-1c4d69e4-8272-4b83-9833-11f3938f8a72 req-8b7368d0-2246-4a20-ac8d-dc254f9ba135 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Received event network-vif-plugged-0b63d009-60f2-4cf2-afea-679373e18e95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:12:12 np0005603623 nova_compute[226235]: 2026-01-31 09:12:12.559 226239 DEBUG oslo_concurrency.lockutils [req-1c4d69e4-8272-4b83-9833-11f3938f8a72 req-8b7368d0-2246-4a20-ac8d-dc254f9ba135 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c893e608-08e4-4eab-a992-b241e484ea48-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:12 np0005603623 nova_compute[226235]: 2026-01-31 09:12:12.560 226239 DEBUG oslo_concurrency.lockutils [req-1c4d69e4-8272-4b83-9833-11f3938f8a72 req-8b7368d0-2246-4a20-ac8d-dc254f9ba135 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c893e608-08e4-4eab-a992-b241e484ea48-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:12 np0005603623 nova_compute[226235]: 2026-01-31 09:12:12.560 226239 DEBUG oslo_concurrency.lockutils [req-1c4d69e4-8272-4b83-9833-11f3938f8a72 req-8b7368d0-2246-4a20-ac8d-dc254f9ba135 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c893e608-08e4-4eab-a992-b241e484ea48-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:12 np0005603623 nova_compute[226235]: 2026-01-31 09:12:12.560 226239 DEBUG nova.compute.manager [req-1c4d69e4-8272-4b83-9833-11f3938f8a72 req-8b7368d0-2246-4a20-ac8d-dc254f9ba135 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] No waiting events found dispatching network-vif-plugged-0b63d009-60f2-4cf2-afea-679373e18e95 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:12:12 np0005603623 nova_compute[226235]: 2026-01-31 09:12:12.560 226239 WARNING nova.compute.manager [req-1c4d69e4-8272-4b83-9833-11f3938f8a72 req-8b7368d0-2246-4a20-ac8d-dc254f9ba135 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Received unexpected event network-vif-plugged-0b63d009-60f2-4cf2-afea-679373e18e95 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 04:12:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:12:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1873459252' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:12:12 np0005603623 nova_compute[226235]: 2026-01-31 09:12:12.641 226239 DEBUG oslo_concurrency.processutils [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:12:12 np0005603623 nova_compute[226235]: 2026-01-31 09:12:12.646 226239 DEBUG nova.compute.provider_tree [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:12:12 np0005603623 nova_compute[226235]: 2026-01-31 09:12:12.856 226239 DEBUG nova.scheduler.client.report [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:12:12 np0005603623 nova_compute[226235]: 2026-01-31 09:12:12.903 226239 DEBUG oslo_concurrency.lockutils [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.787s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:13 np0005603623 nova_compute[226235]: 2026-01-31 09:12:13.000 226239 INFO nova.scheduler.client.report [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Deleted allocations for instance c893e608-08e4-4eab-a992-b241e484ea48#033[00m
Jan 31 04:12:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e422 e422: 3 total, 3 up, 3 in
Jan 31 04:12:13 np0005603623 nova_compute[226235]: 2026-01-31 09:12:13.050 226239 DEBUG nova.network.neutron [req-a2a9c4c1-7a98-4db2-ab82-889816441489 req-7968be3b-d6ba-4dcb-87b1-6d47f12b0756 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Updated VIF entry in instance network info cache for port 0b63d009-60f2-4cf2-afea-679373e18e95. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:12:13 np0005603623 nova_compute[226235]: 2026-01-31 09:12:13.051 226239 DEBUG nova.network.neutron [req-a2a9c4c1-7a98-4db2-ab82-889816441489 req-7968be3b-d6ba-4dcb-87b1-6d47f12b0756 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Updating instance_info_cache with network_info: [{"id": "0b63d009-60f2-4cf2-afea-679373e18e95", "address": "fa:16:3e:05:73:e5", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b63d009-60", "ovs_interfaceid": "0b63d009-60f2-4cf2-afea-679373e18e95", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:12:13 np0005603623 nova_compute[226235]: 2026-01-31 09:12:13.098 226239 DEBUG oslo_concurrency.lockutils [req-a2a9c4c1-7a98-4db2-ab82-889816441489 req-7968be3b-d6ba-4dcb-87b1-6d47f12b0756 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-c893e608-08e4-4eab-a992-b241e484ea48" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:12:13 np0005603623 nova_compute[226235]: 2026-01-31 09:12:13.143 226239 DEBUG oslo_concurrency.lockutils [None req-0d1c8baf-a678-4aee-a2d0-0b41963da248 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "c893e608-08e4-4eab-a992-b241e484ea48" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.769s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:13.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:13 np0005603623 nova_compute[226235]: 2026-01-31 09:12:13.802 226239 DEBUG oslo_concurrency.lockutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "4efd9646-aadf-4138-9d36-d47416e0c6e1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:13 np0005603623 nova_compute[226235]: 2026-01-31 09:12:13.803 226239 DEBUG oslo_concurrency.lockutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "4efd9646-aadf-4138-9d36-d47416e0c6e1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:13 np0005603623 nova_compute[226235]: 2026-01-31 09:12:13.825 226239 DEBUG nova.compute.manager [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:12:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:13.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:13 np0005603623 nova_compute[226235]: 2026-01-31 09:12:13.935 226239 DEBUG oslo_concurrency.lockutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:13 np0005603623 nova_compute[226235]: 2026-01-31 09:12:13.936 226239 DEBUG oslo_concurrency.lockutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:13 np0005603623 nova_compute[226235]: 2026-01-31 09:12:13.943 226239 DEBUG nova.virt.hardware [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:12:13 np0005603623 nova_compute[226235]: 2026-01-31 09:12:13.943 226239 INFO nova.compute.claims [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.097 226239 DEBUG oslo_concurrency.processutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:12:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/240522099' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.492 226239 DEBUG oslo_concurrency.processutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.395s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.498 226239 DEBUG nova.compute.provider_tree [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.518 226239 DEBUG nova.scheduler.client.report [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.540 226239 DEBUG oslo_concurrency.lockutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.541 226239 DEBUG nova.compute.manager [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.647 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.656 226239 DEBUG nova.compute.manager [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.657 226239 DEBUG nova.network.neutron [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.686 226239 INFO nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.707 226239 DEBUG nova.compute.manager [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.842 226239 DEBUG nova.compute.manager [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.843 226239 DEBUG nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.843 226239 INFO nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Creating image(s)#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.863 226239 DEBUG nova.storage.rbd_utils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 4efd9646-aadf-4138-9d36-d47416e0c6e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.889 226239 DEBUG nova.storage.rbd_utils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 4efd9646-aadf-4138-9d36-d47416e0c6e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.910 226239 DEBUG nova.storage.rbd_utils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 4efd9646-aadf-4138-9d36-d47416e0c6e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.913 226239 DEBUG oslo_concurrency.processutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.983 226239 DEBUG oslo_concurrency.processutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.984 226239 DEBUG oslo_concurrency.lockutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.984 226239 DEBUG oslo_concurrency.lockutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:14 np0005603623 nova_compute[226235]: 2026-01-31 09:12:14.985 226239 DEBUG oslo_concurrency.lockutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:15 np0005603623 nova_compute[226235]: 2026-01-31 09:12:15.009 226239 DEBUG nova.storage.rbd_utils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 4efd9646-aadf-4138-9d36-d47416e0c6e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:12:15 np0005603623 nova_compute[226235]: 2026-01-31 09:12:15.012 226239 DEBUG oslo_concurrency.processutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4efd9646-aadf-4138-9d36-d47416e0c6e1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:12:15 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e423 e423: 3 total, 3 up, 3 in
Jan 31 04:12:15 np0005603623 nova_compute[226235]: 2026-01-31 09:12:15.184 226239 DEBUG nova.policy [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ebd43008d7a64b8bbf97a2304b1f78b6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c7930b92fc3471f87d9fe78ee56e71e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:12:15 np0005603623 nova_compute[226235]: 2026-01-31 09:12:15.295 226239 DEBUG oslo_concurrency.processutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4efd9646-aadf-4138-9d36-d47416e0c6e1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.282s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:12:15 np0005603623 nova_compute[226235]: 2026-01-31 09:12:15.404 226239 DEBUG nova.storage.rbd_utils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] resizing rbd image 4efd9646-aadf-4138-9d36-d47416e0c6e1_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 04:12:15 np0005603623 nova_compute[226235]: 2026-01-31 09:12:15.552 226239 DEBUG nova.objects.instance [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lazy-loading 'migration_context' on Instance uuid 4efd9646-aadf-4138-9d36-d47416e0c6e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:12:15 np0005603623 nova_compute[226235]: 2026-01-31 09:12:15.568 226239 DEBUG nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 04:12:15 np0005603623 nova_compute[226235]: 2026-01-31 09:12:15.568 226239 DEBUG nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Ensure instance console log exists: /var/lib/nova/instances/4efd9646-aadf-4138-9d36-d47416e0c6e1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:12:15 np0005603623 nova_compute[226235]: 2026-01-31 09:12:15.569 226239 DEBUG oslo_concurrency.lockutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:15 np0005603623 nova_compute[226235]: 2026-01-31 09:12:15.569 226239 DEBUG oslo_concurrency.lockutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:15 np0005603623 nova_compute[226235]: 2026-01-31 09:12:15.570 226239 DEBUG oslo_concurrency.lockutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:15.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:15.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:16 np0005603623 nova_compute[226235]: 2026-01-31 09:12:16.531 226239 DEBUG nova.network.neutron [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Successfully created port: 2fa042eb-d400-4d66-9582-1916fd5ca4c0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:12:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e423 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:17 np0005603623 nova_compute[226235]: 2026-01-31 09:12:17.206 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:17.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:12:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:17.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:12:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e424 e424: 3 total, 3 up, 3 in
Jan 31 04:12:18 np0005603623 nova_compute[226235]: 2026-01-31 09:12:18.178 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:18 np0005603623 nova_compute[226235]: 2026-01-31 09:12:18.179 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 04:12:19 np0005603623 nova_compute[226235]: 2026-01-31 09:12:19.207 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 04:12:19 np0005603623 nova_compute[226235]: 2026-01-31 09:12:19.650 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:19.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:19 np0005603623 nova_compute[226235]: 2026-01-31 09:12:19.737 226239 DEBUG nova.network.neutron [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Successfully updated port: 2fa042eb-d400-4d66-9582-1916fd5ca4c0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:12:19 np0005603623 nova_compute[226235]: 2026-01-31 09:12:19.778 226239 DEBUG oslo_concurrency.lockutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "refresh_cache-4efd9646-aadf-4138-9d36-d47416e0c6e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:12:19 np0005603623 nova_compute[226235]: 2026-01-31 09:12:19.778 226239 DEBUG oslo_concurrency.lockutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquired lock "refresh_cache-4efd9646-aadf-4138-9d36-d47416e0c6e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:12:19 np0005603623 nova_compute[226235]: 2026-01-31 09:12:19.778 226239 DEBUG nova.network.neutron [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:12:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:19.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:20 np0005603623 nova_compute[226235]: 2026-01-31 09:12:20.293 226239 DEBUG nova.compute.manager [req-15237e1a-0381-4997-b740-0cb53036877e req-b0a2b330-fb1a-46d2-a3c0-17de63a5d331 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Received event network-changed-2fa042eb-d400-4d66-9582-1916fd5ca4c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:12:20 np0005603623 nova_compute[226235]: 2026-01-31 09:12:20.293 226239 DEBUG nova.compute.manager [req-15237e1a-0381-4997-b740-0cb53036877e req-b0a2b330-fb1a-46d2-a3c0-17de63a5d331 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Refreshing instance network info cache due to event network-changed-2fa042eb-d400-4d66-9582-1916fd5ca4c0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:12:20 np0005603623 nova_compute[226235]: 2026-01-31 09:12:20.294 226239 DEBUG oslo_concurrency.lockutils [req-15237e1a-0381-4997-b740-0cb53036877e req-b0a2b330-fb1a-46d2-a3c0-17de63a5d331 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4efd9646-aadf-4138-9d36-d47416e0c6e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:12:20 np0005603623 nova_compute[226235]: 2026-01-31 09:12:20.345 226239 DEBUG nova.network.neutron [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:12:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:21.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.856 226239 DEBUG nova.network.neutron [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Updating instance_info_cache with network_info: [{"id": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "address": "fa:16:3e:78:d6:ba", "network": {"id": "514e8c9e-2a14-4959-839a-40965c82f800", "bridge": "br-int", "label": "tempest-network-smoke--422685355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fa042eb-d4", "ovs_interfaceid": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:12:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:21.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.956 226239 DEBUG oslo_concurrency.lockutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Releasing lock "refresh_cache-4efd9646-aadf-4138-9d36-d47416e0c6e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.957 226239 DEBUG nova.compute.manager [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Instance network_info: |[{"id": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "address": "fa:16:3e:78:d6:ba", "network": {"id": "514e8c9e-2a14-4959-839a-40965c82f800", "bridge": "br-int", "label": "tempest-network-smoke--422685355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fa042eb-d4", "ovs_interfaceid": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.957 226239 DEBUG oslo_concurrency.lockutils [req-15237e1a-0381-4997-b740-0cb53036877e req-b0a2b330-fb1a-46d2-a3c0-17de63a5d331 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4efd9646-aadf-4138-9d36-d47416e0c6e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.958 226239 DEBUG nova.network.neutron [req-15237e1a-0381-4997-b740-0cb53036877e req-b0a2b330-fb1a-46d2-a3c0-17de63a5d331 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Refreshing network info cache for port 2fa042eb-d400-4d66-9582-1916fd5ca4c0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.961 226239 DEBUG nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Start _get_guest_xml network_info=[{"id": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "address": "fa:16:3e:78:d6:ba", "network": {"id": "514e8c9e-2a14-4959-839a-40965c82f800", "bridge": "br-int", "label": "tempest-network-smoke--422685355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fa042eb-d4", "ovs_interfaceid": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.966 226239 WARNING nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.973 226239 DEBUG nova.virt.libvirt.host [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.974 226239 DEBUG nova.virt.libvirt.host [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.986 226239 DEBUG nova.virt.libvirt.host [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.987 226239 DEBUG nova.virt.libvirt.host [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.989 226239 DEBUG nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.989 226239 DEBUG nova.virt.hardware [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.990 226239 DEBUG nova.virt.hardware [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.990 226239 DEBUG nova.virt.hardware [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.990 226239 DEBUG nova.virt.hardware [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.991 226239 DEBUG nova.virt.hardware [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.991 226239 DEBUG nova.virt.hardware [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.991 226239 DEBUG nova.virt.hardware [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.991 226239 DEBUG nova.virt.hardware [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.992 226239 DEBUG nova.virt.hardware [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.992 226239 DEBUG nova.virt.hardware [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.992 226239 DEBUG nova.virt.hardware [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:12:21 np0005603623 nova_compute[226235]: 2026-01-31 09:12:21.995 226239 DEBUG oslo_concurrency.processutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:12:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:22 np0005603623 nova_compute[226235]: 2026-01-31 09:12:22.209 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:12:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3370129134' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:12:22 np0005603623 nova_compute[226235]: 2026-01-31 09:12:22.437 226239 DEBUG oslo_concurrency.processutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:12:22 np0005603623 nova_compute[226235]: 2026-01-31 09:12:22.466 226239 DEBUG nova.storage.rbd_utils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 4efd9646-aadf-4138-9d36-d47416e0c6e1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:12:22 np0005603623 nova_compute[226235]: 2026-01-31 09:12:22.473 226239 DEBUG oslo_concurrency.processutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:12:22 np0005603623 nova_compute[226235]: 2026-01-31 09:12:22.970 226239 DEBUG oslo_concurrency.processutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:12:22 np0005603623 nova_compute[226235]: 2026-01-31 09:12:22.974 226239 DEBUG nova.virt.libvirt.vif [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:12:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-105635484',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-105635484',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1802479850-ac',id=220,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFC9ERZJ6vqwSOhb+BxoLuCTPY4zXIPbOdYQjYf18qK5EvFlLu3Fd6dU0UfukMij7wWnpSqWAkqu0LocOazNCHHb52PIeAWKGpoQVtLv/Sw5DcBQogLeHH3fNNhS1TtHkw==',key_name='tempest-TestSecurityGroupsBasicOps-763844494',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c7930b92fc3471f87d9fe78ee56e71e',ramdisk_id='',reservation_id='r-w3k0ayns',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1802479850',owner_user_name='tempest-TestSecurityGroupsBasicOps-1802479850-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:12:14Z,user_data=None,user_id='ebd43008d7a64b8bbf97a2304b1f78b6',uuid=4efd9646-aadf-4138-9d36-d47416e0c6e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "address": "fa:16:3e:78:d6:ba", "network": {"id": "514e8c9e-2a14-4959-839a-40965c82f800", "bridge": "br-int", "label": "tempest-network-smoke--422685355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fa042eb-d4", "ovs_interfaceid": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:12:22 np0005603623 nova_compute[226235]: 2026-01-31 09:12:22.975 226239 DEBUG nova.network.os_vif_util [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converting VIF {"id": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "address": "fa:16:3e:78:d6:ba", "network": {"id": "514e8c9e-2a14-4959-839a-40965c82f800", "bridge": "br-int", "label": "tempest-network-smoke--422685355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fa042eb-d4", "ovs_interfaceid": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:12:22 np0005603623 nova_compute[226235]: 2026-01-31 09:12:22.976 226239 DEBUG nova.network.os_vif_util [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:d6:ba,bridge_name='br-int',has_traffic_filtering=True,id=2fa042eb-d400-4d66-9582-1916fd5ca4c0,network=Network(514e8c9e-2a14-4959-839a-40965c82f800),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fa042eb-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:12:22 np0005603623 nova_compute[226235]: 2026-01-31 09:12:22.977 226239 DEBUG nova.objects.instance [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lazy-loading 'pci_devices' on Instance uuid 4efd9646-aadf-4138-9d36-d47416e0c6e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.002 226239 DEBUG nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:12:23 np0005603623 nova_compute[226235]:  <uuid>4efd9646-aadf-4138-9d36-d47416e0c6e1</uuid>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:  <name>instance-000000dc</name>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:  <memory>131072</memory>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:  <vcpu>1</vcpu>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:  <metadata>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-105635484</nova:name>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <nova:creationTime>2026-01-31 09:12:21</nova:creationTime>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <nova:flavor name="m1.nano">
Jan 31 04:12:23 np0005603623 nova_compute[226235]:        <nova:memory>128</nova:memory>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:        <nova:disk>1</nova:disk>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:        <nova:swap>0</nova:swap>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      </nova:flavor>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <nova:owner>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:        <nova:user uuid="ebd43008d7a64b8bbf97a2304b1f78b6">tempest-TestSecurityGroupsBasicOps-1802479850-project-member</nova:user>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:        <nova:project uuid="0c7930b92fc3471f87d9fe78ee56e71e">tempest-TestSecurityGroupsBasicOps-1802479850</nova:project>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      </nova:owner>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <nova:ports>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:        <nova:port uuid="2fa042eb-d400-4d66-9582-1916fd5ca4c0">
Jan 31 04:12:23 np0005603623 nova_compute[226235]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:        </nova:port>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      </nova:ports>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    </nova:instance>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:  </metadata>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:  <sysinfo type="smbios">
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <system>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <entry name="serial">4efd9646-aadf-4138-9d36-d47416e0c6e1</entry>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <entry name="uuid">4efd9646-aadf-4138-9d36-d47416e0c6e1</entry>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    </system>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:  </sysinfo>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:  <os>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <boot dev="hd"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <smbios mode="sysinfo"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:  </os>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:  <features>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <acpi/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <apic/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <vmcoreinfo/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:  </features>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:  <clock offset="utc">
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <timer name="hpet" present="no"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:  </clock>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:  <cpu mode="custom" match="exact">
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <model>Nehalem</model>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:  </cpu>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:  <devices>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <disk type="network" device="disk">
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/4efd9646-aadf-4138-9d36-d47416e0c6e1_disk">
Jan 31 04:12:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:12:23 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <target dev="vda" bus="virtio"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <disk type="network" device="cdrom">
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <driver type="raw" cache="none"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <source protocol="rbd" name="vms/4efd9646-aadf-4138-9d36-d47416e0c6e1_disk.config">
Jan 31 04:12:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      </source>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <auth username="openstack">
Jan 31 04:12:23 np0005603623 nova_compute[226235]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      </auth>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <target dev="sda" bus="sata"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    </disk>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <interface type="ethernet">
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <mac address="fa:16:3e:78:d6:ba"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <mtu size="1442"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <target dev="tap2fa042eb-d4"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    </interface>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <serial type="pty">
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <log file="/var/lib/nova/instances/4efd9646-aadf-4138-9d36-d47416e0c6e1/console.log" append="off"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    </serial>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <video>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <model type="virtio"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    </video>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <input type="tablet" bus="usb"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <rng model="virtio">
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    </rng>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <controller type="usb" index="0"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    <memballoon model="virtio">
Jan 31 04:12:23 np0005603623 nova_compute[226235]:      <stats period="10"/>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:    </memballoon>
Jan 31 04:12:23 np0005603623 nova_compute[226235]:  </devices>
Jan 31 04:12:23 np0005603623 nova_compute[226235]: </domain>
Jan 31 04:12:23 np0005603623 nova_compute[226235]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.005 226239 DEBUG nova.compute.manager [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Preparing to wait for external event network-vif-plugged-2fa042eb-d400-4d66-9582-1916fd5ca4c0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.006 226239 DEBUG oslo_concurrency.lockutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "4efd9646-aadf-4138-9d36-d47416e0c6e1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.007 226239 DEBUG oslo_concurrency.lockutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "4efd9646-aadf-4138-9d36-d47416e0c6e1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.007 226239 DEBUG oslo_concurrency.lockutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "4efd9646-aadf-4138-9d36-d47416e0c6e1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.008 226239 DEBUG nova.virt.libvirt.vif [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:12:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-105635484',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-105635484',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1802479850-ac',id=220,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFC9ERZJ6vqwSOhb+BxoLuCTPY4zXIPbOdYQjYf18qK5EvFlLu3Fd6dU0UfukMij7wWnpSqWAkqu0LocOazNCHHb52PIeAWKGpoQVtLv/Sw5DcBQogLeHH3fNNhS1TtHkw==',key_name='tempest-TestSecurityGroupsBasicOps-763844494',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c7930b92fc3471f87d9fe78ee56e71e',ramdisk_id='',reservation_id='r-w3k0ayns',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1802479850',owner_user_name='tempest-TestSecurityGroupsBasicOps-1802479850-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:12:14Z,user_data=None,user_id='ebd43008d7a64b8bbf97a2304b1f78b6',uuid=4efd9646-aadf-4138-9d36-d47416e0c6e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "address": "fa:16:3e:78:d6:ba", "network": {"id": "514e8c9e-2a14-4959-839a-40965c82f800", "bridge": "br-int", "label": "tempest-network-smoke--422685355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fa042eb-d4", "ovs_interfaceid": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.008 226239 DEBUG nova.network.os_vif_util [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converting VIF {"id": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "address": "fa:16:3e:78:d6:ba", "network": {"id": "514e8c9e-2a14-4959-839a-40965c82f800", "bridge": "br-int", "label": "tempest-network-smoke--422685355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fa042eb-d4", "ovs_interfaceid": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.009 226239 DEBUG nova.network.os_vif_util [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:d6:ba,bridge_name='br-int',has_traffic_filtering=True,id=2fa042eb-d400-4d66-9582-1916fd5ca4c0,network=Network(514e8c9e-2a14-4959-839a-40965c82f800),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fa042eb-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.010 226239 DEBUG os_vif [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:d6:ba,bridge_name='br-int',has_traffic_filtering=True,id=2fa042eb-d400-4d66-9582-1916fd5ca4c0,network=Network(514e8c9e-2a14-4959-839a-40965c82f800),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fa042eb-d4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.011 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.011 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.012 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.016 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.017 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2fa042eb-d4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.017 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2fa042eb-d4, col_values=(('external_ids', {'iface-id': '2fa042eb-d400-4d66-9582-1916fd5ca4c0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:78:d6:ba', 'vm-uuid': '4efd9646-aadf-4138-9d36-d47416e0c6e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.021 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:23 np0005603623 NetworkManager[48970]: <info>  [1769850743.0226] manager: (tap2fa042eb-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/439)
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.023 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.027 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.029 226239 INFO os_vif [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:d6:ba,bridge_name='br-int',has_traffic_filtering=True,id=2fa042eb-d400-4d66-9582-1916fd5ca4c0,network=Network(514e8c9e-2a14-4959-839a-40965c82f800),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fa042eb-d4')#033[00m
Jan 31 04:12:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 e425: 3 total, 3 up, 3 in
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.084 226239 DEBUG nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.085 226239 DEBUG nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.085 226239 DEBUG nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] No VIF found with MAC fa:16:3e:78:d6:ba, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.085 226239 INFO nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Using config drive#033[00m
Jan 31 04:12:23 np0005603623 nova_compute[226235]: 2026-01-31 09:12:23.117 226239 DEBUG nova.storage.rbd_utils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 4efd9646-aadf-4138-9d36-d47416e0c6e1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:12:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:23.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:12:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:23.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:12:24 np0005603623 nova_compute[226235]: 2026-01-31 09:12:24.269 226239 INFO nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Creating config drive at /var/lib/nova/instances/4efd9646-aadf-4138-9d36-d47416e0c6e1/disk.config#033[00m
Jan 31 04:12:24 np0005603623 nova_compute[226235]: 2026-01-31 09:12:24.274 226239 DEBUG oslo_concurrency.processutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4efd9646-aadf-4138-9d36-d47416e0c6e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmppszzd5iy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:12:24 np0005603623 nova_compute[226235]: 2026-01-31 09:12:24.416 226239 DEBUG oslo_concurrency.processutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4efd9646-aadf-4138-9d36-d47416e0c6e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmppszzd5iy" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:12:24 np0005603623 nova_compute[226235]: 2026-01-31 09:12:24.446 226239 DEBUG nova.storage.rbd_utils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 4efd9646-aadf-4138-9d36-d47416e0c6e1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:12:24 np0005603623 nova_compute[226235]: 2026-01-31 09:12:24.451 226239 DEBUG oslo_concurrency.processutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4efd9646-aadf-4138-9d36-d47416e0c6e1/disk.config 4efd9646-aadf-4138-9d36-d47416e0c6e1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:12:24 np0005603623 nova_compute[226235]: 2026-01-31 09:12:24.619 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850729.617843, c893e608-08e4-4eab-a992-b241e484ea48 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:12:24 np0005603623 nova_compute[226235]: 2026-01-31 09:12:24.620 226239 INFO nova.compute.manager [-] [instance: c893e608-08e4-4eab-a992-b241e484ea48] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:12:24 np0005603623 nova_compute[226235]: 2026-01-31 09:12:24.630 226239 DEBUG oslo_concurrency.processutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4efd9646-aadf-4138-9d36-d47416e0c6e1/disk.config 4efd9646-aadf-4138-9d36-d47416e0c6e1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.178s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:12:24 np0005603623 nova_compute[226235]: 2026-01-31 09:12:24.630 226239 INFO nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Deleting local config drive /var/lib/nova/instances/4efd9646-aadf-4138-9d36-d47416e0c6e1/disk.config because it was imported into RBD.#033[00m
Jan 31 04:12:24 np0005603623 nova_compute[226235]: 2026-01-31 09:12:24.639 226239 DEBUG nova.compute.manager [None req-bb1354e2-91ab-409b-b657-53cea6eef54f - - - - - -] [instance: c893e608-08e4-4eab-a992-b241e484ea48] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:12:24 np0005603623 kernel: tap2fa042eb-d4: entered promiscuous mode
Jan 31 04:12:24 np0005603623 NetworkManager[48970]: <info>  [1769850744.6723] manager: (tap2fa042eb-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/440)
Jan 31 04:12:24 np0005603623 ovn_controller[133449]: 2026-01-31T09:12:24Z|00924|binding|INFO|Claiming lport 2fa042eb-d400-4d66-9582-1916fd5ca4c0 for this chassis.
Jan 31 04:12:24 np0005603623 ovn_controller[133449]: 2026-01-31T09:12:24Z|00925|binding|INFO|2fa042eb-d400-4d66-9582-1916fd5ca4c0: Claiming fa:16:3e:78:d6:ba 10.100.0.3
Jan 31 04:12:24 np0005603623 nova_compute[226235]: 2026-01-31 09:12:24.672 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.680 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:d6:ba 10.100.0.3'], port_security=['fa:16:3e:78:d6:ba 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '4efd9646-aadf-4138-9d36-d47416e0c6e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-514e8c9e-2a14-4959-839a-40965c82f800', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c7930b92fc3471f87d9fe78ee56e71e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1a7e4e75-e92f-4762-8e77-5a420647206a dcfda0b5-b009-486f-8bba-7f4dbc382096', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cde7baed-a23e-44c1-8411-520889d37122, chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=2fa042eb-d400-4d66-9582-1916fd5ca4c0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:12:24 np0005603623 ovn_controller[133449]: 2026-01-31T09:12:24Z|00926|binding|INFO|Setting lport 2fa042eb-d400-4d66-9582-1916fd5ca4c0 ovn-installed in OVS
Jan 31 04:12:24 np0005603623 ovn_controller[133449]: 2026-01-31T09:12:24Z|00927|binding|INFO|Setting lport 2fa042eb-d400-4d66-9582-1916fd5ca4c0 up in Southbound
Jan 31 04:12:24 np0005603623 nova_compute[226235]: 2026-01-31 09:12:24.682 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.683 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 2fa042eb-d400-4d66-9582-1916fd5ca4c0 in datapath 514e8c9e-2a14-4959-839a-40965c82f800 bound to our chassis#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.685 143258 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 514e8c9e-2a14-4959-839a-40965c82f800#033[00m
Jan 31 04:12:24 np0005603623 nova_compute[226235]: 2026-01-31 09:12:24.687 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:24 np0005603623 systemd-udevd[330531]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.694 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[e6667470-89fa-4e13-89e6-f3ce367ae1d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.695 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap514e8c9e-21 in ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.697 229607 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap514e8c9e-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.697 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7249cb00-c495-45e0-a19a-65d532e53cd6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.698 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[0c547cff-b2a2-44c7-a832-9b754274275a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:24 np0005603623 NetworkManager[48970]: <info>  [1769850744.7057] device (tap2fa042eb-d4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:12:24 np0005603623 NetworkManager[48970]: <info>  [1769850744.7061] device (tap2fa042eb-d4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.708 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e54068-c91c-4c21-a906-ef9e192706ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:24 np0005603623 systemd-machined[194379]: New machine qemu-102-instance-000000dc.
Jan 31 04:12:24 np0005603623 systemd[1]: Started Virtual Machine qemu-102-instance-000000dc.
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.733 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[b7a98a8d-9281-4c3b-a10b-c0b032bcdf3d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.761 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[dd0c8eb3-6434-4ad1-8841-1a0e82c8c97a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:24 np0005603623 NetworkManager[48970]: <info>  [1769850744.7670] manager: (tap514e8c9e-20): new Veth device (/org/freedesktop/NetworkManager/Devices/441)
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.766 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[8a3f4a27-172d-4eb9-8756-1d736232e4c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:24 np0005603623 systemd-udevd[330537]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.801 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[d3872cf3-1292-4929-9b02-5ec51bcaeb0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.805 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[30212463-5705-426a-92c6-1a8c510d8877]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:24 np0005603623 NetworkManager[48970]: <info>  [1769850744.8224] device (tap514e8c9e-20): carrier: link connected
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.828 229627 DEBUG oslo.privsep.daemon [-] privsep: reply[ba5f9b21-1afd-444d-b62f-af7d388cf54d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.842 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[c1e4ff04-c06d-4f05-8db1-a2759f124adc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap514e8c9e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:94:f6:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 272], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 988525, 'reachable_time': 20668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 330566, 'error': None, 'target': 'ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.855 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9fe65b8a-0675-4edb-a603-69842ba28257]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe94:f624'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 988525, 'tstamp': 988525}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 330567, 'error': None, 'target': 'ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.870 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5a45f6fa-ee8a-4e2e-b184-6c5035855893]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap514e8c9e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:94:f6:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 272], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 988525, 'reachable_time': 20668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 330568, 'error': None, 'target': 'ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.893 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[76b3c8db-cab9-4d85-a40c-29b631198c98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.942 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[d3703c2b-f489-4981-9eec-23bcd16fa1de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.943 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap514e8c9e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.943 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.944 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap514e8c9e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:12:24 np0005603623 NetworkManager[48970]: <info>  [1769850744.9461] manager: (tap514e8c9e-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/442)
Jan 31 04:12:24 np0005603623 nova_compute[226235]: 2026-01-31 09:12:24.945 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:24 np0005603623 kernel: tap514e8c9e-20: entered promiscuous mode
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.950 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap514e8c9e-20, col_values=(('external_ids', {'iface-id': '170f7083-da86-4db5-bf6e-3dc3a556c3c4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:12:24 np0005603623 nova_compute[226235]: 2026-01-31 09:12:24.951 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:24 np0005603623 nova_compute[226235]: 2026-01-31 09:12:24.952 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:24 np0005603623 ovn_controller[133449]: 2026-01-31T09:12:24Z|00928|binding|INFO|Releasing lport 170f7083-da86-4db5-bf6e-3dc3a556c3c4 from this chassis (sb_readonly=0)
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.953 143258 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/514e8c9e-2a14-4959-839a-40965c82f800.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/514e8c9e-2a14-4959-839a-40965c82f800.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.954 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[118417b8-d3cc-4139-bbcf-7fb9eec11bb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.955 143258 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: global
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    log         /dev/log local0 debug
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    log-tag     haproxy-metadata-proxy-514e8c9e-2a14-4959-839a-40965c82f800
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    user        root
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    group       root
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    maxconn     1024
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    pidfile     /var/lib/neutron/external/pids/514e8c9e-2a14-4959-839a-40965c82f800.pid.haproxy
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    daemon
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: defaults
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    log global
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    mode http
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    option httplog
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    option dontlognull
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    option http-server-close
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    option forwardfor
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    retries                 3
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    timeout http-request    30s
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    timeout connect         30s
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    timeout client          32s
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    timeout server          32s
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    timeout http-keep-alive 30s
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: listen listener
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    bind 169.254.169.254:80
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]:    http-request add-header X-OVN-Network-ID 514e8c9e-2a14-4959-839a-40965c82f800
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:12:24 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:24.955 143258 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800', 'env', 'PROCESS_TAG=haproxy-514e8c9e-2a14-4959-839a-40965c82f800', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/514e8c9e-2a14-4959-839a-40965c82f800.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:12:24 np0005603623 nova_compute[226235]: 2026-01-31 09:12:24.959 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.183 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.184 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:12:25 np0005603623 podman[330601]: 2026-01-31 09:12:25.276105039 +0000 UTC m=+0.045278891 container create 8c1b2ae08438d1127566a2aad85be61b7bc80ba620b92af16354daf69623437d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:12:25 np0005603623 systemd[1]: Started libpod-conmon-8c1b2ae08438d1127566a2aad85be61b7bc80ba620b92af16354daf69623437d.scope.
Jan 31 04:12:25 np0005603623 systemd[1]: Started libcrun container.
Jan 31 04:12:25 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/756ea5ae677f52e80f87f3740c41346442d7cd6e7f3b9e90e5868ff4faddbc7e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:12:25 np0005603623 podman[330601]: 2026-01-31 09:12:25.249406931 +0000 UTC m=+0.018580803 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:12:25 np0005603623 podman[330601]: 2026-01-31 09:12:25.355399286 +0000 UTC m=+0.124573148 container init 8c1b2ae08438d1127566a2aad85be61b7bc80ba620b92af16354daf69623437d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 31 04:12:25 np0005603623 podman[330601]: 2026-01-31 09:12:25.359835716 +0000 UTC m=+0.129009558 container start 8c1b2ae08438d1127566a2aad85be61b7bc80ba620b92af16354daf69623437d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 31 04:12:25 np0005603623 neutron-haproxy-ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800[330617]: [NOTICE]   (330621) : New worker (330623) forked
Jan 31 04:12:25 np0005603623 neutron-haproxy-ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800[330617]: [NOTICE]   (330621) : Loading success.
Jan 31 04:12:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:12:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.0 total, 600.0 interval#012Cumulative writes: 18K writes, 89K keys, 18K commit groups, 1.0 writes per commit group, ingest: 0.18 GB, 0.03 MB/s#012Cumulative WAL: 18K writes, 18K syncs, 1.00 writes per sync, written: 0.18 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1605 writes, 7949 keys, 1605 commit groups, 1.0 writes per commit group, ingest: 16.16 MB, 0.03 MB/s#012Interval WAL: 1605 writes, 1605 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     43.5      2.57              0.26        58    0.044       0      0       0.0       0.0#012  L6      1/0   11.69 MB   0.0      0.7     0.1      0.6       0.6      0.0       0.0   5.3     64.1     54.8     10.83              1.26        57    0.190    445K    30K       0.0       0.0#012 Sum      1/0   11.69 MB   0.0      0.7     0.1      0.6       0.7      0.1       0.0   6.3     51.8     52.7     13.40              1.53       115    0.116    445K    30K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.5     79.4     78.7      1.04              0.17        12    0.087     65K   3078       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.7     0.1      0.6       0.6      0.0       0.0   0.0     64.1     54.8     10.83              1.26        57    0.190    445K    30K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     43.5      2.57              0.26        57    0.045       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6600.0 total, 600.0 interval#012Flush(GB): cumulative 0.109, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.69 GB write, 0.11 MB/s write, 0.68 GB read, 0.11 MB/s read, 13.4 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.08 GB read, 0.14 MB/s read, 1.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557fc5f1b1f0#2 capacity: 304.00 MB usage: 74.73 MB table_size: 0 occupancy: 18446744073709551615 collections: 12 last_copies: 0 last_secs: 0.0005 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(4240,71.55 MB,23.5352%) FilterBlock(115,1.20 MB,0.394776%) IndexBlock(115,1.98 MB,0.651485%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.554 226239 DEBUG nova.compute.manager [req-72f3cf4c-b64d-406c-8af8-0cbce5685015 req-4449d137-c5dc-4716-bd79-2f74dc6b4c27 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Received event network-vif-plugged-2fa042eb-d400-4d66-9582-1916fd5ca4c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.555 226239 DEBUG oslo_concurrency.lockutils [req-72f3cf4c-b64d-406c-8af8-0cbce5685015 req-4449d137-c5dc-4716-bd79-2f74dc6b4c27 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4efd9646-aadf-4138-9d36-d47416e0c6e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.556 226239 DEBUG oslo_concurrency.lockutils [req-72f3cf4c-b64d-406c-8af8-0cbce5685015 req-4449d137-c5dc-4716-bd79-2f74dc6b4c27 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4efd9646-aadf-4138-9d36-d47416e0c6e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.556 226239 DEBUG oslo_concurrency.lockutils [req-72f3cf4c-b64d-406c-8af8-0cbce5685015 req-4449d137-c5dc-4716-bd79-2f74dc6b4c27 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4efd9646-aadf-4138-9d36-d47416e0c6e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.557 226239 DEBUG nova.compute.manager [req-72f3cf4c-b64d-406c-8af8-0cbce5685015 req-4449d137-c5dc-4716-bd79-2f74dc6b4c27 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Processing event network-vif-plugged-2fa042eb-d400-4d66-9582-1916fd5ca4c0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.619 226239 DEBUG nova.compute.manager [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.620 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850745.6186411, 4efd9646-aadf-4138-9d36-d47416e0c6e1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.621 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] VM Started (Lifecycle Event)#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.623 226239 DEBUG nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.627 226239 INFO nova.virt.libvirt.driver [-] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Instance spawned successfully.#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.627 226239 DEBUG nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.640 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.643 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.652 226239 DEBUG nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.653 226239 DEBUG nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.654 226239 DEBUG nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.655 226239 DEBUG nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.655 226239 DEBUG nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.656 226239 DEBUG nova.virt.libvirt.driver [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.673 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.674 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850745.619753, 4efd9646-aadf-4138-9d36-d47416e0c6e1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.675 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.700 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.705 226239 DEBUG nova.virt.driver [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] Emitting event <LifecycleEvent: 1769850745.6229084, 4efd9646-aadf-4138-9d36-d47416e0c6e1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.705 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:12:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:25.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.729 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.732 226239 DEBUG nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.740 226239 INFO nova.compute.manager [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Took 10.90 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.741 226239 DEBUG nova.compute.manager [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.752 226239 INFO nova.compute.manager [None req-b04c20eb-64f7-4c56-82b2-a491aa27901b - - - - - -] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.835 226239 INFO nova.compute.manager [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Took 11.94 seconds to build instance.#033[00m
Jan 31 04:12:25 np0005603623 nova_compute[226235]: 2026-01-31 09:12:25.889 226239 DEBUG oslo_concurrency.lockutils [None req-1e463b50-bf89-4fa5-8003-66a6f20493fd ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "4efd9646-aadf-4138-9d36-d47416e0c6e1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:12:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:25.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:12:26 np0005603623 nova_compute[226235]: 2026-01-31 09:12:26.192 226239 DEBUG nova.network.neutron [req-15237e1a-0381-4997-b740-0cb53036877e req-b0a2b330-fb1a-46d2-a3c0-17de63a5d331 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Updated VIF entry in instance network info cache for port 2fa042eb-d400-4d66-9582-1916fd5ca4c0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:12:26 np0005603623 nova_compute[226235]: 2026-01-31 09:12:26.192 226239 DEBUG nova.network.neutron [req-15237e1a-0381-4997-b740-0cb53036877e req-b0a2b330-fb1a-46d2-a3c0-17de63a5d331 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Updating instance_info_cache with network_info: [{"id": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "address": "fa:16:3e:78:d6:ba", "network": {"id": "514e8c9e-2a14-4959-839a-40965c82f800", "bridge": "br-int", "label": "tempest-network-smoke--422685355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fa042eb-d4", "ovs_interfaceid": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:12:26 np0005603623 nova_compute[226235]: 2026-01-31 09:12:26.212 226239 DEBUG oslo_concurrency.lockutils [req-15237e1a-0381-4997-b740-0cb53036877e req-b0a2b330-fb1a-46d2-a3c0-17de63a5d331 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4efd9646-aadf-4138-9d36-d47416e0c6e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:12:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:27 np0005603623 nova_compute[226235]: 2026-01-31 09:12:27.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:27 np0005603623 nova_compute[226235]: 2026-01-31 09:12:27.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:12:27 np0005603623 nova_compute[226235]: 2026-01-31 09:12:27.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:12:27 np0005603623 nova_compute[226235]: 2026-01-31 09:12:27.211 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:27 np0005603623 nova_compute[226235]: 2026-01-31 09:12:27.389 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-4efd9646-aadf-4138-9d36-d47416e0c6e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:12:27 np0005603623 nova_compute[226235]: 2026-01-31 09:12:27.389 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-4efd9646-aadf-4138-9d36-d47416e0c6e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:12:27 np0005603623 nova_compute[226235]: 2026-01-31 09:12:27.390 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 04:12:27 np0005603623 nova_compute[226235]: 2026-01-31 09:12:27.390 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4efd9646-aadf-4138-9d36-d47416e0c6e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:12:27 np0005603623 nova_compute[226235]: 2026-01-31 09:12:27.645 226239 DEBUG nova.compute.manager [req-8b27891c-4ec9-4830-bd96-ced3b28556ad req-32124b46-1387-4c52-8745-ae63dee68b60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Received event network-vif-plugged-2fa042eb-d400-4d66-9582-1916fd5ca4c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:12:27 np0005603623 nova_compute[226235]: 2026-01-31 09:12:27.647 226239 DEBUG oslo_concurrency.lockutils [req-8b27891c-4ec9-4830-bd96-ced3b28556ad req-32124b46-1387-4c52-8745-ae63dee68b60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4efd9646-aadf-4138-9d36-d47416e0c6e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:27 np0005603623 nova_compute[226235]: 2026-01-31 09:12:27.647 226239 DEBUG oslo_concurrency.lockutils [req-8b27891c-4ec9-4830-bd96-ced3b28556ad req-32124b46-1387-4c52-8745-ae63dee68b60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4efd9646-aadf-4138-9d36-d47416e0c6e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:27 np0005603623 nova_compute[226235]: 2026-01-31 09:12:27.647 226239 DEBUG oslo_concurrency.lockutils [req-8b27891c-4ec9-4830-bd96-ced3b28556ad req-32124b46-1387-4c52-8745-ae63dee68b60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4efd9646-aadf-4138-9d36-d47416e0c6e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:27 np0005603623 nova_compute[226235]: 2026-01-31 09:12:27.647 226239 DEBUG nova.compute.manager [req-8b27891c-4ec9-4830-bd96-ced3b28556ad req-32124b46-1387-4c52-8745-ae63dee68b60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] No waiting events found dispatching network-vif-plugged-2fa042eb-d400-4d66-9582-1916fd5ca4c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:12:27 np0005603623 nova_compute[226235]: 2026-01-31 09:12:27.648 226239 WARNING nova.compute.manager [req-8b27891c-4ec9-4830-bd96-ced3b28556ad req-32124b46-1387-4c52-8745-ae63dee68b60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Received unexpected event network-vif-plugged-2fa042eb-d400-4d66-9582-1916fd5ca4c0 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:12:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:27.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:27.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:28 np0005603623 nova_compute[226235]: 2026-01-31 09:12:28.022 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:29 np0005603623 ovn_controller[133449]: 2026-01-31T09:12:29Z|00929|binding|INFO|Releasing lport 170f7083-da86-4db5-bf6e-3dc3a556c3c4 from this chassis (sb_readonly=0)
Jan 31 04:12:29 np0005603623 nova_compute[226235]: 2026-01-31 09:12:29.236 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:29 np0005603623 ovn_controller[133449]: 2026-01-31T09:12:29Z|00930|binding|INFO|Releasing lport 170f7083-da86-4db5-bf6e-3dc3a556c3c4 from this chassis (sb_readonly=0)
Jan 31 04:12:29 np0005603623 nova_compute[226235]: 2026-01-31 09:12:29.279 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:29 np0005603623 nova_compute[226235]: 2026-01-31 09:12:29.464 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Updating instance_info_cache with network_info: [{"id": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "address": "fa:16:3e:78:d6:ba", "network": {"id": "514e8c9e-2a14-4959-839a-40965c82f800", "bridge": "br-int", "label": "tempest-network-smoke--422685355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fa042eb-d4", "ovs_interfaceid": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:12:29 np0005603623 nova_compute[226235]: 2026-01-31 09:12:29.484 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-4efd9646-aadf-4138-9d36-d47416e0c6e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:12:29 np0005603623 nova_compute[226235]: 2026-01-31 09:12:29.484 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 04:12:29 np0005603623 nova_compute[226235]: 2026-01-31 09:12:29.485 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:29 np0005603623 nova_compute[226235]: 2026-01-31 09:12:29.506 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:29 np0005603623 nova_compute[226235]: 2026-01-31 09:12:29.507 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:29 np0005603623 nova_compute[226235]: 2026-01-31 09:12:29.507 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:29 np0005603623 nova_compute[226235]: 2026-01-31 09:12:29.507 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:12:29 np0005603623 nova_compute[226235]: 2026-01-31 09:12:29.508 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:12:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:12:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:29.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:12:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:12:29 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2344061092' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:12:29 np0005603623 nova_compute[226235]: 2026-01-31 09:12:29.921 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:12:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:29.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:29 np0005603623 nova_compute[226235]: 2026-01-31 09:12:29.995 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000dc as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:12:29 np0005603623 nova_compute[226235]: 2026-01-31 09:12:29.995 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000dc as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:12:30 np0005603623 NetworkManager[48970]: <info>  [1769850750.0967] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/443)
Jan 31 04:12:30 np0005603623 NetworkManager[48970]: <info>  [1769850750.0977] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/444)
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.107 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:30 np0005603623 ovn_controller[133449]: 2026-01-31T09:12:30Z|00931|binding|INFO|Releasing lport 170f7083-da86-4db5-bf6e-3dc3a556c3c4 from this chassis (sb_readonly=0)
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.127 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:30.164 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.164 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:12:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:30.164 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:30.165 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.165 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3944MB free_disk=20.96752166748047GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.166 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.166 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.276 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 4efd9646-aadf-4138-9d36-d47416e0c6e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.277 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.277 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.315 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.563 226239 DEBUG nova.compute.manager [req-0cabfbdf-6789-4166-b096-442f8cb6e7d6 req-bcacb063-b8b7-44b1-b8bc-1ee723b37761 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Received event network-changed-2fa042eb-d400-4d66-9582-1916fd5ca4c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.564 226239 DEBUG nova.compute.manager [req-0cabfbdf-6789-4166-b096-442f8cb6e7d6 req-bcacb063-b8b7-44b1-b8bc-1ee723b37761 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Refreshing instance network info cache due to event network-changed-2fa042eb-d400-4d66-9582-1916fd5ca4c0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.565 226239 DEBUG oslo_concurrency.lockutils [req-0cabfbdf-6789-4166-b096-442f8cb6e7d6 req-bcacb063-b8b7-44b1-b8bc-1ee723b37761 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4efd9646-aadf-4138-9d36-d47416e0c6e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.566 226239 DEBUG oslo_concurrency.lockutils [req-0cabfbdf-6789-4166-b096-442f8cb6e7d6 req-bcacb063-b8b7-44b1-b8bc-1ee723b37761 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4efd9646-aadf-4138-9d36-d47416e0c6e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.566 226239 DEBUG nova.network.neutron [req-0cabfbdf-6789-4166-b096-442f8cb6e7d6 req-bcacb063-b8b7-44b1-b8bc-1ee723b37761 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Refreshing network info cache for port 2fa042eb-d400-4d66-9582-1916fd5ca4c0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:12:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:12:30 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3935167101' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.734 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.740 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.779 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.821 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:12:30 np0005603623 nova_compute[226235]: 2026-01-31 09:12:30.822 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.656s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:31 np0005603623 nova_compute[226235]: 2026-01-31 09:12:31.491 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:12:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:31.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:12:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:31.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:32 np0005603623 nova_compute[226235]: 2026-01-31 09:12:32.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:32 np0005603623 nova_compute[226235]: 2026-01-31 09:12:32.254 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:32 np0005603623 nova_compute[226235]: 2026-01-31 09:12:32.313 226239 DEBUG nova.network.neutron [req-0cabfbdf-6789-4166-b096-442f8cb6e7d6 req-bcacb063-b8b7-44b1-b8bc-1ee723b37761 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Updated VIF entry in instance network info cache for port 2fa042eb-d400-4d66-9582-1916fd5ca4c0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:12:32 np0005603623 nova_compute[226235]: 2026-01-31 09:12:32.314 226239 DEBUG nova.network.neutron [req-0cabfbdf-6789-4166-b096-442f8cb6e7d6 req-bcacb063-b8b7-44b1-b8bc-1ee723b37761 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Updating instance_info_cache with network_info: [{"id": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "address": "fa:16:3e:78:d6:ba", "network": {"id": "514e8c9e-2a14-4959-839a-40965c82f800", "bridge": "br-int", "label": "tempest-network-smoke--422685355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fa042eb-d4", "ovs_interfaceid": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:12:32 np0005603623 nova_compute[226235]: 2026-01-31 09:12:32.340 226239 DEBUG oslo_concurrency.lockutils [req-0cabfbdf-6789-4166-b096-442f8cb6e7d6 req-bcacb063-b8b7-44b1-b8bc-1ee723b37761 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4efd9646-aadf-4138-9d36-d47416e0c6e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:12:33 np0005603623 nova_compute[226235]: 2026-01-31 09:12:33.024 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:33 np0005603623 ovn_controller[133449]: 2026-01-31T09:12:33Z|00932|binding|INFO|Releasing lport 170f7083-da86-4db5-bf6e-3dc3a556c3c4 from this chassis (sb_readonly=0)
Jan 31 04:12:33 np0005603623 nova_compute[226235]: 2026-01-31 09:12:33.575 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:33.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:33.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:34 np0005603623 nova_compute[226235]: 2026-01-31 09:12:34.157 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:35.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:35.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:36 np0005603623 nova_compute[226235]: 2026-01-31 09:12:36.151 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:36 np0005603623 nova_compute[226235]: 2026-01-31 09:12:36.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:36 np0005603623 nova_compute[226235]: 2026-01-31 09:12:36.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #184. Immutable memtables: 0.
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:12:36.264222) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 117] Flushing memtable with next log file: 184
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850756264289, "job": 117, "event": "flush_started", "num_memtables": 1, "num_entries": 923, "num_deletes": 254, "total_data_size": 1726013, "memory_usage": 1753920, "flush_reason": "Manual Compaction"}
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 117] Level-0 flush table #185: started
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850756273539, "cf_name": "default", "job": 117, "event": "table_file_creation", "file_number": 185, "file_size": 1137546, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 89346, "largest_seqno": 90264, "table_properties": {"data_size": 1133183, "index_size": 2014, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 9995, "raw_average_key_size": 20, "raw_value_size": 1124337, "raw_average_value_size": 2275, "num_data_blocks": 88, "num_entries": 494, "num_filter_entries": 494, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850699, "oldest_key_time": 1769850699, "file_creation_time": 1769850756, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 185, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 117] Flush lasted 9384 microseconds, and 3003 cpu microseconds.
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:12:36.273597) [db/flush_job.cc:967] [default] [JOB 117] Level-0 flush table #185: 1137546 bytes OK
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:12:36.273623) [db/memtable_list.cc:519] [default] Level-0 commit table #185 started
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:12:36.275398) [db/memtable_list.cc:722] [default] Level-0 commit table #185: memtable #1 done
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:12:36.275409) EVENT_LOG_v1 {"time_micros": 1769850756275405, "job": 117, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:12:36.275426) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 117] Try to delete WAL files size 1721302, prev total WAL file size 1721302, number of live WAL files 2.
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000181.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:12:36.275866) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037373831' seq:72057594037927935, type:22 .. '7061786F730038303333' seq:0, type:0; will stop at (end)
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 118] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 117 Base level 0, inputs: [185(1110KB)], [183(11MB)]
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850756275922, "job": 118, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [185], "files_L6": [183], "score": -1, "input_data_size": 13398951, "oldest_snapshot_seqno": -1}
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 118] Generated table #186: 10900 keys, 11469497 bytes, temperature: kUnknown
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850756425437, "cf_name": "default", "job": 118, "event": "table_file_creation", "file_number": 186, "file_size": 11469497, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11402654, "index_size": 38506, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27269, "raw_key_size": 287978, "raw_average_key_size": 26, "raw_value_size": 11215955, "raw_average_value_size": 1028, "num_data_blocks": 1449, "num_entries": 10900, "num_filter_entries": 10900, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769850756, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 186, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:12:36.425868) [db/compaction/compaction_job.cc:1663] [default] [JOB 118] Compacted 1@0 + 1@6 files to L6 => 11469497 bytes
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:12:36.430404) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 89.5 rd, 76.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 11.7 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(21.9) write-amplify(10.1) OK, records in: 11423, records dropped: 523 output_compression: NoCompression
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:12:36.430425) EVENT_LOG_v1 {"time_micros": 1769850756430415, "job": 118, "event": "compaction_finished", "compaction_time_micros": 149671, "compaction_time_cpu_micros": 28864, "output_level": 6, "num_output_files": 1, "total_output_size": 11469497, "num_input_records": 11423, "num_output_records": 10900, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000185.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850756430721, "job": 118, "event": "table_file_deletion", "file_number": 185}
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000183.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850756432756, "job": 118, "event": "table_file_deletion", "file_number": 183}
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:12:36.275767) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:12:36.432786) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:12:36.432792) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:12:36.432794) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:12:36.432796) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:12:36 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:12:36.432820) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:12:36 np0005603623 podman[330726]: 2026-01-31 09:12:36.966238837 +0000 UTC m=+0.056995179 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:12:37 np0005603623 podman[330727]: 2026-01-31 09:12:37.020929212 +0000 UTC m=+0.108783903 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 04:12:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:37 np0005603623 nova_compute[226235]: 2026-01-31 09:12:37.256 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:37.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:37.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:38 np0005603623 nova_compute[226235]: 2026-01-31 09:12:38.026 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:39 np0005603623 ovn_controller[133449]: 2026-01-31T09:12:39Z|00124|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:78:d6:ba 10.100.0.3
Jan 31 04:12:39 np0005603623 ovn_controller[133449]: 2026-01-31T09:12:39Z|00125|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:78:d6:ba 10.100.0.3
Jan 31 04:12:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:39.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:12:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:39.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:12:40 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 31 04:12:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:41.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:41.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:42 np0005603623 nova_compute[226235]: 2026-01-31 09:12:42.259 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:43 np0005603623 nova_compute[226235]: 2026-01-31 09:12:43.029 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:12:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:43.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:12:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:43.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:45.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:45.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:47 np0005603623 nova_compute[226235]: 2026-01-31 09:12:47.304 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:47.450 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=96, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=95) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:12:47 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:47.451 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:12:47 np0005603623 nova_compute[226235]: 2026-01-31 09:12:47.451 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:47.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:47.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:48 np0005603623 nova_compute[226235]: 2026-01-31 09:12:48.032 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:49.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:49.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:12:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 77K writes, 327K keys, 77K commit groups, 1.0 writes per commit group, ingest: 0.33 GB, 0.05 MB/s#012Cumulative WAL: 77K writes, 26K syncs, 2.91 writes per sync, written: 0.33 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 6778 writes, 27K keys, 6778 commit groups, 1.0 writes per commit group, ingest: 28.97 MB, 0.05 MB/s#012Interval WAL: 6778 writes, 2628 syncs, 2.58 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 04:12:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:51.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:51.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:52 np0005603623 nova_compute[226235]: 2026-01-31 09:12:52.308 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:53 np0005603623 nova_compute[226235]: 2026-01-31 09:12:53.034 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:53.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:53.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:55.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:55.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:56 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:12:56.454 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '96'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:12:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:57 np0005603623 nova_compute[226235]: 2026-01-31 09:12:57.309 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:57.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:57.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:58 np0005603623 nova_compute[226235]: 2026-01-31 09:12:58.036 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:59.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:12:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:59.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:01 np0005603623 podman[331007]: 2026-01-31 09:13:01.444428528 +0000 UTC m=+0.052269681 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 04:13:01 np0005603623 podman[331007]: 2026-01-31 09:13:01.53981929 +0000 UTC m=+0.147660453 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Jan 31 04:13:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:01.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:13:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:01.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:13:02 np0005603623 podman[331159]: 2026-01-31 09:13:02.047972742 +0000 UTC m=+0.045615753 container exec dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 04:13:02 np0005603623 podman[331159]: 2026-01-31 09:13:02.078576682 +0000 UTC m=+0.076219683 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 04:13:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:02 np0005603623 podman[331227]: 2026-01-31 09:13:02.254597284 +0000 UTC m=+0.042748573 container exec 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, architecture=x86_64, version=2.2.4, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, name=keepalived, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph)
Jan 31 04:13:02 np0005603623 podman[331227]: 2026-01-31 09:13:02.26912917 +0000 UTC m=+0.057280459 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, version=2.2.4, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9)
Jan 31 04:13:02 np0005603623 nova_compute[226235]: 2026-01-31 09:13:02.311 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:03 np0005603623 nova_compute[226235]: 2026-01-31 09:13:03.100 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:03 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:13:03 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:13:03 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:13:03 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:13:03 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:13:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:13:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:03.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:13:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:03.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:05.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:05.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:07 np0005603623 nova_compute[226235]: 2026-01-31 09:13:07.314 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:07.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:07 np0005603623 podman[331445]: 2026-01-31 09:13:07.981671557 +0000 UTC m=+0.071662768 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible)
Jan 31 04:13:07 np0005603623 podman[331444]: 2026-01-31 09:13:07.981668777 +0000 UTC m=+0.071733041 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 04:13:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:07.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:08 np0005603623 nova_compute[226235]: 2026-01-31 09:13:08.102 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:13:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:09.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:13:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:09.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:13:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:13:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:13:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:11.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:13:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:11.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:12 np0005603623 nova_compute[226235]: 2026-01-31 09:13:12.315 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:13 np0005603623 nova_compute[226235]: 2026-01-31 09:13:13.104 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:13.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:14.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:15.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:16.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:17 np0005603623 nova_compute[226235]: 2026-01-31 09:13:17.319 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:17.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:13:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:18.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:13:18 np0005603623 nova_compute[226235]: 2026-01-31 09:13:18.105 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:19.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:20.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:13:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:21.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:13:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:22.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:22 np0005603623 nova_compute[226235]: 2026-01-31 09:13:22.356 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:23 np0005603623 nova_compute[226235]: 2026-01-31 09:13:23.107 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:23.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:24.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:25 np0005603623 nova_compute[226235]: 2026-01-31 09:13:25.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:25 np0005603623 nova_compute[226235]: 2026-01-31 09:13:25.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:13:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:13:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:25.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:13:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:26.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:26 np0005603623 ovn_controller[133449]: 2026-01-31T09:13:26Z|00933|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 04:13:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:27 np0005603623 nova_compute[226235]: 2026-01-31 09:13:27.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:27 np0005603623 nova_compute[226235]: 2026-01-31 09:13:27.201 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:13:27 np0005603623 nova_compute[226235]: 2026-01-31 09:13:27.202 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:13:27 np0005603623 nova_compute[226235]: 2026-01-31 09:13:27.202 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:13:27 np0005603623 nova_compute[226235]: 2026-01-31 09:13:27.202 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:13:27 np0005603623 nova_compute[226235]: 2026-01-31 09:13:27.202 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:13:27 np0005603623 nova_compute[226235]: 2026-01-31 09:13:27.357 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:13:27 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3817974669' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:13:27 np0005603623 nova_compute[226235]: 2026-01-31 09:13:27.637 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:13:27 np0005603623 nova_compute[226235]: 2026-01-31 09:13:27.701 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000dc as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:13:27 np0005603623 nova_compute[226235]: 2026-01-31 09:13:27.701 226239 DEBUG nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] skipping disk for instance-000000dc as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:13:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:13:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:27.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:13:27 np0005603623 nova_compute[226235]: 2026-01-31 09:13:27.833 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:13:27 np0005603623 nova_compute[226235]: 2026-01-31 09:13:27.834 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3892MB free_disk=20.89727783203125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:13:27 np0005603623 nova_compute[226235]: 2026-01-31 09:13:27.834 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:13:27 np0005603623 nova_compute[226235]: 2026-01-31 09:13:27.835 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:13:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:28.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:28 np0005603623 nova_compute[226235]: 2026-01-31 09:13:28.110 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:28 np0005603623 nova_compute[226235]: 2026-01-31 09:13:28.135 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Instance 4efd9646-aadf-4138-9d36-d47416e0c6e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:13:28 np0005603623 nova_compute[226235]: 2026-01-31 09:13:28.135 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:13:28 np0005603623 nova_compute[226235]: 2026-01-31 09:13:28.136 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:13:28 np0005603623 nova_compute[226235]: 2026-01-31 09:13:28.160 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 04:13:28 np0005603623 nova_compute[226235]: 2026-01-31 09:13:28.177 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 04:13:28 np0005603623 nova_compute[226235]: 2026-01-31 09:13:28.177 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 04:13:28 np0005603623 nova_compute[226235]: 2026-01-31 09:13:28.191 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 04:13:28 np0005603623 nova_compute[226235]: 2026-01-31 09:13:28.212 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 04:13:28 np0005603623 nova_compute[226235]: 2026-01-31 09:13:28.244 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:13:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:13:28 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2639227182' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:13:28 np0005603623 nova_compute[226235]: 2026-01-31 09:13:28.747 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:13:28 np0005603623 nova_compute[226235]: 2026-01-31 09:13:28.756 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:13:28 np0005603623 nova_compute[226235]: 2026-01-31 09:13:28.817 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:13:28 np0005603623 nova_compute[226235]: 2026-01-31 09:13:28.821 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:13:28 np0005603623 nova_compute[226235]: 2026-01-31 09:13:28.821 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.987s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:13:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:29.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:13:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:30.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:13:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:30.164 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:13:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:30.165 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:13:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:30.165 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:13:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:30.749 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=97, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=96) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:13:30 np0005603623 nova_compute[226235]: 2026-01-31 09:13:30.750 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:30.751 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:13:30 np0005603623 nova_compute[226235]: 2026-01-31 09:13:30.823 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:30 np0005603623 nova_compute[226235]: 2026-01-31 09:13:30.823 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:13:30 np0005603623 nova_compute[226235]: 2026-01-31 09:13:30.824 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:13:31 np0005603623 nova_compute[226235]: 2026-01-31 09:13:31.147 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "refresh_cache-4efd9646-aadf-4138-9d36-d47416e0c6e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:13:31 np0005603623 nova_compute[226235]: 2026-01-31 09:13:31.148 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquired lock "refresh_cache-4efd9646-aadf-4138-9d36-d47416e0c6e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:13:31 np0005603623 nova_compute[226235]: 2026-01-31 09:13:31.148 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 04:13:31 np0005603623 nova_compute[226235]: 2026-01-31 09:13:31.148 226239 DEBUG nova.objects.instance [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4efd9646-aadf-4138-9d36-d47416e0c6e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:13:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:31.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:32.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:32 np0005603623 nova_compute[226235]: 2026-01-31 09:13:32.368 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:32 np0005603623 nova_compute[226235]: 2026-01-31 09:13:32.531 226239 DEBUG nova.network.neutron [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Updating instance_info_cache with network_info: [{"id": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "address": "fa:16:3e:78:d6:ba", "network": {"id": "514e8c9e-2a14-4959-839a-40965c82f800", "bridge": "br-int", "label": "tempest-network-smoke--422685355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fa042eb-d4", "ovs_interfaceid": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:13:32 np0005603623 nova_compute[226235]: 2026-01-31 09:13:32.547 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Releasing lock "refresh_cache-4efd9646-aadf-4138-9d36-d47416e0c6e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:13:32 np0005603623 nova_compute[226235]: 2026-01-31 09:13:32.547 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 04:13:32 np0005603623 nova_compute[226235]: 2026-01-31 09:13:32.547 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:32.753 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '97'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:13:32 np0005603623 nova_compute[226235]: 2026-01-31 09:13:32.910 226239 DEBUG nova.compute.manager [req-318e126f-1970-4aef-b184-19460187580a req-8a8bf9c2-ccb6-4cb6-a104-e6c50d30203a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Received event network-changed-2fa042eb-d400-4d66-9582-1916fd5ca4c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:13:32 np0005603623 nova_compute[226235]: 2026-01-31 09:13:32.911 226239 DEBUG nova.compute.manager [req-318e126f-1970-4aef-b184-19460187580a req-8a8bf9c2-ccb6-4cb6-a104-e6c50d30203a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Refreshing instance network info cache due to event network-changed-2fa042eb-d400-4d66-9582-1916fd5ca4c0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:13:32 np0005603623 nova_compute[226235]: 2026-01-31 09:13:32.911 226239 DEBUG oslo_concurrency.lockutils [req-318e126f-1970-4aef-b184-19460187580a req-8a8bf9c2-ccb6-4cb6-a104-e6c50d30203a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4efd9646-aadf-4138-9d36-d47416e0c6e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:13:32 np0005603623 nova_compute[226235]: 2026-01-31 09:13:32.911 226239 DEBUG oslo_concurrency.lockutils [req-318e126f-1970-4aef-b184-19460187580a req-8a8bf9c2-ccb6-4cb6-a104-e6c50d30203a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4efd9646-aadf-4138-9d36-d47416e0c6e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:13:32 np0005603623 nova_compute[226235]: 2026-01-31 09:13:32.911 226239 DEBUG nova.network.neutron [req-318e126f-1970-4aef-b184-19460187580a req-8a8bf9c2-ccb6-4cb6-a104-e6c50d30203a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Refreshing network info cache for port 2fa042eb-d400-4d66-9582-1916fd5ca4c0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.020 226239 DEBUG oslo_concurrency.lockutils [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "4efd9646-aadf-4138-9d36-d47416e0c6e1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.021 226239 DEBUG oslo_concurrency.lockutils [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "4efd9646-aadf-4138-9d36-d47416e0c6e1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.021 226239 DEBUG oslo_concurrency.lockutils [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "4efd9646-aadf-4138-9d36-d47416e0c6e1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.022 226239 DEBUG oslo_concurrency.lockutils [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "4efd9646-aadf-4138-9d36-d47416e0c6e1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.022 226239 DEBUG oslo_concurrency.lockutils [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "4efd9646-aadf-4138-9d36-d47416e0c6e1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.023 226239 INFO nova.compute.manager [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Terminating instance#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.024 226239 DEBUG nova.compute.manager [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.112 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:33 np0005603623 kernel: tap2fa042eb-d4 (unregistering): left promiscuous mode
Jan 31 04:13:33 np0005603623 NetworkManager[48970]: <info>  [1769850813.1827] device (tap2fa042eb-d4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:13:33 np0005603623 ovn_controller[133449]: 2026-01-31T09:13:33Z|00934|binding|INFO|Releasing lport 2fa042eb-d400-4d66-9582-1916fd5ca4c0 from this chassis (sb_readonly=0)
Jan 31 04:13:33 np0005603623 ovn_controller[133449]: 2026-01-31T09:13:33Z|00935|binding|INFO|Setting lport 2fa042eb-d400-4d66-9582-1916fd5ca4c0 down in Southbound
Jan 31 04:13:33 np0005603623 ovn_controller[133449]: 2026-01-31T09:13:33Z|00936|binding|INFO|Removing iface tap2fa042eb-d4 ovn-installed in OVS
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.188 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:33.193 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:d6:ba 10.100.0.3'], port_security=['fa:16:3e:78:d6:ba 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '4efd9646-aadf-4138-9d36-d47416e0c6e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-514e8c9e-2a14-4959-839a-40965c82f800', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c7930b92fc3471f87d9fe78ee56e71e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1a7e4e75-e92f-4762-8e77-5a420647206a dcfda0b5-b009-486f-8bba-7f4dbc382096', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cde7baed-a23e-44c1-8411-520889d37122, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>], logical_port=2fa042eb-d400-4d66-9582-1916fd5ca4c0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2985ce3820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:13:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:33.194 143258 INFO neutron.agent.ovn.metadata.agent [-] Port 2fa042eb-d400-4d66-9582-1916fd5ca4c0 in datapath 514e8c9e-2a14-4959-839a-40965c82f800 unbound from our chassis#033[00m
Jan 31 04:13:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:33.195 143258 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 514e8c9e-2a14-4959-839a-40965c82f800, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.197 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:33.197 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[90a7a014-3d41-46e9-a494-e8306b75da0b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:13:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:33.198 143258 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800 namespace which is not needed anymore#033[00m
Jan 31 04:13:33 np0005603623 systemd[1]: machine-qemu\x2d102\x2dinstance\x2d000000dc.scope: Deactivated successfully.
Jan 31 04:13:33 np0005603623 systemd[1]: machine-qemu\x2d102\x2dinstance\x2d000000dc.scope: Consumed 15.336s CPU time.
Jan 31 04:13:33 np0005603623 systemd-machined[194379]: Machine qemu-102-instance-000000dc terminated.
Jan 31 04:13:33 np0005603623 neutron-haproxy-ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800[330617]: [NOTICE]   (330621) : haproxy version is 2.8.14-c23fe91
Jan 31 04:13:33 np0005603623 neutron-haproxy-ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800[330617]: [NOTICE]   (330621) : path to executable is /usr/sbin/haproxy
Jan 31 04:13:33 np0005603623 neutron-haproxy-ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800[330617]: [WARNING]  (330621) : Exiting Master process...
Jan 31 04:13:33 np0005603623 neutron-haproxy-ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800[330617]: [WARNING]  (330621) : Exiting Master process...
Jan 31 04:13:33 np0005603623 neutron-haproxy-ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800[330617]: [ALERT]    (330621) : Current worker (330623) exited with code 143 (Terminated)
Jan 31 04:13:33 np0005603623 neutron-haproxy-ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800[330617]: [WARNING]  (330621) : All workers exited. Exiting... (0)
Jan 31 04:13:33 np0005603623 systemd[1]: libpod-8c1b2ae08438d1127566a2aad85be61b7bc80ba620b92af16354daf69623437d.scope: Deactivated successfully.
Jan 31 04:13:33 np0005603623 podman[331668]: 2026-01-31 09:13:33.323627172 +0000 UTC m=+0.047906775 container died 8c1b2ae08438d1127566a2aad85be61b7bc80ba620b92af16354daf69623437d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 04:13:33 np0005603623 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8c1b2ae08438d1127566a2aad85be61b7bc80ba620b92af16354daf69623437d-userdata-shm.mount: Deactivated successfully.
Jan 31 04:13:33 np0005603623 systemd[1]: var-lib-containers-storage-overlay-756ea5ae677f52e80f87f3740c41346442d7cd6e7f3b9e90e5868ff4faddbc7e-merged.mount: Deactivated successfully.
Jan 31 04:13:33 np0005603623 podman[331668]: 2026-01-31 09:13:33.356522506 +0000 UTC m=+0.080802099 container cleanup 8c1b2ae08438d1127566a2aad85be61b7bc80ba620b92af16354daf69623437d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:13:33 np0005603623 systemd[1]: libpod-conmon-8c1b2ae08438d1127566a2aad85be61b7bc80ba620b92af16354daf69623437d.scope: Deactivated successfully.
Jan 31 04:13:33 np0005603623 podman[331700]: 2026-01-31 09:13:33.412423511 +0000 UTC m=+0.041685891 container remove 8c1b2ae08438d1127566a2aad85be61b7bc80ba620b92af16354daf69623437d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 31 04:13:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:33.415 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7d26ed81-7534-4be0-a056-ec8915074f36]: (4, ('Sat Jan 31 09:13:33 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800 (8c1b2ae08438d1127566a2aad85be61b7bc80ba620b92af16354daf69623437d)\n8c1b2ae08438d1127566a2aad85be61b7bc80ba620b92af16354daf69623437d\nSat Jan 31 09:13:33 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800 (8c1b2ae08438d1127566a2aad85be61b7bc80ba620b92af16354daf69623437d)\n8c1b2ae08438d1127566a2aad85be61b7bc80ba620b92af16354daf69623437d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:13:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:33.418 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7d3ec052-4f2a-4cf9-854c-fdfb3fbcf2b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:13:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:33.420 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap514e8c9e-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:13:33 np0005603623 kernel: tap514e8c9e-20: left promiscuous mode
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.425 226239 DEBUG nova.compute.manager [req-c8822f8f-4a34-4f03-8eb3-74f774aeafe3 req-92744cc8-5b9f-4843-83f0-7e051a886dbb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Received event network-vif-unplugged-2fa042eb-d400-4d66-9582-1916fd5ca4c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.426 226239 DEBUG oslo_concurrency.lockutils [req-c8822f8f-4a34-4f03-8eb3-74f774aeafe3 req-92744cc8-5b9f-4843-83f0-7e051a886dbb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4efd9646-aadf-4138-9d36-d47416e0c6e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.426 226239 DEBUG oslo_concurrency.lockutils [req-c8822f8f-4a34-4f03-8eb3-74f774aeafe3 req-92744cc8-5b9f-4843-83f0-7e051a886dbb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4efd9646-aadf-4138-9d36-d47416e0c6e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.427 226239 DEBUG oslo_concurrency.lockutils [req-c8822f8f-4a34-4f03-8eb3-74f774aeafe3 req-92744cc8-5b9f-4843-83f0-7e051a886dbb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4efd9646-aadf-4138-9d36-d47416e0c6e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.427 226239 DEBUG nova.compute.manager [req-c8822f8f-4a34-4f03-8eb3-74f774aeafe3 req-92744cc8-5b9f-4843-83f0-7e051a886dbb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] No waiting events found dispatching network-vif-unplugged-2fa042eb-d400-4d66-9582-1916fd5ca4c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.427 226239 DEBUG nova.compute.manager [req-c8822f8f-4a34-4f03-8eb3-74f774aeafe3 req-92744cc8-5b9f-4843-83f0-7e051a886dbb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Received event network-vif-unplugged-2fa042eb-d400-4d66-9582-1916fd5ca4c0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.428 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.431 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.432 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:33.434 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[5abf3423-4774-489a-ac06-ad3a4fa2fe23]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:13:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:33.446 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[9b362aa5-9fb7-4c40-9951-a30928becc73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:13:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:33.447 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[4d76c081-7e7a-4f72-81aa-60940f470743]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:13:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:33.459 229607 DEBUG oslo.privsep.daemon [-] privsep: reply[7c97beb2-3e7c-4283-9ce9-9d380ef99942]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 988518, 'reachable_time': 33211, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 331727, 'error': None, 'target': 'ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.461 226239 INFO nova.virt.libvirt.driver [-] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Instance destroyed successfully.#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.461 226239 DEBUG nova.objects.instance [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lazy-loading 'resources' on Instance uuid 4efd9646-aadf-4138-9d36-d47416e0c6e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:13:33 np0005603623 systemd[1]: run-netns-ovnmeta\x2d514e8c9e\x2d2a14\x2d4959\x2d839a\x2d40965c82f800.mount: Deactivated successfully.
Jan 31 04:13:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:33.464 143522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-514e8c9e-2a14-4959-839a-40965c82f800 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:13:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:13:33.464 143522 DEBUG oslo.privsep.daemon [-] privsep: reply[39e250c2-3dcb-43eb-8068-b6fedbe99a81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.481 226239 DEBUG nova.virt.libvirt.vif [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:12:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-105635484',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-105635484',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1802479850-ac',id=220,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFC9ERZJ6vqwSOhb+BxoLuCTPY4zXIPbOdYQjYf18qK5EvFlLu3Fd6dU0UfukMij7wWnpSqWAkqu0LocOazNCHHb52PIeAWKGpoQVtLv/Sw5DcBQogLeHH3fNNhS1TtHkw==',key_name='tempest-TestSecurityGroupsBasicOps-763844494',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:12:25Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c7930b92fc3471f87d9fe78ee56e71e',ramdisk_id='',reservation_id='r-w3k0ayns',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1802479850',owner_user_name='tempest-TestSecurityGroupsBasicOps-1802479850-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:12:25Z,user_data=None,user_id='ebd43008d7a64b8bbf97a2304b1f78b6',uuid=4efd9646-aadf-4138-9d36-d47416e0c6e1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "address": "fa:16:3e:78:d6:ba", "network": {"id": "514e8c9e-2a14-4959-839a-40965c82f800", "bridge": "br-int", "label": "tempest-network-smoke--422685355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fa042eb-d4", "ovs_interfaceid": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.481 226239 DEBUG nova.network.os_vif_util [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converting VIF {"id": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "address": "fa:16:3e:78:d6:ba", "network": {"id": "514e8c9e-2a14-4959-839a-40965c82f800", "bridge": "br-int", "label": "tempest-network-smoke--422685355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fa042eb-d4", "ovs_interfaceid": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.482 226239 DEBUG nova.network.os_vif_util [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:78:d6:ba,bridge_name='br-int',has_traffic_filtering=True,id=2fa042eb-d400-4d66-9582-1916fd5ca4c0,network=Network(514e8c9e-2a14-4959-839a-40965c82f800),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fa042eb-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.483 226239 DEBUG os_vif [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:78:d6:ba,bridge_name='br-int',has_traffic_filtering=True,id=2fa042eb-d400-4d66-9582-1916fd5ca4c0,network=Network(514e8c9e-2a14-4959-839a-40965c82f800),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fa042eb-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.485 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.486 226239 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2fa042eb-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.487 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.492 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:13:33 np0005603623 nova_compute[226235]: 2026-01-31 09:13:33.495 226239 INFO os_vif [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:78:d6:ba,bridge_name='br-int',has_traffic_filtering=True,id=2fa042eb-d400-4d66-9582-1916fd5ca4c0,network=Network(514e8c9e-2a14-4959-839a-40965c82f800),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fa042eb-d4')#033[00m
Jan 31 04:13:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:33.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:34.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:34 np0005603623 nova_compute[226235]: 2026-01-31 09:13:34.054 226239 DEBUG nova.network.neutron [req-318e126f-1970-4aef-b184-19460187580a req-8a8bf9c2-ccb6-4cb6-a104-e6c50d30203a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Updated VIF entry in instance network info cache for port 2fa042eb-d400-4d66-9582-1916fd5ca4c0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:13:34 np0005603623 nova_compute[226235]: 2026-01-31 09:13:34.054 226239 DEBUG nova.network.neutron [req-318e126f-1970-4aef-b184-19460187580a req-8a8bf9c2-ccb6-4cb6-a104-e6c50d30203a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Updating instance_info_cache with network_info: [{"id": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "address": "fa:16:3e:78:d6:ba", "network": {"id": "514e8c9e-2a14-4959-839a-40965c82f800", "bridge": "br-int", "label": "tempest-network-smoke--422685355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fa042eb-d4", "ovs_interfaceid": "2fa042eb-d400-4d66-9582-1916fd5ca4c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:13:34 np0005603623 nova_compute[226235]: 2026-01-31 09:13:34.076 226239 DEBUG oslo_concurrency.lockutils [req-318e126f-1970-4aef-b184-19460187580a req-8a8bf9c2-ccb6-4cb6-a104-e6c50d30203a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4efd9646-aadf-4138-9d36-d47416e0c6e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:13:34 np0005603623 nova_compute[226235]: 2026-01-31 09:13:34.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:34 np0005603623 nova_compute[226235]: 2026-01-31 09:13:34.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:34 np0005603623 nova_compute[226235]: 2026-01-31 09:13:34.199 226239 INFO nova.virt.libvirt.driver [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Deleting instance files /var/lib/nova/instances/4efd9646-aadf-4138-9d36-d47416e0c6e1_del#033[00m
Jan 31 04:13:34 np0005603623 nova_compute[226235]: 2026-01-31 09:13:34.199 226239 INFO nova.virt.libvirt.driver [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Deletion of /var/lib/nova/instances/4efd9646-aadf-4138-9d36-d47416e0c6e1_del complete#033[00m
Jan 31 04:13:34 np0005603623 nova_compute[226235]: 2026-01-31 09:13:34.248 226239 INFO nova.compute.manager [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Took 1.22 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:13:34 np0005603623 nova_compute[226235]: 2026-01-31 09:13:34.248 226239 DEBUG oslo.service.loopingcall [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:13:34 np0005603623 nova_compute[226235]: 2026-01-31 09:13:34.249 226239 DEBUG nova.compute.manager [-] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:13:34 np0005603623 nova_compute[226235]: 2026-01-31 09:13:34.249 226239 DEBUG nova.network.neutron [-] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:13:35 np0005603623 nova_compute[226235]: 2026-01-31 09:13:35.504 226239 DEBUG nova.compute.manager [req-90cd4dde-d82d-4a64-81e2-aca9f5dac611 req-d9669338-8045-4a8b-a4ee-5e940aea4340 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Received event network-vif-plugged-2fa042eb-d400-4d66-9582-1916fd5ca4c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:13:35 np0005603623 nova_compute[226235]: 2026-01-31 09:13:35.505 226239 DEBUG oslo_concurrency.lockutils [req-90cd4dde-d82d-4a64-81e2-aca9f5dac611 req-d9669338-8045-4a8b-a4ee-5e940aea4340 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4efd9646-aadf-4138-9d36-d47416e0c6e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:13:35 np0005603623 nova_compute[226235]: 2026-01-31 09:13:35.505 226239 DEBUG oslo_concurrency.lockutils [req-90cd4dde-d82d-4a64-81e2-aca9f5dac611 req-d9669338-8045-4a8b-a4ee-5e940aea4340 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4efd9646-aadf-4138-9d36-d47416e0c6e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:13:35 np0005603623 nova_compute[226235]: 2026-01-31 09:13:35.505 226239 DEBUG oslo_concurrency.lockutils [req-90cd4dde-d82d-4a64-81e2-aca9f5dac611 req-d9669338-8045-4a8b-a4ee-5e940aea4340 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4efd9646-aadf-4138-9d36-d47416e0c6e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:13:35 np0005603623 nova_compute[226235]: 2026-01-31 09:13:35.505 226239 DEBUG nova.compute.manager [req-90cd4dde-d82d-4a64-81e2-aca9f5dac611 req-d9669338-8045-4a8b-a4ee-5e940aea4340 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] No waiting events found dispatching network-vif-plugged-2fa042eb-d400-4d66-9582-1916fd5ca4c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:13:35 np0005603623 nova_compute[226235]: 2026-01-31 09:13:35.505 226239 WARNING nova.compute.manager [req-90cd4dde-d82d-4a64-81e2-aca9f5dac611 req-d9669338-8045-4a8b-a4ee-5e940aea4340 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Received unexpected event network-vif-plugged-2fa042eb-d400-4d66-9582-1916fd5ca4c0 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 04:13:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:35.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:36.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:36 np0005603623 nova_compute[226235]: 2026-01-31 09:13:36.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:37 np0005603623 nova_compute[226235]: 2026-01-31 09:13:37.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:37 np0005603623 nova_compute[226235]: 2026-01-31 09:13:37.196 226239 DEBUG nova.network.neutron [-] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:13:37 np0005603623 nova_compute[226235]: 2026-01-31 09:13:37.220 226239 INFO nova.compute.manager [-] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Took 2.97 seconds to deallocate network for instance.#033[00m
Jan 31 04:13:37 np0005603623 nova_compute[226235]: 2026-01-31 09:13:37.258 226239 DEBUG oslo_concurrency.lockutils [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:13:37 np0005603623 nova_compute[226235]: 2026-01-31 09:13:37.259 226239 DEBUG oslo_concurrency.lockutils [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:13:37 np0005603623 nova_compute[226235]: 2026-01-31 09:13:37.277 226239 DEBUG nova.compute.manager [req-f1b02cdb-675a-4026-969c-35fa7e6db2fa req-c10c6b00-e796-45ab-bb97-1f89292e6e29 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Received event network-vif-deleted-2fa042eb-d400-4d66-9582-1916fd5ca4c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:13:37 np0005603623 nova_compute[226235]: 2026-01-31 09:13:37.302 226239 DEBUG oslo_concurrency.processutils [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:13:37 np0005603623 nova_compute[226235]: 2026-01-31 09:13:37.370 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:13:37 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2778288480' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:13:37 np0005603623 nova_compute[226235]: 2026-01-31 09:13:37.730 226239 DEBUG oslo_concurrency.processutils [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:13:37 np0005603623 nova_compute[226235]: 2026-01-31 09:13:37.736 226239 DEBUG nova.compute.provider_tree [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:13:37 np0005603623 nova_compute[226235]: 2026-01-31 09:13:37.754 226239 DEBUG nova.scheduler.client.report [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:13:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:37.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:37 np0005603623 nova_compute[226235]: 2026-01-31 09:13:37.797 226239 DEBUG oslo_concurrency.lockutils [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:13:37 np0005603623 nova_compute[226235]: 2026-01-31 09:13:37.834 226239 INFO nova.scheduler.client.report [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Deleted allocations for instance 4efd9646-aadf-4138-9d36-d47416e0c6e1#033[00m
Jan 31 04:13:37 np0005603623 nova_compute[226235]: 2026-01-31 09:13:37.922 226239 DEBUG oslo_concurrency.lockutils [None req-aa665c92-6cb8-42b4-986c-0a020129b7d7 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "4efd9646-aadf-4138-9d36-d47416e0c6e1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.901s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:13:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:38.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #187. Immutable memtables: 0.
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:13:38.066369) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 119] Flushing memtable with next log file: 187
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850818066443, "job": 119, "event": "flush_started", "num_memtables": 1, "num_entries": 850, "num_deletes": 257, "total_data_size": 1692815, "memory_usage": 1720464, "flush_reason": "Manual Compaction"}
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 119] Level-0 flush table #188: started
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850818073888, "cf_name": "default", "job": 119, "event": "table_file_creation", "file_number": 188, "file_size": 1117811, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 90269, "largest_seqno": 91114, "table_properties": {"data_size": 1113742, "index_size": 1784, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 8905, "raw_average_key_size": 19, "raw_value_size": 1105644, "raw_average_value_size": 2382, "num_data_blocks": 78, "num_entries": 464, "num_filter_entries": 464, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850757, "oldest_key_time": 1769850757, "file_creation_time": 1769850818, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 188, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 119] Flush lasted 7590 microseconds, and 2820 cpu microseconds.
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:13:38.073955) [db/flush_job.cc:967] [default] [JOB 119] Level-0 flush table #188: 1117811 bytes OK
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:13:38.073976) [db/memtable_list.cc:519] [default] Level-0 commit table #188 started
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:13:38.076264) [db/memtable_list.cc:722] [default] Level-0 commit table #188: memtable #1 done
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:13:38.076283) EVENT_LOG_v1 {"time_micros": 1769850818076277, "job": 119, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:13:38.076302) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 119] Try to delete WAL files size 1688420, prev total WAL file size 1688420, number of live WAL files 2.
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000184.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:13:38.076752) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353134' seq:72057594037927935, type:22 .. '6C6F676D0033373637' seq:0, type:0; will stop at (end)
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 120] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 119 Base level 0, inputs: [188(1091KB)], [186(10MB)]
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850818076796, "job": 120, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [188], "files_L6": [186], "score": -1, "input_data_size": 12587308, "oldest_snapshot_seqno": -1}
Jan 31 04:13:38 np0005603623 nova_compute[226235]: 2026-01-31 09:13:38.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 120] Generated table #189: 10835 keys, 12446582 bytes, temperature: kUnknown
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850818218808, "cf_name": "default", "job": 120, "event": "table_file_creation", "file_number": 189, "file_size": 12446582, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12378691, "index_size": 39682, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27141, "raw_key_size": 287562, "raw_average_key_size": 26, "raw_value_size": 12191764, "raw_average_value_size": 1125, "num_data_blocks": 1498, "num_entries": 10835, "num_filter_entries": 10835, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769850818, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 189, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:13:38.219198) [db/compaction/compaction_job.cc:1663] [default] [JOB 120] Compacted 1@0 + 1@6 files to L6 => 12446582 bytes
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:13:38.220561) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 88.6 rd, 87.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 10.9 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(22.4) write-amplify(11.1) OK, records in: 11364, records dropped: 529 output_compression: NoCompression
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:13:38.220594) EVENT_LOG_v1 {"time_micros": 1769850818220580, "job": 120, "event": "compaction_finished", "compaction_time_micros": 142120, "compaction_time_cpu_micros": 26981, "output_level": 6, "num_output_files": 1, "total_output_size": 12446582, "num_input_records": 11364, "num_output_records": 10835, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000188.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850818220949, "job": 120, "event": "table_file_deletion", "file_number": 188}
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000186.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850818223212, "job": 120, "event": "table_file_deletion", "file_number": 186}
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:13:38.076715) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:13:38.223328) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:13:38.223334) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:13:38.223335) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:13:38.223337) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:13:38 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:13:38.223339) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:13:38 np0005603623 nova_compute[226235]: 2026-01-31 09:13:38.487 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:38 np0005603623 podman[331773]: 2026-01-31 09:13:38.952979918 +0000 UTC m=+0.044493578 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 31 04:13:38 np0005603623 podman[331774]: 2026-01-31 09:13:38.990059711 +0000 UTC m=+0.078985379 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:13:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:13:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:39.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:13:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:40.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:41.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:42.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:42 np0005603623 nova_compute[226235]: 2026-01-31 09:13:42.372 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:43 np0005603623 nova_compute[226235]: 2026-01-31 09:13:43.526 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:43.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:44.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:45.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:46.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:46 np0005603623 nova_compute[226235]: 2026-01-31 09:13:46.284 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:46 np0005603623 nova_compute[226235]: 2026-01-31 09:13:46.313 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:47 np0005603623 nova_compute[226235]: 2026-01-31 09:13:47.373 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:47.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:48.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:48 np0005603623 nova_compute[226235]: 2026-01-31 09:13:48.459 226239 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850813.4582722, 4efd9646-aadf-4138-9d36-d47416e0c6e1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:13:48 np0005603623 nova_compute[226235]: 2026-01-31 09:13:48.460 226239 INFO nova.compute.manager [-] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:13:48 np0005603623 nova_compute[226235]: 2026-01-31 09:13:48.528 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:49 np0005603623 nova_compute[226235]: 2026-01-31 09:13:49.437 226239 DEBUG nova.compute.manager [None req-1dccc4e4-0648-4c67-af60-24a459c0790a - - - - - -] [instance: 4efd9646-aadf-4138-9d36-d47416e0c6e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:13:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:13:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:49.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:13:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:13:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:50.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:13:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:51.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:13:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:52.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:13:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:52 np0005603623 nova_compute[226235]: 2026-01-31 09:13:52.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:52 np0005603623 nova_compute[226235]: 2026-01-31 09:13:52.376 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:53 np0005603623 nova_compute[226235]: 2026-01-31 09:13:53.530 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:13:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:53.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:13:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:54.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:55.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:13:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:56.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:13:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:57 np0005603623 nova_compute[226235]: 2026-01-31 09:13:57.378 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:57.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:13:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:58.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:13:58 np0005603623 nova_compute[226235]: 2026-01-31 09:13:58.532 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:13:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:59.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:00.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:01.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:02.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:02 np0005603623 nova_compute[226235]: 2026-01-31 09:14:02.380 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:03 np0005603623 nova_compute[226235]: 2026-01-31 09:14:03.534 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:03.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:04.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:14:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:05.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:14:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:06.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:07 np0005603623 nova_compute[226235]: 2026-01-31 09:14:07.382 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:07.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:08.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:08 np0005603623 nova_compute[226235]: 2026-01-31 09:14:08.536 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:09.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:09 np0005603623 podman[331936]: 2026-01-31 09:14:09.953501774 +0000 UTC m=+0.045384556 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 31 04:14:09 np0005603623 podman[331937]: 2026-01-31 09:14:09.971944573 +0000 UTC m=+0.063697330 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 04:14:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:10.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:14:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:14:11 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:14:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:14:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:11.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:14:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:12.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:12 np0005603623 nova_compute[226235]: 2026-01-31 09:14:12.399 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:13 np0005603623 nova_compute[226235]: 2026-01-31 09:14:13.541 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:13.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:14.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:14:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3145049656' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:14:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:14:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3145049656' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:14:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:15.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:15 np0005603623 nova_compute[226235]: 2026-01-31 09:14:15.946 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:14:15.946 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=98, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=97) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:14:15 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:14:15.947 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:14:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:16.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:14:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:14:16 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:14:16.950 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '98'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:14:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:17 np0005603623 nova_compute[226235]: 2026-01-31 09:14:17.401 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:17.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:18.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:18 np0005603623 nova_compute[226235]: 2026-01-31 09:14:18.542 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:19.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:20.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:14:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:21.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:14:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:22.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:22 np0005603623 nova_compute[226235]: 2026-01-31 09:14:22.402 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:23 np0005603623 nova_compute[226235]: 2026-01-31 09:14:23.583 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:14:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:23.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:14:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:14:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:24.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:14:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:25.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:14:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:26.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:14:26 np0005603623 nova_compute[226235]: 2026-01-31 09:14:26.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:26 np0005603623 nova_compute[226235]: 2026-01-31 09:14:26.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:14:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:27 np0005603623 nova_compute[226235]: 2026-01-31 09:14:27.406 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:14:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:27.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:14:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:28.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:28 np0005603623 nova_compute[226235]: 2026-01-31 09:14:28.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:28 np0005603623 nova_compute[226235]: 2026-01-31 09:14:28.584 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:28 np0005603623 nova_compute[226235]: 2026-01-31 09:14:28.723 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:14:28 np0005603623 nova_compute[226235]: 2026-01-31 09:14:28.723 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:14:28 np0005603623 nova_compute[226235]: 2026-01-31 09:14:28.723 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:14:28 np0005603623 nova_compute[226235]: 2026-01-31 09:14:28.723 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:14:28 np0005603623 nova_compute[226235]: 2026-01-31 09:14:28.724 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:14:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:14:29 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2623749212' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:14:29 np0005603623 nova_compute[226235]: 2026-01-31 09:14:29.197 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:14:29 np0005603623 nova_compute[226235]: 2026-01-31 09:14:29.380 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:14:29 np0005603623 nova_compute[226235]: 2026-01-31 09:14:29.381 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4109MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:14:29 np0005603623 nova_compute[226235]: 2026-01-31 09:14:29.382 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:14:29 np0005603623 nova_compute[226235]: 2026-01-31 09:14:29.382 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:14:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:14:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:29.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:14:29 np0005603623 nova_compute[226235]: 2026-01-31 09:14:29.860 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:14:29 np0005603623 nova_compute[226235]: 2026-01-31 09:14:29.860 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:14:29 np0005603623 nova_compute[226235]: 2026-01-31 09:14:29.953 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:14:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:14:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:30.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:14:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:14:30.165 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:14:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:14:30.166 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:14:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:14:30.166 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:14:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:14:30 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1970731840' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:14:30 np0005603623 nova_compute[226235]: 2026-01-31 09:14:30.391 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:14:30 np0005603623 nova_compute[226235]: 2026-01-31 09:14:30.398 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:14:30 np0005603623 nova_compute[226235]: 2026-01-31 09:14:30.429 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:14:30 np0005603623 nova_compute[226235]: 2026-01-31 09:14:30.496 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:14:30 np0005603623 nova_compute[226235]: 2026-01-31 09:14:30.497 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:14:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:31.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:32.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:32 np0005603623 nova_compute[226235]: 2026-01-31 09:14:32.408 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:32 np0005603623 nova_compute[226235]: 2026-01-31 09:14:32.498 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:32 np0005603623 nova_compute[226235]: 2026-01-31 09:14:32.498 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:14:32 np0005603623 nova_compute[226235]: 2026-01-31 09:14:32.498 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:14:33 np0005603623 nova_compute[226235]: 2026-01-31 09:14:33.585 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:33.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:34.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:35.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:36.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:37 np0005603623 nova_compute[226235]: 2026-01-31 09:14:37.410 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:37.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:37 np0005603623 nova_compute[226235]: 2026-01-31 09:14:37.925 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:14:37 np0005603623 nova_compute[226235]: 2026-01-31 09:14:37.926 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:37 np0005603623 nova_compute[226235]: 2026-01-31 09:14:37.926 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:37 np0005603623 nova_compute[226235]: 2026-01-31 09:14:37.927 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:37 np0005603623 nova_compute[226235]: 2026-01-31 09:14:37.927 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:38.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:38 np0005603623 nova_compute[226235]: 2026-01-31 09:14:38.586 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:39 np0005603623 nova_compute[226235]: 2026-01-31 09:14:39.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:39 np0005603623 nova_compute[226235]: 2026-01-31 09:14:39.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:39.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:14:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:40.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:14:40 np0005603623 podman[332272]: 2026-01-31 09:14:40.950082173 +0000 UTC m=+0.045818448 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 31 04:14:40 np0005603623 podman[332273]: 2026-01-31 09:14:40.972870988 +0000 UTC m=+0.070019158 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 04:14:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:41.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:42.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:42 np0005603623 nova_compute[226235]: 2026-01-31 09:14:42.412 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:43 np0005603623 nova_compute[226235]: 2026-01-31 09:14:43.587 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:43.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:44.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:45.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:46 np0005603623 ovn_controller[133449]: 2026-01-31T09:14:46Z|00937|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 31 04:14:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:14:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:46.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:14:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:47 np0005603623 nova_compute[226235]: 2026-01-31 09:14:47.413 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:14:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:47.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:14:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:48.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:48 np0005603623 nova_compute[226235]: 2026-01-31 09:14:48.588 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:14:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:49.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:14:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:50.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:51.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:52.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:52 np0005603623 nova_compute[226235]: 2026-01-31 09:14:52.415 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:53 np0005603623 nova_compute[226235]: 2026-01-31 09:14:53.590 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:53.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:54.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:14:55.676 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=99, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=98) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:14:55 np0005603623 nova_compute[226235]: 2026-01-31 09:14:55.676 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:55 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:14:55.677 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:14:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:14:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:55.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:14:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:56.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:57 np0005603623 nova_compute[226235]: 2026-01-31 09:14:57.416 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:14:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:57.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:14:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:58.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:58 np0005603623 nova_compute[226235]: 2026-01-31 09:14:58.591 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:14:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:59.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:00.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:01 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:15:01.679 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '99'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:15:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:01.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:15:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:02.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:15:02 np0005603623 nova_compute[226235]: 2026-01-31 09:15:02.418 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:03 np0005603623 nova_compute[226235]: 2026-01-31 09:15:03.594 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:03.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:04.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:05.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:15:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:06.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:15:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:07 np0005603623 nova_compute[226235]: 2026-01-31 09:15:07.420 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:15:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:07.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:15:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:15:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:08.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:15:08 np0005603623 nova_compute[226235]: 2026-01-31 09:15:08.595 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:09.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:10.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:11.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:11 np0005603623 podman[332431]: 2026-01-31 09:15:11.957327242 +0000 UTC m=+0.051137986 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 31 04:15:12 np0005603623 podman[332432]: 2026-01-31 09:15:12.032937135 +0000 UTC m=+0.119002666 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127)
Jan 31 04:15:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:12.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:12 np0005603623 nova_compute[226235]: 2026-01-31 09:15:12.422 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:13 np0005603623 nova_compute[226235]: 2026-01-31 09:15:13.597 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:13.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:14.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:15:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/295832057' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:15:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:15:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/295832057' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:15:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:15.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:16.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:15:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:15:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:15:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:17 np0005603623 nova_compute[226235]: 2026-01-31 09:15:17.423 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:17.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:18.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:18 np0005603623 nova_compute[226235]: 2026-01-31 09:15:18.598 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:19.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:20.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:21.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:22.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:22 np0005603623 nova_compute[226235]: 2026-01-31 09:15:22.425 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:15:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:15:23 np0005603623 nova_compute[226235]: 2026-01-31 09:15:23.599 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:23.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:15:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:24.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:15:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:25.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:26 np0005603623 nova_compute[226235]: 2026-01-31 09:15:26.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:26 np0005603623 nova_compute[226235]: 2026-01-31 09:15:26.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:15:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:26.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:27 np0005603623 nova_compute[226235]: 2026-01-31 09:15:27.426 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:27.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:28.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:28 np0005603623 nova_compute[226235]: 2026-01-31 09:15:28.601 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:29.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:30 np0005603623 nova_compute[226235]: 2026-01-31 09:15:30.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:15:30.166 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:15:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:15:30.167 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:15:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:15:30.167 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:15:30 np0005603623 nova_compute[226235]: 2026-01-31 09:15:30.181 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:15:30 np0005603623 nova_compute[226235]: 2026-01-31 09:15:30.182 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:15:30 np0005603623 nova_compute[226235]: 2026-01-31 09:15:30.182 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:15:30 np0005603623 nova_compute[226235]: 2026-01-31 09:15:30.182 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:15:30 np0005603623 nova_compute[226235]: 2026-01-31 09:15:30.182 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:15:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:15:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:30.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:15:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:15:30 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2219355343' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:15:30 np0005603623 nova_compute[226235]: 2026-01-31 09:15:30.613 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:15:30 np0005603623 nova_compute[226235]: 2026-01-31 09:15:30.740 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:15:30 np0005603623 nova_compute[226235]: 2026-01-31 09:15:30.742 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4107MB free_disk=20.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:15:30 np0005603623 nova_compute[226235]: 2026-01-31 09:15:30.742 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:15:30 np0005603623 nova_compute[226235]: 2026-01-31 09:15:30.742 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:15:30 np0005603623 nova_compute[226235]: 2026-01-31 09:15:30.899 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:15:30 np0005603623 nova_compute[226235]: 2026-01-31 09:15:30.900 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:15:30 np0005603623 nova_compute[226235]: 2026-01-31 09:15:30.916 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:15:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:15:31 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1717991137' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:15:31 np0005603623 nova_compute[226235]: 2026-01-31 09:15:31.331 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:15:31 np0005603623 nova_compute[226235]: 2026-01-31 09:15:31.337 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:15:31 np0005603623 nova_compute[226235]: 2026-01-31 09:15:31.361 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:15:31 np0005603623 nova_compute[226235]: 2026-01-31 09:15:31.363 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:15:31 np0005603623 nova_compute[226235]: 2026-01-31 09:15:31.363 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:15:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:31.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:32.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:32 np0005603623 nova_compute[226235]: 2026-01-31 09:15:32.363 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:32 np0005603623 nova_compute[226235]: 2026-01-31 09:15:32.364 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:15:32 np0005603623 nova_compute[226235]: 2026-01-31 09:15:32.364 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:15:32 np0005603623 nova_compute[226235]: 2026-01-31 09:15:32.385 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:15:32 np0005603623 nova_compute[226235]: 2026-01-31 09:15:32.427 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:33 np0005603623 nova_compute[226235]: 2026-01-31 09:15:33.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:33 np0005603623 nova_compute[226235]: 2026-01-31 09:15:33.603 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:33.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:34.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:35.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:36 np0005603623 nova_compute[226235]: 2026-01-31 09:15:36.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:36 np0005603623 nova_compute[226235]: 2026-01-31 09:15:36.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:36.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:37 np0005603623 nova_compute[226235]: 2026-01-31 09:15:37.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:37 np0005603623 nova_compute[226235]: 2026-01-31 09:15:37.567 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:37.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:15:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:38.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:15:38 np0005603623 nova_compute[226235]: 2026-01-31 09:15:38.604 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:39 np0005603623 nova_compute[226235]: 2026-01-31 09:15:39.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:39.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:40 np0005603623 nova_compute[226235]: 2026-01-31 09:15:40.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:40.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:41.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:42.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:42 np0005603623 nova_compute[226235]: 2026-01-31 09:15:42.570 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:42 np0005603623 podman[332768]: 2026-01-31 09:15:42.989949065 +0000 UTC m=+0.077709649 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 04:15:43 np0005603623 podman[332767]: 2026-01-31 09:15:43.004081179 +0000 UTC m=+0.089140008 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:15:43 np0005603623 nova_compute[226235]: 2026-01-31 09:15:43.605 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:43.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:44.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:15:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:45.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:15:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:46.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:47 np0005603623 nova_compute[226235]: 2026-01-31 09:15:47.572 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:47.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:48.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:48 np0005603623 nova_compute[226235]: 2026-01-31 09:15:48.606 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:15:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:49.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:15:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:50.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:51.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:52 np0005603623 nova_compute[226235]: 2026-01-31 09:15:52.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:52.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:52 np0005603623 nova_compute[226235]: 2026-01-31 09:15:52.619 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:53 np0005603623 nova_compute[226235]: 2026-01-31 09:15:53.607 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:53.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:54.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:55.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:15:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:56.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:15:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:57 np0005603623 nova_compute[226235]: 2026-01-31 09:15:57.620 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:15:57.817 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=100, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=99) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:15:57 np0005603623 nova_compute[226235]: 2026-01-31 09:15:57.818 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:57 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:15:57.819 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:15:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:57.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:58.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:58 np0005603623 nova_compute[226235]: 2026-01-31 09:15:58.609 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:15:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:15:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:59.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:16:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:16:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:00.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:16:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:01.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:02.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:02 np0005603623 nova_compute[226235]: 2026-01-31 09:16:02.621 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:03 np0005603623 nova_compute[226235]: 2026-01-31 09:16:03.610 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:16:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:04.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:16:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:04.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:16:04.821 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '100'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:16:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:16:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:06.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:16:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:06.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:07 np0005603623 nova_compute[226235]: 2026-01-31 09:16:07.622 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:08.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:08.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:08 np0005603623 nova_compute[226235]: 2026-01-31 09:16:08.612 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:10.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:16:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:10.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:16:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:16:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:12.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:16:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:12.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:12 np0005603623 nova_compute[226235]: 2026-01-31 09:16:12.623 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:13 np0005603623 nova_compute[226235]: 2026-01-31 09:16:13.613 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:13 np0005603623 podman[332925]: 2026-01-31 09:16:13.947057371 +0000 UTC m=+0.039584904 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 04:16:13 np0005603623 podman[332926]: 2026-01-31 09:16:13.971506818 +0000 UTC m=+0.060992625 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 04:16:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:14.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:14.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:16:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3153480839' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:16:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:16:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3153480839' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:16:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:16.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:16.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:17 np0005603623 nova_compute[226235]: 2026-01-31 09:16:17.625 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:18.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:18.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:18 np0005603623 nova_compute[226235]: 2026-01-31 09:16:18.635 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:20.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:20.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:22.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:22.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #190. Immutable memtables: 0.
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:16:22.625289) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 121] Flushing memtable with next log file: 190
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850982625325, "job": 121, "event": "flush_started", "num_memtables": 1, "num_entries": 1750, "num_deletes": 251, "total_data_size": 4250787, "memory_usage": 4306672, "flush_reason": "Manual Compaction"}
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 121] Level-0 flush table #191: started
Jan 31 04:16:22 np0005603623 nova_compute[226235]: 2026-01-31 09:16:22.626 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850982661000, "cf_name": "default", "job": 121, "event": "table_file_creation", "file_number": 191, "file_size": 2795093, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 91119, "largest_seqno": 92864, "table_properties": {"data_size": 2787734, "index_size": 4365, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 15032, "raw_average_key_size": 20, "raw_value_size": 2773206, "raw_average_value_size": 3707, "num_data_blocks": 192, "num_entries": 748, "num_filter_entries": 748, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850818, "oldest_key_time": 1769850818, "file_creation_time": 1769850982, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 191, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 121] Flush lasted 35755 microseconds, and 7181 cpu microseconds.
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:16:22.661043) [db/flush_job.cc:967] [default] [JOB 121] Level-0 flush table #191: 2795093 bytes OK
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:16:22.661059) [db/memtable_list.cc:519] [default] Level-0 commit table #191 started
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:16:22.664795) [db/memtable_list.cc:722] [default] Level-0 commit table #191: memtable #1 done
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:16:22.664808) EVENT_LOG_v1 {"time_micros": 1769850982664804, "job": 121, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:16:22.664823) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 121] Try to delete WAL files size 4242906, prev total WAL file size 4244832, number of live WAL files 2.
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000187.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:16:22.665508) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038303332' seq:72057594037927935, type:22 .. '7061786F730038323834' seq:0, type:0; will stop at (end)
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 122] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 121 Base level 0, inputs: [191(2729KB)], [189(11MB)]
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850982665546, "job": 122, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [191], "files_L6": [189], "score": -1, "input_data_size": 15241675, "oldest_snapshot_seqno": -1}
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 122] Generated table #192: 11066 keys, 13333870 bytes, temperature: kUnknown
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850982865890, "cf_name": "default", "job": 122, "event": "table_file_creation", "file_number": 192, "file_size": 13333870, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13263720, "index_size": 41365, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27717, "raw_key_size": 293024, "raw_average_key_size": 26, "raw_value_size": 13071909, "raw_average_value_size": 1181, "num_data_blocks": 1566, "num_entries": 11066, "num_filter_entries": 11066, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769850982, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 192, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:16:22.866121) [db/compaction/compaction_job.cc:1663] [default] [JOB 122] Compacted 1@0 + 1@6 files to L6 => 13333870 bytes
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:16:22.869410) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 76.0 rd, 66.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 11.9 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(10.2) write-amplify(4.8) OK, records in: 11583, records dropped: 517 output_compression: NoCompression
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:16:22.869429) EVENT_LOG_v1 {"time_micros": 1769850982869419, "job": 122, "event": "compaction_finished", "compaction_time_micros": 200420, "compaction_time_cpu_micros": 23881, "output_level": 6, "num_output_files": 1, "total_output_size": 13333870, "num_input_records": 11583, "num_output_records": 11066, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000191.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850982869762, "job": 122, "event": "table_file_deletion", "file_number": 191}
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000189.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850982870808, "job": 122, "event": "table_file_deletion", "file_number": 189}
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:16:22.665430) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:16:22.870865) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:16:22.870870) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:16:22.870872) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:16:22.870874) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:16:22.870876) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:16:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:16:23 np0005603623 nova_compute[226235]: 2026-01-31 09:16:23.664 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 04:16:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 04:16:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 04:16:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:16:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:16:23 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:16:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:24.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:24.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:26.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:26.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:27 np0005603623 nova_compute[226235]: 2026-01-31 09:16:27.628 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:28.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:28 np0005603623 nova_compute[226235]: 2026-01-31 09:16:28.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:28 np0005603623 nova_compute[226235]: 2026-01-31 09:16:28.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:16:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:28.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:28 np0005603623 nova_compute[226235]: 2026-01-31 09:16:28.667 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:16:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:16:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:16:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:30.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:16:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:16:30.168 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:16:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:16:30.169 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:16:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:16:30.169 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:16:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:30.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:31 np0005603623 nova_compute[226235]: 2026-01-31 09:16:31.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:31 np0005603623 nova_compute[226235]: 2026-01-31 09:16:31.182 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:16:31 np0005603623 nova_compute[226235]: 2026-01-31 09:16:31.182 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:16:31 np0005603623 nova_compute[226235]: 2026-01-31 09:16:31.182 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:16:31 np0005603623 nova_compute[226235]: 2026-01-31 09:16:31.183 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:16:31 np0005603623 nova_compute[226235]: 2026-01-31 09:16:31.183 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:16:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:16:31 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1628176122' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:16:31 np0005603623 nova_compute[226235]: 2026-01-31 09:16:31.579 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.396s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:16:31 np0005603623 nova_compute[226235]: 2026-01-31 09:16:31.716 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:16:31 np0005603623 nova_compute[226235]: 2026-01-31 09:16:31.718 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4094MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:16:31 np0005603623 nova_compute[226235]: 2026-01-31 09:16:31.718 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:16:31 np0005603623 nova_compute[226235]: 2026-01-31 09:16:31.718 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:16:31 np0005603623 nova_compute[226235]: 2026-01-31 09:16:31.811 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:16:31 np0005603623 nova_compute[226235]: 2026-01-31 09:16:31.811 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:16:31 np0005603623 nova_compute[226235]: 2026-01-31 09:16:31.853 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:16:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:32.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:16:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/24850775' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:16:32 np0005603623 nova_compute[226235]: 2026-01-31 09:16:32.273 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:16:32 np0005603623 nova_compute[226235]: 2026-01-31 09:16:32.278 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:16:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:32.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:32 np0005603623 nova_compute[226235]: 2026-01-31 09:16:32.295 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:16:32 np0005603623 nova_compute[226235]: 2026-01-31 09:16:32.297 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:16:32 np0005603623 nova_compute[226235]: 2026-01-31 09:16:32.297 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:16:32 np0005603623 nova_compute[226235]: 2026-01-31 09:16:32.629 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:33 np0005603623 nova_compute[226235]: 2026-01-31 09:16:33.668 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:16:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:34.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:16:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:16:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:34.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:16:34 np0005603623 nova_compute[226235]: 2026-01-31 09:16:34.297 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:34 np0005603623 nova_compute[226235]: 2026-01-31 09:16:34.298 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:16:34 np0005603623 nova_compute[226235]: 2026-01-31 09:16:34.298 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:16:34 np0005603623 nova_compute[226235]: 2026-01-31 09:16:34.316 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:16:34 np0005603623 nova_compute[226235]: 2026-01-31 09:16:34.316 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:36.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:36 np0005603623 nova_compute[226235]: 2026-01-31 09:16:36.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:36.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:37 np0005603623 nova_compute[226235]: 2026-01-31 09:16:37.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:37 np0005603623 nova_compute[226235]: 2026-01-31 09:16:37.631 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:38.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:38.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:38 np0005603623 nova_compute[226235]: 2026-01-31 09:16:38.671 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:39 np0005603623 nova_compute[226235]: 2026-01-31 09:16:39.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:40.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:40 np0005603623 nova_compute[226235]: 2026-01-31 09:16:40.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:40.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:16:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:42.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:16:42 np0005603623 nova_compute[226235]: 2026-01-31 09:16:42.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:42.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:42 np0005603623 nova_compute[226235]: 2026-01-31 09:16:42.632 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:43 np0005603623 nova_compute[226235]: 2026-01-31 09:16:43.672 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:16:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:44.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:16:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:44.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:44 np0005603623 podman[333385]: 2026-01-31 09:16:44.950139686 +0000 UTC m=+0.043543138 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:16:44 np0005603623 podman[333386]: 2026-01-31 09:16:44.992026461 +0000 UTC m=+0.084357578 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 04:16:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:16:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:46.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:16:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:46.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:47 np0005603623 nova_compute[226235]: 2026-01-31 09:16:47.634 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:16:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:48.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:16:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:16:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:48.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:16:48 np0005603623 nova_compute[226235]: 2026-01-31 09:16:48.674 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:50.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:16:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:50.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:16:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:52.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:52.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:52 np0005603623 nova_compute[226235]: 2026-01-31 09:16:52.635 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:53 np0005603623 nova_compute[226235]: 2026-01-31 09:16:53.676 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:54.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:16:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:54.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:16:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:56.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:56.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:57 np0005603623 nova_compute[226235]: 2026-01-31 09:16:57.638 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:58.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:16:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:58.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:58 np0005603623 nova_compute[226235]: 2026-01-31 09:16:58.678 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:00.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:17:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:00.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:17:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:02.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:02.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:02 np0005603623 nova_compute[226235]: 2026-01-31 09:17:02.639 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:03 np0005603623 nova_compute[226235]: 2026-01-31 09:17:03.680 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:04.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:04.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:05 np0005603623 nova_compute[226235]: 2026-01-31 09:17:05.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:05 np0005603623 nova_compute[226235]: 2026-01-31 09:17:05.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 04:17:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:17:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:06.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:17:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:06.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:07 np0005603623 nova_compute[226235]: 2026-01-31 09:17:07.641 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:08.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:08.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:08 np0005603623 nova_compute[226235]: 2026-01-31 09:17:08.682 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:17:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:10.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:17:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:10.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:12.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:17:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:12.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:17:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:17:12 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1323932115' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:17:12 np0005603623 nova_compute[226235]: 2026-01-31 09:17:12.642 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:13 np0005603623 nova_compute[226235]: 2026-01-31 09:17:13.725 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:14.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:14.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:15 np0005603623 podman[333545]: 2026-01-31 09:17:15.960021503 +0000 UTC m=+0.056295298 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 04:17:15 np0005603623 podman[333546]: 2026-01-31 09:17:15.990114167 +0000 UTC m=+0.081714054 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 04:17:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:17:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:16.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:17:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:16.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:17 np0005603623 nova_compute[226235]: 2026-01-31 09:17:17.695 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e426 e426: 3 total, 3 up, 3 in
Jan 31 04:17:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:17:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:18.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:17:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:18.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:18 np0005603623 nova_compute[226235]: 2026-01-31 09:17:18.778 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:20.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:20.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:21 np0005603623 nova_compute[226235]: 2026-01-31 09:17:21.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e427 e427: 3 total, 3 up, 3 in
Jan 31 04:17:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:22.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:22.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:22 np0005603623 nova_compute[226235]: 2026-01-31 09:17:22.697 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e428 e428: 3 total, 3 up, 3 in
Jan 31 04:17:23 np0005603623 nova_compute[226235]: 2026-01-31 09:17:23.780 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:24.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e429 e429: 3 total, 3 up, 3 in
Jan 31 04:17:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:24.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:26.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:26.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e429 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:27 np0005603623 nova_compute[226235]: 2026-01-31 09:17:27.166 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:27 np0005603623 nova_compute[226235]: 2026-01-31 09:17:27.166 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 04:17:27 np0005603623 nova_compute[226235]: 2026-01-31 09:17:27.186 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 04:17:27 np0005603623 nova_compute[226235]: 2026-01-31 09:17:27.699 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e430 e430: 3 total, 3 up, 3 in
Jan 31 04:17:28 np0005603623 nova_compute[226235]: 2026-01-31 09:17:28.175 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:28 np0005603623 nova_compute[226235]: 2026-01-31 09:17:28.175 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:17:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:28.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:28.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:28 np0005603623 nova_compute[226235]: 2026-01-31 09:17:28.782 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:17:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:17:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:17:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:17:30.170 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:17:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:17:30.170 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:17:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:17:30.170 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:17:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:30.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:30.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:32 np0005603623 nova_compute[226235]: 2026-01-31 09:17:32.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:32 np0005603623 nova_compute[226235]: 2026-01-31 09:17:32.174 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:17:32 np0005603623 nova_compute[226235]: 2026-01-31 09:17:32.174 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:17:32 np0005603623 nova_compute[226235]: 2026-01-31 09:17:32.174 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:17:32 np0005603623 nova_compute[226235]: 2026-01-31 09:17:32.175 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:17:32 np0005603623 nova_compute[226235]: 2026-01-31 09:17:32.175 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:17:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:32.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:32.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:17:32.576 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=101, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=100) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:17:32 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:17:32.577 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:17:32 np0005603623 nova_compute[226235]: 2026-01-31 09:17:32.579 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:17:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/535608558' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:17:32 np0005603623 nova_compute[226235]: 2026-01-31 09:17:32.606 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:17:32 np0005603623 nova_compute[226235]: 2026-01-31 09:17:32.699 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:32 np0005603623 nova_compute[226235]: 2026-01-31 09:17:32.741 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:17:32 np0005603623 nova_compute[226235]: 2026-01-31 09:17:32.742 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4128MB free_disk=20.94265365600586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:17:32 np0005603623 nova_compute[226235]: 2026-01-31 09:17:32.743 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:17:32 np0005603623 nova_compute[226235]: 2026-01-31 09:17:32.743 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:17:33 np0005603623 nova_compute[226235]: 2026-01-31 09:17:33.064 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:17:33 np0005603623 nova_compute[226235]: 2026-01-31 09:17:33.065 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:17:33 np0005603623 nova_compute[226235]: 2026-01-31 09:17:33.079 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:17:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:17:33 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1426851570' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:17:33 np0005603623 nova_compute[226235]: 2026-01-31 09:17:33.490 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.411s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:17:33 np0005603623 nova_compute[226235]: 2026-01-31 09:17:33.497 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:17:33 np0005603623 nova_compute[226235]: 2026-01-31 09:17:33.516 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:17:33 np0005603623 nova_compute[226235]: 2026-01-31 09:17:33.543 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:17:33 np0005603623 nova_compute[226235]: 2026-01-31 09:17:33.544 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.801s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:17:33 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:17:33.579 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '101'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:17:33 np0005603623 nova_compute[226235]: 2026-01-31 09:17:33.808 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:34.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:17:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:34.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:17:35 np0005603623 nova_compute[226235]: 2026-01-31 09:17:35.544 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:35 np0005603623 nova_compute[226235]: 2026-01-31 09:17:35.545 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:17:35 np0005603623 nova_compute[226235]: 2026-01-31 09:17:35.545 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:17:35 np0005603623 nova_compute[226235]: 2026-01-31 09:17:35.560 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:17:35 np0005603623 nova_compute[226235]: 2026-01-31 09:17:35.561 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:36.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:17:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:17:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:36.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:37 np0005603623 nova_compute[226235]: 2026-01-31 09:17:37.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:37 np0005603623 nova_compute[226235]: 2026-01-31 09:17:37.700 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:38.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:38.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:38 np0005603623 nova_compute[226235]: 2026-01-31 09:17:38.810 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:39 np0005603623 nova_compute[226235]: 2026-01-31 09:17:39.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:40 np0005603623 nova_compute[226235]: 2026-01-31 09:17:40.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:17:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:40.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:17:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:40.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:41 np0005603623 nova_compute[226235]: 2026-01-31 09:17:41.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:42 np0005603623 nova_compute[226235]: 2026-01-31 09:17:42.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:42.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:42.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:42 np0005603623 nova_compute[226235]: 2026-01-31 09:17:42.704 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:43 np0005603623 nova_compute[226235]: 2026-01-31 09:17:43.812 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:44.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:17:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:44.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:17:46 np0005603623 podman[333901]: 2026-01-31 09:17:46.171126723 +0000 UTC m=+0.074237871 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 04:17:46 np0005603623 podman[333902]: 2026-01-31 09:17:46.183121419 +0000 UTC m=+0.082678405 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:17:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:17:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:46.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:17:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:46.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:47 np0005603623 nova_compute[226235]: 2026-01-31 09:17:47.740 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:48.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:48.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:48 np0005603623 nova_compute[226235]: 2026-01-31 09:17:48.849 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:50.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:50.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:52 np0005603623 nova_compute[226235]: 2026-01-31 09:17:52.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:52.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:52.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:52 np0005603623 nova_compute[226235]: 2026-01-31 09:17:52.741 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:53 np0005603623 nova_compute[226235]: 2026-01-31 09:17:53.851 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:17:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:54.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:17:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:54.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:56.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:56.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:57 np0005603623 nova_compute[226235]: 2026-01-31 09:17:57.775 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:58.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:17:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:58.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:58 np0005603623 nova_compute[226235]: 2026-01-31 09:17:58.888 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:00.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:00.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:02.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:02.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:02 np0005603623 nova_compute[226235]: 2026-01-31 09:18:02.778 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:04 np0005603623 nova_compute[226235]: 2026-01-31 09:18:04.210 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:04.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:04.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:06.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:06.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:07 np0005603623 nova_compute[226235]: 2026-01-31 09:18:07.811 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:08.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:08.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:09 np0005603623 nova_compute[226235]: 2026-01-31 09:18:09.212 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:10.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:18:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:10.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:18:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:18:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:12.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:18:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:12.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:12 np0005603623 nova_compute[226235]: 2026-01-31 09:18:12.838 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:14 np0005603623 nova_compute[226235]: 2026-01-31 09:18:14.215 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:18:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:14.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:18:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:14.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:18:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3007947053' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:18:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:18:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3007947053' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:18:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:16.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:18:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:16.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:18:16 np0005603623 podman[334038]: 2026-01-31 09:18:16.964115847 +0000 UTC m=+0.055942616 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127)
Jan 31 04:18:16 np0005603623 podman[334039]: 2026-01-31 09:18:16.989295027 +0000 UTC m=+0.082647995 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 04:18:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:17 np0005603623 nova_compute[226235]: 2026-01-31 09:18:17.890 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:18:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:18.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:18:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:18.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:19 np0005603623 nova_compute[226235]: 2026-01-31 09:18:19.217 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:18:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:20.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:18:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:20.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:18:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:22.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:18:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:18:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:22.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:18:22 np0005603623 nova_compute[226235]: 2026-01-31 09:18:22.891 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:24 np0005603623 nova_compute[226235]: 2026-01-31 09:18:24.219 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:24.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:24.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:26.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:18:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:26.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:18:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:18:26.438 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=102, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=101) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:18:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:18:26.439 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:18:26 np0005603623 nova_compute[226235]: 2026-01-31 09:18:26.439 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:26 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:18:26.440 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '102'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:18:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:27 np0005603623 nova_compute[226235]: 2026-01-31 09:18:27.894 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:28.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:28.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:29 np0005603623 nova_compute[226235]: 2026-01-31 09:18:29.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:29 np0005603623 nova_compute[226235]: 2026-01-31 09:18:29.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:18:29 np0005603623 nova_compute[226235]: 2026-01-31 09:18:29.220 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:18:30.171 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:18:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:18:30.171 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:18:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:18:30.171 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:18:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:30.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:30.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:32.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:32.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:32 np0005603623 nova_compute[226235]: 2026-01-31 09:18:32.897 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:33 np0005603623 nova_compute[226235]: 2026-01-31 09:18:33.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:33 np0005603623 nova_compute[226235]: 2026-01-31 09:18:33.191 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:18:33 np0005603623 nova_compute[226235]: 2026-01-31 09:18:33.192 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:18:33 np0005603623 nova_compute[226235]: 2026-01-31 09:18:33.192 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:18:33 np0005603623 nova_compute[226235]: 2026-01-31 09:18:33.192 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:18:33 np0005603623 nova_compute[226235]: 2026-01-31 09:18:33.192 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:18:33 np0005603623 nova_compute[226235]: 2026-01-31 09:18:33.612 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:18:33 np0005603623 nova_compute[226235]: 2026-01-31 09:18:33.767 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:18:33 np0005603623 nova_compute[226235]: 2026-01-31 09:18:33.769 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4128MB free_disk=20.938796997070312GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:18:33 np0005603623 nova_compute[226235]: 2026-01-31 09:18:33.769 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:18:33 np0005603623 nova_compute[226235]: 2026-01-31 09:18:33.770 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:18:33 np0005603623 nova_compute[226235]: 2026-01-31 09:18:33.943 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:18:33 np0005603623 nova_compute[226235]: 2026-01-31 09:18:33.944 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:18:34 np0005603623 nova_compute[226235]: 2026-01-31 09:18:34.029 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 04:18:34 np0005603623 nova_compute[226235]: 2026-01-31 09:18:34.064 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 04:18:34 np0005603623 nova_compute[226235]: 2026-01-31 09:18:34.065 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 04:18:34 np0005603623 nova_compute[226235]: 2026-01-31 09:18:34.114 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 04:18:34 np0005603623 nova_compute[226235]: 2026-01-31 09:18:34.158 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 04:18:34 np0005603623 nova_compute[226235]: 2026-01-31 09:18:34.222 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:34 np0005603623 nova_compute[226235]: 2026-01-31 09:18:34.224 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:18:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:18:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:34.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:18:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:34.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:18:34 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2869683668' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:18:34 np0005603623 nova_compute[226235]: 2026-01-31 09:18:34.918 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.694s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:18:34 np0005603623 nova_compute[226235]: 2026-01-31 09:18:34.923 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:18:34 np0005603623 nova_compute[226235]: 2026-01-31 09:18:34.961 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:18:34 np0005603623 nova_compute[226235]: 2026-01-31 09:18:34.963 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:18:34 np0005603623 nova_compute[226235]: 2026-01-31 09:18:34.963 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:18:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:18:35 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2001416669' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:18:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:18:35 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2001416669' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:18:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:18:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:36.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:18:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:36.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:18:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:18:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:18:36 np0005603623 nova_compute[226235]: 2026-01-31 09:18:36.964 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:36 np0005603623 nova_compute[226235]: 2026-01-31 09:18:36.965 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:18:36 np0005603623 nova_compute[226235]: 2026-01-31 09:18:36.965 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:18:36 np0005603623 nova_compute[226235]: 2026-01-31 09:18:36.983 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:18:36 np0005603623 nova_compute[226235]: 2026-01-31 09:18:36.983 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e431 e431: 3 total, 3 up, 3 in
Jan 31 04:18:37 np0005603623 nova_compute[226235]: 2026-01-31 09:18:37.926 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:38.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:38.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:39 np0005603623 nova_compute[226235]: 2026-01-31 09:18:39.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:39 np0005603623 nova_compute[226235]: 2026-01-31 09:18:39.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:39 np0005603623 nova_compute[226235]: 2026-01-31 09:18:39.223 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:40.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:40.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:41 np0005603623 nova_compute[226235]: 2026-01-31 09:18:41.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e431 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:42 np0005603623 nova_compute[226235]: 2026-01-31 09:18:42.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:42 np0005603623 nova_compute[226235]: 2026-01-31 09:18:42.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:42.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:18:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:42.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:18:42 np0005603623 nova_compute[226235]: 2026-01-31 09:18:42.927 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:44 np0005603623 nova_compute[226235]: 2026-01-31 09:18:44.224 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:44.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:44.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:18:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:18:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e432 e432: 3 total, 3 up, 3 in
Jan 31 04:18:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:46.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:46.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e432 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:47 np0005603623 nova_compute[226235]: 2026-01-31 09:18:47.929 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:47 np0005603623 podman[334423]: 2026-01-31 09:18:47.952095132 +0000 UTC m=+0.045821599 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 31 04:18:47 np0005603623 podman[334424]: 2026-01-31 09:18:47.995158453 +0000 UTC m=+0.087282190 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 31 04:18:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e433 e433: 3 total, 3 up, 3 in
Jan 31 04:18:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:18:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:48.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:18:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:18:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:48.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:18:49 np0005603623 nova_compute[226235]: 2026-01-31 09:18:49.225 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:50.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:50.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:52.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:52.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:52 np0005603623 nova_compute[226235]: 2026-01-31 09:18:52.931 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 e434: 3 total, 3 up, 3 in
Jan 31 04:18:54 np0005603623 nova_compute[226235]: 2026-01-31 09:18:54.227 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:54.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:54.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:56.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:18:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:56.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:18:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:57 np0005603623 nova_compute[226235]: 2026-01-31 09:18:57.934 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:18:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:58.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:18:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:18:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:58.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:59 np0005603623 nova_compute[226235]: 2026-01-31 09:18:59.229 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:00.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:00.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:02.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:02.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:02 np0005603623 nova_compute[226235]: 2026-01-31 09:19:02.968 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:04 np0005603623 nova_compute[226235]: 2026-01-31 09:19:04.230 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:04.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:04.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:19:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:06.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:19:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:06.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:07 np0005603623 nova_compute[226235]: 2026-01-31 09:19:07.969 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:19:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:08.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:19:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:08.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:09 np0005603623 nova_compute[226235]: 2026-01-31 09:19:09.231 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:10.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:10.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:12.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:12.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:12 np0005603623 nova_compute[226235]: 2026-01-31 09:19:12.971 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:14 np0005603623 nova_compute[226235]: 2026-01-31 09:19:14.232 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:14.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:14.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:16.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:16.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:17 np0005603623 nova_compute[226235]: 2026-01-31 09:19:17.972 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:19:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:18.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:19:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:18.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:18 np0005603623 podman[334531]: 2026-01-31 09:19:18.96673092 +0000 UTC m=+0.062131270 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:19:19 np0005603623 podman[334532]: 2026-01-31 09:19:19.036231982 +0000 UTC m=+0.125996065 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:19:19 np0005603623 nova_compute[226235]: 2026-01-31 09:19:19.233 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:20.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:19:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:20.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:19:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:22.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:22.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:22 np0005603623 nova_compute[226235]: 2026-01-31 09:19:22.975 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:24 np0005603623 nova_compute[226235]: 2026-01-31 09:19:24.234 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:24.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:24.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:26.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:26.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:27 np0005603623 nova_compute[226235]: 2026-01-31 09:19:27.976 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:28.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:28.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:29 np0005603623 nova_compute[226235]: 2026-01-31 09:19:29.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:29 np0005603623 nova_compute[226235]: 2026-01-31 09:19:29.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:19:29 np0005603623 nova_compute[226235]: 2026-01-31 09:19:29.236 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:19:30.171 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:19:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:19:30.172 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:19:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:19:30.172 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:19:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:30.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:30.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:32.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:32.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:32 np0005603623 nova_compute[226235]: 2026-01-31 09:19:32.979 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:34 np0005603623 nova_compute[226235]: 2026-01-31 09:19:34.238 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:34.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:19:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:34.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:19:35 np0005603623 nova_compute[226235]: 2026-01-31 09:19:35.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:35 np0005603623 nova_compute[226235]: 2026-01-31 09:19:35.227 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:19:35 np0005603623 nova_compute[226235]: 2026-01-31 09:19:35.228 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:19:35 np0005603623 nova_compute[226235]: 2026-01-31 09:19:35.228 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:19:35 np0005603623 nova_compute[226235]: 2026-01-31 09:19:35.228 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:19:35 np0005603623 nova_compute[226235]: 2026-01-31 09:19:35.229 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:19:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:19:35 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1817552660' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:19:35 np0005603623 nova_compute[226235]: 2026-01-31 09:19:35.638 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:19:35 np0005603623 nova_compute[226235]: 2026-01-31 09:19:35.785 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:19:35 np0005603623 nova_compute[226235]: 2026-01-31 09:19:35.786 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4135MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:19:35 np0005603623 nova_compute[226235]: 2026-01-31 09:19:35.786 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:19:35 np0005603623 nova_compute[226235]: 2026-01-31 09:19:35.786 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:19:35 np0005603623 nova_compute[226235]: 2026-01-31 09:19:35.989 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:19:35 np0005603623 nova_compute[226235]: 2026-01-31 09:19:35.989 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:19:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:19:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:36.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:19:36 np0005603623 nova_compute[226235]: 2026-01-31 09:19:36.408 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:19:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:19:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:36.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:19:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:19:36 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2593646254' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:19:36 np0005603623 nova_compute[226235]: 2026-01-31 09:19:36.826 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:19:36 np0005603623 nova_compute[226235]: 2026-01-31 09:19:36.831 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:19:36 np0005603623 nova_compute[226235]: 2026-01-31 09:19:36.862 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:19:36 np0005603623 nova_compute[226235]: 2026-01-31 09:19:36.895 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:19:36 np0005603623 nova_compute[226235]: 2026-01-31 09:19:36.895 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:19:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:37 np0005603623 nova_compute[226235]: 2026-01-31 09:19:37.980 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:38.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:19:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:38.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:19:38 np0005603623 nova_compute[226235]: 2026-01-31 09:19:38.896 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:38 np0005603623 nova_compute[226235]: 2026-01-31 09:19:38.897 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:19:38 np0005603623 nova_compute[226235]: 2026-01-31 09:19:38.897 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:19:38 np0005603623 nova_compute[226235]: 2026-01-31 09:19:38.934 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:19:38 np0005603623 nova_compute[226235]: 2026-01-31 09:19:38.936 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:39 np0005603623 nova_compute[226235]: 2026-01-31 09:19:39.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:39 np0005603623 nova_compute[226235]: 2026-01-31 09:19:39.239 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:40 np0005603623 nova_compute[226235]: 2026-01-31 09:19:40.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:19:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:40.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:19:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:40.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:42.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:42.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:42 np0005603623 nova_compute[226235]: 2026-01-31 09:19:42.985 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:43 np0005603623 nova_compute[226235]: 2026-01-31 09:19:43.151 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:43 np0005603623 nova_compute[226235]: 2026-01-31 09:19:43.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:44 np0005603623 nova_compute[226235]: 2026-01-31 09:19:44.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:44 np0005603623 nova_compute[226235]: 2026-01-31 09:19:44.241 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:44.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:19:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:44.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:19:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:46.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:46.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:47 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:19:47 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:19:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:48 np0005603623 nova_compute[226235]: 2026-01-31 09:19:48.019 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:48 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:19:48 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:19:48 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:19:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:48.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:19:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:48.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:19:49 np0005603623 nova_compute[226235]: 2026-01-31 09:19:49.242 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:49 np0005603623 podman[334867]: 2026-01-31 09:19:49.986099778 +0000 UTC m=+0.075022045 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:19:50 np0005603623 podman[334868]: 2026-01-31 09:19:50.014276543 +0000 UTC m=+0.099864005 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 04:19:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:50.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:50.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #193. Immutable memtables: 0.
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:50.602261) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 123] Flushing memtable with next log file: 193
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851190602345, "job": 123, "event": "flush_started", "num_memtables": 1, "num_entries": 2413, "num_deletes": 254, "total_data_size": 5800875, "memory_usage": 5888192, "flush_reason": "Manual Compaction"}
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 123] Level-0 flush table #194: started
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851190648216, "cf_name": "default", "job": 123, "event": "table_file_creation", "file_number": 194, "file_size": 3784825, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 92869, "largest_seqno": 95277, "table_properties": {"data_size": 3774968, "index_size": 6286, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20537, "raw_average_key_size": 20, "raw_value_size": 3755142, "raw_average_value_size": 3777, "num_data_blocks": 274, "num_entries": 994, "num_filter_entries": 994, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850982, "oldest_key_time": 1769850982, "file_creation_time": 1769851190, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 194, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 123] Flush lasted 46020 microseconds, and 9434 cpu microseconds.
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:50.648290) [db/flush_job.cc:967] [default] [JOB 123] Level-0 flush table #194: 3784825 bytes OK
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:50.648316) [db/memtable_list.cc:519] [default] Level-0 commit table #194 started
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:50.650862) [db/memtable_list.cc:722] [default] Level-0 commit table #194: memtable #1 done
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:50.650881) EVENT_LOG_v1 {"time_micros": 1769851190650876, "job": 123, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:50.650899) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 123] Try to delete WAL files size 5790234, prev total WAL file size 5790234, number of live WAL files 2.
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000190.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:50.651823) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038323833' seq:72057594037927935, type:22 .. '7061786F730038353335' seq:0, type:0; will stop at (end)
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 124] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 123 Base level 0, inputs: [194(3696KB)], [192(12MB)]
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851190651875, "job": 124, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [194], "files_L6": [192], "score": -1, "input_data_size": 17118695, "oldest_snapshot_seqno": -1}
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 124] Generated table #195: 11530 keys, 15134478 bytes, temperature: kUnknown
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851190842340, "cf_name": "default", "job": 124, "event": "table_file_creation", "file_number": 195, "file_size": 15134478, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15059623, "index_size": 44918, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28869, "raw_key_size": 303528, "raw_average_key_size": 26, "raw_value_size": 14858135, "raw_average_value_size": 1288, "num_data_blocks": 1712, "num_entries": 11530, "num_filter_entries": 11530, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769851190, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 195, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:50.842583) [db/compaction/compaction_job.cc:1663] [default] [JOB 124] Compacted 1@0 + 1@6 files to L6 => 15134478 bytes
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:50.845509) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 89.8 rd, 79.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 12.7 +0.0 blob) out(14.4 +0.0 blob), read-write-amplify(8.5) write-amplify(4.0) OK, records in: 12060, records dropped: 530 output_compression: NoCompression
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:50.845530) EVENT_LOG_v1 {"time_micros": 1769851190845520, "job": 124, "event": "compaction_finished", "compaction_time_micros": 190531, "compaction_time_cpu_micros": 32217, "output_level": 6, "num_output_files": 1, "total_output_size": 15134478, "num_input_records": 12060, "num_output_records": 11530, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000194.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851190845995, "job": 124, "event": "table_file_deletion", "file_number": 194}
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000192.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851190847284, "job": 124, "event": "table_file_deletion", "file_number": 192}
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:50.651742) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:50.847340) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:50.847344) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:50.847345) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:50.847346) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:50 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:50.847348) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:52.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:52.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:19:52 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:19:53 np0005603623 nova_compute[226235]: 2026-01-31 09:19:53.021 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:53 np0005603623 nova_compute[226235]: 2026-01-31 09:19:53.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:54 np0005603623 nova_compute[226235]: 2026-01-31 09:19:54.243 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:54.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:54.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:56.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:56.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:58 np0005603623 nova_compute[226235]: 2026-01-31 09:19:58.066 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #196. Immutable memtables: 0.
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:58.176659) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 125] Flushing memtable with next log file: 196
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851198176721, "job": 125, "event": "flush_started", "num_memtables": 1, "num_entries": 331, "num_deletes": 250, "total_data_size": 284081, "memory_usage": 291296, "flush_reason": "Manual Compaction"}
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 125] Level-0 flush table #197: started
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851198179880, "cf_name": "default", "job": 125, "event": "table_file_creation", "file_number": 197, "file_size": 187260, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 95282, "largest_seqno": 95608, "table_properties": {"data_size": 185088, "index_size": 335, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5833, "raw_average_key_size": 20, "raw_value_size": 180822, "raw_average_value_size": 634, "num_data_blocks": 14, "num_entries": 285, "num_filter_entries": 285, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769851191, "oldest_key_time": 1769851191, "file_creation_time": 1769851198, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 197, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 125] Flush lasted 3256 microseconds, and 1049 cpu microseconds.
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:58.179926) [db/flush_job.cc:967] [default] [JOB 125] Level-0 flush table #197: 187260 bytes OK
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:58.179941) [db/memtable_list.cc:519] [default] Level-0 commit table #197 started
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:58.180897) [db/memtable_list.cc:722] [default] Level-0 commit table #197: memtable #1 done
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:58.180911) EVENT_LOG_v1 {"time_micros": 1769851198180905, "job": 125, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:58.180926) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 125] Try to delete WAL files size 281766, prev total WAL file size 281766, number of live WAL files 2.
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000193.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:58.181293) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033323630' seq:72057594037927935, type:22 .. '6D6772737461740033353131' seq:0, type:0; will stop at (end)
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 126] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 125 Base level 0, inputs: [197(182KB)], [195(14MB)]
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851198181325, "job": 126, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [197], "files_L6": [195], "score": -1, "input_data_size": 15321738, "oldest_snapshot_seqno": -1}
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 126] Generated table #198: 11304 keys, 11485897 bytes, temperature: kUnknown
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851198247819, "cf_name": "default", "job": 126, "event": "table_file_creation", "file_number": 198, "file_size": 11485897, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11417329, "index_size": 39207, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28293, "raw_key_size": 299008, "raw_average_key_size": 26, "raw_value_size": 11224473, "raw_average_value_size": 992, "num_data_blocks": 1473, "num_entries": 11304, "num_filter_entries": 11304, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769851198, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 198, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:58.248041) [db/compaction/compaction_job.cc:1663] [default] [JOB 126] Compacted 1@0 + 1@6 files to L6 => 11485897 bytes
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:58.249198) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 230.2 rd, 172.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 14.4 +0.0 blob) out(11.0 +0.0 blob), read-write-amplify(143.2) write-amplify(61.3) OK, records in: 11815, records dropped: 511 output_compression: NoCompression
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:58.249252) EVENT_LOG_v1 {"time_micros": 1769851198249230, "job": 126, "event": "compaction_finished", "compaction_time_micros": 66561, "compaction_time_cpu_micros": 21953, "output_level": 6, "num_output_files": 1, "total_output_size": 11485897, "num_input_records": 11815, "num_output_records": 11304, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000197.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851198249504, "job": 126, "event": "table_file_deletion", "file_number": 197}
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000195.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851198251474, "job": 126, "event": "table_file_deletion", "file_number": 195}
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:58.181221) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:58.251609) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:58.251616) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:58.251618) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:58.251620) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:58 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:19:58.251622) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:58.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:19:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:19:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:58.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:19:59 np0005603623 nova_compute[226235]: 2026-01-31 09:19:59.244 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:00.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:20:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:00.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:20:00 np0005603623 ceph-mon[77037]: overall HEALTH_OK
Jan 31 04:20:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:02.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:02.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:03 np0005603623 nova_compute[226235]: 2026-01-31 09:20:03.105 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:04 np0005603623 nova_compute[226235]: 2026-01-31 09:20:04.247 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:04.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:04.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:06.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:06.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:08 np0005603623 nova_compute[226235]: 2026-01-31 09:20:08.107 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:20:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:08.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:20:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:08.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:09 np0005603623 nova_compute[226235]: 2026-01-31 09:20:09.283 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:09 np0005603623 systemd-logind[795]: New session 69 of user zuul.
Jan 31 04:20:09 np0005603623 systemd[1]: Started Session 69 of User zuul.
Jan 31 04:20:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:10.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:10.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:12.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:12.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:13 np0005603623 nova_compute[226235]: 2026-01-31 09:20:13.154 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Jan 31 04:20:13 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2491038089' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 31 04:20:14 np0005603623 nova_compute[226235]: 2026-01-31 09:20:14.284 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:14.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:14.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:20:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3856377600' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:20:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:20:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3856377600' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:20:16 np0005603623 ovs-vsctl[335311]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 31 04:20:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:16.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:16.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:17 np0005603623 virtqemud[225858]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 31 04:20:17 np0005603623 virtqemud[225858]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 31 04:20:17 np0005603623 virtqemud[225858]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 31 04:20:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:17 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: cache status {prefix=cache status} (starting...)
Jan 31 04:20:17 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: client ls {prefix=client ls} (starting...)
Jan 31 04:20:17 np0005603623 lvm[335658]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 31 04:20:17 np0005603623 lvm[335658]: VG ceph_vg0 finished
Jan 31 04:20:18 np0005603623 nova_compute[226235]: 2026-01-31 09:20:18.154 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:18 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: damage ls {prefix=damage ls} (starting...)
Jan 31 04:20:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:18.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Jan 31 04:20:18 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4239498057' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 31 04:20:18 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: dump loads {prefix=dump loads} (starting...)
Jan 31 04:20:18 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 31 04:20:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:18.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:18 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 31 04:20:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Jan 31 04:20:18 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2324727117' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 31 04:20:18 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 31 04:20:18 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 31 04:20:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0) v1
Jan 31 04:20:19 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3815120264' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 31 04:20:19 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 31 04:20:19 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 31 04:20:19 np0005603623 nova_compute[226235]: 2026-01-31 09:20:19.324 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:19 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: ops {prefix=ops} (starting...)
Jan 31 04:20:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Jan 31 04:20:19 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4108123027' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 31 04:20:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Jan 31 04:20:19 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3102421695' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 31 04:20:19 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 31 04:20:19 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3191906386' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 31 04:20:20 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: session ls {prefix=session ls} (starting...)
Jan 31 04:20:20 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: status {prefix=status} (starting...)
Jan 31 04:20:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:20.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 31 04:20:20 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2787856787' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 31 04:20:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:20.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0) v1
Jan 31 04:20:20 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3786845859' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 31 04:20:20 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 31 04:20:20 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1693825433' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 31 04:20:20 np0005603623 podman[336103]: 2026-01-31 09:20:20.973046839 +0000 UTC m=+0.060199930 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 31 04:20:20 np0005603623 podman[336116]: 2026-01-31 09:20:20.993976076 +0000 UTC m=+0.079224157 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 04:20:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Jan 31 04:20:21 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3752218524' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 31 04:20:21 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Jan 31 04:20:21 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/501393631' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 31 04:20:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Jan 31 04:20:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/804656387' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 31 04:20:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 31 04:20:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4020601169' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 31 04:20:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:20:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:22.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:20:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Jan 31 04:20:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/922251162' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 31 04:20:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:22.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 31 04:20:22 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/153149921' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 31 04:20:23 np0005603623 nova_compute[226235]: 2026-01-31 09:20:23.156 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 31 04:20:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1560162082' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5011561 data_alloc: 218103808 data_used: 10100736
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6133edc00 session 0x55e61435eb40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497631232 unmapped: 78094336 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497631232 unmapped: 78094336 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e618454400 session 0x55e612044960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19d6e0000/0x0/0x1bfc00000, data 0x1b610c9/0x1d4e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x207cf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497631232 unmapped: 78094336 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e618cee000 session 0x55e61256e1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497631232 unmapped: 78094336 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e613ade800 session 0x55e6153e63c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497631232 unmapped: 78094336 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4750679 data_alloc: 218103808 data_used: 7720960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497631232 unmapped: 78094336 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497631232 unmapped: 78094336 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19d6e0000/0x0/0x1bfc00000, data 0x1b610a6/0x1d4d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x207cf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497631232 unmapped: 78094336 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497631232 unmapped: 78094336 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6170a8400 session 0x55e61186dc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497631232 unmapped: 78094336 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4750679 data_alloc: 218103808 data_used: 7720960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6133edc00 session 0x55e6117f4960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497631232 unmapped: 78094336 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497631232 unmapped: 78094336 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19d6e0000/0x0/0x1bfc00000, data 0x1b610a6/0x1d4d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x207cf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497631232 unmapped: 78094336 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6178b1800 session 0x55e61170c780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6190b5c00 session 0x55e6153e7c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497631232 unmapped: 78094336 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.504285812s of 14.795516014s, submitted: 44
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19d6e0000/0x0/0x1bfc00000, data 0x1b610a6/0x1d4d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x207cf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6133eec00 session 0x55e6112e63c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497639424 unmapped: 78086144 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4743463 data_alloc: 218103808 data_used: 7720960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497639424 unmapped: 78086144 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497639424 unmapped: 78086144 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497639424 unmapped: 78086144 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19d706000/0x0/0x1bfc00000, data 0x1b3d097/0x1d28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x207cf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497639424 unmapped: 78086144 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e62d15bc00 session 0x55e6115cb860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e62d160400 session 0x55e61216ab40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6133edc00 session 0x55e617123680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6178b1800 session 0x55e6143a94a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6190b5c00 session 0x55e61216cf00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 497639424 unmapped: 78086144 heap: 575725568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4743463 data_alloc: 218103808 data_used: 7720960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19d706000/0x0/0x1bfc00000, data 0x1b3d097/0x1d28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x207cf9c6), peers [0,1] op hist [0,0,0,0,0,0,1])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 513302528 unmapped: 66625536 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e616102400 session 0x55e6112e6780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495476736 unmapped: 84451328 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6133edc00 session 0x55e6115caf00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6178b1800 session 0x55e610b0bc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6190b5c00 session 0x55e61133bc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6133f2400 session 0x55e61306bc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6133eec00 session 0x55e6117f5e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495386624 unmapped: 84541440 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495386624 unmapped: 84541440 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495386624 unmapped: 84541440 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4785022 data_alloc: 218103808 data_used: 5353472
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495386624 unmapped: 84541440 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19cfb0000/0x0/0x1bfc00000, data 0x2293097/0x247e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x207cf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495386624 unmapped: 84541440 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19cfb0000/0x0/0x1bfc00000, data 0x2293097/0x247e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x207cf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495386624 unmapped: 84541440 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495386624 unmapped: 84541440 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e610cc8800 session 0x55e6114d6000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19cfb0000/0x0/0x1bfc00000, data 0x2293097/0x247e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x207cf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495337472 unmapped: 84590592 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e622d0e000 session 0x55e61653b4a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4785022 data_alloc: 218103808 data_used: 5353472
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495337472 unmapped: 84590592 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495337472 unmapped: 84590592 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e613adc000 session 0x55e61133a1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495337472 unmapped: 84590592 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6119cb800 session 0x55e6117f4b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.719934464s of 19.208599091s, submitted: 98
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e611824800 session 0x55e61256f2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495337472 unmapped: 84590592 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19cfae000/0x0/0x1bfc00000, data 0x22930ca/0x2480000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x207cf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495337472 unmapped: 84590592 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4788821 data_alloc: 218103808 data_used: 5353472
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e611824800 session 0x55e61306b0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495067136 unmapped: 84860928 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19cfac000/0x0/0x1bfc00000, data 0x229313c/0x2482000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x207cf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495001600 unmapped: 84926464 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495001600 unmapped: 84926464 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e614329400 session 0x55e6153e6960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495001600 unmapped: 84926464 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19cfac000/0x0/0x1bfc00000, data 0x229313c/0x2482000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x207cf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495017984 unmapped: 84910080 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4903409 data_alloc: 234881024 data_used: 20803584
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 495058944 unmapped: 84869120 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19cfac000/0x0/0x1bfc00000, data 0x229313c/0x2482000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x207cf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 496148480 unmapped: 83779584 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 496148480 unmapped: 83779584 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.272429466s of 10.070381165s, submitted: 242
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 496156672 unmapped: 83771392 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 496156672 unmapped: 83771392 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4903409 data_alloc: 234881024 data_used: 20803584
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 500457472 unmapped: 79470592 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e615e97800 session 0x55e6131a4b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6170a8800 session 0x55e6114d6000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6133f0400 session 0x55e6153e6960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e611824800 session 0x55e61435e3c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e618454000 session 0x55e61653a1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6178b0c00 session 0x55e61186cb40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e615e94400 session 0x55e6118e9c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e614333800 session 0x55e61653b0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e611824800 session 0x55e61168c000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e614333800 session 0x55e6112e6f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e618348800 session 0x55e61306ab40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 504201216 unmapped: 75726848 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19bb83000/0x0/0x1bfc00000, data 0x32a413c/0x3493000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [0,0,0,2])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 504553472 unmapped: 75374592 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e615e94400 session 0x55e6131a41e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 504569856 unmapped: 75358208 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6178b0c00 session 0x55e614082000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503529472 unmapped: 76398592 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e611824800 session 0x55e61216a780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5113169 data_alloc: 234881024 data_used: 22212608
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e614333800 session 0x55e61216b2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503529472 unmapped: 76398592 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503537664 unmapped: 76390400 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19b267000/0x0/0x1bfc00000, data 0x3bc90da/0x3db7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503545856 unmapped: 76382208 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19b267000/0x0/0x1bfc00000, data 0x3bc90da/0x3db7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503545856 unmapped: 76382208 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19b243000/0x0/0x1bfc00000, data 0x3bed0da/0x3ddb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503545856 unmapped: 76382208 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5116069 data_alloc: 234881024 data_used: 22282240
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19b243000/0x0/0x1bfc00000, data 0x3bed0da/0x3ddb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503545856 unmapped: 76382208 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503554048 unmapped: 76374016 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503554048 unmapped: 76374016 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.808023453s of 14.836380959s, submitted: 194
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19b243000/0x0/0x1bfc00000, data 0x3bed0da/0x3ddb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e615e96800 session 0x55e614349c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503414784 unmapped: 76513280 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503414784 unmapped: 76513280 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5116173 data_alloc: 234881024 data_used: 22286336
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503414784 unmapped: 76513280 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503414784 unmapped: 76513280 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503414784 unmapped: 76513280 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19b239000/0x0/0x1bfc00000, data 0x3bf70da/0x3de5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503414784 unmapped: 76513280 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503414784 unmapped: 76513280 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5179373 data_alloc: 234881024 data_used: 29233152
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503414784 unmapped: 76513280 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503414784 unmapped: 76513280 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503414784 unmapped: 76513280 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19b236000/0x0/0x1bfc00000, data 0x3bfa0da/0x3de8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503414784 unmapped: 76513280 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503414784 unmapped: 76513280 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5179273 data_alloc: 234881024 data_used: 29233152
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 503365632 unmapped: 76562432 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.238917351s of 13.271662712s, submitted: 6
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 506626048 unmapped: 73302016 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19a5be000/0x0/0x1bfc00000, data 0x48720da/0x4a60000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507674624 unmapped: 72253440 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507625472 unmapped: 72302592 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507625472 unmapped: 72302592 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5283849 data_alloc: 234881024 data_used: 29474816
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507633664 unmapped: 72294400 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19a5ae000/0x0/0x1bfc00000, data 0x48820da/0x4a70000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507633664 unmapped: 72294400 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507633664 unmapped: 72294400 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507633664 unmapped: 72294400 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507633664 unmapped: 72294400 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5283113 data_alloc: 234881024 data_used: 29474816
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507633664 unmapped: 72294400 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507633664 unmapped: 72294400 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.793082237s of 11.057446480s, submitted: 77
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19a5ab000/0x0/0x1bfc00000, data 0x48830da/0x4a71000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507633664 unmapped: 72294400 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507641856 unmapped: 72286208 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6133efc00 session 0x55e612723680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507641856 unmapped: 72286208 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5283421 data_alloc: 234881024 data_used: 29474816
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507641856 unmapped: 72286208 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19a5ab000/0x0/0x1bfc00000, data 0x48830da/0x4a71000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507641856 unmapped: 72286208 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507641856 unmapped: 72286208 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507641856 unmapped: 72286208 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507641856 unmapped: 72286208 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5283421 data_alloc: 234881024 data_used: 29474816
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e615e95000 session 0x55e611926960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e611824800 session 0x55e6116a43c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6133efc00 session 0x55e611926b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e614333800 session 0x55e610b0a5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507576320 unmapped: 72351744 heap: 579928064 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19a5ab000/0x0/0x1bfc00000, data 0x48830da/0x4a71000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e615e95000 session 0x55e614082960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e618ceec00 session 0x55e6121a83c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e611824800 session 0x55e613084f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507691008 unmapped: 75907072 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6133efc00 session 0x55e6114d6960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e614333800 session 0x55e6114d6b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507691008 unmapped: 75907072 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507691008 unmapped: 75907072 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x199b4e000/0x0/0x1bfc00000, data 0x52e014c/0x54d0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507699200 unmapped: 75898880 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5364020 data_alloc: 234881024 data_used: 29478912
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507699200 unmapped: 75898880 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507699200 unmapped: 75898880 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507699200 unmapped: 75898880 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507699200 unmapped: 75898880 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x199b4e000/0x0/0x1bfc00000, data 0x52e014c/0x54d0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507707392 unmapped: 75890688 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6133f1c00 session 0x55e61170cb40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e61cb40c00 session 0x55e6118e8780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e611824800 session 0x55e6143a94a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e618ceec00 session 0x55e6112e63c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5364020 data_alloc: 234881024 data_used: 29478912
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.767076492s of 17.921913147s, submitted: 31
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6133efc00 session 0x55e614082b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6133f1c00 session 0x55e617123860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e614333800 session 0x55e6131a5860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e611824800 session 0x55e6112e7a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6133efc00 session 0x55e617123860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x198fd0000/0x0/0x1bfc00000, data 0x5e5c185/0x604e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507944960 unmapped: 75653120 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507944960 unmapped: 75653120 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507944960 unmapped: 75653120 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x198fd0000/0x0/0x1bfc00000, data 0x5e5c1be/0x604e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507944960 unmapped: 75653120 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507944960 unmapped: 75653120 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5459652 data_alloc: 234881024 data_used: 29507584
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507944960 unmapped: 75653120 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 507944960 unmapped: 75653120 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e61c237800 session 0x55e6143a94a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 508280832 unmapped: 75317248 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 508280832 unmapped: 75317248 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x198fa6000/0x0/0x1bfc00000, data 0x5e861be/0x6078000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 511188992 unmapped: 72409088 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5538299 data_alloc: 251658240 data_used: 36384768
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 511188992 unmapped: 72409088 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.428407669s of 10.753164291s, submitted: 59
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6133b5c00 session 0x55e6115cba40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 511221760 unmapped: 72376320 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e613449c00 session 0x55e6115cb680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 511221760 unmapped: 72376320 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x198fa4000/0x0/0x1bfc00000, data 0x5e861be/0x6078000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e611824800 session 0x55e614348d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 511229952 unmapped: 72368128 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e6133b5c00 session 0x55e614082d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 511229952 unmapped: 72368128 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5555079 data_alloc: 251658240 data_used: 37482496
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 511229952 unmapped: 72368128 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 513359872 unmapped: 70238208 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 513359872 unmapped: 70238208 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x198fa3000/0x0/0x1bfc00000, data 0x5e871f1/0x607b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 513359872 unmapped: 70238208 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e614c33000 session 0x55e61186da40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 513359872 unmapped: 70238208 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x198f9e000/0x0/0x1bfc00000, data 0x5e8c1f1/0x6080000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5589043 data_alloc: 251658240 data_used: 41521152
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x198f9e000/0x0/0x1bfc00000, data 0x5e8c1f1/0x6080000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.270610809s of 10.001121521s, submitted: 56
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 515792896 unmapped: 67805184 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 514801664 unmapped: 68796416 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 ms_handle_reset con 0x55e613c76400 session 0x55e613319e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 514809856 unmapped: 68788224 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 heartbeat osd_stat(store_statfs(0x19a07f000/0x0/0x1bfc00000, data 0x4dac1e1/0x4f9f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 514834432 unmapped: 68763648 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 373 handle_osd_map epochs [373,374], i have 373, src has [1,374]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133eb400 session 0x55e613085860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 514850816 unmapped: 68747264 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5486802 data_alloc: 251658240 data_used: 35627008
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 514850816 unmapped: 68747264 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 518799360 unmapped: 64798720 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133ea000 session 0x55e614082000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e622d0e000 session 0x55e61653b2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 518619136 unmapped: 64978944 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x199142000/0x0/0x1bfc00000, data 0x5ce4e80/0x5edb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e615e97800 session 0x55e61435fe00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 518627328 unmapped: 64970752 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613d93400 session 0x55e61133b860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 518635520 unmapped: 64962560 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5573530 data_alloc: 251658240 data_used: 36855808
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133edc00 session 0x55e6141fb860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133ea000 session 0x55e6121a92c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613d93400 session 0x55e6141fba40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 518651904 unmapped: 64946176 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 518660096 unmapped: 64937984 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x199121000/0x0/0x1bfc00000, data 0x5d00037/0x5efd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.567641258s of 11.468025208s, submitted: 151
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 518668288 unmapped: 64929792 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 518668288 unmapped: 64929792 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x19911e000/0x0/0x1bfc00000, data 0x5d03037/0x5f00000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 518668288 unmapped: 64929792 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5579318 data_alloc: 251658240 data_used: 36855808
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 518668288 unmapped: 64929792 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 518668288 unmapped: 64929792 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61c236400 session 0x55e61133ad20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613ae0800 session 0x55e61216a5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 523919360 unmapped: 59678720 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61d733000 session 0x55e613084f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61432d400 session 0x55e61256e1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e614328400 session 0x55e61133ba40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 523919360 unmapped: 59678720 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133efc00 session 0x55e6153e70e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61c237800 session 0x55e6119270e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133ea000 session 0x55e61216b860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x19a262000/0x0/0x1bfc00000, data 0x4a46f92/0x4c40000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519069696 unmapped: 64528384 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5367601 data_alloc: 234881024 data_used: 32301056
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519069696 unmapped: 64528384 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613b6a800 session 0x55e6132e23c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613addc00 session 0x55e6143a81e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519069696 unmapped: 64528384 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61194c400 session 0x55e6143a9860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519069696 unmapped: 64528384 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x19a261000/0x0/0x1bfc00000, data 0x4a46fa2/0x4c41000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519069696 unmapped: 64528384 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519069696 unmapped: 64528384 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5370903 data_alloc: 234881024 data_used: 31715328
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.733942986s of 12.977799416s, submitted: 80
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61d732000 session 0x55e6119263c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e616103400 session 0x55e61306bc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519086080 unmapped: 64512000 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618455800 session 0x55e6116a43c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613b85000 session 0x55e612044960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e62d15c000 session 0x55e6153e61e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519184384 unmapped: 64413696 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133f0c00 session 0x55e613085a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61194c000 session 0x55e612045860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519184384 unmapped: 64413696 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x199d69000/0x0/0x1bfc00000, data 0x50ba004/0x52b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519184384 unmapped: 64413696 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133f0c00 session 0x55e61435fc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613b85000 session 0x55e61170cd20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6190b5c00 session 0x55e61435f4a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e610cc8800 session 0x55e61653ba40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618455800 session 0x55e6143a85a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519192576 unmapped: 64405504 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5087833 data_alloc: 234881024 data_used: 19951616
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133f0c00 session 0x55e6143a81e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x199d69000/0x0/0x1bfc00000, data 0x50ba004/0x52b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e62d15c000 session 0x55e617122780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e610cc8800 session 0x55e6119263c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 70918144 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133a9c00 session 0x55e613084f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 70918144 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x19b90c000/0x0/0x1bfc00000, data 0x3518fd1/0x3712000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 70918144 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 70918144 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e611fa0c00 session 0x55e61170d0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e610cc8800 session 0x55e61133a960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 70918144 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5127100 data_alloc: 234881024 data_used: 23834624
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.516397476s of 10.063669205s, submitted: 133
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e611fa0c00 session 0x55e61306a780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e615e97000 session 0x55e614083e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512761856 unmapped: 70836224 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x19b376000/0x0/0x1bfc00000, data 0x3aaefd1/0x3ca8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512761856 unmapped: 70836224 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512761856 unmapped: 70836224 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512761856 unmapped: 70836224 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512761856 unmapped: 70836224 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5167766 data_alloc: 234881024 data_used: 23834624
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512761856 unmapped: 70836224 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512761856 unmapped: 70836224 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x19b376000/0x0/0x1bfc00000, data 0x3aaefd1/0x3ca8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512761856 unmapped: 70836224 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512761856 unmapped: 70836224 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512761856 unmapped: 70836224 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5171678 data_alloc: 234881024 data_used: 24379392
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x19b376000/0x0/0x1bfc00000, data 0x3aaefd1/0x3ca8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e62d160000 session 0x55e6112e6780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512761856 unmapped: 70836224 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618cee000 session 0x55e613318780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512696320 unmapped: 70901760 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.440398216s of 12.474577904s, submitted: 2
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512696320 unmapped: 70901760 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e610cc8800 session 0x55e6117f50e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e611fa0c00 session 0x55e613084780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x19b370000/0x0/0x1bfc00000, data 0x3ab4fd1/0x3cae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512999424 unmapped: 70598656 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x19b34b000/0x0/0x1bfc00000, data 0x3ad8ff4/0x3cd3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e614332800 session 0x55e6133185a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 512999424 unmapped: 70598656 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6190b5c00 session 0x55e61216ab40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5179085 data_alloc: 234881024 data_used: 24395776
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613e76400 session 0x55e61216b2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 513007616 unmapped: 70590464 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618454800 session 0x55e611650f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613ade400 session 0x55e61435ed20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 513187840 unmapped: 70410240 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61c236400 session 0x55e61168c1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e622d0f800 session 0x55e6130850e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6178ae400 session 0x55e61170c000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613ade400 session 0x55e6118e9e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6178ae400 session 0x55e610dd21e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 513212416 unmapped: 70385664 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613b85000 session 0x55e6119270e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6167c7400 session 0x55e61216d680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 513212416 unmapped: 70385664 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x19a646000/0x0/0x1bfc00000, data 0x47dd004/0x49d8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 513212416 unmapped: 70385664 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5349626 data_alloc: 251658240 data_used: 32202752
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 513212416 unmapped: 70385664 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 513212416 unmapped: 70385664 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 513212416 unmapped: 70385664 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x19a646000/0x0/0x1bfc00000, data 0x47dd004/0x49d8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 513212416 unmapped: 70385664 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.114621162s of 11.316886902s, submitted: 42
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133f0800 session 0x55e6153e63c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613ade400 session 0x55e61216af00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 513212416 unmapped: 70385664 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5349758 data_alloc: 251658240 data_used: 32202752
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613b85000 session 0x55e61216ba40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6167c7400 session 0x55e6115caf00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613adf000 session 0x55e6141fb2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 513220608 unmapped: 70377472 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133ebc00 session 0x55e614082960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613ade000 session 0x55e61435e3c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 516186112 unmapped: 67411968 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x19a036000/0x0/0x1bfc00000, data 0x4deb037/0x4fe8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613b85000 session 0x55e61216ab40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 516734976 unmapped: 66863104 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6167c7400 session 0x55e6171221e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 518324224 unmapped: 65273856 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 518307840 unmapped: 65290240 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5445066 data_alloc: 251658240 data_used: 34373632
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519094272 unmapped: 64503808 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x199a23000/0x0/0x1bfc00000, data 0x53fc06a/0x55fb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519725056 unmapped: 63873024 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519725056 unmapped: 63873024 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519725056 unmapped: 63873024 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519733248 unmapped: 63864832 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5540826 data_alloc: 268435456 data_used: 47882240
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519733248 unmapped: 63864832 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x199a20000/0x0/0x1bfc00000, data 0x53ff06a/0x55fe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519733248 unmapped: 63864832 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519733248 unmapped: 63864832 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x199a20000/0x0/0x1bfc00000, data 0x53ff06a/0x55fe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.134312630s of 14.601552010s, submitted: 154
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 519733248 unmapped: 63864832 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x199a20000/0x0/0x1bfc00000, data 0x53ff06a/0x55fe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520396800 unmapped: 63201280 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5562744 data_alloc: 268435456 data_used: 47935488
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 521584640 unmapped: 62013440 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524304384 unmapped: 59293696 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x1990d7000/0x0/0x1bfc00000, data 0x5d4806a/0x5f47000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [0,0,0,0,1])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 522764288 unmapped: 60833792 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x1990d7000/0x0/0x1bfc00000, data 0x5d4806a/0x5f47000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [0,0,0,0,0,0,1])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 523894784 unmapped: 59703296 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x199098000/0x0/0x1bfc00000, data 0x5d8006a/0x5f7f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [0,0,0,0,2])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6190b4000 session 0x55e6131a41e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618349000 session 0x55e614348780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613ade000 session 0x55e6132e2d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613b85000 session 0x55e61435fe00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6167c7400 session 0x55e6117f5a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x19882d000/0x0/0x1bfc00000, data 0x65f1093/0x67f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [0,0,0,0,0,1,1])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524034048 unmapped: 59564032 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5704063 data_alloc: 268435456 data_used: 48656384
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524091392 unmapped: 59506688 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524099584 unmapped: 59498496 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613b84c00 session 0x55e6118e9c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133ed800 session 0x55e6133185a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618454800 session 0x55e61435e5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524099584 unmapped: 59498496 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 6.452570438s of 10.153909683s, submitted: 180
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524107776 unmapped: 59490304 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x198813000/0x0/0x1bfc00000, data 0x660c0bc/0x680b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [1])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133ed800 session 0x55e6116a45a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520478720 unmapped: 63119360 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5499947 data_alloc: 251658240 data_used: 36974592
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613ade000 session 0x55e614349a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520445952 unmapped: 63152128 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6170a8c00 session 0x55e61170cd20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e614c33000 session 0x55e614349680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520454144 unmapped: 63143936 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618a29400 session 0x55e61256f2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618454400 session 0x55e6118e83c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61ee30c00 session 0x55e61216cb40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520454144 unmapped: 63143936 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520454144 unmapped: 63143936 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61d733800 session 0x55e61435ed20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133ed800 session 0x55e611927a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618454400 session 0x55e613084960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520462336 unmapped: 63135744 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x1996e0000/0x0/0x1bfc00000, data 0x573e08d/0x593e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5504050 data_alloc: 251658240 data_used: 36913152
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x1996e0000/0x0/0x1bfc00000, data 0x573e08d/0x593e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520462336 unmapped: 63135744 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133f2400 session 0x55e6132e25a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520716288 unmapped: 62881792 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e62d15ac00 session 0x55e612044f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520716288 unmapped: 62881792 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520716288 unmapped: 62881792 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x1996de000/0x0/0x1bfc00000, data 0x574008d/0x5940000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520716288 unmapped: 62881792 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5555690 data_alloc: 251658240 data_used: 40730624
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.896430016s of 11.582259178s, submitted: 94
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520716288 unmapped: 62881792 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x1996dc000/0x0/0x1bfc00000, data 0x574108d/0x5941000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520716288 unmapped: 62881792 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520716288 unmapped: 62881792 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520716288 unmapped: 62881792 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520716288 unmapped: 62881792 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5556114 data_alloc: 251658240 data_used: 40701952
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x1996dc000/0x0/0x1bfc00000, data 0x574108d/0x5941000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520716288 unmapped: 62881792 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 522502144 unmapped: 61095936 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x19924a000/0x0/0x1bfc00000, data 0x5bc608d/0x5dc6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e62101f800 session 0x55e6112e6780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e62101e000 session 0x55e6141fa000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 523124736 unmapped: 60473344 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133f2400 session 0x55e61133ad20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 523141120 unmapped: 60456960 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 523141120 unmapped: 60456960 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5366893 data_alloc: 251658240 data_used: 30101504
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 523141120 unmapped: 60456960 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 523141120 unmapped: 60456960 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x1999d9000/0x0/0x1bfc00000, data 0x49cf05a/0x4bcd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 523141120 unmapped: 60456960 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.048564911s of 13.485887527s, submitted: 132
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 523149312 unmapped: 60448768 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 523173888 unmapped: 60424192 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5367641 data_alloc: 251658240 data_used: 30101504
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6125e7c00 session 0x55e61170c000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x19a42e000/0x0/0x1bfc00000, data 0x49f105a/0x4bef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20bdf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613c29c00 session 0x55e611339860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61c82fc00 session 0x55e6121a9860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61cb41800 session 0x55e61653b0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6125e7c00 session 0x55e611f143c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618348c00 session 0x55e6141faf00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61d732000 session 0x55e6117f45a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61ee31800 session 0x55e617122f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e611fa0800 session 0x55e6118e8780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524247040 unmapped: 59351040 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524247040 unmapped: 59351040 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524247040 unmapped: 59351040 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x19991e000/0x0/0x1bfc00000, data 0x50ef0cc/0x52ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20fef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524247040 unmapped: 59351040 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524214272 unmapped: 59383808 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5431373 data_alloc: 251658240 data_used: 30101504
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524214272 unmapped: 59383808 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524214272 unmapped: 59383808 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6178b1000 session 0x55e61216dc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x199918000/0x0/0x1bfc00000, data 0x50f60cc/0x52f6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20fef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6170a8400 session 0x55e6116a5680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524222464 unmapped: 59375616 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133f1c00 session 0x55e61133af00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.974376678s of 10.153779030s, submitted: 53
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524222464 unmapped: 59375616 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61b1ff800 session 0x55e614083c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524222464 unmapped: 59375616 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5434375 data_alloc: 251658240 data_used: 30101504
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524222464 unmapped: 59375616 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x1998ee000/0x0/0x1bfc00000, data 0x51200cc/0x5320000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20fef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524222464 unmapped: 59375616 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524230656 unmapped: 59367424 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525295616 unmapped: 58302464 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525295616 unmapped: 58302464 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5486135 data_alloc: 251658240 data_used: 37150720
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525295616 unmapped: 58302464 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525295616 unmapped: 58302464 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x1998eb000/0x0/0x1bfc00000, data 0x51230cc/0x5323000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20fef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525295616 unmapped: 58302464 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x1998eb000/0x0/0x1bfc00000, data 0x51230cc/0x5323000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20fef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525295616 unmapped: 58302464 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525295616 unmapped: 58302464 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x1998eb000/0x0/0x1bfc00000, data 0x51230cc/0x5323000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20fef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5486135 data_alloc: 251658240 data_used: 37150720
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525295616 unmapped: 58302464 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525295616 unmapped: 58302464 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525295616 unmapped: 58302464 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x1998eb000/0x0/0x1bfc00000, data 0x51230cc/0x5323000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x20fef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.645236015s of 14.358864784s, submitted: 5
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528580608 unmapped: 55017472 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 55263232 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5590501 data_alloc: 251658240 data_used: 37695488
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618454400 session 0x55e6132e25a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 55263232 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 55263232 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6170a8c00 session 0x55e61196e960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133f0400 session 0x55e613084780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 55263232 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x197d96000/0x0/0x1bfc00000, data 0x5ad10bc/0x5cd0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528343040 unmapped: 55255040 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613ade000 session 0x55e612045860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528359424 unmapped: 55238656 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5573411 data_alloc: 251658240 data_used: 37695488
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618a29400 session 0x55e610b0bc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61d733800 session 0x55e61170d4a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528367616 unmapped: 55230464 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613ae0800 session 0x55e6112e6f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618cef000 session 0x55e6127221e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e610e23000 session 0x55e6119270e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525303808 unmapped: 58294272 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525303808 unmapped: 58294272 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x198a02000/0x0/0x1bfc00000, data 0x4d55fc5/0x4f51000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525303808 unmapped: 58294272 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525303808 unmapped: 58294272 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5411916 data_alloc: 251658240 data_used: 30318592
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525303808 unmapped: 58294272 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.102174759s of 13.145416260s, submitted: 234
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618454800 session 0x55e612044000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525303808 unmapped: 58294272 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x198b1c000/0x0/0x1bfc00000, data 0x4d55fd5/0x4f52000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525303808 unmapped: 58294272 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525303808 unmapped: 58294272 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x198b1c000/0x0/0x1bfc00000, data 0x4d55fd5/0x4f52000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525303808 unmapped: 58294272 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5414822 data_alloc: 251658240 data_used: 30318592
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525312000 unmapped: 58286080 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x198b17000/0x0/0x1bfc00000, data 0x4d5afd5/0x4f57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133a8400 session 0x55e61133bc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525312000 unmapped: 58286080 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525312000 unmapped: 58286080 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525312000 unmapped: 58286080 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x198b17000/0x0/0x1bfc00000, data 0x4d5afd5/0x4f57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525320192 unmapped: 58277888 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5415502 data_alloc: 251658240 data_used: 30326784
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525320192 unmapped: 58277888 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525320192 unmapped: 58277888 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525320192 unmapped: 58277888 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525320192 unmapped: 58277888 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.831439018s of 12.860879898s, submitted: 7
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133f0c00 session 0x55e61133be00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525320192 unmapped: 58277888 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5415370 data_alloc: 251658240 data_used: 30326784
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x198b17000/0x0/0x1bfc00000, data 0x4d5afd5/0x4f57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525320192 unmapped: 58277888 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133a8400 session 0x55e617123e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525320192 unmapped: 58277888 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525320192 unmapped: 58277888 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6190b5c00 session 0x55e61170c3c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e615e94000 session 0x55e6140823c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e62d160c00 session 0x55e61256f2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e614332400 session 0x55e61216be00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525443072 unmapped: 58155008 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133a8400 session 0x55e6143490e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x197e99000/0x0/0x1bfc00000, data 0x59d8037/0x5bd5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 523616256 unmapped: 59981824 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5326509 data_alloc: 234881024 data_used: 22740992
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 523616256 unmapped: 59981824 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 523616256 unmapped: 59981824 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 523616256 unmapped: 59981824 heap: 583598080 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613c76400 session 0x55e6143a8960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618348c00 session 0x55e61216d0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133ee400 session 0x55e61216cb40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133a8400 session 0x55e6153e6000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x198510000/0x0/0x1bfc00000, data 0x48e3fc5/0x4adf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [0,0,1])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 533356544 unmapped: 51298304 heap: 584654848 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.306050301s of 10.110964775s, submitted: 101
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133ee400 session 0x55e6143494a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524918784 unmapped: 59736064 heap: 584654848 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e613c76400 session 0x55e613084f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e614332400 session 0x55e611f14b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618348c00 session 0x55e6153e6960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5406205 data_alloc: 234881024 data_used: 22740992
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133a8400 session 0x55e617123680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133a9c00 session 0x55e61216d2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x1986b7000/0x0/0x1bfc00000, data 0x51bb027/0x53b7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524951552 unmapped: 59703296 heap: 584654848 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618cef800 session 0x55e613318780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133f3800 session 0x55e6153e63c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525910016 unmapped: 58744832 heap: 584654848 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133f0400 session 0x55e61170cf00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133a8400 session 0x55e611f150e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133a9c00 session 0x55e6141fa3c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618cef800 session 0x55e6143a9860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e610e23000 session 0x55e6119265a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133f3800 session 0x55e6120454a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e610e23000 session 0x55e6140832c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 526999552 unmapped: 57655296 heap: 584654848 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133a8400 session 0x55e614349c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133a9c00 session 0x55e6117f4000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618cef800 session 0x55e61133a780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133efc00 session 0x55e6153e6780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133a8400 session 0x55e61216a5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e622d0f800 session 0x55e6171223c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133a9c00 session 0x55e617122d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133efc00 session 0x55e6171232c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 526057472 unmapped: 58597376 heap: 584654848 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 527302656 unmapped: 57352192 heap: 584654848 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e62d163c00 session 0x55e612044f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5568111 data_alloc: 251658240 data_used: 35209216
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133a8400 session 0x55e61133b860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x197ec9000/0x0/0x1bfc00000, data 0x59a4298/0x5ba3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e62d15a800 session 0x55e6141fa960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 527343616 unmapped: 57311232 heap: 584654848 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61c82fc00 session 0x55e6115caf00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6190b4c00 session 0x55e61435fa40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6178afc00 session 0x55e61256f2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133a8400 session 0x55e612723680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528039936 unmapped: 61939712 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61c237800 session 0x55e614349e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x19775b000/0x0/0x1bfc00000, data 0x611604a/0x6313000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 527097856 unmapped: 62881792 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61c82fc00 session 0x55e61653b2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 529211392 unmapped: 60768256 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.421557426s of 10.240062714s, submitted: 203
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 529211392 unmapped: 60768256 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5701665 data_alloc: 251658240 data_used: 44101632
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 529276928 unmapped: 60702720 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e62d15a800 session 0x55e6141fb4a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61c236000 session 0x55e6121a8780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 529285120 unmapped: 60694528 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 529285120 unmapped: 60694528 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61c237800 session 0x55e61216dc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x19775b000/0x0/0x1bfc00000, data 0x6116fd8/0x6312000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532062208 unmapped: 57917440 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61c82fc00 session 0x55e613084780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 537305088 unmapped: 52674560 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5818934 data_alloc: 268435456 data_used: 51105792
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e62d15a800 session 0x55e612045a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e618454800 session 0x55e6118e9c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #58. Immutable memtables: 14.
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 538959872 unmapped: 51019776 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61c236000 session 0x55e6153e7860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6133a8400 session 0x55e614082780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542384128 unmapped: 47595520 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543383552 unmapped: 46596096 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e61c236000 session 0x55e61653b680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 heartbeat osd_stat(store_statfs(0x195e58000/0x0/0x1bfc00000, data 0x6368fd9/0x6564000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547536896 unmapped: 42442752 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549650432 unmapped: 40329216 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5960709 data_alloc: 268435456 data_used: 55222272
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.738165855s of 11.317887306s, submitted: 308
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6190b4c00 session 0x55e61256fa40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549756928 unmapped: 40222720 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e62d161c00 session 0x55e61170dc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 ms_handle_reset con 0x55e6178b1000 session 0x55e613084f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543719424 unmapped: 46260224 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 374 handle_osd_map epochs [374,375], i have 374, src has [1,375]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 375 ms_handle_reset con 0x55e6178b1000 session 0x55e6117f52c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543727616 unmapped: 46252032 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 375 heartbeat osd_stat(store_statfs(0x196c43000/0x0/0x1bfc00000, data 0x5a91ef2/0x5c8a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543727616 unmapped: 46252032 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 375 ms_handle_reset con 0x55e6133a8400 session 0x55e6117f4960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 375 handle_osd_map epochs [375,376], i have 375, src has [1,376]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 376 heartbeat osd_stat(store_statfs(0x196c3e000/0x0/0x1bfc00000, data 0x5a93b4b/0x5c8d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 376 ms_handle_reset con 0x55e6190b4c00 session 0x55e6141faf00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 376 ms_handle_reset con 0x55e618cef800 session 0x55e61170c3c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 376 ms_handle_reset con 0x55e610e23000 session 0x55e6141fba40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543752192 unmapped: 46227456 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5672700 data_alloc: 251658240 data_used: 44556288
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 376 ms_handle_reset con 0x55e610e23000 session 0x55e61256e780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539131904 unmapped: 50847744 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 376 ms_handle_reset con 0x55e6190b4800 session 0x55e6115cb860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 376 ms_handle_reset con 0x55e616103c00 session 0x55e61435e1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541220864 unmapped: 48758784 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 376 ms_handle_reset con 0x55e613ade400 session 0x55e61653a3c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 376 ms_handle_reset con 0x55e613adf000 session 0x55e613079680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543498240 unmapped: 46481408 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 376 ms_handle_reset con 0x55e610e23000 session 0x55e6171232c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543522816 unmapped: 46456832 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 376 heartbeat osd_stat(store_statfs(0x196ebd000/0x0/0x1bfc00000, data 0x580f753/0x5a06000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 376 heartbeat osd_stat(store_statfs(0x196ebd000/0x0/0x1bfc00000, data 0x580f753/0x5a06000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543522816 unmapped: 46456832 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5579748 data_alloc: 251658240 data_used: 33116160
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543522816 unmapped: 46456832 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543522816 unmapped: 46456832 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 376 ms_handle_reset con 0x55e61c82c800 session 0x55e61168c000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.023643494s of 11.694922447s, submitted: 329
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 376 ms_handle_reset con 0x55e6178b1800 session 0x55e61186d860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543367168 unmapped: 46612480 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 376 ms_handle_reset con 0x55e6133eb000 session 0x55e61216a5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543367168 unmapped: 46612480 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 376 heartbeat osd_stat(store_statfs(0x196ecc000/0x0/0x1bfc00000, data 0x580c743/0x5a02000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543367168 unmapped: 46612480 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5565567 data_alloc: 251658240 data_used: 33009664
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 376 handle_osd_map epochs [376,377], i have 376, src has [1,377]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544423936 unmapped: 45555712 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544423936 unmapped: 45555712 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 ms_handle_reset con 0x55e613ae0800 session 0x55e61133be00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 ms_handle_reset con 0x55e613adc000 session 0x55e6112e7a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542744576 unmapped: 47235072 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 ms_handle_reset con 0x55e615e97000 session 0x55e61306bc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 ms_handle_reset con 0x55e62d160000 session 0x55e61133b2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 ms_handle_reset con 0x55e610e23000 session 0x55e6153e6f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542752768 unmapped: 47226880 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 heartbeat osd_stat(store_statfs(0x198a66000/0x0/0x1bfc00000, data 0x3c72210/0x3e67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [0,0,1])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542752768 unmapped: 47226880 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5268542 data_alloc: 234881024 data_used: 21991424
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 ms_handle_reset con 0x55e6133eb000 session 0x55e6143485a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542752768 unmapped: 47226880 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 heartbeat osd_stat(store_statfs(0x198aae000/0x0/0x1bfc00000, data 0x3c2a1ed/0x3e1e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542760960 unmapped: 47218688 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542760960 unmapped: 47218688 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 heartbeat osd_stat(store_statfs(0x198aae000/0x0/0x1bfc00000, data 0x3c2a1ed/0x3e1e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542760960 unmapped: 47218688 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542760960 unmapped: 47218688 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5268542 data_alloc: 234881024 data_used: 21991424
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 ms_handle_reset con 0x55e62d160400 session 0x55e614349860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542760960 unmapped: 47218688 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.013575554s of 14.527601242s, submitted: 71
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 heartbeat osd_stat(store_statfs(0x198aae000/0x0/0x1bfc00000, data 0x3c2a1ed/0x3e1e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 534700032 unmapped: 55279616 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 ms_handle_reset con 0x55e611824400 session 0x55e6112e72c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 534700032 unmapped: 55279616 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 ms_handle_reset con 0x55e613444800 session 0x55e6119263c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 534700032 unmapped: 55279616 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 heartbeat osd_stat(store_statfs(0x19947f000/0x0/0x1bfc00000, data 0x325b1ed/0x344f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 ms_handle_reset con 0x55e6178b1000 session 0x55e6153e6780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 529227776 unmapped: 60751872 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5046868 data_alloc: 234881024 data_used: 11862016
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 heartbeat osd_stat(store_statfs(0x19947f000/0x0/0x1bfc00000, data 0x325b1ed/0x344f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 ms_handle_reset con 0x55e62d160000 session 0x55e6121a81e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 529227776 unmapped: 60751872 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 heartbeat osd_stat(store_statfs(0x199b98000/0x0/0x1bfc00000, data 0x28c218b/0x2ab5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520323072 unmapped: 69656576 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 ms_handle_reset con 0x55e611824400 session 0x55e61170c5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520323072 unmapped: 69656576 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520323072 unmapped: 69656576 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 heartbeat osd_stat(store_statfs(0x19a8c3000/0x0/0x1bfc00000, data 0x1b9817b/0x1d8a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520323072 unmapped: 69656576 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4893152 data_alloc: 218103808 data_used: 6811648
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 520323072 unmapped: 69656576 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 377 handle_osd_map epochs [377,378], i have 377, src has [1,378]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.845713615s of 10.014044762s, submitted: 69
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 378 ms_handle_reset con 0x55e61432d400 session 0x55e61170cb40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 514924544 unmapped: 75055104 heap: 589979648 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 378 ms_handle_reset con 0x55e61c82f800 session 0x55e61306ab40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 515686400 unmapped: 77447168 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 515686400 unmapped: 77447168 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 378 heartbeat osd_stat(store_statfs(0x19a985000/0x0/0x1bfc00000, data 0x1d55de2/0x1f47000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 515686400 unmapped: 77447168 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4910322 data_alloc: 218103808 data_used: 1376256
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 515686400 unmapped: 77447168 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 515686400 unmapped: 77447168 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 515686400 unmapped: 77447168 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 378 ms_handle_reset con 0x55e62d15a800 session 0x55e6140823c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 515686400 unmapped: 77447168 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 378 ms_handle_reset con 0x55e6190b5c00 session 0x55e6117f45a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 378 heartbeat osd_stat(store_statfs(0x19a985000/0x0/0x1bfc00000, data 0x1d55de2/0x1f47000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 515686400 unmapped: 77447168 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 378 handle_osd_map epochs [378,379], i have 378, src has [1,379]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4911168 data_alloc: 218103808 data_used: 1376256
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 ms_handle_reset con 0x55e614332800 session 0x55e61435f860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 ms_handle_reset con 0x55e61b1fe800 session 0x55e61133a1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 515547136 unmapped: 77586432 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 515547136 unmapped: 77586432 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 515563520 unmapped: 77570048 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 515563520 unmapped: 77570048 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 515563520 unmapped: 77570048 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4911328 data_alloc: 218103808 data_used: 1380352
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 heartbeat osd_stat(store_statfs(0x19a983000/0x0/0x1bfc00000, data 0x1d57921/0x1f4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 515563520 unmapped: 77570048 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 515244032 unmapped: 77889536 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 516759552 unmapped: 76374016 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 516759552 unmapped: 76374016 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 516759552 unmapped: 76374016 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4974368 data_alloc: 218103808 data_used: 10158080
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 heartbeat osd_stat(store_statfs(0x19a983000/0x0/0x1bfc00000, data 0x1d57921/0x1f4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 516759552 unmapped: 76374016 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.586288452s of 19.746097565s, submitted: 48
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 heartbeat osd_stat(store_statfs(0x19a983000/0x0/0x1bfc00000, data 0x1d57921/0x1f4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 ms_handle_reset con 0x55e6167c7c00 session 0x55e6112e65a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 516759552 unmapped: 76374016 heap: 593133568 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 87293952 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 ms_handle_reset con 0x55e613e77400 session 0x55e6117f52c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 ms_handle_reset con 0x55e613d93400 session 0x55e61196f2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 ms_handle_reset con 0x55e613e77400 session 0x55e6143481e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 ms_handle_reset con 0x55e614332800 session 0x55e6153e74a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 ms_handle_reset con 0x55e6167c7c00 session 0x55e6121a92c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 87293952 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 87293952 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5075122 data_alloc: 218103808 data_used: 10158080
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 87293952 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 87293952 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 heartbeat osd_stat(store_statfs(0x199cd8000/0x0/0x1bfc00000, data 0x2a02983/0x2bf6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 87293952 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 87293952 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 87293952 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5075122 data_alloc: 218103808 data_used: 10158080
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 87293952 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 ms_handle_reset con 0x55e61d733400 session 0x55e613084780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 heartbeat osd_stat(store_statfs(0x199cd8000/0x0/0x1bfc00000, data 0x2a02983/0x2bf6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 87293952 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 ms_handle_reset con 0x55e618a28800 session 0x55e61216dc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 ms_handle_reset con 0x55e6167c7400 session 0x55e6121a94a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 87293952 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 ms_handle_reset con 0x55e613e77400 session 0x55e613318960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.665700912s of 11.822876930s, submitted: 42
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 ms_handle_reset con 0x55e614332800 session 0x55e61186da40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 517062656 unmapped: 87146496 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 517062656 unmapped: 87146496 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5079007 data_alloc: 218103808 data_used: 10158080
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 517070848 unmapped: 87138304 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 heartbeat osd_stat(store_statfs(0x199cb3000/0x0/0x1bfc00000, data 0x2a269a6/0x2c1b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 521478144 unmapped: 82731008 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 521478144 unmapped: 82731008 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 521478144 unmapped: 82731008 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 521478144 unmapped: 82731008 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5172127 data_alloc: 234881024 data_used: 22810624
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 521478144 unmapped: 82731008 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 heartbeat osd_stat(store_statfs(0x199cb3000/0x0/0x1bfc00000, data 0x2a269a6/0x2c1b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 521478144 unmapped: 82731008 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 521478144 unmapped: 82731008 heap: 604209152 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.596400261s of 10.035444260s, submitted: 8
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 ms_handle_reset con 0x55e61c236400 session 0x55e6132e3a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 ms_handle_reset con 0x55e61c82c800 session 0x55e61653b2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 521494528 unmapped: 86392832 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 ms_handle_reset con 0x55e613e77400 session 0x55e6153e63c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 521494528 unmapped: 86392832 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5099137 data_alloc: 234881024 data_used: 14577664
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524722176 unmapped: 83165184 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525434880 unmapped: 82452480 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 heartbeat osd_stat(store_statfs(0x198eb9000/0x0/0x1bfc00000, data 0x38189a6/0x3a0d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 525467648 unmapped: 82419712 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 heartbeat osd_stat(store_statfs(0x198eb9000/0x0/0x1bfc00000, data 0x38189a6/0x3a0d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524804096 unmapped: 83083264 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524812288 unmapped: 83075072 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5233471 data_alloc: 234881024 data_used: 16523264
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524812288 unmapped: 83075072 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524812288 unmapped: 83075072 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 heartbeat osd_stat(store_statfs(0x198ebf000/0x0/0x1bfc00000, data 0x381a9a6/0x3a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524812288 unmapped: 83075072 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524812288 unmapped: 83075072 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.190343857s of 11.043982506s, submitted: 118
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 ms_handle_reset con 0x55e611824c00 session 0x55e6127234a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 379 handle_osd_map epochs [379,380], i have 379, src has [1,380]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524820480 unmapped: 83066880 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e6133ee400 session 0x55e6131a4d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5269938 data_alloc: 234881024 data_used: 16547840
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 524820480 unmapped: 83066880 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 526016512 unmapped: 81870848 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 heartbeat osd_stat(store_statfs(0x198a4a000/0x0/0x1bfc00000, data 0x3c8c622/0x3e83000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 526016512 unmapped: 81870848 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 526016512 unmapped: 81870848 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 526016512 unmapped: 81870848 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5325154 data_alloc: 234881024 data_used: 24285184
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 526016512 unmapped: 81870848 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 526016512 unmapped: 81870848 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e614c33800 session 0x55e61196f0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e61ee31400 session 0x55e6121a8b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e611824c00 session 0x55e6130785a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 526016512 unmapped: 81870848 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 heartbeat osd_stat(store_statfs(0x198a4a000/0x0/0x1bfc00000, data 0x3c8c622/0x3e83000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e6133ee400 session 0x55e61256f0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 526016512 unmapped: 81870848 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e613e77400 session 0x55e61170c3c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 529489920 unmapped: 78397440 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5336514 data_alloc: 251658240 data_used: 28938240
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 529498112 unmapped: 78389248 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e618a28c00 session 0x55e6171223c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.951934814s of 12.034827232s, submitted: 9
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 529776640 unmapped: 78110720 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 heartbeat osd_stat(store_statfs(0x197ff6000/0x0/0x1bfc00000, data 0x46e0632/0x48d8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 529776640 unmapped: 78110720 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e6133f1000 session 0x55e61133ba40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e611824c00 session 0x55e61435fc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e6133ee400 session 0x55e611f150e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 529776640 unmapped: 78110720 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 529776640 unmapped: 78110720 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5433360 data_alloc: 251658240 data_used: 28979200
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e613e77400 session 0x55e61133a780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e61cb41000 session 0x55e611927680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e613448400 session 0x55e613078780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 529907712 unmapped: 77979648 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 heartbeat osd_stat(store_statfs(0x197f5c000/0x0/0x1bfc00000, data 0x4776632/0x496e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e611824c00 session 0x55e6112e63c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530956288 unmapped: 76931072 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 heartbeat osd_stat(store_statfs(0x197f39000/0x0/0x1bfc00000, data 0x4785632/0x497d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e613e77400 session 0x55e61435ef00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e61cb41000 session 0x55e6140821e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532406272 unmapped: 75481088 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532406272 unmapped: 75481088 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532406272 unmapped: 75481088 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5506737 data_alloc: 251658240 data_used: 29069312
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e6133ee400 session 0x55e61256f860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532422656 unmapped: 75464704 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e61432dc00 session 0x55e6143492c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e611824c00 session 0x55e614349860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 heartbeat osd_stat(store_statfs(0x19774c000/0x0/0x1bfc00000, data 0x4f876c7/0x5182000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532758528 unmapped: 75128832 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e6133ee400 session 0x55e6143485a0
Jan 31 04:20:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 31 04:20:23 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3491807509' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532758528 unmapped: 75128832 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e613e77400 session 0x55e6119263c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.862941742s of 12.237458229s, submitted: 106
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532914176 unmapped: 74973184 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e61cb41000 session 0x55e61170c5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532914176 unmapped: 74973184 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5573136 data_alloc: 251658240 data_used: 34869248
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e617e22400 session 0x55e6131a4d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532914176 unmapped: 74973184 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 535044096 unmapped: 72843264 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 heartbeat osd_stat(store_statfs(0x197726000/0x0/0x1bfc00000, data 0x4fab739/0x51a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 heartbeat osd_stat(store_statfs(0x197316000/0x0/0x1bfc00000, data 0x4fab739/0x51a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 536018944 unmapped: 71868416 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 536018944 unmapped: 71868416 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 ms_handle_reset con 0x55e61b382400 session 0x55e617123e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 536018944 unmapped: 71868416 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5627516 data_alloc: 251658240 data_used: 41881600
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 536018944 unmapped: 71868416 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539877376 unmapped: 68009984 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 380 handle_osd_map epochs [380,381], i have 380, src has [1,381]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 381 ms_handle_reset con 0x55e611824400 session 0x55e611f143c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 381 ms_handle_reset con 0x55e613add400 session 0x55e613318960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 381 ms_handle_reset con 0x55e616d5d800 session 0x55e6130850e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544088064 unmapped: 63799296 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.560070038s of 10.009545326s, submitted: 125
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 381 handle_osd_map epochs [381,382], i have 381, src has [1,382]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 382 ms_handle_reset con 0x55e617e22800 session 0x55e613318780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 382 heartbeat osd_stat(store_statfs(0x195d49000/0x0/0x1bfc00000, data 0x657503f/0x6774000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544112640 unmapped: 63774720 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 382 handle_osd_map epochs [382,383], i have 382, src has [1,383]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 383 ms_handle_reset con 0x55e611824400 session 0x55e61435ed20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544112640 unmapped: 63774720 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5828690 data_alloc: 251658240 data_used: 43208704
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544112640 unmapped: 63774720 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 383 handle_osd_map epochs [383,384], i have 383, src has [1,384]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 384 ms_handle_reset con 0x55e62101fc00 session 0x55e6121a8000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 384 ms_handle_reset con 0x55e615e95800 session 0x55e61168c5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541155328 unmapped: 66732032 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 384 ms_handle_reset con 0x55e6133ef800 session 0x55e614082f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 384 ms_handle_reset con 0x55e62d164c00 session 0x55e6112e7860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543039488 unmapped: 64847872 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 384 heartbeat osd_stat(store_statfs(0x1953ef000/0x0/0x1bfc00000, data 0x6ecc929/0x70ce000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543170560 unmapped: 64716800 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 384 handle_osd_map epochs [384,385], i have 384, src has [1,385]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544227328 unmapped: 63660032 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5917134 data_alloc: 251658240 data_used: 44285952
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544227328 unmapped: 63660032 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544227328 unmapped: 63660032 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 heartbeat osd_stat(store_statfs(0x19534c000/0x0/0x1bfc00000, data 0x6f6e468/0x7171000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544227328 unmapped: 63660032 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 heartbeat osd_stat(store_statfs(0x19534c000/0x0/0x1bfc00000, data 0x6f6e468/0x7171000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544227328 unmapped: 63660032 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.605742455s of 11.151481628s, submitted: 148
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544235520 unmapped: 63651840 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5918018 data_alloc: 251658240 data_used: 44285952
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544243712 unmapped: 63643648 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 ms_handle_reset con 0x55e618a29400 session 0x55e6118e9c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 ms_handle_reset con 0x55e611824c00 session 0x55e6153e7c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 ms_handle_reset con 0x55e6133ee400 session 0x55e6171225a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539148288 unmapped: 68739072 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 ms_handle_reset con 0x55e62d164400 session 0x55e6131a41e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539148288 unmapped: 68739072 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 heartbeat osd_stat(store_statfs(0x196d8b000/0x0/0x1bfc00000, data 0x535a3d3/0x555a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539148288 unmapped: 68739072 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539148288 unmapped: 68739072 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5573209 data_alloc: 234881024 data_used: 26791936
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539148288 unmapped: 68739072 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539148288 unmapped: 68739072 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 heartbeat osd_stat(store_statfs(0x196d8b000/0x0/0x1bfc00000, data 0x535a3d3/0x555a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539148288 unmapped: 68739072 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539148288 unmapped: 68739072 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539148288 unmapped: 68739072 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5573209 data_alloc: 234881024 data_used: 26791936
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 heartbeat osd_stat(store_statfs(0x196d8b000/0x0/0x1bfc00000, data 0x535a3d3/0x555a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539148288 unmapped: 68739072 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539148288 unmapped: 68739072 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 heartbeat osd_stat(store_statfs(0x196d8b000/0x0/0x1bfc00000, data 0x535a3d3/0x555a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539148288 unmapped: 68739072 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.872932434s of 14.068326950s, submitted: 62
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 ms_handle_reset con 0x55e6167c7c00 session 0x55e617123c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 ms_handle_reset con 0x55e61d733400 session 0x55e6141fb4a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 heartbeat osd_stat(store_statfs(0x196d8b000/0x0/0x1bfc00000, data 0x535a3d3/0x555a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532250624 unmapped: 75636736 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 ms_handle_reset con 0x55e611824c00 session 0x55e61196e960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532250624 unmapped: 75636736 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5260719 data_alloc: 234881024 data_used: 15089664
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532250624 unmapped: 75636736 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532250624 unmapped: 75636736 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 heartbeat osd_stat(store_statfs(0x198b7c000/0x0/0x1bfc00000, data 0x374234e/0x3940000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532250624 unmapped: 75636736 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532250624 unmapped: 75636736 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 ms_handle_reset con 0x55e61b1fe800 session 0x55e6121a81e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 ms_handle_reset con 0x55e62101f400 session 0x55e6131a5e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 ms_handle_reset con 0x55e6178b1c00 session 0x55e6132e2f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 ms_handle_reset con 0x55e611824c00 session 0x55e612723680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 ms_handle_reset con 0x55e618454c00 session 0x55e6153e61e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 ms_handle_reset con 0x55e6178b1c00 session 0x55e6112e6d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532258816 unmapped: 75628544 heap: 607887360 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5262013 data_alloc: 234881024 data_used: 15089664
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 ms_handle_reset con 0x55e61b1fe800 session 0x55e6115caf00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 heartbeat osd_stat(store_statfs(0x198b7e000/0x0/0x1bfc00000, data 0x374234e/0x3940000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 ms_handle_reset con 0x55e61d733400 session 0x55e611926b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542384128 unmapped: 75612160 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 ms_handle_reset con 0x55e611824c00 session 0x55e6153e74a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542400512 unmapped: 75595776 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 385 handle_osd_map epochs [385,386], i have 385, src has [1,386]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 386 ms_handle_reset con 0x55e6178b1c00 session 0x55e6117f50e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 386 ms_handle_reset con 0x55e618454c00 session 0x55e61653bc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543465472 unmapped: 74530816 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543465472 unmapped: 74530816 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 386 handle_osd_map epochs [386,387], i have 386, src has [1,387]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.488714218s of 11.146036148s, submitted: 98
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 387 ms_handle_reset con 0x55e62101f400 session 0x55e610b0b2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539820032 unmapped: 78176256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5409996 data_alloc: 234881024 data_used: 24891392
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539820032 unmapped: 78176256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 387 heartbeat osd_stat(store_statfs(0x198283000/0x0/0x1bfc00000, data 0x4039cc4/0x423a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539820032 unmapped: 78176256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 387 ms_handle_reset con 0x55e61432f400 session 0x55e6117f5e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 387 ms_handle_reset con 0x55e61dc78800 session 0x55e61256e960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 387 ms_handle_reset con 0x55e614328000 session 0x55e61186cb40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539820032 unmapped: 78176256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 387 ms_handle_reset con 0x55e611824c00 session 0x55e613318780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539820032 unmapped: 78176256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 387 handle_osd_map epochs [387,388], i have 387, src has [1,388]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539820032 unmapped: 78176256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5413610 data_alloc: 234881024 data_used: 24907776
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539820032 unmapped: 78176256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 heartbeat osd_stat(store_statfs(0x198280000/0x0/0x1bfc00000, data 0x403b83b/0x423d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539820032 unmapped: 78176256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539820032 unmapped: 78176256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539820032 unmapped: 78176256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.952500343s of 10.057047844s, submitted: 43
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539820032 unmapped: 78176256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5439046 data_alloc: 234881024 data_used: 27287552
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539820032 unmapped: 78176256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539820032 unmapped: 78176256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 heartbeat osd_stat(store_statfs(0x198281000/0x0/0x1bfc00000, data 0x403b83b/0x423d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539820032 unmapped: 78176256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539820032 unmapped: 78176256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539688960 unmapped: 78307328 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5446620 data_alloc: 234881024 data_used: 28024832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539688960 unmapped: 78307328 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539688960 unmapped: 78307328 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539729920 unmapped: 78266368 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 heartbeat osd_stat(store_statfs(0x19827c000/0x0/0x1bfc00000, data 0x404083b/0x4242000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539729920 unmapped: 78266368 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539729920 unmapped: 78266368 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5461180 data_alloc: 251658240 data_used: 29663232
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.942104340s of 10.985748291s, submitted: 26
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539729920 unmapped: 78266368 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e61b1fe800 session 0x55e6112e6f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539729920 unmapped: 78266368 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e61b382400 session 0x55e61435f680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528015360 unmapped: 89980928 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e61b1ff800 session 0x55e61216da40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528015360 unmapped: 89980928 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 heartbeat osd_stat(store_statfs(0x1995f7000/0x0/0x1bfc00000, data 0x2cc484b/0x2ec7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528015360 unmapped: 89980928 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5217180 data_alloc: 234881024 data_used: 16388096
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528015360 unmapped: 89980928 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528015360 unmapped: 89980928 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528015360 unmapped: 89980928 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e614333c00 session 0x55e6121a85a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e61c82f800 session 0x55e61216a780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e615e95000 session 0x55e61216d2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e61dc78000 session 0x55e6112e65a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e614333c00 session 0x55e610dd34a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e615e95000 session 0x55e613319e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e61b1ff800 session 0x55e617123a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e61c82f800 session 0x55e61216a3c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e614328c00 session 0x55e613319e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528187392 unmapped: 89808896 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 heartbeat osd_stat(store_statfs(0x198dc0000/0x0/0x1bfc00000, data 0x34f98bd/0x36fe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528187392 unmapped: 89808896 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5285964 data_alloc: 234881024 data_used: 16388096
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528187392 unmapped: 89808896 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528187392 unmapped: 89808896 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 heartbeat osd_stat(store_statfs(0x198dc0000/0x0/0x1bfc00000, data 0x34f98bd/0x36fe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528187392 unmapped: 89808896 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528187392 unmapped: 89808896 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528187392 unmapped: 89808896 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5285964 data_alloc: 234881024 data_used: 16388096
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 heartbeat osd_stat(store_statfs(0x198dc0000/0x0/0x1bfc00000, data 0x34f98bd/0x36fe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 heartbeat osd_stat(store_statfs(0x198dc0000/0x0/0x1bfc00000, data 0x34f98bd/0x36fe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528187392 unmapped: 89808896 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 heartbeat osd_stat(store_statfs(0x198dc0000/0x0/0x1bfc00000, data 0x34f98bd/0x36fe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 heartbeat osd_stat(store_statfs(0x198dc0000/0x0/0x1bfc00000, data 0x34f98bd/0x36fe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e613c77c00 session 0x55e61216a780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528187392 unmapped: 89808896 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e613e76c00 session 0x55e6121a85a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528187392 unmapped: 89808896 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e614328800 session 0x55e61435f680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528187392 unmapped: 89808896 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.675891876s of 18.822942734s, submitted: 76
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e61432d000 session 0x55e6112e6f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e61b1ff400 session 0x55e610b0b2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528187392 unmapped: 89808896 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5292308 data_alloc: 234881024 data_used: 17338368
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 heartbeat osd_stat(store_statfs(0x198d9c000/0x0/0x1bfc00000, data 0x351d8bd/0x3722000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 528228352 unmapped: 89767936 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e613c77c00 session 0x55e61306ab40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 529367040 unmapped: 88629248 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e613e76c00 session 0x55e613085860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 heartbeat osd_stat(store_statfs(0x198d9c000/0x0/0x1bfc00000, data 0x351d8bd/0x3722000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e61432d000 session 0x55e6140821e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 529367040 unmapped: 88629248 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e614328800 session 0x55e617122f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e61b1ff800 session 0x55e61256f860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530587648 unmapped: 87408640 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530587648 unmapped: 87408640 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5361572 data_alloc: 234881024 data_used: 25075712
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530587648 unmapped: 87408640 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530587648 unmapped: 87408640 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 heartbeat osd_stat(store_statfs(0x198d74000/0x0/0x1bfc00000, data 0x35428f0/0x3749000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e616102000 session 0x55e612044960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530587648 unmapped: 87408640 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530587648 unmapped: 87408640 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5362564 data_alloc: 234881024 data_used: 25333760
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530587648 unmapped: 87408640 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530587648 unmapped: 87408640 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.021357536s of 12.116014481s, submitted: 19
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 533176320 unmapped: 84819968 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 533176320 unmapped: 84819968 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 heartbeat osd_stat(store_statfs(0x19879c000/0x0/0x1bfc00000, data 0x3b0d8f0/0x3d14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e613c77c00 session 0x55e61306a780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e613e76c00 session 0x55e61170d860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e61dc79c00 session 0x55e6116a5680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532594688 unmapped: 85401600 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e61dc78c00 session 0x55e6116a5a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e6119cac00 session 0x55e61216cd20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5421563 data_alloc: 234881024 data_used: 25587712
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532652032 unmapped: 85344256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e616102000 session 0x55e6118e90e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532856832 unmapped: 85139456 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e61dc78c00 session 0x55e6118e94a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532856832 unmapped: 85139456 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 heartbeat osd_stat(store_statfs(0x198154000/0x0/0x1bfc00000, data 0x415b8bd/0x4360000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532856832 unmapped: 85139456 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532856832 unmapped: 85139456 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e618454c00 session 0x55e613084f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5475265 data_alloc: 234881024 data_used: 25587712
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532856832 unmapped: 85139456 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532856832 unmapped: 85139456 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 heartbeat osd_stat(store_statfs(0x19815e000/0x0/0x1bfc00000, data 0x415b8bd/0x4360000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e62d165000 session 0x55e6118e8780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532856832 unmapped: 85139456 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.469054222s of 11.163727760s, submitted: 119
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e616102000 session 0x55e61133a5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e618454c00 session 0x55e61133ab40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e6119cac00 session 0x55e6118e81e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532881408 unmapped: 85114880 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e6178b1c00 session 0x55e611f143c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e62101f400 session 0x55e6131a4d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 531628032 unmapped: 86368256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5226684 data_alloc: 234881024 data_used: 14073856
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 531644416 unmapped: 86351872 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 heartbeat osd_stat(store_statfs(0x19960d000/0x0/0x1bfc00000, data 0x2cb082b/0x2eb1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e62101e800 session 0x55e6121a8f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 531644416 unmapped: 86351872 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 531644416 unmapped: 86351872 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 531644416 unmapped: 86351872 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 531644416 unmapped: 86351872 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 heartbeat osd_stat(store_statfs(0x19960d000/0x0/0x1bfc00000, data 0x2cb0808/0x2eb0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e613442400 session 0x55e613085c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 ms_handle_reset con 0x55e613e77000 session 0x55e614083680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5225068 data_alloc: 234881024 data_used: 14073856
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 531644416 unmapped: 86351872 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 388 handle_osd_map epochs [388,389], i have 388, src has [1,389]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 532701184 unmapped: 85295104 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 389 ms_handle_reset con 0x55e615e97800 session 0x55e6116a4d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530718720 unmapped: 87277568 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.612535477s of 10.763399124s, submitted: 76
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530685952 unmapped: 87310336 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 389 ms_handle_reset con 0x55e6133ee400 session 0x55e61196f2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530694144 unmapped: 87302144 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 389 handle_osd_map epochs [389,390], i have 389, src has [1,390]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 390 ms_handle_reset con 0x55e6133eb800 session 0x55e61133bc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5304088 data_alloc: 218103808 data_used: 10223616
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530702336 unmapped: 87293952 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 390 heartbeat osd_stat(store_statfs(0x198a76000/0x0/0x1bfc00000, data 0x384418c/0x3a47000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 390 ms_handle_reset con 0x55e614c33c00 session 0x55e6112e7a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530726912 unmapped: 87269376 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530726912 unmapped: 87269376 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530726912 unmapped: 87269376 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 390 ms_handle_reset con 0x55e614c33800 session 0x55e6153e7c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 390 heartbeat osd_stat(store_statfs(0x198a76000/0x0/0x1bfc00000, data 0x384418c/0x3a47000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 390 ms_handle_reset con 0x55e61dc78800 session 0x55e614082780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530726912 unmapped: 87269376 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 390 handle_osd_map epochs [390,391], i have 390, src has [1,391]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5353965 data_alloc: 234881024 data_used: 14389248
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530726912 unmapped: 87269376 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 391 ms_handle_reset con 0x55e6170a8800 session 0x55e6112e63c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530726912 unmapped: 87269376 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 391 ms_handle_reset con 0x55e6133eb800 session 0x55e611650f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530726912 unmapped: 87269376 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530726912 unmapped: 87269376 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 391 ms_handle_reset con 0x55e614c33800 session 0x55e614083680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 391 ms_handle_reset con 0x55e614c33c00 session 0x55e6116a4d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 391 ms_handle_reset con 0x55e61dc78800 session 0x55e6115cb860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530726912 unmapped: 87269376 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 391 heartbeat osd_stat(store_statfs(0x19862d000/0x0/0x1bfc00000, data 0x3c8dc69/0x3e91000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.465101242s of 11.790494919s, submitted: 51
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 391 ms_handle_reset con 0x55e614329800 session 0x55e6153e61e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5386247 data_alloc: 234881024 data_used: 14389248
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 530726912 unmapped: 87269376 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 391 handle_osd_map epochs [391,392], i have 391, src has [1,392]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e6133eb800 session 0x55e6132e2f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543948800 unmapped: 74047488 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543948800 unmapped: 74047488 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x198627000/0x0/0x1bfc00000, data 0x3c908c2/0x3e95000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e610cc8800 session 0x55e61306ab40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x198295000/0x0/0x1bfc00000, data 0x40228c2/0x4227000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [0,0,0,0,0,0,5,6,55])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552837120 unmapped: 65159168 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e614329800 session 0x55e614083c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x198295000/0x0/0x1bfc00000, data 0x40228c2/0x4227000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [0,0,0,0,0,0,0,1,0,19,20])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548364288 unmapped: 69632000 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e614c33800 session 0x55e61133b0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5562547 data_alloc: 251658240 data_used: 29335552
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548372480 unmapped: 69623808 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548372480 unmapped: 69623808 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x197570000/0x0/0x1bfc00000, data 0x4d498c2/0x4f4e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [0,0,0,0,1])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548372480 unmapped: 69623808 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x1974f0000/0x0/0x1bfc00000, data 0x4dc98c2/0x4fce000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548012032 unmapped: 69984256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548012032 unmapped: 69984256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5568279 data_alloc: 251658240 data_used: 29392896
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548012032 unmapped: 69984256 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.241108894s of 10.892267227s, submitted: 153
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548315136 unmapped: 69681152 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548315136 unmapped: 69681152 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x1974c7000/0x0/0x1bfc00000, data 0x4df28c2/0x4ff7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548315136 unmapped: 69681152 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548315136 unmapped: 69681152 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x1974bd000/0x0/0x1bfc00000, data 0x4dfc8c2/0x5001000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5570859 data_alloc: 251658240 data_used: 29401088
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548323328 unmapped: 69672960 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548323328 unmapped: 69672960 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548323328 unmapped: 69672960 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e613d92c00 session 0x55e612723860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e613448000 session 0x55e613318000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x1974b8000/0x0/0x1bfc00000, data 0x4e018c2/0x5006000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [0,0,0,0,0,0,1])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548413440 unmapped: 69582848 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e61c82fc00 session 0x55e617122d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548413440 unmapped: 69582848 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5574376 data_alloc: 251658240 data_used: 29405184
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548413440 unmapped: 69582848 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548413440 unmapped: 69582848 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 7.658015728s of 10.949478149s, submitted: 23
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548421632 unmapped: 69574656 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e6143a6400 session 0x55e61653bc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e615e97800 session 0x55e611926d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e616d5c400 session 0x55e6117f50e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e613448000 session 0x55e6143a8780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e6143a6400 session 0x55e6116a43c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e615e97800 session 0x55e61133a960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e61c82fc00 session 0x55e6116a4960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548421632 unmapped: 69574656 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x197493000/0x0/0x1bfc00000, data 0x4e25924/0x502b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [0,0,0,0,0,14])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559964160 unmapped: 58032128 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e62d15c000 session 0x55e614082f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5667282 data_alloc: 251658240 data_used: 30195712
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548708352 unmapped: 69287936 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548716544 unmapped: 69279744 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548724736 unmapped: 69271552 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548724736 unmapped: 69271552 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548724736 unmapped: 69271552 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e61b383000 session 0x55e6115ca1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x196a4a000/0x0/0x1bfc00000, data 0x586e924/0x5a74000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e613448000 session 0x55e61435f4a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5666340 data_alloc: 251658240 data_used: 30195712
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548741120 unmapped: 69255168 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x196a4a000/0x0/0x1bfc00000, data 0x586e924/0x5a74000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e6133b5800 session 0x55e61168c000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548765696 unmapped: 69230592 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e614329000 session 0x55e612045e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x196a4a000/0x0/0x1bfc00000, data 0x586e924/0x5a74000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e615e94800 session 0x55e6121a8780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 7.239238739s of 10.188981056s, submitted: 44
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e6133b5800 session 0x55e6121a8d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548765696 unmapped: 69230592 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549797888 unmapped: 68198400 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e61b383000 session 0x55e6143481e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e6133b5000 session 0x55e6121a8d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551247872 unmapped: 66748416 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5770586 data_alloc: 251658240 data_used: 34619392
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551108608 unmapped: 66887680 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e61194dc00 session 0x55e613319c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e613442400 session 0x55e6131a41e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551108608 unmapped: 66887680 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x19626f000/0x0/0x1bfc00000, data 0x6046967/0x624f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551108608 unmapped: 66887680 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551108608 unmapped: 66887680 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551116800 unmapped: 66879488 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5818380 data_alloc: 251658240 data_used: 40497152
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553738240 unmapped: 64258048 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e613448000 session 0x55e6141fad20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e614329000 session 0x55e6121a8780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e618349800 session 0x55e61216a3c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555450368 unmapped: 62545920 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x19626c000/0x0/0x1bfc00000, data 0x6049967/0x6252000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x19626d000/0x0/0x1bfc00000, data 0x6049934/0x6250000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555450368 unmapped: 62545920 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 70K writes, 299K keys, 70K commit groups, 1.0 writes per commit group, ingest: 0.31 GB, 0.05 MB/s#012Cumulative WAL: 70K writes, 23K syncs, 2.95 writes per sync, written: 0.31 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 7633 writes, 29K keys, 7633 commit groups, 1.0 writes per commit group, ingest: 30.71 MB, 0.05 MB/s#012Interval WAL: 7633 writes, 2972 syncs, 2.57 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e60fd28f30#2 capacity: 1.11 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000122231%) FilterBlock(3,0.33 KB,2.82073e-05%) IndexBlock(3,0.34 KB,2.95505e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e60fd28f30#2 capacity: 1.11 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000122231%) FilterBlock(3,0.33 KB,2.82073e-05%) IndexBlock(3,0.34 KB,2.95505e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555450368 unmapped: 62545920 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x19626d000/0x0/0x1bfc00000, data 0x6049934/0x6250000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555450368 unmapped: 62545920 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.596310616s of 13.258410454s, submitted: 122
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5844235 data_alloc: 251658240 data_used: 43044864
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555450368 unmapped: 62545920 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e618a29400 session 0x55e614349c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e617e23400 session 0x55e612045860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555458560 unmapped: 62537728 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555458560 unmapped: 62537728 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555458560 unmapped: 62537728 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x19626f000/0x0/0x1bfc00000, data 0x60498d2/0x624f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555458560 unmapped: 62537728 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x19626f000/0x0/0x1bfc00000, data 0x60498d2/0x624f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5872340 data_alloc: 251658240 data_used: 43069440
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 558866432 unmapped: 59129856 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 558571520 unmapped: 59424768 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 heartbeat osd_stat(store_statfs(0x195778000/0x0/0x1bfc00000, data 0x6b388d2/0x6d3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [0,0,0,0,0,0,0,18])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 ms_handle_reset con 0x55e614332800 session 0x55e617122780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561012736 unmapped: 56983552 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 handle_osd_map epochs [392,393], i have 392, src has [1,393]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 392 handle_osd_map epochs [393,393], i have 393, src has [1,393]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 57499648 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 393 ms_handle_reset con 0x55e61c82f800 session 0x55e61256eb40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 57499648 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 393 heartbeat osd_stat(store_statfs(0x194188000/0x0/0x1bfc00000, data 0x6f8e57f/0x7195000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 57491456 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5973050 data_alloc: 251658240 data_used: 43585536
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 393 ms_handle_reset con 0x55e61799fc00 session 0x55e613318f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.183924675s of 11.158607483s, submitted: 145
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 393 ms_handle_reset con 0x55e61432fc00 session 0x55e61306b0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 57491456 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 393 heartbeat osd_stat(store_statfs(0x1945d1000/0x0/0x1bfc00000, data 0x6b4657f/0x6d4d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 57491456 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 393 heartbeat osd_stat(store_statfs(0x1945d1000/0x0/0x1bfc00000, data 0x6b4657f/0x6d4d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 57491456 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 393 ms_handle_reset con 0x55e617e22c00 session 0x55e61256f0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 393 handle_osd_map epochs [393,394], i have 393, src has [1,394]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560529408 unmapped: 57466880 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560529408 unmapped: 57466880 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5913639 data_alloc: 251658240 data_used: 39305216
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e611825000 session 0x55e61653ab40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e61c82d000 session 0x55e61653b860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560529408 unmapped: 57466880 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560529408 unmapped: 57466880 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x1945cb000/0x0/0x1bfc00000, data 0x6b4a0be/0x6d52000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560529408 unmapped: 57466880 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560529408 unmapped: 57466880 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560529408 unmapped: 57466880 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5912835 data_alloc: 251658240 data_used: 39440384
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560529408 unmapped: 57466880 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x1945c9000/0x0/0x1bfc00000, data 0x6b4c0be/0x6d54000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.351765633s of 10.682572365s, submitted: 27
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e614328800 session 0x55e6117f45a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 57458688 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e614330400 session 0x55e61133a1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e61194dc00 session 0x55e61133be00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 57458688 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e611825000 session 0x55e6117f4780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554852352 unmapped: 63143936 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554852352 unmapped: 63143936 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5641364 data_alloc: 234881024 data_used: 28295168
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554852352 unmapped: 63143936 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x195f7d000/0x0/0x1bfc00000, data 0x5199110/0x53a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554868736 unmapped: 63127552 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e61ee30000 session 0x55e6125cc780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554876928 unmapped: 63119360 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554876928 unmapped: 63119360 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554876928 unmapped: 63119360 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5653378 data_alloc: 251658240 data_used: 29442048
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e61dc78400 session 0x55e6121a81e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559071232 unmapped: 58925056 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x195f7e000/0x0/0x1bfc00000, data 0x51990ae/0x53a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [0,0,0,0,0,4,1])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 7.596742630s of 10.129597664s, submitted: 46
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e61dc78c00 session 0x55e6114d6b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e614329000 session 0x55e6143a8960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554876928 unmapped: 63119360 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554876928 unmapped: 63119360 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x195971000/0x0/0x1bfc00000, data 0x57a60ae/0x59ad000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554876928 unmapped: 63119360 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554876928 unmapped: 63119360 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5700800 data_alloc: 251658240 data_used: 29458432
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554876928 unmapped: 63119360 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e61b1fe400 session 0x55e61186cb40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e617e23800 session 0x55e6116a5c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e610cc8800 session 0x55e6118e8b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e6133eb800 session 0x55e61435f2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e6178ae000 session 0x55e613085a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554876928 unmapped: 63119360 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554901504 unmapped: 63094784 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x195970000/0x0/0x1bfc00000, data 0x57a70ae/0x59ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e616d5fc00 session 0x55e61653b0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e615e95000 session 0x55e61435fc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e6178ae000 session 0x55e617122960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554934272 unmapped: 63062016 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e61d732400 session 0x55e61653b0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554983424 unmapped: 63012864 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5542880 data_alloc: 234881024 data_used: 27549696
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555024384 unmapped: 62971904 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 6.739014149s of 10.050796509s, submitted: 280
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555048960 unmapped: 62947328 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e611fa0800 session 0x55e61196dc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555048960 unmapped: 62947328 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e615e95000 session 0x55e61196da40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547586048 unmapped: 70410240 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x197980000/0x0/0x1bfc00000, data 0x338903c/0x358e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547586048 unmapped: 70410240 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5313968 data_alloc: 234881024 data_used: 17797120
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547586048 unmapped: 70410240 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x197980000/0x0/0x1bfc00000, data 0x338903c/0x358e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547586048 unmapped: 70410240 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547586048 unmapped: 70410240 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x197980000/0x0/0x1bfc00000, data 0x338903c/0x358e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547586048 unmapped: 70410240 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x197980000/0x0/0x1bfc00000, data 0x338903c/0x358e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547586048 unmapped: 70410240 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5313968 data_alloc: 234881024 data_used: 17797120
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547594240 unmapped: 70402048 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547594240 unmapped: 70402048 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547594240 unmapped: 70402048 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x19797e000/0x0/0x1bfc00000, data 0x338903c/0x358e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547594240 unmapped: 70402048 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547594240 unmapped: 70402048 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5324992 data_alloc: 234881024 data_used: 18862080
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.834159851s of 14.406176567s, submitted: 36
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547766272 unmapped: 70230016 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e61ee30c00 session 0x55e6117f4b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547766272 unmapped: 70230016 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548503552 unmapped: 69492736 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x197955000/0x0/0x1bfc00000, data 0x33b305f/0x35b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548503552 unmapped: 69492736 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548503552 unmapped: 69492736 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5374927 data_alloc: 234881024 data_used: 23969792
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x197955000/0x0/0x1bfc00000, data 0x33b305f/0x35b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548503552 unmapped: 69492736 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548503552 unmapped: 69492736 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x197955000/0x0/0x1bfc00000, data 0x33b305f/0x35b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548503552 unmapped: 69492736 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548503552 unmapped: 69492736 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548503552 unmapped: 69492736 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5374927 data_alloc: 234881024 data_used: 23969792
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x197955000/0x0/0x1bfc00000, data 0x33b305f/0x35b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548503552 unmapped: 69492736 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548503552 unmapped: 69492736 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.820829391s of 12.133041382s, submitted: 6
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 546570240 unmapped: 71426048 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547471360 unmapped: 70524928 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x19775c000/0x0/0x1bfc00000, data 0x35ac05f/0x37b2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547471360 unmapped: 70524928 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5392045 data_alloc: 234881024 data_used: 24006656
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547479552 unmapped: 70516736 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x19775c000/0x0/0x1bfc00000, data 0x35ac05f/0x37b2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [0,0,0,0,0,0,0,3])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e61432c400 session 0x55e61133be00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 70582272 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 70582272 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e614329800 session 0x55e6115cb860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 70582272 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 70582272 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5397049 data_alloc: 234881024 data_used: 24104960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x19774f000/0x0/0x1bfc00000, data 0x35b605f/0x37bc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 70582272 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 70582272 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e62d164000 session 0x55e6116a5e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 70582272 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e616d5f000 session 0x55e61216b2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.579778671s of 11.296989441s, submitted: 26
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 70582272 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 ms_handle_reset con 0x55e6143a6000 session 0x55e61168c5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 70582272 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5398527 data_alloc: 234881024 data_used: 24109056
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 heartbeat osd_stat(store_statfs(0x19774e000/0x0/0x1bfc00000, data 0x35ba05f/0x37c0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 70582272 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 70582272 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 70582272 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553443328 unmapped: 64552960 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 394 handle_osd_map epochs [394,395], i have 394, src has [1,395]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 395 ms_handle_reset con 0x55e6119cbc00 session 0x55e6127223c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 395 ms_handle_reset con 0x55e615e94800 session 0x55e612723860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 395 ms_handle_reset con 0x55e61432e000 session 0x55e61186d0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548683776 unmapped: 69312512 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5349811 data_alloc: 234881024 data_used: 21016576
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 395 heartbeat osd_stat(store_statfs(0x197b3b000/0x0/0x1bfc00000, data 0x31cacb8/0x33d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 395 heartbeat osd_stat(store_statfs(0x197b3b000/0x0/0x1bfc00000, data 0x31cacb8/0x33d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548683776 unmapped: 69312512 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548683776 unmapped: 69312512 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 395 handle_osd_map epochs [395,396], i have 395, src has [1,396]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 396 ms_handle_reset con 0x55e6119cbc00 session 0x55e6116385a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548683776 unmapped: 69312512 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 396 heartbeat osd_stat(store_statfs(0x197b37000/0x0/0x1bfc00000, data 0x31cc921/0x33d6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548691968 unmapped: 69304320 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548691968 unmapped: 69304320 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5353543 data_alloc: 234881024 data_used: 21016576
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 396 heartbeat osd_stat(store_statfs(0x197b37000/0x0/0x1bfc00000, data 0x31cc921/0x33d6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548691968 unmapped: 69304320 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548691968 unmapped: 69304320 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 396 heartbeat osd_stat(store_statfs(0x197b37000/0x0/0x1bfc00000, data 0x31cc921/0x33d6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 396 handle_osd_map epochs [396,397], i have 396, src has [1,397]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.941195488s of 13.586594582s, submitted: 44
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548708352 unmapped: 69287936 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 397 ms_handle_reset con 0x55e6133ef400 session 0x55e61306ab40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541057024 unmapped: 76939264 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 397 ms_handle_reset con 0x55e62d164000 session 0x55e61435f4a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541057024 unmapped: 76939264 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5208229 data_alloc: 218103808 data_used: 7974912
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 397 ms_handle_reset con 0x55e6143a6400 session 0x55e617122b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541057024 unmapped: 76939264 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 397 ms_handle_reset con 0x55e613e76000 session 0x55e6171221e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 397 ms_handle_reset con 0x55e6119cbc00 session 0x55e6117f4d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541057024 unmapped: 76939264 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541057024 unmapped: 76939264 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 397 heartbeat osd_stat(store_statfs(0x198b34000/0x0/0x1bfc00000, data 0x21ce5ce/0x23d9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 397 handle_osd_map epochs [397,398], i have 397, src has [1,398]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 397 handle_osd_map epochs [398,398], i have 398, src has [1,398]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 76685312 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 76685312 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5253923 data_alloc: 234881024 data_used: 13910016
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 76685312 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 heartbeat osd_stat(store_statfs(0x198b31000/0x0/0x1bfc00000, data 0x21d010d/0x23dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 76685312 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 heartbeat osd_stat(store_statfs(0x198b31000/0x0/0x1bfc00000, data 0x21d010d/0x23dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 76685312 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 76685312 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 76685312 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5253923 data_alloc: 234881024 data_used: 13910016
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 76685312 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 76685312 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e6133ef400 session 0x55e6117f4f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 heartbeat osd_stat(store_statfs(0x198b31000/0x0/0x1bfc00000, data 0x21d010d/0x23dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 76685312 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.487635612s of 15.706043243s, submitted: 25
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 heartbeat osd_stat(store_statfs(0x19910c000/0x0/0x1bfc00000, data 0x1bf610d/0x1e02000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 76685312 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e6133ed800 session 0x55e614349680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e614328c00 session 0x55e6117f5e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e613adc400 session 0x55e6143a8780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 76685312 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5164439 data_alloc: 218103808 data_used: 7974912
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 76685312 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 heartbeat osd_stat(store_statfs(0x19910c000/0x0/0x1bfc00000, data 0x1bf610d/0x1e02000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 76685312 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 heartbeat osd_stat(store_statfs(0x19910c000/0x0/0x1bfc00000, data 0x1bf610d/0x1e02000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541319168 unmapped: 76677120 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541319168 unmapped: 76677120 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e6190b5c00 session 0x55e61216d860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 heartbeat osd_stat(store_statfs(0x19910c000/0x0/0x1bfc00000, data 0x1bf610d/0x1e02000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541319168 unmapped: 76677120 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5164439 data_alloc: 218103808 data_used: 7974912
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e61c82d000 session 0x55e613085860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541319168 unmapped: 76677120 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e610e23000 session 0x55e6125cd860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e613b6b000 session 0x55e6140830e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e6167c6c00 session 0x55e6153e6780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541671424 unmapped: 76324864 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e6133eac00 session 0x55e6118e9e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541679616 unmapped: 76316672 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541687808 unmapped: 76308480 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.340457916s of 11.689073563s, submitted: 17
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e61194dc00 session 0x55e6140825a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541696000 unmapped: 76300288 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5175014 data_alloc: 218103808 data_used: 8007680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 heartbeat osd_stat(store_statfs(0x1990e1000/0x0/0x1bfc00000, data 0x1c2011d/0x1e2d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541696000 unmapped: 76300288 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541696000 unmapped: 76300288 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541696000 unmapped: 76300288 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e61ee30c00 session 0x55e6153e63c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e6119cb400 session 0x55e61196f2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 heartbeat osd_stat(store_statfs(0x1990df000/0x0/0x1bfc00000, data 0x1c2018f/0x1e2f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541712384 unmapped: 76283904 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e61799ec00 session 0x55e6112e7a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e61ee30000 session 0x55e6132e2f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541917184 unmapped: 76079104 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5220916 data_alloc: 218103808 data_used: 8011776
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541917184 unmapped: 76079104 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541917184 unmapped: 76079104 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e61194dc00 session 0x55e6118e9e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e6119cb400 session 0x55e6153e6780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e6133eac00 session 0x55e6140830e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e61ee30c00 session 0x55e6125cd860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541917184 unmapped: 76079104 heap: 617996288 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e61194dc00 session 0x55e613085860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e6119cb400 session 0x55e6116385a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e6133eac00 session 0x55e6127223c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e613e77000 session 0x55e61216b2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 ms_handle_reset con 0x55e613448400 session 0x55e6117f5e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541204480 unmapped: 80994304 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 heartbeat osd_stat(store_statfs(0x199284000/0x0/0x1bfc00000, data 0x2aba19f/0x2cca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23caf9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 398 handle_osd_map epochs [398,399], i have 398, src has [1,399]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541212672 unmapped: 80986112 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5343784 data_alloc: 218103808 data_used: 9302016
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 399 handle_osd_map epochs [399,400], i have 399, src has [1,400]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.149781227s of 10.657603264s, submitted: 91
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 399 handle_osd_map epochs [400,400], i have 400, src has [1,400]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 400 ms_handle_reset con 0x55e61194dc00 session 0x55e6131a4000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541089792 unmapped: 81108992 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 400 handle_osd_map epochs [400,401], i have 400, src has [1,401]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541712384 unmapped: 80486400 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 401 ms_handle_reset con 0x55e6119cb400 session 0x55e612722780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 401 handle_osd_map epochs [401,402], i have 401, src has [1,402]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 402 ms_handle_reset con 0x55e6133eac00 session 0x55e614082000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539394048 unmapped: 82804736 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 402 ms_handle_reset con 0x55e61432c800 session 0x55e611338f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 539402240 unmapped: 82796544 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #59. Immutable memtables: 15.
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 402 ms_handle_reset con 0x55e6133eb400 session 0x55e612044000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 402 ms_handle_reset con 0x55e62101f000 session 0x55e6117f52c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 402 heartbeat osd_stat(store_statfs(0x197bf0000/0x0/0x1bfc00000, data 0x2fa938f/0x31be000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541990912 unmapped: 80207872 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5395001 data_alloc: 218103808 data_used: 9310208
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541999104 unmapped: 80199680 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 402 ms_handle_reset con 0x55e61432c800 session 0x55e611650d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 402 ms_handle_reset con 0x55e6133eb400 session 0x55e6143a9860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543580160 unmapped: 78618624 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 402 ms_handle_reset con 0x55e61194cc00 session 0x55e61256fa40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 543580160 unmapped: 78618624 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 402 heartbeat osd_stat(store_statfs(0x197825000/0x0/0x1bfc00000, data 0x33733b2/0x3589000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 402 handle_osd_map epochs [402,403], i have 402, src has [1,403]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 handle_osd_map epochs [403,403], i have 403, src has [1,403]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 heartbeat osd_stat(store_statfs(0x1973d8000/0x0/0x1bfc00000, data 0x37c03b2/0x39d6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e62d165400 session 0x55e610dd34a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542949376 unmapped: 79249408 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e61194cc00 session 0x55e61216a000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e6133eac00 session 0x55e613078000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542949376 unmapped: 79249408 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5528018 data_alloc: 234881024 data_used: 18087936
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e61194dc00 session 0x55e61653b860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e6119cb400 session 0x55e6141faf00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.144738197s of 10.207188606s, submitted: 95
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541679616 unmapped: 80519168 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e6133eb400 session 0x55e61170d4a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541687808 unmapped: 80510976 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541687808 unmapped: 80510976 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541687808 unmapped: 80510976 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 heartbeat osd_stat(store_statfs(0x197d88000/0x0/0x1bfc00000, data 0x2deb010/0x3000000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541687808 unmapped: 80510976 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5386020 data_alloc: 218103808 data_used: 9318400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541687808 unmapped: 80510976 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541687808 unmapped: 80510976 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541687808 unmapped: 80510976 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e613448800 session 0x55e6116a52c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541982720 unmapped: 80216064 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541982720 unmapped: 80216064 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5461976 data_alloc: 218103808 data_used: 9318400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 heartbeat osd_stat(store_statfs(0x197437000/0x0/0x1bfc00000, data 0x3762010/0x3977000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541982720 unmapped: 80216064 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541982720 unmapped: 80216064 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 heartbeat osd_stat(store_statfs(0x197437000/0x0/0x1bfc00000, data 0x3762010/0x3977000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541982720 unmapped: 80216064 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541982720 unmapped: 80216064 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 heartbeat osd_stat(store_statfs(0x197437000/0x0/0x1bfc00000, data 0x3762010/0x3977000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542007296 unmapped: 80191488 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5461976 data_alloc: 218103808 data_used: 9318400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542007296 unmapped: 80191488 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542007296 unmapped: 80191488 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.566255569s of 16.626279831s, submitted: 77
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e618a29c00 session 0x55e61216a780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e611825000 session 0x55e6131a4d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542007296 unmapped: 80191488 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e610e23000 session 0x55e6141fa780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e618454c00 session 0x55e6121a90e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e613b6b000 session 0x55e6112e7860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e613448000 session 0x55e611927680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542015488 unmapped: 80183296 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542195712 unmapped: 80003072 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5521357 data_alloc: 234881024 data_used: 18829312
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e613adf000 session 0x55e612722b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 heartbeat osd_stat(store_statfs(0x197462000/0x0/0x1bfc00000, data 0x3738000/0x394c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e61c82f400 session 0x55e61435ef00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542195712 unmapped: 80003072 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542195712 unmapped: 80003072 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e61432d000 session 0x55e611650f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542883840 unmapped: 79314944 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e613448000 session 0x55e6141fbe00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542883840 unmapped: 79314944 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e6133eb800 session 0x55e61133b2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542883840 unmapped: 79314944 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5561404 data_alloc: 234881024 data_used: 25071616
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 403 handle_osd_map epochs [403,404], i have 403, src has [1,404]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 404 ms_handle_reset con 0x55e614328000 session 0x55e6118e8780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542875648 unmapped: 79323136 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 404 heartbeat osd_stat(store_statfs(0x197560000/0x0/0x1bfc00000, data 0x3637cad/0x384d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 404 ms_handle_reset con 0x55e6133f1000 session 0x55e611f150e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 404 ms_handle_reset con 0x55e6178b1c00 session 0x55e6116a4960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542875648 unmapped: 79323136 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 404 ms_handle_reset con 0x55e610e23000 session 0x55e6132e3a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542875648 unmapped: 79323136 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542875648 unmapped: 79323136 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.773722649s of 11.937841415s, submitted: 52
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 546652160 unmapped: 75546624 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5651376 data_alloc: 234881024 data_used: 26652672
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 404 ms_handle_reset con 0x55e61b1fe400 session 0x55e6143a9680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 404 heartbeat osd_stat(store_statfs(0x1969ea000/0x0/0x1bfc00000, data 0x41adc8a/0x43c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [1])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 404 ms_handle_reset con 0x55e611825000 session 0x55e6153e65a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548446208 unmapped: 73752576 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548446208 unmapped: 73752576 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549371904 unmapped: 72826880 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 404 handle_osd_map epochs [404,405], i have 404, src has [1,405]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549380096 unmapped: 72818688 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 405 handle_osd_map epochs [405,406], i have 405, src has [1,406]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 heartbeat osd_stat(store_statfs(0x196df2000/0x0/0x1bfc00000, data 0x41cd7c9/0x3fbb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e613445800 session 0x55e6119272c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549396480 unmapped: 72802304 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5628565 data_alloc: 234881024 data_used: 20672512
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 heartbeat osd_stat(store_statfs(0x196df2000/0x0/0x1bfc00000, data 0x41cd7c9/0x3fbb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549404672 unmapped: 72794112 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549404672 unmapped: 72794112 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 heartbeat osd_stat(store_statfs(0x196dd0000/0x0/0x1bfc00000, data 0x41f0466/0x3fde000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549404672 unmapped: 72794112 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549404672 unmapped: 72794112 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 heartbeat osd_stat(store_statfs(0x196dd0000/0x0/0x1bfc00000, data 0x41f0466/0x3fde000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e6178b0400 session 0x55e6118e8960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e610e23000 session 0x55e6125ccd20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e611825000 session 0x55e6132e3c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e613445800 session 0x55e613318780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.295885086s of 10.920019150s, submitted: 202
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549421056 unmapped: 72777728 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5628989 data_alloc: 234881024 data_used: 20672512
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e61b1fe400 session 0x55e61133b0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e617e22800 session 0x55e6141fa1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e610e23000 session 0x55e61133af00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e611825000 session 0x55e6116a54a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e613445800 session 0x55e6114d6960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 heartbeat osd_stat(store_statfs(0x196ba1000/0x0/0x1bfc00000, data 0x441e476/0x420d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549421056 unmapped: 72777728 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e6119cb400 session 0x55e611638780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e617e22400 session 0x55e6140830e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549429248 unmapped: 72769536 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e610e23000 session 0x55e6153e6000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549453824 unmapped: 72744960 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 heartbeat osd_stat(store_statfs(0x196b95000/0x0/0x1bfc00000, data 0x442a476/0x4219000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 406 handle_osd_map epochs [406,407], i have 406, src has [1,407]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549453824 unmapped: 72744960 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 407 ms_handle_reset con 0x55e618a28400 session 0x55e612045e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 407 ms_handle_reset con 0x55e6133f1000 session 0x55e6143a8780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549453824 unmapped: 72744960 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5650588 data_alloc: 234881024 data_used: 20692992
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 407 ms_handle_reset con 0x55e6119cbc00 session 0x55e6125cd860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549453824 unmapped: 72744960 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 407 ms_handle_reset con 0x55e6133f3800 session 0x55e61216a000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 407 ms_handle_reset con 0x55e610e23000 session 0x55e6118e9e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549453824 unmapped: 72744960 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 407 ms_handle_reset con 0x55e6119cbc00 session 0x55e612044000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 407 ms_handle_reset con 0x55e6133f1000 session 0x55e61256fa40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549462016 unmapped: 72736768 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 407 handle_osd_map epochs [407,408], i have 407, src has [1,408]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 408 ms_handle_reset con 0x55e62d160800 session 0x55e6171223c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 408 heartbeat osd_stat(store_statfs(0x196b8c000/0x0/0x1bfc00000, data 0x4430f86/0x4222000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549642240 unmapped: 72556544 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 408 ms_handle_reset con 0x55e618349800 session 0x55e61133be00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 408 handle_osd_map epochs [408,409], i have 408, src has [1,409]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 409 ms_handle_reset con 0x55e61c236400 session 0x55e61653b680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.503706932s of 10.003719330s, submitted: 144
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 546979840 unmapped: 75218944 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5472238 data_alloc: 234881024 data_used: 15331328
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 409 ms_handle_reset con 0x55e6133ef400 session 0x55e6153e7a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 409 ms_handle_reset con 0x55e610e23000 session 0x55e6118e9e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 409 ms_handle_reset con 0x55e618349800 session 0x55e6127223c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 546988032 unmapped: 75210752 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 409 ms_handle_reset con 0x55e6119cbc00 session 0x55e6141fb680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547143680 unmapped: 75055104 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 409 ms_handle_reset con 0x55e610e23000 session 0x55e6140834a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 409 heartbeat osd_stat(store_statfs(0x19790a000/0x0/0x1bfc00000, data 0x3289a92/0x34a4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 409 ms_handle_reset con 0x55e6133ef400 session 0x55e61256f680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547143680 unmapped: 75055104 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547143680 unmapped: 75055104 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547143680 unmapped: 75055104 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5484816 data_alloc: 234881024 data_used: 15331328
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547143680 unmapped: 75055104 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547143680 unmapped: 75055104 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 409 heartbeat osd_stat(store_statfs(0x19790a000/0x0/0x1bfc00000, data 0x3289a92/0x34a4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547143680 unmapped: 75055104 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 409 handle_osd_map epochs [409,410], i have 409, src has [1,410]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547143680 unmapped: 75055104 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.488860130s of 10.026491165s, submitted: 79
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548405248 unmapped: 73793536 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5551436 data_alloc: 234881024 data_used: 15331328
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547684352 unmapped: 74514432 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613adf000 session 0x55e612723e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613c77800 session 0x55e6121a83c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e618349c00 session 0x55e6117f4f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e610e23000 session 0x55e61196e000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555769856 unmapped: 76931072 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e6133ef400 session 0x55e61306b0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613adf000 session 0x55e61216be00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613c77800 session 0x55e613085a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e6170a8400 session 0x55e6143a9e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e610e23000 session 0x55e6116a5a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544923648 unmapped: 87777280 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196465000/0x0/0x1bfc00000, data 0x472b67b/0x4949000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544923648 unmapped: 87777280 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544923648 unmapped: 87777280 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5659670 data_alloc: 234881024 data_used: 15462400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196465000/0x0/0x1bfc00000, data 0x472b67b/0x4949000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544923648 unmapped: 87777280 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544923648 unmapped: 87777280 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e61b382400 session 0x55e617122f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544923648 unmapped: 87777280 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613b6b000 session 0x55e6153e7e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544923648 unmapped: 87777280 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544923648 unmapped: 87777280 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5659846 data_alloc: 234881024 data_used: 15462400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e616d5fc00 session 0x55e61170c3c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.580759048s of 10.413801193s, submitted: 81
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e6133b5000 session 0x55e61170d4a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e610e23000 session 0x55e6117f4000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544948224 unmapped: 87752704 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196465000/0x0/0x1bfc00000, data 0x472b67b/0x4949000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e61b382400 session 0x55e6114d6000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544948224 unmapped: 87752704 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544956416 unmapped: 87744512 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e614328800 session 0x55e611338d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e61c236400 session 0x55e6127232c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544972800 unmapped: 87728128 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196464000/0x0/0x1bfc00000, data 0x472b68b/0x494a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547135488 unmapped: 85565440 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5773973 data_alloc: 251658240 data_used: 30351360
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550354944 unmapped: 82345984 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550354944 unmapped: 82345984 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e618a28400 session 0x55e612723680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613448800 session 0x55e611f15680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550371328 unmapped: 82329600 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196462000/0x0/0x1bfc00000, data 0x472b6be/0x494c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e610e23000 session 0x55e6141fa1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550371328 unmapped: 82329600 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550371328 unmapped: 82329600 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5708012 data_alloc: 251658240 data_used: 32235520
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196e98000/0x0/0x1bfc00000, data 0x3cf767b/0x3f15000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550371328 unmapped: 82329600 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550371328 unmapped: 82329600 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.847383499s of 12.084917068s, submitted: 57
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551985152 unmapped: 80715776 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552067072 unmapped: 80633856 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557391872 unmapped: 75309056 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5907494 data_alloc: 251658240 data_used: 33472512
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555376640 unmapped: 77324288 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x195637000/0x0/0x1bfc00000, data 0x555067b/0x576e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 77291520 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x195633000/0x0/0x1bfc00000, data 0x555467b/0x5772000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 77291520 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 77291520 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554737664 unmapped: 77963264 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5927272 data_alloc: 251658240 data_used: 34787328
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x19563a000/0x0/0x1bfc00000, data 0x555667b/0x5774000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554737664 unmapped: 77963264 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554737664 unmapped: 77963264 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554737664 unmapped: 77963264 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554737664 unmapped: 77963264 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554737664 unmapped: 77963264 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5929384 data_alloc: 251658240 data_used: 34775040
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.419394493s of 13.173292160s, submitted: 225
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554745856 unmapped: 77955072 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x195636000/0x0/0x1bfc00000, data 0x555967b/0x5777000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613b6b000 session 0x55e6121a8b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e616d5fc00 session 0x55e61216a000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x195636000/0x0/0x1bfc00000, data 0x555967b/0x5777000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554745856 unmapped: 77955072 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x195637000/0x0/0x1bfc00000, data 0x555966b/0x5776000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e611fa0000 session 0x55e6121a9e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554770432 unmapped: 77930496 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554770432 unmapped: 77930496 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5926763 data_alloc: 251658240 data_used: 34852864
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554770432 unmapped: 77930496 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554770432 unmapped: 77930496 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554770432 unmapped: 77930496 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x195638000/0x0/0x1bfc00000, data 0x555966b/0x5776000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554770432 unmapped: 77930496 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554770432 unmapped: 77930496 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613adf000 session 0x55e61435e3c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e615e94800 session 0x55e6116a4d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e618349400 session 0x55e6141fa960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5635240 data_alloc: 234881024 data_used: 21491712
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550789120 unmapped: 81911808 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196974000/0x0/0x1bfc00000, data 0x3c465c6/0x3e5f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550789120 unmapped: 81911808 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550789120 unmapped: 81911808 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196974000/0x0/0x1bfc00000, data 0x3c465c6/0x3e5f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550797312 unmapped: 81903616 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550797312 unmapped: 81903616 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196974000/0x0/0x1bfc00000, data 0x3c465c6/0x3e5f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5635240 data_alloc: 234881024 data_used: 21491712
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550797312 unmapped: 81903616 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550797312 unmapped: 81903616 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550797312 unmapped: 81903616 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196974000/0x0/0x1bfc00000, data 0x3c465c6/0x3e5f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550797312 unmapped: 81903616 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550797312 unmapped: 81903616 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5635240 data_alloc: 234881024 data_used: 21491712
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550797312 unmapped: 81903616 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550805504 unmapped: 81895424 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550805504 unmapped: 81895424 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196974000/0x0/0x1bfc00000, data 0x3c465c6/0x3e5f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550805504 unmapped: 81895424 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.592836380s of 23.012874603s, submitted: 96
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e62101f800 session 0x55e611f15680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e615e97800 session 0x55e611927680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e615e94800 session 0x55e6143492c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e62101f800 session 0x55e6114d6000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613addc00 session 0x55e6171234a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550952960 unmapped: 81747968 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e62d15c000 session 0x55e6117f4780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613443400 session 0x55e6125cc5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613443400 session 0x55e6112e65a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e618349400 session 0x55e614082b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5469668 data_alloc: 218103808 data_used: 9863168
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550977536 unmapped: 81723392 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551002112 unmapped: 81698816 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551002112 unmapped: 81698816 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551002112 unmapped: 81698816 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x19719c000/0x0/0x1bfc00000, data 0x308664b/0x32a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551002112 unmapped: 81698816 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5470788 data_alloc: 218103808 data_used: 9940992
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551002112 unmapped: 81698816 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551002112 unmapped: 81698816 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x19719c000/0x0/0x1bfc00000, data 0x308664b/0x32a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613442400 session 0x55e6121a90e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551002112 unmapped: 81698816 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e6133eec00 session 0x55e61196e1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551002112 unmapped: 81698816 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e616d5d800 session 0x55e614348960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.309727669s of 10.510506630s, submitted: 53
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e6133eec00 session 0x55e6171232c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551002112 unmapped: 81698816 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x1976fd000/0x0/0x1bfc00000, data 0x308664b/0x32a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5471361 data_alloc: 218103808 data_used: 9949184
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551010304 unmapped: 81690624 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x1976fd000/0x0/0x1bfc00000, data 0x308664b/0x32a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552771584 unmapped: 79929344 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x1976fd000/0x0/0x1bfc00000, data 0x308664b/0x32a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552771584 unmapped: 79929344 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552804352 unmapped: 79896576 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552804352 unmapped: 79896576 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x1976fb000/0x0/0x1bfc00000, data 0x308664b/0x32a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5552337 data_alloc: 234881024 data_used: 20381696
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552804352 unmapped: 79896576 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552804352 unmapped: 79896576 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552804352 unmapped: 79896576 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552804352 unmapped: 79896576 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552804352 unmapped: 79896576 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x1976fd000/0x0/0x1bfc00000, data 0x308664b/0x32a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5552337 data_alloc: 234881024 data_used: 20377600
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552804352 unmapped: 79896576 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.627647400s of 11.764071465s, submitted: 16
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557334528 unmapped: 75366400 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196e61000/0x0/0x1bfc00000, data 0x391a64b/0x3b35000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556695552 unmapped: 76005376 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556695552 unmapped: 76005376 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556695552 unmapped: 76005376 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196e42000/0x0/0x1bfc00000, data 0x394164b/0x3b5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [0,0,0,0,0,0,1])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5637295 data_alloc: 234881024 data_used: 21131264
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556695552 unmapped: 76005376 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196e42000/0x0/0x1bfc00000, data 0x394164b/0x3b5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556695552 unmapped: 76005376 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556703744 unmapped: 75997184 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 410 handle_osd_map epochs [410,411], i have 410, src has [1,411]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e618cee800 session 0x55e61133a000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556711936 unmapped: 75988992 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556711936 unmapped: 75988992 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6190b4800 session 0x55e61133ba40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133f1000 session 0x55e6117f50e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x196e3b000/0x0/0x1bfc00000, data 0x39462a4/0x3b62000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e61b383000 session 0x55e611638780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133eec00 session 0x55e6127232c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647841 data_alloc: 234881024 data_used: 22560768
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556711936 unmapped: 75988992 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6190b5400 session 0x55e6153e7a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e62101fc00 session 0x55e612722000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.020457268s of 10.069975853s, submitted: 100
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556728320 unmapped: 75972608 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133f1000 session 0x55e6116a5860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e618cee800 session 0x55e6117f5e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133eec00 session 0x55e6115ca1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133f1000 session 0x55e61256e5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6190b5400 session 0x55e61133a5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557891584 unmapped: 74809344 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557891584 unmapped: 74809344 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557891584 unmapped: 74809344 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x1963a9000/0x0/0x1bfc00000, data 0x4488326/0x45f5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5748520 data_alloc: 234881024 data_used: 22564864
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557891584 unmapped: 74809344 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e618a28c00 session 0x55e613319e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6170a8000 session 0x55e61170cd20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6170a8000 session 0x55e61170c5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133eec00 session 0x55e6117f52c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557907968 unmapped: 74792960 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133f1000 session 0x55e6114d6b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e618a28c00 session 0x55e61196d680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6190b5400 session 0x55e614082780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 558358528 unmapped: 74342400 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6190b5400 session 0x55e61435f4a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133eec00 session 0x55e6140830e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x195afc000/0x0/0x1bfc00000, data 0x4d3335f/0x4ea2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133f1000 session 0x55e6140821e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 558252032 unmapped: 74448896 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 558596096 unmapped: 74104832 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5905852 data_alloc: 251658240 data_used: 32215040
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 558882816 unmapped: 73818112 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e616d5dc00 session 0x55e61653bc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 558882816 unmapped: 73818112 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e61cb41800 session 0x55e6116a54a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.029963493s of 11.044583321s, submitted: 109
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 558923776 unmapped: 73777152 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x195ad0000/0x0/0x1bfc00000, data 0x4f8d3bb/0x4ece000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133eec00 session 0x55e6153e63c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133f1000 session 0x55e6125ccd20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: mgrc ms_handle_reset ms_handle_reset con 0x55e61c82c400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3835187053
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3835187053,v1:192.168.122.100:6801/3835187053]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: mgrc handle_mgr_configure stats_period=5
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e616d5dc00 session 0x55e61133b2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e616102400 session 0x55e614349680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 558923776 unmapped: 73777152 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x195ace000/0x0/0x1bfc00000, data 0x4f8d3ee/0x4ed0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x195ace000/0x0/0x1bfc00000, data 0x4f8d3ee/0x4ed0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559030272 unmapped: 73670656 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6178af400 session 0x55e6127223c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133eec00 session 0x55e6130785a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5995167 data_alloc: 251658240 data_used: 39555072
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 71712768 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x195aa8000/0x0/0x1bfc00000, data 0x4fb1421/0x4ef6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560996352 unmapped: 71704576 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561070080 unmapped: 71630848 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561070080 unmapped: 71630848 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561070080 unmapped: 71630848 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6082947 data_alloc: 251658240 data_used: 39837696
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565739520 unmapped: 66961408 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x195aa8000/0x0/0x1bfc00000, data 0x4fb1421/0x4ef6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565501952 unmapped: 67198976 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566640640 unmapped: 66060288 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.704203606s of 11.106727600s, submitted: 186
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566640640 unmapped: 66060288 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x194faa000/0x0/0x1bfc00000, data 0x5aa9421/0x59ee000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566640640 unmapped: 66060288 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6102057 data_alloc: 251658240 data_used: 40452096
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566640640 unmapped: 66060288 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x194faa000/0x0/0x1bfc00000, data 0x5aa9421/0x59ee000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572850176 unmapped: 59850752 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572481536 unmapped: 60219392 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572481536 unmapped: 60219392 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572481536 unmapped: 60219392 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6211937 data_alloc: 251658240 data_used: 43184128
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572481536 unmapped: 60219392 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572481536 unmapped: 60219392 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x1944f0000/0x0/0x1bfc00000, data 0x6561421/0x64a6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574726144 unmapped: 57974784 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574726144 unmapped: 57974784 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x194407000/0x0/0x1bfc00000, data 0x6652421/0x6597000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574726144 unmapped: 57974784 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6218149 data_alloc: 251658240 data_used: 43196416
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574726144 unmapped: 57974784 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x194407000/0x0/0x1bfc00000, data 0x6652421/0x6597000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574734336 unmapped: 57966592 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.703801155s of 14.044802666s, submitted: 170
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x194407000/0x0/0x1bfc00000, data 0x6652421/0x6597000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573898752 unmapped: 58802176 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573898752 unmapped: 58802176 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573906944 unmapped: 58793984 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e61ee31c00 session 0x55e61186cb40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e614329800 session 0x55e61133be00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5975263 data_alloc: 251658240 data_used: 32772096
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566861824 unmapped: 65839104 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e614331c00 session 0x55e614083680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566861824 unmapped: 65839104 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x19562e000/0x0/0x1bfc00000, data 0x509438c/0x4fd6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e615e96800 session 0x55e61133ab40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133eec00 session 0x55e6115ca1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e614329800 session 0x55e611638780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e614331c00 session 0x55e61186c5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x19562e000/0x0/0x1bfc00000, data 0x509438c/0x4fd6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567926784 unmapped: 72654848 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e61ee31c00 session 0x55e61216da40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e62d165800 session 0x55e61256e5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e62d165800 session 0x55e613078d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133eec00 session 0x55e612723860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e61b383000 session 0x55e613318f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567803904 unmapped: 72777728 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567803904 unmapped: 72777728 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x194d84000/0x0/0x1bfc00000, data 0x5fda3fe/0x5c1a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x194d84000/0x0/0x1bfc00000, data 0x5fda3fe/0x5c1a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6092614 data_alloc: 251658240 data_used: 32661504
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567812096 unmapped: 72769536 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e613442400 session 0x55e61168c000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e613443400 session 0x55e61133ad20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567754752 unmapped: 72826880 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.431325912s of 10.056238174s, submitted: 138
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 77881344 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133eec00 session 0x55e61196e1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 77881344 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x195ffa000/0x0/0x1bfc00000, data 0x4d6439c/0x49a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 77881344 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5877892 data_alloc: 234881024 data_used: 23572480
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 77881344 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e61d733000 session 0x55e6121a9a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e616d5e000 session 0x55e61653a3c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 77881344 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562708480 unmapped: 77873152 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6178af800 session 0x55e6112e6780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e615e97800 session 0x55e617122960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562716672 unmapped: 77864960 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x195ff9000/0x0/0x1bfc00000, data 0x4d643ac/0x49a4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562716672 unmapped: 77864960 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133f1000 session 0x55e6121a9c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e616102400 session 0x55e613318960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5949039 data_alloc: 251658240 data_used: 33120256
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563740672 unmapped: 76840960 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x195ffa000/0x0/0x1bfc00000, data 0x4d643ac/0x49a4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6178af800 session 0x55e6114d7680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 76832768 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 76832768 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.212591171s of 10.545234680s, submitted: 49
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 76832768 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6170a8400 session 0x55e61196f4a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x19601e000/0x0/0x1bfc00000, data 0x4d41379/0x497f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 76832768 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e615e95800 session 0x55e617123e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e61b1fe800 session 0x55e61653b4a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 411 handle_osd_map epochs [411,412], i have 411, src has [1,412]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 412 ms_handle_reset con 0x55e6133f1000 session 0x55e61196da40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5942760 data_alloc: 251658240 data_used: 34693120
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 76832768 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 76832768 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 76832768 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 412 ms_handle_reset con 0x55e613adf000 session 0x55e611338d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 412 ms_handle_reset con 0x55e6143a7000 session 0x55e61170d4a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 412 ms_handle_reset con 0x55e61432f400 session 0x55e61653a960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 412 heartbeat osd_stat(store_statfs(0x19613e000/0x0/0x1bfc00000, data 0x463d016/0x4860000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 76832768 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 76832768 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5969678 data_alloc: 251658240 data_used: 34799616
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564723712 unmapped: 75857920 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 412 heartbeat osd_stat(store_statfs(0x195a38000/0x0/0x1bfc00000, data 0x4d43ff3/0x4f66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 74752000 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 74752000 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 412 ms_handle_reset con 0x55e6133f1000 session 0x55e611927e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.994777679s of 10.394090652s, submitted: 118
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 412 ms_handle_reset con 0x55e613adf000 session 0x55e6116501e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 412 handle_osd_map epochs [412,413], i have 412, src has [1,413]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 412 handle_osd_map epochs [413,413], i have 413, src has [1,413]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564609024 unmapped: 75972608 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564609024 unmapped: 75972608 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x196c80000/0x0/0x1bfc00000, data 0x3af9b32/0x3d1d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5732762 data_alloc: 234881024 data_used: 22024192
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564617216 unmapped: 75964416 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564617216 unmapped: 75964416 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133eec00 session 0x55e6119270e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e616d5e000 session 0x55e6125cd860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557277184 unmapped: 83304448 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e611825800 session 0x55e6143a9e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557285376 unmapped: 83296256 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x198074000/0x0/0x1bfc00000, data 0x2709ab0/0x292a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557285376 unmapped: 83296256 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5497221 data_alloc: 218103808 data_used: 10317824
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557285376 unmapped: 83296256 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557285376 unmapped: 83296256 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613448400 session 0x55e6143a83c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e61799ec00 session 0x55e61133a960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613b6a000 session 0x55e61196e000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e6117f4b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6178af000 session 0x55e613085a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e61133b0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x198071000/0x0/0x1bfc00000, data 0x270cab0/0x292d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [1,1,0,1])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551403520 unmapped: 89178112 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613448400 session 0x55e6116a4d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e61c82dc00 session 0x55e613318f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613ae0400 session 0x55e613078d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.453991890s of 10.083424568s, submitted: 155
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e62d15b800 session 0x55e61216da40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e614083680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613b6a000 session 0x55e61170c000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551378944 unmapped: 89202688 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551378944 unmapped: 89202688 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5352100 data_alloc: 218103808 data_used: 1597440
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551378944 unmapped: 89202688 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19899d000/0x0/0x1bfc00000, data 0x1d2ba6d/0x1f49000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551378944 unmapped: 89202688 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551378944 unmapped: 89202688 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551378944 unmapped: 89202688 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551378944 unmapped: 89202688 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19899d000/0x0/0x1bfc00000, data 0x1d2ba6d/0x1f49000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133f1000 session 0x55e6130785a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5352100 data_alloc: 218103808 data_used: 1597440
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551387136 unmapped: 89194496 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e615e94400 session 0x55e6127223c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551387136 unmapped: 89194496 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613449c00 session 0x55e614349680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e61133b2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551501824 unmapped: 89079808 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551501824 unmapped: 89079808 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551829504 unmapped: 88752128 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5427247 data_alloc: 234881024 data_used: 11382784
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551829504 unmapped: 88752128 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x198a2f000/0x0/0x1bfc00000, data 0x1d4faa0/0x1f6f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551829504 unmapped: 88752128 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551829504 unmapped: 88752128 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551829504 unmapped: 88752128 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551829504 unmapped: 88752128 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x198a2f000/0x0/0x1bfc00000, data 0x1d4faa0/0x1f6f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5427247 data_alloc: 234881024 data_used: 11382784
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551829504 unmapped: 88752128 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551829504 unmapped: 88752128 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x198a2f000/0x0/0x1bfc00000, data 0x1d4faa0/0x1f6f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551829504 unmapped: 88752128 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.247579575s of 20.317701340s, submitted: 24
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554565632 unmapped: 86016000 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x198536000/0x0/0x1bfc00000, data 0x2248aa0/0x2468000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553852928 unmapped: 86728704 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485855 data_alloc: 234881024 data_used: 11407360
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554696704 unmapped: 85884928 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553992192 unmapped: 86589440 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553992192 unmapped: 86589440 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554008576 unmapped: 86573056 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19832d000/0x0/0x1bfc00000, data 0x2451aa0/0x2671000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [0,0,1])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e62d162800 session 0x55e61216d2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555302912 unmapped: 85278720 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5557113 data_alloc: 234881024 data_used: 12337152
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555343872 unmapped: 85237760 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555343872 unmapped: 85237760 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x197c87000/0x0/0x1bfc00000, data 0x2af7aa0/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555343872 unmapped: 85237760 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555343872 unmapped: 85237760 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x197c87000/0x0/0x1bfc00000, data 0x2af7aa0/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e616d5d000 session 0x55e6171234a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555343872 unmapped: 85237760 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6170a9000 session 0x55e6113383c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5557113 data_alloc: 234881024 data_used: 12337152
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555343872 unmapped: 85237760 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.936045647s of 12.311985016s, submitted: 93
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e618a29800 session 0x55e610dd21e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e613319e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555491328 unmapped: 85090304 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x197c60000/0x0/0x1bfc00000, data 0x2b1eaa0/0x2d3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555630592 unmapped: 84951040 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x197c60000/0x0/0x1bfc00000, data 0x2b1eaa0/0x2d3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557629440 unmapped: 82952192 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557629440 unmapped: 82952192 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133f1000 session 0x55e6116a54a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613b6a000 session 0x55e61435f0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5596581 data_alloc: 234881024 data_used: 17735680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557629440 unmapped: 82952192 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ed800 session 0x55e6132e2f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556122112 unmapped: 84459520 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556122112 unmapped: 84459520 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x198adc000/0x0/0x1bfc00000, data 0x1a78a0b/0x1c95000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556122112 unmapped: 84459520 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556122112 unmapped: 84459520 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5393568 data_alloc: 218103808 data_used: 6991872
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556122112 unmapped: 84459520 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556122112 unmapped: 84459520 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556130304 unmapped: 84451328 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.426653862s of 12.647393227s, submitted: 51
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557793280 unmapped: 82788352 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19850c000/0x0/0x1bfc00000, data 0x2275a0b/0x2492000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559030272 unmapped: 81551360 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5471068 data_alloc: 218103808 data_used: 8425472
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559284224 unmapped: 81297408 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19720e000/0x0/0x1bfc00000, data 0x23d2a0b/0x25ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x1971cc000/0x0/0x1bfc00000, data 0x2414a0b/0x2631000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x1971cc000/0x0/0x1bfc00000, data 0x2414a0b/0x2631000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5486942 data_alloc: 218103808 data_used: 8507392
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x1971ac000/0x0/0x1bfc00000, data 0x2435a0b/0x2652000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5480786 data_alloc: 218103808 data_used: 8507392
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e616d5d000 session 0x55e610dd34a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6170a9000 session 0x55e61306ba40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.785085678s of 13.056034088s, submitted: 101
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553402368 unmapped: 87179264 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553402368 unmapped: 87179264 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19820a000/0x0/0x1bfc00000, data 0x13d7a0b/0x15f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e6125cc5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553402368 unmapped: 87179264 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553402368 unmapped: 87179264 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5305271 data_alloc: 218103808 data_used: 1597440
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553402368 unmapped: 87179264 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553402368 unmapped: 87179264 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19820a000/0x0/0x1bfc00000, data 0x13d7a0b/0x15f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553410560 unmapped: 87171072 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553410560 unmapped: 87171072 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553410560 unmapped: 87171072 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19820a000/0x0/0x1bfc00000, data 0x13d7a0b/0x15f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5305271 data_alloc: 218103808 data_used: 1597440
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553410560 unmapped: 87171072 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553410560 unmapped: 87171072 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19820a000/0x0/0x1bfc00000, data 0x13d7a0b/0x15f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553410560 unmapped: 87171072 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553410560 unmapped: 87171072 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553410560 unmapped: 87171072 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5305271 data_alloc: 218103808 data_used: 1597440
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553418752 unmapped: 87162880 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19820a000/0x0/0x1bfc00000, data 0x13d7a0b/0x15f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553418752 unmapped: 87162880 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553418752 unmapped: 87162880 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553418752 unmapped: 87162880 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553418752 unmapped: 87162880 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5305271 data_alloc: 218103808 data_used: 1597440
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553418752 unmapped: 87162880 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553418752 unmapped: 87162880 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19820a000/0x0/0x1bfc00000, data 0x13d7a0b/0x15f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553418752 unmapped: 87162880 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553426944 unmapped: 87154688 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553426944 unmapped: 87154688 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e618cee800 session 0x55e6140834a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613b85400 session 0x55e6121a94a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e61b383400 session 0x55e6141fb2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e61256e5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 23.977979660s of 24.250816345s, submitted: 14
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5402723 data_alloc: 218103808 data_used: 1597440
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555606016 unmapped: 84975616 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613b85400 session 0x55e61653a3c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6170a9000 session 0x55e61216dc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e618cee800 session 0x55e61168cd20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e61c82f800 session 0x55e61435e1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e61653af00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x1975af000/0x0/0x1bfc00000, data 0x2030a44/0x224f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553467904 unmapped: 87113728 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553467904 unmapped: 87113728 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553467904 unmapped: 87113728 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x1975af000/0x0/0x1bfc00000, data 0x2030a7d/0x224f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553467904 unmapped: 87113728 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613ae1c00 session 0x55e6127232c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e61c82c400 session 0x55e617123c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613b84c00 session 0x55e61216d4a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e62d15c000 session 0x55e6120452c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x1975af000/0x0/0x1bfc00000, data 0x2030a7d/0x224f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5406427 data_alloc: 218103808 data_used: 1597440
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553320448 unmapped: 87261184 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554729472 unmapped: 85852160 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x1975ae000/0x0/0x1bfc00000, data 0x2030aa6/0x2250000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613ae1c00 session 0x55e613079680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e6117f4d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554729472 unmapped: 85852160 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x196f56000/0x0/0x1bfc00000, data 0x2688adf/0x28a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613443400 session 0x55e61653ad20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554737664 unmapped: 85843968 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e61b383000 session 0x55e61170c5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554745856 unmapped: 85835776 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5459544 data_alloc: 218103808 data_used: 1597440
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554745856 unmapped: 85835776 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613e77400 session 0x55e6141fa3c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554745856 unmapped: 85835776 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e61256e960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x196f56000/0x0/0x1bfc00000, data 0x2688adf/0x28a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554745856 unmapped: 85835776 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613c77000 session 0x55e612723e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.082229614s of 12.501278877s, submitted: 75
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554491904 unmapped: 86089728 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x196f55000/0x0/0x1bfc00000, data 0x2688aef/0x28a9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86081536 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e61d732400 session 0x55e6116a5860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6178b0400 session 0x55e6115cb680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613447c00 session 0x55e61186da40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5463197 data_alloc: 218103808 data_used: 1601536
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86081536 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x196f55000/0x0/0x1bfc00000, data 0x2688aef/0x28a9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86081536 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x196f54000/0x0/0x1bfc00000, data 0x2688aff/0x28aa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554196992 unmapped: 86384640 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557113344 unmapped: 83468288 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557113344 unmapped: 83468288 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x196f54000/0x0/0x1bfc00000, data 0x2688aff/0x28aa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5601381 data_alloc: 234881024 data_used: 20992000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557113344 unmapped: 83468288 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557113344 unmapped: 83468288 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557113344 unmapped: 83468288 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x196f54000/0x0/0x1bfc00000, data 0x2688aff/0x28aa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557113344 unmapped: 83468288 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557113344 unmapped: 83468288 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5601381 data_alloc: 234881024 data_used: 20992000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557113344 unmapped: 83468288 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557113344 unmapped: 83468288 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.077384949s of 14.277094841s, submitted: 11
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #60. Immutable memtables: 16.
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563732480 unmapped: 76849152 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x194885000/0x0/0x1bfc00000, data 0x3bb1aff/0x3dd3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [2])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566116352 unmapped: 74465280 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567279616 unmapped: 73302016 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x194730000/0x0/0x1bfc00000, data 0x3d04aff/0x3f26000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5819139 data_alloc: 234881024 data_used: 23150592
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567287808 unmapped: 73293824 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19471a000/0x0/0x1bfc00000, data 0x3d1aaff/0x3f3c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567287808 unmapped: 73293824 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19471a000/0x0/0x1bfc00000, data 0x3d1aaff/0x3f3c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567287808 unmapped: 73293824 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567287808 unmapped: 73293824 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567287808 unmapped: 73293824 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5812007 data_alloc: 234881024 data_used: 23158784
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567287808 unmapped: 73293824 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567287808 unmapped: 73293824 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x1946fe000/0x0/0x1bfc00000, data 0x3d3eaff/0x3f60000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567296000 unmapped: 73285632 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567296000 unmapped: 73285632 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x1946fe000/0x0/0x1bfc00000, data 0x3d3eaff/0x3f60000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567296000 unmapped: 73285632 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.453927994s of 13.398723602s, submitted: 223
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5812475 data_alloc: 234881024 data_used: 23162880
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567296000 unmapped: 73285632 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567296000 unmapped: 73285632 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613e76000 session 0x55e6132e3a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567304192 unmapped: 73277440 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 handle_osd_map epochs [413,414], i have 413, src has [1,414]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 413 handle_osd_map epochs [414,414], i have 414, src has [1,414]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 414 ms_handle_reset con 0x55e6143a7400 session 0x55e6117f5e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 414 ms_handle_reset con 0x55e618cee800 session 0x55e61216ba40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 414 ms_handle_reset con 0x55e61194cc00 session 0x55e6153e74a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570646528 unmapped: 69935104 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 414 ms_handle_reset con 0x55e615e94400 session 0x55e61170c3c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 575864832 unmapped: 75849728 heap: 651714560 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 414 heartbeat osd_stat(store_statfs(0x19452b000/0x0/0x1bfc00000, data 0x3f0f758/0x4132000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 414 handle_osd_map epochs [414,415], i have 414, src has [1,415]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 415 ms_handle_reset con 0x55e615e94400 session 0x55e6125cc1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5992797 data_alloc: 251658240 data_used: 31043584
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 575881216 unmapped: 75833344 heap: 651714560 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 415 handle_osd_map epochs [415,416], i have 415, src has [1,416]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e61194cc00 session 0x55e61186d860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576937984 unmapped: 74776576 heap: 651714560 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e6143a7c00 session 0x55e611650f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576937984 unmapped: 74776576 heap: 651714560 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e6143a6800 session 0x55e611927a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e6170a8800 session 0x55e612723e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e6170a8800 session 0x55e6141fab40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e61194cc00 session 0x55e61186dc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e6143a6800 session 0x55e612722000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e6143a7c00 session 0x55e6114d7680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576937984 unmapped: 74776576 heap: 651714560 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e615e94400 session 0x55e61216d4a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e615e94400 session 0x55e6116a52c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e61194cc00 session 0x55e6171232c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e6143a6800 session 0x55e61196f0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e6143a7c00 session 0x55e6121a8000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570064896 unmapped: 85852160 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 416 heartbeat osd_stat(store_statfs(0x192d18000/0x0/0x1bfc00000, data 0x572107a/0x5946000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6055984 data_alloc: 251658240 data_used: 31043584
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570064896 unmapped: 85852160 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570064896 unmapped: 85852160 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568016896 unmapped: 87900160 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 416 handle_osd_map epochs [416,417], i have 416, src has [1,417]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.550370216s of 12.453164101s, submitted: 68
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568025088 unmapped: 87891968 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e62d160800 session 0x55e612044f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e62d160800 session 0x55e613084960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e61194cc00 session 0x55e61216b2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e6143a6800 session 0x55e61196f2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568025088 unmapped: 87891968 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e615e94400 session 0x55e613319c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e6143a7c00 session 0x55e61306a000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e6143a7c00 session 0x55e61256e1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e61194cc00 session 0x55e6116a5c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e6143a6800 session 0x55e6114d7e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x192d13000/0x0/0x1bfc00000, data 0x5722c2a/0x594b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6062002 data_alloc: 251658240 data_used: 31051776
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568025088 unmapped: 87891968 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568025088 unmapped: 87891968 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e615e94400 session 0x55e6140821e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568025088 unmapped: 87891968 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568025088 unmapped: 87891968 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x192d13000/0x0/0x1bfc00000, data 0x5722c2a/0x594b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [0,0,0,0,0,2])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572891136 unmapped: 83025920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e611824000 session 0x55e6114d6b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6138499 data_alloc: 251658240 data_used: 41205760
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573194240 unmapped: 82722816 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573194240 unmapped: 82722816 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573341696 unmapped: 82575360 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x192ced000/0x0/0x1bfc00000, data 0x5747c4d/0x5971000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574242816 unmapped: 81674240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.508294106s of 11.667946815s, submitted: 31
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574242816 unmapped: 81674240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6168755 data_alloc: 251658240 data_used: 42401792
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574242816 unmapped: 81674240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574291968 unmapped: 81625088 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574291968 unmapped: 81625088 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x192ce6000/0x0/0x1bfc00000, data 0x574ec4d/0x5978000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574291968 unmapped: 81625088 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 575840256 unmapped: 80076800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x191f18000/0x0/0x1bfc00000, data 0x6104c4d/0x632e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6253085 data_alloc: 251658240 data_used: 42610688
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 575840256 unmapped: 80076800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 575864832 unmapped: 80052224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 575193088 unmapped: 80723968 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x191e9a000/0x0/0x1bfc00000, data 0x618ac4d/0x63b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 578838528 unmapped: 77078528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 579026944 unmapped: 76890112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6333918 data_alloc: 251658240 data_used: 44892160
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 579026944 unmapped: 76890112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.080519676s of 12.241555214s, submitted: 105
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 575832064 unmapped: 80084992 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x1918ab000/0x0/0x1bfc00000, data 0x6779c4d/0x69a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 575848448 unmapped: 80068608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x1918ab000/0x0/0x1bfc00000, data 0x6779c4d/0x69a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e6119cac00 session 0x55e611927680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 575848448 unmapped: 80068608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576946176 unmapped: 78970880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e616d5cc00 session 0x55e6112e6780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6102700 data_alloc: 251658240 data_used: 34942976
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576946176 unmapped: 78970880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576946176 unmapped: 78970880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x192c36000/0x0/0x1bfc00000, data 0x5327c4d/0x5551000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576946176 unmapped: 78970880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576954368 unmapped: 78962688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x192cfa000/0x0/0x1bfc00000, data 0x532ac4d/0x5554000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e61d732400 session 0x55e61256f0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e6133ea000 session 0x55e61168c1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576954368 unmapped: 78962688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e62101f400 session 0x55e6116a43c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5895507 data_alloc: 234881024 data_used: 27729920
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576970752 unmapped: 78946304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x1940e2000/0x0/0x1bfc00000, data 0x3efdbdb/0x4125000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576970752 unmapped: 78946304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576970752 unmapped: 78946304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576978944 unmapped: 78938112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576978944 unmapped: 78938112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5895507 data_alloc: 234881024 data_used: 27729920
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576978944 unmapped: 78938112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.050031662s of 14.386358261s, submitted: 69
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 577060864 unmapped: 78856192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x194129000/0x0/0x1bfc00000, data 0x3efdbdb/0x4125000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 577060864 unmapped: 78856192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x194129000/0x0/0x1bfc00000, data 0x3efdbdb/0x4125000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 577069056 unmapped: 78848000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 577077248 unmapped: 78839808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5895155 data_alloc: 234881024 data_used: 27729920
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 577536000 unmapped: 78381056 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x194129000/0x0/0x1bfc00000, data 0x3efdbdb/0x4125000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 577536000 unmapped: 78381056 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x194129000/0x0/0x1bfc00000, data 0x3efdbdb/0x4125000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e61432fc00 session 0x55e6153e6780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 417 handle_osd_map epochs [417,418], i have 417, src has [1,418]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 418 ms_handle_reset con 0x55e613e77800 session 0x55e61216af00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 418 ms_handle_reset con 0x55e6133b5400 session 0x55e6121a81e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 418 ms_handle_reset con 0x55e6170a9000 session 0x55e617123a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 598065152 unmapped: 57851904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 418 ms_handle_reset con 0x55e6119cac00 session 0x55e6171234a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 418 heartbeat osd_stat(store_statfs(0x194124000/0x0/0x1bfc00000, data 0x3eff844/0x4129000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 589946880 unmapped: 65970176 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 418 handle_osd_map epochs [418,419], i have 418, src has [1,419]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 419 ms_handle_reset con 0x55e6119cac00 session 0x55e611638780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 589946880 unmapped: 65970176 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 419 handle_osd_map epochs [419,420], i have 419, src has [1,420]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 420 ms_handle_reset con 0x55e6133b5400 session 0x55e6112e7860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6184245 data_alloc: 251658240 data_used: 39505920
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 590020608 unmapped: 65896448 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.693154335s of 10.000839233s, submitted: 57
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 420 ms_handle_reset con 0x55e613e77800 session 0x55e61435e3c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 420 ms_handle_reset con 0x55e61432fc00 session 0x55e612723680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 420 ms_handle_reset con 0x55e6170a9000 session 0x55e6140823c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 590036992 unmapped: 65880064 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 420 heartbeat osd_stat(store_statfs(0x192067000/0x0/0x1bfc00000, data 0x5fb9166/0x61e5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 420 handle_osd_map epochs [420,421], i have 420, src has [1,421]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 421 ms_handle_reset con 0x55e6119cac00 session 0x55e6125cc780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 590086144 unmapped: 65830912 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 590086144 unmapped: 65830912 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 421 ms_handle_reset con 0x55e61194cc00 session 0x55e61133a5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 421 ms_handle_reset con 0x55e6143a6800 session 0x55e6116501e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 421 ms_handle_reset con 0x55e6143a7000 session 0x55e61306b0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 592330752 unmapped: 63586304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 421 heartbeat osd_stat(store_statfs(0x1941b2000/0x0/0x1bfc00000, data 0x3938dae/0x3b63000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5893292 data_alloc: 251658240 data_used: 33304576
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 592330752 unmapped: 63586304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 592330752 unmapped: 63586304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 421 handle_osd_map epochs [421,422], i have 421, src has [1,422]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 590045184 unmapped: 65871872 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 590045184 unmapped: 65871872 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 422 heartbeat osd_stat(store_statfs(0x19470c000/0x0/0x1bfc00000, data 0x39168e6/0x3b41000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 422 ms_handle_reset con 0x55e62d15bc00 session 0x55e6121a9e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 422 ms_handle_reset con 0x55e61194cc00 session 0x55e61196e780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 422 ms_handle_reset con 0x55e613c29800 session 0x55e6117f4000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 422 ms_handle_reset con 0x55e6119cac00 session 0x55e6115caf00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 422 handle_osd_map epochs [422,423], i have 422, src has [1,423]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 590249984 unmapped: 65667072 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 423 ms_handle_reset con 0x55e6143a6800 session 0x55e6141fa1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 423 ms_handle_reset con 0x55e6143a7000 session 0x55e611338d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 423 ms_handle_reset con 0x55e61b1fe000 session 0x55e6116a5c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 423 heartbeat osd_stat(store_statfs(0x19470c000/0x0/0x1bfc00000, data 0x39168e6/0x3b41000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5731086 data_alloc: 234881024 data_used: 16670720
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 580632576 unmapped: 75284480 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.230168343s of 10.558722496s, submitted: 112
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 423 ms_handle_reset con 0x55e613c77000 session 0x55e6141fbe00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 423 ms_handle_reset con 0x55e6178b0400 session 0x55e612045860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 580648960 unmapped: 75268096 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 423 ms_handle_reset con 0x55e61194cc00 session 0x55e61196f2c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 423 heartbeat osd_stat(store_statfs(0x19527b000/0x0/0x1bfc00000, data 0x2da65f5/0x2fd3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 423 handle_osd_map epochs [423,424], i have 423, src has [1,424]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566198272 unmapped: 89718784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566198272 unmapped: 89718784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566198272 unmapped: 89718784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5454505 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566198272 unmapped: 89718784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 424 heartbeat osd_stat(store_statfs(0x1963d1000/0x0/0x1bfc00000, data 0x18560ce/0x1a81000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566198272 unmapped: 89718784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 424 ms_handle_reset con 0x55e6133ee800 session 0x55e61216d680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 424 handle_osd_map epochs [424,425], i have 424, src has [1,425]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61c82e400 session 0x55e6143a8d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566198272 unmapped: 89718784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566198272 unmapped: 89718784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e6178af800 session 0x55e6143a8780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e62101f000 session 0x55e611926d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566353920 unmapped: 89563136 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x1967a4000/0x0/0x1bfc00000, data 0x187bc30/0x1aa9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x1967a4000/0x0/0x1bfc00000, data 0x187bc30/0x1aa9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5462528 data_alloc: 218103808 data_used: 1675264
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566353920 unmapped: 89563136 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566362112 unmapped: 89554944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566362112 unmapped: 89554944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566362112 unmapped: 89554944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566362112 unmapped: 89554944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5494208 data_alloc: 218103808 data_used: 6172672
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x1967a4000/0x0/0x1bfc00000, data 0x187bc30/0x1aa9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566362112 unmapped: 89554944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566362112 unmapped: 89554944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566362112 unmapped: 89554944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566362112 unmapped: 89554944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566362112 unmapped: 89554944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5494688 data_alloc: 218103808 data_used: 6184960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566370304 unmapped: 89546752 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.329175949s of 19.560520172s, submitted: 87
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x1967a4000/0x0/0x1bfc00000, data 0x187bc30/0x1aa9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [0,0,0,4,3])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572686336 unmapped: 83230720 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569131008 unmapped: 86786048 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195f2d000/0x0/0x1bfc00000, data 0x20f3c30/0x2321000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195f0b000/0x0/0x1bfc00000, data 0x2115c30/0x2343000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569237504 unmapped: 86679552 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569237504 unmapped: 86679552 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195eff000/0x0/0x1bfc00000, data 0x2121c30/0x234f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5576460 data_alloc: 218103808 data_used: 6492160
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569237504 unmapped: 86679552 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569237504 unmapped: 86679552 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195eff000/0x0/0x1bfc00000, data 0x2121c30/0x234f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569237504 unmapped: 86679552 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569073664 unmapped: 86843392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569073664 unmapped: 86843392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5574204 data_alloc: 218103808 data_used: 6496256
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569073664 unmapped: 86843392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569073664 unmapped: 86843392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569073664 unmapped: 86843392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195efc000/0x0/0x1bfc00000, data 0x2124c30/0x2352000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569073664 unmapped: 86843392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569073664 unmapped: 86843392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 77K writes, 327K keys, 77K commit groups, 1.0 writes per commit group, ingest: 0.33 GB, 0.05 MB/s#012Cumulative WAL: 77K writes, 26K syncs, 2.91 writes per sync, written: 0.33 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 6778 writes, 27K keys, 6778 commit groups, 1.0 writes per commit group, ingest: 28.97 MB, 0.05 MB/s#012Interval WAL: 6778 writes, 2628 syncs, 2.58 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5574204 data_alloc: 218103808 data_used: 6496256
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569073664 unmapped: 86843392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569073664 unmapped: 86843392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.822758675s of 16.085189819s, submitted: 84
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61dc78400 session 0x55e6127234a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e613c77800 session 0x55e6143485a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570294272 unmapped: 85622784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570294272 unmapped: 85622784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19553f000/0x0/0x1bfc00000, data 0x2ae0c92/0x2d0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570294272 unmapped: 85622784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19553f000/0x0/0x1bfc00000, data 0x2ae0c92/0x2d0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5649897 data_alloc: 218103808 data_used: 6496256
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570294272 unmapped: 85622784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570294272 unmapped: 85622784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570294272 unmapped: 85622784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19553f000/0x0/0x1bfc00000, data 0x2ae0c92/0x2d0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570294272 unmapped: 85622784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570294272 unmapped: 85622784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5649897 data_alloc: 218103808 data_used: 6496256
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570245120 unmapped: 85671936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61b383c00 session 0x55e6121a81e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19551b000/0x0/0x1bfc00000, data 0x2b04c92/0x2d33000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5717114 data_alloc: 234881024 data_used: 15491072
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19551b000/0x0/0x1bfc00000, data 0x2b04c92/0x2d33000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.217922211s of 17.367380142s, submitted: 38
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5717486 data_alloc: 234881024 data_used: 15491072
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195519000/0x0/0x1bfc00000, data 0x2b05c92/0x2d34000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570728448 unmapped: 85188608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574160896 unmapped: 81756160 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x194ed4000/0x0/0x1bfc00000, data 0x3143c92/0x3372000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x194ed4000/0x0/0x1bfc00000, data 0x3143c92/0x3372000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5787944 data_alloc: 234881024 data_used: 15544320
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574160896 unmapped: 81756160 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573489152 unmapped: 82427904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573497344 unmapped: 82419712 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573497344 unmapped: 82419712 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.052712440s of 10.270314217s, submitted: 69
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573497344 unmapped: 82419712 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x194c7c000/0x0/0x1bfc00000, data 0x33a3c92/0x35d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5806950 data_alloc: 234881024 data_used: 16617472
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573497344 unmapped: 82419712 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x194c7c000/0x0/0x1bfc00000, data 0x33a3c92/0x35d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573497344 unmapped: 82419712 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573497344 unmapped: 82419712 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573497344 unmapped: 82419712 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61432d000 session 0x55e6116a4780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e613ae1c00 session 0x55e6112e6780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573464576 unmapped: 82452480 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5594594 data_alloc: 218103808 data_used: 6606848
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572858368 unmapped: 83058688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195ed6000/0x0/0x1bfc00000, data 0x2149c30/0x2377000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572858368 unmapped: 83058688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572858368 unmapped: 83058688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e618348400 session 0x55e614082780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572866560 unmapped: 83050496 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195efb000/0x0/0x1bfc00000, data 0x2125c30/0x2353000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572866560 unmapped: 83050496 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.619760513s of 11.074169159s, submitted: 54
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5587286 data_alloc: 218103808 data_used: 6496256
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572866560 unmapped: 83050496 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572866560 unmapped: 83050496 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e618a28c00 session 0x55e6143492c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e6170a8800 session 0x55e61216a780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572923904 unmapped: 82993152 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e618349000 session 0x55e61435e3c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565714944 unmapped: 90202112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565764096 unmapped: 90152960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565764096 unmapped: 90152960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565788672 unmapped: 90128384 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565788672 unmapped: 90128384 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565788672 unmapped: 90128384 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565788672 unmapped: 90128384 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565788672 unmapped: 90128384 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565796864 unmapped: 90120192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565796864 unmapped: 90120192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565796864 unmapped: 90120192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565796864 unmapped: 90120192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565805056 unmapped: 90112000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565805056 unmapped: 90112000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565813248 unmapped: 90103808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565813248 unmapped: 90103808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565813248 unmapped: 90103808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565813248 unmapped: 90103808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565813248 unmapped: 90103808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565813248 unmapped: 90103808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565813248 unmapped: 90103808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565813248 unmapped: 90103808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565813248 unmapped: 90103808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565821440 unmapped: 90095616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565821440 unmapped: 90095616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565821440 unmapped: 90095616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565821440 unmapped: 90095616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 90087424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 90087424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 90087424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 90087424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 90087424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 90087424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 90087424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 90087424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565837824 unmapped: 90079232 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565837824 unmapped: 90079232 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565837824 unmapped: 90079232 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565837824 unmapped: 90079232 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565837824 unmapped: 90079232 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565837824 unmapped: 90079232 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565837824 unmapped: 90079232 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565837824 unmapped: 90079232 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565846016 unmapped: 90071040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565846016 unmapped: 90071040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565846016 unmapped: 90071040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565846016 unmapped: 90071040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565846016 unmapped: 90071040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565846016 unmapped: 90071040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565846016 unmapped: 90071040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565846016 unmapped: 90071040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565854208 unmapped: 90062848 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565854208 unmapped: 90062848 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 109.116950989s of 109.828414917s, submitted: 277
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565854208 unmapped: 90062848 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e618349800 session 0x55e613318960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e617e23c00 session 0x55e613078000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e613448800 session 0x55e6132e2780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e613448800 session 0x55e61256fc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e6170a8800 session 0x55e61216a1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5494573 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566231040 unmapped: 89686016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566231040 unmapped: 89686016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566231040 unmapped: 89686016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566231040 unmapped: 89686016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19668e000/0x0/0x1bfc00000, data 0x1994bab/0x1bc0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566231040 unmapped: 89686016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5494573 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19668e000/0x0/0x1bfc00000, data 0x1994bab/0x1bc0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566231040 unmapped: 89686016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566239232 unmapped: 89677824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e6133b5800 session 0x55e6120452c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566239232 unmapped: 89677824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e616d5c000 session 0x55e6116a54a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e615e95000 session 0x55e6131a4d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e6133b5800 session 0x55e61256f0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565190656 unmapped: 90726400 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565198848 unmapped: 90718208 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19668d000/0x0/0x1bfc00000, data 0x1994bbb/0x1bc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5528735 data_alloc: 218103808 data_used: 6242304
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565207040 unmapped: 90710016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565207040 unmapped: 90710016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565207040 unmapped: 90710016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19668d000/0x0/0x1bfc00000, data 0x1994bbb/0x1bc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565207040 unmapped: 90710016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19668d000/0x0/0x1bfc00000, data 0x1994bbb/0x1bc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565207040 unmapped: 90710016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5528735 data_alloc: 218103808 data_used: 6242304
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565207040 unmapped: 90710016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565207040 unmapped: 90710016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565207040 unmapped: 90710016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19668d000/0x0/0x1bfc00000, data 0x1994bbb/0x1bc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565207040 unmapped: 90710016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.850133896s of 20.080587387s, submitted: 28
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565592064 unmapped: 90324992 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5604607 data_alloc: 218103808 data_used: 6541312
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567631872 unmapped: 88285184 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195c83000/0x0/0x1bfc00000, data 0x239dbbb/0x25ca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567853056 unmapped: 88064000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567853056 unmapped: 88064000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567853056 unmapped: 88064000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567853056 unmapped: 88064000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195c40000/0x0/0x1bfc00000, data 0x23e0bbb/0x260d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5619547 data_alloc: 218103808 data_used: 6529024
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567853056 unmapped: 88064000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195c40000/0x0/0x1bfc00000, data 0x23e0bbb/0x260d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567672832 unmapped: 88244224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567672832 unmapped: 88244224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567672832 unmapped: 88244224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567672832 unmapped: 88244224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5613859 data_alloc: 218103808 data_used: 6529024
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195c1f000/0x0/0x1bfc00000, data 0x2402bbb/0x262f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567672832 unmapped: 88244224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567672832 unmapped: 88244224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195c1f000/0x0/0x1bfc00000, data 0x2402bbb/0x262f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567672832 unmapped: 88244224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.083731651s of 13.471039772s, submitted: 88
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e613448800 session 0x55e6141fb4a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e616d5c000 session 0x55e6113392c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567672832 unmapped: 88244224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e6133ea000 session 0x55e611650d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5456814 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5456814 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5456814 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5456814 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567582720 unmapped: 88334336 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567582720 unmapped: 88334336 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567590912 unmapped: 88326144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567590912 unmapped: 88326144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5456814 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567590912 unmapped: 88326144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567590912 unmapped: 88326144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567590912 unmapped: 88326144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567590912 unmapped: 88326144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567590912 unmapped: 88326144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5456814 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567590912 unmapped: 88326144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567599104 unmapped: 88317952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567599104 unmapped: 88317952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567599104 unmapped: 88317952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567599104 unmapped: 88317952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5456814 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567599104 unmapped: 88317952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567599104 unmapped: 88317952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567599104 unmapped: 88317952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567599104 unmapped: 88317952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567607296 unmapped: 88309760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5456814 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567615488 unmapped: 88301568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567615488 unmapped: 88301568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567615488 unmapped: 88301568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567615488 unmapped: 88301568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567615488 unmapped: 88301568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5456814 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e614331400 session 0x55e6118e83c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567615488 unmapped: 88301568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567615488 unmapped: 88301568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567623680 unmapped: 88293376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567623680 unmapped: 88293376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567623680 unmapped: 88293376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61c82c000 session 0x55e61653b4a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61d732400 session 0x55e61435f680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e613ae1c00 session 0x55e6171230e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61dc79000 session 0x55e6132e3c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 47.185108185s of 47.276470184s, submitted: 30
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e6170a8800 session 0x55e611926000
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5519113 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e613ae1c00 session 0x55e6132e3a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61c82c000 session 0x55e61196e780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [0,0,0,0,1])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61d732400 session 0x55e6120454a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61dc79000 session 0x55e6121a9a40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567762944 unmapped: 88154112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567762944 unmapped: 88154112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567762944 unmapped: 88154112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567762944 unmapped: 88154112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196495000/0x0/0x1bfc00000, data 0x1b8cbbb/0x1db9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567762944 unmapped: 88154112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5519113 data_alloc: 218103808 data_used: 1638400
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567762944 unmapped: 88154112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196495000/0x0/0x1bfc00000, data 0x1b8cbbb/0x1db9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196495000/0x0/0x1bfc00000, data 0x1b8cbbb/0x1db9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e613c77c00 session 0x55e611339680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567762944 unmapped: 88154112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e613ae1c00 session 0x55e61216af00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567762944 unmapped: 88154112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61c82c000 session 0x55e6143494a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61d732400 session 0x55e6140823c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196470000/0x0/0x1bfc00000, data 0x1bb0bca/0x1dde000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5581896 data_alloc: 218103808 data_used: 9551872
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196470000/0x0/0x1bfc00000, data 0x1bb0bca/0x1dde000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5581896 data_alloc: 218103808 data_used: 9551872
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196470000/0x0/0x1bfc00000, data 0x1bb0bca/0x1dde000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.383367538s of 19.576660156s, submitted: 29
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5609840 data_alloc: 218103808 data_used: 9568256
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568942592 unmapped: 86974464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196142000/0x0/0x1bfc00000, data 0x1edebca/0x210c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196142000/0x0/0x1bfc00000, data 0x1edebca/0x210c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5621382 data_alloc: 218103808 data_used: 9846784
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196132000/0x0/0x1bfc00000, data 0x1eeebca/0x211c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196132000/0x0/0x1bfc00000, data 0x1eeebca/0x211c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5621398 data_alloc: 218103808 data_used: 9846784
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196132000/0x0/0x1bfc00000, data 0x1eeebca/0x211c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568098816 unmapped: 87818240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.063075066s of 13.349035263s, submitted: 47
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568098816 unmapped: 87818240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568098816 unmapped: 87818240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61dc79c00 session 0x55e6141fb860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5635212 data_alloc: 218103808 data_used: 9846784
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568098816 unmapped: 87818240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 425 handle_osd_map epochs [425,426], i have 425, src has [1,426]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 426 ms_handle_reset con 0x55e616d5c000 session 0x55e6143a8d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568107008 unmapped: 87810048 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 426 heartbeat osd_stat(store_statfs(0x195f30000/0x0/0x1bfc00000, data 0x20efdca/0x231e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568107008 unmapped: 87810048 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 426 ms_handle_reset con 0x55e613ae1c00 session 0x55e6127234a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568107008 unmapped: 87810048 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 426 heartbeat osd_stat(store_statfs(0x195f2c000/0x0/0x1bfc00000, data 0x20f1a23/0x2321000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 426 heartbeat osd_stat(store_statfs(0x195f2c000/0x0/0x1bfc00000, data 0x20f1a23/0x2321000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 426 handle_osd_map epochs [426,427], i have 426, src has [1,427]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 426 handle_osd_map epochs [427,427], i have 427, src has [1,427]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 427 ms_handle_reset con 0x55e613b85400 session 0x55e6143492c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 427 ms_handle_reset con 0x55e61c82c000 session 0x55e612722780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 427 ms_handle_reset con 0x55e616d5c000 session 0x55e61196e780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568115200 unmapped: 87801856 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 427 heartbeat osd_stat(store_statfs(0x195f2c000/0x0/0x1bfc00000, data 0x20f1a23/0x2321000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [0,0,0,0,0,0,3])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 427 heartbeat osd_stat(store_statfs(0x1952e4000/0x0/0x1bfc00000, data 0x2d3768c/0x2f69000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 427 ms_handle_reset con 0x55e61d732400 session 0x55e611f14b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5762274 data_alloc: 234881024 data_used: 13156352
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567771136 unmapped: 88145920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 427 handle_osd_map epochs [427,428], i have 427, src has [1,428]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 427 handle_osd_map epochs [428,428], i have 428, src has [1,428]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 428 ms_handle_reset con 0x55e613ae1c00 session 0x55e6153e7860
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567771136 unmapped: 88145920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 428 handle_osd_map epochs [428,429], i have 428, src has [1,429]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 429 ms_handle_reset con 0x55e614c33c00 session 0x55e6140830e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567779328 unmapped: 88137728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 429 ms_handle_reset con 0x55e61194d000 session 0x55e614083e00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 429 ms_handle_reset con 0x55e613448400 session 0x55e614082d20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567779328 unmapped: 88137728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 429 ms_handle_reset con 0x55e613e76000 session 0x55e614082f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 429 heartbeat osd_stat(store_statfs(0x1952dd000/0x0/0x1bfc00000, data 0x2d3afae/0x2f6f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567869440 unmapped: 88047616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 429 ms_handle_reset con 0x55e613448400 session 0x55e61435f680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 429 ms_handle_reset con 0x55e61194d000 session 0x55e61196dc20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5770122 data_alloc: 234881024 data_used: 13168640
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567869440 unmapped: 88047616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 429 handle_osd_map epochs [429,430], i have 429, src has [1,430]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.447452545s of 13.320432663s, submitted: 50
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952db000/0x0/0x1bfc00000, data 0x2d3caed/0x2f72000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e61dc78800 session 0x55e610dd34a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e611fa0c00 session 0x55e610dd2780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e611fa1000 session 0x55e6121a8780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e61194d000 session 0x55e6121a8f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e611fa1000 session 0x55e6132e3c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e611fa0c00 session 0x55e61133b680
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e613448400 session 0x55e61133b0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e61dc78800 session 0x55e61653a3c0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e61194d000 session 0x55e61653af00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952db000/0x0/0x1bfc00000, data 0x2d3caed/0x2f72000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5758008 data_alloc: 234881024 data_used: 13168640
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562118656 unmapped: 93798400 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562126848 unmapped: 93790208 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952db000/0x0/0x1bfc00000, data 0x2d3cafd/0x2f73000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5758168 data_alloc: 234881024 data_used: 13172736
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562126848 unmapped: 93790208 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562126848 unmapped: 93790208 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.108816147s of 11.132290840s, submitted: 17
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562126848 unmapped: 93790208 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e62d164400 session 0x55e61653a780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562274304 unmapped: 93642752 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562274304 unmapped: 93642752 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5779177 data_alloc: 234881024 data_used: 15343616
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952b7000/0x0/0x1bfc00000, data 0x2d60afd/0x2f97000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952b7000/0x0/0x1bfc00000, data 0x2d60afd/0x2f97000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5807017 data_alloc: 234881024 data_used: 19283968
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952b7000/0x0/0x1bfc00000, data 0x2d60afd/0x2f97000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952b7000/0x0/0x1bfc00000, data 0x2d60afd/0x2f97000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5807017 data_alloc: 234881024 data_used: 19283968
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.592308044s of 13.788862228s, submitted: 9
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 91815936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 91815936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x19526c000/0x0/0x1bfc00000, data 0x2dabafd/0x2fe2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 91815936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 91815936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x19526c000/0x0/0x1bfc00000, data 0x2dabafd/0x2fe2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5830915 data_alloc: 234881024 data_used: 21471232
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 91815936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 91815936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 91332608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 91332608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 91332608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5839683 data_alloc: 234881024 data_used: 21471232
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 91332608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951fe000/0x0/0x1bfc00000, data 0x2e19afd/0x3050000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 91332608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 91332608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.524675369s of 11.724849701s, submitted: 9
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 91332608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 91332608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5839959 data_alloc: 234881024 data_used: 21471232
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 91332608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951fd000/0x0/0x1bfc00000, data 0x2e1aafd/0x3051000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951f5000/0x0/0x1bfc00000, data 0x2e1fafd/0x3056000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5839659 data_alloc: 234881024 data_used: 21471232
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951f5000/0x0/0x1bfc00000, data 0x2e1fafd/0x3056000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951f5000/0x0/0x1bfc00000, data 0x2e1fafd/0x3056000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5839659 data_alloc: 234881024 data_used: 21471232
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951f5000/0x0/0x1bfc00000, data 0x2e1fafd/0x3056000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5840939 data_alloc: 234881024 data_used: 21725184
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566075392 unmapped: 89841664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951f5000/0x0/0x1bfc00000, data 0x2e1fafd/0x3056000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e6133efc00 session 0x55e61256e5a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566075392 unmapped: 89841664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951f5000/0x0/0x1bfc00000, data 0x2e1fafd/0x3056000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.332700729s of 19.351327896s, submitted: 12
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951f5000/0x0/0x1bfc00000, data 0x2e1fafd/0x3056000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566075392 unmapped: 89841664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566075392 unmapped: 89841664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566075392 unmapped: 89841664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e618349400 session 0x55e61216c960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e614329c00 session 0x55e6120454a0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5843485 data_alloc: 234881024 data_used: 23822336
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566083584 unmapped: 89833472 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952b8000/0x0/0x1bfc00000, data 0x2d60aed/0x2f96000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566083584 unmapped: 89833472 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566099968 unmapped: 89817088 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e61194d000 session 0x55e61435e1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952dc000/0x0/0x1bfc00000, data 0x2d3caed/0x2f72000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566099968 unmapped: 89817088 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566124544 unmapped: 89792512 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952dc000/0x0/0x1bfc00000, data 0x2d3caed/0x2f72000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5823139 data_alloc: 234881024 data_used: 23355392
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566124544 unmapped: 89792512 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e62d15a800 session 0x55e6117f4b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566124544 unmapped: 89792512 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e6170a9800 session 0x55e61168cd20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566124544 unmapped: 89792512 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566124544 unmapped: 89792512 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952dc000/0x0/0x1bfc00000, data 0x2d3caed/0x2f72000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566124544 unmapped: 89792512 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5823139 data_alloc: 234881024 data_used: 23355392
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 handle_osd_map epochs [430,431], i have 430, src has [1,431]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.423069000s of 13.489458084s, submitted: 23
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952dc000/0x0/0x1bfc00000, data 0x2d3caed/0x2f72000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 430 handle_osd_map epochs [431,431], i have 431, src has [1,431]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566124544 unmapped: 89792512 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566140928 unmapped: 89776128 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566140928 unmapped: 89776128 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 431 heartbeat osd_stat(store_statfs(0x1952d8000/0x0/0x1bfc00000, data 0x2d3e79a/0x2f75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566140928 unmapped: 89776128 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566140928 unmapped: 89776128 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5826440 data_alloc: 234881024 data_used: 23359488
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566149120 unmapped: 89767936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563675136 unmapped: 92241920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 431 ms_handle_reset con 0x55e6143a6400 session 0x55e61435e1e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 431 heartbeat osd_stat(store_statfs(0x195f1e000/0x0/0x1bfc00000, data 0x20fa78a/0x2330000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 431 heartbeat osd_stat(store_statfs(0x195f1e000/0x0/0x1bfc00000, data 0x20fa78a/0x2330000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 431 ms_handle_reset con 0x55e614c33000 session 0x55e61133b0e0
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563675136 unmapped: 92241920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563675136 unmapped: 92241920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 431 handle_osd_map epochs [431,432], i have 431, src has [1,432]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 432 ms_handle_reset con 0x55e61194d000 session 0x55e6132e3c20
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563683328 unmapped: 92233728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5686406 data_alloc: 234881024 data_used: 15290368
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563683328 unmapped: 92233728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 432 handle_osd_map epochs [432,433], i have 432, src has [1,433]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.539867401s of 10.603461266s, submitted: 27
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 433 heartbeat osd_stat(store_statfs(0x195f1a000/0x0/0x1bfc00000, data 0x20fc453/0x2333000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563691520 unmapped: 92225536 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 433 heartbeat osd_stat(store_statfs(0x195f17000/0x0/0x1bfc00000, data 0x20fdfae/0x2336000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563691520 unmapped: 92225536 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 433 ms_handle_reset con 0x55e61dc79000 session 0x55e61196c780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 433 ms_handle_reset con 0x55e61432cc00 session 0x55e617122b40
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563699712 unmapped: 92217344 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 433 ms_handle_reset con 0x55e610e23000 session 0x55e6121a8f00
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 433 heartbeat osd_stat(store_statfs(0x196a1c000/0x0/0x1bfc00000, data 0x15fbf8f/0x1832000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5528311 data_alloc: 218103808 data_used: 3780608
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 433 handle_osd_map epochs [433,434], i have 433, src has [1,434]
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 ms_handle_reset con 0x55e613add000 session 0x55e617122780
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 ms_handle_reset con 0x55e610e23000 session 0x55e6121a8960
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196a18000/0x0/0x1bfc00000, data 0x15fdace/0x1835000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5512982 data_alloc: 218103808 data_used: 1683456
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5512982 data_alloc: 218103808 data_used: 1683456
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 95412224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 95412224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 95412224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 95412224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 95412224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560513024 unmapped: 95404032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560513024 unmapped: 95404032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560513024 unmapped: 95404032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560513024 unmapped: 95404032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560513024 unmapped: 95404032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560513024 unmapped: 95404032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560513024 unmapped: 95404032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560513024 unmapped: 95404032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560545792 unmapped: 95371264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560545792 unmapped: 95371264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560545792 unmapped: 95371264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560545792 unmapped: 95371264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560545792 unmapped: 95371264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560545792 unmapped: 95371264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560545792 unmapped: 95371264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560545792 unmapped: 95371264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560562176 unmapped: 95354880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560562176 unmapped: 95354880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560562176 unmapped: 95354880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560562176 unmapped: 95354880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560562176 unmapped: 95354880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560562176 unmapped: 95354880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560562176 unmapped: 95354880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560562176 unmapped: 95354880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560570368 unmapped: 95346688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560570368 unmapped: 95346688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560570368 unmapped: 95346688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560570368 unmapped: 95346688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560570368 unmapped: 95346688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560570368 unmapped: 95346688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560570368 unmapped: 95346688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560570368 unmapped: 95346688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560586752 unmapped: 95330304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560586752 unmapped: 95330304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560586752 unmapped: 95330304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560586752 unmapped: 95330304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560586752 unmapped: 95330304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560586752 unmapped: 95330304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560594944 unmapped: 95322112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560594944 unmapped: 95322112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560594944 unmapped: 95322112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560594944 unmapped: 95322112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560594944 unmapped: 95322112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560603136 unmapped: 95313920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560603136 unmapped: 95313920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560603136 unmapped: 95313920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560603136 unmapped: 95313920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560603136 unmapped: 95313920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560603136 unmapped: 95313920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560603136 unmapped: 95313920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560603136 unmapped: 95313920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560611328 unmapped: 95305728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560611328 unmapped: 95305728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560619520 unmapped: 95297536 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560562176 unmapped: 95354880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: do_command 'config diff' '{prefix=config diff}'
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: do_command 'config show' '{prefix=config show}'
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: do_command 'counter dump' '{prefix=counter dump}'
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: do_command 'counter schema' '{prefix=counter schema}'
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560652288 unmapped: 95264768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560660480 unmapped: 95256576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:20:23 np0005603623 ceph-osd[79732]: do_command 'log dump' '{prefix=log dump}'
Jan 31 04:20:24 np0005603623 nova_compute[226235]: 2026-01-31 09:20:24.325 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:24.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:24 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 31 04:20:24 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1215613742' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 31 04:20:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:24.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Jan 31 04:20:25 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3743708388' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 31 04:20:25 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0) v1
Jan 31 04:20:25 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/809628699' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 31 04:20:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0) v1
Jan 31 04:20:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2212239015' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 31 04:20:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:26.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0) v1
Jan 31 04:20:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3838157604' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 31 04:20:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:20:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:26.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:20:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Jan 31 04:20:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1440770780' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 31 04:20:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0) v1
Jan 31 04:20:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3458416767' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 31 04:20:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Jan 31 04:20:27 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3900881572' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 31 04:20:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Jan 31 04:20:27 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/156361118' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 31 04:20:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0) v1
Jan 31 04:20:27 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1860882146' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 31 04:20:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Jan 31 04:20:27 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2049797207' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 31 04:20:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0) v1
Jan 31 04:20:27 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2360769313' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 31 04:20:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Jan 31 04:20:27 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3732784118' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 31 04:20:28 np0005603623 nova_compute[226235]: 2026-01-31 09:20:28.157 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0) v1
Jan 31 04:20:28 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3733659211' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 31 04:20:28 np0005603623 systemd[1]: Starting Hostname Service...
Jan 31 04:20:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Jan 31 04:20:28 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4291353756' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 31 04:20:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:28.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:28 np0005603623 systemd[1]: Started Hostname Service.
Jan 31 04:20:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:28.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Jan 31 04:20:28 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1458730007' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 31 04:20:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0) v1
Jan 31 04:20:28 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1027628581' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 31 04:20:29 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0) v1
Jan 31 04:20:29 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/105075060' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 31 04:20:29 np0005603623 nova_compute[226235]: 2026-01-31 09:20:29.327 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:29 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 31 04:20:29 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 31 04:20:29 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 31 04:20:29 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 31 04:20:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:20:30.173 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:20:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:20:30.174 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:20:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:20:30.174 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:20:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:30.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:30.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:30 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0) v1
Jan 31 04:20:30 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/711825567' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 31 04:20:31 np0005603623 nova_compute[226235]: 2026-01-31 09:20:31.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:31 np0005603623 nova_compute[226235]: 2026-01-31 09:20:31.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:20:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0) v1
Jan 31 04:20:31 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2589931041' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 31 04:20:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0) v1
Jan 31 04:20:31 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/526710554' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 31 04:20:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0) v1
Jan 31 04:20:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/889626092' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 31 04:20:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:20:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:32.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:20:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:32.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 31 04:20:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 31 04:20:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 31 04:20:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 31 04:20:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 31 04:20:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 31 04:20:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 31 04:20:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 31 04:20:33 np0005603623 nova_compute[226235]: 2026-01-31 09:20:33.160 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0) v1
Jan 31 04:20:33 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/843923156' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 31 04:20:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0) v1
Jan 31 04:20:34 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/855631414' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 31 04:20:34 np0005603623 nova_compute[226235]: 2026-01-31 09:20:34.329 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:34.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df"} v 0) v1
Jan 31 04:20:34 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2060324192' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 31 04:20:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:34.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0) v1
Jan 31 04:20:34 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2069470758' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 31 04:20:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0) v1
Jan 31 04:20:35 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/725891389' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 31 04:20:36 np0005603623 nova_compute[226235]: 2026-01-31 09:20:36.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0) v1
Jan 31 04:20:36 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/941510922' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 31 04:20:36 np0005603623 nova_compute[226235]: 2026-01-31 09:20:36.246 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:20:36 np0005603623 nova_compute[226235]: 2026-01-31 09:20:36.246 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:20:36 np0005603623 nova_compute[226235]: 2026-01-31 09:20:36.247 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:20:36 np0005603623 nova_compute[226235]: 2026-01-31 09:20:36.247 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:20:36 np0005603623 nova_compute[226235]: 2026-01-31 09:20:36.247 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:20:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:20:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:36.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:20:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:36.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0) v1
Jan 31 04:20:36 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2719580731' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 31 04:20:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:20:36 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3222069048' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:20:36 np0005603623 nova_compute[226235]: 2026-01-31 09:20:36.702 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:20:36 np0005603623 nova_compute[226235]: 2026-01-31 09:20:36.867 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:20:36 np0005603623 nova_compute[226235]: 2026-01-31 09:20:36.870 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3893MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:20:36 np0005603623 nova_compute[226235]: 2026-01-31 09:20:36.871 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:20:36 np0005603623 nova_compute[226235]: 2026-01-31 09:20:36.871 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:20:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:37 np0005603623 nova_compute[226235]: 2026-01-31 09:20:37.270 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:20:37 np0005603623 nova_compute[226235]: 2026-01-31 09:20:37.270 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:20:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0) v1
Jan 31 04:20:37 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1492412004' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 31 04:20:37 np0005603623 nova_compute[226235]: 2026-01-31 09:20:37.708 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:20:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:20:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3743242811' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:20:38 np0005603623 nova_compute[226235]: 2026-01-31 09:20:38.154 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:20:38 np0005603623 nova_compute[226235]: 2026-01-31 09:20:38.161 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:38 np0005603623 nova_compute[226235]: 2026-01-31 09:20:38.170 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:20:38 np0005603623 nova_compute[226235]: 2026-01-31 09:20:38.241 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:20:38 np0005603623 nova_compute[226235]: 2026-01-31 09:20:38.242 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:20:38 np0005603623 nova_compute[226235]: 2026-01-31 09:20:38.243 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.371s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:20:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:38.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0) v1
Jan 31 04:20:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/394698352' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 31 04:20:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:38.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0) v1
Jan 31 04:20:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/407307699' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 31 04:20:39 np0005603623 nova_compute[226235]: 2026-01-31 09:20:39.243 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:39 np0005603623 nova_compute[226235]: 2026-01-31 09:20:39.244 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:20:39 np0005603623 nova_compute[226235]: 2026-01-31 09:20:39.245 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:20:39 np0005603623 nova_compute[226235]: 2026-01-31 09:20:39.269 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:20:39 np0005603623 nova_compute[226235]: 2026-01-31 09:20:39.269 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:39 np0005603623 nova_compute[226235]: 2026-01-31 09:20:39.331 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0) v1
Jan 31 04:20:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1553580151' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 31 04:20:40 np0005603623 ovs-appctl[339761]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 31 04:20:40 np0005603623 ovs-appctl[339766]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 31 04:20:40 np0005603623 ovs-appctl[339787]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 31 04:20:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:40.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0) v1
Jan 31 04:20:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2884718864' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Jan 31 04:20:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:40.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:41 np0005603623 nova_compute[226235]: 2026-01-31 09:20:41.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:41 np0005603623 nova_compute[226235]: 2026-01-31 09:20:41.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Jan 31 04:20:41 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2080527757' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 31 04:20:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0) v1
Jan 31 04:20:42 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3203779151' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 31 04:20:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:20:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:42.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:20:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0) v1
Jan 31 04:20:42 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3876395925' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 31 04:20:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:20:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:42.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:20:43 np0005603623 nova_compute[226235]: 2026-01-31 09:20:43.164 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0) v1
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3992823637' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #199. Immutable memtables: 0.
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:20:43.300191) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 127] Flushing memtable with next log file: 199
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851243300385, "job": 127, "event": "flush_started", "num_memtables": 1, "num_entries": 1064, "num_deletes": 255, "total_data_size": 1616730, "memory_usage": 1644464, "flush_reason": "Manual Compaction"}
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 127] Level-0 flush table #200: started
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851243306899, "cf_name": "default", "job": 127, "event": "table_file_creation", "file_number": 200, "file_size": 1065671, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 95613, "largest_seqno": 96672, "table_properties": {"data_size": 1060077, "index_size": 2604, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 16513, "raw_average_key_size": 22, "raw_value_size": 1047493, "raw_average_value_size": 1409, "num_data_blocks": 113, "num_entries": 743, "num_filter_entries": 743, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769851198, "oldest_key_time": 1769851198, "file_creation_time": 1769851243, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 200, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 127] Flush lasted 6779 microseconds, and 3560 cpu microseconds.
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:20:43.306983) [db/flush_job.cc:967] [default] [JOB 127] Level-0 flush table #200: 1065671 bytes OK
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:20:43.307016) [db/memtable_list.cc:519] [default] Level-0 commit table #200 started
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:20:43.308535) [db/memtable_list.cc:722] [default] Level-0 commit table #200: memtable #1 done
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:20:43.308551) EVENT_LOG_v1 {"time_micros": 1769851243308545, "job": 127, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:20:43.308570) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 127] Try to delete WAL files size 1610551, prev total WAL file size 1610551, number of live WAL files 2.
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000196.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:20:43.309121) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373636' seq:72057594037927935, type:22 .. '6C6F676D0034303137' seq:0, type:0; will stop at (end)
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 128] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 127 Base level 0, inputs: [200(1040KB)], [198(10MB)]
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851243309187, "job": 128, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [200], "files_L6": [198], "score": -1, "input_data_size": 12551568, "oldest_snapshot_seqno": -1}
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 128] Generated table #201: 11528 keys, 12426936 bytes, temperature: kUnknown
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851243370776, "cf_name": "default", "job": 128, "event": "table_file_creation", "file_number": 201, "file_size": 12426936, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12355680, "index_size": 41319, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28869, "raw_key_size": 306560, "raw_average_key_size": 26, "raw_value_size": 12157746, "raw_average_value_size": 1054, "num_data_blocks": 1557, "num_entries": 11528, "num_filter_entries": 11528, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769851243, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 201, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:20:43.371013) [db/compaction/compaction_job.cc:1663] [default] [JOB 128] Compacted 1@0 + 1@6 files to L6 => 12426936 bytes
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:20:43.372155) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 203.5 rd, 201.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 11.0 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(23.4) write-amplify(11.7) OK, records in: 12047, records dropped: 519 output_compression: NoCompression
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:20:43.372172) EVENT_LOG_v1 {"time_micros": 1769851243372163, "job": 128, "event": "compaction_finished", "compaction_time_micros": 61671, "compaction_time_cpu_micros": 33451, "output_level": 6, "num_output_files": 1, "total_output_size": 12426936, "num_input_records": 12047, "num_output_records": 11528, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000200.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851243372424, "job": 128, "event": "table_file_deletion", "file_number": 200}
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000198.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851243373716, "job": 128, "event": "table_file_deletion", "file_number": 198}
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:20:43.308979) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:20:43.373819) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:20:43.373829) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:20:43.373831) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:20:43.373833) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:20:43.373834) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0) v1
Jan 31 04:20:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1540694020' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 31 04:20:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0) v1
Jan 31 04:20:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3620290963' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Jan 31 04:20:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0) v1
Jan 31 04:20:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1357864426' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 31 04:20:44 np0005603623 nova_compute[226235]: 2026-01-31 09:20:44.333 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:44.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0) v1
Jan 31 04:20:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2222041201' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 31 04:20:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:20:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:44.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:20:45 np0005603623 nova_compute[226235]: 2026-01-31 09:20:45.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:45 np0005603623 nova_compute[226235]: 2026-01-31 09:20:45.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:45 np0005603623 nova_compute[226235]: 2026-01-31 09:20:45.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0) v1
Jan 31 04:20:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1834683808' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 31 04:20:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:46.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0) v1
Jan 31 04:20:46 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1590406639' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 31 04:20:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:46.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0) v1
Jan 31 04:20:47 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2638799518' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 31 04:20:48 np0005603623 nova_compute[226235]: 2026-01-31 09:20:48.165 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:20:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:48.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:20:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0) v1
Jan 31 04:20:48 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/555650559' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 31 04:20:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:20:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:48.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:20:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0) v1
Jan 31 04:20:48 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4014528098' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 31 04:20:49 np0005603623 nova_compute[226235]: 2026-01-31 09:20:49.335 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0) v1
Jan 31 04:20:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2851454912' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 31 04:20:50 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0) v1
Jan 31 04:20:50 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4113132484' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 31 04:20:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:50.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:50.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:51 np0005603623 systemd[1]: Starting Time & Date Service...
Jan 31 04:20:51 np0005603623 podman[341752]: 2026-01-31 09:20:51.075305255 +0000 UTC m=+0.045761196 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127)
Jan 31 04:20:51 np0005603623 systemd[1]: Started Time & Date Service.
Jan 31 04:20:51 np0005603623 podman[341754]: 2026-01-31 09:20:51.114203797 +0000 UTC m=+0.084392130 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 31 04:20:51 np0005603623 virtqemud[225858]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 31 04:20:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Jan 31 04:20:51 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3033443478' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 31 04:20:51 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0) v1
Jan 31 04:20:51 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3438737465' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 31 04:20:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:20:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:52.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:20:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:52.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:53 np0005603623 nova_compute[226235]: 2026-01-31 09:20:53.166 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:54 np0005603623 nova_compute[226235]: 2026-01-31 09:20:54.338 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:20:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:54.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:20:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:54.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:20:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:20:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:20:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:20:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:20:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:56.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:56.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:58 np0005603623 nova_compute[226235]: 2026-01-31 09:20:58.169 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:20:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:58.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:20:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:20:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:20:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:58.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:20:59 np0005603623 nova_compute[226235]: 2026-01-31 09:20:59.341 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:00.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:21:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:00.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:21:01 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:21:01 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:21:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:21:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:02.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:21:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:02.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:03 np0005603623 nova_compute[226235]: 2026-01-31 09:21:03.193 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:04 np0005603623 nova_compute[226235]: 2026-01-31 09:21:04.344 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:04.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:04.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:21:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:06.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:21:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:21:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:06.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:21:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:08 np0005603623 nova_compute[226235]: 2026-01-31 09:21:08.194 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:08.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:21:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:08.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:21:09 np0005603623 nova_compute[226235]: 2026-01-31 09:21:09.347 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:10.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:21:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:10.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:21:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:12.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:21:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:12.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:21:13 np0005603623 nova_compute[226235]: 2026-01-31 09:21:13.197 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:14 np0005603623 nova_compute[226235]: 2026-01-31 09:21:14.349 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:14.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:14.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:21:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:16.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:21:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:16.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:18 np0005603623 nova_compute[226235]: 2026-01-31 09:21:18.199 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:21:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:18.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:21:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:21:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:18.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:21:19 np0005603623 nova_compute[226235]: 2026-01-31 09:21:19.352 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:21:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:20.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:21:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:21:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:20.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:21:21 np0005603623 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 31 04:21:21 np0005603623 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 31 04:21:21 np0005603623 podman[342282]: 2026-01-31 09:21:21.237297206 +0000 UTC m=+0.072287150 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:21:21 np0005603623 podman[342283]: 2026-01-31 09:21:21.257138969 +0000 UTC m=+0.093421023 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:21:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:22.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:21:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:22.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:21:23 np0005603623 nova_compute[226235]: 2026-01-31 09:21:23.201 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:24 np0005603623 nova_compute[226235]: 2026-01-31 09:21:24.354 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:24.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:24.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:26.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:26.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:28 np0005603623 nova_compute[226235]: 2026-01-31 09:21:28.202 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:21:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:28.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:21:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:21:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:28.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:21:29 np0005603623 nova_compute[226235]: 2026-01-31 09:21:29.356 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:21:30.174 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:21:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:21:30.174 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:21:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:21:30.174 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:21:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:30.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:30.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:32.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:32.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:33 np0005603623 nova_compute[226235]: 2026-01-31 09:21:33.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:33 np0005603623 nova_compute[226235]: 2026-01-31 09:21:33.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:21:33 np0005603623 nova_compute[226235]: 2026-01-31 09:21:33.205 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:34 np0005603623 systemd[1]: session-69.scope: Deactivated successfully.
Jan 31 04:21:34 np0005603623 systemd[1]: session-69.scope: Consumed 2min 37.752s CPU time, 1.0G memory peak, read 505.4M from disk, written 274.0M to disk.
Jan 31 04:21:34 np0005603623 systemd-logind[795]: Session 69 logged out. Waiting for processes to exit.
Jan 31 04:21:34 np0005603623 systemd-logind[795]: Removed session 69.
Jan 31 04:21:34 np0005603623 systemd-logind[795]: New session 70 of user zuul.
Jan 31 04:21:34 np0005603623 systemd[1]: Started Session 70 of User zuul.
Jan 31 04:21:34 np0005603623 nova_compute[226235]: 2026-01-31 09:21:34.359 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:21:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:34.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:21:34 np0005603623 systemd[1]: session-70.scope: Deactivated successfully.
Jan 31 04:21:34 np0005603623 systemd-logind[795]: Session 70 logged out. Waiting for processes to exit.
Jan 31 04:21:34 np0005603623 systemd-logind[795]: Removed session 70.
Jan 31 04:21:34 np0005603623 systemd-logind[795]: New session 71 of user zuul.
Jan 31 04:21:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:21:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:34.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:21:34 np0005603623 systemd[1]: Started Session 71 of User zuul.
Jan 31 04:21:34 np0005603623 systemd[1]: session-71.scope: Deactivated successfully.
Jan 31 04:21:34 np0005603623 systemd-logind[795]: Session 71 logged out. Waiting for processes to exit.
Jan 31 04:21:34 np0005603623 systemd-logind[795]: Removed session 71.
Jan 31 04:21:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:21:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:36.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:21:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:36.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:38 np0005603623 nova_compute[226235]: 2026-01-31 09:21:38.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:38 np0005603623 nova_compute[226235]: 2026-01-31 09:21:38.185 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:21:38 np0005603623 nova_compute[226235]: 2026-01-31 09:21:38.186 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:21:38 np0005603623 nova_compute[226235]: 2026-01-31 09:21:38.186 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:21:38 np0005603623 nova_compute[226235]: 2026-01-31 09:21:38.186 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:21:38 np0005603623 nova_compute[226235]: 2026-01-31 09:21:38.186 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:21:38 np0005603623 nova_compute[226235]: 2026-01-31 09:21:38.206 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:38.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:21:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2299859033' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:21:38 np0005603623 nova_compute[226235]: 2026-01-31 09:21:38.585 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.399s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:21:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:38.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:38 np0005603623 nova_compute[226235]: 2026-01-31 09:21:38.720 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:21:38 np0005603623 nova_compute[226235]: 2026-01-31 09:21:38.722 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4036MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:21:38 np0005603623 nova_compute[226235]: 2026-01-31 09:21:38.722 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:21:38 np0005603623 nova_compute[226235]: 2026-01-31 09:21:38.723 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:21:38 np0005603623 nova_compute[226235]: 2026-01-31 09:21:38.851 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:21:38 np0005603623 nova_compute[226235]: 2026-01-31 09:21:38.852 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:21:38 np0005603623 nova_compute[226235]: 2026-01-31 09:21:38.879 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:21:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:21:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4221360420' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:21:39 np0005603623 nova_compute[226235]: 2026-01-31 09:21:39.289 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:21:39 np0005603623 nova_compute[226235]: 2026-01-31 09:21:39.294 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:21:39 np0005603623 nova_compute[226235]: 2026-01-31 09:21:39.360 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:39 np0005603623 nova_compute[226235]: 2026-01-31 09:21:39.686 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:21:39 np0005603623 nova_compute[226235]: 2026-01-31 09:21:39.688 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:21:39 np0005603623 nova_compute[226235]: 2026-01-31 09:21:39.688 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.966s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:21:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:21:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:40.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:21:40 np0005603623 nova_compute[226235]: 2026-01-31 09:21:40.690 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:40 np0005603623 nova_compute[226235]: 2026-01-31 09:21:40.690 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:21:40 np0005603623 nova_compute[226235]: 2026-01-31 09:21:40.690 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:21:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:40.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:40 np0005603623 nova_compute[226235]: 2026-01-31 09:21:40.709 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:21:40 np0005603623 nova_compute[226235]: 2026-01-31 09:21:40.710 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:41 np0005603623 nova_compute[226235]: 2026-01-31 09:21:41.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:42.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:42.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:43 np0005603623 nova_compute[226235]: 2026-01-31 09:21:43.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:43 np0005603623 nova_compute[226235]: 2026-01-31 09:21:43.249 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:44 np0005603623 nova_compute[226235]: 2026-01-31 09:21:44.362 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:44.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:44.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:45 np0005603623 nova_compute[226235]: 2026-01-31 09:21:45.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:46 np0005603623 nova_compute[226235]: 2026-01-31 09:21:46.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:21:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:46.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:21:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:46.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:47 np0005603623 nova_compute[226235]: 2026-01-31 09:21:47.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:48 np0005603623 nova_compute[226235]: 2026-01-31 09:21:48.251 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:48.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:48.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:49 np0005603623 nova_compute[226235]: 2026-01-31 09:21:49.364 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:50.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:50.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:51 np0005603623 podman[342551]: 2026-01-31 09:21:51.974155792 +0000 UTC m=+0.057361302 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, managed_by=edpm_ansible)
Jan 31 04:21:52 np0005603623 podman[342552]: 2026-01-31 09:21:52.001367245 +0000 UTC m=+0.084572065 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 04:21:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:52.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:52.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:53 np0005603623 nova_compute[226235]: 2026-01-31 09:21:53.285 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:54 np0005603623 nova_compute[226235]: 2026-01-31 09:21:54.405 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:54.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:54.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:56 np0005603623 nova_compute[226235]: 2026-01-31 09:21:56.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:56.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:56.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:58 np0005603623 nova_compute[226235]: 2026-01-31 09:21:58.287 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:58.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:21:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:58.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:59 np0005603623 nova_compute[226235]: 2026-01-31 09:21:59.408 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:22:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:00.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:22:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:00.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:02 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:22:02 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:22:02 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:22:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:22:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:02.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:22:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:22:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:02.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:22:03 np0005603623 nova_compute[226235]: 2026-01-31 09:22:03.288 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:04 np0005603623 nova_compute[226235]: 2026-01-31 09:22:04.410 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:04.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:04.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:22:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:06.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:22:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:06.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:08 np0005603623 nova_compute[226235]: 2026-01-31 09:22:08.290 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:22:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:08.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:22:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:08.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:09 np0005603623 nova_compute[226235]: 2026-01-31 09:22:09.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:09 np0005603623 nova_compute[226235]: 2026-01-31 09:22:09.156 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 04:22:09 np0005603623 nova_compute[226235]: 2026-01-31 09:22:09.412 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:10.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:10.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:22:10 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:22:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:12.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:12.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:13 np0005603623 nova_compute[226235]: 2026-01-31 09:22:13.339 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:14 np0005603623 nova_compute[226235]: 2026-01-31 09:22:14.454 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:22:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:14.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:22:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:14.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:16.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:16.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:18 np0005603623 nova_compute[226235]: 2026-01-31 09:22:18.341 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:18.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:18.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:19 np0005603623 nova_compute[226235]: 2026-01-31 09:22:19.456 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:20.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:20.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:22:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:22.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:22:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:22.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:22 np0005603623 podman[342842]: 2026-01-31 09:22:22.947685 +0000 UTC m=+0.041346608 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 04:22:22 np0005603623 podman[342843]: 2026-01-31 09:22:22.973698016 +0000 UTC m=+0.067607462 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 04:22:23 np0005603623 nova_compute[226235]: 2026-01-31 09:22:23.386 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:24.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:24 np0005603623 nova_compute[226235]: 2026-01-31 09:22:24.506 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:22:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:24.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:22:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:22:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.0 total, 600.0 interval#012Cumulative writes: 19K writes, 97K keys, 19K commit groups, 1.0 writes per commit group, ingest: 0.19 GB, 0.03 MB/s#012Cumulative WAL: 19K writes, 19K syncs, 1.00 writes per sync, written: 0.19 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1507 writes, 7815 keys, 1507 commit groups, 1.0 writes per commit group, ingest: 15.96 MB, 0.03 MB/s#012Interval WAL: 1507 writes, 1507 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     45.3      2.68              0.29        64    0.042       0      0       0.0       0.0#012  L6      1/0   11.85 MB   0.0      0.8     0.1      0.6       0.7      0.0       0.0   5.5     66.7     57.3     11.64              1.43        63    0.185    515K    33K       0.0       0.0#012 Sum      1/0   11.85 MB   0.0      0.8     0.1      0.6       0.8      0.1       0.0   6.5     54.2     55.0     14.32              1.72       127    0.113    515K    33K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   8.6     89.4     89.6      0.92              0.19        12    0.077     70K   3129       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.8     0.1      0.6       0.7      0.0       0.0   0.0     66.7     57.3     11.64              1.43        63    0.185    515K    33K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     45.4      2.68              0.29        63    0.043       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 7200.0 total, 600.0 interval#012Flush(GB): cumulative 0.119, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.77 GB write, 0.11 MB/s write, 0.76 GB read, 0.11 MB/s read, 14.3 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.08 GB read, 0.14 MB/s read, 0.9 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557fc5f1b1f0#2 capacity: 304.00 MB usage: 85.30 MB table_size: 0 occupancy: 18446744073709551615 collections: 13 last_copies: 0 last_secs: 0.000803 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(5262,81.65 MB,26.8578%) FilterBlock(127,1.40 MB,0.459786%) IndexBlock(127,2.26 MB,0.742325%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 04:22:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:26.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:22:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:26.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:22:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:28 np0005603623 nova_compute[226235]: 2026-01-31 09:22:28.387 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:28.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:28.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:29 np0005603623 nova_compute[226235]: 2026-01-31 09:22:29.507 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:22:30.174 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:22:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:22:30.175 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:22:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:22:30.175 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:22:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:30.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:30.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:32.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:22:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:32.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:22:33 np0005603623 nova_compute[226235]: 2026-01-31 09:22:33.388 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:34 np0005603623 nova_compute[226235]: 2026-01-31 09:22:34.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:34 np0005603623 nova_compute[226235]: 2026-01-31 09:22:34.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:22:34 np0005603623 nova_compute[226235]: 2026-01-31 09:22:34.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:34.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:34 np0005603623 nova_compute[226235]: 2026-01-31 09:22:34.508 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:34.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:34 np0005603623 nova_compute[226235]: 2026-01-31 09:22:34.980 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:36.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:36.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:38 np0005603623 nova_compute[226235]: 2026-01-31 09:22:38.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:38 np0005603623 nova_compute[226235]: 2026-01-31 09:22:38.199 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:22:38 np0005603623 nova_compute[226235]: 2026-01-31 09:22:38.200 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:22:38 np0005603623 nova_compute[226235]: 2026-01-31 09:22:38.200 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:22:38 np0005603623 nova_compute[226235]: 2026-01-31 09:22:38.200 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:22:38 np0005603623 nova_compute[226235]: 2026-01-31 09:22:38.200 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:22:38 np0005603623 nova_compute[226235]: 2026-01-31 09:22:38.390 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:38.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:22:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/987162534' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:22:38 np0005603623 nova_compute[226235]: 2026-01-31 09:22:38.621 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:22:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:22:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:38.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:22:38 np0005603623 nova_compute[226235]: 2026-01-31 09:22:38.776 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:22:38 np0005603623 nova_compute[226235]: 2026-01-31 09:22:38.777 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4096MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:22:38 np0005603623 nova_compute[226235]: 2026-01-31 09:22:38.777 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:22:38 np0005603623 nova_compute[226235]: 2026-01-31 09:22:38.777 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:22:38 np0005603623 nova_compute[226235]: 2026-01-31 09:22:38.909 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:22:38 np0005603623 nova_compute[226235]: 2026-01-31 09:22:38.909 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:22:38 np0005603623 nova_compute[226235]: 2026-01-31 09:22:38.947 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:22:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:22:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/890461639' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:22:39 np0005603623 nova_compute[226235]: 2026-01-31 09:22:39.383 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:22:39 np0005603623 nova_compute[226235]: 2026-01-31 09:22:39.387 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:22:39 np0005603623 nova_compute[226235]: 2026-01-31 09:22:39.404 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:22:39 np0005603623 nova_compute[226235]: 2026-01-31 09:22:39.405 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:22:39 np0005603623 nova_compute[226235]: 2026-01-31 09:22:39.406 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:22:39 np0005603623 nova_compute[226235]: 2026-01-31 09:22:39.510 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:40 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 31 04:22:40 np0005603623 nova_compute[226235]: 2026-01-31 09:22:40.406 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:40 np0005603623 nova_compute[226235]: 2026-01-31 09:22:40.407 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:22:40 np0005603623 nova_compute[226235]: 2026-01-31 09:22:40.407 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:22:40 np0005603623 nova_compute[226235]: 2026-01-31 09:22:40.444 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:22:40 np0005603623 nova_compute[226235]: 2026-01-31 09:22:40.444 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:22:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:40.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:22:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:22:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:40.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:22:41 np0005603623 nova_compute[226235]: 2026-01-31 09:22:41.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:41 np0005603623 nova_compute[226235]: 2026-01-31 09:22:41.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 04:22:41 np0005603623 nova_compute[226235]: 2026-01-31 09:22:41.174 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 04:22:42 np0005603623 nova_compute[226235]: 2026-01-31 09:22:42.174 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:42.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:42.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:43 np0005603623 nova_compute[226235]: 2026-01-31 09:22:43.427 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:44 np0005603623 nova_compute[226235]: 2026-01-31 09:22:44.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:44.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:44 np0005603623 nova_compute[226235]: 2026-01-31 09:22:44.512 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:22:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:44.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:22:45 np0005603623 nova_compute[226235]: 2026-01-31 09:22:45.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:46 np0005603623 nova_compute[226235]: 2026-01-31 09:22:46.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:22:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:46.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:22:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:22:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:46.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:22:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:48 np0005603623 nova_compute[226235]: 2026-01-31 09:22:48.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:48 np0005603623 nova_compute[226235]: 2026-01-31 09:22:48.429 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:48.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:48.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:49 np0005603623 nova_compute[226235]: 2026-01-31 09:22:49.515 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:50.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:22:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:50.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:22:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.1 total, 600.0 interval#012Cumulative writes: 78K writes, 332K keys, 78K commit groups, 1.0 writes per commit group, ingest: 0.34 GB, 0.05 MB/s#012Cumulative WAL: 78K writes, 27K syncs, 2.90 writes per sync, written: 0.34 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1564 writes, 4848 keys, 1564 commit groups, 1.0 writes per commit group, ingest: 4.11 MB, 0.01 MB/s#012Interval WAL: 1564 writes, 689 syncs, 2.27 writes per sync, written: 0.00 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 04:22:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:22:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:52.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:22:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:22:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:52.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:22:53 np0005603623 nova_compute[226235]: 2026-01-31 09:22:53.477 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:53 np0005603623 podman[343045]: 2026-01-31 09:22:53.958128486 +0000 UTC m=+0.051197548 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 31 04:22:53 np0005603623 podman[343046]: 2026-01-31 09:22:53.990163802 +0000 UTC m=+0.078794164 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:22:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:22:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:54.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:22:54 np0005603623 nova_compute[226235]: 2026-01-31 09:22:54.549 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:54.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:56.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:22:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:56.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:22:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:57 np0005603623 ceph-mgr[77391]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3835187053
Jan 31 04:22:58 np0005603623 nova_compute[226235]: 2026-01-31 09:22:58.479 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:58.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:22:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:58.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:59 np0005603623 nova_compute[226235]: 2026-01-31 09:22:59.551 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:00.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:00.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:23:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:02.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:23:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:02.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:03 np0005603623 nova_compute[226235]: 2026-01-31 09:23:03.511 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:04.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:04 np0005603623 nova_compute[226235]: 2026-01-31 09:23:04.621 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:23:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:04.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:23:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:06.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:06.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:08 np0005603623 nova_compute[226235]: 2026-01-31 09:23:08.513 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:08.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.002000064s ======
Jan 31 04:23:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:08.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 31 04:23:09 np0005603623 nova_compute[226235]: 2026-01-31 09:23:09.622 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:10.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:23:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:10.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:23:11 np0005603623 podman[343321]: 2026-01-31 09:23:11.702424716 +0000 UTC m=+0.061256293 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef)
Jan 31 04:23:11 np0005603623 podman[343321]: 2026-01-31 09:23:11.781962571 +0000 UTC m=+0.140794128 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 31 04:23:12 np0005603623 podman[343475]: 2026-01-31 09:23:12.257857295 +0000 UTC m=+0.045837789 container exec dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 04:23:12 np0005603623 podman[343475]: 2026-01-31 09:23:12.269130099 +0000 UTC m=+0.057110563 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 04:23:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:12 np0005603623 podman[343541]: 2026-01-31 09:23:12.421328706 +0000 UTC m=+0.047026227 container exec 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, release=1793, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, architecture=x86_64, distribution-scope=public, description=keepalived for Ceph, vendor=Red Hat, Inc., name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=)
Jan 31 04:23:12 np0005603623 podman[343541]: 2026-01-31 09:23:12.453096593 +0000 UTC m=+0.078794144 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, release=1793, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, architecture=x86_64, description=keepalived for Ceph, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, vendor=Red Hat, Inc.)
Jan 31 04:23:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:23:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:12.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:23:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:12.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:13 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:23:13 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:23:13 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:23:13 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:23:13 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:23:13 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:23:13 np0005603623 nova_compute[226235]: 2026-01-31 09:23:13.514 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:14 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:23:14 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:23:14 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:23:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:14.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:14 np0005603623 nova_compute[226235]: 2026-01-31 09:23:14.624 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:23:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/231197208' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:23:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:23:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/231197208' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:23:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:14.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:16.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:16.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #202. Immutable memtables: 0.
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:23:17.156973) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 129] Flushing memtable with next log file: 202
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851397157080, "job": 129, "event": "flush_started", "num_memtables": 1, "num_entries": 1819, "num_deletes": 251, "total_data_size": 4311178, "memory_usage": 4367448, "flush_reason": "Manual Compaction"}
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 129] Level-0 flush table #203: started
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851397177635, "cf_name": "default", "job": 129, "event": "table_file_creation", "file_number": 203, "file_size": 2812756, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 96677, "largest_seqno": 98491, "table_properties": {"data_size": 2804975, "index_size": 4658, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16803, "raw_average_key_size": 20, "raw_value_size": 2789238, "raw_average_value_size": 3426, "num_data_blocks": 204, "num_entries": 814, "num_filter_entries": 814, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769851244, "oldest_key_time": 1769851244, "file_creation_time": 1769851397, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 203, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 129] Flush lasted 20726 microseconds, and 4997 cpu microseconds.
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:23:17.177705) [db/flush_job.cc:967] [default] [JOB 129] Level-0 flush table #203: 2812756 bytes OK
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:23:17.177725) [db/memtable_list.cc:519] [default] Level-0 commit table #203 started
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:23:17.180185) [db/memtable_list.cc:722] [default] Level-0 commit table #203: memtable #1 done
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:23:17.180201) EVENT_LOG_v1 {"time_micros": 1769851397180196, "job": 129, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:23:17.180220) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 129] Try to delete WAL files size 4302814, prev total WAL file size 4302814, number of live WAL files 2.
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000199.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:23:17.181052) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038353334' seq:72057594037927935, type:22 .. '7061786F730038373836' seq:0, type:0; will stop at (end)
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 130] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 129 Base level 0, inputs: [203(2746KB)], [201(11MB)]
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851397181112, "job": 130, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [203], "files_L6": [201], "score": -1, "input_data_size": 15239692, "oldest_snapshot_seqno": -1}
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 130] Generated table #204: 11823 keys, 13219758 bytes, temperature: kUnknown
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851397330237, "cf_name": "default", "job": 130, "event": "table_file_creation", "file_number": 204, "file_size": 13219758, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13145898, "index_size": 43219, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29573, "raw_key_size": 313685, "raw_average_key_size": 26, "raw_value_size": 12942209, "raw_average_value_size": 1094, "num_data_blocks": 1633, "num_entries": 11823, "num_filter_entries": 11823, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769851397, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 204, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:23:17.330506) [db/compaction/compaction_job.cc:1663] [default] [JOB 130] Compacted 1@0 + 1@6 files to L6 => 13219758 bytes
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:23:17.332050) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 102.1 rd, 88.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 11.9 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(10.1) write-amplify(4.7) OK, records in: 12342, records dropped: 519 output_compression: NoCompression
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:23:17.332066) EVENT_LOG_v1 {"time_micros": 1769851397332058, "job": 130, "event": "compaction_finished", "compaction_time_micros": 149203, "compaction_time_cpu_micros": 24874, "output_level": 6, "num_output_files": 1, "total_output_size": 13219758, "num_input_records": 12342, "num_output_records": 11823, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000203.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851397332350, "job": 130, "event": "table_file_deletion", "file_number": 203}
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000201.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851397333381, "job": 130, "event": "table_file_deletion", "file_number": 201}
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:23:17.180950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:23:17.333441) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:23:17.333445) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:23:17.333446) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:23:17.333448) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:23:17 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:23:17.333450) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:23:18 np0005603623 nova_compute[226235]: 2026-01-31 09:23:18.516 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:23:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:18.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:23:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:23:18 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:23:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:18.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:19 np0005603623 nova_compute[226235]: 2026-01-31 09:23:19.666 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:20.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:20.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:22.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:22.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:23 np0005603623 nova_compute[226235]: 2026-01-31 09:23:23.518 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:24.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:24 np0005603623 nova_compute[226235]: 2026-01-31 09:23:24.669 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:24.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:24 np0005603623 podman[343763]: 2026-01-31 09:23:24.966229162 +0000 UTC m=+0.053750458 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:23:24 np0005603623 podman[343764]: 2026-01-31 09:23:24.982796222 +0000 UTC m=+0.069371848 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 31 04:23:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:26.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:26.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:28 np0005603623 nova_compute[226235]: 2026-01-31 09:23:28.519 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:28.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:23:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:28.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:23:29 np0005603623 nova_compute[226235]: 2026-01-31 09:23:29.671 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:23:30.176 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:23:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:23:30.176 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:23:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:23:30.176 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:23:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:23:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:30.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:23:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:30.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:32.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:23:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:32.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:23:33 np0005603623 nova_compute[226235]: 2026-01-31 09:23:33.522 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:23:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:34.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:23:34 np0005603623 nova_compute[226235]: 2026-01-31 09:23:34.674 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:23:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:34.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:23:36 np0005603623 nova_compute[226235]: 2026-01-31 09:23:36.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:23:36 np0005603623 nova_compute[226235]: 2026-01-31 09:23:36.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:23:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.003000095s ======
Jan 31 04:23:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:36.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000095s
Jan 31 04:23:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:36.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:38 np0005603623 nova_compute[226235]: 2026-01-31 09:23:38.523 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:23:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:38.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:23:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:23:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:38.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.179 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.180 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.180 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.180 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.180 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:23:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:23:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3533933567' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.585 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.675 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.708 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.709 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4086MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.709 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.709 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.833 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.834 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.859 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.878 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.878 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.894 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.930 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 04:23:39 np0005603623 nova_compute[226235]: 2026-01-31 09:23:39.944 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:23:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:23:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/991607286' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:23:40 np0005603623 nova_compute[226235]: 2026-01-31 09:23:40.376 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:23:40 np0005603623 nova_compute[226235]: 2026-01-31 09:23:40.382 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:23:40 np0005603623 nova_compute[226235]: 2026-01-31 09:23:40.403 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:23:40 np0005603623 nova_compute[226235]: 2026-01-31 09:23:40.405 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:23:40 np0005603623 nova_compute[226235]: 2026-01-31 09:23:40.405 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:23:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:23:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:40.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:23:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:40.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:41 np0005603623 nova_compute[226235]: 2026-01-31 09:23:41.405 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:23:41 np0005603623 nova_compute[226235]: 2026-01-31 09:23:41.406 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:23:41 np0005603623 nova_compute[226235]: 2026-01-31 09:23:41.406 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:23:41 np0005603623 nova_compute[226235]: 2026-01-31 09:23:41.424 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:23:41 np0005603623 nova_compute[226235]: 2026-01-31 09:23:41.424 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:23:42 np0005603623 nova_compute[226235]: 2026-01-31 09:23:42.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:23:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:23:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:42.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:23:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:42.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:43 np0005603623 nova_compute[226235]: 2026-01-31 09:23:43.526 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:44 np0005603623 nova_compute[226235]: 2026-01-31 09:23:44.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:23:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:44.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:44 np0005603623 nova_compute[226235]: 2026-01-31 09:23:44.677 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:23:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:44.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:23:45 np0005603623 nova_compute[226235]: 2026-01-31 09:23:45.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:23:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:23:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:46.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:23:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:46.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:48 np0005603623 nova_compute[226235]: 2026-01-31 09:23:48.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:23:48 np0005603623 nova_compute[226235]: 2026-01-31 09:23:48.528 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:23:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:48.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:23:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:48.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:49 np0005603623 nova_compute[226235]: 2026-01-31 09:23:49.680 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:50 np0005603623 nova_compute[226235]: 2026-01-31 09:23:50.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:23:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:50.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:50.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:23:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:52.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:23:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:52.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:53 np0005603623 nova_compute[226235]: 2026-01-31 09:23:53.550 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:23:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:54.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:23:54 np0005603623 nova_compute[226235]: 2026-01-31 09:23:54.714 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:54.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:55 np0005603623 podman[343964]: 2026-01-31 09:23:55.961374545 +0000 UTC m=+0.050618550 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 31 04:23:55 np0005603623 podman[343965]: 2026-01-31 09:23:55.975215239 +0000 UTC m=+0.064625179 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 04:23:56 np0005603623 nova_compute[226235]: 2026-01-31 09:23:56.480 226239 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 2.22 sec#033[00m
Jan 31 04:23:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:56.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:56.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:58 np0005603623 nova_compute[226235]: 2026-01-31 09:23:58.552 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:58.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:23:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:23:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:58.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:23:59 np0005603623 nova_compute[226235]: 2026-01-31 09:23:59.716 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:00.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:24:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:00.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:24:01 np0005603623 nova_compute[226235]: 2026-01-31 09:24:01.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:02.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:24:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:02.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:24:03 np0005603623 nova_compute[226235]: 2026-01-31 09:24:03.600 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:04.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:04 np0005603623 nova_compute[226235]: 2026-01-31 09:24:04.751 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:24:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:04.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:24:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:06.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:06.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:24:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:08.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:24:08 np0005603623 nova_compute[226235]: 2026-01-31 09:24:08.601 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:24:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:08.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:24:09 np0005603623 nova_compute[226235]: 2026-01-31 09:24:09.754 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:10.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:24:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:10.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:24:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:12.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:12.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:13 np0005603623 nova_compute[226235]: 2026-01-31 09:24:13.645 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:24:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:14.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:24:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:24:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/191521311' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:24:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:24:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/191521311' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:24:14 np0005603623 nova_compute[226235]: 2026-01-31 09:24:14.790 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:14.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:24:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:16.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:24:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:16.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:18.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:18 np0005603623 nova_compute[226235]: 2026-01-31 09:24:18.644 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:24:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:18.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:24:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:19 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:19 np0005603623 podman[344455]: 2026-01-31 09:24:19.504417643 +0000 UTC m=+0.033690088 container create ec72b2052222834cb5fe9f461965f22b52e8242d61eeba722d3e997ad799e7e3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_haibt, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Jan 31 04:24:19 np0005603623 systemd[1]: Started libpod-conmon-ec72b2052222834cb5fe9f461965f22b52e8242d61eeba722d3e997ad799e7e3.scope.
Jan 31 04:24:19 np0005603623 systemd[1]: Started libcrun container.
Jan 31 04:24:19 np0005603623 podman[344455]: 2026-01-31 09:24:19.580929254 +0000 UTC m=+0.110201789 container init ec72b2052222834cb5fe9f461965f22b52e8242d61eeba722d3e997ad799e7e3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_haibt, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 04:24:19 np0005603623 podman[344455]: 2026-01-31 09:24:19.488987099 +0000 UTC m=+0.018259554 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 04:24:19 np0005603623 podman[344455]: 2026-01-31 09:24:19.588801671 +0000 UTC m=+0.118074106 container start ec72b2052222834cb5fe9f461965f22b52e8242d61eeba722d3e997ad799e7e3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_haibt, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 04:24:19 np0005603623 podman[344455]: 2026-01-31 09:24:19.591887888 +0000 UTC m=+0.121160433 container attach ec72b2052222834cb5fe9f461965f22b52e8242d61eeba722d3e997ad799e7e3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_haibt, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:24:19 np0005603623 upbeat_haibt[344471]: 167 167
Jan 31 04:24:19 np0005603623 systemd[1]: libpod-ec72b2052222834cb5fe9f461965f22b52e8242d61eeba722d3e997ad799e7e3.scope: Deactivated successfully.
Jan 31 04:24:19 np0005603623 podman[344455]: 2026-01-31 09:24:19.59513083 +0000 UTC m=+0.124403295 container died ec72b2052222834cb5fe9f461965f22b52e8242d61eeba722d3e997ad799e7e3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_haibt, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Jan 31 04:24:19 np0005603623 systemd[1]: var-lib-containers-storage-overlay-1b4f53eaf457da37d740e732320f7774194261529892d1e56760a1abc64f8ce1-merged.mount: Deactivated successfully.
Jan 31 04:24:19 np0005603623 podman[344455]: 2026-01-31 09:24:19.626939668 +0000 UTC m=+0.156212103 container remove ec72b2052222834cb5fe9f461965f22b52e8242d61eeba722d3e997ad799e7e3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_haibt, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507)
Jan 31 04:24:19 np0005603623 systemd[1]: libpod-conmon-ec72b2052222834cb5fe9f461965f22b52e8242d61eeba722d3e997ad799e7e3.scope: Deactivated successfully.
Jan 31 04:24:19 np0005603623 podman[344495]: 2026-01-31 09:24:19.779724543 +0000 UTC m=+0.058455366 container create 8b9b6cf99e463315f3fa684143717cd90b82b136c9e6d20ed15d3e1ff6db69b8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_noether, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 04:24:19 np0005603623 nova_compute[226235]: 2026-01-31 09:24:19.791 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:19 np0005603623 systemd[1]: Started libpod-conmon-8b9b6cf99e463315f3fa684143717cd90b82b136c9e6d20ed15d3e1ff6db69b8.scope.
Jan 31 04:24:19 np0005603623 podman[344495]: 2026-01-31 09:24:19.764144633 +0000 UTC m=+0.042875436 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 04:24:19 np0005603623 systemd[1]: Started libcrun container.
Jan 31 04:24:19 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b571dee821cd51f14c55fb3ab6a5a12fc16af51a1208cf8cf26658950c926cc/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 04:24:19 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b571dee821cd51f14c55fb3ab6a5a12fc16af51a1208cf8cf26658950c926cc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 04:24:19 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b571dee821cd51f14c55fb3ab6a5a12fc16af51a1208cf8cf26658950c926cc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 04:24:19 np0005603623 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b571dee821cd51f14c55fb3ab6a5a12fc16af51a1208cf8cf26658950c926cc/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 04:24:19 np0005603623 podman[344495]: 2026-01-31 09:24:19.89053014 +0000 UTC m=+0.169260983 container init 8b9b6cf99e463315f3fa684143717cd90b82b136c9e6d20ed15d3e1ff6db69b8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_noether, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 04:24:19 np0005603623 podman[344495]: 2026-01-31 09:24:19.896528998 +0000 UTC m=+0.175259781 container start 8b9b6cf99e463315f3fa684143717cd90b82b136c9e6d20ed15d3e1ff6db69b8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_noether, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3)
Jan 31 04:24:19 np0005603623 podman[344495]: 2026-01-31 09:24:19.900357708 +0000 UTC m=+0.179088531 container attach 8b9b6cf99e463315f3fa684143717cd90b82b136c9e6d20ed15d3e1ff6db69b8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_noether, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef)
Jan 31 04:24:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:20.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:20.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:21 np0005603623 great_noether[344511]: [
Jan 31 04:24:21 np0005603623 great_noether[344511]:    {
Jan 31 04:24:21 np0005603623 great_noether[344511]:        "available": false,
Jan 31 04:24:21 np0005603623 great_noether[344511]:        "ceph_device": false,
Jan 31 04:24:21 np0005603623 great_noether[344511]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 31 04:24:21 np0005603623 great_noether[344511]:        "lsm_data": {},
Jan 31 04:24:21 np0005603623 great_noether[344511]:        "lvs": [],
Jan 31 04:24:21 np0005603623 great_noether[344511]:        "path": "/dev/sr0",
Jan 31 04:24:21 np0005603623 great_noether[344511]:        "rejected_reasons": [
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "Has a FileSystem",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "Insufficient space (<5GB)"
Jan 31 04:24:21 np0005603623 great_noether[344511]:        ],
Jan 31 04:24:21 np0005603623 great_noether[344511]:        "sys_api": {
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "actuators": null,
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "device_nodes": "sr0",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "devname": "sr0",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "human_readable_size": "482.00 KB",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "id_bus": "ata",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "model": "QEMU DVD-ROM",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "nr_requests": "2",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "parent": "/dev/sr0",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "partitions": {},
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "path": "/dev/sr0",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "removable": "1",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "rev": "2.5+",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "ro": "0",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "rotational": "1",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "sas_address": "",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "sas_device_handle": "",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "scheduler_mode": "mq-deadline",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "sectors": 0,
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "sectorsize": "2048",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "size": 493568.0,
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "support_discard": "2048",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "type": "disk",
Jan 31 04:24:21 np0005603623 great_noether[344511]:            "vendor": "QEMU"
Jan 31 04:24:21 np0005603623 great_noether[344511]:        }
Jan 31 04:24:21 np0005603623 great_noether[344511]:    }
Jan 31 04:24:21 np0005603623 great_noether[344511]: ]
Jan 31 04:24:21 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:21 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:21 np0005603623 systemd[1]: libpod-8b9b6cf99e463315f3fa684143717cd90b82b136c9e6d20ed15d3e1ff6db69b8.scope: Deactivated successfully.
Jan 31 04:24:21 np0005603623 podman[344495]: 2026-01-31 09:24:21.191418723 +0000 UTC m=+1.470149496 container died 8b9b6cf99e463315f3fa684143717cd90b82b136c9e6d20ed15d3e1ff6db69b8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_noether, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 04:24:21 np0005603623 systemd[1]: libpod-8b9b6cf99e463315f3fa684143717cd90b82b136c9e6d20ed15d3e1ff6db69b8.scope: Consumed 1.300s CPU time.
Jan 31 04:24:21 np0005603623 systemd[1]: var-lib-containers-storage-overlay-2b571dee821cd51f14c55fb3ab6a5a12fc16af51a1208cf8cf26658950c926cc-merged.mount: Deactivated successfully.
Jan 31 04:24:21 np0005603623 podman[344495]: 2026-01-31 09:24:21.236951282 +0000 UTC m=+1.515682065 container remove 8b9b6cf99e463315f3fa684143717cd90b82b136c9e6d20ed15d3e1ff6db69b8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_noether, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0)
Jan 31 04:24:21 np0005603623 systemd[1]: libpod-conmon-8b9b6cf99e463315f3fa684143717cd90b82b136c9e6d20ed15d3e1ff6db69b8.scope: Deactivated successfully.
Jan 31 04:24:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:24:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:24:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:22.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:24:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:22.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:24:23 np0005603623 nova_compute[226235]: 2026-01-31 09:24:23.645 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:24:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:24.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:24:24 np0005603623 nova_compute[226235]: 2026-01-31 09:24:24.792 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:24.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:26 np0005603623 podman[345737]: 2026-01-31 09:24:26.390741125 +0000 UTC m=+0.076144200 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Jan 31 04:24:26 np0005603623 podman[345738]: 2026-01-31 09:24:26.455504198 +0000 UTC m=+0.141099868 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 04:24:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:26.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:26.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:26 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:26 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:28.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:28 np0005603623 nova_compute[226235]: 2026-01-31 09:24:28.648 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:24:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:28.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:24:29 np0005603623 nova_compute[226235]: 2026-01-31 09:24:29.794 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:24:30.177 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:24:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:24:30.177 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:24:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:24:30.177 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:24:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:30.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:30.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:32.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:32.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:33 np0005603623 nova_compute[226235]: 2026-01-31 09:24:33.650 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:34.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:34 np0005603623 nova_compute[226235]: 2026-01-31 09:24:34.796 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:34.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:36 np0005603623 nova_compute[226235]: 2026-01-31 09:24:36.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:36 np0005603623 nova_compute[226235]: 2026-01-31 09:24:36.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:24:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:36.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:36.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:38.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:38 np0005603623 nova_compute[226235]: 2026-01-31 09:24:38.652 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:38.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:39 np0005603623 nova_compute[226235]: 2026-01-31 09:24:39.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:39 np0005603623 nova_compute[226235]: 2026-01-31 09:24:39.188 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:24:39 np0005603623 nova_compute[226235]: 2026-01-31 09:24:39.188 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:24:39 np0005603623 nova_compute[226235]: 2026-01-31 09:24:39.188 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:24:39 np0005603623 nova_compute[226235]: 2026-01-31 09:24:39.188 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:24:39 np0005603623 nova_compute[226235]: 2026-01-31 09:24:39.189 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:24:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:24:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3374617450' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:24:39 np0005603623 nova_compute[226235]: 2026-01-31 09:24:39.614 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:24:39 np0005603623 nova_compute[226235]: 2026-01-31 09:24:39.770 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:24:39 np0005603623 nova_compute[226235]: 2026-01-31 09:24:39.772 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4093MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:24:39 np0005603623 nova_compute[226235]: 2026-01-31 09:24:39.773 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:24:39 np0005603623 nova_compute[226235]: 2026-01-31 09:24:39.773 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:24:39 np0005603623 nova_compute[226235]: 2026-01-31 09:24:39.799 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:40 np0005603623 nova_compute[226235]: 2026-01-31 09:24:40.046 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:24:40 np0005603623 nova_compute[226235]: 2026-01-31 09:24:40.047 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:24:40 np0005603623 nova_compute[226235]: 2026-01-31 09:24:40.080 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:24:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:24:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3778444742' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:24:40 np0005603623 nova_compute[226235]: 2026-01-31 09:24:40.479 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.399s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:24:40 np0005603623 nova_compute[226235]: 2026-01-31 09:24:40.484 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:24:40 np0005603623 nova_compute[226235]: 2026-01-31 09:24:40.503 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:24:40 np0005603623 nova_compute[226235]: 2026-01-31 09:24:40.504 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:24:40 np0005603623 nova_compute[226235]: 2026-01-31 09:24:40.505 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.732s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:24:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:24:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:40.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:24:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:24:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:40.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:24:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:42.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:42.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:43 np0005603623 nova_compute[226235]: 2026-01-31 09:24:43.506 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:43 np0005603623 nova_compute[226235]: 2026-01-31 09:24:43.506 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:24:43 np0005603623 nova_compute[226235]: 2026-01-31 09:24:43.506 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:24:43 np0005603623 nova_compute[226235]: 2026-01-31 09:24:43.540 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:24:43 np0005603623 nova_compute[226235]: 2026-01-31 09:24:43.540 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:43 np0005603623 nova_compute[226235]: 2026-01-31 09:24:43.540 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:43 np0005603623 nova_compute[226235]: 2026-01-31 09:24:43.697 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:44 np0005603623 nova_compute[226235]: 2026-01-31 09:24:44.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:24:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:44.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:24:44 np0005603623 nova_compute[226235]: 2026-01-31 09:24:44.828 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:44.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:46 np0005603623 nova_compute[226235]: 2026-01-31 09:24:46.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:46.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:46.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:48 np0005603623 nova_compute[226235]: 2026-01-31 09:24:48.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:48.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:48 np0005603623 nova_compute[226235]: 2026-01-31 09:24:48.698 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:48.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:49 np0005603623 nova_compute[226235]: 2026-01-31 09:24:49.886 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:24:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:50.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:24:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:50.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:51 np0005603623 nova_compute[226235]: 2026-01-31 09:24:51.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:52.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:24:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:52.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:24:53 np0005603623 nova_compute[226235]: 2026-01-31 09:24:53.740 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:54.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:54 np0005603623 nova_compute[226235]: 2026-01-31 09:24:54.947 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:24:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:54.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:24:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:56.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:56.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:56 np0005603623 podman[345967]: 2026-01-31 09:24:56.984664889 +0000 UTC m=+0.069462331 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 31 04:24:57 np0005603623 podman[345968]: 2026-01-31 09:24:57.018583773 +0000 UTC m=+0.102823367 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 04:24:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:58.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:58 np0005603623 nova_compute[226235]: 2026-01-31 09:24:58.743 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:24:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:58.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:59 np0005603623 nova_compute[226235]: 2026-01-31 09:24:59.949 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:25:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:00.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:25:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:25:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:00.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:25:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:02.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:02.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:03 np0005603623 nova_compute[226235]: 2026-01-31 09:25:03.794 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:04.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:04.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:05 np0005603623 nova_compute[226235]: 2026-01-31 09:25:05.188 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:06.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:06.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:08.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:08 np0005603623 nova_compute[226235]: 2026-01-31 09:25:08.795 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:08.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:10 np0005603623 nova_compute[226235]: 2026-01-31 09:25:10.190 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:10.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:11.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:12.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:25:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:13.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:25:13 np0005603623 nova_compute[226235]: 2026-01-31 09:25:13.797 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:14.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:25:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3644713675' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:25:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:25:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3644713675' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:25:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.002000064s ======
Jan 31 04:25:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:15.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 31 04:25:15 np0005603623 nova_compute[226235]: 2026-01-31 09:25:15.192 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:16.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:17.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:18.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:18 np0005603623 nova_compute[226235]: 2026-01-31 09:25:18.800 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:19.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:20 np0005603623 nova_compute[226235]: 2026-01-31 09:25:20.238 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:20.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:21.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:22.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:25:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:23.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:25:23 np0005603623 nova_compute[226235]: 2026-01-31 09:25:23.801 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:25:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:24.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:25:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:25.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:25 np0005603623 nova_compute[226235]: 2026-01-31 09:25:25.241 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:26.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:27.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:27 np0005603623 podman[346210]: 2026-01-31 09:25:27.960377004 +0000 UTC m=+0.053361445 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 04:25:27 np0005603623 podman[346211]: 2026-01-31 09:25:27.986517734 +0000 UTC m=+0.079695761 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:25:28 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:25:28 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:25:28 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:25:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:25:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:28.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:25:28 np0005603623 nova_compute[226235]: 2026-01-31 09:25:28.801 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:29.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:25:30.178 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:25:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:25:30.178 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:25:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:25:30.178 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:25:30 np0005603623 nova_compute[226235]: 2026-01-31 09:25:30.242 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:25:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:30.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:25:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:25:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:31.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:25:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:32.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:33.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:33 np0005603623 nova_compute[226235]: 2026-01-31 09:25:33.843 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:25:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:34.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:25:35 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:25:35 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:25:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:35.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:35 np0005603623 nova_compute[226235]: 2026-01-31 09:25:35.245 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:36.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:37.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:37 np0005603623 nova_compute[226235]: 2026-01-31 09:25:37.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:25:37 np0005603623 nova_compute[226235]: 2026-01-31 09:25:37.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:25:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:38.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:38 np0005603623 nova_compute[226235]: 2026-01-31 09:25:38.845 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:25:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:39.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:25:40 np0005603623 nova_compute[226235]: 2026-01-31 09:25:40.246 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:25:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:40.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:25:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:41.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:41 np0005603623 nova_compute[226235]: 2026-01-31 09:25:41.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:25:41 np0005603623 nova_compute[226235]: 2026-01-31 09:25:41.307 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:25:41 np0005603623 nova_compute[226235]: 2026-01-31 09:25:41.308 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:25:41 np0005603623 nova_compute[226235]: 2026-01-31 09:25:41.308 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:25:41 np0005603623 nova_compute[226235]: 2026-01-31 09:25:41.308 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:25:41 np0005603623 nova_compute[226235]: 2026-01-31 09:25:41.308 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:25:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:25:41 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3392057005' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:25:41 np0005603623 nova_compute[226235]: 2026-01-31 09:25:41.776 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:25:41 np0005603623 nova_compute[226235]: 2026-01-31 09:25:41.923 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:25:41 np0005603623 nova_compute[226235]: 2026-01-31 09:25:41.925 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4096MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:25:41 np0005603623 nova_compute[226235]: 2026-01-31 09:25:41.925 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:25:41 np0005603623 nova_compute[226235]: 2026-01-31 09:25:41.925 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:25:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:42 np0005603623 nova_compute[226235]: 2026-01-31 09:25:42.380 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:25:42 np0005603623 nova_compute[226235]: 2026-01-31 09:25:42.381 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:25:42 np0005603623 nova_compute[226235]: 2026-01-31 09:25:42.400 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:25:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:42.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:25:42 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3974155708' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:25:42 np0005603623 nova_compute[226235]: 2026-01-31 09:25:42.794 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.394s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:25:42 np0005603623 nova_compute[226235]: 2026-01-31 09:25:42.799 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:25:42 np0005603623 nova_compute[226235]: 2026-01-31 09:25:42.894 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:25:42 np0005603623 nova_compute[226235]: 2026-01-31 09:25:42.896 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:25:42 np0005603623 nova_compute[226235]: 2026-01-31 09:25:42.896 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.971s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:25:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:43.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:43 np0005603623 nova_compute[226235]: 2026-01-31 09:25:43.846 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:43 np0005603623 nova_compute[226235]: 2026-01-31 09:25:43.897 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:25:43 np0005603623 nova_compute[226235]: 2026-01-31 09:25:43.897 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:25:43 np0005603623 nova_compute[226235]: 2026-01-31 09:25:43.898 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:25:43 np0005603623 nova_compute[226235]: 2026-01-31 09:25:43.943 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:25:43 np0005603623 nova_compute[226235]: 2026-01-31 09:25:43.943 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:25:43 np0005603623 nova_compute[226235]: 2026-01-31 09:25:43.944 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:25:44 np0005603623 nova_compute[226235]: 2026-01-31 09:25:44.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:25:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:25:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:44.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:25:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:25:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:45.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:25:45 np0005603623 nova_compute[226235]: 2026-01-31 09:25:45.247 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:46.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:47.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:47 np0005603623 nova_compute[226235]: 2026-01-31 09:25:47.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:25:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:48.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:48 np0005603623 nova_compute[226235]: 2026-01-31 09:25:48.883 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:49.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:50 np0005603623 nova_compute[226235]: 2026-01-31 09:25:50.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:25:50 np0005603623 nova_compute[226235]: 2026-01-31 09:25:50.248 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:25:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:50.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:25:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:51.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:52 np0005603623 nova_compute[226235]: 2026-01-31 09:25:52.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:25:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:52 np0005603623 nova_compute[226235]: 2026-01-31 09:25:52.685 226239 DEBUG oslo_concurrency.processutils [None req-3fbad158-704d-4546-a900-43e27b0e5be8 94836483675641d9846c5768c3b91eed 89e274acfc5c4097be7194f5ef1fabd3 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:25:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:52.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:52 np0005603623 nova_compute[226235]: 2026-01-31 09:25:52.710 226239 DEBUG oslo_concurrency.processutils [None req-3fbad158-704d-4546-a900-43e27b0e5be8 94836483675641d9846c5768c3b91eed 89e274acfc5c4097be7194f5ef1fabd3 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:25:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:53.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:53 np0005603623 nova_compute[226235]: 2026-01-31 09:25:53.885 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:54.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:25:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:55.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:25:55 np0005603623 nova_compute[226235]: 2026-01-31 09:25:55.250 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:56.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:57.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:58.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:58 np0005603623 nova_compute[226235]: 2026-01-31 09:25:58.929 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:59 np0005603623 podman[346464]: 2026-01-31 09:25:59.02562146 +0000 UTC m=+0.062571464 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 04:25:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:25:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:59.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:59 np0005603623 podman[346465]: 2026-01-31 09:25:59.074905827 +0000 UTC m=+0.108557289 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 31 04:26:00 np0005603623 nova_compute[226235]: 2026-01-31 09:26:00.252 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:26:00.510 143258 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=103, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=102) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:26:00 np0005603623 nova_compute[226235]: 2026-01-31 09:26:00.510 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:00 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:26:00.511 143258 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:26:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:00.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:01.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:02.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:03.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:03 np0005603623 nova_compute[226235]: 2026-01-31 09:26:03.974 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:04 np0005603623 nova_compute[226235]: 2026-01-31 09:26:04.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:04 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:26:04.512 143258 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7ec8bf38-9571-4400-a85c-6bd5ac54bdf3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '103'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:26:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:04.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:05.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:05 np0005603623 nova_compute[226235]: 2026-01-31 09:26:05.254 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:26:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:06.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:26:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:07.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:08.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:08 np0005603623 nova_compute[226235]: 2026-01-31 09:26:08.978 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:09.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:10 np0005603623 nova_compute[226235]: 2026-01-31 09:26:10.257 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:10.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:11.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:26:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:12.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:26:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:26:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:13.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:26:13 np0005603623 nova_compute[226235]: 2026-01-31 09:26:13.979 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:26:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:14.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:26:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:15.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:15 np0005603623 nova_compute[226235]: 2026-01-31 09:26:15.258 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:16.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:17.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:18.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:18 np0005603623 nova_compute[226235]: 2026-01-31 09:26:18.980 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:19.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:20 np0005603623 nova_compute[226235]: 2026-01-31 09:26:20.271 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:20.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:21.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:22.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:26:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:23.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:26:23 np0005603623 nova_compute[226235]: 2026-01-31 09:26:23.981 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:26:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:24.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:26:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:25.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:25 np0005603623 nova_compute[226235]: 2026-01-31 09:26:25.273 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:26:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:26.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:26:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:27.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:28.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:28 np0005603623 nova_compute[226235]: 2026-01-31 09:26:28.983 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:26:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:29.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:26:29 np0005603623 podman[346595]: 2026-01-31 09:26:29.23629904 +0000 UTC m=+0.049679329 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:26:29 np0005603623 podman[346596]: 2026-01-31 09:26:29.28981018 +0000 UTC m=+0.101486356 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127)
Jan 31 04:26:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:26:30.179 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:26:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:26:30.179 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:26:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:26:30.179 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:26:30 np0005603623 nova_compute[226235]: 2026-01-31 09:26:30.274 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:30.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:31.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:32.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:26:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:33.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:26:33 np0005603623 nova_compute[226235]: 2026-01-31 09:26:33.984 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:34.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:35.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:35 np0005603623 nova_compute[226235]: 2026-01-31 09:26:35.276 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:26:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:36.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:26:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:26:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:26:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:26:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:26:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:26:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:26:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 04:26:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 04:26:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:26:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:37.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:26:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 04:26:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:26:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:26:37 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:26:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:38.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:38 np0005603623 nova_compute[226235]: 2026-01-31 09:26:38.985 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:26:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:39.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:26:39 np0005603623 nova_compute[226235]: 2026-01-31 09:26:39.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:39 np0005603623 nova_compute[226235]: 2026-01-31 09:26:39.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:26:40 np0005603623 nova_compute[226235]: 2026-01-31 09:26:40.278 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:40.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:26:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:41.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:26:41 np0005603623 nova_compute[226235]: 2026-01-31 09:26:41.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:42.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:26:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:43.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:26:43 np0005603623 nova_compute[226235]: 2026-01-31 09:26:43.752 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:26:43 np0005603623 nova_compute[226235]: 2026-01-31 09:26:43.752 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:26:43 np0005603623 nova_compute[226235]: 2026-01-31 09:26:43.753 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:26:43 np0005603623 nova_compute[226235]: 2026-01-31 09:26:43.753 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:26:43 np0005603623 nova_compute[226235]: 2026-01-31 09:26:43.753 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:26:43 np0005603623 nova_compute[226235]: 2026-01-31 09:26:43.987 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:26:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/78309093' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:26:44 np0005603623 nova_compute[226235]: 2026-01-31 09:26:44.145 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.391s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:26:44 np0005603623 nova_compute[226235]: 2026-01-31 09:26:44.280 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:26:44 np0005603623 nova_compute[226235]: 2026-01-31 09:26:44.281 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4090MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:26:44 np0005603623 nova_compute[226235]: 2026-01-31 09:26:44.282 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:26:44 np0005603623 nova_compute[226235]: 2026-01-31 09:26:44.282 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:26:44 np0005603623 nova_compute[226235]: 2026-01-31 09:26:44.369 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:26:44 np0005603623 nova_compute[226235]: 2026-01-31 09:26:44.369 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:26:44 np0005603623 nova_compute[226235]: 2026-01-31 09:26:44.434 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:26:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:26:44 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:26:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:44.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:26:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1421262946' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:26:44 np0005603623 nova_compute[226235]: 2026-01-31 09:26:44.840 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.405s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:26:44 np0005603623 nova_compute[226235]: 2026-01-31 09:26:44.845 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:26:44 np0005603623 nova_compute[226235]: 2026-01-31 09:26:44.877 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:26:44 np0005603623 nova_compute[226235]: 2026-01-31 09:26:44.879 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:26:44 np0005603623 nova_compute[226235]: 2026-01-31 09:26:44.879 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:26:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:45.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:45 np0005603623 nova_compute[226235]: 2026-01-31 09:26:45.320 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:45 np0005603623 nova_compute[226235]: 2026-01-31 09:26:45.879 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:45 np0005603623 nova_compute[226235]: 2026-01-31 09:26:45.880 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:26:45 np0005603623 nova_compute[226235]: 2026-01-31 09:26:45.880 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:26:45 np0005603623 nova_compute[226235]: 2026-01-31 09:26:45.901 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:26:45 np0005603623 nova_compute[226235]: 2026-01-31 09:26:45.901 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:45 np0005603623 nova_compute[226235]: 2026-01-31 09:26:45.902 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:45 np0005603623 nova_compute[226235]: 2026-01-31 09:26:45.902 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:46.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:47.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:48.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:48 np0005603623 nova_compute[226235]: 2026-01-31 09:26:48.989 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:49.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:49 np0005603623 nova_compute[226235]: 2026-01-31 09:26:49.173 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:50 np0005603623 nova_compute[226235]: 2026-01-31 09:26:50.321 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:50.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:51.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:51 np0005603623 nova_compute[226235]: 2026-01-31 09:26:51.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:52 np0005603623 nova_compute[226235]: 2026-01-31 09:26:52.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #205. Immutable memtables: 0.
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:26:52.308645) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 131] Flushing memtable with next log file: 205
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851612308726, "job": 131, "event": "flush_started", "num_memtables": 1, "num_entries": 2357, "num_deletes": 251, "total_data_size": 6096328, "memory_usage": 6182304, "flush_reason": "Manual Compaction"}
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 131] Level-0 flush table #206: started
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851612350300, "cf_name": "default", "job": 131, "event": "table_file_creation", "file_number": 206, "file_size": 3939867, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 98496, "largest_seqno": 100848, "table_properties": {"data_size": 3930184, "index_size": 6176, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19324, "raw_average_key_size": 20, "raw_value_size": 3911098, "raw_average_value_size": 4108, "num_data_blocks": 270, "num_entries": 952, "num_filter_entries": 952, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769851397, "oldest_key_time": 1769851397, "file_creation_time": 1769851612, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 206, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 131] Flush lasted 41691 microseconds, and 5776 cpu microseconds.
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:26:52.350345) [db/flush_job.cc:967] [default] [JOB 131] Level-0 flush table #206: 3939867 bytes OK
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:26:52.350362) [db/memtable_list.cc:519] [default] Level-0 commit table #206 started
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:26:52.352200) [db/memtable_list.cc:722] [default] Level-0 commit table #206: memtable #1 done
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:26:52.352212) EVENT_LOG_v1 {"time_micros": 1769851612352209, "job": 131, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:26:52.352228) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 131] Try to delete WAL files size 6085986, prev total WAL file size 6085986, number of live WAL files 2.
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000202.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:26:52.353078) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038373835' seq:72057594037927935, type:22 .. '7061786F730039303337' seq:0, type:0; will stop at (end)
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 132] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 131 Base level 0, inputs: [206(3847KB)], [204(12MB)]
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851612353155, "job": 132, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [206], "files_L6": [204], "score": -1, "input_data_size": 17159625, "oldest_snapshot_seqno": -1}
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 132] Generated table #207: 12256 keys, 15188326 bytes, temperature: kUnknown
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851612471155, "cf_name": "default", "job": 132, "event": "table_file_creation", "file_number": 207, "file_size": 15188326, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15109977, "index_size": 46620, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30661, "raw_key_size": 323331, "raw_average_key_size": 26, "raw_value_size": 14897067, "raw_average_value_size": 1215, "num_data_blocks": 1775, "num_entries": 12256, "num_filter_entries": 12256, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769851612, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 207, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:26:52.471390) [db/compaction/compaction_job.cc:1663] [default] [JOB 132] Compacted 1@0 + 1@6 files to L6 => 15188326 bytes
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:26:52.473122) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 145.3 rd, 128.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 12.6 +0.0 blob) out(14.5 +0.0 blob), read-write-amplify(8.2) write-amplify(3.9) OK, records in: 12775, records dropped: 519 output_compression: NoCompression
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:26:52.473138) EVENT_LOG_v1 {"time_micros": 1769851612473130, "job": 132, "event": "compaction_finished", "compaction_time_micros": 118082, "compaction_time_cpu_micros": 24854, "output_level": 6, "num_output_files": 1, "total_output_size": 15188326, "num_input_records": 12775, "num_output_records": 12256, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000206.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851612473953, "job": 132, "event": "table_file_deletion", "file_number": 206}
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000204.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851612475280, "job": 132, "event": "table_file_deletion", "file_number": 204}
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:26:52.352958) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:26:52.475391) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:26:52.475398) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:26:52.475400) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:26:52.475402) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:26:52 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:26:52.475403) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:26:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:52.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:53.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:53 np0005603623 nova_compute[226235]: 2026-01-31 09:26:53.991 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:26:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:54.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:26:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:55.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:55 np0005603623 nova_compute[226235]: 2026-01-31 09:26:55.323 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:56.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:57.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:58.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:58 np0005603623 nova_compute[226235]: 2026-01-31 09:26:58.992 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:26:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:26:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:59.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:26:59 np0005603623 podman[347074]: 2026-01-31 09:26:59.977762563 +0000 UTC m=+0.070026308 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 04:26:59 np0005603623 podman[347075]: 2026-01-31 09:26:59.979702095 +0000 UTC m=+0.072066454 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Jan 31 04:27:00 np0005603623 nova_compute[226235]: 2026-01-31 09:27:00.324 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:00.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:01.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:02.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:03.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:03 np0005603623 nova_compute[226235]: 2026-01-31 09:27:03.994 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:04.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:05.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:05 np0005603623 nova_compute[226235]: 2026-01-31 09:27:05.326 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:06.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:07.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:08.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:08 np0005603623 nova_compute[226235]: 2026-01-31 09:27:08.996 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:27:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:09.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:27:10 np0005603623 nova_compute[226235]: 2026-01-31 09:27:10.367 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:10.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:11.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:12.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:13.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:14 np0005603623 nova_compute[226235]: 2026-01-31 09:27:14.047 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:14.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:15.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:15 np0005603623 nova_compute[226235]: 2026-01-31 09:27:15.368 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:16 np0005603623 nova_compute[226235]: 2026-01-31 09:27:16.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:16 np0005603623 nova_compute[226235]: 2026-01-31 09:27:16.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 04:27:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:27:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:16.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:27:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:17.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:18.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:19 np0005603623 nova_compute[226235]: 2026-01-31 09:27:19.050 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:19.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:20 np0005603623 nova_compute[226235]: 2026-01-31 09:27:20.370 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:20.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:21.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:22.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:23.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:24 np0005603623 nova_compute[226235]: 2026-01-31 09:27:24.051 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:24.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:27:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:25.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:27:25 np0005603623 nova_compute[226235]: 2026-01-31 09:27:25.417 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:26.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:27.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:27:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:28.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:27:29 np0005603623 nova_compute[226235]: 2026-01-31 09:27:29.054 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:29.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:27:30.179 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:27:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:27:30.180 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:27:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:27:30.180 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:27:30 np0005603623 nova_compute[226235]: 2026-01-31 09:27:30.419 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:27:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:30.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:27:30 np0005603623 podman[347237]: 2026-01-31 09:27:30.958154188 +0000 UTC m=+0.050553167 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:27:30 np0005603623 podman[347238]: 2026-01-31 09:27:30.996324277 +0000 UTC m=+0.075378147 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 04:27:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:27:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:31.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:27:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:32.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:33.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:34 np0005603623 nova_compute[226235]: 2026-01-31 09:27:34.055 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:34.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:27:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:35.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:27:35 np0005603623 nova_compute[226235]: 2026-01-31 09:27:35.449 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:36.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:37.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:38.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:39 np0005603623 nova_compute[226235]: 2026-01-31 09:27:39.057 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:39.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:40 np0005603623 nova_compute[226235]: 2026-01-31 09:27:40.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:40 np0005603623 nova_compute[226235]: 2026-01-31 09:27:40.451 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:27:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:40.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:27:41 np0005603623 nova_compute[226235]: 2026-01-31 09:27:41.174 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:41 np0005603623 nova_compute[226235]: 2026-01-31 09:27:41.175 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:27:41 np0005603623 nova_compute[226235]: 2026-01-31 09:27:41.175 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:41.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:41 np0005603623 nova_compute[226235]: 2026-01-31 09:27:41.916 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:27:41 np0005603623 nova_compute[226235]: 2026-01-31 09:27:41.917 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:27:41 np0005603623 nova_compute[226235]: 2026-01-31 09:27:41.917 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:27:41 np0005603623 nova_compute[226235]: 2026-01-31 09:27:41.917 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:27:41 np0005603623 nova_compute[226235]: 2026-01-31 09:27:41.917 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:27:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:27:42 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2682941791' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:27:42 np0005603623 nova_compute[226235]: 2026-01-31 09:27:42.337 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:27:42 np0005603623 nova_compute[226235]: 2026-01-31 09:27:42.463 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:27:42 np0005603623 nova_compute[226235]: 2026-01-31 09:27:42.464 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4097MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:27:42 np0005603623 nova_compute[226235]: 2026-01-31 09:27:42.464 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:27:42 np0005603623 nova_compute[226235]: 2026-01-31 09:27:42.465 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:27:42 np0005603623 nova_compute[226235]: 2026-01-31 09:27:42.569 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:27:42 np0005603623 nova_compute[226235]: 2026-01-31 09:27:42.569 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:27:42 np0005603623 nova_compute[226235]: 2026-01-31 09:27:42.597 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:27:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:42.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:27:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3569290171' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:27:43 np0005603623 nova_compute[226235]: 2026-01-31 09:27:43.111 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:27:43 np0005603623 nova_compute[226235]: 2026-01-31 09:27:43.117 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:27:43 np0005603623 nova_compute[226235]: 2026-01-31 09:27:43.144 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:27:43 np0005603623 nova_compute[226235]: 2026-01-31 09:27:43.146 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:27:43 np0005603623 nova_compute[226235]: 2026-01-31 09:27:43.146 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.681s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:27:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:43.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:44 np0005603623 nova_compute[226235]: 2026-01-31 09:27:44.059 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:44.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:45 np0005603623 nova_compute[226235]: 2026-01-31 09:27:45.126 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:45 np0005603623 nova_compute[226235]: 2026-01-31 09:27:45.126 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:27:45 np0005603623 nova_compute[226235]: 2026-01-31 09:27:45.126 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:27:45 np0005603623 nova_compute[226235]: 2026-01-31 09:27:45.151 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:27:45 np0005603623 nova_compute[226235]: 2026-01-31 09:27:45.151 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:45 np0005603623 nova_compute[226235]: 2026-01-31 09:27:45.151 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:45 np0005603623 nova_compute[226235]: 2026-01-31 09:27:45.151 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:27:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:45.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:27:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:27:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:27:45 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:27:45 np0005603623 nova_compute[226235]: 2026-01-31 09:27:45.452 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:46.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:27:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:47.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:27:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:48.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:49 np0005603623 nova_compute[226235]: 2026-01-31 09:27:49.061 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:49.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:50 np0005603623 nova_compute[226235]: 2026-01-31 09:27:50.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:50 np0005603623 nova_compute[226235]: 2026-01-31 09:27:50.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:50 np0005603623 nova_compute[226235]: 2026-01-31 09:27:50.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 04:27:50 np0005603623 nova_compute[226235]: 2026-01-31 09:27:50.494 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:50 np0005603623 nova_compute[226235]: 2026-01-31 09:27:50.727 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 04:27:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:50.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:51.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:52 np0005603623 nova_compute[226235]: 2026-01-31 09:27:52.726 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:52 np0005603623 nova_compute[226235]: 2026-01-31 09:27:52.727 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:52.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:53.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:54 np0005603623 nova_compute[226235]: 2026-01-31 09:27:54.062 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:54 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:27:54 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:27:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:27:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:54.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:27:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:55.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:55 np0005603623 nova_compute[226235]: 2026-01-31 09:27:55.496 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:56.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:27:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:57.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:27:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:58.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:59 np0005603623 nova_compute[226235]: 2026-01-31 09:27:59.066 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:27:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:59.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:00 np0005603623 nova_compute[226235]: 2026-01-31 09:28:00.498 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:00.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:01.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:01 np0005603623 podman[347575]: 2026-01-31 09:28:01.947088289 +0000 UTC m=+0.040827102 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:28:01 np0005603623 podman[347576]: 2026-01-31 09:28:01.969561264 +0000 UTC m=+0.061708547 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 04:28:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:02.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:28:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:03.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:28:04 np0005603623 nova_compute[226235]: 2026-01-31 09:28:04.067 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:04 np0005603623 nova_compute[226235]: 2026-01-31 09:28:04.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:04.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:05.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:05 np0005603623 nova_compute[226235]: 2026-01-31 09:28:05.531 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:06.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:07.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:08.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:09 np0005603623 nova_compute[226235]: 2026-01-31 09:28:09.067 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:09.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:10 np0005603623 nova_compute[226235]: 2026-01-31 09:28:10.533 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:10.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:11.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:12.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:13.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:14 np0005603623 nova_compute[226235]: 2026-01-31 09:28:14.068 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:14.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:15.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:15 np0005603623 nova_compute[226235]: 2026-01-31 09:28:15.534 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:16.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:17.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #208. Immutable memtables: 0.
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:18.623100) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 133] Flushing memtable with next log file: 208
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851698623206, "job": 133, "event": "flush_started", "num_memtables": 1, "num_entries": 1007, "num_deletes": 250, "total_data_size": 2189826, "memory_usage": 2224408, "flush_reason": "Manual Compaction"}
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 133] Level-0 flush table #209: started
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851698692306, "cf_name": "default", "job": 133, "event": "table_file_creation", "file_number": 209, "file_size": 880301, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 100853, "largest_seqno": 101855, "table_properties": {"data_size": 876638, "index_size": 1378, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9702, "raw_average_key_size": 20, "raw_value_size": 868855, "raw_average_value_size": 1856, "num_data_blocks": 62, "num_entries": 468, "num_filter_entries": 468, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769851613, "oldest_key_time": 1769851613, "file_creation_time": 1769851698, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 209, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 133] Flush lasted 69306 microseconds, and 4486 cpu microseconds.
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:18.692421) [db/flush_job.cc:967] [default] [JOB 133] Level-0 flush table #209: 880301 bytes OK
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:18.692496) [db/memtable_list.cc:519] [default] Level-0 commit table #209 started
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:18.750527) [db/memtable_list.cc:722] [default] Level-0 commit table #209: memtable #1 done
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:18.750612) EVENT_LOG_v1 {"time_micros": 1769851698750595, "job": 133, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:18.750653) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 133] Try to delete WAL files size 2184867, prev total WAL file size 2184867, number of live WAL files 2.
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000205.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:18.751793) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033353130' seq:72057594037927935, type:22 .. '6D6772737461740033373631' seq:0, type:0; will stop at (end)
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 134] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 133 Base level 0, inputs: [209(859KB)], [207(14MB)]
Jan 31 04:28:18 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851698751916, "job": 134, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [209], "files_L6": [207], "score": -1, "input_data_size": 16068627, "oldest_snapshot_seqno": -1}
Jan 31 04:28:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:28:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:18.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:28:19 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 134] Generated table #210: 12248 keys, 12786017 bytes, temperature: kUnknown
Jan 31 04:28:19 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851699012516, "cf_name": "default", "job": 134, "event": "table_file_creation", "file_number": 210, "file_size": 12786017, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12711396, "index_size": 42892, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30661, "raw_key_size": 323345, "raw_average_key_size": 26, "raw_value_size": 12502235, "raw_average_value_size": 1020, "num_data_blocks": 1622, "num_entries": 12248, "num_filter_entries": 12248, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769851698, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 210, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:28:19 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:28:19 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:19.012801) [db/compaction/compaction_job.cc:1663] [default] [JOB 134] Compacted 1@0 + 1@6 files to L6 => 12786017 bytes
Jan 31 04:28:19 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:19.036323) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 61.6 rd, 49.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 14.5 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(32.8) write-amplify(14.5) OK, records in: 12724, records dropped: 476 output_compression: NoCompression
Jan 31 04:28:19 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:19.036397) EVENT_LOG_v1 {"time_micros": 1769851699036368, "job": 134, "event": "compaction_finished", "compaction_time_micros": 260677, "compaction_time_cpu_micros": 28738, "output_level": 6, "num_output_files": 1, "total_output_size": 12786017, "num_input_records": 12724, "num_output_records": 12248, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:28:19 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000209.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:28:19 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851699036975, "job": 134, "event": "table_file_deletion", "file_number": 209}
Jan 31 04:28:19 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000207.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:28:19 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851699040398, "job": 134, "event": "table_file_deletion", "file_number": 207}
Jan 31 04:28:19 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:18.751634) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:19 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:19.040530) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:19 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:19.040537) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:19 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:19.040539) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:19 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:19.040541) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:19 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:19.040542) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:19 np0005603623 nova_compute[226235]: 2026-01-31 09:28:19.072 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:19.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:20 np0005603623 nova_compute[226235]: 2026-01-31 09:28:20.536 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:20.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:28:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:21.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:28:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:22.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:23.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #211. Immutable memtables: 0.
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:23.507287) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 135] Flushing memtable with next log file: 211
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851703507316, "job": 135, "event": "flush_started", "num_memtables": 1, "num_entries": 304, "num_deletes": 257, "total_data_size": 119322, "memory_usage": 125328, "flush_reason": "Manual Compaction"}
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 135] Level-0 flush table #212: started
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851703536631, "cf_name": "default", "job": 135, "event": "table_file_creation", "file_number": 212, "file_size": 78285, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 101860, "largest_seqno": 102159, "table_properties": {"data_size": 76361, "index_size": 151, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4897, "raw_average_key_size": 17, "raw_value_size": 72497, "raw_average_value_size": 259, "num_data_blocks": 7, "num_entries": 279, "num_filter_entries": 279, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769851699, "oldest_key_time": 1769851699, "file_creation_time": 1769851703, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 212, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 135] Flush lasted 29397 microseconds, and 788 cpu microseconds.
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:23.536678) [db/flush_job.cc:967] [default] [JOB 135] Level-0 flush table #212: 78285 bytes OK
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:23.536698) [db/memtable_list.cc:519] [default] Level-0 commit table #212 started
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:23.622543) [db/memtable_list.cc:722] [default] Level-0 commit table #212: memtable #1 done
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:23.622588) EVENT_LOG_v1 {"time_micros": 1769851703622578, "job": 135, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:23.622612) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 135] Try to delete WAL files size 117093, prev total WAL file size 117093, number of live WAL files 2.
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000208.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:23.623061) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303136' seq:72057594037927935, type:22 .. '6C6F676D0034323639' seq:0, type:0; will stop at (end)
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 136] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 135 Base level 0, inputs: [212(76KB)], [210(12MB)]
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851703623112, "job": 136, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [212], "files_L6": [210], "score": -1, "input_data_size": 12864302, "oldest_snapshot_seqno": -1}
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 136] Generated table #213: 12005 keys, 12746950 bytes, temperature: kUnknown
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851703782317, "cf_name": "default", "job": 136, "event": "table_file_creation", "file_number": 213, "file_size": 12746950, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12673404, "index_size": 42457, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30021, "raw_key_size": 319237, "raw_average_key_size": 26, "raw_value_size": 12467846, "raw_average_value_size": 1038, "num_data_blocks": 1601, "num_entries": 12005, "num_filter_entries": 12005, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769851703, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 213, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:23.782617) [db/compaction/compaction_job.cc:1663] [default] [JOB 136] Compacted 1@0 + 1@6 files to L6 => 12746950 bytes
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:23.806331) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 80.8 rd, 80.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 12.2 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(327.2) write-amplify(162.8) OK, records in: 12527, records dropped: 522 output_compression: NoCompression
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:23.806368) EVENT_LOG_v1 {"time_micros": 1769851703806354, "job": 136, "event": "compaction_finished", "compaction_time_micros": 159279, "compaction_time_cpu_micros": 22898, "output_level": 6, "num_output_files": 1, "total_output_size": 12746950, "num_input_records": 12527, "num_output_records": 12005, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000212.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851703806569, "job": 136, "event": "table_file_deletion", "file_number": 212}
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000210.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851703807613, "job": 136, "event": "table_file_deletion", "file_number": 210}
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:23.622933) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:23.807640) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:23.807644) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:23.807646) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:23.807647) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:23 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:28:23.807649) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:24 np0005603623 nova_compute[226235]: 2026-01-31 09:28:24.073 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:24.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:25.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:25 np0005603623 nova_compute[226235]: 2026-01-31 09:28:25.537 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:26.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:27.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:28.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:29 np0005603623 nova_compute[226235]: 2026-01-31 09:28:29.074 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:29.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:28:30.181 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:28:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:28:30.181 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:28:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:28:30.181 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:28:30 np0005603623 nova_compute[226235]: 2026-01-31 09:28:30.539 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:30.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:31.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:28:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:32.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:28:32 np0005603623 podman[347735]: 2026-01-31 09:28:32.953130118 +0000 UTC m=+0.046624925 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 04:28:32 np0005603623 podman[347736]: 2026-01-31 09:28:32.968874392 +0000 UTC m=+0.058543939 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 31 04:28:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:33.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:34 np0005603623 nova_compute[226235]: 2026-01-31 09:28:34.075 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:34.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:35.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:35 np0005603623 nova_compute[226235]: 2026-01-31 09:28:35.540 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:36.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:37.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:38.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:39 np0005603623 nova_compute[226235]: 2026-01-31 09:28:39.108 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:28:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:39.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:28:40 np0005603623 nova_compute[226235]: 2026-01-31 09:28:40.542 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:28:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:40.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:28:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:28:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:41.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:28:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:42.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.190 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.190 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.190 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.191 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.191 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:28:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:43.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:28:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/75100032' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.599 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.732 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.734 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4108MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.734 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.734 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.860 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.861 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.875 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.892 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.893 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.907 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.934 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 04:28:43 np0005603623 nova_compute[226235]: 2026-01-31 09:28:43.967 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:28:44 np0005603623 nova_compute[226235]: 2026-01-31 09:28:44.109 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:28:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2688154072' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:28:44 np0005603623 nova_compute[226235]: 2026-01-31 09:28:44.387 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:28:44 np0005603623 nova_compute[226235]: 2026-01-31 09:28:44.393 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:28:44 np0005603623 nova_compute[226235]: 2026-01-31 09:28:44.421 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:28:44 np0005603623 nova_compute[226235]: 2026-01-31 09:28:44.423 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:28:44 np0005603623 nova_compute[226235]: 2026-01-31 09:28:44.423 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.689s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:28:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:44.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:45.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:45 np0005603623 nova_compute[226235]: 2026-01-31 09:28:45.423 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:45 np0005603623 nova_compute[226235]: 2026-01-31 09:28:45.424 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:28:45 np0005603623 nova_compute[226235]: 2026-01-31 09:28:45.424 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:28:45 np0005603623 nova_compute[226235]: 2026-01-31 09:28:45.455 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:28:45 np0005603623 nova_compute[226235]: 2026-01-31 09:28:45.455 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:45 np0005603623 nova_compute[226235]: 2026-01-31 09:28:45.456 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:45 np0005603623 nova_compute[226235]: 2026-01-31 09:28:45.456 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:45 np0005603623 nova_compute[226235]: 2026-01-31 09:28:45.592 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:46.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:47.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:48.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:49 np0005603623 nova_compute[226235]: 2026-01-31 09:28:49.147 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:49.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:50 np0005603623 nova_compute[226235]: 2026-01-31 09:28:50.594 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:28:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:50.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:28:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:51.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:52 np0005603623 nova_compute[226235]: 2026-01-31 09:28:52.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:52 np0005603623 nova_compute[226235]: 2026-01-31 09:28:52.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:52.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:53.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:54 np0005603623 nova_compute[226235]: 2026-01-31 09:28:54.148 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:54 np0005603623 nova_compute[226235]: 2026-01-31 09:28:54.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:54.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:55.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:28:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:28:55 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:28:55 np0005603623 nova_compute[226235]: 2026-01-31 09:28:55.596 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:56.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:57.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:28:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:58.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:28:59 np0005603623 nova_compute[226235]: 2026-01-31 09:28:59.150 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:28:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:59.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:00 np0005603623 nova_compute[226235]: 2026-01-31 09:29:00.597 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:00.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:01.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:02.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:03.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:03 np0005603623 podman[348017]: 2026-01-31 09:29:03.977957587 +0000 UTC m=+0.071004289 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 04:29:03 np0005603623 podman[348016]: 2026-01-31 09:29:03.98601254 +0000 UTC m=+0.080312491 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:29:04 np0005603623 nova_compute[226235]: 2026-01-31 09:29:04.151 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:04.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:05.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:05 np0005603623 nova_compute[226235]: 2026-01-31 09:29:05.598 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:06.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:29:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:07.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:29:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:08 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:29:08 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:29:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:08.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:09 np0005603623 nova_compute[226235]: 2026-01-31 09:29:09.153 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:09.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:10 np0005603623 nova_compute[226235]: 2026-01-31 09:29:10.601 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:10.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:29:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:11.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:29:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:12.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:29:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:13.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:29:14 np0005603623 nova_compute[226235]: 2026-01-31 09:29:14.155 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:14.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:29:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:15.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:29:15 np0005603623 nova_compute[226235]: 2026-01-31 09:29:15.602 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:16.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:17.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:18.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:19 np0005603623 nova_compute[226235]: 2026-01-31 09:29:19.156 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:19.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:20 np0005603623 nova_compute[226235]: 2026-01-31 09:29:20.604 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:20.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:21.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:22.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:29:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:23.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:29:24 np0005603623 nova_compute[226235]: 2026-01-31 09:29:24.158 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:24.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:25.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:25 np0005603623 nova_compute[226235]: 2026-01-31 09:29:25.605 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:26.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:27.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:28.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:29 np0005603623 nova_compute[226235]: 2026-01-31 09:29:29.161 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:29.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:29:30.181 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:29:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:29:30.182 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:29:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:29:30.182 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:29:30 np0005603623 nova_compute[226235]: 2026-01-31 09:29:30.607 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:30.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:31.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:32.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:33.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:34 np0005603623 nova_compute[226235]: 2026-01-31 09:29:34.162 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:34.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:34 np0005603623 podman[348224]: 2026-01-31 09:29:34.9401237 +0000 UTC m=+0.040022577 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 04:29:34 np0005603623 podman[348225]: 2026-01-31 09:29:34.968356047 +0000 UTC m=+0.063916157 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 04:29:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:35.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:35 np0005603623 nova_compute[226235]: 2026-01-31 09:29:35.608 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:36.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:29:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:37.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:29:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:38.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:39 np0005603623 nova_compute[226235]: 2026-01-31 09:29:39.164 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:39.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:40 np0005603623 nova_compute[226235]: 2026-01-31 09:29:40.609 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:40.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:41.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:42.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:29:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:43.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:29:44 np0005603623 nova_compute[226235]: 2026-01-31 09:29:44.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:29:44 np0005603623 nova_compute[226235]: 2026-01-31 09:29:44.166 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:44.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.154 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.188 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.188 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.188 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.188 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.189 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.226 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.226 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.226 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.226 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.227 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:29:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:45.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.611 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:29:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3156215945' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.679 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.822 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.823 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4090MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.823 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:29:45 np0005603623 nova_compute[226235]: 2026-01-31 09:29:45.824 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:29:46 np0005603623 nova_compute[226235]: 2026-01-31 09:29:46.180 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:29:46 np0005603623 nova_compute[226235]: 2026-01-31 09:29:46.180 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:29:46 np0005603623 nova_compute[226235]: 2026-01-31 09:29:46.201 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:29:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:29:46 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/72677669' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:29:46 np0005603623 nova_compute[226235]: 2026-01-31 09:29:46.914 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.713s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:29:46 np0005603623 nova_compute[226235]: 2026-01-31 09:29:46.919 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:29:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:46.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:46 np0005603623 nova_compute[226235]: 2026-01-31 09:29:46.940 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:29:46 np0005603623 nova_compute[226235]: 2026-01-31 09:29:46.941 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:29:46 np0005603623 nova_compute[226235]: 2026-01-31 09:29:46.942 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:29:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:47.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:47 np0005603623 nova_compute[226235]: 2026-01-31 09:29:47.908 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:29:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:48.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:49 np0005603623 nova_compute[226235]: 2026-01-31 09:29:49.167 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:49.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:50 np0005603623 nova_compute[226235]: 2026-01-31 09:29:50.612 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:50.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:29:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:51.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:29:52 np0005603623 nova_compute[226235]: 2026-01-31 09:29:52.151 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:29:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:29:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:52.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:29:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:53.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:54 np0005603623 nova_compute[226235]: 2026-01-31 09:29:54.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:29:54 np0005603623 nova_compute[226235]: 2026-01-31 09:29:54.168 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:54.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:55 np0005603623 nova_compute[226235]: 2026-01-31 09:29:55.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:29:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:55.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:55 np0005603623 nova_compute[226235]: 2026-01-31 09:29:55.614 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:29:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:56.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:29:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:29:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:57.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:29:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:29:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:58.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:29:59 np0005603623 nova_compute[226235]: 2026-01-31 09:29:59.228 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:29:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:59.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:00 np0005603623 ceph-mon[77037]: overall HEALTH_OK
Jan 31 04:30:00 np0005603623 nova_compute[226235]: 2026-01-31 09:30:00.615 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:00.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:01.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:30:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:02.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:30:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:30:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:03.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:30:04 np0005603623 nova_compute[226235]: 2026-01-31 09:30:04.230 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:30:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:04.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:30:05 np0005603623 nova_compute[226235]: 2026-01-31 09:30:05.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:30:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:05.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:30:05 np0005603623 nova_compute[226235]: 2026-01-31 09:30:05.616 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:05 np0005603623 podman[348508]: 2026-01-31 09:30:05.950564269 +0000 UTC m=+0.041499513 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 04:30:06 np0005603623 podman[348509]: 2026-01-31 09:30:06.003268904 +0000 UTC m=+0.091849764 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 04:30:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:30:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:30:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:30:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:30:06 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:30:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:06.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:07.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:07 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:08.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:09 np0005603623 nova_compute[226235]: 2026-01-31 09:30:09.231 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:30:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:09.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:30:10 np0005603623 nova_compute[226235]: 2026-01-31 09:30:10.617 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:10 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:10 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:30:10 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:10.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:30:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:11.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:12 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:12 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:12 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:12.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:12 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:13.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:14 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:30:14 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:30:14 np0005603623 nova_compute[226235]: 2026-01-31 09:30:14.232 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:14 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:14 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:14 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:14.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:15.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:15 np0005603623 nova_compute[226235]: 2026-01-31 09:30:15.619 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:16 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:16 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:16 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:16.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:17.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:17 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:18 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:18 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:18 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:18.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:19 np0005603623 nova_compute[226235]: 2026-01-31 09:30:19.233 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:30:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:19.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:30:20 np0005603623 nova_compute[226235]: 2026-01-31 09:30:20.620 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:20 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:20 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:20 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:20.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:21.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:22 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:22 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:22 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:22.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:22 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:23.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:24 np0005603623 nova_compute[226235]: 2026-01-31 09:30:24.235 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:24 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:24 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:24 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:24.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:25.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:25 np0005603623 nova_compute[226235]: 2026-01-31 09:30:25.621 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:26 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:26 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:26 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:26.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:27.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:27 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:28 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:28 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:28 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:28.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:29 np0005603623 nova_compute[226235]: 2026-01-31 09:30:29.237 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:29.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:30:30.182 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:30:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:30:30.183 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:30:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:30:30.183 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:30:30 np0005603623 nova_compute[226235]: 2026-01-31 09:30:30.624 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:30 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:30 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:30 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:30.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:31.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:32 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:32 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:32 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:32.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:33.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:34 np0005603623 nova_compute[226235]: 2026-01-31 09:30:34.237 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:34 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:34 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:34 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:34.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:30:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:35.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:30:35 np0005603623 nova_compute[226235]: 2026-01-31 09:30:35.626 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:36 np0005603623 podman[348723]: 2026-01-31 09:30:36.95439436 +0000 UTC m=+0.041835954 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 04:30:36 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:36 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:36 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:36.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:36 np0005603623 podman[348724]: 2026-01-31 09:30:36.977732353 +0000 UTC m=+0.061998757 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true)
Jan 31 04:30:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:30:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:37.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:30:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:38 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:38 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:38 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:38.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:39 np0005603623 nova_compute[226235]: 2026-01-31 09:30:39.238 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:30:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:39.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:30:40 np0005603623 nova_compute[226235]: 2026-01-31 09:30:40.627 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:40 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:40 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:40 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:40.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:41.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:42 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:42 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:42 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:42.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:30:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:43.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:30:44 np0005603623 nova_compute[226235]: 2026-01-31 09:30:44.241 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:44 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:44 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:44 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:44.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:45 np0005603623 nova_compute[226235]: 2026-01-31 09:30:45.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:45 np0005603623 nova_compute[226235]: 2026-01-31 09:30:45.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:45 np0005603623 nova_compute[226235]: 2026-01-31 09:30:45.191 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:30:45 np0005603623 nova_compute[226235]: 2026-01-31 09:30:45.191 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:30:45 np0005603623 nova_compute[226235]: 2026-01-31 09:30:45.191 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:30:45 np0005603623 nova_compute[226235]: 2026-01-31 09:30:45.191 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:30:45 np0005603623 nova_compute[226235]: 2026-01-31 09:30:45.191 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:30:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:30:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:45.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:30:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:30:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2847566898' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:30:45 np0005603623 nova_compute[226235]: 2026-01-31 09:30:45.583 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.392s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:30:45 np0005603623 nova_compute[226235]: 2026-01-31 09:30:45.628 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:45 np0005603623 nova_compute[226235]: 2026-01-31 09:30:45.721 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:30:45 np0005603623 nova_compute[226235]: 2026-01-31 09:30:45.723 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4085MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:30:45 np0005603623 nova_compute[226235]: 2026-01-31 09:30:45.723 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:30:45 np0005603623 nova_compute[226235]: 2026-01-31 09:30:45.723 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:30:45 np0005603623 nova_compute[226235]: 2026-01-31 09:30:45.935 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:30:45 np0005603623 nova_compute[226235]: 2026-01-31 09:30:45.935 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:30:45 np0005603623 nova_compute[226235]: 2026-01-31 09:30:45.955 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #214. Immutable memtables: 0.
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:30:46.307878) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 137] Flushing memtable with next log file: 214
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851846307945, "job": 137, "event": "flush_started", "num_memtables": 1, "num_entries": 1538, "num_deletes": 251, "total_data_size": 3641131, "memory_usage": 3674064, "flush_reason": "Manual Compaction"}
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 137] Level-0 flush table #215: started
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851846329130, "cf_name": "default", "job": 137, "event": "table_file_creation", "file_number": 215, "file_size": 2392709, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 102164, "largest_seqno": 103697, "table_properties": {"data_size": 2386157, "index_size": 3750, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 13683, "raw_average_key_size": 20, "raw_value_size": 2373102, "raw_average_value_size": 3479, "num_data_blocks": 166, "num_entries": 682, "num_filter_entries": 682, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769851704, "oldest_key_time": 1769851704, "file_creation_time": 1769851846, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 215, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 137] Flush lasted 21272 microseconds, and 4537 cpu microseconds.
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:30:46.329180) [db/flush_job.cc:967] [default] [JOB 137] Level-0 flush table #215: 2392709 bytes OK
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:30:46.329200) [db/memtable_list.cc:519] [default] Level-0 commit table #215 started
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:30:46.330509) [db/memtable_list.cc:722] [default] Level-0 commit table #215: memtable #1 done
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:30:46.330526) EVENT_LOG_v1 {"time_micros": 1769851846330521, "job": 137, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:30:46.330543) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 137] Try to delete WAL files size 3634062, prev total WAL file size 3634062, number of live WAL files 2.
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000211.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:30:46.331006) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730039303336' seq:72057594037927935, type:22 .. '7061786F730039323838' seq:0, type:0; will stop at (end)
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 138] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 137 Base level 0, inputs: [215(2336KB)], [213(12MB)]
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851846331045, "job": 138, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [215], "files_L6": [213], "score": -1, "input_data_size": 15139659, "oldest_snapshot_seqno": -1}
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1919113529' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:30:46 np0005603623 nova_compute[226235]: 2026-01-31 09:30:46.364 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:30:46 np0005603623 nova_compute[226235]: 2026-01-31 09:30:46.368 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:30:46 np0005603623 nova_compute[226235]: 2026-01-31 09:30:46.383 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:30:46 np0005603623 nova_compute[226235]: 2026-01-31 09:30:46.385 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:30:46 np0005603623 nova_compute[226235]: 2026-01-31 09:30:46.385 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 138] Generated table #216: 12170 keys, 13038355 bytes, temperature: kUnknown
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851846498396, "cf_name": "default", "job": 138, "event": "table_file_creation", "file_number": 216, "file_size": 13038355, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12963864, "index_size": 42954, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30469, "raw_key_size": 323350, "raw_average_key_size": 26, "raw_value_size": 12755581, "raw_average_value_size": 1048, "num_data_blocks": 1614, "num_entries": 12170, "num_filter_entries": 12170, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769851846, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 216, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:30:46.498696) [db/compaction/compaction_job.cc:1663] [default] [JOB 138] Compacted 1@0 + 1@6 files to L6 => 13038355 bytes
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:30:46.500515) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 90.4 rd, 77.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 12.2 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(11.8) write-amplify(5.4) OK, records in: 12687, records dropped: 517 output_compression: NoCompression
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:30:46.500552) EVENT_LOG_v1 {"time_micros": 1769851846500538, "job": 138, "event": "compaction_finished", "compaction_time_micros": 167465, "compaction_time_cpu_micros": 26202, "output_level": 6, "num_output_files": 1, "total_output_size": 13038355, "num_input_records": 12687, "num_output_records": 12170, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000215.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851846500900, "job": 138, "event": "table_file_deletion", "file_number": 215}
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000213.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851846502476, "job": 138, "event": "table_file_deletion", "file_number": 213}
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:30:46.330963) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:30:46.502601) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:30:46.502607) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:30:46.502610) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:30:46.502611) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:30:46 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:30:46.502613) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:30:46 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:46 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:30:46 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:46.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:30:47 np0005603623 nova_compute[226235]: 2026-01-31 09:30:47.386 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:47 np0005603623 nova_compute[226235]: 2026-01-31 09:30:47.386 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:30:47 np0005603623 nova_compute[226235]: 2026-01-31 09:30:47.387 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:30:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:47.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:47 np0005603623 nova_compute[226235]: 2026-01-31 09:30:47.401 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:30:47 np0005603623 nova_compute[226235]: 2026-01-31 09:30:47.402 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:47 np0005603623 nova_compute[226235]: 2026-01-31 09:30:47.403 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:47 np0005603623 nova_compute[226235]: 2026-01-31 09:30:47.403 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:30:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:48 np0005603623 nova_compute[226235]: 2026-01-31 09:30:48.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:48 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:48 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:48 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:48.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:49 np0005603623 nova_compute[226235]: 2026-01-31 09:30:49.243 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:49.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:50 np0005603623 nova_compute[226235]: 2026-01-31 09:30:50.630 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:50 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:50 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:50 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:50.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:51.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:52 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:52 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:52 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:52.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:52 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:30:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:53.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:30:54 np0005603623 nova_compute[226235]: 2026-01-31 09:30:54.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:54 np0005603623 nova_compute[226235]: 2026-01-31 09:30:54.243 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:54 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:54 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:54 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:54.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:55 np0005603623 nova_compute[226235]: 2026-01-31 09:30:55.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:30:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:55.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:30:55 np0005603623 nova_compute[226235]: 2026-01-31 09:30:55.631 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:56 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:56 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:56 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:56.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:57 np0005603623 nova_compute[226235]: 2026-01-31 09:30:57.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:57.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:57 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:58 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:58 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:58 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:58.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:59 np0005603623 nova_compute[226235]: 2026-01-31 09:30:59.288 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:30:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:30:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:59.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:31:00 np0005603623 nova_compute[226235]: 2026-01-31 09:31:00.635 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:00 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:00 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:00 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:00.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:01.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:02 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:02 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:02 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:02.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:02 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:03.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:04 np0005603623 nova_compute[226235]: 2026-01-31 09:31:04.290 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:04 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:04 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:04 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:04.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:05.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:05 np0005603623 nova_compute[226235]: 2026-01-31 09:31:05.638 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:06 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:06 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:06 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:06.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:07.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:07 np0005603623 podman[348877]: 2026-01-31 09:31:07.945610859 +0000 UTC m=+0.044701314 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:31:07 np0005603623 podman[348878]: 2026-01-31 09:31:07.969068104 +0000 UTC m=+0.065758424 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127)
Jan 31 04:31:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:08 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:08 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:08 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:08.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:09 np0005603623 nova_compute[226235]: 2026-01-31 09:31:09.292 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:09.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:10 np0005603623 nova_compute[226235]: 2026-01-31 09:31:10.639 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:10.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:11.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:13.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:31:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:13.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:31:14 np0005603623 nova_compute[226235]: 2026-01-31 09:31:14.294 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:31:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3529266027' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:31:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:31:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3529266027' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:31:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:15.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:15.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:15 np0005603623 nova_compute[226235]: 2026-01-31 09:31:15.642 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:31:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:31:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:31:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:31:16 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:31:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:17.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:17.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:19.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:19 np0005603623 nova_compute[226235]: 2026-01-31 09:31:19.344 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:19.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:20 np0005603623 nova_compute[226235]: 2026-01-31 09:31:20.645 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:21.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:21 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:31:21 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:31:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:31:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:21.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:31:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:23.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:23.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:24 np0005603623 nova_compute[226235]: 2026-01-31 09:31:24.346 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:25.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:31:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:25.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:31:25 np0005603623 nova_compute[226235]: 2026-01-31 09:31:25.647 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:27.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:27.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:31:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:29.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:31:29 np0005603623 nova_compute[226235]: 2026-01-31 09:31:29.347 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:29.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:31:30.184 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:31:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:31:30.184 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:31:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:31:30.184 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:31:30 np0005603623 nova_compute[226235]: 2026-01-31 09:31:30.649 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:31.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:31.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:33.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:33.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:34 np0005603623 nova_compute[226235]: 2026-01-31 09:31:34.348 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:35.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:31:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:35.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:31:35 np0005603623 nova_compute[226235]: 2026-01-31 09:31:35.651 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:31:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:37.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:31:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:31:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:37.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:31:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:38 np0005603623 podman[349220]: 2026-01-31 09:31:38.957146358 +0000 UTC m=+0.045105396 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 04:31:38 np0005603623 podman[349221]: 2026-01-31 09:31:38.97730299 +0000 UTC m=+0.064291599 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 31 04:31:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:39.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:39 np0005603623 nova_compute[226235]: 2026-01-31 09:31:39.350 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:31:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:39.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:31:40 np0005603623 nova_compute[226235]: 2026-01-31 09:31:40.652 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:31:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:41.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:31:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:41.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:43.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:43.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:44 np0005603623 nova_compute[226235]: 2026-01-31 09:31:44.352 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:45.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:45 np0005603623 nova_compute[226235]: 2026-01-31 09:31:45.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:45 np0005603623 nova_compute[226235]: 2026-01-31 09:31:45.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:45 np0005603623 nova_compute[226235]: 2026-01-31 09:31:45.184 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:31:45 np0005603623 nova_compute[226235]: 2026-01-31 09:31:45.184 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:31:45 np0005603623 nova_compute[226235]: 2026-01-31 09:31:45.185 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:31:45 np0005603623 nova_compute[226235]: 2026-01-31 09:31:45.185 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:31:45 np0005603623 nova_compute[226235]: 2026-01-31 09:31:45.185 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:31:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:31:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:45.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:31:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:31:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3713697666' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:31:45 np0005603623 nova_compute[226235]: 2026-01-31 09:31:45.609 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:31:45 np0005603623 nova_compute[226235]: 2026-01-31 09:31:45.653 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:45 np0005603623 nova_compute[226235]: 2026-01-31 09:31:45.746 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:31:45 np0005603623 nova_compute[226235]: 2026-01-31 09:31:45.747 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4097MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:31:45 np0005603623 nova_compute[226235]: 2026-01-31 09:31:45.748 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:31:45 np0005603623 nova_compute[226235]: 2026-01-31 09:31:45.748 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:31:45 np0005603623 nova_compute[226235]: 2026-01-31 09:31:45.856 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:31:45 np0005603623 nova_compute[226235]: 2026-01-31 09:31:45.857 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:31:45 np0005603623 nova_compute[226235]: 2026-01-31 09:31:45.958 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:31:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:31:46 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/255829661' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:31:46 np0005603623 nova_compute[226235]: 2026-01-31 09:31:46.347 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.389s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:31:46 np0005603623 nova_compute[226235]: 2026-01-31 09:31:46.353 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:31:46 np0005603623 nova_compute[226235]: 2026-01-31 09:31:46.392 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:31:46 np0005603623 nova_compute[226235]: 2026-01-31 09:31:46.394 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:31:46 np0005603623 nova_compute[226235]: 2026-01-31 09:31:46.395 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.647s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:31:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:31:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:47.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:31:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:31:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:47.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:31:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:49.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:49 np0005603623 nova_compute[226235]: 2026-01-31 09:31:49.354 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:49 np0005603623 nova_compute[226235]: 2026-01-31 09:31:49.394 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:49 np0005603623 nova_compute[226235]: 2026-01-31 09:31:49.395 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:31:49 np0005603623 nova_compute[226235]: 2026-01-31 09:31:49.395 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:31:49 np0005603623 nova_compute[226235]: 2026-01-31 09:31:49.415 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:31:49 np0005603623 nova_compute[226235]: 2026-01-31 09:31:49.415 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:49 np0005603623 nova_compute[226235]: 2026-01-31 09:31:49.416 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:49 np0005603623 nova_compute[226235]: 2026-01-31 09:31:49.416 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:49 np0005603623 nova_compute[226235]: 2026-01-31 09:31:49.417 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:31:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:49.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:50 np0005603623 nova_compute[226235]: 2026-01-31 09:31:50.659 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:50 np0005603623 systemd[1]: Starting dnf makecache...
Jan 31 04:31:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:51.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:51 np0005603623 dnf[349339]: Metadata cache refreshed recently.
Jan 31 04:31:51 np0005603623 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 31 04:31:51 np0005603623 systemd[1]: Finished dnf makecache.
Jan 31 04:31:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:31:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:51.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:31:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:31:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:53.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:31:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:31:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:53.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:31:54 np0005603623 nova_compute[226235]: 2026-01-31 09:31:54.172 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:54 np0005603623 nova_compute[226235]: 2026-01-31 09:31:54.356 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:55.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:55.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:55 np0005603623 nova_compute[226235]: 2026-01-31 09:31:55.661 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:56 np0005603623 nova_compute[226235]: 2026-01-31 09:31:56.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:31:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:57.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:31:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:57.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:31:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:59.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:31:59 np0005603623 nova_compute[226235]: 2026-01-31 09:31:59.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:59 np0005603623 nova_compute[226235]: 2026-01-31 09:31:59.358 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:31:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:59.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:00 np0005603623 nova_compute[226235]: 2026-01-31 09:32:00.662 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:01.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:32:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:01.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:32:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:03.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:03.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:04 np0005603623 nova_compute[226235]: 2026-01-31 09:32:04.360 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:05.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:32:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:05.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:32:05 np0005603623 nova_compute[226235]: 2026-01-31 09:32:05.664 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:07.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:07.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:09.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:09 np0005603623 nova_compute[226235]: 2026-01-31 09:32:09.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:09 np0005603623 nova_compute[226235]: 2026-01-31 09:32:09.362 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:09.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:09 np0005603623 podman[349375]: 2026-01-31 09:32:09.945402456 +0000 UTC m=+0.043782966 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:32:09 np0005603623 podman[349376]: 2026-01-31 09:32:09.964818795 +0000 UTC m=+0.063199215 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 04:32:10 np0005603623 nova_compute[226235]: 2026-01-31 09:32:10.666 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:32:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:11.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:32:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:11.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:13.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:13.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:14 np0005603623 nova_compute[226235]: 2026-01-31 09:32:14.363 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:32:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/367454158' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:32:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:32:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/367454158' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:32:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:15.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:15.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:15 np0005603623 nova_compute[226235]: 2026-01-31 09:32:15.667 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:17.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:32:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:17.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:32:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:19.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:19 np0005603623 nova_compute[226235]: 2026-01-31 09:32:19.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:19 np0005603623 nova_compute[226235]: 2026-01-31 09:32:19.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 04:32:19 np0005603623 nova_compute[226235]: 2026-01-31 09:32:19.365 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:19.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:20 np0005603623 nova_compute[226235]: 2026-01-31 09:32:20.679 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:21.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:21.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:32:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:32:22 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:32:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:23.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:32:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:23.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:32:24 np0005603623 nova_compute[226235]: 2026-01-31 09:32:24.367 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:25.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:25.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:32:25 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.0 total, 600.0 interval#012Cumulative writes: 20K writes, 104K keys, 20K commit groups, 1.0 writes per commit group, ingest: 0.21 GB, 0.03 MB/s#012Cumulative WAL: 20K writes, 20K syncs, 1.00 writes per sync, written: 0.21 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1438 writes, 6904 keys, 1438 commit groups, 1.0 writes per commit group, ingest: 15.33 MB, 0.03 MB/s#012Interval WAL: 1438 writes, 1438 syncs, 1.00 writes per sync, written: 0.01 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     45.8      2.86              0.31        69    0.041       0      0       0.0       0.0#012  L6      1/0   12.43 MB   0.0      0.8     0.1      0.7       0.7      0.0       0.0   5.6     68.0     58.5     12.49              1.56        68    0.184    578K    36K       0.0       0.0#012 Sum      1/0   12.43 MB   0.0      0.8     0.1      0.7       0.8      0.1       0.0   6.6     55.3     56.1     15.35              1.87       137    0.112    578K    36K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.6     70.3     70.9      1.04              0.15        10    0.104     63K   2553       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.8     0.1      0.7       0.7      0.0       0.0   0.0     68.0     58.5     12.49              1.56        68    0.184    578K    36K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     45.8      2.86              0.31        68    0.042       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 7800.0 total, 600.0 interval#012Flush(GB): cumulative 0.128, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.84 GB write, 0.11 MB/s write, 0.83 GB read, 0.11 MB/s read, 15.4 seconds#012Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.12 MB/s read, 1.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557fc5f1b1f0#2 capacity: 304.00 MB usage: 93.36 MB table_size: 0 occupancy: 18446744073709551615 collections: 14 last_copies: 0 last_secs: 0.000586 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(5738,89.29 MB,29.371%) FilterBlock(137,1.56 MB,0.514015%) IndexBlock(137,2.51 MB,0.825144%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 04:32:25 np0005603623 nova_compute[226235]: 2026-01-31 09:32:25.682 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:27.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:32:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:27.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:32:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:28 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:32:28 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:32:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:29.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:29 np0005603623 nova_compute[226235]: 2026-01-31 09:32:29.369 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:29.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:32:30.184 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:32:30.185 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:32:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:32:30.185 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:32:30 np0005603623 nova_compute[226235]: 2026-01-31 09:32:30.685 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:31.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:31.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:32:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:33.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:32:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:32:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:33.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:32:34 np0005603623 nova_compute[226235]: 2026-01-31 09:32:34.379 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:32:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:35.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:32:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:35.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:35 np0005603623 nova_compute[226235]: 2026-01-31 09:32:35.688 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:37.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:37.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:39.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:39 np0005603623 nova_compute[226235]: 2026-01-31 09:32:39.380 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:32:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:39.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:32:40 np0005603623 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-2[77740]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 31 04:32:40 np0005603623 nova_compute[226235]: 2026-01-31 09:32:40.695 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:40 np0005603623 podman[349720]: 2026-01-31 09:32:40.954102875 +0000 UTC m=+0.048697689 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_metadata_agent)
Jan 31 04:32:40 np0005603623 podman[349721]: 2026-01-31 09:32:40.974072492 +0000 UTC m=+0.067990874 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 31 04:32:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:41.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:41.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:43.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:32:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:43.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:32:44 np0005603623 nova_compute[226235]: 2026-01-31 09:32:44.384 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:32:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:45.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:32:45 np0005603623 nova_compute[226235]: 2026-01-31 09:32:45.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:45.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:45 np0005603623 nova_compute[226235]: 2026-01-31 09:32:45.698 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:46 np0005603623 nova_compute[226235]: 2026-01-31 09:32:46.175 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:46 np0005603623 nova_compute[226235]: 2026-01-31 09:32:46.175 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:46 np0005603623 nova_compute[226235]: 2026-01-31 09:32:46.202 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:32:46 np0005603623 nova_compute[226235]: 2026-01-31 09:32:46.203 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:32:46 np0005603623 nova_compute[226235]: 2026-01-31 09:32:46.203 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:32:46 np0005603623 nova_compute[226235]: 2026-01-31 09:32:46.203 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:32:46 np0005603623 nova_compute[226235]: 2026-01-31 09:32:46.203 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:32:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:32:46 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4277126305' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:32:46 np0005603623 nova_compute[226235]: 2026-01-31 09:32:46.612 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:32:46 np0005603623 nova_compute[226235]: 2026-01-31 09:32:46.732 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:32:46 np0005603623 nova_compute[226235]: 2026-01-31 09:32:46.733 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4096MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:32:46 np0005603623 nova_compute[226235]: 2026-01-31 09:32:46.733 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:32:46 np0005603623 nova_compute[226235]: 2026-01-31 09:32:46.733 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:32:46 np0005603623 nova_compute[226235]: 2026-01-31 09:32:46.800 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:32:46 np0005603623 nova_compute[226235]: 2026-01-31 09:32:46.801 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:32:46 np0005603623 nova_compute[226235]: 2026-01-31 09:32:46.818 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:32:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:47.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:47 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:32:47 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2176464123' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:32:47 np0005603623 nova_compute[226235]: 2026-01-31 09:32:47.205 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.387s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:32:47 np0005603623 nova_compute[226235]: 2026-01-31 09:32:47.210 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:32:47 np0005603623 nova_compute[226235]: 2026-01-31 09:32:47.243 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:32:47 np0005603623 nova_compute[226235]: 2026-01-31 09:32:47.244 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:32:47 np0005603623 nova_compute[226235]: 2026-01-31 09:32:47.244 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.511s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:32:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:47.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:49.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:49 np0005603623 nova_compute[226235]: 2026-01-31 09:32:49.224 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:49 np0005603623 nova_compute[226235]: 2026-01-31 09:32:49.385 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:49.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:50 np0005603623 nova_compute[226235]: 2026-01-31 09:32:50.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:50 np0005603623 nova_compute[226235]: 2026-01-31 09:32:50.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:32:50 np0005603623 nova_compute[226235]: 2026-01-31 09:32:50.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:32:50 np0005603623 nova_compute[226235]: 2026-01-31 09:32:50.168 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:32:50 np0005603623 nova_compute[226235]: 2026-01-31 09:32:50.700 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:32:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:51.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:32:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:32:51 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.1 total, 600.0 interval#012Cumulative writes: 79K writes, 333K keys, 79K commit groups, 1.0 writes per commit group, ingest: 0.34 GB, 0.04 MB/s#012Cumulative WAL: 79K writes, 27K syncs, 2.89 writes per sync, written: 0.34 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 664 writes, 1020 keys, 664 commit groups, 1.0 writes per commit group, ingest: 0.33 MB, 0.00 MB/s#012Interval WAL: 664 writes, 328 syncs, 2.02 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 04:32:51 np0005603623 nova_compute[226235]: 2026-01-31 09:32:51.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:51 np0005603623 nova_compute[226235]: 2026-01-31 09:32:51.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:51 np0005603623 nova_compute[226235]: 2026-01-31 09:32:51.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:32:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:51.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:53.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:53.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:54 np0005603623 nova_compute[226235]: 2026-01-31 09:32:54.387 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:55.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:55 np0005603623 nova_compute[226235]: 2026-01-31 09:32:55.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:55 np0005603623 nova_compute[226235]: 2026-01-31 09:32:55.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 04:32:55 np0005603623 nova_compute[226235]: 2026-01-31 09:32:55.180 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 04:32:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:32:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:55.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:32:55 np0005603623 nova_compute[226235]: 2026-01-31 09:32:55.702 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:56 np0005603623 nova_compute[226235]: 2026-01-31 09:32:56.175 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:32:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:57.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:32:57 np0005603623 nova_compute[226235]: 2026-01-31 09:32:57.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:57.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:32:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:59.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:32:59 np0005603623 nova_compute[226235]: 2026-01-31 09:32:59.388 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:32:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:59.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:00 np0005603623 nova_compute[226235]: 2026-01-31 09:33:00.704 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:01.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:01 np0005603623 nova_compute[226235]: 2026-01-31 09:33:01.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:33:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:01.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:33:01 np0005603623 nova_compute[226235]: 2026-01-31 09:33:01.939 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:03.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:03.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:04 np0005603623 nova_compute[226235]: 2026-01-31 09:33:04.390 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:05.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:05.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:05 np0005603623 nova_compute[226235]: 2026-01-31 09:33:05.706 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:07.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:33:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:07.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:33:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:09.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:09 np0005603623 nova_compute[226235]: 2026-01-31 09:33:09.393 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:33:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:09.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:33:10 np0005603623 nova_compute[226235]: 2026-01-31 09:33:10.708 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:11.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:11 np0005603623 podman[349900]: 2026-01-31 09:33:11.490005103 +0000 UTC m=+0.075277883 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 31 04:33:11 np0005603623 podman[349901]: 2026-01-31 09:33:11.490688535 +0000 UTC m=+0.073394125 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 31 04:33:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:11.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:13.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:13.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:14 np0005603623 nova_compute[226235]: 2026-01-31 09:33:14.395 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:33:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3459192910' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:33:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:33:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3459192910' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:33:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:15.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:15.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:15 np0005603623 nova_compute[226235]: 2026-01-31 09:33:15.714 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:17.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:17.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:19.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:19 np0005603623 nova_compute[226235]: 2026-01-31 09:33:19.397 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:19.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:20 np0005603623 nova_compute[226235]: 2026-01-31 09:33:20.717 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:21.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:33:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:21.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:33:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:33:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:23.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:33:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:23.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:24 np0005603623 nova_compute[226235]: 2026-01-31 09:33:24.400 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:25.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:25.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:25 np0005603623 nova_compute[226235]: 2026-01-31 09:33:25.719 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:33:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:27.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:33:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:27.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:28 np0005603623 podman[350151]: 2026-01-31 09:33:28.27132687 +0000 UTC m=+0.055026027 container exec 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Jan 31 04:33:28 np0005603623 podman[350151]: 2026-01-31 09:33:28.363784262 +0000 UTC m=+0.147483409 container exec_died 8c9a65d328a174950b89565262ace439b25a568a3ccea40fb4eedb3707a86a35 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-2, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 04:33:28 np0005603623 podman[350307]: 2026-01-31 09:33:28.903934935 +0000 UTC m=+0.063601987 container exec dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 04:33:28 np0005603623 podman[350307]: 2026-01-31 09:33:28.937846379 +0000 UTC m=+0.097513371 container exec_died dd4a6866d9f6c016cbeff00273160852b7bcd5426839ae13cd84770d1cb64720 (image=quay.io/ceph/haproxy:2.3, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-haproxy-rgw-default-compute-2-yyrexo)
Jan 31 04:33:29 np0005603623 podman[350373]: 2026-01-31 09:33:29.11949761 +0000 UTC m=+0.047829592 container exec 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, com.redhat.component=keepalived-container, release=1793, vendor=Red Hat, Inc., version=2.2.4, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Jan 31 04:33:29 np0005603623 podman[350373]: 2026-01-31 09:33:29.130832676 +0000 UTC m=+0.059164638 container exec_died 430b1efc3b251885fa4032d083a43092d2bf3393e26bbbdbea03b4540e781466 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-keepalived-rgw-default-compute-2-voilty, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, release=1793, version=2.2.4, io.openshift.expose-services=, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, description=keepalived for Ceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 31 04:33:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:29.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:33:29 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:33:29 np0005603623 nova_compute[226235]: 2026-01-31 09:33:29.401 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:33:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:29.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:33:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:33:30.185 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:33:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:33:30.185 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:33:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:33:30.186 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:33:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:33:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:33:30 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:33:30 np0005603623 nova_compute[226235]: 2026-01-31 09:33:30.723 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:33:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:31.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:33:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:33:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:31.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:33:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:33.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:33.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:34 np0005603623 nova_compute[226235]: 2026-01-31 09:33:34.404 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:33:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:35.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:33:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:35.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:35 np0005603623 nova_compute[226235]: 2026-01-31 09:33:35.725 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:33:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:33:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:37.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:37.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:39.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:39 np0005603623 nova_compute[226235]: 2026-01-31 09:33:39.405 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:39.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:40 np0005603623 nova_compute[226235]: 2026-01-31 09:33:40.727 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:41.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:41.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:41 np0005603623 podman[350645]: 2026-01-31 09:33:41.954148687 +0000 UTC m=+0.045958112 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:33:41 np0005603623 podman[350646]: 2026-01-31 09:33:41.997415646 +0000 UTC m=+0.087193518 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 04:33:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:43.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:43.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:44 np0005603623 nova_compute[226235]: 2026-01-31 09:33:44.407 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:45.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:45.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:45 np0005603623 nova_compute[226235]: 2026-01-31 09:33:45.729 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:47.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:47.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.182 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.183 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.183 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.183 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.183 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:33:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:33:48 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4037942183' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.618 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.761 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.762 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4082MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.762 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.762 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.845 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.846 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.860 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing inventories for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.878 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating ProviderTree inventory for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.879 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Updating inventory in ProviderTree for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.892 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing aggregate associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.912 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Refreshing trait associations for resource provider 492dc482-9d1e-49ca-87f3-0104a8508b72, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 04:33:48 np0005603623 nova_compute[226235]: 2026-01-31 09:33:48.927 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:33:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:49.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:33:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1574344321' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:33:49 np0005603623 nova_compute[226235]: 2026-01-31 09:33:49.383 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:33:49 np0005603623 nova_compute[226235]: 2026-01-31 09:33:49.389 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:33:49 np0005603623 nova_compute[226235]: 2026-01-31 09:33:49.408 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:49 np0005603623 nova_compute[226235]: 2026-01-31 09:33:49.451 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:33:49 np0005603623 nova_compute[226235]: 2026-01-31 09:33:49.453 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:33:49 np0005603623 nova_compute[226235]: 2026-01-31 09:33:49.453 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.691s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:33:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:49.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:50 np0005603623 nova_compute[226235]: 2026-01-31 09:33:50.732 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:51.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:51 np0005603623 nova_compute[226235]: 2026-01-31 09:33:51.453 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:51 np0005603623 nova_compute[226235]: 2026-01-31 09:33:51.454 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:33:51 np0005603623 nova_compute[226235]: 2026-01-31 09:33:51.454 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:33:51 np0005603623 nova_compute[226235]: 2026-01-31 09:33:51.488 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:33:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:51.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:52 np0005603623 nova_compute[226235]: 2026-01-31 09:33:52.157 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:52 np0005603623 nova_compute[226235]: 2026-01-31 09:33:52.158 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:33:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:53 np0005603623 nova_compute[226235]: 2026-01-31 09:33:53.156 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:53.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:53.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:54 np0005603623 nova_compute[226235]: 2026-01-31 09:33:54.411 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:55.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:33:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:55.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:33:55 np0005603623 nova_compute[226235]: 2026-01-31 09:33:55.733 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:33:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:57.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:33:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:57.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:58 np0005603623 nova_compute[226235]: 2026-01-31 09:33:58.149 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:59 np0005603623 nova_compute[226235]: 2026-01-31 09:33:59.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:33:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:59.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:33:59 np0005603623 nova_compute[226235]: 2026-01-31 09:33:59.412 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:33:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:59.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:00 np0005603623 nova_compute[226235]: 2026-01-31 09:34:00.735 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:01.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:01 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 04:34:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:01.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:01 np0005603623 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 04:34:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:03 np0005603623 nova_compute[226235]: 2026-01-31 09:34:03.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:34:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:34:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:03.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:34:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:34:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:03.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:34:04 np0005603623 nova_compute[226235]: 2026-01-31 09:34:04.414 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:05.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:05.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:05 np0005603623 nova_compute[226235]: 2026-01-31 09:34:05.736 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:07.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:07.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:09.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:09 np0005603623 nova_compute[226235]: 2026-01-31 09:34:09.414 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:09.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:10 np0005603623 nova_compute[226235]: 2026-01-31 09:34:10.739 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:11.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:34:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:11.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:34:12 np0005603623 podman[350852]: 2026-01-31 09:34:12.965239082 +0000 UTC m=+0.056844074 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:34:12 np0005603623 podman[350853]: 2026-01-31 09:34:12.992377445 +0000 UTC m=+0.082073508 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:34:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:13 np0005603623 nova_compute[226235]: 2026-01-31 09:34:13.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:34:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:13.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:13.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:14 np0005603623 nova_compute[226235]: 2026-01-31 09:34:14.416 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:34:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1893673374' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:34:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:34:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1893673374' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:34:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:15.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:34:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:15.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:34:15 np0005603623 nova_compute[226235]: 2026-01-31 09:34:15.741 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:17.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:34:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:17.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:34:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:19.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:19 np0005603623 nova_compute[226235]: 2026-01-31 09:34:19.420 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:19.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:20 np0005603623 nova_compute[226235]: 2026-01-31 09:34:20.743 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:21.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:21.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:23.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:23.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:24 np0005603623 nova_compute[226235]: 2026-01-31 09:34:24.420 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:25.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:25.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:25 np0005603623 nova_compute[226235]: 2026-01-31 09:34:25.745 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:27.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:27.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:29.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:29 np0005603623 nova_compute[226235]: 2026-01-31 09:34:29.422 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:34:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:29.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:34:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:34:30.186 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:34:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:34:30.186 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:34:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:34:30.186 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:34:30 np0005603623 nova_compute[226235]: 2026-01-31 09:34:30.747 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:34:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:31.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:34:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:31.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #217. Immutable memtables: 0.
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:34:32.324828) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 139] Flushing memtable with next log file: 217
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852072324877, "job": 139, "event": "flush_started", "num_memtables": 1, "num_entries": 2345, "num_deletes": 251, "total_data_size": 5905216, "memory_usage": 5981408, "flush_reason": "Manual Compaction"}
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 139] Level-0 flush table #218: started
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852072391225, "cf_name": "default", "job": 139, "event": "table_file_creation", "file_number": 218, "file_size": 3865367, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 103702, "largest_seqno": 106042, "table_properties": {"data_size": 3855846, "index_size": 6078, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 19097, "raw_average_key_size": 20, "raw_value_size": 3836977, "raw_average_value_size": 4073, "num_data_blocks": 266, "num_entries": 942, "num_filter_entries": 942, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769851847, "oldest_key_time": 1769851847, "file_creation_time": 1769852072, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 218, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 139] Flush lasted 66458 microseconds, and 6706 cpu microseconds.
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:34:32.391282) [db/flush_job.cc:967] [default] [JOB 139] Level-0 flush table #218: 3865367 bytes OK
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:34:32.391308) [db/memtable_list.cc:519] [default] Level-0 commit table #218 started
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:34:32.398165) [db/memtable_list.cc:722] [default] Level-0 commit table #218: memtable #1 done
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:34:32.398183) EVENT_LOG_v1 {"time_micros": 1769852072398177, "job": 139, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:34:32.398201) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 139] Try to delete WAL files size 5894982, prev total WAL file size 5894982, number of live WAL files 2.
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000214.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:34:32.399522) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730039323837' seq:72057594037927935, type:22 .. '7061786F730039353339' seq:0, type:0; will stop at (end)
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 140] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 139 Base level 0, inputs: [218(3774KB)], [216(12MB)]
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852072399586, "job": 140, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [218], "files_L6": [216], "score": -1, "input_data_size": 16903722, "oldest_snapshot_seqno": -1}
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 140] Generated table #219: 12595 keys, 14843143 bytes, temperature: kUnknown
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852072527607, "cf_name": "default", "job": 140, "event": "table_file_creation", "file_number": 219, "file_size": 14843143, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14764310, "index_size": 46272, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31493, "raw_key_size": 332877, "raw_average_key_size": 26, "raw_value_size": 14546997, "raw_average_value_size": 1154, "num_data_blocks": 1752, "num_entries": 12595, "num_filter_entries": 12595, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769852072, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 219, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:34:32.527926) [db/compaction/compaction_job.cc:1663] [default] [JOB 140] Compacted 1@0 + 1@6 files to L6 => 14843143 bytes
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:34:32.549527) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 132.0 rd, 116.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 12.4 +0.0 blob) out(14.2 +0.0 blob), read-write-amplify(8.2) write-amplify(3.8) OK, records in: 13112, records dropped: 517 output_compression: NoCompression
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:34:32.549556) EVENT_LOG_v1 {"time_micros": 1769852072549543, "job": 140, "event": "compaction_finished", "compaction_time_micros": 128012, "compaction_time_cpu_micros": 27272, "output_level": 6, "num_output_files": 1, "total_output_size": 14843143, "num_input_records": 13112, "num_output_records": 12595, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000218.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852072550007, "job": 140, "event": "table_file_deletion", "file_number": 218}
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000216.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852072551196, "job": 140, "event": "table_file_deletion", "file_number": 216}
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:34:32.399365) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:34:32.551384) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:34:32.551393) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:34:32.551395) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:34:32.551397) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:34:32 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:34:32.551398) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:34:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:34:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:33.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:34:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:34:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:33.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:34:34 np0005603623 nova_compute[226235]: 2026-01-31 09:34:34.423 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:35.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:35.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:35 np0005603623 nova_compute[226235]: 2026-01-31 09:34:35.748 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:34:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:34:36 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:34:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:34:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:37.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:34:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:34:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:37.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:34:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:34:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:39.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:34:39 np0005603623 nova_compute[226235]: 2026-01-31 09:34:39.426 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:39.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:40 np0005603623 nova_compute[226235]: 2026-01-31 09:34:40.750 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:41 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:34:41 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:34:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:41.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:41.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:43.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:43.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:43 np0005603623 podman[351141]: 2026-01-31 09:34:43.967505488 +0000 UTC m=+0.063402130 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 04:34:43 np0005603623 podman[351140]: 2026-01-31 09:34:43.973147715 +0000 UTC m=+0.071794474 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 31 04:34:44 np0005603623 nova_compute[226235]: 2026-01-31 09:34:44.427 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:45.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:45.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:45 np0005603623 nova_compute[226235]: 2026-01-31 09:34:45.752 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:34:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:47.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:34:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:34:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:47.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:34:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:48 np0005603623 nova_compute[226235]: 2026-01-31 09:34:48.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:34:48 np0005603623 nova_compute[226235]: 2026-01-31 09:34:48.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:34:48 np0005603623 nova_compute[226235]: 2026-01-31 09:34:48.179 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:34:48 np0005603623 nova_compute[226235]: 2026-01-31 09:34:48.179 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:34:48 np0005603623 nova_compute[226235]: 2026-01-31 09:34:48.179 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:34:48 np0005603623 nova_compute[226235]: 2026-01-31 09:34:48.180 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:34:48 np0005603623 nova_compute[226235]: 2026-01-31 09:34:48.180 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:34:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:34:48 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3408528579' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:34:48 np0005603623 nova_compute[226235]: 2026-01-31 09:34:48.596 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:34:48 np0005603623 nova_compute[226235]: 2026-01-31 09:34:48.751 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:34:48 np0005603623 nova_compute[226235]: 2026-01-31 09:34:48.752 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4064MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:34:48 np0005603623 nova_compute[226235]: 2026-01-31 09:34:48.752 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:34:48 np0005603623 nova_compute[226235]: 2026-01-31 09:34:48.753 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:34:48 np0005603623 nova_compute[226235]: 2026-01-31 09:34:48.943 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:34:48 np0005603623 nova_compute[226235]: 2026-01-31 09:34:48.944 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:34:48 np0005603623 nova_compute[226235]: 2026-01-31 09:34:48.959 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:34:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:34:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:49.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:34:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:34:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3027687024' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:34:49 np0005603623 nova_compute[226235]: 2026-01-31 09:34:49.413 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:34:49 np0005603623 nova_compute[226235]: 2026-01-31 09:34:49.418 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:34:49 np0005603623 nova_compute[226235]: 2026-01-31 09:34:49.429 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:49 np0005603623 nova_compute[226235]: 2026-01-31 09:34:49.433 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:34:49 np0005603623 nova_compute[226235]: 2026-01-31 09:34:49.435 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:34:49 np0005603623 nova_compute[226235]: 2026-01-31 09:34:49.436 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:34:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:49.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:50 np0005603623 nova_compute[226235]: 2026-01-31 09:34:50.754 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:51.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:51 np0005603623 nova_compute[226235]: 2026-01-31 09:34:51.437 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:34:51 np0005603623 nova_compute[226235]: 2026-01-31 09:34:51.438 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:34:51 np0005603623 nova_compute[226235]: 2026-01-31 09:34:51.438 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:34:51 np0005603623 nova_compute[226235]: 2026-01-31 09:34:51.457 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:34:51 np0005603623 nova_compute[226235]: 2026-01-31 09:34:51.457 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:34:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:51.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:53 np0005603623 nova_compute[226235]: 2026-01-31 09:34:53.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:34:53 np0005603623 nova_compute[226235]: 2026-01-31 09:34:53.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:34:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:53.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:34:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:53.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:34:54 np0005603623 nova_compute[226235]: 2026-01-31 09:34:54.431 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:55 np0005603623 nova_compute[226235]: 2026-01-31 09:34:55.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:34:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:34:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:55.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:34:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:55.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:55 np0005603623 nova_compute[226235]: 2026-01-31 09:34:55.756 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:57.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:57.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:59.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:59 np0005603623 nova_compute[226235]: 2026-01-31 09:34:59.434 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:34:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:59.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:00 np0005603623 nova_compute[226235]: 2026-01-31 09:35:00.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:35:00 np0005603623 nova_compute[226235]: 2026-01-31 09:35:00.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:35:00 np0005603623 nova_compute[226235]: 2026-01-31 09:35:00.758 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:01.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:01.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:03 np0005603623 nova_compute[226235]: 2026-01-31 09:35:03.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:35:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:35:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:03.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:35:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:03.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:04 np0005603623 nova_compute[226235]: 2026-01-31 09:35:04.479 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:05.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:05.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:05 np0005603623 nova_compute[226235]: 2026-01-31 09:35:05.761 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:07.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:07.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:09.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:09 np0005603623 nova_compute[226235]: 2026-01-31 09:35:09.522 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:35:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:09.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:35:10 np0005603623 nova_compute[226235]: 2026-01-31 09:35:10.763 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:11.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:35:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:11.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:35:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:13.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:13.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:14 np0005603623 podman[351344]: 2026-01-31 09:35:14.250431008 +0000 UTC m=+0.092199995 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:35:14 np0005603623 podman[351345]: 2026-01-31 09:35:14.281568675 +0000 UTC m=+0.120203733 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 04:35:14 np0005603623 nova_compute[226235]: 2026-01-31 09:35:14.526 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:15.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:15.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:15 np0005603623 nova_compute[226235]: 2026-01-31 09:35:15.764 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:17.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:35:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:17.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:35:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:35:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:19.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:35:19 np0005603623 nova_compute[226235]: 2026-01-31 09:35:19.527 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:35:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:19.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:35:20 np0005603623 nova_compute[226235]: 2026-01-31 09:35:20.766 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:21.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:21.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:23.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:23.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:24 np0005603623 nova_compute[226235]: 2026-01-31 09:35:24.528 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:25.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:35:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:25.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:35:25 np0005603623 nova_compute[226235]: 2026-01-31 09:35:25.767 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:35:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:27.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:35:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:27.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:29.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:29 np0005603623 nova_compute[226235]: 2026-01-31 09:35:29.531 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:29.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:35:30.187 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:35:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:35:30.187 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:35:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:35:30.187 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:35:30 np0005603623 nova_compute[226235]: 2026-01-31 09:35:30.769 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:31.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:35:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:31.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:35:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:33.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:33.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:34 np0005603623 nova_compute[226235]: 2026-01-31 09:35:34.561 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:35.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:35.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:35 np0005603623 nova_compute[226235]: 2026-01-31 09:35:35.770 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:37.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:37.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:39.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:39 np0005603623 nova_compute[226235]: 2026-01-31 09:35:39.564 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:39.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:40 np0005603623 nova_compute[226235]: 2026-01-31 09:35:40.771 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:35:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:41.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:35:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:41.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:35:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:35:42 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:35:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:43.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:43.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:44 np0005603623 nova_compute[226235]: 2026-01-31 09:35:44.566 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:44 np0005603623 podman[351587]: 2026-01-31 09:35:44.964241385 +0000 UTC m=+0.055079949 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:35:44 np0005603623 podman[351588]: 2026-01-31 09:35:44.983318625 +0000 UTC m=+0.073591771 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:35:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:45.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:45.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:45 np0005603623 nova_compute[226235]: 2026-01-31 09:35:45.773 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:35:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:47.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:35:47 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:47 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:47 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:47.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:48 np0005603623 nova_compute[226235]: 2026-01-31 09:35:48.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:35:48 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:35:48 np0005603623 ceph-mon[77037]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:35:48 np0005603623 nova_compute[226235]: 2026-01-31 09:35:48.328 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:35:48 np0005603623 nova_compute[226235]: 2026-01-31 09:35:48.328 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:35:48 np0005603623 nova_compute[226235]: 2026-01-31 09:35:48.329 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:35:48 np0005603623 nova_compute[226235]: 2026-01-31 09:35:48.329 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:35:48 np0005603623 nova_compute[226235]: 2026-01-31 09:35:48.330 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:35:48 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:35:48 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1113668854' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:35:48 np0005603623 nova_compute[226235]: 2026-01-31 09:35:48.734 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:35:48 np0005603623 nova_compute[226235]: 2026-01-31 09:35:48.858 226239 WARNING nova.virt.libvirt.driver [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:35:48 np0005603623 nova_compute[226235]: 2026-01-31 09:35:48.860 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4061MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:35:48 np0005603623 nova_compute[226235]: 2026-01-31 09:35:48.860 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:35:48 np0005603623 nova_compute[226235]: 2026-01-31 09:35:48.860 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:35:48 np0005603623 nova_compute[226235]: 2026-01-31 09:35:48.928 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:35:48 np0005603623 nova_compute[226235]: 2026-01-31 09:35:48.928 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:35:49 np0005603623 nova_compute[226235]: 2026-01-31 09:35:49.040 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:35:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:49.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:49 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:35:49 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3886216282' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:35:49 np0005603623 nova_compute[226235]: 2026-01-31 09:35:49.450 226239 DEBUG oslo_concurrency.processutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:35:49 np0005603623 nova_compute[226235]: 2026-01-31 09:35:49.455 226239 DEBUG nova.compute.provider_tree [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed in ProviderTree for provider: 492dc482-9d1e-49ca-87f3-0104a8508b72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:35:49 np0005603623 nova_compute[226235]: 2026-01-31 09:35:49.473 226239 DEBUG nova.scheduler.client.report [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Inventory has not changed for provider 492dc482-9d1e-49ca-87f3-0104a8508b72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:35:49 np0005603623 nova_compute[226235]: 2026-01-31 09:35:49.475 226239 DEBUG nova.compute.resource_tracker [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:35:49 np0005603623 nova_compute[226235]: 2026-01-31 09:35:49.475 226239 DEBUG oslo_concurrency.lockutils [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:35:49 np0005603623 nova_compute[226235]: 2026-01-31 09:35:49.568 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:49 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:49 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:49 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:49.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:50 np0005603623 nova_compute[226235]: 2026-01-31 09:35:50.476 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:35:50 np0005603623 nova_compute[226235]: 2026-01-31 09:35:50.775 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:51 np0005603623 nova_compute[226235]: 2026-01-31 09:35:51.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:35:51 np0005603623 nova_compute[226235]: 2026-01-31 09:35:51.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:35:51 np0005603623 nova_compute[226235]: 2026-01-31 09:35:51.156 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:35:51 np0005603623 nova_compute[226235]: 2026-01-31 09:35:51.235 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:35:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:51.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:51 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:51 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:35:51 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:51.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:35:52 np0005603623 nova_compute[226235]: 2026-01-31 09:35:52.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:35:53 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:35:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:53.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:35:53 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:53 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:35:53 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:53.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:35:54 np0005603623 nova_compute[226235]: 2026-01-31 09:35:54.611 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:55 np0005603623 nova_compute[226235]: 2026-01-31 09:35:55.154 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:35:55 np0005603623 nova_compute[226235]: 2026-01-31 09:35:55.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:35:55 np0005603623 nova_compute[226235]: 2026-01-31 09:35:55.155 226239 DEBUG nova.compute.manager [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:35:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:55.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:55 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:55 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:55 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:55.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:55 np0005603623 nova_compute[226235]: 2026-01-31 09:35:55.778 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:57.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:57 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:57 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:35:57 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:57.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:35:58 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:59.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:59 np0005603623 nova_compute[226235]: 2026-01-31 09:35:59.610 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:59 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:35:59 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:35:59 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:59.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:36:00 np0005603623 nova_compute[226235]: 2026-01-31 09:36:00.155 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:36:00 np0005603623 nova_compute[226235]: 2026-01-31 09:36:00.780 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:36:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:01.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:36:01 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:01 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:36:01 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:01.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:36:02 np0005603623 nova_compute[226235]: 2026-01-31 09:36:02.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:36:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:36:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:03.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #220. Immutable memtables: 0.
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:03.553073) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 141] Flushing memtable with next log file: 220
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852163553223, "job": 141, "event": "flush_started", "num_memtables": 1, "num_entries": 1119, "num_deletes": 256, "total_data_size": 2376981, "memory_usage": 2414888, "flush_reason": "Manual Compaction"}
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 141] Level-0 flush table #221: started
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852163562483, "cf_name": "default", "job": 141, "event": "table_file_creation", "file_number": 221, "file_size": 1557690, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 106047, "largest_seqno": 107161, "table_properties": {"data_size": 1552709, "index_size": 2504, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10776, "raw_average_key_size": 19, "raw_value_size": 1542693, "raw_average_value_size": 2799, "num_data_blocks": 109, "num_entries": 551, "num_filter_entries": 551, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769852073, "oldest_key_time": 1769852073, "file_creation_time": 1769852163, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 221, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 141] Flush lasted 9473 microseconds, and 6071 cpu microseconds.
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:03.562531) [db/flush_job.cc:967] [default] [JOB 141] Level-0 flush table #221: 1557690 bytes OK
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:03.562577) [db/memtable_list.cc:519] [default] Level-0 commit table #221 started
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:03.565292) [db/memtable_list.cc:722] [default] Level-0 commit table #221: memtable #1 done
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:03.565313) EVENT_LOG_v1 {"time_micros": 1769852163565305, "job": 141, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:03.565338) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 141] Try to delete WAL files size 2371532, prev total WAL file size 2371532, number of live WAL files 2.
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000217.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:03.566268) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323638' seq:72057594037927935, type:22 .. '6C6F676D0034353230' seq:0, type:0; will stop at (end)
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 142] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 141 Base level 0, inputs: [221(1521KB)], [219(14MB)]
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852163566380, "job": 142, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [221], "files_L6": [219], "score": -1, "input_data_size": 16400833, "oldest_snapshot_seqno": -1}
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 142] Generated table #222: 12621 keys, 16203128 bytes, temperature: kUnknown
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852163665448, "cf_name": "default", "job": 142, "event": "table_file_creation", "file_number": 222, "file_size": 16203128, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16122504, "index_size": 47935, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31621, "raw_key_size": 334474, "raw_average_key_size": 26, "raw_value_size": 15903052, "raw_average_value_size": 1260, "num_data_blocks": 1818, "num_entries": 12621, "num_filter_entries": 12621, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769852163, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 222, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:03.665694) [db/compaction/compaction_job.cc:1663] [default] [JOB 142] Compacted 1@0 + 1@6 files to L6 => 16203128 bytes
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:03.667471) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 165.4 rd, 163.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 14.2 +0.0 blob) out(15.5 +0.0 blob), read-write-amplify(20.9) write-amplify(10.4) OK, records in: 13146, records dropped: 525 output_compression: NoCompression
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:03.667489) EVENT_LOG_v1 {"time_micros": 1769852163667481, "job": 142, "event": "compaction_finished", "compaction_time_micros": 99161, "compaction_time_cpu_micros": 33909, "output_level": 6, "num_output_files": 1, "total_output_size": 16203128, "num_input_records": 13146, "num_output_records": 12621, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000221.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852163667833, "job": 142, "event": "table_file_deletion", "file_number": 221}
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000219.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852163669542, "job": 142, "event": "table_file_deletion", "file_number": 219}
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:03.566075) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:03.669617) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:03.669753) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:03.669756) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:03.669758) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:03 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:03.669760) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:03 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:03 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:03 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:03.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:04 np0005603623 nova_compute[226235]: 2026-01-31 09:36:04.153 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:36:04 np0005603623 nova_compute[226235]: 2026-01-31 09:36:04.614 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:36:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:05.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:36:05 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:05 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:05 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:05.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:05 np0005603623 nova_compute[226235]: 2026-01-31 09:36:05.833 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:07.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:07 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:07 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:36:07 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:07.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:36:08 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:36:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:09.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:09 np0005603623 nova_compute[226235]: 2026-01-31 09:36:09.614 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:09 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:09 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:09 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:09.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:10 np0005603623 nova_compute[226235]: 2026-01-31 09:36:10.836 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:11.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:11 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:11 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:11 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:11.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:13 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:36:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:36:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:13.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:36:13 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:13 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:13 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:13.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:14 np0005603623 nova_compute[226235]: 2026-01-31 09:36:14.150 226239 DEBUG oslo_service.periodic_task [None req-3ef0505c-5b3f-4773-a237-2da593a00519 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:36:14 np0005603623 nova_compute[226235]: 2026-01-31 09:36:14.617 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:36:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1572591649' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:36:14 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:36:14 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1572591649' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:36:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:15.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:15 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:15 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:36:15 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:15.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:36:15 np0005603623 nova_compute[226235]: 2026-01-31 09:36:15.838 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:15 np0005603623 podman[351843]: 2026-01-31 09:36:15.977222698 +0000 UTC m=+0.067696866 container health_status 1b89470e7fa821066838b4a0ebcfef38464fd487ebb6276880534c1de33c574e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:36:15 np0005603623 podman[351844]: 2026-01-31 09:36:15.986693764 +0000 UTC m=+0.080455106 container health_status cc9ee73a2ce09cf88d6e0fd154655ee361e311b83cd5ab85d0571d05a45444c6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823-1d031a0e0eb300539bab3d3c8fb5f3999292e0b3e94944293979adf8189e0823'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 04:36:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:17.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:17 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:17 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:36:17 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:17.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:36:18 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:36:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:36:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:19.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:36:19 np0005603623 nova_compute[226235]: 2026-01-31 09:36:19.620 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:19 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:19 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:19 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:19.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:20 np0005603623 nova_compute[226235]: 2026-01-31 09:36:20.840 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:21.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:21 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:21 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:36:21 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:21.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:36:23 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:36:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:23.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:23 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:23 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:23 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:23.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:23 np0005603623 systemd-logind[795]: New session 72 of user zuul.
Jan 31 04:36:23 np0005603623 systemd[1]: Started Session 72 of User zuul.
Jan 31 04:36:24 np0005603623 nova_compute[226235]: 2026-01-31 09:36:24.657 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:36:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:25.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:36:25 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:25 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:25 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:25.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:25 np0005603623 nova_compute[226235]: 2026-01-31 09:36:25.842 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:26 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Jan 31 04:36:26 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3587171946' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 31 04:36:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:36:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:27.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:36:27 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:27 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:27 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:27.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:28 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:36:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:36:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:29.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:36:29 np0005603623 ovs-vsctl[352180]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 31 04:36:29 np0005603623 nova_compute[226235]: 2026-01-31 09:36:29.658 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:29 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:29 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:29 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:29.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:30 np0005603623 virtqemud[225858]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 31 04:36:30 np0005603623 virtqemud[225858]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 31 04:36:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:36:30.188 143258 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:36:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:36:30.188 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:36:30 np0005603623 ovn_metadata_agent[143253]: 2026-01-31 09:36:30.189 143258 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:36:30 np0005603623 virtqemud[225858]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 31 04:36:30 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: cache status {prefix=cache status} (starting...)
Jan 31 04:36:30 np0005603623 lvm[352496]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 31 04:36:30 np0005603623 lvm[352496]: VG ceph_vg0 finished
Jan 31 04:36:30 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: client ls {prefix=client ls} (starting...)
Jan 31 04:36:30 np0005603623 nova_compute[226235]: 2026-01-31 09:36:30.844 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:31.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:31 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: damage ls {prefix=damage ls} (starting...)
Jan 31 04:36:31 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: dump loads {prefix=dump loads} (starting...)
Jan 31 04:36:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Jan 31 04:36:31 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3705711625' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 31 04:36:31 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 31 04:36:31 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 31 04:36:31 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:31 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:31 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:31.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:31 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 31 04:36:31 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Jan 31 04:36:31 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1771978098' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 31 04:36:32 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 31 04:36:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0) v1
Jan 31 04:36:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3074270701' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 31 04:36:32 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 31 04:36:32 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 31 04:36:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0) v1
Jan 31 04:36:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3026629194' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 31 04:36:32 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: ops {prefix=ops} (starting...)
Jan 31 04:36:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Jan 31 04:36:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/588415764' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 31 04:36:32 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Jan 31 04:36:32 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/165674052' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 31 04:36:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:36:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 31 04:36:33 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1615663054' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 31 04:36:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:36:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:33.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:36:33 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: session ls {prefix=session ls} (starting...)
Jan 31 04:36:33 np0005603623 ceph-mds[84161]: mds.cephfs.compute-2.asgtzy asok_command: status {prefix=status} (starting...)
Jan 31 04:36:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 31 04:36:33 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2691658362' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 31 04:36:33 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:33 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:33 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:33.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:33 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0) v1
Jan 31 04:36:33 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2387219852' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 31 04:36:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 31 04:36:34 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1540748622' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 31 04:36:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Jan 31 04:36:34 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1207885738' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 31 04:36:34 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Jan 31 04:36:34 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2247374267' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 31 04:36:34 np0005603623 nova_compute[226235]: 2026-01-31 09:36:34.660 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Jan 31 04:36:35 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/562058712' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 31 04:36:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Jan 31 04:36:35 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/956952527' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 31 04:36:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:35.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 31 04:36:35 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/179222275' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 31 04:36:35 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Jan 31 04:36:35 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/852058476' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 31 04:36:35 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:35 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:36:35 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:35.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:36:35 np0005603623 nova_compute[226235]: 2026-01-31 09:36:35.846 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 31 04:36:36 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/557860913' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 31 04:36:36 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 31 04:36:36 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1321273465' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 403 heartbeat osd_stat(store_statfs(0x197437000/0x0/0x1bfc00000, data 0x3762010/0x3977000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541982720 unmapped: 80216064 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 541982720 unmapped: 80216064 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 403 heartbeat osd_stat(store_statfs(0x197437000/0x0/0x1bfc00000, data 0x3762010/0x3977000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542007296 unmapped: 80191488 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5461976 data_alloc: 218103808 data_used: 9318400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542007296 unmapped: 80191488 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542007296 unmapped: 80191488 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.566255569s of 16.626279831s, submitted: 77
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e618a29c00 session 0x55e61216a780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e611825000 session 0x55e6131a4d20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542007296 unmapped: 80191488 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e610e23000 session 0x55e6141fa780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e618454c00 session 0x55e6121a90e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e613b6b000 session 0x55e6112e7860
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e613448000 session 0x55e611927680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542015488 unmapped: 80183296 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542195712 unmapped: 80003072 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5521357 data_alloc: 234881024 data_used: 18829312
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e613adf000 session 0x55e612722b40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 403 heartbeat osd_stat(store_statfs(0x197462000/0x0/0x1bfc00000, data 0x3738000/0x394c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e61c82f400 session 0x55e61435ef00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542195712 unmapped: 80003072 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542195712 unmapped: 80003072 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e61432d000 session 0x55e611650f00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542883840 unmapped: 79314944 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e613448000 session 0x55e6141fbe00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542883840 unmapped: 79314944 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 403 ms_handle_reset con 0x55e6133eb800 session 0x55e61133b2c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542883840 unmapped: 79314944 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5561404 data_alloc: 234881024 data_used: 25071616
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 403 handle_osd_map epochs [403,404], i have 403, src has [1,404]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 404 ms_handle_reset con 0x55e614328000 session 0x55e6118e8780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542875648 unmapped: 79323136 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 404 heartbeat osd_stat(store_statfs(0x197560000/0x0/0x1bfc00000, data 0x3637cad/0x384d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 404 ms_handle_reset con 0x55e6133f1000 session 0x55e611f150e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 404 ms_handle_reset con 0x55e6178b1c00 session 0x55e6116a4960
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542875648 unmapped: 79323136 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 404 ms_handle_reset con 0x55e610e23000 session 0x55e6132e3a40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542875648 unmapped: 79323136 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 542875648 unmapped: 79323136 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.773722649s of 11.937841415s, submitted: 52
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 546652160 unmapped: 75546624 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5651376 data_alloc: 234881024 data_used: 26652672
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 404 ms_handle_reset con 0x55e61b1fe400 session 0x55e6143a9680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 404 heartbeat osd_stat(store_statfs(0x1969ea000/0x0/0x1bfc00000, data 0x41adc8a/0x43c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [1])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 404 ms_handle_reset con 0x55e611825000 session 0x55e6153e65a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548446208 unmapped: 73752576 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548446208 unmapped: 73752576 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549371904 unmapped: 72826880 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 404 handle_osd_map epochs [404,405], i have 404, src has [1,405]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549380096 unmapped: 72818688 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 405 handle_osd_map epochs [405,406], i have 405, src has [1,406]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 heartbeat osd_stat(store_statfs(0x196df2000/0x0/0x1bfc00000, data 0x41cd7c9/0x3fbb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e613445800 session 0x55e6119272c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549396480 unmapped: 72802304 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5628565 data_alloc: 234881024 data_used: 20672512
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 heartbeat osd_stat(store_statfs(0x196df2000/0x0/0x1bfc00000, data 0x41cd7c9/0x3fbb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549404672 unmapped: 72794112 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549404672 unmapped: 72794112 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 heartbeat osd_stat(store_statfs(0x196dd0000/0x0/0x1bfc00000, data 0x41f0466/0x3fde000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549404672 unmapped: 72794112 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549404672 unmapped: 72794112 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 heartbeat osd_stat(store_statfs(0x196dd0000/0x0/0x1bfc00000, data 0x41f0466/0x3fde000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e6178b0400 session 0x55e6118e8960
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e610e23000 session 0x55e6125ccd20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e611825000 session 0x55e6132e3c20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e613445800 session 0x55e613318780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.295885086s of 10.920019150s, submitted: 202
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549421056 unmapped: 72777728 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5628989 data_alloc: 234881024 data_used: 20672512
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e61b1fe400 session 0x55e61133b0e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e617e22800 session 0x55e6141fa1e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e610e23000 session 0x55e61133af00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e611825000 session 0x55e6116a54a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e613445800 session 0x55e6114d6960
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 heartbeat osd_stat(store_statfs(0x196ba1000/0x0/0x1bfc00000, data 0x441e476/0x420d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549421056 unmapped: 72777728 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e6119cb400 session 0x55e611638780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e617e22400 session 0x55e6140830e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549429248 unmapped: 72769536 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 ms_handle_reset con 0x55e610e23000 session 0x55e6153e6000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549453824 unmapped: 72744960 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 heartbeat osd_stat(store_statfs(0x196b95000/0x0/0x1bfc00000, data 0x442a476/0x4219000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 406 handle_osd_map epochs [406,407], i have 406, src has [1,407]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549453824 unmapped: 72744960 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 407 ms_handle_reset con 0x55e618a28400 session 0x55e612045e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 407 ms_handle_reset con 0x55e6133f1000 session 0x55e6143a8780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549453824 unmapped: 72744960 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5650588 data_alloc: 234881024 data_used: 20692992
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 407 ms_handle_reset con 0x55e6119cbc00 session 0x55e6125cd860
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549453824 unmapped: 72744960 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 407 ms_handle_reset con 0x55e6133f3800 session 0x55e61216a000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 407 ms_handle_reset con 0x55e610e23000 session 0x55e6118e9e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549453824 unmapped: 72744960 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 407 ms_handle_reset con 0x55e6119cbc00 session 0x55e612044000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 407 ms_handle_reset con 0x55e6133f1000 session 0x55e61256fa40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549462016 unmapped: 72736768 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 407 handle_osd_map epochs [407,408], i have 407, src has [1,408]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 408 ms_handle_reset con 0x55e62d160800 session 0x55e6171223c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 408 heartbeat osd_stat(store_statfs(0x196b8c000/0x0/0x1bfc00000, data 0x4430f86/0x4222000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 549642240 unmapped: 72556544 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 408 ms_handle_reset con 0x55e618349800 session 0x55e61133be00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 408 handle_osd_map epochs [408,409], i have 408, src has [1,409]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 409 ms_handle_reset con 0x55e61c236400 session 0x55e61653b680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.503706932s of 10.003719330s, submitted: 144
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 546979840 unmapped: 75218944 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5472238 data_alloc: 234881024 data_used: 15331328
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 409 ms_handle_reset con 0x55e6133ef400 session 0x55e6153e7a40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 409 ms_handle_reset con 0x55e610e23000 session 0x55e6118e9e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 409 ms_handle_reset con 0x55e618349800 session 0x55e6127223c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 546988032 unmapped: 75210752 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 409 ms_handle_reset con 0x55e6119cbc00 session 0x55e6141fb680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547143680 unmapped: 75055104 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 409 ms_handle_reset con 0x55e610e23000 session 0x55e6140834a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 409 heartbeat osd_stat(store_statfs(0x19790a000/0x0/0x1bfc00000, data 0x3289a92/0x34a4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 409 ms_handle_reset con 0x55e6133ef400 session 0x55e61256f680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547143680 unmapped: 75055104 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547143680 unmapped: 75055104 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547143680 unmapped: 75055104 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5484816 data_alloc: 234881024 data_used: 15331328
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547143680 unmapped: 75055104 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547143680 unmapped: 75055104 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 409 heartbeat osd_stat(store_statfs(0x19790a000/0x0/0x1bfc00000, data 0x3289a92/0x34a4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547143680 unmapped: 75055104 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 409 handle_osd_map epochs [409,410], i have 409, src has [1,410]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547143680 unmapped: 75055104 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.488860130s of 10.026491165s, submitted: 79
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 548405248 unmapped: 73793536 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5551436 data_alloc: 234881024 data_used: 15331328
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547684352 unmapped: 74514432 heap: 622198784 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613adf000 session 0x55e612723e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613c77800 session 0x55e6121a83c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e618349c00 session 0x55e6117f4f00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e610e23000 session 0x55e61196e000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555769856 unmapped: 76931072 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e6133ef400 session 0x55e61306b0e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613adf000 session 0x55e61216be00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613c77800 session 0x55e613085a40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e6170a8400 session 0x55e6143a9e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e610e23000 session 0x55e6116a5a40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544923648 unmapped: 87777280 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196465000/0x0/0x1bfc00000, data 0x472b67b/0x4949000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544923648 unmapped: 87777280 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544923648 unmapped: 87777280 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5659670 data_alloc: 234881024 data_used: 15462400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196465000/0x0/0x1bfc00000, data 0x472b67b/0x4949000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544923648 unmapped: 87777280 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544923648 unmapped: 87777280 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e61b382400 session 0x55e617122f00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544923648 unmapped: 87777280 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613b6b000 session 0x55e6153e7e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544923648 unmapped: 87777280 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544923648 unmapped: 87777280 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5659846 data_alloc: 234881024 data_used: 15462400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e616d5fc00 session 0x55e61170c3c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.580759048s of 10.413801193s, submitted: 81
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e6133b5000 session 0x55e61170d4a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e610e23000 session 0x55e6117f4000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544948224 unmapped: 87752704 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196465000/0x0/0x1bfc00000, data 0x472b67b/0x4949000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e61b382400 session 0x55e6114d6000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544948224 unmapped: 87752704 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544956416 unmapped: 87744512 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e614328800 session 0x55e611338d20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e61c236400 session 0x55e6127232c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 544972800 unmapped: 87728128 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196464000/0x0/0x1bfc00000, data 0x472b68b/0x494a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 547135488 unmapped: 85565440 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5773973 data_alloc: 251658240 data_used: 30351360
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550354944 unmapped: 82345984 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550354944 unmapped: 82345984 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e618a28400 session 0x55e612723680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613448800 session 0x55e611f15680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550371328 unmapped: 82329600 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196462000/0x0/0x1bfc00000, data 0x472b6be/0x494c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e610e23000 session 0x55e6141fa1e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550371328 unmapped: 82329600 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550371328 unmapped: 82329600 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5708012 data_alloc: 251658240 data_used: 32235520
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196e98000/0x0/0x1bfc00000, data 0x3cf767b/0x3f15000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550371328 unmapped: 82329600 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550371328 unmapped: 82329600 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.847383499s of 12.084917068s, submitted: 57
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551985152 unmapped: 80715776 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552067072 unmapped: 80633856 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557391872 unmapped: 75309056 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5907494 data_alloc: 251658240 data_used: 33472512
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555376640 unmapped: 77324288 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x195637000/0x0/0x1bfc00000, data 0x555067b/0x576e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 77291520 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x195633000/0x0/0x1bfc00000, data 0x555467b/0x5772000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 77291520 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 77291520 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554737664 unmapped: 77963264 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5927272 data_alloc: 251658240 data_used: 34787328
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x19563a000/0x0/0x1bfc00000, data 0x555667b/0x5774000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554737664 unmapped: 77963264 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554737664 unmapped: 77963264 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554737664 unmapped: 77963264 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554737664 unmapped: 77963264 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554737664 unmapped: 77963264 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5929384 data_alloc: 251658240 data_used: 34775040
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.419394493s of 13.173292160s, submitted: 225
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554745856 unmapped: 77955072 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x195636000/0x0/0x1bfc00000, data 0x555967b/0x5777000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613b6b000 session 0x55e6121a8b40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e616d5fc00 session 0x55e61216a000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x195636000/0x0/0x1bfc00000, data 0x555967b/0x5777000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554745856 unmapped: 77955072 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x195637000/0x0/0x1bfc00000, data 0x555966b/0x5776000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e611fa0000 session 0x55e6121a9e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554770432 unmapped: 77930496 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554770432 unmapped: 77930496 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5926763 data_alloc: 251658240 data_used: 34852864
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554770432 unmapped: 77930496 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554770432 unmapped: 77930496 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554770432 unmapped: 77930496 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x195638000/0x0/0x1bfc00000, data 0x555966b/0x5776000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554770432 unmapped: 77930496 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554770432 unmapped: 77930496 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613adf000 session 0x55e61435e3c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e615e94800 session 0x55e6116a4d20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e618349400 session 0x55e6141fa960
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5635240 data_alloc: 234881024 data_used: 21491712
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550789120 unmapped: 81911808 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196974000/0x0/0x1bfc00000, data 0x3c465c6/0x3e5f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550789120 unmapped: 81911808 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550789120 unmapped: 81911808 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196974000/0x0/0x1bfc00000, data 0x3c465c6/0x3e5f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550797312 unmapped: 81903616 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550797312 unmapped: 81903616 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196974000/0x0/0x1bfc00000, data 0x3c465c6/0x3e5f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5635240 data_alloc: 234881024 data_used: 21491712
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550797312 unmapped: 81903616 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550797312 unmapped: 81903616 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550797312 unmapped: 81903616 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196974000/0x0/0x1bfc00000, data 0x3c465c6/0x3e5f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550797312 unmapped: 81903616 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550797312 unmapped: 81903616 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5635240 data_alloc: 234881024 data_used: 21491712
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550797312 unmapped: 81903616 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550805504 unmapped: 81895424 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550805504 unmapped: 81895424 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196974000/0x0/0x1bfc00000, data 0x3c465c6/0x3e5f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550805504 unmapped: 81895424 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.592836380s of 23.012874603s, submitted: 96
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e62101f800 session 0x55e611f15680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e615e97800 session 0x55e611927680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e615e94800 session 0x55e6143492c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e62101f800 session 0x55e6114d6000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613addc00 session 0x55e6171234a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550952960 unmapped: 81747968 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e62d15c000 session 0x55e6117f4780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613443400 session 0x55e6125cc5a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613443400 session 0x55e6112e65a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e618349400 session 0x55e614082b40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5469668 data_alloc: 218103808 data_used: 9863168
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 550977536 unmapped: 81723392 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551002112 unmapped: 81698816 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551002112 unmapped: 81698816 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551002112 unmapped: 81698816 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x19719c000/0x0/0x1bfc00000, data 0x308664b/0x32a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551002112 unmapped: 81698816 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5470788 data_alloc: 218103808 data_used: 9940992
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551002112 unmapped: 81698816 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551002112 unmapped: 81698816 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x19719c000/0x0/0x1bfc00000, data 0x308664b/0x32a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e613442400 session 0x55e6121a90e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551002112 unmapped: 81698816 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e6133eec00 session 0x55e61196e1e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551002112 unmapped: 81698816 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e616d5d800 session 0x55e614348960
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.309727669s of 10.510506630s, submitted: 53
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 ms_handle_reset con 0x55e6133eec00 session 0x55e6171232c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551002112 unmapped: 81698816 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x1976fd000/0x0/0x1bfc00000, data 0x308664b/0x32a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5471361 data_alloc: 218103808 data_used: 9949184
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551010304 unmapped: 81690624 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x1976fd000/0x0/0x1bfc00000, data 0x308664b/0x32a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552771584 unmapped: 79929344 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x1976fd000/0x0/0x1bfc00000, data 0x308664b/0x32a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552771584 unmapped: 79929344 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552804352 unmapped: 79896576 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552804352 unmapped: 79896576 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x1976fb000/0x0/0x1bfc00000, data 0x308664b/0x32a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5552337 data_alloc: 234881024 data_used: 20381696
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552804352 unmapped: 79896576 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552804352 unmapped: 79896576 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552804352 unmapped: 79896576 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552804352 unmapped: 79896576 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552804352 unmapped: 79896576 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x1976fd000/0x0/0x1bfc00000, data 0x308664b/0x32a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5552337 data_alloc: 234881024 data_used: 20377600
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 552804352 unmapped: 79896576 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.627647400s of 11.764071465s, submitted: 16
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557334528 unmapped: 75366400 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196e61000/0x0/0x1bfc00000, data 0x391a64b/0x3b35000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556695552 unmapped: 76005376 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556695552 unmapped: 76005376 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556695552 unmapped: 76005376 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196e42000/0x0/0x1bfc00000, data 0x394164b/0x3b5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [0,0,0,0,0,0,1])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5637295 data_alloc: 234881024 data_used: 21131264
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556695552 unmapped: 76005376 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 heartbeat osd_stat(store_statfs(0x196e42000/0x0/0x1bfc00000, data 0x394164b/0x3b5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556695552 unmapped: 76005376 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556703744 unmapped: 75997184 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 410 handle_osd_map epochs [410,411], i have 410, src has [1,411]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e618cee800 session 0x55e61133a000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556711936 unmapped: 75988992 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556711936 unmapped: 75988992 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6190b4800 session 0x55e61133ba40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133f1000 session 0x55e6117f50e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x196e3b000/0x0/0x1bfc00000, data 0x39462a4/0x3b62000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e61b383000 session 0x55e611638780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133eec00 session 0x55e6127232c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647841 data_alloc: 234881024 data_used: 22560768
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556711936 unmapped: 75988992 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6190b5400 session 0x55e6153e7a40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e62101fc00 session 0x55e612722000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.020457268s of 10.069975853s, submitted: 100
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556728320 unmapped: 75972608 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133f1000 session 0x55e6116a5860
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e618cee800 session 0x55e6117f5e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133eec00 session 0x55e6115ca1e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133f1000 session 0x55e61256e5a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6190b5400 session 0x55e61133a5a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557891584 unmapped: 74809344 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557891584 unmapped: 74809344 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557891584 unmapped: 74809344 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x1963a9000/0x0/0x1bfc00000, data 0x4488326/0x45f5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5748520 data_alloc: 234881024 data_used: 22564864
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557891584 unmapped: 74809344 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e618a28c00 session 0x55e613319e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6170a8000 session 0x55e61170cd20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6170a8000 session 0x55e61170c5a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133eec00 session 0x55e6117f52c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557907968 unmapped: 74792960 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133f1000 session 0x55e6114d6b40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e618a28c00 session 0x55e61196d680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6190b5400 session 0x55e614082780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 558358528 unmapped: 74342400 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6190b5400 session 0x55e61435f4a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133eec00 session 0x55e6140830e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x195afc000/0x0/0x1bfc00000, data 0x4d3335f/0x4ea2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133f1000 session 0x55e6140821e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 558252032 unmapped: 74448896 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 558596096 unmapped: 74104832 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5905852 data_alloc: 251658240 data_used: 32215040
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 558882816 unmapped: 73818112 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e616d5dc00 session 0x55e61653bc20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 558882816 unmapped: 73818112 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e61cb41800 session 0x55e6116a54a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.029963493s of 11.044583321s, submitted: 109
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 558923776 unmapped: 73777152 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x195ad0000/0x0/0x1bfc00000, data 0x4f8d3bb/0x4ece000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133eec00 session 0x55e6153e63c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133f1000 session 0x55e6125ccd20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: mgrc ms_handle_reset ms_handle_reset con 0x55e61c82c400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3835187053
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3835187053,v1:192.168.122.100:6801/3835187053]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: mgrc handle_mgr_configure stats_period=5
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e616d5dc00 session 0x55e61133b2c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e616102400 session 0x55e614349680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 558923776 unmapped: 73777152 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x195ace000/0x0/0x1bfc00000, data 0x4f8d3ee/0x4ed0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x195ace000/0x0/0x1bfc00000, data 0x4f8d3ee/0x4ed0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559030272 unmapped: 73670656 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6178af400 session 0x55e6127223c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133eec00 session 0x55e6130785a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5995167 data_alloc: 251658240 data_used: 39555072
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 71712768 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x195aa8000/0x0/0x1bfc00000, data 0x4fb1421/0x4ef6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560996352 unmapped: 71704576 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561070080 unmapped: 71630848 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561070080 unmapped: 71630848 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561070080 unmapped: 71630848 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6082947 data_alloc: 251658240 data_used: 39837696
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565739520 unmapped: 66961408 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x195aa8000/0x0/0x1bfc00000, data 0x4fb1421/0x4ef6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565501952 unmapped: 67198976 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566640640 unmapped: 66060288 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.704203606s of 11.106727600s, submitted: 186
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566640640 unmapped: 66060288 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x194faa000/0x0/0x1bfc00000, data 0x5aa9421/0x59ee000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566640640 unmapped: 66060288 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6102057 data_alloc: 251658240 data_used: 40452096
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566640640 unmapped: 66060288 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x194faa000/0x0/0x1bfc00000, data 0x5aa9421/0x59ee000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572850176 unmapped: 59850752 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572481536 unmapped: 60219392 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572481536 unmapped: 60219392 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572481536 unmapped: 60219392 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6211937 data_alloc: 251658240 data_used: 43184128
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572481536 unmapped: 60219392 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572481536 unmapped: 60219392 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x1944f0000/0x0/0x1bfc00000, data 0x6561421/0x64a6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574726144 unmapped: 57974784 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574726144 unmapped: 57974784 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x194407000/0x0/0x1bfc00000, data 0x6652421/0x6597000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574726144 unmapped: 57974784 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6218149 data_alloc: 251658240 data_used: 43196416
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574726144 unmapped: 57974784 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x194407000/0x0/0x1bfc00000, data 0x6652421/0x6597000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574734336 unmapped: 57966592 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.703801155s of 14.044802666s, submitted: 170
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x194407000/0x0/0x1bfc00000, data 0x6652421/0x6597000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573898752 unmapped: 58802176 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573898752 unmapped: 58802176 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573906944 unmapped: 58793984 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e61ee31c00 session 0x55e61186cb40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e614329800 session 0x55e61133be00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5975263 data_alloc: 251658240 data_used: 32772096
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566861824 unmapped: 65839104 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e614331c00 session 0x55e614083680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566861824 unmapped: 65839104 heap: 632700928 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x19562e000/0x0/0x1bfc00000, data 0x509438c/0x4fd6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e615e96800 session 0x55e61133ab40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133eec00 session 0x55e6115ca1e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e614329800 session 0x55e611638780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e614331c00 session 0x55e61186c5a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x19562e000/0x0/0x1bfc00000, data 0x509438c/0x4fd6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567926784 unmapped: 72654848 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e61ee31c00 session 0x55e61216da40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e62d165800 session 0x55e61256e5a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e62d165800 session 0x55e613078d20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133eec00 session 0x55e612723860
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e61b383000 session 0x55e613318f00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567803904 unmapped: 72777728 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567803904 unmapped: 72777728 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x194d84000/0x0/0x1bfc00000, data 0x5fda3fe/0x5c1a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x194d84000/0x0/0x1bfc00000, data 0x5fda3fe/0x5c1a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6092614 data_alloc: 251658240 data_used: 32661504
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567812096 unmapped: 72769536 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e613442400 session 0x55e61168c000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e613443400 session 0x55e61133ad20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567754752 unmapped: 72826880 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.431325912s of 10.056238174s, submitted: 138
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 77881344 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133eec00 session 0x55e61196e1e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 77881344 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x195ffa000/0x0/0x1bfc00000, data 0x4d6439c/0x49a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 77881344 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5877892 data_alloc: 234881024 data_used: 23572480
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 77881344 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e61d733000 session 0x55e6121a9a40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e616d5e000 session 0x55e61653a3c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 77881344 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562708480 unmapped: 77873152 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6178af800 session 0x55e6112e6780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e615e97800 session 0x55e617122960
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562716672 unmapped: 77864960 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x195ff9000/0x0/0x1bfc00000, data 0x4d643ac/0x49a4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562716672 unmapped: 77864960 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6133f1000 session 0x55e6121a9c20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e616102400 session 0x55e613318960
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5949039 data_alloc: 251658240 data_used: 33120256
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563740672 unmapped: 76840960 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x195ffa000/0x0/0x1bfc00000, data 0x4d643ac/0x49a4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6178af800 session 0x55e6114d7680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 76832768 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 76832768 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.212591171s of 10.545234680s, submitted: 49
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 76832768 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e6170a8400 session 0x55e61196f4a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 heartbeat osd_stat(store_statfs(0x19601e000/0x0/0x1bfc00000, data 0x4d41379/0x497f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 76832768 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e615e95800 session 0x55e617123e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 ms_handle_reset con 0x55e61b1fe800 session 0x55e61653b4a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 411 handle_osd_map epochs [411,412], i have 411, src has [1,412]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 412 ms_handle_reset con 0x55e6133f1000 session 0x55e61196da40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5942760 data_alloc: 251658240 data_used: 34693120
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 76832768 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 76832768 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 76832768 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 412 ms_handle_reset con 0x55e613adf000 session 0x55e611338d20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 412 ms_handle_reset con 0x55e6143a7000 session 0x55e61170d4a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 412 ms_handle_reset con 0x55e61432f400 session 0x55e61653a960
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 412 heartbeat osd_stat(store_statfs(0x19613e000/0x0/0x1bfc00000, data 0x463d016/0x4860000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 76832768 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 76832768 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5969678 data_alloc: 251658240 data_used: 34799616
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564723712 unmapped: 75857920 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 412 heartbeat osd_stat(store_statfs(0x195a38000/0x0/0x1bfc00000, data 0x4d43ff3/0x4f66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 74752000 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 74752000 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 412 ms_handle_reset con 0x55e6133f1000 session 0x55e611927e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.994777679s of 10.394090652s, submitted: 118
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 412 ms_handle_reset con 0x55e613adf000 session 0x55e6116501e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 412 handle_osd_map epochs [412,413], i have 412, src has [1,413]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 412 handle_osd_map epochs [413,413], i have 413, src has [1,413]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564609024 unmapped: 75972608 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564609024 unmapped: 75972608 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x196c80000/0x0/0x1bfc00000, data 0x3af9b32/0x3d1d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5732762 data_alloc: 234881024 data_used: 22024192
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564617216 unmapped: 75964416 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564617216 unmapped: 75964416 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133eec00 session 0x55e6119270e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e616d5e000 session 0x55e6125cd860
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557277184 unmapped: 83304448 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e611825800 session 0x55e6143a9e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557285376 unmapped: 83296256 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x198074000/0x0/0x1bfc00000, data 0x2709ab0/0x292a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557285376 unmapped: 83296256 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5497221 data_alloc: 218103808 data_used: 10317824
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557285376 unmapped: 83296256 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557285376 unmapped: 83296256 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613448400 session 0x55e6143a83c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e61799ec00 session 0x55e61133a960
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613b6a000 session 0x55e61196e000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e6117f4b40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6178af000 session 0x55e613085a40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e61133b0e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x198071000/0x0/0x1bfc00000, data 0x270cab0/0x292d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [1,1,0,1])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551403520 unmapped: 89178112 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613448400 session 0x55e6116a4d20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e61c82dc00 session 0x55e613318f00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613ae0400 session 0x55e613078d20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.453991890s of 10.083424568s, submitted: 155
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e62d15b800 session 0x55e61216da40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e614083680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613b6a000 session 0x55e61170c000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551378944 unmapped: 89202688 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551378944 unmapped: 89202688 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5352100 data_alloc: 218103808 data_used: 1597440
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551378944 unmapped: 89202688 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19899d000/0x0/0x1bfc00000, data 0x1d2ba6d/0x1f49000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551378944 unmapped: 89202688 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551378944 unmapped: 89202688 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551378944 unmapped: 89202688 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551378944 unmapped: 89202688 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19899d000/0x0/0x1bfc00000, data 0x1d2ba6d/0x1f49000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133f1000 session 0x55e6130785a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5352100 data_alloc: 218103808 data_used: 1597440
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551387136 unmapped: 89194496 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e615e94400 session 0x55e6127223c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551387136 unmapped: 89194496 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613449c00 session 0x55e614349680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e61133b2c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551501824 unmapped: 89079808 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551501824 unmapped: 89079808 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551829504 unmapped: 88752128 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5427247 data_alloc: 234881024 data_used: 11382784
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551829504 unmapped: 88752128 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x198a2f000/0x0/0x1bfc00000, data 0x1d4faa0/0x1f6f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551829504 unmapped: 88752128 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551829504 unmapped: 88752128 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551829504 unmapped: 88752128 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551829504 unmapped: 88752128 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x198a2f000/0x0/0x1bfc00000, data 0x1d4faa0/0x1f6f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5427247 data_alloc: 234881024 data_used: 11382784
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551829504 unmapped: 88752128 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551829504 unmapped: 88752128 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x198a2f000/0x0/0x1bfc00000, data 0x1d4faa0/0x1f6f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 551829504 unmapped: 88752128 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.247579575s of 20.317701340s, submitted: 24
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554565632 unmapped: 86016000 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x198536000/0x0/0x1bfc00000, data 0x2248aa0/0x2468000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553852928 unmapped: 86728704 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485855 data_alloc: 234881024 data_used: 11407360
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554696704 unmapped: 85884928 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553992192 unmapped: 86589440 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553992192 unmapped: 86589440 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554008576 unmapped: 86573056 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19832d000/0x0/0x1bfc00000, data 0x2451aa0/0x2671000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [0,0,1])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e62d162800 session 0x55e61216d2c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555302912 unmapped: 85278720 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5557113 data_alloc: 234881024 data_used: 12337152
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555343872 unmapped: 85237760 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555343872 unmapped: 85237760 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x197c87000/0x0/0x1bfc00000, data 0x2af7aa0/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555343872 unmapped: 85237760 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555343872 unmapped: 85237760 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x197c87000/0x0/0x1bfc00000, data 0x2af7aa0/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e616d5d000 session 0x55e6171234a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555343872 unmapped: 85237760 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6170a9000 session 0x55e6113383c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5557113 data_alloc: 234881024 data_used: 12337152
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555343872 unmapped: 85237760 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.936045647s of 12.311985016s, submitted: 93
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e618a29800 session 0x55e610dd21e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e613319e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555491328 unmapped: 85090304 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x197c60000/0x0/0x1bfc00000, data 0x2b1eaa0/0x2d3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555630592 unmapped: 84951040 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x197c60000/0x0/0x1bfc00000, data 0x2b1eaa0/0x2d3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557629440 unmapped: 82952192 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557629440 unmapped: 82952192 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133f1000 session 0x55e6116a54a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613b6a000 session 0x55e61435f0e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5596581 data_alloc: 234881024 data_used: 17735680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557629440 unmapped: 82952192 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ed800 session 0x55e6132e2f00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556122112 unmapped: 84459520 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556122112 unmapped: 84459520 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x198adc000/0x0/0x1bfc00000, data 0x1a78a0b/0x1c95000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556122112 unmapped: 84459520 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556122112 unmapped: 84459520 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5393568 data_alloc: 218103808 data_used: 6991872
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556122112 unmapped: 84459520 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556122112 unmapped: 84459520 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 556130304 unmapped: 84451328 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.426653862s of 12.647393227s, submitted: 51
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557793280 unmapped: 82788352 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19850c000/0x0/0x1bfc00000, data 0x2275a0b/0x2492000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559030272 unmapped: 81551360 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5471068 data_alloc: 218103808 data_used: 8425472
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559284224 unmapped: 81297408 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19720e000/0x0/0x1bfc00000, data 0x23d2a0b/0x25ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x1971cc000/0x0/0x1bfc00000, data 0x2414a0b/0x2631000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x1971cc000/0x0/0x1bfc00000, data 0x2414a0b/0x2631000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5486942 data_alloc: 218103808 data_used: 8507392
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x1971ac000/0x0/0x1bfc00000, data 0x2435a0b/0x2652000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5480786 data_alloc: 218103808 data_used: 8507392
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 559456256 unmapped: 81125376 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e616d5d000 session 0x55e610dd34a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6170a9000 session 0x55e61306ba40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.785085678s of 13.056034088s, submitted: 101
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553402368 unmapped: 87179264 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553402368 unmapped: 87179264 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19820a000/0x0/0x1bfc00000, data 0x13d7a0b/0x15f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e6125cc5a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553402368 unmapped: 87179264 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553402368 unmapped: 87179264 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5305271 data_alloc: 218103808 data_used: 1597440
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553402368 unmapped: 87179264 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553402368 unmapped: 87179264 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19820a000/0x0/0x1bfc00000, data 0x13d7a0b/0x15f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553410560 unmapped: 87171072 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553410560 unmapped: 87171072 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553410560 unmapped: 87171072 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19820a000/0x0/0x1bfc00000, data 0x13d7a0b/0x15f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5305271 data_alloc: 218103808 data_used: 1597440
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553410560 unmapped: 87171072 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553410560 unmapped: 87171072 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19820a000/0x0/0x1bfc00000, data 0x13d7a0b/0x15f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553410560 unmapped: 87171072 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553410560 unmapped: 87171072 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553410560 unmapped: 87171072 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5305271 data_alloc: 218103808 data_used: 1597440
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553418752 unmapped: 87162880 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19820a000/0x0/0x1bfc00000, data 0x13d7a0b/0x15f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553418752 unmapped: 87162880 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553418752 unmapped: 87162880 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553418752 unmapped: 87162880 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553418752 unmapped: 87162880 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5305271 data_alloc: 218103808 data_used: 1597440
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553418752 unmapped: 87162880 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553418752 unmapped: 87162880 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19820a000/0x0/0x1bfc00000, data 0x13d7a0b/0x15f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553418752 unmapped: 87162880 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553426944 unmapped: 87154688 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553426944 unmapped: 87154688 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e618cee800 session 0x55e6140834a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613b85400 session 0x55e6121a94a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e61b383400 session 0x55e6141fb2c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e61256e5a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 23.977979660s of 24.250816345s, submitted: 14
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5402723 data_alloc: 218103808 data_used: 1597440
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 555606016 unmapped: 84975616 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613b85400 session 0x55e61653a3c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6170a9000 session 0x55e61216dc20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e618cee800 session 0x55e61168cd20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e61c82f800 session 0x55e61435e1e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e61653af00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x1975af000/0x0/0x1bfc00000, data 0x2030a44/0x224f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553467904 unmapped: 87113728 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553467904 unmapped: 87113728 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553467904 unmapped: 87113728 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x1975af000/0x0/0x1bfc00000, data 0x2030a7d/0x224f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553467904 unmapped: 87113728 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613ae1c00 session 0x55e6127232c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e61c82c400 session 0x55e617123c20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613b84c00 session 0x55e61216d4a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e62d15c000 session 0x55e6120452c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x1975af000/0x0/0x1bfc00000, data 0x2030a7d/0x224f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5406427 data_alloc: 218103808 data_used: 1597440
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 553320448 unmapped: 87261184 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554729472 unmapped: 85852160 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x1975ae000/0x0/0x1bfc00000, data 0x2030aa6/0x2250000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613ae1c00 session 0x55e613079680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e6117f4d20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554729472 unmapped: 85852160 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x196f56000/0x0/0x1bfc00000, data 0x2688adf/0x28a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613443400 session 0x55e61653ad20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554737664 unmapped: 85843968 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e61b383000 session 0x55e61170c5a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554745856 unmapped: 85835776 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5459544 data_alloc: 218103808 data_used: 1597440
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554745856 unmapped: 85835776 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613e77400 session 0x55e6141fa3c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554745856 unmapped: 85835776 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6133ea000 session 0x55e61256e960
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x196f56000/0x0/0x1bfc00000, data 0x2688adf/0x28a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554745856 unmapped: 85835776 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613c77000 session 0x55e612723e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.082229614s of 12.501278877s, submitted: 75
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554491904 unmapped: 86089728 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x196f55000/0x0/0x1bfc00000, data 0x2688aef/0x28a9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86081536 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e61d732400 session 0x55e6116a5860
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e6178b0400 session 0x55e6115cb680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613447c00 session 0x55e61186da40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5463197 data_alloc: 218103808 data_used: 1601536
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86081536 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x196f55000/0x0/0x1bfc00000, data 0x2688aef/0x28a9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86081536 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x196f54000/0x0/0x1bfc00000, data 0x2688aff/0x28aa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 554196992 unmapped: 86384640 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557113344 unmapped: 83468288 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557113344 unmapped: 83468288 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x196f54000/0x0/0x1bfc00000, data 0x2688aff/0x28aa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5601381 data_alloc: 234881024 data_used: 20992000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557113344 unmapped: 83468288 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557113344 unmapped: 83468288 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557113344 unmapped: 83468288 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x196f54000/0x0/0x1bfc00000, data 0x2688aff/0x28aa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557113344 unmapped: 83468288 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557113344 unmapped: 83468288 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5601381 data_alloc: 234881024 data_used: 20992000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557113344 unmapped: 83468288 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 557113344 unmapped: 83468288 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.077384949s of 14.277094841s, submitted: 11
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #60. Immutable memtables: 16.
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563732480 unmapped: 76849152 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x194885000/0x0/0x1bfc00000, data 0x3bb1aff/0x3dd3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [2])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566116352 unmapped: 74465280 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567279616 unmapped: 73302016 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x194730000/0x0/0x1bfc00000, data 0x3d04aff/0x3f26000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5819139 data_alloc: 234881024 data_used: 23150592
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567287808 unmapped: 73293824 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19471a000/0x0/0x1bfc00000, data 0x3d1aaff/0x3f3c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567287808 unmapped: 73293824 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x19471a000/0x0/0x1bfc00000, data 0x3d1aaff/0x3f3c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567287808 unmapped: 73293824 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567287808 unmapped: 73293824 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567287808 unmapped: 73293824 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5812007 data_alloc: 234881024 data_used: 23158784
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567287808 unmapped: 73293824 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567287808 unmapped: 73293824 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x1946fe000/0x0/0x1bfc00000, data 0x3d3eaff/0x3f60000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567296000 unmapped: 73285632 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567296000 unmapped: 73285632 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 heartbeat osd_stat(store_statfs(0x1946fe000/0x0/0x1bfc00000, data 0x3d3eaff/0x3f60000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567296000 unmapped: 73285632 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.453927994s of 13.398723602s, submitted: 223
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5812475 data_alloc: 234881024 data_used: 23162880
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567296000 unmapped: 73285632 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567296000 unmapped: 73285632 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 ms_handle_reset con 0x55e613e76000 session 0x55e6132e3a40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567304192 unmapped: 73277440 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 handle_osd_map epochs [413,414], i have 413, src has [1,414]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 413 handle_osd_map epochs [414,414], i have 414, src has [1,414]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 414 ms_handle_reset con 0x55e6143a7400 session 0x55e6117f5e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 414 ms_handle_reset con 0x55e618cee800 session 0x55e61216ba40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 414 ms_handle_reset con 0x55e61194cc00 session 0x55e6153e74a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570646528 unmapped: 69935104 heap: 640581632 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 414 ms_handle_reset con 0x55e615e94400 session 0x55e61170c3c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 575864832 unmapped: 75849728 heap: 651714560 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 414 heartbeat osd_stat(store_statfs(0x19452b000/0x0/0x1bfc00000, data 0x3f0f758/0x4132000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 414 handle_osd_map epochs [414,415], i have 414, src has [1,415]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 415 ms_handle_reset con 0x55e615e94400 session 0x55e6125cc1e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5992797 data_alloc: 251658240 data_used: 31043584
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 575881216 unmapped: 75833344 heap: 651714560 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 415 handle_osd_map epochs [415,416], i have 415, src has [1,416]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e61194cc00 session 0x55e61186d860
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576937984 unmapped: 74776576 heap: 651714560 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e6143a7c00 session 0x55e611650f00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576937984 unmapped: 74776576 heap: 651714560 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e6143a6800 session 0x55e611927a40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e6170a8800 session 0x55e612723e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e6170a8800 session 0x55e6141fab40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e61194cc00 session 0x55e61186dc20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e6143a6800 session 0x55e612722000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e6143a7c00 session 0x55e6114d7680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576937984 unmapped: 74776576 heap: 651714560 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e615e94400 session 0x55e61216d4a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e615e94400 session 0x55e6116a52c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e61194cc00 session 0x55e6171232c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e6143a6800 session 0x55e61196f0e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 416 ms_handle_reset con 0x55e6143a7c00 session 0x55e6121a8000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570064896 unmapped: 85852160 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 416 heartbeat osd_stat(store_statfs(0x192d18000/0x0/0x1bfc00000, data 0x572107a/0x5946000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6055984 data_alloc: 251658240 data_used: 31043584
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570064896 unmapped: 85852160 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570064896 unmapped: 85852160 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568016896 unmapped: 87900160 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 416 handle_osd_map epochs [416,417], i have 416, src has [1,417]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.550370216s of 12.453164101s, submitted: 68
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568025088 unmapped: 87891968 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e62d160800 session 0x55e612044f00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e62d160800 session 0x55e613084960
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e61194cc00 session 0x55e61216b2c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e6143a6800 session 0x55e61196f2c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568025088 unmapped: 87891968 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e615e94400 session 0x55e613319c20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e6143a7c00 session 0x55e61306a000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e6143a7c00 session 0x55e61256e1e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e61194cc00 session 0x55e6116a5c20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e6143a6800 session 0x55e6114d7e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x192d13000/0x0/0x1bfc00000, data 0x5722c2a/0x594b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6062002 data_alloc: 251658240 data_used: 31051776
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568025088 unmapped: 87891968 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568025088 unmapped: 87891968 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e615e94400 session 0x55e6140821e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568025088 unmapped: 87891968 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568025088 unmapped: 87891968 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x192d13000/0x0/0x1bfc00000, data 0x5722c2a/0x594b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [0,0,0,0,0,2])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572891136 unmapped: 83025920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e611824000 session 0x55e6114d6b40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6138499 data_alloc: 251658240 data_used: 41205760
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573194240 unmapped: 82722816 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573194240 unmapped: 82722816 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573341696 unmapped: 82575360 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x192ced000/0x0/0x1bfc00000, data 0x5747c4d/0x5971000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574242816 unmapped: 81674240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.508294106s of 11.667946815s, submitted: 31
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574242816 unmapped: 81674240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6168755 data_alloc: 251658240 data_used: 42401792
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574242816 unmapped: 81674240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574291968 unmapped: 81625088 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574291968 unmapped: 81625088 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x192ce6000/0x0/0x1bfc00000, data 0x574ec4d/0x5978000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574291968 unmapped: 81625088 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 575840256 unmapped: 80076800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x191f18000/0x0/0x1bfc00000, data 0x6104c4d/0x632e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6253085 data_alloc: 251658240 data_used: 42610688
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 575840256 unmapped: 80076800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 575864832 unmapped: 80052224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 575193088 unmapped: 80723968 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x191e9a000/0x0/0x1bfc00000, data 0x618ac4d/0x63b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 578838528 unmapped: 77078528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 579026944 unmapped: 76890112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6333918 data_alloc: 251658240 data_used: 44892160
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 579026944 unmapped: 76890112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.080519676s of 12.241555214s, submitted: 105
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 575832064 unmapped: 80084992 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x1918ab000/0x0/0x1bfc00000, data 0x6779c4d/0x69a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 575848448 unmapped: 80068608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x1918ab000/0x0/0x1bfc00000, data 0x6779c4d/0x69a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e6119cac00 session 0x55e611927680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 575848448 unmapped: 80068608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576946176 unmapped: 78970880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e616d5cc00 session 0x55e6112e6780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6102700 data_alloc: 251658240 data_used: 34942976
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576946176 unmapped: 78970880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576946176 unmapped: 78970880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x192c36000/0x0/0x1bfc00000, data 0x5327c4d/0x5551000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576946176 unmapped: 78970880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576954368 unmapped: 78962688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x192cfa000/0x0/0x1bfc00000, data 0x532ac4d/0x5554000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e61d732400 session 0x55e61256f0e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e6133ea000 session 0x55e61168c1e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576954368 unmapped: 78962688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e62101f400 session 0x55e6116a43c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5895507 data_alloc: 234881024 data_used: 27729920
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576970752 unmapped: 78946304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x1940e2000/0x0/0x1bfc00000, data 0x3efdbdb/0x4125000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576970752 unmapped: 78946304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576970752 unmapped: 78946304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576978944 unmapped: 78938112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576978944 unmapped: 78938112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5895507 data_alloc: 234881024 data_used: 27729920
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 576978944 unmapped: 78938112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.050031662s of 14.386358261s, submitted: 69
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 577060864 unmapped: 78856192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x194129000/0x0/0x1bfc00000, data 0x3efdbdb/0x4125000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 577060864 unmapped: 78856192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x194129000/0x0/0x1bfc00000, data 0x3efdbdb/0x4125000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 577069056 unmapped: 78848000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 577077248 unmapped: 78839808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5895155 data_alloc: 234881024 data_used: 27729920
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 577536000 unmapped: 78381056 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x194129000/0x0/0x1bfc00000, data 0x3efdbdb/0x4125000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 577536000 unmapped: 78381056 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 heartbeat osd_stat(store_statfs(0x194129000/0x0/0x1bfc00000, data 0x3efdbdb/0x4125000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 ms_handle_reset con 0x55e61432fc00 session 0x55e6153e6780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 417 handle_osd_map epochs [417,418], i have 417, src has [1,418]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 418 ms_handle_reset con 0x55e613e77800 session 0x55e61216af00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 418 ms_handle_reset con 0x55e6133b5400 session 0x55e6121a81e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 418 ms_handle_reset con 0x55e6170a9000 session 0x55e617123a40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 598065152 unmapped: 57851904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 418 ms_handle_reset con 0x55e6119cac00 session 0x55e6171234a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 418 heartbeat osd_stat(store_statfs(0x194124000/0x0/0x1bfc00000, data 0x3eff844/0x4129000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 589946880 unmapped: 65970176 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 418 handle_osd_map epochs [418,419], i have 418, src has [1,419]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 419 ms_handle_reset con 0x55e6119cac00 session 0x55e611638780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 589946880 unmapped: 65970176 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 419 handle_osd_map epochs [419,420], i have 419, src has [1,420]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 420 ms_handle_reset con 0x55e6133b5400 session 0x55e6112e7860
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6184245 data_alloc: 251658240 data_used: 39505920
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 590020608 unmapped: 65896448 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.693154335s of 10.000839233s, submitted: 57
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 420 ms_handle_reset con 0x55e613e77800 session 0x55e61435e3c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 420 ms_handle_reset con 0x55e61432fc00 session 0x55e612723680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 420 ms_handle_reset con 0x55e6170a9000 session 0x55e6140823c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 590036992 unmapped: 65880064 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 420 heartbeat osd_stat(store_statfs(0x192067000/0x0/0x1bfc00000, data 0x5fb9166/0x61e5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 420 handle_osd_map epochs [420,421], i have 420, src has [1,421]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 421 ms_handle_reset con 0x55e6119cac00 session 0x55e6125cc780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 590086144 unmapped: 65830912 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 590086144 unmapped: 65830912 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 421 ms_handle_reset con 0x55e61194cc00 session 0x55e61133a5a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 421 ms_handle_reset con 0x55e6143a6800 session 0x55e6116501e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 421 ms_handle_reset con 0x55e6143a7000 session 0x55e61306b0e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 592330752 unmapped: 63586304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 421 heartbeat osd_stat(store_statfs(0x1941b2000/0x0/0x1bfc00000, data 0x3938dae/0x3b63000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5893292 data_alloc: 251658240 data_used: 33304576
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 592330752 unmapped: 63586304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 592330752 unmapped: 63586304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 421 handle_osd_map epochs [421,422], i have 421, src has [1,422]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 590045184 unmapped: 65871872 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 590045184 unmapped: 65871872 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 422 heartbeat osd_stat(store_statfs(0x19470c000/0x0/0x1bfc00000, data 0x39168e6/0x3b41000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 422 ms_handle_reset con 0x55e62d15bc00 session 0x55e6121a9e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 422 ms_handle_reset con 0x55e61194cc00 session 0x55e61196e780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 422 ms_handle_reset con 0x55e613c29800 session 0x55e6117f4000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 422 ms_handle_reset con 0x55e6119cac00 session 0x55e6115caf00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 422 handle_osd_map epochs [422,423], i have 422, src has [1,423]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 590249984 unmapped: 65667072 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 423 ms_handle_reset con 0x55e6143a6800 session 0x55e6141fa1e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 423 ms_handle_reset con 0x55e6143a7000 session 0x55e611338d20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 423 ms_handle_reset con 0x55e61b1fe000 session 0x55e6116a5c20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 423 heartbeat osd_stat(store_statfs(0x19470c000/0x0/0x1bfc00000, data 0x39168e6/0x3b41000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5731086 data_alloc: 234881024 data_used: 16670720
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 580632576 unmapped: 75284480 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.230168343s of 10.558722496s, submitted: 112
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 423 ms_handle_reset con 0x55e613c77000 session 0x55e6141fbe00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 423 ms_handle_reset con 0x55e6178b0400 session 0x55e612045860
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 580648960 unmapped: 75268096 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 423 ms_handle_reset con 0x55e61194cc00 session 0x55e61196f2c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 423 heartbeat osd_stat(store_statfs(0x19527b000/0x0/0x1bfc00000, data 0x2da65f5/0x2fd3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 423 handle_osd_map epochs [423,424], i have 423, src has [1,424]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566198272 unmapped: 89718784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566198272 unmapped: 89718784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566198272 unmapped: 89718784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5454505 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566198272 unmapped: 89718784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 424 heartbeat osd_stat(store_statfs(0x1963d1000/0x0/0x1bfc00000, data 0x18560ce/0x1a81000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566198272 unmapped: 89718784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 424 ms_handle_reset con 0x55e6133ee800 session 0x55e61216d680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 424 handle_osd_map epochs [424,425], i have 424, src has [1,425]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61c82e400 session 0x55e6143a8d20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566198272 unmapped: 89718784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566198272 unmapped: 89718784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e6178af800 session 0x55e6143a8780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e62101f000 session 0x55e611926d20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566353920 unmapped: 89563136 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x1967a4000/0x0/0x1bfc00000, data 0x187bc30/0x1aa9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x1967a4000/0x0/0x1bfc00000, data 0x187bc30/0x1aa9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5462528 data_alloc: 218103808 data_used: 1675264
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566353920 unmapped: 89563136 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566362112 unmapped: 89554944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566362112 unmapped: 89554944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566362112 unmapped: 89554944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566362112 unmapped: 89554944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5494208 data_alloc: 218103808 data_used: 6172672
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x1967a4000/0x0/0x1bfc00000, data 0x187bc30/0x1aa9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566362112 unmapped: 89554944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566362112 unmapped: 89554944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566362112 unmapped: 89554944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566362112 unmapped: 89554944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566362112 unmapped: 89554944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5494688 data_alloc: 218103808 data_used: 6184960
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566370304 unmapped: 89546752 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.329175949s of 19.560520172s, submitted: 87
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x1967a4000/0x0/0x1bfc00000, data 0x187bc30/0x1aa9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [0,0,0,4,3])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572686336 unmapped: 83230720 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569131008 unmapped: 86786048 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195f2d000/0x0/0x1bfc00000, data 0x20f3c30/0x2321000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195f0b000/0x0/0x1bfc00000, data 0x2115c30/0x2343000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569237504 unmapped: 86679552 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569237504 unmapped: 86679552 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195eff000/0x0/0x1bfc00000, data 0x2121c30/0x234f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5576460 data_alloc: 218103808 data_used: 6492160
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569237504 unmapped: 86679552 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569237504 unmapped: 86679552 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195eff000/0x0/0x1bfc00000, data 0x2121c30/0x234f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569237504 unmapped: 86679552 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569073664 unmapped: 86843392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569073664 unmapped: 86843392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5574204 data_alloc: 218103808 data_used: 6496256
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569073664 unmapped: 86843392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569073664 unmapped: 86843392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569073664 unmapped: 86843392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195efc000/0x0/0x1bfc00000, data 0x2124c30/0x2352000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569073664 unmapped: 86843392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569073664 unmapped: 86843392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 77K writes, 327K keys, 77K commit groups, 1.0 writes per commit group, ingest: 0.33 GB, 0.05 MB/s#012Cumulative WAL: 77K writes, 26K syncs, 2.91 writes per sync, written: 0.33 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 6778 writes, 27K keys, 6778 commit groups, 1.0 writes per commit group, ingest: 28.97 MB, 0.05 MB/s#012Interval WAL: 6778 writes, 2628 syncs, 2.58 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5574204 data_alloc: 218103808 data_used: 6496256
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569073664 unmapped: 86843392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 569073664 unmapped: 86843392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.822758675s of 16.085189819s, submitted: 84
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61dc78400 session 0x55e6127234a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e613c77800 session 0x55e6143485a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570294272 unmapped: 85622784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570294272 unmapped: 85622784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19553f000/0x0/0x1bfc00000, data 0x2ae0c92/0x2d0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570294272 unmapped: 85622784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19553f000/0x0/0x1bfc00000, data 0x2ae0c92/0x2d0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5649897 data_alloc: 218103808 data_used: 6496256
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570294272 unmapped: 85622784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570294272 unmapped: 85622784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570294272 unmapped: 85622784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19553f000/0x0/0x1bfc00000, data 0x2ae0c92/0x2d0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570294272 unmapped: 85622784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570294272 unmapped: 85622784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5649897 data_alloc: 218103808 data_used: 6496256
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570245120 unmapped: 85671936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61b383c00 session 0x55e6121a81e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19551b000/0x0/0x1bfc00000, data 0x2b04c92/0x2d33000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5717114 data_alloc: 234881024 data_used: 15491072
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19551b000/0x0/0x1bfc00000, data 0x2b04c92/0x2d33000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.217922211s of 17.367380142s, submitted: 38
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5717486 data_alloc: 234881024 data_used: 15491072
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195519000/0x0/0x1bfc00000, data 0x2b05c92/0x2d34000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570417152 unmapped: 85499904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 570728448 unmapped: 85188608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574160896 unmapped: 81756160 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x194ed4000/0x0/0x1bfc00000, data 0x3143c92/0x3372000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x194ed4000/0x0/0x1bfc00000, data 0x3143c92/0x3372000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5787944 data_alloc: 234881024 data_used: 15544320
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 574160896 unmapped: 81756160 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573489152 unmapped: 82427904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573497344 unmapped: 82419712 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573497344 unmapped: 82419712 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.052712440s of 10.270314217s, submitted: 69
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573497344 unmapped: 82419712 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x194c7c000/0x0/0x1bfc00000, data 0x33a3c92/0x35d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5806950 data_alloc: 234881024 data_used: 16617472
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573497344 unmapped: 82419712 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x194c7c000/0x0/0x1bfc00000, data 0x33a3c92/0x35d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573497344 unmapped: 82419712 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573497344 unmapped: 82419712 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573497344 unmapped: 82419712 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61432d000 session 0x55e6116a4780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e613ae1c00 session 0x55e6112e6780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 573464576 unmapped: 82452480 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5594594 data_alloc: 218103808 data_used: 6606848
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572858368 unmapped: 83058688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195ed6000/0x0/0x1bfc00000, data 0x2149c30/0x2377000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572858368 unmapped: 83058688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572858368 unmapped: 83058688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e618348400 session 0x55e614082780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572866560 unmapped: 83050496 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195efb000/0x0/0x1bfc00000, data 0x2125c30/0x2353000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572866560 unmapped: 83050496 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.619760513s of 11.074169159s, submitted: 54
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5587286 data_alloc: 218103808 data_used: 6496256
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572866560 unmapped: 83050496 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572866560 unmapped: 83050496 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e618a28c00 session 0x55e6143492c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e6170a8800 session 0x55e61216a780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 572923904 unmapped: 82993152 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e618349000 session 0x55e61435e3c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565714944 unmapped: 90202112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565764096 unmapped: 90152960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565764096 unmapped: 90152960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565780480 unmapped: 90136576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565788672 unmapped: 90128384 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565788672 unmapped: 90128384 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565788672 unmapped: 90128384 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565788672 unmapped: 90128384 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565788672 unmapped: 90128384 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565796864 unmapped: 90120192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565796864 unmapped: 90120192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565796864 unmapped: 90120192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565796864 unmapped: 90120192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565805056 unmapped: 90112000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565805056 unmapped: 90112000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565813248 unmapped: 90103808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565813248 unmapped: 90103808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565813248 unmapped: 90103808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565813248 unmapped: 90103808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565813248 unmapped: 90103808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565813248 unmapped: 90103808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565813248 unmapped: 90103808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565813248 unmapped: 90103808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565813248 unmapped: 90103808 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565821440 unmapped: 90095616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565821440 unmapped: 90095616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565821440 unmapped: 90095616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565821440 unmapped: 90095616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 90087424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 90087424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 90087424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 90087424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 90087424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 90087424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 90087424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565829632 unmapped: 90087424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565837824 unmapped: 90079232 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565837824 unmapped: 90079232 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565837824 unmapped: 90079232 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565837824 unmapped: 90079232 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565837824 unmapped: 90079232 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565837824 unmapped: 90079232 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565837824 unmapped: 90079232 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565837824 unmapped: 90079232 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565846016 unmapped: 90071040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565846016 unmapped: 90071040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565846016 unmapped: 90071040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565846016 unmapped: 90071040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565846016 unmapped: 90071040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565846016 unmapped: 90071040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5444212 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565846016 unmapped: 90071040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565846016 unmapped: 90071040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565854208 unmapped: 90062848 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565854208 unmapped: 90062848 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 109.116950989s of 109.828414917s, submitted: 277
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565854208 unmapped: 90062848 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e618349800 session 0x55e613318960
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e617e23c00 session 0x55e613078000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e613448800 session 0x55e6132e2780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e613448800 session 0x55e61256fc20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e6170a8800 session 0x55e61216a1e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5494573 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566231040 unmapped: 89686016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566231040 unmapped: 89686016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566231040 unmapped: 89686016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566231040 unmapped: 89686016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19668e000/0x0/0x1bfc00000, data 0x1994bab/0x1bc0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566231040 unmapped: 89686016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5494573 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19668e000/0x0/0x1bfc00000, data 0x1994bab/0x1bc0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566231040 unmapped: 89686016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566239232 unmapped: 89677824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e6133b5800 session 0x55e6120452c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566239232 unmapped: 89677824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e616d5c000 session 0x55e6116a54a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e615e95000 session 0x55e6131a4d20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e6133b5800 session 0x55e61256f0e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565190656 unmapped: 90726400 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565198848 unmapped: 90718208 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19668d000/0x0/0x1bfc00000, data 0x1994bbb/0x1bc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5528735 data_alloc: 218103808 data_used: 6242304
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565207040 unmapped: 90710016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565207040 unmapped: 90710016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565207040 unmapped: 90710016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19668d000/0x0/0x1bfc00000, data 0x1994bbb/0x1bc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565207040 unmapped: 90710016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19668d000/0x0/0x1bfc00000, data 0x1994bbb/0x1bc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565207040 unmapped: 90710016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5528735 data_alloc: 218103808 data_used: 6242304
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565207040 unmapped: 90710016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565207040 unmapped: 90710016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565207040 unmapped: 90710016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x19668d000/0x0/0x1bfc00000, data 0x1994bbb/0x1bc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565207040 unmapped: 90710016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.850133896s of 20.080587387s, submitted: 28
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 565592064 unmapped: 90324992 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5604607 data_alloc: 218103808 data_used: 6541312
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567631872 unmapped: 88285184 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195c83000/0x0/0x1bfc00000, data 0x239dbbb/0x25ca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567853056 unmapped: 88064000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567853056 unmapped: 88064000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567853056 unmapped: 88064000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567853056 unmapped: 88064000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195c40000/0x0/0x1bfc00000, data 0x23e0bbb/0x260d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5619547 data_alloc: 218103808 data_used: 6529024
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567853056 unmapped: 88064000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195c40000/0x0/0x1bfc00000, data 0x23e0bbb/0x260d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567672832 unmapped: 88244224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567672832 unmapped: 88244224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567672832 unmapped: 88244224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567672832 unmapped: 88244224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5613859 data_alloc: 218103808 data_used: 6529024
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195c1f000/0x0/0x1bfc00000, data 0x2402bbb/0x262f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567672832 unmapped: 88244224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567672832 unmapped: 88244224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x195c1f000/0x0/0x1bfc00000, data 0x2402bbb/0x262f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567672832 unmapped: 88244224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.083731651s of 13.471039772s, submitted: 88
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e613448800 session 0x55e6141fb4a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e616d5c000 session 0x55e6113392c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567672832 unmapped: 88244224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e6133ea000 session 0x55e611650d20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5456814 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5456814 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5456814 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5456814 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567574528 unmapped: 88342528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567582720 unmapped: 88334336 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567582720 unmapped: 88334336 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567590912 unmapped: 88326144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567590912 unmapped: 88326144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5456814 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567590912 unmapped: 88326144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567590912 unmapped: 88326144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567590912 unmapped: 88326144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567590912 unmapped: 88326144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567590912 unmapped: 88326144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5456814 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567590912 unmapped: 88326144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567599104 unmapped: 88317952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567599104 unmapped: 88317952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567599104 unmapped: 88317952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567599104 unmapped: 88317952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5456814 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567599104 unmapped: 88317952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567599104 unmapped: 88317952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567599104 unmapped: 88317952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567599104 unmapped: 88317952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567607296 unmapped: 88309760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5456814 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567615488 unmapped: 88301568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567615488 unmapped: 88301568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567615488 unmapped: 88301568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567615488 unmapped: 88301568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567615488 unmapped: 88301568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5456814 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e614331400 session 0x55e6118e83c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567615488 unmapped: 88301568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567615488 unmapped: 88301568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567623680 unmapped: 88293376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567623680 unmapped: 88293376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567623680 unmapped: 88293376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61c82c000 session 0x55e61653b4a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61d732400 session 0x55e61435f680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e613ae1c00 session 0x55e6171230e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61dc79000 session 0x55e6132e3c20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 47.185108185s of 47.276470184s, submitted: 30
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e6170a8800 session 0x55e611926000
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5519113 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e613ae1c00 session 0x55e6132e3a40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61c82c000 session 0x55e61196e780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196c36000/0x0/0x1bfc00000, data 0x13ecbab/0x1618000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [0,0,0,0,1])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61d732400 session 0x55e6120454a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61dc79000 session 0x55e6121a9a40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567762944 unmapped: 88154112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567762944 unmapped: 88154112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567762944 unmapped: 88154112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567762944 unmapped: 88154112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196495000/0x0/0x1bfc00000, data 0x1b8cbbb/0x1db9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567762944 unmapped: 88154112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5519113 data_alloc: 218103808 data_used: 1638400
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567762944 unmapped: 88154112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196495000/0x0/0x1bfc00000, data 0x1b8cbbb/0x1db9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196495000/0x0/0x1bfc00000, data 0x1b8cbbb/0x1db9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e613c77c00 session 0x55e611339680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567762944 unmapped: 88154112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e613ae1c00 session 0x55e61216af00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567762944 unmapped: 88154112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61c82c000 session 0x55e6143494a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61d732400 session 0x55e6140823c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196470000/0x0/0x1bfc00000, data 0x1bb0bca/0x1dde000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5581896 data_alloc: 218103808 data_used: 9551872
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196470000/0x0/0x1bfc00000, data 0x1bb0bca/0x1dde000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5581896 data_alloc: 218103808 data_used: 9551872
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567918592 unmapped: 87998464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196470000/0x0/0x1bfc00000, data 0x1bb0bca/0x1dde000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.383367538s of 19.576660156s, submitted: 29
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5609840 data_alloc: 218103808 data_used: 9568256
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568942592 unmapped: 86974464 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196142000/0x0/0x1bfc00000, data 0x1edebca/0x210c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196142000/0x0/0x1bfc00000, data 0x1edebca/0x210c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5621382 data_alloc: 218103808 data_used: 9846784
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196132000/0x0/0x1bfc00000, data 0x1eeebca/0x211c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196132000/0x0/0x1bfc00000, data 0x1eeebca/0x211c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5621398 data_alloc: 218103808 data_used: 9846784
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 heartbeat osd_stat(store_statfs(0x196132000/0x0/0x1bfc00000, data 0x1eeebca/0x211c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568082432 unmapped: 87834624 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568098816 unmapped: 87818240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.063075066s of 13.349035263s, submitted: 47
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568098816 unmapped: 87818240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568098816 unmapped: 87818240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 ms_handle_reset con 0x55e61dc79c00 session 0x55e6141fb860
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5635212 data_alloc: 218103808 data_used: 9846784
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568098816 unmapped: 87818240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 425 handle_osd_map epochs [425,426], i have 425, src has [1,426]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 426 ms_handle_reset con 0x55e616d5c000 session 0x55e6143a8d20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568107008 unmapped: 87810048 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 426 heartbeat osd_stat(store_statfs(0x195f30000/0x0/0x1bfc00000, data 0x20efdca/0x231e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568107008 unmapped: 87810048 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 426 ms_handle_reset con 0x55e613ae1c00 session 0x55e6127234a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568107008 unmapped: 87810048 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 426 heartbeat osd_stat(store_statfs(0x195f2c000/0x0/0x1bfc00000, data 0x20f1a23/0x2321000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 426 heartbeat osd_stat(store_statfs(0x195f2c000/0x0/0x1bfc00000, data 0x20f1a23/0x2321000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 426 handle_osd_map epochs [426,427], i have 426, src has [1,427]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 426 handle_osd_map epochs [427,427], i have 427, src has [1,427]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 427 ms_handle_reset con 0x55e613b85400 session 0x55e6143492c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 427 ms_handle_reset con 0x55e61c82c000 session 0x55e612722780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 427 ms_handle_reset con 0x55e616d5c000 session 0x55e61196e780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 568115200 unmapped: 87801856 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 427 heartbeat osd_stat(store_statfs(0x195f2c000/0x0/0x1bfc00000, data 0x20f1a23/0x2321000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [0,0,0,0,0,0,3])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 427 heartbeat osd_stat(store_statfs(0x1952e4000/0x0/0x1bfc00000, data 0x2d3768c/0x2f69000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 427 ms_handle_reset con 0x55e61d732400 session 0x55e611f14b40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5762274 data_alloc: 234881024 data_used: 13156352
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567771136 unmapped: 88145920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 427 handle_osd_map epochs [427,428], i have 427, src has [1,428]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 427 handle_osd_map epochs [428,428], i have 428, src has [1,428]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 428 ms_handle_reset con 0x55e613ae1c00 session 0x55e6153e7860
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567771136 unmapped: 88145920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 428 handle_osd_map epochs [428,429], i have 428, src has [1,429]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 429 ms_handle_reset con 0x55e614c33c00 session 0x55e6140830e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567779328 unmapped: 88137728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 429 ms_handle_reset con 0x55e61194d000 session 0x55e614083e00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 429 ms_handle_reset con 0x55e613448400 session 0x55e614082d20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567779328 unmapped: 88137728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 429 ms_handle_reset con 0x55e613e76000 session 0x55e614082f00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 429 heartbeat osd_stat(store_statfs(0x1952dd000/0x0/0x1bfc00000, data 0x2d3afae/0x2f6f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567869440 unmapped: 88047616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 429 ms_handle_reset con 0x55e613448400 session 0x55e61435f680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 429 ms_handle_reset con 0x55e61194d000 session 0x55e61196dc20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5770122 data_alloc: 234881024 data_used: 13168640
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 567869440 unmapped: 88047616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 429 handle_osd_map epochs [429,430], i have 429, src has [1,430]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.447452545s of 13.320432663s, submitted: 50
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952db000/0x0/0x1bfc00000, data 0x2d3caed/0x2f72000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e61dc78800 session 0x55e610dd34a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e611fa0c00 session 0x55e610dd2780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e611fa1000 session 0x55e6121a8780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e61194d000 session 0x55e6121a8f00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e611fa1000 session 0x55e6132e3c20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e611fa0c00 session 0x55e61133b680
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e613448400 session 0x55e61133b0e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e61dc78800 session 0x55e61653a3c0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e61194d000 session 0x55e61653af00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952db000/0x0/0x1bfc00000, data 0x2d3caed/0x2f72000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5758008 data_alloc: 234881024 data_used: 13168640
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562118656 unmapped: 93798400 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562126848 unmapped: 93790208 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952db000/0x0/0x1bfc00000, data 0x2d3cafd/0x2f73000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5758168 data_alloc: 234881024 data_used: 13172736
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562126848 unmapped: 93790208 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562126848 unmapped: 93790208 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.108816147s of 11.132290840s, submitted: 17
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562126848 unmapped: 93790208 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e62d164400 session 0x55e61653a780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562274304 unmapped: 93642752 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562274304 unmapped: 93642752 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5779177 data_alloc: 234881024 data_used: 15343616
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952b7000/0x0/0x1bfc00000, data 0x2d60afd/0x2f97000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952b7000/0x0/0x1bfc00000, data 0x2d60afd/0x2f97000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5807017 data_alloc: 234881024 data_used: 19283968
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952b7000/0x0/0x1bfc00000, data 0x2d60afd/0x2f97000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952b7000/0x0/0x1bfc00000, data 0x2d60afd/0x2f97000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5807017 data_alloc: 234881024 data_used: 19283968
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.592308044s of 13.788862228s, submitted: 9
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 91815936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 91815936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x19526c000/0x0/0x1bfc00000, data 0x2dabafd/0x2fe2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 91815936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 91815936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x19526c000/0x0/0x1bfc00000, data 0x2dabafd/0x2fe2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5830915 data_alloc: 234881024 data_used: 21471232
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 91815936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 91815936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 91332608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 91332608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 91332608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5839683 data_alloc: 234881024 data_used: 21471232
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 91332608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951fe000/0x0/0x1bfc00000, data 0x2e19afd/0x3050000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 91332608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 91332608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.524675369s of 11.724849701s, submitted: 9
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 91332608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 91332608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5839959 data_alloc: 234881024 data_used: 21471232
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 91332608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951fd000/0x0/0x1bfc00000, data 0x2e1aafd/0x3051000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951f5000/0x0/0x1bfc00000, data 0x2e1fafd/0x3056000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5839659 data_alloc: 234881024 data_used: 21471232
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951f5000/0x0/0x1bfc00000, data 0x2e1fafd/0x3056000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951f5000/0x0/0x1bfc00000, data 0x2e1fafd/0x3056000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5839659 data_alloc: 234881024 data_used: 21471232
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951f5000/0x0/0x1bfc00000, data 0x2e1fafd/0x3056000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 564240384 unmapped: 91676672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5840939 data_alloc: 234881024 data_used: 21725184
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566075392 unmapped: 89841664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951f5000/0x0/0x1bfc00000, data 0x2e1fafd/0x3056000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e6133efc00 session 0x55e61256e5a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566075392 unmapped: 89841664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951f5000/0x0/0x1bfc00000, data 0x2e1fafd/0x3056000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.332700729s of 19.351327896s, submitted: 12
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951f5000/0x0/0x1bfc00000, data 0x2e1fafd/0x3056000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566075392 unmapped: 89841664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566075392 unmapped: 89841664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566075392 unmapped: 89841664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e618349400 session 0x55e61216c960
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e614329c00 session 0x55e6120454a0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5843485 data_alloc: 234881024 data_used: 23822336
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566083584 unmapped: 89833472 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952b8000/0x0/0x1bfc00000, data 0x2d60aed/0x2f96000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566083584 unmapped: 89833472 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566099968 unmapped: 89817088 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e61194d000 session 0x55e61435e1e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952dc000/0x0/0x1bfc00000, data 0x2d3caed/0x2f72000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566099968 unmapped: 89817088 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566124544 unmapped: 89792512 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952dc000/0x0/0x1bfc00000, data 0x2d3caed/0x2f72000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5823139 data_alloc: 234881024 data_used: 23355392
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566124544 unmapped: 89792512 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e62d15a800 session 0x55e6117f4b40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566124544 unmapped: 89792512 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 ms_handle_reset con 0x55e6170a9800 session 0x55e61168cd20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566124544 unmapped: 89792512 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566124544 unmapped: 89792512 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952dc000/0x0/0x1bfc00000, data 0x2d3caed/0x2f72000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566124544 unmapped: 89792512 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5823139 data_alloc: 234881024 data_used: 23355392
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 handle_osd_map epochs [430,431], i have 430, src has [1,431]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.423069000s of 13.489458084s, submitted: 23
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 heartbeat osd_stat(store_statfs(0x1952dc000/0x0/0x1bfc00000, data 0x2d3caed/0x2f72000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 430 handle_osd_map epochs [431,431], i have 431, src has [1,431]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566124544 unmapped: 89792512 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566140928 unmapped: 89776128 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566140928 unmapped: 89776128 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 431 heartbeat osd_stat(store_statfs(0x1952d8000/0x0/0x1bfc00000, data 0x2d3e79a/0x2f75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566140928 unmapped: 89776128 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566140928 unmapped: 89776128 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5826440 data_alloc: 234881024 data_used: 23359488
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 566149120 unmapped: 89767936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563675136 unmapped: 92241920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 431 ms_handle_reset con 0x55e6143a6400 session 0x55e61435e1e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 431 heartbeat osd_stat(store_statfs(0x195f1e000/0x0/0x1bfc00000, data 0x20fa78a/0x2330000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 431 heartbeat osd_stat(store_statfs(0x195f1e000/0x0/0x1bfc00000, data 0x20fa78a/0x2330000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 431 ms_handle_reset con 0x55e614c33000 session 0x55e61133b0e0
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563675136 unmapped: 92241920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563675136 unmapped: 92241920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 431 handle_osd_map epochs [431,432], i have 431, src has [1,432]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 432 ms_handle_reset con 0x55e61194d000 session 0x55e6132e3c20
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563683328 unmapped: 92233728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5686406 data_alloc: 234881024 data_used: 15290368
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563683328 unmapped: 92233728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 432 handle_osd_map epochs [432,433], i have 432, src has [1,433]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.539867401s of 10.603461266s, submitted: 27
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 433 heartbeat osd_stat(store_statfs(0x195f1a000/0x0/0x1bfc00000, data 0x20fc453/0x2333000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563691520 unmapped: 92225536 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 433 heartbeat osd_stat(store_statfs(0x195f17000/0x0/0x1bfc00000, data 0x20fdfae/0x2336000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563691520 unmapped: 92225536 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 433 ms_handle_reset con 0x55e61dc79000 session 0x55e61196c780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 433 ms_handle_reset con 0x55e61432cc00 session 0x55e617122b40
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563699712 unmapped: 92217344 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 433 ms_handle_reset con 0x55e610e23000 session 0x55e6121a8f00
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 433 heartbeat osd_stat(store_statfs(0x196a1c000/0x0/0x1bfc00000, data 0x15fbf8f/0x1832000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5528311 data_alloc: 218103808 data_used: 3780608
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 433 handle_osd_map epochs [433,434], i have 433, src has [1,434]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 ms_handle_reset con 0x55e613add000 session 0x55e617122780
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 ms_handle_reset con 0x55e610e23000 session 0x55e6121a8960
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196a18000/0x0/0x1bfc00000, data 0x15fdace/0x1835000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5512982 data_alloc: 218103808 data_used: 1683456
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5512982 data_alloc: 218103808 data_used: 1683456
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 95412224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 95412224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 95412224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 95412224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 95412224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560513024 unmapped: 95404032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560513024 unmapped: 95404032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560513024 unmapped: 95404032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560513024 unmapped: 95404032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560513024 unmapped: 95404032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560513024 unmapped: 95404032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560513024 unmapped: 95404032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560513024 unmapped: 95404032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560545792 unmapped: 95371264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560545792 unmapped: 95371264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560545792 unmapped: 95371264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560545792 unmapped: 95371264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560545792 unmapped: 95371264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560545792 unmapped: 95371264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560545792 unmapped: 95371264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560545792 unmapped: 95371264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560562176 unmapped: 95354880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560562176 unmapped: 95354880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560562176 unmapped: 95354880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560562176 unmapped: 95354880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560562176 unmapped: 95354880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560562176 unmapped: 95354880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560562176 unmapped: 95354880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560562176 unmapped: 95354880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560570368 unmapped: 95346688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560570368 unmapped: 95346688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560570368 unmapped: 95346688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560570368 unmapped: 95346688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560570368 unmapped: 95346688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560570368 unmapped: 95346688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560570368 unmapped: 95346688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560570368 unmapped: 95346688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560586752 unmapped: 95330304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560586752 unmapped: 95330304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560586752 unmapped: 95330304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560586752 unmapped: 95330304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560586752 unmapped: 95330304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560586752 unmapped: 95330304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560594944 unmapped: 95322112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560594944 unmapped: 95322112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560594944 unmapped: 95322112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560594944 unmapped: 95322112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560594944 unmapped: 95322112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560603136 unmapped: 95313920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560603136 unmapped: 95313920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560603136 unmapped: 95313920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560603136 unmapped: 95313920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560603136 unmapped: 95313920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560603136 unmapped: 95313920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560603136 unmapped: 95313920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560603136 unmapped: 95313920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560611328 unmapped: 95305728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560611328 unmapped: 95305728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560619520 unmapped: 95297536 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560562176 unmapped: 95354880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: do_command 'config diff' '{prefix=config diff}'
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: do_command 'config show' '{prefix=config show}'
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: do_command 'counter dump' '{prefix=counter dump}'
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: do_command 'counter schema' '{prefix=counter schema}'
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560652288 unmapped: 95264768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560660480 unmapped: 95256576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: do_command 'log dump' '{prefix=log dump}'
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560660480 unmapped: 95256576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: do_command 'perf dump' '{prefix=perf dump}'
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: do_command 'perf schema' '{prefix=perf schema}'
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560332800 unmapped: 95584256 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560332800 unmapped: 95584256 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560332800 unmapped: 95584256 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560332800 unmapped: 95584256 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560332800 unmapped: 95584256 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560332800 unmapped: 95584256 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560340992 unmapped: 95576064 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560349184 unmapped: 95567872 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560349184 unmapped: 95567872 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560349184 unmapped: 95567872 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560349184 unmapped: 95567872 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560349184 unmapped: 95567872 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560349184 unmapped: 95567872 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560349184 unmapped: 95567872 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560365568 unmapped: 95551488 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560365568 unmapped: 95551488 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560365568 unmapped: 95551488 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560365568 unmapped: 95551488 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560365568 unmapped: 95551488 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560365568 unmapped: 95551488 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560365568 unmapped: 95551488 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560365568 unmapped: 95551488 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560373760 unmapped: 95543296 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560373760 unmapped: 95543296 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560381952 unmapped: 95535104 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560381952 unmapped: 95535104 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560390144 unmapped: 95526912 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560390144 unmapped: 95526912 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560390144 unmapped: 95526912 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560390144 unmapped: 95526912 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560398336 unmapped: 95518720 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560398336 unmapped: 95518720 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560398336 unmapped: 95518720 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560398336 unmapped: 95518720 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560398336 unmapped: 95518720 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560398336 unmapped: 95518720 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560398336 unmapped: 95518720 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560398336 unmapped: 95518720 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560406528 unmapped: 95510528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560406528 unmapped: 95510528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560406528 unmapped: 95510528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560406528 unmapped: 95510528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560414720 unmapped: 95502336 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560414720 unmapped: 95502336 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560414720 unmapped: 95502336 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560414720 unmapped: 95502336 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560431104 unmapped: 95485952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560431104 unmapped: 95485952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560439296 unmapped: 95477760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560439296 unmapped: 95477760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560439296 unmapped: 95477760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560439296 unmapped: 95477760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560439296 unmapped: 95477760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560439296 unmapped: 95477760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560447488 unmapped: 95469568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560447488 unmapped: 95469568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560447488 unmapped: 95469568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560455680 unmapped: 95461376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560455680 unmapped: 95461376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560455680 unmapped: 95461376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560463872 unmapped: 95453184 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560463872 unmapped: 95453184 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560480256 unmapped: 95436800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560480256 unmapped: 95436800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560480256 unmapped: 95436800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560480256 unmapped: 95436800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560480256 unmapped: 95436800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560480256 unmapped: 95436800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560480256 unmapped: 95436800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560480256 unmapped: 95436800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560488448 unmapped: 95428608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560488448 unmapped: 95428608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560496640 unmapped: 95420416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 95412224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 95412224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 95412224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 95412224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560504832 unmapped: 95412224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560521216 unmapped: 95395840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560529408 unmapped: 95387648 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560529408 unmapped: 95387648 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560529408 unmapped: 95387648 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560529408 unmapped: 95387648 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560529408 unmapped: 95387648 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560537600 unmapped: 95379456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560545792 unmapped: 95371264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560553984 unmapped: 95363072 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560553984 unmapped: 95363072 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560553984 unmapped: 95363072 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560553984 unmapped: 95363072 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560553984 unmapped: 95363072 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560553984 unmapped: 95363072 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560553984 unmapped: 95363072 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560578560 unmapped: 95338496 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560578560 unmapped: 95338496 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560578560 unmapped: 95338496 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560578560 unmapped: 95338496 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560578560 unmapped: 95338496 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560586752 unmapped: 95330304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560586752 unmapped: 95330304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560586752 unmapped: 95330304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560586752 unmapped: 95330304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560586752 unmapped: 95330304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560586752 unmapped: 95330304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560594944 unmapped: 95322112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560594944 unmapped: 95322112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560594944 unmapped: 95322112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560603136 unmapped: 95313920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560603136 unmapped: 95313920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560611328 unmapped: 95305728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560619520 unmapped: 95297536 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560619520 unmapped: 95297536 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560619520 unmapped: 95297536 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560619520 unmapped: 95297536 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560619520 unmapped: 95297536 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560619520 unmapped: 95297536 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560619520 unmapped: 95297536 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560635904 unmapped: 95281152 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560635904 unmapped: 95281152 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560635904 unmapped: 95281152 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560635904 unmapped: 95281152 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560635904 unmapped: 95281152 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560635904 unmapped: 95281152 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560635904 unmapped: 95281152 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560635904 unmapped: 95281152 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560644096 unmapped: 95272960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560644096 unmapped: 95272960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560644096 unmapped: 95272960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560644096 unmapped: 95272960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560644096 unmapped: 95272960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560644096 unmapped: 95272960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560644096 unmapped: 95272960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560644096 unmapped: 95272960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560660480 unmapped: 95256576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560660480 unmapped: 95256576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560660480 unmapped: 95256576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560660480 unmapped: 95256576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.1 total, 600.0 interval#012Cumulative writes: 78K writes, 332K keys, 78K commit groups, 1.0 writes per commit group, ingest: 0.34 GB, 0.05 MB/s#012Cumulative WAL: 78K writes, 27K syncs, 2.90 writes per sync, written: 0.34 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1564 writes, 4848 keys, 1564 commit groups, 1.0 writes per commit group, ingest: 4.11 MB, 0.01 MB/s#012Interval WAL: 1564 writes, 689 syncs, 2.27 writes per sync, written: 0.00 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560660480 unmapped: 95256576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560660480 unmapped: 95256576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560660480 unmapped: 95256576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560660480 unmapped: 95256576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560676864 unmapped: 95240192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560676864 unmapped: 95240192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: mgrc ms_handle_reset ms_handle_reset con 0x55e61cb41800
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3835187053
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3835187053,v1:192.168.122.100:6801/3835187053]
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: mgrc handle_mgr_configure stats_period=5
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560939008 unmapped: 94978048 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560947200 unmapped: 94969856 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560947200 unmapped: 94969856 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560947200 unmapped: 94969856 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560947200 unmapped: 94969856 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560947200 unmapped: 94969856 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560947200 unmapped: 94969856 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560955392 unmapped: 94961664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560955392 unmapped: 94961664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560955392 unmapped: 94961664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560947200 unmapped: 94969856 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560947200 unmapped: 94969856 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560947200 unmapped: 94969856 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560947200 unmapped: 94969856 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560955392 unmapped: 94961664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560955392 unmapped: 94961664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560955392 unmapped: 94961664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560955392 unmapped: 94961664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560963584 unmapped: 94953472 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560963584 unmapped: 94953472 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560963584 unmapped: 94953472 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560963584 unmapped: 94953472 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560979968 unmapped: 94937088 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560979968 unmapped: 94937088 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 94928896 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 94928896 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513142 data_alloc: 218103808 data_used: 1687552
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 94928896 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 94928896 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 94928896 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1a000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 94928896 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 279.219055176s of 279.595733643s, submitted: 68
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560930816 unmapped: 94986240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x196c1b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x279af9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560963584 unmapped: 94953472 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560979968 unmapped: 94937088 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560979968 unmapped: 94937088 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560979968 unmapped: 94937088 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 560996352 unmapped: 94920704 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [0,0,0,1])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561061888 unmapped: 94855168 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561086464 unmapped: 94830592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561111040 unmapped: 94806016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561111040 unmapped: 94806016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561111040 unmapped: 94806016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561111040 unmapped: 94806016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561111040 unmapped: 94806016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561111040 unmapped: 94806016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561111040 unmapped: 94806016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561111040 unmapped: 94806016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561111040 unmapped: 94806016 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561119232 unmapped: 94797824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561119232 unmapped: 94797824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561119232 unmapped: 94797824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561119232 unmapped: 94797824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561119232 unmapped: 94797824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561119232 unmapped: 94797824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561119232 unmapped: 94797824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561119232 unmapped: 94797824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561119232 unmapped: 94797824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561119232 unmapped: 94797824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561119232 unmapped: 94797824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561119232 unmapped: 94797824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561119232 unmapped: 94797824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561119232 unmapped: 94797824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561127424 unmapped: 94789632 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561127424 unmapped: 94789632 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561127424 unmapped: 94789632 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561127424 unmapped: 94789632 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561127424 unmapped: 94789632 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561127424 unmapped: 94789632 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561127424 unmapped: 94789632 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561127424 unmapped: 94789632 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561135616 unmapped: 94781440 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561135616 unmapped: 94781440 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561135616 unmapped: 94781440 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561135616 unmapped: 94781440 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561135616 unmapped: 94781440 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561135616 unmapped: 94781440 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561135616 unmapped: 94781440 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561135616 unmapped: 94781440 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561143808 unmapped: 94773248 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561143808 unmapped: 94773248 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561143808 unmapped: 94773248 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561143808 unmapped: 94773248 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561143808 unmapped: 94773248 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561143808 unmapped: 94773248 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561143808 unmapped: 94773248 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561143808 unmapped: 94773248 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:36 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561152000 unmapped: 94765056 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561152000 unmapped: 94765056 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561152000 unmapped: 94765056 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561152000 unmapped: 94765056 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561152000 unmapped: 94765056 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561152000 unmapped: 94765056 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561152000 unmapped: 94765056 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561152000 unmapped: 94765056 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561160192 unmapped: 94756864 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561160192 unmapped: 94756864 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561160192 unmapped: 94756864 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561160192 unmapped: 94756864 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561160192 unmapped: 94756864 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561168384 unmapped: 94748672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561168384 unmapped: 94748672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561168384 unmapped: 94748672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561168384 unmapped: 94748672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561168384 unmapped: 94748672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561168384 unmapped: 94748672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561168384 unmapped: 94748672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561168384 unmapped: 94748672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561176576 unmapped: 94740480 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561176576 unmapped: 94740480 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561176576 unmapped: 94740480 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561184768 unmapped: 94732288 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561192960 unmapped: 94724096 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561192960 unmapped: 94724096 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561192960 unmapped: 94724096 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561192960 unmapped: 94724096 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561192960 unmapped: 94724096 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561192960 unmapped: 94724096 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561192960 unmapped: 94724096 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561201152 unmapped: 94715904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561201152 unmapped: 94715904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561201152 unmapped: 94715904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561201152 unmapped: 94715904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561201152 unmapped: 94715904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561201152 unmapped: 94715904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561201152 unmapped: 94715904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561201152 unmapped: 94715904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561217536 unmapped: 94699520 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561217536 unmapped: 94699520 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561217536 unmapped: 94699520 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561217536 unmapped: 94699520 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561217536 unmapped: 94699520 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561217536 unmapped: 94699520 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561217536 unmapped: 94699520 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561217536 unmapped: 94699520 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561225728 unmapped: 94691328 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561225728 unmapped: 94691328 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561225728 unmapped: 94691328 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561225728 unmapped: 94691328 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561225728 unmapped: 94691328 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561225728 unmapped: 94691328 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561225728 unmapped: 94691328 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561225728 unmapped: 94691328 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561233920 unmapped: 94683136 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561242112 unmapped: 94674944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561242112 unmapped: 94674944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561242112 unmapped: 94674944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561250304 unmapped: 94666752 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561250304 unmapped: 94666752 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561250304 unmapped: 94666752 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561250304 unmapped: 94666752 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561258496 unmapped: 94658560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561258496 unmapped: 94658560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561258496 unmapped: 94658560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561258496 unmapped: 94658560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561258496 unmapped: 94658560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561258496 unmapped: 94658560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561266688 unmapped: 94650368 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561266688 unmapped: 94650368 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561266688 unmapped: 94650368 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561266688 unmapped: 94650368 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561266688 unmapped: 94650368 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561274880 unmapped: 94642176 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561274880 unmapped: 94642176 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561274880 unmapped: 94642176 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561274880 unmapped: 94642176 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561274880 unmapped: 94642176 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561283072 unmapped: 94633984 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561283072 unmapped: 94633984 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561283072 unmapped: 94633984 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561283072 unmapped: 94633984 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561283072 unmapped: 94633984 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561283072 unmapped: 94633984 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561283072 unmapped: 94633984 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561283072 unmapped: 94633984 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561291264 unmapped: 94625792 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561291264 unmapped: 94625792 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561291264 unmapped: 94625792 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561291264 unmapped: 94625792 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561291264 unmapped: 94625792 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561291264 unmapped: 94625792 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561291264 unmapped: 94625792 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561291264 unmapped: 94625792 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561299456 unmapped: 94617600 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561307648 unmapped: 94609408 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561307648 unmapped: 94609408 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561315840 unmapped: 94601216 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561315840 unmapped: 94601216 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561315840 unmapped: 94601216 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561315840 unmapped: 94601216 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561315840 unmapped: 94601216 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561315840 unmapped: 94601216 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561315840 unmapped: 94601216 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561315840 unmapped: 94601216 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561324032 unmapped: 94593024 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561324032 unmapped: 94593024 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561324032 unmapped: 94593024 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561324032 unmapped: 94593024 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561324032 unmapped: 94593024 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561332224 unmapped: 94584832 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561332224 unmapped: 94584832 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561332224 unmapped: 94584832 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561348608 unmapped: 94568448 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561348608 unmapped: 94568448 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561348608 unmapped: 94568448 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561348608 unmapped: 94568448 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561348608 unmapped: 94568448 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561364992 unmapped: 94552064 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561364992 unmapped: 94552064 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561364992 unmapped: 94552064 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561373184 unmapped: 94543872 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561381376 unmapped: 94535680 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561381376 unmapped: 94535680 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561381376 unmapped: 94535680 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561381376 unmapped: 94535680 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561389568 unmapped: 94527488 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561389568 unmapped: 94527488 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561389568 unmapped: 94527488 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561389568 unmapped: 94527488 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561389568 unmapped: 94527488 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561389568 unmapped: 94527488 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561389568 unmapped: 94527488 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561389568 unmapped: 94527488 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561397760 unmapped: 94519296 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561397760 unmapped: 94519296 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561397760 unmapped: 94519296 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561397760 unmapped: 94519296 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561397760 unmapped: 94519296 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561397760 unmapped: 94519296 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561397760 unmapped: 94519296 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561397760 unmapped: 94519296 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561405952 unmapped: 94511104 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561414144 unmapped: 94502912 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561414144 unmapped: 94502912 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561414144 unmapped: 94502912 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561422336 unmapped: 94494720 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561422336 unmapped: 94494720 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561422336 unmapped: 94494720 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561430528 unmapped: 94486528 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561438720 unmapped: 94478336 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561438720 unmapped: 94478336 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561438720 unmapped: 94478336 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561438720 unmapped: 94478336 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561438720 unmapped: 94478336 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561438720 unmapped: 94478336 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561438720 unmapped: 94478336 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561438720 unmapped: 94478336 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561446912 unmapped: 94470144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561446912 unmapped: 94470144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561455104 unmapped: 94461952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561463296 unmapped: 94453760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561463296 unmapped: 94453760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561463296 unmapped: 94453760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561463296 unmapped: 94453760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561463296 unmapped: 94453760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561471488 unmapped: 94445568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561471488 unmapped: 94445568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561471488 unmapped: 94445568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561479680 unmapped: 94437376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561479680 unmapped: 94437376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561479680 unmapped: 94437376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561479680 unmapped: 94437376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561479680 unmapped: 94437376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561487872 unmapped: 94429184 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561487872 unmapped: 94429184 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561487872 unmapped: 94429184 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561496064 unmapped: 94420992 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561496064 unmapped: 94420992 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561496064 unmapped: 94420992 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561496064 unmapped: 94420992 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561496064 unmapped: 94420992 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561504256 unmapped: 94412800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561504256 unmapped: 94412800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561512448 unmapped: 94404608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561512448 unmapped: 94404608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561512448 unmapped: 94404608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561512448 unmapped: 94404608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561512448 unmapped: 94404608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561512448 unmapped: 94404608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561512448 unmapped: 94404608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561512448 unmapped: 94404608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561512448 unmapped: 94404608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561512448 unmapped: 94404608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561512448 unmapped: 94404608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561512448 unmapped: 94404608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561512448 unmapped: 94404608 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561528832 unmapped: 94388224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561528832 unmapped: 94388224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561528832 unmapped: 94388224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561528832 unmapped: 94388224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561528832 unmapped: 94388224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561528832 unmapped: 94388224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561528832 unmapped: 94388224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561528832 unmapped: 94388224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561537024 unmapped: 94380032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561537024 unmapped: 94380032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561537024 unmapped: 94380032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561537024 unmapped: 94380032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561545216 unmapped: 94371840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561545216 unmapped: 94371840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561545216 unmapped: 94371840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561545216 unmapped: 94371840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561553408 unmapped: 94363648 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561553408 unmapped: 94363648 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561561600 unmapped: 94355456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561561600 unmapped: 94355456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561561600 unmapped: 94355456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561569792 unmapped: 94347264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561569792 unmapped: 94347264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561569792 unmapped: 94347264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561577984 unmapped: 94339072 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561586176 unmapped: 94330880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561586176 unmapped: 94330880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561586176 unmapped: 94330880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561586176 unmapped: 94330880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561586176 unmapped: 94330880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561586176 unmapped: 94330880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561586176 unmapped: 94330880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561594368 unmapped: 94322688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561594368 unmapped: 94322688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561594368 unmapped: 94322688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561594368 unmapped: 94322688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561610752 unmapped: 94306304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561618944 unmapped: 94298112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561618944 unmapped: 94298112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561618944 unmapped: 94298112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561627136 unmapped: 94289920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561627136 unmapped: 94289920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561627136 unmapped: 94289920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561627136 unmapped: 94289920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561627136 unmapped: 94289920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561627136 unmapped: 94289920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561627136 unmapped: 94289920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561627136 unmapped: 94289920 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561635328 unmapped: 94281728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561635328 unmapped: 94281728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561635328 unmapped: 94281728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561635328 unmapped: 94281728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561635328 unmapped: 94281728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561635328 unmapped: 94281728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561635328 unmapped: 94281728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561635328 unmapped: 94281728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561659904 unmapped: 94257152 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561668096 unmapped: 94248960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561668096 unmapped: 94248960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561668096 unmapped: 94248960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561668096 unmapped: 94248960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561668096 unmapped: 94248960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561668096 unmapped: 94248960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561668096 unmapped: 94248960 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561676288 unmapped: 94240768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561676288 unmapped: 94240768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561676288 unmapped: 94240768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561676288 unmapped: 94240768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561676288 unmapped: 94240768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561676288 unmapped: 94240768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561676288 unmapped: 94240768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561684480 unmapped: 94232576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561700864 unmapped: 94216192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561700864 unmapped: 94216192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561700864 unmapped: 94216192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561700864 unmapped: 94216192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561709056 unmapped: 94208000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561709056 unmapped: 94208000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561709056 unmapped: 94208000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561709056 unmapped: 94208000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561709056 unmapped: 94208000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561709056 unmapped: 94208000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561709056 unmapped: 94208000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561709056 unmapped: 94208000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561709056 unmapped: 94208000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561709056 unmapped: 94208000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561709056 unmapped: 94208000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561709056 unmapped: 94208000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561725440 unmapped: 94191616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561725440 unmapped: 94191616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561725440 unmapped: 94191616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561725440 unmapped: 94191616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561733632 unmapped: 94183424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561733632 unmapped: 94183424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561733632 unmapped: 94183424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561733632 unmapped: 94183424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561741824 unmapped: 94175232 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561741824 unmapped: 94175232 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561750016 unmapped: 94167040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561750016 unmapped: 94167040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561750016 unmapped: 94167040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561750016 unmapped: 94167040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561750016 unmapped: 94167040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561750016 unmapped: 94167040 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561758208 unmapped: 94158848 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561758208 unmapped: 94158848 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561758208 unmapped: 94158848 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561758208 unmapped: 94158848 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561758208 unmapped: 94158848 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561758208 unmapped: 94158848 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561758208 unmapped: 94158848 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561758208 unmapped: 94158848 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561782784 unmapped: 94134272 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561782784 unmapped: 94134272 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561782784 unmapped: 94134272 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561782784 unmapped: 94134272 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561782784 unmapped: 94134272 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561782784 unmapped: 94134272 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561782784 unmapped: 94134272 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561790976 unmapped: 94126080 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561790976 unmapped: 94126080 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561790976 unmapped: 94126080 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561790976 unmapped: 94126080 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561790976 unmapped: 94126080 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561790976 unmapped: 94126080 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561790976 unmapped: 94126080 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561799168 unmapped: 94117888 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561799168 unmapped: 94117888 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561815552 unmapped: 94101504 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561815552 unmapped: 94101504 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561815552 unmapped: 94101504 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561831936 unmapped: 94085120 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561831936 unmapped: 94085120 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561831936 unmapped: 94085120 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561831936 unmapped: 94085120 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561831936 unmapped: 94085120 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561831936 unmapped: 94085120 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561831936 unmapped: 94085120 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561831936 unmapped: 94085120 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561831936 unmapped: 94085120 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561840128 unmapped: 94076928 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561840128 unmapped: 94076928 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561840128 unmapped: 94076928 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561840128 unmapped: 94076928 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561856512 unmapped: 94060544 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561864704 unmapped: 94052352 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561864704 unmapped: 94052352 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561872896 unmapped: 94044160 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561872896 unmapped: 94044160 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561872896 unmapped: 94044160 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561872896 unmapped: 94044160 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561872896 unmapped: 94044160 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561881088 unmapped: 94035968 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561881088 unmapped: 94035968 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561881088 unmapped: 94035968 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561881088 unmapped: 94035968 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561889280 unmapped: 94027776 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561889280 unmapped: 94027776 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561889280 unmapped: 94027776 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561889280 unmapped: 94027776 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561913856 unmapped: 94003200 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561913856 unmapped: 94003200 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561913856 unmapped: 94003200 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561913856 unmapped: 94003200 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561913856 unmapped: 94003200 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561913856 unmapped: 94003200 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561913856 unmapped: 94003200 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561913856 unmapped: 94003200 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561922048 unmapped: 93995008 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561922048 unmapped: 93995008 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561922048 unmapped: 93995008 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561922048 unmapped: 93995008 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561922048 unmapped: 93995008 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561930240 unmapped: 93986816 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561930240 unmapped: 93986816 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561930240 unmapped: 93986816 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561946624 unmapped: 93970432 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561946624 unmapped: 93970432 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561946624 unmapped: 93970432 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561946624 unmapped: 93970432 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561946624 unmapped: 93970432 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561946624 unmapped: 93970432 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561946624 unmapped: 93970432 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561946624 unmapped: 93970432 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561954816 unmapped: 93962240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561954816 unmapped: 93962240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561954816 unmapped: 93962240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561954816 unmapped: 93962240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561954816 unmapped: 93962240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561954816 unmapped: 93962240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561954816 unmapped: 93962240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561954816 unmapped: 93962240 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561979392 unmapped: 93937664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561979392 unmapped: 93937664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561979392 unmapped: 93937664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561979392 unmapped: 93937664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561979392 unmapped: 93937664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561979392 unmapped: 93937664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561979392 unmapped: 93937664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561979392 unmapped: 93937664 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561987584 unmapped: 93929472 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561995776 unmapped: 93921280 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561995776 unmapped: 93921280 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561995776 unmapped: 93921280 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561995776 unmapped: 93921280 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561995776 unmapped: 93921280 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561995776 unmapped: 93921280 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 561995776 unmapped: 93921280 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562012160 unmapped: 93904896 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562012160 unmapped: 93904896 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562012160 unmapped: 93904896 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562012160 unmapped: 93904896 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562012160 unmapped: 93904896 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562012160 unmapped: 93904896 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562012160 unmapped: 93904896 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562012160 unmapped: 93904896 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562020352 unmapped: 93896704 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562020352 unmapped: 93896704 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562028544 unmapped: 93888512 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562028544 unmapped: 93888512 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562036736 unmapped: 93880320 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562036736 unmapped: 93880320 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562036736 unmapped: 93880320 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562036736 unmapped: 93880320 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562053120 unmapped: 93863936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562053120 unmapped: 93863936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562053120 unmapped: 93863936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562053120 unmapped: 93863936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562053120 unmapped: 93863936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562053120 unmapped: 93863936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562053120 unmapped: 93863936 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562069504 unmapped: 93847552 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562069504 unmapped: 93847552 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562069504 unmapped: 93847552 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562069504 unmapped: 93847552 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562069504 unmapped: 93847552 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562069504 unmapped: 93847552 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562069504 unmapped: 93847552 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562069504 unmapped: 93847552 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562085888 unmapped: 93831168 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562094080 unmapped: 93822976 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562094080 unmapped: 93822976 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562094080 unmapped: 93822976 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562094080 unmapped: 93822976 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562094080 unmapped: 93822976 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562094080 unmapped: 93822976 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562094080 unmapped: 93822976 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562102272 unmapped: 93814784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562102272 unmapped: 93814784 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562110464 unmapped: 93806592 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562118656 unmapped: 93798400 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562118656 unmapped: 93798400 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562118656 unmapped: 93798400 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562118656 unmapped: 93798400 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562118656 unmapped: 93798400 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562126848 unmapped: 93790208 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562126848 unmapped: 93790208 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562126848 unmapped: 93790208 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562143232 unmapped: 93773824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562143232 unmapped: 93773824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562143232 unmapped: 93773824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562143232 unmapped: 93773824 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562151424 unmapped: 93765632 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562151424 unmapped: 93765632 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562151424 unmapped: 93765632 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562159616 unmapped: 93757440 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562167808 unmapped: 93749248 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562167808 unmapped: 93749248 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562167808 unmapped: 93749248 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562167808 unmapped: 93749248 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562167808 unmapped: 93749248 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562167808 unmapped: 93749248 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562167808 unmapped: 93749248 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562167808 unmapped: 93749248 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562184192 unmapped: 93732864 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562184192 unmapped: 93732864 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562184192 unmapped: 93732864 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562184192 unmapped: 93732864 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562184192 unmapped: 93732864 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562184192 unmapped: 93732864 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562184192 unmapped: 93732864 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562184192 unmapped: 93732864 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562192384 unmapped: 93724672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562192384 unmapped: 93724672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562192384 unmapped: 93724672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562200576 unmapped: 93716480 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562200576 unmapped: 93716480 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562200576 unmapped: 93716480 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562200576 unmapped: 93716480 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562200576 unmapped: 93716480 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562208768 unmapped: 93708288 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562208768 unmapped: 93708288 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562208768 unmapped: 93708288 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562208768 unmapped: 93708288 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562216960 unmapped: 93700096 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562216960 unmapped: 93700096 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562216960 unmapped: 93700096 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562216960 unmapped: 93700096 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562225152 unmapped: 93691904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562225152 unmapped: 93691904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562225152 unmapped: 93691904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562225152 unmapped: 93691904 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.1 total, 600.0 interval#012Cumulative writes: 79K writes, 333K keys, 79K commit groups, 1.0 writes per commit group, ingest: 0.34 GB, 0.04 MB/s#012Cumulative WAL: 79K writes, 27K syncs, 2.89 writes per sync, written: 0.34 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 664 writes, 1020 keys, 664 commit groups, 1.0 writes per commit group, ingest: 0.33 MB, 0.00 MB/s#012Interval WAL: 664 writes, 328 syncs, 2.02 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562249728 unmapped: 93667328 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562249728 unmapped: 93667328 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562249728 unmapped: 93667328 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562249728 unmapped: 93667328 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562257920 unmapped: 93659136 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562257920 unmapped: 93659136 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562257920 unmapped: 93659136 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562257920 unmapped: 93659136 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562257920 unmapped: 93659136 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562257920 unmapped: 93659136 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562257920 unmapped: 93659136 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562257920 unmapped: 93659136 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562266112 unmapped: 93650944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562266112 unmapped: 93650944 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562274304 unmapped: 93642752 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562274304 unmapped: 93642752 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 31 04:36:37 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/850169218' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562282496 unmapped: 93634560 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562290688 unmapped: 93626368 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562290688 unmapped: 93626368 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562290688 unmapped: 93626368 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562290688 unmapped: 93626368 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562290688 unmapped: 93626368 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562290688 unmapped: 93626368 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562298880 unmapped: 93618176 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562298880 unmapped: 93618176 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562307072 unmapped: 93609984 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562307072 unmapped: 93609984 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562307072 unmapped: 93609984 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562307072 unmapped: 93609984 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562315264 unmapped: 93601792 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562323456 unmapped: 93593600 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562323456 unmapped: 93593600 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562323456 unmapped: 93593600 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562323456 unmapped: 93593600 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562323456 unmapped: 93593600 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562323456 unmapped: 93593600 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562323456 unmapped: 93593600 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562323456 unmapped: 93593600 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 602.856994629s of 604.210449219s, submitted: 336
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562339840 unmapped: 93577216 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562339840 unmapped: 93577216 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562397184 unmapped: 93519872 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562462720 unmapped: 93454336 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562470912 unmapped: 93446144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562470912 unmapped: 93446144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562470912 unmapped: 93446144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562470912 unmapped: 93446144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562470912 unmapped: 93446144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562470912 unmapped: 93446144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562470912 unmapped: 93446144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562470912 unmapped: 93446144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562470912 unmapped: 93446144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562470912 unmapped: 93446144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562470912 unmapped: 93446144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562470912 unmapped: 93446144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562470912 unmapped: 93446144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562470912 unmapped: 93446144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562470912 unmapped: 93446144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562470912 unmapped: 93446144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562470912 unmapped: 93446144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562470912 unmapped: 93446144 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562479104 unmapped: 93437952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562479104 unmapped: 93437952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562479104 unmapped: 93437952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562479104 unmapped: 93437952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562479104 unmapped: 93437952 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562487296 unmapped: 93429760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562487296 unmapped: 93429760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562487296 unmapped: 93429760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562487296 unmapped: 93429760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562487296 unmapped: 93429760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562487296 unmapped: 93429760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562487296 unmapped: 93429760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562487296 unmapped: 93429760 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562495488 unmapped: 93421568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562495488 unmapped: 93421568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562495488 unmapped: 93421568 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562503680 unmapped: 93413376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562503680 unmapped: 93413376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562503680 unmapped: 93413376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562503680 unmapped: 93413376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562503680 unmapped: 93413376 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562511872 unmapped: 93405184 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562511872 unmapped: 93405184 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562511872 unmapped: 93405184 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562511872 unmapped: 93405184 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562511872 unmapped: 93405184 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562511872 unmapped: 93405184 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562511872 unmapped: 93405184 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562511872 unmapped: 93405184 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562520064 unmapped: 93396992 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562520064 unmapped: 93396992 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562520064 unmapped: 93396992 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562528256 unmapped: 93388800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562528256 unmapped: 93388800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562528256 unmapped: 93388800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562528256 unmapped: 93388800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562528256 unmapped: 93388800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562528256 unmapped: 93388800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562528256 unmapped: 93388800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562528256 unmapped: 93388800 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562544640 unmapped: 93372416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562544640 unmapped: 93372416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562544640 unmapped: 93372416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562544640 unmapped: 93372416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562544640 unmapped: 93372416 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562552832 unmapped: 93364224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562552832 unmapped: 93364224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562552832 unmapped: 93364224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562552832 unmapped: 93364224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562552832 unmapped: 93364224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562552832 unmapped: 93364224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562552832 unmapped: 93364224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562552832 unmapped: 93364224 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562561024 unmapped: 93356032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562561024 unmapped: 93356032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562561024 unmapped: 93356032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562561024 unmapped: 93356032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562561024 unmapped: 93356032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562561024 unmapped: 93356032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562561024 unmapped: 93356032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562561024 unmapped: 93356032 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562569216 unmapped: 93347840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562569216 unmapped: 93347840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562569216 unmapped: 93347840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562569216 unmapped: 93347840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562569216 unmapped: 93347840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562569216 unmapped: 93347840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562569216 unmapped: 93347840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562569216 unmapped: 93347840 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562577408 unmapped: 93339648 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562585600 unmapped: 93331456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562585600 unmapped: 93331456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562585600 unmapped: 93331456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562585600 unmapped: 93331456 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562593792 unmapped: 93323264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562593792 unmapped: 93323264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562593792 unmapped: 93323264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562593792 unmapped: 93323264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562593792 unmapped: 93323264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562593792 unmapped: 93323264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562593792 unmapped: 93323264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562593792 unmapped: 93323264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562593792 unmapped: 93323264 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562601984 unmapped: 93315072 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562601984 unmapped: 93315072 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562610176 unmapped: 93306880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562610176 unmapped: 93306880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562610176 unmapped: 93306880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562610176 unmapped: 93306880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562610176 unmapped: 93306880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562610176 unmapped: 93306880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562610176 unmapped: 93306880 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562618368 unmapped: 93298688 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562626560 unmapped: 93290496 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562626560 unmapped: 93290496 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562626560 unmapped: 93290496 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562626560 unmapped: 93290496 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562626560 unmapped: 93290496 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562634752 unmapped: 93282304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562634752 unmapped: 93282304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562634752 unmapped: 93282304 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562642944 unmapped: 93274112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562642944 unmapped: 93274112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562642944 unmapped: 93274112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562642944 unmapped: 93274112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562642944 unmapped: 93274112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562642944 unmapped: 93274112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562642944 unmapped: 93274112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562642944 unmapped: 93274112 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562659328 unmapped: 93257728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562659328 unmapped: 93257728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562659328 unmapped: 93257728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562659328 unmapped: 93257728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562659328 unmapped: 93257728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562659328 unmapped: 93257728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562659328 unmapped: 93257728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562659328 unmapped: 93257728 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562667520 unmapped: 93249536 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562667520 unmapped: 93249536 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562667520 unmapped: 93249536 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562675712 unmapped: 93241344 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562675712 unmapped: 93241344 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562675712 unmapped: 93241344 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562683904 unmapped: 93233152 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562683904 unmapped: 93233152 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 93216768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 93216768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 93216768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 93216768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 93216768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 93216768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 93216768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 93216768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 93216768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 93216768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562700288 unmapped: 93216768 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562708480 unmapped: 93208576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562708480 unmapped: 93208576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562708480 unmapped: 93208576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562708480 unmapped: 93208576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562708480 unmapped: 93208576 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562716672 unmapped: 93200384 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562724864 unmapped: 93192192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562724864 unmapped: 93192192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562724864 unmapped: 93192192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562724864 unmapped: 93192192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562724864 unmapped: 93192192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562724864 unmapped: 93192192 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562733056 unmapped: 93184000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562733056 unmapped: 93184000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562733056 unmapped: 93184000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562733056 unmapped: 93184000 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562749440 unmapped: 93167616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562749440 unmapped: 93167616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562749440 unmapped: 93167616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562749440 unmapped: 93167616 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562757632 unmapped: 93159424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562757632 unmapped: 93159424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562757632 unmapped: 93159424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562757632 unmapped: 93159424 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: do_command 'config diff' '{prefix=config diff}'
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: do_command 'config show' '{prefix=config show}'
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: do_command 'counter dump' '{prefix=counter dump}'
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: do_command 'counter schema' '{prefix=counter schema}'
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 562929664 unmapped: 92987392 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: osd.2 434 heartbeat osd_stat(store_statfs(0x19680b000/0x0/0x1bfc00000, data 0x13fc8ce/0x1633000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x27dbf9c6), peers [0,1] op hist [])
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: prioritycache tune_memory target: 4294967296 mapped: 563216384 unmapped: 92700672 heap: 655917056 old mem: 2845415832 new mem: 2845415832
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: bluestore.MempoolThread(0x55e60fe07b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513062 data_alloc: 218103808 data_used: 1712128
Jan 31 04:36:37 np0005603623 ceph-osd[79732]: do_command 'log dump' '{prefix=log dump}'
Jan 31 04:36:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:37.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Jan 31 04:36:37 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2621335609' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 31 04:36:37 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:37 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:37 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:37.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:37 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 31 04:36:37 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1767978105' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 31 04:36:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:36:38 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Jan 31 04:36:38 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1274631869' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 31 04:36:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0) v1
Jan 31 04:36:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1053508630' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 31 04:36:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:36:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:39.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:36:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0) v1
Jan 31 04:36:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/111715297' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 31 04:36:39 np0005603623 nova_compute[226235]: 2026-01-31 09:36:39.663 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:39 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:39 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 04:36:39 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:39.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 04:36:39 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0) v1
Jan 31 04:36:39 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3840994702' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 31 04:36:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Jan 31 04:36:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1956805902' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 31 04:36:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Jan 31 04:36:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/164996097' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 31 04:36:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Jan 31 04:36:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1525848609' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 31 04:36:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Jan 31 04:36:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2296246173' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 31 04:36:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Jan 31 04:36:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1055284830' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 31 04:36:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0) v1
Jan 31 04:36:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2103828846' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 31 04:36:40 np0005603623 nova_compute[226235]: 2026-01-31 09:36:40.877 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Jan 31 04:36:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3332195484' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 31 04:36:40 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0) v1
Jan 31 04:36:40 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2252184208' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 31 04:36:40 np0005603623 systemd[1]: Starting Hostname Service...
Jan 31 04:36:41 np0005603623 systemd[1]: Started Hostname Service.
Jan 31 04:36:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0) v1
Jan 31 04:36:41 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3019928767' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 31 04:36:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Jan 31 04:36:41 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/294626957' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 31 04:36:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:41.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Jan 31 04:36:41 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/952797843' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 31 04:36:41 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Jan 31 04:36:41 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/106328942' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 31 04:36:41 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:41 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:36:41 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:41.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:36:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0) v1
Jan 31 04:36:42 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2991664313' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 31 04:36:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0) v1
Jan 31 04:36:42 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3512499561' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 31 04:36:42 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0) v1
Jan 31 04:36:42 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/152863261' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:36:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:43.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #223. Immutable memtables: 0.
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:43.571057) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:856] [default] [JOB 143] Flushing memtable with next log file: 223
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852203571170, "job": 143, "event": "flush_started", "num_memtables": 1, "num_entries": 812, "num_deletes": 250, "total_data_size": 1274847, "memory_usage": 1297624, "flush_reason": "Manual Compaction"}
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:885] [default] [JOB 143] Level-0 flush table #224: started
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852203578374, "cf_name": "default", "job": 143, "event": "table_file_creation", "file_number": 224, "file_size": 626441, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 107166, "largest_seqno": 107973, "table_properties": {"data_size": 622473, "index_size": 1491, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 12178, "raw_average_key_size": 22, "raw_value_size": 613722, "raw_average_value_size": 1136, "num_data_blocks": 63, "num_entries": 540, "num_filter_entries": 540, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769852164, "oldest_key_time": 1769852164, "file_creation_time": 1769852203, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 224, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 143] Flush lasted 7400 microseconds, and 3166 cpu microseconds.
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:43.578447) [db/flush_job.cc:967] [default] [JOB 143] Level-0 flush table #224: 626441 bytes OK
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:43.578507) [db/memtable_list.cc:519] [default] Level-0 commit table #224 started
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:43.581438) [db/memtable_list.cc:722] [default] Level-0 commit table #224: memtable #1 done
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:43.581506) EVENT_LOG_v1 {"time_micros": 1769852203581493, "job": 143, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:43.581542) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 143] Try to delete WAL files size 1270137, prev total WAL file size 1270137, number of live WAL files 2.
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000220.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:43.582275) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373630' seq:72057594037927935, type:22 .. '6D6772737461740034303131' seq:0, type:0; will stop at (end)
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 144] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 143 Base level 0, inputs: [224(611KB)], [222(15MB)]
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852203582359, "job": 144, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [224], "files_L6": [222], "score": -1, "input_data_size": 16829569, "oldest_snapshot_seqno": -1}
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 144] Generated table #225: 12667 keys, 13220274 bytes, temperature: kUnknown
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852203712418, "cf_name": "default", "job": 144, "event": "table_file_creation", "file_number": 225, "file_size": 13220274, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13143680, "index_size": 43841, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31685, "raw_key_size": 336712, "raw_average_key_size": 26, "raw_value_size": 12927814, "raw_average_value_size": 1020, "num_data_blocks": 1645, "num_entries": 12667, "num_filter_entries": 12667, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844145, "oldest_key_time": 0, "file_creation_time": 1769852203, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "bff4eee8-7924-415a-b54a-245461f42486", "db_session_id": "GEDCETBIRNCTO5ATJU9U", "orig_file_number": 225, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:43.712764) [db/compaction/compaction_job.cc:1663] [default] [JOB 144] Compacted 1@0 + 1@6 files to L6 => 13220274 bytes
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:43.716909) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 129.3 rd, 101.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 15.5 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(48.0) write-amplify(21.1) OK, records in: 13161, records dropped: 494 output_compression: NoCompression
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:43.716951) EVENT_LOG_v1 {"time_micros": 1769852203716935, "job": 144, "event": "compaction_finished", "compaction_time_micros": 130205, "compaction_time_cpu_micros": 26587, "output_level": 6, "num_output_files": 1, "total_output_size": 13220274, "num_input_records": 13161, "num_output_records": 12667, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000224.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852203717182, "job": 144, "event": "table_file_deletion", "file_number": 224}
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000222.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852203718391, "job": 144, "event": "table_file_deletion", "file_number": 222}
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:43.582166) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:43.718438) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:43.718443) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:43.718446) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:43.718448) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: rocksdb: (Original Log Time 2026/01/31-09:36:43.718450) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0) v1
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2312663840' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 31 04:36:43 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:43 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:43 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:43.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 31 04:36:43 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 31 04:36:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0) v1
Jan 31 04:36:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/830429409' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 31 04:36:44 np0005603623 nova_compute[226235]: 2026-01-31 09:36:44.667 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:44 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0) v1
Jan 31 04:36:44 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/277477150' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 31 04:36:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 31 04:36:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 31 04:36:45 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0) v1
Jan 31 04:36:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2516485541' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 31 04:36:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 31 04:36:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 31 04:36:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:45.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 31 04:36:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 31 04:36:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 31 04:36:45 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 31 04:36:45 np0005603623 radosgw[83781]: ====== starting new request req=0x7f2c8f8d56f0 =====
Jan 31 04:36:45 np0005603623 radosgw[83781]: ====== req done req=0x7f2c8f8d56f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 04:36:45 np0005603623 radosgw[83781]: beast: 0x7f2c8f8d56f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:45.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 04:36:45 np0005603623 nova_compute[226235]: 2026-01-31 09:36:45.887 226239 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:46 np0005603623 ceph-mon[77037]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0) v1
Jan 31 04:36:46 np0005603623 ceph-mon[77037]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2009577670' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
